The purpose of this exercise is to describe the business need that existed and resulted in the creation of IPython as a project and as a tool, and to support further development of it as a tool and as a continuous project.
An assessment of business goals and objectives, business problem(s) or opportunity, and/or desired outcome of IPython project will be performed to outline the issue(s) that existed and defined the business need for IPython at its creation.
Initially in 2001:
• Deliver an interactive environment for daily use from existing tools scientists are accustomed to using
By 2006:
• Deliver modern, high-quality tools for scientific computing that take advantage of the best practices from the open source revolution in software development and bring them to computational research
• Deliver tools that greatly enhance Python shell and to provide facilities for interactive distributed and parallel computing, as well as a comprehensive set of tools for building special-purpose interactive environments for scientific computing.
• Obtain funding for development of interactive tool
• Interactive tools that allow immediate, rapid and direct algorithmic exploration, data analysis, visualization and evaluation
• Interactive tools take advantage of the best practices from the open source revolution in software development and bring them to computational research
• Interactive tools in which code can be run immediately, without having to go through the traditional edit/compile/execute cycle
The techniques that will be applied for this assessment are benchmarking and focus groups.
The stakeholders for this assessment are: customer, domain SME and end user, implementation SME and sponsor.
• Powerful interactive shells (terminal and Qt-based)
• A browser-based notebook with support for code, text, mathematical expressions, inline plots and other rich media
• Support for interactive data visualization and use of GUI toolkits.
• Flexible, embeddable interpreters to load into your own projects
• Easy to use, high performance tools for parallel computing.
• August 2013: $100k Microsoft donation
• 2012: $1.15M Sloan Foundation grant
• 2011-Present: The US DoD High Performance Computing Modernization Program (HPCMP) has funded several IPython developers in collaboration with the US Army Engineer Research and Development Center (ERDC) that provides computing resources and support from the staff.
• 2011: Funding by Sage/NSF via the grant “Sage: Unifying Mathematical Software for Scientists, Engineers, and Mathematicians” (NSF grant DMS-1015114) supported our Seattle developer meeting.
• 2009: Funding via the NiPy project (NIH grant 5R01MH081909-02) supported refactoring work.
• Beginning-Present: Enthought Inc has supported IPython since its beginning in multiple forms, including –but not limited to– the funding of our Qt console, hosting our website for many years, the continued hosting of our mailing lists, and the inclusion of IPython in Enthought Canopy.
• Continuous period: Donations through IPython.org website
• Xxxx-Present: GitHub hosts development workflow and documentation.
• Xxxx-Present: ShiningPanda provides a free continuous integration service.
• July 2013: Free cloud hosting by Rackspace
• 2010: Microsoft’s team working on Python Tools for Visual Studio developed the integration of IPython into the Python plugin for Visual Studio.
• 2010 and 2005: Google Summer of Code provided support for prototypes in several areas of the project.
• 2009: The Ohio Supercomputer Center and the Department of Defense High Performance Computing Modernization Program (HPCMP) sponsored work on parallel computing tools.
• 2008: Tech-X Corporation supported the development of parallel computing tools.
• 2006: Bivio Software hosted an IPython sprint in addition to their support of the Front Range Pythoneers group in Boulder, CO.
• (Global goals?)
• [Increase customers usage of IPython Notebook by x% (Q4 2014)]
• [Increase customer satisfaction (Q4 2014)]
• [Maintain (or Increase) Sponsor satisfaction (Continuous)]
• Deliver the IPython Notebook as a general tool for scientific and technical computing that is open, collaborative and reproducible. (Q4 ’14)
• Deliver major release of interactive tools (every 6 months)
• Update roadmap document with each release (every 6 months)
• [Other goals?]
• [Goal around Fundraising/grant –related goal (by when?)]
• [Goal to Build/sustaining a team (by when?)]
• Participate or offer X number of talks (every 6 months)
• Participate or offer X number of workshops (every 6 months)
• [Goal to measure and maintain sponsor satisfaction]
• [Goal to measure customer satisfaction (by when? More surveys?]
• Deliver White Papers by target date.
• Build interactive JavaScript widgets for the IPython Notebook that enable computations and visualizations to be controlled with UI controls (sliders, buttons, etc.) (Q4 ’14)
• Improve the IPython Notebook format by creating libraries for converting Notebooks to various formats (LaTeX, PDF, HTML, Presentations) and integrating these into the Notebook web application. (Q4 ’14)
• Adding multi-user support to the Notebook web application, to enable small to medium sized groups of trusted individuals to run a central Notebook server for collaborative research and teaching. (Q4 ’14)
• Develop IPython Notebooks for applied statistics in collaboration with Jonathan Taylor, who will use these materials in his Applied Statistics course at Stanford (Q4 ’14)
• Core IPython developers meet twice a year, funded by Sloan Foundation (every 6 months)
• [Other goals?]
In the last couple of decades, but increasingly so in the last few years, all scientific disciplines have been tossed into the deep end of the computational pool, often while not being too well prepared for the dip. Up until the end of the XX century, scientific computing was mostly the domain of physicists, chemists and engineers. Fortran and C dominated the scene, and for the most part scientific computing was synonymous with high-end numerical analysis. However, now, all disciplines are drowning in quantitative data and need to develop a culture of scientific computing. Fortunately, the revolution in open source (made possible by the opening of the Internet in the late 90’s) has created many freely available tools, and a culture of how to use them and develop them, that is causing lasting changes in scientific computing.
The backbone of scientific computing is mostly a collection of high-performance code written in Fortran, C and C++ that typically runs in batch mode on large systems, dclusters, and supercomputers. However, over the past decade, high-level environments that integrate easy-to-use integrate easy-to-use interpreted languages, comprehensive numerical libraries, and visualization facilities have become faster, the critical bottleneck in scientific computing is not always the computer’s processing time; the scientist’s time is also a consideration. For this reason, systems that allow rapid algorithmic exploration, data analysis, and visualization have become a staple of daily scientific work. The Interactive Data language (IDL) and Matlab (for numerical work), and Mathematica and Maple (for work that includes symbolic manipulation) are well-known commercial environments of this kind. GNU Data Language, Octave, Maxima and Sage provide their open source counterparts.
Initially in 2001, the desired outcome was to build an interactive environment that could be used for daily work, inspired by tools a physicist is accustomed to using. Over time, it evolved to the development of modern, high-quality tools for scientific computing that take advantage of the best practices from the open source revolution in software development and bring them to computational research.
Interactive Data language (IDL), Matlab (for numerical work), Mathematica, Maple, GNU Data Language, Octave, Maxima and Sage offer an interactive command line in which code can be run immediately, without having to go through the traditional edit/compile/execute cycle. This flexible style matches well the spirit of computing in a scientific context, in which determining what computations must be performed next often requires significant work. An interactive environment lets scientists look at data, test new ideas, combine algorithmic approaches, and evaluate their outcome directly. This process might lead to a final result, or it might clarify how they need to build a more static, large-scale production code. Overall, it can make a significant difference to how scientific research is conducted and disseminated.
Python is an excellent tool for such a workflow. The aim is to not only provide a greatly enhanced Python shell but also facilities for interactive distributed and parallel computing, as well as a comprehensive set of tools for building special-purpose interactive environments for scientific computing.
In 2001, Fernando Perez took those first steps towards offering a free interactive tool for scientific computing. He started the IPython project based on the Python programming language. In 2014, the project is now focused on delivering the IPython Notebook as a general tool for scientific and technical computing that is open, collaborative and reproducible and on a global scale.
No need to write if already written elsewhere
No need to write if already written elsewhere. high level view