IBM (NYSE: IBM) today announced that its machine learning technology –SystemML –has been accepted as a project by the Apache Incubator open source project. Originally developed by IBM Research, and now used in IBM’s BigInsights data analytics platform, SystemML is a machine learning algorithm translator. With SystemML, developers can build a machine learning model one time and keep reusing it to analyze and make predictions on data in a nearly infinite number of industry-specific scenarios.
In the next several years, all businesses will rely almost exclusively on applications that learn. For developers that are not expert in machine learning, the availability of SystemML as open source technology will help scale learning and widespread development of applications that truly sense, learn, reason and interact with people in new ways,” said Rob Thomas, VP of Development, IBM Analytics. “IBM developed SystemML to provide the ability to scale data analysis from a small laptop to large clusters without the need to rewrite the entire codebase. This allows for domain –or industry –specific machine learning, providing developers what they need from a base code to customize applications for their enterprise’s need.”
Data scientists today face time consuming and difficult challenges when porting their algorithms to production environments. Apache SystemML addresses these challenges by dynamically compiling and optimizing machine learning algorithms in the environments familiar to the data scientist and automatically porting these algorithms to production environments. By contributing SystemML to the open source community, IBM is helping data scientists iterate faster with the changing needs of the business, and helping data engineers by removing the need to rewrite for varying environments. As a result, more app developers will be able to apply deep intelligence into every thing from mobile applications to large mainframe processes.
In June of 2015, IBM announced its intent to donate SystemML to promote open source innovation and accelerate intelligence into every application. Now called Apache SystemML, the project has achieved a number of early milestones, including:
- Over 320 patches including APIs, Data Ingestion, Optimizations, Language and Runtime Operators, Additional Algorithms, Testing, and Documentation.
- 90+ contributions to the Apache Spark project from more than 25 engineers at the IBM Spark Technology Center in San Francisco to make Machine Learning accessible to the fastest growing community of data science professionals and to various other components of Apache Spark.
- More than 15 contributors from a number of organizations to enhance the capabilities to the core SystemML engine.
SystemML not only scales for big data analytics with high performance optimizer technology, but also empowers users to write customized machine learning algorithms using simple domain specific language without learning complicated distributed programming. It is a great extensible complement framework of Spark MLlib. I’m looking forward to seeing this become part of Apache Spark ecosystem,” said D.B. Tsai, Apache Spark and Apache SystemML Committer.