How Enterprises Can Finally Capitalize on Machine Learning

Print Friendly, PDF & Email

In this special guest feature, Dr. Michael Zeller, SVP, AI Strategy & Innovation at Software AG, discusses how enterprises have reached a pivotal moment for operationalizing machine learning. For a while now artificial intelligence has been over hyped, its benefits over promised and, in the end, it always under-delivered. Now the hype is back stronger than ever. For AI to not fade away again, however, it needs to be made actionable. Previously, Dr. Zeller was Co-Founder of Zementis, where he used his vision is to help companies deepen and accelerate insights from big data through the power of predictive analytics. Michael has extensive experience in strategic implementation of technology, business process improvement and systems integration. He strives to provide customers with innovative business solutions tailored to their unique needs. Currently he also serves on the Board of Directors of Software San Diego and as Secretary/Treasurer on the Executive Committee of ACM SIGKDD, which is the premier international organization for data mining researchers and practitioners from academia, industry, and government.

Enterprises have reached a pivotal moment for operationalizing machine learning. For a while now artificial intelligence has been overhyped, its benefits overpromised and, in the end, it always under-delivered. Now the hype is back stronger than ever. For AI to not fade away again, however, it needs to be made actionable.

Today, we still use many of the same algorithms that we have had for the last few decades. This difference now is that more data and drastically enhanced computing power are available at a lower cost. The combination of these things means it is more feasible to utilize these algorithms. With the volumes of  data being collected – thanks largely to the proliferation of IoT – the need to automate and make progressively intelligent decisions via AI is more important than ever.

Operational deployment in the enterprise is where AI, machine learning and predictive algorithms start generating measurable results and ROI. To utilize machine learning and AI, we need to:

  • Look at the practical aspect of what a data science team is developing
  • Explore the data – whether it’s from sensors on the factory floor or from marketing applications
  • Come up with a machine-learning model that meets the specific business needs before deploying it into an existing IT infrastructure.

This means developing industry standards that easily allow companies to deploy, integrate and scale the use of predictive analytics, machine learning and AI to transform the next stage of the IT platform.

The Challenge

Historically, the operational deployment of AI, machine learning and predictive algorithms has been a tedious, labor- and time-intensive task. Data science teams had to build predictive and machine-learning models and then manually re-code them for deployment in operational IT systems. Only then could these models be used to effectively score new data in real-time streaming or Big Data batch applications.

This process was prone to errors and could easily require over six months while wasting valuable resources. Not only did it limit how quickly models could be deployed, but also made it difficult to leverage more complex machine-learning algorithms that could deliver more precise results.

Overcoming this somewhat antiquated process meant achieving a more efficient model development life cycle with popular open source data mining tools.

A Standards-based Solution

Enter PMML, the Predictive Model Markup Language industry standard. PMML is an XML-based standard for the vendor-independent exchange of predictive analytics, data mining and machine learning models. Developed by the Data Mining Group, PMML has matured to the point where it now has extensive vendor support and has become the backbone of big data and streaming analytics. For today’s agile IT infrastructure, PMML delivers the necessary representational power for predictive models to be quickly and easily exchanged between systems.

One of the leading statistical modeling platforms today is R. R allows for quick exploration of data, the extraction of important features and has a myriad of packages which give data scientists easy access to various modeling techniques. The ‘PMML’ and ‘PMMLTransformations’ packages for R were created to allow data scientists to export their models to PMML format. The PMML representation of these models is then easy to deploy and integrate in any IT enterprise architecture, be it cloud or on premise, streaming, mainframe or Hadoop.

As more companies invest in IoT- and exponentially more data is collected, making it easier for companies to deploy, integrate and scale the use of predictive analytics – machine learning and AI will be transformative. The technology and the data are available, but integrating all the parts will require an enterprise-grade framework based on open standards, which allows machine-learning models to be exchanged across applications, vendors and organizational boundaries to maximize business value.

Only then can organizations begin to capitalize on AI and ML.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*