Sign up for our newsletter and get the latest big data news and analysis.

Turning IT Upside Down In a Machine Learning World

In this special guest feature, Chris Heineken, CEO and Co-founder of Atrium, suggests that as Machine Learning (ML) is growing in the IT and cloud space, understanding how to best utilize its capabilities will change the approach to implementing new IT investments. As CEO of Atrium, Chris leads a world-class team in empowering companies to embrace the next generation of tech through the power of AI. Prior to founding Atrium, Chris was the COO at Appirio where he was responsible for leading the Company’s global consulting, sales, and operations teams. Chris started his career with Accenture and later founded Bay Street Solutions, a CRM/Siebel consulting firm, acquired by Perficient. He earned his undergraduate degree from UC Davis and MBA from UC Berkeley.

Modern-day IT systems are the reflection of software development practices and technology constraints inherited over the last two decades. IT solutions were constricted on their cloud-based software due to the approaches that previously dictated the market. As Machine Learning (ML) is growing in the IT and cloud space, understanding how to best utilize its capabilities will change the approach to implementing new IT investments. The systems of intelligence characterized by augmented ML, robust analytics, and workflow-based frameworks biased towards action, will dominate the mindset of those looking to place their organizations at the top end of the IT systems ‘bell curve.’ So, how do you take your first step to migrate up the IT systems bell curve in an ML-driven future? You have to recognize that the rules of the game in software development that served you well in the past won’t help you in the future.

To frame the environment going forward, it is important to understand the context that created current IT development frameworks and dominated the mindset of IT professionals. For the last several years, the approach towards building IT systems has followed a principle that is characterized by ‘paving the old cow paths.’ Ever wonder how some current road systems defy rational logic for today’s transportation needs? As the legend goes, cows tend to take the path of least resistance from point A to point B and when local towns started to advance road systems, they just paved the cow paths that were already there. IT systems over the last decades share a lot in common with paving those cow paths. More often than not, in years past, the first question asked by systems architects when initiating a new IT project was how to replicate current infrastructure in new technologies (while addressing some new pain points) versus focusing on step-change advances. Anchoring on past infrastructure to set the tone for the future has the advantages of allowing for quick, incremental progress and deployment results with lower risk, but the rewards are limited (especially when new paradigms like ML emerge).

For example, generally, teams of IT professionals would start with ‘as is’ and ‘to be’ process flows triangulated with current system limitations and then new architected system designs would be wireframed, developed and deployed to optimize any process you could imagine. Process speed and accuracy was a consistent output and a big win a few years ago. Many of these projects started with the best of intentions (impacting key business metrics) but in the end, the overwhelming majority of IT programs were judged a success or failure based on simple metrics around user adoption and process compliance.

The process of IT systems development will be reversed or turned upside down with machine learning. Those looking to win in the age of machine learning will place data and analytics as the centerpiece of their strategy for systems development. Data should no longer be viewed as a necessary evil required to complete a process step, rather it should be the foundation that informs the possibilities of the future. IT investments will start with identifying the question we want to answer, inventorying the data we possess, identifying the data architecture gaps and then, as the last step, we will build systems to support those objectives. Consider the comparison of how these two paradigms contrast relative to traditional software development phases.

At every step in the software lifecycle you can see how mindsets need to shift. Rather than optimizing for ‘how can I make your current pain points better,’ it is about determining the questions, that if answered, would yield groundbreaking results. Every organization has one or two key metrics, that if changed, could dramatically improve company performance. These metrics could be customer retention, lead acquisition, win rate or any of a large number of potential metrics/questions that if impacted by data, insights, and action could produce order of magnitude results in terms of revenue, margins, and valuation. As an example, in environments where market share matters, high volume interactions occur with low costs of sale and the difference of moving from a four percent lead conversion rate to an eight percent conversion rate can be the difference between average results and best in class results.

Looking forward, rather than investing time in figuring out how you capture and process data faster to support an end to end process, you need to invest time in figuring out how to collect data around the most powerful target variables, and looking at how to structure the data in a way that can be understood and acted upon by machine learning algorithms. The future solve to harness the power of machine learning is more about how you can understand your business drivers well enough to harvest data in a fashion that allows for machine learning models to both predict and act in real-time. For example, empowering a call center agent to process calls faster may still be a great investment, but it will only keep you at parity with your industry peers. That was so 2005. Identifying how you can augment the call center agent with ML actions and prompts to dramatically improve productivity will be the goal of those who understand how to win in the next ten years.

Approaching the future of IT investment with a mindset towards ‘paving the cow path’ will either deliver sub-optimal results or not allow you to even start the machine learning journey. Step one in preparing to take advantage of new paradigms triggered by ML is acknowledging how the process of deploying the next generation of IT systems will require different approaches from those utilized over the last twenty years. Organizations looking to develop next-generation intelligent systems that can both predict and act on key business drivers will need to revisit IT organizational structures, skill sets, and the frameworks around evaluating the types of IT projects to fund. Just when stale software paradigms pose the problem of commoditizing IT, new paradigms emerge that challenge our assumptions and force us to consider how to adapt. Planning for ML and systems of intelligence represents a much greater opportunity to IT professionals than risks associated with job displacement. Those who openly embrace new ML paradigms will find a multi-decade opportunity professionally and organizationally as the line of demarcation between people/enterprises that are outstanding vs. those that are average will be determined by principles drawn from machine learning foundations.

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: