How Automation is Solving the Data Capacity Crisis

Print Friendly, PDF & Email

Matt GouldIn this special guest feature, Matthew Gould, Chief Strategy Officer and Co-Founder of Arria NLG, discusses how artificial intelligence that naturally generates language from data is providing solutions to issues facing our society, such as how better use of data can improve the well-being of millions of hospital patients; or how governments can utilize the data to better govern their citizens. Arria NLG is a leader in the development and deployment of Natural Language Generation (NLG) software technologies. Mr. Gould has worked in the IT and communications industry for the last 20 years, holding senior positions with the Daily Mail Group in the UK, NEC in Japan, and Hewlett Packard in Sydney and the USA. Mr. Gould has degrees in Literary Theory and Theology from Canterbury University, New Zealand and an MBA from the Advanced Business Programme at Otago University, Dunedin, New Zealand.

Data is being generated faster than any one person or companies’ ability to analyze it. Globally, we’re being overwhelmed by data – from the everyday person trying to manage their health to international corporations trying to manage business decisions and strategy. The problem lies in the shortage of experts who can analyze, unlock, and communicate its value.

Gartner forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from 2015; and will reach 20.8 billion by 2020. By 2020, about 1.7 megabytes of new information will be created every second for each person on earth. But less than 0.5% of all data is ever analyzed.

Much of this problem stems from the shortage of experts who have the capabilities to examine data. According to McKinsey & Co., the United States alone faces a shortage of 140,000 to 190,000 people with analytical expertise and 1.5 million managers and analysts with the skills to understand and make decisions based on the analysis of big data. While this problem is widely accepted, the simple fact is that data is being created faster than we could ever create jobs to analyze it.

But automation and artificial intelligence may have part of the answer – to translate vast amounts of data into a concise narrative or report. Natural language generation (NLG) platforms automate the analysis of data and then communicate key insights in real time – stimulating the way one communicates with another via natural language.

NLG platforms learn and can be taught what is important about any given data set, and how to correctly analyze it for a specific set of circumstances. In real time, the technology absorbs vast sets of unstructured data from multiple sources, analyzes, and draws conclusions from it. The technology can then automatically communicate conclusions in a compelling narrative that could have been written by an industry expert.

NLG platforms are trained to think and act like experts, learning how to evaluate and analyze the data in order to generate insightful reports. Like a human mind, it can learn what information is important and then tailor the language based on audience. Once the process of capturing the experts’ analytical skills and expertise in the NLG platform algorithms is complete, workers are liberated from having to spend their days analyzing data. The best and brightest employees are free to do what they are actually trained to do – engineers to build, doctors to heal, scientists to discover.

Take financial services as an example. Financial advisors spend significant time interpreting data to provide their investors with easy access to up-to-date information about their assets and investments. NLG technology can digest the mass amounts of financial data and generate tailored reports that explain each portfolio’s performance. The financial advisor can then review, assess and share the automatically generated reports as desired or schedule automatic delivery to a client with no intervention; saving hours and hours of analytical work. These financial advisors are liberated from having to spend their days reporting past preformance and free to spend their time researching investment strategies and providing clients insights and advice.

The financial sector isn’t alone: industries such as healthcare, utilities, and marketing also need solutions to help them navigate the vast amounts of data being generated on a daily (if not constant) basis. And, the only way this volume of data will ever be analyzed and ultimately leveraged in business decisions is through automation. But, we need to go beyond robots that churn out templated reports; we need robots that understand the data. Sophisticated platforms, such as NLG, can construct fluent narratives from idiosyncratic data sets and generate insightful analyses: ultimately innovating the very way we use and interpret data.

 

Sign up for the free insideBIGDATA newsletter.

 

Speak Your Mind

*