Making Big Data Small

Print Friendly, PDF & Email

In this special guest feature, Gabe Batstone, Co-founder and CEO for contextere , suggests that rather than provide workers with as much data as possible, it may be better to make Big Data small. This means harnessing the power of Artificial Intelligence (AI) to generate actionable insights a worker needs in that moment to complete the task at hand. Making Big Data small, keeping it both consumable and actionable, will empower workers to be more productive, safer and have less equipment downtime. Gabe brings to the company two decades of experience in implementing emerging technology across multiple industries on six continents. This has included innovations for digital oilfield, intelligent vehicles, smart city, future soldier and augmented reality for the enterprise. Gabe has worked for market leaders in the aerospace, defense, automotive, and energy markets including NAVTEQ, CAE, and NGRAIN. He holds a Bachelor of Applied Arts, specializing in Geographic information Systems, from Ryerson University in Toronto and a Master of Business Administration from the University of Baltimore.

We’re witnessing an incredible evolution where the advent of the Internet of Things (IoT) and the proliferation of sensors are creating more data than we could have previously imagined – big data is the new norm. This trend is expected to continue as sensors become smaller and more affordable. It will soon be possible to integrate them with more devices and gather previously unattainable data.

In theory, this will enable us to unlock new insights, expand business potential, improve efficiencies and provide workers with contextually-relevant information. In reality, however, if organizations don’t have the proper mechanisms and systems in place to utilize the data, collecting more of it won’t necessarily lead to the desired outcome. And, even when we have the right systems in place, having too much data can often lead to a multitude of dashboards and excessive analytics.

The ubiquity of big data has unintentionally become time-consuming, confusing and can lead to inaction. How can we free ourselves from the analysis-paralysis cycle we’ve created? Answer: By making big data small.

Why is having ‘Small Data’ Important?

There is no doubt that dashboards and analytics can reveal novel trends and insights, effectively predicting future outcomes and answering the question “So What?”. This enables companies to successfully predict when a piece of equipment will require maintenance and help them avoid costly, unanticipated equipment downtime. At the same time, this type of information is less useful when it comes to execution. The employee that needs to fix a piece of equipment often finds themselves overwhelmed, sifting through data, dashboards and analytics that may or may not be relevant. This type of activity can result in workers spending over two-thirds of their time on non-productive activities.

To empower employees that are focused on execution, such as blue-collar workers, we need to answer the question “Now What?”. We can achieve this by making big data small, ensuring the information is more easily digestible and contextually relevant to the task at hand. Put simply, by making big data small, we’re further distilling and curating the insights big data can provide to create actionable intelligence.

How Do You Make Big Data Small?

This is where artificial intelligence (AI) plays a pivotal role. Although big data can lead to inaction for humans, it fuels AI. Programming or ‘training’ AI algorithms typically requires an enormous amount of data; “the more information there is to process, the more data the system is given, the more it learns and ultimately the more accurate it becomes.”[1] This relationship between big data and AI is one of the reasons why AI has exploded in recent years.

Today’s IoT produces the data required for AI-powered applications.

By harnessing the power of AI, we’re able to curate big data and determine what contextually-relevant information workers may need at any given moment. AI enables workers to break the analysis-paralysis cycle and effectively execute on the insights derived from big data. To be clear, when referring to big data, this is not limited to data produced by the IoT. AI can ingest IoT, enterprise and contextual data, among others, to provide real-time actionable intelligence. It’s this intelligence that I classify as “small data.”

Where Do Humans Come In?

Though AI can provide recommendations for a range of tasks, these are generally narrow and specific. It cannot easily replace the hands-on activities, curiosity and judgment of humans. Instead, we should think about AI working in partnership with humans. For example, when we feed big data into our AI algorithms to arrive at a small, curated piece of intelligence that humans can easily act on, we’re augmenting the human. In this way, AI can be used to enhance our minute-to-minute decisions and judgment, increase productivity, and free up our time to take on more complex tasks.

Ultimately, I believe that to make big data actionable, we need to harness the power of AI and keep big data small. Otherwise, we’re only training humans to be good at tasks in which computers already excel. Instead, we should be giving employees the right tools and encouraging them to hone their uniquely human skills: curiosity, creativity and judgment.

[1] https://www.forbes.com/sites/bernardmarr/2017/06/09/why-ai-would-be-nothing-without-big-data/#49eef6ff4f6d

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*