Transforming Big Data into Meaningful Insights

Print Friendly, PDF & Email

In this special guest feature, Marc Alacqua, CEO and founding partner of Signafire, discusses a useful approach to data – known as data fusion – which is essentially alchemy-squared, turning not just one but multiple raw materials in to something greater than the sum of their parts. It goes beyond older methods of big data analysis, like data integration, in which large data sets are simply thrown together in one environment. In this new science of data fusion, technology is deployed not just to mash together billions of data records, but to fundamentally transform them so that humans can understand the unseen commonalities or inconsistencies within them. Marc is a decorated combat veteran of the U.S. Army Special Operations Forces. For his service during Operation Iraqi Freedom, he was cited for “exceptionally conspicuous gallantry” and awarded two Bronze Star Medals and the Army Commendation Medal for Valor. A 20-year veteran and Lieutenant Colonel, Marc has extensive command experience in both combat and peace time, having commanded airborne and light infantry as well as special operations units.

“Big Data” is all around us – it’s even on TV. Hulu fans most recently discovered this as they watched The Looming Towers. On the show, we helplessly watch as the FBI and the CIA fail to share data about the impending 9/11 attacks. Their inability to break down information silos allows obvious clues to become buried in a sea of unrelated data. This scenario is hardly unique. In fact, it resonates in many of the most infamous civil and corporate disasters. British Petroleum’s Deepwater Horizon rig explosion, Enron’s collapse, the Takata airbag recall; each of these disasters began with siloed data, the puzzle pieces of which – if properly pieced together – might have revealed the problem patterns leading to the event.

Take Deepwater Horizon, for instance. When one of BP’s largest oil rigs suddenly exploded, resulting in a massive oil spill into the Gulf Mexico, the event itself was actually the culmination of dozens of ignored warnings, worried messages, buried reports, and seemingly unrelated signals. BP and Transocean had tons of individual data points available – from emailed warnings to bypassed alarm systems – that, if pieced together, might have raised the red flags needed to avert disaster.

So what happened? The problem is certainly not lack of data. Indeed, companies like BP are operating in the greatest era of data abundance. But Big Data is only that – copious, often-isolated recordings of fact that are only as good as the ways in which we review and analyze them. In this case, the email warnings meant nothing to the bypassed alarm systems, and the people responsible for overseeing one set of data had no way of piecing together the whole problem without seeing the other set. It takes joining massive amounts of disparate data to uncover the patterns and risks within.

This approach to data – known as data fusion – is essentially alchemy-squared, turning not just one but multiple raw materials in to something greater than the sum of their parts. It goes beyond older methods of big data analysis, like data integration, in which large data sets are simply thrown together in one environment. In this new science of data fusion, technology is deployed not just to mash together billions of data records, but to fundamentally transform them so that humans can understand the unseen commonalities or inconsistencies within them. Fusion breaks down traditional silos, allowing analysts to search for and corroborate theories quickly, at a scale and speed previously unthinkable.

Two of my colleagues were members of one the very first fusion intelligence cells formed following 9/11, in which analysts sat with intel officers to sift troop reports, interrogation transcripts and more – all sharing disparate data in the same place for the first time. Their work was responsible for targeting and apprehending some of the most wanted terrorists in the world. But while these techniques were initially designed for the military and lay out of reach to private sector companies, recent advancements in machine learning and AI-based computing are now putting this technology at our fingertips.

Automobile manufacturers, for instance, are beginning to use data fusion to identify safety problems in global fleets of vehicles, making it easier to get potentially dangerous vehicles off the roads quickly. Insurance companies are using fusion to assess the risk in underwriting certain projects and products. Financial services companies are using fused data to make better predictions about complex market trends.

But adoption outside these industries has been slow, stymied by the same infighting and siloed corporate culture that intelligence agencies faced in the lead-up to 9/11. So how can companies get around this? First, they need to elevate IT leads to C-suite level, with distinct roles such as the Chief Data Officer. These people are uniquely suited to knock down silos and demonstrate how their approach to data fusion will yield measurable results. Then, they need to develop a new operational way of working, showing that data, not human intuition, should be the primary lever behind major company decisions.

For the military, this way of working has now become table stakes. But as The Looming Towers demonstrates, this was not always the case. It took a major event to highlight the importance of data fusion, and significant effort on the parts of key people and agencies to implement the technology in an impactful way. The time has come for private sector companies to do the same.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*