Back

 Industry News Details

 
Transforming Big Data into Meaningful Insights Posted on : Jul 19 - 2018

In this special guest feature, Marc Alacqua, CEO and founding partner of Signafire, discusses a useful approach to data – known as data fusion – which is essentially alchemy-squared, turning not just one but multiple raw materials in to something greater than the sum of their parts. It goes beyond older methods of big data analysis, like data integration, in which large data sets are simply thrown together in one environment. In this new science of data fusion, technology is deployed not just to mash together billions of data records, but to fundamentally transform them so that humans can understand the unseen commonalities or inconsistencies within them. Marc is a decorated combat veteran of the U.S. Army Special Operations Forces. For his service during Operation Iraqi Freedom, he was cited for “exceptionally conspicuous gallantry” and awarded two Bronze Star Medals and the Army Commendation Medal for Valor. A 20-year veteran and Lieutenant Colonel, Marc has extensive command experience in both combat and peace time, having commanded airborne and light infantry as well as special operations units.

“Big Data” is all around us – it’s even on TV. Hulu fans most recently discovered this as they watched The Looming Towers. On the show, we helplessly watch as the FBI and the CIA fail to share data about the impending 9/11 attacks. Their inability to break down information silos allows obvious clues to become buried in a sea of unrelated data. This scenario is hardly unique. In fact, it resonates in many of the most infamous civil and corporate disasters. British Petroleum’s Deepwater Horizon rig explosion, Enron’s collapse, the Takata airbag recall; each of these disasters began with siloed data, the puzzle pieces of which – if properly pieced together – might have revealed the problem patterns leading to the event.

Take Deepwater Horizon, for instance. When one of BP’s largest oil rigs suddenly exploded, resulting in a massive oil spill into the Gulf Mexico, the event itself was actually the culmination of dozens of ignored warnings, worried messages, buried reports, and seemingly unrelated signals. BP and Transocean had tons of individual data points available – from emailed warnings to bypassed alarm systems – that, if pieced together, might have raised the red flags needed to avert disaster.

So what happened? The problem is certainly not lack of data. Indeed, companies like BP are operating in the greatest era of data abundance. But Big Data is only that – copious, often-isolated recordings of fact that are only as good as the ways in which we review and analyze them. In this case, the email warnings meant nothing to the bypassed alarm systems, and the people responsible for overseeing one set of data had no way of piecing together the whole problem without seeing the other set. It takes joining massive amounts of disparate data to uncover the patterns and risks within.

This approach to data – known as data fusion – is essentially alchemy-squared, turning not just one but multiple raw materials in to something greater than the sum of their parts. It goes beyond older methods of big data analysis, like data integration, in which large data sets are simply thrown together in one environment. In this new science of data fusion, technology is deployed not just to mash together billions of data records, but to fundamentally transform them so that humans can understand the unseen commonalities or inconsistencies within them. Fusion breaks down traditional silos, allowing analysts to search for and corroborate theories quickly, at a scale and speed previously unthinkable. View More