Actian Survey Shows Three out of Four Execs Not Fully Satisfied with Big Data and Hadoop Initiatives

Print Friendly, PDF & Email

actian-logo_featureHADOOP SUMMIT 2015 NEWS

Big data has been a big problem for many early adopters and has not lived up to its promise, according to a survey of more than 100 C-level executives, data scientists and IT leaders. Data is regarded as a company’s most important asset (61 percent of respondents), yet 77 percent of those surveyed reported their big data and analytics deployments are failing to live up to their expectations – indicating a huge unmet need for better analytic tooling to enable companies to glean the full value of their data.

We are living in a very disruptive time when organizations are using data to outcompete and create new revenue and customer pathways. Actian commissioned this survey to capture the sentiments of top organizations involved in today’s data-led technology evolution and to understand what separates the winners from the losers,” said Ashish Gupta, CMO and senior vice president of business development for Actian. “The survey results echoed what we’ve been hearing directly from organizations of various sizes – painful trial and error has revealed that traditional database technologies are failing to deliver on analytical workloads, so they have turned to Hadoop for help. The problem is, while Hadoop is a very cost-effective place to store massive amounts of data, most are finding it’s too immature to manage enterprise-grade, high-performance analytics jobs needed to get ahead and stay ahead.”

The global survey, sponsored by Actian Corporation, was conducted in May 2015 and represents the views of 106 C-level executives, IT leaders and data scientists spanning more than 25 industries, including aerospace/defense, automotive, banking/financial services/insurance, education, healthcare, technology and telecommunications.

Eighty-one percent of respondents reported that data analytics-driven business growth is their number-one priority during the next twelve months, followed by gaining customer knowledge and insight (58 percent). Also, accuracy, business alignment and speed are key for driving real insights from their data analytics initiatives. Enterprises depend on analytical insights to help drive revenue, get closer to their customers and outpace the competition. To put data to work for them, they need supporting technology that is going to deliver those insights quickly and accurately.

Fifty-one percent of respondents claim that Hadoop could make existing data analytics operations more efficient, yet only five percent of the IT pros surveyed are asking for Hadoop. Why? Skills gaps are a limiting factor. Organizations want to use what they know – mostly SQL – to break into Hadoop. When asked which statements best summarize their views of Hadoop, one-third of the data scientists surveyed said Hadoop provides a cost-effective, scalable way to store massive amounts of data. But, nearly as many said Hadoop is hard to use and requires talent not possessed by the organization (20 percent), and Hadoop needs tooling to make it more enterprise-grade, secure and fast (20 percent). When asked what they’d change about Hadoop today, the top response (40 percent) was to have access to Hadoop data via SQL and BI tools.

As with any technology advancement, Hadoop has areas for improvement. Many organizations have invested time and resources into making it work for them because they know that their current way of managing analytical workloads won’t cut it,” said Gupta. “More than 25 percent of CEOs surveyed said a failed big data project is a ‘fireable offense’ for their CTO or CIO, putting immense pressure on top IT leaders to deliver on the promise of big data. So, how are the IT leaders within an organization making Hadoop analytics work? They’re depending heavily on tried-and-tried tools like SQL for gleaning insights.”

View more results from the survey by clicking HERE.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*

Comments

  1. There is some synergy here with integration projects, maybe the beast is too big to tackle. Supplemental tooling may be an answer to take the complexity away or reduce it to a reasonable level.