Too Integrated to Fail, But No Longer Competent to Deliver

Print Friendly, PDF & Email

Ashish Gupta ActianIn this special guest feature, Ashish Gupta of Actian discusses how legacy solutions were not architected to handle the requirements of today, but with traditional systems so pervasive (and so much having been invested in them) organizations are reluctant to rip and replace. Ashish is CMO and SVP, Business Development at Actian where he brings more than 21 years of experience in enterprise software companies where he focused on creating go-to-market approaches that scale rapidly and building product portfolio that became category leaders in the industry. He holds an M.B.A. from UCLA and a B.A. in Economics and Computer Science from Grinnell College.

Teradata’s reported Q2 losses were heavy—$265 million—and although it was the company’s first quarterly loss since going public, it came on the heels of a disappointing Q1. Revenues were down 8 percent from the previous year as an increasing number of customers balked at making large purchases from the company. As a result, Teradata is fast becoming the poster child for a crop of traditional vendors that just can’t keep up with the new analytic workloads of today.

Data has become pervasive and insights from this data are becoming increasingly mandatory for successful companies to mine in a timely manner. Gone are the days where reports from six-month-old data or even alerts will suffice. Customers are looking for analytics-enabled business processes that utilize insights to make business decisions in a contextual and timely manner. Legacy solutions are breaking under the pressure of these new workloads. They simply can’t handle the increasing flow of data within their static frameworks, falling behind and leaving batches of data underutilized. Enterprise big data implementations are estimated to be a $38.4 billion market this year, but despite increased IT spending, the majority of stored data is not generating immediate business value.

Given this situation, it should come as no surprise that in a survey Actian conducted of IT decision-makers eleven percent of respondents were “not at all satisfied” that their technology investments in reporting, analytics and big data were giving them what they need to help achieve their top business priorities. Twenty-five percent were only “slightly satisfied” and 41 percent were “moderately satisfied.” That’s 77 percent of executives who are not impressed with their results.

Yet companies are hesitant to rip and replace their floundering legacy solutions because they have invested so much money in them already and are reluctant to simply write them off as a loss. Additionally, legacy data warehouses are ubiquitous throughout organizations, making it a difficult and resource-intensive process to remove and replace them. Hadoop on the other hand, gives you the scalability to store the data it what is often fondly called the Data Lake, but customers are finding it is more like a Data Swamp. The reason is that Hadoop is still naïve in its evolution and tooling for BI and analytics.

So how do customers get through this conundrum – they want to use their domain expertise and extend their legacy investments while still getting the benefits of the cost-effective scale of Hadoop environments? Since the majority of applications and BI tools like Tableau or MicroStrategy depend on SQL, I believe that SQL-in-Hadoop offers a way forward. Existing SQL applications and queries no longer have to be rewritten to access Hadoop and data processing no longer needs to be brought out of Hadoop to work. It’s the perfect complement to legacy deployments.

For example, we worked with a leading bank that found that by leveraging Hadoop they were able to process twenty times more data than was possible with their previous database. This bank could now query 200 billion rows of data and invest $100 Billion float in just 28 seconds – as opposed to the 3 hours it once took – while instituting multi-level risk exposure analysis for controls and regulatory compliance across the organization.

Data’s exponential expansion may seem overwhelming, but options exist to address the problem without blowing your budget. There is no magic bullet, but ignoring the problem will only make it worse. Don’t limit what your data can do for you.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*