The Business Prerogative for Speed in Big Data

Print Friendly, PDF & Email

Mathius GollombekIn this special guest feature, Mathias Golombek of EXASOL discusses the need for speed in big data computing architectures and how in-memory computing solutions can fit the bill. Mathias Golombek is CTO of EXASOL, developer of an innovative in-memory database solution for successful companies to run their businesses faster and smarter.

What business do you know that has built a lasting foundation on being slow? Not likely many, as just about everyone can tell the story of being frustrated with slow customer service. With analytics and data, slowness is the kiss of death. With the growing volumes of data faced by businesses today, the pressure to distill excellent customer experiences and timely business decisions is greater than ever. As any company evaluates its technology investments, it becomes critical to assess performance benchmarks of these solutions–and validation from independent authorities is one of the best ways to do this. Since speed is one of the main competitive business advantages, companies need to know what performance to expect from your technology investments so that they don’t get left behind in the dust.

Third-party, industry-association benchmark organizations like the Transaction Processing and Performance Council serve as validation in the analytics and data industry to help companies accurately evaluate just what it is they are buying. They provide a fair and agreed upon assessment of a technology’s performance based on a standard set of criteria for comparisons. Without these standard criteria, it becomes a case of he said, she said comparisons between different technology vendors. How can customers ever see through to the truth of the matter?

In September 2014, EXASOL, an in-memory analytic database vendor, released third-party audited results showing that it is the only database company to benchmark an in-memory system ever at 100TB. In addition, the system delivered 140 times performance increase over the nearest competitor. Its EXASolution database also broke every single performance record from 100GB, to 100TB data. These points provide hard data that companies can rely on to make critical technology buying decisions.

The Transaction Processing and Performance Council’s TPC-H benchmark is an important benchmark for decision-support systems based on a suite of business oriented ad-hoc queries and concurrent data modifications. It provides a relative performance comparison of how decision support systems examine large volumes of data, execute complex queries and give answers to critical business questions. Validated benchmarks, such as the TPC-H, give companies the comfort that vendors are up to par on the solutions they’re selling. They are also reliable because every feature of the hardware and software configuration is specified and priced. That’s right, the TPC-H benchmark also gives companies an independently audited performance-cost comparison to make technology purchase decisions.

In-memory database solutions by the nature of their speed are helping bridge the gap between business requirements and technology investments. Gartner released a recent report, ‘Hype Cycle for In-Memory Computing 2014’ which found that in-memory database solutions are growing rapidly due to technology maturation and decreasing costs. The report also highlighted that there are a few high impact in-memory technologies already in use for mainstream purposes– meaning now is the time for planning your company’s in-memory analytics and data strategy.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*