Sign up for our newsletter and get the latest big data news and analysis.

An Escape Plan for Modern Businesses Trapped in the Data Dungeon

In this special guest feature, Matt Glickman, VP of Product Management at Snowflake Computing, discusses the tell-tale signs that a DBA is caught in a data dungeon — such as data project failures and limiting data loading to off-peak times to avoid query performance degradation — and how businesses can escape. Matt is an industry leader in enterprise data platforms. For over 20 years, he led the development of business-critical analytics platforms at Goldman Sachs. As a Managing Director, he was responsible for the entire data platform of Goldman’s $1 trillion asset management business. Matt also co-led Goldman’s risk management platform where his team built Goldman’s first, company-wide data warehouse that helped it navigate the 2008 financial crisis. Matt holds a bachelor’s degree in Computer Science and Applied Mathematics from Binghamton University.

Data is the driving force of today’s economy. More data is generated every four hours than existed globally only 15 years ago, according to figures in a recent IDC report, and some companies are aggregating hundreds of terabytes each month.

It is therefore perplexing that incumbent giants of data warehousing, all full of information which drives the digital economy, under-deliver on their services and charge far too much to effectively hold clients’ data hostage. These services, instrumental in building the technological world as we know it, are rusty at the hinges, creaking along through queries at frustratingly slow rates. In turn, modern businesses are increasingly forced to halt operations while their data operatives run a number crunch.

Traditional data warehousing models cannot keep up with demands for flexibility and exponential growth – data is pumped into reservoirs of fixed capacities and later piped to you. This fixed pipeline between your enterprise and the warehouse is the main bottleneck, but not the only one.

Legacy data warehouse services cannot sustain the demands of a modern enterprise: data is effectively vaulted up and rendered useless by the inflexibility of the system. In real terms this is reflected in the astounding 88% of organizations who have experienced failed data initiatives.

In these data dungeons, loading can only be done “off-peak” so as not to degrade query performance. But when has that term ever been paired with ‘global enterprise’? Such organizations compromise by over-provisioning for any eventuality and begrudgingly take on hefty fees for underused resources.

Within traditional data warehousing schemes, data is scattered across thousands of data silos, follows no logical distribution and once placed is virtually impossible to move. From an analytics perspective, this architecture is downright unsustainable.

Data warehousing has reached a breaking point, where the information is tied up in processing for hours, days or even weeks. The architecture we depend upon came into being at a time of disk-driven data, designed as an ugly hybrid of the physical and digital realms. It was not designed for the volume of data of a modern enterprise, and it cannot support the concurrent processing now demanded by its users.

Conventional data warehouses have refused to innovate, and it has taken too long for customers to realize the effect this is having on their performance. What is needed is a true cloud warehouse to bring an end to this scandal of inflated costs and under delivered service.

You are not asking too much when you call for a solution capable of tackling all the data sets, all the time – ideally in under an hour. No modern business can afford to wait a week for a number-crunch. Data warehouses should be scalable, simple and flexible, not clad in iron at the bottom of a dark data dungeon.

 

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: