The Rise and Fall of the Traditional Data Enterprise

Print Friendly, PDF & Email

Not too long ago, HP, Cisco, Dell, EMC, IBM and Oracle were thought by some to have a very limited future. These companies helped establish Internet businesses built on the strength of thousands of computers but by the mid-2010s, they saw that they couldn’t afford all the hardware and software provided by these giants and could not scale to be competitive. The future was going to be in the cloud which offered cost and infrastructure flexibility. These once dominant companies would continue to do business but as tech giants, their time was up.

Today, we stand at an even more dramatic critical juncture. When we look back at the death knells for Dell, EMC, HP, Cisco and IBM – it is hard not to read a similar future in the tea leaves of companies like Snowflake and Palantir after their wildly successful IPOs, and of Databricks with its highly-anticipated public offering.

However, the change that we are seeing today goes far deeper.

The rise of Databricks, Snowflake and Palantir represent a massive and fundamental shift in the tech landscape. We have not seen a shift of this scale since the move from mainframe to client-server in the early 1990s. At that time, IBM with its mainframe business, Compaq, Digital and the like were on the decline, and on the rise were the new kids: Microsoft, to run all those modern enterprise applications created by SAP and others on desktops, and Intel to power all those desktops.

Today, I believe we are seeing the clear signs of the death of the data enterprise as we have known it.

A Fundamental Shift

Any student of history understands its cyclical nature, and it seems cloud history is no different. In the 1990s, the industry shift was away from a centralized computing paradigm (mainframe) towards a decentralized paradigm (client-server). Today, we are seeing that trend being reversed with the shift towards the cloud as a centralized solution.

However, a significant difference is that in this newly-recentralized computing world, storage is no longer an issue. Services like Amazon S3 and Azure Data Lake Storage have already commoditized storage at scales old-school players like EMC and HP could only have dreamed of.

Processing power is no longer a factor either. AWS has been offering services with its Graviton chips – designed with tech licensed from ARM (now part of NVIDIA) – since 2018. Apple’s announcement last year that it would gradually transition to its own M1 chips has continued this trend, leaving one obvious company behind – Intel.

It Goes far Deeper

The transformation that we are seeing today goes far deeper than infrastructure. In fact, it cuts right to the heart of what the massive shift to the cloud is really all about: better big data analytics driven by AI and machine learning. And the way 2021’s rising stars facilitate this is very much defining who will come out on top and who will fade away slowly.

Snowflake built a simplified, easy-to-use data warehouse for the cloud, ideal for smaller structured datasets, and has offered it as a turnkey solution. The company rose to prominence and grabbed a massive market share from Teradata, HP Vertica, IBM Netezza, Oracle, and others which were initially unsuited to the rigors of the cloud and have not been able to effectively catch up. They are now relegated to the has-beens.

Databricks built a massively efficient data lakehouse designed from the ground up for the cloud and is ideal for large scale semi-structured data. In doing so, they have pushed Cloudera – which bet on a batch-based Hadoop product that was far less efficient for the cloud – into the ranks of old news.

Palantir’s cloud-based analysis tool is a powerful alternative to relational databases and is becoming the de facto solution of choice for enterprise cloud data integration, information management and quantitative analytics – driving players like Microsoft and its SQL Server into the ranks of the previously relevant.

And last on the list of the dying technologies is, of course, Hadoop. When released, Hadoop was designed to be a cost-effective way to store petabytes of data at a price tag that was a fraction of traditional data warehousing costs. Once enterprises realized that storing data and using it were two entirely different challenges, data began backing up into data swamps that sat unused.

The Bottom Line

We are witnessing the death of traditional enterprise computing and storage – a real changing of the guard. Companies like Databricks, Snowflake and Palantir are obliterating companies initially thought to have been competitors: EMC, HP, Intel, Teradata, Cloudera and Hadoop.

The tremendous power of the cloud coupled with the incredible cost efficiencies of cloud-based storage, processing and analysis are pushing more and more legacy market leaders into the ranks of the obsolete. In the same way that IBM’s mainframe business couldn’t reinvent itself, this old guard will be hard-pressed to save itself. Enterprises are shifting away from legacy on-prem systems to the cloud – driven by AI and ML similar to the way that the client-server revolution was driven by TCP/IP.

Change does not wait for stragglers. As we navigate this sea change, the question is which giants will fall from grace, who will manage to reinvent themselves, and which new players will drive the final nails into the coffin of the data enterprise?

About the Author

David Richards co-founded WANdisco in 2005 and also leads with his wife the David & Jane Richards Foundation. Its mission is to educate, empower and improve the lives of children.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*