The Looming Datacenter Paradigm Shift

Print Friendly, PDF & Email

The acceleration of our digital economy has fueled an unprecedented proliferation of data – but in terms of the amount of data set to be created in the near future, “the world ain’t seen nothin’ yet.”

According to IDC’s Global DataSphere Forecast, the data generated over the next three years will exceed the entire amount generated over the past 30 years, as data-hungry AI, IoT, enterprise, and cloud workloads increase at a breakneck pace. This will put unprecedented demand on the fast-growing datacenter market, where infrastructure spending is on course to grow 6% this year and will continue rising at a similar pace through 2024

With an estimated 90% to 95% of the market share for datacenter processors, Intel has played an outsized role in meeting datacenter demand, cementing the CPU’s role as the dominant chip processing all that information. But Moore’s Law has seemingly reached its outer limits, with CPU performance no longer increasing at the rate it once did, and as more and more data is generated over the coming years, the strain on enterprises’ data operations will only get worse – that is, unless datacenters manage to overcome the increasing limitations of CPUs by embracing accelerated data analytics processing. Here’s why the shift is long overdue – and what it will take to make it happen.

Underlying Market Trends

While the CPU still reigns supreme in datacenters worldwide, the need to process larger amounts of data for AI applications has already engendered the need for another type of processor – the GPU – in the datacenter space. Indeed, the discovery of GPUs’ remarkable efficiency in processing AI workloads provided just the accelerant Nvidia needed to surpass Intel as one of the world’s most valuable chip companies.

However, the market’s demand for even greater and smarter processing power continues to grow. As enterprises face ongoing waves of data acceleration, AI systems on a chip (SoCs) have emerged to challenge GPUs – as clear a sign as any that the future of the datacenter market will be shaped by an ongoing quest for ever-greater processing efficiency. 

Of course, CPUs will still have a vital role in the datacenters of the future, but their role in data processing will diminish, as accelerators and dedicated processors move to the fore.

Different Workloads, Different Solutions

The search for greater processing efficiency stems from the need to service new types of workloads that traditional CPUs and GPUs simply can’t handle adequately. 

Take big data analytics, which is in more demand than ever as businesses across industries seek to leverage a vast range of customer, market, financial, and other data to enable smarter decisions about how they allocate their budgets, reach out to customers, develop products and services, respond in real time to supply-chain and logistical challenges, and more. The sheer array of data that businesses could leverage – particularly in the age of proliferating IoT devices – means that datacenter workloads will grow increasingly complex, requiring more and more processing power. Indeed, as a recent European Union study noted, “IoT applications are becoming the dominant workload” across many datacenters.

In this age of not only big data, but diverse data as well, one size can never fit all. Dedicated processors and accelerators can provide the tailored, hyper-efficient solutions needed for the specific, highly complex workloads that will characterize the future of the 21st-century economy – making it easier for businesses to extract the maximum value from the “new gold” in their data treasure chests.

The paradigm shift is already underway. CPU’s leading role in datacenters is no longer a given, and as database workloads grow increasingly varied, market forces will point more and more in a clear direction: the development of new, dedicated, accelerated solutions. 

Data workloads are only going to grow more diverse and intricate from here. The same holds true for the chips that will process them.

About the Author

Jonathan Friedmann is the CEO & Co-Founder of Speedata whose first-of-its-kind Analytics Processing Unit (APU) is designed to accelerate big data analytic workloads across industries. Jonathan is a serial entrepreneur with over 15 years of experience in executive roles at semiconductor companies. Previously, Friedmann was the CEO & Co-Founder of Centipede, which developed IP for general purpose processors. He also served as COO and VP R&D at Provigent, an infrastructure semiconductor company acquired by Broadcom for over $300M. Friedmann holds a BSc in Industrial Engineering (Magna Cum Laude), an MSc in Electrical Engineering (Magna Cum Laude), and a PhD, all from Tel-Aviv University.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 –

Speak Your Mind