Condusiv Technologies Reports: Need for Speed Drives Edge Computing Growth

Print Friendly, PDF & Email

Edge computing is projected to grow at a cumulative annual growth rate of 46% over the next four years to over $6 billion by 2022.1 With this growth has come a readjustment in planning strategy on the part of CIOs and other IT managers. “For nearly a decade now, large, computer-intensive enterprises have been looking at IT investment in terms of moving virtually all applications to the cloud, with a concomitant reduction of operating expenses in local computing and the possibility of lower—or at least reasonably stable—overall cost. Unfortunately, it’s going to be more complicated than that,” said James D’Arezzo, CEO of Condusiv Technologies, a world leader in I/O reduction and SQL database performance.

The actual “edge” in edge computing depends on the application. In telecommunications, it could be a cell phone, or perhaps a cell tower. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop. The important thing about edge computing is that it enables data produced by Internet of Things devices to be processed close to where it’s created, rather than sending it to centralized, cloud-based data centers. This allows data to be analyzed in near real-time—a need of organizations across many industries, including manufacturing, healthcare, telecommunications and finance.2

Industry observers cite the autonomous, self-driving car as a kind of current ultimate example of the need for edge computing. Due to speed, privacy concerns and available bandwidth, a user cannot feed each and every one of the various sensors of a self-driving car up to the cloud and then await a response. A latency delay which might be tolerable in, say, waiting for a video to start would be unacceptable in a situation involving split-second, life-or-death decisions.3

The growing need for edge computing—and, inevitably, the growing number of applications for which it’s being put to use—are causing corporate IT planners to look toward a future of managing multiple capabilities—cloud-based for some applications, non-cloud-based (or specialty cloud-based) for others. One recent suggested model would use edge computing for basic data visualization, basic analytics, caching, buffering and streaming, pre-processing and cleansing, and device-to-device communications. Cloud computing would be reserved for complex analytics, big data mining, business logic sources, machine learning rules, advanced visualizations, and long-term data storage.4

Meanwhile, IT leaders in many fields are scrambling to balance an explosive growth in overall data with the demands of users for immediate analysis. According to the Gartner Group, for example, in the life sciences, only 5% of the data created has actually been analyzed. More than one industry expert has referred to the situation as a “digital arms race” between data creation and the ability to store and analyze it.5

“Basically, there’s no choice,” said D’Arezzo. “If you’re a CIO today, you have to do both. You have to do edge computing and cloud computing. You have to keep generating new data, and you have to store and analyze it. And you have to do all of this within budgets that don’t normally allow for wholesale hardware replacement. For that to happen, your I/O capacity and SQL servers need to be optimized, and, given the realities of edge computing, so do your desktops and laptops. A very cost-effective way to do that is to implement software that reduces input and output, which can improve performance dramatically.”

Condusiv Technologies has seen users of its software solutions increase the I/O capability of storage and servers, including SQL servers, by 30% to 50% or more— with some results being as high as 10x initial performance.

  1. “Global Edge Computing Market By Deployment (On-Premise & Cloud), By Application (IoT, Video Surveillance, etc.), By Component (Hardware & Software), by End-User (Manufacturing, Power, etc.), By Region, Competition Forecast & Opportunities, 2015-2022,” Global Information, Inc., March 28, 2018.
  2. Butler, Brandon, “What is edge computing and how it’s changing the network,” Network World, September 21, 2017.
  3. Miller, Paul, “What is edge computing?”, The Verge, May 7, 2018.
  4. “IIoT Edge Computing vs. Cloud Computing,” Open Automation Software, 2018.
  5. Hiatt, David, “The Next Digital Arms Race in Life Sciences,” Bio-IT World, August 25, 2017.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*