2023 Trends in Cloud Computing  

Print Friendly, PDF & Email

The cloud—including applications at its core, the edge, and in the Internet of Things (IoT)—has become the de facto environment of choice for the computational and storage necessities of modern data management.

Although not quite indistinguishable from the data ecosystem itself, the cloud factors significantly into almost all workloads involving aspects of data science, Artificial Intelligence, transactional systems, DevOps, customer interactions, and more. As an indicator of its ubiquity, Gartner announced spending on global public cloud services is estimated to reach approximately $500 billion in 2022.

Consequently, developments in cloud computing are demonstrative of those affecting data-driven processes as a whole. These trends include cloud native mechanisms that make the cloud more utilitarian, regulatory compliance and data governance concerns that make it sustainable, edge processing and IoT deployments that typify its celerity, and its increasing decentralization of enterprise architecture.

These phenomena signify both challenges and opportunities. They also herald a realization of an information technology world that’s markedly different from that of even a few years ago, in which organizations “used to have everything in one place,” reflected Steve Madden, Equinix VP of Global Segment Marketing.  “Now, [they’ve] got to manage things in lots of places. That’s hugely complicated. But if you’re subscribing to, as a service, only components or elements from those locations… now [you’re] just controlling the use of it.”

Cloud Native

Cloud native constructs will always represent the vanguard of cloud computing and cloud architecture. They’re intensely practical for increasing the manageability of the cloud as a means of decentralizing data virtually anywhere. Established cloud native techniques involving containers, orchestration platforms, microservices, and Infrastructure as Code (IaC) will maintain relevance in 2023. However, they’ll be assisted by developments in: 

  • Event Brokers: Cloud native event brokers are a form of event-driven architecture that supplement microservices by enabling systems to communicate with one another through event data. This capability has real-time implications for the IoT and edge computing. It’s also useful for integrating with SaaS offerings. For such vendors, “cloud-native event broker solutions make it easy to pipe events from anywhere into [B2B SaaS vendors’] software, and therefore satisfy their end users’ demands,” mentioned Sebastien Goasguen, TriggerMesh CEO.
  • FinOps: FinOps is an emergent discipline for monitoring, managing, and paring cloud costs. “FinOps processes and roles will grow rapidly and become standard in most enterprises, along with the tools that enable FinOps, such as resource and cost optimization,” predicted Rich Bentley, StormForge VP of Marketing.
  • Platform Engineering: According to Lightbend CEO Jonas Bonér, this concept is designed to help developers “climb the ladder of abstraction,” when devising applications or web pages. Platform engineering facilitates greater self-service functionality for developers, in part via an Internal Developer Platform that lets developers consume events from multiple services. “Event brokers can help connect to such systems without needing to directly access them or needing to understand the intricacies of how to integrate with them,” Goasguen indicated.
  • Serverless Computing: Although serverless computing is no longer a novelty, recent improvements made by purveyors such as AWS have made it an even more viable alternative to “challenges deploying applications on cloud-native infrastructure because the Kubernetes/Cloud Native Computing Foundation ecosystem is overwhelmingly complex,” Bonér posited.

The employment of AI will also expand in cloud native environs to “automate many of the tasks involved in managing cloud resources, such as provisioning, scaling, and monitoring,” Bentley added.

Data Sovereignty, Data Integrity 

The days in which the cloud was perceived as exacerbating regulatory compliance and data governance seem to be rapidly waning. According to Pavel Burmenko, General Manager of Veeva Clinical Database at Veeva, the cloud is now desired for its ability to reinforce data sovereignty and data integrity “because cloud providers are delivering compliant systems now.” This realization is particularly acute for public cloud deployments. “They can ensure data is stored in a particular geography because of the distributed nature of cloud computing,” Burmenko explained. “Global cloud providers have the geographic footprint to support this.”

A particularly prominent trend pertaining to data integrity is the expanding sophistication in the workflow capabilities of cloud environments—which reinforces data governance, data provenance, and regulatory compliance. This greater degree of accountability is primed for distributed work environments that are no longer silos because of what Burmenko termed “function-specific audit trails and workflows.” Specifically, “If I’m working in a cloud application, that application knows what my assigned role is,” Burmenko said. “It knows that I’m working on a specific project and that I’m supposed to perform my task after another person has performed it. The application is recording my activity in a way that’s reproducible. If an auditor comes from a regulatory body, we can give them an evidence trail of everything that’s been done.”

Edge Analytics

Effectual management of cloud resources via cloud native technologies, coupled with prudent data governance for data integrity, is the formula for reaping the benefits of the cloud as an enterprise enabler. Some of the advantages that exemplify the cloud’s premise of low latent, highly scalable applications anywhere involve advanced analytics at the cloud’s edge. Madden noted that “the cloud is also becoming a misnomer because a lot of the infrastructure you’re running at the edge is, in [many] cases, available in a bigger cloud backend warehouse.” This sentiment especially applies to cognitive analytics deployments like evaluating medical images.

“MRIs are getting so good that each scan is 28 gigabytes,” Madden revealed. “If you’re doing those 50 times a day across several hospitals in a city, that data is huge. Sending it all back [to the cloud’s core] and down again can cost thousands of dollars and lots of time.”  With edge AI, however, organizations are “storing, processing, and managing data right where it is produced and consumed, while using the cloud for data backup, more thorough data analysis, and a holistic system view,” Bonér put forth.

Transmitting only the results of analytics to the cloud’s core is more timely, cost-effective, and sustainable. Madden articulated a use case in which a medical facility input Alexa in patients’ rooms for real-time analysis of their needs for contacting nurses while automating logistical concerns like changing the channel. “It reduced the amount of contact between nurses and patients during Covid by 80 percent,” he recalled. “It saved lives.”

The IoT 

The IoT will continue to be one of the most impressive expressions of the cloud in the next 12 months. Digital twin applications are broadening the former’s utility beyond industrial settings to enterprise ones with interactive models of systems. The IoT allows organizations to enlarge the data quantities, while reducing the latency, for mission-critical applications such as clinical trial testing. Wearable devices, for example, let researchers “ask research questions for a clinical project that are going to be answered by smart devices that are able to collect the data we use for answering our clinical questions,” Burmenko disclosed.

Analysis of continuous, real-time bio-metric data is much more effective than traditional research methods involving daily questionnaires and other qualitative measures. “The volume of data coming from a wearable is so substantially greater than the volume of data used in clinical trials historically, that the cloud almost becomes the de facto place that enables this research to happen,” Burmenko observed.

The Edge to Cloud Continuum 

The reality is that such a use case could easily become an edge computing one, depending on the infrastructure in place and enterprise definitions of it. “The edge consists of many hierarchical layers between the cloud and the devices; each layer is further away from the cloud but closer to the end-users: a cloud-to-edge continuum,” Bonér confirmed. To that end, the edge becomes a singular expansion of the cloud, and a natural progression for enterprise deployments of the cloud’s cardinal boons of scalability, elasticity, low latency, and modest storage costs.

Defining the edge, and relying on it as a means of supplementing the cloud’s core, is crucial to availing organizations of this macro level development in cloud architecture. The edge, then, is not necessarily individual end point devices, but rather a local, metropolitan “aggregation point of all those technologies, devices, and traffic that come together to be stitched together into a business process and start revenue-generating,” Madden said. “The core is regional for how to understand, across a whole region, what’s going on.”

About the Author

Jelani Harper is an editorial consultant servicing the information technology market. He specializes in data-driven applications focused on semantic technologies, data governance and analytics.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW

Speak Your Mind

*