Sign up for our newsletter and get the latest big data news and analysis.

Businesses Building AI Applications Are Shifting to Open Infrastructure

In this special guest feature, Ami Badani, CMO of Cumulus Networks, suggests that as AI requires a lot of data to train algorithms in addition to immense compute power and storage to process larger workloads when running these applications, IT leaders are fed up with forced, expensive and inefficient infrastructure, and as a result they are turning to open infrastructure to enable this adoption, ultimately transforming their data centers. Ami is responsible for all aspects of marketing from messaging and positioning, demand generation, partner marketing, and amplification of the Cumulus Networks brand. She has a decade’s worth of experience at various Silicon Valley technology companies. Ami has an MBA from University of Chicago, Booth School of Business and a BS from University of Southern California.

From facial recognition to self-driving cars, the real-life use cases for AI are growing exponentially. With the limitless possibilities and a promising future, there has been an influx of interest in the technology, driving companies to build new AI-focused applications. But the much-needed compute power to run AI-backed applications begs the question: what’s going to happen to the network infrastructure these companies rely on day-in and day-out? As companies look to adopt innovative technologies to drive new business opportunities, they face major barriers because their legacy data center infrastructure is holding them back. As AI requires a lot of data to train algorithms in addition to immense compute power and storage to process larger workloads when running these applications, IT leaders are fed up with forced, expensive and inefficient infrastructure, and as a result they are turning to open infrastructure to enable this adoption, ultimately transforming their data centers.

IT leaders are rethinking their data center infrastructure 

According to IDC, by 2020, the demands of next-generation applications and new IT architectures will force 55 percent of enterprises to either update existing data centers or deploy new ones. As AI workloads and costs continue to grow, IT leaders are questioning their current infrastructure. 

Some forward-looking companies are building their own data centers to handle the immense computational stress it puts on networks, as Walmart recently did.

The very root of the problem is finding hardware and software capable of moving large workloads, efficiently. Even with the latest generation of TPUs, which are purpose specific AI processing units, the data sets moving through are so large that the infrastructure still needs a significant amount of servers. More so, because these servers need to talk to each other, the bottle neck inherently has been the network. With that, IT leaders are starting to look to open infrastructure to combat the increased workloads, costs, and more.

Shifting to open infrastructure 

From a larger lens, the industry has witnessed a massive shift to open infrastructure. One study by Researchscape noted that 70% of companies are turning to open networking to take advantage of innovative technologies like AI. More so, as IT leaders continue to see the benefits of open infrastructure and the critical role it plays in modernizing the data center, companies are adopting much more of the technology to a point where almost 94% are using at least some open technology in their data center. With increasing numbers, companies are continuing to switch to open infrastructure to combat the inefficiencies of proprietary underpinnings.

Why open infrastructure?

Instead of relying on proprietary legacy infrastructure, IT leaders are turning to open infrastructure to have flexibility in the hardware they use. With the growing market of AI-specific compute processing hardware, businesses see the benefits of being able to mix and match hardware and software à la carte-style to have infrastructure that best meets their specific needs.

Additionally, to operate in this digital era, businesses need the ability to move fast and make quick decisions, which extends to the operations of the data center. There is a balancing act between human-led and technology-driven ops as it is expensive to have a solely human-led operations team. To help relieve some of this cost, companies are using modern tools like automation to scale, mitigate errors, and enable IT leaders to manage more switches.

Understanding Infrastructure Needs

Overall, as companies continue to build out their AI programs to stay competitive and drive new business opportunities, they need to understand what that means from an infrastructure standpoint. While building new AI applications isn’t a simple task, it is important to have simple, open-infrastructure to process large amounts of information with efficient, cost-effective hardware and software that is easy to operate and maintain. 

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: