Enterprises Seek to Improve Technological Infrastructure and Cloud Hosting Processes for Supporting AI

Print Friendly, PDF & Email

Today we continue the insideBIGDATA Executive Round Up, our annual feature showcasing the insights of thought leaders on the state of the big data industry, and where it is headed. In today’s discussion, our panel of experienced big data executives – Ayush Parashar, Co-founder and Vice President of Engineering for Unifi Software, Robert Lee, Vice President & Chief Architect, Pure Storage, Inc., and Oliver Schabenberger, COO & CTO at SAS – discusses enterprises seeking to improve their technological infrastructure and cloud hosting processes for supporting their AI, machine learning, and deep learning efforts.

The conversation is moderated by Daniel D. Gutierrez, Managing Editor & Resident Data Scientist of insideBIGDATA.

insideBIGDATA: As deep learning makes businesses innovate and improve with their AI and machine learning offerings, more specialized tooling and infrastructure will be needed to be hosted in the cloud. What’s your view of enterprises seeking to improve their technological infrastructure and cloud hosting processes for supporting their AI, machine learning, and deep learning efforts?

Oliver Schabenberger, COO & CTO at SAS

Oliver Schabenberger: The rise of AI during the last decade is due to the convergence of big data and big compute power with decades-old neural network technology. AI systems based on deep neural networks require large amounts of data to train deep networks well.

Improving technological infrastructure and cloud processes is a requirement for any business involved in digital transformation, whether business problems are solved through AI, machine learning, or other technologies. We will see continued investment in cloud infrastructure as a result of it.

Ultimately however, the question of whether AI-driven insights are created via cloud-based or on-premises systems is secondary. It is central that smarter and more effective data analysis by AI systems addresses real-world challenges for businesses and governments.

Robert Lee, Vice President & Chief Architect for Pure Storage

Robert Lee: Deep learning continues to be a data-intensive process and that shows no sign of slowing. Data at scale has gravity and is hard to move, so in general we see that you’ll train where your data is, and you’ll need fast access to that data wherever that is. Software tooling and GPU hardware are available and fairly portable both on-premise and in the public cloud – the third piece of the puzzle (fast, abundant storage to feed the data) is often not given enough thought and planning, to the detriment of the speed of development. Ultimately, we see customers at larger scales tending to have large pools of data on-premise, and ultimately finding better performance and economics training next to that data.

Ayush Parashar, Co-founder and Vice President of Engineering for Unifi Software

Ayush Parashar: Cloud has been a big factor in helping businesses to innovate. To compete, especially around AI right now, it’s critical for a company to iterate on the latest and greatest tooling that can scale up and scale down in a commodity environment instantly. All the major cloud providers have already innovated on their cloud offerings around AI and that has made the AI algorithm available in a ubiquitous fashion.

In addition to tooling and infrastructure around AI, data management and analytic tools are very important to take AI to the next level. Getting the right data and integrating data from various places, cleansing it and preparing it is pivotal, and it’s often the first step that a data scientist works on for their AI project.

On premises doesn’t always provide this agility. Organizational teams can become more successful if they can do more in less time. Cloud allows for more tools to come together, faster, at scale.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*