Sign up for our newsletter and get the latest big data news and analysis.

NVIDIA Ships New Advanced AI System — NVIDIA DGX A100: Third-Generation DGX Packs Record 5 Petaflops of AI Performance

NVIDIA today unveiled NVIDIA DGX™ A100, the third generation of the world’s most advanced AI system, delivering 5 petaflops of AI performance and consolidating the power and capabilities of an entire data center into a single flexible platform for the first time. Immediately available, DGX A100 systems have begun shipping worldwide, with the first order going to the U.S. Department of Energy’s (DOE) Argonne National Laboratory, which will use the cluster’s AI and computing power to better understand and fight COVID-19.

New Study Details Importance of TCO for HPC Storage Buyers

Total cost of ownership (TCO) now rivals performance as a top criterion for purchasing high-performance computing (HPC) storage systems, according to an independent study published by Hyperion Research. The report, commissioned by our friends over at Panasas®, a leader in HPC data storage solutions, surveyed data center planners and managers, storage system managers, purchasing decision-makers and key influencers, as well as users of HPC storage systems.

Give the People What They Want: The Rise of the Data Marketplace

In this special guest feature, Susan Cook, CEO of Zaloni, discusses her definition of “data marketplaces,” a slightly different concept to the large industry-wide data exchanges and marketplaces in financial services, pharma and healthcare. These data marketplaces have grown up within enterprises (even though the data itself could be internally or externally sourced) to service all their various consumers of data. This marketplace concept is the latest manifestation of the Amazonification of the enterprise and the latest practical evolution in turning disorganized data sprawl into targeted data shopping.

Video Highlights: What is Streaming Data Integration?

In this “Whiteboard Wednesday” video, Steve Wilkes, founder and CTO of Striim, takes a look at Streaming Data Integration – what it is, what it is used for, and most importantly, what is needed set it up and manage it.

NAS Migrations: Five Key Steps to a Fast and Secure Data Transfer

In this special guest feature, Wendy Meyers, Director of Global Operations at Datadobi, believes that with proper planning and a professional migration software, most of the important issues can be avoided during an NAS migration project. After handling hundreds of complex NAS migrations for over the last decade, the company has identified five key steps that, when taken, can help produce a fast and accurate NAS migration.

New Study Details Importance of TCO for HPC Storage Buyers

Total cost of ownership (TCO) is often assumed to be an important consideration for buyers of HPC storage systems. Because TCO is defined differently by HPC users, it’s difficult to make comparisons based on a predefined set of attributes. With this fact in mind, our friends over at Panasas commissioned Hyperion Research to conduct a worldwide study that asked HPC storage buyers about the importance of TCO in general, and about specific TCO components that have been mentioned frequently in the past two years by HPC storage buyers.

insideBIGDATA Latest News – 5/4/2020

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

How to Prevent Data Black Holes from Swallowing your Organization Whole

In this special guest feature, Tolga Tarhan, Chief Technology Officer at Onica, points out that as data accumulates in an environment, applications and services that rely on that data will naturally be pulled into the same environment, creating a data black hole. As companies continue to accumulate data, they are sitting on ineffectively used data — creating problems with user experience, speed, and digital transformation. And eventually, they are being pulled into data black holes in which the larger a data mass is, the harder it is to move.

Rubber Meets the Road: Reality of AI in Infrastructure Monitoring

In this special guest feature, Farhan Abrol, Head of Machine Learning Products at Pure Storage, examines the disparity between the hype and what’s been delivered, and where we’ll see the most impactful advancements in efficiency and capacity in the coming year. The hype around artificial intelligence and machine learning’s potential to improve IT infrastructure continues to grow, as does enterprise investment in intelligent infrastructure management, however the anticipated value has yet to be realized.

insideBIGDATA Latest News – 4/22/2020

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.