Sign up for our newsletter and get the latest big data news and analysis.

Advanced Performance and Massive Scaling Driven by AI and DL

In this contributed article, Kurt Kuckein, Director of Marketing for DDN Storage, discusses how current enterprise and research data center IT infrastructures are woefully inadequate in handling the demanding needs of AI and DL. Designed to handle modest workloads, minimal scalability, limited performance needs and small data volumes, these platforms are highly bottlenecked and lack the fundamental capabilities needed for AI-enabled deployments.

Okera Introduces Attribute-Based Access Control for Data Lake Security and Access Management

Okera, a leading active data management company for data lake security and governance, announced the release of new attribute-based access control (ABAC) and automated business metadata tagging and policy enforcement capabilities. These new features help enterprises simplify how to manage, secure, and govern data access on data lakes at scale in an easy and automated manner.

Databricks Open Sources Delta Lake for Data Lake Reliability

Databricks, a leader in Unified Analytics and founded by the original creators of Apache Spark™, announced a new open source project called Delta Lake to deliver reliability to data lakes. Delta Lake is the first production-ready open source technology to provide data lake reliability for both batch and streaming data. This new open source project will enable organizations to transform their existing messy data lakes into clean Delta Lakes with high quality data, thereby accelerating their data and machine learning initiatives.

Book Excerpt: Sensitive Data Management and Access Control

Below please find an excerpt from a new title, O’Reilly Media’s “The Enterprise Big Data Lake: Delivering the Promise of Big Data and Data Science,” a new release in the Data Warehousing category by Alex Gorelik. The Enterprise Data Lake is a go-to resource for CTOs, CDOs, chief analytics officers and their teams, the people charged with extracting the strategic and operational insights from petabytes of data that will ultimately transform their organizations into sooth-seeing, agile businesses.

Teradata Expands As-a-Service Offerings for Vantage

Teradata (NYSE: TDC), the Pervasive Data Intelligence company, announced three new offerings for its Teradata Vantage platform, reflecting Teradata’s commitment to offering as-a-service capabilities to meet the needs of their customers. These capabilities provide choice and flexibility for customers operating on Amazon Web Services (AWS) and Microsoft Azure, private cloud or hybrid cloud environments.

Expanding Adoption for Graph Databases

In order to facilitate access to graph database technology Neo4j, a leader in graph databases, announced that it has expanded the availability of its free Startup Program. Neo4j graph technology drives innovation at NASA, eBay, Airbnb, and Adobe. The Neo4j Startup Program ensures the next generation of world-changing startups are powered by the leading graph database technology.

Major Disruptions in Data Storage Technology: What This Shake-Up Means for the Enterprise

In this special guest feature, Dave Donald, Founder and CEO of Keeper Technology, looks at some of the biggest disruptors we will see, or continue to see, shaking up the storage industry in 2019 including: SDS, NVMe, hyper-converged architectures, edge computing, and portable storage architectures.

Building a Data Catalog: A Guide to Planning & Implementing

Building and implementing a data catalog can help your enterprises’ data community discover and use the best data and analytics resources for their projects. A data catalog can help businesses achieve faster results, and make better decisions. As for the next steps to address the importance of data catalogs in your business, Data.world covers that, as well, in a new report.

The Big Data Era: Managing Challenges of Scale, Speed, Personal Information

In this contributed article, Data Vault modeling inventor Dan Linstedt, points out that while simply collecting lots of data presents comparatively few problems, most businesses run into two significant roadblocks in its use: extracting value and ensuring responsible handling of data to the standard required by data privacy legislation like GDPR.

Transformative Solutions for Accelerating AI, Analytics and Deep Learning at NVIDIA #GTC19

One pivotal message received by attendees of this week’s NVIDIA GPU Technology Conference (GTC) in Silicon Valley is the importance of game-changing storage solutions and applications that empower users to accomplish their most challenging AI objectives.