Sign up for our newsletter and get the latest big data news and analysis.

A collection of big data white papers reviewed by the editors of insideBIGDATA. Please visit the insideBIGDATA White Paper Library for a comprehensive list of white papers focus on big data strategies.

Chief Data Officer Survey and Research Results

Research undertaken by YouGov on behalf of analytics database provider Exasol finds that 72% of businesses worry that their inability to generate insights through the analysis of data will have a negative impact on financial performance. This is despite a similar number (77%) of respondents stating that data is now their organization’s most valuable asset. The findings of the research, combined with additional desk research and the views from a number of industry commentators, are brought together in Exasol’s new eBook.

New Survey: Nearly Two Thirds of Analytics Projects Are Jeopardized Due to Poor Access to the Right Data

According to the 2019 Data Decisions Survey from analytics database provider Exasol, 57% of organizations have suffered because of slow or poor access to the right data, resulting in an inability to access real-time analytics and inaccurate business intelligence (BI). Full results of the survey, recently released, highlight how organizations are leveraging data to make more intelligent and productive business decisions.

insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads – Part 3

Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads, we’ll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation.

insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads – Part 2

Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads, we’ll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation.

insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads

Artificial Intelligence (AI) and Deep Learning (DL) represent some of the most demanding workloads in modern computing history as they present unique challenges to compute, storage and network resources. In this technology guide, insideBIGDATA Guide to Optimized Storage for AI and Deep Learning Workloads, we’ll see how traditional file storage technologies and protocols like NFS restrict AI workloads of data, thus reducing the performance of applications and impeding business innovation. A state-of-the-art AI-enabled data center should work to concurrently and efficiently service the entire spectrum of activities involved in DL workflows, including data ingest, data transformation, training, inference, and model evaluation.

Streaming Cloud Integration: Key Data Considerations for Hybrid and Multicloud Architectures

A new white paper from Striim explores why hybrid-cloud, multicloud, or inter-cloud architecture strategies all need a modern, real-time data integration solution that keeps up with the speed of business. Striim is uniquely qualified to help make the journey to cloud fast and smooth. Striim simplifies moving real-time data across a diverse set of data sources and cloud targets with high-performance, security, and reliability.

How Hadoop Can Help Your Business Manage Big Data

Hadoop. Once largely unknown, hit the scene in part due to the explosion of unstructured data. Download the new white paper, “Making the Most of Your Investment in Hadoop,” through which SQREAM explores an approach to Hadoop that aims to help businesses reduce time-to-insight, increase productivity, empower data teams for better decision making, and increase revenue.

How Companies Can be Ethical with AI

One big question in the industry these days is about the safeguards companies can take to ensure their AI is fair and ethical. Stakeholders are trying to determine how enterprises can ensure that their employees, investors and customers trust their AI technology. With AI advancing at the incredible rate that it is and being applied to diverse use cases such as criminal detection, this is an important and timely topic.

DarwinAI Generative Synthesis Platform and Intel Optimizations for TensorFlow Accelerate Neural Networks

DarwinAI, a Waterloo, Canada startup creating next-generation technologies for Artificial Intelligence development, announced that the company’s Generative Synthesis platform – when used with Intel technology and optimizations – generated neural networks with a 16.3X improvement in image classification inference performance. Intel shared the optimization results in a recently published solution brief.

State of Data Science, Engineering & AI Report – 2019

Our friends over at Diffbot, using the Diffbot Knowledge Graph, and in only a matter of hours, conducted the single largest survey of machine learning skills ever compiled in order to generate a clear, global picture of the machine learning workforce. All of the data contained in the “State of Data Science, Engineering & AI Report – 2019” was pulled from the company’s structured database of more than 1 trillion facts about 10 trillion entities (and growing autonomously every day).