Sign up for our newsletter and get the latest big data news and analysis.

Text Analytics without Tradeoffs

Caryn Alagno, EVP, Finch Computing

The pace at which the world creates data will never be this slow again. And much of this new data we’re creating is unstructured, textual data. Emails. Word documents. News articles. Blogs. Reviews. Research reports… Understanding what’s in this text – and what isn’t, and what matters – is critical to an organization’s ability to understand the environments in which it operates. Its competitors. Its customers. Its weaknesses and its opportunities.

Enabling Value for Converged Commercial HPC and Big Data Infrastructures through Lustre*

big data infrastructure

A number of industries rely on high-performance computing (HPC) clusters to process massive amounts of data. As these same organizations explore the value of Big Data analytics based on Hadoop, they are realizing the value of converging Hadoop and HPC onto the same cluster rather than scaling out an entirely new Hadoop infrastructure.

Hadoop for HPC—It Just Makes Sense


Building out a Hadoop cluster with massive amounts of local storage is a considerably extensive and expensive undertaking, especially when the data already resides in a POSIX compliant Lustre file system. Now companies can adopt analytics written for Hadoop and run them on their HPC clusters.

Enterprise Grade Lustre in the Clouds

commercial lustre

With the release of Intel® Cloud Edition for Lustre software in collaboration with key cloud infrastructure providers like Amazon Web Services (AWS), commercial customers have an ideal opportunity to employ a production-ready version of Lustre—optimized for business HPDA—in a pay-as-you-go cloud environment.

10 Ways IBM Platform Computing Saves You Money

data center cloud

IBM Platform Computing products can save an organizations money by reducing a variety of direct costs associated with grid and cluster computing. Your organization can slow the rate of infrastructure growth and reduce the costs of management, support, personnel and training—while also avoiding hidden or unexpected costs.

Understanding Active Risk Management with High Performance Data

This webinar is focus on understanding active risk management with high performance data and grid management.

Big Data Analytics: Gain a Performance Edge with a Hybrid Solution

simon garland head

With a hybrid approach to big data storage, companies can combine the high performance and speed capabilities of in-memory while solving the storage issues by putting the vast historical data sets on disk. By bridging available technologies, companies can deliver on all counts – including cost.

In-Memory Database vs. In-Memory Data Grid

Nikita Ivanov, CTO of GridGain

In-memory computing is comprised of two main categories: In-Memory Databases and In-Memory Data Grids. Nikita Ivanov, CTO of GridGain delves into the differences between the two and when to apply each technology.

Long Live In Memory Computing

Nikita Ivanov, CTO of GridGain

By Nikita Ivanov, CEO of GridGain in Memory Computing. In the last 12 months we observed a growing trend that use cases for distributed caching are rapidly going away as customers are moving up stack… in droves. Let me elaborate by highlighting three points that when combined provide a clear reason behind this observation.

Big Data – Quality or Quantity?

In this special guest feature, Anchita Magan from [x]cube DATA writes that the element of quality has to be considered in quantifiable data. Significance of Big Data The entire cosmos has been turned into an aggregated ocean of Data – structured or unstructured, systematic or unsystematic, useful or useless. This zillion of roughly organized data […]