Sign up for our newsletter and get the latest big data news and analysis.

The Prescribed Value of Data Over Time

Shahbaz-Ali2

In this special guest feature, Shahbaz Ali of Tarmin takes a look at the commoditization of data and how aggregated data has evolved to the extent of being so crucial and proprietary that its value is unparalleled and irreplaceable to the organization.

The Analytics Frontier of the Hadoop Eco-System

Ted Wilkie

“The Hadoop MapReduce framework grew out of an effort to make it easy to express and parallelize simple computations that were routinely performed at Google. It wasn’t long before libraries, like Apache Mahout, were developed to enable matrix factorization, clustering, regression, and other more complex analyses on Hadoop. Now, many of these libraries and their workloads are migrating to Apache Spark because it supports a wider class of applications than MapReduce and is more appropriate for iterative algorithms, interactive processing, and streaming applications.”

Why The Strata + Hadoop World Conference Matters

Sundeep_RPM

In this special guest feature, Sundeep Sanghavi, explains why attending the upcoming Strata + Hadoop World Conference is important – Oct. 15-17, 2014 in NYC. Sundeep Sanghavi is the CEO and Co-Founder of DataRPM.

Predicting the Big Data Landscape

In this special guest feature, Al Nugent, co-author of the guide “Big Data for Dummies,” looks back at 2014 and how big data has progressed and also offers some predictions for how the technology might evolve.

Interview: Replacing HDFS with Lustre for Maximum Performance

Gabriele Paciucci

“When organizations operate both Lustre and Apache Hadoop within a shared HPC infrastructure, there is a compelling use case for using Lustre as the file system for Hadoop analytics, as well as HPC storage. Intel Enterprise Edition for Lustre includes an Intel-developed adapter which allows users to run MapReduce applications directly on Lustre. This optimizes the performance of MapReduce operations while delivering faster, more scalable, and easier to manage storage.”

Dirk Slama Keynote on The Internet of Things

Dirk Slama, Director of Business Development, Bosch Software Innovations

“The vision for the Internet of Things is very powerful – a world in which assets, devices, machines, and cloud-based applications seamlessly interoperate, enabling new business models and services; with big data analytics as a foundation to support intelligent decision making in this connected world. As with every vision, the question is how to make it happen. This presentation provides key success factors for IoT, as well as a detailed overview of concrete IoT uses cases in the areas of automotive and transport, manufacturing and supply chain, as well as energy. Finally, a framework for IoT implementation is presented, which helps making your IoT projects a success.”

From Yawn to YARN: Why You Should be Excited About Hadoop 2.0

Yarn White Paper

Learn how YARN offers a world beyond MapReduce for Hadoop 2: less-encumbered by complex programming protocols, faster, and at a lower cost!

Are “Small” and “Smart” Keys to Your Big Data Success?

Allen Bonde

In this special guest feature, Allen Bonde of Actuate discusses why we need to take all the Big Data we’re building up, pour it down a “data funnel” – then deliver it to customers as Small Data in ways that make it incredibly valuable. Allen Bonde is VP of Product Marketing & Innovation at business reporting and analytics leader Actuate Corporation – the people behind BIRT.

The Non-Technical Revolution: How Business Intelligence Came to the People

Saar Bitner

In this special guest feature, Saar Bitner of SiSense discusses the history and current state of BI software, and how things are starting to change to better address the needs of the business user.

Big Data Analytics: Gain a Performance Edge with a Hybrid Solution

simon garland head

With a hybrid approach to big data storage, companies can combine the high performance and speed capabilities of in-memory while solving the storage issues by putting the vast historical data sets on disk. By bridging available technologies, companies can deliver on all counts – including cost.