Sign up for our newsletter and get the latest big data news and analysis.

Bridging the MapReduce Skills Gap

Mapreduce skills gap

Data is exploring at large organization. So is the adoption of Hadoop. Hadoop’s potential cost effectiveness and facility for accepting unstructured data is making it central to modern, “Big Data” architectures. Yet, a significant obstacle to Hadoop adoption has been a shortage of skilled MapReduce coders.

Lustre 101

lustre logo

This week’s lustre 101 article looks at the history of lustre and the typical configuration of this high-performance scalable storage solution for big data applications.

Learning to Define – Software Defined Storage

software defined storage

Let’s start the conversation here: If you work with big data in the cloud or deal with structured and unstructured data for analytics, you need software defined storage.

Optimizing Life Sciences – Deploying IBM Platform Computing

high performance infrastructure

In your world – numbers and data can save lives. Minutes and seconds absolutely matter. Whether engaged in genome sequencing, drug design, product analysis or risk management, life sciences research teams need high-performance technical environments with the ability to process massive amounts of data and support increasingly sophisticated simulations and analyses.

Elastic Storage – The power of flash, software defined storage and data analytics

ibm-logo

The rapid, accelerating growth of data, transactions, and digitally aware devices is straining today’s IT infrastructure. At the same time, storage costs are increasing and user expectations and cost pressures are rising. This staggering growth of data has led to the need for high-performance streaming, data access, and collaborative data sharing. So – how can elastic storage help?

InsideBIGDATA Guide to Big Data Solutions in the Cloud

Big Data Business Solution Cloud widget

For a long time, the industry’s biggest technical challenge was squeezing as many compute cycles as possible out of silicon chips so they could get on with solving the really important, and often gigantic problems in science and engineering faster than was ever thought possible. Now, by clustering computers to work together on problems, scientists are free to consider even larger and more complex real-world problems to compute, and data to analyze.

Enterprise Risk Management

ibm-logo

Most firms understand that robust enterprise risk management (ERM) will not only improve risk management; it will also help them to measure risk more accurately and develop a more sustainable business model. However, while simple in theory, ERM can sometime be difficult in practice.

Attaining High-Performance Scalable Storage

data center cloud

As compute speed advanced towards its theoretical maximum, the HPC community quickly discovered that the speed of storage devices and the underlying the Network File System (NFS) developed decades ago had not kept pace. As CPUs got faster, storage became the main bottleneck in high data-volume environments.

insideBIGDATA Launches Big Data Events Calendar

InsideBIGDATA Events Calendar

Today InsideBigData announces the roll out of a comprehensive events calendar for the Big Data community, and offers free listings to event planners. This event calendar is a valuable resource for planning your conference travels in 2014

What Can Hadoop Can Do for Your Big Data Strategy?

SAS Hadoop

For all its agility in handling big data, Hadoop by itself is not a big data strategy.