Sign up for our newsletter and get the latest big data news and analysis.

China’s Ping An Insurance Deploys IBM Storage Virtualization to Speed Data Insights

IBM (NYSE: IBM) announced today that Ping An Insurance (Group) Company of China, Ltd. (Ping An), the largest private insurer in the country, has implemented an IBM Software Defined Storage solution to help it speed data collection from one month to one hour, dramatically improving its ability to meet new regulatory requirements such as the annual audit and industrial data analysis.

Attaining High-Performance Scalable Storage

data center cloud

As compute speed advanced towards its theoretical maximum, the HPC community quickly discovered that the speed of storage devices and the underlying the Network File System (NFS) developed decades ago had not kept pace. As CPUs got faster, storage became the main bottleneck in high data-volume environments.

Imation Rolls Out NST4000 Hybrid Storage Appliance At VMworld

product-nst6000_new

Today at VMworld, Imation introduced the NST4000, a new hybrid storage appliance purpose-built for media and entertainment and data protection workloads.

Designing a High Performance Lustre Storage System: A Case Study

case

Intel’s White Paper, “Architecting a High-Performance Storage System,” shows you the step-by-step process in the design of a Lustre file system. It is available for download at insideBIGDATA White Paper Library. “Although a good system is well-balanced, designing it is not straight forward. Fitting components together and making the adjustments needed for peak performance is challenging. The process begins with a requirement analysis followed by a design structure (a common structure was selected for the paper) and component choices.”

Sponsored Post: Intel Enterprise Edition Software for Lustre

chart

Intel Enterprise Edition Software for Lustre can transform vast amounts of data into data-driven decisions. And using the software with Hadoop makes storage management simpler with single Lustre file systems rather than partitioned, hard-to-manage storage.

In-Memory Computing: Three Myths That Could Put Your Business at Risk

Eric Frenkiel_MemSQL

In this special guest feature, Eric Frenkiel, Co-founder and CEO, MemSQL writes about the three myth surrounding in-memory computing and how companies that don’t take advantage of IMC risk being left behind.

IBM Introduces Elastic Storage on Cloud

ibm-logo

IBM is announcing a new software defined storage-as-a-service on IBM SoftLayer, code named Elastic Storage on Cloud, that gives organizations access to a fully-supported, ready-to-run storage environment, which includes SoftLayer bare metal resources and high performance data management and allows organizations to move data between their on-premise infrastructure and the cloud.

Interview: Dolphin Speeds Business with Data Volume Management for SAP HANA

Dr. Werner Hopf

“Dolphin helps companies manage data volume and optimize processes so they can balance the performance and processing capabilities of SAP systems against the cost of running those systems. We develop a data volume management strategy so our customers can keep business critical data in SAP HANA, to get the fast efficient processing they need, and move static or business complete data on to other storage where it is still accessible. With a data volume management strategy in place, our customers are better prepared to go live on HANA and improve their return on investment.”

GridGain Adds New Level of Security to In-Memory Computing Platform

GridGain™ Systems, provider of the leading open source In-Memory Computing (IMC) Platform, announced that it has enhanced the security features on the latest version of its Enterprise Edition product to enable full visibility into all data access points across its platform, and now offers security audit, advanced authentication and fine grained authorization.

Teradata and MongoDB Empower Big Data Strategies with JSON Integration

teradata_logo_mi

Teradata (NYSE: TDC) has announced an alliance with MongoDB, Inc., to integrate their systems by building a high-speed, bi-directional connector based on JavaScript Object Notation (JSON). Use of the new connector will empower users to easily incorporate data for analytics, while improving operations with strategic intelligence.