Let’s start the conversation here: If you work with big data in the cloud or deal with structured and unstructured data for analytics, you need software defined storage.
Denodo Technologies, a leader in Data Virtualization, today announced Denodo Express (DE), a no-cost Data Virtualization tool. Designed to democratize Data Virtualization, Denodo Express allows data management professionals to start proving the value of Data Virtualization by generating new data insights in hours instead of weeks.
The rapid, accelerating growth of data, transactions, and digitally aware devices is straining today’s IT infrastructure. At the same time, storage costs are increasing and user expectations and cost pressures are rising. This staggering growth of data has led to the need for high-performance streaming, data access, and collaborative data sharing. So – how can elastic storage help?
For a long time, the industry’s biggest technical challenge was squeezing as many compute cycles as possible out of silicon chips so they could get on with solving the really important, and often gigantic problems in science and engineering faster than was ever thought possible. Now, by clustering computers to work together on problems, scientists are free to consider even larger and more complex real-world problems to compute, and data to analyze.
IBM (NYSE: IBM) announced today that Ping An Insurance (Group) Company of China, Ltd. (Ping An), the largest private insurer in the country, has implemented an IBM Software Defined Storage solution to help it speed data collection from one month to one hour, dramatically improving its ability to meet new regulatory requirements such as the annual audit and industrial data analysis.
As compute speed advanced towards its theoretical maximum, the HPC community quickly discovered that the speed of storage devices and the underlying the Network File System (NFS) developed decades ago had not kept pace. As CPUs got faster, storage became the main bottleneck in high data-volume environments.
Intel’s White Paper, “Architecting a High-Performance Storage System,” shows you the step-by-step process in the design of a Lustre file system. It is available for download at insideBIGDATA White Paper Library. “Although a good system is well-balanced, designing it is not straight forward. Fitting components together and making the adjustments needed for peak performance is challenging. The process begins with a requirement analysis followed by a design structure (a common structure was selected for the paper) and component choices.”