Concurrent, Inc., a leader in data application infrastructure, released 2015 predictions for big data and Apache Hadoop. While the past year brought expansion and education across the Hadoop ecosystem and increased investment in related initiatives, 2015 will be the year of pragmatism as global enterprises turn their focus on getting value from these investments.
Big data is more than a buzzword, as proven by how fast organizations are adopting new analytics technologies to obtain business value from it. That is the key takeaway from a Luth Research survey of large organizations currently using big data analytics software or planning to use it in the next 12 months.
Teradata Corp. (NYSE: TDC), the big data analytics and marketing applications company, today announced engineering advancements to the Teradata Database that deliver analytic performance and system efficiency through new memory and CPU optimizations.
Looker, an innovative software company with a unique approach to business analytics, today announced a partnership with Teradata Corp. (NYSE: TDC), the big data analytics and marketing applications company.
In your world – numbers and data can save lives. Minutes and seconds absolutely matter. Whether engaged in genome sequencing, drug design, product analysis or risk management, life sciences research teams need high-performance technical environments with the ability to process massive amounts of data and support increasingly sophisticated simulations and analyses.
Today at Big Data Innovation Summit 2014, GridGain Systems (www.GridGain.com), a leading innovator of open source in-memory computing solutions, announced the launch of the GridGain In-Memory Data Fabric, a comprehensive software solution that accelerates business operations and time to insights by enabling high-performance transactions, real-time streaming and ultra-fast analytics in a single, highly scalable data access and processing layer.
“Fortissimo Foundation is a clustered, pervasive, global direct-remote I/O access system that linearly scales I/O bandwidth, memory, Flash and hard disk storage capacity and server performance to provide an “in-memory” scale-out solution that intelligently aggregates all resources of a data center cluster into a massive global name space, bridging all remote compute and storage resources to look and act as if they were local.”