Sign up for our newsletter and get the latest big data news and analysis.

The Importance of Vectorization Resurfaces

Vectorization offers potential speedups in codes with significant array-based computations—speedups that amplify the improved performance obtained through higher-level, parallel computations using threads and distributed execution on clusters. Key features for vectorization include tunable array sizes to reflect various processor cache and instruction capabilities and stride-1 accesses within inner loops.

How Big Data and AI Are Transforming How We Buy Cars

In this contributed article, John Price, CEO of Vast, explores how big data coupled with the methods of artificial intelligence (AI) have transformed how we buy cars. Big data and AI aren’t just changing the sales process for consumers. They’re changing it for salespeople, too, giving each salesperson an “intelligent assistant” in their pocket.

“Above the Trend Line” – Your Industry Rumor Central for 9/18/2017

Above the Trend Line: your industry rumor central is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items grouped by category such as people movements, funding news, financial results, industry alignments, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz.

AI is on Its Way to the Enterprise, Bringing Easy Analytics with It

In this contributed article, Doug Bordonaro, Chief Data Evangelist at ThoughtSpot takes a personal ride through all the ways that artificial intelligence (AI) is making strong inroads into the enterprise, specifically how the promise of AI in the enterprise is finally being realized.

AWS Cost Control Recommendations and How They Work

In this contributed article, Jay Chapel, CEO of ParkMyCloud provides 6 instrumental AWS cost control recommendations and how they work. These will be of interest to large and small enterprises that want to reduce their bills, just like with any utility you might run at home. We’ll see how cloud management and cloud optimization tools are quite important for all of these organizations looking for cost control.

How To Spend Less Time Processing Queries And More Time Gaining Insights

In this contributed article, Richard Heyns, CEO of Brytlyt discusses the trend of companies adopting GPU database software and the things they should keep in mind when adopting a GPU database.

2017 Data Connectivity Outlook

Our friends over at Progress revealed some new findings from their 2017 Data Connectivity Outlook global survey. In the 4th annual survey based on responses from 1,200 business and IT professionals around the world, the results validate the explosive growth seen in SaaS data sources and the common challenges faced when trying to connect to data in a hybrid environment.

Intel® Parallel Studio XE Helps Developers Take their HPC, Enterprise, and Cloud Applications to the Max

Intel® Parallel Studio XE is a comprehensive suite of development tools that make it fast and easy to build modern code that gets every last ounce of performance out of the newest Intel® processors. This tool-packed suite simplifies creating code with the latest techniques in vectorization, multi- threading, multi-node, and memory optimization.

Identifying Health Risks Using Pattern Recognition and AI

Physicians are increasingly using AI technologies to treat patients with superhuman speed and performance, and predictive analytics will be key to delivering more effective, proactive, and quality care. Stephen Wheat, Director of HPC Pursuits at Hewlett Packard Enterprise, explores how we can identify health risks using pattern recognition and AI. 

‘Learning Database’ Speeds Queries from Hours to Seconds

University of Michigan researchers developed software called Verdict that enables existing databases to learn from each query a user submits, finding accurate answers without trawling through the same data again and again. Verdict allows databases to deliver answers more than 200 times faster while maintaining 99 percent accuracy. In a research environment, that could mean getting answers in seconds instead of hours or days.