Sign up for our newsletter and get the latest big data news and analysis.

GridGain 8.8 Advances Its Multi-Tier Database Engine to Scale Beyond Available Memory Capacity and Meet Growing Customer Demand

GridGain® Systems, provider of enterprise-grade in-memory computing solutions powered by the Apache® Ignite® distributed database, announced GridGain 8.8, the latest release of the company’s in-memory computing platform. The release features enhanced support for GridGain’s multi-tier database engine, which scales up and out across memory and disk.

When Big Data Collides with Intellectual Property Law

In this contributed article, technologist Bernard Brode discusses how the realm of intellectual property – and the myriad twists and turns inherent – is being subjected to the latest waves of cutting-edge analytic tools which fall within the sphere of big data.

How AI Will Shape the Future of Customer Communications

In this contributed article, Eric Schurke, VP of Operations at Ninja Number, discusses how AI solutions are changing the way businesses communicate in 2021 and beyond. To succeed you should focus on three different aspects: incorporating customer feedback, over-communicating wherever you can, and building a culture of customer success.

AI-driven Platform Identifies and Remediates Biases in Data

Synthesized has released the Community Edition of its data platform for Bias Mitigation. Released as a freemium version, the offering incorporates AI research and cutting-edge techniques to enable any organization to quickly identify potential biases within their data and immediately start to remediate these flaws.

Feature Stores are Critical for Scaling ML Initiatives and Accelerating both Top-line and Bottom-line Impact

Feature stores are emerging as a critical component of the infrastructure stack for ML. They solve the hardest part of operationalizing ML: building and serving ML data to production. They allow data scientists to build more accurate ML features and deploy these features to production within hours instead of months.

Why 2021 is The Year of Low-Code

Forrester analysts estimate 75% of all enterprise software will be built with low-code technology this year. As the pandemic’s impact continues well into 2021, IT leaders will continue to rely on rapid, agile low-code software development platforms to roll out business critical solutions and expand digital channels. Platforms that integrate augmented reality, holistic multi-experiences, and easy enterprise data access will enable organizations to navigate continued economic uncertainty.

Kyligence CEO Identifies Top Big Data, Cloud, and Data Analytics Predictions for 2021

Kyligence, originator of Apache Kylin and developer of the AI-augmented analytics platform Kyligence Cloud, provides its cloud data analytics predictions for 2021, focusing on the rapid growth rate of cloud-native data warehouse and data storage services that will enable the massive acceleration of analytics adoption.

Best of arXiv.org for AI, Machine Learning, and Deep Learning – December 2020

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

TOP 10 insideBIGDATA Articles for December 2020

In this continuing regular feature, we give all our valued readers a monthly heads-up for the top 10 most viewed articles appearing on insideBIGDATA. Over the past several months, we’ve heard from many of our followers that this feature will enable them to catch up with important news and features flowing across our many channels.

Cloud Optimization 2.0: Beyond Better Performance

In this contributed article, Ross Schibler, Co-founder and CEO of Opsani, introducs the concept of Optimization 2.0. Companies understand the importance of optimizing to increase efficiency/performance and decrease costs–but humans can only do so much. Optimization 2.0 requires humans to put some trust in machines, to work with ML and AI to capture all the possible combinations of potential solutions, and choose the right one at the right time, and automate the process of optimization.