insideBIGDATA Latest News – 11/24/2021

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

MOSTLY AI Becomes First Synthetic Data Provider to Achieve ISO Certification

MOSTLY AI, which pioneered the creation of AI-generated synthetic data, announced that it has just been awarded ISO 27001:2013 certification. The ISO 27001 standard is a globally recognized information security standard. With data privacy and information security at the heart of everything MOSTLY AI does, the company makes compliance with security standards and regulations a high priority, with a dedicated privacy and security team. In March 2021, the company received its SOC 2 Type 2 certification, which is an audit report capturing how a company safeguards customer data and how well internal controls are operating. 

“We are very proud to have received our ISO 27001:2013 certification,” said Melanie Hartl, Chief Information Security Officer at MOSTLY AI. “Since we work with many global Fortune 100 enterprises, like banks and insurance companies, who handle highly sensitive data, it is imperative that they can trust in our ability to keep data secure.”

Yugabyte Meets Developer Demand for Comprehensive PostgreSQL Compatibility with YugabyteDB 2.11

Yugabyte, a leading open source distributed SQL database company, announced the general availability of YugabyteDB 2.11, with updates that extend PostgreSQL compatibility in the open source database. These updates allow application developers to use powerful and familiar PostgreSQL features without compromising resilience, scale, or performance. This release extends YugabyteDB’s lead as the most PostgreSQL-compatible distributed SQL database in the world.

“The biggest roadblock to database adoption is familiarity. For developers, PostgreSQL is the most familiar database. Being able to work within a similar framework is critical for productivity. What other distributed SQL databases get wrong is they cherry pick features, either providing good compatibility without true distributed SQL, or the reverse,” said Karthik Ranganathan, co-founder and CTO, Yugabyte. “We’ve heard loud and clear from our community that true distributed SQL with complete PostgreSQL compatibility is the gold standard. That is what we are delivering.”

CAST AI™ Introduces ‘Instant Rebalancing,’ Advancing its Cloud Management Platform 

CAST AI™, the AI-driven cloud optimization company specializing in cost optimization for customers running cloud-native applications, introduced Instant Rebalancing, a feature that immediately and automatically reduces cloud compute costs between 50 and 75 percent. Built upon CAST AI’s cutting edge artificial intelligence technology, Instant Rebalancing enables customers to analyze their cluster configuration and – at the click of a button – rightsize it to the most cost-efficient compute resources available. With instant Rebalancing, customers automatically optimize a cluster from its current state to the most optimal configuration by seamlessly modifying compute resources in real time, based on real-time inventory availability and pricing. Customers typically realize savings within five minutes.

“A substantial part of cloud cost optimization is rightsizing, or using the best available resources  to optimize your deployments,” said Laurent Gil, Chief Product Officer at CAST AI. “We’ve developed advanced algorithms  that make rightsizing instant, and as easy as clicking a button. Customers can then integrate instant rebalancing into cluster onboarding or continuous optimization through Infrastructure as Code (IaC). Our platform provides customers  with significant cost savings by enabling them to quickly and easily get optimized and stay optimized, ensuring a truly frictionless experience.” 

Red Hat Bolsters Partner Ecosystem to Accelerate Data Science Pipelines Across the Open Hybrid Cloud

Red Hat, Inc., a leading provider of open source solutions, announced the availability of Red Hat OpenShift Data Science as a field trial, as well as an expanded partner ecosystem focused on this new cloud service offering. As leading artificial intelligence and machine-learning (AI/ML) partners support the service, Red Hat customers are provided with a range of solutions optimized for Red Hat OpenShift, letting them select the technologies to best meet their specific machine learning needs across the open hybrid cloud and edge computing environments.

“Data science and machine learning are helping drive innovation and business value in nearly every industry,” commented Mike Piech, vice president and general manager, Cloud Data Services, Red Hat. “For many companies the biggest barrier to adoption is the complexity of wiring together the necessary data sources with diverse model training and model deployment technologies. With Red Hat OpenShift Data Science, Red Hat’s contributions to Open Data Hub, and our extensive partner ecosystem, we’re helping organizations overcome such complexity to begin harnessing the full potential of machine learning from the leader in trusted open source technology.”

Druva Unveils Data Resiliency Cloud

Druva Inc. unveiled the Druva Data Resiliency Cloud. Delivering an at-scale SaaS solution for data resiliency, Druva enables enterprises to radically simplify data protection, streamline data governance, and gain data visibility as they accelerate cloud adoption. Leveraging a cloud-native, centralized, and automated approach to data protection and disaster recovery, the Druva Data Resiliency Cloud is designed to help enterprises manage data that has become increasingly fragmented across multi-cloud environments.

“Enterprises are working to stay ahead of three major trends: the accelerating rate of cloud migrations, the massive growth of data, and the concerning levels of malicious attacks,” said Jaspreet Singh, founder and CEO, Druva. “When data is resilient, a business can get back to business, and we are on a mission to make data resilient, secure, accessible, and actionable for organizations around the world. The Druva Data Resiliency Cloud is the culmination of years of unparalleled industry experience building a completely cloud-native platform, which we believe is the end-game of data resiliency. Leveraging the core innovations and benefits of the public cloud, we are bringing customers and partners a truly unmatched experience.” Launches First Pay-As-You-Go AI Management Platform to Make AI Accessible to All Companies has launched an AI Management Platform on Google Cloud, enabling companies that have a limited amount of resources and infrastructure ability to now support robust AI projects. The feature-rich platform democratizes AI through the removal of complicated licensing fees that can run companies into the hundreds of thousands—even millions—each year. This empowers teams to deliver projects faster through an intuitive interface that allows users to create, deploy, monitor, and retrain models in a few clicks. By managing the complexities throughout the AI lifecycle and offering revolutionary pay-as-you-go pricing, is breaking down the barriers that impede organizations from building a robust, high-impact AI practice. 

“We started with the goal of improving the lives of people who work with data day in and day out,” said Tuncay Isik, the founder, and CEO of “Our industry-first AI management platform removes production inhibitors while still scaling the value, domain expertise, and impact users can have at their organizations. By putting our platform in the hands of users via the relationship with Google, it will shine a whole new light on the way predictive analytics can—and should—be done.”

Entropik Tech Announces Beta Launch of its Disruptive Conversational Intelligence Platform, Decode

Entropik Tech, a leader in Emotion AI, announced the beta launch of its new conversational intelligence platform, Decode, which comes ready to be integrated with the existing conferencing and collaboration ecosystem. It seamlessly gathers conversation data and creates a layer of intelligence on top to turn conversations into actionable insights that will increase the efficiency and productivity of organizations.

“We are excited to announce the beta launch of our latest innovation, Decode,” commented Ranjan Kumar, Founder, and CEO, Entropik Tech. “Being a conversation intelligence platform, it offers companies a chance to unlock the unlimited potential of all conversations. The product is self-serve, easy-to-use and enables users to collate all their conversations in one place, regardless of the video conferencing tools they use. With the availability of Emotion AI capabilities, we believe that it signals a new era in video conversations. We look forward to announcing the General Availability of Decode in the next two months”.

Introducing TheLoops 1.0 – an Intelligent Support Operations Platform

TheLoops, an intelligent support operations platform, announced the release of version 1.0 of its enterprise-grade platform. TheLoops transforms the support experience, enabling agents to make decisions faster and deliver modern support with real-time access to operational customer product feature data within tools such as Salesforce, Zendesk, Intercom and Jira. TheLoops contextualizes data for businesses, delivering digital customer transformation. By learning from collaborations across support, customer success and engineering, it revolutionizes the support experience by providing insights from broad data sets and recommendations embedded in intelligent process flows – upskilling representatives to make them preventative and growth oriented. In effect, TheLoops bridges the gap between support and engineering. In addition, real-time insights drawn from people, process, and tooling interactions also help support managers to be more effective in monitoring the state of their service operations.

“We live in a world where more digital businesses recognize that leveraging automation and analytics to support human-centric engagement will improve the quality of customer relationships and drive empathetic loyalty,” said Somya Kapoor, CEO of TheLoops. “Many companies have digitized their data, but not their customer experience. This is where TheLoops steps in. By having an agile approach to customer support, we enable our clients to scale their businesses while reducing operational costs. TheLoops transforms support from being a cost center to a growth driver.”

Narrative Launches Buyer Studio, a No-Code App That Makes Buying Data Fast, Easy, and Cost-Effective

To help give organizations control of the data acquisition process while saving time and money, Narrative, the Data Commerce Platform, announced the launch of Buyer Studio. The new software offering enables organizations to find and access the precise data they need while automating the most time- and labor-intensive aspects of buying data. Traditionally, buying data is a cumbersome, manual, and highly inefficient multi-step process, involving various internal teams working over numerous months to find and evaluate suppliers, negotiate terms, run ETL, normalize schemas, integrate with other systems, and so on. Now, with Buyer Studio, organizations can simply find the precise data they need, place an order, and have it delivered directly to the systems they want, all with just a few clicks.

“Traditionally when you buy from a data broker, you have no say in the data you receive—you get what they offer you, and that’s it,” said Nick Jordan, founder and CEO of Narrative. “You are also most likely getting and paying for duplicate data. Narrative Buyer Studio is a synthesizer that provides all of the data that businesses want and need and makes it accessible via easy-to-use, understandable, and affordable options. With Buyer Studio, buyers no longer need to accept data from a firehose and waste money paying for duplicate and useless information. Buyers have the flexibility of buying data from one or many providers, transparency into how and when the data was collected, and the confidence of knowing that the data they purchase is exactly what they want. With Buyer Studio they can customize, filter, and control the data they need with no duplicates and set their price.”

Nasdaq Releases Data Fabric, New Managed Data API Service Available from Nasdaq Data Link

Nasdaq announced the launch of Data Fabric, a managed data solution to help investment management firms scale their data infrastructure with enhanced quality, governance and integrity. Built off Nasdaq Data Link, Data Fabric enables firms to significantly improve data time-to-value and can power investment processes and strategies with new datasets in a matter of days or weeks instead of months. The platform provides secure, end-to-end data hosting through fully managed infrastructure and data onboarding services, enabling firms to integrate internal and external data sets quickly to focus on their competitive edge.

“For many financial institutions, it is extremely expensive and time-consuming to build out a full data stack and develop reliable internal infrastructure,” said Bill Dague, Head of Alternative Data and Nasdaq Data Link. “We developed Data Fabric to empower firms to leapfrog that entire process. Financial services firms are already grappling with attracting and retaining the best technology and data science talent and Data Fabric ensures that those individuals can spend their time developing meaningful insights from data – not overseeing the infrastructure that should already exist.”

Treasure Data Unveils End-to-End Governance, Security and Privacy Foundation

Treasure Data™, a leading enterprise customer data platform (CDP), introduced the Treasure Data Trusted Foundation. The suite of features enables marketers to manage all data privacy and consent preferences related to individuals in the unified customer data record with data access permissions and controls – all within one smart platform. With Treasure Data’s best-in-class privacy, consent management, compliance and security controls, teams can quickly leverage trusted customer data to deliver both personal and ethical customer experiences. Marketers become better equipped to navigate an increasingly complex privacy landscape while benefiting from access to advanced audience segmentation, personalization and activation.

“Just as marketers understand the need for one-to-one personalization, they must also recognize the need for one-to-one privacy,” said Tamar Shor, vice president of product strategy at Treasure Data. “Brand reputation now relies on customer data stewardship balanced with personalization across every touchpoint. Treasure Data’s Trusted Foundation empowers marketers to deliver on this promise with campaigns that pair personalization with privacy, further building trust and respecting privacy while maximizing efficiencies.”

ChaosSearch Data Lake Platform is First to Unlock JSON Files for Analytics at Scale

ChaosSearch announced JSON Flex™, a powerful new capability that delivers a first-of-its-kind, scalable, cloud-native solution for analyzing JavaScript Object Notation (JSON) log files. The ChaosSearch Data Lake Platform can now help data engineers reduce the cost, complexity and time associated with accessing and analyzing complex nested JSON files. The JSON file format has become a standard for logging, and it’s common for data engineers to have heavily nested JSON within custom logs like CloudTrail and Sidecar. However, the JSON format requires significant preparation and transformation to present it for analysis – and the flattening of JSON can result in exploding data volume growth. These issues often prohibit companies from making JSON files available for on-demand analytics. Data engineers have limited options: they can either exclude and/or transform JSON via complicated data pipelines where such pipelines need to be revisited if requirements change; or at worst, fully expand all the JSON permutations upfront with the drawback of data explosion. These options create challenges for the business analysts who want access to all of the data and the ability to experiment with various searches as part of their analysis.

“Until the introduction of JSON Flex, there hasn’t been a scalable analytics solution for complex, nested JSON files,” said Thomas Hazel, CTO, Founder and Chief Scientist, ChaosSearch. “Organizations have been forced to rely on static, limited views of their data that ultimately lead to less valuable insights. We’re unlocking that data and democratizing it for the masses by delivering a platform that can automatically index and present all the data at once — making it easier to search, understand and leverage for business insights.”

Panzura Rolls Out 2nd-Gen Panzura Data Services With Complementary and Paid Tiers, Simplified Value-based Pricing

Panzura has released the second generation of Panzura Data Services to provide all Panzura customers with observability and visibility over their data, and more agile consumption-based pricing that flexibly supports them and scales as they grow. Panzura Data Services is a powerful SaaS data management solution that offers a single, unified view and management of enterprise data, whether it is stored in the cloud, on premises in a data center, or at the edge. It also offers search and audit capabilities that allow users to see and find files across their entire data storage footprint. Panzura Data Services is now available in two tiers. Complementary Basic service is now offered to all users of Panzura CloudFS, and new paid Licensed tiers are fee-based according to the number of concurrent users. 

“The second generation of Panzura Data Services  allows organizations to manage their unstructured data differently than the exclusively capacity-based paradigm of the past,” said Panzura’s chief innovation officer, Edward M.L. Peters, Ph.D.  “Now, they can go beyond an exclusively storage-based approach and separately license access for value-added functions such as enhanced audit and search capabilities. This allows them to easily access the value embedded in their data and provides for enhanced decision making.”

Alpine Intuition launches “iSquare for Pharma” – a version of its AI hosting platform optimized for companies in the pharmaceutical industry

AI-as-a-Service company Alpine Intuition launched “iSquare for Pharma”, the first release of its flagship platform for completely automated AI hosting, tailored to the needs of companies in the pharmaceutical sector. According to industry research*, one in two organizations have adopted or plan to adopt AI in at least one business function, resulting in global enterprise spend on AI-related solutions breaking the $500 billion mark by 2024. However, it remains difficult for most companies to utilize the technology. One of the biggest pain points is deployment, with the majority of companies spending up to 12 months to deploy AI models into production, which comes with increased overhead and computing costs. To address this pain point, Alpine Intuition has launched iSquare, an automated hosting platform, to allow firms to easily and cost effectively deploy AI models, opening up the technology without the need for AI specialists or DevOps skills.

“Deploying AI technology is fundamental for any firm to build and sustain competitive advantage in our digital age,” said Sebastian Savidan, Co-founder and CEO of Alpine Intuition. “With iSquare, we are not only levelling the playing field by allowing any company to deploy AI but also bringing advanced features to market, such as real-time inference which allows predictions to be made at any time with an immediate response, as needed for example with streaming data. We are very excited to be putting this technology into the hands of so many businesses, starting in the pharmaceutical sector.”

Open Source Immutable Database Adds Cluster Support Capable of Billions of Transactions and Potentially Unlimited Cloud Storage

The first and only open source enterprise-class database with data immutability at scale, immudb, can now be deployed in cluster configurations for demanding applications that require high scalability — up to billions of transactions per day — and high availability. Codenotary’s immudb 1.1 update also enables databases to use Amazon’s S3 storage cloud so they will never run out of disk space. 

“With this update, we’re addressing the most requested capability to scale to any level of data store,” said Jerónimo Irázabal, lead architect at Codenotary – the company behind the immudb project. “Banking applications are one example that require ultra-secure and tamperproof transaction ledgers while at the same time retaining high scalability.”

Habana Labs Announces Turnkey AI Training Solution Featuring Habana Gaudi Platform and DDN AI400X2 Storage System

Habana Labs, an Intel Company and leading developer of AI processors, announced the availability of a turnkey, enterprise-class AI training solution featuring the Supermicro X12 Gaudi AI Training Server with the DDN AI400X2 Storage system. This system is the product of the collaboration of Habana Labs and Supermicro with DDN, a leader in AI data management and storage. With eight Habana Gaudi purpose-built AI processors, the Supermicro X12 Gaudi AI Server provides customers with highly cost-efficient AI training, ease of use and system scalability. Integration of the Gaudi platform with the DDN AI400X2 appliance eliminates storage bottlenecks found in traditional NAS storage and optimizes utilization of AI compute capacity.

“The Habana team is committed to bringing Gaudi’s price performance, usability and scalability to enterprise AI customers who need more cost-effective AI training solutions,” said Eitan Medina, chief business officer of Habana Labs. “We are pleased to support our customers with this new turnkey solution that brings the efficiency of the Supermicro X12 Gaudi AI Server together with the data management and storage performance of the DDN AI400X2 system to augment utilization of AI compute capacity and enable us to address this growing need in training deep learning models.”

Gurobi 9.5 Delivers Enterprise Features and Even Better Performance

Gurobi Optimization, LLC, creator of the fast mathematical optimization solver, announced the release of Gurobi Optimizer 9.5. This release provides customers with an even faster compute engine, with impressive performance improvements across all supported problem types. Customers will discover over a dozen enhancements across the product, such as native support for Apple M1, powerful new heuristics for non-convex quadratic models, norm constraints, deterministic work measures, memory limit parameters, and more user control of IIS computation, as well as improvements to callbacks and tuning.

“I’m confident that our customers will be really pleased with Gurobi 9.5. And happy customers are the foundation upon which our whole business is built,” said Dr. Edward Rothberg, Chief Executive Officer and Co-founder of Gurobi Optimization.

ScaleOut Software Announces Azure Digital Twins Integration for its ScaleOut Digital Twin Streaming Service™

ScaleOut Software announced major extensions to the ScaleOut Digital Twin Streaming Service™ that integrate its Azure-based in-memory computing platform with Microsoft’s Azure Digital Twins cloud service. This integration adds key new capabilities for real-time analytics to Azure Digital Twins and unlocks important new use cases in a variety of applications, such as predictive maintenance, logistics, telematics, disaster recovery, cyber and physical security, health-device tracking, IoT, smart cities, financial services, and ecommerce.

“We are excited to combine our in-memory computing technology with the popular Azure Digital Twins platform to deliver fast, scalable insights that help address real-time challenges across industries,” said Dr. William Bain, ScaleOut Software’s CEO and founder. “By incorporating this technology, ScaleOut Software is enabling a new wave of applications for Azure Digital Twins, and we look forward to helping our customers take full advantage of this integration to meet their real-time monitoring and streaming analytics capabilities.”

Introducing CockroachDB 21.2: Survive and Thrive in a Distributed World 

Cockroach Labs, the company behind CockroachDB, the most highly evolved SQL database on the planet, announced the release of CockroachDB 21.2, further strengthening its position as an ideal transactional database for cloud-native applications. CockroachDB 21.2 delivers improvements that let developers integrate more seamlessly with event-driven data architecture, build against CockroachDB with more schema design and query optimization tools, and operate more easily at a massive scale. Deployment of modern data architecture in the cloud demands a scalable, resilient infrastructure that can extract the full value of the distributed environment. However, the majority of infrastructure still used beneath critical applications was not built to meet the demands of today’s real-time, global economy. Organizations are looking for data platforms that can effortlessly perform as the market shifts to cater to cloud-native solutions. 

“Most of our customers turn to CockroachDB for a scalable and resilient relational database—but they also value a familiar and comfortable developer experience, simple integrations with their preferred stack, and easy operations. CockroachDB was built by developers, for developers, and 21.2 builds upon these core principles,” said Spencer Kimball, CEO, and co-founder of Cockroach Labs. “With the help of our customers and their feedback, CockroachDB 21.2 is another step forward in our mission to make it easy to build world-changing applications.”  

PingCAP Introduces its New Developer Tier to Boost Application Innovation with TiDB Cloud 

PingCAP, a leading distributed SQL provider, announced the availability of its new Developer Tier for TiDB Cloud. The fully-managed database as a service now allows developers to easily launch a small TiDB cluster for free for up to one year. The TiDB Cloud Developer Tier introduces a 12-month trial period, in which developers can build and test applications directly in the platform. Through this offering, PingCAP is breaking down the adoption barrier for TiDB, providing the crucial time developers need to experiment first-hand how TiDB and TiDB Cloud can support their mission-critical applications and workloads, while providing decision makers a time frame to properly evaluate and determine where TiDB delivers the highest ROI for their needs. 

“Our commitment to our customers starts before they choose PingCAP as their provider. Through this new developer tier, we are giving users the appropriate time to properly test and evaluate the benefits of TiDB and TiDB Cloud,” said Shen Li, SVP, Head of Global Business at PingCAP. “This new tier aims to ease the decision-making process and give customers the confidence that they are choosing the best option for their projects. Moreover, we are shattering the barrier for TiDB adoption and making it easier for our customers to onboard TiDB in their mission-critical applications.” 

Datadobi Software Enhancements Power Agile Multi-Cloud Expansion, Flexible Data Reorganization, Lower Costs

Datadobi, a leader in unstructured data management software, announced enhancements to its vendor-neutral unstructured data mobility engine with the introduction of DobiMigrate’s API. Version 5.13 will allow organizations to programmatically configure unstructured data migrations using the API.

“Due to the scale and complexity of unstructured data in today’s heterogeneous storage environments, enterprises can no longer rely on outdated tools and manual practices to execute data management projects. Organizations must trust specialist tools powered by automation to gain an understanding of their environments and move data accordingly,” said Carl D’Halluin, CTO, Datadobi. “Datadobi’s API allows for a seamless data management experience built with the speed and integrity needed to conduct business today.”

Alluxio Boosts AI/ML Support for Its Hybrid and Multi-Cloud Data Orchestration Platform

Alluxio, the developer of open source data orchestration software for large-scale workloads, announced the immediate availability of version 2.7 of its Data Orchestration Platform. This new release has led to 5x improved I/O efficiency for Machine Learning (ML) training at significantly lower cost by parallelizing data loading, data preprocessing and training pipelines. Alluxio 2.7 also provides enhanced performance insights and support for open table formats like Apache Hudi and Iceberg to more easily scale access to data lakes for faster Presto and Spark-based analytics.

“Alluxio 2.7 further strengthens Alluxio’s position as a key component for AI, Machine Learning, and deep learning in the cloud,” said Haoyuan Li, Founder and CEO, Alluxio. “With the age of growing datasets and increased computing power from CPUs and GPUs, machine learning and deep learning have become popular techniques for AI. This rise of these techniques advances the state-of-the-art for AI, but also exposes some challenges for the access to data and storage systems.”

DDN Launches Next Generation of High Performance NVMe and Hybrid Storage
for AI and Advanced Computing Acceleration

DDN®, a leader in artificial intelligence (AI) and multicloud data management solutions, announced the availability of its next generation of NVMe platforms, the SFA® 400NVX2 and 200NVX2. These Storage Fusion Architecture® systems are the foundation of DDN’s accelerated storage portfolio and are available as EXAScaler® solutions – ES400NVX2 and ES200NVX2 – as well as the recently announced AI400X2 appliances for enterprise AI deployments. DDN developed these platforms to eliminate many of the challenges organizations face when bringing challenging workloads such as AI applications, natural language processing, financial analytics, and manufacturing automation to production.  Their current infrastructure is not designed to handle the ever-expanding amount of data these applications require, nor are they optimized to deliver the data fast enough for real-time processing and insight. With these systems as the foundation, DDN can provide enterprise-class storage solutions with ease of use, security and powerful data management to complement its best-in-class performance and scalability. 

“DDN is enabling customers to capture the full value of their data while eliminating complexity without compromising scalability,” said Dr. James Coomer, senior VP of products, DDN.  “By creating intelligent infrastructure using autonomous operations to greatly reduce administrative overhead and optimize systems for every workload, we can deliver flexible solutions that help customers get the most from their AI, analytics and high-performance computing projects.”

Starburst Announces New Product Release Which Extends Flexibility When Building Data Lakehouse Architecture

Starburst, the analytics anywhere company, announced the availability of the latest version of Starburst Enterprise. With enhanced performance, connectivity and security, Starburst Enterprise streamlines and expands data access across cloud and on-prem environments. Support for Apache Iceberg and MinIO with enhancements to materialized views empowers both data teams and domain experts with new data lake functionality that accelerates the journey to a data mesh architecture.

“Apache Iceberg is a rapidly growing open table format designed for petabyte scale datasets. With the addition of Starburst support for querying data stored in Apache Iceberg, Starburst now provides its customers the optionality to use Iceberg or Delta Lake (or both) table formats for their data lakehouse architecture,” said Matt Fuller, VP, Product and co-founder of Starburst. “Additionally, as companies continue to adopt hybrid and cross-cloud architectures, their data gravity is both in the cloud and on-prem. Businesses with data stored on-prem are opting for S3-compatible storage, such as MinIO as they build their private, cloud-like architecture. With official Starburst support for querying data stored in MinIO, MinIO users can enhance their hybrid and cross-cloud strategies.”

Fortanix Introduces Confidential AI to Streamline the Development of Richer AI Models and Applications

Fortanix® Inc., the data-first multi-cloud security company, introduced Confidential AI, a new software and infrastructure subscription service that leverages Fortanix’s industry-leading confidential computing to improve the quality and accuracy of data models, as well as to keep data models secure. With Fortanix Confidential AI, data teams in regulated, privacy-sensitive industries such as healthcare and financial services can utilize private data to develop and deploy richer AI models.

“For today’s AI teams, one thing that gets in the way of quality models is the fact that data teams aren’t able to fully utilize private data,” said Ambuj Kumar, CEO and co-founder of Fortanix. “Confidential AI makes that problem disappear by ensuring that highly sensitive data can’t be compromised even while in use, giving organizations the peace of mind that comes with assured privacy and compliance.”  

DataRobot Introduces “AI Cloud for Industries,” Arming Banking, Healthcare, Manufacturing and Retail Customers for the Next Generation of Intelligent Business

DataRobot announced DataRobot AI Cloud for Industries, a comprehensive solution that unites industry-tailored AI capabilities and best practices, integrations, and expanded partnerships for major industries. Building upon DataRobot AI Cloud, the new launch leverages DataRobot’s deep expertise working with many of the largest and most successful retail, banking, manufacturing and healthcare organizations in the world to harness the power of AI to transform their operations, accelerate growth opportunities and manage risk as they deliver services for their teams and customers. 

DDN Launches AI Innovation Lab with NVIDIA

DDN®, a leader in Artificial Intelligence (AI) and multi-cloud data management solutions, announced that it has joined with NVIDIA to establish an AI Innovation Lab in Singapore to drive innovation and accelerate the deployment of AI-based solutions for enterprises. The AI Innovation Lab will provide customers and partners the necessary infrastructure and tools to build AI-led solutions at scale. With the best-in-class computing, networking and storage infrastructure provided by the lab, enterprises will be able to build, test and optimize AI models. The lab will be powered by DDN’s A3I® AI400X™ systems to provide unmatched performance, optimal efficiency and flexible growth when used with NVIDIA DGX™ systems. DDN’s AI400X systems have been deployed to deliver the robust and high-performance storage for NVIDIA Selene, the world’s sixth most powerful supercomputer. DDN systems are certified by NVIDIA for scalable NVIDIA DGX™ POD™ and NVIDIA DGX™ SuperPOD™ configurations, and offer storage infrastructure optimized to meet the demands of evolving AI workloads.

“As a trusted data storage solutions provider, we are excited to collaborate with NVIDIA to deliver Intelligent Infrastructure to enterprises locally to develop, test and deploy rich AI solutions at scale,” said Atul Vidwansa, general manager for India & S.E. Asia, DDN. “This further demonstrates our commitment to empowering customers and partners to drive AI-powered innovations as quickly as possible.”

Apromore Announces Version 8 to Further Enable End-to-End Process Intelligence 

Apromore, a leading provider of enterprise-grade and open-source process mining technology, announced the latest version of its research-led process mining software, Apromore 8. New capabilities mean that organizations can quickly and accurately map end-to-end business processes that span multiple applications to accelerate, scale and efficiently manage business process improvement and automation initiatives.  

“As the economy rebounds and new challenges appear in business processes, customers are asking for ways to accelerate the time to gaining actionable insights from process mining,” said Prof. Marcello La Rosa, Apromore co-founder and Chief Executive Officer. “The improvements in Apromore 8 give customers the ability to streamline and scale business process improvement initiatives through easier integration, security and project management capabilities.” 

Vertica to Leverage NetApp StorageGRID to Deliver Cloud-Scale Analytics to On-Premises Environments

Vertica announced a new integration with NetApp StorageGRID to deliver the advantages of cloud-native analytics to on-premises environments. This combined analytics offering enables data-driven organizations to elastically scale capacity and performance as data volumes grow and as analytics and machine learning become a strategic business driver – all from within their enterprise data centers.

“With this NetApp integration, we are committed to providing our joint customers with the broadest options to power their strategic analytical and machine learning initiatives in the way that works best for their businesses — today and in the future,” said Colin Mahony, senior vice president and general manager of Vertica. “Every organization can now run Vertica’s cloud-optimized architecture with NetApp’s StorageGRID to address their performance and financial requirements – all within enterprise data centers or private clouds.”

Apollo GraphQL Launches Contracts to Expand Access to the Graph

Apollo GraphQL, a pioneer in the use of open source and commercial GraphQL API technologies, announced the launch of Contracts, a dynamic new feature that allows enterprise software teams to create tailored graphs for different audiences by applying filters to a single unified graph. 

“The graph is an essential new layer of the software stack that unifies all services into a single source of truth and is becoming the standard for cutting-edge teams developing applications,” said Matt DeBergalis, co-founder and CTO of Apollo GraphQL, which provides the industry’s only unified graph platform. “Multiple Fortune 500 companies already trust Apollo technology to operate their graph, and companies of that scale often have hundreds of client applications and thousands of developers all consuming different subsets of data from the graph. Rather than exposing the entire graph to every internal and external developer, Contracts allow you to streamline the experience by creating an individual graph for each audience which contains a filtered subset of the unified graph.”

Agora Launches New Analytics Solution, Empowering Developers with Powerful Service Assurance Tools 

Agora, Inc. (NASDAQ: API), a pioneer and leading platform for real-time engagement (RTE) APIs, has launched Agora Analytics 3.0, which provides developers with valuable insights into the audio and video performance of their application and the capability to drill-down and understand specific Quality-of-Experience (QoE) metrics at the individual user or session level. 

“Agora Analytics 3.0 will transform the way developers build and deploy new audio and video experiences,” said Tony Zhao, Co-founder, and CEO of Agora. “With consistent monitoring and alerts, developers now have access to user data in real-time to troubleshoot issues and ensure high-quality seamless user experiences.”  

IBM to Add New Natural Language Processing Enhancements to Watson Discovery

IBM (NYSE: IBM) announced new natural language processing (NLP) enhancements planned for IBM Watson Discovery. These planned updates are designed to help business users in industries such as financial services, insurance and legal services enhance customer care and accelerate business processes by uncovering insights and synthesizing information from complex documents.

“The stream of innovation coming to IBM Watson from IBM Research is why global businesses in the fields of financial services, insurance and legal services turn to IBM to help detect emerging business trends, gain operational efficiency and empower their workers to uncover new insights,” said Daniel Hernandez, General Manager of Data and AI, IBM. “The pipeline of natural language processing innovations we’re adding to Watson Discovery can continue to provide businesses with the capabilities to more easily extract the signal from the noise and better serve their customers and employees.”

Cloudian Adds New Management and Security Features to HyperIQ Observability and
Analytics Solution

Cloudian® announced new features in its HyperIQ observability and analytics solution, addressing the challenge of managing modern storage infrastructures that are increasingly distributed across geographically dispersed data centers. Introduced last year, HyperIQ gives enterprises and service providers a unified management view of their entire Cloudian storage infrastructure, encompassing interconnected users, applications, network connections and storage devices. It provides intelligent monitoring, advanced analytics and health checks that enable predictive maintenance, enhanced security and resource optimization. As a result, customers can reduce mean time to repair, increase availability and accelerate new deployments, thereby saving operational costs and making it easier to adapt to workload demands.

“Today’s modern storage infrastructure is increasingly distributed across geographically dispersed data centers, both on-premises and in public clouds,” said Jon Toor, chief marketing officer, Cloudian. “HyperIQ provides the comprehensive view of this geo-distributed storage and related networking infrastructure, and today’s announcement gives enterprises additional tools for efficiently, cost-effectively and securely managing it.”

Imply Introduces Project Shapeshift, the Next Step in the Evolution of the Druid Experience 

Imply, founded by the original creators of Apache Druid®, unveiled Project Shapeshift, which will offer developers and organizations a next level Druid experience that reimagines the process of building modern analytics applications. A series of game changing capabilities will be released throughout the next year to transform the Druid experience to fit a cloud-native, developer-centric world. 

“We saw the impact that our friends at Confluent had when they launched their industry-defining Project Metamorphosis,” said Gian Merlino, an original creator of Apache Druid, and Imply co-founder and CTO. “Establishing a path forward to massive adoption of a new data infrastructure lies in a strong commitment to advancing the underlying open source technology combined with a dedication to re-engineer the very foundation of that technology to be truly cloud-native. It’s an outcome that can only be achieved by the original creators of the open source technology supported by the organization they lead. This is what Project Shapeshift is all about, and over the next 12 months there will be product updates for both Druid and Imply.” 

Confluent Sets Data in Motion Across Hybrid and Multicloud Environments for Real-Time Connectivity Everywhere

Confluent, Inc. (NASDAQ: CFLT), the platform to set data in motion, announced that Cluster Linking is available on Confluent Platform 7.0. Combined with its earlier release on Confluent Cloud, Cluster Linking can now be used in any environment, everywhere an enterprise’s data and workloads reside. Now, organizations can securely stream data across hybrid and multicloud environments without needing to manage additional layers of complex tooling across disparate and siloed architectures. With a reliable, persistent bridge for real-time data sharing, organizations can quickly mobilize their data across their business to drive next-generation digital experiences and operations while maximizing the value of their cloud initiatives. 

“There’s a massive shift to the cloud that is inadvertently creating pockets of siloed data across organizations,” said Ganesh Srinivasan, Chief Product Officer, Confluent. “It is now more important than ever for businesses to solve these data connectivity challenges as their success depends on it. With Cluster Linking, the data across all the parts of a company–from cloud, on-premises, and everything in between–can be quickly connected in real time to help modernize businesses and build stand-out applications.” 

ZeroShotBot Launches World’s First AI Chatbot That Requires Zero Training Data or Coding for Businesses of All Sizes

ZeroShotBot announced the launch of a new disruptive conversational AI technology that democratizes chatbots for businesses big and small. ZeroShotBot brings a new way of building chatbots that can be scalable within hours, and requires no training data, allowing anyone with zero coding experience and training to create a fully functionable chatbot. Where many market leading chatbot solutions take several months and a team of developers to build and deploy a solution ZeroShotBot is able to reduce this effort to as quickly as one day.  

“I’ve studied and worked in the AI industry and recognized that there had to be a better way to create chatbots that doesn’t require a lot of time and setup and overhead costs. I strongly believe in the potential of AI and chatbots but felt that the promise of chatbots has to date outstripped reality,” said Dr. Jason Mars, founder and CEO of ZeroShotBot. “With ZeroShotBot our aim is to democratize chatbots as we know it by completely rethinking from the ground up how AI chatbots work. ZeroShotBot is the first chatbot that is truly accessible for businesses of all sizes, especially for small businesses, which is the engine of our economy. Easy and quick to set up, affordable for all budgets, and simply more effective, ZeroShotBot is the dawn of a new era in customer service.”

Lucata Launches Next Generation Computing Platform That Shatters the Performance Limits of Conventional Computers for Graph Analytics

Lucata, provider of a next generation server platform for accelerating and scaling graph analytics, AI and machine learning (ML) announced it has launched the Lucata Pathfinder server and a customized version of GraphBLAS for the Lucata platform. Performance benchmarks demonstrate the unmatched performance and scalability improvements of the Lucata platform, which enables users to run faster analytics on larger graphs than is possible with conventional computing technologies. The Lucata platform affordably fills the gap between the performance and scalability of conventional servers and the capabilities of supercomputers for Big Data graph analytics. A single rack of Pathfinder chassis provides the same full Breadth-First Search (BFS) performance as over 1,000 Xeon processors, while using 1/10th the power of a comparable Xeon-based system.

“Innovation is always fueled by democratizing access to the latest high-end technology. By making it possible to cost-effectively analyze massive graph databases, Lucata will spur significant innovation in multiple industries,” said Marty Deneroff, Lucata COO. “The Pathfinder benchmarks demonstrate the orders-of-magnitude increase in performance and scalability made possible today by our patented Migrating Thread technology.” launches CrateOM: a smart solution to digitalize and optimize operational processes, the enterprise data management company enabling data insights at scale, announced the launch of CrateOM, a smart solution that transforms process data into actionable insights. CrateOM runs in the cloud, at the edge or in a hybrid environment. By enabling digital transformation through real-time insights and in-app communications, CrateOM helps production companies improve decision making and cross-functional collaboration. Running on CrateDB, an enterprise-grade open-source database optimized for large data volumes, CrateOM supports the complexity of manufacturing and enables companies to digitalize processes and streamline operations. Designed to improve process efficiency, CrateOM reduces unplanned downtime, increases employee effectiveness, optimizes resource utilization and minimizes waste.

“ALPLA has been a great development partner in the creation of CrateOM,” commented CEO Eva Schönleitner. “Together, we saw a need for a smart factory solution that would use the advanced functionalities of CrateDB on the shop floor. With the launch of CrateOM, we want to extend our value offering from data collection to enabling operational data analysis in real time across several plants. We also partnered with Microsoft to ensure CrateOM runs on Microsoft Azure as the primary cloud provider.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 –

Speak Your Mind