Sign up for our newsletter and get the latest big data news and analysis.

insideBIGDATA Latest News – 7/22/2022

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

AWS Announces General Availability of Three New Serverless Analytics Offerings 

Amazon Web Services, Inc. (AWS) announced the general availability of three new serverless analytics offerings that make it even easier for customers to analyze vast amounts of data without having to configure, scale, or manage the underlying infrastructure. The announcements include new serverless offerings for Amazon EMR to enable customers to run analytics applications using open-source big data frameworks (Apache Spark and Hive) without having to manage the underlying infrastructure, Amazon Managed Streaming for Apache Kafka (Amazon MSK) to simplify real-time data ingestion and streaming, and Amazon Redshift to allow customers to run high-performance data warehousing and analytics workloads on petabytes of data without having to manage clusters. Along with other serverless analytics offerings from AWS such as Amazon QuickSight for business intelligence and AWS Glue for data integration, the new offerings make it significantly easier and more cost-effective for customers to modernize their infrastructure and analyze vast amounts of data without worrying about capacity planning or incurring excess costs by over-provisioning for peak demand. There are no upfront commitments or additional costs to use Amazon EMR Serverless, Amazon MSK Serverless, and Amazon Redshift Serverless, and customers only pay for the precise capacity needed for their analytics workloads. 

“By offering the most serverless options for data analytics in the cloud—including options for data warehousing, big data processing, real-time data analysis, data integration, interactive dashboards and visualizations, and more—we are making it even easier for customers to maximize the value of their data to drive innovation, improve customer experiences, and make better decisions faster,” said Swami Sivasubramanian, vice president of Database, Analytics, and Machine Learning at AWS. “With these new serverless options, customers can run even the most variable and intermittent analytics workloads and expand the use of analytics throughout their organizations without worrying about provisioning or scaling capacity—or incurring excess cost.” 

StarRocks Launches Fast, Cloud Native, Real-time Analytics Engine

StarRocks, a performance leader in real-time enterprise analytics, announced a cloud-native version of its SQL engine. StarRocks Cloud, a fully managed software as a service (SaaS) platform, greatly simplifies the delivery of real-time analytics projects and reduces the time to business value for enterprises. This new cloud-based offering enables the democratization of real-time analytics to help further accelerate digital transformation strategies.

“The demand for faster insights poses several challenges for data infrastructure teams, including the non-stop growth of data, the proliferation of citizen analysts and data scientists, and cross environment data pipelines that are increasingly brittle and complex,” said Mike Leone, senior analyst, ESG. “StarRocks’ strategy is perfectly aligned to address these challenges. By harnessing the data, they have collected through their digital channels to gain timely insights, enterprise organizations can benefit from data freshness and responsiveness – critical ingredients for digital transformation success.”

Speechmatics Unlocks Accurate Understanding of Financial Terms with New Language Pack 

Speechmatics, a leading speech recognition technology scaleup, has launched an English language pack specific to the finance industry. This addition has been built for use-cases including compliance, fraud identification, analytics, financial news and earnings calls. The accurate and inclusive speech-to-text engine can now identify finance terminology in conversation helping to avoid confusion with abbreviations, acronyms and finance-specific terms. 

“Our aim is to understand every voice regardless of race, gender or accent and I’m proud that Speechmatics has overcome significant challenges that traditional speech-to-text engines have struggled with,” said Katy Wigdahl, CEO, Speechmatics. “However, we wanted to go even further and dive into the complexities that specific industries present. Some sectors are known for complex terms and jargon that, if added to our global models, risk making the technology less effective for other users. This led to our approach for domain-specific packs that can directly address the needs of individual sectors. Financial services was an obvious place to start but we hope our language pack will set a blueprint for every high-stakes industry where the financial, reputational and social cost of misunderstanding is high.”

Signal AI Unveils New External Intelligence Graph Making Sense of the World’s Unstructured Data

Signal AI, a leading global External Intelligence company, announced the launch of its External Intelligence Graph, a comprehensive view of an organization’s external world built on real-time data and content. Signal AI’s External Intelligence Graph maps the relationships between the things a modern organization needs to care about, like climate change, supply chain risk or competitor intelligence, and highlights how an organization is “associated” to these important topics. This entirely new kind of data, making sense of the huge amounts of unstructured content now available, can show how relationships  are changing, and also offers up “unknown unknowns” surfacing new, and previously unlinked, connections to an organization or individual.  

“In part due to the rise of stakeholder capitalism and ESG focus, businesses are increasingly affected by the actions and behaviors of the people and organizations in the world outside of the company walls,” said Clancy Childs, Chief Product Officer. “The amount of external, unstructured data that now must be considered in modern decision making is daunting. With our AI technology working at scale, we provide the external intelligence to help businesses cut through the noise and understand the topics and events that are affecting them both directly and indirectly. It’s truly a game changer for business decision making and organizations operating at scale.” 

TIBCO ModelOps Significantly Improves Efficiency and Flexibility Across the Enterprise with Impactful AI

TIBCO Software Inc., a global leader in enterprise data, empowers its customers to connect, unify, and confidently predict business outcomes, solving the world’s most complex data-driven challenges. TIBCO announced the release of TIBCO® ModelOps, which enables businesses to deploy AI models faster, from anywhere to everywhere, safely and at scale. This addition to the company’s game-changing analytics portfolio helps customers simplify and scale cloud-based analytic model management, deployment, monitoring, and governance.

“While 92% of firms spent more overall on data science in 2021 compared to previous years, only 12.1% deployed it at scale. To help organizations realize the value of their AI deployments, we’ve designed a system that puts self-service access to data science firmly in the hands of teams, including business users,” said Mark Palmer, senior vice president, engineering, TIBCO. “This allows decision-making teams to choose the algorithm they want, work from any cloud service, and run it safely, securely, and at scale. This is a bold step to enabling business users to take AI out of the lab and out on the road.”

Diwo Announces General Availability of Diwo Decision Intelligence (DI) Platform to Optimize Business Decisions with AI-powered Recommendations

Diwo announced the general availability of the Diwo Decision Intelligence (DI) Platform. Diwo’s DI platform delivers an innovative analytics experience designed to accelerate the path from data to decisions by providing business users with AI-powered, actionable recommendations in minutes rather than days. Diwo is purpose-built to solve the “last mile of analytics” challenge with its ability to dynamically deliver contextual insights and proactively recommend the best action to optimize business objectives and drive growth.

“With today’s data deluge, business decisions often stall out in the manual-intensive ‘last mile of analytics,’ where insights need to become actionable,” said Krishna Kallakuri, CEO of Diwo. “It’s important to know when a change occurs in your business, but even more critical to understand why and what actions to take to improve outcomes. With Diwo, you not only close the data-to-decision gap, but also put AI and contextual intelligence at the heart of decision-making to better understand the why and what questions. Using Diwo, Fortune 1000 enterprises can make better decisions, reduce the time to decision, realize breakthrough productivity gains and compete more effectively.”

NVIDIA Announces Hybrid Quantum-Classical Computing Platform

NVIDIA announced a unified computing platform for speeding breakthroughs in quantum research and development across AI, HPC, health, finance and other disciplines. The NVIDIA Quantum Optimized Device Architecture, or QODA, aims to make quantum computing more accessible by creating a coherent hybrid quantum-classical programming model. QODA is an open, unified environment for some of today’s most powerful computers and quantum processors, improving scientific productivity and enabling greater scale in quantum research. HPC and AI domain experts can use it to easily add quantum computing to existing applications, leveraging both today’s quantum processors, as well as simulated future quantum machines using NVIDIA DGX systems and a large installed base of NVIDIA GPUs available in scientific supercomputing centers and public clouds.

“Scientific breakthroughs can occur in the near term with hybrid solutions combining classical computing and quantum computing,” said Tim Costa, director of HPC and Quantum Computing Products at NVIDIA. “QODA will revolutionize quantum computing by giving developers a powerful and productive programming model.”

Robocorp Launches Automation Studio to Seamlessly Uplevel Teams and Bots

Robocorp, a top provider of Gen2 robotic process automation (RPA), announced the public beta release of Automation Studio – the code native, low-code RPA development solution that offers the fastest and easiest way for developers and business users to automate enterprise processes together using the world’s most flexible and powerful RPA platform.

“At Robocorp, we believe in ‘minutes to learn, a lifetime to master’,” says Antti Karjalainen, the CEO of Robocorp. “With Robocorp Automation Studio, we are not only enabling aspiring developers of all levels to start their journey to becoming software robot developers but also making it easier for existing developers to upgrade their skills and their bots.”

Soda Unveils Soda Core: the Open-Source Framework for Data Quality and Reliability

Soda, the provider of data reliability tools and cloud observability platform, has announced the general availability of Soda Core, the open source framework for Data Engineers to embed data reliability checks and quality management into data pipelines. Powered by SodaCL (Soda Checks Language), also released as the first Domain-Specific Language (DSL) for data reliability, Soda Core introduces data engineering as-code practices to create broad coverage, eliminate data downtime, and unlock the cumbersome tasks of detecting and resolving issues across the entire data product lifecycle.

“This first public release of Soda Core and SodaCL is one of the most important milestones in our journey so far, giving Data Engineers the framework and language to get started and scale with reliability engineering and data quality management,” explains Tom Baeyens, CTO, and Co-Founder, Soda. “We realized early on that when it comes to data quality, the needs of engineers are quite different when compared with the needs of the data team as a whole. A lot of people in a data team know what good data looks like but only a few can code the checks. With our releases today, we are providing the tools to remove the bottlenecks that exist around coding data reliability, enabling Data Engineers to build data quality checks-as-code directly into their pipelines and fundamentally change how teams set up and maintain reliable, high-quality data products.” 

Future AI Unveils Sallie, Software that Thinks, Learns, and Evolves Like a Person

Future AI, an artificial general intelligence (AGI) company developing “Technologies that Think,” launched Sallie, its prototype software and artificial entity that learns in real-time with vision, hearing, speaking, and mobility, giving it the ability to draw conclusions, a critical facet of genuine thinking and a necessary component to ushering in AGI, the most exciting project on the planet.

“The first component of being able to understand like a person is learning about immediate surroundings. Sallie can recognize objects with vision, build an internal model, ask questions, and take direction without any initial information,” explains Charles Simon, Founder and CEO, Future AI. “Our work advances new algorithms which simulate biological neuron circuits with high-level artificial intelligence techniques. Sallie can infer information about objects she doesn’t understand – demonstrating one shot, real-world learning without tagged data sets or backpropagation.”

Acure.io Releases Free AIOps Incident Control and Automation Saas Platform for IT Ops Professionals 

Acure.io, formerly known as Monqlab, a leading AIOps developer, has released a cloud-hosted AIOps platform that detects and prevents IT failures in complex, dynamically changing IT environments. The platform is suitable for companies of any size, from SMBs to enterprises, and helps site reliability engineering (SRE), DevOps, and IT Ops professionals by providing quick and simple root-cause analysis and troubleshooting. 

“At the end of last year, we launched Monqlab’s self-hosted Free Community Edition, an AIOps platform for data collecting and analysis,” said Nikolay Ganyushkin, CEO and founder of Acure.io. “We saw a massive interest from the community – hundreds of people downloaded the product after just a few posts on social media. We decided to move fast and build the SaaS version as soon as we could to bring the value of our product in the most user-friendly way to every engineer who needs it with absolutely no charge. We want to make Acure.io the tool for those who would like to try AIOps and start learning how to use the technology, bring a value to the companies they work at, and at the same time become more valuable professionals.”

Dashbot Launches Conversational Data Cloud™ to Provide a Centralized View of All Chatbot Data

Dashbot, a conversational AI and data platform, announced the launch of its proprietary Conversational Data Cloud™, letting customers build and optimize their chatbots from their businesses’ own conversational data. Dashbot’s Conversational Data Cloud™ turns unstructured, noisy, interrelated and often tangled conversational data into immediate action.

“We’re expanding beyond reporting and analytics to be able to ingest raw conversational data which can be difficult, but also very valuable for our customers,” said Andrew Hong, CEO of Dashbot. “We’re on a mission to decipher language, which is one of the most complex types of data that has ever existed. We listened to our customers that are challenged to make sense of all their conversational data, so we built our Conversational Data Cloud™ to help businesses automate, analyze and optimize their conversation channels.”

Stats Perform Transforms Sports Media Content Creation with PressBox Platform

Stats Perform, the sports tech leader in data and AI, announced expanded capabilities in its PressBox platform that gives media a streamlined set of tools powered by AI to deliver high quality content to fans. Enhancements to PressBox Graphics, PressBox Live and PressBox Video enable access to data, historical context and mobile content creation on an integrated platform to meet the demands of the modern-day sports fans. The PressBox platform enables broadcasters, social media teams, producers and content creators to access live match data, ready to use video and create social graphics before, during and after the game. The suite is the culmination of Stats Perform’s heritage, expertise and vision to utilize AI alongside the company’s depth and breadth of Opta data.

“PressBox is the platform for sports media content creators to deliver the best experiences to sports fans,” said Nancy Hensley, Chief Product and Marketing Officer at Stats Perform. “With this offering, Stats Perform continues to drive transformative value and lead industry innovation, capturing the moments that matter to meet the demands of today’s fan.”

Ondat 2.8 Arrives in GA with Increased Support for Stateful Workloads in Kubernetes

Ondat, a leading Kubernetes-native data platform provider, released into general availability version 2.8 of its Ondat platform for stateful workloads in Kubernetes. The new version brings significant changes that open up the option of running a robust ETCD setup within production clusters, removing the need for external service setup. This change reduces operational overhead and cost for production users.

“Our customers have come to rely on Ondat to support the performance and reliability that their stateful workloads require in Kubernetes,” said Alex Chircop, founder and CEO of Ondat. “With version 2.8, we’ve added several important new features they’ve asked for, including snapshots and the ability to run ETCD within their clusters. Taken together, these new capabilities further extend the enterprise capabilities in Ondat, making it an increasingly popular choice.” 

TigerGraph Announces New Cloud Features that Make Graph Technology More Accessible to All

TigerGraph, provider of a leading ML and AI graph analytics platform, announced new features to TigerGraph Cloud, its fully managed graph database as a service. The new capabilities, natively built for the cloud, address the demands of TigerGraph’s rapidly growing customer base and developer community with enhancements that enable seamless adoption, deployment, and management of multiple graph database solutions.

“Graph is a critical technology for improved business insights from ML and AI and we want to make it so easy to access and use that anyone can do it,” said Yu Xu, CEO and founder, TigerGraph. “The TigerGraph Cloud capabilities we announced today make it easy for enterprises to adopt graph technologies and answer critical business questions in the most collaborative way possible. With this, along with our significant cloud expansion into the Asia Pacific and South America regions, we’ve broadened our global reach to include even more organizations seeking graph for the business insights needed to stay ahead of the curve.”

Decodable Launches Delta Lake Connector Into GA; New Classes of Databricks Use Cases Now Unlocked Via Simple, Affordable Access to AI-Enhanced Analytics

Decodable, the real-time data engineering company, unveiled a faster, simpler and less expensive means to ingest streaming data into Databricks. The announcement was made possible by the addition of a new connector for users of the Decodable SaaS offering.

“Ingesting streaming data into Databricks unlocks a host of powerful analytics capabilities,” said Eric Sammer, CEO and founder of Decodable. “Unfortunately, for many developers of real-time applications, the present process to make this happen is overly complicated and cost prohibitive. We built the Delta Lake connector to solve those problems and bring the power of Databricks to a whole new set of use cases that presently are blocked due to the cost and complexity of ingestion.”

John Snow Labs Releases Spark NLP 4.0

John Snow Labs, the Healthcare AI and NLP company and developer of the Spark NLP library, announced the release of Spark NLP 4.0. With new question answering annotators, major performance improvements, optimizations on new hardware platforms, and more than 1,000 state-of-the-art pre-trained transformer models available in multiple languages, Spark NLP 4.0 is the company’s most significant release this year. This is an example of John Snow Labs’ ongoing commitment to delivering the latest, most accurate NLP software to the global AI community.

“As the most widely used NLP library in the enterprise, we have a responsibility to deliver accurate, production-grade, state-of-the-art NLP software,” said David Talby, CTO, John Snow Labs. “With the pace of technology and business evolution, last year’s best-of-breed AI tools are already falling behind. Our promise to our customers and the open source community is that we will always keep them state-of-the-art—and this new release delivers on that promise.” 

Altair Releases Altair Unlimited Data Analytics Appliance

Altair (Nasdaq: ALTR), a global leader in computational science, high performance computing (HPC) and artificial intelligence (AI), announced it has launched the Altair Unlimited data analytics appliance, an all-in-one turnkey solution, which will democratize enterprise-wide data analytics and empower users to glean more data insights than ever. The Altair Unlimited data analytics appliance – which is built on Dell PowerEdge R750 servers – is designed to foster enterprise-wide, data-driven strategies by giving teams the power to use data analytics and AI to gain competitive advantages and drive next-level business results.

“Dell Technologies is a longtime partner and we are thrilled to take the Altair Unlimited data analytics appliance to market with them. After the incredible success of the Altair Unlimited simulation appliance, we wanted to give customers the same type of opportunities in the data analytics and AI sphere,” said Sam Mahalingam, chief technology officer, Altair. “With this technology, we will help enterprises move from expensive hardware and software solutions built around the SAS language, to a modern, cost-effective turnkey solution. We are confident users and organizations will see game-changing results immediately and we look forward to seeing how users maximize this technology and get the most from their data.”

YugabyteDB 2.15 and New Migration Engine YugabyteDB Voyager Effortlessly Power the Widest Range of Apps and Simplify Cloud Adoption

Yugabyte, a leading open source distributed SQL database company, announced the general availability of YugabyteDB 2.15 and unveiled a new service, YugabyteDB Voyager. YugabyteDB 2.15 gives engineering and operations teams a unified platform to run all their business-critical transactional applications effortlessly. YugabyteDB Voyager, the new cloud database migration service, accelerates cloud native adoption by making the move to a distributed database simple and efficient. YugabyteDB 2.15 and Voyager represent the most extensive updates to the database since its initial launch.

“We built YugabyteDB with the goal of simplifying the data tier for cloud native applications,” said Karthik Ranganathan, co-founder and CTO of Yugabyte. “Simplicity in today’s demanding enterprise environments is about being able to run an expanding array of diverse workloads in any cloud around the world with enterprise-class scale, performance, and availability. YugabyteDB 2.15 is our most feature-rich release since launch, reinforcing YugabyteDB as the best-in-class database for both enterprises and born-in-the-cloud startups. YugabyteDB Voyager further breaks down barriers at the data layer that often hinder or complicate the journey to the cloud.”

Kyligence Introduces an Intelligent Metrics Store to Democratize Data Analytics

Kyligence, announced Kyligence Zen, an intelligent metrics store platform that helps to align business goals and key metrics. The new platform automates data pipelines from data lakes or data warehouses to its multidimensional OLAP database to deliver metrics consistency and data trust in a cost-effective way. The high complexity of data stacks and pipelines has resulted in low efficiency and inconsistency of data analytics. Without a unified metrics store, an organization’s metrics logic is duplicated, in different platforms or tools, causing significantly more work for data analysts to combine and analyze metrics in one place repeatedly.

“It’s very hard to find the right metrics from thousands of reports spread across different data systems, so data teams are focusing more on data pipelines but less focused on the true value of data,” said Luke Han, CEO, Kyligence. “We believe metrics-driven analytics platforms will help organizations build the alignment between business and metrics. Helping global industry leaders including banks, insurance companies and retailers to successfully build their unified analytics platforms in the past several years, Kyligence knows deeply about those challenges and Kyligence Zen is our answer to the data industry challenges.”

Crate.io Launches New Tier Model for CrateDB Cloud with Added Flexibility and Cost Optimization

Crate.io, the enterprise data management company, announced the introduction of a new tier for CrateDB Cloud, which offers the ability to deploy it with shared infrastructure resources and suspend compute power. With the deployment of this new tier of CrateDB Cloud, customers have access to more cost-effective options that enable use cases with changing requirements or smaller use cases with the flexibility to turn on and off compute power.

Roman Meingassner, CrateDB Cloud Product Manager, said: “We see this new tier model providing key add-ons for our customers using CrateDB Cloud. They are both subtle in terms of their immediate impact, but the benefits are long lasting. These new features are saving our customers money in several ways, whilst ensuring the level of operational efficiency they expect.”

MakinaRocks Unveils ‘Link’ – The AI Modeling Tool Built To Leap Technical Hurdles in JupyterLab

MakinaRocks, an AI-based startup specializing in manufacturing and industrial solutions, launched a community version of its AI and Machine Learning (ML) modeling tool MakinaRocks Link™ (hereafter referred to as “Link”), making their industry-changing technology available for a wider variety of Machine Learning Operations (MLOps) environments. Link is an extension for JupyterLab – an interactive development interface for notebooks, code, and data – that lets users easily create readable pipelines for AI and ML modeling. Link maintains the usability of JupyterLab that data scientists rely on while removing technological hurdles related to Kubernetes, a portable, open-source platform for managing workloads and services. By removing the technological hurdles associated with Kubernetes, Link allows users to create pipelines that can be used in MLOps environments with ease, even without a working knowledge of Kubernetes.

“MakinaRocks is a startup that has been solving various problems in the manufacturing and industrial sectors through AI solutions since its inception,” said co-CEO Andre S. Yoon. “We’ve launched Link to help solve the problems data scientists experience in their daily work environment. Link will offer improved AI and ML modeling experiences for data scientists with its diverse features and help overcome JupyterLab’s limitations while further enhancing its strengths.”

DataStax’s Astra Streaming Goes GA With New Built-in Support for Kafka and RabbitMQ

DataStax, the real-time data company, announced the general availability (GA) of Astra Streaming, an advanced, fully-managed messaging and event streaming service built on Apache Pulsar.  Now featuring built-in API-level support for Kafka, RabbitMQ and Java Message Service (JMS), Astra Streaming is the only service that makes it easy for enterprises to get real-time value from all their data-in-motion.

“Many enterprises are struggling with fragmented and complex streaming architectures, with most of their data-in-motion still siloed in legacy messaging and queuing middleware like JMS and RabbitMQ,” said Chris Latimer, vice president of product management at DataStax. “These valuable veins of data are impossible to harvest through Kafka. With the built-in support for Kafka, RabbitMQ and JMS,  Astra Streaming makes it easy to unify all data-in-motion in a modern, multi-cloud streaming service designed for scale.”

Roboflow Makes 90,000 Datasets and 7,000 Pre-trained Models Available

In August 2021 Roboflow Universe launched with 50 open source datasets and opened our computer vision infrastructure products to users for free with a Public plan. Universe now has 90,000+ datasets with 66+ million images available for building computer vision models and 7,000+ pre-trained models with available APIs to use in applications. To date, publicly available pre-trained models have powered over 14 million inferences – making AI/ML more accessible for developers. Real world datasets can help build real world solutions and the goal of Roboflow Universe is to make it easy for anyone to build computer vision applications.

Large scale datasets for training machine learning models are generally owned by the world’s biggest companies (Meta, Google, Tesla, etc) but the data is not available to the computer vision community. Roboflow’s goal is to make that data accessible to everyone using open source contributions – helping to reduce time spent finding and labeling data. Novel computer vision use cases are being proven by the computer vision research community using open source datasets.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Leave a Comment

*