insideBIGDATA Latest News – 1/20/2023

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Immuta Releases Immuta Detect for Continuous Security Monitoring

 Immuta, a data security leader, announced the release of its latest product, Immuta Detect. With its continuous data security monitoring capabilities, Immuta Detect alerts data and security teams about risky data access behavior, enabling quicker and more accurate risk remediation and improved data security posture management across modern cloud data platforms.  The product is a new key pillar of Immuta’s comprehensive Data Security Platform that provides sensitive data discovery, security and access control, and data activity monitoring. The platform uniquely integrates with the leading cloud data platforms and with existing SIEM and Managed Detection and Response (MDR) tools.

“In the modern business landscape, organizations face a challenging dichotomy in which they need to use and share data to remain competitive but also maintain the highest standards of data security,” said Mo Plassnig, Chief Product Officer, Immuta. “It comes down to managing your risk appetite and to do that effectively, it is vital for teams to have a comprehensive view of data access activities and risks. Immuta Detect is an important step towards meeting customer demands for a comprehensive approach to data security, from discovering sensitive data, protecting it, and monitoring its usage, to keeping data policies up to date.”

ClearML Releases New Reports Feature to Share Real-Time Results of Machine Learning Projects and Ignite ML Collaboration Across the Enterprise

ClearML, a leading open source, end-to-end MLOps platform, announced it has released a new Reports feature that is now generally available. This new feature makes it easy to create and share real-time reports within ClearML as well as connect to third-party editors such as Confluence, Monday, Notion, Colab (Jupyter), and others using embed code. A report can contain charts that auto-update continuously throughout an experiment lifecycle. ClearML users can now easily create, collaborate on, and share reports, as well as graphs and charts, in order to summarize and explain experiments and how model versions improve,  analyze results, show experiment comparisons, discuss bugs, demonstrate progress towards milestones, and publish their work.

“Reporting and showcasing your findings to colleagues, managers, or even your future self is a core component of any modern collaborative workflow,” said Moses Guttmann, Co-founder and CEO of ClearML. “Having one central place where you can seamlessly  report, analyze, plan, and collaborate on your work in real time makes it that much easier. ClearML-Reports connects seamlessly with existing ClearML functionality and is based on markdown, so it’s very easy to export or connect to an external reporting tool.” chosen by prestigious EIC Accelerator to use AI for accelerated scientific text understanding, provider of a world-leading and award-winning AI engine for scientific text understanding, has been selected by the European Innovation Council to receive €2.4 million of funding in grants and up to €12M in equity investments from the 2022 EIC Fund. was one of just 15% of startups chosen with a female CEO and amongst only 78 successful companies from over 1,000 qualified candidates in a highly competitive selection process. solves a major problem for researchers – that of the overwhelming volume of research being published. Finding relevant research is like finding a needle in a haystack, and as a result, researchers in both academia and industry are missing relevant published papers that could advance their knowledge or are simply wasting time reading irrelevant research. cuts the time required to carry out scientific research by using AI language models to categorize, navigate, summarize and systematize data from academic papers, patents and all other technical or research documentations. It’s already being used by hundreds of universities and companies – including Fortune 500 steelmaker ArcelorMittal – to eliminate the weeks and months spent manually wading through patents and research.

“When was founded in 2015, few people had heard of language models,” commented Schjøll Abildgaard, CEO and Co-founder of, “Since then, the AI ecosystem has grown exponentially, and the concept of language models is common knowledge. However, the current generation of large language models – including ChatGPT – simply don’t work for science. They hallucinate, generate mistruths, and misunderstand scientific text due to a lack of domain-specific knowledge. What we’re doing differently is creating a language model that can be relied upon for analyzing scientific research. Together with my two brilliant co-founders, we are delighted to be selected for the European Innovation Council Accelerator funding. It will allow us to ramp up the development of our technology and achieve our goal of building a complete AI researcher – AI tools and applications which allow humans to make sense of the totality of the world’s scientific knowledge.”

Decodable Ships Additional Enterprise Readiness Features for Real-Time Data Streaming Built on Apache Flink

Decodable, the stream processing company, announced a slate of new features in its enterprise-ready platform built on Apache Flink, the industry leading stream processing technology. The Decodable platform rounds out open source Apache Flink with capabilities that deliver security, efficiency and performance, enabling enterprise users to connect to anything, develop with speed and operate with confidence.

“The future of data is real-time, and the community has created an incredibly powerful tool in Apache Flink,” said Eric Sammer, CEO and founder of Decodable. “Problem is, deploying and operating Flink in production is complex, and it takes specialized expertise and lots of time. To build a working solution that’s suitable for enterprise production can be a long-haul commitment that most organizations are understandably reluctant to take on. Today, Decodable flips this script, offering updates to our platform that packages Flink into a fully-managed, as-a-service, pay-as-you-go stream processing platform. What’s more, these new capabilities offer a streamlined developer experience requiring only SQL skills to build transformation pipelines, greatly simplifying—and more importantly, speeding—the process. The Decodable platform is now the only offering in GA that puts stream processing within the reach of every organization for every project.”

Pliops Unveils XDP Data Services to Empower Breakthrough Data-Infrastructure Optimization

Pliops, a leading provider of data processors for cloud and enterprise data centers, launched its XDP Data Services platform – a seamless-to-deploy, transformational approach to optimize data infrastructure and accelerate modern workloads, while in tandem reducing TCO by 50%. Running on the Pliops Extreme Data Processor (XDP), the portfolio of XDP Data Services includes XDP-RAIDplus, XDP-AccelDB and XDP-AccelKV. These Data Services are designed to maximize data center infrastructure investments by exponentially increasing application performance, data reliability, storage capacity, and overall stack efficiency.

“Maintaining high-performance, high-reliability SSD-based storage systems can be hugely challenging,” said Uri Beitler, Pliops founder and CEO. “SSD faults in servers hosting data-hungry applications are a leading cause of significant downtime, impacting productivity and affecting SLAs. With this in mind, XDP-RAIDplus was designed to maximize the capabilities of NVMe SSDs to the most demanding I/O needs of any system, while optimizing the system’s cost/performance ratio. Numerous customers are sacrificing data reliability in order to avoid performance drops – we solve this by not only accelerating current solutions but also by providing this key feature to customers that cannot take advantage of existing solutions.”

Quest Software Announces General Availability of SharePlex 11, Enabling Database Replication Across PostgreSQL and Snowflake Environments

Quest Software, a global systems management, data protection and security software provider, announced the general availability of SharePlex 11, now covering PostgreSQL and Snowflake, to support high volume, mission-critical database replication. SharePlex offers businesses support for data replication, across the most important data platforms today, to ensure database availability and systems interoperability. SharePlex product features help businesses to modernize their data infrastructure while ensuring high performance, keeping data in sync and maintaining peak operational performance.  

“As enterprises continue to look for ways to reduce overbearing license costs and lower their IT spend, PostgreSQL continues to demonstrate that it is a viable open-source database option. Our mission is to ensure that customers who wish to adopt PostgreSQL for their mission-critical applications have the necessary availability, disaster recovery and horizontal scaling capabilities they need to confidently deploy PostgreSQL” said Bharath Vasudevan, Vice President of Product Management and Marketing for Quest ISM.  

UiPath Unveils New Migration Capabilities and Connectors to Expand and Simplify Next-Gen Test Automation

UiPath (NYSE: PATH), a leading enterprise automation software company, announced significant upgrades to allow customers to modernize their software testing practices by migrating testing to the UiPath Business Automation Platform. With migration streamlined and comprehensive software testing natively available to all customers, UiPath provides CIOs and IT decision makers the opportunity to save costs by consolidating and automating testing in a single platform.

“Customers are seeking to connect UiPath to their development toolchain, and these two new features of the UiPath Business Automation Platform are the latest innovations in automated testing,” said Dr. Gerd Weishaar, Senior Vice President of Product Management at UiPath. “CIOs and IT leaders can achieve software testing outcomes faster at a lower cost with our intuitive and scalable testing solutions included in the market’s only end-to-end automation platform. Organizations can leave disparate legacy testing tools behind and consolidate on the UiPath platform to modernize how teams manage and execute delivery of their software.”

TDengine Releases Simple and Secure Data Sharing for Enterprises

TDengine™ released a new data-sharing feature for its popular, cloud-native time-series data platform. By introducing user authorization to its core data subscription component, TDengine delivers simple and secure data distribution with fine-grained access control for internal and external stakeholders. Precise data access can be challenging and typically requires a second database for managing users and a custom application to control access to data. TDengine Cloud now allows sharing of any level of data—from an entire organization to a single topic—with specified users or user groups with different roles. The security of your shared data is ensured by secure tokens and encryption in flight.

“Enterprises want to share data with key members inside and outside the organization to streamline their processes and meet their business needs, but they need precise control of their data to ensure that compliance and privacy guidelines are satisfied,” said Jeff Tao, founder and CEO of TDengine. “Our data-sharing feature makes this complex process as easy as sharing Google Docs.”

Cognigy Launches Generative AI Solution for Enterprise Contact Centers

Cognigy, a market leader in Conversational AI, announced that its platform, Cognigy.AI, will be enhanced with Generative AI, or Large Language Models (LLMs) like OpenAI’s GPT-3, to augment Conversational AI deployments. The adoption of Generative AI will help to further transform enterprise customer service and contact centers by creating advanced conversational experiences and driving efficiency. Generative AI solutions have skyrocketed in recent months. As one example, ChatGPT, which was built by OpenAI on top of GPT-3 and made available for public testing on November 30, 2022, crossed one million users within one week of its launch.

“Generative AI combined with traditional Conversational AI platforms can deliver value that extends far beyond what each component can deliver alone,” said Philipp Heltewig, co-founder and CEO at Cognigy. “We see tremendous value in leveraging LLMs in a way that augments human tasks rather than operating autonomously. With our integrated approach, brands are now able to benefit from a better customer experience and lower cost to serve through the use of Generative AIs.”

Lexalytics Expands NLP Capabilities Across Foreign Languages

Lexalytics®, an InMoment® company and pioneer in AI-based, natural language processing (NLP) technology, announced that it has improved accuracy and expanded NLP capabilities for 11 non-English languages it supports. Now, global brands and businesses can analyze unstructured data natively and benefit from full-featured text analytics to better understand their customers and make more informed decisions across 31 total languages. 

While some companies offer a range of NLP features in languages other than English, few cover the sheer number that Lexalytics supports. Many rely on machine translation before processing which can significantly decrease the accuracy of the analysis. Because Lexalytics processes text data in its native language, it can take into account specific nuances and complexities of that language, and better understand the meaning and context of the text, leading to more accurate results. 

“We work with global companies whose customers – both current and prospective – are interacting with them in dozens of languages beyond the ones commonly supported in the data analytics landscape,” said Jeff Catlin, Head of Lexalytics, an InMoment company. “We’ve always been at the forefront of providing the most depth and breadth among enterprise NLP providers, and we’re fully committed to expanding our pioneering work to offer accurate and full-featured text analytics across the most languages for our customers.” 

DiffusionData Releases Diffusion 6.9

DiffusionData, formerly known as Push Technology, a pioneer and leader in real-time data streaming and messaging solutions, announced the release of Diffusion 6.9, the Intelligent Data Platform that consumes, enriches, and delivers data among applications, systems, and devices.

Altair Announces Release of Simulation 2022.2 Software Update

Altair (Nasdaq: ALTR), a global leader in computational science and artificial intelligence (AI), announced the latest updates to its simulation portfolio, Simulation 2022.2. These updates build on the enhancements brought by Simulation 2022.1 and improve Altair’s cloud elasticity and scalability, electrification, and product development capabilities. 

NICE Delivers New RPA Innovations Leveraging AI To Achieve Complete Performance and Master CXi

NICE (Nasdaq: NICE) announced new RPA (Robotic Process Automation) capabilities in its latest release, using AI to identify focused opportunities for automation. With NEVA Discover’s new process analytics solution and semi-supervised machine learning, organizations can optimize their business process executed elements to achieve complete performance. NEVA Discover, NICE’s AI-powered discovery tool, uses advanced AI to produce actionable insights to optimize business processes, improve effectiveness and efficiency, and empower employees to meet their key performance indicators.

“NEVA Discover’s new capabilities demonstrate the power of AI in delivering actionable automation opportunities to empower employees and ultimately help brands master CXi (Customer Experience interactions),” said Barry Cooper, President, CX Division, NICE. “Process optimization with NEVA Discover is not a one-time event. It is a cycle of continuous improvement, with ongoing measurement of the impact of each change, to create exponential value.”

Absci First to Create and Validate De Novo Antibodies with Zero-Shot Generative AI

Absci Corporation (Nasdaq: ABSI), a generative AI drug creation company, announced the ability to create and validate de novo antibodies in silico (via a computer) with the use of zero-shot generative AI — a major milestone for the biotechnology industry. The ability to create de novo therapeutic antibodies in silico could potentially reduce the time it takes to get new drug leads into the clinic from as much as six years down to just 18-24 months while also increasing their probability of success in the clinic. This new advancement is a major industry step change, unlocking the potential to deliver breakthrough therapeutics at the click of a button, for every patient.

“It’s a slow, arduous process to bring just one safe, effective drug to the market. I have overseen the development of over ten drugs to approval throughout my career, and know the labor and dedication required for the small chance of creating a therapeutic that can improve lives,” said Andreas Busch, PhD, Chief Innovation Officer of Absci. “What Absci has accomplished is just one of the reasons I joined the team. Being part of the mission to bring potentially life-changing biologics to patients with the power of generative AI is the next evolution in medicine. We’re seeing that start today.” 

HPE announces HPE Alletra 4000: the next generation of data storage servers

Architected to accelerate data-driven initiatives at any scale and with ideal economics, the HPE Alletra 4000 family represents the first server-based offering within the HPE Alletra portfolio of cloud-native data infrastructure solutions. The new HPE Alletra 4000 models deliver the next generation of data storage servers that were previously under the HPE Apollo 4000 brand, providing modern data infrastructure specifically designed to keep pace with data demands securely, economically, and with operational simplicity. This empowers you to focus scarce capital resources and staff on your data-driven initiatives.

Aporia launches Direct Data Connectors, the industry’s first solution for monitoring large-scale data to ensure safe and trustworthy AI

Aporia, the observability platform for machine learning, announced the launch of Direct Data Connectors (DDC). DDC is a novel technology to monitor machine learning models in production by connecting directly to training and inference datasets, without the need to duplicate any data. With DDC, organizations can monitor billions of predictions without data sampling, data duplication, or hidden cloud costs. Aporia is the first company in the MLOps space to offer this capability.

“In today’s world where AI is being used for everything from scheduling airline flights to hiring employees, the traditional approach to ML monitoring is no longer sufficient”, said Liran Hason, CEO of Aporia. “We can’t rely on data samples anymore — as AI becomes more ubiquitous, it is crucial that organizations have a solution in place to accurately and completely monitor all data. We are proud to be the first to offer this capability with DDC.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter:

Join us on LinkedIn:

Join us on Facebook:

Speak Your Mind