insideBIGDATA Latest News – 1/26/2021

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.’s New Release of Contract Intelligence Software Uses AI to Improve Extraction and Search Capabilities announced a new release of its Contract Intelligence software. Utilizing a patented natural language understanding (NLU) approach based on semantic folding theory, the software analyzes the content of large quantities of documents with a degree of accuracy that is difficult to achieve with manual labor or other automation tools. It automatically and accurately searches, extracts, classifies and compares key information from agreements, contracts, and other unstructured documents like policies and financial reports. The Contract Intelligence solution understands the meaning of whole sentences and concepts, instead of just keywords.

“Other vendors offer contract review and analysis software, but are limited to pre-defined contract types. offers the capability to easily customize the extraction and classification to any type of document and to the specific corporate requirements with little training data,” said COO Thomas Reinemer. “Because it is meaning-based, the solution reaches higher levels of accuracy even when extracting whole sentences and paragraphs which makes it very reliable and provides efficiency and savings to our customers.”

Narrative Unveils Universal Onboarding, Industry’s First Self-Service Solution for Onboarding Offline Customer Data

Continuing its mission to fix the broken data broker model, Narrative—the Data Streaming Platform that makes it easy to buy, sell, and win— announced the launch of its Universal Onboarding app. This new offering represents a paradigm shift in the industry: a fully self-service solution for data onboarding, allowing users to match offline customer data to digital identifiers in minutes, without the need for technical expertise or significant upfront commitments. The app provides an alternative to traditional onboarding services characterized by a lack of visibility or control over data sources, slow turnaround times, and inflexible pricing models.

“Customers are moving fast, and today’s existing offerings for onboarding offline customer data takes two weeks on average, is error-prone, and has opaque pricing. This is frankly unacceptable,” said Nick Jordan, founder and CEO of Narrative. “With Universal Onboarding, we are delivering the industry’s easiest, most cost-effective, automated data onboarding solution that matches digital identifiers from our graph—a pool of 1.6 billion IDs—in minutes, without the need to speak with or wait for anyone.”

Foxconn Announces FOXCONN NxVAE, Unsupervised Learning AI Technology

Foxconn Technology Group (Foxconn), a global leader in smart manufacturing, today announced the launch of FOXCONN NxVAE, a new unsupervised learning artificial intelligence (AI) technology that ensures higher levels of efficiency and accuracy in the inspection of defects in manufacturing production lines when compared with traditional practices.

As a first step in applying this new technology, Foxconn introduced FOXCONN NxVAE to some handheld device production lines in mainland China. After eight months of research and development, those lines successfully reduced the manpower resources required for defect inpsection by 50 percent. The new technology will also be applied to broader manufacturing uses for verticals such as textiles and healthcare, as part of Foxconn’s support for accelerating transformation of a range of industry sectors.

“The yield rate of our production lines has exceeded 99% and the unsupervised learning algorithm developed by the AI team not only enhances efficiency and reduces the challenges associated with introducing new products into the production line, it also marks an important production efficiency milestone for our industry,” said Gene Liu, Vice President of the Semiconductor Subgroup at Foxconn Technology Group. “This development also demonstrates our company’s vision of “3+3=∞” which symbolizes the infinite possibilities created by Foxconn’s industrial advancement and emerging technologies, and we remain committed to investing in these areas.”

VAST Data Unveils Joint Reference Architecture with NVIDIA Designed to Significantly Increase Storage Performance for Large-Scale AI Workloads

VAST Data , the storage company breaking decades-old tradeoffs, today announced a new reference architecture based on NVIDIA DGX™ A100 systems and VAST Data’s Universal Storage platform. This reference architecture is designed to significantly increase storage performance for AI use cases such as large-scale training of conversational AI models and petabyte-scale data analytics. Jointly designed, built and tested by NVIDIA and VAST Data, the reference architecture eliminates the guesswork of building a solution from the ground up by providing enterprises with a turnkey petabyte-scale AI infrastructure solution that maximizes performance without adding needless cost and complexity.

Historically, enterprises have been forced to choose between two specific infrastructure configurations based on workload (i.e. GPU-intensive or storage-intensive), but as applications and data science teams’ requirements evolve, that infrastructure choice may limit and negatively impact performance. Built on VAST Data’s recently released LightSpeed platform and NVIDIA’s universal AI system, DGX A100, the architecture leverages VAST’s unique capabilities such as NFS-over-RDMA, NFS Multipath, and support for NVIDIA GPUDirect Storage as well as a converged fabric design. The joint reference architecture delivers more than 140GB/s of throughput for both GPU-intensive and storage-intensive AI workloads.

“For the first time, enterprises, and more importantly data science teams, are no longer constrained by the limitations of rigid infrastructure configurations,” said Jeff Denworth, Co-founder and CMO at VAST Data. “We’ve worked with NVIDIA on this new reference architecture, built on our LightSpeed platform, to provide customers a flexible, turnkey, petabyte-scale AI infrastructure solution and to remove the variables that have introduced compromise into storage environments for decades.”

Kyligence Launches Intelligent Data Cloud for Interactive Analytics at Massive Concurrency and Scale

Kyligence, originator of Apache Kylin and developer of the AI-augmented analytics platform Kyligence Cloud, announced the immediate availability of Kyligence Cloud 4, its new cloud-native distributed big data analytics platform. Available on Microsoft Azure and Amazon AWS, Kyligence Cloud 4 leverages cloud-native concepts such as the separate scaling of  compute and storage to enable fast, scalable and highly concurrent  analytics  against  cloud data warehouses and data lakes. It combines high performance and high concurrency OLAP, a cloud-native architecture, and auto-optimization using machine learning algorithms to simplify and automate cloud analytics. Kyligence Cloud 4 can routinely deliver sub-second query response times against datasets of hundreds of terabytes to petabytes.

“When a significant proportion of your IT budget is allocated to cloud services, significant performance gains translate to  pure savings in the cloud,” said Li Kang, vice president of North America, Kyligence. “Kyligence Cloud functions as a high performance data service that delivers unified semantics and supports SQL, MDX, and REST interfaces. Our mission  is to follow a cloud-native approach to enable data engineers and SQL experts to build high performance into the datasets they curate. ”

Splice Machine Launches the Splice Machine Feature Store to Simplify Feature Engineering and Democratize Machine Learning

Splice Machine, the scale-out SQL database with built-in machine learning, announced it has launched the Splice Machine Feature Store. The solution will help more companies operationalize machine learning by reducing the complexity of feature engineering and allow data scientists to make the right decisions based on real-time data.

“The capacity to create, share, explain and reliably reproduce features for a given model is paramount to the success of a data science team,” said Monte Zweben, CEO, Splice Machine. “The old way of doing things meant data science operations were simply not scalable. The Splice Machine Feature Store enables you to harness complex analytics in real time and transform real-time data into features, so your models are never uninformed. It also stores feature history making training set creation a single click.”

RSIP Vision Announces Versatile Medical Image Segmentation Tool, Delivering Efficient Anatomical Measurements and Better Treatment Options

RSIP Vision, a leading innovator in medical imaging through advanced AI and computer vision solutions, announced a general purpose, AI-based segmentation and measurement tool for detecting objects of interest and their boundaries quickly and automatically, making surgical and diagnostic measurements easier and more accurate for better treatment decisions.  The tool requires minimal work by the user to deliver an accurate 3D visualization and analysis of patient anatomy and is applicable across medical imaging verticals & modalities. The solution runs automatically and is robust and clinically accurate, avoiding human factors such as fatigue and misreads which may result in mistakes in measurement. It is available to medical device manufacturers for use in leading facilities worldwide.

“Distinguishing and measuring organs, lesions, and other areas of interest in biopsy and pre-surgical planning can be tedious work, which is generally assigned to a specific employee or technician, or even a  physician,” said Ron Soferman, Founder & CEO at RSIP Vision. “Our new segmentation tool makes it easier to pinpoint specific points and boundaries in images, which in turn leads to greater accuracy during surgeries without being dependent on the capability and experience of a specific individual.  In 2021, RSIP Vision will continue to drive innovation in image analysis across the medical verticals through custom software, advanced algorithm development and custom technologies which will be found in medical devices in leading facilities worldwide. RSIP Vision ensures customers can leverage the latest advances in AI and computer vision, in order to save time and cost during medical procedures.”

Fivetran Expands Enterprise Data Integration with Release of New Data Source Connectors

Fivetran, a leading automated data integration provider, announced the addition of new data source connectors for IBM, Oracle and SAP, accelerating large enterprise organizations’ move to the modern data stack. The pre-built connectors add to the more than 150 maintained 24/7 by Fivetran, ensuring data engineering teams never need to waste time building and maintaining data pipelines. Mission critical data, stored in these databases, can now be efficiently extracted from their source, loaded into cloud data warehouses and lakes, and reliably managed for instant availability to support data-driven decision making.  

“Enterprise data teams just don’t have the time to build, manage and maintain the enormous number of data integrations that are being requested. We see it as our mission at Fivetran to eliminate this burden, freeing up invaluable time by providing an extensive library of easy-to-use, fully managed and maintained data connectors,” said Fraser Harris, vice president of product at Fivetran. “Because we observe tens of thousands of customer connector configurations and manage more than one million syncs daily, we’re able to apply our insight and continuous pipeline optimizations across all of our 1600-plus customers, providing added value for all.”

Datatron Releases New Governance Dashboard to Provide Trust, Transparency, Traceability & Validation of AI/ML Solutions

Datatron, a pioneer in AI ModelOps and governance at scale, announced the immediate availability of the Datatron Governance Dashboard, an adaptive, performance-based artificial intelligence (AI) governance solution. This new offering provides AI and machine learning (ML) model transparency risk management to comply with regulations while optimizing business outcomes. Because many ML initiatives work in isolation from each other and don’t directly support—or can even hinder—a company’s broader business, regulatory, and privacy objectives, the Datatron Governance Dashboard provides model traceability and confidence in AI applications. The dashboard provides analytic leaders, business stakeholders, and data scientists a birds-eye, multi-level view of how their models perform in production via the smart visualization of key metrics.

Technologies like ML and deep learning (DL) are becoming critically important for organizations looking to increase revenue and competitive advantage. However, along with these new technologies came new government guidelines and regulatory approaches because AI/ML models can pose risks to privacy, brand reputation, autonomy, and growth. Models must be carefully managed and appropriately governed, especially since further AI/ML adoption and acceptance depend on trust, transparency, and validation. The new Datatron Governance Dashboard not only facilitates compliance with audits and regulations, it delivers consistent and easy-to-understand risk assessment and management analysis that should be the foundation of any AI/ML program.

“We consistently hear from prospects about poor model performance and a serious lack of trust in predictions from AI applications,” said Harish Doddi, Datatron CEO. “The ability of large companies to operationalize and govern their AI/ML models has far exceeded their infrastructure and the bandwidth of their engineering and data science teams. We established Datatron to operationalize and govern AI model management at scale. Leveraging the new Datatron Governance Dashboard, organizations can monitor deployments, detect problems early, and increase the efficiency of managing multiple models at scale in order to maintain compliance and growth.”

ScaleOut Software Announces Tools to Simplify Development of Real-Time Digital Twins for Streaming Analytics

ScaleOut Software released enhanced support for developing streaming analytics applications that run on its ScaleOut Digital Twin Streaming Service™.  Now developers can build real-time digital twin models through a widely used technique called a “rules engine” as an alternative to writing application code in programming languages such as Java, C#, or JavaScript. This technique reduces development time and lowers the barrier for developers and analysts by eliminating the need for specialized programming skills. To ensure fast and easy application development, a new software tool, called the ScaleOut Rules Engine Development Tool™, provides comprehensive features for building and testing rules engine-based models.

With its ability to immediately analyze data in motion from individual data sources in milliseconds and make immediate use of dynamic context for each data source, the ScaleOut Digital Twin Streaming Service fundamentally changes the way industries like telematics, security monitoring, healthcare, logistics, retail, and financial services process live data streams to make critical decisions in the moment. Support for describing application logic using a set of rules that are executed by a rules engine enables the streaming service to be deployed by a wider range of application developers than previously possible.

“We are excited to further simplify the development of real-time streaming analytics applications with support for rules-based digital twin models,” said Dr. William Bain, ScaleOut Software’s CEO and founder. “With the introduction of the ScaleOut Digital Twin Streaming Service, we created a breakthrough for applications that need to simultaneously track thousands of data sources. Now, the integration of an easy-to-use rules engine adds new features that make application development easier than ever.”

Greenlight Guru Launches Halo℠ For Change Management, World’s First AI & ML Recommendation Engine for Medical Device Quality

Greenlight Guru, a leading medical device quality management software (MDQMS) platform, announced the launch of Halo℠ for Change Management, an AI and machine learning recommendation engine for medical device quality. This revolutionary product provides quality and product teams with recommendations of items impacted by a change order and the real-time visibility needed to discover, assess and manage the impact of a change. With the launch of Halo℠ for Change Management, medical device companies can stop reacting and start predicting. 

“Our recent 2021 State of Medical Device Quality Report revealed that, currently, 60% of organizations rely on traceability for conducting change, while 28% of medical device professionals say it still takes a full day or more to run a change impact analysis. This tells us that medical device professionals are working from a reactive state that is tedious and error-prone, limiting organizations from gathering the quality insights that are needed to stay ahead of change,” said David DeRam, CEO at Greenlight Guru. “Halo℠ for Change Management is the first AI feature that allows you to predict the impact of your change and provides you with recommendations for what is impacted by a change, giving you real-time visibility, reduced risk, and overall improved quality.” 

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 –

Speak Your Mind