Sign up for our newsletter and get the latest big data news and analysis.

Fiddler Announces Giga-Scale Model Performance Management with Deeper Understanding of Unstructured Models and Fine Discoverability to Launch New AI Initiatives

Fiddler, a pioneer in Model Performance Management (MPM), announced major improvements to its MPM platform, including model ingestion at giga-scale, natural language processing (NLP) and computer vision (CV) monitoring, class imbalance, and an intuitive and streamlined user experience. With these new features, the Fiddler MPM platform is delivering a deeper understanding of unstructured model behavior and performance, and enhanced scalability, discoverability of rare and nuanced model drifts, and ease of use. Semantic Folding Approach Demonstrates a 2,800x Acceleration and 4,300x Increase in Energy Efficiency over BERT announced its breakthrough prototype for classifying high volumes of unstructured text. Classifying documents or messages constitutes one of the most fundamental Natural Language Understanding (NLU) functions for business artificial intelligence (AI). The benchmark was carried out on two similar system setups using the same, off-the-shelve, dual AMD-Epyc server hardware. The “BERT” system, a transformer-based machine learning technique for natural language processing, was augmented by a NVidia GPU. The “Semantic Folding” approach utilized a cost comparable number of Xilinx Alveo FPGA accelerator cards.

Video Highlights: The Rise of DeBERTa for NLP Downstream Tasks

In episode seven of the NVIDIA Grandmaster Series, you’ll learn from four members of the Kaggle Grandmasters of NVIDIA (KGMON) team. Watch this video to learn how they used natural language processing to analyze argumentative writing elements from students and identified key phrases in patient notes from medical licensing exams.

More Than You Know: The Enterprise Worth of Natural Language Generation 

In this contributed article, editorial consultant Jelani Harper highlights how Natural Language Generation (NLG) is arguably the nexus point of natural language technologies. It utilizes Natural Language Processing (NLP), is a prerequisite for conversational AI, and largely requires Natural Language Understanding (NLU) for meaningful responses to interrogatives or commands.

Video Highlights: How to Optimize Deep Learning Models for Production

In the video presentation embedded below, our friends over at Neural Magic present a compelling workshop: How to Optimize Deep Learning Models for Production. After watching this video, you’ll be able to optimize your NLP and/or computer vision model, apply your own data with a few lines of code, and deploy it on commodity CPUs at GPU-level speeds.

Deci Boosts Computer Vision & NLP Models’ Performance at MLPerf 

Deci, the deep learning company harnessing Artificial Intelligence (AI) to build AI, announced its results for both Computer Vision (CV) and Natural Language Processing (NLP) inference models that were submitted to the MLPerf v2.0 Datacenter Open division. These submissions demonstrated the power of Deci’s Automated Neural Architecture Construction (AutoNAC) technology, which automatically generated models dubbed DeciNets and DeciBERT, thus delivering breakthrough accuracy and throughput performance on Intel’s CPUs.

New Report: Natural Language Emerges as AI Priority in 2022 (EXAI:IM), a leading company in artificial intelligence (AI) for language understanding, released a new Expert IQ Report, “Natural Language Emerges as AI Priority in 2022.” The report looks at the most discussed AI topics of 2021, while offering forecasts for 2022.  The study turned AI on itself by using’s advanced natural language understanding (NLU) capabilities to unearth findings. Democratizes Deep Learning with H2O Hydrogen Torch, an AI Cloud leader, announced H2O Hydrogen Torch, a deep learning training engine that makes it easy for companies of any size in any industry to make state-of-the-art image, video and natural language processing (NLP) models without coding. 

Elite Deep Learning for Natural Language Technologies: Representation Learning

In this contributed article, editorial consultant Jelani Harper discusses deep learning, natural language technologies, and representation learning. The deep learning space is continuing to grow apace. Representation learning is one of the more meritorious approaches for paring the quantity of training data (and amount of labels) involved in natural language technology deployments. It also does so while diversifying the utility of the underlying models for applications of multitask learning.

New Data Report: How AI-powered Language is Enhancing Customer Engagement

Our friends over at Persado (the AI content generation and decisioning company) and Coresight Research released a new data report, “AI-Powered Language: A New Era of Enhanced Customer Engagement,” that examines how AI-powered language is fueling enhanced customer engagement. The report surveyed decision-makers in industries such as retail, manufacturing, and professional services, to highlight the growing importance of AI-enabled marketing technologies to generate engaging content, personalize customer experiences, and improve how businesses leverage first-party data to gain meaningful insights.