2023 Trends in Artificial Intelligence and Machine Learning: Generative AI Unfolds  

In this contributed article, editorial consultant Jelani Harper offers his perspectives around 2023 trends for the boundless potential of generative Artificial Intelligence—the variety of predominantly advanced machine learning that analyzes content to produce strikingly similar new content.

Cortical.io Semantic Folding Approach Demonstrates a 2,800x Acceleration and 4,300x Increase in Energy Efficiency over BERT

Cortical.io announced its breakthrough prototype for classifying high volumes of unstructured text. Classifying documents or messages constitutes one of the most fundamental Natural Language Understanding (NLU) functions for business artificial intelligence (AI). The benchmark was carried out on two similar system setups using the same, off-the-shelve, dual AMD-Epyc server hardware. The “BERT” system, a transformer-based machine learning technique for natural language processing, was augmented by a NVidia GPU. The “Semantic Folding” approach utilized a cost comparable number of Xilinx Alveo FPGA accelerator cards.

Research Highlights: ExBERT

In the insideBIGDATA Research Highlights column we take a look at new and upcoming results from the research community for data science, machine learning, AI and deep learning. Our readers need to get a glimpse for technology coming down the pipeline that will make their efforts more strategic and competitive. In this installment we review a new paper: EXBERT: A Visual Analysis Tool to Explore Learned Representations in Transformers Models by researchers from the MIT-IBM Watson AI Lab and Harvard.

Interview: Beerud Sheth, CEO of Gupshup

I recently caught up with Beerud Sheth, CEO of Gupshup, to discuss the state-of-the-art for conversational AI and chatbot technology. He also gives us an idea for future areas of evolution for AI chatbots.

How NLP and BERT Will Change the Language Game

In this contributed article, Rob Dalgety, Industry Specialist at Peltarion, discusses how the recent model open-sourced by Google in October 2018, BERT (Bidirectional Encoder Representations from Transformers, is now reshaping the NLP landscape. BERT is significantly more evolved in its understanding of word semantics given its context and has an ability to process large amounts of text and language.