Intel Innovation 2023 Highlights

Tuesday morning (Sept. 19, 2023), Intel kicked off its third annual developer event, Intel Innovation 2023, virtually and in San Jose, California. During the Day 1 keynote, “Developing the Future of the Siliconomy,” Intel CEO Pat Gelsinger, and a variety of customers, unveiled an array of technologies and applications that bring artificial intelligence everywhere and make it more accessible across all workloads.

Generative AI Report – 9/19/2023

Welcome to the Generative AI Report round-up feature here on insideBIGDATA with a special focus on all the new applications and integrations tied to generative AI technologies. We’ve been receiving so many cool news items relating to applications and deployments centered on large language models (LLMs), we thought it would be a timely service for readers to start a new channel along these lines. The combination of a LLM, fine tuned on proprietary data equals an AI application, and this is what these innovative companies are creating. The field of AI is accelerating at such fast rate, we want to help our loyal global audience keep pace.

SambaNova Unveils New AI Chip, the SN40L, Powering its Full Stack AI Platform

SambaNova Systems, makers of the purpose-built, full stack AI platform, announced a revolutionary new chip, the SN40L. The SN40L will power SambaNova’s full stack large language model (LLM) platform, the SambaNova Suite, with its revolutionary new design: on the inside it offers both dense and sparse compute, and includes both large and fast memory, making it a truly “intelligent chip.”

Kinetica Launches Native Large Language Model for Language-to-SQL on Enterprise Data  

Kinetica, the speed layer for generative AI and real-time analytics, announced a native Large Language Model (LLM) combined with Kinetica’s innovative architecture that allows users to perform ad-hoc data analysis on real-time, structured data at speed using natural language. Unlike with public LLMs, no external API call is required and data never leaves the customer’s environment.

Anyscale Teams With NVIDIA to Supercharge LLM Performance and Efficiency

Anyscale, the AI infrastructure company built by the creators of Ray, the world’s fastest-growing open-source unified framework for scalable computing, today announced a collaboration with NVIDIA to further boost the performance and efficiency of large language model (LLM) development on Ray and the Anyscale Platform for production AI.

Video Highlights: Designing Machine Learning Systems — with Chip Huyen

Chip Huyen, co-founder of Claypot AI and author of O’Reilly’s best-selling “Designing Machine Learning Systems” joins our good friend Jon Krohn, Co-Founder and Chief Data Scientist at the machine learning company Nebula, to share her expertise on designing production-ready machine learning applications, the importance of iteration in real-world deployment, and the critical role of real-time machine learning in various applications.

DDN Storage Solutions Deliver 700% Gains in AI and Machine Learning for Image Segmentation and Natural Language Processing

DDN®, a leader in artificial intelligence (AI) and multi-cloud data management solutions, announced impressive performance results of its AI storage platform for the inaugural AI storage benchmarks released this week by MLCommons Association. The MLPerfTM Storage v0.5 benchmark results confirm DDN storage solutions as the gold standard for AI and machine learning applications.

KX Announces KDB.AI Cloud: The Free, Smarter Vector Database for AI

KX, a pioneer in vector and time-series data management, has announced the general availability of KDB.AI Cloud, a vector database for real-time contextual AI. Quick and easy to set up and use, this free, cloud-based version of KDB.AI, the leading vector database , has been designed with a commitment to provide a superior developer experience.

Research Highlights: Unveiling the First Fully Integrated and Complete Quantum Monte Carlo Integration Engine

Quantinuum, a leading integrated quantum computing company has published full details of their complete Quantum Monte Carlo Integration (QMCI) engine. QMCI applies to problems that have no analytic solution, such as pricing financial derivatives or simulating the results of high-energy particle physics experiments and promises computational advances across business, energy, supply chain logistics and other sectors.

New FeatureByte Copilot Automatically Ideates Use-Case Specific Features for Data Scientists

FeatureByte, an AI startup formed by a team of data science experts, announced FeatureByte Copilot, an automated, intelligent feature ideation solution that marks a new era in enterprise AI. This new product, driven by data semantics and real-world relevance, eliminates a major headache for data science teams – preparing and deploying AI data.