In this contributed article, Jakob Freund, Co-Founder and CEO at Camunda, explores three different types of AI that he predicts will dominate industries as organizations work to ensure business processes are streamlined and working as intended. These three AI buckets include predictive decision-making, generative processes, and assistive tools.
insideBIGDATA AI News Briefs – 9/22/2023
Welcome insideBIGDATA AI News Briefs, our timely new feature bringing you the latest industry insights and perspectives surrounding the field of AI including deep learning, large language models, generative AI, and transformers. We’re working tirelessly to dig up the most timely and curious tidbits underlying the day’s most popular technologies. We know this field is advancing rapidly and we want to bring you a regular resource to keep you informed and state-of-the-art.
insideBIGDATA Latest News – 9/21/2023
In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.
Intel Innovation 2023 Highlights
Tuesday morning (Sept. 19, 2023), Intel kicked off its third annual developer event, Intel Innovation 2023, virtually and in San Jose, California. During the Day 1 keynote, “Developing the Future of the Siliconomy,” Intel CEO Pat Gelsinger, and a variety of customers, unveiled an array of technologies and applications that bring artificial intelligence everywhere and make it more accessible across all workloads.
Generative AI Report – 9/19/2023
Welcome to the Generative AI Report round-up feature here on insideBIGDATA with a special focus on all the new applications and integrations tied to generative AI technologies. We’ve been receiving so many cool news items relating to applications and deployments centered on large language models (LLMs), we thought it would be a timely service for readers to start a new channel along these lines. The combination of a LLM, fine tuned on proprietary data equals an AI application, and this is what these innovative companies are creating. The field of AI is accelerating at such fast rate, we want to help our loyal global audience keep pace.
SambaNova Unveils New AI Chip, the SN40L, Powering its Full Stack AI Platform
SambaNova Systems, makers of the purpose-built, full stack AI platform, announced a revolutionary new chip, the SN40L. The SN40L will power SambaNova’s full stack large language model (LLM) platform, the SambaNova Suite, with its revolutionary new design: on the inside it offers both dense and sparse compute, and includes both large and fast memory, making it a truly “intelligent chip.”
Heard on the Street – 9/18/2023
Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace.
Anyscale Teams With NVIDIA to Supercharge LLM Performance and Efficiency
Anyscale, the AI infrastructure company built by the creators of Ray, the world’s fastest-growing open-source unified framework for scalable computing, today announced a collaboration with NVIDIA to further boost the performance and efficiency of large language model (LLM) development on Ray and the Anyscale Platform for production AI.
Kinetica Launches Native Large Language Model for Language-to-SQL on Enterprise Data
Kinetica, the speed layer for generative AI and real-time analytics, announced a native Large Language Model (LLM) combined with Kinetica’s innovative architecture that allows users to perform ad-hoc data analysis on real-time, structured data at speed using natural language. Unlike with public LLMs, no external API call is required and data never leaves the customer’s environment.
Cash Treasury Trading in the Age of AI
In this contributed article, Shankar Narayanan, Head of Trading Research, Quantitative Brokers, discusses how In the era of artificial intelligence, cash treasury trading presents a unique opportunity to integrate new technologies, enhance trading methodologies and meet the growing demands of a rapidly evolving market.