Sign up for our newsletter and get the latest big data news and analysis.

Research Highlights: Attention Condensers

A group of AI researchers from DarwinAI and out of the University of Waterloo, announced an important theoretical development in deep learning around “attention condensers.” The paper describing this important advancement is: “TinySpeech: Attention Condensers for Deep Speech Recognition Neural Networks on Edge Devices,” by Alexander Wong, et al. Wong is DarwinAI’s CTO.

Transform Raw Data to Real Time Actionable Intelligence Using High Performance Computing at the Edge

In this special guest feature, Tim Miller from One Stop Systems discusses the importance of transforming raw data to real time actionable intelligence using HPC at the edge. The imperative now is to move processing closer to where the data is being sourced, and apply high performance computing edge technologies so real time insights can drive business actions.

Expansion of the Edge: The Preeminent Importance of Edge Computing Today

In this contributed article, editorial consultant Jelani Harper discusses how the reliance on edge components and edge processing is becoming more critical to the decentralized big data landscape, especially with the ongoing need to communicate remotely. It’s imperative to ensure edge networks can be remotely provisioned, secured, and available for fringe processing where applicable to continue to support what’s become a burgeoning need for the IoT in general.

On the Edge of Something Big

In this contributed article, Tim Parker, VP of Network Strategy at Flexential, provides the top four reasons your organization needs an edge strategy now. Edge computing enables efficient data processing near the source to minimize latency, reduce bandwidth usage and lower costs while improving compliance, security and resiliency.

The Open Edge Architecture Imperative

In this contributed article, Roman Shapshnik, Co-Founder and VP of Product & Strategy at ZEDEDA, outlines the path forward for edge computing—where regardless of hardware, developers can create applications to run uniformly on the edge. Constricting the architecture to keep control only works in the short term.

NXP Delivers Embedded AI Environment to Edge Processing

NXP Semiconductors N.V. (NASDAQ:NXPI) announced a comprehensive, easy-to-use machine learning (ML) environment for building innovative applications with cutting-edge capabilities. Customers can now easily implement ML functionality on NXP’s breadth of devices from low-cost microcontrollers (MCUs) to breakthrough crossover i.MX RT processors and high-performance application processors.

Condusiv Technologies Reports: Need for Speed Drives Edge Computing Growth

The actual “edge” in edge computing depends on the application. In telecommunications, it could be a cell phone, or perhaps a cell tower. In manufacturing, it could be a machine on a shop floor; in enterprise IT, the edge could be a laptop. The important thing about edge computing is that it enables data produced by Internet of Things devices to be processed close to where it’s created, rather than sending it to centralized, cloud-based data centers.

FreeWave Unveils ZumIQ App Server Software to Power IoT Programmability at the Edge

FreeWave Technologies, Inc., a leader in industrial, secure Machine to Machine (M2M) and Internet of Things (IoT) wireless networking solutions, announced the availability of its ZumIQ App Server Software.

FogHorn Systems Brings Advanced Machine Learning Capabilities to Industrial IoT Edge Computing

FogHorn Systems announced the availability of Lightning ML, the newest version of its Lightning™ edge intelligence software platform for the Industrial Internet of Things (IIoT). Lightning ML is now the industry’s first IIoT software platform with integrated machine learning capabilities and universal compatibility across all major IIoT edge systems.