Sign up for our newsletter and get the latest big data news and analysis.

TOP 10 insideBIGDATA Articles for August 2018

In this continuing regular feature, we give all our valued readers a monthly heads-up for the top 10 most viewed articles appearing on insideBIGDATA. Over the past several months, we’ve heard from many of our followers that this feature will enable them to catch up with important news and features flowing across our many channels. We’re happy to oblige! We understand that busy big data professionals can’t check the site everyday.

NXP Delivers Embedded AI Environment to Edge Processing

NXP Semiconductors N.V. (NASDAQ:NXPI) announced a comprehensive, easy-to-use machine learning (ML) environment for building innovative applications with cutting-edge capabilities. Customers can now easily implement ML functionality on NXP’s breadth of devices from low-cost microcontrollers (MCUs) to breakthrough crossover i.MX RT processors and high-performance application processors.

3 Key Infrastructure Considerations for Your Big Data Operation

In this special guest feature, AJ Byers, President and CEO, ROOT Data Center, suggests that if your organization is launching or expanding a Big Data initiative, it would be wise to keep the needs of real estate, power and up-time top-of-mind. Whether your Big Data operations ultimately reside on-premises, at a colocation data center, or in the cloud, infrastructure that is flexible, scalable, sustainable and reliable is ground zero for ensuring its success.

DDN Storage Announces Groundbreaking 33GB/s Performance to NVIDIA DGX Servers to Accelerate Machine Learning and AI Initiatives

DataDirect Networks (DDN®) today announced its  EXAScaler DGX solution, a unique solution that delivers leading-edge performance using a new optimized, accelerated client integrating tightly and seamlessly with the NVIDIA DGX Architecture. Using the EXAScaler® ES14KX® high-performance all-flash array, the new solution smashed existing records by demonstrating a massive 33GB/s of throughput to a single NVIDIA […]

Will Artificial Intelligence Spark a Chip Cambrian Explosion?

In this feature article, insideBIGDATA’s Managing Editor and Resident Data Scientist, Daniel D. Gutierrez, explores the recent resurgence of the field of artificial intelligence (AI) that has upended the leadership positions of the biggest players in the global chip market. It turns out that AI benefits from specific types of processors that perform operations in parallel, and this fact opens up tremendous opportunities for newcomers. The question is – are we seeing the start of a Cambrian Explosion of start-up companies designing specialized AI chips?

Does It Make Sense to Do Big Data with Small Nodes?

In this contributed article, Glauber Costa, Principal Architect at ScyllaDB, busts some myths about the best node size for big data. In this age of big data and powerful commodity hardware there’s an ongoing debate about node size. Does it make sense to use a lot of small nodes to handle big data workloads? Or should we instead use only a handful of very big nodes?

Bitfusion Flex Announces Support for Xilinx FPGAs on AWS F1 Instances

Bitfusion announced that Amazon Web Services (AWS) customers can deploy deep learning workspaces and inference on AWS F1 FPGA instances with Bitfusion Flex.

New Containers on NVIDIA GPU Cloud Help Developers Instantly Deploy Fully Optimized AI and HPC Software

NVIDIA announced that a new advanced data center GPU — the NVIDIA® Tesla® V100 GPU based on NVIDIA’s Volta architecture — is available through major computer makers and chosen by major cloud providers to deliver artificial intelligence and high performance computing.

FreeWave Unveils ZumIQ App Server Software to Power IoT Programmability at the Edge

FreeWave Technologies, Inc., a leader in industrial, secure Machine to Machine (M2M) and Internet of Things (IoT) wireless networking solutions, announced the availability of its ZumIQ App Server Software.

The Future of Computing and Microchips

The looming demise of Moore’s Law has spurred computer scientists and researchers to look for new ways of maintaining processing speed growth. Promising solutions include cloud computing, deep learning, quantum computing, extreme ultraviolet lithography, and chips that mimic brain functioning. Our friends over at the New Jersey Institute of Technology Online Master of Science in Computer Science degree program have created the infographic below that highlights the future of computing and microchips.