Sign up for our newsletter and get the latest big data news and analysis.

Examining Architectures for the Post-Exascale Era

On Wednesday, November 11th, at 9am PST, a group of researchers and industry players on the leading edge of a new approach to HPC architecture join to explore the topic in a webinar titled, “Disaggregated System Architectures for Next Generation HPC and AI Workloads.”

Best of arXiv.org for AI, Machine Learning, and Deep Learning – September 2020

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

NVIDIA Advances Performance Records on AI Inference

NVIDIA today announced its AI computing platform has again smashed performance records in the latest round of MLPerf, extending its lead on the industry’s only independent benchmark measuring AI performance of hardware, software and services.

What’s Under the Hood of Neural Networks?

In this contributed article, Pippa Cole, Science Writer at the London Institute for Mathematical Sciences, discusses new research on artificial neural networks that has added to concerns that we don’t have a clue what machine learning algorithms are up to under the hood. She highlights a new study that focuses on two completely different deep-layered machines, and found that in fact they did exactly the same thing, which was a huge surprise. It’s a demonstration of how little we understand about the inner workings of deep-layered neural networks.

KDD 2020 Recognizes Winning Teams of 24th Annual KDD Cup

Across Four Competition Tracks, KDD Cup 2020 Tackled E-Commerce, Generative Adversarial Networks, Automatic Graph Representation Learning, Automated Machine Learning, Mobility-on-Demand (MoD) Platforms and Reinforcement Learning. KDD 2020, the premier interdisciplinary conference in data science, recognized over sixty winning teams in this year’s KDD Cup competition, which took place virtually Aug. 23-27, 2020.

KDD 2020 Celebrates Recipients of the SIGKDD Best Paper Awards

KDD 2020, the premier interdisciplinary conference in data science (which took place virtually Aug. 23-27, 2020), announced the recipients of the SIGKDD Best Paper Awards, recognizing papers presented at the annual SIGKDD conference that advanced the fundamental understanding of the field of knowledge discovery in data and data mining. Winners were selected from more than 2,000 papers initially submitted for consideration to be presented at the conference.

Best of arXiv.org for AI, Machine Learning, and Deep Learning – August 2020

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

The State of Data Management – Why Data Warehouse Projects Fail

Based on new research commissioned by SnapLogic and conducted by Vanson Bourne, who surveyed 500 IT Decision Makers (ITDMs) at medium and large enterprises across the US and UK, this whitepaper explores the data management challenges organizations are facing, the vital role data warehouses play, and the road to success.

HuBMAP Inaugural Data Release Puts Detailed Anatomical Data about Seven Human Organs at the Service of Scientists, Public

HuBMAP (the Human BioMolecular Atlas Program) has released its inaugural data for use by the scientific community and the general public. Included in this release are detailed, 3D anatomical data and genetic sequences of healthy tissues from seven organ types, at the level of individual cells as well as many bulk tissue data sets. HuBMAP’s ultimate goal is to provide the framework required for scientists to create a 3D atlas of the human body.

Research Highlights: Attention Condensers

A group of AI researchers from DarwinAI and out of the University of Waterloo, announced an important theoretical development in deep learning around “attention condensers.” The paper describing this important advancement is: “TinySpeech: Attention Condensers for Deep Speech Recognition Neural Networks on Edge Devices,” by Alexander Wong, et al. Wong is DarwinAI’s CTO.