Research Highlights: Pen and Paper Exercises in Machine Learning

In this regular column we take a look at highlights for breaking research topics of the day in the areas of big data, data science, machine learning, AI and deep learning. For data scientists, it’s important to keep connected with the research arm of the field in order to understand where the technology is headed. Enjoy!

Best of arXiv.org for AI, Machine Learning, and Deep Learning – November 2021

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

Best of arXiv.org for AI, Machine Learning, and Deep Learning – July 2021

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

Research Highlights: Attention Condensers

A group of AI researchers from DarwinAI and out of the University of Waterloo, announced an important theoretical development in deep learning around “attention condensers.” The paper describing this important advancement is: “TinySpeech: Attention Condensers for Deep Speech Recognition Neural Networks on Edge Devices,” by Alexander Wong, et al. Wong is DarwinAI’s CTO.