Impacts of Artificial Intelligence and Higher Education’s Response

Northeastern University and Gallup just released a fascinating new survey that gauges public perceptions about artificial intelligence (AI) titled, “Optimism and Anxiety: Views on the Impacts of Artificial Intelligence and Higher Education’s Response.” Taken together, the results appear to be a wake-up call for higher education. Colleges and universities will have to adapt by designing a 21st century curriculum that empowers humans to become “robot-proof.”

Best of arXiv.org for AI, Machine Learning, and Deep Learning – November 2017

In this recurring monthly feature, we will filter all the recent research papers appearing in the arXiv.org preprint server for subjects relating to AI, machine learning and deep learning – from disciplines including statistics, mathematics and computer science – and provide you with a useful “best of” list for the month.

A Wave of Abundance from Big Ocean Data

In this contributed article, Matthew Mulrennan, Director of the Ocean Initiative at XPRIZE, and Dr. Jyotika Virmani, Senior Director for Planet & Environment at XPRIZE and prize lead for the Shell Ocean Discovery XPRIZE, explain how advancing big data collection in ocean science can improve the business of conservation and protection of our underwater resources and provide early warnings for water quality risks to human health and in lead to new underwater discoveries.

Predicting and Preventing Power Outages Using Big Data

Texas A&M University researchers have developed an intelligent model that can predict a potential vulnerability to utility assets and present a map of where and when a possible outage may occur. Dr. Mladen Kezunovic, along with graduate students Tatjana Dokic and Po-Chen Chen, have developed the framework for a model that can predict weather hazards, vulnerability of electric grids and the economic impact of the potential damage.

SKA Signs Big Data Cooperation Agreement with CERN

SKA Organisation and CERN, the European Laboratory for Particle Physics, signed an agreement formalising their growing collaboration in the area of extreme-scale computing. The agreement establishes a framework for collaborative projects that addresses joint challenges in approaching Exascale computing and data storage, and comes as the LHC will generate even more data in the coming decade and SKA is preparing to collect a vast amount of scientific data as well.

Case Study: More Efficient Numerical Simulation in Astrophysics

Novosibirsk State University is one of the major research and educational centers in Russia and one of the largest universities in Siberia. When researchers at the University were looking to develop and optimize a software tool for numerical simulation of magnetohydrodynamics (MHD) problems with hydrogen ionization —part of an astrophysical objects simulation (AstroPhi) project—they needed to optimize the tool’s performance on Intel® Xeon Phi™ processor-based hardware.

MIT Sloan Professor Builds New Meta-analysis Method to Help Settle Unresolved Debates

Science progresses when researchers build on prior work to extend, test, and apply theories. Aggregating the quantitative findings from prior research – meta-analysis — plays a significant role in advancing science, however current techniques have limitations. They assume prior studies share similar substantive factors and designs, yet many studies are heterogenous. A new method, co-created by MIT Sloan School of Management Prof. Hazhir Rahmandad, solves this problem by aggregating the results of prior studies with different designs and variables into a single meta-model.

Five Reasons to Attend a New Kind of Developer Event

In this special guest feature, Ubuntu Evangelist Randall Ross writes that the OpenPOWER Foundation is hosting an all-new type of developer event. “The OpenPOWER Foundation envisioned something completely different. In its quest to redefine the typical developer event the Foundation asked a simple question: What if developers at a developer event actually spent their time developing?”

Ohio State Launches High-Performance Deep Learning Project

Deep learning is one of the hottest topics at SC16. Now, DK Panda and his team at Ohio State University have announced an exciting new High-Performance Deep Learning project that aims to bring HPC technologies to the DL field. “Welcome to the High-Performance Deep Learning project created by the Network-Based Computing Laboratory of The Ohio State University. Availability of large data sets like ImageNet and massively parallel computation support in modern HPC devices like NVIDIA GPUs have fueled a renewed interest in Deep Learning (DL) algorithms. This has triggered the development of DL frameworks like Caffe, Torch, TensorFlow, and CNTK. However, most DL frameworks have been limited to a single node. The objective of the HiDL project is to exploit modern HPC technologies and solutions to scale out and accelerate DL frameworks.”

Video: Why use Tables and Graphs for Knowledge Discovery System?

In this video from the 2016 HPC User Forum in Austin, John Feo from PNNL presents: Why use Tables and Graphs for Knowledge Discovery System? “GEMS software provides a scalable solution for graph queries over increasingly large data sets. As computing tools and expertise used in conducting scientific research continue to expand, so have the enormity and diversity of the data being collected. Developed at Pacific Northwest National Laboratory, the Graph Engine for Multithreaded Systems, or GEMS, is a multilayer software system for semantic graph databases. In their work, scientists from PNNL and NVIDIA Research examined how GEMS answered queries on science metadata and compared its scaling performance against generated benchmark data sets. They showed that GEMS could answer queries over science metadata in seconds and scaled well to larger quantities of data.”