Ask a TA: Everything You Need to Know About MIT xPRO’s Online Data Science Course

Eric Bradford, a Teaching Assistant for MIT xPRO’s Online Data Science Course, Data Science and Big Data Analytics: Making Data-Driven Decisions, gives his insights into the unique seven-week course. The seven-week course explores the theory and practice behind recommendation engines, regressions, network and graphical modeling, anomaly detection, hypothesis testing, machine learning, and big data analytics. The next online data science course begins Feb. 4, and is currently enrolling.

Can Online Professional Development Courses Teach Data-Driven Decision Making?

MIT xPRO’s Data Science program is segmented into six modules, each lead by a faculty member with expertise, research and teaching experience in that area. After just seven weeks, learners earn an MIT Professional Certificate in Data Science. This guest post from MIT focuses on whether online professional development courses, like its xPRO Data Science program, are successful in teaching data-driven decision making. 

Be on Top of Key Data Analytic Trends

Emily Washington: ‘Businesses are increasingly evaluating ways to streamline their overall technology stack… to successfully leverage big data and analytics’. Tech trends in data analytics are seeing the industry soar. Discover more here.

Interview: Vivienne Sze, Associate Professor of Electrical Engineering and Computer Science at MIT

I recently caught up with Vivienne Sze, Associate Professor of Electrical Engineering and Computer Science at MIT, to discuss the launch a new professional education course titled, “Designing Efficient Deep Learning Systems.” The two-day class will run March 28-29, 2018 at the Samsung Campus in Mountain View, CA and will explore all the latest breakthroughs related to efficient algorithms and hardware that optimize power, memory and data processing resources in deep learning systems.

The Importance of Vectorization Resurfaces

Vectorization offers potential speedups in codes with significant array-based computations—speedups that amplify the improved performance obtained through higher-level, parallel computations using threads and distributed execution on clusters. Key features for vectorization include tunable array sizes to reflect various processor cache and instruction capabilities and stride-1 accesses within inner loops.

Five Reasons to Attend a New Kind of Developer Event

In this special guest feature, Ubuntu Evangelist Randall Ross writes that the OpenPOWER Foundation is hosting an all-new type of developer event. “The OpenPOWER Foundation envisioned something completely different. In its quest to redefine the typical developer event the Foundation asked a simple question: What if developers at a developer event actually spent their time developing?”

Dr. Eng Lim Goh on New Trends in Big Data and Deep Learning for Artificial Intelligence

In this video from SC16, Dr. Eng Lim Goh from HPE/SGI discusses new trends in HPC Energy Efficiency and Deep Learning for Artificial Intelligence. “Recently acquired by Hewlett Packard Enterprise, SGI is a trusted leader in technical computing with a focus on helping customers solve their most demanding business and technology challenges.”

Data Analytics, Machine Learning, and HPC in Today’s Changing Application Environment

In this video from the Intel HPC Developer Conference, Franz Kiraly from Imperial College London and the Alan Turing Institute describes why many companies and organizations are beginning to scope their potential for applying rigorous quantitative methodology and machine learning.

insideBIGDATA Guide to Scientific Research

In this new insideBIGDATA Guide to Scientific Research, the goal is to provide a road map for scientific researchers wishing to capitalize on the rapid growth of big data technology for collecting, transforming, analyzing, and visualizing large scientific data sets.

Cornell to Lead Aristotle Cloud Federation for Research

Today Cornell University announced a five-year, $5 million project sponsored by the National Science Foundation to build a federated cloud comprised of data infrastructure building blocks (DIBBs) designed to support scientists and engineers requiring flexible workflows and analysis tools for large-scale data sets, known as the Aristotle Cloud Federation.