Sign up for our newsletter and get the latest big data news and analysis.

Project Adam: a New Deep-Learning System

Project_Adam

Project Adam is a new deep-learning system modeled after the human brain that has greater image classification accuracy and is 50 times faster than other systems in the industry. Project Adam is an initiative by Microsoft researchers and engineers that aims to demonstrate that large-scale, commodity distributed systems can train huge deep neural networks effectively.

Industry Perspectives from the 2013 O’Reilly Strata + Hadoop World Conference

The O’Reilly Strata + Hadoop World Conference is one of a few conferences that seriously can deliver on the mission of providing a state-of-the-art perspective on the big data industry. Here is a selection of video presentations made by industry luminaries that can guide enterprise thought leaders.

Kony, Inc. Selects TIBCO Jaspersoft for Cloud-based Analytics and Reporting

Bid-Data-2014.

TIBCO Software Inc. (NASDAQ: TIBX) has announced that Kony, Inc., a leading enterprise mobility company, is using TIBCO Jaspersoft® for Amazon Web Services to achieve embedded analytics within its mobile platform. Jaspersoft®, the “Intelligence Inside” applications and business processes, is used by Kony and its customers to monitor, report, and analyze the deployment of mobile applications.

Building a Better Brain: Saffron Cognitive Computing Platform Replicates How We Associate Facts

Saffron_logo

Saffron Technology has been on a quest since 1999 to replicate the way the human brain learns using associative memory. Saffron is now commercially available as a cognitive computing platform following beta testing for real-time operational risk intelligence and decision support in defense, energy, healthcare and manufacturing applications.

Data Science 101: Cool Hadoop Open Source Projects and How They Work

Hadoop_elephants

For up-and-coming data scientists who need to get up to speed on Hadoop architectures, here is another in a long line of compelling Big Data & Brews episodes. In the video below we hear from three Hadoop luminaries about the Hadoop projects they’ve worked on – Erich Nachbar on Spark, Michael Stack on Hbase and Ari Zilka (from Hortonworks) on Stinger. Great insider’s perspective!

Ovum’s Tony Baer on SQL on Hadoop

Datameer_logo

One of the attractions of the Hadoop Summit 2014 was the Big Data & Brews interview series – “Live from Hadoop Summit.” These short, well-focused discussions always provide good light into important industry trends. In the episode below, the conversation turns to the subject of SQL on Hadoop. Stefan Groschupf, the CEO of Datameer, recorded a special interview with Ovum analyst Tony Baer who gave his thoughts on the topic.

Big Data and Sustainability

Big_data_sustainability

The recent Big Ideas for Sustainable Prosperity research conference brought together some of the world’s preeminent environment & economy thinkers for a two day conference to share knowledge and think big about Policy Innovation for Greening Growth. In the video presentation below, Dr. Matthew E. Kahn argues that the combination of Big Data and field experiments can sharply improve urban quality of life.

Data Science 101: Hadoop – Just the Basics for Big Data Rookies

Hadoop_elephants

With the Hadoop Summit conference coming next week (June 3-5), it might be useful for all newbies to get up to speed with this exciting distributed computing technology. Below is a video presentation that will open doors for you about the Hadoop technology that’s taking the enterprise by storm.

Data Science 101: Machine Learning Class at CMU

Here is a great learning resource for anyone wishing to dive into the field of machine learning – a complete class “Machine Learning” from Spring 2011 at Carnegie Mellon University. The course is taught by Tom Mitchell, Chair of the Machine Learning Department.

Big Workflow: Accelerates Insights That Inspire Data-Driven Decisions

adaptive-computing

Big Workflow is a new industry term coined by Adaptive Computing that refers to technology that accelerates insights by more efficiently processing intense simulations and big data analysis. Big Workflow derives its name from its ability to solve big data challenges by streamlining the workflow to deliver valuable insights from massive quantities of data across multiple platforms, environments, and locations.