Wearables – A Data Scientist’s Dream Come True

Everyone knows that data scientists love data and the more of it, the greater the love. As a result, the surging interest in wearables is just what the doctor ordered because these electronic devices collect enormous treasure troves of data. In turn, it is the job of data scientists to make sense of it all, unlock secrets, and assign economic value. As a data scientist, it is a dream come true!

Big Workflow: Accelerates Insights That Inspire Data-Driven Decisions

Big Workflow is a new industry term coined by Adaptive Computing that refers to technology that accelerates insights by more efficiently processing intense simulations and big data analysis. Big Workflow derives its name from its ability to solve big data challenges by streamlining the workflow to deliver valuable insights from massive quantities of data across multiple platforms, environments, and locations.

Interview: Adaptive Computing Brings Big Workflow to the Data Center

“Our thought process was that Big Data + a better Workflow = Big Workflow. We coined it as an industry term to denote a faster, more accurate and more cost-effective big data analysis process. It is not a product or trademarked name of Adaptive’s, and we hope it becomes a common term in the industry that is synonymous with a more efficient big data analysis process.”

Big Workflow – Beyond Intelligent Workflow Management

Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. A new white paper that focuses on these issues is available here on insideBIGDATA.

Why Big Data Needs Cloud

“Tackling big data without a cloud-centric worldview is sort of like building a skyscraper without doing a soil study first: you might make some initial progress, but sooner or later you’ll discover that you need to understand and thoroughly adapt an (inadequate) foundation. At a minimum, you’ll experience false starts and thrashing; in many cases, you may never place a capstone.”

Adaptive Computing Introduces Big Workflow to Accelerate Insights

“A Big Workflow approach to big data not only delivers business intelligence more rapidly, accurately and cost effectively, but also provides a distinct competitive advantage. We are confident that Big Workflow will enable enterprises across all industries to leverage big data that inspires game-changing, data-driven decisions.”

Creating Better Infrastructure to Manage Big Data

In this video from SC13, Trev Harmon from Adaptive Computing looks back to the utility computing vision of Douglas Parkhill and proposes an application-centric workflow for the future that fulfills that vision across many disciplines of computing.

Wrangling Big Data Compute Resources with Adaptive Computing

In this video, Chad Harrington from Adaptive Computing describes how the company’s Moab software helps customers wrangle Big Data.

Big Data in the Big Apple – This Week on inside* Publications

We had an amazing time in Lugano at the HPC Advisory Council Switzerland Workshop, but there was a lot else going on here on inside* publications this week: Slidecast: Pentaho Big Data Update. In this episode of the Rich Report, Pentaho CEO Quentin Gallivan describes the company’s popular business analytics solutions. Guest Feature: Big Data Software – […]