Interview: Adaptive Computing Brings Big Workflow to the Data Center

“Our thought process was that Big Data + a better Workflow = Big Workflow. We coined it as an industry term to denote a faster, more accurate and more cost-effective big data analysis process. It is not a product or trademarked name of Adaptive’s, and we hope it becomes a common term in the industry that is synonymous with a more efficient big data analysis process.”

Interview: Wise.io Sees Machine Learning Throughout the Entire Customer Lifecycle

“Our main differentiator from other machine-learning companies is that we’re focused not just on high-performance algorithms, but on delivering an end-to-end application for business users. While we continue to push boundaries of cutting-edge machine learning technology, we made an early decision not to get sucked into the “algorithms arms race.” We hold a fundamental belief that the best analytics technologies will fail unless they can be implemented in a timeframe relevant to the business and interpreted by the ultimate decision makers.”

Interview: Datameer Brings End-to-End Data Analytic Solutions Built on Hadoop

“Datameer is all about providing a self-service, end-to-end experience for big data analytics on Hadoop. From data integration to analytics to visualization, we are wizard-led, point-and-click. Most recently we announced our Smart Analytics module, which allows business users to use data mining algorithms through a drag and drop UI. These new capabilities complement what data scientists are doing and enable business analysts to take advantage of advanced algorithms without involving IT.”

Interview: Glassbeam Joins Forces with HDS for Complex Infrastructure Management

“Machine logs contain simple and complex data – some logs contain time stamped data (i.e. syslogs) that are tactical events or errors used by sys admins to troubleshoot IT infrastructure. But other logs have more complex, unstructured or multi-structured text with sections on configuration info, statistics and other non-time stamped data. To make sense of the data in these logs, one needs a powerful language and processing engine to provide meaning and structure to the information. Once structure is defined, complex analytics and trend reporting can be performed.”

Interview: Splunk Brings Machine Data to Higher Education

“Splunk Enterprise is a platform for machine data. The technology delivers powerful and fast analytics to quickly unlock the value of machine data to IT and other users throughout an organization. In short, it’s a simple, effective way to collect, analyze and secure the massive streams of machine data generated by all IT systems and technology infrastructure.”

Intel’s Boyd Davis Talks Predictive Analytics and March Madness

“Intel’s goal is to encourage more innovative and creative uses for data as well as to demonstrate how big data and analytics technologies are impacting many facets of our daily lives, including sports. For example, coaches and their staffs are using real-time statistics to adjust games on-the-fly and throughout the season. From intelligent cameras to wearable sensors, a massive amount of data is being produced that, if analyzed in real-time, can provide a significant competitive advantage. Intel is among those making big data technologies more affordable, available, and easier to use for everything from helping develop new scientific discoveries and business models to even gaining the upper hand on good-natured predictions of sporting events.”

Interview: Active Archives for Managing and Storing Big Data

“Active archives are ideal for organizations that face exponential data growth or regularly manage high-volume unstructured data or digital assets. Target markets include life sciences, media and entertainment, education, research, government, financial services, oil and gas, and telecommunications, as well as general IT organizations requiring online data archive options.”

How MPSTOR Delivers Software Defined Storage Across Multiple Services

MPSTOR integrates virtualization into the software stack to provide better, more robust infrastructure management. “Orkestra enables automated delivery of “Anything as a Service,” allowing cloud operators to create and deliver cost effective, differentiated services.”

Interview: Caserta Brings New Business Insights via Data Intelligence

“Caserta Concepts is a leading big data innovation and implementation services organization. Among our areas of specialization is the development of solutions to meet the volume and variety of data and overall speed demands of the financial services sector.”

Interview: How Anaplan Delivers Innovation in Real-Time Data Modeling

“The Anaplan revolution is to provide a big-data engine for business users, removing the need to work with data scientists. The ability to scale your data – 100 billion cells in one model, with 1 billion items in a list – will prove to be the key to proliferation, so long as the data is immediate, useable, consumable via apps, and easy to modify. With Anaplan, business users can build a model with 500 million cells, use it for one hour for a specific purpose, and then throw it away and start on a new one if they want! Ease of use is key. This is the future of enterprise big data.”