Sign up for our newsletter and get the latest big data news and analysis.

“Above the Trend Line” – Your Industry Rumor Central for 11/21/2016

Above the Trend Line: machine learning industry rumor central, is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items such as people movements, funding news, financial results, industry alignments, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz. Our intent is to provide our readers a one-stop source of late-breaking news to help keep you abreast of this fast-paced ecosystem. We’re working hard on your behalf with our extensive vendor network to give you all the latest happenings. Heard of something yourself? Tell us! Just e-mail me at: daniel@insidebigdata.com.  Be sure to Tweet Above the Trend Line articles using the hashtag: #abovethetrendline.

We’ve had a very active week here at insideBIGDATA with big data moving forward at a frenetic pace. We’ll start our weekly roundup with new partnerships and alignments starting with – Talena, the always-on big data pioneer announcing a partnership with Arcadia Data, the unified visual analytics and business intelligence (BI) platform for big data. Companies using Talena ActiveRX™ and Arcadia Enterprise in combination can turn typically passive big data backups into active clusters and obtain valuable, actionable business insights. The integration saves companies precious time and IT resources by removing the need to manually move or duplicate large data sets, obtain insights across multiple big data platforms, and eliminates negative performance impacts on production environments. In just three years it’s expected that more than 35 zettabytes of data will be generated worldwide, with more than 80% of that data stored in enterprise environments. Companies recognize the need to protect these valuable data sets against data loss, and how these losses could impact their businesses, yet big data backups are traditionally passive environments used only in the event of a data loss. Talena ActiveRX enables these backup data environments to become active clusters, ready for native data visualization and analysis at any time. With Talena ActiveRX, compute moves to the data layer, eliminating the often-expensive step of moving data yet again to a separate analytics cluster. The combination of Talena ActiveRX and Arcadia Enterprise enables companies to handle data analytics at any scale, leverage multiple backup copies for trend analysis, and run analytics across backup copies of NoSQL, Hadoop and modern data warehouses environments … XSEDE, the Extreme Science and Engineering Discovery Environment, announced that academic users with XSEDE allocations may access MATLAB and other add-on products from MathWorks on XSEDE service provider supercomputers without having to bring their own license. MathWorks has made this solution widely available in order to enable faculty, students and researchers from diverse institutions around the U.S. to more easily access MATLAB and collaborate on each other’s resources as well as on XSEDE supercomputers … Snowflake Computing, the cloud data warehousing company, announced the launch of its Snowflake Solution Partner Program. This new program recognizes qualified Snowflake partners who have demonstrated the skills and experience to implement solutions that leverage Snowflake technology while providing partners with training, resources, and collaboration to maximize their ability to leverage Snowflake technology. The Snowflake Solution Partner Program helps Snowflake customers find qualified partners who can work with them to accelerate and maximize the benefits of Snowflake. Snowflake Solution Partners bring expertise in strategies, architectures, design principles, and best practices in big data and data analytics that ensure that customers can rapidly deploy Snowflake to help solve their data analytics challenges … SnapLogic, the unified data and application integration platform as a service (iPaaS), and Snowflake Computing, the data warehouse built for the cloud, announced a new partnership to simplify and accelerate data integration and analytics in the cloud. The partnership includes technology integration and joint go-to-market activities to help organizations harness all data for new insights, better decisions and better business outcomes. Available immediately, SnapLogic has introduced a series of new pre-built intelligent connectors – called Snaps – which provide fast, self-service data ingestion and transformation from virtually any application or data source to the Snowflake Elastic Data Warehouse. Timely, relevant data can then be quickly and easily analyzed using a variety of analytic tools … Machine learning automation leader DataRobot announced the integration of its enterprise machine learning platform with the Alteryx, Inc. self-service data analytics platform. The Alteryx connector for DataRobot allows users of both platforms to simplify and automate the end-to-end workflow for predictive analytics, from data preparation to modeling to deployment. The connector is now available to download from the Alteryx Analytics Gallery. Predictive analytics can be challenging and complex, often requiring disconnected tools and a variety of programming environments to blend data from disparate sources, build predictive models and then seamlessly deploy them so they can drive real business value. The integration of Alteryx and DataRobot brings together self-service data preparation, blending and advanced analytics from Alteryx with intuitive, automated machine learning from DataRobot … Pivotal announced the fully self-service Pivotal Cloud Foundry in Azure to give customers the ability to run cloud-native Java and .NET applications on Microsoft’s global public cloud infrastructure. Companies including GE, Ford, Dell and Manulife, and others are already taking advantage of the benefits by combining Pivotal’s expertise and open source platform technology with Microsoft Azure infrastructure and services. This important update aligns with Pivotal’s strategy to work in conjunction with all of the major cloud platforms in the world, as well being the only technology in full support of all major cloud platforms – an intention Pivotal made clear at launch nearly four years ago. This ability for enterprises to have full control of their cloud strategy is one the most important steps to accelerate the industry-wide adoption of the cloud … Yhat, a software company working to bridge the technological divide between data scientists and engineers, announced that Lumiata, the AI-powered predictive analytics company, has implemented Yhat’s machine learning deployment platform, ScienceOps. Lumiata is using ScienceOps to incorporate its proprietary health risk algorithms into their predictive tool, the Risk Matrix. The Lumiata Risk Matrix delivers personalized, time-based predictions of an individual’s future health state based on associated clinical conditions or diagnoses, and is delivered via an API. By leveraging Yhat’s ScienceOps, Lumiata’s data science and engineering teams can efficiently work with large health data sets to develop and deploy models that deliver individual and population-level risk predictions.

In new funding news, Domino announced that the company has raised $10.5 million in a funding round led by Sequoia Capital. For Domino, fundraising is simply a means to an end: building the leading data science platform to help companies maximize the impact of their quantitative research.

We learned plenty of people movement news starting with … to further help customers plan and execute their digital process transformation strategies, Sutherland Digital is advancing its capabilities under the direction of Andy Zimmerman, former President of Frog Design. Mr. Zimmerman will serve as president of Sutherland Digital and lead the company’s Telecommunications and Technology business units. Leading with a human-centric design approach, Sutherland Digital has realigned its capabilities to now include the company’s design labs expertise along with consulting, big data analytics and delivery platform services. Sutherland Digital clients can now draw upon a comprehensive portfolio of services and technologies to ensure outstanding customer experiences and superior business outcomes. The announcement comes shortly after Sutherland acquired Nuevora, a top 10 big data analytics firm. The acquisition advanced Sutherland’s big data analytics capabilities to address the fundamental challenge many companies face: translating massive amounts of customer data into immediately useful action … MapR Technologies, Inc., provider of the Converged Data Platform, named Jim Kowalski to lead its worldwide sales organization. As the Chief Revenue Officer, Jim is responsible for the global MapR go-to-market team, including sales and sales engineering. Jim joins MapR most recently from Marketo where he led their enterprise organization to significant growth. Prior to that, Jim held a number of senior sales leadership positions with Oracle in both the technology and applications divisions that included their big data and analytics teams. Before Oracle, Jim was Chief Marketing Officer of SmartSignal, an early pioneer in the Internet of Things (IoT), now owned by GE.

In new big data products, services and solutions, we learned about … Striim, Inc., announced that it has launched a new version of its end-to-end real-time data integration and streaming analytics platform. On the heels of Striim’s release of version 3.6, Striim further underscores its commitment to seamless integration with Big Data and Cloud environments with this release. Highlights include integrations with MapR, Kafka 0.9 and Google BigQuery. Striim is also pleased to launch the Hazelcast Striim Hot Cache, announced separately … Global supercomputer leader Cray Inc. (Nasdaq:CRAY) announced it has achieved acceptance for “Theta,” the Cray® XC40™ supercomputer, as well as a Cray Sonexion® 3000 storage system, located at the Argonne Leadership Computing Facility (ALCF), a U.S. Department of Energy Office of Science User Facility at Argonne National Laboratory. The Theta system marks Cray and Intel’s first acceptance for a large-scale supercomputer featuring the latest generation of Intel® Xeon Phi™ processors formerly code named “Knights Landing.” Theta has a peak performance of more than eight petaflops, and is currently running a broad range of scientific applications through the ALCF’s Theta Early Science Program. The Cray XC40 system was delivered to Argonne National Laboratory through a partnership with Intel as part of the DOE’s initiative to build state-of-the art supercomputers through the Collaboration of Oak Ridge, Argonne, and Lawrence Livermore National Laboratories (CORAL) program …

insideBIGDATA also heard some compelling commentary about recent industry happenings … NoSQL database vendor DataStax announced that they’re acquiring open source software cloud provider, DataScale. One of DataStax’s biggest competitor’s, Basho Technologies, has a strong opinion about why DataStax chose to buy DataScale outside of the obvious reasons like NoSQL industry maturation. Basho’s CEO, Adam Wray, would like to share his opinion on this acquisition with your readers if you’re working on any stories today or throughout the week about this news. Here’s what Adam has to say:

At Basho, we are not surprised that DataStax is acquiring DataScale’s services business as a potential path to accelerate revenue growth and commercial differentiation. We speculate this move may have more urgency due to DataStax losing control of Cassandra in the recent fallout with the Apache Foundation. We see this as an attempt by DataStax to find new ways to provide enterprise value because they may not have as much control over the direction of Cassandra in the future.”

Also, here’s Patrick Moorhead’s (of Moor Insights & Strategy) takeaway from the Intel AI event in San Francisco last week:

AI compute background: There are many flavors of AI: neural networks, LSTM, Belief Networks, etc.  Neural networks for AI is currently split between two distinct workloads, training and inference. Generally speaking, training takes much more compute performance and uses more power, and inference (formerly known as scoring) is the opposite. Generally speaking, leading edge training compute is dominated by NVIDIA GPUs, legacy training compute by CPUs and inference compute is divided across the Intel CPU, Xilinx/Altera FPGA, NVIDIA GPU, ASICs like Google TPU and even DSPs. Overall: Intel threw their formal AI strategy axe into the ocean which was very important given the general tech industry sees GPUs as the current driver of AI compute. If Intel can execute on and deliver what they said they would do today, Intel will be a future player in AI. To be clear, Intel is part of almost every AI implementation today as you can’t boot a GPU, but in leading-edge installations, GPUs are doing most of the heavy-lifting for deep neural net training.

And finally, our Vendor of the Week is Bit Stew Systems that announced the company has been acquired by GE Digital (NYSE: GE) bringing its leading data intelligence capabilities to Predix,  GE’s operating system for the Industrial Internet, and its industrial applications such as Asset Performance Management (APM). Bit Stew’s MIx  Core™ is purpose-built to handle complex data integration and analysis across connected devices, OT and IT systems, and external sources. Unlike conventional ETL tools, MIx Core automates data integration by applying machine intelligence to the process, which can reduce project costs by an  average of 90% compared to traditional approaches.

 

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: