“Above the Trend Line” – Your Industry Rumor Central for 5/1/2017

Print Friendly, PDF & Email

Above the Trend Line: your industry rumor central is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items grouped by category such as people movements, funding news, financial results, industry alignments, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz. Our intent is to provide you a one-stop source of late-breaking news to help you keep abreast of this fast-paced ecosystem. We’re working hard on your behalf with our extensive vendor network to give you all the latest happenings. Heard of something yourself? Tell us! Just e-mail me at: daniel@insidebigdata.com.  Be sure to Tweet Above the Trend Line articles using the hashtag: #abovethetrendline.

We here at insideBIGDATA are looking forward to the upcoming GPU Technology Conference on May 8-11 in Silicon Valley. We’ll be on the floor and around corners soaking up news and rumors for our readers. But in the meantime, let’s talk about all the new VC money finding a home in the Big Data industry … Looker announced it has closed an $81.5 million Series D funding round led byCapitalG, Alphabet’s growth equity investment fund. The round includes additional participation from new investors Geodesic Capital and Goldman Sachs, as well as from Looker’s previous investors Kleiner Perkins Caufield & Byers, Meritech Capital Partners, Redpoint Ventures and Sapphire Ventures. Looker has raised a total of $177.5 million since 2013. Looker is a modern data platform that leverages today’s best data technology to let everyone in an organization make better business decisions using data. The announcement will help Looker continue to innovate with even more intuitive ways for users to access data, expand product functionality, and make deeper investments in its integrations with the most powerful database technologies. Looker will also accelerate its investments in sales and marketing and continue international expansion, including Asia Pacific.

New data science educational opportunities keep popping up … Syracuse University recently announced that it is now accepting applications for its newest online program, DataScience@Syracuse. Designed to meet the needs of today’s data science professionals, DataScience@Syracuse will offer an 18-month Master of Science in Applied Data Science, developed in a collaboration between Syracuse University’s School of Information Studies (iSchool) and the Martin J. Whitman School of Management. Taught by esteemed Syracuse faculty, the online program’s interdisciplinary curriculum focuses on delivering organizational insight and driving business strategy by using data capture, management, mining and analysis skills. Syracuse University continues a partnership with 2U, Inc., and DataScience@Syracuse uses 2U’s cloud-based software-as-a-service technology platform. Students and faculty will meet weekly through live online class sessions, and students will complete immersive course content between classes, accessible both online and offline, on computers and mobile devices, from any location. The M.S. in Applied Data Science consists of 36 credits and can be completed in 18 months. Each year, there will be start dates in January, April, July and October with admissions decisions being made on a rolling basis. Applications are now being accepted for the first cohort, which begins in October 2017.

In M&A news, we learned that Infor, a leading provider of business applications specialized by industry and built for the cloud, announced it has reached an agreement to acquire Birst, Inc., a pioneer of cloud-native, business intelligence (BI), analytics, and data visualization. Birst is a unique, comprehensive platform for sourcing, refining, and presenting standardized data insights at scale to drive business decisions. The Birst business intelligence platform connects the entire enterprise through a network of virtualized BI instances on-top of a shared common analytical fabric. Birst spans ETL (extract, transform, and load), operational reports, dashboards, semantic understanding, visualization, smart discovery, and data blending to form a rich, simplified end-to-end BI suite in the cloud. Birst received among the four highest scores in four of the five use cases assessed in the 2017 Gartner Critical Capabilities for Business Intelligence and Analytics Platforms report, which examined products from 26 vendors, published March 2. Birst scored highest for the OEM or Embedded BI (4.15 out of 5) and Extranet Deployment (4.18 out of 5) use cases. It received the third-highest score in the Agile Centralized BI Provisioning (3.80 out of 5) use case, and it received the fourth-highest score in the Governed Data Discovery (3.59 out of 5) use case.

Christina Noren, CPO at Interana offered us some comments about the Info/Birst acquisition:

Moves like Infor buying Birst will likely be replicated more, with incumbent analytics players being acquired by application software players because they tend toward prescriptive analytics that depend on a fixed model of the business. That comes most consistently from packaged applications. Independent analytics vendors need to provide more flexibility and exploratory capability for digital businesses to understand what is happening in their proprietary services vs. analytics closely tied to the cookie cutter aspects of the business embodied in packaged applications. There is a split happening here.

We kept our ears open for new partnerships, alignments and collaborations starting with Accenture, the global professional services company, and the German Research Center for Artificial Intelligence (DFKI) forming an alliance to enable clients to take advantage of Artificial Intelligence (AI) technologies as fundamental elements of their innovation strategies, helping to shape the future of their organizations. Accenture Analytics, part of Accenture Digital, will apply its deep analytics expertise with DFKI’s specialized AI research capabilities to further the adoption of these new technologies in Germany and beyond. Together, Accenture and DFKI will provide clients with direct access to innovative AI technologies, supporting them to understand the potential of applying AI for their organizations and guiding the implementation of new solutions through the adoption of best practices to unlock tangible new value and growth opportunities … The Industrial Internet Consortium® (IIC), a world’s leading organization transforming business and society by accelerating the Industrial Internet of Things (IIoT), and the Industrial Value Chain Initiative (IVI), a forum of smart manufacturing for connected industries based in Japan, announced they have signed a memorandum of understanding (MoU). Under the agreement, the IIC and the IVI will work together to align efforts to maximize interoperability, portability, security and privacy for the industrial Internet … Supercomputer leader Cray Inc. (Nasdaq: CRAY) announced the Company has signed a solutions provider agreement with Mark III Systems, Inc. to develop, market and sell solutions that leverage Cray’s portfolio of supercomputing and big data analytics systems. Headquartered in Houston, Texas, Mark III Systems is a leading enterprise IT solutions provider focused on delivering IT infrastructure, software, services, cloud, digital, and cognitive solutions to a broad array of enterprise clients. The company’s BlueChasm digital development unit is focused on building and running open digital, cognitive, and AI platforms in partnership with enterprises, institutions, service providers, and software and cloud partners. Mark III Systems can now combine the design, development, and engineering expertise of its BlueChasm team with the data-intensive computing capabilities of the Cray® XC™, Cray CS™, and Urika®-GX systems, and offer enterprise IT customers customized solutions across a wide range of commercial use cases.

The big data vendor ecosystem also logged a number of important customer wins starting with Rubikloud™, the machine intelligence platform turning omni-channel retailers into modern data-driven innovators, announcing it has selected Microsoft Azure as the preferred platform to power its retail-focused machine learning products. This announcement is significant because it is bringing terabytes of insightful data to Azure from enterprise customers using Rubikloud’s Rubicore platform, Customer LifeCycle Manager and Promotion Manager products. This announcement also falls on the heels of Rubikloud’s decision to deploy Azure for its existing partnership with A.S. Watson Group (ASW), the largest international health and beauty retailer in Asia and Europe. Following successful implementation of Azure for A.S. Watson, Rubikloud selected Azure as a preferred platform for future enterprise retail customers given Azure’s flexibility and support in managing hundreds of terabytes of retail data. Rubikloud’s packaged cloud-based products have extracted and integrated hundreds of terabytes of data from its retail customers’ legacy systems, unlocking astonishing efficiency for merchandising, loyalty programs, dynamic pricing, stock-out reduction, and more … MapR Technologies, Inc., the provider of the Converged Data Platform enabling organizations to create intelligent applications that fully integrate analytics with operational processes in real time, announced thatNorCom, a full-chain supplier for big data solutions, has selected the MapR Converged Data Platform to serve as a foundation for its autonomous driving applications that leverage deep learning technologies. The partnership enables joint customers to deploy deep learning frameworks that can provide fast and reliable analysis in critical compute environments. NorCom leverages a purpose-built deep learning framework for the automotive industry. To fully take advantage of it, they needed a way to efficiently manage the massive data sets generated by sensors and cameras in self-driving cars. Running containerized deep learning applications on the MapR Platform provided the required speed, scale and reliability to successfully analyze continuous data in an autonomous driving environment and achieve the benefits of deep learning … Information Builders, a leader in business intelligence (BI) and analytics, information integrity, and integration solutions, announced that State Volunteer Mutual Insurance Company (SVMIC), a single-line medical professional liability insurer, has selected Information Builders’ analytics solutions for the insurance market. Business leaders at SVMIC believe the new software will improve the company’s data reporting to better serve doctors, surgeons, and other medical practitioners throughout Tennessee and the surrounding states. Information Builders’ industry-specific BI solution will bring actionable data to SVMIC experts in underwriting, claims, risk management, and other parts of the business.

In the people movement department, we learned that MapR Technologies, Inc., the provider of the Converged Data Platform enabling organizations to create intelligent applications that fully integrate analytics with operational processes in real time, announced that George Roberts has joined the company’s Board of Directors. Mr. Roberts joined the MapR Board of Directors in March 2017 and serves on the Nominating and Corporate Governance Committee, and as the Chair of the Compensation Committee. Previously, Mr. Roberts served as the Executive Vice President of North America at Oracle Corporation and as a member of the Oracle Executive Committee. He currently serves on the Board of Directors of a number of privately held companies

In new patent news, we heard that a new patent for “whole brain” systems for autonomous robotic control has been issued by the U.S. Patent Office to Neurala, the software company that invented The Neurala Brain, a deep learning neural networks platform. This new invention will enable AI to function more like a human brain because it integrates multiple brain areas. Human brains integrate sight, sound and other senses when making a decision, but existing AI systems do not. Traditional AI systems are engineered by first implementing separate subsystems (for instance, visual and auditory perception, spatial navigation and obstacle avoidance), and then they attempt to integrate them in a unified system. In biological brains the different senses work together to achieve a task. For example, a brain may consider the sight and sound of a moving car to estimate its position and place that car in a “mental map” of the world. A whole brain AI system, which acts like the human brain, will be significantly better at performing complex tasks because of this native integration that enables different senses and modules to complement each other’s deficiencies and shortcomings.

Microsoft recently announced upgrades to its sales software that integrates data from LinkedIn. Behavioral analytics startup Interana‘s CEO and co-founder Ann Johnson sees this move as an important step to recreating the way people have access to their data and the insights gleaned from it.

Organizations save precious resources, including time and money, when they give 100 to 300 people access to the company’s data vs. one to three. This idea of putting data in the hands of more people across every department of an organization––not just the data scientists–– and providing them with the tools to use it, is the premise Interana is built on and our founders are long-time proponents of this adoption. As data-informed decision making becomes more of a business-critical advantage, this latest move from Microsoft will hopefully set an example for other organizations looking to maximize the value of their data.”

And finally, our friends over at Dataiku provided a short piece on how open source tools may become the lifeblood of enterprise data science projects:

Data Team Harmony Lies with Open Source

Building an effective data team can come at a high cost, yet open source tools may be the key to creating harmony and potentially reducing short term and long term costs.

Harmony and analytics are two terms not often found together, especially when getting a team of diverse data professionals to execute data science chores – together. A data science team is often made up of people from diverse backgrounds, with diverse skill sets – from the machine learning specialist, to the master Python coder, to the beginning data analyst. To successfully build and execute any sized data science project requires harmony across all of the team members. Everyone needs to work effectively and efficiently, using the tools they know best.

What’s more, the growing deficit of Data Scientists, along with the closed nature of many analytics tools makes building effective teams even more difficult. That said, all is not lost. There exists a vast ecosystem of open source tools that are available to the masses, which can help to level the playing field, and bring data analytics capabilities to professionals of all stripes. Yet, much like the cola wars of the 80’s, there is an almost infinite variety of flavors and formulas that drive tastes, at least when it comes to standardizing analytics tool sets.

This is a conundrum that can only be solved by creating harmony among team members and their tools of choice. However, harmony means many things to many people, in the case of data science, harmony takes on the form of people being able to interact with their tools of choice, as well as having some mechanism to orchestrate those tools. Naturally, orchestration and harmony cannot happen without a conductor, and in the world of data science, that conductor takes the form of a software platform that fuels interoperability, and tears down barriers.

Dataiku digitizes that conductor with Dataiku Data Science Studio (DSS), a platform that embraces the ideologies of open source technologies, and bridges those technologies together to give teams choices, while promoting collaboration. Dataiku DSS connects to more than 25 different data storage systems, including closed source and open source databases, such as SQL Server, HDFS, NoSQL, and so forth.

Dataiku DSS also supports numerous programing languages (Python, R, Spark, etc) allowing data professionals to work with the programming tools of their choice and still have connectivity to the data shared by the team. Critical features such as team knowledge sharing, change management, and project monitoring further fuel collaboration, while eliminating silos of operation.

Dataiku DSS’s platform approach centralizes open source elements, creating an environment where team knowledge is shared, and never lost when teams are reconfigured. What’s more, integrated to-do lists, document sharing, and unified logs make it easier to onboard new team members, as well as perform forensics on previous projects.

Simply put, open source tools may become the lifeblood of enterprise data science projects, but without proper orchestration, that life blood, as well as communal knowledge is sure to be lost over a short period of time.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*