“Above the Trend Line” – Your Industry Rumor Central for 12/9/2019

Print Friendly, PDF & Email

Above the Trend Line: your industry rumor central is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items grouped by category such as M&A activity, people movements, funding news, industry partnerships, customer wins, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz. Our intent is to provide you a one-stop source of late-breaking news to help you keep abreast of this fast-paced ecosystem. We’re working hard on your behalf with our extensive vendor network to give you all the latest happenings. Heard of something yourself? Tell us! Just e-mail me at: daniel@insidebigdata.com. Be sure to Tweet Above the Trend Line articles using the hashtag: #abovethetrendline.

So much gossip as we approach year-end! Let’s start with some new funding news … Cerebri AI, which develops and sells CVX – one of the most sophisticated customer experience (CX) platforms in the world – using AI and reinforcement learning to drive customer engagement and financial success, announced a capital infusion of $7 million from new investor Arcis Capital Partners and existing investors … Tray.io, whose General Automation Platform enables citizen automators in any business role to build enterprise-class integrations and automation themselves in a low-code environment, has closed an October $50M Series C funding round just five months after its previous raise at the end of April 2019. The round is led by Meritech Capital alongside existing investors Spark Capital, GGV Capital, and True Ventures, who participated over their pro rata amounts.

And in new M&A activity we heard … Snow Software, a leader in technology intelligence solutions, announced it has acquired Embotics, a hybrid cloud management company. This acquisition brings together two market leaders, enabling CIOs to understand and manage their full technology stack from software and hardware to infrastructure and applications, regardless of whether they live on-premises, in the cloud or in a hybrid environment.

In the new customer wins category, we have … Information Builders, a leader in business intelligence (BI), analytics, and data management solutions, announced that the California State University (CSU), the nation’s largest four-year public university serving more than 481,000 students, has selected Information Builders cloud-based Omni-Gen™ Master Data Management (MDM) solution to consolidate and cleanse data across its 23 campuses … dotData, the company focused on delivering full-cycle data science automation and operationalization for the enterprise, announced that global technology leader Seiko Epson Corporation (“Epson”) has selected dotData to accelerate and democratize data science across its organization. Epson is a global technology leader and innovator across multiple categories including communications, wearables, and robotics. The organization deployed dotData Enterprise to accelerate and democratize its AI development as a part of its AI and Analytics Platform strategy … Hyperion X, created in 2019 as the data and technology division of Hyperion Insurance Group, a leading insurance group with employee ownership at its heart, has selected Reltio Cloud as the core client data platform to power its data and analytics capabilities and strengthen its competitive position in the insurance sector … Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), and Cerner Corporation, a global healthcare technology company, announced that Cerner has selected AWS as its preferred cloud, artificial intelligence (AI), and machine learning (ML) provider. Cerner will use AWS’s broad portfolio of services, including ML, analytics, and Internet of things (IoT), to help create the next chapter of healthcare’s digital age, which will focus on advancing the patient care experience, improving the health of populations, and reducing the per capita cost of healthcare. Cerner will use AWS services to lead this evolution in healthcare to increase efficiencies and lower operational burdens and creating new technologies for care providers to interact with patients.

We also learned of a number of new partnerships, collaborations, and alignments … DiA Imaging Analysis Ltd., an IBM Alpha Zone Accelerator Alumni Startup, announces a collaboration with IBM Watson Health, a leading provider of innovative AI, enterprise imaging, and interoperability solutions used by medical professionals worldwide. The IBM Imaging AI Marketplace will offer DiA’s FDA-cleared, AI-powered cardiac ultrasound software, designed to assist clinicians to analyze cardiac ultrasound images automatically. Analyzing ultrasound images is often a visual process that can be challenging and highly dependent on user experience. DiA’s solutions address this challenge by assisting clinicians to objectively and accurately analyze ultrasound images, reducing the subjectivity associated with visual interpretation … Bedrock Analytics, an intelligent analytics and insights automation platform for the Consumer Packaged Goods (CPG) industry, announced it has joined the Nielsen Connect Partner Network. As a network partner, Bedrock now offers streamlined access to insights, visualization and reporting to customers who buy CPG data from Nielsen (www.nielsen.com), the global leader in consumer and market data. The partnership provides mutual Nielsen and Bedrock customers with sophisticated data visualization tools purpose-built to easily extract crucial insights designed to help them create compelling sales presentations for their retail partners … GoodData®, a leader in cloud BI and analytics, announced an agreement with NTT DATA, one of the world’s largest IT services companies. Through the deal, NTT DATA secured the opportunity to resell GoodData’s analytics platform that will help NTT DATA’s customers around the globe better leverage their increasingly vast data assets. In addition, GoodData’s analytics platform will power NTT DATA’s iQuattro® Industrial Internet-of-Things (IIoT) platform being deployed by leading Japanese manufacturers to run smart factories, develop smart supply chains and dominate the 21st Century factory floor.

In people movement news we learned … TIBCO Software Inc., a global leader in integration, API management, and analytics, announced that Scott Roza has been appointed its new president & global head of customer operations. In this role, Roza will lead TIBCO’s global sales, alliances, professional services, and customer excellence functions. He will report to TIBCO’s chief executive officer, Dan Streetman.

We also received a reaction to AWS’ new managed Cassandra service from Instaclustr CTO Ben Bromhead:

“We view AWS’ announcement of a managed Cassandra service as further validation of our position that Cassandra is enterprise-grade without the need for proprietary software or ‘open core’ versions. With AWS’ involvement, we expect this will add to market excitement around, and the continued adoption of, fully open source Cassandra as the most highly scalable database for enterprises. At Instaclustr we expect to see continued customer growth as overall Cassandra interest continues to swell. Cassandra is powerful, but can be complex to operate and we are very comfortable in our position as the most experienced operators of this technology at significant scale.”

2020 Trends/2019 Year-in-Review

“The crucial skill of human beings, as opposed to any other creature, is the ability to imagine the future, to plan for it and shape it,” commented Chris Nicholson, CEO of Skymind. “More than any other species, we are good at imagining possibilities, and planning for contingencies. You could say that this ability to partially predict and control future events is what has made humans the dominant species on the planet. Scientific discovery and the scientific method, for the last few hundred years, have made us better and better at understanding cause and effect, and using causes to create the effects we want. AI can help us do that even better. Specifically, a combination of deep learning and reinforcement learning has been shown to find the best sequence of actions through complex situations to a goal we seek to attain. Threading our way through life, and choosing the actions that will get us where we need to go, is probably the most basic human challenge. And the AI that is emerging now can help us do that better and better. For the last few years, think tanks like DeepMind and OpenAI have applied deep reinforcement learning to really complex, hard-to-solve video games, and they have beaten those games. The trend line is clear: AI is getting better at solving strategic problems in scenarios where it has partial information. Those same algorithms are being applied to real-world problems that range from weather predictions to load-balancing on wind farms to reducing the cooling costs of data centers. As those algorithms are combined with business simulations, and massive compute in the cloud, we believe that large organizations will come to rely on them to solve strategic problems that get to the heart of their plans. Leaders and planners will use those algorithms to answer the question “What should I do next?” When they are confronted with expensive decisions like how to optimize their factories, where to build their next warehouse, or how to staff their organization on a daily basis to respond to the world’s demands. So this type of AI is stepping in to fill a fundamental need. It will augment our ability to navigate an uncertain future, and guide us down the narrow path to our goals.”

“Object Storage will be Key to Processing AI and ML Workloads,” commended Jon Toor, CMO of Cloudian. “As data volumes continue to explode, one of the key challenges is how to get the full strategic value of this data. In 2020, we will see a growing number of organizations capitalizing on object storage to create structured/tagged data from unstructured data, allowing metadata to be used to make sense of the tsunami of data generated by AI and ML workloads. While traditional file storage defines data with limited metadata tags (file name, date created, date last modified, etc.) and organizes it into different folders, object storage defines data with unconstrained types of metadata and locates it all from a single API, searchable and easy to analyze. For example, a traditional X-ray file would only have metadata describing basics like creation date, owner, location and size. An X-ray object, on the other hand, could use metadata that identifies patient name, age, injury details and which area of the body was X-rayed, making it much easier to locate via search. In 2020, object storage will be instrumental in helping to process AI and ML workloads in 2020 as this newer storage architecture leverages metadata in ways traditional file storage doesn’t.”

“Data will enable growth … or ruin you”, commented Eric Raab, SVP of Product and Engineering at Information Builders. “There is more information available today than ever before and that growth will only continue as organizations continue to invest in data-producing technologies like the IoT. This sheer volume of data is reaching a critical mass and 2020 will see the organizations split into two groups: those for which this is a great opportunity and those for which this is a great threat. Those that have invested in the solutions to manage, analyze and properly action their data will have a clearer view of their business and the path to success than has ever been available to them. Those that have not will be left with a mountain of information that they cannot truly understand or responsibly act upon, leaving them to make ill-informed decisions or deal with data paralysis.”

“Algorithms over apps – In 2020, companies will focus less on shipping traditional applications and focus more on selling AI use cases,” commented Zinier CEO Arka Dhar. “They’ll offer customers AI models for a specific use case (i.e. diagnosing repair needs in 5G infrastructure) and separate models for a different use case (i.e. determining when oil and gas infrastructure needs to be retired). Organizations will rely less on one-size-fits-all apps and instead leverage highly specialized models for custom use cases, which will ultimately deliver better results.”

“Increased operationalization of AI/ML Yields Business Value … The data explosion is at its peak and becoming more mainstream across all industries – the supply chain is no exception,” commented Dr. Madhav Durbha, Group Vice President of Industry Strategy at LLamasoft. “Next year, AI and ML will move beyond its current hype cycle to offer more tangible use cases that deliver real business value. Here are a few examples of AI applications that will take off in 2020: (i) Predicting Volatile Order Patterns: AI and ML will give companies the ability to predict less stable, highly volatile order patterns from customers. The supplier community is seeing increased volatility in demand signal due to an uptick in order volumes from leading online retailers. Predictability with the ordering is a challenge a significant challenge and AI models perform at much more optimal levels in these situations; (2) Market Sensing: AI can help harness the power of external causal data such as weather, GDP, CPI, employment levels, industrial production, etc, as a better predictor of markets shifts and demand drivers, bringing better sensory capabilities into the supply chain, product portfolio, capital expenditure decisions, and long term strategic and capacity planning; (3) Chargeback Reduction: Retailers charge hefty penalties to brand owners for missed OTIF (On Time in Full) deliveries. Deep learning algorithms allow sifting through key shipment data including order types, times, quantities, locations and transportation modes to identify root causes for chargebacks and predict points of failure.”

“We will see advancements in ‘explainable AI’ and more trust in AI across the board, “commented DarwinAI CEO Sheldon Fernandez. “Advancements in explainable AI will continue in 2020 and beyond as new standards are developed around the technical definition of explainability, slowly followed by new technologies to address the explainability problem for business leaders non-technical audiences. In real estate, for example, offering a compelling explanation for why a mortgage application was rejected by an AI-driven platform will eventually be a necessity as AI adoption continues. Although we’ll see evolving technical tools and standards, progress for layperson tools will be slower with some narrow and domain-specific solutions (e.g., non-technical explainability for finance) emerging first. Like the general public’s understanding of ‘the web’ in the 90s, awareness, understanding and trust in AI will gradually increase as the capabilities and use of the technology spreads.”

“Consolidation of analytics and machine learning companies … We’re more than a decade into a massive transformation in the world of data platforms, analytics and machine learning, and some winners are emerging,” commented Okera CTO and co-founder Amandeep Khurana. “This year, we’ll see a consolidation of companies and technologies via acquisitions as well as the merger of projects and initiatives. It’ll be the beginning of the trend of consolidation that will likely accelerate into 2021.”

“Consolidation of power and platforms will accelerate in 2020 … The explosion of AI efforts will be accompanied by a growing trend toward consolidation of data science organizational power,” commented Joshua Poduska, Chief Data Scientist, Domino Data Lab. “The idea of setting up an internal data science practice is not new, and most companies have already invested here. Executives realize that the competitive advantage of the next five years will belong to those who can build the best data science flywheel. Integrating model insights into decision flows and significantly increasing the number of quality machine learning (ML) and AI projects are the high-level keys to success in building this flywheel, but the former is harder than the latter. In order to get lucrative models integrated into the fabric of the business, leaders are seeking a better process – best practices and workflows for collaborative data science that start with the end in mind. Efforts to increase the funnel of AI projects lean on recent technology advancements, specifically a new class of enterprise software called Data Science Platforms, which remove dev ops barriers to model research and deployment. They also facilitate collaboration and reproducibility, two key elements of running effective modern data science teams. With access to better centralized platforms, data scientists will be significantly more productive, but business leaders will be slower to define and enforce the processes needed to ensure that work gets successfully into production to improve decision making. While the particular implementation details will vary, this trend of consolidation of power and platforms will also accelerate in 2020.”

“Customizable approaches to deep learning will make or break AI applications,” commented Max Versace, PhD, CEO and co-founder, Neurala. “Traditional approaches to deep learning can be tedious and time consuming due to the need for massive amounts of data which need to be retrained over and over again. Moreover, data is often not available online or is confidential to one organization, so it cannot be combined with others to create massive AI systems. In 2020, we’ll see the emergence of new paradigms and approaches to deep learning to solve these challenges. One example of a new approach to DNN is Lifelong-DNN™: an approach that reduces the data requirements for AI model development and enables continuous learning in the cloud or at the edge to ease the AI development process. With new approaches to DNN training, organizations can build AI systems using their own data to bypass issues of data privacy. As a result, enterprises and companies looking to deploy AI solutions can leverage L-DNN for real-world applications in visual detection, recognition and classification, either using L-DNN alone or in tandem with traditional approaches to DNN.”

“The Fate of AI Depends on AI: The 2010s closed out with an AI frenzy – with marketing hype and spending around the technology at an all time high,” commented Sean Knapp, founder and CEO of Ascend. “While organizations worldwide are anticipated to spend more than $1.8 trillion annually by 2021 on big data and AI-driven digital transformation efforts, many will struggle to translate those investments into business success. This is due to insufficient resources and expertise to support data initiatives, difficulty accessing siloed data, and an increased urgency for fast analysis and delivery. We’ve seen this type of technology gold rush before, and unless we address the core issues at hand, we will be doomed to fail. The fate of AI will depend on AI itself, or rather the ability to utilize automation to ensure successful AI and big data projects. I expect to see that advancements in automated data and delivery systems in the coming decade will help businesses increase their success rates in these AI and big data initiatives across industries.”

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind