Sign up for our newsletter and get the latest big data news and analysis.

“Above the Trend Line” – Your Industry Rumor Central for 12/17/2019

Above the Trend Line: your industry rumor central is a recurring feature of insideBIGDATA. In this column, we present a variety of short time-critical news items grouped by category such as M&A activity, people movements, funding news, industry partnerships, customer wins, rumors and general scuttlebutt floating around the big data, data science and machine learning industries including behind-the-scenes anecdotes and curious buzz. Our intent is to provide you a one-stop source of late-breaking news to help you keep abreast of this fast-paced ecosystem. We’re working hard on your behalf with our extensive vendor network to give you all the latest happenings. Heard of something yourself? Tell us! Just e-mail me at: daniel@insidebigdata.com. Be sure to Tweet Above the Trend Line articles using the hashtag: #abovethetrendline.

We’ve been receiving a high volume of 2020 predictions leading up to year-end. Please check out a handful of these commentaries from the big data vendor ecosystem at the end of this column. For now, let’s start with some late breaking M&A news … US semiconductor giant Intel has acquired Israeli startup Habana Labs, a developer of artificial intelligence processors, for $2 billion, the company announced. Founded in 2016, Habana Labs develops processor platforms that are optimized for training deep neural networks and for inference deployment in production environments. The company is headquartered in Tel Aviv and has offices in California, Poland, and China. Intel led a $75 million investment in Habana Labs in 2018. That year, Habana unveiled its Goya inference processor which it says is ideally suited for the most demanding AI applications in the industry, including private and cloud data centers, autonomous vehicles, factory and warehouse automation robots, and high-end drones. In 2019, Habana announced a new processor, the Habana Gaudi, which the startup said delivers an increase in throughput of up to four times over systems built with the equivalent number of GPUs … Qualitest, the software testing and quality assurance company, has acquired AI and machine learning company AlgoTrace for an undisclosed amount. This acquisition marks the first step of Qualitest’s growth strategy following an investment from Bridgepoint earlier this year. The acquisition will allow Qualitest to radically expand the number of AI-powered testing solutions available to clients, as well as develop its capabilities in assisting companies test and launch new AI-powered solutions with greater confidence and speed. As software grows in complexity and the pressure to launch faster and more frequently increases, according to Gartner, companies that do not use AI to enhance their Quality Assurance will be at a significant disadvantage … Instana announced that the company has acquired three technologies in the advanced application performance management area to continue to enable Instana’s customers to build, deliver and operate better performing software and services faster. The acquisitions augment Instana’s next-generation microservices and Kubernetes performance management solution to deliver high-frequency cloud application metrics, recognize and analyze complex system signals, perform code-level profiling and further automate root-cause analysis. Some of these innovations include: StackImpact – the first polyglot production application profiler; BeeInstant – a leading solution in the emerging area of high-frequency metrics analysis in large scale cloud environments; and Signify – a forward-looking solution to provide insight into complex system signals … DataRobot, a leader in enterprise AI, announced that it has entered into a definitive agreement to acquire Paxata, the pioneer of self-service data preparation and leading data fabric provider, to fulfill its mission to build the world’s first automated end-to-end enterprise AI platform. While the massive impact of AI on enterprises is well understood — PwC forecasts by 2030 AI could contribute $15.7 trillion to the global economy — companies must overcome several key challenges associated with AI in order to reap the benefits and become successful. Data preparation is one area that has historically held companies back. Creating a dataset for training predictive models, deploying data prep steps with AI models, and preparing data specific to AI routines are all major challenges companies face when it comes to leveraging data at scale … Syncsort closed the acquisition of the Pitney Bowes software and data business, creating a powerhouse data management software company with more than 11,000 enterprise customers, $600 million in revenue and 2,000 employees worldwide. The new company brings an unmatched combination of scale, agility and breadth of portfolio to empower leading enterprises to gain a competitive advantage from their data … Accenture announced an agreement to acquire Clarity Insights, a leading provider of data science and AI/ML engineering capabilities for large enterprises, and a strategic partner to clients across a range of industries, particularly healthcare, financial services and insurance. This acquisition will boost Accenture’s data science, AI/ML engineering, and deep industry talent in North America, and add a portfolio of accelerators to Accenture’s Applied Intelligence practice.

We received a commentary on the DataRobot/Paxata acquisition news from Trifacta CEO Adam Wilson:

“DataRobot’s acquisition of Paxata further validates the strong market opportunity for data prep. The need for data prep for reporting/analytics is well established and this acquisition in particular highlights the critical role of data prep for AI/ML. Trifacta’s own strategic relationships with Google Cloud and IBM have shown the need for prep in cloud data lakes/warehouses and data catalogs respectively. The business value that data preparation brings to organizations will only continue to grow as this market evolves and Trifacta’s strategy to provide a stand-alone platform for a diverse set of downstream use cases will allow us to capitalize on this expansion.”

In new funding news we learned … Imply, the real-time analytics company, announced that it has raised $30 million in funding led by Andreessen Horowitz’s Late Stage Venture Fund with participation from Geodesic Capital and Khosla Ventures. The primary equity financing brings the company’s total funding to $45.3 million. The round was opportunistic as the company had spent less than 10% of its Series A round while growing ARR 8-fold over the 24 months preceding the funding. The financing will be used to accelerate product development and the company’s go-to-market expansion. Imply was founded in late 2015 by the original authors of Apache Druid (incubating), a popular open source database that delivers sub-second query and high-speed ingest on enormous datasets where instant insight is critical, such as risk and fraud data, supply chain events, user behavior data, Internet of Things (IoT) device interactions, and much more. Imply delivers a complete solution with Druid as its engine, adding an intuitive and responsive analytics UI , cloud-native deployment and management, performance monitoring, and enterprise-grade security … Anyscale, the distributed programming platform company, today announced $20.6M in Series A funding, led by Andreessen Horowitz (a16z) with participation from NEA, Intel Capital, Ant Financial, Amplify Partners, 11.2 Capital, and The House Fund. With the funding, Anyscale will expand its leadership team and amplify its contribution to the open source community. With the power of Ray, Anyscale simplifies distributed programming. Applications built with Ray can easily be scaled out from a laptop to a cluster, eliminating the need for in-house distributed computing expertise and resources. Supported by a rich ecosystem of libraries and applications, Anyscale helps software developers and machine learning engineers scale their applications quickly and easily. With these tools, Anyscale removes the barriers to entry for building scalable distributed applications that have held organizations back from reaping the benefits of all the recent advances in artificial intelligence (AI).

We also have news of a number of new partnerships, collaborations, and alignments … Proscia, a leading provider of AI-enabled digital pathology software, and Johns Hopkins School of Medicine, one of the leading academic medical centers in the U.S., will collaborate on the development of computational applications that incorporate artificial intelligence (AI) to advance the practice of pathology for multiple diseases. Disease-specific AI applications help drive efficiency, productivity, and quality in tissue diagnosis. This is critical in overcoming the subjectivity inherent in traditional pathology and in addressing the world’s looming pathologist shortage. AI also has the power to tap into data unseen by the human eye to reveal clinically important tissue patterns. Training a successful AI system for pathology requires diverse, high-quality pathology data. Diverse data helps ensure an AI system is accurate across a wide variety of diseases, methods of biopsy, preparation of tissue, tissue dying procedures, and digital scanning processes … AI-driven augmented data management provider Promethium announced it has partnered with Starburst for an end-to-end solution that allows analysts to get answers quickly and easily across the most complex data environments. Users can now easily leverage Presto to run federated queries across multiple databases, data warehouses, and data lakes such as Hadoop through a user-friendly UI with automated data discovery and automated SQL query generation uniquely powered by AI and natural language processing.

In the people movement category we learned … HPE is pleased to announce that Keith White, a proven cloud strategy leader, is joining the company to lead the newly created HPE GreenLake business unit. Keith, along with the company’s talented team, will help propel HPE GreenLake to even greater heights. Keith comes to HPE with more than 20 years at Microsoft, where he played a central role in developing their end-to-end, go-to-market, and cloud business strategy. In his most recent role as head of their Intelligent Cloud, Worldwide Commercial Business, Keith played a key role in driving the significant growth of Azure over seven consecutive years. He also brings extensive experience in worldwide field sales and marketing, partner ecosystem development and business strategy leadership – all of which will be key elements of his role here at HPE … Sigma Computing, an innovator in cloud analytics and business intelligence (BI), announced the appointment of Cristina Bravo Olmo as the company’s Vice President of Marketing and Latha Colby as the Vice President of Engineering. Bravo Olmo has extensive B2B marketing experience, a legacy of successful go-to-market strategies, and a proven ability to lead high-performance teams at companies like Wrike, Zendesk, Marketo, and Trend Micro. She will oversee all aspects of marketing at Sigma. Colby has vast experience building data management and analytics products at both startups and established companies, including Workday, Platfora, and ParAccel. At Sigma, she will expand the engineering team and guide the product vision … Collibra, the Data Intelligence company, announced the appointment of Madan Gadde as Chief Customer Officer, a new role for the company. Gadde will lead the company’s efforts in supporting more than 400 global customers spanning financial services, technology, utilities, retail, healthcare and more. With more than 20 years of experience driving customer-centric technology solutions globally, Gadde most recently served as Senior Vice President (SVP) Customer Success at DataStax, a leading provider of database software for hybrid and multi-cloud distributed applications. Prior to that, Gadde led customer service offerings at FIS as SVP of worldwide professional services. Gadde also held various leadership positions at HP, Genesys and Mercury Interactive … Information Builders, a leader in business intelligence (BI), analytics and data management solutions, announced the appointment of Keith Kohl as senior vice president, Product Management. In this role, Kohl will assume responsibility for all product management functions and serve as a liaison between product, sales and marketing teams. With more than 30 years of overall experience and a proven, quantifiable track record of results in fast-growing, global enterprise software companies, Kohl will be responsible for Information Builders’ product strategy and roadmap. Leveraging his extensive product management expertise, Kohl will define the solutions that Information Builders creates and the strategy behind them. He will ensure products have market relevance and oversee all associated aspects of pricing, bundling, and industry-specific functionality.

Lastly, we heard from the Association for Computing Machinery (ACM) Special Interest Group on Knowledge Discovery and Data Mining (SIGKDD) that has announced that KDD 2020 has opened the call for papers in both its research and applied data science tracks.

“As the largest interdisciplinary conference for data science, we are excited to open the call for papers in both the research and applied data science tracks for our 2020 conference in San Diego! Every year, we are consistently blown away by the caliber of papers spanning across big data, data analytics, statistical methods, deep learning, machine learning and more and know that the highly-selective process will only get harder,” said Rajesh Gupta, KDD 2020 conference co-chair and founding director of the Halicioglu Data Science Institute

2020 Trends/2019 Year-in-Review

“I believe that AIOps will develop as an umbrella capability that will integrate DataOps, ModelOps and DevOps together,” commented Radu Miclaus, Director of Product, AI, and Cloud, Lucidworks. “These processes will enable IT to better support the entire continuum of data pipelines, model development, performance testing, model deployment, model monitoring and model refinement at the same pace that DevOps has been operating in the recent past. Eventually it will lead to semi or fully automated systems that enable self-learning and continuous improvement.”

“The ‘low code/no code’ movement will begin giving DevOps a run for its money,” commented Sazzala Reddy, Datrium CTO and Co-founder. “There is growing momentum towards ‘low code’ or ‘no code’ ideology because it’s hard to find and retain talented developers without paying a huge price. The next phase of application development will rely on using external SaaS applications for doing most of the work except in a few narrow core use cases. This will reduce the burden of having to hire expert developers and maintain sophisticated code that then needs to be maintained.” 

“AI unlocks intelligent experiences in customer support,” commented Philip Say, VP of Innovation Product Management, Sutherland Labs. “AI is disrupting almost every industry, from manufacturing to finance – and in 2020 the increasing impact of AI will make its way to customer support (CS) centers, allowing businesses to empower CS agents with real-time feedback and guidance during customer interactions. This shift will allow brands to resolve issues and find answers much more quickly, unlocking a new level of more intuitive, intelligent customer experiences.”

“AI will compete more strenuously against … AI, fueling monopolistic practices and reducing competitive situations (a key early example of this includes the homogenization of air travel pricing),” commented Tyna Callahan, Senior Director of Communications & Product Marketing at Scality. “To be ready for what the fourth (and fifth) industrial revolution brings, the division between what requires ‘humans’ and what does not will accelerate, so we will continue to see the divvying-up of those tasks and functions that require humans, and those that AI does well. As time goes on, humans will do what requires care, creativity and artisanship; and everything else will be automated. 2020 will see this division of ‘labor’ accelerate.”

“Big Data is a relative term, and a moving target,” commented Dan Sommer, Senior Director, Global Market Intelligence Lead at Qlik. “With the rise of infinitely scalable cloud data warehouses, the mysticism around analytics on massive amounts of data is gone. We are moving into a distributed data approach, or what we’d like to call the ‘wide data’ era. Data formats are becoming more varied, resulting in a corresponding fragmentation of databases. Synthesizing these different sources together again will be essential in seeing the larger data picture in 2020 and will be a step forward to improve data literacy within the organization.”

“As we see more instances of AI in 2020 and a continued development of facial recognition technologies, we will begin to see government entities declaring privacy regulations on what data businesses can and cannot use,” commended Chris Downie, Chief Executive Officer of Flexential. “For example some countries, such as China, are racing to showcase who has the best AI technology. They are measuring their students’ brain waves with IoT sensors during class to provide teachers with more customizable content to achieve better retention and results. In the next year, we will see which governments will allow this level of technology within municipalities and discuss the cyber vulnerabilities associated with this type of use case.”

“In 2020, artificial intelligence and machine learning will become more embedded into applications as a core component of business process flow,” commented Gunther Rothermel, EVP and Head of SAP Cloud Platform. “We’ll see a transformation from business process management of the past (BPM) to intelligent business process management (IBPM). With conversational AI (CAI) and robotic process automation (RPA) embedded directly into business process flow, all users will look at processes and be empowered to modify them to become more efficient. As a result, we can expect simplified workflows and an overall increase in productivity.”

“Cross-functional teams will drive excellence in AI practices: Cross-functional teams, sometimes referred to as “Centers of Excellence” (CoEs) will empower organizations to create impactful AI projects in 2020,” commented Zachary Jarvinen, Head of Technology Strategy, AI and Analytics at OpenText. “These teams will represent the entire organization and will include individuals with business knowledge, IT experience, and specialized AI skills, such as data engineers, data scientists, subject matter experts, and project managers. The role of these teams will be to identify use cases and manage a digital platform that supports collaboration on key business initiatives. They must also partner with the right vendor who has the tools and expertise needed to help the organization kickstart a successful AI journey. Combining internal and external resources will be imperative to building and executing powerful AI projects that see the light of day and provide real business value, instead of getting locked in some corner of the office.”

What the world is calling AI today will split into several areas in 2020, which someone in marketing will inevitably create pithier names for,” commented Cheryl Wiebe, Practice Lead, Industrial Intelligence Consulting at Teradata. “These include: Robotic Process Automation (RPA); automated feature engineering and selection; perception AI, which is the automation and refinement of physical perception; and resource allocation AI, the marriage of optimization technologies to sense and respond to demands in real-time. AI will begin to improve the process of data management itself. For example, for system resource allocation, for automated feature engineering, for operational metadata collection, and for better knowledge management (such as tagging).”

“There will be some impediments to the spread of ML,” commented Moshe Kranc, CTO, Ness Digital Engineering. “The severe manpower shortage of skilled ML engineers will make it difficult for second tier companies to keep up. Transparency will continue to be an issue. So long as humans cannot understand why an ML agent made a particular choice, it will be difficult to convince humans to give these ML agents the autonomy to make choices on their own. Ethical concerns about bias in AI algorithms will continue to be a major concern. Can we rely on insights derived via training data that may express historical bias against women, the elderly or minorities?”

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: