Sign up for our newsletter and get the latest big data news and analysis.

insideBIGDATA Latest News – 3/10/2020

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Bright Computing Announces Bright Cluster Manager for Data Science Now Available at No Charge with Easy8

Bright Computing, a global leader in high performance cluster management software, announced that Bright Cluster Manager for Data Science is now available at no charge as part of the Easy8 program. Launched in November of 2019, Easy8 is designed to put Bright’s award-winning cluster management software in the hands of every organization working with high-performance Linux clusters. Easy8 offers the full-featured Bright Cluster Manager software free for up to 8 nodes, and now includes Bright Cluster Manager for Data Science.   Bright Cluster Manager automates the process of building and managing heterogeneous Linux clusters that span from your on-premise datacenter to the cloud, and to the edge. 

“With Easy8, our goal is to give every organization the ability to use our software, free of charge for clusters up to 8 nodes, to demonstrate how powerful and easy Bright Cluster Manager software is”, said Bill Wagner, CEO of Bright Computing. “With the inclusion of Bright Cluster Manager for Data Science in the Easy8 program, we are giving organizations the ability to quickly deploy cluster-based machine learning environments for data scientists and researchers that can be easily maintained and scaled out.” 

Datical Continues Investment in Liquibase with New Capabilities

Datical, a leading provider of database release automation solutions, announced the release of Targeted Rollback capabilities for Liquibase, the rapidly growing open-source tool that helps application developers track, version and deploy database schema changes quickly and safely. This powerful Liquibase Pro feature saves developers countless hours by allowing a more precise way to deliver innovative software much faster. Additionally, Datical recently released an all-in-one graphical installer feature available to the entire Liquibase community.

“Safely and predictably rolling back changes is a critical capability for the community of Liquibase users,” said Dion Cornett, president of Datical. “We appreciate the great feedback from the community, and with this release, we’re thrilled to respond to three-quarters of our community requesting more advanced rollback features. Community-driven innovation, such as Targeted Rollback, helps development teams innovate quickly.”

Tableau 2020.1 Delivers Community-Driven Features, Including Dynamic Parameters, Visualization Animations and Administrative Tools

Tableau Software, a leading analytics platform, today released new capabilities to help people unlock more interactivity in their dashboards and get deeper insights from their data. Tableau 2020.1 is the latest update driven by customers’ input in the online Tableau Community Forums, adding new features requested by Tableau customers and increased flexibility for administrators. Features including Dynamic Parameters eliminate the need to update workbooks when changes are made to the underlying data and viz animations give viewers a new tool for understanding transitions between data points, as well as engaging with the data. Tableau 2020.1 also provides additional capabilities for Tableau administrators through Login-Based License Management, and improvements to the Data Management Add-on and Server Management Add-on.

“Since day one, we’ve put customers at the center of our product development. Our extensive and passionate Tableau Community helps steer our product innovation and keeps us focused on solving our customers’ most pressing challenges,” said Francois Ajenstat, Chief Product Officer at Tableau. “This release brings some of our Community’s most-requested updates together, empowering our customers to make analytics more interactive and engaging and allowing them to focus on tasks that drive real value and transformation across their organizations.”

Netreo Releases AIOps: Autopilot to Automatically Identify and Repair Deviations in IT Infrastructure Configurations

Netreo, the award-winning solution for IT management and one of Inc. 5000’s fastest growing companies, announced the release of AIOps: Autopilot, the first product featuring data models that combine artificial intelligence (AI) and machine learning (ML) technology with 20 years of network management system (NMS) configuration and monitoring data. As a result, it brings the intelligence to automatically discover, configure and tune the entire monitoring environment. This allows threshold baselines, event correlation rules, dependency mapping, and many other configurations to evolve and improve the longer Netreo is deployed.

“The key to transforming IT into a strategic weapon for an organization is to employ tools that can observe your whole technology stack, analyze the data found, and act on those findings,” said Netreo President Andrew Anderson. “AIOps: Autopilot was designed and developed with that idea in mind. It takes care of learning and automatically configuring the monitoring environment, so there are never any blind spots. As a result, engineering teams can take complete visibility for granted and spend their time engineering, not tuning their tools.”

Four Twenty Seven Announces its Physical Climate Risk Application

Four Twenty Seven, an affiliate of Moody’s and the leading publisher of climate data for financial markets, announced the release of a new on-demand climate risk scoring tool. This application responds to the financial sector’s growing call for the seamless integration of granular, forward-looking climate data into investment decisions and risk management practices.

Users are able to enter location and other data via an intuitive interface and immediately receive information on their assets’ exposure for floods, sea level rise, hurricanes & typhoons, heat stress and water stress to mid-century. The application allows users to browse and download detailed facility scorecards that include data on the underlying risk drivers for each hazard. The application also enables users to toggle between maps and tables to identify regional trends and multi-hazard exposure. Users can perform analyses for large volumes of locations via an API and integrate the outputs into downstream risk management and portfolio analysis applications.

As the material financial impacts of climate change become increasingly evident, understanding and preparing for climate risks is essential.  Real estate investors can use Four Twenty Seven’s physical climate risk app for due diligence and proactive risk management across their portfolio of properties. Portfolio managers can leverage the application to report climate risk exposure and enhance portfolio decision-making. Asset owners can evaluate long-term risk exposure and engage with corporations and managers to improve resilience. Banks can score thousands of locations at once to identify risk in commercial and residential lending portfolios. Corporations can identify risk hotspots and opportunities to build resilience in their global operations.

“We are excited to bring our on-demand physical climate risk application to the market.  Our app provides access to sophisticated climate model outputs in easily understandable metrics with just a few clicks,” says Four Twenty Seven’s  Founder & CEO, Emilie Mazzacurati. “Real-time access to forward-looking, location-specific data on climate risk enables investors, banks and corporations to manage their risk and invest in resilience.”

Plotly Introduces Next Evolution of Dash Enterprise with Focus on Kubernetes and High Availability Environments

Plotly Technologies, developer of the leading data science and AI platform for creating analytic applications, unveiled Dash Enterprise 3.4. Dash enables Data Science teams to quickly build and share interactive analytic applications across and beyond their organizations through a repeatable, scalable process. New and expanded capabilities in Dash Enterprise 3.4 allow easier deployment in Kubernetes environments and across clusters and includes a full Starter app set to streamline creation of analytic apps.

The new features make Dash Enterprise the ideal choice for organizations using data science at scale or managing data-intensive applications that are mission critical and require high availability. These may range from large-scale AI model explainability to stock ticker applications that must be accessible to hundreds or thousands of people simultaneously. The combination of scale and availability makes Dash Enterprise 3.4 an invaluable solution for data scientists and engineers across industries.

“Highly visual, interactive, and easy-to-use web applications act as bridges that connect data science investments with meaningful action,” said Chris Parmer, Plotly co-founder and author of Dash. “This latest release of Dash Enterprise helps ensure that anyone within an organization can quickly operationalize key data science outcomes working in Python or R, the programming languages preferred by data scientists, with no compromises and no need for full stack development.”

Yugabyte Scales to New Heights with the Release of YugabyteDB 2.1

Yugabyte, a leader in open source distributed SQL databases, announced the general availability of YugabyteDB 2.1. With new generally available features, including two data center (2DC) deployments for reducing write latency, read replicas for reducing read latency, enterprise-grade encryption enhancements and TPC-C and YCSB benchmarks confirming a 10x increase in performance, YugabyteDB is the only distributed SQL database on the market to provide the speed, scale and performance developers need to deploy global, cloud native applications.

These updates make it easier than ever for organizations looking to take advantage of multi-cloud or solve for the geo-distribution of data to use YugabyteDB to develop, deploy and operationalize their modern applications with high performance and scale. YugabyteDB 2.1 eliminates wide area network (WAN) latency with the 2DC deployment and read latency with read replicas. These new models allow users to configure the database differently – or use all three deployments together – to reduce latency and solve for high volumes of data, delighting users and enabling business growth. For enterprise users of the Yugabyte Platform, the commercial self-managed DBaaS offering, creating and managing read replicas is simplified even further.

“Enterprises that deliver modern, internet-scale applications are increasingly turning to distributed SQL databases, like YugabyteDB, so they no longer have to give up the data modeling flexibility and transactional capabilities of SQL in the process of going cloud native,” said Kannan Muthukkaruppan, co-founder and CEO, Yugabyte. “YugabyteDB 2.1 builds on our vision to enable unprecedented application agility by offering greater deployment flexibility, improved performance and enhanced security controls so that businesses and their applications can fully embrace geo-distribution and multi-cloud to solve important challenges and fuel their growth.”

Matillion Delivers Effortless Data Ingestion with General Availability of Matillion Data Loader

Matillion, a leading provider of data transformation software for cloud data warehouses (CDWs), announced the general availability of Matillion Data Loader. The SaaS data integration solution enables data analysts and data-savvy business professionals to easily integrate and access the data they need to perform analytics, improve data availability, and fuel business growth. This free offering from Matillion helps enterprises continuously extract and load data into their chosen cloud data warehouse, eliminating data silos and providing business users easy access to insights.

“Throughout their data journey, companies not only need to quickly and easily move data into the cloud but also transform that data to extract its true business value,” said Matthew Scullion, CEO of Matillion. “Matillion Data Loader delivers a code-free, data loading solution that works seamlessly with our data transformation solution, Matillion ETL. Our platform supports our customers on their data journey with the flexibility to address simple use cases or more complex data projects so they can grow their business with actionable insights.”

Octopai Announces Support of Tabular on Azure Analysis Services, Netezza and Vertica as well as Expanded Support of OLAP Cubes for Better BI

Octopai, a leader in metadata management automation for BI & Analytics, announced that the company has added support for Tabular on Azure Analysis Services, IBM’s Netezza, Vertica, and has expanded its support of OLAP cubes to enable organizations to understand their complete data flow to get full control over their data assets through automated data lineage and discovery.

“Once blind in the face of their data, we are very excited to offer our cube customers expanded support of OLAP  so that they can finally see the entire data journey with automated data lineage,” said Gal Ziton, CTO. “We are proud to continuously add ongoing capabilities for more and more customers using varying platforms and systems to provide a unified view and map of their data.”

ODPi Announces the OpenDS4All Project

ODPi, a nonprofit The Linux Foundation project, accelerating the open ecosystem of big data solutions, today announced that OpenDS4All is now an ODPi Live Project. OpenDS4All is an open source project built to accelerate the creation of data science curricula at academic institutions. This project was initiated and funded by IBM, built by the University of Pennsylvania, and brought to life under the governance of the Linux Foundation.

OpenDS4All is a curriculum kit comprised of a set of open source building blocks for schools to supplement, strengthen or start-up their data science programs. These building blocks are based on Python, open source tools and frameworks and include slides, documentation, code, and data sets that could be adopted or updated by anyone. By making a “starter set” of training materials available on how to build a Data Science program, IBM, cross-industry partners, and educators working together can help accelerate the availability of skills-building programs around the world.

“Naturally, there was a desire for the project to have open source credibility and reach from the get-go. IBM is a member of The Linux Foundation, ODPi and recently announced participation in The Linux Foundation AI.” John Mertic, Director of Program Management at The Linux Foundation, sees the potential. “This project that IBM proposed and funded, is revolutionary for developing educational materials. It is a great fit for what The Linux Foundation stands for, shared technology. Shared education is an emerging and important frontier for open source.”

Honeywell Achieves Breakthrough that Will Enable the World’s Most Powerful Quantum Computer

Honeywell (NYSE: HON) announced it has achieved a breakthrough in quantum computing that accelerates the capability of quantum computers and will enable the company to release the world’s most powerful quantum computer within the next three months. The company also announced it has made strategic investments in two leading quantum computing software providers and will work together to develop quantum computing algorithms with JPMorgan Chase. Together, these announcements demonstrate significant technological and commercial progress for quantum computing and change the dynamics in the quantum computing industry.

In a scientific paper that was posted to the online repository arXiv, Honeywell has demonstrated its quantum charge coupled device (QCCD) architecture, a major technical breakthrough in accelerating quantum capability. The company also announced it is on a trajectory to increase its computer’s quantum volume by an order of magnitude each year for the next five years. This breakthrough in quantum volume results from Honeywell’s solution having the highest-quality, fully-connected qubits with the lowest error rates.

“Quantum computing will enable us to tackle complex scientific and business challenges, driving step-change improvements in computational power, operating costs and speed,” said Honeywell Chairman and Chief Executive Officer Darius Adamczyk. “Materials companies will explore new molecular structures. Transportation companies will optimize logistics. Financial institutions will need faster and more precise software applications. Pharmaceutical companies will accelerate the discovery of new drugs. Honeywell is striving to influence how quantum computing evolves and to create opportunities for our customers to benefit from this powerful new technology.”

cnvrg.io Accelerates Enterprise AI Deployment Through Advanced MLOps Solutions Integrated with NVIDIA GPU Cloud Container Registry

cnvrg.io, the data science platform simplifying model management with MLOps and continual machine learning automation, announced its advanced MLOps solution will be integrated with the NVIDIA NGC container registry. Through a full, native integration, cnvrg.io will deliver accelerated enterprise artificial intelligence (AI), machine learning (ML) and data science automated pipelines to enterprise teams in multi-cloud and hybrid-cloud environments.

“At cnvrg.io we are constantly working to give data scientists and data engineers the best possible resources to do what they do best: data science.” said Yochay Ettun, CEO and co-founder of cnvrg.io. “We see great customer traction and now with the NGC container integration we have another enhancement for our MLOps solution. Any AI/ML use case has the flexibility to run on any compute resource whether it’s cloud or on premises.”

Sign up for the free insideBIGDATA newsletter.

Comments

  1. David Kielpinski says:

    Technically, it’s not _Honeywell’s_ QCCD architecture. I was lead author on the paper that proposed it in 2001 and there is at least $1B invested in the QCCD industry (better known as trapped-ion quantum computing).

Leave a Comment

*

Resource Links: