Sign up for our newsletter and get the latest big data news and analysis.

insideBIGDATA Latest News – 8/14/2020

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

DataRobot Launches Pathfinder: A Comprehensive Library of 100+ AI Use Cases

DataRobot, a leader in enterprise AI, announced DataRobot Pathfinder, a public library of more than 100 AI use cases that provide the know-how to maximize success and impact based on DataRobot’s experience with thousands of real world applications of AI. DataRobot Pathfinder offers the most comprehensive library of AI use cases on the market, enabling individuals to discover the best application of AI for their unique business needs.

Identifying and deploying the right use cases is critical for businesses to get real value from AI, yet remains one of the biggest challenges to AI adoption. To help more individuals successfully deploy AI, Pathfinder offers insight into which use cases are commonly implemented across 14 industries to illustrate the variety of problems that AI can solve. For a number of high-value use cases, Pathfinder also provides detailed guides on how organizations can implement them from both a technical and business perspective.

Developed by DataRobot’s customer-facing data science team, Pathfinder is built on decades’ worth of academic and industry experience to help users select and deploy the best AI use cases. The public library makes it easy for users to identify the AI use case that best fits their needs by allowing them to filter by vertical, problem type, and value driver. These use cases include industry-agnostic applications of AI, such as forecasting demand, scoring new leads, and reducing customer churn, as well as industry-specific applications of AI, including reducing hospital readmissions, predicting insurance claims’ severity, and preventing anti-money laundering.

“We are hyper-focused on enabling massively successful and impactful applications of AI,” said Michael Schmidt, Chief Scientist, DataRobot. “DataRobot Pathfinder is meant to help organizations—whether they’re customers or not—deeply understand specific applications of AI for use cases in their industry, and the right steps to create incredible value and efficiency. There is real depth here that encodes our experience deploying thousands of real industry use-cases where AI creates profound impact, and we’re looking forward to hearing how others are able to leverage this to make their applications of AI come to life.”

Tableau 2020.3 Adds External Write to Database, Enhanced Administrator Tools

Tableau Software, the leading analytics platform, announced the general availability of Tableau 2020.3, which delivers the ability for customers to output to and update external databases directly from Tableau Prep Builder, expanding the Tableau platform to serve a broader set of data preparation needs. The latest release also includes new tools for Tableau administrators to simplify the distribution of product licenses to various groups in their organization. Customers can instantly access these features, as well as new integrations with leading database providers, by upgrading to Tableau 2020.3.

“The new capabilities in Tableau 2020.3 continue to extend the breadth and depth of the entire Tableau platform, helping customers to easily scale their analytics further than ever before,” said Francois Ajenstat, chief product officer at Tableau Software. “By introducing the ability to write to an external database, Tableau Prep can now be used for analytics use cases outside of the Tableau platform, from data science to data governance.”

ScaleOut Software Announces the Release of ScaleOut StateServer® Pro

ScaleOut Software announced ScaleOut StateServer® Pro, a new software product release that adds integrated data analytics to the company’s battle-tested ScaleOut StateServer® in-memory data grid (IMDG) and distributed cache to help businesses gain more value from their fast-changing data. 

“With the release of ScaleOut StateServer Pro, we are pleased to make powerful advanced features available to our ScaleOut StateServer users,” said Dr. William L. Bain, founder and CEO of ScaleOut Software. “We are excited to offer the latest ‘big data’ capabilities with integrated, easy to use APIs that enable our customers to uncover important new insights in their data.”

Juice Analytics Launches Next-Generation Data Reporting Platform “Juicebox” in Beta  

Juice Analytics, a data visualization company, announced the launch of the beta program for Juicebox, an innovative SaaS platform for interactive data reporting. Juicebox is an easy-to-use tool for business users to turn static spreadsheets into attractive, interactive reports that guide audiences to action. The new beta version of Juicebox now enables any user to create sophisticated reports that previously could only be created by Juice Analytics’ team of developers. Clients for whom Juice Analytics has produced reports include Accenture, Aetna, HealthStream and University of Notre Dame. 

“We’re excited to make Juicebox more accessible than ever through the launch of our self-service platform,” said Zach Gemignani, Juice Analytics co-founder and CEO. “People who work with data realize it’s critical to translate that data into a form that gets used, so their hard work isn’t wasted. We’re providing a lightweight alternative to complex dashboarding tools, spreadsheets and PowerPoint. People don’t need to settle for flat, boring and ineffective data communication.”

Panasas Launches the New PanFS with Dynamic Data Acceleration Technology to Support Diverse and Changing Workflows in HPC and AI 

Panasas® released Dynamic Data Acceleration on the new PanFS® parallel file system, a proprietary software feature that delivers predictable high performance by automatically adapting to the changing and evolving small file and mixed workloads that dominate today’s HPC and AI landscape. 

Inconsistent performance and lack of adaptability in the face of change has been a major headache for both application users and storage administrators. PanFS with Dynamic Data Acceleration is the remedy to this headache and the answer to HPC and enterprise IT organizations who are looking for a high-performance plug-and-play storage solution that keeps up with their pace of change. 

“The rate of change in high-performance workloads and the extension of parallel file systems to AI and enterprise use cases call for a file system that is predictably fast, resilient and reliable in the face of change,” said Jim Donovan, chief marketing officer at Panasas. “Adding Dynamic Data Acceleration to the latest version of the PanFS parallel file system on ActiveStor Ultra delivers an HPC storage solution that will remain consistently fast as your workloads change.” 

Qubole Supercharges Capabilities for Data Science and Exploration via RStudio Integration

Qubole, the open date lake company, announced that customers can now enable, access and work with their enterprise-grade RStudio integrated development environment (IDE) directly within the Qubole Open Data Lake Platform. By seamlessly integrating RStudio Server Pro with Qubole, customers will have access to out-of-the-box features and unique managed services that supercharge data science and data exploration workflows for R users while optimizing costs for R-based projects.

Data scientists depend on RStudio as one of the top tools of choice for machine learning, deep data exploration, interactive data analytics and collaboration. With massive amounts of data now traversing the enterprise and becoming more accessible, data scientists and analysts need the power of computational frameworks that work with the R programming language, such as Apache Spark, to quickly make sense of this data and derive actionable insights for their businesses. 

“Users, and especially data scientists, like to work in their environment of choice. Through the RStudio integration, we are empowering data science teams with robust —but easy to use—tool sets to meet their diverse data exploration needs. Qubole is building on its commitment to help businesses scale by addressing an expansive range of use cases on data lakes,” said Ashish Thusoo, CEO and co-founder, Qubole. “While demand for advanced analytics and machine learning applications continue to soar as businesses collect troves of data in their open data lakes and data warehouses, we want to ensure that every customer using the Qubole data lake platform can effectively meet their data science needs. Our native integration with RStudio provides out-of-the-box statistical and graphical analysis functionality to further simplify the end-to-end machine learning workflow.”

Tachyum Shows Prodigy Running Existing x86, ARM, and RISC-V Software

Tachyum™ Inc. today announced that its Prodigy Universal Processor has successfully completed software emulation testing across x86, ARM and RISC-V binary environments. This important milestone demonstrates that Prodigy will enable customers to run their legacy applications transparently at launch with better performance than any contemporary or future ARM or RISC-V processors. Coupled with hyperscale data center workhorse programs such as Hadoop, Apache and more, which Tachyum is recompiling to Prodigy native code, this capability will ensure that Prodigy customers can run a broad spectrum of applications, right out of the box. Tachyum customers consistently indicate that they would run 100% native applications within 9-18 months of transitioning to the Tachyum platform to exceed performance of the fastest Xeon processor. The emulation is to smoothly transition to native software for Tachyum Prodigy.

Tachyum’s Prodigy can run HPC applications, convolution AI, explainable AI, general AI, bio AI and spiking neural networks, as well as normal data center workloads on a single homogeneous processor platform with its simple programming model. Using CPU, GPU, TPU and other accelerators in lieu of Prodigy for these different types of workloads is inefficient. A heterogeneous processing fabric, with unique hardware dedicated to each type of workload (e.g. data center, AI, HPC), results in under-utilization of hardware resources, and a more challenging programming environment. Prodigy’s ability to seamlessly switch among these various workloads dramatically changes the competitive landscape and the economics of data centers.

“Having a readily available solution and easy to use with massive amounts of software, demonstrates the foundation for success of a platform,” said Dr. Radoslav Danilak, Tachyum founder and CEO. “This demonstration of Prodigy’s ability to run software correctly – even legacy code from x86, ARM or RISC-V processors – shows that we will enable customers to seamlessly use the applications they are using today from Day One of Prodigy’s launch. This is another validation of viability for Prodigy and proof of its ability to unlock unprecedented performance, power efficiency and cost advantages across the most challenging computing environments.”

Isima Introduces the First Self-Service, Hyper-Converged Data Platform to Advance the Data-Driven Enterprise

Isima emerged from stealth and announced the general availability of their product, bi(OS)®. It delivers unparalleled speed to insight for data app builders in a unified manner, reducing the complete lifecycle of building data apps from months to weeks. This includes adding varied data sources, deriving real-time insights, and deploying to production.

“Monish and I canvassed the fundamentals of data management, distributed storage, and Applied AI to look at the maze created by incremental innovations,” said Darshan Rawal, CEO and co-founder of Isima. “By questioning assumptions of the past three decades, avoiding the bolt-on mindset, and being empathetic to enterprise IT, we built bi(OS) to be Telco-grade from day-one. We are encouraged by the validation we have received from the market, especially in the current environment.”

Artificial Intelligence System Tops One Billion Neurons on a Desktop Computer

In a significant advance in the development of Artificial General Intelligence (AGI), the Brain Simulator II neural simulator successfully tested one billion neurons on a desktop computer comprised completely of off-the-shelf components. From a performance perspective, the system processed three billion synapses per second. Brain Simulator II is an open-source software platform for proving the evolution of artificial intelligence (AI) to AGI.

Seen as another step toward creating brain-level functionality on computers, the spiking neural models used by the Brain Simulator II are more like biological neurons than traditional AI models and contribute immensely to the efficiency of the program. The computer used for this achievement included an AMD Ryzen Threadripper 3990X CPU running at 2.9Ghz (not overclocked) and 128 Gigabytes of RAM.

Energy analysis of the neocortex (the higher-level thinking part of the brain) shows that neurons spike on average, only once every six seconds. This means that its 16 billion neurons generate only 2.5 billion spikes per second in total. The Brain Simulator’s spiking neural model only processes neurons which spike in a specific time interval, rather than processing all of them, and so can be thousands of times faster than traditional artificial neural networks.

“This machine exceeds the processing performance of the human neocortex,” says Charles Simon, CEO of FutureAI, and principal author of the Brain Simulator II. “The CPU alone contains nearly 40 billion transistors and each transistor is nearly a billion times faster than each of the neocortex’s 16 billion neurons.”

Splice Machine Announces Splice Machine Kubernetes Ops Center

Splice Machine, the scale-out SQL database with native machine learning, announced it has launched the Splice Machine Kubernetes Ops Center, deployed with Helm Charts. With its comprehensive data capabilities, Splice Machine supports extensive OLTP, OLAP, data science, and MLOps in a single open source platform, making it easy for developers to get what they need to do done, without stitching a huge number of disparate technologies together.

“Now, application teams throughout the enterprise can develop with the agility of having full control over their data platform and data science that would previously have required a centralized and specialized data organization,” said Monte Zweben, co-founder and CEO, Splice Machine. “Ops Center brings DBOps and MLOps together at last. Now, developers can easily spin up and tear down sandboxes, shadows, dev, test, and production clusters in minutes to be agile and cost-efficient while retaining the control provided by having a single comprehensive software stack – a powerful combination that has direct results on business performance.”

Stratifyd Launches Next Generation Data Analytics Platform

Stratifyd, a technology company that democratizes data science and artificial intelligence (AI) through self-service data analytics, announced the launch of its next generation data analytics platform. This powerful analytics engine was re-designed from the ground up to be intuitive and easy-to-use, enabling business users – regardless of education, skill, or job function – to harness the power of proprietary and third-party data to easily reveal and understand hidden stories represented within the data, thus delivering the benefits of a data science team to every organization. 

The Stratifyd platform now provides the functionality to meet the demanding data science needs of an organization, but is specifically designed to be easy to use for those with limited data analytics experience. It empowers users of all skill levels to connect data sources to the platform, perform in depth analysis and data modeling, and discover insightful stories faster and more easily than previously possible. Through a graphical user interface, pre-built and customizable data analytics models, and simplified dashboards, the platform enables business users to extract insights (i.e., stories) that are hidden in the data and essential in helping companies improve customer service, better understand customer requirements, deliver product enhancements that address gaps in the market, solve problems experienced by customers, rollout new product and service offerings that deliver a competitive advantage, and more.

“Stratifyd is a company with data science in our DNA. We were founded with a bold but simple mission to enable everyone within an organization to not only uncover but also understand the hidden stories within their data,” said Derek Wang, founder and CEO of Stratifyd. “This release of our next-gen platform is a giant leap in bringing to life our vision of putting the power of data science into the hands of business users.”

Terbine Launches Global IoT Data Subscription Service  

Terbine announced that it is offering subscriptions to its global IoT data trove to commercial users. The continually expanding index provides fingertip access to sensor readings generated by infrastructure, transportation, meteorology, energy, logistics and many other sectors. The system makes it possible, for the first time, for human users and AI’s alike to rapidly find and retrieve sensor information from sources around the world and apply it to analytics, visualizations, the operation of machines and management of infrastructure.

“Imagine this as a Bloomberg for the physical world,” said David Knight, Terbine Founder and CEO. “Instead of digging through many thousands of individual data sites and nearly indistinguishable data streams, it’s all comprehensively contextualized and indexed for you.” Terbine provides rich context, provenance, indexing and over 30 searchable parameters, each with multiple subcategories. Representing thousands of man-hours in data acquisition and characterization, the data is sourced from a broad range of public agencies and other sources encompassing air, land, water and space.

MariaDB Platform X5 Adds New Distributed SQL

MariaDB® Corporation announced the general availability of MariaDB Platform X5, a comprehensive open source database solution delivering the ultimate in versatility across workloads and scalability from a single database or data warehouse to millions of transactions per second. This major release introduces powerful upgrades to every component of the platform as well as the addition of the new MariaDB Xpand smart engine for distributed SQL for global scale and nonstop availability. MariaDB Platform X5 supports companies at any point in their growth, delivering the confidence to go boldly anywhere.

“MariaDB Platform X5 is the culmination of years of deep engineering work to bring together best-of-breed technologies in a meaningful way,” said Michael Howard, CEO, MariaDB Corporation. “We’re once again challenging the notion that you have to have different databases to get a single job done. With MariaDB Platform X5, our customers can start small and go big – adaptively, pragmatically and with extreme ease.”

Qlik Survey Shows “Data to Insights” Leaders See 23% Increase in Revenue Through Data Pipeline Optimization

Qlik® launched two new resources that build on the recent global IDC study sponsored by Qlik, which shows organizations that invest in creating data-to-insights (D2I) capabilities through modern data and analytics pipelines are seeing significant gains. Through the new IDC hosted assessment tool (www.D2I-Score.com), every organization can evaluate the strengths and gaps in their own data pipelines. The tool also provides a set of recommendations that will help organizations better support and focus strategic investments that can have significant bottom line impact. 

“Organizations across the globe are missing a crucial opportunity to impact their performance by turning data into ongoing business value due to gaps in leaky data pipelines,” said James Fisher, Chief Product Officer at Qlik. “Qlik’s unique end-to-end approach to data integration and analytics can help any organization act at the speed of data through improved data-to-insights capabilities that drive tangible business outcomes.”

Pypestream Introduces Conversational Ads, Delivering Engaging and Personalized Experiences 

Pypestream, s leading conversational AI platform with all-in-one cloud messaging, announces Conversational Ads, an immersive brand experience driven by in-ad engagements. Conversational Ads transform present-day display ads by incorporating conversational AI alongside rich features like video, carousels, and surveys, creating authentic two-way messaging interactions between consumers and brands at scale. 

“Banner blindness has forced brands to make their ads bigger, bolder and brighter. Unfortunately, this strategy has exemplified the ‘law of diminishing returns’ in that consumers are just getting more annoyed and ‘turned off,” said Richard Smullen, CEO at Pypestream. “The answer is not to get louder but to get smarter, to get more personal, and with that, better understand the consumers and their intent. Conversational ads powered by AI achieve this at scale and as these new ad units gain adoption, the entire advertising landscape will shift to ongoing dialogues.”

Luminoso Introduces Deep Learning Model for Evaluating Sentiment at the Concept Level

Luminosohttps://luminoso.com/, the company that automatically turns unstructured text data into business-critical insights, unveiled its new deep learning model for analyzing sentiment of multiple concepts within the same text-based document.

Luminoso’s new deep learning model understands documents using multiple layers of attention, a mechanism that identifies which words are relevant to get context around a specific concept as expressed by a word or phrase. This model is capable of identifying the author’s sentiment for each individual concept they’ve written about, as opposed to providing an analysis of the overall sentiment of the document.

“While sentiment analysis has been prevalent for well over a decade, the most common form of sentiment analysis today involves evaluating whether a document’s sentiment is overall more positive than negative,” said Adam Carte, CEO of Luminoso. “This type of analysis is overly-simplistic, as it fails to address nuanced comments such as customers explaining what they like and dislike about a product, or employee feedback about a company’s strengths and weaknesses. With Concept-Level Sentiment in Luminoso Daylight, businesses across industries will be able to upload any text-based document, and quickly receive a nuanced analysis of the author’s sentiment regarding the topics they wrote about.”

Rockset Shatters Operational Barriers for Real-Time Analytics

Rockset, the real-time indexing database company, announced new product features—including decoupling of storage from compute in the cloud, and separation of query-compute from ingest-compute—further eliminating the operational barriers in achieving real-time analytics at scale.

To stay ahead in the digital economy, companies need to personalize user experiences, build real-time decisions systems and automate actions using sensor data. These types of applications demand both very low latency and complex analytical queries on a variety of data formats from different sources. Since real-time data is messy and bursty, the high operational cost and complexity in ingesting and querying real-time data at scale has been a major barrier for enterprises moving from batch to real-time.

“First-generation real-time analytics solutions like Elasticsearch and Druid tightly coupled storage and compute because they were optimized for the data center era. But this leads to over-provisioning of resources and makes it difficult to manage costs at scale,” said Dhruba Borthakur, co-founder and CTO, Rockset. “On the other hand, scaling storage and compute independently in the cloud provides the benefits of improved scalability, availability and better price-performance ratios. Rockset’s cloud-native architecture and serverless technology already enables massive operational gains for customers, and these new features will further enable developers to achieve real-time analytics at scale.”

Knoa Software Releases New Analytics Platform

Knoa® Software, a leading provider of user experience management (UEM) software, announced Knoa Analytics, a new component of Knoa UEM, which is also sold by SAP as SAP® User Experience Management (SAP UEM) by Knoa. This new analytics platform extends and complements the current product offering with a new set of analytical capabilities geared towards accelerating adoption on enterprise software such as SAP, increasing user productivity, and offering more intuitive insights into business performance, as well as team-level and project-based collaboration.

“We’ve been working closely with customers and partners, to better understand how Knoa’s unique data can be applied to solve critical business challenges,” said Bogdan Nica, Vice President of Product and Services, Knoa. “As a result of this collaboration, we were able to identify and launch key new capabilities that we believe will help customers drive accelerated value with our solutions.”

dotData Launches dotData Enterprise Version 2.0: Full UX Remodeling with Significant Functional Upgrades to Deliver on Vision to Democratize Data Science for All

dotData, a leader in AutoML 2.0 software to help accelerate AI/ML development and operationalization for the enterprise, announced the release of dotData Enterprise Version 2.0, with significant updates and new features that completely change the AI/ML experience for citizen data scientists and deliver greater performance and transparency.

dotData released Version 2.0 as the Company continues to expand its product offerings and scale up operations to meet growing market demand for its full-cycle data science automation platform. In addition to full UX remodeling of its interface, additional key updates that deliver superior AI/ML experience include auto-balancing of accuracy and transparency, more accurate and interpretable auto-designed features, expanded out of the box connectivities and seamless model porting with dotDataPy and dotData Stream.

“We take the quality of features and quality of outcomes very seriously and are committed to continuous improvement of our platform,” said Ryohei Fujimaki, Ph.D., founder, and CEO of dotData. “After we released dotData Enterprise 1.6.2 in December 2019, we made a big decision to fully remodel our UX to realize our commitment of democratizing enterprise data science. Today, we are very happy to announce the release of dotData 2.0 which redefines the AI/ML experience for all. This is a significant change that continues to expand our vision to simplify the data science process so that BI, data professionals, and analytics teams can accelerate the development of predictive analytics by taking advantage of AI and data science using automated machine learning.”

Netradyne Passes 1 Billion Miles Mapped and Captures and Analyzes Nearly 3 Billion Minutes of Vision Based Driving Data 

Netradyne, a leader in artificial intelligence (AI) and edge computing focusing on driver and fleet safety, announced that Driveri®, its vision-based driver recognition safety program, has mapped data from vehicles traveling more than 1 billion miles on US roads, including 1.8 million unique miles, more than any other vision-based driving system. This data is collected by the Driveri system deployed in fleet vehicles around the US, driven by professional drivers and includes numerous passes over the same roads to provide deeper insights into how driving and different road conditions may change in hours, days and weeks, not mapped over the course of many years. 

The Driveri system leverages a 4-camera system looking both within and outside the vehicle, continuously analyzing driving scenes and road data while monitoring drivers for driving behavior using AI embedded directly into the edge computing device. It captures diverse roads and traffic scenarios such as commercial driveways, temporary road closures, frontage roads, school zones and more. 

“In addition to mapping more than a billion miles, we are creating a unique database from which to study the trends and patterns regarding accidents and changing roads,” said Avneesh Agrawal, chief executive officer of Netradyne. “This rich data not only is critical for fleet companies and insurers to make our roads safer, but ultimately this data will be immensely valuable to the systems of autonomous vehicles, where the training data is imperative. Lives literally are at stake, our data can be used to help train AV ecosystems which currently must rely on pricey surveys, specialized equipment, and human review; rather, they can lean on the data collected by our Driveri platform in order to gather real-time insights and alerts that will improve safety and optimize overall fleet operations. The more miles humans are safely able to drive today will help the autonomous vehicles of the future drive safely, but only if we have the data to use to train those future systems.”

LogicMonitor Launches Enhancements to AIOps Early Warning System

LogicMonitor, a leading cloud-based provider of IT infrastructure monitoring and intelligence software for enterprises and managed service providers, announced enhancements to its LM Intelligence™ AIOps early warning system. Its dynamic thresholds functionality, first introduced in December 2019, now includes support for seasonality and rate of change. LogicMonitor’s dynamic thresholds intelligently alert IT teams based on historical performance and newly refined algorithms to help businesses save time, avoid alert fatigue, and surface anomalies sooner to proactively prevent downtime.

“If static thresholds are not set or tuned well, dynamic thresholds will ensure alerts are triggered and silenced appropriately. If static thresholds are tuned well and enabled, dynamic thresholds will still defer to them,” said Tej Redkar, Chief Product Officer at LogicMonitor. “The addition of dynamic thresholds seasonality and rate of change support to the existing anomaly detection, forecasting and root cause analysis features of LM Intelligence mean that LogicMonitor customers now have access to the most advanced AIOps capabilities in the market.”

GridGain Control Center For On-Premises Use Lets Companies Manage, Monitor And Develop Applications On GridGain And Apache Ignite Inside Their Network

GridGain Systems, provider of enterprise-grade in-memory computing solutions based on Apache® Ignite®, announced the release of GridGain Control Center for on-premises use, a downloadable version of the hosted GridGain Control Center. With the on-premises GridGain Control Center, Operations, DevOps or development users can manage, monitor and develop applications built on GridGain and Apache Ignite environments within their network. It allows users to simultaneously manage multiple GridGain or Ignite applications through a single interface to monitor application health and detect problems early. A free trial version is available for managing two nodes, perfect for DevOps or development users who want to use the GridGain Control Center locally within their corporate firewall while developing applications that leverage the GridGain or Ignite platforms. For operations professionals, the downloadable version is a convenient option for evaluating the on-premises version of GridGain Control Center. A commercial on-premises Control Center license is available for managing clusters with more than two nodes.

“GridGain Control Center makes it easy to ensure optimal performance of GridGain or Ignite in-memory computing clusters. Our on-premises version provides flexibility for situations where connecting to systems outside the corporate network isn’t possible or creates security concerns,” said Nikita Ivanov, Founder and CTO of GridGain Systems. “The downloadable version of GridGain Control Center is an excellent way for operations and DevOps staff to evaluate the solution for managing large clusters, and the free, two-node trial version is a powerful tool for developers building new applications on GridGain or Apache Ignite.”

Altair Releases New Version of Altair® Knowledge Studio® Market-leading Machine Learning and Predictive Analytics Solution

Altair (Nasdaq: ALTR), a global technology company providing solutions in product development, high performance computing (HPC), and data analytics, has released a new version of Altair Knowledge Studio that brings even greater speed, flexibility, and transparency to data modeling and predictive analytics. 

“As a powerful solution that can be used by data scientists and business analysts alike, Knowledge Studio continues to lead the data science and machine learning market,” said Sam Mahalingam, Altair chief technology officer. “Without requiring a single line of code, Knowledge Studio visualizes data fast, and quickly generates explainable results.”

Introducing Arena, Zaloni’s End-to-end DataOps Platform

ZaloniTM, an award-winning leader in data management, announced that Zaloni’s product, formerly known as the Zaloni Data Platform (ZDP), has been renamed ArenaTM.  The Arena platform provides innovative DataOps capabilities and a new user-interface as the product advances to reflect market shifts and customer-driven innovation.

Zaloni’s Arena platform enables DataOps by streamlining the data supply chain and providing data unification, discovery, governance, active metadata management, collaboration, mastering, and self-service provisioning, in one unified extensible platform. Arena secures data pipelines to enable better, faster analytics, reduce the burden on IT, and lower data costs.

“Enterprises today are grappling with data sprawl and complexity along with a critical business need to deliver trusted and secure data to analysts as quickly as possible to reduce time to value,” said Susan Cook, CEO of Zaloni. “Driven by an agile DataOps mentality, we are re-orienting our approach to delivering true end-to-end data management with the Arena 6.0 release.”

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: