insideBIGDATA Latest News – 7/24/2021

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Dremio Launches the Industry’s First SQL Lakehouse Service to Accelerate BI and Analytics

Dremio, the SQL lakehouse company, announced its cloud-native SQL-based data lakehouse service, Dremio Cloud. Purpose-built for the cloud, Dremio Cloud makes cloud data lakes 10x easier, while delivering infinite scale and industry-leading security. Dremio Cloud enables organizations of any size to leverage a no-copy open data architecture that eliminates the need to copy data into expensive and proprietary data warehouses.

Dremio Cloud reimagines the traditional data lake by combining the best of traditional data warehouses and data lakes into a SQL lakehouse, while removing the limitations of traditional data warehouses resulting from closed data architectures. Dremio Cloud enables high performance SQL workloads directly on cloud storage, eliminating the cost and complexity of copying and moving data.

“We built Dremio to automatically handle any scale with consistent query performance so companies could achieve data democratization while only paying for what they truly need,” said Tomer Shiran, Dremio’s Founder and Chief Product Officer. “This type of cost-effective and highly performant scaling was a central design tenet of Dremio. We wanted to be sure data teams could achieve superior price-performance and trust the cost transparency of our platform.”

Alation Supports Next Generation of Data Enthusiasts, Provides Free Software & Training Tailored to Academic Institutions to Empower Students with Data and Analytics Skills 

Alation Inc., a leader in enterprise data intelligence solutions, announced a new philanthropic initiative, The Data Intelligence Project. As part of this initiative Alation provides its platform, free of charge, to academic institutions to nurture the next generation of data enthusiasts. Using Alation, undergraduate and graduate students complete their coursework and conduct data-based research, such as analyzing different datasets regarding aspects of the COVID-19 pandemic. The first participant in the project, which launched in Summer 2020, was the University of Wisconsin-Milwaukee (UWM). 

Through the project, the UWM School of Information Studies utilized Alation’s catalog and collaborative query tool in information and data-centric courses, including Introduction to Databases, Organizational Informatics, and Big Data and Society. The classes introduced classic concepts and methodology from information science as applied to data, data cataloging, and data usage. More than 250 students have used Alation in their coursework thus far. Classes for 2021 are currently underway, and more courses are available in Fall 2021.

“The Data Intelligence Project has enabled hundreds of students to learn and conduct data-based research, and experience the benefits of integrating shared data in their assignments,” said Dr. Maria Haigh, Associate Professor, University of Wisconsin-Milwaukee. “In these courses, students found Alation’s collaborative features and the hands-on introduction to metadata in a shared data warehouse extremely valuable. Alation has helped them nurture data research, collaboration and analytical skills; such skills are not only critical to solving big data problems, they are uniquely supported by Alation’s technology.” 

Narrative makes data monetization accessible to all businesses with the launch of Data Shops

Narrative, the Data Streaming Platform that makes it easy to buy, sell and win, today launched Data Shops, the company’s latest innovative solution that makes it easy for any business in any industry to launch their own branded data e-commerce experience without spending significant time and resources.

“E-commerce has become a common part of our lives,” said Nick Jordan, Founder and CEO of Narrative. “It’s never been easier to buy almost anything at any time. Yet the buying and selling of data have remained a convoluted process that often takes months. Narrative Data Shops is a category maker – a truly transformational offering in the marketplace that is being enthusiastically embraced by customers as the ‘Shopify for Data.’ It’s an end-to-end solution for standing up a data business, from the top of the sales and marketing funnel through to transactions and delivery.”

Observable Introduces Data Visualization Stack for the Enterprise

Observable, the collaborative data visualization company, announced a new Enterprise tier that offers enterprise-focused features including advanced security with authentication and dedicated customer support. Observable empowers people to explore, collaborate, and communicate data insights through a single canvas. It offers web-based, real-time collaboration that helps organizations make better data-driven business decisions faster. 

“The current business intelligence tools are siloed and limit collaboration across organizations.  These tools restrict the data exploration and insight creating friction and frustration between colleagues. Making sense of data is one of the most urgent challenges facing our connected world,” said Melody Meckfessel, chief executive officer and co-founder. “Observable Enterprise offers web-based, real time collaborative business intelligence. Working with data is an essential skill. Observable Enterprise enables people of diverse roles (developers, analysts, executives, stakeholders) to explore and visualize data and then make data-driven decisions together that benefit the business.”

New Sisense Extense Framework Empowers Customers with AI-Powered Insights in their Favorite Collaboration Applications

Sisense, a leading AI-driven platform for infusing analytics everywhere, announced the Sisense Extense Framework, an innovation developed to deliver AI-driven analytic experiences directly within the applications users are working in without needing to leave their workflow. As a part of the announcement, Sisense is introducing several new infusion applications built on the Extense Framework to deliver actionable intelligence to employees for enhanced operational, logistical, and role-based teamwork, improving collaboration and decision-making effectiveness.

“Sisense continues to challenge the status quo by innovating on new ways to approach analytics in a seamless and guided way. Customers will no longer need to adjust their workflow or leave what they’re doing to search for insights in dashboards alone,” said Ashley Kramer, Chief Product and Marketing Officer at Sisense. “Now, any business user can go about their day-to-day activities in their favorite applications where analytics will be infused directly where they are working to impact everyday decisions.”

Redpoint Global Announces In Situ

Redpoint Global announced In Situ a cloud-native, data quality-as-a-service (DQaaS) that delivers perfected data and resolved identities in real time, using exclusively first-party data. With a simple, powerful and privacy-focused approach, In Situ provides unified customer data in place, at unprecedented ease, speed and scale without the need to transfer data across the internet. In Situ also provides radical transparency into the quality, reliability and trust of all customer data – empowering businesses to confidently actualize their data across all edge points of the enterprise.

Redpoint In Situ – named for the Latin phrase ‘in situ’ meaning in the original place – operates seamlessly within an organization’s existing cloud subscription. Data quality, identity resolution and governance are contained within existing Virtual Private Cloud (VPC), yielding a comprehensive identity graph and a full transactional and behavior tail as a real-time, holistic golden record for each customer with zero data exposure.

“Customer data perfection, as defined by each organization regardless of the industry, is the foundation of delivering superior experiences whether for a customer, member, patient, subscriber or donor,” said Dale Renner, co-founder and CEO of Redpoint Global. “In Situ transforms the creation and delivery of perfect data, wherever and whenever it is needed – in the cloud with no concessions in quality, speed, security and scale. For more than a decade Redpoint has led the market in providing the best, most accurate and complete customer golden record. With In Situ, all stakeholders win by having complete data transparency and trust in the quality of their customer data, while at the same time maintaining control and security of their most valuable asset.”

Quobyte Releases Hadoop Native Driver to Unlock the Power of Enterprise Analytics, Machine Learning, Streaming, and Real-time Applications

Quobyte® Inc., a leading developer of scale-out deploy-anywhere software-defined storage (SDS), announced the availability of its Hadoop Driver. Quobyte’s new native driver for Hadoop addresses the limitations of the Hadoop Distributed File System’s (HDFS) high-capacity design within the enterprise. The new native driver brings significant benefits in optimizing Hadoop clusters for a much wider range of applications and workloads, and true file system sharing across object storage and applications.

“Today’s analytics solutions allow enterprises to extract important insight from large volumes of data, but with the increasing prevalence of AI and machine learning in data analytics applications HDFS’s batch processing limitations have been exposed,” said Bjӧrn Kolbeck, CEO of Quobyte. “By deploying Quobyte’s native Hadoop/HDFS driver enterprises can now seamlessly share large amounts of file data with high performance across Hadoop/analytics, machine learning, and any Linux or Windows application.”

Saggezza Launches Quantum Computing Lab to Help Businesses Leap Into the Future

Saggezza, a global technology solutions provider and consulting firm, announced the launch of its quantum computing lab, which allows employees to experiment with paradigm-shifting quantum computing technology.

With its exponential problem-solving capabilities, quantum computing has the potential to reshape entire industries. By investing in the emerging technology at an early stage, Saggezza will stay ahead of radical industry transformation and identify where quantum computing can offer solutions to complex business problems.

Through the quantum computing lab, Saggezza employees utilize Braket, a fully managed quantum computing service through cloud computing giant Amazon Web Services. Due to the distributed nature of the quantum environment, Saggezza’s 500+ business and technology consultants worldwide can experiment with the world-changing technology before it revolutionizes value chains in insurance, finance, logistics, and healthcare industries.

“Quantum computing is the future — but we don’t know what the future will look like quite yet,” said Frank Trainer, Vice President of Process and Delivery at Saggezza. “Given that uncertainty, businesses can’t afford to shirk innovation because it’s challenging, uncomfortable, or risky. As the technological advancements of today shape the business environment of tomorrow, it’s more important than ever for businesses to prioritize innovation as they search for their next big idea.”

Vertica Announces Vertica 11, Delivering on Vision of Unified Analytics 

Vertica announced the Vertica 11 Analytics Platform, which includes major features and enhancements to delivering unified analytics and machine learning across multi-cloud and multi-regional deployments with self-service container workflows to meet the agility, speed, and security requirements of the most analytically driven organizations. With Vertica 11, organizations can unify their data siloes and choose from the broadest deployment options with improved automation capabilities to future-proof their analytics and machine learning to achieve measurable business value.

“Unified Analytics is a critical movement in our industry. But truly unified analytics requires proven and mature security, true deployment choice, end-to-end machine learning in production, and no-compromise analytical performance for organizations to capitalize on this mega trend,” said Colin Mahony, Senior Vice President and General Manager, Vertica, Micro Focus. “In Vertica 11, we expanded Vertica in Eon Mode to the Azure cloud, delivered support for Docker containers and Kubernetes, extended our market lead in advanced analytics and machine learning including time series forecasting, and much more. The feature list goes on and on – Vertica 11 is truly the unified analytics platform with the fastest performance at unlimited scale.”  

Latest Release of Wind River Studio Delivers Transformational Automation Technologies Across the Intelligent Systems Lifecycle

Wind River®, a global leader in delivering software for intelligent systems, has introduced its latest release of Wind River Studio. Unique to the industry, Wind River Studio is a cloud native platform for the development, deployment, operations, and servicing of mission-critical intelligent systems from devices to cloud. It enables dramatic improvements in productivity, agility, and time-to-market, with seamless technology integration that includes far edge cloud compute, data analytics, security, 5G, and AI/ML. The growth of 5G and sophisticated AI applications and the emergence of more intelligent systems and autonomous systems deliver on the promise of IoT. It is estimated that AI, robotics, and automation will drive 70% of global GDP growth between now and 2030.1

“The next generation of cloud-connected intelligent systems requires the right software infrastructure to securely capture and process real-time machine data with digital feedback from a multitude of edge systems, enabling advanced automated and autonomous scenarios,” said Kevin Dallas, president and CEO, Wind River. “Wind River Studio delivers a flexible and collaborative platform that addresses the dynamic automation needs surrounding the intelligent systems of the future, providing a complete lifecycle experience adapted for a cloud native environment.”

Explorance Introduces Breakthrough Tool for Employee Experience Management in Tumultuous Year for Talent

Text-based open-ended employee feedback is challenging for enterprise organizations to analyze and interpret, but this unruly qualitative data can now offer the richest and most actionable insights to business and HR leaders. Explorance, a leader in Experience Management (XM) solutions, introduced BlueML, a machine learning-powered comment analysis solution specifically trained to turn employee responses into decision-grade intelligence. With BlueML, business leaders can make sense of — and make decisions from — the unstructured data gathered from employee surveys and other sources of feedback, driving meaningful and actionable insights in just seconds.

“Talent is one of the biggest issues businesses are facing this year, presenting three critical challenges: recruitment, retention, and inclusion. Businesses are collecting more employee feedback than ever before, yet they lack the tools to easily and accurately analyze and interpret this feedback,” said Samer Saab, Founder and CEO of Explorance. “BlueML enables business leaders to not only listen to their employees’ feedback, but to take timely and meaningful action in response.”

AWS Announces General Availability of Amazon HealthLake

Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company (NASDAQ: AMZN), announced the general availability of Amazon HealthLake, a HIPAA-eligible service for healthcare and life sciences organizations to ingest, store, query, and analyze their health data at scale. Amazon HealthLake uses machine learning to understand and extract meaningful medical information from unstructured data, and then organizes, indexes, and stores that information in chronological order. The result provides a holistic view of patient health. The service leverages the Fast Healthcare Interoperability Resources (FHIR) industry standard format to further enable interoperability by facilitating the exchange of information across healthcare systems, pharmaceutical companies, clinical researchers, health insurers, patients, and more. Amazon HealthLake is a new service that is part of AWS for Health, a comprehensive offering of AWS services and AWS Partner Network solutions used by thousands of healthcare and life sciences customers globally. AWS for Health provides proven and easily accessible capabilities that help organizations increase the pace of innovation, unlock the potential of health data, and develop more personalized approaches to therapeutic development and care. As part of AWS for Health, Amazon HealthLake further facilitates customers’ application of analytics and machine learning on top of their newly normalized and structured data. Doing so enables customers to examine trends like disease progression at the individual or population health level over time, spot opportunities for early intervention, and deliver personalized medicine.

“More and more of our customers in the healthcare and life sciences space are looking to organize and make sense of their reams of data, but are finding this process challenging and cumbersome,” said Swami Sivasubramanian, Vice President of Amazon Machine Learning for AWS. “We built Amazon HealthLake to remove this heavy lifting for healthcare organizations so they can transform health data in the cloud in minutes and begin analyzing that information securely at scale. Alongside AWS for Health, we’re excited about how Amazon HealthLake can help medical providers, health insurers, and pharmaceutical companies provide patients and populations with data-driven, personalized, and predictive care.”

Amazon and Microsoft Veterans Join Forces with the Allen Institute for A.I. to Launch MajorBoost

Three tech and healthcare veterans collaborating at the Allen Institute for AI (AI2) launched MajorBoost. The AI-based communication and decision support company is set to significantly improve efficiency in the healthcare system by reimagining how doctor’s offices and insurance companies communicate with each other.

“Instead of spending more time on patient conversations and conducting health care follow-ups, medical providers are spending endless hours on the phone navigating the maze of health insurance call centers,” said Lekshmi Venu, co-founder and CEO of MajorBoost. “MajorBoost empowers providers to get their questions answered quickly, resolve insurance issues faster, and ultimately grow their patient service and human connection.”

Syniti Launches Data Jump-start to Drive Business Value from Data

Syniti, a global leader in enterprise data management, launched Syniti Data Jumpstart to help organizations understand the impact of data quality in driving growth, increasing margin, speeding new product introductions, and maximizing the value of major initiatives such as M&A and Digital Transformations.  Syniti Data Jumpstart is a packaged, cloud-based software solution that provides turnkey improvement recommendations, enhanced by tailored, business-specific insights to help create rapid, substantial bottom-line returns by improving data quality. 

“At Syniti, our mission is to help organizations see the value and power of trusted data,” said Chris Knerr, Chief Digital Officer at Syniti. “Data Jumpstart is a fast, cloud-based data ‘x-ray’ of a business that can deliver fast data quality insights and visibility into related business upside. We call this inside-out data intelligence. Its findings connect data quality to actionable insights that help drive financial and operational performance. It’s absolutely unique in the industry. Our expectation is that every Data Jumpstart customer will leave this engagement with a focused strategy for how to create positive business impacts through data.” 

Espressive Empowers Enterprises to Fully Customize AI-Based Virtual Agent Barista with a No-Code Experience

Espressive, the pioneer in artificial intelligence (AI) for enterprise service management (ESM), announced availability of the Espressive Barista Control Center™, the administrative interface to the intelligence behind Espressive Barista™, the industry’s most comprehensive virtual agent solution. With the Barista Control Center, Espressive is empowering enterprises to expand Barista’s language model, design dynamic and interactive conversations, and integrate with virtually any API-enabled third-party system – all through a zero-code experience. Whether enterprises want a managed service, the flexibility to make their own updates, or the ability to do deep customizations on their own, Barista delivers.

Up until now, enterprises only had two options when deploying workplace virtual agents; they had to choose between a toolkit approach that requires multiple tools, expensive resources, and long deployment times, or an out-of-the-box solution that could be deployed more rapidly but might lack in customization abilities. Barista fills that gap by accommodating a range of customer requirements all on one platform, without upgrades, as needs evolve.

“Enterprises are at different levels of maturity in leveraging conversational AI, and so it is important to have one platform that can grow with them without the requirement for an expensive upgrade when they are ready to do deep customization in house,” said Pat Calhoun, Espressive Founder and CEO. “We have beta tested the Control Center over several months with multiple customers and have seen exciting use cases. For example, a leading pharmaceutical company has integrated Barista into SAP ensuring users can get help from within the app, while Dexcom has expanded their language model to handle HR benefits and government programs that are only available in the Philippines.”

CognitiveScale Announces Low Code Developer Platform to Fuel Quick Development of Large Scale, Trusted AI Campaigns

CognitiveScale, the enterprise AI company that helps organizations win with intelligent, transparent, and trusted AI-powered digital systems, announced the release of Cortex Fabric Version 6—a new, low code developer platform for automation, augmentation and transformation of knowledge work and digital experiences. 

Cortex 6 helps enterprises create trustworthy AI applications faster, more affordably, and with business outcomes delivered through KPIs based on insights from data, models and actions—all with minimal dependencies on underlying infrastructure. It is superior to existing solutions in that it uses data where it resides, avoiding cloud vendor lock-in and enabling developers to invoke any model developed by any Machine Learning platform with a focus on operationalization. 

“Enterprises need to build trust into their AI-driven applications; it’s become essential in the market. The consequences of launching unethical or biased applications are prohibitive. At the same time, not making the best use of all the data and insights companies have to differentiate represents an incredible missed opportunity,” said Matt Sanchez, CTO, CognitiveScale.

Qubit Unveils Deep Learning-Powered CommerceAI Engine Enabling Retailers to Harness New Revenue Streams as Digital Shift Accelerates

Qubit, a leader in AI-powered personalization, announced Qubit CommerceAI, which leverages AI, deep learning and machine learning to deliver advanced 1:1 personalization techniques. Qubit CommerceAI’s models understand and react instantly to the customer context – an approach that’s proven to unlock more value from retailer’s entire product catalogs. 

Qubit CommerceAI combines customer data, including intent, and design tools to drive up conversion rates and customer lifetime value while reducing abandonment. Through its use of deep learning, Qubit empowers brands to learn more about their customers and determine the products from their catalogs – which may include hundreds or thousands of items – that will drive actual sales performance. Deep learning leverages more sophisticated algorithms than machine learning, enabling the processing and understanding of massive amounts of data in real time to power the entire end user shopping experience and resulting in superior ecommerce outcomes. This data is also leveraged for insights that ecommerce teams can use to make important business decisions about merchandise, inventory, campaigns, offers, and more.

“Although the growth of ecommerce exceeded all expectations last year, with no sign of slowing, many customers are still frustrated with the online shopping experience because they either can’t easily find what they’re looking for or the products being recommended are not individually targeted to the shopper,” said Qubit’s CEO and founder, Graham Cooke. “Brands can remedy this by using modern technologies like deep learning to better understand customer behavior and create truly tailored shopping experiences; ones that connect the visitor directly with the products they’re most interested in. Deep learning-driven product recommendations not only drive 1:1 personalized shopping experiences, it also allows brands to derive more value and revenues from their entire product catalogs.”

Actian Launches Next-Generation Zen™ Embedded Database for Mobile and IoT

Actian, a leader in hybrid cloud data analytics, announced the general availability of its new Zen™ V15 embedded database for mobile and IoT. Actian Zen V15 addresses the demanding needs of today’s on-premise, cloud, mobile, and IoT application developers by providing persistent local and distributed data across intelligent applications deployed in enterprise, branch, and remote field environments. Actian’s Zen V15 database delivers breakthrough levels of performance and has been certified to work with the Intel OpenNESS and Smart Edge MEC platforms.

“Edge applications and devices increasingly rely on unsupervised machine learning inference at the edge to improve automation and real-time decision-making,” said Lewis Carr, Senior Director of Product Marketing at Actian. “Actian Zen V15 edge data management delivers against a broad and demanding set of requirements including leveraging a variety of hardware architectures, operating environments, networks, communications interfaces, and languages to offer comprehensive support, performance, and the flexibility modern enterprises need to stay competitive.”

BigPanda Event Enrichment Engine Supercharges Insights That Accelerate Incident Response and Increase Uptime

BigPanda, Inc., a leader in Event Correlation and Automation powered by AIOps, revealed the power of its Event Enrichment Engine that enriches raw alerts with rich topological and operational context to create high-quality incidents. This key capability of BigPanda’s AIOps platform moves organizations beyond simple alert noise to turbocharging the effectiveness of event correlation, root cause analysis, and automation. 

Many organizations are investing in AIOps to sift through and correlate alerts across observability and monitoring platforms to detect incidents in real-time before an incident turns into an outage. Unfortunately, many AIOps tools fail to convert raw alerts into context-rich, high-quality incidents because of their inability to easily tap and parse into all sources of contextual data that can be added to incidents as context, at scale. 

“Many AIOps projects fail to live up to their promise because alerts don’t get enriched with operational, topological or other contextual data, making it difficult to separate noisy alerts from meaningful alerts, and then eliminate the noise,” said Elik Eizenberg, co-founder and CTO at BigPanda. “Our Event Enrichment Engine is critical to improving NOC productivity and L1 resolution rates by correlating interrelated alerts into context-rich, high-quality incidents that easily describe what the problem is, what’s causing it, and what action to take.”

Hazelcast Unveils Real-Time Intelligent Applications Platform

Hazelcast, the real-time intelligent applications platform, is announcing the new Hazelcast Platform, enterprise software capable of serving as a single platform for transactional, operational and analytical workloads. The Hazelcast Platform combines the capabilities of a real-time stream processing engine with in-memory computing to deliver a simplified architecture that is highly performant, scalable and reliable. 

“While data continues to be an enterprise’s most valuable resource, it’s only useful if they can derive actionable insights in a timely manner,” said Kelly Herrell, CEO of Hazelcast. “The Hazelcast Platform represents a monumental step forward for the creation of real-time, intelligent applications that help enterprises capture the value they otherwise would miss.” 

Monte Carlo Launches Incident IQ To Help Organizations Achieve End-to-End Data Trust

Monte Carlo, the data reliability company, released Incident IQ, a new suite of capabilities that help data engineers better pinpoint, address, and resolve data downtime at scale through the Monte Carlo Data Observability Platform. Incident IQ automatically generates rich insights about critical data issues through root cause analysis, giving teams unprecedented visibility into the end-to-end health and trust of their data beyond the scope of traditional data quality solutions.

“As companies become more data driven, it’s fundamental that organizations not only understand the health of their data, but also have the data observability necessary to trust it from end to end,” said Lior Gavish, CTO, Monte Carlo. “As the data stack fragments to incorporate new tools, it’s becoming increasingly difficult to identify when data pipelines break and take action to fix them. With Incident IQ, data practitioners and leaders alike can holistically understand and respond to issues faster, before they become a serious problem for the business. We believe these features will help customers eliminate hundreds of hours of data downtime and thousands to millions of dollars in savings each month, as well as enable data platform teams to scale with rich post-mortems that track performance and facilitate greater learning.”

Deci’s DeciNets Image Classification Models Break Efficient Frontier  

Deci, the deep learning company harnessing AI to build AI, announced the discovery of their family of industry-leading image classification models dubbed DeciNets. Deci’s proprietary Automated Neural Architecture Construction (AutoNAC) technology discovered DeciNets using roughly two orders of magnitude less computing power than Google-scale Neural Architecture Search (NAS) technologies, the latter having been used to uncover well-known and powerful neural architectures like EfficientNet.

The race for improved accuracy and performance on new and more challenging prediction tasks, in conjunction with the availability of increasingly more powerful hardware and big data, has led to a push for larger deep learning models with increasing algorithmic complexity. These have essentially become unsustainable for cost-effective inference operations in production. While NAS has been presented as a potential solution to automate the design of more effective artificial neural networks that can outperform or at least be on par with manually-designed architectures, the resource requirements to operate such technology is excessive. To date, NAS has really been only successfully implemented by tech giants like Google, Microsoft and in the confines of academia, proving its impracticality for the vast majority of developers.

In order to solve this problem, Deci developed AutoNAC, the first commercially viable NAS enabling developers to automatically design and build deep learning models that outperform other known state-of-the-art architectures. Setting parameters of their choice to tackle a specific task (e.g., classification, detection, segmentation), developers can apply AutoNAC to their dataset to obtain optimized models ready for production at scale on their target inference hardware. Unlike other NAS technologies, AutoNAC is hardware-aware, meaning that it can squeeze maximum performance out of any hardware and deploy models in any environment (cloud, edge, mobile). 

“Deep learning is powering the next generation of computing- without higher performing and more efficient models that seamlessly run on any hardware, consumer technologies we take for granted everyday will reach a barrier,” said Yonatan Geifman, co-founder and CEO of Deci.  “Deci’s ‘AI that builds AI’ approach is crucial in unlocking the models needed to unleash a new era of innovation, empowering developers with the tools required to transform ideas into revolutionary products.”

Fractal Announces Launch of Crux Intelligence, the Next-Generation Business Intelligence Company That Puts AI in the Hands of Every Business User

Fractal, a global provider of artificial intelligence and advanced analytics solutions to Fortune 500® companies, announced the launch of Crux Intelligence, a new provider of comprehensive Augmented Analytics products. Crux Intelligence brings together machine learning (ML) and natural language processing (NLP) technology to allow mid-to-large enterprise customers to easily uncover insights and derive business intelligence from their data. Crux Intelligence will be further powered by the roll-up of AI products and services recently acquired by Fractal.

In a data-driven economy, companies must compete not only on the depth and richness of their data intelligence, but the speed with which they can convert analysis into action. Crux Intelligence’s new suite of Augmented Analytics tools help businesses in logistics-heavy verticals like consumer-packaged goods, insurance, finance and retail to optimize for KPIs and proactively identify anomalies and bottlenecks.

“Given the speed and complexity of today’s data-driven economy, its essential that companies are able to predict changes within their business and understand the root cause of each change,” said Kathy Leake, CEO of Crux Intelligence. “Our new Augmented Analytics solutions make it easier and faster than ever for our business customers to gain immediate insights into their performance and develop an organization-wide culture of decision-making informed by data. With support from Fractal and with the addition of the AI driven assets we are assembling, Crux will be well positioned to deliver leading edge data analytics tools for years to come.”

New Uniphore AI-Driven Capabilities Provide Enhanced Customer Experiences  

Uniphore, an early leader in Conversational Service Automation (CSA), announced innovative new artificial Intelligence (AI) enhancements to its portfolio of products. With these additions, the company continues to lead in providing new and exciting options for organizations to deliver transformational experiences throughout the entire engagement cycle – before, during and after contact is made. Announced are enhancements to Uniphore’s U-Assist family that now include new deep learning AI models specifically developed to augment and optimize both the agent performance and customer experience.  Uniphore’s latest AI innovations are in the areas of enhanced intent discovery + next best action, enhanced agent promises model, proactive supervisor alerts and automatic feedback loop for the optimization of our AI models.  

“From the beginning, Uniphore has led the industry by focusing on delivering AI + Automation solutions that make a tangible difference in the conversations between customers and agents,” said Umesh Sachdev, CEO and co-founder of Uniphore. “These latest enhancements help our customers drive transformational experiences by delivering greater intelligence and recommendations through the application of deep learning AI models. I am extremely proud of the work our team has done to deliver these technology innovations for our customers.” 

Airbyte Offers Open Source Data Integration Platform to Data Lakes

Airbyte, creators of the open source data integration platform, announced the release of an open source data integration for data lakes, enabling AWS users to replicate data from anywhere to their Amazon Simple Storage Service (S3) account. Companies are now able to leverage Airbyte’s 75-plus pre-built connectors, or build their own custom connectors within two hours using Airbyte’s Connector Development Kit (CDK), in order to replicate their data to S3. It makes it possible for businesses to access all of their data consolidated in their data lake for analytics and any other use case. S3 is the first destination offered by Airbyte, but the data lakes of other cloud providers and Delta Lake will soon follow. 

“Airbyte is moving forward with its mission to commoditize all data integration and will start supporting all the other data lakes,” said Michel Tricot, co-founder and CEO of Airbyte. “Airbyte is becoming the new de facto standard for open-source ETL/ELT.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*