insideBIGDATA Latest News – 3/9/2021

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Splice Machine Announces New Capabilities to Support Real-Time AI with Version 3.1 of its Scale-Out SQL Database

Splice Machine, provider of the real-time AI platform built on the scale-out SQL database with built-in machine learning, today announced version 3.1 of its database, which introduces new features and functionality to support enterprises with real-time AI projects. With 3.1, Splice Machine has added Spark 3.0 support for its database engine, which adds performance improvements, resource elasticity support on Kubernetes, GPU support, expansions to Spark’s ML libraries, and more. Splice Machine 3.1 greatly increases transparency of data used to create ML models. A new feature of 3.1 enables developers to query the database back in time with AS OF syntax to a specific date, providing a full audit and lineage for a regulator checking for bias or data drift.

“We are excited to be powering data engineers and data scientists with the tools they need to break down the chasms that stop ML and AI projects from being successful,” said Monte Zweben, co-founder and CEO, Splice Machine. “With 3.1, we have made vital leaps in database capabilities to successfully operationalize real-time AI applications and bring ML models into production.”

YesPlz x Shopify: Product Recommendations for SMBs, Powered by AI

YesPlz announced the release of a fashion-focused Product Recommendation Engine for small and medium-sized businesses, available now on Shopify. The YesPlz Product Recommendation Engine brings the power of AI to merchants on Shopify. The YesPlz Product Recommendation Engine offers a new way for retailers to show customers relevant, accurate product recommendations for every step of the customer journey. Using AI that is specifically trained to understand key fashion attributes, the YesPlz algorithm can pull key product features from merchant stores to match clothing, shoes, and jewelry to user preferences. The matches are then displayed based on relevance to the customer. The underlying AI has been trained based on hundreds of real customer interviews, resulting in highly relevant matches. The YesPlz algorithm works by focusing on key defined features, making the search process faster and more accurate than before. By tagging less, but higher quality information from an image, the AI has an impact on speed and efficiency.

“Good recommendations are critical for online shopping, yet great recommendations don’t exist,” according to Jiwon Hong, CEO of YesPlz. “Today, computer vision technology allows us to gather key data to build an accurate recommendation engine that works for small and medium businesses. These smaller businesses can simply install the app to provide top-notch recommendations to their customers. That’s what I’m most excited about.”

BidFX Launches Multifaceted Data and Analytics Suite for FX Trading

BidFX, a leading cloud-based provider of electronic FX trading solutions for institutions and a wholly-owned subsidiary of SGX Group, today announced the launch of BidFX Data and Analytics. This new offering is the latest addition to BidFX’s powerful suite of offerings for financial institutions, which includes an EMS platform, risk and compliance modules and transaction cost analysis (TCA).

With this expansion, banks, hedge funds and asset managers can access tools to manage the collection and cloud storage of client-specific liquidity streams, as well as monitor composite rates across multiple FX products. This gives institutional players a comprehensive view of the pricing, market impact and liquidity for every transaction.

“The ability to efficiently harness, normalise and analyse FX data has become increasingly vital for refining execution strategies and evaluating counterparties,” said Daniel Chambers, BidFX Global Head of Data and Analytics. “The launch of this product arms our clients with tools to provide valuable insights in real time. Today, having a secure platform is not enough; traders need access to fast and reliable market data and analytics. The latest expansion of the BidFX suite greatly streamlines market analysis, liquidity provision, back-testing and more, making it an essential addition to any FX trader’s toolkit.”

VMware Evolves Developer and AI-Ready Infrastructure to Advance Digital Business

VMware, Inc. (NYSE: VMW) today announced portfolio updates to help customers modernize their applications and infrastructure. The new releases of vSphere 7 and vSAN 7 will help IT teams support new and existing applications with infrastructure that is developer and AI-ready; scales without compromise; boosts infrastructure and data security; and simplifies operations.

“Infrastructure owners are racing to support exciting new containerized applications, such as advanced AI workloads, without compromising security,” said Lee Caswell, vice president, marketing, Cloud Platform Business Unit, VMware. “VMware is helping vSphere admins expand their influence beyond traditional virtualized applications to new enterprise AI environments through our partnership with NVIDIA, to high-capacity HCI use cases with HCI Mesh from vSAN, and to security-sensitive containerized workloads that can benefit from the SEV-ES security feature in AMD EPYC processors. These new capabilities allow infrastructure to seamlessly meet the rapid pace of application change.” 

Sumo Logic Expands Observability with Deeper Insights for Performance and Reliability of Microservices

Sumo Logic (Nasdaq: SUMO), the pioneer in continuous intelligence, announced new updates to the Sumo Logic Observability Suite including Service Maps and Service Dashboards, the extension of its Root Cause Explorer solution to include Kubernetes metrics and tracing, expansion of its Global Intelligence Service for Kubernetes, as well as a new beta program for both AWS Lambda support and Browser Real User Monitoring. These additions allow DevOps and site reliability engineers (SREs) to get a holistic view of all microservices to identify and resolve issues faster.

As modern application stacks become increasingly complex and interconnected, it becomes more difficult for organizations to make connections between numerous applications to gain real, valuable insights into the performance of their microservices. However, now more than ever, end users expect real-time, always-on functionality of applications and services. Because of this, it is critical for digital businesses to effectively monitor and manage how applications across its technology stack’s ecosystem are operating in relation to one another, so engineers can prioritize and troubleshoot any issues.

“Today’s organizations need to have a deep understanding of not just which microservices work hand-in-hand, but also how they are supported by the application stack as a whole. When an application issue arises engineers must be able to understand the full failure chain that led to the alert at the drop of a hat, otherwise restoring the reliability of the application will take too long and the failure will likely recur,” said Bruno Kurtic, Founding VP of Strategy and Solutions at Sumo Logic. “We are excited to further expand our observability suite with new features and functionality to help organizations get a snapshot of the holistic health of your microservices and ultimately achieve application reliability.”

OmniSci Launches Free Edition of Platform to Democratize the Power of Accelerated Analytics

OmniSci, the pioneer in accelerated analytics, announced the launch of OmniSci Free, a full-featured version of its analytics platform available for use forever at no cost. OmniSci Free will allow users to leverage the full power of the OmniSci Analytics Platform, including OmniSciDB, OmniSci Render Engine, OmniSci Immerse, and the OmniSci Data Science Toolkit, to explore and gain insights from their big data at the speed of curiosity. 

“Our mission from the beginning has been to make analytics instant, powerful, and effortless for everyone, and the launch of OmniSci Free is our latest step towards making our platform accessible to an even broader audience,” said OmniSci CEO and Co-Founder Todd Mostak. “While our open source database has delivered significant value to the community as an ultra-fast OLAP SQL engine, it has become increasingly clear that many use cases heavily benefit from access to the capabilities of our full platform, including it’s massively scalable visualization and data science capabilities. OmniSci originated out of my own struggles with using legacy analytics tools to interrogate large datasets, and it is our deep hope that making it easy to access the OmniSci stack in its entirety will allow analysts, data scientists, and decision makers everywhere to unlock the full power of their data.”

Algorithmia Tackles Machine Learning Model Risk with New Governance Capabilities 

Algorithmia, a leader in machine learning (ML) operations and management software, has released new, advanced reporting tools to help enterprise IT and internal risk leaders govern the use of ML models in production environments.

In most organizations, governance and ML model risk management are primarily focused on validation and testing of models and inspection of documentation prior to model deployment. As ML adoption has accelerated over the last year, IT leaders, business line leaders, CIOs and chief risk officers have realized that what happens after a model is deployed is even more important than pre-deployment testing and validation. Operational risk is now the most significant analytics risk. 

“We’re still in the early days of ML governance, and organizations lack a clear roadmap or prescriptive advice for implementing it effectively in their own unique environments,” said Diego Oppenheimer, CEO of Algorithmia. “Regulations are undefined and a changing and ambiguous regulatory landscape leads to uncertainty and the need for companies to invest significant resources to maintain compliance. Those that can’t keep up risk losing their competitive edge. Furthermore, existing solutions are manual and incomplete. Even organizations that are implementing governance today are doing so with a patchwork of disparate tools and manual processes. Not only do such solutions require constant maintenance, but they also risk critical gaps in coverage.”

Deloitte Teams With NVIDIA to Launch the Deloitte Center for AI Computing

Deloitte announced the launch of the Deloitte Center for AI Computing, a first-of-its-kind center designed to accelerate the development of innovative artificial intelligence (AI) solutions for Deloitte clients. Built on NVIDIA’s DGX™ A100 systems, the Center brings together the supercomputing architecture and AI expertise that clients require as they become AI-fueled organizations.

Accelerated computing platforms featuring NVIDIA graphics processing unit (GPU) technology, NVIDIA networking and NVIDIA software are transforming data processing, analytics and AI by bringing massive parallel processing capability and speed to demanding deep learning, machine learning and data science workloads. This enables the development of extremely sophisticated AI solutions at scale.

“AI is moving from research labs into industry, and Deloitte’s efforts will supercharge its reach,” said NVIDIA CEO Jensen Huang. “Every industry will be transformed by AI. Products and services will be revolutionized by AI. Companies will become learning machines and their people will be supported by AI. Together with Deloitte’s global force of experienced specialists, we will turbocharge the realization of this vision.”

Kaskada™ Announces General Availability of its Feature Engineering Platform

Kaskada, a machine learning company that empowers data scientists to build and operate machine learning solutions, announced the general availability of its feature engineering platform. This launch signifies that, after a period of beta testing with early adopters, the platform is ready for data science teams to use for a wide variety of use cases, including fraud, personalization, and recommendation engines.

Machine learning is rapidly changing how companies do business and serve their customers. These opportunities, however, tend to be exploited most by large technology companies with significant resources invested in data collection, data processing, and productionization of machine learning, while others often struggle to achieve the same level of results. A key missing piece of getting to success is a data infrastructure that bridges the gap between model training and live serving of machine learning results in production environments.

Kaskada’s feature engineering platform is the first ML platform for data scientists that focuses on the feature engineering and feature serving experience. The platform includes a collaborative interface for data scientists and is powered by proprietary data infrastructure for computing across event-based data and serving features in production.

“Kaskada’s feature engineering platform is designed to make truly hard data problems in machine learning easy,” said Davor Bonaci, Kaskada co-founder and CEO. “Data science teams can now work better together, build better features and deliver results at a whole new level. I cannot wait to see what kind of impact they’ll accomplish in the months and years to come.”

Honeycomb Introduces Refinery, a New Solution to Optimize Observability for Enterprises at Scale

Honeycomb, the company that pioneered the commercial observability solution to understand, debug, and improve distributed production systems, announced a new solution, Refinery, to help enterprises refine their observability data at scale. Managing large, complex applications means dealing with potentially hundreds of billions of events per month. Refinery collects 100 percent of telemetry data and provides multiple ways to observe only the events that best represent important system changes, resulting in a high-fidelity debugging experience while also controlling costs.

“At large enough scale, there’s a tradeoff between providing engineers the data they need to maintain the kind of digital experiences customers expect and the resource costs of sending all that data to Honeycomb,” said Christine Yen, CEO of Honeycomb. “Honeycomb is always looking for ways to democratize the tools and expertise that used to be reserved for only the world’s most elite companies. Previously, solutions like Refinery have been proprietary and closed-source. Now, Honeycomb makes it easy for anyone to only keep the most important data they need to debug their production services and stop paying for the rest.”

Lenovo Reveals New ThinkEdge Portfolio of Embedded Computers 

Lenovo™ announced its all new portfolio of embedded computers for the edge. Building from the existing edge portfolio from Lenovo, the ThinkEdge devices – the new ThinkEdge SE30 and ThinkEdge SE50 – are small, rugged, and powerful enough to meet the demanding needs of enterprise data processing, security and scalability at the edge. 

The new ThinkEdge SE30 is a small and rugged compute device for edge workloads. It includes the latest 11th Gen Intel® Core™ i5 vPro® processors for industrial computing. The processor improves compute power, accelerates AI workloads, and is built for the challenges of edge implementations in enterprise with extended temperature support from -20 to +60 Celsius, long-life reliability, as well as enhanced security and manageability features. 

The new ThinkEdge SE50 is designed for versatile applications that require higher analytics and data processing at the edge. The embedded edge compute device includes an Intel® Core™ i5 or i7 vPro® processor for industrial computing and up to 32GB of memory. 

Kin + Carta Launches Kin + Carta Data Labs Created to address the next digital evolution: data

Kin + Carta (KTC), the global digital transformation consultancy, announced today the launch of Kin + Carta Data Labs. Kin + Carta Data Labs is strategically conceived as a hub of data-related innovation and expertise to be embedded into client-related consulting and engineering services. 

“Data transformation is the next big priority for businesses,” said Kelly Manthey, Group Chief Executive for Kin + Carta Americas. “Companies started their digital transformation journey focused on the front-end experience. Then they moved to modernizing their platforms to continue to become more digital. Now, the main focus is data. Data insights and AI-driven applications will allow leaders to make smarter business decisions and provide end-customers better user experiences.”

Quantcast Unveils New Intelligent Audience Platform to Empower Brands, Agencies and Publishers to Thrive on the Open Internet 

Quantcast, a global advertising technology company, revealed the Quantcast Platform. With the advertising industry at a critical juncture, the Quantcast Platform equalizes advertising on the open internet. This new and innovative, intelligent audience platform powered by the company’s patented AI and machine learning engine, Ara™, empowers brands, agencies and publishers to know and grow their audiences by reaching the right person at the right time.   

“As champions of a free and open internet, we believe in leveling the playing field for brands, agencies and publishers and eliminating the advantages currently wielded by the walled gardens,” said Konrad Feldman, co-founder and CEO of Quantcast. “We are bringing together our technology leadership in measurement and insights, programmatic advertising, privacy and consent management into a single unified, easy-to-use platform to serve the needs of our customers around the world, and to benefit the open internet.” 

Cloudera Cloud-Native Operational Database Accelerates Application Development

Cloudera, (NYSE: CLDR), the enterprise data cloud company, announced the availability of the Cloudera Data Platform (CDP) Operational Database on both Amazon Web Services (AWS) and Microsoft Azure. CDP Operational Database is a fully managed cloud-native operational database with unparalleled scale, performance, and reliability. Optimized to be deployed anywhere, on any cloud platform, CDP Operational Database aligns with the cloud infrastructure strategy best suited for the business.

“Multi-cloud is the future and the global need to enable remote business has only accelerated this shift,” said Arun Murthy, Chief Product Officer at Cloudera. “With CDP Operational Database, companies no longer need to make sacrifices when it comes to their database. Enterprises can deploy on any cloud infrastructure to move at the speed that their customers demand, while also maintaining flexibility.”

Strike Graph Introduces AI Technology Platform to Automate Security Questionnaires

Strike Graph, a compliance automation startup, today announces the launch of its new AI technology platform. The first feature on the new Strike Graph AI platform is the ability to answer each question on a company’s Security Questionnaire based upon an organization’s security habits and procedures. This new AI feature empowers and helps sales departments to close deals faster and at scale by automating the process.

“The biggest complaint we hear from customers is answering security questionnaires. Each questionnaire is different and takes precious time away from CTOs and technical talent to close deals,” said Justin Beals, CEO and co-founder of Strike Graph. “We built this new AI feature to solve that problem by amassing thousands of typical questions and testing our ability to answer questions against the Strike Graph data model.”

Quinten Spins Off 50 Data Scientists to become a Global Leader in Precision Care Using AI and Real-world Data Science

Following the launch of PharmIA, an awarded software to augment and facilitate the prescriptions review by hospital pharmacists, Quinten now launches Quinten Health: a global and integrated analytical services and tools development offer. Quinten Health’s AI-based technologies enable for instance earlier and better diagnosis of rare diseases, drug responders profiling, weak safety signals identification, optimal and precision treatment pathways, and ultimately augment and rationalize critical medical decisions at the point of care.

Quinten Health spins off from the Quinten group, the French pioneer in AI-powered decision support solutions since 2008 and builds on Quinten’s proprietary algorithms and data science capabilities grown over a decade of R&D and hundreds of impactful projects. The Quinten Health team blends a mix of 50 data scientists, data engineers, as well as biomedical and digital health experts for impactful and interpretable healthcare data science.

“While healthcare has been the major focus for Quinten so far, it was time to gear up and specialize as a genuine HealthTech company, serving and promoting synergies between drug developers, care providers and public health decision makers through artificial intelligence and machine learning solutions developed with and for them” commented Dr Alexandre Templier, President and Co-founder of the Quinten group.

Logically Launches Threat Intelligence Platform to Identify and Counter Mis/disinformation at Scale

Logically, a tech company combining advanced AI with human intelligence to tackle misinformation, announced the launch of its new threat intelligence platform that can identify, analyse and disarm harmful online misinformation at scale. Built on cutting-edge, secure, scalable cloud infrastructure, Logically Intelligence brings together Logically’s capabilities in at-scale analysis, classification and detection of damaging narratives and online threats. It also provides access to a suite of countermeasures to tackle identified threats, including automated fact checking and OSINT research, meaning it is one of the only platforms to integrate both analytical capabilities and countermeasure deployment to tackle misinformation.

“Since 2016 we have seen the phenomenon of mis and disinformation firmly take root, evolve and proliferate, and increasingly cause real world harm,” said Lyric Jain, CEO of Logically. “As the dissemination of misinformation becomes more complex and dynamic, governments across the world require more sophisticated methods to tackle it. Logically Intelligence incorporates our years of expertise in this area, and we feel our technology is best placed to solve the challenges specific to this problem.”

Promethium’s New Release Helps Enterprises Make Data-driven Decisions in Real time Without the Complexity of Data Management

At the speed with which market conditions are changing the need to make data-driven decisions in real time has never been more critical for every business.  Promethium’s insights acceleration solution uses search and AI to help employees at all levels, from business users to C Level Executives, always be ready to answer questions with data in real time. The company announced the latest features, which includes new and improved search, crowdsourcing, knowledge retention and real-time preview, to help organizations answer questions with data in minutes instead of months.

“At Promethium we are on a mission to help every business be data-driven by enabling every employee to make data driven decisions in real time without the technical complexity of data management,”  said Kaycee Lai, CEO and Founder, Promethium.  “The latest features from Promethium make data-driven answers more accessible for every organization so they can take action faster than ever before.”

CLARA Analytics Offers Rapid-Turnaround Data Quality Service for Insurers 

CLARA Analytics (“CLARA”), a leading provider of artificial intelligence (AI) technology in the commercial insurance industry, announced the availability of Network Analyzer, an API-based data quality service that matches and de-duplicates medical provider records using the most current information from the CLARA Data Platform. Unlike industry-agnostic data quality products, CLARA Network Analyzer matches network information to a trusted database of verified medical providers that is curated and kept constantly up to date by CLARA. 

“With CLARA’s Network Analyzer, insurers can verify the accuracy of their data against our comprehensive, trusted database of network providers, gaining confidence in the quality of their provider information,” said Gary Hagmueller, CEO of CLARA Analytics. “This new offering combines the power of the CLARA Data Platform with the simplicity of an API-driven service, adding meaningful business value at a very affordable price point.” 

New Off-the-Shelf (OTS) Datasets from Appen Accelerate AI Deployment

Appen Limited (ASX:APX), a leading provider of high-quality training data for organizations that build effective AI systems at scale, announced new off-the-shelf (OTS) datasets. These datasets are designed to make it easier and faster for businesses to acquire the high-quality training data needed to accelerate their artificial intelligence (AI) and machine learning (ML) projects. The new OTS datasets include human body movement and innovative baby crying sounds, as well as scripted speech and images with text suitable for optical character recognition (OCR) for high-demand but hard-to-acquire languages, such as Arabic, Croatian, Greek, Hungarian, Thai and more. With the expanded datasets, Appen’s total OTS offering includes over 250 datasets, comprising of over 11,000 hours of audio, over 25,000 images and over 8.7 million words across 80 languages and multiple dialects.

Appen’s OTS datasets are a fast, cost-effective tool to jumpstart an AI or ML project with consistent high-quality training data. Teams expanding their AI capabilities can also leverage OTS datasets to effectively improve accuracy, develop new model skills and incorporate other improvements into their AI models. An OTS dataset is often delivered in one week, for example, compared to the eight to twelve weeks for a new dataset collection and annotation project – or even longer, depending on complexity. All Appen datasets are developed using a fully transparent, opt-in methodology, so AI specialists can be assured their data is clean and compliant, eliminating the potential risk of backlash and reputation damage.

“AI teams around the world working on projects with tight deadlines and flexible data requirements can benefit from using off-the-shelf datasets,” said Wilson Pang, CTO of Appen. “OTS datasets shorten time to value and provide access to high-quality data at a lower total cost than using traditional methods. We at Appen take the necessary steps to ensure that all our datasets are ethically sourced and demographically balanced, enabling companies to maintain responsible AI practices by minimizing bias in their models and ensuring fair treatment of data annotators. You always know the precise quality of an OTS dataset, which helps build better AI that works in the real world.”

Actian Takes Customer Insight to New Levels with Customer 360 Solution on Its Avalanche™ Analytics Service

Actian, the leader in hybrid cloud data analytics, officially launched a new Customer 360 solution offering, designed to simplify and accelerate the delivery of customer insights. Built on Actian’s industry-leading Avalanche hybrid cloud data warehouse, the solution enables data-driven organizations to shape their customers’ experiences better and more quickly, dynamically and cost-effectively — to gain competitive advantage.

“Organizations need to react quickly to changes in buyer behavior, but data silos and the lack of strategic, easy-to-use analytical tools are holding them back,” said Vikas Mathur, SVP of Products at Actian. “Actian’s Customer 360 Real-time Analytics solution running on our Avalanche service bridges the gap, giving workers of all technical abilities valuable tools they can use to harness existing data sources to generate actionable insight in the business moment.”

DeepCube Launches Product Suite to Accelerate Enterprise Adoption of Deep Learning

DeepCube, the deep learning pioneer, announced the launch of a new suite of products and services to help drive enterprise adoption of deep learning, at scale, on intelligent edge devices and in data centers.

The offerings build on DeepCube’s patented platform, which is the industry’s first software-based deep learning accelerator that drastically improves performance on any existing hardware. Now, DeepCube will offer solutions for neural network training and inference, allowing users to leverage DeepCube’s technology to address challenges in their deep learning pipeline. Additionally, a new service offering will make available DeepCube’s team of leading AI experts to support deep learning projects.

“The offerings announced by DeepCube today are the culmination of decades of work and research by some of the world’s leading experts in deep learning,” said Michael Zimmerman, CEO at DeepCube. “We have long been focused on solving the technical challenges of training and inference for next-generation deep learning models, which is no easy feat – this is proven by the fact that so many enterprises are still unable to take their AI models out of the research stage. But we’re confident in the power of our patented technology, and by commercializing it through CubeIQ, CubeEngine and CubeAdvisor, we’re taking steps toward democratizing deep learning across industries.”

Hasura Releases Enterprise Grade Features so Developers Can Build Mission-Critical Applications Faster

Hasura, the data access infrastructure company, announced that it has released version 2.0 of its popular open source GraphQL Engine. The release now enables organizations to simultaneously deploy REST and GraphQL APIs from one configuration to provide a bridge between modern application development and supporting existing REST APIs. This release includes the industry’s first GraphQL API gateway which provides granular authorization to any GraphQL API. Additionally, the release also now supports Google’s BigQuery database in addition to PostgreSQL, SQL Server and MySQL, so developers can easily access the data instantly to accelerate application development.

“Hasura GraphQL Engine 2.0 has our most anticipated features that make building mission critical applications in enterprise a reality, accelerating time to market for key initiatives,” said Tanmai Gopal, co-founder and CEO of Hasura. “Hasura’s declarative configuration approach and modern developer experience, coupled with a powerful authorization engine means organizations can innovate fast without compromising on security. Hasura is proving to be the backbone for the next generation of mission critical applications. The days of proprietary data access are over. Technology and user behavior are evolving too fast to completely re-architect systems. Hasura’s vision empowers developers to focus on user experience while re-using their data in new ways.”

Next Pathway Announces New Capabilities to Crawler360 and SHIFT to Accelerate Migration From Hadoop to the Cloud

Next Pathway Inc., the Automated Cloud Migration company, announced enhanced capabilities within SHIFT™ Migration Suite and Crawler360™, allowing enterprises to automatically migrate from Apache Hadoop to their desired cloud targets, such as Snowflake and Azure Synapse.

“Next Pathway’s industry-leading automation software enables organizations to accelerate their migrations and quickly get off Hadoop,” said Chetan Mathur, Chief Executive Officer of Next Pathway. “Automation through Crawler360  and SHIFT provides a solution for the most complex parts of a migration from Hadoop – planning and execution.”

Mission Launches Data, Analytics, & Machine Learning Practice for Businesses on AWS

Mission, a managed cloud services provider and Amazon Web Services (AWS) Premier Consulting Partner, announced the launch of its dedicated Data, Analytics & Machine Learning practice. The new practice provides all of the data engineering, analytics, machine learning, and data science expertise and tools required for enterprises, SMBs, and startups to tap into the vast potential of their data on AWS and accelerate data-backed transformation within their organizations. No matter where customers are in their journey toward becoming a fully data-driven business, Mission provides the roadmap, strategy, technology integration, and hands-on execution to ensure fast and ongoing success.

“Organizations have a huge opportunity to let their data affect change,” said Jaret Chiles, VP, Consulting Services, Mission. “Regardless of company size, regardless of industry – connecting disparate data sources and deriving insight from that data continues to be a monumental challenge for businesses that don’t have the requisite (and expensive) expertise in-house. We are launching our new practice to move data and analytics modernization from goal to reality – quickly and with processes and technologies built for our customers’ long-term success.”

Sisu Redesigns the Analytics Experience to Accelerate the Exploration of Cloud-Scale Data

Sisu, the augmented intelligence solution, announced a new analytics experience designed to accelerate the exploration of cloud-scale data and help analytics teams deliver insights in minutes rather than days. Informed by months of research, the redesigned product enables data teams to streamline their workflow, answer critical questions quickly, and get the complete picture behind changing metrics without cumbersome manual exploration.

Sisu’s new approach puts metrics at the center of the analytics workflow. Data teams start by creating common definitions of KPIs, ensuring consistent measurement across the organization. From there, Sisu automatically explores the highly-dimensional enterprise data behind these metrics and proactively surfaces the trends and segments that matter. Finally, the new interface enables faster drill-down, providing a more comprehensive view into the top drivers in the data, accelerating the delivery of meaningful insights for the business.

“We’ve spoken with dozens of world-class data teams and top data analysts about their decision-making processes, and one single theme emerged,” said Berit Hoffmann, Vice President of Product at Sisu. “Despite continued investments in cloud infrastructure and the modern data stack, it’s too slow and too difficult to identify which dimensions in the data matter. We are thrilled to unveil a redesigned Sisu, featuring a new metrics-first interface that helps analysts find answers fast enough to drive better decisions.”

Imperva Sonar Platform Delivers A Unified Security Platform Across Edge, Applications and Data

Imperva, Inc., the cybersecurity leader whose mission is to protect data and all paths to it, introduces the Imperva Sonar platform, which eliminates the need for siloed point solutions and delivers integrated analytics while automating workflow and accelerating incident response. Saving users time and reducing costs, organizations can now simplify the security of their most targeted business assets as they accelerate digital transformation projects. Advanced analytics provide visibility to two of the most challenging information security blind spots: the applications where breaches most often originate and the data most targeted for theft.

“The rush to modernize has created gaps and security teams lack visibility into the data lifecycle and how sensitive data is being accessed,” says Kunal Anand, CTO, Imperva. “The Imperva Sonar platform is the industry’s first solution to give security-conscious organizations a unified approach to protect their data wherever it lives — from outside to inside the network — all within one tool.”

Alation Announces Data Intelligence Platform with the Broadest and Deepest Connectivity in the Market

Alation Inc., a leader in enterprise data intelligence solutions, announced the release and general availability of Alation 2021.1, strengthening the company’s data intelligence platform. The newest release extends connector and query coverage to virtually any data source, expedites relevant search & discovery through data domains, and features new data governance capabilities.

“Fortune 1000 and Global 2000 firms have invested in a myriad of best-of-breed technologies, data sources, systems, and tools. They need the freedom to deeply connect to and integrate these investments without being dependent on a particular vendor,” said Satyen Sangani, CEO and co-founder, Alation. “By transforming the catalog into an open platform for data intelligence, our customers can deploy and realize benefits in the short term and have a robust, extensible architecture they can build on top of in the long term.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*