insideBIGDATA Latest News – 7/14/2020

Print Friendly, PDF & Email

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

OXO Just Made it Easier to Get Multilingual Data for NLP

OXO Innovation, a top boutique language service providers, has launched a new division to serve clients in big data and natural language processing: OXO AI. This new service builds on OXO’s main strengths to provide companies with top-quality, localized machine learning data.

“This foray into AI is a natural progression for OXO and for the localization industry in general. At OXO Innovation, as our name suggests, we strive to keep pace with the latest industry developments and technologies. AI has been part of our every day for a few years already; we use it for tasks like machine translation and quality auditing. We’re big believers in the potential of this technology. When a major client approached us to write thousands of Canadian French utterances for a customer service chatbot—our first NLP project—we jumped at the opportunity,” says Charles Lesperance, CEO of OXO Innovation.

Hazelcast Simplifies Application Modernization with Event-Driven Architectures

Hazelcast, a leading open source in-memory computing platform, announced the latest release of Hazelcast Jet which includes new application development features to its stream processing architecture. This new release enables developers to easily integrate an event-driven architecture into brownfield deployments to gain new functionality around real-time and in-memory processing. 

“Banks love Hazelcast. From its real-time performance to its lightweight, simplified architecture, Jet is an ideal technology foundation for application modernization,” said David Brimley, chief product officer (CPO) at Hazelcast. “As enterprises are seeking to extend the life of legacy investments, the latest update to Hazelcast Jet enables customers to take advantage of more data sources and execute additional processing, all the while achieving greater efficiencies from their hardware investments.”

Moogsoft Publishes “Observability with AIOps For Dummies” 

Moogsoft, a pioneer and leading provider of artificial intelligence for IT operations (AIOps), announced the release of “Observability with AIOps For Dummies,” an educational best-practices guide for DevOps and Site Reliability Engineering (SRE) teams to successfully automate observability of complex IT systems and build continuous assurance into their digital services with AIOps. Additionally, the book provides actionable insights for using AIOps for better agility and responsiveness at scale.

“In a growing digital economy, downtime simply isn’t an option for DevOps and SRE teams as they balance constant delivery of new services with maintenance of increasingly complex infrastructure,” said Adam Frank, author of “Observability with AIOps For Dummies” and VP, Product & Design at Moogsoft. “This book will guide these teams to unlock their true potential to innovate by automating observability using AIOps.”

Exasol V7: Unlock More Data At Speed For Improved Businesses Agility

Exasol, the analytics database, launches Exasol V7, bringing unmatched speed, performance, accuracy and insight to enhance organizations’ use of data. The new features empower businesses, giving them the ability to rapidly adapt to the dynamically changing world around them. Exasol V7 gives users the freedom and focus to run analytical models on even larger data volumes to quickly find more business-critical answers. These deeper insights can transform passive employees into data-driven teams, supported by the infrastructure they need to extract more value from data.

“The business environment is characterized by increasingly high levels of uncertainty and change. Alongside this, the exponential complexity and growth of data sources and formats has heightened business demand for more and more flexibility. Organizations need to ensure they’re building a sustainable data architecture that allows them to solve data challenges now and in years to come,” said Mathias Golombek, CTO of Exasol. “Exasol V7 really ramps up what data-driven organizations are able to do with their data. Many businesses now have the skills and the data to push the boundaries of what’s possible when it comes to analytics — our database is equipped to handle whatever they can throw at it.”

GridGain Announces Nebula Managed Service For Apache Ignite And GridGain In-Memory Computing Platforms

GridGain® Systems, provider of enterprise-grade in-memory computing solutions based on Apache® Ignite®, announced GridGain Nebula, a Managed Service Offering (MSO) for the Apache Ignite and GridGain in-memory computing platforms. The GridGain MSO ensures 24/7 optimal performance of an Ignite or GridGain in-memory computing platform at a fraction of the cost of staffing an internal IT operations team.

“GridGain has become the in-memory platform of choice for the world’s largest and most innovative companies. Apache Ignite now powers products from the world’s leading technology companies, and the digital transformation initiatives of the largest global corporations,” said GridGain President and CEO Abe Kleinfeld. “GridGain’s comprehensive vision and disciplined execution continues to make Apache Ignite implementations easier, faster and more scalable while providing the security, high-availability and management of the most mission critical applications, from merchant payment systems and securities trading platforms to supply chain management and new drug discovery. With the availability of the GridGain Nebula MSO, customers can now focus their resources on developing groundbreaking solutions that accelerate revenues and open new business channels rather than managing the underlying software infrastructure that powers them.”

Couchbase Cloud Debuts on Amazon Web Services

Couchbase, the creator of the enterprise-class, multicloud to edge NoSQL database, announced the general availability of Couchbase Cloud, its award-winning, fully-managed Database-as-a-Service (DBaaS). Couchbase Cloud is initially available on Amazon Web Services (AWS) with support for Microsoft Azure and Google Cloud Platform available by year-end.

“Couchbase is widely recognized as the most powerful and versatile NoSQL database available in the market today. As Enterprises accelerate their cloud adoption plans, Couchbase Cloud now offers them the opportunity to transform their businesses at a reduced TCO and without losing control of their data,” said Couchbase President and CEO Matt Cain.  “With more enterprises than ever migrating to both NoSQL databases and cloud deployments, we will continue to expand our cloud-native product portfolio and continue the momentum that has seen more than 500 enterprises, including over 30% of the Fortune 100, rely on the Couchbase NoSQL database.”

Matillion Exchange Enables Faster Data Innovation With Community Contributed Pre-Built ETL Workflows 

Matillion, a leading provider of data transformation for cloud data warehouses (CDWs), launched Matillion Exchange, a marketplace for Matillion ETL users and partners to publish and download Shared Jobs to reduce development time and solve business challenges faster.

In a recent study from IDG, data professionals cited faster time-to-value as the top reason enterprises migrate their data to cloud platforms for implementing analytics projects. On Matillion Exchange, users can download and reuse helpful ETL workflows to load and transform their business-critical data, created by the Matillion ecosystem of data professionals, including recognized Matillion partners, to expedite the ETL development process. Customers can leverage reusable jobs to speed up data consolidation and business reporting for advanced analytics and machine learning to inform their data-dependent projects.

“Our customers often share with us their need to accelerate time to value to enable faster decision making. Matillion Exchange enables users and partners to share knowledge while making sophisticated use cases more accessible by leveraging the code and logic created by experts to innovate quickly,” said Matthew Scullion, CEO. “In addition to publishing their own jobs, this marketplace enables our partners to rapidly build solutions for their customers, and  ISVs to deploy data analytics solutions using our leading data transformation software.”

Sisu Accelerates Complex Analyses with Two New Ways to Answer “Why” Faster

Sisu, a comprehensive diagnostic analytics platform, announced two new ways to accelerate the most time-consuming kinds of data analysis. With new tools that comprehensively diagnose the results of A/B and other group comparison tests, and new capabilities for faster text analysis, Sisu is helping businesses answer “why” faster and more comprehensively than ever before. Together, these new capabilities help data teams translate complex data into clear, actionable business decisions.

Unlike current methods for A/B testing and text analytics that only produce basic results, Sisu automatically tests thousands of hypotheses to clearly and comprehensively surface detailed differences between groups. For example, media companies can use these tools to monitor changing audience behaviors across new shows. Game publishers can precisely target high-interest players with fine-tuned offers.

“In the pursuit of growth, it’s no longer good enough to report on how much lift to expect from an A/B test result,” said Berit Hoffmann, Vice President of Product at Sisu. “Analysts and product leaders need to quickly understand how different groups compare, across as many variables as possible. We’re thrilled to help data teams supercharge the way they diagnose changes in their most complex data.”

dotData Launches dotData Stream – Containerized AI Model for Real-Time Prediction

dotData, a leader in full-cycle data science automation and operationalization for the enterprise, launched dotData Stream, a new containerized AI/ML model that enables real-time predictive capabilities for dotData users. dotData Stream was developed to meet the growing market demand for real-time prediction capabilities for use cases such as fraud detection, automated underwriting, dynamic pricing, industrial IoT, and more.

dotData Stream performs real-time predictions using AI/ML models developed on the dotData Platform, including feature transformation such as one-hot encoding, missing value imputation, data normalization, and outlier filter. It is highly scalable and effective – a single prediction can be performed as fast as tens of milliseconds or even faster for microbatch predictions. Its deployment is as easy and simple as launching a docker container with AI/ML models downloaded from the dotData Platform with just one click. An end-point for real-time predictions becomes immediately available. In addition, dotData Stream can run in cloud MLOps Platforms for enterprise AI/ML orchestration or at the edge servers for intelligent IoT applications.

“We are seeing an increasing demand for real-time prediction capability, which has become an essential necessity for many enterprise companies. dotData Stream allows our customers to leverage AI/ML capability in a real-time environment,” said Ryohei Fujimaki, Ph.D., founder and CEO of dotData. “We are honored and excited about our partnership with JFE Steel. Their intelligent IoT application is the perfect use case to demonstrate the ability of dotData Stream, and we are fully committed to supporting their vision to adopt AI/ML in smart manufacturing and achieve the full potential of Industry 4.0.”

Fake News Detection Engine Seeks to Combat Online Harms

A digital tool designed to detect fake news, cyberbullying and other online harms is being developed at the University of Exeter Business School. “LOLA” uses sophisticated artificial intelligence to detect emotional undertones in language, such as anger, fear, joy, love, optimism, pessimism and trust. It can analyse 25,000 texts per minute, and has been found to detect harmful behaviour such as cyberbullying, hatred and Islamophobia with up to 98% accuracy.

LOLA takes advantage of the latest advances in natural language processing and behavioural theory. Taking its name from the children’s TV series Charlie and Lola, the detection engine has been developed by a team led by Dr David Lopez, from the Initiative for Digital Economy Exeter (INDEX).

“In the online world the sheer volume of information makes it harder to police and enforce abusive behavior,” said Dr Lopez. “We believe solutions to address online harms will combine human agency with AI-powered technologies that would greatly expand the ability to monitor and police the digital world. Our solution relies on the combination of recent advances in natural language processing to train an engine capable of extracting a set of emotions from human conversations (tweets) and behavioural theory to infer online harms arising from these conversations.”

Such is LOLA’s potential in the battle against misinformation that it has already led to collaborations with the Spanish government and Google. In a recent experiment, LOLA was found to pinpoint those responsible for cyberbullying Greta Thunberg on Twitter. It has also been used to spot fake news about Covid-19, detecting the fear and anger so often used to pedal misinformation and singling out the accounts responsible.

LOLA grades each tweet with a severity score, and sequences them: ‘most likely to cause harm’ to ‘least likely’. Those at the top are the tweets which score highest in toxicity, obscenity and insult. This kind of analysis could be a valuable tool for cybersecurity services, at a time when social media companies are under increasing pressure to tackle online harms.

Bitwise Announces Support for Microsoft Azure Cloud Databases with Hydrograph ETL for Big Data Development Tool

Bitwise, a Chicago based technology consulting firm focused on data-driven business transformation, announced support for Microsoft Azure cloud databases with its ETL development tool for big data, Hydrograph. Hydrograph ETL for Big Data provides an easy to use, drag-and-drop tool for developing data pipelines on powerful processing platforms such as Apache Spark and other Big Data processing engines both on-premise and on the cloud. In addition to Azure cloud databases, Hydrograph supports leading databases including Google BigQuery, AWS Redshift, Snowflake, MongoDB, Kafka, Teradata, Netezza, Oracle, SQL Server, MySQL, Mainframe (EBCDIC) and many more to fit with any on-premise, cloud and hybrid environment.

New Self-study Course Provides A Level Mathematics Students with Insights into Data Science

Maths education charity, MEI, has developed a new course to introduce A level Mathematics students and their teachers to data science with support from Arm, the world’s leading semiconductor IP company.

Data science uses concepts and techniques from mathematics, statistics and computing to analyze big data. Demand for data scientists is growing rapidly, presenting exciting new career opportunities for young people. The application of data science across an increasingly wide range of contexts means it is also important for young people to understand how data is used and the impact it will have on their lives. The school curriculum will need to adapt to accommodate these trends.

The new course aims to raise awareness of data science among students and teachers and stimulate interest in further study. It will advance MEI’s understanding of how to introduce and teach data science to 16-18-year-olds and inform MEI’s future work on data science education, including the design of potential future curricula and qualifications.

“Regardless of whether young people want to become data scientists, data science and skills in data analysis are becoming increasingly important to university study, applied research and the modern workplace,” said Charlie Stripp, MEI Chief Executive. “It is essential that we take this step to explore what this means for curriculum foundations at A level.”

Cloud-based Data Layer Security Startup Cyral Announces General Availability of Enterprise Platform

Cyral, provider of the cloud security platform for the data layer, announced general availability of its data layer security platform. As databases, pipelines and data warehouses move to the cloud, the concept of infrastructure as code is now the defacto model for most enterprises today. While proving a boon for productivity and agility, these changes also make it much harder for security and DevOps teams to keep track of their data, making it nearly impossible to know who has access to what data and what they are doing with it. Cyral provides DevOps and security teams unmatched visibility and control without any changes to their production applications. 

“Cyral pioneered the concept of a security platform for the data layer and has closed scores of new customer trials and production deployments since announcing our funding in January 2020,” said Manav Mital, founder and CEO of Cyral. “The new GA platform gives customers security directly at the data layer. It is cloud native, fast and light, and lets organizations install and deploy with one-click native integrations to the most popular enterprise tools without requiring changes to any applications.”

Algorithmia Bolsters Enterprise ML Security with Platform Upgrade

Algorithmia, a leader in ML Operations & Management, announced a series of upgrades to its Enterprise product. The platform changes are highlighted by advanced security options that enable customers to operate Algorithmia in highly controlled and restrictive environments, including AWS C2S and AWS GovCloud. The new version of Algorithmia Enterprise also includes support for the latest AWS and Azure GPU hardware, user local debugging improvements, and integration to PyCharm.

Machine learning has the most impact on a company’s core applications, which live behind a firewall in regulated industries like financial services, insurance, healthcare and laboratory sciences. For ML systems to serve those industries, integration to core security systems and compliance with policies and processes is a requirement. Algorithmia’s updated Enterprise product addresses this with support for air-gapped deployment, C2S, GovCloud and VMWare, authenticated proxies, customer-provided and hardened OS images, private Docker hub, private dependency mirrors, and private  certificate authorities.

“Large enterprises in regulated industries have stringent requirements around all aspects of their software development lifecycles,” said Diego Oppenheimer, CEO at Algorithmia. “Many of these companies want to leverage machine learning and artificial intelligence to improve business outcomes, but current MLOps platforms don’t offer the features and integrations to support these requirements. Algorithmia Enterprise allows customers to control the provenance of all components of ML operations, including certificate authorities, operating system, container images, code, dependencies and ML models used in their ML enabled applications.”

Informatica Unveils Data Management Innovations That Drive Business Continuity and Value with Intelligence and Automation

Informatica®, the enterprise cloud data management leader, announced it has updated its Intelligent Data Platform, powered by Informatica’s AI-powered CLAIREengine. The release includes the introduction of a privacy analytics dashboard for reducing the cost of compliance with laws like the California Consumer Privacy Act (CCPA) and Europe’s General Data Protection Regulation (GDPR), Data Asset Analytics (DAA) for data valuation, end-to-end support for DataOps and MLOps, and integration platform-as-a-service (iPaaS) updates that enable organizations to build more resilient and reliable integrations while providing 24/7 operations for business continuity. Updates to multi-cloud Master Data Management (MDM) allow businesses to master business-critical data to increase customer retention and loyalty, manage supply chain risk, drive digital commerce, and boost operational efficiency.

“In today’s era of Data 4.0, and as businesses navigate an increasingly complex landscape, digital transformation must be data-led,” said Amit Walia, CEO of Informatica. “Today’s release empowers data leaders to create more value and improve operational efficiency, all while ensuring business continuity. By introducing more automation and intelligence capabilities – powered by CLAIRE – businesses can accelerate ROI, decrease risk and improve productivity across hybrid and multi-cloud environments.”

GoodData Launches Support for Location Intelligence

GoodData®, a leading global analytics company, announced new geo-mapping capabilities to better meet the needs of companies seeking location data analytics to inform strategic decision making. This new set of analytical visualizations, analytics, and modeling techniques results in GoodData providing the most advanced support for geolocation in the analytics industry for market trends evaluation, site selection, asset tracking and monitoring, and other core business needs. The examples of location-based business insights include COVID-19 infections, economic shifts by geography, election results, unemployment trends, and even contact tracing as efforts ramp up to battle COVID-19.

“We are quickly moving into a world where essentially all data is geo-tagged for location intelligence,” says Roman Stanek, GoodData Founder, and CEO. “The rise of IoT, smartphones, Bluetooth, and other wireless technologies gives businesses completely new perspectives into their operations and risks and our new capabilities lead this trend.”

Signals Analytics Powers the Future of Market Intelligence with its Latest Advanced Analytics Platform Rollout

Signals Analytics, the next generation advanced analytics platform that leverages external data to uncover trends and predictive insights, announced a new rollout of its award-winning platform, leveraging breakthroughs in NLP, machine learning and other capabilities that enhance the precision, detail and utilization of advanced analytics in the enterprise. New features that span the entire data journey such as e-commerce product clustering, author, affiliation, and brand refinement, data mart integration and a daily alert allow businesses to seamlessly integrate Signals Analytics into their existing business intelligence technology stacks, while surfacing highly-detailed and predictive actionable intelligence across a broader range of use cases. 

“From day one, our commitment has been to use AI to power the future of market intelligence and increase the business impact of analytics for the enterprise,” said Signals Analytics CEO and Co-Founder, Gil Sadeh. “What this means, given the volume of external data and how fast it changes, is a continued focus on connecting more and more data sources, pushing the envelope with NLP to extract granular and actionable insights, and enabling integrations with other business intelligence platforms. The capabilities we are announcing today, such as auto ML and product clustering for example, showcase the depth of our AI-focus, while the daily alerts demonstrate how our platform delivers strategic and tactical value to multiple stakeholders across the enterprise.”

Tellius Launches Guided Insights On-Demand to Shatter the Cost and Accessibility Barrier of AI-Driven Analytics

Tellius, the Guided Data Insights platform, announced the availability of Tellius On-Demand, the first and only on-demand SaaS application powered by machine learning that enables business users and analytics teams to quickly understand what is driving performance and uncover the reasons why metrics in their data are changing. Built from the ground up for dynamic elasticity, Tellius On-Demand lowers the cost of modern, enterprise-grade data analytics as much as 80% compared to legacy analytics and visualization tools. Organizations pay only for what they use and no longer have a barrier to analyzing all their data.

“Legacy analytics and their ‘cloudified’ scale are dead,” said Ajay Khanna, Founder and CEO of Tellius. “Billions of dollars have been wasted on traditional analytics applications that do not scale as efficiently or as easily as Tellius On-Demand. Modern data analysis driven by machine learning and AI requires a completely new architecture built from the ground up that enables organizations to get answers from all their data using automated analysis techniques without worrying about resource capacities or operational scale up and scale down.”

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*

Comments

  1. Wow!It was very helpful. I have learnt so many new things related to big data.Thanks for sharing