Sign up for our newsletter and get the latest big data news and analysis.

insideBIGDATA Latest News – 4/8/2021

In this regular column, we’ll bring you all the latest industry news centered around our main topics of focus: big data, data science, machine learning, AI, and deep learning. Our industry is constantly accelerating with new products and services being announced everyday. Fortunately, we’re in close touch with vendors from this vast ecosystem, so we’re in a unique position to inform you about all that’s new and exciting. Our massive industry database is growing all the time so stay tuned for the latest news items describing technology that may make you and your organization more competitive.

Alternative Data Key For Decision Making In Financial Services, Bright Data Research Reveals

Bright Data (formerly Luminati Networks), a leading online data collection platform, has released new research findings that highlight the importance of alternative data in financial services. The insights, gathered in cooperation with the leading market research experts Vanson Bourne, demonstrate the impact alternative data is having in the US and UK versus legacy/traditional data. The survey concluded that nearly a quarter (24%) of financial services professionals who work for organizations that collect alternative data use it to aid their work every day. The research queried respondents from financial services sectors, including insurance, banking, and hedge funds and found a clear dependence on external data sources, with 95% of financial services organizations relying on outside information to contribute towards business success in the past year.

The research findings also shed light on the obstacles that financial institutions face when deploying and working with alternative data. Three-quarters (75%) of banking professionals that use alternative data state data analysis as their biggest challenge. Additionally, the survey unveiled the insights that respondents are not currently getting from alternative data. For example, 80% of those surveyed expressed they require more competitive insights from alternative data, and 79% hoped to get information on customer behavior from the data. These findings show that even though many financial services professionals are using alternative data, there’s a lack of understanding about how to properly analyze the information to unlock insights.

Further key findings from the survey include:

  • Professionals from insurance companies (74%) and hedge funds (72%) find it much easier to integrate alternative data into decision-making than those at banks (57%).
  • Sixty-four percent (64%) of organizations that rely on alternative data sources when building business strategies say that alternative data impacts their investing strategy, and 59% say it impacts their customer experience strategy.
  • Seventy-seven percent (77%) of US respondents find it easy or very easy to integrate alternative data sources into decision-making, compared to only 49% of UK respondents.

“Gone are the days where quarterly earnings reports could be relied on as the main source of data for decision-making,” said Or Lenchner, CEO of Bright Data. “Financial services institutions are seeking out alternative/external data for up-to-the-minute insights that provide the most relevant, reliable and accurate data available. We’ve seen a significant increase in businesses within the space turning to Bright Data to collect alternative data. We look forward to continuing to empower these organizations with tailored online data that guides their most important business strategies and decisions.”

The survey asked 100 employees from insurance, banking, hedge and quant funds, and loan companies about their alternative data usage. The group was representative of US and UK respondents and included employees who work within the IT and data departments of their organizations.

Snorkel AI Launches Application Studio, a Fast Way to Develop AI Applications

Snorkel AI, the company accelerating enterprise AI application development and deployment through programmatic data labeling, announced Application Studio, a visual builder with templated solutions for common AI use cases based on best practices from hundreds of deployments and research at top academic institutions over the last six years. Application Studio is in preview and will be generally available later this year within Snorkel Flow, the first AI development platform to programmatically label data and train, deploy and analyze models iteratively.

“Over the years we’ve heard a clear refrain from enterprises working to deploy AI: data is the blocker. In many settings today–for example, ones where privacy, expertise or speed are essential–even the largest organizations can’t afford to manually label the volume of data that modern machine learning approaches require,” said Alex Ratner, co-founder and CEO of Snorkel AI. “Snorkel Flow’s programmatic approach to training data labeling and model development uniquely unlocks these use cases and a whole new way to rapidly and iteratively develop AI applications–which we’re now excited to make increasingly templatized and fast to deploy with Application Studio.”

Streamlit Transforms How Data Scientists Share Data

Streamlit, the creators of the fastest and most powerful app framework for machine learning and data science, today formally introduced Streamlit for Teams, the company’s first commercial product. Streamlit for Teams lets data scientists instantly deploy and share apps with teammates, clients and other stakeholders so they can make rapid, data-informed decisions based on the insight from the apps. As of today, the new Teams product is moving out of developer preview into beta before launching later this year.

“Streamlit apps are simple interactive script visualizations – a deceptively powerful idiom that strikes just the right balance between low code, power and customizability. This unique approach enables such fast creation of powerful, useful apps, that Streamlit apps have become an entirely new workflow within companies — similar to Google Docs and Notion. Streamlit for Teams lets companies instantly bring these apps into the entire company, allowing everyone to make faster, data-informed decisions,” said Adrien Treuille, co-founder and CEO of Streamlit.

Alation Delivers Cloud-Based Platform For Data Intelligence

Alation Inc., a leader in enterprise data intelligence solutions, launched Alation Cloud Service, a comprehensive cloud-based platform for data intelligence. Alation Cloud Service offers a simple, fast way to drive data intelligence across an organization’s hybrid architecture. The new offering results in faster time to value, lower maintenance and administration overhead, and enables customers to leverage innovations faster through continuous integration and deployment options.

“We’ve always believed that speed to deployment and time to value are fundamental strengths of Alation’s platform. Alation Cloud Service doubles down on that value proposition,” said Satyen Sangani, CEO and co-founder, Alation. “Now, not only does Alation’s ease-of-use drive adoption, but new deployment options allow organizations to innovate and collaborate on data faster than ever before.”

Katana Graph Partners with Intel to Unleash the Potential of 3rd Gen Intel Xeon Scalable Processors

Katana Graph, the high-performance scale-out graph processing, AI and analytics company, announced today that it has optimized its industry-leading graph engine for the all new 3rd Gen Intel Xeon Scalable processor and memory systems. Katana Graph can now take advantage of the latest generation Intel Xeon Scalable processors and Intel Optane persistent memory technology to process massive graphs on much smaller clusters. This will better support organizations with huge unstructured datasets and graphs, including online retailers, financial institutions, and identity management companies, in understanding their customers, operations and opportunities.

“We are proud that our Katana Graph Engine runs up to twice as fast on Intel’s 3rd Gen Intel Xeon Scalable processors than on the previous generation right out of the box, and delivers even more performance with optimizations,” said Keshav Pingali, Katana Graph co-founder and CEO. “Our customers need the high-performance analytics capabilities that we are unleashing by working closely with Intel.” 

Cortical.io Message Intelligence Solution Improves Accuracy of Processing Unstructured Documents

Cortical.io announced Message intelligence 2.1, an intelligent document processing solution (IDP) that provides high accuracy in filtering, classification, and extraction of emails, attachments, and other types of unstructured documents.

Leveraging Cortical.io’s patented method for natural language understanding (NLU), Message Intelligence 2.1 enables higher productivity, fewer false positives, less manual intervention saving time and money. It also requires far less material to train custom classifiers and extraction models, making time to production/value a fraction of other tools available. This is particularly valuable around situations where there is a lack of training material.

“Corporations are overwhelmed with the wide range of interactions with a variety of external stakeholders – be it via emails, web contact forms or social media messages,” said Thomas Reinemer, COO at Cortical.io. “In today’s market, everyone expects quick responses. Speed and agility have become critical to stay competitive. Assuring these interactions are compliant with regulations is also an issue in certain verticals such as insurance, banking, and financial services.”

MemVerge Makes Big Memory Apps Sizzle

MemVerge™, the pioneers of Big Memory software, announced the release of Memory Machine software version 1.2. The software delivers Big Memory performance and capacity leveraging up to 40 cores in 3rd Gen Intel Xeon Scalable processors (code named Ice Lake) and up to 6TB of capacity per socket with Intel Optane persistent memory 200 series. The company also announced its membership in the CXL™ Consortium, and five Big Memory Labs at Arrow, Intel, MemVerge, Penguin Computing, and WWT that are now equipped and available for Big Memory demonstrations, proof-of-concept testing, and software integration.

“Memory Machine v1.2 is designed to allow application vendors and end-users to take full advantage of Intel’s latest Xeon Scalable processor and Optane memory technology,” said Charles Fan, CEO of MemVerge. “We started by providing access to new levels of performance and capacity without requiring changes to applications.”

Yellowbrick Brings Data Warehousing to Distributed Clouds for First Time, Addressing Business Challenges of Distributed Data 

Yellowbrick Data, a leader in modern data warehousing, announced a preview release of Yellowbrick Manager to give customers unified control of data warehouses across distributed clouds, and general availability of its new Andromeda optimized instance for customers with data sovereignty or high-performance requirements. Additionally, the company has added more agile data movement capabilities to help customers more easily integrate Yellowbrick with data lakes built on cloud object stores like Amazon S3. Distributed clouds are an emerging architectural pattern characterized by a mesh of interconnected physical and virtualized infrastructure, forming a best-of-breed, logical cloud managed by a single, unified control plane.  

“Data is becoming more distributed across private data centers, multiple clouds, and the network edge, creating significant data sovereignty and gravity challenges,” explained Yellowbrick CEO Neil Carson. “Yellowbrick led the industry in hybrid cloud innovation. Next, bringing our data warehouse to distributed clouds will be transformative for businesses facing these challenges, especially as use cases like IoT analytics emerge in manufacturing, telecom, and logistics.”

Google Cloud Dataprep Accelerates Data Engineering Tasks 20x By Running Inside BigQuery

Trifacta, the Data Engineering Cloud company, announced Google Cloud Dataprep by Trifacta now leverages the full power of SQL to transform data inside BigQuery. These new capabilities accelerate data engineering tasks up to 20x by eliminating the need to move or cache any data.

With these new features, Database pushdown, popularly known as Extract Load Transform (ELT), transforms the data that is present in a database. Using the power of SQL, the data can be transformed in-place, making it efficient for data manipulations such as filters, joins, unions, and aggregation. Dataprep automatically understands if the data pipeline can be partially or fully translated in a BigQuery SQL statement. There is no need to move or cache any data, enabling enterprises to run workloads for data analytics at any scale with Google Dataprep by Trifacta. This helps provide complete flexibility with data consumption at lower costs and increased security at the user level or the service level with standards such as OAuth and IAM.

“A big requirement for our customers is to not move or cache any data. With this new capability, data can remain in the database enabling enterprises to run workloads with the highest security, with greater flexibility, and at any scale”, said Trifacta CTO and co-founder Sean Kandel. 

Alluxio Introduces Hybrid Cloud Solution Fueled by 3rd Gen Intel Xeon Scalable Processors and Intel Optane Persistent Memory 200 Series

Alluxio, a developer of open source cloud data orchestration software, announced a go-to-market solution in collaboration with Intel to offer an in-memory acceleration layer with 3rd Gen Intel Xeon Scalable processors and Intel Optane persistent memory (PMem) 200 series. The solution enables high performance analytics and AI pipelines at a massive scale while minimizing the Total Cost of Ownership (TCO) by more seamlessly managing data in local storage tiers close to compute. Intel Optane PMem provides a cost-effective storage tier for Alluxio managed data, while accelerating performance with a disaggregated compute and storage architecture.

“With an explosion in the amount of data being managed by the Alluxio Data Orchestration system, effective memory and storage media provided by Intel has a huge value-add,” said Haoyuan Li, Founder and CEO, Alluxio.

MinIO Enables IT to Manage Kubernetes-native Object Storage with Addition of Console, Operator and SUBNET Health

MinIO, a pioneer in high performance, Kubernetes-native object storage, announced the addition of Console, Operator and SUBNET Health to MinIO’s suite of object storage software. These new features simplify the deployment of multi-tenant object storage using Kubernetes. Critical components in any IT modernization effort, these additions ensure that the broadest audience can leverage MinIO.

“Collectively, these features allow customers to rapidly adopt even the most advanced MinIO capabilities while providing a roadmap for increased automation as they scale their deployments and workloads,” said MinIO CEO and Co-Founder AB Periasamy. “SUBNET Health is a critical component of MinIO’s hybrid cloud strategy and provides customers with the confidence to run on any hardware, in any environment and on any cloud.”

TYAN Uses New 3rd Gen Intel®Xeon®Scalable Processors to Drive Performance for AI and Cloud Data Centers

TYAN®, an industry-leading server platform design manufacturer and MiTAC Computing Technology Corporation subsidiary, today introduced the  3rd Gen Intel Xeon Scalable processor-based server platformsfeaturing built-in AI acceleration, enhanced security, and PCIe Gen4 support for the most demanding workloads in cloud, enterprise, AI and HPC fields.

Second Generation of Black Knight’s Rapid Analytics Platform Significantly Expands Data Marketplace and Team Collaboration Tools; Further Streamlines Workflow

Black Knight, Inc. (NYSE:BKI) announced the release of the second generation of its Rapid Analytics Platform (RAP), which includes a powerful new design that delivers a streamlined workflow experience for users. RAP also now includes several additional datasets that clients can leverage to address a variety of critical business needs. RAP is a unique, cloud-based data marketplace and decision-science studio that allows users to directly access Black Knight’s massive, diverse data assets and create custom analytics within a single solution. Users can seamlessly source Black Knight data managed on the platform, connect to other data sources, execute queries, create advanced analytics and train machine-learning models.

“RAP had already changed the landscape for mortgage and housing-related data science by bringing together more primary-sourced data and advanced analytics than any platform currently available,” said Ben Graboske, president of Black Knight’s Data & Analytics division. “With this second iteration, we’ve significantly enhanced the user and workflow experience and increased the number of datasets available, while simultaneously boosting the power available to users.”

DefinedCrowd Addresses Data Quality Challenge as Global Pandemic Accelerates Demand for Bias-Free Artificial Intelligence, Boosts Access via NVIDIA NGC

To address the rapid increase in the demand for high-quality, bias-aware AI training data, DefinedCrowd announced the expansion of its online data marketplace, DefinedData, to third-party suppliers to sell or share AI datasets, as well as a collaboration with NVIDIA to provide dataset samples through the NVIDIA NGC catalog. In addition, the platform now provides AI engineers with unprecedented levels of training data transparency, and a range of subscription options, with special discounts for academia.

“This is an exciting moment. I am proud to see DefinedCrowd becoming the GitHub of AI,” said Founder and CEO, Dr. Daniela Braga. “Transparent, traceable and bias-aware data is crucial to build ethical AI technologies.”

Announcing Gemini: The Disaggregated Storage Consumption Model To Marry Hyperscale Economics with Enterprise Appliance Simplicity

VAST Data, the storage software company breaking decades-old tradeoffs, unveiled Gemini, the enterprise storage appliance model reinvented for the age of hyperscale. This offering, which disaggregates the business of hardware and software, allows VAST to sell managed software on hardware that customers can buy at cost as integrated appliances that ship directly from the manufacturer. With Gemini, VAST has solved many of the greatest hurdles associated with pure software and pure appliance models enabling customers to buy like hyperscale cloud operators without the heavy lifting commonly associated with building always-on, massively scalable clouds of low-cost flash infrastructure.

This new offering provides customers all of the benefits of a software-defined business model in terms of flexibility and cost advantage, but at the same time preserves the simple and multigenerational cluster appliance model that customers have come to love from VAST Data. This event marks the company’s complete transition into a software business and is intended to align VAST’s business objectives precisely with the needs and objectives of its customers.

“After deploying hundreds of petabytes of storage around the world, we’ve learned so much about what our customers want and need,” said Renen Hallak, Founder and CEO of VAST Data. “The promise of disaggregation shouldn’t just be about the technology, but equally about how that technology is deployed. Instead of purchasing hardware and software together and being caught up in an endless refresh cycle as we’ve seen for the past 30 years, Gemini offers the freedom, flexibility, and simplicity, all at an affordable cost, that organizations need to deploy an infinite storage lifecycle.”

Talkwalker Announces New Speech Analytics Technology 

With its new Speech Analytics technology, Talkwalker, a leading enterprise listening company, empowers brands to monitor and analyze podcast conversations. Through text, image, video, and now speech analytics solutions, Talkwalker provides full visibility over what is being said about brands in seconds. Podcast Monitoring offers new insights and glimpses into campaign efficacy by unlocking hidden brand mentions to better understand consumers and ultimately drive revenue.

“As a major source of news and entertainment, podcasts hold an extensive amount of information that brands cannot ignore when looking to capture the voice of the customer,” said Robert Glaesener, Talkwalker CEO. 

Holberton Launches Expanded Program to Accelerate Learning Fundamentals of Artificial Intelligence

Holberton, making software engineering education affordable and accessible globally, announced the appointment of a new Machine Learning and Mathematics Team to build out a comprehensive program to accelerate training students in the key tenets of Artificial Intelligence (AI), the engine of the New Economy.

“Machine learning has revolutionized so many fields, ranging from medicine, to social media, to food, to security,” said Julien Barbier, CEO of Holberton. “This trend has accelerated with COVID-19. While yesterday, every company had to become a digital company to survive, tomorrow they will have to become machine learning companies. We’re committed to training the next generation of ML software engineers to meet this demand when there is an urgent global shortage of qualified talent.”

Denodo Launches New Data Integration Solution in the Cloud with Denodo Standard

Denodo, a leader in data virtualization, today announced Denodo Standard, a new data integration solution available on leading cloud marketplaces. The new offering leverages Denodo’s modern data virtualization engine to deliver superior performance and productivity, enabling real-time analytics and data services without replicating the data into another repository. Denodo Standard lowers the barriers to begin data integration by allowing organizations to purchase it directly from their cloud marketplace of choice—AWS, Microsoft Azure, and Google Cloud Platform. Its cloud infrastructure automation and flexible by the hour pricing enables both enterprises as well as small and medium businesses to rapidly deploy it without a large commitment of time and resources.

“The goal of Denodo Standard is to enable organizations of all sizes to unlock the value of their data assets faster, and with a lower cost and resource commitment. By leveraging cloud infrastructure automation, usage-based pricing, and free trials in the cloud marketplaces, we have lowered barriers to start using Denodo for data integration,” said Ravi Shankar, senior vice president and chief marketing officer at Denodo. “We are excited to be able to bring the value of data virtualization to even more companies, especially small and medium-sized businesses, and we expect many of them to expand beyond their initial use quickly to the enterprise logical data fabric capabilities once they have realized the benefits and the rapid ROI Denodo provides.”

Talkdesk launches AI Trainer, the first ‘human-in-the-loop’ tool for contact centers

Talkdesk®, Inc., the cloud contact center for innovative enterprises, launched Talkdesk AI TrainerTM, the first human-in the-loop (HITL) tool for contact centers. While most artificial intelligence (AI) systems require the employment of highly specialized data scientists, the powerful simplicity of Talkdesk AI Trainer allows agents with domain knowledge to improve the AI models autonomously. As a result, enterprises can successfully resolve more cases through automation, which, in turn, improves accuracy, decreases the cost per case and increases customer satisfaction.

Human-in-the-loop systems—in which humans provide information and knowledge for AI training—are essential for many applications and the ongoing maturation of AI in the contact center. Talkdesk AI Trainer is the first HITL tool to be included in the operational flow of contact centers. Within Talkdesk AI Trainer, dashboards display the performance of each AI model and indicate where each model needs additional training. An easy-to-use interface allows non-technical staff with domain or business expertise to improve automation performance. 

“By lowering the barrier to AI adoption in contact centers, Talkdesk AI Trainer is revolutionizing the way companies implement, maintain and customize their AI models for automation,” said Charyana Kannan, chief product officer, Talkdesk. “Enterprises no longer need to hire highly specialized data scientists to program their machine learning models. With AI Trainer, enterprises become autonomous by leveraging their internal customer service subject matter experts—agents and supervisors—to embed knowledge into their existing AI architecture. Talkdesk AI Trainer signifies a bold move into the future of work, where agents are empowered to join the era of AI.” 

RSIP Vision Unveils Robust Metal Implant & Anatomical Segmentation Tool, for Improved Planning of Specialized Orthopedic Procedures including Revision Arthroplasty

RSIP Vision, a leading innovator in medical imaging through advanced AI and computer vision solutions, announced an advanced joint segmentation tool for detailed, non-invasive planning of revision arthroplasty and other orthopedic procedures for patients with pre-existing metal implants. This powerful AI-based software module enables quick and accurate segmentation of different joints from CT scans of hips, knees, shoulder and spines. It provides precise measurements of the geometry of joints, including complicated cases of joints with existing metallic orthopedic implants. RSIP Vision’s deep learning algorithms provides a solution for the artifacts associated with metals in CT images, which normally cause severe degradation of medical imaging. This vendor-neutral technology will be available to third-party CT manufacturers and medical device vendors, allowing them an improved way to plan and execute both manual and robot-assisted revision arthroplasty procedures.

“One of the most common problems in bone segmentation imaging occurs when a CT scan includes the presence of metals in the bones, either due to previous orthopedic procedures (such as hip replacements) or surgical corrections after a traumatic injury,” said Ron Soferman, CEO of RSIP Vision. “In these cases, the CT images become problematic because of nonstandard absorption values caused by crosstalk between the absorbing pixels and additional artifacts – which result in a challenging image. Moreover, standard segmentation tools can lead to inaccurate outcomes that limit the orthopedic surgeon’s ability to properly plan for the surgery.”

Penguin Computing Announces DeepData Solution Featuring Red Hat Ceph Storage and Seagate

Penguin Computing, Inc., a division of SMART Global Holdings, Inc. (NASDAQ: SGH) and leader in high-performance computing (HPC), artificial intelligence (AI), and enterprise data center solutions, today announced that it has collaborated with Red Hat and Seagate Technology to unveil its DeepData solution with Red Hat Ceph Storage, a high performance, software-defined storage solution optimized for scale, throughput and cost-effectiveness.

Every year, organizations are creating larger quantities of unstructured data, making it difficult to keep up. There is growing demand for data to feed AI systems to improve decision making, customer experiences and deliver new services. Additionally, image and video files are increasing in quality, quantity and size while the continued growth of IoT and edge devices is accelerating an influx of unstructured data. According to ESG research, traditional storage solutions limit business agility and complicate management, unlike a workload optimized software-defined architecture that can be tuned for I/O profiles. Organizations need a way to scale storage without increasing cost, head count or security issues while adapting to modern data management modalities that accelerate time to value.

“Through our work with Red Hat and Seagate, the DeepData solution is accelerating adoption of software-defined architectures by delivering a proven and tested turn-key enterprise solution,” said Kevin Tubbs, Ph.D., and senior vice president, Strategic Solutions Group at Penguin Computing. “Incredibly scalable with security measures built-in and able to support workloads driven by large amounts of data, our DeepData solution is designed to leverage high capacity, cost-effective enterprise storage solutions that meet the needs of tomorrow’s data sets – with no single point of failure.”

Matillion ETL for Delta Lake on Databricks Accelerates Time to Insights in Lakehouse Architecture

Matillion, a leading enterprise cloud data integration platform, announced today the general availability of Matillion ETL for Delta Lake on Databricks, enabling data professionals across the business to aggregate and share data in a single environment for improved cross-functional decision-making. Through the powerful, scalable combination of Matillion ETL and Databricks, enterprises access insights from large, dynamic datasets on a unified platform, empowering data teams to take full advantage of modern lakehouse architecture. 

“Enterprises want to complete using data, but struggle to make data useful quickly for projects such as AI/ML and analytics. Matillion ETL for Delta Lake on Databricks makes it easier for enterprises to make their data useful, taking the pain out of manual data ingestion and transformation tasks to enable faster time to insights,” said Matthew Scullion, CEO of Matillion. “We are excited to work alongside Databricks’ popular lakehouse architecture to improve time to value for enterprises.”

Kinetica Offers Geospatial Visualization Within Tableau at Unlimited Scale

Kinetica, the streaming data warehouse that combines historical and streaming data analysis with powerful location intelligence and AI, announced a new Tableau extension. Kinetica’s new extension overcomes previous limitations to visualizing billions of geospatial data points within Tableau, helping customers filter data reflected in real time on their map, track objects through time and space, or detect patterns with heat maps.

“Kinetica was built to treat the unique challenges of geospatial analytics as a priority,” said Nima Negahban, CTO at Kinetica. “For Tableau users to tap into our distributed geospatial engine, and visualize a nearly unlimited amount of location data, represents a breakthrough for geospatial analytics as a part of business intelligence.”

Incorta Ushers in the Modern Data Analytics Platform with Incorta 5, Bringing Faster, Simplified and Advanced Analytics to Business Users

Incorta, the Direct Data Platform, launched Incorta 5, a unified modern analytics platform that extends benefits to every line of business and provides access to crucial data insights while reducing the burden on IT. Expanding on the industry-proven benefits of the Incorta Direct Data Platform™, Incorta 5 provides a unified data experience to help anyone, from C-suite executives to IT staff and data teams, easily collect, interpret and act on complex data in order to gain valuable insights for the business. With little to no training, business users can now have powerful self-driven analytics.

“Anyone who handles enterprise data knows that current technologies don’t provide the seamless self-service experience or agility and velocity that today’s digital businesses require. I have long believed that there has to be a better way to give every decision maker access to data,” said Scott Jones, CEO of Incorta. “Incorta 5 solves that with its revolutionary Direct Data capabilities, essentially cutting out the middleman, and enabling anyone from IT to the C-level to make informed, creative decisions about every element of how they manage their business and the relationships they have with their customers.”

Akkio Launches AI Data Stories Feature to Help Companies Understand their Predictive Models

Akkio, a no-code machine learning (ML) platform for modern sales, marketing, and finance teams, announced it has launched a new feature called Data Stories. Data Stories enable Akkio users to understand the inputs that are most predictive in driving the AI models they create. Data Stories help users overcome the “black box feeling” – the lack of confidence and trust many non-data scientists experience when creating predictive models.

Akkio’s no-code ML platform makes it possible for any technically-savvy user – not just data scientists and software engineers – to create and deploy AI predictive models. By making AI easy, Akkio opens up the power of AI to thousands of new applications where AI could be useful today. But these new users need different ways to understand and trust model performance than the highly technical ones data scientists use. That’s where Data Stories come in. Data Stories empower users to see, understand, and take action on their business data like never before.

“The amount of data available to marketing, sales, and financial teams is exploding at a furious pace – but making sense of it all is virtually impossible without the assistance of artificial intelligence,” said Abe Parangi, co-founder and CEO of Akkio. “We believe our Data Stories functionality is the first to empower non-expert users to identify the myriad of complex factors that drive key business outcomes, so they can replicate what works and avoid what doesn’t. In essence, Data Stories helps democratize AI, making its valuable insights available to even those without data science expertise.”

AttackIQ Announces Major Platform Innovations to Bolster Informed Defense Architecture

AttackIQ®, a leading independent vendor of Breach and Attack Simulation (BAS) systems, announced a series of technology innovations to the AttackIQ Informed Defense Architecture (AIDA) that allow cybersecurity teams to better test their people, processes, and defensive technologies against advanced, multi-stage attack campaigns. AttackIQ now offers the industry’s only adversary emulation architecture built to test artificial intelligence (AI) and machine learning (ML)-based cyberdefense technologies in production, while emulating comprehensive, multi-stage attacks. 

“To validate cybersecurity effectiveness against real-world threats, organizations need a platform that can emulate the adversary with specificity and realism at every step in the cyberattack process, which is no small feat,” said Brett Galloway, CEO of AttackIQ. “We’ve developed a future-proof series of technology innovations in our kill chain testing that accounts for individual tasks and mimics human adversarial behavior. Now, organizations that are leveraging AI and ML control and detection systems can test their systems with a full-scale, automated platform across the entire kill chain with point-and-click ease of use that’s also completely aligned to MITRE ATT&CK.”

Ahana Announces New Capabilities for Its Presto Managed Service Including Data Lake Caching, Security and Seamless Operations

Ahana, the self-service analytics company for Presto, announced significant updates to its Ahana Cloud for Presto managed service. The major areas of focus include performance, better cluster management, ease of use, and security. One of the key features is the data lake IO caching capability that can dramatically improve query performance, reducing latencies up to 80%.

“Our latest innovations make Ahana Cloud the fastest and most advanced cloud native managed service available for Presto,” said Dipti Borkar, Cofounder and Chief Product Officer, Ahana. “We have seen that advanced capabilities like data lake IO caching can improve query performance up to 5x for real-world concurrent query workloads. This, along with our new ease of use, security and cost management advancements, continue to extend Ahana’s leadership in the market, giving data platform teams more value using the power of Presto for faster insights, easier access, and lower costs.”

Fiverr Opens New Vertical Focused on Data Related Services

Fiverr International Ltd., (NYSE: FVRR) announced the launch of its first new vertical in over nine years dedicated to services related to data. The use of data and analytics is no longer limited to big companies with deep pockets. It’s widespread, with 67% of small businesses revealing that they spend more than $10K a year on analytics. Investing in data science and analytics can help companies make more informed decisions, improve their operational efficiency, and ultimately increase their revenue.

“Small businesses need to feel empowered to take advantage of the kind of data driven decisions big corporations have been mining for years. However 57% say they don’t have the right people to manage the process of implementing the solutions,” said Micha Kaufman, Fiverr CEO. “As all kinds of businesses are being forced to shift their operations online, the need for the insights data can deliver becomes even greater. By opening a vertical focused solely on data services, we are providing business buyers the ability to tap into talented data analysts, data scientists and more, on-demand. These skilled freelancers can help implement solutions to help businesses make data driven decisions to improve, enhance and ultimately grow. This dedicated vertical is enhancing Fiverr’s user experience to meet a growing demand for these types of services.”

Tableau Business Science Brings Powerful Data Science Capabilities to Business People

Tableau, a leading analytics platform (NYSE: CRM), introduced Business Science, a new class of AI-powered analytics that lowers the barrier to data science techniques, enabling business users and analysts to make smarter decisions faster. In a market where agility is the ultimate competitive advantage, Business Science empowers more people with data, with simplified model creation, predictions, what-if scenarios, forecasting and other analytical methods – all using clicks, not code.

Tableau will deliver Einstein Discovery in its 2021.1 update, the first major release enabling Business Science. Integrating Einstein Discovery’s trusted, real-time predictions and recommendations into Tableau will help people go beyond understanding what happened and why it happened, to explore likely business outcomes and inform proactive action. For more than five years, Einstein Discovery has helped Salesforce customers surface insights and understand patterns across millions of rows of data in minutes, without requiring sophisticated data models.

“Data science has always been able to solve big problems but too often that power is limited to a few select people within an organization,” said Francois Ajenstat, chief product officer, Tableau. “To build truly data-driven organizations, we need to unlock the power of data for as many people as possible. Democratizing data science will help more people make smarter decisions faster.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Leave a Comment

*

Resource Links: