Heard on the Street – 4/20/2023

Print Friendly, PDF & Email

Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!

Earth Day 2023. Commentary by Glenn Stowe, Product Manager for Geospatial at MariaDB

The global climate crisis is the defining challenge of our time, requiring innovative and collaborative solutions to mitigate its impacts. By leveraging cloud computing, open standards, AI, and the democratization of access to space, we can unlock the full potential of geospatial data to better understand, predict, and mitigate the impacts of climate change. The importance of geospatial data in the fight against climate change cannot be overstated. The increased availability of affordable satellite imagery has led to an explosion of geospatial data, which in turn has facilitated a more comprehensive understanding of climate change.

Earth Day 2023. Commentary by Alex Mans, founder and CEO of FLYR Labs

Advanced forecasting technology can empower airlines to optimize their flight networks, reducing empty seats and ultimately reducing carbon emissions. With the power of artificial intelligence and machine learning, airlines can analyze demand and revenue performance months in advance, allowing them to maximize capacity and fuel usage. As a result, airlines can not just generate more revenue and cut costs, but achieve fuller flights, which contributes significantly to a more sustainable future in the skies. While it’s important to consider sustainable aviation fuel and other green technologies as long-term goals, optimizing route networks and reducing empty seats is a tangible step that airlines can take today towards achieving their sustainability objectives.

Earth Day 2023. Commentary by  Shekar Ayyar, chairman and CEO of Arrcus

5G will enable an array of new services and AI applications to drive sustainability in various use cases like IoT, automation, etc. While 5G could drive an increase in the number of cell towers, higher bandwidth and power consumption, this can be offset by using software-driven platforms built on open networking hardware, rather than purpose-built networking hardware. This can sustainably drive digital transformation by supporting multiple applications and use cases on a single network fabric, with increased simplicity and performance, and can reduce the network’s footprint and power requirements.

Predictions for RSA. Commentary by Shashi Kiran, CMO of Fortanix

2023 is an area of economic uncertainty and innovation. I expect this paradox to impact the security space as well trickling into 2024 and beyond. On one hand, we’ve seen the rise in data breaches and ransomware headlines, including from security companies themselves. On the other, we’ve also seen the power of ChatGPT and other such platforms capture headlines for their powerful AI. Couple these with emerging areas such as post quantum cryptography (PQC), the future looks both interesting and frightening. I anticipate a greater interest therefore in securing data at its core, through state of the art cryptographic technologies as well as in platforms that simplify the management of the entire lifecycle including for key management, certs, secrets management etc., at scale and across clouds including leveraging aspects of confidential computing. We certainly are seeing large scale interest and believe it is only the tip of the iceberg.

Transformational Benefits of Big Data. Commentary by Misha Sulpovar, VP of Artificial Intelligence Product at Cherre

A comprehensive big data strategy can have transformational benefits to a business and its bottom line. Yet, to experience the true advantages of big data, companies must continually review and evolve their strategy, and ensure they’re embracing emerging trends, including metadata-driven data fabric, artificial intelligence (AI) automation, advances in process maturity, explainability and bias detection as well as generative AI. Metadata-driven data fabric, which connects a disparate collection of data tools, increases an organization’s agility and powers better decision-making by building in flexibility, infrastructure for modeling and a thicker data set to drive authentic insights. Tech leaders can also accelerate their speed to market by coupling automation like AutoML with tools to monitor drift and check for bias. However, the biggest momentum we are seeing in our general understanding of AI is generative AI.  It will help us unlock the art of the possible and unblock our thinking of applied AI and will serve as a powerful inflection point. Nevertheless, companies cannot simply adopt these solutions and expect a miracle to take place. They must be deliberate and diligent to ensure the decision systems they build are built with resilience and are solving real problems.

Preparing for an AI-assisted data science field. Commentary by Michael Grant, Senior Vice President, Enterprise at Anaconda

While the generative AI boom provides much problem-solving potential and excitement for how we will go about our daily lives, business decisions still need human brain power. There are dangers of hitting the autopilot button and potentially entrusting your organization’s data analysis entirely to a black box. Generated models might run afoul of data protection regulations, or include errors that are hard to detect or understand. Of course, generative AI is here to stay, and it does have great potential to empower individual users to accomplish tasks faster and more effectively than ever before. In the data science realm, AI can help automate certain data processing tasks, summarize the output of a model, or uncover missed connections and new insights. As we move into an AI-assisted future, finding a balance between artificial and human intelligence will be vital to driving a successful business.

World Backup Day 2023. Commentary by Don Boxley, CEO and Co-Founder, DH2i

World Backup Day is an annual event that is intended to raise awareness of the importance of data backup and protection. It serves as a reminder for individuals and organizations to take proactive measures to safeguard critical data against unexpected incidents that can result in data loss, such as hardware or software failure, cyber-attacks, natural disasters, and human error. And, while the exact cost can vary depending on factors such as the size of the organization, the type and amount of data lost, the cause of the loss, and the duration of the downtime, according to various studies, it can cost organizations upwards of billions of dollars each year. That’s why, for systems architects and IT executives alike, zero is the ultimate hero. And to achieve it, they are taking a multi-pronged approach to data protection. To achieve zero downtime, zero security holes, and zero wasted resources, they are also layering-on smart high availability (HA) clustering and software-defined perimeter (SDP) technology that enables them to securely connect and failover enterprise applications — from anywhere, to anywhere, anytime. On World Backup day and all year long, it is critical to remember that businesses that invest in data protection are better equipped to navigate unexpected data loss events, maintain regulatory compliance, and protect their critical assets and reputation. Bottom-line, investing in data protection is not just smart, it’s essential for business success.

World Backup Day 2023. Carl D’Halluin, CTO, Datadobi

Failing to backup your data can have catastrophic consequences, as a single hardware failure, cyber-attack, or natural disaster can wipe out all your valuable information, leaving you with no way to recover it. This means that years of hard work can all be lost in an instant, with no chance of retrieval. Even the cost of losing just a portion of your important data can be immeasurable, with potential financial, legal, and reputational implications that can last for years. Identifying the vital data that requires protection should be the first step in the process. But even if you know and can ‘describe’ what data must be protected, finding it has always been another matter – and you cannot backup what you cannot find. To effectively address this enormous and complicated undertaking, users should look for a data management solution that is agnostic to specific vendors and can manage a variety of unstructured data types, such as file and object data, regardless of whether they are stored on-premises, remotely, or in the cloud. The solution should be capable of evaluating and interpreting various data characteristics such as data size, format, creation date, type, level of complexity, access frequency, and other specific factors that are relevant to your organization. Subsequently, the solution should allow the user to organize the data into a structure that is most suitable for the organization’s particular needs and empower the user to take action based on the analyzed data. In this case, backup the necessary data to the appropriate environment(s). And, if necessary, the solution should enable the user to identify data that should be organized into a ‘golden copy’ and move that to a confidential, often air-gapped environment. To sum it up… Don’t let the nightmare of data loss become your reality – always backup your data.

AI in the Enterprise: is 2023 the year of widespread adoption? Commentary by Alexander Hagerup, co-founder and CEO, Vic.ai

The potential of AI has been talked about for years, but enterprise adoption and awareness of its true capabilities has been limited. I believe that’s about to change for organizations as the value proposition of AI becomes more tangible and accessible. The progression from augmenting work with AI and ML, to relying on software to autonomously do the work for us isn’t anything new. However, we’re likely to see adoption accelerate two-fold this year amidst the recession and as businesses continue to cut back on labor and other costs. In the mainstream/consumer space technologies like ChatGPT are moving the AI discussion forward and lifting the blinds on what’s possible.  Similarly, as early enterprise adopters start to see proven success of AI applications, it’s going to spark an “aha!” moment among key stakeholders and higher adoption across the enterprise. This is especially true for more corporate and highly regulated environments like finance departments, where providing accurate and timely financial information to stakeholders, and ensuring compliance with regulatory requirements, is critical. Any technology adopted in these areas must be proven to be reliable, accurate, and trustworthy.

Putting Users First: Zero-Copy Integration Governance Framework changing the game for Canadian Businesses. Commentary by Lewis Wynne-Jones, VP of Product at ThinkData Works

In an increasingly data-hungry landscape, we are witnessing a heyday of applications that consume and serve up data in ways that advantage consumers. The problem with this development is it naturally leads to a proliferation of application-specific copies of user data, a byproduct of increased digitization that creates risk for users or, in the case of data pertaining to health and banking, stops truly helpful applications from being developed due to the lack of control associated with traditional approaches to sharing this data between producers and consumers. Zero-copy integration aims at both improving this process and unlocking new potential. As an architectural approach, this framework will eliminate the costly and risky process of backing up individual data stores, in one stroke fixing the broken process that copies data in order to deliver it to consumers. As an innovative approach to governance, this standard lays the foundation for the development of intelligent apps for healthcare, banking, and other industries where data is extremely sensitive and yet sorely needed. None of this happens without modern tech. Protecting the data lake and developing channels where applications and consumers can access information within it without creating copies of the data is the backbone of this framework, and novel approaches to data virtualization (ie, accessing data in a database without copying it) make it possible to establish best practices for backing up data securely while maintaining critical controls on user data. This framework makes it possible to centralize access without centralizing data and represents a crucial evolution in data best practices, leading to better services, improved user security, and a more data-driven future. 

How companies can safeguard themselves by working with a distributed SQL database. Andrew Oliver, Senior Director of Product Marketing at MariaDB

Surviving a disaster is about preparation requiring multiple redundancies, backups, and configuration management at the database layer. For cloud applications it means deploying new technologies including distributed databases with multiple availability zones and cross-region replication. Instances caused by human error require both backups and a way to manage and apply configuration as well as rollback changes that do not work. Various fault tolerance levels, like these, are built into distributed SQL databases, making them one of the best ways to not only perform and scale, but carry on even when bad things happen.

How hybrid AI will benefit regulated industries. Commentary by Emmanuel Walckenaer, CEO of Yseop

Even with Large Language Models (LLMs), such as ChatGPT, demonstrating a promising future for AI, there are still issues and circumstances where LLMs cannot be relied on. Within regulated industries, enterprise software must consider fundamental needs like security, transparency, explainability of AI models and more. The life sciences industry is trusted to bring life-saving drugs to market where medical writing teams work with confidential data that platforms (like ChatGPT) should under no circumstance ever process due to its unpredictability and potential to produce inaccuracies. Additionally, with LLMs, once the information is on an open platform it is then available across all users, automatically losing any confidentiality. The newest face of AI is a hybrid model, which will guarantee accuracy and combine the best of LLMs and symbolic AI. The combination of models allows the use of trillions of data points and documents partnered with accuracy and controllable narratives. The creation of an efficient hybrid tool permits regulated industries to process sensitive data and generate reports that are necessary to the industry.

Investing in AI for Successful Value-Based Care Participation: The EOM Use Case. Commentary by Kathy Ford, Chief Product and Strategy Officer, Project Ronin

Improving patient access and care delivery are top priorities for healthcare leaders in the push for value-based care (VBC). Yet value-based contracts accounted for only 7% of medical revenue among primary care specialties, 6% among surgical, and 15% among nonsurgical specialties, according to a 2022 report. The Centers for Medicare & Medicaid Services (CMS) introduced an array of value-based care models to help address this issue, including the Enhancing Oncology Model (EOM). EOM is a 5-year voluntary model, beginning on July 1, 2023, that aims to improve quality and reduce costs through payment incentives and required participant redesign activities. But care teams and electronic medical records alone cannot satisfy EOM requirements. Health systems must accelerate the adoption of clinical decision-support technologies that integrate safe and ethical AI to fill the gaps in existing capabilities and realize the benefits of value-based care. Despite the push toward VBC and the continued investment by health systems to improve their electronic health record (EHR) systems, the full benefit of data captured in these systems has not yet been realized. This means many organizations don’t have the capabilities to administer VBC-based programs and payment models. Accessible, validated AI technology is critical to changing that status. Today’s AI can pull data from unstructured clinician notes, accelerate time-consuming chart reviews, and improve care by analyzing data to produce actionable predictive insights. Paired with a robust decision support platform and AI, cancer centers can provide patients with 24/7 access to care teams, streamline patient-to-care team communications, engage patients,  screen for social needs, deliver health education and identify patients at risk for adverse events. To leverage EOM and capture new revenue without the burden of more software, health systems will need to adopt solutions that incorporate safe and ethical AI tools to accelerate precise clinical decisions in care and rise above their competition. 

The promise, pitfalls and opportunities of generative AI. Commentary by Cristina Fonseca, Head of AI at Zendesk

The biggest opportunity with generative AI lies in its ability to eliminate much of the manual workload that can be low value and incredibly time consuming. Businesses want and deserve to see immediate results when it comes to manual, repetitive work performed today, but it can’t be done with generic models only. ChatGPT has brought Large Language Models (LLMs) mainstream and made them readily available for everyone. While LLMs are not new, ChatGPT’s conversational, chatbot-like user interface that makes AI easy to understand and use, is what is truly innovative. According to Zendesk’s 2023 CX Trends Report, 72% of businesses say expanding AI across CX will be a main priority in 2023. AI can help CX teams be more consistent, better understand customers and derive insights from data. If we appropriately apply technology with the right level of CX strategy, we can find unique opportunities to boost agent productivity and to completely automate customer queries without degrading the customer experience. Additionally, AI insights can help businesses identify knowledge gaps and pinpoint problems before they become large volume issues. While the benefits are clear, there are some things to keep in mind as companies look to implement LLMs. ChatGPT is great at finding general responses that are highly conversational, but it doesn’t have the context to answer questions about your business – this is a problem, especially for customers looking for correct information. Companies must have a mixture of deep CX specific technology and human agents who remain involved to supervise the AI and mitigate risks. Providing a good customer experience and maintaining customers’ trust is a top priority – businesses cannot lose sight of this while adopting new technology.

AI Co-Pilots and Predictive AI Will Transform the Way We Work. Commentary by Artem Kroupenev, VP of Strategy, Augury

We are reaching a point where every profession will be enhanced with hybrid intelligence and have an AI co-pilot which will operate alongside human workers to deliver more accurate and nuanced work at a much faster pace. These co-pilots are already being deployed with clear use cases in mind to support specific roles and operational needs, like AI-driven Machine Health solutions that enable reliability engineers to ensure production uptime, safety and sustainability through predictive maintenance. As AI becomes more accessible and reliable, it would be extremely difficult, and in some cases even irresponsible for organizations to not operate with these insights given the accuracy and reliability of the data. Executives are beginning to understand the value of AI co-pilots for critical decision making and its value as a key competitive differentiator, which will drive adoption across the enterprise.

The Tipping Point for AI. Commentary by Scott Leshinski, SVP of Commercial Expansion, OneStream Software

Historically, AI was a highly specialized skillset that operated as a stand-alone function and was dependent on bespoke 3rd party technologies. This operating model made it difficult to connect predictive AI into business planning processes, which created a disconnect between the business and the AI model’s output. In addition, the technical effort to integrate predictive AI technologies into business planning software would subject the business to a high level of technical debt by incurring significant infrastructure and compute costs and resulting in ongoing maintenance.  Adding to this complexity, producing an efficacious predictive AI model would take months, if not years, to produce, and the results lacked the necessary transparency to be explainable by the consumer. In contrast, modern approaches to predictive AI, have evolved to incorporate AI and machine learning directly into the technology platform that is used for business planning, which in turn, unifies the data science function with the business planning functions, such as corporate financial planning and sales and operations planning.  In addition, predictive AI should provide the ability to easily graft in external drivers to understand the relationships between these factors and the forecast results.  The ability to incorporate these external factors serves to provide real-time signals to the business on changes in external conditions and how they will impact the business.  Adding to this, modern approaches to predictive AI provide end users with transparency on which factors meaningfully influence the results that are being produced by AI and machine learning models.  This way, businesses know where to focus their resources to maximize revenue or minimize costs. Using predictive AI enables businesses to quickly process large volumes of internal and external data and produce a forecast efficiently and accurately at any level of granularity. As the world becomes more data-driven, productized AI and machine learning is necessary for businesses to gain a competitive advantage in a constantly evolving environment.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW

Speak Your Mind

*