Sign up for our newsletter and get the latest big data news and analysis.

Heard on the Street – 12/27/2022

Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!

Empowering Engineers through Open Source AI. Commentary by Brian Venturo, co-founder and CTO of CoreWeave

We’re in the embryonic stages of a new AI and machine-learning era — much of which already does and will have its roots in the open-source community. EleutherAI influenced much of the early boom we’ve been seeing in open-source AI with the release of GPT-NeoX-20B in January 2022. Stable Diffusion, developed by Stability AI which went live in August 2022 and has just issued V2 in mid-November, is the latest high-profile example of what can be accomplished with open-source AI. The advances, informed by these early open-source successes in AI have quickly driven the birth and exponential growth of Generative AI. The availability of open-source AI empowers researchers and engineers with access to the best, most cutting-edge models to help propel AI innovation forward. Stemming from these rapidly evolving open-source AI developments, we anticipate the acceleration of overall advances in AI as we look ahead into 2023 and beyond. These may include real-time object detection, sentiment analysis and smarter chatbots. The unique beauty of all those working on open-source projects is their idealism: unlike many tech advances to date which have been closed-source in nature and tightly held by large monolithic corporations, open-source contributors create  for everyone. It is a community of people passionate about sharing ideas, advancements and best practices that are more democratic in nature for the benefit of all. Whether it’s self-driving cars, doctors and nurses providing better patient care or companies giving customers what they want, the open-source AI community will continue to be key to human progress.

Data capture is changing the workforce. Commentary by Marieke Wijtkamp, SVP Product, Librestream 

Over the past year, the Great Resignation has shifted the workforce, leaving many industries with a large talent shortage and a vast knowledge gap – something that became even more apparent for critical industries. The unlikely hero in these circumstances has manifested as data, as it has the power to transform and improve the way people work. The first stage in creating vital intelligence networks is data capture. Organizations can automate processes, extract information, and ensure that they are efficient thanks to data, making effective and widespread knowledge exchange possible. This is especially important in critical industries and for field work as situations may be hazardous and require an expert-level understanding of processes and operations. Through data capture, organizations can build ‘knowledge networks,’ that are a repository of content from the field (from maintenance logs to step by step guidance from SMEs recordings to visual descriptions of processes). Through this network, workers can easily access a wealth of information and specialized knowledge while on the job. Information and knowledge continuity throughout an organization is key; data capture and redistribution are changing the workforce by filling the knowledge gap to promote efficiency, productivity, and most importantly, safety. 

Big Data Drives ROI for Frontline Operations. Commentary by Matt Belkin, Chief Executive Officer, Parsable

Manufacturers invest billions into frontline operations, but most industrial leaders can’t get visibility into what’s actually happening on the frontline and how that impacts operational results. Gathering granular frontline activity data helps industrial leaders prove what’s working and what’s not across all frontline processes — production, safety, quality, and more. This is the power of generating big data on frontline work. Frontline activity data is critical for understanding which variables within frontline workflows and processes are causing the inefficiencies inhibiting industrial leaders from reaching their cost, production, and quality goals. Having on-demand access to frontline operation data helps power business intelligence and predictive analytics and can reveal where changes need to be made to increase revenue and decrease costs. By utilizing frontline activity data and taking the appropriate action from the insights generated from the data, companies can achieve productivity and efficiency gains that give them an edge over their competition.

Responsible Artificial Intelligence: Shaping AI to Reduce Inherent Bias and Promote Human Equity. Commentary by Reggie Townsend, Director of the Data Ethics Practice at SAS

With the exponentially increasing use of AI in business and government applications, trustworthy, responsible innovation has occasionally been obscured by unethical applications. In addition to potential threats to personal privacy, AIs trained on data sets built on legacy inequalities or with harmful demographic omissions can amplify these biases at enormous scale. The potential harm that comes with the far-reaching power of AI makes it critical for developers to consider the responsible application of AI at every stage, from ideation to deployment. While the Administration has begun issuing new guidance on AI implementations, we do not yet have a comprehensive set of rules and regulations to guide development across industries, so it’s imperative for enterprises and agencies alike to independently instill the tenets of responsible AI. This approach places humans in the center, with a focus on ethics and transparency to promote human well-being, agency, and equity. To instill trust in these powerful new innovations, the human element remains a primary consideration at every stage in the process.

How AI Can Rescue Customer Service. Commentary by By Eli Israelov, Co-Founder and CEO, CommBox

Holiday sales are expected to be up as much as 8% this season, and travel is also expected to return to pre-pandemic levels. This means that among ongoing labor shortages, it is more clear than ever that companies need to embrace AI to keep up with customer service and communications. But for AI to be a real game changer, and transform customer service from a cost into a growth-driver, brands should keep three things in mind. First of all, companies’ AI-empowered communications need to be available on the platforms consumers are already using to talk to family and friends in daily life—like SMS texting, Facebook Messenger and WhatsApp–and not just on a brand’s website or app. Consumers don’t want to download apps or sign up for accounts, they just want to send a message out and have it answered. Secondly, AI should not be limited to fielding inquiries from customers; it should also be used to enable them to do tasks, like changing a date on an airline ticket or returning an item of clothing. And finally, AI isn’t just for customer-facing tools, it is also critical for helping burned out and overworked or customer service and call center agents do their jobs. Agents need to be able to automate repetitive tasks and assign jobs to AI virtual assistants. This will allow them to be the powerful human-in-the-loop rather than the human trying to work like a robot. It is clear that in today’s world, where consumers expect tasks to be easy and instant, and employees expect work to be satisfying, using AI to improve experiences on both of these fronts is a key way to rise above the competition.

How e-commerce companies can utilize first-party data to mitigate the impact of economic volatility, especially with an impending recession in 2023. Commentary by Shobhit Khandelwal, Co-Founder of Minoan Experience and Founder of ShyftLabs

As third-party data begins to diminish because of privacy restrictions and regulations, e-commerce businesses are increasingly reliant on their first-party data to understand and grow their business. While many companies are nervous about this, it can actually be a positive to harness your own data and use it to make sound business decisions. For example, having a strong first-party data strategy can help foster a better customer relationship by implementing something like accurate customer profiles, which can be used to improve the relationship along every aspect of the customer journey. Many businesses understand the value of first-party data when it comes to advertising, but in times of economic volatility, it is important to be able to use the data in a variety of ways to elevate your business. Some other ways that data can be used are for things like dynamic pricing, which alters the price dependent on a number of factors, one of which is often customer demand. This is a simple way to maximize profits on a high demand item, without making constant pricing decisions. Another aspect to consider during dire economic situations is customer acquisition and retention. Many customers are in search of positive brand experiences, and data can help. As mentioned, comprehensive customer profiles based on first-party data can establish a connection between the customer and the brand by always being one step ahead – offering convenient bundling options, re-ordering capabilities and more can help the customer feel less marketed to, and more like they are getting value from the brand authentically.

The Next Big Endeavor in Broadcasting is Artificial Intelligence. Commentary by Vinayak Shrivastav, Co-Founder and CEO at VideoVerse

Sports fans now have several ways to watch their favorite teams play games and matches, from attending events in person, watching at home, or on the go from a mobile device—without the need for a TV or cable subscription. Mobile devices, in particular, have increased demand for instant gratification. Many sports fans want to cut to the chase to access real-time highlights, player stats and specific moments. This presents a dilemma for broadcasters: adapt or risk losing fans and ROI. That’s where AI brings a competitive advantage. An AI-driven platform can identify content context, mapping it throughout the broadcast. In a tennis match, for example, AI can identify data aspects such as crowd and player reactions, ball tracking and its interaction with the racket. It can also identify and tag player interviews in the live broadcast stream with automatic cloud-agnostic metadata tagging. This ensures that the organization of videos is optimized to help fans quickly find the content they want. On the other hand, it will help broadcasters boost engagement and realize ROI from different revenue streams, including the metaverse. 

Breaking the “data explosion” into 3 key challenges. Commentary by Chris Cooney, Developer Advocate, Coralogix

Over the past decade, the data explosion has threatened to become the greatest constraint on our ability to make data-driven decisions. Our insights are becoming more complex, more difficult to derive, and our competition is becoming even faster. The new barrier is scale, and how we can efficiently process all of this new data we have, to generate the insights that will set us apart in the market. We can break the “data explosion” into 3 key challenges. The first is cost. “Big data” can not remain as expensive as it is. Cost is already becoming a huge influence in purchasing decisions for data-driven industries, like observability. The second is performance. When queries are scanning millions of documents, they can slow down dramatically. Slow queries lead to frustrated users and become a bottleneck for insights. Finally, innovation. The greatest blocker to consuming new data is how that data is presented. We need to find new ways to render our information, to yield actionable insights that give users the competitive edge they need to lead their industry.

AIOps’ Data Problem. Commentary by Nick Heudecker – Senior Director of Market Strategy & Competitive Intelligence at Cribl

As enterprises grapple with complex and growing IT environments, they’re increasingly relying on AIOps platforms to make these environments observable. With thousands of events per second being generated from systems, it’s impossible for IT staff to react and respond to all of them. AIOps platforms promise a better approach to dealing with this deluge of status changes, alerts, and events. However, these AIOps platforms rely on meaningful data — and that’s where organizations run into challenges. Enterprises are drowning in low-quality data, a huge variety of different data formats, and it’s all coming from multiple disparate sources. IT operations teams pursuing an AIOps strategy need to focus less on the models deployed in their platforms, and more on the inputs to the models. Normalize and structure their logs, events, and metrics as much as possible to ensure a higher quality input. Work with application developers on developing logging standards, even down to time stamp formats. Last, enrich data as much as possible with additional context and tagging to yield a better output from their AIOps platforms.

Machine learning algorithms to recognize patterns within patient data. Commentary by from Austin Jordan, Head of Data Science, AI and ML at Apixio 

The shift toward electronic health records has unlocked oceans of patient data – yet this is just the tip of the healthcare data iceberg. Unstructured data, or free-form text entries, accounts for an estimated 80% of all healthcare data, yet has remained largely untapped. These massive amounts of patient data that cannot be stored in relational databases contain critical information for matching patient symptoms with potential health conditions and assessing risk levels. As healthcare organizations transition from a fee-for-service model to value-based care, they need to find ways to improve outcomes and lower costs. Unstructured data analytics provides that opportunity. By implementing AI technologies, healthcare providers can sift through unstructured data such as medical charts and social worker and behavioral health notes to gain a more complete picture of their patient’s health. With this holistic view of the patient, physicians make more accurate diagnoses, improving overall patient care.

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW

Leave a Comment

*