Heard on the Street – 1/17/2024

Print Friendly, PDF & Email

Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Click HERE to check out previous “Heard on the Street” round-ups.

Solving the Unique Challenges of Aerospace Manufacturing with Text Analysis. Commentary by Pietro Cervellera, SVP Global Aerospace & Defense, Altair 

“Unlike other manufacturing industries that are semi or fully-automated, such as automotive or consumer electronics, aerospace still conducts much of its manufacturing manually. With so many individual pieces of equipment being assembled by different departments, and in some cases in different factories, the process creates a vast amount of text entries and documentation. All of this textual data from various sources, including technical documents, quality assurance reporting, aircraft catalogs, manuals and more, makes it difficult to collect, build, and analyze effective data science models. This challenge has kept manufacturers from being able to quickly identify potential errors, discrepancies or inconsistencies that lead to product defects. In an industry where the safety of passengers is at stake, the risk is too high to get it wrong, so long delays have persisted in the interest of safety.

But now, newer text analysis technologies can derive high-quality information from human-written text, helping to automate a key piece of the manufacturing process. It’s a fast, straightforward, and effective way to make sense of all the disparate text data the aerospace manufacturing industry churns out to turn into valuable insights that quickly improve product performance and reliability. In addition to product safety monitoring, text analysis enables companies to monitor and interpret regulatory changes and updates, ensuring timely compliance with the industry’s evolving requirements. It can also complement numerical data analysis by extracting actionable insights from unstructured textual data, enabling manufacturers to make more informed decisions related to production processes, design changes, and resource allocation. 

Combined with the advancements in AI and new no-code analysis tools, text analysis has become an invaluable solution. You don’t have to be a data scientist or machine learning expert to reap the benefits of text analysis; organizationally, it packages the data efficiently and makes it easier to visualize. By using text analysis, those vast catalogs of data across departments – from quality assurance, compliance and supply chain management, to maintenance, decision-making and risk mitigation – are now at the fingertips of whoever needs it.”  

A Perspective on Data Marketplaces. Commentary by John Tumminaro, VP Technology, GlobalLogic

“Data platforms and data lakes are old news, every enterprise has one and even possibly on their second or third generation. Within these data platforms, enterprises have aggregated “golden” versions of their 360-degree data to enable new downstream insight and advanced analytic data products. These data products have created value within the enterprise and are leveraged to create value outside the enterprise as products, available to clients and partners via their own APIs or a variety of data marketplaces.

Marketplaces can be convenient mechanisms to syndicate an enterprise’s data product to get broader market visibility than otherwise. Could a product be sold on its own website with its own branding? Absolutely. But could that same product get more visibility and higher sales on Amazon? Very likely, yes.

While selling data products via marketplaces comes with a cost in the form of revenue share, it can be well worth it if the sales are increased. Like with any good product that provides value to consumers and garners positive ratings, the revenue share aspect can become a non-issue.

A more challenging aspect of data marketplaces is the maturity of your data governance capability. Do you know the lineage and licensing aspects of every bit of data that made up your data product? Is the data within your aggregated/transformed/enriched data product yours to sell?

Some enterprise data products could be composed of several upstream sources, including commercially licensed data, open source data, tenant owned data, etc. From a legal and compliance perspective, it is crucial to understand what compositions are allowed to be sold as downstream data products. It’s easy to compose data sets and develop interesting downstream data sets, but without careful data governance processes and review, it might be impossible to publish such data products in data marketplaces due to licensing/compliance restrictions.

Fear not, here is this author’s simple recipe for success: 1) Dream up your awesome data product that consumers will love 2) Involve mature data governance from the beginning of product planning 3) Publish your awesome data product on all the fashionable data marketplaces you can 4) Make millions!”

Big tech is struggling to turn AI hype into profit — for now. Commentary by Raj De Datta, Co-Founder and CEO of Bloomreach

“Companies building massive AI applications are not going to turn a profit any time soon, but that’s not necessarily a problem. These companies have the cash balances to be making  sizable investments in their models without returns. Take ChatGPT, for example. Each successive iteration is on an order of magnitude with more parameters. Answer quality and relevance are improving exponentially, but as the AI ingests more information, it’s becoming more expensive to generate those responses. Yet these companies will continue to fight their way through this and run losses for a very long period of time, until economies of scale drive costs down. For now, increased costs are the tradeoff for scaling models and improving output quality.”

Quantum’s Role in Driving AI Acceleration. Commentary by Christopher Arrasmith, SVP of Enterprise Computing Solutions at Unisys

“Artificial Intelligence (AI) provides the opportunity for companies to unlock new value by finding the right answers to complex problems hidden in structured and unstructured data. While AI provides insight into how to take action, one thing limiting the use of AI for generating outcomes of significance is the processing of massive amounts of data to run accurate AI-powered applications. Running data through traditional computing systems limits the ability for organizations to solve highly complex data-based challenges in time to make a positive and substantive impact to the business. To solve these data-rich challenges, pairing hybrid compute architectures, such as quantum computing, to process the data, and AI to analyze and action the data removes some of the inherent delays in classical data-processing, enabling the generation of answers in faster time for the business to put them to use. In some cases, this compression of output enables near real-time decision making. This combination of quantum computing techniques and AI unleashes segment specific use cases in industries such as airlines, logistics and transportation where traditional computing and AI cannot instantaneously solve complex data analysis problems that have transformative business impact. 

With specific, large problem sets that AI alone cannot solve quickly, the air cargo industry is one of the first to experience the power of pairing quantum computing with AI to bring near real-time analysis and optimization to cargo management. Cargo shipments has become an increasingly important revenue driver for airlines, and the need to optimize capacity, minimize fuel costs, and reduce claims is critical to a carrier’s bottom line, which is also facing increased pressure from rising supply chain costs and increasing customer experience demands. In this industry, reams of historical and real-time data exist to help airline operations teams make cargo analysis decisions. By pairing quantum computing and AI, companies can analyze cargo data in near-real time to enable optimized utilization of cargo space, ensure the selection of most efficient routes for specific containers and palettes, and reduce claims and damages. Sample use cases such as these in aviation can be used to illustrate the promise of quantum and AI as a winning combination for creating transformational outcomes in other industries.”

AI / Machine Learning. Commentary by Miroslav Klivansky, Analytics and Global Practice Lead at Pure Storage

“As a whole, the bar for understanding and harnessing the full value of AI is still low but it won’t be for long as market pressures continue to accelerate AI adoption. The future of enterprise AI will be centered on AI being built into the products and services already in use. But as AI innovation evolves, we’ll see enterprises learn to build their own in-house AI data platform and move part of the workflows into their own infrastructure. For enterprises who want to get ahead of the curve, its critical that they start investing in building their in-house expertise now. A central ‘center of excellence’ for AI and Data Sciences will be more beneficial than individual AI projects scattered around the company.”

How Generative AI Will Impact Workforce Management. Commentary by Mitri Dahdaly, VP of Products at Legion Technologies

“Generative AI is poised to revolutionize workforce management and software engagement in general. All functions and deep expertise within a workforce management platform will be easily accessible to users through a simple conversation – no learning curve, no complicated interfaces. The combination of natural language processing and intelligent automation means requests will be understood and instantly executed. It will enable everyone to have a personalized, knowledgeable, and actionable virtual assistant who can boost/amplify their skills and productivity.

A generative-AI-based personal assistant will serve as an in-house workforce management policy expert and enable managers and employees to get instant answers to questions about their organization’s workforce management policies and software questions. Managers can get instant answers to general labor compliance questions based on up-to-date information from curated trusted .gov sources. The virtual assistant will also be able to act as a data expert, providing a summary of weekly schedules, schedule execution, and more. One of the most exciting developments is that the virtual assistant can act as an actionable agent that carries out workforce management actions such as adding or editing shifts, which will increase manager mobility and efficiency.”

How Data Became a Gatekeeper to Creativity. Commentary by Keith Pitt; Founder and CEO of Buildkite

“Companies are becoming weary of taking chances and searching for data to validate every decision before they are made. While data is a useful tool, developers have experience and instincts that provide innovative solutions beyond improving existing products.

If an engineering CEO were to ask their customers what they would like to see from their organization, the customer would likely go on to list ways to improve current products. When it comes to pushing past current products and innovating new offerings, customers struggle. Breakthrough products are not pulled from customer focus groups rather developers using years of expertise to pioneer new solutions to customer issues.  

The era of generative AI and automated testing has pushed many to think that software development consists of hard skills and understanding code. However, creativity, now more than ever, is imperative to develop breakthrough products and push boundaries. Organizations should use data to help inform elements of the development process, but it should not come at the expense of creative thinking and the developer’s expertise. To breed innovation, organizations need to give their engineers the autonomy and trust to develop creative solutions.” 

How the AI services of Big Tech providers is failing users and the path forward. Commentary by Josh Mesout, Chief Innovation Officer, Civo

“Today’s AI market is not working. The dominance of AI solutions run by the cloud computing giants leaves customers with limited choice that under-delivers for them at every turn. Users are struggling with AI & machine learning (ML) services that are complex, expensive to run, and generally out of reach for smaller businesses. The status quo has failed: Civo research found 48% developers find ML projects highly time consuming to run.

The path forward lies with open source solutions that lower the barrier to entry and enable smaller businesses to leverage AI. If we hope to achieve a responsible and equal future for AI, providers must focus on giving businesses accessible solutions and support – not on putting the interests of shareholders first.

The current dominance of Big Tech in the market is something that should concern everyone genuinely motivated to see AI succeed. The future has to be a level playing field, where any business – no matter their size – can innovate and deliver cutting-edge solutions using AI & ML.

Organizations Can Overcome Bad Master Data With an MDM Overhaul. Commentary by Danny Thompson, CPO at apexanalytix

“Master data is the foundation of business success, but many companies struggle to control the sheer volume of information – let alone keep it clean, secure and organized on a regular basis. Poor data quality comes at a price, though. Bad master data costs companies significant time, money and reputational damage, and it doesn’t improve on its own. Those wanting to take control of their master data must re-evaluate and reconstruct their master data management (MDM) programs.

Doing so starts with understanding the gaps and errors in current datasets, and then implementing clean data processes moving forward. As many organizations quickly realize, this is no easy feat and often leads them to cut corners or abandon their endeavors altogether. Fortunately, businesses can rely on automated tools to clean, enrich and organize the data, leaving them with a complete, accurate and up-to-date master dataset. Effective MDM takes time, effort and expertise, but organizations that achieve it will reap the benefits. With the right partner on board, companies can drastically transform operations and drive significant revenue growth.”

What do the new AI guidelines look like and how will they be implemented? Commentary by Wendy Gonzalez, CEO of Sama

“They are a set of voluntary guidelines designed to help companies make informed decisions about the design, development, deployment and operation of AI systems. These guidelines were developed in collaboration with 18 governments (including the US and UK, but not other major AI-developing countries like China) and some AI companies. The objective is to build AI systems that function as intended, are available when needed, and work without revealing sensitive data to unauthorized parties. 

Overall, they are a generic set of standards that are mostly already implemented by AI companies. They set a low threshold for the industry and have no enforcement mechanisms, meaning there will be no material impact.

These guidelines are already best practices for companies building AI, so they will be easy for companies to implement and say they “comply.” Ultimately, however, they will not solve the problems that the AI industry is currently facing around ethical development and usage of models and the data that powers them, nor do they diminish the potential misuse of AI.

Companies that don’t adhere to these standards won’t face any consequences legally yet (although they may lose out on valuable contracts from governments who want their vendors to follow them), and it will continue to be nearly impossible to determine liability in the event that AI makes a mistake or is misused.”

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW

Speak Your Mind

*