Big Data Industry Predictions for 2023

Print Friendly, PDF & Email

Welcome to insideBIGDATA’s annual technology predictions round-up! The big data industry has significant inertia moving into 2023. In order to give our valued readers a pulse on important new trends leading into next year, we here at insideBIGDATA heard from all our friends across the vendor ecosystem to get their insights, reflections and predictions for what may be coming. We were very encouraged to hear such exciting perspectives. Even if only half actually come true, Big Data in the next year is destined to be quite an exciting ride. Enjoy!

[NOTE: please check back often as we’ll be adding new content to this feature article into February 2023]

Daniel D. Gutierrez – Editor-in-Chief & Resident Data Scientist


Multi-Cloud Analytics. There are many reasons why a customer would choose to implement their architecture on multiple clouds whether it’s technology, market, or business-driven. When this happens, many times this leads to transactional and operational data being stored on multiple cloud platforms. The challenge this brings is how to gain insight into these without resorting to implementing multiple disparate data platforms. Historically data virtualization tools have been introduced to solve this style of problem, but it gets challenging when working across cloud environments. We are seeing increased emphasis from vendors on this message (Google’s BigQuery Omni is one example) and expect customer adoption to pick up in order to quickly unlock value across data platforms without having to perform migrations. – Brian Suk, Associate CTO at SADA

Augmentation of data quality, data preparation, metadata management and analytics. While the end result of many data management efforts is to feed advanced analytics and support AI and ML efforts, proper data management itself is pivotal to an organizations’ success. Data is often being called the new oil, because data- and analytics-based insights are constantly propelling business innovation. As organizations accelerate their usage of data, it’s critical for companies to keep a close eye on data governance, data quality and metadata management. Yet, with the growing amount of volume, variety, and velocity of data continues, these various aspects of data management have become too complex to manage at scale. Consider the amount of time data scientists and data engineers spend finding and preparing the data, before they can start utilizing it. That is why augmented data management has recently been embraced by various data management vendors where, with the application of AI, organizations are able to automate many data management tasks. According to some of the top analyst firms, each layer of a data fabric — namely data ingestion, data processing, data orchestration, data governance, etc. should have AI/ML baked into it, to automate each stage of the data management process. In 2023, augmented data management will find strong market traction, helping data management professionals focus on delivering data driven insights rather than being held back with routine administrative tasks. While these are the five most important trends in our mind, there are other areas of data and analytics practice which will shape up how digital business will not only survive but thrive in 2023 and beyond. The last two to three years have definitely taught us that digital business is not really a fall back option when the world cannot meet in person, but that is where the future lies. Hopefully your organization can gain some insights from these article as you lay out your digital business plan. – Angel Viña, CEO and founder of Denodo

Not only will companies rely on analytics to track their productivity, down-time, budget, and performance KPIs, but companies will also solicit new areas to implement analytics for better tracking and precision. As the saying goes, you can’t manage what you can’t measure – and especially in today’s volatile economic climate (e.g. inflation, supply chain, geopolitical tensions), business leaders are increasingly keen to track as many facets of their business as possible so they can do more with less and become more adaptable. I’d predict that companies who already use analytics will be eager to use even more, and companies/industries that haven’t traditionally been as dependent on analytics will jump into the game. – Eddy Azad, CEO, Parsec Automation

Data analytics are increasingly being asked to drive greater data freedom in the multicloud: Data freedom is data’s ability to be accessed and to move, and therefore the freedom to deliver insights and the business value that comes from these insights. It means a frictionless flow—unobstructed by structural, economic, and other barriers. One way in which analytics can drive data freedom is via open data lake architectures that leverage open-source frameworks. – Vamsi Paladugu, Senior Director, Lyve Cloud Analytics at Seagate Technology

More knowledge workers in non-data science roles will turn to advanced analytic techniques: Thirty years ago, a marketing professional didn’t need to know about the internet. Fast forward to today and it’s an integral part of just about any job, including marketing. Analytics is on a similar trajectory, and that will become even clearer in the coming year. We’ll see more companies enable knowledge workers throughout their business to garner impactful insights from their data. The combination of no-code/code-friendly cloud analytics solutions and increased investments in data literacy and upskilling will take data analytics out of the realm of specialty roles and into the entire organization. – Alan Jacobson, Chief Data & Analytic Officer at Alteryx

Data Analysis Will Be Continuous: Data warehouses are becoming “always on” analytics environments. In the years ahead, the flow of data into and out of data warehouses will be not only faster, but continuous. Technology strategists have long sought to utilize real-time data for business decision-making, but architectural and system limitations have made that a challenge, if not impossible. Also, consumption-based pricing could make continuous data cost prohibitive. Increasingly, however, data warehouses and other infrastructure are offering new ways to stream data for real-time applications and use cases. Popular examples of real-time data in action include stock-ticker feeds, ATM transactions, and interactive games. Now, emerging use cases such as IoT sensor networks, robotic automation, and self-driving vehicles are generating ever more real-time data, which needs to be monitored, analyzed, and utilized. – Chris Gladwin, CEO and Co-founder of Ocient

Google considers shedding Google Analytics: In the past few years, Google Analytics has been a thorn in the side of Google. Google Analytics continues to cause legal and privacy issues for Google and may eventually put Google’s advertising revenue (its “cash cow”) at risk. The marketplace has spoken and told Google that privacy is important and that website/app users don’t want to be tracked by Google. Add to this the fact that the latest version of Google Analytics (GA4) has been perceived by the marketplace as a flawed product. If Google Analytics continues to subject the larger Google to legal scrutiny and puts its advertising business at risk, there is a possibility (albeit a remote one) that Google spins off the GA unit so that it is no longer connected to Google. If that were to happen, Google Analytics would have a difficult time surviving since it likely loses money on an annual basis and is currently supported by the Google Ads business. – Adam Greco, product evangelist, Amplitude

Organizations will need to shift to confidential computing analytic platforms that do not compromise data security: The massive organizational need to protect data throughout its lifecycle will lead to the rapid adoption of confidential AI and analytics platforms that enable data analysts and machine learning practitioners to securely analyze data without ever having to expose it unencrypted during processing. The adoption will be driven by the rise of business use cases that mandate confidential analytics on sensitive data and the hefty costs associated with a data breach or failure to meet data privacy regulations and compliance policies. – Rishabh Poddar, CEO and Co-Founder of Opaque Systems

Data & Analytics ROI is Key Metric: Data and analytics continue to be the focus of IT transformation for many organizations, but as the market evolves–and in an environment of economic uncertainty–most will demand that their investments in analytics show a path to a clear return on investment in the near-term. Lessons learned during the pandemic and from the experiences of organizations that are already a good distance down the analytics road, show that producing faster, more accurate business intelligence is an attainable goal. That puts an emphasis on production rather than proof-of-concept, and ROI is a key metric for enterprises to consider before adopting any analytics platform or product. – Kyligence‘s CEO Luke Han

SaaS Free Forever is the New Open Source in 2023: When open-source software hit the data analytics scene, interest and adoption skyrocketed as organizations cited the benefits of cost effectiveness, speed and agility, community, and avoiding vendor lock in.  What most companies learned was that many projects required extensive set up, integration, and maintenance that slowed both innovation and migration to the cloud.  SaaS models will continue delivering on the promise and speed and agility, while reducing switching costs.  Emerging Free Forever SaaS models will further make these offerings cost effective and facilitate robust developer communities. – Nima Negahban, CEO and Cofounder, Kinetica

Data and Analytics Will Knock Down Silos, Providing a Holistic Look at the Customer Journey: New contact center technologies are emerging that knock down silos and walls and offer an end-to-end look at the holistic customer journey. This will change the customer experience as we know it. For instance, in a traditional environment, an end-user customer may visit a company’s website, then call in to make a purchase or resolve an issue via an agent. Instead, in a modern contact center environment, the conversational AI agent will already know the end user has been on the company’s website and immediately present them with customized offers based on cookies that recreate their browsing journey. Data and analytics will evolve to create that holistic customer journey coupled with software that helps organizations shorten the time needed to serve their customer – all while being smarter about it. – Waterfield CEO Steve Kezirian 

2023 will see the shift from traditional business intelligence to modern business intelligence: The data analytics space is advancing with AI/ML, driving faster decision making and allowing companies to focus on their success through business intelligence. As the implementation of automation increases, business intelligence will be furthered as well. For example, to scale and democratize automation, robotic process automation will evolve into an as-a-service offering, allowing enterprises and small businesses alike to take advantage and automate business processes to move the business forward. – Ali Ahmed, GM, Enterprise Applications at Cloud Software Group

Teams use self-service analytics to revolt against the HiPPO: According to IDC research sponsored by Heap, 69% of digital product teams reported that decisions were often driven by the highest paid person in the room (HIPPO). Letting HIPPOs base critical decisions on instinct can lead to volatility in an organization, and with the state of the current economic market, it can be catastrophic. In 2023, more teams will revolt and use data to fight executive “gut” decisions. Utilizing self-service analytics tooling will become more readily available on the market to better drive investments and business results. The same IDC survey revealed that organizations that rely on data had a 2.5X increase in overall business outcomes. – Rachel Obstler, EVP of Product at Heap

Dashboards are Dead: Experienced business intelligence and data engineers have long been aware that dashboards–often counted as a business asset–tend to generate a great deal of technical debt that accrues over time, hampering performance. That’s because, for all the ease-of-use glitz and democratization acclaim, dashboards are just dirty tools for connecting data silos. With enterprises rushing to adopt public cloud applications and services, using business intelligence dashboards for managing processes and reviewing results will create more problems than they solve. But while BI dashboards die off, the metrics store concept will displace them as the preferred method for efficiently  managing data on one platform across the entire  enterprise. – Kyligence‘s CEO Luke Han

Tighter integration of managed cloud services and object storage will deliver better AI/ML and analytics performance: Application vendors will publish their own extended storage APIs for enhanced monitoring, reporting, performance acceleration and optimal data placement. This will be embraced by leading object storage solutions to deliver even more compelling solutions and ROI for enterprise and mid-market customers in data protection (backup and ransomware protection), big data analytics and AI/ML. In addition, because customers value cloud-like storage services but show a preference for them from the comfort of their own data center infrastructure, we will see increasing partnerships between object storage vendors and large OEMs or managed service providers (MSPs) to provide fully integrated, private-cloud S3 storage-as-a-service offerings. – Paul Speciale, CMO, Scality 

Companies should adopt real-time analytics in 2023: In 2023, companies should focus on making real-time analytics the center of all their activities.  We are in the Industrial Revolution 4.0 (4IR), a time when the boundaries between the physical, digital, and biological worlds are blurred, with data being a shared output of all three. Data in the 4IR is fast, accessible and immediate — it’s modern data. The difference in milliseconds drives customers from your organization to a competing one. Every industry needs to focus on becoming an “on-demand” service provider to meet rising customer expectations. As real-time becomes the mainstream expectation in 2023, businesses need to reevaluate their data strategies to include real-time applications that allow for scale and will enhance customer experiences. Data movement causes latency, which is why a unified database will prevail – a singleverse. Real-time will not only play an important role for businesses, but the world as a whole will benefit if transactions happen in real time across all functions, including banking, healthcare and cybersecurity. In order to be victorious in the new year, companies should reimagine their data strategies to meet the real-time demands of society. – Raj Verma, CEO, SingleStore  

Artificial Intelligence

AI model explainability stays hot. As AI becomes increasingly important in the lives of everyday people, more people want to know exactly how the models work. This is being driven by internal stakeholders, consumers, and regulators. – Anupam Datta, co-founder, president and chief scientist at TruEra 

AIOps is placing more emphasis on cyber asset management for tagging and classification of assets – AIOps is on the rise as companies embrace automation to help with alert management and auto resolution of issues to maximize operational reliability and uptime. Along with this, we’re seeing a rise in advanced tagging and metadata management of assets to ensure AIOps algorithms can manage these assets effectively in automated processes. – Keith Neilson, Technical Evangelist at CloudSphere 

Formal regulation of AI use in the U.S.  U.S. regulatory agencies have been studying the challenges and impacts of AI, but have yet to make a significant move, unlike the European Commission.  I anticipate that to change in 2023, with the U.S. finally drafting its own rules at the federal level, similar to those already in effect in the EU and Asia. Guardrails are good for everyone in this market, and will ultimately help build trust in AI. U.S. regulations are not far off, and business should get ready. Anupam Datta, co-founder, president and chief scientist at TruEra 

We will see an AI Maturity divide, in which there will be AI Haves and Have Nots; the economic situation will exacerbate that. Modern data architecture will be the underpinning of that split. – Peak’s CEO Rich Potter

AI Becomes Democratized. While AI has traditionally been seen as a complicated and challenging innovation, in 2023, artificial intelligence will be spread to a wider user base that includes those without specialized knowledge of AI. This change will put the power in the hands of customers, not just developers. Companies will seek self-service tools to create their own custom machine learning models that examine business-specific attributes. – Monish Darda, Co-Founder and Chief Technology Officer at Icertis

Companies will address looming AI Regulations with Responsible AI. Governments in the EU and US plan to impose new regulation to protect consumers (i.e., the EU’s liability rules on products and AI and the White House’s AI Bill of Rights). Surprisingly though, many organizations view AI regulation as a boon versus a barrier to success: Almost two thirds (57%) of companies regard AI as a critical enabler of their strategic priorities. 2023 will see many organizations shifting from a reactive AI compliance strategy to a proactive development of Responsible AI capabilities – in order to build solid foundations for adapting to new regulations and guidance. – Accenture

More debate about AI and bias. Is AI a friend or foe to fairness?  In 2021 and 2022, people were concerned that AI was causing bias, due to factors such as bad training data. In 2023, we’ll see a growing realization that AI can help eliminate bias by bypassing the historical points where bias came into play. People are often more biased than machines. We’re starting to see ways that AI can reduce bias rather than to introduce it. – Anupam Datta, co-founder, president and chief scientist at TruEra 

Geopolitical shifts will slow AI adoption as fear and protectionism create barriers to data movement and processing locations. Macroeconomic instability, including rising energy costs and a looming recession, will hobble the advancement of AI initiatives as companies struggle just to keep the lights on. – Peak’s CEO Rich Potter

Focus on ethical AI practices. In 2023, organizations will focus on eliminating bias from automated decision-making systems. The development of ethical and explanable AI models have been priorities for Icertis in recent years. Now, with the release of the blueprint for an AI Bill of Rights, the technology industry as a whole will work to eliminate unfairness in AI. The machine will never have all the data, which is why keeping the human in the loop is so important. – Monish Darda, Co-Founder and Chief Technology Officer at Icertis

Augmented Data Management: Augmented data management will rise in importance as AI becomes more integrated with data quality, metadata management, and master data management. This means that manual data management tasks will be lessened thanks to ML and AI developments, which enable specialists to take care of more high-value tasks. – Founder and CEO of Nexla, Saket Saurabh

The battle between AI speed & quality will come to a head: For as long as businesses have leveraged AI, executives have been focused on prioritizing one of two things: speed to deploy AI or quality of AI data. Technology, combined with human oversight to help spot areas of improvement along the way, will help merge speed and quality and help companies make their AI moonshot goals a reality in the coming year. – Sujatha Sagiraju, CPO at Appen

AI will prove recession-proof. The demand for AI and other kinds of resource-intensive workloads and applications requiring specialized and high-performance computing platforms WON’T slow down because of economics, and, will in fact, increase. AI has become critical to almost every business, government agency, and organization on the planet. And its importance grows every day. But AI will add complexity to IT decisions – see Prediction 1 – and will compel CIOs to execute strategies that include a mix of cloud repatriation of AI apps and more complex multicloud, multiplatform distributed architectures. – Holland Barry, SVP and Field CTO, Cyxtera

Businesses will have the ability to use AI within their organization to better suit their individual, specific business needs. One of the biggest trends that we’re going to see in the AI space in 2023 will be the shift from being an artisanal sport for data scientists and quant-jockey’s, to more of an industrialized, embedded type of construct where actual business users are able to start using and working with algorithms. It will no longer be strictly the domain of data scientists, and it is going to move away from the standard, laboratory type of black-box construct. We’re really going to start to see more industrialization within these programs. What we are going to see is, you have all of these models – the kind that you’ve done POCs on and have figured out the values – and now, you are putting it out into the field. It’ll be more focused on letting business users figure out how to use AI in a very non-prescriptive way. To put it in other terms, you’re sitting down and saying, “okay, here are the things that AI can do: predictive analytics, anomaly detection, etc,” but now you’re putting it into the hands of first-time users. By eliminating these data silos and putting AI directly into the organization, we can then enable information to be more democratized within the organization. This will also benefit from a low-code no-code type of environment where users can start configuring what data sets they want to work on and how they themselves can figure out and utilize this data to create predictions, fine-tune it, and make it work for them. – Bikram Singh, Founder and CEO at EZOPS 

The AI industry will offer more tools that can be operated directly by business users. Companies have been hiring more and more data scientists and MLEs but net AI adoption in production has not increased at the same rate. While a lot of research and trials are being executed, companies are not benefiting from production AI solutions that can be scaled and managed easily as the business climate evolves. In the coming year, AI will start to become more democratized such that less technical people can directly leverage tools that abstract all the machine learning complexity. Knowledge workers and citizen “data scientists” without formal training in advanced statistics and/or mathematics will be extracting high-value insights from data using these self-service tools allowing them to perform advanced analytics and solve specific business problems at the speed of the business. – Ryan Welsh, Founder and CEO of Kyndi

AI is advancing rapidly. Look no further than Google Translate’s evolution over the past five years for proof of that. And as data significantly increases, creating new AI and ML opportunities, early AI investors will soon establish a clear lead in their respective industries. Companies that cannot prove themselves quickly and establish a market position will fall behind, leaving a group of clear winners capable of delivering top-tier solutions. – TealBook Chief Technology Officer Arnold Liwanag

Ethical AI becomes paramount as commercial adoption of AI-based decision making increases. Companies across industries are accelerating the usage of AI for their data-based decision making. Whether it’s about social media platforms suppressing posts, connecting healthcare professionals with patients, or large wealth management banks granting credits to their end consumers; However, when artificial intelligence decides the end result, currently there is no way to suppress the inherent bias in the algorithm. That is why emerging regulations such as the proposed EU Artificial Intelligence Act, and Canada’s Bill C-27 (which may become the Artificial Intelligence and Data Act if enacted) are starting to put a regulatory framework around the use of AI in commercial organizations. These new regulations classify the risk of AI applications as unacceptable, high, medium, or low risk and prohibit or manage the use of  these applications accordingly. In 2023, Organizations will need to be able to comply with these proposed regulations, including ensuring privacy and data governance, algorithmic transparency, fairness and non-discrimination, accountability, and auditability. With this in mind, organizations have to implement their own frameworks to support ethical AI e.g. guidelines for trustworthy AI, peer review frameworks, and AI Ethics committees. As more and more companies put AI to work, ethical AI is bound to become more important than ever in the coming year. – Angel Viña, CEO and founder of Denodo

2023 Will Be The Year of Embedded Innovation: Embedded innovation is key for digital transformation. If businesses are siloed, they’ll never achieve the goals they set for themselves. For most organizations, AI is currently an IT project not integrated into the business processes. With embedded systems, you also need the right data and machine learning algorithms to drive the AI to make the right decisions. As companies increase the value of the data in their ERP/ERM solutions and couple it with other external sources (i.e. weather patterns) you start to build a better pool of data that will drive better decision making. The best sources for this data come directly from the production floor, from the bottom up. What we’re seeing the start of is increasingly more accessibility to public-sourced data, which allows businesses to have a wider scope of information on things affecting their suppliers or customers that they can then use to inform their own business decisions. In the future, companies will be able to calculate the probability of a disruption in their supply chain or build a better profile for marketing campaigns based on this data. It extends the value of traditional ERP in decision making past what is currently in use. – Kevin Miller, Interim CTO, IFS 

Powers of AI & ML to Improve Workflows & Alleviate Resource Constraints. At a time when organizations face constant waves of sophisticated threats across multiple vectors, cloud security will increasingly harness AI and ML capabilities to not only alleviate skills shortages and resourcing challenges, but also automate powerful workflows to help enterprises stay ahead of attackers. – Rodman Ramezanian, the Global Cloud Threat Lead at Skyhigh Security

Adversarial learning – CIOs need to understand this technique: bad actors training neural networks to fool predictive algorithms. For example, adversarial algorithms have been used to dupe cars into driving through lanes and render a stop sign invisible to classification algorithms. The same technique is applied to image and audio samples to trick prediction algorithms. – John McClurg, Sr. Vice President and CISO at BlackBerry

In an increasingly data-centric world, edge computing will fuel the evolution of reliable AI: AI is ubiquitous in our everyday lives. It suggests what to buy and the news we read. It could determine the emails we receive and augment the cars we drive. Moving forward, AI will be even more embedded in our world. It’ll go through a maturation phase that enables us to rely on it more. Predictability and explainability of AI will improve dramatically as we move forward. Moreover, AI will evolve from being algorithm driven to being more data driven. In order for this to be effective, more and more computation will happen at the edge for AI to be reliable, responsive and cost-effective. This trend of more data influencing the algorithms will determine how AI will evolve to be a tool that is relied upon heavily in this data centric future. – Ravi Mayuram, CTO, Couchbase

The Battle Between Speed and Quality Will Come To a Head: For as long as businesses have leveraged AI, executives have been focused on prioritizing one of two things: speed to deploy AI or quality of AI data.These two have not been mutually exclusive things in the past, which has led to fundamental problems in how companies build, scale, deploy, and maintain their AI systems. In the future, however, companies should no longer find themselves in a position where they are sacrificing speed for quality or vice versa. To avoid this problem, we will see companies continue to deploy solutions that help them both source quality data and scale AI systems more efficiently and effectively than ever before. Technology, combined with human oversight to help spot areas of improvement along the way, will help merge speed and quality and help companies make their AI moonshot goals a reality in the coming year. – Sujatha Sagiraju, CPO, Appen

AI moves deeper into the pipeline: As concerns about a recession and economic uncertainty rise, many will see a pull back on investment and hiring. However, with the global skills shortage continuing to impact companies of all sizes, ensuring technologies such as Artificial Intelligence (AI) and Machine Learning (ML) are able to automate some of the more menial data preparation tasks will be crucial. By moving AI deeper into the data pipeline before an application or dashboard has even been built, we can finally start to shift the breakdown of time spent on data preparation versus data analytics. Right now, less than 20% of time is spent analyzing data, while just over 80% of the time is spent collectively on searching for, preparing, and governing the appropriate data. Doing this would enable hard-to-come-by data talent to focus on the value-add; cross-pollinating and generating new insights that weren’t possible before. A far more productive use of their time. That’s why Gartner estimates that by 2024 manual data integration tasks will be reduced by up to 50%, and why this is set to be a key trend for 2023. – Dan Sommer, former Gartner analyst and Qlik’s Global Market Intelligence Lead

Defining Hybrid AI in 2023: 2023 will be the year we define what hybrid AI really is. Hype for hybrid AI has grown exponentially, but there’s been some debate over what it is. Some say it’s a physical simulator tied to machine learning; some say it uses a hybrid cloud — as of late, no clear definition has emerged. In the new year, the industry will reach a consensus, and once it does, an explosion of new tools will emerge as organizations take action. – Michael Krause, the AI Solutions Director at Beyond Limits 

AI-Powered decision-making will take data analysis to the next level, but the technology is still nascent: AI (artificial intelligence) and ML (machine learning) models have been crucial in highlighting underlying correlations in data which is not obvious to human interpretation usually. In the next two to three years, these models will further evolve to suggest corrective action based on the analysis. Actionable insights will be accompanied by recommendations towards possible actions. Such recommendation engines will be highly vertical-specific and use case-specific to begin with before becoming vertical agnostic. Take IT workload and SLA balance for instance. AI-powered analytical engines will be able to provide insights on why you’re unable to improve SLAs by taking into account in several factors such as how are your SLAs as compared to industry trends, what is your ticket-to-technicians ratio, how the total workload is split among your technicians, who are your busiest technicians, who are the ones who struggle to hit SLAs, and how has cost-cutting measures impacted your SLAs. The suggestions can range from ideas to reshuffle workload to adding more technicians or upskilling technicians. This is an example of vertical-specific suggestion. This can help senior management (CIOs or CTOs), who may not be experts with analytics tools, gain actionable insights that can put them on the fast track to implementing solutions instead of hunting for answers within data. – Rakesh Jayaprakash, Head of Product Management at ManageEngine

AI ethics will be a key consideration for DevOps leaders: Organizations that leverage AI have gotten caught up in the benefits it provides while ignoring the implicit bias that naturally infiltrates data sets. With brands being held to higher customer standards and governance pressure than ever before, it is imperative to consider AI and data ethics from the start. This approach will require an increased level of human and machine-based collaboration, ultimately resulting in an ethics-based approach to driving business outcomes with unbiased data at the forefront, satisfying a wide variety of stakeholders. – Vamsi Paladugu, Senior Director, Lyve Cloud Analytics at Seagate Technology 

ERP Systems Need to be “AI-ified”: While ERP systems are strategic for entering, storing, and tracking data related to various business transactions, CIOs, COOs, and business analysis teams have struggled over decades to extract, transform, and load data from ERP systems and utilize it for AI/ML applications. As enterprises spearhead digital transformation journeys and look to implement AI, the demand to connect to enterprise data across the organization has never been more paramount. In 2023, the market is starting to support the concept of AI micro-products or toolkits that can be used to connect to ERP systems through middleware. These middleware toolkits must have the ability to link to data both within the organizations from the ERP systems as well as CRM or HR platforms and external data (such as news or social media). The middleware can then feed into the leading AI platform to develop, select, and deploy ML models to provide highly accurate predictions and forecasting. – Anand Mahukar, CEO, Findability

Bias is Overhyped: Bias is a concept that gets a lot of attention– and will continue to get more with the AI Bill of Rights– it’s not something that many ML practitioners are concerned with day-to-day. Of course, they account for it, but sound ML practitioners understand the issues and know what to do to prevent bias from adversely affecting outcomes. – Gideon Mendels, CEO and co-founder of MLOps platform Comet

IT teams will prioritize AI/ML applications that automate tasks to save cost and human effort: With high labor costs and a recession looming, AI/ML solutions that solve practical use cases will take priority in 2023. IT organizations will look to build or purchase autonomous solutions that bring them cost savings, provide operational efficiency, and reduce or eliminate human effort. – Preethi Srinivasan, Director of Innovation at Druva

Organizations have rapidly and successfully adopted AI for wide-ranging use cases in recent years, but a majority of AI today can be described as ‘narrow’ – it replicates human actions for highly specific purposes, such as answering common customer questions, and exists mostly in siloes. As more and more organizations introduce AI and automation throughout the enterprise, they will move away from deploying disparate applications and begin building connected ecosystems with AI at their core. This enables organizations to take data from throughout the enterprise to strengthen machine learning models across applications, effectively creating learning systems that continually improve outcomes. AI becomes a business multiplier, rather than simply an optimizer. – Vinod Bidarkoppa, Chief Technology Officer, Sam’s Club and Senior Vice President, Walmart

AI goes ROI: The slowdown in tech spending will show up in AI and machine learning in two ways: major new AI methodologies and breakthroughs will slow down, while innovation in AI moves toward “productization.” We’ll see AI get faster and cheaper as the innovation moves into techniques to make deep learning less expensive to apply and faster through models like DistilBERT, where accuracy goes down a bit, but the need for GPU’s is reduced. – Jeff Catlin, Head of Lexalytics, an InMoment Company

State, local, federal governments will focus on skill-building for AI adoption, aided by new federal funding: Government investment in AI at all levels will increase AI integration and job creation. Agencies with limited technical skills will deploy AI-as-a-Service so municipalities can better support the public. While US AI infrastructure remains competitive against peer and near-peer states, the country is currently lagging behind AI powerhouses like China in qualified STEM professionals, (source: Brookings Institution). – Rodrigo Liang, Co-Founder and CEO of SambaNova

After years of growth in AI adoption, the technology has now hit an inflection point where AI is no longer just for enterprises with the most advanced technology stacks. Today, more and more mainstream businesses are seeing the value AI can bring to help solve their most critical problems and are embracing AI, and I expect to see more and more AI on “Main Street” in the coming year. As these organizations adopt AI, I expect them to start using more pre-trained models and fine-tune the model with additional data as they go rather than starting to build models from scratch, since there are more pre-trained AI models available in the market than ever before. I also expect to see these companies opting for open data ecosystems versus proprietary data stacks so that as they continue on their AI journeys they have the flexibility to innovate and scale faster. – June Yang, Vice President, Cloud AI and Industry Solutions, Google Cloud

AI on the Offense: Deepfake technology to date has resulted in political confusion, internet chatter, and some amusing mashup videos, but expect this to change in the near term. Security experts have warned for years about the possibility of social engineering attacks with deepfakes, and the technology has matured enough for 2023 to see hackers successfully leverage it. We will see an increase in image generation, generated audio, and conversations that appear realistic, designed to trick recipients into sharing personal data or other sensitive information. The deepfake threat isn’t relegated solely to consumers; we’ll likely see threat actors spoof a Fortune 100 CEO in an attempt to defraud or otherwise damage the organization. – Scott Register, VP Security Solutions at Keysight Technologies

AI Will Supplement Virtual Sales Training to Create Top Performers: The applications for AI in sales have only begun to scratch the surface. In the coming years, we’ll see new use cases—starting with virtual training and coaching. Of the 75% of organizations that provided practice opportunities during the pandemic, 90% found effectiveness in doing so. Coaching is in demand, but few sales managers have time for it with large, dispersed teams. As a solution, more sales teams will use AI-powered virtual actors to simulate live conversations for practice and feedback in 2023. As AI becomes more sophisticated, companies that use this tech to stay on the cutting edge will attract, train and retain top sales reps. – from Andre Black, Chief Product Officer of Allego

In 2023, many companies will seek Artificial Intelligence-powered (AI) solutions that modernize their business processes, and improve productivity and employee morale while growing the bottom line. I believe that, as AI applications become more widespread, the common misunderstanding that ‘AI takes jobs from people’ will dissipate. Over time, the understanding that technologies denoted as AI help employees focus on more strategic tasks will grow. In my opinion, the widening of AI applications mirrors other historical inventions that were originally suspected to “take jobs”. Let us draw an analogy with the steam engine – this advancement did not prove disastrous for workers. It reduced their workloads and enabled them to do more. Advances in Machine Learning (ML), like Intelligent Document Processing (IDP) that handles all data entry and processing from a cloud-based platform, is one of the technological advancements happening today. Manual data entry is the modern equivalent of physical tasks in the workplace, and ML is here to help. – Milan Šulc, Head of AI Lab at Rossum

Implementing AI & Data for better digital adoption products & solutions: Artificial intelligence and machine learning will grow as crucial tools of a successful digital adoption platform. They will help focus and improve the customer experience by greater organization of the learning journey. This will be done by enabling more data-driven processes, which will highlight and predict what customers will use on the platform, and what needs to be improved. – Uzi Dvir, Global CIO at WalkMe

Foundation models are AI’s next frontier: We’ve seen recent and rapid growth of generative AI models and expect that they will change business operations across a variety of industries in 2023. Foundation models are enabling many of these breakthrough AI capabilities that are delivering value that was not previously possible. Given the economic climate we’re in, we foresee the government and the most forward-looking organizations (banks, research organizations, oil and gas companies) leveraging AI to drive ROI and cost savings in text-heavy workflows like fraud and compliance, customer service, and operational efficiency. – Rodrigo Liang, Co-Founder and CEO of SambaNova 

Investment in data will continue to open organizations to new possibilities and drive transformation in decision-making. Data compliance, security, and mobilization will continue to be principal drivers of new investment in 2023, with data becoming more central to mission-critical operations. Businesses’ most significant challenge remains staying competitive in an increasingly complicated world. Successful adopters will find solutions that uplift existing tech stacks and staff with solutions tailored specifically to their industry or vertical instead of trying to squeeze onto platforms not designed for them. AI should target opportunities for new business growth and is not a one size fits all. – Sasha Grujicic, Chief Operating Officer, NowVertical Group Inc

More debate about AI and bias: Is AI a friend or foe to fairness?  In 2021 and 2022, people were concerned that AI was causing bias, due to factors such as bad training data. In 2023, we’ll see a growing realization that AI can help eliminate bias by bypassing the historical points where bias came into play. People are often more biased than machines. We’re starting to see ways that AI can reduce bias rather than to introduce it. – Anupam Datta, co-founder, president and chief scientist at TruEra

Organizations Will Prioritize AI/ML Expertise, but Finding IT Talent Won’t be Easy: To truly take advantage of AI is a long journey; one that will continue to be bottlenecked by a lack of IT talent. Organizations need data scientists, people who are technically capable and cognizant of AI/ML technologies, however labor market shortages have caused them to pull back on growth plans. While layoffs may continue in 2023, organizations will hold onto the technologists working in areas of AI/ML that have the ability to provide far-reaching benefits in the future. – Amy Fowler, VP of FlashBlade Strategy and Operations, Pure Storage

AI-powered advertisements: Whether companies bringing their mascots to life, or avatars engaging with us via AI-powered billboards and signage, AI will enable advertisers to take their efforts up a notch. Soon, instead of a person handing out flyers, individuals will get specific, targeted solicitations as they’re waiting for the streetlight to change, walking towards an ATM, or refueling their car at a gas station. All of this will be made possible via Facial Recognition and for those of us who have ever placed an Xbox game, watched Netflix, or ordered anything from Amazon, these daily encounters will know how to capture your specific attention. – David Ly – CEO and founder of Iveda

AI will be increasingly augmented by automation to enhance its impact on the business. AI and automation make a powerful combination, opening up more capabilities that can unlock new value. From Communications Mining to NLP to Document Processing, AI and automation can tackle entirely new realms of work. – Ted Kummert, Executive Vice President of Products & Engineering at UiPath

Generative AI Transforms Enterprise Applications: The hype about generative AI becomes reality in 2023. That’s because the foundations for true generative AI are finally in place, with software that can transform large language models and recommender systems into production applications that go beyond images to intelligently answer questions, create content and even spark discoveries. This new creative era will fuel massive advances in personalized customer service, drive new business models and pave the way for breakthroughs in healthcare. – MANUVIR DAS
Senior Vice President, Enterprise Computing, NVIDIA

We will start to see a democratization of AI: AI can be a scary tool but it is an integration businesses must get on board with to simplify and integrate their processes – and this can be accomplished with no code at all. No to low code opens the door to customization at a better price and opens the door to less technical people to use it. It also creates an easy starting point so users are not scared off. It’s what everyone has been waiting for – really making a proper intelligent model to answer the companies questions. Democratization of AI offers more than just an answer, it provides a direct extraction of the answer from the source content. – Daniel Fallman, CEO and founder of Mindbreeze

AI May Not Be Sentient (Yet), But It’s Smart Enough to Put Your Dark Data to Work: True discovery will be found through AI and advanced data analytics, by assigning more tasks to AI. Organizations are awash with dark data. In the legal world, dark data can be a treasure trove for the finding. What was once thought of as a burden of data is now either an asset or a risk in the age of AI: in those dark spaces there are breadcrumbs for AI to follow. AI has the cycles to mine insights from vast quantities of data and uncover patterns in the haystack. This may finally fully unlock the value of dark data to chart a faster path to the truth. – Chuck Kellner, Strategic Discovery Advisor at Everlaw

AI is going to continue to be easier to add. You’re going to see companies stop trying to build their own because they can’t compete with the technology and the dollars invested by the big cloud services providers. Everyone is going to consolidate their AI tooling onto the big cloud service providers in favor of utilizing their infrastructure in the most effective way. – Troy Pospisil, CEO of Ontra

The last of the data-generating or data-consuming companies that haven’t already adopted AI will do so next year. In an era where data is growing so fast, a business will become obsolete if it does not have tools to automate repetitive decisions, process internal data, and/or take advantage of external data. At the end of the day, the role of automation is not only to accelerate existing processes, but to enable the maximum potential of human productivity. In 2023, when a turbulent economic climate will continue to force enterprises to reduce their workforces, intelligent automation will mitigate the onus on remaining talent, transforming operations and creating more engaged employees. Moreover, companies will see incredible value on the customer side, with intelligent automation enabling heightened demand predictive capabilities and more efficient data pipelines. The key to adopting this critical technology is to ensure all users understand how the automated decisions are being made, creating trust in the system and optimizing implementation. – Farshad Kheiri, Head of AI and Data Science at Legion Technologies

AI model explainability stays hot. As AI becomes increasingly important in the lives of everyday people, more people want to know exactly how the models work. This is being driven by internal stakeholders, consumers, and regulators. – Anupam Datta, co-founder, president and chief scientist at TruEra

Generalist AI Agents: AI agents will solve open-ended tasks with natural language instructions and large-scale reinforcement learning, while harnessing foundation models — those large AI models trained on a vast quantity of unlabeled data at scale — to enable agents that can parse any type of request and adapt to new types of questions over time. – ANIMA ANANDKUMAR
Director of ML Research at NVIDIA, and Bren Professor at Caltech

In 2023, AI-driven tools are going to augment human efforts in virtually every function. Workers from marketing to sales to support to finance will have AI tools that are capable of completing half or more of their daily tasks and the human role will be to build on, enhance, polish or focus AI output to complete their duties. The routine minutiae will be stripped out of the work day, allowing people to spend their time applying judgment and insight. In that sense, AI should stand for Augmented Intelligence, where its best use is in automating tasks with assumptions being made through pattern analysis, but ultimately vetted by human intuition and guidance. – Ramon Chen, chief product officer, ActivTrak

Fewer keywords, greater understanding: In years past, AI has relied heavily on keywords to search a database or maintain a conversation. Being this literal is, of course, limiting: if the AI didn’t receive the right keyword, it could fumble a chatbot conversation or come up with the wrong search results, or have no results at all. In 2023 we’ll see AI continue to move further away from keywords, and progress toward actual comprehension and understanding. Language-agnostic AI, already existent within certain AI and chatbot platforms, will understand hundreds of languages — and even interchange them within a single search or conversation — because it’s not learning language like you or I would. This advanced AI instead focuses on meaning, and attaches code to words accordingly, so language is more of a finishing touch than the crux of a conversation or search query. Language-agnostic AI will power stronger search results — both from external (the Internet) and internal (a company database) sources — and less robotic chatbot conversations, enabling companies to lean on automation to reduce resources and strain on staff and truly trust their AI … Or no words at all: This same concept extends to images as well. With AI that actually understands language, rather than piecing it together one word at a time, we’ll also be able to describe an image that we’re looking for, and the AI can spit out the correct image. Similarly, this AI can also handle image-to-image search: we provide it with an image, and the AI finds another image resembling the first. Beyond its consumer search engine applications, this technology can be applied commercially. For example, retail brands can use it to show customers exactly what they are looking for (even if they can’t remember what it’s called). – Dr. Pieter Buteneers, Director of Engineering in ML and AI, Sinch

Companies Will Leverage AI to Shed Some Light on Dark Data: The lack of insight caused by the “black hole” of dark data will continue to plague companies in 2023, including the loss of “tribal” knowledge every time an employee leaves the company. This knowledge gap will lead organizations to leverage AI to classify employee knowledge, making sure everything is captured and searchable, thus enabling team members and new employees to quickly ramp up rather than starting from ground zero. – PFU America’s Technology Evangelist Scott Francis

Explainability Will Create More Trustworthy AI for Enterprise Users: As individuals continue to worry about how businesses and employers will use AI and machine learning technology, it will become more important than ever for companies to provide transparency into how their AI is applied to worker and finance data. Explainable AI will increasingly help to advance enterprise AI adoption by establishing greater trust. More providers will start to disclose how their machine learning models lead to their outputs (e.g. recommendations) and predictions, and we’ll see this expand even further to the individual user level with explainability built right into the application being used. – Workday’s CTO, Jim Stratton

One of the most exciting developments in 2022 was the introduction of Data Creation as an approach. This is the process of intentionally creating data to power AI and advanced data applications, as opposed to data exhaust, which is the byproduct of data emitted from existing systems. Data Contracts have been a much-discussed topic in the data community this year. In the context of the Modern Data Stack, this sensible idea has been controversial primarily because practitioners are stuck in the ‘data is oil’ paradigm, in which they assume that data is extracted and not intentionally created. We expect the ‘data product manager’ role and ‘data product management’ skill set to continue to gain traction in 2023. They are instrumental for organizations to successfully treat data as a product, design and implement the right data contracts (as part of data products) and enable them to build out a ‘data mesh.’ – Yali Sassoon, Co-Founder, Snowplow

Open source and AI realize their shared potential in 2023, but it gets complicated in a hurry: The massive potential of open source-powered AI efforts will quickly beget itself in 2023, as new machine learning models find an ever-growing set of important use cases. Yet it could be a bumpy road, as questions such as who owns the code when written with a machine learning model based on open source, is that use compatible with open source licenses, and is it a derived product will all need to be addressed in the year ahead. – Luis Villa, co-founder, Tidelift

AI in Mainstream Business: Most businesses are still in the exploration stage when it comes to AI, but they are experimenting with what this technology can do for them specifically. At this point, many businesses are running POC (proof of concept) trials in specific use cases to further investigate the benefits. As they continue to explore and understand how AI can be utilized, the level of maturity and commitment continues to differ across industries. For example, organizations in the manufacturing sector are further along in their AI journeys than organizations in the retail sector. In 2023, more businesses will invest in the development and testing the benefits of AI in their organizations. – Marieke Wijtkamp, SVP of Product at Librestream  

Responsible AI solutions – that address trust, risk, ethics, security, transparency – will gradually begin to become more mainstream. Solutions that target personalized insights – whether it is related to aspects such as credit risk, underwriting or simply recommendation engines for dynamic pricing or influencing buying decisions. – Nicolas Sekkaki, Kyndryl’s GM of Applications, Data & AI

AI Gets Smarter in 2023: AI will not only grow smarter in 2023, it will become more affordable, simple, and accessible. Companies will automate many processes done manually before, such as invoicing, transcriptions, medical charting, and resume processing. As companies digitize their document archives and historic data, they will use Natural Language Processing (NLP) to make it searchable in real-time. This will increase employee productivity and accelerate the availability of information to business units, delivering richer insights and stronger ROI. – PFU America’s Technology Evangelist Scott Francis

Businesses leveraging AI to do more with less during challenging times will win in the long term: Microsoft CEO Satya Nadella recently said, “software is ultimately the biggest deflationary force.” And I would add that out of all software, AI is the most deflationary force.  Deflation basically means getting the same amount of output with less money — and the way to accomplish that is to a large degree through automation and AI. AI allows you to take something that costs a lot of human time and resources and turn it into computer time, which is dramatically cheaper — directly impacting productivity.  While many companies are facing budget crunches amid a tough market, it will be important to continue at least some AI and automation efforts in order to get back on track and realize cost savings and productivity enhancements in the future. – Varun Ganapathi, Ph.D., CTO and co,-founder at AKASA

Embedded AI expands the potential of operational applications: In 2023, IDC is forecasting, AI spending will go past half-trillion-dollar mark. However, this will only be possible if AI is generally available to customers and organizations across all industries. One of the most significant barriers to the wide adoption of AI has been the skills gap. Now, AI has become more accessible with the ever-growing number of apps that include AI functionality, like predictive text suggestions. This, combined with the ease of use that stems from the growth of AI capabilities, will bring the utility of AI within reach for all. Operationalizing ML algorithms within databases has been a harder challenge and that will see a change in 2023, and finally don’t worry, GPT-4 is not going to create skynet 🙂 – Shireesh Thota, SVP of Engineering, SingleStore

Human-Centered AI will come to focus in 2023: The outdated idea of AI replacing people is giving way to a more human-centered vision of AI with tools to augment human intelligence, creativity and well-being in 2023. AI has moved from black-box models beyond human comprehension, to emphasize transparency and model explainability – at population and individual levels.  Instead of making decisions for people, algorithms can recommend a number of good options, allowing them to choose using their experience and knowledge. – Michael O’Connell, Chief Analytics Officer at TIBCO 

In 2023, many companies will seek AI-powered solutions that modernize their business processes, and improve productivity and employee morale while growing the bottom line. I believe that, as AI applications become more widespread, the common misunderstanding that “AI takes jobs from people” will dissipate. Over time, the understanding that technologies denoted as AI help employees focus on more strategic tasks will grow. In my opinion, the widening of AI applications mirrors other historical inventions that were originally suspected to “take jobs.” Let us draw an analogy with the steam engine – this advancement did not prove disastrous for workers. It reduced their workloads and enabled them to do more. Advances in Machine Learning (ML), like Intelligent Document Processing (IDP) that handles all data entry and processing from a cloud-based platform, is one of the technological advancements happening today. Manual data entry is the modern equivalent of physical tasks in the workplace, and ML is here to help. – Milan Šulc, Head of AI Lab at Rossum

The testing and experimenting with AI systems and machine learning tools to perform all aspects from everything between “art-to-documentation” to “code assistance”. These integrated innovations are truly driving more sophisticated intelligence into cloud systems which helps accelerate development, support, customer interaction and adaptive tooling.  Artificial intelligence and machine learning (AI/ML) is an exploding innovation which is pushing breakthrough services for the cloud. – Chris Chapman, CTO at MacStadium

Between the increasing complexity of systems and the huge shortage of tech talent, organizations will need a much greater reliance on automation just to keep up in 2023 and beyond. Organizations have been collecting telemetry data for a long time, but that data has not historically been structured enough for good automation. Now, organizations are increasingly able to collect data at full fidelity. That’s going to change things dramatically, because the much higher quality data lets us build the models we need essentially to automate processes right. Moving forward, this trend will contribute toward the increased adoption of AIOps, which leverages data to automate the prediction, prevention and resolution of incidents. There simply aren’t enough individuals in the workforce to do this manually — especially as data volumes continue to skyrocket — and more enterprise leaders are beginning to view AIOps as a sustainable path forward. Additionally, business leaders will begin to leverage AIOps for solutions that extend beyond the traditional IT use cases. Believe it or not, AIOps can be applied to KPI business metrics such as revenue, transitions or e-commerce. By taking purely business metrics and tying them back to underlying software infrastructure, companies can understand where KPIs are trending, plot their evolution, and make early, informed decisions. In 2023, expect automated solutions to become commonplace in the enterprise, and look out for AIOps use cases to expand significantly. – Spiros Xanthos, General Manager of Observability, Splunk

For CIOs, deciding how and where AI workloads will be placed is about to get infinitely more complicated, due to the double whammy of skyrocketing cloud costs and inflation. AI in the cloud remains ‘the easy button,’ but you pay a big premium for it with diminishing performance as you scale – a premium that only gets bigger with inflation. So after years of increasing AI workloads in the cloud, some companies will rethink those decisions in 2023. In fact, we predict that a vast majority of companies will move quickly to undo a big percentage of their previous (expensive) decisions to put AI workloads in the cloud. – Holland Barry, SVP and Field CTO, Cyxtera

As AI becomes cheaper and easier to build, train and maintain, there will be much more adoption of AI and breadth of AI use cases in 2023. For example, the CX industry is becoming AI-first as the technology extends beyond reactive service and becomes embedded across the entire customer journey, including proactive and preventative service. To prepare for this future where the majority of front line customer interactions are completely automated with AI, businesses will double down on AI in 2023 and start to use AI in completely new use cases. – Adrian McDermott, Zendesk CTO

In 2023, AI Testing Will Become Prevalent as the Technology Catches Up to Advanced AI: CX complexity is increasing because the technology underpinning customer experiences, customer expectations and interactions is evolving. As contact centers move to the cloud, complexity shifts from technology-driven to business-driven such as processes, integration with APIs and channels. The nature of the complexity is changing. When complexity becomes too great, manual testing is no longer sufficient. The only way to test this level of complexity is with AI coming to the delivery of CX. As CX business applications become more complex and increasingly utilize AI more, automated testing will keep pace by leveraging AI to keep up with the scope and pace of testing. – Max Lipovetsky, VP of Products, Cyara

Enterprises embrace AI to gain resiliency among economic uncertainty: During economic uncertainty, enterprises want improved business uptime, productivity gains and revenue assurance. To gain an advantage, they will have to build an autonomous enterprise that is built on the pillars of AI, ML and intelligent automation. This will help with business scaling and resiliency and creating competitive differentiation needed during uncertain times. The importance of being able to demonstrate business value that during economic downtime will be key. Gartner, at a recent keynote in London, stated that just 17 percent of organizations are consistently able to demonstrate the business value of IT. That percentage has to get better moving into the new year. In 2023, organizations will increasingly use automation to maximize productivity. Expect greater adoption of AI to make IT systems more resilient without growing costs. Using AI, businesses can automate some of the most essential and elemental IT operations tasks, such as monitoring alerts managing employee onboarding and offboarding. In doing so, companies not only make their IT systems stronger, they also free up skilled IT staff to focus on higher value projects. Expect greater adoption of cloud and multi-cloud operations. Sustainability metrics are also a major focus, so AI has a role to play there in supporting organizational efforts to meet their goals. – Akhilesh Tripathi, CEO, Digitate

AI will become table stakes for how organizations and IT departments run their business: AI has evolved from a sideline experiment to a core part of how most organizations run their business. It’s no longer about whether or not to use AI, but how to realize its value to drive outcomes. Looking ahead 2023, we will continue to see a fundamental shift from experimental to ubiquitous and pervasive AI deployments across IT. Many AI projects often fail to deliver on the value they promise. The success of these deployments will rely on choosing the right use cases that optimize core processes within the organization and selecting the right toolkit for the job. – Stephen Franchetti, Chief Information Officer, Samsara 

Opening up to AI: Learning to trust our AI colleagues:  With AI tools increasingly standardized, organizations are realizing that competitive gains will best be achieved when there is high confidence that AI is delivering the right analytics and insights. To build trust, AI algorithms must be visible, auditable and explainable, and workers must be involved in AI design and output. – Deloitte’s Chief Futurist Mike Bechtel

AI/ML Gets Grounded: While AI is the new buzzword, using it in most content production organizations has been messy. Cool AI tools to up-res or transcribe video are scattered across web tools, custom applications, or embedded in editing suites. Looking forward, beyond the “standard” actions of these tools, content producers will work with data scientists and computer vision experts to customize these tools to meet their specialized content domain and needs, and to automate actions that take content loggers ages to complete. As content leaders realize they are in a critical race to develop and implement these tools, they will look to centralize this development and connect it to powerful GPU pipelines. – Skip Levens, Product, Media and Entertainment, Quantum

Organizations will face greater pressure to realize AI’s value: While AI has become firmly entrenched in all industries – from financial services to healthcare and others – many organizations still struggle to shift AI proofs-of-concept to full-scale production. In 2023, business/IT decision makers will focus on tighter collaboration to truly address company issues and needs. – Keshav Pingali, co-founder and CEO of Katana Graph

AI Will Become a DevOps Competitive Advantage in 2023: The future of enterprise DevOps is being able to turn data into actionable, predictive insights so enterprises can learn from past historical trends to make higher-quality software at greater speed and AI/ML has finally reached a tipping point to enable this. A machine learning model can now capture thousands of monthly change events, including who the team is, what infrastructure changed, what testing was done during development, who the developer or team was, defects that were found during testing, and other factors. In 2023, this information will increasingly be correlated to the success and failure of past changes so teams can learn from these past issues – and plan to avoid them. – Wing To, Vice President of Engineering for Value Stream Delivery Platform & DevOps at

2023 will be a pivotal year for mobilizing AI solutions across several business verticals. With machine learning quickly becoming ubiquitous across the business landscape, how businesses create trust with usable and explainable AI will separate the leaders from the pack in the coming year. To achieve this, many organizations will pivot focus beyond the algorithm with things such as business-ready predictive dashboards, visualizations, and applications that demystify how AI systems work and reach conclusions — this will help business leaders understand the impact on their business and take action quickly with confidence. – Santiago Giraldo, Senior Product Director at Cloudera 

AI is ‘coming of age’ in terms of its maturity, and while there continues to be some resistance towards it, AI will grow in acceptance as a valuable tool for businesses. AI, for instance in the legal setting, gives legal departments the ability to reduce the amount of “low-value work” being done by team members by replacing mundane tasks with technology, giving the employee back the time for more fulfilling work and enabling legal teams to do more with less. Additionally, we’ll see an increase of business leaders recognizing how efficient and time-saving AI technology can be in keeping teams engaged, efficient and focused on high-value work, especially as departments get stretched thin. This is what will drive businesses towards more AI-powered tools in 2023. – Matt Gould, General Counsel, ContractPodAi

AI and Machine Learning Make Data Lifecycle Management More Intelligent: Data is the driving force of artificial intelligence (AI) and machine learning. Vast quantities of training data enhance accuracy in the search for potentially predictive relationships. In the past, security solutions were predominantly reactive but that is changing. Machine learning and AI algorithms play a key role in this shift. While they are not a one-stop solution for all cybersecurity concerns, they are incredibly useful for rapidly automating decision-making processes and inferring patterns from data. These algorithms work by first learning from real-world data, analyzing normal behavior patterns and responding to deviations from that baseline norm. Traditional methodologies of backup to tape are also going away due to the level of effort and time it takes to restore from tape. Additionally with rich data and history stored in backup files, organizations will start to realize the benefits of backing up their data to either on-premises or cloud object storage and continuously analyzing that data to make it more useful to business decisions. – Jimmy Tam, CEO of Peer Software

Use of AI/ML to drive data management will be a foundational requirement (vs. a nice to have) to meet the ever changing data landscape as well as, reduction in resources and budget freezes. Companies will be looking to go from best of breed tools to tools that can accelerate automation and provide core capabilities. Focus will go from evaluation of having every capability in each tool to understanding foundation capabilities required to deliver shorter term value. The goal will be to reduce management of vendors, tools, costs, training, and integrations across tools. Moving to cloud SaaS applications will continue to reduce management of infrastructure and focus on utilization of toolsets to obtain value. Companies will leave it up to the vendors to manage performance, upgrades, and general troubleshooting. – Stephen Gatchell, Director of Data Advisory at BigID

Photos, icons and other visual materials created by AI will make their way out of the lab and into mainstream business. This year, we saw major companies like Microsoft and Canva begin experimenting with AI-powered algorithms to build digital images from plain text. In 2023, text-to-image processes will become accessible to everyday companies, eliminating their dependency on overused stock images and visuals—which often don’t fully communicate the ideas they want to convey about their businesses. We’ll see more digital imagery, app-making, art and even entertainment created by AI rather than through manual design processes, opening up an entirely new market and revenue streams. – Roman Reznikov, VP of Delivery, Head of Digital Segment at Intellias

Never-Before-Possible AI and ML Use Cases Will Emerge–and Ultimately Become Mainstream: As companies break free from the constraints of legacy systems and are able to bring together massive data sets from disparate systems, we’ll see a slew of never-before-possible use cases for AI and machine learning. In auto manufacturing, for instance, we’re just starting to see the emergence of next generation manufacturing data platforms–or single unified cloud-based platforms where manufacturers are aggregating all data across their entire organizations. Once the data’s in there, they can start building AI-enabled applications against that. Here are a few that we expect to begin seeing more of in the coming year: (i) Visual inspection with computer vision to detect anomalies on the assembly line; (ii) Natural Language Processing – Using voice as an interface in the manufacturing process and across operations to control software systems; (iii) Introduction of predictive and preventive maintenance via anomalies in machines that signal mechanical failure before it happens. – Marco Santos, CEO USA and LATAM at GFT

AI will need to go beyond a buzzword to be valuable: Artificial intelligence (AI) has long been a buzzword that technologies have used to drum up interest in a product, but companies will begin to look beyond the buzzword itself for value from the technology. In 2023, companies will look to AI for two distinct reasons: 1) help them do more with less people and 2) drive more revenue. It will become less about using technology with AI and more about how AI drives efficiency for digital marketing teams and critical business outcomes. And if it does not drive outcomes – companies will look to other technologies that can. – Deniz Ibrahim, VP of Product Marketing, Bluecore

Alignment will bring the concept of adversarial machine learning into the public consciousness: AI Alignment is the study of the behavior of sophisticated AI models, considered by some as precursors to transformative AI (TAI) or artificial general intelligence (AGI), and whether such models might behave in undesirable ways that are potentially detrimental to society or life on this planet. This discipline can essentially be considered adversarial machine learning, since it involves determining what sort of conditions lead to undesirable outputs and actions that fall outside of expected distribution of a model. The process involves fine-tuning models using techniques such as RLHF – Reinforcement Learning from Human Preferences. Alignment research leads to better AI models will be discussed a lot more in 2023, and will likely become a mainstream topic, bringing the idea of adversarial machine learning into the public consciousness. – Andrew Patel, Senior Researcher, WithSecure Intelligence

Unethical AI practices and AI abuse/misuse will increase while AI legislation is still being developed and debated: Use of unethical AI practices and the biases that come with them will continue and will only really be addressed when they start happening en mass or for incidents with larger impact. As some of the larger industries start making it mainstream, there will be more data and more potential for abuse/misuse. The EU AI act will help, but the real data about us historically, the apps we use and the data we have produced are not located, processed or monetized in Europe for the most part. Based on the above, we will have increased cases of “run away data” where on paper and for auditors the data is in a certain location but is enriched outside of certain legislated jurisdictions when it comes to ML/AI. In the same way that we currently do CO2 book-keeping to “buy clean air” from other countries, so too will there be an economy for data enrichment once the first legislation gets passed in certain regions of the world. In the metal industry, companies spend millions shipping raw aluminum ore to Iceland because of the low cost of geothermal energy. So too will we have countries where data can be “shipped to” digitally, enriched and exported. The more moving parts and data, the more potential for misuse and abuse and thus leaks and real societal impact. The insurance industry along with special legislation will grow and be further introduced to the market to help manage the risk of data misclassification, prosecution and other data driven decisions coming from ML/AI. Systematic probing of AI/ML will become mainstream and detection for safeguarding AI algorithms will be required to protect the quality of outcome and data as well as detection of manipulation of the algorithms. – Tom Van de Wiele, Principal Technology & Threat Researcher, WithSecure

AI is a Game Changer: Even though the industry has been talking about AI for many years, the trifecta of massive amounts of data available through the rise of the internet, the dramatic increase in processing power fueled by GPU’s originally designed for gaming and steady advances in software algorithms to analyze and use that data will result in the power of AI being more fully realized across almost all markets. AI is 100% the number one game changer in the coming years – it will change the way we interact with computers just as Google Search did 20 years ago. Generative Design is one great example. Being able to use text prompts, leverage huge datasets and natural language recognition is now driving increased use of generative design, and I think we’ll see much more of that in 2023. But it goes beyond generative design and will have a big impact on Computer Aided Engineering (CAE) in general. For example, you will be able to generate large data sets and then have the system learn from it, and predict the outcome of deformations or anything else, give advance preview of post-processing results and even suggest certain workflows for optimum efficiency. Or give the system a basic 2D image and then it generates a 3D design from scratch. In addition to CAE, AI will completely revolutionize how realistic digital-twin assets are created. For example, designers would sit for hours generating and applying complex repeating textures on objects and environments. This can now be done with AI and will result in far greater volume and variety of assets. – Jonathan Girroir, Technical Evangelist at Tech Soft 3D

AI solutions that directly impact the top line (versus just reducing cost with a focus on the bottom line) – closing new business and retaining existing customers – are the ones that will win out in 2023: However, it’s still in the early adopter category. Next year, expansion is the new retention. Businesses must prepare for a cold winter that will likely subside in the 3rd and 4th quarters. During economic slowdowns, companies will need to reduce costs and protect revenue, and AI automation is a suitable answer for that because it scales at a cost that is way more effective during times when companies do not have the resources to expand human capital. – Jim Kaskade, CEO of Conversica

Vendors need AI to keep up with volume of data: Machine data analytics is the future of data analytics. With machines increasing their capacity at a high rate, their data output is growing exponentially. It’s going to be essential for vendors to manage this unstructured log data as increased data will place stress on processes and resources. AI will help reduce false positives, as well as detect and fix problems faster. – Erez Barak, VP of Product Development for Observability, Sumo Logic

Companies will have to win users’ trust by ensuring that operations are ethical—this includes handling customer data and ensuring energy efficiency. To do this, those operating in this field will need to work on their ability to explain AI algorithms as well as highlight the value of the technology within a wider context. An example of this could be the evolution of self-driving autonomous cars, which are expected to achieve 40% to 50% more fuel efficiency than manual driving by 2050. – Bal Heroor, CEO of Mactores

Having access to AI is no longer enough, AI needs to be adaptive in order to ensure agility within contract management: It seems everyone has or is claiming to have AI capabilities, but simply having AI is not necessarily enough to meet business key performance indicators (KPIs) and deliver on promises to customers. In order to take AI a step further, enterprises will need to invest in adaptive AI. Gartner predicts this technology to be one of the top trends that focus on business model changes, accelerated responses and opportunities. Adaptive AI allows businesses to change their model behavior after deployment using real-time feedback. Based on new data and adjusted goals, it will be possible to remain agile when faced with challenges in the real world. This adaptability will be vital when looking beyond problems to find solutions that leverage the vast knowledge and expertise available, empowering enterprises to act proactively rather than reactively. – Scott Quinn, Vice President – Customer Success at SirionLabs  

In the market, AI is primarily utilized for extraction. This is an important use case of AI as it creates visibility into the key data of an organization. Although this will continue to grow, extracting data out of legal documents is only the first step. In order to maximize outcomes and create strategic value, businesses must ask themselves what they can do with the data that’s extracted. In 2023, the approach to AI-driven data will shift, and enterprises will turn their attention to making this data more consumable and actionable to provide insights for strategic decisions across the business. – Atena Reyhani, SVP Product Management, ContractPodAi

Big Data

The Rise of Data-as-a-Product. In 2023, data-as-a-product will reach maturity resulting in increased quality and trust in data at companies. This will lead to more robust data organizations within enterprises that require an increased need for data modeling technologies and data teams/engineers. – Armon Petrossian, CEO and co-founder of Coalesce 

The Data Market will Evolve as Large Enterprises Drive Change. The last ten years has been all about cloud and modern data stacks with the rise of technologies like dbt, Snowflake, Databricks, and others. While this trend is extremely impactful for smaller and mid-size organizations since they have a really simple way to start a data platform in minutes, larger enterprises have a different set of challenges mostly around modernization, change management, and governance. This is where data lineage is beginning to play a critical role. The data market today is extremely fragmented and one of the big questions, especially considering the recession, is if and how it will consolidate. We may see a lot more mergers and acquisitions in 2023. Additionally, we expect more evolution on the non-technology side with data contracts, data mesh, and/or advanced federated governance processes, as that seems to be the next obvious step on the data journey for any mature data organization. – VP of Research and Education at MANTA, Jan Ulrych

Data Complexity Will Increase: The nature of data is changing. There are both more data types and more complex data types with the lines continuing to blur between structured and semi-structured data. At the same time, the software and platforms used to manage and analyze data are evolving. A new class of purpose-built databases specialize in different data types—graphs, vectors, spatial, documents, lists, video, and many others. Next-generation cloud data warehouses must be versatile—able to support multimodal data natively, to ensure performance and flexibility in the workloads they handle. The Ocient Hyperscale Data Warehouse, for example, supports arrays, tuples, matrixes, lines, polygons, geospatial data, IP addresses, and large variable-length character fields, or VARCHARs. The need to analyze new and more complex data types, including semi-structured data, will gain strength in the years ahead, driven by digital transformation and global business requirements. For example, a telecommunications network operator may look to analyze network metadata for visibility into the health of its switches and routers. Or an ocean shipping company may want to run geospatial analysis for logistics and route optimization. – Chris Gladwin, CEO and Co-founder of Ocient 

Data Reduction: There is an exponentially increasing amount of data, but I believe we will see rise of solutions that deduce the meaningful bits of data from the overall mass of data collected, or even reduce the footprint of data using new technologies beyond current classic data storage techniques. – Dan Spurling, SVP Product Engineering, Teradata

It was a year of fast-moving discussions around the modern data stack. Lots of new vendors popped up, and major ones like Snowflake and Databricks continue their journey to take over many technical components, despite the challenging economic situation. But at the same time, voices emerged who questioned the modern data stack as such, whose decoupled approach often leads to many tools and high costs, let alone the complexity of getting it all together. The discussions around the ‘postmodern data stack’ (as just one out of many terms) were started, and we’re all eager to see where this will lead us in the coming years. – Chris Lubasch, Chief Data Officer (CDO) & RVP DACH, Snowplow

2023 will be put up or shut up time for data teams. Companies have maintained investment in IT despite wide variance in the quality of returns. With widespread confusion in the economy, it is time for data teams to shine by providing actionable insight because executive intuition is less reliable when markets are in flux. The best data teams will grow and become more central in importance. Data teams that do not generate actionable insight will see increased budget pressure. – Alexander Lovell, Head of Product at Fivetran

Metadata Will be Driven by Data Lineage. Metadata is the most notable data strategy trend today. However, it’s not just about collecting metadata, but unlocking its power through activation. Data lineage is the foundational type of metadata, with the ability to deliver the most powerful benefits. When done right, it can enable automated and intelligent data management practices. – VP of Research and Education at MANTA, Jan Ulrych

In 2023, I expect Data Creation to go big! There will be a rise of the ‘data product manager’ and ‘data product management’ as the key persona/key skill sets required to treat data as a product, design and implement the right data contracts (as part of data products) and the enablement of organizations to build out a ‘data mesh.’ Furthermore, I expect more organizations to build operational (including real-time) use cases on top of cloud data warehouses and data lakes, supported by better tooling from the core vendors (e.g. Databricks and Snowflake) and companies in the ecosystem. Finally, I predict there will be more thoughtful approaches/technology architectures for managing the tension between data democratization and data compliance for personally identifiable data. Increased focus on measuring ROI on data investments. – Yali Sassoon, Co-Founder, Snowplow

Iteration Replaces Big-Bang: For large-scale modernization projects with high complexity, business leaders might be looking for the “easy button” but CTOs and software architects know that the reality is much different. Carving out microservices from a monolith is an iterative process often requiring refactoring and replatforming. Unless your application is fairly simple, the “one-at-a-time” approach is the recommended path to ensure success and manage risk – and actually increase project velocity. Trying to do too much at once or attempt a “big-bang” modernization or re-write is why most app modernization projects fail, sputter, or just fizzle out. This is not necessarily as slow-and-steady as it seems, as iteration builds velocity that will outpace any massive undertaking in short order. –  vFunction Chief Ecosystem Officer, Bob Quillin 

The rise of the Data Processing Agreement (DPA): How organizations process data within on-premises systems has historically been a very controlled process that requires heavy engineering and security resources. However, using today’s SaaS data infrastructure, it’s never been easier to share and access data across departments, regions, and companies. With this in mind, and as a result of the increase in data localization/sovereignty laws, the rules as to how one accesses, processes, and reports on data use will need to be defined through contractual agreements – also known as data processing agreements (DPA). In 2023, we’ll see DPAs become a standard element of SaaS contracts and data sharing negotiations. How organizations handle these contracts will fundamentally change how they architect data infrastructure and will define the business value of the data. As a result, it will be in data leaders’ best interest to fully embrace DPAs in 2023 and beyond. These lengthy documents will be complex, but the digitization of DPAs and the involvement of legal teams will make them far easier to understand and implement. – Matt Carroll, Co-founder & CEO, Immuta

For the past few years, big data has been framed as a technology that will disrupt diverse industries. However, big data has spiked in adoption thanks to advancements in metadata-driven data fabric, AutoML and the ever-growing variety of data. Data technology communities began discussing metadata-driven data fabric in 2022. Since active metadata-assisted automated functions in the data fabric reduces human effort while improving data utilization, this technology will gain significant traction in 2023. The data fabric listens, learns and acts on metadata or “data in context,” which helps users access contextual information.  One of the key strategic differentiators will be having access to contextual data. Machine learning (ML) will become more accessible to non-experts over the next year thanks to AutoML. This is a class of ML algorithms that helps automate the designing and training of a ML model. ML models are created by the algorithms built by humans. Many organizations now either have in-house talent or partners that have delivered successful big data driven solutions. These well-documented success stories coupled with more efficient analytical techniques have resulted in commercial outcomes becoming increasingly essential filters in big data projects. Consequently, in 2023, we expect to only rarely come across big data projects that use R&D budgets. – Raj Bhatti, SVP, Client Solutions at Cherre

Data Complexity Will Increase: The nature of data is changing. There are both more data types and more complex data types with the lines continuing to blur between structured and semi-structured data. At the same time, the software and platforms used to manage and analyze data are evolving. A new class of purpose-built databases specialize in different data types—graphs, vectors, spatial, documents, lists, video, and many others. Next-generation cloud data warehouses must be versatile—able to support multimodal data natively, to ensure performance and flexibility in the workloads they handle. The Ocient Hyperscale Data Warehouse, for example, supports arrays, tuples, matrixes, lines, polygons, geospatial data, IP addresses, and large variable-length character fields, or VARCHARs. The need to analyze new and more complex data types, including semi-structured data, will gain strength in the years ahead, driven by digital transformation and global business requirements. For example, a telecommunications network operator may look to analyze network metadata for visibility into the health of its switches and routers. Or an ocean shipping company may want to run geospatial analysis for logistics and route optimization. – Chris Gladwin, CEO and Co-founder of Ocient 

Big data isn’t dead (yet): Providers will attempt to get ahead trends, and we will see many start to advertise that “Big data is dead.” Instead, many organizations are leaning into “smart data” for greater insights. But despite the advertisements, big data will continue to play an important role in business operations — for now. The key is to make sure you have easy to use, self-service tools in place that enable cleansing, verifying, and prepping of the data that can then be plugged into a data analytics model for valuable results and smart decisions. The companies that turn their big data into smart data will be the ones that will benefit from the new ways of thinking about data. – Christian Buckner, SVP, Data Analytics and IoT, Altair

In the coming year we will, unfortunately, start to see the industry backsliding into another era of vendor lock-in. Just as the cloud is becoming the center of business, companies are at risk of having limited access to their own data because they’re locked into closed vendor ecosystems, which limits the flexibility and creativity needed to maximize driving value from their data. Businesses whose data is locked in will not be as agile in reacting to market conditions and building new apps and services needed to meet customer demands. And of course, lock-in will also prove problematic for businesses because it will limit their ability to shop around for the best competitive pricing. It will be imperative for companies to invest in open data ecosystems to ensure they can quickly change strategies to keep pace with changing markets. – Gerrit Kazmaier, Vice President & General Manager for Database, Data Analytics & Looker, Google 

Moving to a metadata mindset: Metadata-based data management emerges from the shadows: Metadata is emerging as a vital component of data management as organizations look to accelerate the time to value of their data, optimize costs, and comply with the ever-evolving landscape of industry and governmental regulations. In 2023, the role of metadata in the data ecosystem will continue to grow, spurred by more organizations shifting to the cloud, and a growing interest in data discovery, governance, virtualization, and catalogs, as well as the need to speed up data delivery through the automation of data pipelines and warehouse automation. However, metadata is still often overlooked and understated in data analytics and data management. Businesses should take heed, as this alone could sabotage your data management strategy. When it comes to good quality data, metadata is the foundation. Simply put, metadata is data that provides information about other data so that it can be more easily understood and used by the organization. It answers the who, what, when, where, why, and how questions for data users. Metadata management also plays a large part in supporting data governance programs. With the recent influx of new regulatory compliance laws, businesses are increasingly investing in data governance programs to help securely manage their data assets. Metadata and metadata management can support these efforts by providing the foundation for identifying, defining, and classifying data. And with data quality standards established, metadata management can ensure that the necessary regulatory controls are applied to the corresponding data. As we look ahead, we encourage modern enterprise teams to embrace a metadata mindset, and we expect a new wave of metadata management tools and best practices to be a focal point in the market. – Jens Graupmann, SVP of product & innovation, Exasol

The need for on the ground data excavators will become increasingly more important as vital information gaps grow wider and wider in “Data Dark” Parts of the World, especially where investment opportunities are plentiful. For example, the projected growth of African markets signals an increase in the production capacity of economies across the continent. Therefore, the demand for data will significantly increase to enable investors and businesses to capitalize on this increase in output. – Joseph Rutakangwa, co-founder and CEO of Rwazi

Global data creation is expected to continue to grow in the new year, nearly doubling by 2025. For savvy businesses, this presents an incredible opportunity if they find new ways to leverage this wealth of data to make smarter and faster decisions. We can expect successful data-driven enterprises to focus on several key AI and data science initiatives in 2023, in order to realize the full value of their data and unlock ROI. These include: (i) Productizing data for actionable insights, (ii) Embedding automation in core business processes to reduce costs, and (iii) Enhancing customer experiences through engagement platforms. The key to success will be underpinning these characteristics with data that is accurate, consistent, and contextual. For example, higher-quality data provides better fuel for training machine learning applications and programs which translates into greater efficiency for MLOps and AIOps.  Also, focusing data engineering efforts to improve consistencies in how data is standardized, labeled, and delivered can unlock greater collaboration and productivity with domain experts. Data integrity will be integral for fueling the top data initiatives of 2023. – Precisely’s Chief Product Officer, Anjan Kundavaram

APIs will drive democratization of data: APIs make it easy to adjust, transform, enrich and consume data – traditionally there was a need for hundreds of highly paid engineers to manage the data and data scientists were needed to understand algorithms. In 2023, we will see a shift towards APIs technologies managing data as a way to gain insights and also control data related costs which means people will no longer need to have highly developed engineering skills to harness the power of data. – Rapid’s CEO and founder Iddo Gino

The data mindset will shift to real time. Many businesses use Apache Kafka to send data to traditional systems like data warehouses or databases for processing, essentially treating data streams like static data. In the next year, we will see more developers make use of data streams in their real-time form. Expect to see direct lines into more machine learning, analytics, and business applications as companies take advantage of real-time data. – Chad Verbowski, Senior Vice President of Engineering, Confluent

In 2023 IT leaders and organizations are going to rely even more heavily on data. Making informed data-driven decisions is more crucial than ever, as companies navigate economic uncertainty and brace for a recession. Having data to inform purchase or renewal decisions around the tools and technologies employees use, can help businesses cut back on costs while actually improving efficiencies and overall agility. Companies today drastically overspend on cloud applications without even realizing it. For many organizations the number of SaaS subscriptions in use is three to six times higher than IT leaders think. Organizations will invest in solutions that automatically collect, normalize, and analyze data about SaaS tools, costs, usage, and savings opportunities, so that they can evaluate which apps to prioritize, which to get rid of, and how to right-size contracts to match app utilization. It’s more important than ever for IT leaders to use data to safeguard organizations against economic uncertainty, deliver value, and eliminate inefficiencies, risk and wasted spend. – Uri Haramati, Founder and CEO of Torii

The year big data becomes accessible: Most companies now understand the value of data as an asset, but many still struggle to unlock its true value. Trying to do so has partly driven the growth in managed Database-as-a-Service and/or Data-Management-as-a-Service that reduce complexity. 2023 will see the next step in this process, with an onset of data visualization platforms and tools being adopted by the C-suite, to better understand big data and use it to make better-informed decisions. – Heikki Nousiainen, CTO and co-founder at Aiven

Data Streaming: This year, the concept of data as a product will become more mainstream. Across many industries, data streaming is becoming more central to how businesses operate and disseminate information within their companies. However, there is still a need for broader education about key data principles and best practices, like those outlined through data mesh, for people to understand these complex topics. For people creating this data, understanding these new concepts and principles requires data to be treated like a product so that other people can consume it easily with fewer barriers of access. In the future, we expect to see a shift from companies using data pipelines to manage their data streaming needs to allowing this data to serve as a central nervous system so more people can derive smarter insights from it. – Danica Fine, senior developer advocate at Confluent

Open Source: Digital Transformation of Traditional Industries: I believe that open source and open systems will become even more critical over the next year. In particular, traditional industries like manufacturing in the United States will look to open systems to rebuild infrastructure to become more modern, cost-effective, and globally competitive. Open systems will allow traditional industries not to be locked in by legacy vendors and allow them to be on the pulse of cutting-edge tools and technologies like AI, ML, AR, and more. It will remove the data silos allowing data to be shared easily internally or with outside partners for better analysis. I predict that there will be two ways that this transformation will happen. First, through embracing the cloud. Cloud-based open systems engineers will be able to share data easier and take full advantage of modern data processing, analytics tools, and the elasticity of cloud to reduce operating costs. Additionally, democratizing infrastructure by embracing open-source projects will open traditional industries like manufacturing and automation to a larger developer community ecosystem. – Jeff Tao, Founder and CEO of TDengine

Freedom and flexibility will become the mantra of virtually every data management professional in the coming year. In particular, data management professionals will seek data mobility solutions that are cloud-enabled and support data migration, data replication and data synchronization across mixed environments including disk, tape and cloud to maximize ROI by eliminating data silos. We will likewise see an uptick in solutions that support vendor-agnostic file replication and synchronization, are easily deployed and managed on non-proprietary servers and can transfer millions of files simultaneously – protecting data in transit to/from the cloud with SSL encryption. – Brian Dunagan, Vice President of Engineering, Retrospect

Governments will need to go beyond the data and focus on how it can be used to support storytelling so they can present data in more profound and impactful ways. Data alone doesn’t motivate people to take action, but a story can inspire action. Storytelling encourages user-centric thinking and helps governments work toward more innovative and connected solutions. – Cathy Grossi, vice president, product management at Accela 

The IT complexity of data storage, processing and analytics is reaching its breaking point: The current methods of storing and analyzing data have typically involved data warehouses and data lakes (i.e., originally using technologies like Hadoop).  With the advent of cloud computing, data lakes and warehouses have been able to move from on-premises to the Cloud to take advantage of scale economics (think Snowflake, Databricks, Azure Synapse, Amazon Athena to name a few).  All of the major cloud service providers now offer a set of robust data capabilities – storage, technical meta-data management, pipelines, warehousing, data science workbenches, etc. The challenge is that in most large enterprises, data is now being copied and proliferating into multiple disparate on-prem data marts and data lakes across multiple cloud providers. The more data is being copied, the greater the risk of data fidelity, integrity and quality issues, along with risk of data leakage and cybertheft. The latest technology innovations to try and manage data across all of these new environments include data fabrics, or data meshes. However, data fabrics are only exacerbating the complexity of the data estate. New technologies will emerge which will get to the core of data reusability and dramatically change the landscape and conversation.  This will be more of a return to better, smarter uses of reusable ‘Small Data’ components versus continuing to proliferate more and more ‘Big Data.’ – Eliud Polanco, the president of Fluree

Real-time data processing: Data Platforms: For Enterprise business applications, user responsiveness is a major factor that determines customer behavior, retention, and loyalty. All modern digital interactions are going to be seamless, B2B and B2C users expect real-time responses to their requests and transactions. These digital interactions are typically small transactions, but it needs modern data platforms for storing and processing billions of objects at scale. Data Platforms and their infrastructure should evolve fast enough to deal with the present & future needs of enterprises by adopting real-time data processing at the economics of scale. – Tony Afshary, Global VP of Products and Marketing, Pliops

Data quality determines success: Over the past few years, many companies have made significant strides to accelerate their CX initiatives, but most are only now realizing the success of these programs is reliant on the quality of their data. This data can be sourced through solicited customer feedback, like surveys, or unsolicited feedback, like the conversations that happen in contact and customer service centers. In 2023, transcription accuracy of omnichannel customer-brand interactions will transition from a “nice-to-have” to a critical capability. The most successful organizations in the coming year will be the ones who understand the direct correlation between transcription accuracy and the quality of customer insights, and then use that better intelligence to drive even greater CX value. – Eric Williamson, CMO of CallMiner

Data Projects Face Prioritization: Market conditions will impact data projects, but not stop them. The trajectory toward leveraging data for business insight and advantage won’t be reversed despite reduced funding or revenue. In fact, it can’t. The need to eke out every little bit of profit, revenue and cost savings from the business requires data. But you’ll have to prioritize. If you have an ongoing data quality project, a data security project, and a BI or Analytics project, you’ll need to decide which is a “must have”, which is a “should have” and which is really a “nice to have.” The balance will start to tip toward security as the “must have” as it’s foundational to enable the other projects. – James Beecham, founder and CEO, ALTR

The synergy between structured and unstructured data: Despite the exponential growth of unstructured data, structured data will still carry substantial value in the future. And it’s almost inevitable for organizations to deal with both structured and unstructured data simultaneously to realize maximum business growth. Incumbent solutions originally engineered to deal with structured data for traditional data analytics can extend their processing capabilities to unstructured data through plug-ins, like “native vector search” in ElasticSearch 8.0 and “vector similarity search” in Redis 6.0. For AI applications known for intensive unstructured data, that’s where a purpose-built solution like vector databases shines, complemented with the hybrid search functionality that supports filtering based on tags, attributes, etc. – Frank Liu, Director of Operations at Zilliz

CIOs, chief data officers, and data managers will need to confront the cost of modernization: Modernization and digital transformation involve many long-term initiatives like migrating major systems to the cloud or constantly integrating existing data and systems in order to meet business process needs. These can be high-cost initiatives, but are more and more necessary in today’s digital world. To minimize the risks of failures with these initiatives, executives should look for ways to balance their long-term vision with the short-term ROI of modernization, like working with a data integration platform or automating the integration process. – Rajesh Raheja, Chief Engineering Officer, Boomi

Data will be the center of collaboration for organizations: We live in a data-rich environment with most companies sitting on an excess of data. But as much data as businesses have, there’s usually a lack of access to it because everything is siloed and owned by various stakeholders and departments spread across the organization. In 2023, we’ll see these data silos start to break down as companies move towards digital workplace tools, which offer a single place to get work done. In the digital workplace, teams can easily access and bring multiple data sources together to get a full-picture overview into their business–and make more informed decisions because of it. While decisions have long been ruled by the highest-ranking official or loudest person in the room, we’ll see data opening dialogue between team members and lending itself to fresh ideas. When data becomes the center of team communications, collaboration will increase and teams will drive smarter, more-informed decisions for the business. – Dean Guida, Founder of Slingshot & CEO of Infragistics

Enterprises are facing an issue when it comes to data. Insights generated by the data are not translated quickly enough and it takes so long to process and analyze it that when the insights are finally in hand, it’s often too late to act upon them. DataOps the application of agile engineering and DevOps best practices to data management, which promises to improve the success rate of data and analytics initiatives.  This is a process-oriented, automated, and collaborative approach to designing, implementing, and managing data workflows and a distributed data architecture. In 2023 we are seeing more companies recognize the value of data, implementing DataOps strategies and becoming data-driven businesses. Those that have begun that process will start to see a reduction in the number of times data is causing exceptions in applications (and vice versa) and an improvement in the ability to deliver data projects on time. With these improvements organizations will be able to build quality and trust back into the modern data environment while also improving data quality, trust, and positive business outcomes. – Ram Chakravarti, Chief Technology Officer at BMC Software

In 2023, the most successful data people won’t be judged just on their knowledge of SQL, Tableau, or dbt. Instead, they’ll be measured by their impact on the business at large. This is a discipline data practitioners have largely struggled with in the past, but in a challenging economy, demonstrating ROI is everything. I hear many of my peers talk about the importance of business people being data literate, but I believe this is the year that we see more data people become business literate. Data scientists, analysts, engineers, and stewards are as essential to business outcomes as anyone in marketing, finance, or sales. After all, they deliver the raw material and dashboards that spark insight. When data people understand how the business operates, makes money, reduces risk, and drives innovation, they take greater pride in their work and focus on the metrics that the business cares most about. – Juan Sequeda, Principal Scientist at

Chatbots and Conversational AI

Is Conversational AI becoming to human-like? Technologies like natural language processing/generation (e.g. GPT-3, BERT, etc) are becoming more and more powerful. As they do so, conversational AI is evolving to support more human-like relationships. This will lead companies to thinking through the ethical implications of building conversational AI tools along three questions: 1. Does the conversational AI have a human-looking avatar that might embed stereotypes? 2. Does it set out to understand the human user? 3. Does it behave like a human in a way that changes the relationship with the end user? – Accenture

Virtual agents will become the experience integration hub for employees. As conversational AI, ML, and NLP capabilities expand, virtual agents will be able to resolve more complex issues and meet employees where they are to become the experience integration hub. Looking ahead, virtual agents will continue to evolve from automating the resolution of tier 1 service desk agents (password resets, questions on PTO, etc.), to resolving much more complex issues that historically required higher skilled IT staff. This will not only be critical to the journey of upleveling employee self-help, but also to reduce costs during uncertain economic conditions. Automation will be used to upskill current teams so leaders don’t have to mine for technical expertise. – Pat Calhoun, CEO and founder of Espressive

Chatbots will chat less and answer questions more. Humans don’t want to spend more time interacting with machines as if they were talking to people; they really just want their questions answered quickly and efficiently from the start without lengthy wait times or having to choose from a myriad of options. Although many chatbots accurately execute the specific tasks they were designed to do, they fall far short of end-user expectations because they rarely answer their actual questions. In 2023, organizations will finally be able to complement chatbots with Natural Language Search capabilities. Because Natural Language Search understands human language and can process unstructured text-based data (documents, etc.) individuals can phrase questions using their own words–as if they were speaking to a person– and receive all the relevant answers back instantly. – Ryan Welsh, Founder and CEO of Kyndi

AI-powered surveys will change the customer feedback game: We’ll see attention shift beyond simply delivering strong customer experiences to how well each of those experiences are perceived by the consumer. Thanks to AI and a wealth of rich chat, SMS and WhatsApp messages showing the cadence of customer and agent conversations, as opposed to stop/start email interactions, organizations will get smarter and more sophisticated feedback. AI-powered surveys that dynamically morph to get maximum feedback from customers based on prior responses, will reshape the customer feedback game and textual analysis will be able to pull together a much more integrated sense of how customers feel. – Colin Crowley, CX Advisor at Freshworks

Over the last several decades, the value of automation has largely been derived from using robotics to replicate human actions and eliminate laborious, repetitive tasks. This coming year, I predict we’ll witness a significant expansion beyond robotics to intelligent automation, which uses artificial intelligence and analysis to carry out data-driven tasks with very little human interaction. This enablement shifts reliance off humans and onto technology, so workers can focus their attention on other areas of the business. As more businesses adopt this newer structure, they’ll find greater efficiencies in everyday tasks across their organization. Imagine streamlining hundreds of processes and decisions—everything from prioritizing employee work tasks, to determining the products stocked on shelves, to automating customer contact—with the push of a button. The possibilities and opportunities are endless for optimizing workflows and reducing costs. – Srinivasan Venkatesan, Executive Vice President, U.S. Omni Tech, Walmart Global Tech

AI shopping assistants: Something I personally don’t enjoy is buying gifts for the holiday season (don’t judge!) –– the good news is, it won’t be long before AI comes to the rescue for people like me. Smart online shopping with AI assistants is right around the corner and will have the ability to suggest items that your kids, partner, or parent will love, all while making unique recommendations based on the person you’re shopping for. It’s all about automation simplifying our lives. – David Ly – CEO and founder of Iveda

AI Assistants Meet the Data Professional: AI assistants finally seem ready to be useful for data professionals. GitHub Copilot routinely writes code in a way that seemed impossible a decade ago, while OpenAI Codex is able to plainly understand and explain code. Every data professional knows the pain of issuing simple queries and aggregates against a database, or counting records looking for a discrepancy, or generating charts for a presentation. By handing these simple-in-theory but nuanced and idiosyncratic tasks over to an AI assistant, data professionals can free their time and focus on problems that truly deserve intelligence. – from Prefect CEO and Founder, Jeremiah Lowin

The chatbot evolution is upon us: We’ll continue to see advancements with AI technology like Open AI to drive independent human interaction and AI training itself to better adapt to human responses”, says Olga. “In the customer service space specifically, businesses will forego the method of human interaction to allow AI technology to take the place of call centers and human operators. –  VP of AI and Machine Translation at Smartling

By this time next year, a new kind of conversational AI will be breaking away from the awkward pauses and turn-taking that can make voice interfaces feel robotic. Voice AI will also be more proactive, noting the context of a situation and using it to make helpful suggestions. The test ground for this technology will be the restaurant space, and as it  transitions from the lab to the drive thru we’ll see that it’s ready for broad commercialization. Particularly in industries battling labor challenges and inflation – like restaurants. – Zubin Irani, CRO,  SoundHound

With the Natural Language Processing (NLP) Tech Boom, More Contact Centers Will Strategically Implement Conversational AI Bots: As companies shift contact centers to the cloud, AI comes baked into many cloud solutions such as Amazon Connect. This brings in an NLP component, which makes chatbots and interactive voice response (IVR) systems more intuitive and conversational. There’s been an explosion in NLP startups in the last year, not to mention Google’s release of LaMDA, a large language model for chatbot applications. At the same time, big tech companies like Microsoft, Google, Amazon and Salesforce are now investing in contact center solutions to sell to their enterprise customers. Deploying IVRs and chatbots powered by conversational AI allows companies to improve customer support and in turn, drive more value for the business. NLP enables bots to understand customer intent and uses that context to provide a better customer experience. Bringing AI into the contact center also increases the speed and volume at which customers are helped, reducing wait times and requiring less live phone calls with agents. More satisfied customers means higher retention and less revenue lost from customer churn. So now, the contact center is strategic. – James Isaacs, President of Cyara

Today’s customers look for highly flexible communication, and they don’t want to be restricted by pre-defined experiences. They are looking for conversations with businesses that can switch between modes and channels to deliver more immersive and engaging experiences. The key to this is multiexperience, which is all about creating seamless and simple experiences across apps, digital touchpoints, and interaction modalities. In 2023, businesses would need to adapt a strategy that would be equipped with the ability to drive automation across different modalities, be it voice or chat or across web or social, as well as across channels, be it WhatsApp or Instagram. And Conversational AI-powered Dynamic agents would be key to delivering the much-sought multiexperience. Additionally, conversational commerce will see increased demand and adoption, becoming a non-negotiable element in every brand’s marketing, sales, and customer support strategies. When it comes to channels, voice emerged as a prominent channel of communication between customers and businesses in 2022. Next year, the focus will be on making voice AI more human. As users demand more human-like, hyper-personalized experiences while interacting with voice AI agents, 2023 will see the industry working towards achieving this feat. Voice AI agents will only become more mainstream if synthetic monotones are completely replaced by conversational human tones. There will also be a quicker transition from analogue voice in telephony to digital voice, which will eventually lead to voice interactions supplemented by video. – Raghu Ravinutala, CEO & Co-founder,

Next-gen conversational AI solutions that can understand the context and hold dialogues like a human are starting to kill the scripted chatbots that only frustrate users: End users don’t want to get on the phone with someone if they can avoid it. They want tech on their side to help them solve problems and answer questions in the moment while still experiencing an interaction that feels like they are talking to a human. Companies cannot offer this level of personalization at scale unless they use technology—and the right tech at that. The conversation has to feel natural and follow the customer’s needs, not the businesses’. That means scripted chatbots with rigid rules won’t cut it. Chat must be equipped with Natural Language Processing technology that can understand incoming messages, respond appropriately and take proper action to truly meet customer expectations and generate revenue. – Jim Kaskade, CEO of Conversica


Hybrid Cloud. I look at hybrid cloud deployments as deployments that have significant production usage in both cloud and data centers. These typically arise where each location (cloud & data center) possess tools or requirements that are uniquely satisfied by something contained therein. Think about data location requirements or unique hardware for data centers, or using tools like BigQuery or Cloud Dataflow in Google Cloud. While we’ve made incremental steps towards true hybrid cloud over the years, hybrid cloud deployment patterns are still in their infancy. This is primarily because the tooling is all relatively new to consumers. However, I expect this to change next year. I think this primarily because I think the tooling, primarily led by the increasing maturity of Anthos, to be at a tipping point where companies will start really investing in hybrid deployments. – Peter-Mark Verwoerd, Associate CTO, SADA

Hybrid computing is dead. Hybrid computing will cease to exist as everyone transitions to the cloud. Sales teams at Azure, AWS, and Google will continue to focus on moving legacy applications and organizations to the cloud because it’s where they make the most money, contributing to the death of hybrid computing. – Uma Chingunde, VP of Engineering at Render

Prioritize a cloud-first strategy, not cloud-only: We are still working on reigning in cloud spend and where to appropriately place workloads. A few years ago, some believed on-premise hosting was dying, but this has not been the case. On-prem has a use and a purpose, and it works within a more traditional finance model. However, you need to do what works best for your industry, your skillset and your capacity. Migrating to the cloud or moving to on-prem to save money requires investment as both are niche expertise. Some organizations have completely shifted to the cloud, but we are currently in a hybrid world. Moving into 2023, companies must follow a cloud-first strategy, not a cloud-only strategy. – Jesse Stockall, Chief Architect at Snow Software

Inflationary pressure will force major public cloud providers to reevaluate pricing and fee structures: Many organizations rely on public cloud service providers to help them “do more with less” and meet organizational requirements to store and analyze growing volumes of data with IT budgets that aren’t necessarily growing at the same rate. However, delivering on this expectation will become increasingly difficult for cloud providers in 2023, as inflationary pressures continue to rise. As a result, there’s a good likelihood that organizations will be met with service list price and fee increases as cloud hyperscalers seek to maintain margins. The simple fact is that $/GB rates for cloud storage have stopped trending down, and there have been no price reductions among leading providers since 2017. Between 2018-2022, instead of list-price reductions, many providers have introduced additional, lower-cost storage tiers (e.g., “cold” or “archive” tiers) as a way to help customers achieve ongoing cost reductions. But these new tiers require adoption, transfer of data, and new data lifecycle and management policies to realize these cost reductions. 2023 may mark the beginning of a new era of expectations when it comes to cloud infrastructure services pricing – and we expect organizations to heavily scrutinize the risk and impact of potential price changes from their cloud storage provider. – Andrew Smith, Senior Manager of Strategy and Market Intelligence, Wasabi Technologies

To the Cloud and Back Again: Enterprises will look to repatriate data from the public cloud to reduce their operating costs. Enterprises know they can reduce their costs if they can get their data back on-prem, particularly for data that need to be retained for long periods of time for compliance reasons. Digital data sources, including imagery and video, continue to grow exponentially, forcing organizations to confront the impracticality of relying on the public cloud for all of their data management and storage. Cost constraints, access control, data sovereignty, and longer-term retention of massive amounts of data will influence large enterprises to consider building their own low-cost storage cloud internally for access and use across the entire organization. – Tim Sherbak, Product Manager, Quantum

As digital data repositories continue to grow, organizations are under increased pressure to optimize their IT infrastructures to take full advantage of the power of their data. Additionally, the concept of a cloud-first strategy for organizations is becoming more mainstream. These trends are driving the need for organizations to manage data at scale in single-cloud or multi-cloud architectures, raising new complex challenges regarding the management, usage and preservation of data stored in the cloud. The rise of modern multi-cloud and multi-location ecosystems provides agility and flexibility in data utilization across clouds and sites, where data, applications and workflows are accessible to drive more value from the data.  New tools, such as distributed multi-site, multi-cloud data management software, will help to unify and simplify data access, usage and placement across on-premises storage and multiple clouds. This will empower organizations to leverage cloud services no matter where data is created or stored for optimal productivity and collaboration. – Deanna Hoover, Director of Product Marketing for Spectra Logic

Proper Data Protection in the Cloud will put Data Democratization Into Sharper Focus:  As positions across companies become more reliant on utilizing data in their daily roles, and with data primarily being protected in the cloud, there are more opportunities emerging for businesses to leverage data in ways they have not yet done before. More specifically, the cloud opened up the ability to leverage infinite compute, to democratize access to data in ways that could not be previously supported. Before the cloud, backup data protection meant locking up highly sensitive data, without being able to utilize it again, less threatening its security. Data stored on the cloud however, can stay protected and repeatedly accessed across an organization while remaining secure. For example, many businesses are now taking advantage of data lakes to warehouse large swaths of data that can go on to ultimately train artificial intelligence (AI) and machine learning (ML) models. This makes it necessary to protect both the original data source as well as the new data that is continuously generated from these models on a granular level.  The key here will be to execute incredibly well in regard to data protection on the cloud, in order for this to become a regular and effective practice. This dual use of data will revolutionize how businesses are able to interact with data to achieve their desired outcomes, while continuing to secure such data at all times. – Chadd Kenney, VP of Product, Clumio

Continued AI breakthroughs due to increasing cloud options: The testing and experimenting with AI systems and machine learning tools to perform all aspects from everything between “art-to-documentation” to “code assistance”. These integrated innovations are truly driving more sophisticated intelligence into cloud systems which helps accelerate development, support, customer interaction and adaptive tooling.  Artificial intelligence and machine learning (AI/ML) is an exploding innovation which is pushing breakthrough services for the cloud. – Chris Chapman, CTO at MacStadium

Database/Data Warehouse/Data Lake

The Return of Data Modeling. In 2023, industry veterans who spent nearly a decade calling for thoughtfulness in building fundamental data infrastructure instead of rushing to build buzzworthy products will get their “I told you so” moment. Data modeling is making a comeback, alongside the realization that without the infrastructure to deliver high-quality data, businesses will not get very far towards the promise of predictive analytics, machine learning/AI, or even making truly data-driven decisions. – Satish Jayanthi, CTO and co-founder of Coalesce

Databases will Streamline the Tech Stack, Enabling DevOps to do More with Less. With today’s organizations dealing with massive amounts of data in an uncertain macro environment, they will continue to move away from relational databases to multi-model databases, which can handle different data types and models, including documents, graphs, and relational and key-value databases – all from a single, integrated backend. Multi-model databases provide unified data management, access, and governance. This speeds up time to market and lowers deployment and operational costs, saving resources at a time when budgets are tight. We’ll also see next-generation databases emerge that can seamlessly support both transactions and analytics for real-time business insights. No longer will businesses need a separate vendor solution for operational analytics. – Rahul Pradhan, VP of Product at Couchbase

In 2023 with rising economic concerns, we will see companies taking a deeper look at cloud data warehouse costs and becoming much more serious about cost control. Information on total cost of ownership is critical to effectively managing cost for data teams because driving down costs one at a time will undermine the efficacy of the whole system. The burden of proof shifts to data teams to demonstrate efficiency with total cost of ownership and that data insights are driving excess value to the business. – Alexander Lovell, Head of Product at Fivetran

Cloud databases will reach new levels of sophistication to support modern applications in an era where fast, personalized and immersive experiences are the goal: From a digital transformation perspective, it’s about modernizing the tech stack to ensure that apps are running without delay – which in turn gives users a premium experience when interacting with an app or platform. Deploying a powerful cloud database is one way to do this. There’s been a massive trend in going serverless and using cloud databases will become the de facto way to manage the data layer. In the next year, we will also see the decentralization of data as it moves closer to the edge to offer faster, more dependable availability. Additionally, we’ll start to see the emergence of AI-assisted databases to enable teams to do more with less. The proliferation of data will only continue, making AI-assisted databases a critical strategy to making the data lifecycle more operationally efficient for the business. – Ravi Mayuram, CTO, Couchbase.

Enterprises move from traditional data warehouses to real time data storage: In 2023, we will continue to see movement away from traditional data warehousing to storage options that support analyzing and reacting to data in real time. Organizations will lean into processing data as it becomes available and storing it in a user-friendly format for reporting purposes (whether that’s as a denormalized file in a data lake or in a key-value NoSQL database like DynamoDB). Whether a manufacturer monitoring streaming IoT data from machinery, or a retailer monitoring ecommerce traffic, being able to identify trends in real time will help avoid costly mistakes and capitalize on opportunities when they present themselves. – Jay Upchurch, Executive Vice President and Chief Information Officer, SAS

The rise of the data lakehouse will lead to simplicity but will require a rethinking of data security and governance: Enterprises will increasingly migrate to the data lakehouse, which is a unified data platform for previously siloed workloads such as data engineering, warehousing, streaming, data science, and machine learning. Unifying these silos unlocks tremendous value for businesses, but also requires the organization to evolve to realize the full potential of the platform. Understanding that data remains one of the most valued assets within a business; forward-thinking enterprises will also be looking for simplicity by consolidating on a single vendor lakehouse architecture. Enterprises will evolve their teams and processes to adopt a secure, centralized management and governance model with the associated tooling for the data assets that span multiple workloads. – Fermín J. Serna, Chief Security Officer, Databricks

Open Lakehouse Will More Effectively Augment the Proprietary Cloud Enterprise Data Warehouse: As the architectural paradigm shift toward the lakehouse continues forward, the disaggregated stack will become more fully-featured data management systems; from disjoint components evolving into cohesive stacks that include metadata, security, and transactions. – Dave Simmon, Co-founder and CTO, Ahana

In a multi-cloud world, object storage is primary storage : Right now, databases are converging on object storage as their primary storage solution. This is driven by performance, scalability and open table formats. One key advantage in the rise of open table formats (Iceberg, Hudi, Delta) is that they allow for multiple databases and analytics engines to coexist. This, in turn, creates the requirement to run anywhere – something that modern object storage is well suited for. The early evidence is powerful, both Snowflake and Microsoft will GA external tables functionality in late 2023. Now companies will be able to leverage object storage for any database without ever needing to move those objects directly into the database, they can query in place. – Anand Babu Periasamy, Co-founder and CEO at MinIO

For years, data lakes held the promise of taming data chaos. Many organizations dumped their ever-growing body of data into a data lake with the hope that having all their data in one place will help bring order to it. But data lakes are overhyped and often lack proper governance. And without clean, curated data, they simply do not work. That’s why many organizations who implemented data lakes are realizing that what they actually have is a data swamp. Having clean, curated data is valuable. That’s a fact. But dirty data swamps are not and organizations must prioritize the importance of accurate and integrated data, develop a strategy of eliminating data silos, and make cleaning data everyone’s responsibility. – Tamr Co-Founder and CEO Andy Palmer 

Standards-based Semantic Layers Will Power Data Selection through Business Terms: Data fabrics, data lakes, and data lake houses contain a surplus of unstructured and semi-structured data from external sources. In 2023 there will be a significant uptick in organizations applying W3C standards-based semantic layers atop these architectures, where data assets are described by metadata in familiar business terms and enable business users to select data through a lens of business understanding. This method will provide a seamless business understanding of data that fosters a culture of data literacy and self-service, while simplifying data integration and improving analytics. – Jans Aasman, Ph.D., an expert in Cognitive Science and CEO of Franz Inc.

End in sight for legacy databases: 2023 will see an acceleration of the end for legacy databases. With the world moving towards real-time unified databases, speed has been an important differentiator, and legacy systems can’t keep up anymore with the real-time nature we are seeing in this digital services economy. We have seen this trend in industries like finance up until this point, but it’s now becoming apparent to business leaders across sectors that the digital revolution begins with the tech stack that holds your company together: the database. We are ushering in an era of unified, simple, modern data in real time. Without this, your company will likely not see 2024. – Shireesh Thota, SVP of Engineering, SingleStore

Organizations’ competitive advantage lies in being able to easily build intelligent, data-driven applications. This requires today’s developer to unlock and leverage data from both operational and analytical systems and infuse machine learning models into their applications. We believe in the coming years, the barriers between transactional and analytics workloads will disappear. Traditionally, data architectures have separated these workloads because each needed a fit-for-purpose database. Transactional databases are optimized for fast reads and writes, while analytical databases are optimized for aggregating large data sets. With advances in cloud-based data architectures that leverage highly scalable, disaggregated compute and storage with high-performance networking, we predict there will be new database architectures that allow both transactional and analytical workloads within one system without requiring applications to compromise on workload needs. – Andi Gutmans, Vice President & General Manager, Google Databases, Google

From centralized Hive catalog to open table formats in data lakes: With data lakes becoming the primary destination for a growing volume and variety of data, having a table format for data stored in a data lake is a no-brainer. More organizations now have realized that Hive catalogs have become the central bottleneck. In the cloud-native era, decentralized open data table formats are popular, especially in large-scale data platforms. In 2023, we can expect to see more enterprise data being stored in open table formats as Apache Iceberg, Hudi and Delta Lake are rapidly adopted. – Haoyuan Li, Founder and CEO, Alluxio

In 2023 with rising economic concerns, we will see companies taking a deeper look at cloud data warehouse costs and becoming much more serious about cost control. Information on total cost of ownership is critical to effectively managing cost for data teams because driving down costs one at a time will undermine the efficacy of the whole system. The burden of proof shifts to data teams to demonstrate efficiency with total cost of ownership and that data insights are driving excess value to the business. – Alexander Lovell, Head of Product at Fivetran 

Rise of the SQL: The most important language to learn isn’t Python; it’s SQL. Databases of all sizes are on a tear. Many workloads are moving to the cloud (and powerful cloud data warehouses in particular), finally reaching a tipping point as a combination of features and price make it difficult for any company to hold out. And when data is available locally, new in-memory databases like DuckDB make it possible to use advanced, SQL-based query engines from a laptop, from a serverless function, even from the browser itself. These ubiquitous SQL-based tools are crowding out yesterday’s heavily scripted approaches to data manipulation because they empower users to work with data where it sits, rather than have to extract it, manipulate it, and re-insert it. – Prefect CEO and Founder, Jeremiah Lowin

Ushering an era of unified databases: 2023 is going to be the year of unified databases. Unified databases are designed to support large amounts of transactional and analytical workloads at the same time. This allows a simplified and flexible data architecture for companies to process massive workloads. In 2023, we will witness a convergence of specialized databases that will be built on four primary characteristics: distributed, shared-nothing architecture, cloud-native, multi-model and relational foundation. Organizations will need one platform to transact and reason with data in milliseconds in a hybrid, multi-cloud environment. 2022 saw many popular vendors move in this direction, and it will pick up a significant pace in the coming year. – Shireesh Thota, SVP of Engineering, SingleStore

The rise of hybrid “bring-your-own-database” (BYODB) cloud deployments: The benefits of moving certain data-driven projects to the cloud are undisputed — quicker deployment, reduced infrastructure and maintenance costs, built-in support and SLAs, and instant scalability when you need it. However, there will always be use case obligations that require keeping data on-premises, including performance, security, regulatory compliance, local development, and air-gapped hardware (to name a few). A more flexible solution is for modern data vendors to support hybrid “bring-your-own-database” (BYODB) cloud deployments in addition to the more common on-premises and fully-managed cloud service options. This new approach will catch on in the years ahead, allowing data to be kept in situ and unaltered but remotely connected to SaaS services that layer on top from nearby data centers. This provides all the benefits of the cloud, while still allowing for full authority and control over the company’s most precious resource… its data. Ben Haynes, co-founder/CEO of Directus

PostgreSQL will continue to take over the world: PostgreSQL continues to grow as a project and as a community. It will eventually take over the position that MySQL holds on the DB-Engines ranking and become the most popular open source database, but this will be a while. There are lots of new projects being launched that base themselves on PostgreSQL, and then offer their spin on top. The reason for this is that it is easy to make PostgreSQL do what you want it to, and the license it is released under makes it possible to build businesses on this as well. For users, it is simple to implement and the community is a strong one. – Donnie Berkholz, SVP of Product Management at Percona 

NoSQL had a good run over the past 10 years. By catering to developers, they changed how database companies address their audiences. True to their namesake, NoSQL started with replacing SQL to appeal to developers. Transactions and many typical database features became collateral damage. They got ditched along the way too. Now, much matured, NoSQL vendors are looking to conquer the enterprise. However, they now realize that enterprise customers consider SQL table stakes. In a complete face-about, NoSQL databases are now adding SQL and other database features. 2023 will be a watershed moment for NoSQL vendors. They must prove they can move beyond their original niches. Otherwise they will not be able to challenge the dominance of established database players. – Mike Waas, CEO, Datometry

Hyperscale Will Become Mainstream: Data warehouse vendors are will develop new ways to build and expand systems and services. Some leading-edge IT organizations are now working with data sets that comprise billions and trillions of records. In 2023, we could even see data sets of a quadrillion rows in data-intensive industries such as adtech, telecommunications, and geospatial.  Hyperscale data sets will become more common as organizations leverage increasing data volumes in near real-time from operations, customers, and on-the-move devices and objects. – Chris Gladwin, CEO, and Co-founder of Ocient

Vector databases take hold to unleash the value of untapped unstructured data: As businesses embrace the AI era and attempt to make full use of its benefits in production, there occurs a significant spike in the volume of unstructured data taking all sorts of forms that need to be made sense of.  To cope with these challenges in extracting tangible value from unstructured data, vector databases–a new type of database management technology purpose-built for unstructured data processing–is on the rise and will take hold in years to come. – Frank Liu, Director of Operations at Zilliz

Snowflake will become a niche technology as legacy providers costs rise: In 2023 Snowflake will become more of a niche technology. With Snowflake’s costs increasing on average 71% year over year, based on their earnings report, customers are getting to a point where they can no longer afford to continue that kind of exponential increase in costs. Because of this, customers are going to be much more cautious about what they put in there, and will put up walls of approvals and rules regarding who’s allowed to use and access what. With companies becoming more careful in this regard, they will be looking for open alternatives. The demand to make data accessible and to become data driven is still there, and data’s still growing very fast. But, customers need systems that are able to do that at scale, and customers need them to be cost efficient. The industry is moving towards those types of systems. – Tomer Shiran, CPO, and co-founder of Dremio

 Why NoSQL is irrelevant, NewSQL is insufficient and Distributed SQL reigns supreme: 2023 is the year that companies facing new challenges in the cloud will realize they cannot solve their problems, meet customers’ expectations, or achieve goals entirely with NoSQL, SQL, or extensions of NewSQL databases. It’s time for them to adopt an architecture that provides the freedom and flexibility required to address their current and future needs. – Co-founder and CTO of Yugabyte Karthik Ranganathan

Data Lakes and Warehouses will converge as data infrastructure vendors of all sizes try to differentiate themselves through innovation. We are currently in a rare moment in the evolution of the data infrastructure industry as users are increasingly seeing competitive parity features emerge from major players like Snowflake, Databricks, etc. In turn, this will create an even stronger buyer’s market as vendors continue to innovate and differentiate themselves, resulting in greater industry integration, interoperability and a standardization of best practices. In 2023 and beyond, data warehouses, data lakes and other similar infrastructure technologies will see notable consolidation as buyers sift through vendors and features to find the most value for their data stack while removing redundancies and the need to build and manage their own bespoke platforms. – Sean Knapp, CEO and founder of

Companies should adopt a unified database: Companies need a unified database to support large amounts of transactional and analytical workloads at the same time. This allows a simplified and flexible data architecture for companies to process massive workloads. In 2023, successful organizations will use one platform to transact and reason with data in milliseconds in a hybrid, multi-cloud environment. Specialized databases need to converge to be built on four primary characteristics: distributed, shared-nothing architecture, cloud-native, multi-model and relational foundation. – Shireesh Thota, SVP of Engineering, SingleStore

Data Engineering

Python will become a key enabler of the democratization of data: Business people are no longer patiently waiting for data scientists and ML engineers to unlock the value of data, they want to extract insights from data themselves. In 2023, Python will be the primary medium for democratizing access to, and insights from, data for everyone across an organization. Python will become more enterprise-ready as the runtime infrastructure around Python grows simpler and more straight-forward, and includes more security and governance. At the same time, productionizing Python results will become further streamlined and that code will be wrapped in a meaningful user experience so it can be easily consumed and understood by non-IT users such as a company’s marketing team. We’ll see Python have the same or more likely, an even greater transformational impact on democratizing data than the emergence of self-service business intelligence tools 15 to 20 years ago. – Torsten Grabs, Director of Product Management, Snowflake 

The Data Engineer role will continue down the path of specialization. Over the past couple of decades, we’ve seen the role of the software engineer split into a variety of other roles such as infrastructure, backend, frontend, mobile, and even product engineer. The role of the data engineer will follow a similar path, and we will see continued growth not only in today’s “data engineer” (i.e. the full stack engineer), but expansion in the “analytics engineer” (i.e. the frontend engineer), as well as the “data platform engineer” (i.e. the infrastructure engineer). This specialization will enable organizations to tap into a greater body of talent and skills, as well as aligning jobs with what is of greatest interest to their developers. – Sean Knapp, CEO and founder of

The Beginning of the End for Data Engineering As We Know It: Maintaining legacy data pipelines is a costly affair for enterprises. Data Engineers are constantly moving data across different platforms. This is a wasteful, inefficient allocation of talent and resources. Recent developments in cloud architecture and Artificial Intelligence will fundamentally change how data is prepared, resulting in more efficient data engineering. These developments will bring about new products, architectures, and methodologies in the coming year that will improve the profiling, acquisition, and movement of data, all powered by AI. – James Li, CEO of CelerData

Believe it or not, technical skills are not the most important qualification when looking for an engineer. If someone is sufficiently curious, they can pick up any technical skill. Engineers looking to make themselves “recession-proof” will need to understand that companies are looking for someone with the potential and willingness to learn while also being able to communicate effectively. Hiring based on curiosity and potential opens the door to a more diverse, and therefore successful, workforce. – According Manjot Singh — Field CTO for MariaDB 

Data Contracts Become More Real & The Business Finally Gets Involved: Too many engineering teams are struggling to maintain data quality, access, and track usage patterns. While many businesses have in-function analytics professionals collaborating with core enterprise analytics teams, data and/or analytics engineering professionals are still navigating data domains and certifying data coming out of data build tools. In 2023, the continued proliferation of data is going to finally force the business to take more ownership of data, not just in use and interpretation, but also in the patterns of how it is managed and provisioned. Distributed stewardship will become a reality and the best way to enable this will be with tools that are not built for engineers, but with data contracts that clearly map ownership, use, dependencies, etc. This will become more visible as features in data catalogs and/or a few startups emerging since confluence will not cut it at scale. – Nik Acheson, Field Chief Data Officer for Okera 

Data reliability engineering: All too often, bad data is first discovered by stakeholders downstream in dashboards and reports instead of in the pipeline— or even before. Since data is rarely ever in its ideal, perfectly reliable state, data teams are hiring data reliability engineers to put the tooling (like data observability platforms and data testing) and processes (like CI/CD) in place to ensure that when issues happen, they’re quickly resolved and impact is conveyed to those who need to know before your CFO finds out. – Lior Gavish, co-founder and CTO of Monte Carlo

The role of AI/ML engineers will become mainstream: Since model deployment, scaling AI across the enterprise, reducing time to insight and reducing time to value will become the key success criteria, these roles will become critical in meeting these criteria. Today a lot of AI projects fail because they are not built to scale or integrate with business workflows. – Nicolas Sekkaki, Kyndryl’s GM of Applications, Data & AI

Data Governance

The share of organizations with a formal data governance team will increase 30%. Organizations must take the critical step of establishing a dedicated data governance team to develop standards and policies, implement enterprise data governance tools, and coordinate business-line efforts. In the next 12 months, 36% of organizations plan to bring on a head of data governance, and 32% of organizations will bring on a chief data officer. – Forrester 

Getting access to data does not necessarily mean being in a position to derive useful insight. In this data deluge, the successful organizations will be those who will be able to crack the data governance dilemma by leveraging both self-executing policies, such as access control and obfuscation, and auditing capabilities, with a view to reduce time to data. They will discard meaningless pre-approval workflows and federate data governance by making data owners the key players: data owners will be both domain experts and data stewards. – Sophie Stalla-Bourdillon, Senior Privacy Counsel and Legal Engineer, Immuta

New data sovereignty laws will spur businesses to make data more visible and interoperable: We expect to see businesses take a more proactive role in creating their own data governance policies amid the current wave of regulatory action. The current global patchwork of data sovereignty and privacy laws has made it more complicated than ever for businesses to create consistent policies on data sharing, integration and compliance. This will continue to have a significant impact on organizations’ ability to maximize the use of data across their IT infrastructure, unless they put together clear plans for data integration and governance. In 2023, the passing of more data sovereignty and sharing laws will spur businesses to invest in getting visibility into their data and creating clear plans for sharing and integration across their IT landscape. – Danny Sandwell, Senior Solutions Strategist, Quest

Automation will offer data teams a way to balance defensive/offensive data operations: With recent high-profile cases of organizations being fined for data breaches, there was a risk that data teams may have become “spooked” into funneling efforts and funds into data regulation compliance and away from data innovation. While that may have happened in previous years, the balancing act between data compliance and data innovation could be about to become a lot easier in 2023 as data governance automation begins to take root. Data access and security controls have been automated for a while, but we expect data governance automation to blend existing automated operations with data governance policy-making to free-up time for data teams to focus on business innovation without leaving the organization defenseless. – John Wills, Field CTO at Alation

Ungoverned data will have to fall in line. Because of the rapid proliferation of data, many companies still don’t have a good grasp on who has access to what data and where their data lives. Data governance will remain a top priority for IT leaders, not only for compliance reasons, but also to ensure the data in your company is usable. It’s harder to turn data into value if you struggle to locate, access, and securely share it with the right people. – Chad Verbowski, SVP of engineering at Confluent

Businesses to focus efforts on AI regulation and governance: As the United States – and the rest of the world – respond to the increasing demand for trust in AI, technology leaders will follow in pursuit. While AI and automation can provide businesses with many benefits, it also comes with a lot of risks. It is critical to embrace AI to stay competitive, but it is also imperative to set standards within your business. To stand out in the new digital landscape in the coming year, we will see businesses not only continue to invest in AI, but also invest their resources in getting ready for new regulations. – Anand Rao, Global AI Lead; US Innovation Lead, Emerging Technology Group, PwC

Data democratization and data governance balancing act takes center stage: The trend of data democratization will continue to grow as organizations increasingly seek greater accessibility. However data democratization comes with risks and today’s organizations are faced with a challenge — how do you drive governance around data democratization? With the continued rise of data democratization, we’ll see a shift from a storage-centric world to an analytics-centric world, one where data is treated as a first class product to be consumed by the rest of the organization. The democratization of Data Products enables organizations to have greater access to their data across the board, but with that, there also needs to be fine grained access controls. To support that, similar to how the data science occupation was created about 10 years ago to support big data analytics projects, in 2023 we’ll see the Data Product Manager role emerge to provide the necessary governance to facilitate who has access to what. – Justin Borgman, Starburst CEO

Data Integration, Data Quality

Need real-time data monitoring to ensure AI is fed by good quality data: For companies looking to up-level their data and AI strategies in 2023, data quality should be a top priority. Poor quality or biased data results in erroneous or biased AI models. In order to trust and ensure the quality of the data being fed into AI and ML models, enterprise leaders need to implement real-time monitoring of data pipelines via data observability tools to catch issues and pipeline breakdowns before they cause serious issues. – Jay Militscher, Head of Data Office at Collibra

This is the year that the data quality issue in healthcare will really come to the forefront. This is because the federal government is looking at provider data quality more seriously (through the CMS National Health Directory proposal) and it is becoming more apparent that machine learning-based interventions in healthcare do not actually work. There is a big push to free data in healthcare (APIs to share data from health plans, price transparency, etc.) that will be realized next year, and the assumption is that all of this will be great. However, if you’re sharing “dirty data,” the system won’t get the benefits that everyone is predicting. – Amit Garg, CEO and co-founder of HiLabs

Junky or dirty data is data that is incorrect, incomplete, inconsistent, outdated, duplicative – or all of the above, and may be killing your business. It’s a common problem often heightened during cyclical periods when you need your customer data to work for you most — i.e., for holiday shopping and travel. Avoid confusion and frustration, and ease your customers’ shopping and travel experience by mastering your customer data. Customer mastering creates a unified, accurate and enriched view of customer data across systems and sources, and a unique identifier enabling consistent tracking of the customer. Mastering your customer data at scale gives sales, marketing and customer experience teams a powerful way to accelerate data-driven selling. It also enables customer insights for competitive advantage. – Anthony Deighton, Chief Product Officer at Tamr

Data contracts: Designed to prevent data quality issues that occur upstream when data-generating services unexpectedly change, data contracts are very much en vogue. Why? Thanks to changes made by software engineers who unknowingly create ramifications via updates that affect the downstream data pipeline and due to the rise of data modeling gives data engineers the option to deliver the data into the warehouse, pre-modeled. 2023 will see broader data contract adoption as practitioners attempt to apply these frameworks. – Lior Gavish, Co-founder and CTO of Monte Carlo

Developer platforms will overtake loosely connected tools as teams tire of integrations. While only a handful players continue to dominate the data infrastructure stack (Snowflake, Databricks, and the big 3 clouds), businesses have been using a hodgepodge of tools upstack for tasks like data ingestion, transformation, orchestration, management, and observability. In 2023, businesses will reach their breaking point as they tire of assembling and managing these upstack tools and don’t see enough returns on those investments. As they become increasingly frustrated by the inherent inefficiency of the fragmentary model, businesses will start to consolidate their vendor tools in order to prioritize developer team productivity and ease of maintenance. – Sean Knapp, CEO and founder of

As data integration technology becomes more accessible, the focus will shift to operationalizing those technologies. The ability to scale simple-to-build pipelines and comply with enterprise governance requirements will be seen as more important than simply being able to connect to lots of environments. There will be a greater focus on efficiency and cost-benefit for users of cloud solutions. Companies have spent years moving data into cloud-hosted data warehouses and data lakes and, especially with uncertainty in the economy, leadership will be expected to justify the existing spend and control it moving forward. Technology companies that can provide complete end-to-end solutions for businesses undergoing digital transformation will have a significant advantage over those that don’t. As businesses look to see returns on their digital transformation investments, time-to-value and battle-tested solutions will be highly sought-after. – Dima Spivak, COO Products at StreamSets

AI and ML will enhance data quality: Artificial intelligence and machine learning have been adopted at unprecedented rates. In today’s multi-cloud environments, enterprises are sometimes left to their own devices to manage their data. AI and machine learning can help enterprises manage their data sources while also quickly scaling a multi-cloud environment. Enterprises are finally understanding that one size does not fit all and are looking to improved tools to get more value from their data. We’re seeing a renewed cloud conversation that encompasses cloud, data center, edge, and how data plays across those environments. In 2023, enterprises will treat data as a first-class consideration and not an afterthought. – Matt Wallace, CTO at Faction

Data Mesh, Data Fabric

Data Mesh Takes a Backseat But Data Products Will Push on Ahead in 2023: With the economy in a slowdown, we can expect data mesh— frameworks that bring scale to enterprise wide data adoption — to take a backseat. 2023 will be the year of data products before the industry moves towards data mesh in 2024 and beyond. – Founder and CEO of Nexla, Saket Saurabh

Accelerated adoption of data fabric and data mesh. Over the past two decades, data management has gone through cycles of centralization vs. decentralization, including databases, data warehouses, cloud data stores, data lakes, etc. While the debate over which approach is best has its own proponents and opponents, the last few years have proven that data is more distributed than centralized for most of the organizations. While there are numerous options for deploying enterprise data architecture, 2022 saw accelerated adoption of two data architectural approaches – data fabric and data mesh, to better manage and access the distributed data. While there is an inherent difference between the two, data fabric is a composable stack of data management technologies and data mesh is a process orientation for a distributed groups of teams to manage enterprise data as they see fit. Both are critical to enterprises that want to manage their data better. Easy access to data and ensuring it’s governed and secure, is important to every data stakeholders — from data scientists all the way to executives. After-all, it is critical for dashboarding and reporting, advanced analytics, machine learning, and AI projects. Both data fabric and data mesh can play critical roles in enterprise wide data access, integration, management and delivery, when constructed properly with the right data infrastructure in place. So in 2023, expect a rapid increase in adoption of both architectural approaches within mid-to-large size enterprises. – Angel Viña, CEO and founder of Denodo

X fabric connects data governance: Investment in data and analytics has dramatically accelerated thanks to the pandemic, and will continue to do so for the foreseeable with 93% of companies indicating they plan to continue to increase budgets in these areas. But rapidly shifting rules and regulations around privacy, as well as the distribution, diversity and dynamics of data is holding back organizations’ ability to really squeeze the best competitive edge out of it. This becomes especially challenging in a distributed and fragmented world, as data governance becomes even more complex. Improving access, real-time movement and advanced transformation of data between sources and systems across the enterprise is crucial to organizations realizing the full power of data and will be a key trend for 2023. This is why an increasing number of businesses are turning to data control plane architecture, an “X-fabric” not just for your data, but also for your applications, BI dashboards and algorithms, enabled by catalogs and cloud data integration solutions. Vital for anyone tasked with increasing agility, flexibility, compliance and sustainability; while propelling innovation. In short, they are a critical component in the modern data environment and for any organization that wants to act with certainty. – Dan Sommer, former Gartner analyst and Qlik’s Global Market Intelligence Lead 

One of the main challenges data executives will face in 2023 is deciding how they will leverage data to gain a competitive advantage. The ‘cloud wars’ have given way to the ‘data wars.’ To stay ahead of competitors, companies will need to improve the success rate of their AI and ML projects as disciplines like MLOps and related toolsets are helping AI/ML to have more of an impact beyond the data lab. To improve operations, organizations must go beyond technology and address the structural, cultural, and people-based aspects of data management—embracing disciplines like ‘data mesh’ and DataOps. – Myles Gilsenan, Vice President of Data, Analytics and AI at Apps Associates

Organizations Must Focus on Getting the Data Fabric in Place or Risk AI Project Failure: As more enterprises look to implement AI projects in 2023 to increase productivity, gain better insights and have the ability to make more accurate predictions regarding strategic business decisions, the challenge will be for traditional enterprises to establish a robust data framework that will allow their organizations to leverage data effectively for AI purposes. To succeed, organizations must have the correct data infrastructure architecture (IA) in place. The issue is that most companies do not have a sound data infrastructure and will struggle to maximize the value of their data unless their data fabric is in place. Additionally, the data is often unorganized, uncleaned, and unanalyzed and could be sitting in several systems, from ERP to CRM. In 2023, organizations must utilize data in the same way that oil firms use crude oil and farmers use their land and crops to generate profit:  identify the sources, plant the “seeds,” extract the impurities, refine, store, and pipe them, build the infrastructure for distribution, nurture, cure, safeguard, and yield it. AI solution providers can work with enterprises on these obstacles and implement frameworks that will strengthen the infrastructure architecture (IA) so that it can more successfully implement AI. The first order of business should be how to collect data which includes widening the data by adding external features –  both structured and unstructured data along with more focus on the quality and availability of the data required for developing an AI solution versus just volume. When finding answers to “what will happen,” enterprises need various data sources. Once all the data is collected, it can then be unified, processed, and ultimately presented as the AI output to iterate predictions and other information enterprises need and then all three ROIs like strategy, capability and financial ROI rather than only financial ROI to be focused. – Anand Mahukar, CEO, Findability

Data mesh initiatives gain momentum, but misinformation threatens to slow adoption: Companies that rely on data management tools to make decisions are 58% more likely to beat their revenue goals than non-data-driven companies, according to a study from Collibra and Forrester Consulting. And data-driven organizations are 162% more likely to significantly surpass revenue goals than those that are not data-driven. The enterprise has reached a turning point in the evolution of data management. We are increasingly seeing companies pivot away from traditional, monolithic approaches, and instead embrace new strategies and solutions in an effort to become more data-driven. One growing trend is the adoption of a data mesh architecture, which many organizations view as an answer to their challenges around data ownership, quality, governance, and access. A data mesh offers a distributed model where domain experts have an ownership role over their data products. In 2023, we anticipate even greater pressure on organizations to move faster and build resilient, agile data architectures that will push data teams towards data mesh implementations. However, despite the growing enthusiasm around data mesh, we do expect roadblocks due to misinformation. In order to move forward, misinformation needs to be eradicated so that data mesh can be successfully adopted at scale. For example, despite being marketed as such, you cannot buy a data mesh — it is not a technology. There is also still much discussion and confusion about how to prevent data meshes from exacerbating data silos, and whether or not data mesh and data fabric are actually the same thing. To overcome these challenges and move beyond any debates or uncertainties, companies must take responsibility for educating themselves to strengthen their understanding of what a data mesh is and how it can optimize their data management strategy. – Jens Graupmann, SVP of product & innovation, Exasol

Data fabrics will become more popular for businesses at the forefront of data management: As organizations work to break down data silos, many highly data-literate companies are turning to data fabrics as a way to utilize the data they have as optimally as possible. As these systems gain traction, we’re likely to see more data marketplaces within organizations, where users of all skill levels from departments across the organization can find and use data in a self-service, consumable manner. – John Wills, Field CTO at Alation

Centralized data infrastructures will shift to data meshes: Data mesh’s popularity will keep growing as data teams search to reduce bottlenecks and improve data usage. To make data more accessible and available, organizations will adopt key concepts popularized by Zhamak Dehghani, such as domain ownership or data as a product. – Andy Petrella, founder and CPO of Kensu

2023 will be the year of data products before we move fully towards data mesh. With the economy in a slowdown, we can expect complete adoption of data mesh to take a backseat. We will also see augmented data management rise in importance as AI becomes more integrated with data quality, metadata management, and master data management. Specialists will be able to focus on more high-value tasks as manual data management tasks are lessened by ML and AI developments. New approaches will make it possible to converge multiple tools such as integration, monitoring, and transformation into one. With data products, augmented data management, and convergence changing the way we handle data, 2023 will bring new solutions for handling data problems. – Saket Saurabh, CEO of Nexla

Businesses will take incremental steps towards Data Mesh and Fabric adoption. While the tech industry as a whole may still be debating what defines each of these (as well as what differentiates them), there is undeniable interest among organizations to gain the benefits of these specific architectures, namely speed and agility at scale. In 2023 and the years ahead, while most data teams will not have the resources to dive headfirst into working with a new mesh or fabric architecture, we will see them incrementally work their way towards the adoption of these solutions, often first by defining new best practices and strategies that support these future cutting-edge architectures. – Sean Knapp, CEO and founder of

Smart data fabrics move to the edge: Edge computing, the analysis of data and the development of solutions at the location where the data is generated, can significantly reduce latency and improve both customer and end-user experiences. While the edge computing market is forecast to grow 38.9% each year and reach $155.90 billion by 2030, progress is beginning to stall as implementation and enablement of edge technologies is proving more costly than beneficial. The “edge” is never a defined thing, but instead is a jagged edge that is constantly in a state of flux – a moving target that businesses are always trying to hit the mark on. As moving data is one of the biggest IT expenses for companies, ensuring you have the right data architecture to support the edge is critical for success and for budgeting. I anticipate smart data fabrics will become inherent to edge implementations in 2023 as this reference architectural approach provides organizations with a consistent, accurate, unified view of their data assets and keep pace with the changing demand at the edge. – Scott Gnau, Head of Data Platforms, InterSystems

Data Science

Data science democratizes to address skills shortage: The demand for data scientists has grown but organizations are struggling with finding the best candidates, leading to difficulty in putting models into production and operationalizing AI. At the same time, citizen data scientists are emerging as organizations have recognized that data science is no longer a skill for only the technical few and that everyone should be enabled to work with data comfortably. In 2023, look for organizations to consolidate diverse AI and analytics tools around modern, open, multi-language tools that will increase data science productivity, empower end users to do basic analytics tasks, and allow data scientists to focus on core tasks. By democratizing analytics, more people can join the field. – Marinela Profi, Data Scientist and Product Lead for Analytics and ModelOps, SAS

New vulnerability in the data science ranks. For the past several years, there has been a severe shortage of data scientists, and companies lucky enough to have them treated them like gold. But as trends continue regarding the difficulty of demonstrating ROI on AI efforts, and as the economy softens, enterprises are taking a harder line on results. It’s common today for only 1 in 10 models developed to ever see the light of day. Data science teams that aren’t able to get models into production at a faster pace will face pressure. Those jobs may not be secure forever. – Anupam Datta, co-founder, president and chief scientist at TruEra 

The world reaches the era of “peak data scientist.” The shortfall of data scientists and machine learning engineers has always been a bottleneck in companies realizing value from AI. Two things have happened as result: (1) more people have pursued data science degrees and accreditation, increasing the number of data scientists; and (2) vendors have come up with novel ways to minimize the involvement of data scientists in the AI production roll out. The coincident interference of these two waves yields “peak data scientist”, because with the advent of foundational models, companies can build their own applications on top of these models rather than requiring every company to train their own models from scratch. Less bespoke model training requires fewer data scientists and MLEs at the same time that more are graduating. In 2023, expect the market to react accordingly resulting in data science oversaturation. – Ryan Welsh, Founder and CEO of Kyndi

The role of data scientists will evolve in 2023, reflecting the rising popularity and power of AutoML and AI capabilities that can handle many of the routine tasks usually handled by these experts. With everyday predictive needs increasingly addressed by automated platforms, business leaders will dedicate data scientists’ scarce time and costly resources more intentionally toward projects requiring hand-crafted, one-off models or specialized domain expertise. To preserve flexibility and reduce risk in the coming year’s turbulent conditions, more businesses will turn to specialized vendors and SaaS for routine predictive modeling projects instead of continuing to build expensive data science teams. – Zohar Bronfman, co-founder and CEO of Pecan AI 

The generalist data scientist becomes specialized: Recent focus has been on training data scientists as generalists in coding, algorithm development, and open source on the most dispersed (and sometime with low business value) topics. However, we are seeing that when faced with a real-world business problem, data scientists are lacking industry-specific knowledge – an understanding of the dynamics, trends, challenges of a specific industry, and the ultimate goals that everyone is looking to achieve in that niche market – stalling projects at the onset. In 2023, data scientists who have industry specific knowledge will be the most successful in meeting business demand and expectations and we will see data scientists seek out specialized training. – Marinela Profi, Data Scientist and Product Lead for Analytics and ModelOps, SAS

Data scientists will develop a greater appetite for pre-built industry-specific and domain-specific ML models: In 2023, we’ll see an increased number of pre-built machine learning models becoming available to data scientists. They encapsulate area expertise within an initial ML model, which then speeds up time-to-value and time-to-market for data scientists and their organizations. For instance, these pre-built ML models help to remove or reduce the amount of time that data scientists have to spend on retraining and fine-tuning models. Take a look at the work that the Hugging Face AI community is already doing in driving a marketplace for ready-to-use ML models. What I expect to see next year and beyond is an increase in industry-specific and domain-specific pre-built ML models, allowing data scientists to work on more targeted problems using a well-defined set of underlying data and without having to spend time on becoming a subject matter expert in a field that’s non-core to their organization. – Torsten Grabs, Director of Product Management, Snowflake 

Pipelines Will Get More Sophisticated: A data pipeline is how data gets from its original source into the data warehouse. With so many new data types—and data pouring in continuously—these pipelines are becoming not only more essential, but potentially more complex. In 2023, users should expect data warehouse vendors to offer new and better ways to extract, transform, load, model, test, and deploy data. And vendors will do so with a focus on integration and ease of use. – Chris Gladwin, CEO and Co-founder of Ocient 

In 2023, business leaders will evaluate potential data science projects much more rigorously than in the past. These projects often fail to generate real impact due to poor alignment with business needs or because they never make it into production. With the expense and time commitment involved in data science, leaders will scrutinize proposed efforts more carefully and investigate the right way to pursue them to ensure that business-improvement actions could be taken in the near term based on the output of the models — or scuttle them before resources are wasted. – Zohar Bronfman, co-founder and CEO of Pecan AI 

New vulnerability in the data science ranks: For the past several years, there has been a severe shortage of data scientists, and companies lucky enough to have them treated them like gold. But as trends continue regarding the difficulty of demonstrating ROI on AI efforts, and as the economy softens, enterprises are taking a harder line on results. It’s common today for only 1 in 10 models developed to ever see the light of day. Data science teams that aren’t able to get models into production at a faster pace will face pressure. Those jobs may not be secure forever. – Anupam Datta, co-founder, president and chief scientist at TruEra

Data science teams will need to drive ROI to dodge layoffs: Today, many data scientists and data science teams are too disconnected from core business outcomes, focused on interesting experiments instead of programs that deliver measurable revenue. Even with the relative scarcity of talent, the economic need to show results will evolve roles to be more revenue-based. As it relates to the marketing function, the successful data scientist will be able to collaborate with marketing counterparts to productionize a churn model to not just predict churn but actually prevent it. This requires a degree of business acumen – but it also requires the right tooling. MarTech stacks that enable seamless connectivity from data and data science environments will be critical. This prediction will ultimately lead to changes in the data scientist role – while there still will be core data scientists, they’ll be focused on solving well defined problems with understood value and incrementality. For those data scientists that don’t make the cut, marketing teams will need tools in place to allow them to be self-sufficient in the absence of this resource. The wide-reaching data science roles that we see today will move closer to business functions to focus on marketing insights or product analytics in 2023. – Simon Data CEO and co-founder Jason Davis

Business leaders will be demanding increased AI/ML Engineering Productivity and better ROI of AI: The war on data science talent has led to a number of companies acquiring AI/ML talent at a premium in the market during the ‘Great Resignation’. As the economy slows down, business leaders are looking for proof points for better AI/ML engineering productivity from their data scientists. This will also translate into a more rigorous approach to measuring and enhancing Return on Investment (ROI) of their AI initiatives. – Anand Rao, Global AI Lead; US Innovation Lead, Emerging Technology Group, PwC

In the past few years, businesses have come to realize just how much data is available within their organization. The challenge now becomes what to do with this sheer volume of data and how to extract insights from it in a reliable and near real-time fashion. In 2023, Data Science will move from a job role all by itself, to a skill set performed by many in wider job functions in an effort to capitalize on the valuable insights a company’s data can provide. Additionally, AI practitioners and data scientists will continue to see a growing need for more compute power at the desktop as datasets get even larger, and with the right technology platforms in place, the possibilities are almost endless for corporations of any size. – Mike Leach, senior manager for client AI and workstation solutions at Lenovo

Pipelines Will Get More Sophisticated: A data pipeline is how data gets from its original source into the data warehouse. With so many new data types—and data pouring in continuously—these pipelines are becoming not only more essential, but potentially more complex.  In 2023, users should expect data warehouse vendors to offer new and better ways to extract, transform, load, model, test, and deploy data. And vendors will do so with a focus on integration and ease of use. – Chris Gladwin, CEO, and Co-founder of Ocient

Data science teams will need to drive ROI to dodge layoffs: Today, many data scientists and data science teams are too disconnected from core business outcomes, focused on interesting experiments instead of programs that deliver measurable revenue. Even with the relative scarcity of talent, the economic need to show results will evolve roles to be more revenue-based. As it relates to the marketing function, the successful data scientist will be able to collaborate with marketing counterparts to productionize a churn model to not just predict churn but actually prevent it. This requires a degree of business acumen – but it also requires the right tooling. MarTech stacks that enable seamless connectivity from data and data science environments will be critical. This prediction will ultimately lead to changes in the data scientist role – while there still will be core data scientists, they’ll be focused on solving well defined problems with understood value and incrementality. For those data scientists that don’t make the cut, marketing teams will need tools in place to allow them to be self-sufficient in the absence of this resource. The wide-reaching data science roles that we see today will move closer to business functions to focus on marketing insights or product analytics in 2023. – Simon Data CEO and co-founder Jason Davis

Within the field of data science in 2023, things that will stand out include autonomous systems in software platforms, and physical self management with self learning mechanics. It’s likely that autonomous platforms will be able to adapt to algorithms without the usual software upgrade. What’s more, autonomous systems will be used to solve problems facing model-based and conventional automation systems that are struggling to scale as per business needs.  – Bal Heroor, CEO of Mactores

Deep Learning

This year, Generative AI became the source of tremendous hype. It marks the entrance of AI into the creative realm, for example, with Jasper creating original marketing content or Dall-E 2 instantaneously producing art based on user-provided text. But I believe the real excitement should be focused on Generative AI’s ability to democratize existing AI use cases for traditional companies—and this could have a major impact as early as 2023. Today, traditional companies are limited in their AI adoption because they have to build algorithms from scratch and often lack both the data and skills to do so. With Generative AI, companies will have access to pre-trained “all-purpose” algorithms that require few data and data scientists to fine-tune and are easy for employees to operate. As a result, we could see a major wave of AI adoption in different sectors. For instance, Generative AI could enable the large-scale use of AI in healthcare to produce more accurate medical diagnoses or in manufacturing through the introduction of better predictive maintenance at lower cost. – François Candelon, Global Director, BCG Henderson Institute

End-to-End Deep Learning Needs Data-Driven Solutions: In 2023, more companies will use data-driven solutions to build end-to-end deep learning AI models. Approaching the models with data is the most productive and efficient way to advance the existing technology and ramp up the pace of innovation. Each massive step in AI thus far has been faster than the previous, and that’s due to other companies bringing awareness to data-driven strategies for building AI models and ramping up the speed of the entire industry along with it. – Scott Stephenson, CEO and co-founder of Deepgram 

Deep learning is here: The next step for artificial intelligence in 2023 is deep learning. While AI so far has mostly been a mix of supervised machine learning and data analytics, the rise of deep learning will usher in a new era where computers are able to learn without supervision. Advancements in deep learning will lead to innovations in robotics, generative AI, natural language processing and speech recognition, and scientific breakthroughs in health, sustainability, and more. Of course, as with any AI models, the key for organizations to ensure the results are accurate and comply with new regulations emerging is to make sure there is still a human element for routine monitoring and trusted accuracy of the ML models. – Rosemary Francis, Chief Scientist, Altair

Training and Deploying Foundation Models Supporting Billions of Parameters Becomes Less Costly and Complex: In 2022, we saw an increase in the accessibility of models with a number of parameters in the order of billions to hundreds of billions. These models – known as foundation models – exhibit strong generalization and the ability to solve tasks for which they were not trained. Currently, training foundation models require months of work and budgets that can stretch to millions of dollars for a single training run. In 2023, I anticipate a significant reduction in both the effort and cost required to train foundation models as well as deploy them in a number of industries, like pharmaceutical research and fintech. – Luca Antiga, CTO of Lightning AI

The singularity in our sights?: A research paper by Jiaxin Huang et al. was published this past October with the attention-grabbing title “Large Language Models Can Self-Improve.” While not yet the singularity, the researchers coaxed a large language model into generating questions from text snippets, answering the self-posed question through “chain of thought prompting,” and then learning from those answers in order to improve the abilities of the network on a variety of tasks. These bootstrapping approaches have historically had a pretty tight bound to improvement – eventually models start teaching themselves the wrong thing and go off the rails – but the promise of improved performance without laborious annotation efforts is a siren song to AI practitioners. We predict that while approaches like this won’t drive us into a singularity moment, it will be the hot research topic of 2023 and by the end of the year will be a standard technique in all state-of-the-art, natural language processing results. – Paul Barba, Chief Scientist at Lexalytics, an InMoment Company

DALL-E and Generative AI in 2023: Generative models, like the one used to create DALL-E, analyze data and interpolate to create something brand new. But they’re not just good at creating weird art — generative models can be used to discover new materials for battery design, carbon capture, and loads of other innovations. Generative models will reach new heights in 2023 as solutions like DALL-E are adapted to the younger generation’s desire for video over audio and pictures. We can also expect to see generative models continue to infiltrate the healthcare space for vaccine modeling, drug discovery, and even personalized medicine supported by training data generated from electronic medical records. – Michael Krause, the AI Solutions Director at Beyond Limits

Environmental Impact of Generative Models: Generative models are producing extremely impressive results, but it’s not clear the impact they have on an actual business. What is clear is the carbon emission impact of training these massive models. The compute requirements are insane. So it begs the question, “Are the outcomes worth the environmental cost?” – Gideon Mendels, CEO and co-founder of MLOps platform Comet

Investment in Large Language Models Will Skyrocket in 2023: Large Language Models (LLMs) like GPT-3 and BERT exhibit the ability to perform complex tasks by crafting input text in a way that triggers the model to solve a specific problem, such as a mathematical quiz. LLMs are considered “large” because of the inability to load them on an individual device and the attendant difficulties encountered when training them. Next year, I expect to see a significant increase in the number of startups and established businesses seeking funding or redirecting extant funds to budgets specifically allocated toward creating and training an individual LLM. – Luca Antiga, CTO of Lightning AI

The rise of multi-modal learning: The wave of image-generating networks like Stable Diffusion and DALL-E demonstrate the power of AI approaches that understand multiple forms of data – in this case, image in order to generate a picture, and text in order to take in descriptions from a human. While multimodal learning has always been a significant research area, it’s been hard to translate into the business world where each data source is difficult to interact with in its own way. Still, as businesses continue to grow more sophisticated in their use of data, multimodal learning jumps out as an extremely powerful opportunity in 2023. Systems that can marry the broad knowledge conveyed in text, image and video with sophisticated modeling of financial and other numeric series will be the next stage in many companies’ data science initiatives. – Paul Barba, Chief Scientist at Lexalytics, an InMoment Company

The Next Version of Tech is Generative: In 2023, one of the biggest focuses will be on generative technologies. As the technology grows more sophisticated it will continue to be disruptive, not just for images and content development, but for other industries like speech recognition and banking. I predict that generative technology will soon act as an exoskeleton for humans—it will support the work we are doing and ultimately drive a more efficient and creative future. – Scott Stephenson, CEO and co-founder of Deepgram 

The rise of generative AI startups: Generative artificial intelligence exploded in 2022. In this next year, we will see text processing and visual art using generative AI continue to improve. Entrepreneurs will look to get in on the action (and the $$) and many startups will emerge that create simple experiences for non-technical people based on generative AI. This could range from advertising copy, SQL queries, documentation copy, blog title ideas, code comments, instructional material, and even deepfake video content. Increasingly the creative output from these models is indistinguishable from – and in many cases superior to – human output. – Christian Buckner, SVP, Data Analytics and IoT, Altair

Big models for AI are driving innovations in specialized infrastructure and solutions: Over the past few years, AI and deep learning have become mainstream and reached the same maturity level as data analytics. Big models, from OpenAI’s DALL-E 2 image generation model to Google’s LaMDA conversation agent, are expected to dominate the landscape in 2023. Billions of files will be used to train big models for longer periods of time, requiring more specialized infrastructure and solutions. Next-generation AI infrastructure will be developed to handle the scale. – Haoyuan Li, Founder and CEO, Alluxio

If 2022 was the year of Generative AI toys, 2023 will be the year of Generative AI products.” The focus will be on revenue generation and not just product viability. Optimizing Generative AI means focusing on things humans can’t do well. With Dall-e 2, much of its success is similar to a good search engine. Humans can already make incredible art and write life-changing text. However, what humans aren’t good at is analyzing billions of data points to unpack trends, patterns and predictions. That’s where the opportunity lies. There are massive applications for everything from drug discovery to solving traffic! – Joshua Meier, Lead AI Scientist, Absci

Prompt Engineering Takes Flight and Gains Refinement as Deployment of LLMs Increase: Once a foundation model is trained, it can be used to address problems that go well beyond predicting for example the next word in a sentence based on the words that precede it. Inference – in other words, submitting an input to a model and receiving an output – becomes both more complex and of greater interest as a model grows in size. Next year, I expect prompt engineering – fine-tuning inputs to a model in order to receive a specific type of output – to become a central strategy in the deployment of LLMs. We’ve already seen this take place with the multitude of resources for engineering images generated by models like Stable Diffusion. In 2023, I expect the efforts to refine prompt engineering to take center stage. – Luca Antiga, CTO of Lightning AI

Generative AI will gain momentum across businesses: Recent advances in AI image generation are making generative AI understandable to people who previously couldn’t picture it or imagine its uses. With image generation, AI has entered the home. Business leaders are hearing about it from their kids and their friends. Once a technology is being talked about by families over the dinner table, it’s poised to change the business landscape. In this way, generative images are paving the way for language models. Generative AI is trained on unfathomable amounts of data. If businesses aren’t figuring out how to operationalize the data they own, they will need to start there. It’s the perfect entry point for partnering with companies that have developed and honed AI tools over the years to accelerate progress in profound ways. Now is the moment for generative AI. –  Lisa Spira, Head of Content Intelligence, Persado

Increased opportunities for deep learning will boost demand for GPUs: The biggest source of improvement in AI has been the deployment of deep learning—and especially transformer models—in training systems, which are meant to mimic the action of a brain’s neurons and the tasks of humans. These breakthroughs require tremendous compute power to analyze vast structured and unstructured data sets. Unlike CPUs, graphics processing units (GPUs) can support the parallel processing that deep learning workloads require. That means in 2023, as more applications founded on deep learning technology emerge to do everything from translating menus to curing disease, demand for GPUs will continue to soar. – Domino Data Lab’s CEO Nick Elprin

HPC tackles deep learning: As deep learning becomes more prevalent in 2023, we will see a further shift in HPC workloads. While initially most machine learning workloads were run on Kubernetes or other container orchestration frameworks, it’s become clear that these systems are designed for microservices, not for the bursty, computer-intensive machine workloads now required for deep learning. Commercial HPC workload managers need comprehensive container support so organizations can spool their compute and start to take advantage of batch scheduling, cloud bursting, and fare share — all key aspects of efficient HPC. – Altair’s Chief Scientist Rosemary Francis

The Rise of LLM Applications: Research on large language models will lead to new types of practical applications that can transform languages, text and even images into useful insights that can be used across a multitude of diverse organizations by everyone from business executives to fine artists. We’ll also see rapid growth in demand for the ability to customize models so that LLM expertise spreads to languages and dialects far beyond English, as well as across business domains, from generating catalog descriptions to summarizing medical notes. – KARI BRISKI. Vice President, AI and HPC Software, NVIDIA

The rise of generative AI is poised to change the enterprise: Generative AI is emerging as a game-changing technology. James Currier from nfx defines it as a rocketship for the human mind. I believe generative AI can be a rocketship for the Enterprise. This ability of machines to generate, summarize, and perfect enterprise content and communications will digitally transform every function of the enterprise, from finance and legal to marketing or customer services, in 2023 and beyond. If we zoom into commerce next year, we’ll see generative AI further blurring the lines between offline and online/digital commerce. Brands will be able to generate personalized language, images, and even videos, and combine those with personal multimedia experiences that are the closest we have ever seen to walking into a store/branch and getting the most powerful, relevant sales and service interaction. – Assaf Baciu, co-founder and COO, Persado

Generative AI needs maturing: Generative AI will continue to become more powerful and more prominent, but it still needs to develop and mature before it can be used appropriately in industries where the accuracy of statements are critical, such as in finance or medicine. The field includes natural language generators such as GPT-3 language models, which generate natural language using an initial prompt. At Intuit, generative AI may play a pivotal role in helping us create personalized conversational systems to provide financial advice and guidance directly to our customers. But we’ll need to make significant advances in the technology to provide the right advice and guidance at scale. For example, in the ideal scenario, generative AI could provide our human experts financial insights on areas such as cash flow and money movement, along with narratives that would be factually correct and useful for the customer. We’ll also explore using generative AI to help small businesses engage with their customers and help solve their financial problems. – Ashok Srivastava, Senior Vice President & Chief Data Officer at Intuit

Large language models (LLMs) will help people be more creative: We’re likely going to see more solutions like GitHub Co-Pilot where large language models (LLMs) are able to assist people in meaningful ways. These tools are not going to get everything right but they will help solve that initial writer’s block. When you’re staring at a blank page, it’s often hard to get started. But if you can describe the prompt or problem and the model outputs something, it may give you a good starting point and show you something you can use — even if it’s not entirely what you want. Prompt engineering (i.e. instructing models using the right starting text) will become a new way of writing programs, using natural language. I think AI will help in this or similar ways in a lot of domains and it’s very interesting. People thought the big hold-up for AI would be creativity, but ironically it may be the reverse. It may be that AI will actually help us become more creative — by seeding us with initial ideas that we can build upon and refine. – Varun Ganapathi, Ph.D., CTO and co,-founder at AKASA

Generative AI Changed How We Create Pictures. Emerging 3D Generative AI Will Simulate the World: Generative AI enabled the creation of 2D images from text prompts, creating new opportunities for artistic expression. Over the next year, we’ll see the technology spur even more transformation as companies take these models one step further to generate 3D models. This emerging capability will change the way games are built, visual effects are produced, and immersive 3D environments are developed. For industrial uses, democratizing this technology will create significant opportunities for digital twins and simulations to train complex computer vision systems, such as autonomous vehicles. – Yashar Behzadi, CEO and Founder of Synthesis AI

Advances in Generative AI will lead to the democratization of creativity across consumers and businesses. Just as we saw an explosion of user-generated content a decade or so ago we will see an explosion of AI-generated images, videos, art work, blogs, summaries of articles, and articles. This will also change and challenge creative artists to work with AI to produce truly augmented art and creative work. Creative artists will focus more on aesthetics that appeal to certain groups of people and let the AI generate the art that fits the aesthetics. – Anand Rao, Global AI Lead; US Innovation Lead, Emerging Technology Group, PwC

AI will become more accessible across organizations and functions: The recent spike in popularity in ChatGPT has been heralded as a major breakthrough in delivering safe and useful AI systems that non-technical users can access in a conversational way. In 2023, we can expect more models to be unveiled, as data between the user and the AI assistant will find ways to improve how departments – from marketing, sales, HR and others – will work. – Keshav Pingali, co-founder and CEO of Katana Graph

With the introduction of Generative AI comes the increase of content on the internet. As user-generated content grows, the need to search for this content effectively will only become greater. The best way of engaging with all this new content is through text, which will be the widely used interface to describe images, videos and audio in the near future. The same neural networks that assist Generative AI in creating content will also help change and transform our engagement with this influx of new information through the summarization, categorization, and language translation of content across the internet. Organizations will seek pipelines for extracting, storing, and servicing this content within existing web applications and analytic interfaces. As a result, semantic processing and serving engines will become an integral part of modern application architectures, sitting in front of, behind, and/or on top of data lakes, data warehouses, and application data pipelines. – Ed Albanese, Chief Operating Officer at Vectara

Generative AI results are both spectacular and concerning. They are spectacular because they appear structurally and semantically coherent even in long dialogs. They are concerning because they generate words and phrases to fill in gaps not founded on truth, fact, evidence, or even logical consistency. As the popularity of this powerful but easily misunderstood technology spreads, we will see a push to enforce more legislation surrounding best practices in artificial intelligence. This type of legislation will require that AI systems be hybrid systems, that they may find statistical patterns in the data but also use other AI techniques that interpret the data with respect to rational and transparent models of values, rules, and judgments. The result is a system that does not just blindly follow the data but can subject the implicit trends to a rational process. These types of hybrid AI systems are necessary for the broader acceptance, transparency, and responsible integration of AI in human decision processes. – Dr. David Ferrucci CEO and Founder of Elemental Cognition

These last few weeks of the year, we’ve seen the power of generative AI take off and grab the attention of the public. ChatGBT and AI-powered artistry wow people as they see what’s possible via artificial intelligence. As we enter 2023, I anticipate we’ll see large language models (LLMs) move to the forefront of the conversation of what can be possible not only for consumers, but in an enterprise setting. AI-powered and deep learning based LLMs have allowed computers to better understand human language. Instead of training models to observe the structure of sentences, neural networks have allowed them to understand more complex semantics. As this technology continues to develop, we’ll see how it can impact job functions across every department and industry, transforming how knowledge workers approach their roles and ultimately aiding people in becoming smarter, faster and better at work. – Arvind Jain, CEO and co-founder of Glean

Generative AI becomes a marketer’s best friend: Now that generative AI can create compelling text, images, and videos, marketers will need to decide the extent to which they implement the technology. At the very least, AI can support existing content creation workflows, but in the near future, it could become responsible for new content creation processes that won’t require any support from humans. – Thomas Peham, VP of Marketing, Storyblok

Deep learning capabilities will edge further into the mainstream as more popular consumer use cases are adopted such as image recognition for authentication, video processing and object detection for security use. This is thanks to the increased efficiency of convolutional neural networks (CNN). Such technology will make way for semantic segmentation and even more real time use cases like self-driving cars, computing emotions during e-commerce purchases or enterprise sales calls, and improved health care support.  – Bal Heroor, CEO of Mactores


I’ve always been a proponent of graphs, and it appears that the industry has finally come around to graph technology (database and compute) over the past year. Many contemporary data challenges can be solved with graphs at the data engineering, data science, and user levels. Graphs were previously utilized in a small number of critical systems, including the manufacturing and transportation of medicines during the pandemic, but I predict graph usage to increase in the coming year and become more widespread. While the pattern will be similar, it will be more broadly democratized. – Jim Webber, Chief Scientist, Neo4j

Causal Knowledge Graphs will Emerge: The next few years will see growth in Causal AI starting with the creation of Knowledge Graphs that discover causal relationships between events. Healthcare, pharma, financial services, manufacturing and supply chain organizations will link domain-specific knowledge graphs with causal graphs and conduct simulations to go beyond correlation-based machine learning that relies on historical data. Causal predictions have the potential to improve the explainability of AI by making cause-and-effect relationships transparent. – Jans Aasman, Ph.D., an expert in Cognitive Science and CEO of Franz Inc.

Graph neural networks will be particularly influential in AI applications: Whether it’s halting information fraud or security in real time, or uncovering more effective healthcare treatments, innovations in graph can identify relationships in data that users otherwise wouldn’t see. Solutions focused on extrapolating insights from unstructured data will further support a range of enterprise use cases. – Keshav Pingali, co-founder and CEO of Katana Graph

One hot topic in AI tech right now is knowledge graphs, which basically adds another layer to data sets and allows you to describe properties and relationships between entities. This technology will have practical applications in various industries, but we might see it used in social media and digital marketing in 2023. One example is the ability to detect and even predict the real impact of an online influencer or opinion leader based on the data of their network and interactions. We’ll also be able to predict whether a piece of content will go viral. – Erudit’s Chief Science Officer & Co-founder, Ricardo Michel Reyes


AI Becomes Cost-Effective With Energy-Efficient Computing: In 2023, inefficient, x86-based legacy computing architectures that can’t support parallel processing will give way to accelerated computing solutions that deliver the computational performance, scale and efficiency needed to build language models, recommenders and more. Amidst economic headwinds, enterprises will seek out AI solutions that can deliver on objectives, while streamlining IT costs and boosting efficiency. New platforms that use software to integrate workflows across infrastructure will deliver computing performance breakthroughs — with lower total cost of ownership, reduced carbon footprint and faster return on investment on transformative AI projects — displacing more wasteful, older architectures. – CHARLIE BOYLE, Vice President, DGX systems, NVIDIA

Green Computing for a Sustainable Future: As the threat of the global energy crisis and recession continue and companies become more mindful of their carbon footprint, more and more enterprises—especially the mid-to higher-end—will significantly accelerate their investment in cloud adoption to securely and efficiently manage both modern and legacy applications. While the cloud is not perfect when it comes to sustainability, it is more eco-friendly than traditional data centers. And with more companies shifting workloads to the cloud, cloud providers will continue to invest in renewable energy sources to enable environmentally friendly cloud-native applications. – Prashant Ketkar, Chief Technology and Product Officer at Alludo

The foundation for our future is the implementation of composable infrastructure. This technology advancement will become critical in 2023, allowing enterprises to succeed and grow despite economic downturns and dwindling resources. Composable infrastructure reduces our reliance on technical experts, removes bottlenecks, and allows innovation from everyone involved. The modularity afforded by this model brings  greater compartmentalization and specialization, with digital capabilities embedded throughout the business. Our efforts to liberate quick innovation supports Moore’s law, in which computational progress  becomes significantly faster, smaller, and more efficient over time. Enterprises that don’t keep up with the quick changes coming will fall behind. According to Gartner, organizations that adopt a composable approach in 2022 will outpace their competition by 80% in new feature implementation. As adoption of composable infrastructure grows, we will see a shift towards a more centralized IT model. While IT won’t disappear, the old system will change. Instead, more responsibility will rest within the business where we’ll see the hiring of technical people, developers, and others who support and embrace the new approach. – Tam Ayers, Field CTO, North America at Digibee

The AI computing war will reach a fever pitch: The race for evermore powerful AI supercomputers will shift into high gear and will introduce a new problem, being able to feed data fast enough with higher bandwidth and low latency. – Tony Pialis, CEO of Alphawave

As Generative AI and its resulting content explodes, the energy consumption and load on hardware chips will only increase and accelerate: Generative AI is taking the internet by storm. Now publicly accessible, new AI technology, such as AI-generated portraits and artwork, is proliferating across social platforms. This sudden increase in access and popularity will have a direct impact on the energy consumption load placed onto hardware chips. Already, current AI models running on traditional chips require massive computing power from data centers. Given the demand for AI and ML, its compounding annual growth rates, and the ever-increasing need for compute power, we’re already facing a crisis. In 2023, hyperscalers will need to factor in this energy consumption as AI and ML continue to see explosive growth through content creation of generative AI. – Nicholas Harris, CEO and founder of Lightmatter

IoT and Edge Computing

IoT will become more accessible for IT: Companies need to ensure they have the same tools, APIs, and infrastructures from the cloud to the edge to ensure their teams don’t have to re-learn things. Providers need to make tools/devices more uncomplicated, more accessible for companies to innovate and leverage them, and easier to enter the market as the adoption of IoT technologies continue to grow. Many companies will start leveraging IoT technologies to drive business and operational transformation by extending the benefits of the cloud to the edge. It is important to bring the agility, elasticity, economics, scale, and security of the cloud to the edge. – Yasser Alsaied, VP of IoT at AWS

In 2023, it will become clearer that succeeding in the IoT space is all about working in an ecosystem of applications instead of a singular application. An IoT platform means you’re able to bring in more third-party sensors and devices to add to your arsenal of sensing, and you’re also able to push out the data you’re collecting to other platforms using APIs. – Sammy Kolt, Chief Product Officer, SmartSense by Digi®, a business unit of Digi International 

Connected device visibility and security will be a major area of focus for most organizations: IoT-connected devices have been deployed by most organizations over the years, but often without adequate security governance.  As the number of IoT-, OT-, ICS- and IIoT-connected devices grows, the attack surface for the networks and ecosystems to which they’re connected grows as well, creating exponentially more security, data, and privacy risks. Leading organizations will focus in the year ahead on connected device cyber practices by establishing or updating related policies and procedures, updating inventories of their IoT-connected devices, , monitoring and patching devices, honing both device procurement and disposal practices with security in mind, correlating IoT and IT networks, monitoring connected devices more closely to further secure those endpoints, manage vulnerabilities, and respond to incidents. – Wendy Frank, Deloitte’s US Cyber IoT leader

Edge Data: 2023 will increase opportunities for cloud and network convergence and force a rethinking of IT architectures, especially at the edge and for mobile environments where it meets the physical world. The explosive growth of edge data, driven by IIoT adoption and 5G, will allow companies to quickly process and analyze data where it lives and where quick responses are required. – Bjorn Andersson, senior director of global digital innovation marketing and strategy, Hitachi Vantara

Distributed communication and data processing will become a new trend: With a movement towards edge computing, more resources are available on the edge. As 2022 comes to a close, many companies have started deploying edge infrastructure. The need to balance costs has resulted in a tradeoff. The newly gained computing powers come at the expense of reduced centralized resources. This new distribution of computing creates a new challenge: Effectively using the “enterprise-wide data center” for workload execution. A distributed — but low-power — infrastructure requires new models of workload management and leads to needs around distributed data processing and better means of near-real-time communication across distributed platforms. As those needs grow, new and advanced technology solutions will enter a new market to address this demand. We will see a resurrection of torrent-like P2P Networking models and blockchain-based validation algorithms will emerge and establish themselves. – Ekim Maurer, director of product management, NS1

Analytics at the Edge Will Go Mainstream: Analytics at the edge may not be a new concept, but 2023 will see more happening at the edge than ever before. Today’s terminal devices are almost as powerful as a low-end server, and the increasing availability of 5G networks has enabled the transmission of more data at higher speeds. As more data is pushed to the edge, it will enable real-time decision making in the field, shorten response times, and reduce the compute, storage, and network costs of cloud infrastructure. For these reasons, along with lower technical barriers, we are sure to see greater adoption of analytics at the edge in 2023. – James Li, CEO of CelerData

The Edge will remain a nebulous and disputed concept: Edge is a bad name for a distributed compute paradigm. There is simultaneously no edge to the Internet, and many Edges, depending on your perspective. The debate will continue to rage about where the Edge is and whether some distributed systems are more or less “Edge-y” than others. What will not be disputed is that distribution of applications to wider hosting footprints has advantages with respect to elements such as latency, reliability, redundancy and data backhaul cost. So maybe a new phrase will emerge with a focus on application distribution rather than Edge. – Stewart McGrath, co-founder and CEO of Section 

IoT will become a universal business decision and will be an expectation rather than an exception: Many businesses and industries will continue to invest in IoT because it provides business and operational value. For example, agriculture companies use IoT technology to expand their business by adding precision agriculture tools into their vehicles and improving their factory operations with digital twins. We are consistently seeing new customer segments, like the retail industry, unlocking the value of IoT. Customers are trying to formulate a unified experience that traverses easily between online and offline (O2O), resulting in the convergence with mobile, social media, and the Internet of Things (IoT) that can serve wherever and whenever they desire. Customers are also broadening their sustainability initiatives to go beyond emission reduction to create a smart building or smart city environment leveraging IoT to monitor energy performance, reduce waste, and align facility operations with occupancy trends. – Yasser Alsaied, VP of IoT at AWS

As sensor technologies become embedded in enterprises across industries, IoT strategies have started fueling the products and services that shape our world today. However, IoT’s growth is also causing enterprises to amass data at an unmanageable rate. IoT data is increasingly becoming trapped in edge environments, limiting enterprises’ ability to derive its full value. Throughout 2023, enterprises will realize that they are not only losing money at the edge but also insights that can guide organizational decision-making, fuel new partnerships and reveal new business models. As the scale of this loss becomes clearer, enterprises will experiment with ways to move IoT data to the cloud, and solutions for large-scale and continuous IoT migrations will become mission-critical. – Paul Scott-Murphy, Chief Technology Officer of WANdisco


Organizations will prioritize easy-to-maintain technology to bridge the skills gap: Accelerated digital transformation has led to more distributed IT infrastructures, with Kubernetes becoming the de facto standard for managing containerized environments. Although there are many benefits to using Kubernetes in hybrid and multi-cloud environments, Kubernetes is a complex technology that requires deep technical skills to deploy and manage. Because Kubernetes is a relatively new technology, the talent pool of skilled Kubernetes engineers is limited. This is why we expect to see organizations gradually abandon DIY Kubernetes projects and put their budgets toward training and technology for their Kubernetes deployments and projects. Considering the economic uncertainty over the next year, CIOs and business decision-makers are being forced to look at their budgets closely and be more selective on which technology investments to move forward with. One critical factor during this time is the growing skills gap in emerging technology sectors. In an effort to bridge this gap, technology and tools that are both impacting the business’s bottom line and are easy to deploy and maintain will rise to top priority. – Tobi Knaup, CEO and co-founder of D2iQ

AI and ML workloads running in Kubernetes will overtake non-Kubernetes deployments: AI and ML workloads are picking up steam but the dominant projects are still currently not on Kubernetes. We expect that to shift in 2023. There has been a massive amount of focus put into adapting Kubernetes in the last year+ with new projects that make it more attractive for developers. These efforts have also focused on adapting Kubernetes offerings to allow for the compute intensive needs of AI and ML to run on GPUs to maintain quality of service while hosted on Kubernetes. – Patrick McFadin, DataStax’s VP of Developer Relations

Edge burns white-hot: Kubernetes may have gained popularity as the operating system for the data center, but its real value may prove to be at the edge, where its portable and resilient application workloads can power an almost infinite variety of digital business processes and customer experiences. Our research has found that already 35% of production Kubernetes users are running Kubernetes at the edge, and many many more plan to do so in the next 12 months. The use cases are incredibly varied, from fruit-picking drones to AI on MRI machines, and many of them have the potential to drive revenue and competitive differentiation for the companies that get them right. But the challenges are equally immense, from manageability to security. 2023 is the tipping point, when the challenges get hit head-on, and edge truly goes mainstream.

More large-scale analytics and AI workloads will be containerized, but the talent pool is a bottleneck: In the cloud-native era, Kubernetes has become the de facto standard, with a variety of commercial platforms available on the market. Organizations are increasingly deploying large-scale analytics and AI workloads in containerized environments. While containers provide many benefits, the transition to containers is very complex. As a result, in 2023 the main bottleneck to container adoption will be the shortage of talent with the necessary skill set for tools like Kubernetes. – Haoyuan Li, Founder and CEO, Alluxio

The rise of Kubernetes as a Service: Kubernetes has been described as an operating system for containers. As workload management continues to expand to serverless and virtual machines, and the operations ecosystem (e.g., security and observability) matures and hardens, we will see Kubernetes more abstracted from users. No developer working on building an application really needs (or probably wants) to understand and manage Kubernetes. What they really want is the benefits of Kubernetes when managing their applications in production. In the same way, no developer wants to manage Linux or even the servers on which it runs, so cloud computing gave us compute as a service. Kubernetes is one layer above that compute, and a natural fit for an “as a service” offering; in 2023 we’ll see that take off. – Stewart McGrath, co-founder and CEO of Section

Kubernetes goes mission-critical: Over the last 24 months, Kubernetes has become mainstream. Containers are now being adopted in mission critical environments, meaning that the application environment and the underlying data in these environments now needs protection. Now, ownership of these containers and the protection of them has become more complex, creating silos and confusion over if it’s the backup admin or the DevOps admin that’s responsible. At the same time, organizations are struggling to identify which containers to back up and how to do so, which will likely lead to more investment in training to help close the Kubernetes skills gap. In 2023, IT departments will continue to navigate how to adequately protect and backup their Kubernetes environments. – Deepak Mohan, EVP of Engineering at Veritas Technologies

Kubernetes’ complexity will continue to be an issue: Kubernetes is one of those technologies that everyone uses because they have to but nobody likes – sort of like Maven during the heyday of Java. There have been a number of attempts to build something like Kubernetes, but simpler, including K3S, MDSO, and others. However, none of these technologies seem to be gaining significant traction. In 2023, people will keep trying to build simpler technologies and someone may succeed. This is the kind of product I would expect Hashicorp to do well. – Mike Loukides, Vice President of Emerging Tech Content at O’Reilly Media

2023 Will See The Taming of Kubernetes Chaos: The “great K8s irony” is that the very technology that was created to streamline the management of cloud applications is, itself, incredibly difficult to manage. Enterprise adoption of Kubernetes is often stalled by the staggering amount of disparate tools and software addons that need extensive integration and maintenance, especially as the number of workloads and clusters increase. Consequently, enterprises struggle to maintain the increasing time, cost and resources needed to manage this “Kubernetes jigsaw puzzle.” In 2023, leaders will realize that platform teams are essential to solving the Kubernetes jigsaw puzzle and tame the Kubernetes chaos. Eliminating the complexities of Kubernetes from developers’ workloads, platform teams bring an operational mindset to internal tools and workflows that enable them to help manage and operate Kubernetes at scale. – Mohan Atreya, SVP product and solutions, Rafay Systems


No-code / no-brain AI. Advanced machine learning technologies will enable no-code developers to innovate and create applications never seen before. This evolution may pave the way for a new breed of development tools. – Esko Hannula, Sr. VP of Product Management at Copado

Rise of low-code and no-code AutoML: In 2023, there will be greater availability of industrialized AI through low-code and no-code automated machine learning (AutoML). The models will be provided through self-service marketplaces and can be enhanced with packaged services for customization and deployment. – Jason Mann, Vice President of IoT, SAS

An emergence of low-code CX: The past few years have highlighted the need for enterprises to pivot to meet the ever-shifting landscape of customer needs efficiently. Next year, we’ll see an increase in user-friendly, low-code processes and systems to create a seamless customer experience across a myriad of touchpoints and systems. Vendors will embrace Industry-standard APIs to allow enterprises to integrate their CX ecosystem connecting internal and external systems painlessly. – Dr. Prateek Kapadia, Chief Technology Officer at Flytxt 

Low-code will be a top priority for democratizing automation and AI. Simple-to-use tools help both experienced and beginner technical workers do more with more of their technology. With low-code, business users unfamiliar with analytics can still tap into their knowledge and expertise to train sophisticated AI models to solve for challenges. – Ted Kummert, Executive Vice President of Products & Engineering at UiPath

Low code/no code applications will create compliance issues: Low code/no code application development has been instrumental in democratizing application development across companies. In 2023, low code/no code adoption will become mainstream, and non-technical employees (citizen developers) across any organization will have the power to create their own app. While this will significantly alleviate the burden on IT teams, it will also create a big compliance risk for organizations. Because citizen developers don’t have the same experience in implementing security and privacy, most of the applications they develop won’t be adequately protected and protection policies may be inaccurately applied. As a result, not only will organizations face compliance issues, their applications may also create new vulnerabilities for bad actors to exploit. – Deepak Mohan, EVP of Engineering at Veritas Technologies

Modern no-code and low-code solutions will follow a bottom-up approach: No-code and low-code platforms like AirTable have been instrumental in democratizing company data. However, while they provide highly intuitive facades for non-technical business users, their top-down architecture is extremely limiting or inaccessible to engineers. While quickly adoptable, these band-aide services have an unconsidered backend that is unable to scale, and therefore needs to be replaced over time. In the coming years, modern NC/LC solutions will follow a bottom-up approach that lays a foundational data layer comprised of powerful developer tools, performant APIs, tailored data stores, and an unopinionated tech stack. True data democratization can’t be achieved without equally enabling both non-technical and highly technical users. – Ben Haynes, co-founder/CEO of Directus

2023 will be low-code’s time to shine as the developer shortage continues. The real mark of a great developer is not the project they do, it’s the selection of the projects among many. Organizations will need to deploy their technological talent very smartly. No one can afford to deploy them on a haphazard basis. – Gordon Allott, CEO of BroadPeak Partners

AI will be brought more heavily into the low-code equation: Artificial intelligence will increasingly enable software development processes that are more proactively guided and written by other software. This will allow business users to create new applications using text prompts with the assistance of the application development tools. While this prospect may cause professional developers to feel anxious, the shift promises to create new opportunities within IT, rather than eliminate old ones. Software developers will become adept at enabling this evolution by learning how to provide the right prompts to an AI tool to generate the code that a no-code application developer will need.  Also, generally, at a fundamental level AI, AR+VR and simulation software are going to rule. To support this necessary backbone, trends in improving compute network and storage are going to take an exponential leap in the next 3-5 years. So, tech changes will be driven at compute, storage and network level! IT departments will increasingly rely on the “business technologist”: Low-code options have created a new persona in the workplace: the business technologist—also known as “citizen developer”—who can participate in the application development process. As of now, the IT department still does the heavy lifting of application development, but in 2023 and beyond, business users will increasingly be able to create applications end-to-end with relatively little intervention from developers. This shift will allow developers to focus on maintaining large-scale strategic projects, while monitoring the long tail of the applications being built by business technologists. – Lloyd Adams, President, SAP North America

Economic uncertainty will make low-code tools an enterprise necessity: In today’s rapidly changing environment, businesses are moving faster than ever before to innovate and gain market traction amid inflation, a looming economic recession, tightening budgets and continued supply chain issues. This means that enterprises can no longer afford to wait for months to get engineering and IT resources to update their systems. A well-built low- or no-code platform gives businesses the ability to quickly build applications that solve an even broader range of use cases. When seeking new low-code tools, companies should make sure they can integrate seamlessly with their existing systems. Low-code tools should be easy to configure, ensuring faster implementation and time to value. Low-code tools also need the ability to grow with a company. This requires the tools to have a stable platform, reliable performance and the ability to process data at scale. With these features in mind, low-code tools will soon become a requirement for any business looking to modernize their business processes. – Lu Cheng, Co-founder and CTO, Zip

Machine Learning

The data science and ML community will actively embrace more standardization: Today’s data science and machine learning market is still very fragmented, while the pace of innovation remains rapid. In 2023, we will see the desire for standardization led by two key driving forces — the traditional Python community focusing more on successful productization of Python code, and a growing group of enterprises who are fast becoming important stakeholders in Python. Both groups are eager to realize a long-time stable, trusted platform on which they can build. We see ML standardization already in progress with the four leading ML frameworks: Sci-Kit Learn, XGBoost, PyTorch, and TensorFlow.  Any innovator will naturally use one of these very popular broadly adopted frameworks versus another alternative. I expect the data science and ML community will continue to move towards standardization, which is very healthy for the entire market space as we’ve seen previously play out for other areas, for example, in the Linux community. – Torsten Grabs, Director of Product Management, Snowflake 

The uptake of digital technologies in manufacturing in 2022 was surprising. In 2023, we’ll see manufacturers continue to invest in machine learning for factory optimization, specifically white-box (transparent) machine learning so that they can generate ROI from data harmonization efforts. In addition, infrastructure sectors (steel, cement) will see more activity as last year’s Infrastructure Bill is tapped into. Within these sectors, there will be increasing interest in AI and machine learning as they pursue decarbonization solutions. – Berk Birand, co-founder and CEO of Fero Labs

While we have seen more attention to correcting gender bias, machine learning will continue reducing the bias. In conversational AI, systems that “know the customer” by leveraging information about that particular person will also reduce bias. – Language I/O Chief Technology Officer Diego Bartolome

Automating ML workflows will become more essential: Although we’ve seen plenty of top technology companies announce layoffs in the latter part of 2022, it’s likely none of these companies are laying off their most talented machine learning personnel. However, to fill the void of fewer people on deeply technical teams, companies will have to lean even further into automation to keep productivity up and ensure projects reach completion. We expect to also see companies that use ML technology put more systems into place to monitor and govern performance and make more data-driven decisions on how to manage ML or data science teams. With clearly defined goals, these technical teams will have to be more KPI-centric so leadership can have a more in-depth understanding of machine learning’s ROI. Gone are the days of ambiguous benchmarks for ML. – Moses Guttmann, CEO and Co-Founder of ClearML

Data needs a seat at the table as AI grows: As AI and machine learning continue to grow and become more commonplace, the data fueling these applications will need a seat at the decision maker’s table. In 2023, we will see continued development of data teams and data leadership roles.  The Chief Data Officer position will continue to expand, furthering its rise as the fastest growing CxO position amongst Global 2000 enterprises. – Stijn Christiaens, Co-Founder & Chief Data Citizen at Collibra

SaaS will be all about specialization: Most cloud providers have become comparable in basic capabilities and there is very little to differentiate them. The journey from here is going to be about specialization. Companies will need to start diving a little deeper into the key value they are looking for and which cloud provider can provide it best. For example, for some AI and ML capabilities, there may be a specific cloud that has a significant upper hand or for PaaS there could be another cloud which delivers a significant discount based on previous usage. For organizations to drive the value needed to stay competitive it will be critical to have the right infrastructure and tools in place to effectively manage data & operations in a multi-cloud environment. – Amit Rathi, VP of Engineering, Virtana

AutoML synergies will help to ease tensions between citizen and hardcore data scientists: In 2023, we will see the breaking down of the artificial barrier that has existed between two former factions — the early emerging citizen data scientists versus the hardcore data scientists and ML engineers. The conflict over the merits of Automated Machine Learning (AutoML) as opposed to traditional hand-coded ML models will disappear in favor of synergy between the two camps and the two types of ML. We’ll see hardcore data scientists and ML engineers change their view of AutoML, and adopt it as a quick start to reach an initial draft of a ML model by an order of magnitude over their prior hand-coded approach. AutoML will be the car that contains the ML engine. So while citizen data scientists use AutoML, the data scientists and ML engineers will be able to crack open the hood of the AutoML car to be able to fine tune the complex ML engine inside. – Torsten Grabs, Director of Product Management, Snowflake

Hoarding ML talent is over: Recent layoffs, those working with machine learning specifically, are likely the most recent hires as opposed to the more long-term staff that have been working with ML for years. Since ML and AI has become a more common technology in the last decade, many big tech companies began hiring these types of workers because they could handle the financial cost and keep them away from competitors – not necessarily because they were needed. From this perspective, it’s not surprising to see so many ML workers being laid off considering the surplus within larger companies. However, as the era of ML talent hoarding ends, it could usher in a new wave of innovation and opportunity for startups. With so much talent now looking for work, we will likely see many of these folks trickle out of big tech and into small and medium-sized businesses or startups. – Moses Guttmann, CEO and Co-Founder of ClearML

End User Experience Becomes A Top Priority: As deep integrations of data platforms become standard, the reduced complexity will usher a focus on end user experience. The data platform will become abstracted even further to end users. Instead of worrying about the underlying engines and data structures, end users will be able to easily and seamlessly leverage powerful underlying engines for interactive, batch, real-time, streaming and ML workloads. – Steven Mih, Co-founder and CEO, Ahana

Most ML Projects Still Fail: This is not a maturity issue, and it’s not a failing of available tools (though technology is often blamed). Organizations are going to have to give a closer look at their teams. He can also discuss why some ML teams within the same company are underperforming while others are absolutely crushing it, or what companies do to ensure that teams are asking the right questions and building models that deliver business value with the capabilities available to them. – Gideon Mendels, CEO and co-founder of MLOps platform Comet

Organizations will realize the need to invest in AI/ML platforms for reusability, scale, and faster delivery: Organizations have struggled to scale their AI/ML model deployment in production due to complex data dependencies, manual processes, lack of expert AI/ML skills, and siloed engineering teams. In 2023, organizations will look to invest in AI/ML platform teams and services which can simplify dependencies, improve data and model discoverability, manage dataset access and governance, and make model building and deployment easily repeatable. MLOps platforms which can provide these benefits will gain traction. – Preethi Srinivasan, Director of Innovation at Druva

Rise of Real-Time Machine Learning: With all the real-time data being collected, stored, and constantly changing, the demand for real-time ML will be on the rise in 2023. The shortcomings of batch predictions are apparent in the user experience and engagement metrics for recommendation engines, but become more pronounced in the case of online systems that do fraud detection, since catching fraud 3 hours later introduces very high risk for the business. In addition real-time ML is proving to be more efficient both in terms of cost and complexity of ML operations. While some companies are still debating whether there’s value in online inference, those who have already embraced it are seeing the return on their investment and surging ahead of their competitors. – Dhruba Borthakur, co-founder and CTO at Rockset

ML project prioritization will focus on revenue and business value: Looking at ML projects in-progress, teams will have to be far more efficient given the recent layoffs and look towards automation to help projects move forward. Other teams will need to develop more structure and determine deadlines to ensure projects are completed effectively. Different business units will have to begin communicating more, improving collaboration, and sharing knowledge so these now smaller teams can act as one, cohesive unit. In addition, teams will also have to prioritize which types of projects they need to work on to make the most impact in a short period of time. I see machine learning projects boiled down to two types: sellable features that leadership believes will increase sales and win against the competition, and revenue optimization projects that directly impact revenue. Sellable feature projects will likely be postponed, as they’re hard to get out quickly, and instead, the now-smaller ML teams will focus more on revenue optimization as it can drive real revenue. Performance, in this moment, is essential for all business units and ML isn’t immune to that. – Moses Guttmann, CEO and Co-Founder of ClearML

2023 will be a year of acceleration for the operationalization of widespread usage of analytics and ML in all functions of enterprises. For years, early adopters have already been building out systems to automate a host of mundane tasks and to focus on higher-value activities: this has included everything from financial reporting, to data cleansing and document parsing. They’ve also combined automation with traditional analytics and AI or ML activities. The benefits can be significant, with companies reporting greater efficiencies and improved quality control, with time to focus on developing the next great ideas and products. Moving on to more profound work also delivers a higher sense of accomplishment: it makes people feel that their job has more value and sense. All of this together creates a strong incentive for more conservative companies to heavily invest in these practices, which are more often than not accelerated by employees eager for more automation, more analytics, and more insight. When it’s grassroots-driven like this, you get buy-in from across the organization. The success of these initiatives relies on appropriate tooling and standard processes (MLOps, data ops, sometimes called XOps) in order to disseminate such power across organizations, while retaining appropriate controls and governance. – Clément Stenac, Co-founder and CTO, Dataiku

Data powers every major decision in organizations today. In 2023, organizations will continue to collect vast amounts of data from the physical world – through sensors and instruments – as well as from the digital world of online transactions. To organize these intermingled data sets, developers will increasingly tag data, not only with time-stamps, but also “location-stamps”. To extract value from this data tagged with geospatial context, data scientists and developers will work together to build and use specialized machine learning models and analytical queries. Much like DevOps transformed the software landscape two decades ago, MLOps services purpose-built for specialized data, models, and queries will increase the agility and productivity of developers building intelligent services. Developers who take advantage of these emerging platform capabilities to deliver a new class of mobile applications and cloud services enriched with geospatial data will drive the next wave of growth in the industry. – Vin Sharma, VP of Engineering at Foursquare

Modernization efforts will increasingly rely on automation to curb costs, accelerate project completion, and help address challenges sourcing programmers in the current market. AI and ML algorithms will become more intelligent, and the number of modernization success stories will mount as processes become more standardized and human errors are reduced. – EvolveWare CEO Miten Marfatia

Open source ML tools will gain greater market share: It’s clear that next year MLOps teams, that specifically focus on ML operations, management, and governance, will have to do more with less. Because of this, businesses will adopt more off-the-shelf solutions because they are less expensive to produce, require less research time, and can be customized to fit most needs. MLOps teams will also need to consider open-source infrastructure instead of getting locked into long term contracts with cloud providers. While organizations doing ML at hyperscale can certainly benefit from integrating with their cloud providers, it forces these companies to work the way the provider wants them to work. At the end of the day, you might not be able to do what you want, the way you want, and I can’t think of anyone who actually relishes that predicament. As well, you are at the mercy of the cloud provider for cost increases, upgrades, and will suffer if you are running experiments on local machines. On the other hand, open source delivers flexible customization, cost savings, and efficiency – and you can even modify open source code yourself to ensure it works exactly the way you want. Especially with teams shrinking across tech, this is becoming a much more viable option. – Moses Guttmann, CEO and Co-Founder of ClearML

Companies with more than one machine learning use case in production will shift toward an MLOps “Continuous X” culture. This means that such companies will implement Continuous Integration (CI), Continuous Delivery (CD), Continuous Training (CT), Continuous Monitoring (CM) within their practices. Companies with zero machine learning cases in production will need to develop a collaborative approach between their business, technology, and product teams in order to define a clear MVP and land their first win. – Bal Heroor, CEO of Mactores


Quiet times for the Metaverse and NFTs: They’re not dead, but we won’t be seeing any major milestones in 2023. – Dan Parsons, CPO and co-founder of Thoughtful 

Metaverse Technologies Will Remain Just Hype, While the Adoption of Digital Transformation Technologies Trends Higher and Higher: While there might be flashes of jazzy product introductions around Metaverse technologies, there will not be any mass adoption or game-changing impact in 2023 stemming from Metaverse. These technologies will remain just hype for the foreseeable future until more and more enterprises gain a better understanding of this space and its impact. Technologies accelerating digital transformation, with a focus on cost reduction, will gain steam in 2023. The digital transformation trend that started during the Covid pandemic will only continue to accelerate as enterprises look for new ways to extract efficiencies in systems and processes. – Shiva Nathan, Founder & CEO of Onymos 

Metaverse technologies are still too immature to provide the psychological safety required for high-quality coaching. While some practitioners will continue experimenting in the space, there won’t be any large-scale adoptions of coaching using metaverse technology in 2023. – CoachHub Global Director of Consulting Sam Isaacson

Start Thinking Ahead to Cybersecurity Concerns in the Metaverse: The metaverse, digital twins, and similar advanced technologies will present new security challenges for organizations and individual users. Artificial intelligence solutions will be needed to validate the legitimacy of identities and controls. When we think of the metaverse today, we often envision immersive gaming environments such as Fortnite. However, the metaverse will eventually reach beyond gaming into nearly all aspects of business and society. This new type of digital interface will present unforeseen security risks when avatars impersonate other people and trick users into giving away personal data. We are already seeing significant attack patterns that compromise users who click on a bad file or a malicious link. It could be a credential-harvesting ploy conducted through a spoofed URL, or a social engineering attack launched through a natural language message that triggers malware or ransomware. Then there are doctored videos of synthetic media “deep fakes” which can cause viewers to question whether someone or something they see is real or fake. We also find this trend with digital twins that allow users to conduct physical facility maintenance remotely through a digital environment. We can expect to see more of these holographic-type phishing attacks and fraud scams as the metaverse develops. In turn, folks will have to fight AI with stronger AI because we can no longer rely solely on the naked eye or human intuition to solve these complex security problems. – Patrick Harr, Chief Executive Officer, SlashNext 

Digital and synthetic twins take center stage: The next generation of the analytics life cycle will see a focus on simulating complex systems to help prepare for any possible scenario or disruptive event with digital and synthetic twins. Introducing rare events into our modelling and simulation will be key to understanding the highest probabilities of outcomes when the past is not a predictor of the future. From there, businesses can make rapid and resilient decisions to minimize risk and maximize profits. – Bryan Harris, Executive Vice President and Chief Technology Officer, SAS 

Digital twins will save thousands of lives in 2023: Improvements in IoT sensors, AI/ML, and 3D printing mean that digital twins, as a facet of synthetic biology, have arrived to make a major impact in the lives of thousands of humans – and animals. We now have the technology to fully model biological organs, recreating anything from a heart to a nose to skin in cyberspace. Not only does this enable doctors to more quickly diagnose current diseases and predict possible future health outcomes, but it also means that we are getting closer to ending animal testing, forever. Using a digital skin, for instance, a cosmetics company could test new skincare solutions for toxicity in cyberspace, instead of testing solutions on animals in a lab. Between the impact on humans and animals, digital twins will make a measurable impact to improve lives in 2023. – Frank Diana, Principal Futurist at Tata Consultancy Services

In the next five years, technologies like the metaverse, AI and automation, and biometrics will make huge leaps, and customers will expect their financial institutions to engage with them using these technologies. In order to meet them there, banks must be thinking now about how to maximize these opportunities, and they must be prepared to adapt to change more quickly than ever before both culturally and technologically. Those who choose to invest in resources that identify and solve problems such as removing bias in machine learning will come out the other end ready to capture the market and prevail as the real winners. – Chief Transformation and Operations Officer at Arvest Bank, Laura Merling

I don’t believe in the winner-take-all metaverse future. It will likely be a mix of companies and platforms that make up the metaverse, just like social media. You’re going to have different levels of immersive experiences, similar to how TikTok and Instagram require different forms of content. These future subtleties will require a metaverse brand manager. – Sami Khan, CEO of ATLAS: EARTH

For now, we are far from the version of the Metaverse that will give us an immersive digital world to traverse. Instead, I see the use of augmented reality (AR) and virtual reality (VR) increasing drastically in the next few years. These extended reality experiences can help tech teams to learn and innovate by augmenting human vision and the human mind. Things like Google Glass were ahead of their time in this regard, but I do see there being a place for these types of technologies to become common for technology teams. – Pluralsight CEO Aaron Skonnard

Deep Learning Meets the Metaverse: The sky’s the limit when it comes to the benefits of deep learning. It could be applied to anything that requires large amounts of data and decision making with high levels of accuracy. One such example that’s anticipated to explode in coming years is the Metaverse. Given the massive amounts of personal and valuable data involved, security will be a paramount concern and deep learning technology will be an extraordinary tool that can be used to help mitigate any security issues along the way. – Matthew Fulmer, Manager of Cyber Intelligence Engineering at Deep Instinct

Web access will bring more visitors into the metaverse: In 2023, we will see more virtual worlds vie for your attention on the browser. One of the biggest blockers to the proliferation of the metaverse is accessibility. VR adoption is still in its infancy and many users are unwilling to download dedicated apps when their attention could be spent elsewhere with significantly less effort (read: friction). This presents a conundrum for metaverse developers, as the potential for more genuine online interpersonal interactions is an oft-touted value proposition for spending time in virtual worlds. Although some examples of web-based 3D virtual worlds are out there, we expect that 2023 will see established and new players offering browser access to their metaverse destinations to grow their user bases. – Shawn Zhong, CTO at Agora


Natural Language Processing and Computer Vision Will Play an Important Role: Enterprise adoption of automation of processes involving text or voice data using Natural Language Processing (NLP) and Computer Vision (CV) technologies will greatly enhance in 2023. Large language models with high complexity will increase the sophistication of NLP applications. For example, AI-based virtual assistants are becoming essential to most organizations’ customer service lifecycle and engagement strategies. This allows customers, vendors, and employees to ask questions that can be easily answered through automated processes, as in a chatbot. But there are more sophisticated uses as well. For instance, broadcast editors who used to struggle to match timestamps with subtitles for a newly posted video can now utilize NLP and context analysis to provide subtitles and generate near-perfect translations. While designing a solution, recommendation and search engines are powerful tools in bringing relevant content to visibility. With CV and NLP, it is now possible to scan documents and retrieve relevant information instantaneously. AI has enabled quality assurance teams by analyzing inputs, outputs, and simulated data for anomalies. Based on wide data, data from multiple sources, AI can also help predict business outcomes, allowing companies to make rapid decisions. In addition, NLP-based systems to help organizations to meet regulatory compliance requirements. – Anand Mahukar, CEO, Findability

SQL workloads will explode as more NLP (Natural Language Processing) and other Machine Learning (ML) applications generate SQL: While data analysts and scientists continue to uncover insights using SQL, increasingly we’ll see apps that “speak SQL” drive a large portion of the analytical compute. Natural Language Processing (NLP) applications are enabling citizen data analysts and demand more compute on data platforms. Similarly, ML applications can dig into datasets in new ways which will blow through today’s level of demand for analytic compute. SQL is not only the ‘lingua franca’ of data analysis, SQL is the ‘lingua franca’ of ML and NLP too. –  Steven Mih, Co-founder and CEO, Ahana

Growing acceptance of hybrid NLP: It’s fairly common knowledge that hybrid NLP solutions that mix machine learning and classic NLP techniques like white lists, queries and sentiment dictionaries mixed with deep learning models typically provide better business solutions than straight Machine Learning solutions. The benefit of these hybrid solutions means that they will become a checkbox item in corporate evaluations of NLP vendors. – Jeff Catlin, Head of Lexalytics, an InMoment Company

Dynamic data searching will become essential: Organizations will prioritize the management of unstructured conversation content. As the volume of dynamic chat, voice, video, and text data exponentially increases, accessing, searching, and retrieving this data will be more important than ever. Whether for forensics, financial reporting, internal investigations, or litigation, the ability to understand the contents of complex communication data, organize it, and retrieve it will need to become a core competency. Any inability to manage data could have profound regulatory and other, more serious, consequences. – Devin Redmond, CEO and co-founder, Theta Lake

Natural language processing (NLP) + object recognition will bring search to the next level: While most people write scrapers today to get data off of websites, this may soon be replaced by further advancements in NLP. You will just have to describe in natural language what you want to extract from a given web page and it would pull it for you.  For example, you could say, “search this travel site for all the flights from San Francisco to Boston and put all of them in a spreadsheet, along with price, airline, time, and day of travel.” It’s a hard problem but we could actually solve it in the next year. As another example on the healthcare side, I think we’ll be able to predict — automatically — the notes and documentation a doctor might write for a given diagnosis or treatment, which would be a huge achievement. It could save healthcare workers valuable time. Generally, we’ll be able to tie object detection — where we train algorithms to predict what is in an image — to natural language processing. This would be a big step forward as it will allow us to simply describe what output we want and it would figure out how to build a classifier to deliver it. For example, you could say “does this image contain an animal with four feet and tails?” and that would be “programming” a classifier. While we can do this to some extent now, it will become more advanced in the coming year and allow us to go one level deeper — only describing attributes of what we want to find rather than providing labeled examples of the object itself. We may also develop new methods for combining prompt engineering and supervised labeled examples into a coherent whole. – Varun Ganapathi, Ph.D., CTO and co,-founder at AKASA

Large Language Models for NLP (like BERT, GPT, and derivatives) will keep improving, and their use will become more pervasive. Also one pretrained model will be able to be used with little modification for many functions (sentiment analysis, summarization, word sense disambiguation, etc.). It’s aim is to learn representations of data by contrasting between similar and dissimilar samples. Transformers, text-to-Image, and diffusion models, require large-scale datasets and supervised pre-training of large models is extremely expensive. The self-supervised Contrastive Learning can be used to leverage vast amounts of unlabeled data in order to efficiently pre-train large models. Furthermore, contrastive search is a related technique which has been shown to significantly improve the output of large language models when used for text generation tasks. – Adi Andrei, Head of Data at SpaceNK


Observability is the most under-recognized technology to advance cybersecurity solutions: We’re starting to realize that understanding an organization’s data makes a difference in being able to thwart cyberattacks and prevent breaches. That’s why, observability has the biggest opportunity to change how we advance cybersecurity detection and remediation. Through observability solutionssecurity teams are empowered to take action and tie technology performance to specific business outcomes. By layering observability with identity management, security teams have access to more data on identity-based threats to remediate incidents in real-time and improve their security defenses. – Dan Conrad, AD Security and Management Team Lead, One Identity

Most of the AI trends now have to do a lot with Natural Language Processing, Transformers, the attention mechanism, and how you can use it to make your AI models better. As this evolves, businesses will make better sense of all their stored textual data and pursue the objective of being a data-driven organization. If businesses found themselves collecting more and more raw, organic communications data in the past years, 2023 will see a surge of AI tools that help businesses convert their company data into actionable insights. – Erudit’s Chief Science Officer & Co-founder, Ricardo Michel Reyes

Natural Language Understanding Will Become Part of AI Models: In 2023 we will start to see natural language understanding become possible for AI applications. There will be a transition from simple pattern matching to language understanding within the underlying model. By starting with taxonomies, ontologies, speech technology and new rule based approaches – it will be possible to take natural language understanding and instantly turn it into triples that describe the pragmatics of the world. These triples become the underlying ontological description of the world, which is essential to produce high-quality AI using natural language. – Jans Aasman, Ph.D., an expert in Cognitive Science and CEO of Franz Inc.

Data observability will become a critical industry: In today’s economy, it’s critical to constantly calculate ROI and prioritize ways that we can do more with less. I believe engineering teams have an opportunity to lean in and work towards increasing the capacity of the company to win. I predict that we’ll increasingly see engineers and data teams becoming facilitators of enabling companies to make data-driven decisions by building the infrastructure and providing tools needed to enable other teams (especially non-technical teams). One of the ways they’ll enable this shift is to help teams understand how to access their data in a self-serving manner, rather than being constantly at the center of answering questions. Instead of hiring more data scientists, I expect data teams to increase data engineering roles in order to build lasting infrastructures that enable folks on all sides of the business to answer questions independently. – Shadi Rostami, SVP of engineering, Amplitude

Save all data or delete what ages out? That’s every data management policy challenge. In today’s world, everyone is watching their budgets while looking for creative ways to cut costs and maintain productivity. In 2023, data observability — one of the most valuable ways for companies to derive insights from data to drive critical business decisions — will improve from collection, retention, summarization, and context building to enable organizations to do a lot more with less. – Kit Merker, Chief Growth Officer, Nobl9

Having deep observability coupled with MELT (metrics, events, logs, traces) can help discover rogue activities coming from users, cloud app developers, or bad actors such as running P2P applications, crypto-mining servers, etc. In fact, Google’s recent Threat Horizons report shows that once threat actors gained access to a system, 65% of them engaged in crypto-mining, so spotting such activity can be mission-critical for security teams in the year to come. MELT has been the staple tool for data ingestion and application monitoring for two decades now, and we’ve seen these make their way into observability tools for cloud workloads now. As the architecture and landscape become increasingly complex in 2023, with activities such as crypto-mining, etc., complementing MELT with the “network perspective” that deep observability provides, security teams will have more insight into observability reporting and monitoring capabilities. – Bassam Khan, Vice President of Product and Technical Marketing Engineering, Gigamon

Observability: What companies can expect to see in 2023: Water systems make an excellent analogy for understanding the world of data. Data lakes are enormous centralized repositories, while data streams flow freely at high speeds and high volumes. The water analogy also extends to observability, where teams analyze an ever-growing number of logs, metrics and traces to ensure systems are working at their peak levels. Just as the water gets analyzed and tested for safety in two places (at the reservoir and in the pipelines), observability data needs to be analyzed and acted on while it is still flowing through the pipeline. Waiting for the data to arrive in the data lake is insufficient, and failure to maintain constant insight into the quality of water (and data) could lead to disaster. As businesses in every industry become more data-driven, the amount of data generated will continue to increase. The IDC Global DataSphere, a measure of how much new data is created, captured, replicated, and consumed each year, is expected to double in size from 2022 to 2026. That’s leading to a growing demand for observability tools and some emerging best practices for implementing them. Here’s what 2023 has in store for observability professionals. – Eduardo Silva, founder and CEO of Calyptia and the founder and creator of FluentBit

Observability, security, and business analytics will converge as organizations begin to tame the data explosion: The continued explosion of data from multi cloud and cloud-native environments, coupled with the increased complexity of technology stacks, will lead organizations to seek new, more efficient ways to drive intelligent automation in 2023. It’s not just the huge increase in payloads transmitted, but the exponential volumes of additional data, which can be harnessed to gain better observability, enhanced security, and deeper business insights. However, the prevalence of siloed monitoring tools that offer insights into a single area of the technology stack or support an isolated use case has impeded progress in accessing this value, making it difficult to retain the context of data. It also results in departmental silos, as each team remains focused on its own piece of the puzzle, rather than combining data to reveal the bigger picture. To address this, observability, security, and business analytics will converge as organizations consolidate their tools and move from a myriad of isolated and hard to manage DIY tools to multi-use, AI-powered analytics platforms that offer BizDevSecOps teams the insights and automation they need. This will help to tame clouds and the data explosion and drive intelligent automation across multiple areas, from cloud modernization to regulatory compliance and cyber forensics. – Dynatrace founder and chief technology officer Bernd Greifender

Data Observability investments are immune against the economic environment: While inflation and recession hit the global economy and stock markets plunge, funds will keep investing in the data observability industry, which is already worth billions in valuation. This new category solution focuses on helping data teams better understand data usage and faster troubleshoot data incidents, saving money and ensuring data teams are more efficient and free to focus on revenue-generating initiatives. – Andy Petrella, founder and CPO of Kensu 

Observability will become the watchword in 2023: AI, ML and observability solutions that take AI to the next level, so that organisations are getting more actionable insights and predictions out of their data for improved reporting and analytical purposes, will be paramount. This is where we will see true AI solutions shine – ones that bring in intelligence and richer observability instead of simple monitoring. Observability is more about the correlation of multiple aspects, context gathering and behavioural analysis. Observability correlation enables applications to operate more efficiently and identify when a site’s operations are sub-optimal, with this context delivered to the right person at the right time. This means a high volume of alerts is transformed into a small volume of actionable insights. Without a doubt 2023 will be a challenging year, but there will also be opportunity for innovation and growth in certain sectors.  This is where working with a cost-effective partner will be critical; a trusted partner that can rapidly pivot, innovate and adapt as requirements and market conditions evolve. – Mark Cooke, COO of Xalient

Quantum Computing

The crossover from quantum to classical computing is going to yield new techniques in data science. In particular, the transition from discrete linear algebra as the underpinning math of neural nets to more algebraic topology and quantum foundations could create a third wave in AI! – Moogsoft co-founder and CEO, Phil Tee

Quantum Matures and Consolidates: Companies will merge to fill gaps in their hardware and software offerings and seek firmer financial footings. We’ll say goodbye to some familiar names, but rather than representing a Quantum Winter, it will indicate a necessary maturation and evolution of the industry. – Bob Sutor, Vice President and Chief Quantum Advocate at ColdQuanta

Building the Foundation for Quantum: After decades-long hype around quantum computing and quantum systems, the industry will start to realize its potential for creating new opportunities in fields spanning cybersecurity, materials creation, financial analysis and military receivers. Proactive companies will start investing in quantum, fostering quantum talent within the next generation of workers through university partnerships, hackathons and other projects. This will create an ancillary boost to DEI initiatives resulting in much-needed diversity in the tech workforce. Recent research revealed that 74% of companies believe they will fall behind if they fail to adopt quantum. As a result, organizations will begin to shift their thinking that quantum is a futuristic technology and begin addressing key challenges, including financial resources and operations, and developing real enterprise applications of quantum by 2026, if not sooner. – Dr. Eric Holland, Director of Quantum Engineering Solutions, Keysight Technologies

Quantum Sensing Will Support AI – and Vice Versa: Machine learning will be used to optimize the performance of quantum sensors, while quantum sensors will enable new classes of machine learning algorithms for discovery within, and adaptation to, the sensors’ environment. Very different from the Big Data applications of machine learning and quantum computing, machine learning together with quantum sensing will bring about new capabilities in real-time sensing and signal processing. – Bob Sutor, Vice President and Chief Quantum Advocate at ColdQuanta

An exciting trend in AI is quantum computing, which has been around for around 7 years but will have applications in the next 25 years or so. They’re even more powerful than supercomputers in solving complex problems. Today’s computers started development in the 1940s, but look at us now! Quantum computers will do the same for the future. – Erudit’s Chief Science Officer & Co-founder, Ricardo Michel Reyes

Quantum implications are here and will be painful to adapt to in 2023: Making infrastructures quantum-resilient is going to be more difficult than imagined, both for the public and private sectors. One major area of concern when it comes to quantum is national security. Governments have secrecy policies that last for decades…those policies are going to be threatened by quantum computing as the technology evolves, with much of the information under these policies being transmitted (and potentially captured in encrypted form) with algorithms that may not be quantum safe. Within the next 5-10 years, quantum technology will likely become commercially available, making it a very real threat to past and outdated encryption algorithms – many of which are used to conceal the nation’s top secrets. Quantum computing is going to be able to overcome complex roadblocks at speeds that will render multiple forms of current encryption useless. For the private sector, trade secrets, intellectual property, financial data and more are at the same risk if a bad actor gets their hands on quantum computing capabilities and breaks the encryption keeping critical assets under lock and key. Building cyber resilience in preparation for quantum technology should have been an effort started a decade ago…but now is the second best time. In 2023, we’ll see both the private and public sector’s increased awareness around the challenges associated with quantum resilience, and we’ll see efforts begin to take hold more significantly to prepare for quantum computing. Much of the encryption infrastructure in communication networks that keeps information safe now is deeply embedded, i.e., certificates, and will take years to transition to quantum resilient algorithms, posing a timeline issue for changeover before the general availability of quantum computing. – Chief Information Security Officer of (ISC)² Jon France 

Preparedness for a post-quantum future: Given the proliferation of data today and the rise of data breaches, security is top of mind for everyone, everywhere. As a result, in 2023 more attention will be given to the threats and the implications of a future quantum world. For enterprises, specifically, that means starting to ensure your infrastructure is “quantum-resistant” by taking measures now to secure your networks, protect your backend services, and so on. CISOs will begin to take notice and put the necessary processes in place. Practical solutions are still somewhat tricky as standardization of post-quantum cryptography is ongoing. CISOs need to be diligent and opt for “hybrid” deployments, where even if new algorithms are discovered to be weak, security against classic attacks is still ensured. – Yaron Sheffer, VP of Technology at Intuit 

With quantum computing gaining traction, and both governments and businesses investing heavily in quantum research, we‘ll witness computing power that enables us to solve complex problems previously thought “unsolvable” in a matter of minutes. From new drug discovery, to weather predictions, to easing traffic congestion the practical use cases in certain industries will be game-changing. – Hillary Ashton, Chief Product Officer, Teradata

RPA, Automation, Robotics

The Rise and Fall of Everything-as-Code. In 2023, as budgets likely continue to tighten, a trend will emerge towards seeking optimization and productivity. Rather than continuing to grow teams, companies that are forced to do more with less will look towards ways to automate data processes that they once did manually. That is good news for platforms and tools that enable automation, are simple to use, and free up time spent on repetitive tasks to focus instead of creating impact for the business. – Satish Jayanthi, CTO and co-founder of Coalesce

In 2023, organizations will lean heavily on process orchestration tools to meet full automation potential: Business processes will continue to grow more complex at the same tie that the number of endpoints involved increases. Legacy systems, microservices, manual tasks, RPA bots, AI/ML tools and IoT devices that already adequately automate individual tasks in a process must be reconciled. In order to guarantee that these various tasks run smoothly within a process and can carry out appropriate analyzes and optimizations, process orchestration tools will be critical. This is because they coordinate the end-to-end process and integrate a wide variety of endpoints. If companies don’t manage to orchestrate their processes end-to-end, they only automate and optimize locally and don’t exploit the full potential that automation offers. In addition, process orchestration supports companies in gradually migrating from legacy systems to modern, microservice-based architectures. A good orchestration tool is software and device agnostic, works within an organization’s existing tech stack, and allows individual tasks to be gradually automated outside of a legacy system. Another trend is the increased use of “low-code” in process orchestration. Low-code tools are typically applied to automate simple processes. A smarter way of doing low-code is to use flexible and extensible tools, often in a domain specific way, which allow to apply low-code to more complex scenarios in process orchestration, counteracting the lack of skilled software developers for core and mission critical processes. – Bernd Ruecker, Co-Founder and Chief Technologist, Camunda

Artificial intelligence has continued to rise in adoption across many types of technologies and industries over the past year. Looking forward to 2023, there are many exciting AI use-cases that will continue to gain popularity, and AI’s convergence with robotic process automation (RPA) to generate intelligent automation is near the top of the list. RPA is already a common technology implemented in many businesses; however, intelligent automation, or the intersection of AI and RPA, empowers teams to accelerate their overall digital transformation with quicker speeds, sharper insights, and less stress. Typical RPA software bots are programmed to execute certain tasks, so that’s what they do — end of story. With the infusion of AI, the bots are able to automate certain predictions and decisions based on both structured and unstructured input. Essentially, intelligent automation up-levels RPA’s ability to work in tandem with humans — perceiving, learning, and anticipating processes from the available data to create smarter and more efficient outcomes. Additionally, data captured from intelligent automation can then be utilized by different departments who become more connected across their entire tech stack. In 2023 I think we’ll see more organizations across various industries dive into intelligent automation as they fully appreciate its value proposition. – Dave Dabbah, CMO, Robocorp

Rise of AI-Assistance: Working Smarter, More with Less: Software architects don’t have a lot of tools designed for their needs. Sure, there are a plethora of lower-level developer tools, IDEs, graphing, and profilers that most architects grew up on, but purpose-built tooling that truly helps an architect, well “architect,” really doesn’t exist – to do things like help identify architectural dependencies, recognize natural domain service clusters, define service boundaries and entry points, split services, build commons libraries, and recognize architectural drift. That’s where AI-assistance comes in, much like robot-assisted surgery, to help the expert do their actual job faster, smarter, with lower risk, more efficiently, and much more precisely. – vFunction Chief Ecosystem Officer, Bob Quillin 

2023 will see more organizations relying on existing processes as opposed to architecting new ones: Process automation will begin focusing on optimizing existing processes, rather than designing or architecting new ones. Companies should focus on auditing processes, services, and data to run faster and leaner. Transitioning to reusable components and connectors/plugin architectures allows for fast implementations and integrations. One size doesn’t fit all. Any out-of-the-box solutions need to be flexible with availability- for example, APIs, SDKs, and other programmatic implementations. – Amara Graham, Head of Developer Experience, Camunda

The future is long-term workforce shortages across industries, not a recovery – organizations will need to understand how to combat workforce shortages using automation for redundant tasks. With one in five Americans being 65 or older by 2030, and a workforce shortage caused by COVID as well as the Great Resignation, organizations need to prepare now for a continued, long-term effect on the workforce. This year organizations already adjusted their workforce models to support these shortages, but next year we will begin to see more implementation of digital automation. With the growth in digital channels, organizations will have to deal with many more customer interactions without growing their headcount. This is where AI can help. Instead of being viewed as the enemy that is replacing an employee, automation will help deal with this extra workload by centralizing and streamlining simple tasks in a self-service mode. Organizations can also use automation to surface relevant insights and guidance in real-time to help employees handle tasks and support customers more efficiently and effectively. – Brett Weigl, SVP & GM, Digital and AI, Genesys

The last of the data-generating or data-consuming companies that haven’t already adopted AI will do so next year. In an era where data is growing so fast, a business will become obsolete if it does not have tools to automate repetitive decisions, process internal data, and/or take advantage of external data. At the end of the day, the role of automation is not only to accelerate existing processes, but to enable the maximum potential of human productivity. In 2023, when a turbulent economic climate will continue to force enterprises to reduce their workforces, intelligent automation will mitigate the onus on remaining talent, transforming operations and creating more engaged employees. Moreover, companies will see incredible value on the customer side, with intelligent automation enabling heightened demand predictive capabilities and more efficient data pipelines. The key to adopting this critical technology is to ensure all users understand how the automated decisions are being made, creating trust in the system and optimizing implementation. – Farshad Kheiri, Head of AI and Data Science at Legion Technologies

Very similar to what we saw at the start of the pandemic, the 2023 recession environment will force organizations to figure out how to scale through technology like automation and AIOps and not through headcount. As companies implement hiring freezes and are forced to work with flat budgets, in addition to cutting staff, companies must identify ways to support existing employees and create a less stressful work environment for their IT, SRE and DevOps teams to avoid employee burnout. Effective, automated solutions that address these challenges will become a must-have. – Mohan Kompella at BigPanda

Automation rewriting automation: 47% of developers don’t have access to the tools they need to build applications fast enough to meet deadlines. We can expect the next wave of automation to automate its own development to fill this gap. Code will be written by AI engines, intelligently generating its own code. We’ll see more maturity, more time saved (cutting down development time by 90%), fewer errors and faster development. – Prasad Ramakrishnan, CIO, Freshworks

Automation’s Silent Revolution Becomes Loud and Mainstream: Automation has been a “silent revolution,” with companies slowly increasing adoption over time. Amidst the global economic downturn, adoption of automation will be fast tracked as companies look to cut costs while boosting productivity and collaboration. In 2023, automation will burst on to the scene as a key solution for businesses to scale and expand without breaking the bank. – Aytekin Tank, CEO, Jotform


Conversations about data security are expected and highlighted during Cybersecurity Awareness Month, but it’s critical that they continue every month, throughout every year, into the foreseeable future. Companies across every vertical are trying to become more data-driven and to democratize data within their organization, but that’s difficult to do securely if you have to constantly move data around to meet business objectives—moving data sets from your data lake into a data warehouse, then subsets into BI extracts, creating cubes and lots of copies. Companies lose visibility into who’s accessing which datasets. Worse still, since access controls don’t necessarily travel with data copies, it’s likely that those aren’t being adequately protected over time, which exposes data and creates risk. It’s imperative for enterprises to shift gears and implement an open and secure data architecture, where data is its own tier. With a lakehouse, companies no longer need to copy and move data from object storage into data warehouses to analyze it. Lakehouses bring engines directly to the data—and allow processing and analysis against a single source of truth. This gives data teams full visibility into who’s accessing what data in their lake, so they don’t have to worry about rogue copies of data that could threaten corporate security and governance rules. In turn, companies can take advantage of all the technical innovation that’s happening with data lakehouses and offer their business units genuine self-service with data. – Tomer Shiran, Co-Founder and CPO, Dremio   

Data Security Strategies: As rising international tensions lead to more frequent cyberattacks and more laws take effect that impose harsh penalties for mishandling data, executives in charge of data will begin to interrogate the assumptions on which they’ve built data security strategies. This will include creating a comprehensive inventory of the data they have on hand, questioning whether the data they store is necessary to accomplish their goals, and raising awareness of data security best practices throughout the organization. – Stephen Cavey, co-founder and Chief Evangelist of Ground Labs

Weaponizing deepfakes: in October 2022, a deepfake of U.S. President Joe Biden singing ‘Baby Shark’ instead of the national anthem was circulated widely.  Was this a joke, or an attempt to influence the important U.S. mid-term elections?  Deepfakes technology will be increasingly used to target and manipulate opinions, or to trick employees into giving up access credentials. Deepfakes will go mainstream with hacktivists and cybercriminals leveraging video and voicemails for successful phishing and ransomware attacks. – Mark Ostrowski, Office of the CTO, Check Point Software

Most organizations will continue to tackle data security with a patchwork of technology that covers only small portions of their data estate, leaving the majority of data unmonitored and unprotected. While regulations will continue to expound on the need for specific types of data monitoring and fundamental protection for that type of data, further exacerbating the siloed or targeting security practices. – Terry Ray, SVP Data Security, Imperva

Deep Fakes Replicate Digital Humans – The Digital DNA Theft: In 2023, Deep Fakes will become so authentic that not only will we see our digital identities being stolen, but also digital versions of our DNA. Exposing our Digital DNA on the internet will enable Deep Fakes to replicate and create Digital Humans. If you have ever seen the movie “The 6th Day,” we are on the same path for replicas of our digital selves. Humans sync our physical lives on social media with constant uploads of photos, videos, audio and personal preferences with enough data points and some enhanced algorithms. It is only a matter of time before attackers can create lifelike digital avatars of anyone, and it will be incredibly difficult to identify the difference without technology to analyze the source data. – Joseph Carson, Chief Security Scientist and Advisory CISO at Delinea

Using AI and machine learning to combat ransomware attacks can help strengthen a company: Ransomware, phishing attacks, and data breaches have become all too familiar among organizations, and while these attacks are not new concerns, it has and will consistently take its toll on industries. What’s more, bad actors show no sign of stopping. To combat these ongoing and evolving attacks, AI and machine learning will be beneficial and organizations look towards these tools as we approach another hyperactive cyberthreat landscape in 2023. When implemented, AI can protect individual projects and core ecosystem services, while identifying deployed open-source programs and applying an automated security analysis. – Rick Vanover, Senior Director Product Strategy at Veeam

Investing in AI to combat fraudulent and synthetic identities: In 2023, fraudsters will devise new ways to hack into accounts, including new ways to spoof biometrics, new ways to create fraudulent identity documents, and new ways to create synthetic identities. Organizations will need to invest in artificial intelligence (AI) and machine learning (ML) anti-fraud technology to counter new attack vectors and effectively fight fraud as hacker methods evolve. – Ricardo Amper, CEO and founder of Incode  

AI will completely transform security, risk and fraud: We’re seeing AI and powerful data capabilities redefine the security models and capabilities for companies. Security practitioners and the industry as a whole will have much better tools and much faster information at their disposal, and they should be able to isolate security risks with much greater precision. They’ll also be using more marketing-like techniques to understand anomalous behavior and bad actions. In due time, we may very well see parties using AI to infiltrate systems, attempt to take over software assets through ransomware and take advantage of the cryptocurrency markets. – Ashok Srivastava, Senior Vice President & Chief Data Officer at Intuit

CISOs Tackle Security in The Data Lake: Now that companies recognize the value of data, they are keeping all of it, just in case, and their preferred storage is the data lake. With various analysts and data scientists accessing the data lake for so many purposes, it is harder than ever to configure permissions. Who can take what action on data in the lake?  The folks in IT who receive requests for access feel pressure to respond. After hundreds of requests, it’s hard to not start clicking “allow.” The inevitable result is that companies will collect “access debt.” In 2023, CISOs will start looking for ways to monitor and trim the access debt. – Tarun Thakur, Co-Founder & CEO of Veza

APIs Are Data Pipelines That Will Attract More Attackers: While traditional databases allow users to find, store and maintain data, application programming interfaces (APIs) enable users to access and review the data as it transfers between the company, customers, and third parties. Software code has come under attack in innovative and deeply troubling ways as APIs have become the critical pipeline in modern organizations, and because of this, we can expect to continue to see API hacking as a major threat vector when it comes to critical data. Whether it be through a mobile application or website, APIs interact with business logic and allow adversaries to understand exactly how a company is processing information and data, making APIs a major area of vulnerability for organizations. We expect 2023 to be the year that the risk becomes so apparent that companies can no longer ignore it. – Dor Dankner, Head of Research at Noname Security

“Generative AI” will become a buzzword in the cyber security space: “Generative AI” will transform the AI use cases from “classification, translation, recommendation” to “creation”. In the cybersecurity world, this means threats are even more personalized and more deceiving, but at the same time it means we will have more creative tools to help us to combat the bad guys. – Howie Xu, Vice President of ML/AI at Zscaler

Data context-driven automation will emerge as a priority for organizations looking to mature basic AIOps into more precise AISecOps: Organizations will increasingly realize that to be effective, the platforms they use to automate software delivery pipelines and support AIOps need to be data context-driven. That means they need the ability to unify data in a single source of truth, where it can be transformed into precise answers and intelligent automation. This will be key to ensuring the AI that powers automation can distinguish between cause and effect to make more intelligent and timely decisions. However, organizations are struggling to maintain this context as the growing complexity of dynamic cloud architectures and increasingly distributed digital journeys has led to an explosion of data and disparate analytics tools. In the coming year, organizations will seek to address this by shifting their focus from consolidating tools to drive efficient AIOps, to embracing platforms that support more advanced AISecOps. This will enable them to break down the silos between observability, business, and security data and bring it together with topology and dependency mapping. As a result, they will be able to retain the relationship between data streams and unlock the full context needed to drive more powerful and precise automation, so they can deliver seamless digital experiences. – Dynatrace founder and chief technology officer Bernd Greifender

IAM Teams Look to Adopt AI and Machine Learning but Only for Specific Instances: As more enterprises adopt an identity-first approach to their security strategy, they are challenged with how to manage the increasing number of entitlements and permissions connected to  applications that live in a variety of environments (on-premises, private cloud, public cloud, etc.) and create a lot of  data about events, logs, users and more. In addition, the explosion of demand for cloud infrastructure and entitlement management (CIEM) solutions has resulted in creating more predictable models about users, entitlements and provisions. As a result, in 2023 enterprises  mature in their implementation of these areas will consider leveraging artificial intelligence or machine learning to further scale these strategies. However, AI and ML adoption by IAM teams will likely remain constrained to those targeted areas as enterprises continue to mature their IAM strategies. – Axiomatics’ Chief Product Officer, Mark Cassetta

It’s never too late for policy to evolve; in 2023, it finally might: As biometric and AI-driven healthcare technologies become more pervasive, we will need a federal policy that governs how personal data is collected, managed, and used. It’s unsettling that mobile app creators can collect health-related data that does not have federal data protection. The current administration has announced new guidelines, though many of these policy updates are incremental steps that don’t go far enough in protecting data. The policy must evolve at the same rate technology, and cyber threats do. With that in mind, any CISO will tell you it’s never too late to mature the current approaches since the next threat or attack is around the corner. – Chris Bowen, Founder and CISO at ClearDATA

Artificial Intelligence Will Continue to Play a Prominent in Detecting Data Breaches: In 2023, more workers will still telework and utilize individual gadgets to connect to work networks remotely. Connecting to networks with non-secured remote or cloud-based devices may unwittingly fall prey to more phishing attacks and hacking of credentials. Improved artificial intelligence (AI) algorithms can identify and reduce systems vulnerabilities with weak security in 2023. Companies that use AI and automation to detect and respond to data breaches have better safeguards. Security vendors can effectively examine the vast amount of data moving across networks in real time with the most recent and sophisticated algorithms. – Philip Chan, PhD, Adjunct Professor, School of Cybersecurity & Information Technology, University of Maryland

Both Structured and Unstructured Data Are at Risk for Theft: In 2022, structured data was more at risk than unstructured data for malicious exfiltration. Attackers targeted structured data used in databases such as Oracle and Microsoft® Azure SQL Server (68%) and for analytics in web platforms such as Databricks (63%). However, attackers also searched for unstructured data used in applications (57%) such as Amazon S3, Microsoft® Azure Blob and created by users (50%) in tools such as Microsoft OneDrive, Microsoft SharePoint, and others. Moving into 2023, attackers will target structured data used for analytics (68%) over that used in databases (62%). They’ll also target unstructured data created by users (58%) over that created by applications (54%) or other sources (16%). Analytics and user data reveal corporate intent, providing a lens into strategies, plans, product launches, partnerships, and other information of interest to attackers, such as nation-states, cybercriminals and more. – Titaniam, Inc.

Security will dominate Big data IT buying criteria, including for data storage: Supply chain issues and economic challenges will continue to impact storage projects in 2023 — the exception being those that can show tangible ROI on ransomware protection initiatives. This will present an opportunity for big data storage solutions with the intelligence to address current gaps in multi-level security, detection and data immutability for ransomware protection and fast business recoverability. Moreover, solutions that can provide AI-based anomaly-detection capabilities for detecting ransomware attacks will become more mainstream in the near future. – Giorgio Regni, CTO, Scality


Organizations will be forced to look for new approaches to manage unstructured data growth in 2023. Many have already noticed that the pace of unstructured data growth is snowballing exponentially faster than it has in the past. This leads to increased costs, as companies have to buy more storage, and the introduction of risk, as the organization has less knowledge about the data as it ages in its network. Organizations need new solutions to minimize the financial impact and risk their business faces. Furthermore, much of this unstructured data is stored in network-attached storage (NAS). This is because many applications haven’t yet been redeveloped to leverage object storage. So, much of an organization’s unstructured data will continue to be stored on-premises in 2023. Because of this, public cloud providers will form more relationships with traditional on-premises NAS vendors. They will offer branded, cloud-based, managed file services. These services will benefit customers because they have a simple “on-ramp,” they preserve pre-existing documentation and processes, and they take care of the underlying hardware and operating environment for the customer. – Carl D’Halluin, CTO, Datadobi 

Synthetic Data

Ramping up Innovation While Scaling Back Costs: How Synthetic Data Drives Efficiency in AI Development: Many organizations feel the pressure to innovate, despite paring budgets and staff. During this uncertain time, it’s important to remember that scaling back does not have to stifle innovation. Organizations will need to continue investing in the tools and technology required to advance their processes, products and services––but in a much smarter and more efficient way. Over the past decade-plus, data has become a major source of competitive differentiation for businesses. As the anticipated economic fallout threatens organizations’ health, data will again play a critical role. Synthetic data will play an essential role in carrying forth this vision, as it’s 100x cheaper and faster than using real-world data. The added advantage of cost savings will enable organizations to integrate and iterate in a much faster way. Companies that master the balancing act of scaling efficiently will ultimately reap the reward of maintaining innovation during the anticipated economic downturn. – Yashar Behzadi, CEO and Founder of Synthesis AI

Derivative and synthetic data are on the rise: If the last few years have taught us anything, it’s the value of investing time and resources into preparing for the unexpected, or dare I say it, unprecedented. Good data, analytics, automation and AI all enable us to react quickly, or ideally “pre-act” to forecast issues before they even start. It’s an approach that has been proven time and time again to work with issues we have a wealth of real data to model our potential futures on, like consumer demand to predict stock levels and reduce wastage. But unfortunately, as COVID-19 showed, there are still plenty of unprecedented events that the average operation doesn’t have easy access to enough real data to react to, let alone predict. These increasingly apparent gaps are why the use of derivative and synthetic data will be a key trend in 2023. In fact, the use of synthetic data looks to completely overshadow real data in AI models by 2030. Data that is artificially created enables organizations to model innovatively for things that have never happened before, while jumping over some of the privacy, copyright and ethical hurdles associated with real data. And for anyone questioning its validity, research suggests that models trained on synthetic data can be more accurate than others. Whilst derivative data allows us to repurpose data for multiple needs. It enables the crucial scenario-planning needed to prepare for future issues and crises, holding great potential for highly regulated industries like healthcare and financial services. – Dan Sommer, former Gartner analyst and Qlik’s Global Market Intelligence Lead

The latest innovations in AI will be used to synthesize new content from existing content. One example of this is to ingest documents or images and to produce a summary with key points to aid in comprehension. This is a time-saving innovation that will be a main focus on AI in the ECM space in 2023. Eventually, actionable highlights will be provided, which combines comprehension with judgment that may consider datasets and other factors outside of the document but put in the context of the material being analyzed. – Michael Allen, CTO at Laserfiche

Synthetic Data: The Key to Addressing Generative AI Ethical Concerns: Generative AI has dominated headlines, and the hype surrounding the technology continues to grow. Data remains the most critical aspect in building generative AI systems, but using real-world data poses ethical and privacy concerns, including human data to train ID verification models. Development teams will increasingly use synthetic data when creating generative AI models, as it’s artificial data created in simulated worlds and thus eliminates many biases and privacy concerns associated with datasets collected from the real world. AI adoption is steadily rising, with over 55% of organizations indicating AI as a core function in 2021, up from 50% in 2020. As innovation only continues to increase in the space, it will be imperative for organizations to invest in the tools and technologies that help mitigate bias imbalances and ensure generative AI models are built in a more ethical and privacy-compliant way. – Yashar Behzadi, CEO and Founder of Synthesis AI


AI will yield tremendous breakthroughs in treating medical conditions in the next few years. Just look at the 2021 Breakthrough Prize winner Dr. David Baker. Dr. Baker used AI to design completely new proteins. This ground-breaking technology will continue having huge ramifications in the life sciences, potentially developing life-saving medical treatments for diseases like Alzheimer’s and Parkinson’s. – Moogsoft co-founder and CEO, Phil Tee

Data has become a currency for many discoveries in today’s society and we will continue to see its value grow in the next year. The integration of data sources, especially when personal identifying information (PII) or protected health information (PHI) is involved, has a significant impact on the ability of AI to learn from diverse sets of data. The problem is even more complex in medical applications, where patient data are protected by HIPAA. In the next year, we can expect to see commercial organizations overcome this problem by using approaches that can link diverse data sets for the same individuals, owned and stored by different entities, through de-identified data. Tokenization is one such approach – it allows algorithm developers to gain access to diverse sets of data that are representative of the intended use population, which can then be used to develop and validate generalizable algorithms. Tokenization also creates an effective data search and exchange platform, where organizations can make available and find datasets of different modalities for the same patients, in a privacy-preserving manner. As real world data becomes a major source for AI application development and validation, tokenization will play an increasingly bigger role. – Evangelos Hytopoulos, sr. director of data science at iRhythm

Bridge the Talent Gap by Preserving Institutional Knowledge: As mature workers retire and fewer young people enter the field, insurers are searching for ways to obtain talent resilience. Leveraging AI will be a key strategic initiative, as insurers look to capture seasoned professionals’ institutional knowledge, then transfer it to less experienced employees. AI can effectively serve as a co-pilot for newer employees dramatically decreasing their training time. As AI becomes more widely adopted, underwriters and adjusters will be able to amplify their capabilities, allowing them to process more policies and claims at a higher level of effectiveness. New workflows will emerge that leverage the combined strengths of AI and talented professionals. – Stan Smith, CEO, and founder of Gradient AI

Intelligent and Predictive data, and not just data, is the future: The current trend in the automotive industry involves gathering more data from many, disintegrated systems that doesn’t really solve the one major problem – quicker, accurate and pointed decision making. But the challenge with data being split into multiple variables from different sources is that it’s too complex to analyze and collate. Instead, what businesses actually need is ‘Intelligent and Deterministic Data’. A hybrid model focusing on edge as well as cloud computing that would highlight the aspects that actually matter, yielding ready-to-use data to take real-time decisions with maximum impact. –  CerebrumX CEO Sandip Ranjhan

AI is mature and omnipresent in the Office of Finance: We are utterly immersed in AI, and it is already impacting our lives in powerful and exciting ways. A recent study by the Business Application Research Center (BARC) revealed that in the span of just two years, the percentage of organizations relying on predictive planning technology has increased elevenfold to 44%. According to Forbes, AI is responsible for filtering irrelevant and potentially dangerous messages from our inbox. It personalizes our social media feeds and helps Amazon curate our online shopping experience. It powers the recommendation and personalization algorithms that Netflix uses to produce and serve relevant and compelling content. It populates closed captions and subtitles in PowerPoint to make presentations more accessible. From a technical perspective, AI is stable, mature, and ready to support the office of finance. From a strategic perspective, it is helping CFOs provide the sophisticated insights that business leaders and investors are demanding. – Dr. Björn Schmidt, Chief Finance Officer at Jedox

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter:

Join us on LinkedIn:

Join us on Facebook:

Speak Your Mind



  1. This has sparked a lot of thoughts for me. Thanks for starting the conversation.

  2. Fascinating insights! Exploring big data industry predictions for 2023 offers valuable foresight. Thank you for sharing these informative forecasts!