Bringing on the Pain: Majority of C-Suite Not Involved with Data Quality Management Initiatives

Print Friendly, PDF & Email

In this special guest feature, Dan Ortega, VP of Marketing at Blazent, provides a first-hand commentary on IT’s struggle with data quality. As Vice President of Marketing, Dan is responsible for all facets of Marketing for Blazent. Dan brings over 25 years of experience as both a senior executive with multiple Fortune 500 technology companies, including Sun Microsystems, SAP, and BMC, as well as extensive experience as a VP of Marketing for a series of successful start-ups such as Metacode Technologies and Astoria Software. Dan’s focus includes Product Marketing, Marketing Communications, Field and Corporate Marketing, and Channel Marketing. Dan graduated from the University of Michigan with a degree in Economics and lives in Berkeley.

It is a broadly accepted fact that the volume of data entering any enterprise has skyrocketed, and is going to continue to accelerate at a staggering rate. So while the good news is that enterprises have more fodder than ever before to make decisions, the challenge is that data fodder needs to be converted to actionable intelligences, and there is a massive disconnect between the people making the decisions (the C-Suite), and the folks who provide them with the information needed to make those decisions (the IT function).

The Data Quality Divide 

According to a new Blazent survey which polled the opinions of IT directors/VPs live on the exhibition floor at ServiceNow’s annual Knowledge16 conference (effectively, a big box full of real experts), nearly all (98 percent) indicated data quality management was either very important or critical to IT operations, yet nearly half (48 percent) would give their organization a grade of C or lower in terms of data quality driving business decisions, and over two thirds are nominally confident in the quality of their organization to drive IT decisions.

Couple that with the fact that 56 percent of execs cited that the C-suite isn’t involved in daily data quality management (DQM) conversations or the technology selection, yet still expect data to positively impact ROI. Then, putting the icing on the cake, nearly half of all execs are still relying on spreadsheets (1980s technology) or manual processes (15th century technology) to perform analytics within their organization – an inherently time consuming process, using the wrong tools, and saddled with an uncomfortably large margin for error.

So what does this mean?

  • All of IT admits that data quality is important although half of IT admits to not doing it correctly
  • A majority of IT execs don’t trust themselves to manage the data quality within their organization
  • The C-suite/senior management is not involved with data quality initiatives half the time
  • Half the time the tools being used to making data true, accurate and actionable are antiquated or even worse, non-existent

Clearly there is a significant disconnect. If data quality is that important, how do these companies rationalize the dismal implications of their other responses? The key response is the lack of participation of the C-Suite – these are the people sitting at the top of the mountain with the long-range view. Anyone lower than that is usually in fire-fighting mode, and it’s hard for that group to look down the road. Data quality is a strategic enabler and whoever has better data is in a stronger position to make better decisions than their competitors. The tools are available, the talent is in place, and the payoff is significant and quantifiable. This is a matter of focus and will; every day we see evidence of decisions based on bad data, and every day is another day where the majority of the C-suite is not doing anything about it.

What’s Next? 

It’s clear that data quality remains a huge priority for IT, yet it’s current state of implementation (or lack thereof) is genuinely alarming. In order to clear this data quality hurdle, organizations need to educate from top down on the importance of having a consolidated view of the information associated with quality data and being able to leverage the proper automated processes to minimize human error – all with the goal of enabling an enterprise-wide strategic value-add. The companies who figure this out first will have a massive competitive advantage, those who don’t will become a footnote in case studies of what to avoid.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*