Why Data Quality without Data Integrity is No Match for Today’s Business Demands

Print Friendly, PDF & Email

Bobby KoritalaIn this special guest feature, Bobby Koritala of Infogix,  discusses  data management best practices and why data quality without data integrity is no match for today’s business demands. Bobby Koritala, Chief Product Officer, joined Infogix in 2009 and leads the Product Management and Development Group. Prior to this, Bobby served as the Director of Risk Technology Solutions at Protiviti, COO of Spark Biotech, Vice-President of Investments at Open Prairie Ventures, Director of Applied Technology at Blue Cross Blue Shield, Director of Product Development at Lexis Nexis, and Senior Manager, Software Development at SPSS. Bobby has a Bachelor of Arts degree in Computer Science and Physics from Coe College, a Master of Science degree in Computer Science from the University of Wisconsin, and an MBA from the Kellogg School of Management, Northwestern University.

Salt without pepper. Batman without Robin. Bonnie without Clyde.

One without the other just doesn’t make sense, right?

Dynamic duos enhance each other’s best attributes, just the right combination to make the other more valuable. Taken together, they complement each other – and that’s when the magic happens.

The same thing could be said about data quality and data integrity – different, yet equally important components of an overarching data management strategy. One without the other yields incomplete data quality standards, leaving companies vulnerable to problems resulting from bad data.

In the past few years, data quality has been front and center, commanding the attention for Master Data Management (MDM) and other considerations, particularly as risk management and compliance have dictated the priorities – and budgets – of companies in highly regulated industries, such as financial services and healthcare.

Gartner describes data quality tools as follows: “Traditionally aligned with cleansing of customer data (names and addresses) in support of CRM-related activities, the tools have expanded well beyond such capabilities, and forward-thinking organizations are recognizing the relevance of these tools in other data domains.” Such tools provide the following capabilities, according to the research firm: parsing and standardization; generalized “cleansing;” matching; profiling; monitoring; enrichment.

Despite the allusion to expanded capabilities above, traditional definitions of data quality are insufficient: data quality, on its own, cannot address all of the areas where data is susceptible to corruption, alteration or deletion, nor can it identify the many ways data needs to be reconciled and verified to accomplish different tasks.

When it comes to data management best practices, data quality should be something for which all companies strive. Whether they choose to broaden their definition of data quality to include data integrity best practices or they think of data quality and data integrity as two distinct sides of one coin matters less than the fact that there are certain aspects of data quality that companies often overlook. Robust data quality – your Data IQ, if you will – includes both Integrity and Quality and just like your intelligence IQ, companies are striving for higher Data IQ based on the fact that 60% of IT leaders indicate that their organizations don’t trust their data.

What’s at stake is millions of dollars in lost revenue, exposure to significant risk and efficiency drains on operational costs. Data quality, without data integrity, is only half of the equation to fix this glaring deficit – and companies need to raise their Data IQ to make the grade.

Let’s outline the different parts of Data IQ, then, by focusing on the distinctions between data quality and data integrity, so that businesses can better grasp all that’s required to ensure their data is trustworthy through and through.

Data Quality: The Four C’s

Data quality controls outline dimensions such as completeness, type conformance, value conformance and consistency (what we call the four C’s). Taken together, these concepts outline how data should appear where it resides once it lands after moving through an enterprise’s systems. Checking data quality at the field level (data element) provides a valuable level of granularity which is like having another set of eyes on your data. These eyes though are automated business rules that validate data fields such as account number schemas that follow established corporate rules and policies.

By zooming into the data at the field level, companies set a foundation for more accurate data overall. But companies should implement additional steps to bring data quality out of the weeds, incorporated into procedures that serve to rectify issues initially spotted by data quality controls.

Data Integrity: “VBRT” (Verification, Balancing, Reconciliation and Tracking)

Knowing that there’s an issue with your data is one thing; determining what to do about the discrepancy is another issue entirely. Adding data integrity to traditional standards of data quality helps companies follow the data from source to destination to keep an eye on their data and promptly discover when irregularities occur. When a data problem cascades into other downstream systems it not only proliferates bad data, but invariably is detected by a consumer of the data which perpetuates the fact that 60% of IT leaders indicate that their users don’t trust their data.

A thorough understanding of data quality standards involves VBRT: verification, balancing, reconciliation and tracking. These steps provide visibility into organizational data, making errors easy to identify and respond to in real time. Tracking the health of data as it flows through myriad of disparate systems with automated business rules, aids in proactively prioritizing issues before they grow into major problems that waste manpower to create temporary workarounds while the root cause of the issue is investigated.

The Dynamic Duo in Action

Understanding the value of data quality combined with data integrity is half the battle; putting these concepts into action is another task altogether. Companies that have regular access to data quality and integrity alerts can then monitor data health from an enterprise vantage point and continuously adapt automated business rules logic so that they refine their data operations to catch bad data at the source. Implementing automated data controls in systems across the enterprise requires discipline – with the potential to transform how your company interacts with its data along the way.

Companies can best take on Data IQ by ensuring their data integrity and quality controls are automated and continuous – easy enough in theory, but often overlooked in reality. Manual and semi-automated data controls, set up and validated with snapshot reports for data quality assessments, are incomplete at best. If data controls analyze data across the enterprise a company will have a birds-eye view to provide an end-to-end perspective on the health of their data to reduce the risk of proliferating bad data.

Think of it this way: What if a customer doesn’t get his or her insurance bill one month, but instead receives a cancellation notice. Because he or she didn’t pay their bill, the insurer sent a default notice to cancel the customer’s membership. Data IQ principles apply here in multiple aspects – but first, you need to work backward from the result to the specific initial action that caused this sequence of errors. When thinking of the multiple issues here – first, that the bill wasn’t sent; second, that the cancellation notice was sent – requires an integrated application of logic. If automated controls knew to flag the insurance company when the bill wasn’t sent, then other if/then scenarios could be developed to prevent the cancellation notice from being printed and ultimately sent to the customer, period.

The biggest advantage of implementing end-to-end data quality controls is that there is applied logic from the get-go, rather than applied after a mistake has been made. By thinking through potential scenarios where bad data could have a negative impact, companies are forced to be proactive and strategic when planning for their operational data flows. Working with teams and partners who have a depth of industry knowledge helps establish Data IQ principles, ensuring organizations can lean on seasoned expertise to determine which mistakes to flag. From there, companies can customize their logic up-front to reflect instances unique to individual industries based on their personal experiences.

Steps Toward Successful Data IQ Implementation

Creating a solid framework for Data IQ begins with assessing existing operational processes to identify gaps and opportunities where automation will remove risk by improving data governance.. Consider the following questions with some examples to jumpstart your Data IQ:

Are you working with the right data? This, at the highest level, includes removing duplicates to prevent confusion later on, as well determining procedurally where data needs to be sent within your system, as well as which data needs to be collected or requested.

Is data being properly routed through your system? Data sequencing is important, in particular when issues prevent timely receipt of data. Late data is bad data – end of story. Keeping this in mind is helpful when assessing how data flows through your system. If it’s getting bottlenecked at a certain stage, you can address that by being alerted to data that arrives behind schedule – or not at all. Supply chain companies dealing with SKU proliferation, for example, need to be confident they’re routing orders to the right vendor class – and that this process is happening behind the scenes smoothly so that operations can continue uninterrupted.

Is data following the right path along the journey? Take the above thinking one level deeper, looking at the individual steps in the data movement process. Think of it as guiding data the correct way – left, not right! – to make sure it gets where it needs to be. This is where balancing checks and reconciliation come into play – which can be very complex. Matching records when many transactions merge into one transaction or tracking millions of transactions en masse requires automated protocols. When banks get financial data from tellers or mortgage brokers each night, reconciliation can’t be left to chance error – or they’ll risk mistakes on their general ledger that will compound into larger compliance issues and stiff penalties.

Is the right content available when and where it needs to be accessed? Drilling down to this level of detail allows you to check whether your individual pieces of data are the ones you’re looking to collect, measure or assess. It is here that data quality checks, performance checks and content checks can verify that what you’re gathering is thorough and matches what you’re seeking. If you’re a supply chain company, for example, you’ll want to be checking that your parts list coming in from your catalogue matches – so nothing slips through the cracks.

Raising the Bar with Data IQ

When data quality practices are extended to include data integrity techniques, you’ve set up the right foundation to make data ready for action. By employing strong operational analytics and predictive analytics methods to data that you’re confident are accurate and reliable, you’ll be able to more thoroughly interpret information already pouring through your business systems to affect change and impact the bottom-line.

Focusing first on fixing your data processes using Data IQ principles to correct data quality and data integrity gaps is essential, freeing your business to then focus on how to apply advanced analytical techniques to derive meaningful and accurate insights from your trustworthy data. Data IQ should be the very first step, not an afterthought, toward continual business improvement. As companies wave the banner of being data-driven, their success hinges upon data quality – and a broader definition of it, at that.

The sooner your data quality best practices include measures of data integrity, the sooner you’ll be confident that you’ve got a winning combination for your business data – allowing the dual concepts to work together to make your data, and your company, shine.

 

Download the insideBIGDATA Guide to Finance

Speak Your Mind

*