To Improve Enterprise Visibility, Shine a Light on Dark Data and Shadow IT

Print Friendly, PDF & Email

The use of information technology systems, devices, software, applications, and services without explicit IT department approval is known as “shadow IT.” It’s not a new concept. The adoption of cloud-based applications and services and the rise of the hybrid workforce has only increased its presence. What’s not commonly discussed is the fact that shadow IT creates shadow data, or dark data. A recent report found that 55 percent of organizations’ data is dark – unnoticed and unaccounted-for data. Yet nearly everyone insists data is “very” or “extremely” valuable to success. How can that be if organizations don’t even know what data they have?

If organizations don’t have an accurate picture of their data, they probably have even less of an idea of the quality of that data within their organization. Let’s say that someone gets a hold of some data that appears useful to the business and circulates it around the organization. This unvalidated data could become the organization’s source of truth – a dangerous approach, especially if it’s bad data. Even if 90% of the data that is fed into an algorithm is good to go, it doesn’t take much incorrect data to skew results, thus skewing business strategy. 

Data visibility and accessibility enable sound business decisions

Business leaders rely on data to make decisions and bad, unvalidated data can be detrimental. That’s why data visibility – and by extension, data democratization (putting more data into more people’s hands in a useful, usable way) – is an essential cornerstone of data literacy. This should be a huge priority for all organizations. On one hand, increased visibility gets more people doing things like analytics; they can’t use data if they don’t know it exists. On the other hand, when shadow data is allowed to proliferate, questions arise as to where one should go to verify that the data is trustworthy. Leaders need visibility to see what data they have and what the data means to the business.

Data visibility also drives data democratization. If you’ve got clear visibility into data and sturdy guardrails alongside it, this is where you truly get to that nirvana stage: people using data for initiatives that are absolutely valuable to the business. The more data they know about, the more ideas they can come up with to mix and match that data, to come up with different viewpoints and an understanding that they never had before. But again, if users don’t have a place to find out what’s available and a way to navigate that in an efficient manner, they’ll default to doing the quickest thing, with the mantra: “I have a deadline and I’m going to do the best that I can.” 

Businesses need to develop an architecture that supports data visibility

The role of business leaders is to get all of the IT roadblocks out of the way so people can do the right thing with data – extraordinary things, if it’s done well. 

In order to build that model, it’s important to recognize that the architecture must fit the use and purpose of the data, rather than the other way around. First understand what data is available, how it currently serves the business and how it’s being managed. Then, apply that model as-is to enterprise goals and strategies to compare the processes currently in place with what the organization wants to do in the future, and places where it needs to improve through digital transformation. Taking this 30,000-foot view and mapping it back to the organization’s data will help business leaders get a clear picture of where the organization wants to be and how its architecture needs to change.

Next, it’s time to go from an ‘as-is’ to a ‘to-be’ model  – one with visibility into how things look today and a clear roadmap for where to go and how to get there. Digital transformation is not a one and done deal; it’s an iterative process. And, modernization with an increase of data capabilities will be a lifetime project for every organization because it’s unclear what kind of data, what kind of drivers and what kind of technologies will come down the road. What we do have is the ability to look at where we are today, look at where we need to be, justify the resources and prove that we can get there efficiently, with minimal risk to produce a positive return on investment. We can then follow that roadmap wherever it may lead.  

The data points to the applications and in turn, the applications point to business processes and business capabilities. In that way, visible data can become interconnected throughout the business, if it’s put into an environment where business users can manipulate it effectively.

Real-time observability and automation are required for effective data visibility

The next thing to consider is: how real-time is the business? If real-time impact is a core foundation that’s underpinning the business, data visibility has to be in real-time too. Business leaders need to be able to look at data and answer: How is my data quality? How is it varying from day-to-day? Why is it varying from day-to-day? What are the thresholds of that data where I should no longer use it? All of these answers need to be as real-time as possible.

It’s also important that the process of getting information, and viewing it in real-time, is as simple and up-to-date as possible. Anyone who gives out bad information and incomplete data, is unlikely to have customers, partners and stakeholders come back to them. Is the process of getting information so unwieldy that people think they have to go around you to access the data? That’s an obstacle that needs to be removed.

It ultimately comes down to human nature and trust. Businesses need a process that is strong, but also welcoming, and one that ensures that the data given out is the best information available. Automation can help.

The more that businesses automate data observability and accessibility processes, the easier it becomes to interconnect data with business operations. To support a modern data architecture, automation should be repeatable, intelligent and take data governance and data management into account, while allowing leaders to maintain the speed of business and data capability. Artificial intelligence (AI) and robotic process automation (RPA) can help IT teams to set policies for proper data access, automate data discovery and thus shine a light on data, helping businesses to make faster decisions, determine smarter insights, and drive better business outcomes.

Easier said than done, right? Not necessarily. With the right executive buy-in and sponsorship, ridding an organization of dark data is very plausible. In this regard, culture is important, more so than technology. When leaders create a culture of enterprise visibility around data and data literacy, with buy-in to democratization, buy-in to rights and responsibilities and buy-in to compliance, defeating and eliminating dark data becomes not only a plausible goal, but a highly possible achievement.  

About the Author

Danny Sandwell, Director of Product Marketing, erwin by Quest Software. Danny Sandwell is an IT industry veteran who has been helping organizations create value from their data for more than 30 years. His goal is to help enterprises unlock their potential while mitigating data-related risks. In his role at Quest, Danny helps businesses get the maximum value out of enterprise modeling and data intelligence solutions. Danny has 20+ years of experience in pre-sales consulting, product management, business development and business strategy roles, making him a key advisor to IT leaders across various industries as they plan, develop and manage their data architectures. 

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: https://twitter.com/InsideBigData1

Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/

Join us on Facebook: https://www.facebook.com/insideBIGDATANOW

Speak Your Mind

*