The New Alchemy: Mastering Use of Qualitative Data to Create Insight Gold

Print Friendly, PDF & Email

In this special guest feature, Daniel Erickson, Founder and CEO, Viable, discusses how to master the use of qualitative data to drive sales. Dan brings 15+ years of experience in the field as a self-taught coder including positions as CTO of Getable, and VP of engineering at Eaze, and Senior Engineer at Yammer, Inc. Dan believes in building systems and tools to help teams achieve their goals.

Businesses have always collected data. From notches on sticks to marks on clay tablets and eventually writing on paper, the act of recording information to inform future actions is as old as commerce itself. Of course, data collection today is on a wildly different scale. We’ve hit a point where there’s so much flowing into enterprises that just trying to handle it can detract from, not add to, its value for an organization.

We’ve all seen countless headlines claiming, “Data is the New Gold!” or “Data is the new Oil!” And in the interest of ensuring that no chance to capitalize on it is lost, businesses have become veritable data collecting machines. However, and I’m hardly the first to point this out, data is just like gold and oil in that its value is only reached after it’s been processed. With data, it’s this processing, or what happens to data after it’s collected, that most organizations have yet to master.

According to Forrester, between 60% and 73% of all data within an enterprise goes unused for analytics. That alone illustrates the lack of mastery to which I’m referring. 

Still, even if we were to use 100% of the available data for analytics, we tend to over-rely on quantitative assessments. Organizations are quick to crank out hard numbers like units sold and conversion rates. Meanwhile, they leave on the table the qualitative data that, if properly analyzed, could reveal a wealth of valuable insights.

According to Statista, market research companies generated over $47 billion in revenue in 2019 in the U.S. alone. Nearly two-thirds of that spend was on quantitative methods such as online surveys and phone interviews. Customer satisfaction surveys were the largest single category of this spend.

With a quantitative customer satisfaction survey, you’re likely to find out things like what proportion of customers are happy with the product and whether they would recommend the product to others. Now, imagine you’re on the product team. You learn that 60 percent of customers are happy with the product but only 30 percent would recommend it to others. Now what? You know you’re sort of maybe doing something right, but then again not. Not very helpful.

Qualitative surveys let you get past yes and no and ask the most important question: Why? For the previous example, maybe you find out that customers love the product, but it’s the after-sale support that isn’t meeting expectations. Now you have insights that inform how you can take action to fix the issue. Or more accurately, work with customer service to fix the issue.

Adjusting a purely yes/no survey to incorporate a few open-ended questions is easy enough. The challenge – and the reason why so many organizations don’t even try – is analyzing those unstructured, text-based responses. That’s just a survey. Imagine all the information that organizations are already sitting on: social media mentions, call center transcripts, support emails, chat logs, and so on. If analyzed, it could provide game-changing insights. But, when it just sits in some data warehouse somewhere, it’s useless. 

The second area where lack of mastery is consistently demonstrated is how organizations are – or more accurately, aren’t – disseminating data throughout an organization. What we see most often is more like “he who collects it keeps it.” For example, the product team might have usage metrics while the marketing team has app store feedback, and the comms team has social media mentions. While each of these divisions may get some modicum of use from the information, it would be exponentially more valuable if it could be properly shared and used to inform larger, cross-functional decisions and strategies.

While the value of becoming masters of data usage is clear, the reasons organizations don’t strive for it are less so. In the past, this type of analysis would have been cost prohibitive and/or labor intensive. Today, that’s not true. With advances in natural language processing (NLP) and greater availability of cost-friendly technologies that automate previously manual tasks, achieving mastery is possible. Here are three steps you can take to drive the necessary changes within your organization that will set it up for data-usage success: 

  1. Proactively embed qualitative data into every step of decision making: From the research phase through product development and post launch, make sure you are asking your customers the types of “why” questions that will contribute real insights. For example, why are customers drawn to your product in the first place? Why do users feel the latest update is confusing? Why is feedback overwhelmingly positive for one segment but underwhelming for another? Of course, the responses are worthless if you can’t understand them, so be sure you build in the processes you’ll use to analyze responses and take action where necessary from the start. 
  2. Make the internal case for automation: Lack of analysis is often a side effect of lack of resources. But with the increasing availability of budget-friendly automation tools, this should be a thing of the past. For example, manually reading and analyzing data might take 30 to 45 seconds per data point. If your organization has 7,000 data points, it would take about 60-90 human hours to process. With automation, you can reduce that to just minutes. In a cost-benefit analysis, the winner should be clear. 
  3. Give everyone in your organization access to qualitative data insights: Data analysis that’s accessible only to the technically skilled user is less useful than data analysis that every business team can use. Find ways to proactively share qualitative data analysis with every type of user, regardless of technical abilities. Building habits and processes (point No. 1 of this framework) and embracing automation (point 2 of this framework) go a long way to making this possible. 

While data isn’t gold, mastering its use can have an alchemy-like effect, bringing immense value to something that was previously without. 

Sign up for the free insideBIGDATA newsletter.

Join us on Twitter: @InsideBigData1 – https://twitter.com/InsideBigData1

Speak Your Mind

*