Sign up for our newsletter and get the latest big data news and analysis.

Time to Stop Treating Data Like a Four-Letter Word

It is clear that Apple is addressing privacy head-on with its iOS 14 updates that provide users increased transparency and control over app location tracking.  This a step in the right direction as the daily news cycle is filled with stories about data being misused and applied for a completely different use. Consumers have been conditioned to look for the negative and data is now a “four letter word” – raising consumer hackles and distrust. However, for most businesses, the value lies not in the data – but rather the insight it provides – and the power to deliver innovation meant to help, not harm, consumers.

How to solve the disconnect? Organizations at the forefront of digital transformation and AI need to drive awareness around principles of responsible data use. Responsible data use is when data achieves the purpose of the application and that purpose is hyper focused, not requiring identifying any consumer specifically.

Establishing privacy-first principles and pro-actively adopting responsible use standards as the rule will not only build consumer trust but also help scale digital transformation and AI, thus accelerating consumer value. As consumers recognize the benefits, their confidence in businesses’ ability to respect data privacy will increase, creating a virtuous cycle.

Taking the Personal Out of Personal Data

Traditionally, data privacy centered around personally identifiable information (PII) — the kind of data that is used to identify an individual and potentially used to commit fraud (i.e., name, Social Security number, driver’s license number, bank account number). In recent years, the focus has broadened to Persistent Information (PI) such as Advertising ID, IP Address and cookies as this pseudonymous data reflects an individual’s behavior and may be used to indirectly identify him or her if combined with more information.

IoT and 5G has led to an explosion of data interactions, further creating grey areas. According to Strategy + Business, the average user produces 4,900 data interactions per day and the amount of data stored per Internet user is expected to double by 2025. As legislation lags technology, it becomes incumbent on businesses to lead standards for the “responsible use of data.” These standards address “should we do this?” rather than “can we do this?”

Creating a Responsible-First Mindset

Responsible data begins at the development of any application or platform. Four straight forward principles should guide the use of any data – from the initial problem statement, ensuring consumer consent, using privacy-aware data, and commitment to transparency.

Starting with Aggregate vs. Individual Insight. It should be the goal of the platform to not require the identification of any individual specifically and that a minimum requirement threshold of scale for inclusion (i.e., certain number of records). Data that does not meet this standard should be discarded so as not to keep isolated cases.

For example, tracking foot traffic to certain retailers requires population counts segmented by demographic (to optimize for specific groups, such as the younger generation vs. older). Identifying someone’s personal information, as an individual, will not improve these efforts. There are countless similar use cases that require the application of AI to large, aggregated, high-quality datasets.

Ensuring First-Party Consent and De-identification. Again, we see this happening with Apple’s new iOS by holding apps accountable and putting the power into the hand of the consumer.  Organizations need to verify that the data was obtained with consumers’ rights and benefits in mind. Consumers should not only consent to sharing data but have the option to change participation at any time and be aware of the purposes and benefits for which data is being collected.

Once validated, all personal or user-level identifiers should be discarded by either the first-party provider or the product (i.e., some advanced platforms filter user level identifiers and utilize only aggregated or de-identified data).

Using Privacy-Aware vs. Personal Data. When data is privacy-aware that means it is using the intelligent outputs of de-identification or aggregation, rather than the original data which can identify a unique individual directly or indirectly. For example, privacy-aware outputs could muddy a device ID so what the platform sees is a random string – not identifiable to any individual. Rather than dealing with PII or PI, these systems only consume privacy-aware data. Additional rules can be set to ensure that no user is identifiable, eliminating any element of either human error or malicious actions.

Maintaining Transparency and Simplicity. Open, honest and friendly communication is key to helping consumers to understand how the data they provide is actually used for the purpose for which it is intended. Much like the actions that followed the Credit Card Act of 2009, companies should make the data journey more accessible and describe the complete lifecycle, including source, types of interactions and how the data is retained in the platform.

Responsibility Will Rebuild Trust

When it comes to collecting data, there will always be a conversation on the nature of de-identification and aggregation, with many people still wary that data will be misused. Yet the probability of privacy violations in the world of responsible use should be no greater than the hundreds of manual and error-prone processes used for gathering the same information today. This ongoing frustration reflects just how much consumer trust has dwindled.

Businesses can and should rebuild consumer trust through their own leadership and adoption of responsible use standards. The standards for responsible use of data provide a framework for utilizing the power of data, without compromising fundamental rights of privacy.

About the Authors

Amit Chauhan, SVP & General Counsel, InMobi Group. Amit is responsible for the global legal function, including privacy and data protection for TruFactor and its parent company, InMobi Group. Amit is an attorney with over 18 years of experience working with Indian companies and global corporations. Amit specializes in designing and implementing the legal and compliance functions for technology companies and has deep expertise in solving cross border legal issues, international privacy and data protection requirements as well as navigating government regulators and authorities. Amit is a lawyer licensed to practice in India and a member of the Institute of Company Secretaries of India. Amit has a bachelor’s degree in Science and law degree from the CSJM University, Kanpur.

Meera Mehta, VP Marketing, TruFactor. Meera is responsible for marketing and bringing the vision of TruFactor to life to prospective clients and partners. Meera has 20 years of experience leading sales and marketing teams and bringing SaaS platforms to market. Prior to TruFactor, Meera held strategic roles at Microsoft and Oracle as well as innovative start-ups. Her experience spans CRM, unified communications and collaboration, sales engagement platforms, IoT for smart cities and retail pricing analytics. Meera has a bachelor’s degree in Economics from Stanford University, a master’s degree in International Policy Studies from Stanford University and an MBA from the Sloan School of Management at MIT.

Sign up for the free insideBIGDATA newsletter.

Leave a Comment

*

Resource Links: