Surveillance Capitalism in the New Data Economy

Print Friendly, PDF & Email

Do privacy and data ethics stand a chance against today’s data economy that seeks to exploit and profit from personal data at every turn?

In a 1995 interview with Inc. magazine, Kurt Vonnegut conjectured: “The information superhighway will be two lanes with tollgates, and it’s going to tell you what to look for. People will just watch the show.” Today, we’re not that far off from Vonnegut’s predicted dystopia. Personal information has never been more of a commodity. We are arguably in a golden age of “Surveillance Capitalism” in which businesses are concerned with generating vast profit from personal data. Consumers, as both audience and information source, are captive and disengaged.

Consumers will tend to adopt an “ignorance is bliss” mentality when exchanging their sensitive information for services. Increased efficiency in processing power and storage makes possible the rise of machine learning and the beginnings of artificial intelligence. What once seemed like pure science fiction—self-driving cars, intelligent homes, an entire world of information in the palm of your hand—becomes a factor in the digital efficiencies integrated throughout our everyday analog lives. Our smart devices track our every move, eavesdrop on our conversations, and offer helpful suggestions (and targeted ads). While gaining every convenience, are we also losing every semblance of privacy?

In this golden age of “Surveillance Capitalism,” the momentum of technological and economic opportunity comes at the expense of positive and lasting social impact. While we do not necessarily feel  doomed to accept our AI overlords, it’s clear that the current state of the information economy is both unhealthy and unsustainable. We hope that organizations closest to development will modify their approach to  harvesting consumer information. We hope they adopt ethical approaches to data usage into how they build their products. But really, we don’t know if they will and probably won’t know if they do. Sound familiar? It should.

The Truth in Advertising 

Consumers were once part of an anonymous mass market where brands worked with advertisers to place adverts in publications, on billboards, or within the allotted time frames of early TV. It was simple, easy to understand, fairly transparent, if not a wee bit alcohol fueled. With possibly the exception of the final point, this world is long gone (except in reruns of Mad Men).

The advertising and marketing landscape of today continues to undergo a Kafkaesque transformation. With over 7,000 different “martech” providers inhabiting more than 49 different categories, you would be hard pressed to find an industry more over-burdened by technology. It is also, perhaps, the industry most frequently associated with abandoning privacy and liberty in the quest for profit. Driven by the  overriding incentive to collect as much information as possible and monetize wherever available, it’s leading the charge toward a surveillance-fueled data-pocalypse.

As tech giants like Google and Facebook run up against growing public outrage and consumer fatigue at the loss of personal privacy, the reminder of their outcry at the intelligence community leaks just a few years back is a testament to their hypocrisy. Still, what we’ve built cannot be unbuilt. Our habits, prioritizations, insecurities, urges—our digital identities that can easily reveal the facets and subtleties of our lives—have been distributed across databases and flung across the web.

Getting back on track will require a new way of thinking about our privacy and, in this, ethics has a part to play.  Not only are these ethics-driven solutions achievable, they may support an unrealized revenue opportunity in the market.

Ethics and Data 

Informing technology decisions through an ethical lens begins with consensus. Consensus is key to determining the parameters for ethical data usage. The boundaries between makers, agents, and users have blurred in the digital age,  resulting in the beginnings of a new body of ethics adapted to managing data. 

Viewing the information economy through the lens of big data comes with the understanding that constructed, actionable data will only continue to proliferate beyond traditional electronic environments, while the systems through which it flows will become more integrated, self-learning, and focused on optimizing for scale and speed. Consequently, big data is reducing the distance between the average individual and control of their data.   

Protecting Identity and Preserving Open Commerce  

Enabling a solution that juggles interoperability, capability, and functionality with control, management, and consumer consent is no easy task. However, solutions that look to effectively re-establish a healthy data relationship between consumers and the business community can capture an unrealized opportunity in the market.

Like much of the innovations that spawned the internet, the solutions may not be centralized and may be simply the sum of numerous building blocks. So far, emerging ideas like self-sovereign identity and “Personal Agent” technology fall into that category.

With the additional application of blockchain, under this paradigm, trust is shifted from corporations to an incorruptible network under which individuals have control over how and what parts of their identities can be used in exchange for access to services.  

Of course, the critical element will be adoption. In order for self-sovereign identity, or really any form of digital personal agent to gain momentum, buy-in is needed from every element across the value chain: consumer, organization, and government. Achieving this will require educating each stakeholder on the importance of data ethics.

Although it is true that we study ethical principles, we are not taught how to apply these principles nor how to think about ethical implications in the design and development of new technological projects and solutions. In such a world, a product or feature stays offline until the team has a resolution to ethical dilemmas—a sort of Hippocratic Oath for upholding ethical practice.  

Toward an Ethical Data Future

The impact of “Surveillance Capitalism” and this new data economy will be tremendous, but the outlook for the future is still up in the air. Personally, I think that if we continue down this course—an unregulated, poorly designed variation of surveillance capitalism—we stand to reap the consequences of a pretty inhumane, prescribed reality. But I’m still optimistic. If we take advantage of the brilliant people and institutions that the digital economy has empowered and train them, and the rest of us, to develop, use, and care for a suite of ethics-driven, nuanced, solutions-oriented products, we could easily wind up the benefactors.

Many people are already working on inventions that seek to upend the now-established relationship of data for profit. The primary belief behind this type of work is an ethical one that says it is wrong for large, powerful forces to profit from the personal data of individuals.

While the merits of this last statement could and should be argued extensively, using ethics-driven solutions we can change how we treat data, we can build products differently, and we can advocate for a more equitable exchange of value. In doing so, we can imagine a future where technology puts people first. I think it’s what Vonnegut would have wanted.

About the Author

Dashiell Pinger is Senior Product Strategy Manager for Data Platforms and Media Solutions at Intertrust Technologies. His work has focused on transformational solutions affecting digital and analog change–data-driven advertising, digital rights management, a patent product to defend startups, and now a trusted data platform for the future of energy. 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*