Skip to main content
Private data

Marketing

Putting a Price on Private Data

Putting a Price on Private Data

People value their private data higher in cash than in digital goods or services.

Private data is perhaps the most commonly traded asset today. While technology platforms claim to provide their services for free, users are in fact exchanging browsing and purchase histories, geolocation, content preferences and other personal, private information for access to social media, search results, streaming platforms or other goods and services. But are consumers being fairly compensated for the value they provide to these companies?

Our recent research article, forthcoming in the Journal of Marketing Research, found that people put a higher price on their private data when they exchange them for cash than if they trade them for goods. As tech companies almost exclusively pay people in goods – such as search results or social networks – our findings suggest that firms may not be sufficiently compensating consumers for their data.

A series of 11 experiments revealed that consumers demand two prices for their data: a higher one in cash and a lower one in goods. We also provide experimental evidence that consumers place a lower value on their private data when bartering for goods because they focus more on the value of their data in cash exchanges.

A higher price for privacy

In one valuation task, we measured how much cash we would need to offer participants so that they would be willing to give us three hours of their geolocation data. This measure provided us with a direct valuation of their private data. In a second valuation, we had them tell us the number of units of a good (e.g. Amazon movie rentals, months of access to Netflix streaming, Kindle eBooks, or Shell gasoline, among others) they would demand in exchange for their data. In a third valuation, they indicated how much cash they would be interested in receiving instead of that quantity of good, that is, the cash value they ascribed to these units of the good. This measure provided us with an indirect valuation of their private data via goods.

We compared their direct cash valuation of their data to their indirect valuation. As predicted, participants placed a higher value on their private data when considering exchanging them for cash than for goods.

To establish whether this discrepancy in valuations extended beyond geolocation data, we asked participants in one of our experiments to evaluate ten different types of private data. For instance, we asked them to value data of varying degrees of personal sensitivity, such as ten hours of browsing history, a list of all the apps on their phone, or a sample of their saliva. Again, valuations of private data were higher when measured in cash than goods, and this effect generalised across different data types.  

Markets for private data

Consumers’ uncertainty about how to value data may be driven by the lack of well-defined market prices for private data. Therefore, in another experiment, we asked Amazon Mechanical Turk (MTurk) workers to value either their private data or their MTurk labour, for which there is a well-defined wage rate. We again found that participants valued their private data more in cash than in goods, but there was no such discrepancy in how they valued their labour inputs. This demonstrates that the absence of well-defined market prices brings about the discrepancy in consumers’ valuations of their data.

Next, we directly manipulated uncertainty about the value of private data by informing some participants of a clear market price for their data (US$51.50 for three hours of GPS data). As predicted, the presence of the market price reduced participants’ uncertainty about the value of their data, which led them to value their data equivalently in cash and goods. For the remaining participants, the absence of an established market price meant they were less certain of the value of their data and therefore placed a higher value on their data in cash.

Protecting consumer welfare

Our findings marked a violation of an essential requirement of rational decision-making: procedure invariance. Procedure invariance requires that a rational person’s preference between different choice options must not depend on the underlying method used to elicit their preference. In other words, the procedure should not impact someone’s preference. In all our experiments, participants violated procedure invariance as their private data valuations were dependent on the medium of exchange (cash or goods).

The existence of two different prices, depending on the exchange medium, reveals a systematic psychological bias in how consumers value their private data. Our findings suggest that technology platforms may be able to exploit this bias. This raises the question of whether these companies are adequately compensating consumers for their data.

The prevailing practice on technology platforms contrasts with most other markets, in which goods are bought and sold for cash instead of being bartered for other goods. Our findings therefore also point to a possible psychological account of how technology platforms have been able to build their dominant market positions.

To protect consumer welfare and remedy possible inefficiencies in markets for private data, policymakers could explore methods through which consumers could sell, rather than trade, their private data to technology companies. The EU’s General Data Protection Regulation and the California Privacy Act provide consumers with a degree of control over their data but stop short of granting legal ownership. A potential solution could be to assign explicit property rights to consumers’ private data.

While numerous technical challenges have to be overcome, enabling consumers to sell or rent their data for cash would force companies to compete for the data at more clearly defined market prices, much like they compete for labour. Doing so would address the lack of efficiency in markets for private data that our findings suggest. It might also dampen the market power that big tech companies amass from obtaining private data.

When designing strategy around data collection, managers should consider that they may not be compensating consumers fairly. Above all, consumers need to realise that they’re giving away something of economic value when they’re bartering their data. Understanding this might make them rethink whether they're getting a fair deal. Protecting consumers from falling victim to this fundamental psychological bias in privacy valuations should also be a goal for policy makers and regulators to protect consumer privacy and, at the same time, ensure healthy competition between technology platforms.

Edited by:

Katy Scott

About the author(s)

About the research

"Intransitivity of Consumer Preferences for Privacy" is forthcoming in Journal of Marketing Research.

View Comments
(2)

Mithila LBTC

07/08/2023, 05.31 pm

Nice knowledge-gaining blog. This post is the best on this valuable topic. I like your explanation of the topic and your ability to do work. I found your post very interesting

0
0

Kaan Kanak

13/01/2023, 04.27 am

I really liked this article. Please keep writing for us and keep us informed. I wish you healthy days.

0
0
Leave a Comment
Please log in or sign up to comment.