Advertisement

Ask a Data Ethicist: Is Consent the Wrong Approach for Modern Data Regulation?

By on
Read more about author Katrina Ingram.

Asking for consent to collect and use someone’s data as the basis for legitimate processing of that data is key to our data privacy regulations and is also ethical. However, many people agree that consent is broken and that we don’t really have meaningful consent given the power imbalances between big tech organizations and the average user. It’s not always possible to truly understand long and complex terms and conditions, or opt out of a digital service which may be necessary to use as part of life in the 21st century. Some say, we need better consent mechanisms, but:

Is consent the wrong approach for modern data regulation? 

This post is inspired by the work of legal scholar Salomè Viljoen, an assistant professor at the University of Michigan with research interests at the intersection of law and the political economy of data. There are links to her work and related resources at the end of the column.

What Is Consent Based On?

In simple terms, we can think of consent as an agreement between two parties. When it comes to data, giving our consent means we are agreeing to the data collection, disclosure, use, or processing of the data and entering into a contract with the details of the agreement outlined in the terms and conditions (which few people read). Thus, in this context, consent is an element of contract law. But, there’s more to it.

Consent stems from the idea of an individual’s freedom or autonomy to make choices that involve them, including being able to enter into agreements. Thus, I can’t give consent for someone else unless there are special circumstances (e.g., power of attorney, legal guardianship of a minor). The gold standard of informed, meaningful consent involves a full understanding by both parties of the implications of the agreement as well as being able to act freely with no coercion or duress. I’ll also note there are means other than consent to legally collect and process data, but for the purposes of this column we’ll stick with unpacking consent. 

Consent shifts the ethics of a situation, as this article makes clear:

“Without consent, taking someone’s money is theft: with consent, it’s an investment or a gift. Without consent, entering someone’s home is trespass. With consent, it’s hospitality. Without consent, performing a medical procedure on someone is a ghoulish type of battery. With it, it’s welcome assistance.” (Ethics Explainer: Consent)

The assumption that underpins consent as it relates to data is that I have the right to determine who collects and uses my personal data. Through the agreement, I can effectively control what happens to this data to ensure that its access or usage is aligned with my interests. Those who feel consent is broken might argue for greater control mechanisms or rights for the individual data subject. For example, the GDPR provides for the right to erasure, which gives the data subject greater control over data retention. The CCPA provides for rights around collection and limitations on use.

Others have advocated for more benefits to sweeten the data deal, such as a payment for the use of personal data. W. Russell Neuman, a professor of media technology at NYU, makes this argument. In this Time Magazine opinion piece, he says:

“if this is our information to begin with, why don’t we have the means to not only control it, but to potentially derive value from it? If someone wants complete privacy, they can opt out. But if someone is happy to let marketers know about their market preferences, they could be compensated for providing that valuable information.” (Neuman, 2023)

Both of these reforms – the call for greater data subject rights and the idea of selling our data – revolve around an individualistic perspective where a data subject engages directly with the entity seeking to collect or use the data. It’s still consent based, just with stronger consent mechanisms or a more lucrative agreement. Yet, when it comes to the modern data economy, these approaches, while better for the data subject, miss the relational aspects of data use that transpire between people. It’s this relational piece that informs how organizations derive economic value from data. 

How the Data Economy Works

To illustrate why consent is an inadequate way of protecting people from data harms, Viljoen asks us to consider this scenario:

Let’s say there is a fertility tracking app used by person A – let’s call her Anna. Anna benefits from the use of the app and has consented to the terms of use around data collection. The app sells insights from her data to a data broker. Pregnancy data is particularly valuable because this life event can signal all kinds of new consumer activity. The data broker then uses Anna’s data insights, mixed together with new data from other sources, to other clients. One of their clients is an AI-powered HR tool used to screen resumes. 

Person B – Becca – has data that suggests she is very similar to Anna. Becca has never used a fertility app but she is applying for a job and the company she applies to is using the AI-powered HR resume screening tool that contains insights from Anna’s data. The system flags Becca as high risk and screens her out of the applicant pool. The company using the resume tool doesn’t really know why Becca is flagged as high risk, just that she was screened out. Becca is harmed not because of her own data, rather it’s Anna’s data that is like hers that has caused Becca harm.

I wrote about “data like mine” in the November 2024 column:

Data like mine = the patterns found in data that correlate to my data or data about me in some way. These patterns can be used to make decisions about me. They are inferences – statistical best guesses – based on the data at hand.

The basis of machine learning and predictive analytics is not necessarily concerned with you as an individual, but rather, the probability of how you as an individual will behave given how similar you are to others you correlate to based on patterns in the data. 

I Can’t Provide Consent for You

No amount of consent on Anna’s part can account for the harms caused to Becca in this scenario. It would be as ludicrous as saying that Anna could consent to a third party entering Becca’s home. Only Becca (or her approved designate) has the right to make such an agreement. That is the crux of Viljoen’s argument about why consent is the wrong approach for our current data economy. 

Now, imagine this scenario at scale. We are all both Anna and Becca. We are being datafied and then sorted into categories (aka target markets) based not only on our own data but also based on the opaque social relationships, the inferences, of how we are similar to other people in data terms. It’s not clear where the data winds up or how its used. It might result in determining if you get a job or insurance or access to healthcare. 

How Do We Fix This?

For Viljoen, there is an opportunity to rethink data governance regulations not only to protect people but also to ensure data can be used in socially beneficial ways. Her work goes much deeper into the details and outlines many other issues beyond that of consent. The upshot is this – we need responses that go beyond the individual level:

“The relevant task of data governance is not to reassert individual control over the terms of one’s own datafication (even if this were possible) or to maximize personal gain, as leading legal approaches to data governance seek to do. Instead, the task is to develop the institutional responses necessary to represent (and adjudicate among) the relevant population-level interests at stake in data production.” (Viljoen, 2020)

More Resources

Send Me Your Questions!

I would love to hear about your data dilemmas or AI ethics questions and quandaries. You can send me a note at [email protected] or connect with me on LinkedIn. I will keep all inquiries confidential and remove any potentially sensitive information – so please feel free to keep things high level and anonymous as well. 

This column is not legal advice. The information provided is strictly for educational purposes. AI and data regulation is an evolving area and anyone with specific questions should seek advice from a legal professional.