The F-Word That Really Matters

By on
Read more about author Sheila Colclasure.

We now exist in a post-privacy world. Our expectations for proper curation and care of personal data have gone out the window during the global pandemic where Big Tech, Big Pharma, and Big Government have repeatedly acted more like Big Brother without significant objection from the public. Internet trolls, deceptive sales practices, and data breaches have become so prevalent that we have lost our sense of outrage, as what was previously unthinkable becomes the banal norm. The pace of technological “advancements” has thus far exceeded lawmakers’ ability to build proper guardrails for consumers – and in the process has made everyone a victim.

I submit that if we put down our phones, tune in, and really think about it, we would all be yearning for the F-word. 

No – not that F-word. Fairness.

For years society mislabeled what it wanted as data privacy. As the chief privacy officer for one of the largest data companies in the world, I learned what consumers want most is better privacy achieved through the ethical use of data. What I’ve noticed shift over the past several years, however, is that the expectation of privacy quickly goes out the window the second more important desires emerge: the desire for information, entertainment, or escapism; the desire for reward; the desire to be protected from fear … or a virus. The truth is that the need for privacy is elastic – it ebbs or flows when it is compared to alternative wants.

What is needed is much more fundamental. It is data fairness – and the human desire for fairness never changes.

I give my Social Security number to my doctor willingly because it is required to be seen by that doctor when I am sick. I, therefore, deem it a fair exchange. I allow Amazon to ostensibly listen to every aspect of my private life in my home because I have deemed it is a fair trade for answers, music, and home automation on demand. I install a telemetric device in my car to track my every move and driving habits because I deem it a fair proposition for the possibility of cheaper auto insurance rates. In all of these cases, the operative word is fairness and the key to that fairness is that I am applying my personal agency to choose what I do and do not deem as fair. As long as that stays in symbiotic balance, life is good, and things are OK.

The principle of data fairness should be a first-order requirement for the procurement and use of personal data.

There is nothing more intimate than our personal data. Through ones and zeros, we disclose a clear tapestry of exactly who we are as individuals – our wants, our desires, our dreams, our shortcomings, our quirks, our curiosities, our fears, our interests, our passions, and our secrets. And while we gladly disclose these virtual breadcrumbs to various entities in exchange for things we deem fair in return, the common thread is that we expect the data to remain protected and that it be used fairly according to our consent for proper purposes.

Unfortunately, the notion of “proper purposes” has become increasingly subjective. Some companies have concluded that whoever controls the data controls the market. Technology companies that previously claimed a benevolent platform status now use citizens’ data they disagree with to “de-platform” them. And the egregiousness of that act is that it in essence makes the person who is de-platformed into someone who not only no longer exists, but who never existed at all (as every trace of that person is completely removed from the platform). Could there be anything more dehumanizing? For the companies doing this, the notion of humanity and fairness have been completely distorted, if not altogether lost. 

Before the Digital Age, there was a sacred and fragile nature to the relationship between a proprietor and a customer. A smart shopkeeper would profile their customers much like today – but they would do it through relationship, trust, and observation – and with proper intent. 

For data fairness to exist and ultimately prevail, I’d like to offer three concrete requirements for ethical companies to consider:

  1. Design data fairness into data collection and use: From the beginning, ensure that the proper calibration of data use is interwoven into audience design. The more sensitive the data, the higher the calibration (and guardrails). 
  2. Protect and serve: Be the custodian/guardian/steward of others’ private data and ensure all is being done to establish and/or maintain fairness in how that data is being used. Use data for the good of each person whose data is being used. If something is not for their good, don’t do it. 
  3. Stay human: In a world where artificial intelligence and machine learning receive a nearly infinite stream of inputs from all manner of machines and devices in the Internet of Things (IoT), humanity can easily get lost in the data. And when you forget that every byte of data relates to an actual human being who deserves respect and dignity, it creates a slippery slope that leads to data use practices that are deceptive and manipulative. 

I would like to challenge all companies that collect and/or use personal data to apply the F-word wherever possible: use fairness as your guide. If a use of data will not be interpreted as fair by a person, that use should never be employed. It is only through maintaining and upholding the social contract of fairness that we will be able to navigate the increasingly opaque ethical quagmire of a digital-first, IoT reality.

Data fairness is the answer.

Leave a Reply