In his latest book, “Why Privacy Matters” (2022), Neil Richards connects privacy with the social power human information provides over others. Because this power is not always channeled in ways that benefit humans, ensuring the privacy of human information is of paramount importance for building a society where our roles as situated consumers and workers are safeguarded and where we, as members of this society, can trust and rely on one another.
Richards also shows that, aware of the destructive power of human information, people are apprehensive about protecting their personal data against exposure. A recent global survey found that 88% of people are concerned about others’ accessing their data. Over 80% of people expect governments to regulate data privacy and penalize those companies that do not use people’s private information honestly and responsibly. People are so concerned with data privacy that even when they venture to build an e-commerce business or trade online, they need to be taught in a business or ETF investing school about privacy and rules about human information.
WANT TO STAY IN THE KNOW?
Get our weekly newsletter in your inbox with the latest Data Management articles, webinars, events, online courses, and more.
The problem is that despite all the technological advances in available data security measures, regulating privacy is not easy. Technology constantly changes. Data over the internet spreads unpredictably and fast. And worst of all, the internet’s economics and national security management benefit from using customers’ profiles and personal information. Companies, too, often prefer to breach the privacies of their users over constraining their businesses. Considering how badly governments and companies need to collect data about users, ensuring the effectiveness of privacy regulations is challenging.
For many years, the solution to this problem has been an approach called “Privacy by Design” (PbD). Yet even though invented to solve the privacy breaches rampant in the digital world, the much-applauded PbD is not without a chink in its armor that compromises its power. To appreciate its limitations, let us understand what PbD is and why it has long been heralded as the ultimate protector of people’s personal data.
Coined in the 1990s by Ann Cavoukian, former Information and Privacy Commissioner of Ontario, the term “Privacy by Design” refers to auxiliary for different types of IT systems used for processing personal data. It is designed to be incorporated into networked data systems and technologies by default rather than retrofitted later. It should be a requirement for products and services offered to individual customers, such as WiFi routers, search engines, and social networks. As most people cannot protect their personal information by themselves, these providers need to give basic protection by default, strengthening it by adding appropriate privacy tools, including encryption, access controls, and user anonymity.
In her manual “Privacy by Design: The 7 Foundational Principles” (2006), Cavoukian stresses that PbD does more than ensure security. It is directed at minimizing the amount of processed personal data. This is achieved by separating personal identifiers from content data. Cavoukian also suggests that the anonymization or deletion of personal data should be performed at the earliest stages of data protection. Addressing privacy at an early stage is more straightforward and less expensive. This can also help companies avoid legal complications related to the privacy issues of their customers.
The PbD requirements outlined by Cavoukian practically work as follows. Suppose a financial institution plans to offer a new money transfer service designed for customers’ mobile devices. The major privacy problem in connection to payment transfers is the history of customers’ payment transfers. Who can access it? As a customer, you might not want other users to know who receives your money. If a financial institution does not assure you that no one except you can browse your payment history, you will probably turn down its offer of a new payment service. By not practicing PbD, financial institutions are bound to lose their potential clients.
However, if financial institutions think of privacy as default and incorporate it into their network data system at the beginning of their new payment system projects, transaction records will not be shared automatically. Your payment history will be hidden from prying eyes by default. Or these financial institutions may decide to cater also to those customers who want to share their payment histories and might include such an option as part of their design. They might incorporate sharing into their technologies and thereby invite customers to opt for revealing their payment histories to others. When financial institutions adopt the PbD principles during the design phase, they are likely to prevent privacy problems from emerging and will build customer-friendly businesses.
Although promising and effective, PbD faces challenges. Often, the management of financial institutions refuses to become actively involved in dealing with privacy issues. Managers often fail to understand that a successful strategy for their company’s strongest asset, which is customers’ personal data, requires them to govern this asset. Instead of just viewing and collecting personal information, financial institutions need to manage it actively by optimizing its strategic use, quality, and availability. Few managers, however, want to have this additional task on their plate. Not many financial organizations are ready to incorporate the active handling of personal data into their strategic asset management, preferring to delegate privacy issues to lawyers.
In her article “The Challenges of Privacy by Design” (2017), Sarah Spiekermann warns that managers’ reluctance to handle personal data actively is not the only obstacle financial institutions might encounter when adopting the PbD rules. Even staunch supporters of PbD would not succeed in determining the right strategy without intelligent engineers in their financial institutions. A strong IT department is a prerequisite for practicing the PbD foundational principles. Because PbD presupposes privacy technically incorporated into a system’s design, this design needs to be altered to increase the protection of people’s private information. Financial institutions thus need to encourage their IT departments to think of customers’ privacy from the inception of any IT project. Informational engineers must consider privacy problems in the early phases of their projects to ensure that they can still make decisions about data processing, transfer, and storage.
Yet such close cooperation between managers and engineers is not always feasible in financial institutions. Nor is customers’ privacy always made a concern of IT departments. This is an unfortunate state of affairs because only by jointly deciding on technical and governance controls of privacy problems can managers and engineers in financial organizations protect customers’ data effectively, as the PbD principles call them to do.
PbD has its undisputable merits. It offers practical and technically complex protective measures. It includes the idea that systems should be so constructed as to minimize the amount of personal information gathered from people and processed. Yet its principles should be binding for more people involved in financial institutions: designers, producers, engineers, and data controllers who decide how to use IT systems. Only when their employees collectively use the PbD principle at the earliest phases of a project can financial institutions guarantee that their customers’ privacy will be kept secure.