Silhouettes of engineers working in blurry data center showing privacy challenges

The Privacy Triangle: Emerging Privacy Expectations and Challenges

With every new decade, researchers grapple with the definition of privacy. As society advances with new technology (such as sophisticated online tracking) and applications, the definition has shifted unequivocally; at any given moment there could be multiple definitions depending on the context. While some theories postulate that privacy involves regulation that enables being private in public, other frameworks involve rigid forms of concealment, for example, to protect secrets—sparing little room for a common and unified view on privacy among the users.

For more profit and better customer experience, organizations have invested heavily in the automation of their business processes. At the same time, the user expectations on how data should be handled have begun to shape up. Regulatory bodies—after consulting with a variety of experts like academicians, law professionals, civil society organizations and other stakeholders—are working towards framing rules and regulations in the interest of the users while simultaneously ensuring digital innovation is not throttled. What this means is that there is still a gap resulting from the competing view of data among the three distinct points of view: users, the regulator, and the organization. We like to call these views “The Privacy Triangle.”

Issues in data privacy: Is someone really watching and listening to my every move?

Data privacy has become one of the social and cultural issues of our era. Our need to control what we hide and share extends from human to homes, businesses, communities, and governments. Due to this pervasive nature, the data produced and carried has burrowed into our lives in ways that we take for granted. As we know, in our day-to-day life, there are privacy issues in patient data, location history, search and purchase history, as well as audio systems. Regarding patient data: although laws like HIPAA and HITECH are in place to protect the privacy of patients, there are still some concerns from users. Wearables are collecting a magnitude of data; users may ask who owns their personal data, and how is it going to be processed? Another bulk gathering of data is through location history. Most websites collect IP addresses of users, and some use device location services to pinpoint location – sometimes tracking background location continuously. Similarly, all the clicks we leave behind on websites are used to create scores of individuals and can impact, often negatively, search and purchase history. Lastly, and perhaps the most invasive of the bunch—have you ever thought that your phone was listening to your conversations? Devices with microphones can record and transcribe audio—meaning voice-activated assistants can listen to your private conversations. This begs the question, does the user have a lot of choice left to privacy?

User expectations: The role of data handling

In the increasingly digitalized world, it is evident that sharing data is inevitable and there is a limited opportunity to maintain privacy. This is understood by both users who are sharing their data, and organizations and governments that are processing the collected data. The actual problem lies in the way data is processed. Users want their data to be processed only for lawful purposes or for the ones that they have explicitly consented to. Privacy laws enacted by governments have two main objectives: to prevent misuse of data and allow data to flow seamlessly between different stakeholders to exploit opportunities thrown by digitization.

User expectations also vary according to the demographic, social, geographic, or financial segments of the population. Examples of privacy expectations users have for data handling include adequate protection of sensitive and personal data, transparency in data processing, and notifications in case of changes in data processing or storage. The adequate protection of sensitive and personal data is a basic requirement for every user. Users expect data to be protected using state-of-the-art protection technologies. Further, they expect organizations to clearly state purposes behind processing their data and categorize them as lawful (mandatory), desirable and rewarding (monetary). In the case of changes in data processing or storage, users expect organizations and regulators to notify them—granting awareness of the impact of the new technology and possible threats to their sensitive and personal data.

Regulatory promise: A story of supply and demand

Since privacy laws take time to catch up with user expectations, it is never easy to bring in a dynamic mechanism to keep the laws updated per changing user expectations. Usually, laws are created to prohibit certain actions by the custodians of data. The unprohibited part is normally considered as lawful, while the exploitation of data continues until citizens voice the harms to the society at large. That is what spurs lawmakers to form committees to deliberate and enact regulations. However, there is almost always a lag in the creation of policies at any point in time based on the everchanging user expectations.

Likewise, technology’s extreme evolution adds to demand from users and organizations and puts high pressure on regulators. Conversely, governments desire to create laws which will enable them to have more control over public data, which in a way, can be monetized. The existing privacy laws and the new emerging ones can be seen as generic—allowing rules to be followed by organizations in a broad sense.

Organizational implementation: What is the real definition?

A privacy policy is a legal document prepared by an organization to communicate its privacy intent to all its stakeholders. There are multiple reasons why organizations prepare a privacy policy—to comply with privacy laws, make data processing activities explicit to its stakeholders, and to indemnify itself from frivolous claims. These policies act as a live document, subject to change in case of a shift in privacy law, change of pace in technology, and due to mergers and acquisitions.

Typically, a privacy policy deals with the following aspects:

  1. Data collection and processing: the privacy policy makes it clear what customer/user data is collected by organizations and what aspects are sensitive. It also mentions modes of data collection focusing on channels of collection. Lastly, the privacy policy explicitly mentions the purposes behind collecting data.
  2. Data protection practices: privacy policies may detail various protection measures implemented by an organization to safeguard from attackers.
  3. Data sharing: most organizations mention how the collected data might be shared (in some cases) with third parties or associated group companies in their privacy policies.

However, it’s worth noting that not all users utilizing the services of an organization can access the privacy policy document. Moreover, it is not necessary that a privacy-conscious user agrees to all purposes documented in the privacy policy. Overall, there could be a variety of reasons which may lead to unfulfilled privacy expectations of the user. One reason behind such a scenario is a well-known phenomenon called the privacy-utility tradeoff, where organizations focus solely on maximizing value out of collected data while users wish for their data to be used only for lawful purposes. Ideally, data should be protected as per usage scenario, although framing laws based on usage scenario is difficult. Much improvement can be achieved by following FIPS (Federal Information Processing Standards) and OECD (Organization for Economic Co-operation and Development) guidelines as the baseline.

This leads us to the question: should privacy policies be based on governing rules and regulations alone or should these policies be designed to be more user centric?

Should #privacy policies be based on governing rules and regulations alone or should it be designed to be more user centric? There is competing view of data among three distinct points of view: users, the regulator, and the organization. #respectdataClick to Tweet

The evolving definition: The bottom line is proactivity

Data privacy is a promise given in privacy policy documentation by organizations and is ever evolving. Although organizations try to adhere to local rules and regulations in ensuring privacy of customer data using data protection technologies in various forms, there are still grey areas where a pragmatic approach encompassing people, processes and technology could help in filling the gaps. Proactivity and consistent assessment of the landscape will enable all parties to stay privy to changes and be prepared for the future.

 

Chief Scientist at TCS Research