With every new decade, researchers grapple with the definition of privacy. As society advances with new technology (such as sophisticated online tracking) and applications, the definition has shifted unequivocally; at any given moment there could be multiple definitions depending on the context. While some theories postulate that privacy involves regulation that enables being private in public, other frameworks involve rigid forms of concealment, for example, to protect secrets—sparing little room for a common and unified view on privacy among the users.
For more profit and better customer experience, organizations have invested heavily in the automation of their business processes. At the same time, the user expectations on how data should be handled have begun to shape up. Regulatory bodies—after consulting with a variety of experts like academicians, law professionals, civil society organizations and other stakeholders—are working towards framing rules and regulations in the interest of the users while simultaneously ensuring digital innovation is not throttled. What this means is that there is still a gap resulting from the competing view of data among the three distinct points of view: users, the regulator, and the organization. We like to call these views “The Privacy Triangle.”
Issues in data privacy: Is someone really watching and listening to my every move?
Data privacy has become one of the social and cultural issues of our era. Our need to control what we hide and share extends from human to homes, businesses, communities, and governments. Due to this pervasive nature, the data produced and carried has burrowed into our lives in ways that we take for granted. As we know, in our day-to-day life, there are privacy issues in patient data, location history, search and purchase history, as well as audio systems. Regarding patient data: although laws like HIPAA and HITECH are in place to protect the privacy of patients, there are still some concerns from users. Wearables are collecting a magnitude of data; users may ask who owns their personal data, and how is it going to be processed? Another bulk gathering of data is through location history. Most websites collect IP addresses of users, and some use device location services to pinpoint location – sometimes tracking background location continuously. Similarly, all the clicks we leave behind on websites are used to create scores of individuals and can impact, often negatively, search and purchase history. Lastly, and perhaps the most invasive of the bunch—have you ever thought that your phone was listening to your conversations? Devices with microphones can record and transcribe audio—meaning voice-activated assistants can listen to your private conversations. This begs the question, does the user have a lot of choice left to privacy?
User expectations: The role of data handling
In the increasingly digitalized world, it is evident that sharing data is inevitable and there is a limited opportunity to maintain privacy. This is understood by both users who are sharing their data, and organizations and governments that are processing the collected data. The actual problem lies in the way data is processed. Users want their data to be processed only for lawful purposes or for the ones that they have explicitly consented to. Privacy laws enacted by governments have two main objectives: to prevent misuse of data and allow data to flow seamlessly between different stakeholders to exploit opportunities thrown by digitization.
User expectations also vary according to the demographic, social, geographic, or financial segments of the population. Examples of privacy expectations users have for data handling include adequate protection of sensitive and personal data, transparency in data processing, and notifications in case of changes in data processing or storage. The adequate protection of sensitive and personal data is a basic requirement for every user. Users expect data to be protected using state-of-the-art protection technologies. Further, they expect organizations to clearly state purposes behind processing their data and categorize them as lawful (mandatory), desirable and rewarding (monetary). In the case of changes in data processing or storage, users expect organizations and regulators to notify them—granting awareness of the impact of the new technology and possible threats to their sensitive and personal data.
Regulatory promise: A story of supply and demand
Since privacy laws take time to catch up with user expectations, it is never easy to bring in a dynamic mechanism to keep the laws updated per changing user expectations. Usually, laws are created to prohibit certain actions by the custodians of data. The unprohibited part is normally considered as lawful, while the exploitation of data continues until citizens voice the harms to the society at large. That is what spurs lawmakers to form committees to deliberate and enact regulations. However, there is almost always a lag in the creation of policies at any point in time based on the everchanging user expectations.
Likewise, technology’s extreme evolution adds to demand from users and organizations and puts high pressure on regulators. Conversely, governments desire to create laws which will enable them to have more control over public data, which in a way, can be monetized. The existing privacy laws and the new emerging ones can be seen as generic—allowing rules to be followed by organizations in a broad sense.
Organizational implementation: What is the real definition?
Data protection practices: privacy policies may detail various protection measures implemented by an organization to safeguard from attackers.
Data sharing: most organizations mention how the collected data might be shared (in some cases) with third parties or associated group companies in their privacy policies.
This leads us to the question: should privacy policies be based on governing rules and regulations alone or should these policies be designed to be more user centric?
The evolving definition: The bottom line is proactivity