In the world of information technology, the term “privacy” has officially become a homonym. It’s not just the US trying to define privacy 50 different ways; we see great disparities among EU member states regarding enforcement and penalties around GDPR compliance violations. Throw the invalidation of Privacy Shield guidance in the mix this past July, and the task of ensuring organizational compliance around data privacy in the US has truly never been more challenging. How do we move forward when language, even basic definitions, are often at odds or even completely contradictory?
On the first day of the Biden presidency, Christopher Hoff was appointed to a key post at the US Department of Commerce with the task of overseeing Privacy Shield replacement negotiations. Hoff is a data privacy veteran and this day one action could be seen as a signal that the Biden administration intends to take data privacy concerns to the federal level. While it is true that a single US privacy framework (like GDPR) seems appealing in contrast to keeping pace with multiple state definitions, the appeal may be surface level at best. Given the challenges in the uneven application and overall enforcement of GDPR, codifying clear privacy problem statements and identifying pain points in doing so may seem less ambitious, but would likely be a much more productive first step.
Using GDPR as a baseline standard which seeks to provide regulatory guidance on both data security (much of which is recited in article 33) and data privacy (as recited in article 23), the enforcement challenges are clear. The vast majority of GDPR compliance violations fall under the category of data breach (security) rather than the data privacy requirements in article 23. In GDPR, there are no clear legal standards of care for information security. There is also an absence of guidance on what may constitute non-material damages to an individual whose privacy was violated.
While GDPR is by far the most comprehensive attempt to rightfully address data security and data privacy, tackling both in a single standard has not been without challenges. The lack of clear definitions or guidance on security requirements has led to massively uneven fines across EU member states for data breaches, while civil litigation against privacy infringements have largely settled out of court or have been dismissed.
The California Consumer Privacy Act (CCPA) became enforceable in 2020, as did the New York SHIELD act. The idea of privacy as a homonym becomes evident when comparing the provisions for consumers in these two statutes..
The CCPA really focuses on privacy rights for consumers, giving clear categories for PI, standards for privacy notices, and established guidance on how consumers can reach organizations regulated by the CCPA. Consumers must have the ability to “opt out” of the sale of their information in CCPA while the law simply states that regulated businesses must have “reasonable security procedures” to protect personal information. This is less “California’s GDPR” and more “California’s portion of GDPR article 23”. Let us not ignore EU member states’ challenges in enforcing and litigating pure privacy violations under GDPR.
On the other end of the spectrum, The New York SHIELD Act focuses almost entirely on breach notification, expanding the definition of breach and personal information, and does delve into specifics on technical, administrative, and physical safeguards necessary to be compliant. The SHIELD Act does require continuous monitoring and applies to organizations who store personal information on New York residents. Moreover, the definition of “private information” is detailed as a subset of “personal information” with a prescriptive combination of data elements on NY State residents which is protected under this law.
The kaleidoscopic notions of privacy and privacy laws under consideration are being drafted without:
A singular standard which defines personally identifiable information
Clear elements which must be present in any US privacy notification
Uniform rights defined for US citizens around the collection, sale, and distribution of their PII
Uniform access requirements for individuals to modify or remove their PII if they so choose
Defined requirements or guiding principles around AI or algorithmic transparency as it pertains to leveraging PII
Thresholds for a concentration of datapoints (or PII) on any one individual
For too long, data security and data privacy have been uttered in the same breath (and even used interchangeably). The conflation of these two concepts is perilous. Privacy and compliance professionals remember all too well the efforts required to prepare for GDPR compliance. If the definition of PI differs between states, how reasonable is it to accommodate these laws if I have a global business? How mature are user-friendly tools to provide the required access as prescribed by the CCPA? With no clearly defined guidance on security controls under California law, might it be tempting to loosen operations to ensure data access? And in NY State, what are my remedies if my PI is exposed in a breach?
We must start by defining the privacy problems we need to solve; this begins with an agreement on language, standards, and tooling. Ultimately, we cannot measure the success of any solution without clear definitions of terms and an agreed upon standard of care.