Now that the General Data Protection Regulation (GDPR) has been in effect for almost two years, it’s fair to ask what effects individuals have seen from the increased emphasis on privacy. As lawmakers in the United States work to craft a federal privacy bill, we should take note of what has been working with GDPR implementation, both in terms of the regulation itself and the technologies being developed to assist with compliance.
As we move toward a federal privacy bill in the United States, we should reflect on this situation and avoid the underlying issue that’s causing it: a “notice and choice” model that’s central to GDPR (and the California Consumer Privacy Act, or CCPA). “Notice and choice” requires organizations holding personal data to notify people that their information could be used in various ways and offer them a choice of opting out. In theory, giving individuals control over how their personal data is used sounds empowering, but in reality, it ends up being burdensome and ends up getting ignored, leaving individuals no better off than they were before the regulation. A better approach is to require the organizations that collect, hold and process personal data to handle it responsibly. This the approach taken in the model bill that Intel offered up for comment in late 2018 and throughout 2019.
In the meantime, technical tools are emerging to help safeguard privacy. Companies are developing technology to help the many individuals beleaguered by privacy notices. Using artificial intelligence and machine learning, such companies can trace a user’s digital footprint – learning what personal information is out there and precisely where it is. This is nontrivial, as the average user’s personal data is held by about 350 companies.
Once individuals know who holds what data about them, they can take action – either agreeing to the use of their data, in some cases, or asking for it to be deleted in others. The technology can send a notice to specific sites requesting deletion of the data, making it easy for people to exercise the “right to be forgotten” they have under GDPR or the once enforcement begins in May.
These new data privacy technologies are creating bridges between individuals and companies so that individuals can easily exercise control of their data. On the flip side, companies can learn what data people want to delete under what circumstances, giving them valuable insight about their customers.
We need both technologies and a legal structure to help people navigate the privacy landscape and start taking back control of their personal data. The U.S. needs a strong federal privacy law that tells data brokers and other companies what is and isn’t an acceptable use of personal data, and how they should safeguard what they do use. The law should not rely on “notice and choice,” which makes it difficult for individuals to sort through voluminous notices and take action in order to protect themselves. The law also should be tailored to the U.S., where innovation often depends on access to data – as long as personal data can be protected.
With a comprehensive U.S. federal privacy law in place, we can expect more new technologies and services to develop – including those that help people exercise their data privacy rights and assist organizations in using personal data responsibly. The combination of a strong law that is not based on “notice and choice” and privacy-enabling technologies will give individuals the power they should have without causing them undue burden – ensuring their personal information is used the way they intend.