Generative AI can be life-changing, whether helping to review the intricacies of contracts and legal documents or creating product descriptions for millions of retail SKUs, or anything else we can dream up. There are limitless use cases. Yet, no technology poses a bigger threat to data privacy than generative AI. Models are often constructed and refined without data consent as a design principle; information is incorporated while the owners of that information generally remain blissfully unaware that all of their search queries, online reviews, and so much more have become part of something much bigger than they could have ever foreseen, let alone given permission for use. Within an enterprise, it remains unclear how much proprietary information is being uploaded to AI systems, or how your interactions with an AI model will impact future versions. Not to mention the risks associated with the AI built-in to the third-party vendors you use.
In an effort to achieve clarity, expect vendors to claim they can help you control AI. Adopt with caution. It’s still too early to know the unintentional consequences of AI use and it’s impossible to claim one can control it. We’ll likely see a lot of overpromising, giving companies a false sense of security.
We must teach AI to work with us, not against us. We need frameworks, ethical-use policies and strong governance to ensure the proper use of AI. We can expect to see new technologies created to address the security and data privacy concerns in an AI world. Imagine consumers getting their own “AI Consent Assistant.” Such a tool would move us from static, one-time consent checkboxes to dynamic, ongoing conversations between consumers and platforms, with the AI Consent Assistant acting as a personal guardian to negotiate on our behalf. Or maybe AI tools could be developed to help security teams predict privacy breaches before they happen or proactively auto-redact sensitive information in real-time.
We must think differently about AI in relation to data privacy – the future of data is not about how much we collect, but how ethically it is used and how we can realistically safeguard it so that we get the best out of AI without violating data privacy tenets.
With this in mind, even in companies aspiring to do right by consumers, there will be mistakes. Ultimately, 2024 will be a year of mishaps. Expect to see some potential thrown elbows as well, as the tech industry tries to muscle its wishes through, influencing potential regulations– and as the government responds based on which direction the dollars flow.
Data privacy: Navigating a young, fragmented market
Although it feels like data privacy is an issue that has been around for years, the reality is that many of the problems within data privacy are still developing as consumers become more aware of how their information is actually being used; data collection, sharing, sales, and retention practices that were once acceptable are no longer so. With the advent of app tracking transparency, the release of the Facebook papers, the overturning of Roe v. Wade, and the customer privacy fine for popular beauty retailer Sephora, many consumers now understand how much data is collected on them and how it can be used against them. As such, they’ve become more actively engaged in fighting for their right to privacy. Hand in hand with growing consumer concern is more government regulation.
Given that Gartner predicts that 75% of the world’s population will be protected by data privacy laws by 2024, an increasing number of U.S. citizens will likely gain at least some guaranteed data privacy rights. Today, 12 states have enacted “comprehensive” privacy laws, and many others have tightened regulation over specific sectors. Expect further state laws—and perhaps even a federal privacy law—in coming years.
But what do we do until we get there? We build according to emerging best practices. Data tells our story, and consumers deserve to be the authors with all the rights and control of that position. This should be the guiding principle and the focus of any business. When viewed from this perspective, and ultimately enforced, pieces fall into place. Data privacy becomes an integral part of product design and development. Transparency should never be a question– no one has to guess at what data is collected, why, how it is stored, or how to remove it. Before launching any new technology or platform, companies should assess the privacy impact, working to identify potential privacy issues and taking preventive measures from the start, as it remains quite difficult to retrofit privacy.
The number of data privacy requests is only going to go up, as is enforcement of the limited regulations in place. Thus far, there has been a lack of regulatory personnel in place dedicated to data privacy, but industry experts expect that will change. In addition, VCs, customers, and public perception will drive privacy compliance.
The future of data privacy is not static, so it is imperative that businesses closely monitor new enforcements coming out of California and the EU. This will bring clarity to very opaque laws.
In 2024, hopefully CISOs will be freed to communicate true technology risk so that everyone is aware when decisions are made. The shift will require collaboration with other departments, like legal and compliance. This, however, raises new challenges around differences in risk appetites, preferred toolsets (technology vs. governance), and the best ways to communicate. It may not be a year of smooth sailing for everyone’s favorite corporate scapegoat but if the CISO can weather it, better days lie ahead – both for attracting and retaining much-needed talent in the role as well as for company health and consumer trust.
The road to a better place starts with a change in culture. Data security and data privacy must become the responsibility of every individual. At the corporate level, this means every member is accountable for preserving data integrity. What might this look like? This could mean that companies are more transparent with more disclosures about security issues, privacy requests, or even the size of a bug bounty program.
The CISOs role has evolved over time, parallel to how CFOs after the Enron scandal. Before Enron, CFOs didn’t play a critical role at the board level; the scandal served as a catalyst to dramatically increase the scope of the job. Today, CFOs wield influence, are good stewards of money, and provide governance and tools to manage financial resources. Data is described as the new digital currency, signifying the importance of data in business. As such, CISOs now hold more visible roles in organizations, with increased focus on board-level reporting on risk and the secure use of technology.
Organizations might develop data accountability programs, identifying the CISO as the primary decision maker. This step would ensure the CISO is equipped with the necessary resources while upleveling processes. Others are already forming cross-functional risk councils that include legal, compliance, security, and privacy. In these sessions, teams surface and rank the highest priorities of risk and figure out how they can most effectively communicate it to execs and boards. Let’s make it happen in 2024 so that one person is not solely responsible for the actions of many. Let’s bring everyone onto the same, collaborative team.
Despite massive change, one thing remains the same
The data privacy landscape is transforming in front of us. The points raised above hint at the areas ripe for change and how that change may transpire, but there are many other ways privacy will evolve in 2024 and beyond.
The one thing that remains constant, privacy is a human right.