OpenAI on cell phone screen showing EU data privacy rules

OpenAI Shifts EU Data Privacy Responsibility to Dublin Office

OpenAI will now call Ireland its home, at least for the purposes of European Union regulation. The pioneering “large language model” company opened its Dublin office in 2023, but a recent change to its EU terms of service and an email sent out to some ChatGPT users indicates the company is now formally under the watch of the Irish Data Protection Commission (DPC) in terms of its responsibility to EU data privacy regulations.

The move is unsurprising, as many tech companies have opted to set up shop in Dublin. This is due in no small part to a favorable tax situation, but the Irish DPC has also spent the past five years developing a reputation for being lenient in its regulatory penalties and very slow to wrap up investigations.

OpenAI seeks to establish better compliance with EU data privacy regulations after tumultuous 2023

OpenAI’s formal establishment of a European office in Dublin means that EU data privacy regulations are fully in force for data subjects in the bloc, along with Switzerland and other independent EEA countries in the “single market.” While the Irish DPC will be the lead regulator, OpenAI has said that it will continue to send most user data back to its San Francisco headquarters for processing.

ChatGPT issues prompted a series of regulatory moves and inquiries throughout the bloc in 2023, the most severe being Italy’s temporary ban of the service. It spent most of the month of April unavailable in the country, until OpenAI got back into the government’s good graces with new privacy controls and a new age-gating systems.

OpenAI looks to be making immediate staffing moves to address its EU privacy obligations, advertising for a number of open positions in Dublin such as a privacy program manager and an EMEA general counsel. In June the company announced that it also intends to set up a new office in London, which has become a hotbed for AI development with over a thousand companies having set up shop or relocated there in response to a plethora of available venture capital.

AI firms face stronger EU regulations in near future

While Italy took the most drastic immediate action against ChatGPT, it was far from the only country in the bloc to express concern and initiate investigations into OpenAI’s compliance with EU data privacy regulations. After the Italy ban, the European Data Protection Board (EDPB) set up a task force to monitor the company. The EDPB, along with the European Data Protection Supervisor (EDPS), has been looking to harmonize regulations for AI firms across the region. The two agencies issued a joint opinion on the subject in November that reaffirmed proposed “red line” prohibitions for the technology (such as biometric categorizing) and promoted the creation of a new AI Office that would act independent of any member nation to curtail “forum shopping” by companies.

General Data Protection Regulation (GDPR) scrutiny of OpenAI accelerated in August of last year, when a privacy researcher accused the firm of multiple violations of EU data privacy rules. France, Germany, Ireland, Spain and Switzerland all have ongoing investigations into the company, and Italy’s probe continues despite its repeal of the initial ban.

OpenAI (and similar companies) are also looking directly down the barrel of the EU AI Act, which saw negotiations on provisional rules conclude in early December and is widely expected to receive final approval from the European Parliament in the near future and go into force sometime in 2026. The act creates new standards and assessments for the risk that tools like ChatGPT pose, and AI-generated content would be saddled with new disclosure and transparency requirements. This would be the world’s first set of comprehensive AI regulations, and it is possible that other nations will adopt similar laws in turn just as numerous regions have moved in concert with EU data privacy terms since the GDPR went into effect.

However, the assignment of the Irish DPC as a lead regulator for a company now tends to raise questions about exactly how effective the regulation will ultimately be. The agency has been at the head of numerous EU data privacy investigations involving the likes of Meta, Google and Microsoft since the GDPR terms went into effect in 2018, but in recent years has taken mounting criticism as other national regulators and privacy advocates increasingly see it as an intentional “bottleneck” shielding the tech giants that contribute to its economy. This has manifested primarily in very slow investigations (some stretching out for years), and in low proposed fine amounts that other regulators have at times had to challenge. For its part the Irish DPC says that investigations involving tech giants are complicated and necessarily take a long time. The agency has also at times settled on very large fines, such as last year’s €1.2 billion bill to Meta for systemic international data transfer violations.