A new study published in data science journal Patterns examined a number of apps that handle sensitive health data, including some that interface with labs and other entities covered by HIPAA privacy regulations, and found that they are sharing health data with third party trackers that provide cues for targeted Facebook ads.
The issue of health data being exposed to data brokers and advertisers via basic fitness and health apps, such as exercise and pregnancy trackers, has long been known to the public. These entities do not qualify as patient care organizations, and are thus not bound to HIPAA requirements (which forbid sharing health data with advertisers) when app users freely give such information to them. However, this study focused on five apps that are intended for cancer patients and that handle their sensitive medical records.
The incident continues a string of recent health data issues related to the Facebook ads system, as the company has been tied to both an accidental disclosure involving one of the country’s largest health systems and a website scraping incident that impacted 33 patient care sites.
Apps funnel cancer patient health data, sometimes drawn from protected records, to targeted advertising programs
The study found that the five apps it examined were passing health data to a collective 32 “middleware” outfits that track end users across different sites and apps via cookies. These apps were selected for their frequent use by patients that engage on Facebook and other social media platforms for information and community support related to their condition. In particular, these apps focus on genetic testing that can provide risk projections for different types of cancer and health services for those already diagnosed.
These apps also may meet the Federal Trade Commission (FTC) definition of a personal health record (PHR) vendor that receives patient records from a CLIA-certified diagnostic laboratory that is subject to HIPAA requirements (PHR vendors are by default not subject to HIPAA, but can be if they receive patient records as a business associate of a health plan or provider).
The study found that all five of the apps had privacy policies, but three of them indicated that health data was not being shared when it actually was. Two of the apps, Ciitizen and Invitae, do disclose in their privacy policies that health data may be shared with advertisers and their application of trackers was consistent with what was disclosed.
The study result specifically impacts those that make use of these apps and also have a Facebook account that they use of for communicating or seeking information about their condition. The apps generally ask users to agree to terms of service that allow a tracker embedded in the vendor’s website or app to gather health data; this information is shared with multiple third-party trackers that are also accessed by the Facebook ads system. This goes into the “interests” algorithm that informs what ads the user is shown when they are on Facebook. In some cases, the vendors seek to expand the advertising profile by targeting quizzes or surveys to these users that are designed to obtain even more supplementary personal information for targeted ad purposes.
Facebook ads draw on sensitive data, raising HIPAA questions
Of the five apps that the study examined, Color Genomics is one that is both directly bound to HIPAA terms and operates its own CLIA-certified lab. This app’s privacy notice indicates it does not share or sell health data, and that it asks users to opt in before sharing any other information for marketing purposes. The study found that the app interfaces with three cross-site trackers however, including one (Nanigans) that shares information with Facebook ads.
The study mentions that all of these practices could potentially put the apps in violation of the FTC’s Health Breach Notification Rule, but that this rule has not once been enforced since it was put in place 13 years ago. But the cases of other digital medicine apps that have been caught freely selling health data to advertisers, such as Flo, set a precedent for potential legal action against apps that are feeding Facebook ads with this type of information if they are found to be making “deceptive statements” to the end user.
A spokesperson for Facebook also added that this sort of health data should not be shared with the platform in the first place as it violates Meta’s own company policies. They specified that the company has internal filtering tools that are meant to catch this sort of data before it can be used in Facebook ads, but that they do not always detect all of it.
Chris Olson, CEO of The Media Trust, sees this incident as a timely reminder of the scope of this issue as the digital advertising market grows with minimal regulation: “Data privacy violations are one of many risks associated with unsupervised third-party code like ad trackers, content recommendation algorithms, shopping cart plugins, and more. Today, up to 90% of the code across consumer-facing websites is provided by third parties – even privacy-conscious companies are often unaware of their activities which can lead to data breaches, phishing attacks and worse. Complacency is no longer an option – in the face of emerging data privacy legislation and rising cyber risk, organizations need to commit to the digital safety of their customers by taking control of their online domains and carefully vetting third-party vendors for risky activity. This is especially true for companies that collect sensitive and personally identifiable information (PII) like health data.”
Meta involved in other recent questionable health data adventures
While Meta may have policies in place restricting the flow of this sensitive medical data to Facebook ads, the company was recently brought to court for scraping health data from a variety of hospital websites using its Pixel advertising tool. Pixel is embedded on 33 of the country’s top 100 hospital websites and on the patient portals of seven health systems, and is able to target even those that do not have a Facebook account.
Pixel was also recently implicated in an “unintended disclosure” of the health data of some 1.3 million Novant Health patients. A misconfiguration of the tool by Meta appears to have exposed private contact and appointment information, as well as demographic details. The window for this incident appears to have started in May 2020 as the company launched an errant ad campaign containing the misconfiguration, and may have run into the following month before being detected and cleaned up.
Yaric Shivek, VP of Product of Neosec, notes that all of this could indicate a bundle of HIPAA-related issues coming down the pipes for Meta: “Most of us wouldn’t install a piece of adware on our laptop, and yet it seems that ad trackers are installed on sensitive healthcare websites, giving advertisers visibility into our transactions on these websites. This seems to circumvent HIPAA compliance. You’d hope that security permissions are more orderly in the world of APIs, but while electronic health records (EHR) companies take protecting your sensitive healthcare data seriously, this data is often being insecurely disseminated by 3rd-party aggregators and apps, whose vulnerable APIs can be easily exploited. This connected world of APIs and apps is only as strong as the weakest link. What good is a bank safe, if your courier gets robbed the minute they walk out of the bank with your cash?”