The uproar over Apple’s new privacy rules for its devices may be ending rather suddenly and in a whimper, as reporting from the Financial Times indicates the company is backing down and allowing some amount of user-specific information to be collected by app developers regardless of their choice to opt out.
Part of this retreat appears to be due to the fact that Apple has been ineffective in preventing apps from collecting information that can identify a device due to it being declared “necessary” for the app to function. Apple’s privacy rules will still block developers from the convenience of using each device’s unique identifying number without permission, but it appears a number of anonymized device-level “signals” will now be fair game for ad tracking purposes.
Apple’s battle over privacy rules appears to end with concessions
First rolled out earlier this year with the release of iOS 14.5, Apple’s new App Tracking Transparency framework forbids advertisers from tracking the unique device number (the IDFA) without the user’s permission. This permission must be collected via a pop-up message during app downloads or updates, and developers are not supposed to circumvent the privacy rules with other means.
Though the practice of “device fingerprinting” (recognizing a device based on unique combinations of its hardware and software characteristics) has been expressly forbidden by Apple’s new privacy rules as well, it has proven to be a hard thing to police. App developers are within their rights to ask for many of the “signals” that can be used in this way as elements that are necessary for the app to operate. Once these pieces of information get to the developer’s server side, it’s difficult for Apple to tell what is being done with them.
The report from the Financial Times indicates that Apple has essentially surrendered on all but the most obvious forms of illicit device fingerprinting. Information from tech giants such as Facebook and Snap indicates that these companies are continuing to collect enough user-level signals from devices to continue to track individuals for the purpose of delivering ads based on their interests. Apple seems willing to concede ground in this space so long as this tracking anonymizes the user, not crossing the line into collecting personally identifiable information about them.
Apple’s plan appears to be similar to the “Federated Learning of Cohorts” (FLOC) strategy that Google has proposed for its upcoming Privacy Sandbox project, which pledges to end the use of tracking cookies entirely in the coming years. Information provided by Snap to investors indicates that the company is moving ahead with a similar cohort-based plan for Apple users, adding the devices it tracks to generalized interest groups for ad selection rather than creating individual user profiles. Facebook has also said something similar about a multi-year plan to rebuild its ad infrastructure centered around anonymized and aggregated data.
None of this is an official announcement; the Financial Times report infers it from the statements of various third-party sources, but it certainly tracks with the reported difficulties Apple has had in preventing developers from continuing to “fingerprint” users that opt out. Apple publicly maintains its current privacy rules, centered on developers not deriving data from a device for the purpose of uniquely identifying it.
Ambuj Kumar, CEO of Fortanix, provides some insight into how Apple’s “reformed” privacy rules may work: “So far, Apple’s strategy seems to be collecting and anonymizing data on phones locally before sharing with any service provider. This approach has some merits but it doesn’t prevent a service provider from piecing together multiple pieces of information and de-masking the user! If you are the only one in your office from your neighborhood, an app can learn your home address and identity just by looking at your location history and publicly available information such as your company’s website or social media. Apple can review iOS apps to ensure they are not trying to reverse engineer the user’s identity. There should be clear guidelines for app developers. For example, apps should not be allowed to combine anonymized user data with external data sources such as social media or data collection by other apps. When there is so much at stake for user privacy, much more care is needed to protect it.”
Personalized ad tracking regroups under the banner of increased personal privacy
If the information from Snap and Facebook is accurate, Apple would appear to have resigned itself to joining the rest of the industry in driving toward an anonymized form of fingerprinting users as a replacement for unpopular (and increasingly regulated) tracking cookies that can vacuum up personal information. But this approach is not without its own privacy concerns, and Apple could be courting a user revolt by pressing toward it so soon after publicly committing to an apparent privacy-first approach.
The advertising industry has been fighting Apple’s new privacy rules since they were announced, none more so than Facebook (which even went so far as to take out several full-page newspaper ads decrying the move’s potential impact on small businesses). Given no similarly profitable alternative, some “free” ad-supported apps were facing the prospect of having to simply give up on Apple devices entirely. Apple thus faced significant pressure from desperate advertisers (who have lost an estimated collective $10 billion since the new privacy rules went into effect), and also from regulators that might see the move as an attempt to make its own first-party personalized ad system the only viable game in its digital ecosystem.
While Apple’s particular structure has yet to be addressed, groups such as the Electronic Frontier Foundation (EFF) have raised privacy concerns about cohort-based models such as FLOC. The group claims that it could facilitate browser fingerprinting in such a way as to make it even easier to identify an individual user, potentially pairing their preference groups with personal information. The system would also not necessarily address potential discrimination, with users still able to be grouped into sensitive demographic categories such as religious or political affiliation.