Meta’s planned Twitter killer, Threads, isn’t publicly available yet, but it already looks like a privacy nightmare.
Information provided about the app’s privacy through mandatory disclosures required on iOS shows that the app can collect highly sensitive information about users to profile their digital activities, including health and financial data, precise location, browsing history, contacts, search history and other sensitive information.
Considering that Meta, the developer behind the app, the company formerly known as Facebook, makes its money tracking and profiling web users to sell their attention through its behavioral advertising microtargeting tools, this is hardly surprising. But it does raise questions about whether Threads can launch in the European Union, where the legal basis Meta claimed for processing Facebook users’ personal data (performance of a contract) was found to be illegal early this year.
Meta has now switched to a claim of legitimate interest for this data-for-ads processing. But earlier this week, the Meta bloc’s top court added regional anguish through a judgment on a referral from a German case, in which the court said that this legal basis is also inappropriate for running Meta’s behavioral ads and that permission must be sought. asked. Under current EU law, sensitive information such as health data also requires an even higher standard of explicit consent to be legally processed in order to comply with the General Data Protection Regulation. Thus, Meta would need to request and obtain specific consent for processing sensitive data such as health.
In addition, incoming EU regulations completely prohibit the use of sensitive data for advertising and may require explicit consent from tech giants to combine data for advertising profiling (see: the Digital Services Act and Digital Markets Act). So there is more regional legal uncertainty on the horizon for Meta’s farm business. (Designated gatekeepers must comply with the DMA next spring, while so-called very large online platforms must comply with obligations under the DSA by August 25.)
Currently, the adtech giant doesn’t even give users a general, up-front choice to opt out of tracking and profiling, let alone explicitly ask if it can share data about your health condition so advertisers can try to sell you diet pills or whatever. . And with even tougher limits on surveillance ads coming to market in the EU, an app that proposes to track everything to maximize advertiser appeal will be a hard sell for regional regulators.
And if that wasn’t enough, Meta was recently ordered to stop sending data from EU users to the US for processing and was fined nearly $1.3 billion for violating the requirements of the GDPR for data export. That order is specific to Facebook, but in principle the same requirement could be applied to other meta-services that do not adequately protect Europeans’ data (such as using end-to-end encryption with zero-knowledge architecture). And it’s clear that Threads won’t offer users that kind of privacy.
Bringing Meta’s watch advertising business into line with EU law will require a sweeping change in the way it operates – one that doesn’t seem to be the plan with Threads as it sees more of the same data-grabbing attention farming presents that Mark won Zuckerberg’s empire had such a toxic reputation that it had to undergo an expensive rebranding of the company to Meta in recent years.
Whether the rebrand worked to detoxify Meta’s corporate image seems debatable given it chooses to tie Threads to Instagram’s brand, rather than explicitly calling it a Meta app (the developer listed on the App Store , is “Instagram Inc” and the text description describes the app as “Instagram’s text-based conversation app”). While that choice may have more to do with Meta seeing it as the best strategy for quickly building a Threads user base if it can push Instagram’s large and engaged community to insta-adop what it frames as a sister “text” app so the latter can hit the ground running.
One thing is clear: Threads won’t run in the EU just yet. And possibly never. At least not unless Meta radically reshape its approach to user choice over tracking.
Yesterday the Irish independent reported that the app will not launch in the EU, citing Meta’s lead regional data protection regulator, the Irish DPC, as saying it had been in contact with Meta about the service and would not launch “at this time”.
While today the guard – citing sources within Meta – has reported that the company has postponed an EU launch of Threads due to legal uncertainty around data usage related to the DMA’s aforementioned limits on sharing user data across platforms.
A spokesperson for Meta has not responded to our questions about whether it plans to launch Threads in the EU or not.
But the DPC clarified to businessroundups.org that it has not prevented Meta from launching Threads, based on its role in enforcing GDPR compliance, saying the company has “no plans yet to launch in the EU”. So it appears there has been no active regulatory intervention to block a launch at this stage. Rather, Meta seems concerned about the legal risk it could face if it goes ahead with a launch when it will be subject to the DMA in a few months. (Earlier this week, the company informed the EU that it believes the incoming ex ante antitrust regime applies to its business, but compliance is not required until six months after official EU gatekeeper directions).
The new regulation will be enforced centrally by the European Commission, rather than by Member State-level authorities such as the Irish DPC. Thus, the bloc is expected to shift toward enforcement of digital giants — and that paradigm shift also increases legal uncertainty for Meta within the EU.
Threads, in particular, launches in the UK on Thursday, where the regulations are different as the market is no longer subject to EU law following the Brexit referendum vote to leave the bloc.
The current data protection regime in the UK is still derived from the GDPR, so technically the same legal requirements for processing personal data apply there. However, the ICO, the country’s data protection watchdog, has been notoriously inactive on systematic breaches of the surveillance advertising industry. So Meta can feel comfortable with the level of legal risk his company faces in Brexit Britain. And while the UK government recently revived a shelved plan to enact its own ex ante antitrust reform targeting digital giants, it will likely be years before legislation similar to the EU’s DMA is enshrined in its own statute books.
The UK government has also signaled a plan to water down national data protection standards, under a post-Brexit data reform bill, which also appears to undermine the ICO’s independence and could make the watchdog even more toothless than it already is when it comes to tackling data protection abuses.
Meanwhile, in the EU, Meta was fined more than $410 million in January for not having a valid legal basis under GDPR to run behavioral ads on Facebook and Instagram. infringe the GDPR. While the last time the ICO fined Meta was in the wake of the Cambridge Analytics scandal when the company was still called Facebook.
Under the DMA, centrally imposed fines can be as high as 10% of global annual revenue – significantly above the theoretical maximum that data controllers can penalize for GDPR breaches (which tops out at just 4%).
In that case, the fines for technology giants that have breached EU data protection regulations have remained a fraction of the maximum, including in the case of Meta.