It turns out, that the EU’s push to completely abolish digital privacy might not actually be an altruistic move to save children from abuse. Several tech companies, including one headed by Ashton Kutcher and Demi Moore, stand to profit substantially from the decision. Which is why they massively influenced it.
Who Benefits from Chat Control?
Balkan Insight has published a very interesting story digging into the lobbyists involved in the EU’s Chat Control law, which I discussed in episode 158. Hilariously, this story was in part founded by the European Commission.
The story focuses on Swedish MEP Ylva Johansson (the current Home Affairs Commissioner) and the people who met with her and the rest of the team who drafted the Chat Control legislation. Johansson herself maintains that the law is necessary to protect children. This is not surprising, as WON’T SOMEONE THINK OF THE CHILDREN!?! has been the slogan of those who started, and continue to fight, the Crypto Wars all along.
Johansson, however, has not blinked. “The privacy advocates sound very loud,” the commissioner said in a speech in November 2021. “But someone must also speak for the children.”
In reality, however, it might be that law is also being passed because some tech companies stand to gain massively by its implications. Which is why they have influenced policy in this direction.
The proposed regulation is excessively “influenced by companies pretending to be NGOs but acting more like tech companies”, said Arda Gerkens, former director of Europe’s oldest hotline for reporting online CSAM. “Groups like Thorn use everything they can to put this legislation forward, not just because they feel that this is the way forward to combat child sexual abuse, but also because they have a commercial interest in doing so.”
These companies include Thorn, a venture headed by Ashton Kutcher and his ex-wife Demi Moore.
Star of That 70s Show and a host of Hollywood hits, 45-year-old Kutcher resigned as chairman of the Thorn board in mid-September amid uproar over a letter he wrote to a judge in support of convicted rapist and fellow That ‘70s Show actor Danny Masterson, prior to his sentencing [for two counts of rape]. Up until that moment, however, Kutcher had for years been the very recognisable face of a campaign to rid the Internet of CSAM, a role that involved considerable access to the top brass in Brussels.
Of course, good old Zensursula is involved, too.
In November 2020, it was the turn of Commission President Ursula von der Leyen, who was part of a video conference with Kutcher and an organisation registered in the small Dutch town of Lisse – the WeProtect Global Alliance.
Though registered in the EU lobby database as a charity, Thorn sells its AI tools on the market for a profit; since 2018, the US Department of Homeland Security, for example, has purchased software licences from Thorn for a total of $4.3 million.
Of course, corrupt people who like to take money from other corrupt people tend to find each other.
In November 2022, Kutcher and Johansson lined up as key speakers at a summit organised and moderated by then European Parliament Vice President Eva Kaili, who three weeks later was arrested and deposed over an investigation into the ‘Qatargate’ cash-for-lobbying scandal.
In March this year, six months before his resignation amid uproar over his letter of support for Masterson, Kutcher addressed lawmakers in Brussels, seeking to appease concerns about the possible misuse and shortcomings of the existing technology. Technology can scan for suspicious material without violating privacy, he said, a claim that the European Digital Rights association said was “deeply misleading”.
No, it’s a fucking lie is what that is.
The Commission has been reluctant to detail the relationship between Thorn and Johansson’s cabinet under the EU’s freedom of information mechanism. It refused to disclose Cordua’s emailed response to Johansson’s May 2022 letter or a ‘policy one pager’ Thorn had shared with her cabinet, citing Thorn’s position that “the disclosure of the information contained therein would undermine the organisation’s commercial interest”.
After seven months of communication concerning access to documents and the intervention of the European Ombudsman, in early September the Commission finally released a series of email exchanges between Johansson’s Directorate-General for Migration and Home Affairs and Thorn.
The emails reveal a continuous and close working relationship between the two sides in the months following the roll out of the CSAM proposal, with the Commission repeatedly facilitating Thorn’s access to crucial decision-making venues attended by ministers and representatives of EU member states. The European Ombudsman is looking into the Commission’s refusal to grant access to a host of other internal documents pertaining to Johansson’s proposal.
FGS Global, a major lobbying firm hired by Thorn and paid at least 600,000 euros in 2022 alone, said Thorn would not comment for this story. Johansson also did not respond to an interview request.
There are other lobbyists involved. In this modern thicket of NGOs, companies and government officials, it’s almost impossible to determine where government ends and business starts. This reminds me very much of the censorship-industrial complex.
WeProtect is the offspring of two governmental initiatives – one co-founded by the Commission and the United States, the other by Britain. They merged in 2016 and, in April 2020, as momentum built for legislation to CSAM with client-side scanning technology, WeProtect was transformed from a British government-funded entity into a putatively independent ‘foundation’ registered at a residential address in Lisse, on the Dutch North Sea coast. Its membership includes powerful security agencies, a host of governments, Big Tech managers, NGOs, and one of Johansson’s most senior cabinet officials, Antonio Labrador Jimenez, who heads the Commission’s team tasked with fighting CSAM.
Labrador Jimenez officially joined the WeProtect Policy Board in July 2020, after the Commission decided to join and fund it as “the central organisation for coordinating and streamlining global efforts and regulatory improvements” in the fight against CSAM. WeProtect public documents, however, show Labrador Jimenez participating in WeProtect board meetings in December 2019.
Commenting on this story, the Commission said Labrador Jimenez “does not receive any kind of compensation for his participation in the WeProtect Global Alliance Management Board, and performs this function as part of his duties at the Commission”. Labrador Jimenez’s position on the WeProtect Board, however, raises questions about how the Commission uses its participation in the organisation to promote Johannson’s proposal.
Labrador Jimenez has also played a central role in drafting and promoting Johansson’s regulation, the same proposal that WeProtect is actively campaigning for with EU funding. And next to him on the board sits Thorn’s Julie Cordua, as well as government officials from the US and Britain [the latter currently pursuing its own Online Safety Bill], Interpol, and United Arab Emirates colonel, Dana Humaid Al Marzouqi, who chairs or participates in numerous international police task forces. Between 2020 and 2023, Johansson’s Directorate-General awarded almost 1 million euros to WeProtect to organise the June 2022 summit in Brussels, which was dedicated to the fight against CSAM and activities to enhance law enforcement collaboration.
So I guess what they are saying is that there’s no conflict of interest if you don’t get paid? It’s no problem at all when politicians are also on the board of a lobbying organisation?
Brave Movement’s internal advocacy documents lay out a comprehensive strategy for utilising the voices of abuse survivors to leverage support for Johansson’s proposal in European capitals and, most importantly, within the European Parliament, while targeting prominent critics.
“The main objective of the Brave Movement mobilisation around this proposed legislation is to see it passed and implemented throughout the EU,” it states.
“Once the EU Survivors taskforce is established and we are clear on the mobilised survivors, we will establish a list pairing responsible survivors with MEPs – we will ‘divide and conquer’ the MEPs by deploying in priority survivors from MEPs’ countries of origin,” its advocacy strategy reads. Conservative Spanish MEP Javier Zarzalejos, the lead negotiator on the issue in the parliament, according to the Brave Movement strategy has called for “strong survivors’ mobilisation in key countries like Germany”.
How convinced can these people really be of their supposed good cause, if they can’t win politicians and critics over simply with good arguments, but instead need to exert inordinate amounts of political pressure to force the issue?
Brave Movement’s links with the Directorate-General for Migration and Home Affairs goes deeper still: its Europe campaign manager, Jessica Airey, worked on communications for the Directorate-General between October 2022 and February 2023, promoting Johansson’s regulation.
According to her LinkedIn profile, Airey worked “closely with the policy team who developed the [child sexual abuse imagery] legislation in D.4 [where Labrador Jimenez works] and partners like Thorn”. She also “worked horizontally with MEPs, WeProtect Global Alliance, EPCAT”.
Asked about a possible conflict of interest in Airey’s work for Brave Movement on the same legislative file, the European Commission responded that Airey was appointed as a trainee and so no formal permission was required.
Again. If you’re not paid, it can’t be corruption, right? Give me a break!
Brave Movement has enlisted expert support: its advocacy strategy was drafted by UK consultancy firm Future Advocacy, while its ‘toolkit’, which aims to “build a beating drum of support for comprehensive legislation that protects children” in the EU, was drafted with the involvement of Purpose, a consultancy whose European branch is controlled by French Capgemini SE. Purpose specialises in designing campaigns for UN agencies and global companies, using “public mobilisation and storytelling” to “shift policies and change public narratives”.
Since April 2022, Purpose representatives have met regularly with ECLAG – the network of civil society groups and lobbyists – to refine a pan-European communications strategy. Documents seen by this investigation also show they met with members of Johansson’s team.
Classic propaganda is what that is.
‘Offlimits’, previously known as the Online Child Abuse Expertise Agency, or EOKM, is Europe’s oldest hotline for children and adults wanting to report abuse, whether happening behind closed doors or seen on video circulating online. Offlimits director between 2015 and September this year, Arda Gerkens is deeply knowledgeable of EU policy on the matter. Yet unlike the likes of Thorn, she had little luck accessing Johansson.
“Commissioner Johansson and her staff visited Silicon Valley and big North American companies,” she said. Companies presenting themselves as NGOs but acting more like tech companies have influenced Johansson’s regulation, Gerkens said, arguing that Thorn and groups like it “have a commercial interest”.
Gerkens said that the fight against child abuse must be deeply improved and involve an all-encompassing approach that addresses welfare, education, and the need to protect the privacy of children, along with a “multi-stakeholder approach with the internet sector”.
Maybe this is the most important point of this story. Even beyond the corrupt lobbying charges. This gets to the point why laws like this are a bad idea. There is obviously a significant societal problem here. We need to open our eyes to it. Study the cause of this behaviour and find a holistic solution that far exceeds just sniffing out and locking up criminals. We need to find out what makes these people tick and how we can help them.
But instead, we are closing our eyes to what is actually going on. Instead of having to grapple with the actual darkness in the human soul, we simply declare the offenders to be monsters and look away. And we are prepared to treat the entirety of the general population as potential criminals to do so.
It’s a position reflected in some of the concerns raised by the Dutch in ongoing negotiations on a compromise text at the EU Council, arguing in favour of a less intrusive approach that protects encrypted communication and addresses only material already identified and designated as CSAM by monitoring groups and authorities.
Dutch government official, speaking on condition of anonymity, said: “The Netherlands has serious concerns with regard to the current proposals to detect unknown CSAM and address grooming, as current technologies lead to a high number of false positives.”
“The resulting infringement of fundamental rights is not proportionate.”
Now let’s get to the point of why companies (or NGOs or whatever) like Thorn actually want this legislation to happen:
In June 2022, shortly after the roll out of Johansson’s proposal, Thorn representatives sat down with one of the commissioner’s cabinet staff, Monika Maglione. An internal report of the meeting, obtained for this investigation, notes that Thorn was interested to understand how “bottlenecks in the process that goes from risk assessment to detection order” would be dealt with. Detection orders are a crucial component of the procedure set out within Johansson’s proposed regulation, determining the number of people to be surveilled and how often.
European Parliament sources say that in technical meetings, Zarzalejos, the rapporteur on the proposal, has argued in favour of detection orders that do not necessarily focus on individuals or groups of suspects, but are calibrated to allow scanning for suspicious content. This, experts say, would unlock the door to the general monitoring of EU citizens, otherwise known as mass surveillance.
In the same meeting with Maglione, Thorn representatives expressed a “willingness to collaborate closely with COM [European Commission] and provide expertise whenever useful, in particular with respect to the creation of the database of indicators to be hosted by the EU Centre” as well as to prepare “communication material on online child sexual abuse”.
The EU Centre to Prevent and Combat Child Sexual Abuse, which would be created under Johansson’s proposal, would play a key role in helping member states and companies implement the legislation; it would also vet and approve scanning technologies, as well as purchase and offer them to small and medium companies.
As a producer of such scanning technologies, a role for Thorn in supporting the capacity building of the EU Centre database would be of significant commercial interest to the company.
Meredith Whittaker, president of Signal Foundation, the US non-for-profit foundation behind the Signal encrypted chat application, says that AI companies that produce scanning systems are effectively promoting themselves as clearing houses and a liability buffer for big tech companies, sensing the market potential.
“The more they frame this as a huge problem in the public discourse and to regulators, the more they incentivise large tech companies to outsource their dealing of the problems to them,” Whittaker said in an interview for this story.
“So it’s very clear that whatever their incorporation status is, that they are self- interested in promoting child exploitation as a problem that happens “online,” and then proposing quick (and profitable) technical solutions as a remedy to what is in reality a deep social and cultural problem. (…) I don’t think governments understand just how expensive and fallible these systems are, that we’re not looking at a one-time cost. We’re looking at hundreds of millions of dollars indefinitely due to the scale that this is being proposed at.”
I’ve talked about why this whole idea is a bad one a lot, going back to the early days of the podcast, but just in case you need even more arguments:
Matthew Daniel Green, a cryptographer and security technologist at John Hopkins University, said there was an evident lack of scientific input into the crafting of her regulation. “In the first impact assessment of the EU Commission there was almost no outside scientific input and that’s really amazing since Europe has a terrific scientific infrastructure, with the top researchers in cryptography and computer security all over the world,” Green said.
“The idea that we are going to be able to have encrypted conversations like ours is totally incompatible with these scanning automated systems, and that’s by design.”
In a blow to the advocates of AI-driven CSAM scanning, US tech giant Apple said in late August that it is impossible to implement CSAM-scanning while preserving the privacy and security of digital communications. The same month, UK officials privately admitted to tech companies that there is no existing technology able to scan end-to-end encrypted messages without undermining users’ privacy.
But, of course, it gets worse. As I had predicted. The law isn’t even passed yet and they are already discussing how to broaden the scope of this blanket suspicion they want to place every single citizen under.
In July 2022, the head of Johansson’s Directorate-General, Monique Pariat, visited Europol to discuss the contribution the EU police agency could make to the fight against CSAM, in a meeting attended by Europol executive director Catherine de Bolle.
Europol officials floated the idea of using the proposed EU Centre to scan for more than just CSAM, telling the Commission, “There are other crime areas that would benefit from detection”. According to the minutes, a Commission official “signalled understanding for the additional wishes” but “flagged the need to be realistic in terms of what could be expected, given the many sensitivities around the proposal.”
Ylva Johansson Replies
Let’s see what the MEP who is at the centre of this whole disaster said when asked by the European Parliament’s Civil Liberties Committee (LIBE) to address the Balkan Insights story:
The media article you refer to, published in various versions by different media, is an attempt to misrepresent the normal consultative work of the Commission. In this case consultation with tech companies or with survivors of child sexual abuse.
I wonder if instead, it’s accurately representing “the normal consultative work of the Commission” and we’ve simply learned that this process is horrible.
Let the text of the proposal speak for itself. One of its key pillars is technological neutrality. In other words, the proposal does not incentivise or disincentivise the use of any given technology, leaving to the providers the choice of the technologies to be operated to comply effectively with the obligations of the proposal, provided they meet the high standards set in the proposal and in EU law more generally. Technologies to detect online child sexual abuse have existed for years, and they are certainly not the monopoly of a single organisation.
This is just empty speech. It doesn’t deny what the story alleged. A classic non-denial denial.
The answer to the question ‘Who benefits’ from my proposal is: children. And who benefits from its rejection? Abusers who can continue their crimes undetected and possibly big tech companies and messaging services who do not want to be regulated.
I find the last bit there curious. What does this law have to do with regulation of big tech companies? Especially since all of them are already mass-scanning for CSAM? The only change she is proposing seems to be one that affects private citizens.
I am confident that the Honourable Members of the Parliament, those in the LIBE Committee, will judge the regulation on its merits, will continue to ensure the balance and the respect of all the fundamental rights at stake, regardless of sensationalist media, and will continue ensuring that children’s rights are treated equally to other fundamental rights.
This seems to be a classic example of Hitler’s big lie theory. The big lie in this case being the assertion that the right to privacy (which LIBE is concerned about) is being given merit here; that it is weighed against other considerations. Which is patently bullshit because this regulation proposes the end of all digital privacy. That is absolutely true and people, including the press, need to start putting it this clearly for the public to understand this matter. There simply is no privacy in the digital realm without end-to-end encryption. And there is no encryption if you upload everything people want to encrypt to the cloud beforehand.
A nice touch is also the cheap attack against the press she put in there. Where would we get to in this world if journalists actually did their job? WON’T SOMEONE THINK OF THE CHILDREN!?!
- “Who Benefits?” Inside the EU’s Fight over Scanning for Child Sex Content, Balkan Insight
- Press release by MEP Patrick Breyer
- Johansson letter
First and foremost, I would like to thank everybody who provided feedback on this or previous episodes. You are very important to the continued success of this podcast!
This podcast is provided free of charge and free of obligations under the value-for-value model. However, as a freelance journalist volunteering my time to produce this show, I need your support. If you like my work and want to make sure The Private Citizen keeps going, please consider joining my Patreon.
- Sir Galteran
- Jaroslav Lichtblau
avis, Bennett Piater, Dave, ikn, Jackie Plage, Jonathan M. Hethey, krunkle, Michael Mullan-Jensen, Tobias Weber
Andrew Davidson, astralc, Barry Williams, Cam, Captain Egghead, Dirk Dede, Fadi Mansour, Florian Pigorsch, Joe Poser, MrAmish, RJ Tracey, Robert Forster
D, Jonathan, Juhan Sonin, Kai Siers, RikyM, Steve Hoos, Vlad
Thanks to Bytemark, who are providing the hosting and bandwidth for this episode’s audio file.
The show’s theme song is Acoustic Routes by Raúl Cabezalí, licensed via Jamendo Music. This episode’s ending song is Inner Stress by Autohacker, licensed via Epidemic Sound.
Podcast cover art photo by GegenWind.