EU's proposal to combat online child abuse would put kids in more danger, expert says
No one argues – nor should they – that fighting online child abuse is vital. But the European Union is choosing a risky direction that would endanger end-to-end encryption, Callum Voge, a tech policy researcher, says.
Voge, a Senior Government Affairs & Advocacy Advisor at the Internet Society, a global nonprofit organization fighting for open and secure internet, said in an interview with Cybernews that the European Commission’s proposal for addressing the proliferation of online child sexual abuse material online would actually make everyone, including children, less safe.
How come? According to Voge and lots of other online privacy advocates, the European Commission’s proposal, if implemented, will threaten encryption in the form of automated searches for child sexual abuse material.
The logic behind the idea might sound perfectly reasonable. Policymakers in Brussels seem to think that encrypted messaging allows for child abuse to take place.
According to officials behind the idea, also happily supported by law enforcement eager to conduct surveillance, it would only make sense to make tech companies comply with the new regulation.
The Tory government in the post-Brexit United Kingdom is also pushing its Online Safety Bill, with the former Home Secretary Priti Patel calling encryption a “betrayal” of children.
But Voge says the proposal would undoubtedly weaken end-to-end encryption, so important for the security and privacy of every internet user, or even break it.
Even if the Commission insists this would not be the case, “this is wishful thinking” because once you break encryption, there’s no going back, and all our online data and communications would be exposed to not just governments but third parties, too, Voge stressed.
You have been warning the wider public about the Commission's proposal to combat child sexual abuse material and its threat to encryption. Why do you think it’s dangerous? You have said it would actually make children less safe. What's the route here?
The intention of this proposal is really a good thing. The objective is to better protect children online from harm, specifically sexual abuse. And so, you know, that's a really good ambition.
The issue is actually more in the approach and how the proposal wants to tackle this. The text states very explicitly that the proposal is compatible with end-to-end encryption, but that's actually not the case, as our analysis has shown.
No technical solutions currently exist that would allow service providers to give their users end-to-end encrypted services while still complying with the detection obligations under the proposal. The latter has a series of detection obligations that are quite stringent, so the providers will be really pressured to compromise encryption to comply with the proposal.
That would mean either creating encryption backdoors or, more likely, introducing client-side scanning. Our view on this is that either of these approaches creates new vulnerabilities that could be exploited by criminals or other actors, including hostile states, and they violate the basic value of encryption.
Especially with client-side scanning, it's quite dangerous because the database can be manipulated and indicators can be used for social engineering attacks, extortion, blackmail, or things like this. So there definitely is a threat.
The main conclusion of the Internet Society was that the proposal would harm the internet by harming encryption. It would make the internet less global, less open, less secure, less trustworthy – which is really concerning from the EU's perspective.
A second point I want to emphasize is that it would really derail the EU’s ambitions for the digital decade and hurt potential future prosperity. The reason for that is, of course, new barriers to innovation. Using encryption and new and creative ways would not be at all incentivized. European companies simply wouldn't go into those areas, and it could really cement the domination of the existing big foreign providers.
"We've heard from some law enforcement agencies that the data that they do have access to already is not all processed because of capacity issues. So if you're kind of sacrificing the privacy and security of our society, will it actually create results? That's a big question."Callum Voge.
The third really important point is what that means for ordinary people, which is that when you weaken encryption, European internet users are put at risk – and that includes children. That's online scams, cyberattacks, even physical attacks facilitated by the online environment.
Of course, in some scenarios, we could also imagine the overreach of the governments which would have these new tools to monitor and surveil users. We're not alone in kind of voicing those concerns. The European Data Protection Board also published a joint opinion in July.
Just to make it clear, the European Commission wants to access private data and messages and says it’s possible to detect child sexual abuse material online while still enjoying the perks of encryption. But is that possible at all? What actually is client-side scanning?
Client-side scanning is just one of the possible technologies that could be used to satisfy the requirements of the proposal. There could be others, but we just think client-side scanning would be the most likely. And the way that client-side scanning works is that it occurs on the device. So your messages and content would be scanned.
Before being sent out, right?
Yes, before being encrypted and then sent out. Policymakers are saying that this doesn't violate encryption because the scanning is happening before encryption happens. But at the end of the day, our point is that it undermines the value of encryption and the kind of use that this has.
On a technicality, you can say it doesn't violate encryption, but as we said, it violates the spirit.
Do you think it’s overly ambitious to actually ask the providers to eliminate 100% of this kind of material online if the price is losing privacy?
Well, policymakers are seeking access to this private data in the name of law enforcement. They seek to share this information with law enforcement. The question is, would law enforcement actually be able to process this mass of data and use it efficiently?
We've heard from some law enforcement agencies that the data that they do have access to already is not all processed because of capacity issues. So if you're kind of sacrificing the privacy and security of our society, will it actually create results? That's a big question.
Also, what's the real goal? Is the goal content moderation, or is it actually about amplification? Because we can see that amplification is a big part of this crime, and the spread of this content is very concerning. You know, maybe something can be done in that space that wouldn't require violating encryption.
Is this proportional? We would argue that it’s not, that there are probably other measures that could be taken which could get results and wouldn't require this kind of violation of a key security tool. You know, it's kind of contradictory that, to protect people, we would take away one of the main security tools that we have online. So that's definitely the question about proportionality.
You mention law enforcement. I think various agencies around the world have been maybe not demanding, but asking for encryption backdoors for years now. I guess it depends on the country. This time, the packaging of the proposal is very clever, isn’t it? When people read about this proposal, they probably think, fair enough, let’s protect the children, right? It didn't work with terrorism before because nobody actually believed the correlation, but child abuse is an effective weapon here, don’t you think?
As you said, child sexual abuse is a very emotional topic. When you're a policymaker who needs to make a decision, it's very hard to go to your constituents and say, I'm not going to support this bill, and then try to explain this technical reason why you don't want to support the bill.
When people hear we're trying to protect children, a lot of people will say, yeah, we should do everything we can, even if we're just saving one child. That’s a very emotional argument.
As for the intention, I can let you be the judge of that. But when we compare it to the United Kingdom, you see a very similar approach.
The UK didn't go directly to child sexual abuse. They had a much wider remit and originally talked about illegal content, as well as legal but harmful. But they didn't define what was what, so it could be bullying or content that encourages self-harm. It was very wide and kind of unclear.
In the UK, most of the government PR supporting their Online Safety Bill has been about child sexual abuse. And I think the reason for that, as you mentioned, was because it's so emotional.
That's why it's really important to emphasize that it's not a linear kind of trade-off as in sacrificing our privacy to keep children safer. It's not a trade-off. We always want to remind people that children also rely on encryption for different types of harm – cyberattacks, impersonation of their contacts. A more informed debate is needed.
So if breaking encryption is not the way to go, what other ideas, maybe not threatening anyone’s privacy online, could be worth exploring?
This is challenging. I think this is what policymakers will need to think a lot about because it's a societal issue that maybe needs some societal kind of solution. But from a technical standpoint, it might be a good idea to focus on amplification instead of content moderation.
With amplification, you can improve tools like user reporting so that when users report an image, it can be processed by law enforcement.
One key thing about encryption is that the message is encrypted between the two endpoints. But if one endpoint decides to share that with law enforcement, then that's not a violation of encryption, of course.
Certain privacy organizations might disagree, but there's also metadata that surrounds the sending of a message. For example, WhatsApp or other companies look at certain profiles and see the volume of messages going out. They can see, for example, the profile picture of that account. With that information, they can often make educated guesses about distributor accounts for child sexual abuse.
Law enforcement is already using this data. They can maybe use that more efficiently and work with companies to access that. As I said, that's a bit contentious. Not all privacy organizations agree on the use of metadata, but it's already happening anyway.
These would be, I think, a couple of very early first steps that should be taken. They would not be as intrusive to our privacy and undermining encryption as the proposal suggests.
How do you think opponents of the proposal are succeeding in raising the issue? It surely is important to have people talking about it and realizing that there could be problems later if the Commission’s proposal becomes law.
There are some quite promising indicators on that front. I already mentioned the statements by the European Data Protection Board. That's one.
In addition, there have been several members of the European Parliament that have spoken out against the proposal. They've used the term “chat control” to talk about what the Commission is proposing. That's quite encouraging.
We've also seen things like the Austrian government issuing a binding decision that they wouldn't support the proposal with the current phrasing that undermines encryption.
I'm also working on the UK profile, and I can say that the diversity of the EU member states creates a lot more space for the debate to happen.
I have noticed that especially in countries that have a more recent history of state surveillance, you know, whether from fascism or communism or other government types, there are a lot more suspicions about giving this kind of new power to the government.
We do have other supporters, such as the Global Encryption Coalition, which the Internet Society is a founding member of. I think we have the support and a lot of space to shape this debate.
Could you remind our readers how the timeline will work in the European Commission? Ylva Johansson, the Commissioner for Home Affairs, has been talking about the proposal being a priority in 2023. Will the effort be successful, do you think?
It's not going to be quick. It's going to be very spread out. I was listening to the priorities of the Swedish EU presidency a few weeks ago, and they didn't mention this proposal explicitly at the press conference that I went to.
I know that child sexual abuse is a big issue in Sweden, but at least I didn't hear them mentioning it as an explicit kind of priority for their presidency.
All in all, the proposal shouldn't be accepted as long as encryption is threatened. Our main point is that action can be done to protect children – but just not in the way that it's proposed.
More from Cybernews:
Subscribe to our newsletter