Europeans won’t be mass surveilled or forced to communicate with weakened encryption just yet. But they may be required to verify their age on porn platforms.
The European Parliament Committee on Civil Liberties, Justice, and Home Affairs dealt a major setback for the EU Home Affairs’ initiative to mass-scan Europeans for child sexual abuse material and weaken encrypted communications.
With a clear-cut vote of 51 to 2, it rejected the proposed measures, drafting a new position.
The Committee now urges the implementation of effective measures to fight child sexual abuse online, finding a balance and avoiding “mass surveillance” or “generalized monitoring of the internet.”
According to the proposal, court-validated detection orders should be required to track down illegal material when mitigation measures are not enough. Authorities fighting crime should get time-limited orders as a last resort to detect any child sexual abuse material (CSAM) and take it down, politicians agree.
They also excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential.
The new proposed rules would still mandate internet providers to assess whether there is a significant risk of their services being misused for online child sexual abuse and to take measures to mitigate these risks.
“MEPs (Members of the European Parliament) want mitigation measures to be targeted, proportionate, and effective, and providers should be able to decide which ones to use. They also want to ensure that pornographic sites have adequate age verification systems, flagging mechanisms for CSAM, and human content moderation to process these reports,” the press release reads.
The EP’s proposed position is somewhat opposite to the UK’s recently approved Online Safety Act, which requires tech companies to enforce age limits and checking measures and scan and remove illegal content quickly, even if the messages were to be encrypted.
This outcome was expected after a deal struck by the seven political groups of the parliament at the end of October.
The draft Parliament position still needs to be endorsed by the plenary. On 20th November, the start of negotiations will be announced.
European Digital Rights (EDRi) Association warned that the European Commission’s original draft law, dubbed ‘Chat Control,’ would lead to “mass scanning of private and encrypted messages across Europe.” The draft has courted controversy and has been widely criticized for lack of proportionality. EDRi criticized the proposal as it “likely violates the essence of the right to privacy.”
“Today’s vote shows the strong political will of the Parliament to remove the most dangerous parts of this law – mass scanning, undermining digital security, and mandating widespread age verification. Parliamentarians have recognized that no matter how important the aim of a law, it must be pursued using only lawful and legitimate measures,” the EDRi press release reads.
Mass scanning, weakening of end-to-end encryption, age verification, and other risk mitigation measures, as part of the fight against online child sexual abuse material, were the most worrying to the association.
“<…> in order to search for child sexual abuse material (CSAM), all people’s messages may be scanned (Articles 7-11). Instead, MEPs require that specific suspicion must be required – a similar principle to warrants. This is a vital change which would resolve one of the most notorious parts of the law,” EDRi stated.
The European Parliament’s proposed position states that end-to-end encrypted private message services – like WhatsApp, Signal, or ProtonMail – are not subject to scanning technologies, and those message services should not be weakened in a way that could harm everyone.
EDRi still expressed concerns, remaining skeptical about the chances of a good final outcome of proposed legislation, and noted that the Commission attempted “to manipulate the process.”
While MEPs removed mandatory age verification for private message services and app stores, EDRi is disappointed that the draft position makes age verification mandatory for porn platforms.
“We recommend that there should not be mandatory age verification for porn platforms and that risk mitigation measures should oblige providers to achieve a specific outcome rather than creating overly detailed (and sometimes misguided) service design requirements,” EDRi stated.
EU to establish Centre for Child Protection
The law would set up an EU Centre for Child Protection to help implement the new rules and support internet providers in detecting CSAM. It would collect, filter, and distribute CSAM reports to competent national authorities and Europol. Also, the Centre would develop detection tech for providers and maintain a database of hashes and other technical indicators of CSAM.
“To meet this compelling challenge effectively, we have found a legally sound compromise supported by all political groups. It will create uniform rules to fight the sexual abuse of children online, meaning that all providers will have to assess if there is a risk of abuse in their services and mitigate those with tailor-made measures. As a last resort, detection orders can be used to take down abusive material still circulating on the internet. This agreement strikes a balance between protecting children and protecting privacy,” said Javier Zarzalejos, an MEP from Spain.
More from Cybernews:
Subscribe to our newsletter