Having passed in the House of Lords, the controversial bill is, in theory, set to come into law later this year.
The UK's Online Safety Bill has reached its final stages - and is causing just as much controversy as ever. Having already passed in the House of Commons, it's now also worked its way through the House of Lords, where yet more changes have been introduced.
So what does the legislation now cover, and how is it being received by campaigners and the tech industry?
The Online Safety Bill has been years in the making, getting more ambitious all the time. It's aimed, broadly, at making the internet a safe space, particularly for children.
Platforms will be required to remove all illegal content such as terrorism and incitement to violence, and protect children from pornography and material relating to suicide and self-harm. Users should be able to filter out content they don't want to see.
They’ll have to publish risk assessments and carry out age verification where appropriate, with fines of up to £18 million or 10% of annual revenue, whichever is higher, if they fail to comply. There are also potential jail sentences for tech executives in the event of a breach.
And the rules will apply not just to the major platforms such as search engines and social media companies, but also to an enormous number of websites hosting user-generated content, including things like:
One of the biggest issues with the bill is its sheer scale. Indeed, since 2021, the draft bill has doubled in length. From the initial intention to hold tech giants to account for child sexual abuse and terrorism material, round after round of debate has led to a situation where the bill now covers everything from deepfakes and cyberflashing to fraudulent ads.
All this is to be overseen by regulator Ofcom, which says it expects to be dealing with more than 100,000 services, mostly from overseas. It’s estimating that the costs of all this will hit £169 million by the end of the 2024-25 financial year – although the plan is for the system to become self-financing through the payment of fines.
However, effectively regulating the entire internet with one piece of legislation is clearly a pretty tall order.
From the start, there have been enormous concerns over privacy, with the government insisting that the content of messages and other online activities should be monitored for harmful material, while simultaneously claiming that it isn't trying to eliminate encryption.
And while the contentious concept of monitoring for 'legal but harmful' content has rightly been scrapped, these concerns remain.
Indeed, a number of tech firms have warned against the bill's provisions, with Apple, most recently, saying that it represents 'a serious threat' in terms of surveillance, identity theft, fraud, and data breaches.
Other companies, including Signal and WhatsApp, have indicated that they could pull out of the UK altogether if the plans go ahead, and half a dozen more have called for the government to urgently rethink the legislation.
The latest changes
The Lords considered two amendments aimed at increasing the oversight of Ofcom's powers when it comes to the scanning of private messages.
An opposition amendment proposed requiring a Judicial Commissioner to approve Ofcom’s notices to tech companies ordering them to scan all messages for child sexual abuse material (CSAM).
They would have been obliged to check that the notice was proportionate, and follow the same principles as would be applied by a court in a judicial review. And, crucially, an order wouldn't be approved if it meant that encryption would be weakened or removed.
Meanwhile, however, a government amendment would see a 'skilled person' oversee requests by Ofcom – a consultant called in when Ofcom is short-staffed or needs help with a particular problem. This amendment has been passed.
This, according to the Open Rights Group, is problematic: "Given that this 'skilled person' could be a political appointee, and they would be overseeing decisions about free speech and privacy rights, this would not be effective oversight," it says.
Either way, the most contentious – and hard to implement – feature of the bill remains: the requirement to scan encrypted messages for illegal content.
Compliance would be likely to involve client-side scanning (CSS) to detect illegal content before it's encrypted – a suggestion that’s already seen serious criticism from security experts as being both unworkable and a threat to privacy and security.
Indeed, the Open Rights Group has commissioned a legal opinion from Dan Squires KC and Emma Foubister of Matrix Chambers, which has found that the bill may involve breaches of international law.
By placing a duty on online platforms such as Facebook and Twitter to prevent users from encountering certain illegal content, they believe that the government is exercising 'prior restraint'.
However, with the House of Lords now having approved the bill, it could come into law in September as it stands – although this will depend on the government's priorities as it starts to approach the next general election. There's a fair chance the whole thing could be put on the back burner once again.
More from Cybernews:
Subscribe to our newsletter