Big social media networks, including Facebook, TikTok, and Twitter are now designated as very large online platforms (VLOPs) by the European Commission, and this means they will have to follow extremely strict requirements as early as this summer.
The Commission has published a list of 19 platforms that will have to comply with the rules the European Union’s Digital Services Act (DSA) – the wide-ranging online content law – imposes on companies.
These networks – 17 are VLOPs, and two are very large online search engines – will be required to swiftly remove illegal content, ensure minors aren’t targeted with personalized ads, and limit the spread of disinformation or harmful content like cyberbullying.
The criteria to be designated a VLOP or a very large search engine is reaching at least 45 million monthly active users in the EU.
Technology should serve people
“Today is the D(SA)-Day for digital regulation. The countdown is starting for 19 very large online platforms and search engines to fully comply with the special obligations that the Digital Services Act imposes on them,” Thierry Breton, European commissioner for Internal Market said.
Margrethe Vestager, the Commission’s executive vice president, under the remit of Europe Fit for a Digital Age, added: “The whole logic of our rules is to ensure that technology serves people and the societies that we live in – not the other way around. The Digital Services Act will bring about meaningful transparency and accountability of platforms and search engines and give consumers more control over their online life.”
The platforms that had to publish user data by February 17 include eight social media companies, five online marketplaces, two search engines, Bing and Google, and other services such as Google Maps or Wikipedia.
The platforms now have four months to comply with the DSA regulations, the essence of which the Commission says is to empower users.
For example, users will have to be clearly informed on why they are recommended certain content and will be able to opt out from recommendation systems based on profiling. They will also be able to report illegal content easily.
The companies will have to stop displaying ads to users based on sensitive data like religion and political opinions. AI-generated content like manipulated videos and photos, known as deepfakes, will have to be labeled.
Fines could be hefty
Companies will also have to conduct yearly assessments of the risks their platforms pose on a range of issues like public health, children’s safety, and freedom of expression. They will be required to detail – in writing – how they are tackling such risks, and the first assessment will have to be finalized by August. The full list of measures can be found here.
External firms will audit the platforms, and the enforcement team in the Commission will access their data and algorithms to check what’s happening. If rules are found to be broken, financial fines could reach up to 6% of the companies’ global annual turnover – and in very serious cases, platforms could face temporary bans in the EU.
Breton said Twitter had already invited his team to visit its headquarters in the US and carry out a stress test. TikTok, under heavy fire in the US where Congress is mooting a nationwide ban of the platform, has also reportedly asked for checks from the Commission in advance.
The Commission, eager to reign in Big Tech, proposed the DSA as a “comprehensive framework to ensure a safer, more fair digital space for all” back in 2020. The pan-European law entered into force in November 2022.
The DSA applies to all digital services that connect consumers to goods, services, or content. It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users' rights online, and places digital platforms under a unique new transparency and accountability framework, the Commission says.
Your email address will not be published. Required fields are markedmarked