A move in China towards algorithmic transparency could be disastrous for big tech.
The algorithms that serve us content and dictate our digital lives have never been under more scrutiny. Following years of research that alleges they lead us down rabbit holes of ever-more extreme content and push us into fringe beliefs that we would ordinarily not follow, they’ve been blamed for political polarisation and the world’s ills.
They’ve also been heavily criticized for their ability to push and pull human emotion with impunity – an issue that is compounded by the rise of TikTok, the first true social media success story to emanate from outside Silicon Valley’s tight control.
Because of TikTok’s non-US background, lawmakers in the west have become worried about the impact of algorithms. It’s an issue echoed elsewhere about the rise in general of big tech and a belief that the path they’re leading us down is not one the world wants to follow. The European Union, long one of the most skeptical groups when it comes to technology, has announced plans to regulate AI, including algorithms. “The opacity of many algorithms may create uncertainty and hamper the effective enforcement of the existing legislation on safety and fundamental rights,” they claim.
A rising tide against algorithms
But it’s not just there that governments are getting antsy about algorithms. As artificial intelligence’s ability to know humans better than they know themselves rises, there are increasing worries about their power and might – and a concern about how they work. It’s not just in western countries that people are worried about the algorithmic power that apps hold over us.
China’s internet watchdog, the Cyberspace Administration of China, has recently compelled tech businesses operating within the country to provide some details of what algorithms they use, and how they operate. In all, 30 different companies disclosed details of how their algorithms work within their apps and are integrated with user behavior.
The information is very high-level and not at all detailed. For instance, Tencent News explains that it has a top story ranking algorithm that is about “ordering selection”, with no more detail. ByteDance, the parent company of a number of apps, including TikTok in the west, uses a “personalized push algorithm” that is “used in the recommendation of images, videos, goods, and services to recommend content that may be of interest to users through users' historical clicks, duration, likes, comments, shares, dislikes, and other behavioral data.” That app is used in Douyin, Xigua Video, and Toutiao, but one app that is not disclosed is TikTok, the western equivalent, likely because it does not operate in China.
Opening the floodgates
The transparency, even if only slight, has been welcomed – but it could well open the floodgates for companies who will now be expected to answer further questions not just from Chinese authorities but many more worldwide. The reality is that politicians want to dig into the algorithms that affect our public discourse and learn how they work.
For some, the end goal is to tame and limit them, either to stop the spread of harmful discussions they fear will divide us or to try and bring them to heel and promote a point of view they want. By giving Chinese authorities an inch, it’s likely that tech companies, including eventually TikTok, will have to offer regulators elsewhere a mile in order to keep them happy.
And with that comes another set of problems. Politicians like to pretend they know about everything, even something as complicated as the algorithms that power our apps – which even those behind their development admit not to know the full details of. It’s for this reason that the initial transparency could spell disaster for big tech companies. Now that they’ve given even a slight insight into how they work, they’ll be dogged with follow-up questions and potentially asked to alter their algorithms by people who don’t know what to do. It’s a slippery slope – and one we’re just at the start of.
More from Cybernews:
Subscribe to our newsletter