
An investigation by The Guardian, a British newspaper, has found that more than half of all top-trending videos offering mental health advice on TikTok contain misinformation.
-
An investigation by The Guardian has found that more than half of all top-trending videos offering mental health advice on TikTok contain misinformation.
-
Therapeutic language is being abused, and the platform is rife with false claims and ungrounded “quick fix” solutions, the study said.
-
TikTok said it was taking the videos down if they discouraged people from seeking professional medical support or promoted dangerous treatments.
Therapeutic language is being abused, and the platform is rife with false claims and ungrounded “quick fix” solutions, the study said.
Some of the advice is simply ridiculous. For instance, one video suggests eating an orange in the shower to reduce anxiety. Another presents normal emotional experiences as a sign of borderline personality disorder or abuse.
Mostly, though, the videos peddle supplements such as saffron, magnesium glycinate, and holy basil, even though there’s not enough proof they actually alleviate anxiety.
The Guardian took the top 100 videos posted under the #mentalhealthtips hashtag on TikTok and shared them with mental health professionals who checked the posts for misinformation.
According to the study, 52 out of 100 videos offering advice on dealing with trauma, neurodivergence, anxiety, depression, and severe mental illness contained at least some misinformation, and many others were vague or unhelpful.
Many videos offered general advice based on narrow personal experience and anecdotal evidence, said David Okai, a consultant neuropsychiatrist and researcher in psychological medicine at King’s College London.
The posts reflected how “short-form, attention-grabbing soundbites can sometimes overshadow the more nuanced realities of qualified therapeutic work” on social media, he added.
Experts and politicians The Guardian has talked to have urged the United Kingdom government to protect the public from the spread of misinformation, which, in this case, is potentially directly dangerous to people’s lives.
Tech companies, including TikTok, however, hate any kind of regulation and say they already do a lot to eliminate or at least minimize harm.

The company explained to The Guardian that it was taking the videos down if they discouraged people from seeking professional medical support or promoted dangerous treatments. UK users are also allegedly directed to National Health Service resources.
“TikTok is a place where millions of people express themselves, come to share their authentic mental health journeys, and find a supportive community. There are clear limitations to the methodology of this study, which opposes this free expression and suggests that people should not be allowed to share their own stories,” said a TikTok spokesperson.
TikTok explained to The Guardian that it was taking the videos down if they discouraged people from seeking professional medical support or promoted dangerous treatments.
However, the UK government can now take action through the Online Safety Act, which requires large online platforms to tackle harmful and misleading material if it is potentially damaging to children.
Ofcom, the regulator for the UK comms industries, has finalized a series of child safety rules that will come into force for social media, search, and gaming apps and websites on July 25th, 2025.
The rules will prevent young people from encountering the most harmful content relating to suicide, self-harm, eating disorders, and pornography.
Your email address will not be published. Required fields are markedmarked