October 18, 2021

TikTok peddles misinformation to children as young as nine years old

Meanwhile, high-profile influencers continue spreading misinformation which increases the amount of harmful content seen by young children and adolescents.

TikTok peddles misinformation to children as young as nine years old

TikTok peddles misinformation to children as young as nine years old photo credit: Canva

It has been revealed that the popular social media app, TikTok is feeding misinformation content to young children – even within minutes of them signing up. Research found that false information was targeted toward children as young as nine, even if they did not follow or search for that content.

NewsGuard conducted a study in August and September, asking children aged nine to seventeen and from different cultural backgrounds to create accounts on TikTok. Although TikTok is supposed to restrict access for users younger than 13, the younger participants were able to create accounts without outside help.

Alex Cadier, the UK managing director for NewsGuard, said: “TikTok’s failure to stop the spread of dangerous health misinformation on their app is unsustainable bordering on dangerous. Despite claims of taking action against misinformation, the app still allows anti-vaccine content and health hoaxes to spread relatively unimpeded. This is made worse by the fact that the more anti-vaccine content kids interact with, the more anti-vaccine content they’ll be shown. If self-regulation isn’t working for social media platforms, then regulation, like the online safety bill, has to be the way forward to keep young people safe online.”

Published in May, the draft online safety bill imposes a “duty of care” on social media companies, and some other platforms that allow users to share and post material, to remove “harmful content”. This can include content that is legal but still judged to be harmful, such as abuse that does not reach the threshold of criminality, and posts that encourage self-harm and misinformation.

Cadier added: “The difficulty in really knowing the scale of this problem is that TikTok hold all the information and get to mark their own homework. They say they’ve taken down 30,000 videos containing Covid-19 misinformation in the first quarter of 2021, which is a good step, but how many are left? Of the ones they deleted, how many views did each get? Who shared them? Where did they spread? Where did they come from? How many users mostly see misinformation when they see Covid-19 related content?”

“TikTok is very bad at removing videos with misinformation, and these videos with vaccine misinformation stay for months and months on the platform. The more viral these videos get, the more eyes will see them, and unfortunately some will be children, due to the nature of the algorithms,” University of Illinois School of Public Health Epidemiologist Katrine Wallace, who battles misinformation on Tik Tok, told media outlets.

Besides TikTok, other platforms like Facebook, Instagram and Twitter have recently come under fire as increased transparency from the companies showed the effect of social media on society, especially young people.

Meanwhile, high-profile influencers continue spreading misinformation which increases the amount of harmful content seen by young children and adolescents.

Sources: The Guardian, Business Insider

Follow FiND iT here.

ALSO READ: Zuckerberg defends Facebook after Frances Haugen’s damning claims

Leave a Reply

Your email address will not be published. Required fields are marked *