Study: TikTok uses its algorithms to hurt teens

Have you ever stopped to think how YouTube, Facebook, or really any social media platform you use always seem to know exactly what to show? I mean, if you watch videos on YouTube about “Cats Doing Crazy Things,” or enjoy a lecture on the Constitution, once you’re done, Bingo, you have more crazy pet videos and enlightening commentaries to choose from. If someone made a video about James Madison and his cat, Which would probably be ready to watch too.

It is related to magic.

But it’s all built into the platform’s programming. It’s an algorithm. Each platform has a similar algorithmic setup that monitors every move you make and quickly calculates what you like and what you have to offer. They are all slightly different. They each rank certain elements as higher or lower priorities.

So Facebook (Meta) will put relationships and other Facebook connections at the top of the priority list and feed you your uncle Larry’s cat videos first. Twitter is more concerned with user interactions about your favorite topics or the people you follow most. etc.

Either way, it’s an amazing set of formulas all designed to keep you coming back for more or to simply keep your butt where it is as the hours fly by like seconds.

However, it is starting to become clear that some social media algorithms are much more problematic than others. TikTok, for example, an app that has more than 1 billion monthly active users, may be a lot less cool than its ever-growing user base thinks.

A new study from the Center for Countering Digital Hate (CCDH) found that TikTok’s algorithm can push suicidal content to children in just 2.6 minutes. The report also pointed out that eating disorder content was recommended to teens within “just 8 minutes.”

How did they come up with that? Well, the CCDH hired researchers to set up TikTok accounts posing as 13-year-old users (the minimum age for an account). Those “users” were then interested in body image and mental health content. To put the algorithm to the test, each researcher created two accounts as a 13-year-old: one as an average or “typical” teen and one that could be considered a “vulnerable” teen. One account was given a girl’s username. The other, a username that indicated some concern about body image and included the phrase “slimming.”

After that, all account users briefly paused and “liked” videos about body image and mental health. The TikTok algorithm then quickly sent potentially harmful videos to all users. But the researchers found that those “loseweight” accounts got three times more overall malicious content, and 12 times more self-harm and suicide videos than the standard accounts.

Let me be clear, the videos pushed were not just anti-suicide videos, but videos that promoted the idea, like a video labeled “Making Everyone Think About You.” [sic] fine, so you can try it privately”. And again, that kind of information was handed out in just a few minutes after the user logged into the app.

“TikTok is able to recognize users’ vulnerability and try to exploit it,” said Imran Ahmed, the CEO of CCDH, in a CBS news report. “It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing our kids’ psychology and adapting to keep them online.”

CCDH’s research went even further than the 2.6 minute number I quoted above, stating that their findings suggested that TikTok “pushes potentially malicious content to users every 39 seconds.”

Let that sink in for a while. That’s the kind of statistic that should give every user and teen’s guardian a little thought. And if not, maybe it’s time we all checked our own internal algorithms.

Study: TikTok uses its algorithms to hurt teens

Leave a Reply

Your email address will not be published.

Scroll to top