TikTok just made a huge change to its For You feed — but is it enough?

an image of TikTok on a smartphone
(Image credit: Shutterstock)

TikTok is planning to adjust the way its individually curated ‘For You’ page picks content, to try and prevent users seeing so much of the same thing. The company believes that the kind of content it's concerned about is fine in isolation, but could be damaging when viewed excessively.

The news was first reported in the Wall Street Journal, which had previously published a pretty damning investigation into how the TikTok algorithm learns about its users and supplies suitably engaging content to send them down rabbit holes. 

In it, a bot set up by the paper to display depressive tendencies in its interactions with the app was shown videos about sadness and depression 93% of the time, after just 36 minutes’ usage of the app. 

In response, the paper claims, TikTom intends to “avoid showing users too much of the same content.”

Sure enough, TikTok then wrote a blog post outlining its planned changes. “We’re testing ways to avoid recommending a series of similar content – such as around extreme dieting or fitness, sadness, or breakups – to protect against viewing too much of a content category that may be fine as a single video but problematic if viewed in clusters,” the company explained.

“We’re also working to recognize if our system may inadvertently be recommending only very limited types of content that, though not violative of our policies, could have a negative effect if that’s the majority of what someone watches, such as content about loneliness or weight loss,” it continued. “Our goal is for each person’s For You feed to feature a breadth of content, creators, and topics.”

While it’s hoped that this will help TikTok fans’ wellbeing, algorithms can only go so far. So the company is also working on a feature that will allow users to block content associated with certain words or hashtags appearing in their ‘For You’ feed. 

Growing pains

Social media companies the world over, from YouTube to Facebook, are coming under intense scrutiny for the content that they surface, and the impact that seemingly benign algorithms can have on the wellbeing of both individuals and societies. Given that TikTok just passed a billion users in September — nearly a seventh of the world’s population — it’s perhaps unsurprising that it’s now attracting similar levels of concern.

The problem sounds like a familiar one. Just as YouTube’s engagement algorithm appears to enhance dwell times by pushing people to more and more extreme and/or controversial content, it seems that TikTok’s success comes from learning about a user’s profile and then delivering them the kind of content they crave — whether it’s ultimately good for them or not. 

The change described here is an interesting one, because the company carefully stops short of calling the content damaging in and of itself. If that was the core argument, then stricter moderation would be required, and bigger questions would invariably have to be asked. This is a quantity issue, not a quality one, in other words.

But ultimately, this could still end up opening a can of worms. After all, if one extreme dieting video is fine, but “clusters” are not, then what’s the number that can be safely viewed without inflicting damage? Five? 20? 100? 

These are seemingly impossible questions that the company will have to answer eventually: if not directly, than via the mysteriously opaque workings of the algorithm. If it can’t answer them satisfactorily, then this won’t be the last time TikTok ends up in the news for the wrong reasons.   

Alan Martin

Freelance contributor Alan has been writing about tech for over a decade, covering phones, drones and everything in between. Previously Deputy Editor of tech site Alphr, his words are found all over the web and in the occasional magazine too. When not weighing up the pros and cons of the latest smartwatch, you'll probably find him tackling his ever-growing games backlog. Or, more likely, playing Spelunky for the millionth time.