Over the past few years, the social media app TikTok has truly taken the world by storm, finding its way onto the majority of people’s phones with it’s quick-to-navigate interface and wide variety of content.
Owned by China’s ByteDance, TikTok has appeared in the media many times for equally as many reasons, both positive and negative. Today, it’s found its way back onto our headlines after recent developments regarding the company’s aspiration to tackle TikTok’s growing problems with harmful content being spread on the app.
TikTok has hired hundreds of content moderators throughout Europe from outsourcing companies, such as Facebook, that serve as rivals in the social media field.
TikTok has been rapidly expanding its ‘trust and safety hub’ in Dublin as well as hiring other moderators in London, who have been tasked with reviewing material posted by European users. Since January last year, around several hundred moderators have been hired by TikTok in the UK and Ireland.
These numbers add to the pre-existing thousands of staff that work for the company in similar hubs in California and Singapore.
According to an analysis of public LinkedIn profiles conducted by FT, it has been found that at least 190 of the moderators who joined TikTok since January 2021 have previously worked through contracting companies for Accenture, Covalen and Cpl.
Meta (Facebook and Instagram’s parent company), YouTube and Twitter are known to heavily rely on these contracting companies to oversee and remove the platform’s most violent and harmful content.
Meta chief executive, Mark Zuckerberg, reviewed his company’s growth earlier this month, blaming the decrease in growth on younger users fleeing Facebook and Instagram in favour of TikTok, which has led to more than $220bn being wiped off the company’s value in a day.
However, with TikTok’s huge growth comes the problem of dealing with the worst excesses of users, an issue that has put other leading social platforms in the sight of politicians and regulators across the world.
Last month, Facebook revealed that its monthly active users had dropped for the first time to 1.9bn users. TikTok is catching up, with over 1bn monthly users, bringing it in line with Instagram and above Snapchat, which has over 500mn.
“Our continuous investment in our trust and safety operations reflects our focus on maintaining TikTok as a place for creativity and entertainment,” said Cormac Keenan, global head of Trust and Safety at TikTok.
This push for moderators meant TikTok’s European workforce rose more than 1,000 in 2020, when the company’s turnover in the region grew 545 per cent to $170.8mn. But according to UK Companies House filings, pre-tax losses widened fourfold to $644.3mn, “driven primarily by the increase in employees to support the growth of the business [in Europe]”.
TikTok’s strategy has been to offer moderators in-house positions, with better salaries and benefits in order to lure away experienced staff from the same limited talent pool as Facebook, Instagram, YouTube, and Snapchat.
“I chose TikTok because the benefits are better, the environment is better, and the company values every member,” said one TikTok employee who joined last year from Accenture. “It was better for my career and I wanted to be able to work from home, which was a battle at Accenture.”
According to those with direct knowledge of the hiring process, recruits often speak multiple languages and have experience in content moderation, as TikTok employers find language skills to be a “key consideration for prospective candidates”.
Another content moderator who moved from YouTube to TikTok revealed that the levels of disturbing content in the job were similar, but the psychological support was better at TikTok.
However, Candie Frazier, a former content moderator in California, is suing TikTok, after claiming that the company failed to protect her mental health after viewing extreme and violent videos. The company responded that it does not comment on ongoing litigation but it has continued to expand on a range of wellness services to support moderators.
On the other hand, Accenture, Cpl and YouTube did not respond to requests for comment, and Covalen declined to comment.
Facebook, however, previously agreed to pay $52mn to thousands of US moderators who claimed they were left traumatised after watching disturbing content on the platform.
Meta said it offers wellbeing training and support for internal and contracted content moderators, breakout areas for reviewers to step away from their desks if needed and technology that ensures reviewers are not exposed to potentially graphic content back-to-back for long periods of time.
Written by Jade Andrew
Research by Louis-Daniel Oloyede