AI replacing human reviewers, TikTok begins massive layoffs starting from Malaysia

share
AI replacing human reviewers, TikTok begins massive layoffs starting from Malaysia

The social media platform TikTok is set to lay off hundreds of employees globally as the company shifts its focus towards using more artificial intelligence (AI) to review content. The regions currently affected include Malaysia, where social media companies are facing regulatory pressures. As part of efforts to combat cybercrime, the Malaysian government has mandated that operators must apply for operating licenses by the end of this year.

AI Gradually Replacing Human Moderation

According to Reuters, TikTok, owned by China's ByteDance, is cutting 500 jobs in Malaysia, with most of the affected employees involved in the company's content moderation business. Sources indicate that more layoffs will follow next month to consolidate certain regional operations.

ByteDance has over 110,000 employees in more than 200 cities globally and currently employs a combination of automated detection and human moderation to review content posted on its platforms. A spokesperson stated that the company plans to invest $2 billion globally this year to enhance trust and safety mechanisms, and further improve efficiency, with 80% of guideline-violating content already removed through automated technology.

We are making changes as part of our ongoing efforts to strengthen global content moderation operations.

Malaysia Requires Social Media Operators to Apply for Operating Licenses

The Malaysian government announced at the end of July that social media operators with over 8 million users in the country must apply for operating licenses by the end of this year to combat the increasing cyber offenses, as reported by Reuters.

In light of the sharp rise in harmful content on social media platforms, Malaysian regulatory authorities have instructed social media companies to provide feedback on the government's concerns regarding cyber offenses and harmful content found on their platforms, urging platforms including Meta, the parent company of Facebook, and short video platform TikTok to enhance monitoring on their platforms. Currently, the communications regulator can flag content that violates local laws to social media companies, but the decision to remove the content is left to the platforms.