Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

Just one person remains on Twitter's Asia child safety team, report says, despite Elon Musk saying dealing with child abuse is his biggest priority

A cyclist passes Twitter's HQ in San Francisco
Only one employee remains on Twitter's child safety team in APAC. Justin Sullivan/Getty Images

  • Only one person is left on Twitter's child safety team in Asia, Wired reported.
  • At least four people worked on the team prior to Elon Musk's post-takeover staff cuts.
  • Since the takeover, Musk has said that removing CSAM on Twitter is his number one priority.
Advertisement

Only one Twitter employee is left on a team dedicated to removing child sexual abuse material across Japan and the Asia-Pacific region, Wired reported.

Twitter previously employed at least four employees focused on child safety in APAC, Wired found on LinkedIn. The employees were based in Singapore, Twitter's Asian headquarters, but publicly said three left Twitter in November. 

Sources told Wired that this left only one full-time member of staff in Asia-Pacific to tackle the massive problem of CSAM material on Twitter. 

The Asia-Pacific region houses 60% of the world's population with around 4.3 billion people. Japan comes only second to the United States for the number of Twitter users at 59 million, data from Statista shows

Advertisement

After Musk gutted 50% of the company's roughly 8,000 employees around the world, its Singapore office was also affected.

Twitter did not immediately respond to a request for comment about its child safety team in Asia-Pacific. 

The decreasing size of the team tasked with dealing with child sexual abuse material in Asia runs somewhat contrary to Elon Musk's assertion last week that the removal of such content is his "Priority #1" since taking over the company.

Insider previously found that the three main hashtags used to sell CSE content on Twitter had largely been cleared since Musk became CEO. Twitter has also added a direct reporting option for CSE on tweets with images or videos, with users being able to select the option of "child sexual exploitation," when reporting a tweet. 

Advertisement

Twitter has historically had a problem with adequately addressing the issue of CSAM.

In September, it sent an email to advertisers explaining that ads had appeared on profiles posting or selling CSAM. Although Twitter banned those accounts, a number of companies like Mazda and Dyson suspended advertising on Twitter, after they found data that showed their ads had appeared next to CSAM posts. 

Twitter's latest transparency report from July to December 2021 showed that the company did indeed remove over 500,000 CSAM accounts, up 31% from the previous six months.

Twitter
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account