Data Shows That X Has Significantly Fewer Moderation Staff Than Other Platforms

Does X now have a lot fewer moderators than other apps, following its cull of around 80% of its total staff in 2022?

While we don’t have full insight into the staffing of each app, X has publicly endorsed its “Community Notes” crowd-sourced fact-checking program as a means to supplement its reduced moderation workforce, which it sees as a better solution in many ways.

But how much has that workforce actually reduced, and how does it compare to other apps?

The latest E.U. transparency reports provide some insight.

Under the E.U. Digital Services Act (D.S.A.) , all large online platforms are required to regularly report their E.U. user and moderation staff counts, in order to provide more transparency into their operations.

Over the last week, all of the major social apps have shared their latest reports, which provides a comparison between the total users and moderation staff for each.

Which stands as follows:

Based on this, X does have the worst ratio of moderation staff to users, at 1/60,249, with LinkedIn coming in second (1/41,652), then TikTok (1/22,586) and Meta (1/17,600).

Though there are some provisos here.

Meta, for example, reports that it has 15,000 content reviewers working across both IG and Facebook, which both have 260 million EU users each. In that sense, Meta’s staff to user ratio could arguably be doubled, though even then, it would still be better than X and LinkedIn.

X’s total user count also includes logged-out guests, which the others seemingly don’t. Though guests on Facebook, LinkedIn and IG can’t see as much content, so that’s probably not really a major factor in this context.

It’s also not entirely clear how many moderators are assigned to the E.U. specifically by each platform.

In TikTok’s report, for example, it states that:

“TikTok has 6,287 people dedicated to the moderation of content in the European Union.”

Which clearly delineates that TikTok has this many staff servicing its E.U. user base. Yet, the descriptions from Meta and X are less clear.

Meta says that:

“The team working on safety and security is made up of around 40,000 people. About 15,000 of those are content reviewers; they include a mixture of full-time employees, contractors, and outsourced support. We partner with companies to help with content review, which allows us to scale globally with coverage across time zones, languages, and markets. For content that requires specific language review in the EU, there are dedicated teams of reviewers that perform content moderation activities specifically for that content.”

That aligns with what Meta has reported elsewhere as its global moderation team, servicing both IG and Facebook (and presumably Threads as well these days). Which changes the calculation significantly, while X also notes that the 1,849 moderators it has listed “are not specifically designated to only work on EU matters”.

Yet, even factoring this in, X still trails the others.

X has 550 million total monthly active users, and if its entire moderation workforce is only 1,849 people, that’s a ratio of 1 human moderator for every 297,458 users. Even if you count all of Meta’s 3 billion users, its human moderator to user ratio is still 1/200,000, and that’s not accounting for the other 25k people it has assigned to safety and security.

On balance, then, X does have a lot fewer manual staff moderating content. Which X hasn’t really made a secret of, but that would presumably also have an impact on its capacity to detect and action violative content.

Which aligns with third party reports that more rule breaking content is now being made visible on X, which could point to a potential weakness of Community Notes in providing adequate enforcement of such. Various online safety experts have said that Community Notes is not an adequate safety solution, due to shortfalls in its process, and while X would like to see it as a better process for moderation calls, it may not be enough, in certain circumstances.

Even X has acknowledged this, to some degree, by pledging to build a new moderation center in Texas. Though since that announcement (in January), no further news on the project has come out of X HQ.

Essentially, if you’re concerned that X may not be doing as much to address harmful content, these stats likely underline such, though it is important to note that the numbers here may not necessarily be indicative of X’s broader measures, based on the notes above.

But it seems, based on the descriptions, that X is trailing behind the others, which could reinforce those concerns. 

You can read X’s latest E.U. report here, Meta’s are here (Facebook and IG), LinkedIn’s is here, and TikTok’s is here. Thanks to Xavier Degraux for the heads up on the latest reports. 

Reviews

93 %

User Score

3 ratings
Rate This

Leave your comment

Your email address will not be published. Required fields are marked *