This post is free to read. Members pay for our work: Please upgrade today if you like what we do.

More than 280,000 users have contributed at least one note to X’s Community Notes. Almost 1.3 million have enrolled in the program overall.

Those are remarkable numbers for a fact-checking effort fueled by unpaid volunteers on a private platform owned by a polarizing gajillionaire who has accused the program of being “gamed by governments & legacy media.” Doubly so when you consider that roughly 90% of notes never get published on X due to a filtering algorithm that evaluates the helpfulness of notes based on “diversity of perspectives.”

And yet, in a new preprint I co-authored with Zahra Arjmandi-Lari and Tom Stafford, we find that the Community Notes community may be showing signs of fatigue. The recent launch of AI contributors and the growing use of Grok as a fact-checker by X users could further accelerate the decline in the program.

This matters not just for X, whose content moderation efforts operate at the whim of Elon Musk, but for the wider online information ecosystem. 

In January, Mark Zuckerberg (wrongly) cast Community Notes as an alternative to the fact-checking program it had built with the International Fact-Checking Network.1 With America’s governing party treating “fact-checker” as an insult, both Meta and TikTok have launched a Community Notes-like feature. YouTube has also said it is doing the same but it must be operating at such an infinitesimally small scale that it has been impossible to detect with a naked eye.

Whether as a fig leaf or a genuine effort (or a little bit of both), algorithmically-mediated crowdsourced labels are now a part of the infrastructure of major social platforms.

As the program that inspired the copycats – and the only one to share data at scale – X’s Community Notes matters disproportionately. 

Here’s what we found out in our research.

Sizing up the Notes community on X

Community Notes, initially called Birdwatch, was in a restricted pilot phase until around the time that Elon Musk purchased Twitter. As more and more countries were onboarded, the active user base grew significantly until it reached a steady-ish state in 2024.

Throughout 2024, the number of monthly active authors (MAAs) – defined as users who contributed at least one note – hovered between 30,000 and 40,000. That is less than one tenth of the number of active users on Wikipedia, a project much reviled by Musk. The number of MAAs has mostly declined this year.

Number of Monthly Active Authors on X’s Community Notes

Consistent with other platforms like Wikipedia, the program relies on a subset of authors who contribute most of the notes. My co-authors and I found that in 2024 the top 1% of authors wrote almost one third of all helpful notes, and the top 7.5% wrote half of the total. (Earlier this year, I had written on Indicator about the single most prolific contributor.)

Are users sticking around?

Community Notes is experiencing a significant churn rate. Only 29% of users who started writing notes in the first six months of 2023 are still active two years later. The share of authors who stick around six months after writing their first note has been steadily decreasing from 63% for those joining in the first half of 2023 to 44% for those who joined in the second half of 2024.

Sliced another way, the downward trend is confirmed. The chart below shows the number of authors who wrote at least one additional note in the four months following their first contribution. That share has decreased from 67% for authors who first contributed in January 2024 to 54% for those who first authored a note in February of this year.

Share of Community Notes users who wrote at least one more note in the four months following their first contribution.

Publication matters

What could be causing the program’s decreasing retention rate and the overall decrease in monthly active authors? 

We know remarkably little about the incentive structure of Community Notes. One valuable study has found that contributors appear motivated by correcting their political rivals. My conversations with a few power users suggest that most do it out of a sense of making the internet a slightly better place.

Regardless of the underlying incentive, Zahra, Tom, and I hypothesized that getting a note published motivates users to write additional notes. Put another way, you are more likely to stick around on Community Notes if your notes get shown to all X users because they received the necessary ratings and algorithmic nod to be considered “helpful.”

We set out to test this hypothesis by looking at authors whose first note was just above or just below the helpfulness score of 0.4, which determines future publication. Around this threshold we assumed that the quality of the underlying notes was roughly equivalent and that we were essentially comparing like to like. The sole difference was that notes below 0.4 were not published and those above 0.4 were.

We found that having a first note published increases the likelihood that a Community Notes contributor will write another one by 5%. That is not a huge effect, but it appears to confirm our hypothesis2 that getting published incentivizes users to write more.

The trouble for X is that the share of “helpful” notes is decreasing, meaning a smaller share of users have a positive incentive to keep contributing. From a peak of 13.1% in March 2024, the share of notes published on X was as low as 8.2% in May of this year.

Share of notes considered “helpful” by X’s Community Notes algorithm, NNNs excluded

With the introduction of AI notes, the number of unpublished human contributors may further increase. That’s because AI Contributors can write but not rate notes, meaning more humans will have to spend time rating notes. But the program already has a significant backlog of notes in need of ratings (see chart below). 

Given that AI contributors can potentially write an unlimited number of notes, the backlog could become markedly worse – and further depress the share of human notes that are published.

Community Notes is not dying, yet. Even if our findings are confirmed through peer review, users are clearly contributing for reasons beyond seeing their notes published (what these are, future research will hopefully help answer).

At the same time, Community Notes appears to be retrenching from its 2024 peaks, and the latest changes introduced by X may well end up further degrading the incentive structure that keeps people writing. 

X, Meta, TikTok and any others building crowdsourced labeling features like Community Notes would do well to keep track of how contributors respond to the lack of publication. In turn, this may provide a deeper understanding of why contributors participate in the programs.

1 Disclaimer: I was the founding director of the IFCN and was involved in the early days of the program.

2 The conditional is de rigueur, seeing as our analysis has not yet been peer reviewed.

Keep Reading

No posts found