Advertisement

X now requires community fact checks to include sources

Researchers have said that misinformation has reached new heights amid the Israel-Hamas war.

X is making a significant change to its crowd-sourced fact checking tool in an attempt to stem the flow of misinformation on its platform. The new rule is one that will be familiar to professional fact checkers, academics and Wikipedia editors, but is nonetheless new to X’s approach to fact-checking: the company will now require its volunteer contributors to include sources on every community note they write.

The company announced the change in a post on X, shortly after Wired reported that some community notes contributors are worried the tool is being manipulated by bad actors and worsening X’s misinformation problems amid the ongoing Israel-Hamas war. “Starting today, sources are now required for proposed notes,” the company wrote. “We haven’t previously required this, as some helpful notes inherently do not need sources – for example, they refer to details of the post or media it contains. But those instances are less common, and we believe the overall impact of this change will be positive.”

The change comes amid mounting scrutiny of the amount of misinformation and other falsehoods spreading on X in recent days. Longtime researchers have said that misinformation has reached new heights following Hamas’ attacks in Israel and the ensuing war. The advent of paid verification, and algorithm changes that boost paying subscribers, have allowed misinformation to spread relatively unchecked, researchers have said.

European Union officials have also raised concerns, pointing to the viral spread of video game footage and other unrelated content falsely claiming to depict scenes from the ongoing conflict. EU officials opened an investigation into X over its handling of misinformation last week.

Under Elon Musk’s leadership, X cut the teams responsible for curating reputable information about breaking news events, removed misinformation-reporting tools, slashed safety teams that patrolled for disinformation, and stopped labeling state-affiliated media accounts. Instead, the company has relied almost entirely on Community Notes, which allows volunteer contributors to append fact-checks to individual tweets.

Contributors are not vetted before joining the program, though notes have to reach a certain threshold of “helpful” ratings from other contributors before they’ll be visible. X CEO Linda Yaccarino told EU officials last week that the company had “recently launched a major acceleration in the speed at which notes appear.”

According to Wired, the system is easily manipulated as groups of contributors can rate each other’s notes, or selectively rate contributions that align with their opinions. The report also says that community notes related to the Israel-Hamas war have been filled with conspiracy theories and infighting between contributors.

The change to require a linked source may be X’s attempt to increase the quality of its notes, though it doesn’t seem to have any guidelines about the types of sources that can be cited. The company says “thousands” of new contributors have joined the program in recent days, and that notes have been viewed “millions” of times.