First in the United States, Meta will end its fact-checking program with independent third parties. The company believes that expert fact-checkers have their own biases and that too much content ends up being fact-checked.
Instead, Meta will turn to a community rating system, which uses fact-checking contributions from users.
“We’ve seen this approach work on X, where the community can decide when posts are potentially misleading and need more context,” Joel Kaplan, Meta’s director of global affairs, explained in a blog post.
According to Kaplan, the new system will be rolled out gradually over the next two months and the company will work to improve it throughout the year.
The Associated Press previously participated in Meta’s fact-checking program, but ended its participation a year ago.
“More freedom of expression”
Meta at the same time indicated that it intends to authorize “more freedom of expression” by putting an end to certain restrictions on certain subjects which are part of the general debate. The company thus wants to focus on illegal and “high seriousness” violations, such as terrorism, sexual exploitation of children and drugs.
-Meta acknowledged that its approach of creating complex systems to manage content on its platforms had “gone too far” and made “too many mistakes” by censoring too much content.
Meta CEO Mark Zuckerberg admitted that the changes announced Tuesday were partly triggered by political events, including Donald Trump’s victory in the presidential election.
“The recent election also appears to be a cultural tipping point toward a new emphasis on free speech,” Mr. Zuckerberg said in a video.
Meta’s quasi-independent Oversight Board, which was created to act as an arbiter on controversial content decisions, welcomed the changes. He said he looked forward to working with the company “to understand the changes in more detail, ensuring its new approach is as effective and free-speech-friendly as possible.”