Mark Zuckerberg, Meta’s chief executive, blamed the company’s fact-checking partners for some of Facebook’s moderation issues, saying in a video that “fact-checkers have been too politically biased” and have “destroyed more trust than they created.”
Fact-checking groups that worked with Meta have taken issue with that characterization, saying they had no role in deciding what the company did with the content that was fact-checked.
“I don’t believe we were doing anything, in any form, with bias,” said Neil Brown, the president of the Poynter Institute, a global nonprofit that runs PolitiFact, one of Meta’s fact-checking partners. “There’s a mountain of what could be checked, and we were grabbing what we could.”
Mr. Brown said the group used Meta’s tools to submit fact-checks and followed Meta’s rules that prevented the group from fact-checking politicians. Meta ultimately decided how to respond to the fact-checks, adding warning labels, limiting the reach of some content or even removing the posts.