Fighting fake news, a struggle in transformation for 15 years

Fighting fake news, a struggle in transformation for 15 years
Fighting fake news, a struggle in transformation for 15 years

In 2009, intrigued by the proliferation of fake news around the new President Obama and his health insurance plan, an American political science researcher decided to conduct research on the impacts of disinformation on public opinion. Little did he know that he would become a pioneer in a research sector destined to grow considerably in importance.

Fifteen years later, in fact, the number of research has not only exploded, it is a subject that has brought together researchers from multiple disciplines, reports the American journal Sciencewhich devotes three reports to the question. Just as political disinformation had, in the 2010s, inspired psychologists, sociologists, political scientists and philosophers to launch research to understand the phenomenon and to think about possible solutions, COVID, from the start of 2020, forced public health researchers to take an interest in it too.

If there is one thing that has changed in the meantime, notes the pioneer in question, Adam Berinsky, of the Massachusetts Institute of Technology, it is that we are less “naive”: these are no longer just a few “stories crazy” (a few crazy stories) that we face, but a real ecosystem of disinformation.

Always looking for truly effective remedies

But if there is one thing that has not changed, it is that we still have not put our finger on the solutions that would make it possible to fight most effectively against those who contribute to the dissemination of false information that they sincerely believe to be true.

When we observe the expert debates, however, we note that other small things have changed.

One of the disinformation researchers targeted, Kate Starbird of the University of Wisconsin, suspects that one reason for these online attacks and harassment is not just that her team was studying disinformation, “but that she fought it. “We were trying to make a difference to a concrete problem in real time,” she says.

But among the things that haven’t changed, and have become increasingly thorny issues over the years, is the opacity of social media platforms. The algorithm, this computer code which determines which photo or video will be preferred for this and that type of user, is a company secret. And if we have been able in recent years, thanks to researchers and journalists, to deduce some of the things that the algorithm favors — what provokes an emotion and not a reflection — there is neither a regulatory body nor an independent observer, who can take a look at these lines of code and ensure that they do not have harmful impacts on society, democracy or public health.

The only exception was Twitter: for a long time, researchers had access to part of its database, which made it possible to produce many studies on the behavior of Twitter users and the way in which information, true and false, is propagate. However, in March 2023, shortly after its acquisition by Elon Musk, the company ended the free service.

It is in this context that new European legislation could perhaps change things. The Digital Services Regulations (Digital Services Act), which entered into force in November 2022, requires platforms to provide access to researchers for certain projects. The impact that this will have is being carefully observed by researchers from other regions of the world, who have no shortage of ideas for trying to measure the impacts of disinformation and the fallout from the solutions outlined here and there.

Subscribe to our sprawling newsletter

Encourage us for the price of a coffee

-

-

PREV Eight accused at trial in Paris for the assassination of Samuel Paty
NEXT Federal training: The new version of De Wever’s “super note” still does not appeal to Vooruit, the blockage continues