“Dissuading women”: female politicians targeted by pornographic deepfakes

“Dissuading women”: female politicians targeted by pornographic deepfakes
“Dissuading women”: female politicians targeted by pornographic deepfakes

From the United States to Italy, from the United Kingdom to Pakistan, female politicians are increasingly victims of sexual or pornographic images generated by artificial intelligence (AI), a worrying trend that threatens participation women in public life, researchers say.

This explosion of deepfakes published without the consent of the people targeted thwarts efforts to regulate artificial intelligence at the global level, experts believe, with a proliferation of low-cost AI tools, in particular photo editing applications on phones which allow women to be undressed.

For researchers, these intimate images are used as real weapons that aim to damage the reputation of women in the public sphere, compromising their careers, undermining public confidence and threatening national security, by fueling phenomena of blackmail and of harassment.

In the United States, the American Sunlight Project (ASP), a disinformation research group, has identified more than 35,000 occurrences of deepfakes depicting 26 members of the US Congress, 25 of whom are women, on pornographic sites.

A “dark and disturbing reality” for researchers whose study published last month shows that a sixth of women elected to Congress were victims of such AI-generated images.

“Women members of Congress are being targeted by these AI-generated pornographic deepfakes at an alarming level,” warns Nina Jankowicz, head of the ASP.

“It’s not just a question of technology, it’s an attack against women who have power and against democracy itself,” she emphasizes.

The ASP did not publish the names of the women elected to Congress targeted by the images in question, to avoid generating further interest and research, but indicated that it had notified their offices confidentially.

“Wage this war”

In the United Kingdom, Deputy Prime Minister Angela Rayner is one of at least 30 British political figures targeted by a pornographic deepfake site, according to an investigation by Channel 4 television published in July.

This site, which attracts many visitors and whose name has not been revealed by the channel to avoid any publicity, uses AI to “lay bare” around ten of these political figures, transforming real photos in false images where they appear naked.

Women are targeted by AI-powered apps and tools, freely available to the general public and requiring no technical skills, that allow their users to virtually remove clothing from images or generate deepfakes via sexualized text queries.

Technological advances have given rise to what researchers call a growing “cottage industry” around AI-augmented pornography, with some deepfake creators accepting paid requests to generate content featuring a person of the customer’s choosing .

In Italy, Prime Minister Giorgia Meloni is demanding 100,000 euros in damages from two men accused of creating false pornographic videos featuring her and publishing them on American pornographic sites.

“This is a form of violence against women,” Ms. Meloni told a court in October 2024, according to the Italian news agency ANSA.

“With the advent of AI, if we allow one woman’s face to be superimposed on another’s body, our daughters will find themselves in these situations, which is exactly why I consider it legitimate to fight this war “, she insisted.

“Dissuade women”

In Pakistan, AFP journalists analyzed a deepfake video showing a local elected official, Meena Majeed, kissing a man in public, an act considered immoral in this conservative Muslim country.

Azma Bukhari, information minister of Pakistan’s Punjab province, said she felt “devastated” after discovering a deepfake video that superimposed her face onto the naked body of an Indian actress.

“We are increasingly observing the paralyzing effect of these AI-generated images and videos used to harass women in politics,” analyzed the NGO Tech Policy Press in 2024, warning that this has the effect of “deterring women who have political ambitions.

Around the world, the proliferation of these deepfakes has outpaced regulations.

Existing legislation in Pakistan “to prevent online crimes” includes provisions against “cyberbullying” aimed at prohibiting the sharing of photos or videos without the consent of the persons targeted, “in such a way as to harm any person”. However, the country does not have a specific law to combat the dissemination of deepfakes of a sexual nature.

Although UK laws criminalize the distribution of pornographic deepfakes, the Labor government has promised to ban the creation of such content, but no timetable has yet been set.

A few US states, including California and Florida, have passed laws criminalizing the publication of sexually explicit deepfakes, and activists are calling on the US Congress to urgently pass new laws to regulate their creation and distribution.

Even if the victims of pornographic deepfakes have so far been politicians and celebrities, including singer Taylor Swift, experts say that all women, including those outside the media circle, are vulnerable to them.

After ASP notified the targeted congresswomen, the AI-generated images were almost entirely removed from the affected sites, reflecting what researchers call a “privilege disparity.”

“Women who do not have the resources available to members of Congress would be unlikely to get as quick a response from sites publishing pornographic deepfakes if they initiated a takedown request themselves,” explains the ASP in its report.

-

-

PREV Muan airport closure extended after crash
NEXT 7 sailors banned from traveling