Bern goes to war against apps to “see your neighbor naked”

Such applications allow you to generate nude photos with a single click.Image: dr

Many young Swiss people are victims of blackmail using fake nude photos. The perpetrators are organized into gangs and choose their victims in a targeted manner.

Michael Graber / ch media

It’s like a scenario straight out of a sleazy 80s erotic film: just press a button and all the clothes disappear. And yet, it has been a reality for a long time. Applications and sites like Nudify allow, thanks to artificial intelligence (AI), to transform any innocuous portrait into naked photos. And the results are frighteningly realistic.

“Finally see your neighbor naked”

It is with such slogans that several suppliers have made themselves known, in a very crude way, on social networks. Although a certain amount of self-regulation has reduced the impact of these advertisements, the problem has not yet been resolved. Il There are still dozens of sites and applications that offer the ability to generate photos of this type in one click.

A broad alliance in Parliament is now trying to come together and put an end to this AI-generated content. Raphaël Mahaim (Greens/VD) and Nina Fehr Düsel (UDC/ZH) filed identical interventions aimed at limiting the dissemination of degrading applications such as Nudify. They found support across all parties.

Raphaël Mahaim did the experiment on himself.

At the end of September, Vaudois Raphaël Mahaim explained to 24 Hours His concerns about deep nudes:

“We see more and more of them circulating, particularly in school playgrounds. It’s dramatically easy to create them! Basically, you take a photo of a dressed person, without them even showing in full. You slip it into one of the many “Nudifiers” or “Undress-apps” applications that abound on the Web, and it reconstructs the person’s entire body in a few seconds, without the slightest item of clothing. Boys, in particular, use them a lot.”

Make access technically impossible

Concretely, the French-speaking elected official and his UDC sidekick wish prohibit or make technically impossible “the promotion, sale or provision of digital applications and services” that generate nude photos using AI. The movers thus directly attack the creators of such content. Nina Fehr Düsel explains that the aim is to prevent possible abuse.

A law already exists. Indeed, today anyone who generates a nude photo of another person and shares it is punishable. The distribution of such photos is “extremely painful” for the victims. Once a photo circulates – whether it is authentic or not – the situation is hardly controllable, and condemning the perpetrator is not of much help.

Pour l’UDC Nina Fehr Düsel, it is about “giving a signal” against the more dangerous aspects of artificial intelligence. Despite “many benefits and useful applications,” its use comes with many opportunities for abuse. The Zurich national advisor is realistic and knows that technology is progressing rapidly and that the legal framework is still lagging behind a little. However, she hopes to find here a basis which will prevent, or at least limit, future abuses with other technical means.

Even children receive advertising

She herself noticed ads for apps like Nudify on her children’s account. “It’s unacceptable,” laments the lawyer. Fortunately, there are now possibilities to specifically ban this type of advertising and to ensure that sites offering this type of content are quickly removed from circulation.

Children and adolescents are particularly affected by abuse committed by such applications, explains Regula Bernhard Hug, head of the Swiss Association for Child Protection. The organization also operates the online alert service clickandstop.ch, where you can report child abuse material and obtain information and advice. Here too, those responsible are increasingly confronted with artificially generated material.

Regula Bernhard Hug is aware of numerous cases in Switzerland where young people have been blackmailed with such AI-created photos. Perpetrators obtain photos of victims from social media, generate fake nudes and use them to try to extort money. Or they try to extort genuine material using the fake material.

In both cases, young people are often affected at a fragile period, in the midst of puberty.

“Such a situation of blackmail can lead to major crises”

Bernhard Hug

Young people of both sexes are affected. The expert also knows of cases where AI has been used to generate artificial nude photos, or even child abuse videos of young children.

Do not put full-face photos on the Internet

Regula Bernhard Hug welcomes the fact that politicians are taking action. However, she recommends leaving as few recognizable images of children and adolescents on social networks as possible. Full-frontal photos are particularly problematic, she says, because such images would lend themselves best to being processed with such apps like Nudify or other tools. She therefore calls on parents to choose carefully what they share.

They also sometimes get trapped. Recently, for example, a fake profile and AI-generated nude photos were used to trick a father into believing his daughter was working as a prostitute. The perpetrators demanded money and threatened to publish the photos if he refused to pay. Entire gangs specialize in this type of blackmail. Regula Bernhard Hug believes that there are a large number of unreported cases:

“Many victims are ashamed and do not contact the police or a center like ours”

Translated and adapted from German by Léa Krejci

More articles on artificial intelligence:

This might also interest you:

-

-

PREV Rufus breaks Microsoft restrictions to install 24H2 update on non-compatible PCs
NEXT Farah Alibay: Journey between science and fiction with Najma in the stars