A viral and dangerous trend
These videos have several points in common : they were produced using AI tools like Runway, and are inspired by real events. They replace human beings involved in tragic events with animated characters, which makes the scenes all the more disturbing.
In one of the most viral videos, we can see a Minion sitting in front of a computer, who ends his life by reproducing the suicide live of Ronnie McNutt in 2020. Another takes up the visual elements of the 2019 Christchurch shootingsperpetrated in two mosques in New Zealand. These reconstructions, although stylized, repeat the settings and sounds of the original incidents, amplifying their psychological impact.
Initially launched in Russiathis trend has quickly spread to platforms like TikTok and Instagram, where some videos rack up hundreds of thousands of views. These videos seem innocent at first, but their twist reveals a deep discomfortwhich many content creators like Noah Glenn Carter point out. Videos that would never be allowed with humans are posted with these animated characters. This allows you to publish things that shouldn’t exist
.
A call for vigilance
Questioned by our colleagues from 404mediaTikTok said it removes all content which violates (its) community rules, whether or not it has been modified by artificial intelligence. This includes hateful, bloody or extremely violent content
. The platform – which is however not in the odor of sanctity in the USA – also affirms working on updates to its algorithms to more effectively detect AI-generated videos and prevent the distribution of inappropriate content.
Faced with the rise of this trend, it is therefore necessary to demonstrate increased alertness. Behind the apparent lightness of Minions lie videos which, in addition to normalizing violence, can offend the youngest spectators who are fans of Minions in this case. Beyond this, these videos raise several questions about content management: how should (can?) platforms manage new forms of content generated by artificial intelligence while protecting their communities? A problem which, in the age of AI, seems far from being resolved.