“You will never believe the horrible story that happened to me. I am Estelle Mouzin and I am known for having suffered atrocities at the hands of one of France's most notorious murderers.” On TikTok, a reconstruction of little Estelle Mouzin recounts the day of her disappearance and the involvement of serial killer Michel Fourniret.
This video, viewed more than 270,000 times, is not an isolated case. Little Grégory, professors Samuel Paty and Dominique Bernard are also faces that you can see come to life when you open the application. There, as if they were speaking directly to the user, to tell of the horror of which they were victims. But their facial expressions and robotic voices betray the involvement of artificial intelligence.
Because thanks to artificial intelligence, it is now possible to make the dead or the missing say anything and everything. However, the format is particularly popular, with views numbering in the hundreds of thousands, even millions. “My death shocked the whole of France”, “I was abandoned in a forest”: the first sentences are deliberately shocking so that the user remains hooked on the video. A recipe so juicy that some TikTok profiles dedicate themselves entirely to this kind of content. Thus, the @histoireextra account has more than 273,000 subscribers. A certain number of them are “bots”, robots created to automatically generate these videos which provoke a mass reaction, such as the account @mister_story_ai.
And it works. In the comments, some Internet users go so far as to make specific requests. Under a TikTok presenting a digital image of little Grégory, a user asks for “the story of Kevin and Leslie”, the two young people whose bodies were found in March in Deux-Sèvres. Some users are even confused by these fakes which are sometimes very realistic. Under a video representing the face of little Emile, who has been missing since July in the Alpes-de-Haute-Provence, a user writes: “but you know where you are, why don’t you tell us?”
However, the process is not complicated. To make these videos, the accounts in question use applications for creating deepfakes (“hypertrucages” in French). These digital manipulations make it possible to replace one face with that of another, to reconstruct someone's voice or to falsify the words of a personality with more and more realism. Once the victim's photo is recovered, there are only a few steps left.
FaceSwap, the best-known application, absolves itself of any responsibility in this very controversial use. “We are not trying to bash celebrities or put anyone down,” she writes on her website. “FaceSwap is not used to create inappropriate content or face swapping without consent or with the intent to conceal its use.”
Also read “Deepfakes”, a new scourge on the internet
These tools abound and are very easy to access, such as Revive – available on the App Store – or DeepNostalgia, the software developed by the MyHeritage website, a genealogy platform. Originally, this software allows you to bring a little life back to family photos. The concept is simple: after registering on the site, it is possible to drag a photograph to see it come to life, for free. Then, simply add a fake voice using online software. A robotic voice, similar to that of translation sites or voice assistants like Ok Google or Alexa, generated by AI. No need, therefore, to have a voice sample of the person in question.
Asked about the moderation and monetization of this type of video, TikTok did not wish to respond. A spokesperson confirms, however, that videos depicting murdered people “are not authorized”. Since last May, the social network has required that any realistic content created and edited by AI be clearly identified as such, in the caption or through a superimposed sticker. In fact, none of these videos mention the use of any of these external software.
Shortly after the existence of this content was reported to TikTok, many videos were removed and the moderation wave is still ongoing.