Home » Entertainment » should television be allowed to use the “deepfake”?

should television be allowed to use the “deepfake”?


► “The “deepfake” is a neutral tool that must be appropriated”

Thierry Ardissoncreator of the show “L’hôtel du temps”

«Le deepfake (hyperdeception) is a technology that can be used for parodic but also harmful reasons, like this video where we see Volodymyr Zelensky urging his troops to lay down their arms. With the show “L’hôtel du temps”, I invented another use of deepfake positive, noble and cultural at the same time.

For a long time, I had the idea of ​​making the dead speak. Besides, I had already done shows, about John Lennon or Victor Hugo, with look-alikes and comedians, but I thought it looked like a cabaret… So when I discovered the technology of deepfake I said to myself, here is the solution: we are going to resurrect them!

→ ANALYSIS. Artificial intelligence: with the “deepfake”, how to make anyone say anything

For the first broadcast on Dalida, we collected as many images as possible of her. Artificial intelligence then integrated her face and her voice and all that was needed was to model them on an actress who had studied the singer’s gestures. This show is revolutionary! We are selling it in several European countries and in the United States.

A documentary, not entertainment

This use is not at all harmful but educational. “L’hôtel du temps” is not an entertainment show, it’s a documentary. And we have two rules: only report authentic statements, always having the agreement of the heirs. The program has a heritage dimension, transmission of knowledge and knowledge.

I think that, precisely, the mission of the public service is to be at the service of the people. I am a child of the ORTF and it seems to me that today public television no longer has this power to transmit knowledge, creativity and innovation. But I’m not at all nostalgic, the proof: I work with artificial intelligence. I also have other projects around the deepfakeincluding a show that would confront a living celebrity with themselves, as they were twenty or thirty years ago…

These deaths can always bring us something

What must be understood is that the deepfake is a tool. Just like a hammer with which you can hit a nail or a head. Now that the deepfake exists, we have to see how we use it. Either way, you can’t stop new technologies: you can just choose to do great things rather than horrible things. It is a neutral tool that must be appropriated.

We cannot, on the one hand, deplore the cultural collapse of the time and, on the other, deprive ourselves of the modern means which allow us to speak to the younger generations. I think that literature, cinema, the recording industry in France are going through a period of weakness. I am therefore interested in these deaths which can always bring us something. Without bitterness or sourness. »

► “From data recorded during your lifetime, we can hijack what you have been”

Jean-Gabriel Ganasciacomputer scientist specializing in artificial intelligence, former president of the CNRS ethics committee and author of “Virtual Servitudes” (1)

«Le deepfake uses techniques from artificial intelligence, specifically machine learning, to create an illusion. From images and videos of a person, we build a digital model that we can animate at will, with the same facial expressions, the same intonations, everything.

The most widespread use is to make a celebrity say things she has never said, or to make her do something she has never done. This can be for entertaining or, on the contrary, malicious purposes: for example, embedding the face of a woman on that of a pornographic actress; or, as was the case with Barack Obama, misrepresenting politicians. The danger is that an image is striking. Even after a denial, even if the deception is proven, the damage is done and people remain confused. We have seen it in the field of climate or medicine: false information remains.

Of course, humanity did not wait for the deepfake to manipulate the images or memory of the dead. It is enough to see how politicians from very different horizons dig into quotes from General de Gaulle at their convenience! But with technology, these diversions take on another dimension, which is very problematic. Above all, where these manipulations were often the work of totalitarian regimes, they are now accessible to a wide variety of social groups who can use them to destabilize democracies and fragment societies.

To come back to the entertaining use, the question is that of the right to the image, in particular after the death of the people. This prolongation of a form of “existence” after death raises dizzying questions. Contrary to photographs or writings, which are traces of the disappeared, the deepfake allows you to reconstruct the behavior of a deceased to make him react to the present time, as if he were still among us. But starting from data recorded during your lifetime, we can completely hijack what you were! It would be necessary, before his death, to be able to protect oneself against malevolent uses – a bit like leaving advance medical directives – to be sure that the legacy one leaves is not exploited.

Philosophically, this questions our relationship to mourning and the difficulty of the contemporary world in accepting the break between life and death. Maintaining the memory of the deceased and preserving their testimonies is extremely important, both collectively and individually. But to imagine surviving by technology is shocking. States must legislate to prevent abuses. »

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.