[SOLVED] : Deepfake: How to change the cinema? Part two

“What does this mean for cinema? Personally, I think it could be a major innovation. The deepfake could make it possible to design avatars in the same way as the current special effects, and maybe even better. This includes the creation of humanoid beings, character rejuvenation, more realistic stunt scenes, and more. The possibilities are endless, ”said Andreas Babiolakis, film editor, on the Canadian blog Films Fatale. According to this film expert, the main current limitation of CGIs is the recurring “disharmony” between the real movements of a human being and those simulated by a computer. “The deepfake could fix that, just replacing a face on a host, someone’s body for real. Thus, the movements would be natural, and therefore realistic, ”he says.

And indeed, hard not to prefer the “deepfake” version of Princess Leia in an altered version of Rogue One, to the original, which was designed by computer graphics, but which sounds really wrong… Certainly, the result isn’t perfect (it’s possible to see the face “moving” on Leia’s forehead), but “it’s just the start,” promises Andreas Babiolakis. And this technique costs a lot less than frame-by-frame analysis on 4K monitors.

Just a matter of time?

On the side of film professionals in Hollywood, some remain skeptical about an upcoming revolution in their industry thanks to deepfake… without, however, categorically dismissing it. “I don’t think it’s a technique that the films are going to use anytime soon. The results are interesting, but the current quality is not acceptable for the cinema ”, remarks Richard Clegg, supervisor of visual effects at Moving Picture Company, at the origin of the digital clone of Rachel in Blade Runner 2049, and of the young Arnold Schwarzenegger in Terminator Genisys. But on the website L’actualité, he admits that one day, this artificial image synthesis technique based on AI could make such progress that filmmakers could end up using it.

For their part, other professionals special efforts predict outright the future arrival of fully virtual characters, but “hyper believable”, and can replace actors in a film. This clearly goes in the direction of the predictions enacted above… Darren Hendler, creator of digital avatars for the special effects company Digital Domain, explains in the Guardian that for the moment, humans must “be behind” everything digital clone to perfectly reproduce an actor’s performance (for example, the “young” version of Will Smith in Gemini Man is based on the performance of the actor), but that the AI ​​could one day make it possible to do without it, leading to the design of “virtual humans” functioning “autonomously”, their words and expressions being “guided by artificial intelligence”.

Also in the Guardian, Yuri Lowenthal, an actor who stars in the Spider-Man video game on PS4, wonders: “Not everyone accepts that one voluntarily captures his data on a large scale, like I do it. We record all the data of my performances and my voice, the details of my face and my body language. Does this foreshadow the future? How long will it take before we can create an acting performance out of thin air? ” For Darren Hendler, it will undoubtedly be necessary to wait 5 to 10 years before hoping to see something “semi-plausible”.

So it might just be a matter of time, before we see a Hollywood use of deep learning to change faces and replace real actors – although it might be a long time … AI is currently on the menu of some special effects studios’ projects – to “improve existing tools”, but also to create new ones, ultimately allowing CGIs to be designed faster and in an even more creative way. . Deep learning could thus be used to create artificial sets, or even to “automate” rotoscoping, a technique that allows a character to be cut out (in his environment) frame by frame, but which is currently quite laborious.

Richard Clegg nevertheless conceives that AI could well facilitate the work of studios and their artists, and also finally admits that “artificial intelligence could allow us to imagine and put things together much faster, and to present more choice for the director ”, but also to make“ large-scale ”films, while spending less money. Obviously, “smaller” studios and directors will also be able to use these tools as special effects, without having the same means as the big Hollywood studios.

How does “deepfake” worry Hollywood?

The technology that makes it possible to fake a person’s face has become so democratized that its use questions Hollywood… and the American government more generally. How much does an actor own his body?

In 2018, a Buzzfeed video toured the world. Barack Obama presents the dangers of “deepfake”, these faked videos using increasingly advanced technologies based on artificial intelligence to modify a face and its actions. Of course, the video itself turns out to be a pretty impressive deepfake, voiced by Jordan Peele.

If some imperfections can be noticed, the success of this image manipulation is particularly impressive. And the truth is, they are more and more common. At a time when information can be scrolled in a few seconds in a news feed on social networks, some deepfakes can go unnoticed if we do not pay attention. And as the Buzzfeed video makes clear, with this technology, anyone can do anything and say anything.

Consequently, the democratization of software now makes it possible to easily reproduce the face of a person, in particular to model it on the body of another. In the majority of cases, deepfakes are assumed to be funny, disturbing and immediately identifiable experiments, particularly when the Internet’s fascination with Nicolas Cage leads the latter’s face to appear in all kinds of films. But that does mean that it is possible to implant a celebrity in pictures where he is not supposed to appear.

Possible drifts, which impose an overhaul of the laws

Advertisements produced without authorization, defamatory videos… The American constitution today tries to anticipate, through certain texts of laws, the many potential abuses of deepfake. However, according to The Hollywood Reporter, the main limit to these steps lies in the often complex capacity to judge the goal sought by one of these videos. Can we really consider that there is defamation if the seams of the technology are visible, or if the deepfake is announced by its creator?

In reality, if the fear of deepfake lies in its ability to fool us, it is only the reflection of a culture increasingly anchored in post-information and in the constant pirating of reality by images. In the case of cinema, we can see that the deepfake has become a tool for rewriting the history of the seventh art, especially when we know that some actors have refused roles that could have been decisive in their careers. In this way, today it is possible to realize this phantasmal alternative, to give life to uchronia.

Soon perfect digital liners in all movies?

In short, deepfake is a fascinating tool that worries the entertainment industry right now, in the sense that it facilitates digital dubbing, especially with the help of performance capture. Recently, movies like Fast and Furious 7 and Rogue One: A Star Wars Story have used technology to recreate the faces of the late Paul Walker and Peter Cushing, or to recapture the youthful features of Princess Leia from the first Star Wars.

Ethically, the recreation of a deceased actor never ceases to question, because it assumes that the person is no longer in possession of his own body. Some would say that the cinema resides in a Faustian pact anyway (and they would not be wrong), but the fact that the studios can have available in their databases virtual copies of a personality, as one possesses a puppet, leads today * i to rethink the link of an actor to his own image, which could be used against his will (especially in the context of pornographic deepfakes, which are more and more common).

For example, a Game of Thrones fan had fun taking an excerpt from the series to change his point. In this scene, Kit Harington explains, in a convincing imitation, that he hates season 8. If the sequence is funny enough and good-natured (especially since the lip synchronization is far from perfect), we can to conceive that the actor, who has not declared himself on the quality of this final, appreciates that his body is manipulated to impose a discourse on him.

‘Gemini Man’ by Ang Lee, a highly anticipated film

Much more explicitly, the Corridor Digital Youtube channel, which specializes in making short films requiring special effects, has produced several videos by playing with the possibilities of deepfake. In one of their most realistic creations, she dissects the process that allowed her teams to imagine Tom Cruise’s visit to their premises.

In addition, the end of the year will be marked by the release of Ang Lee’s highly anticipated Gemini Man, which will be based on the filmmaker’s new technical feat: making a film about Will Smith and his younger clone, whose body will be entirely produced in synthetic images, but which will interact from the performance of the actor.

The tools leading to the deepfake necessarily contributed, in an amplified version, to this distorting mirror as magnificent as it is disturbing. There is no doubt that the scenario, which seems to linger on a hunt between the two characters, should put in abyss these questions which torment the factory of dreams.

Tagged in :

, , , ,