The ethics of a voice Deepfake Anthony Bourdain in “Roadrunner”

0

The documentary “Roadrunner: A Film About Anthony Bourdain”, which opened on Friday, is an angry, elegant and often extremely emotional chronicle of the TV star’s life and its impact on those close to him. Directed by Morgan Neville, the film portrays Bourdain as intense, self-disgusting, relentlessly driven, supernatural charismatic, and – in his life and in his death, by suicide, in 2018 – a man who both focused and disrupted the life of those around him. him. To craft the film’s narrative, Neville relied on tens of thousands of hours of video footage and audio archives, and for three particular lines heard in the film, Neville commissioned a software company to create a version. generated by the AI ​​of Bourdain’s voice. Synthetic audio news, which Neville discussed last week in interviews with me and Brett Martin, at GQ, caused startling anger and unease among Bourdain fans: “Well it’s macabre“;”It’s horrible“;”WTF ?!People said on Twitter, where Bourdain’s fake voice has become a trending topic. Critic Sean Burns, who had criticized the documentary negatively, tweeted: “I feel like this tells you everything you need to know about the ethics of the people behind this project.”

When I first spoke with Neville, I was surprised to learn of his use of synthetic audio and also surprised that he chose not to disclose his presence in his movie. He admitted to using the technology for a specific voiceover I had asked about – in which Bourdain unlikely read aloud a desperate email he sent to a friend, artist David Choe – but did not reveal the other two from the documentary. examples of technological witchcraft. Creating a synthetic Bourdain voiceover struck me as a lot less crass than, say, a CGI Fred Astaire put to work selling vacuums in a Dirt Devil commercial, or a holographic Tupac Shakur playing alongside Snoop Dogg at Coachella, and far more trivial than the intentional mixture of fiction and non-fiction in, for example, “Thin Blue Line” by Errol Morris. Neville used the AI-generated audio only to narrate the text Bourdain himself had written. Bourdain composed the words; he just – to our knowledge – never said them out loud. Some of Neville’s critics argue that Bourdain should have the right to control how his written words are spoken. But doesn’t a person relinquish this control every time their writing goes out into the world? The act of reading, whether it’s an email or a novel, in our head or out loud, always involves some degree of interpretation. (I was more troubled that Neville said he hadn’t interviewed Bourdain’s former girlfriend Asia Argento, who is portrayed in the film as the agent of his unravel.)

Further, documentary film, like non-fiction writing, is a broad and loose category encompassing everything from unedited and unmanipulated truth to highly constructed and reconstructed narratives. Winsor McCay’s short “The Sinking of the Lusitania”, a propaganda film, from 1918, which is considered one of the earliest examples of the animated documentary form, was made entirely from reconstructed and recreated footage. “Waltz with Bashir”, Oscar nominated by Ari Folman in 2008, is a cinematic war memoir told through animation, with an unreliable narrator and with the inclusion of entirely fictional characters. The truth is “only a superficial truth, the truth of accountants,” Werner Herzog wrote in his famous “Minnesota Declaration” manifesto. “There are deeper layers of truth in cinema, and there is poetic and ecstatic truth. It is mysterious and elusive, and can only be achieved through fabrication, imagination, and stylization. At the same time, “deepfakes” and other computer-generated synthetic media have certain disturbing connotations – political machinations, fake news, lies bearing the face of truth in HD – and it is natural for viewers and filmmakers to reconsider. question the limits. of its responsible use. Neville’s offhand comment in his interview with me that “we can have a documentary ethics panel on this later” did not help assure people that he took these matters seriously.

On Friday, to help me unravel the tangle of ethical and emotional questions raised by the three pieces of audio from “Roadrunner” (totaling just forty-five seconds), I spoke to two people who would be well qualified to Neville’s hypothetical ethics panel. The first, Sam Gregory, is a former filmmaker and program director of Witness, a non-profit human rights organization that focuses on the ethical applications of video and technology. “In some ways, this is a pretty minor use of synthetic media technology,” he told me. “These are a few lines in a genre where you sometimes build things up, where there aren’t set standards for what’s acceptable.” But, he explained, Neville’s recreation and how he used it raises fundamental questions about how we define the ethical use of synthetic media.

The first is about consent, and what Gregory described as our “worry” about manipulating the image or voice of a deceased person. In Neville’s interview with GQ, he said he pursued the AI ​​idea with the support of Bourdain’s inner circle – “I checked, you know, with his widow and his executor, just to make sure people were cool with it, ”he said. But early Friday morning, as news of his use of AI ricocheted, his ex-wife Ottavia Busia tweeted: “I was definitely NOT the one who said Tony would have been cool with this.” On Saturday afternoon Neville wrote to me that the idea for AI “was part of my initial pitch of asking Tony to posthumously tell the film at Sunset Boulevard, one of Tony’s favorite movies.” and the one he even pieced together on Cook’s tour, “adding,” I didn’t mean to say that Ottavia thought Tony would have liked that. All I know is that no one has ever expressed any reservations to me. (Busia told me, in an email, that she remembered the idea of ​​AI from a first conversation with Neville and others, but hadn’t realized that it had been used before the social media wave started. “I believe Morgan thought he had everyone’s blessing to move forward,” she wrote. made the decision to withdraw from the process early on because it was too painful for me. “)

A second basic principle is disclosure – how the use of synthetic media is or is not clearly explained to an audience. Gregory brought up the example of “Welcome to Chechnya”, the 2020 film, about Chechen underground activists working to free survivors of the country’s violent anti-gay purges. The film’s director, David France, relied on deepfake technology to protect the identity of the film’s subjects by swapping their faces for others, but left a slight shimmer around the activists’ heads to alert his Viewers of manipulation – what Gregory described as an example of “creative signage.” “It’s not like you have to literally label something – it’s not like you have to write something at the bottom of the screen every time you use a synthetic tool – but it’s fair to remind the audience that this is a representation, ”he said. . “If you watch a Ken Burns documentary, it doesn’t say ‘reconstruction’ at the bottom of every photo it has animated. But there are norms and context – trying to think, in the nature of the genre, how we could show manipulation in a way that is responsible to the audience and doesn’t cheat on them.

Gregory suggested that much of the discomfort people feel about “Roadrunner” could come from the newness of the technology. “I’m not sure that’s really what the director did in this movie – it’s because it gets us thinking about how it’s going to play out, in terms of standards of what’s acceptable, of our expectations of the media, ”he said. mentionned. “It may well be that in a few years we will be comfortable with it, the same way we will be comfortable with a narrator reading a Civil War poem or letter.”

“There are some really great creative uses for these tools,” my second interviewee, Karen Hao, editor at the MIT Technology Review which focuses on artificial intelligence, told me. “But we have to be very careful about how we use them from the start. She referred to two recent deepfake technology deployments that she considers successful. The first, a 2020 collaboration between artists and AI companies, is an audio-video synthetic representation of Richard Nixon reading his infamous “In Case of a Lunar Disaster” speech, which he would have given if the Apollo 11 mission had failed and Neil Armstrong and Buzz Aldrin perished. (“The first time I watched it, I got chills,” Hao said.) The second, a March episode of “The Simpsons,” in which Ms. Krabappel’s character, voiced by the the late actress Marcia Wallace, was resuscitated by collecting phonemes from earlier recordings, passed her litmus test because, on a fictional show like “The Simpsons,” “you know a person’s voice doesn’t represent them, so there is less attachment that the voice might be wrong, ”Hao said. But, in the context of a documentary, “you don’t suddenly expect to see fake footage or hear fake sounds.”

A particularly disturbing aspect of the Bourdain voice clone, Hao speculated, could be his hybridization of reality and unreality: “It’s not clearly rigged, nor clearly real, and the fact that it has been his real words only confuse it even more. “In the world of broadcast media, deepfake and synthetic technologies are the logical successors of the ubiquitous and more discernible analog and digital manipulation techniques. Face renders and voice clones are already an emerging technology in scripted media, especially in big budget productions, where they promise to provide an alternative to laborious and expensive practical effects. But the potential of these technologies is undermined “if we present them to the public in a jarring fashion,” Hao said, adding: ” This could make the public have a more negative perception of this technology than is perhaps deserved. ”Part of the fact that Bourdain’s synthetic voice was not detected until Neville pointed out. what makes her so unnerving, “I’m sure people are wondering, how many other things have I heard where I thought it was really real, because it’s something that X nobody would say, and it was actually fabricated? Hao said. Still, she added, “I would recommend people give the guy” – Neville – “a little slack. It’s such fresh territory … It’s completely new terrain. Personally, I would be inclined to forgive him for having crossed a border which did not exist before.



Source link

Leave A Reply

Your email address will not be published.