Anthony Bourdain’s documentary and the ethics of audio deepfakes – Quartz

0

Today, using machine learning to simulate a deceased person onscreen is an accepted Hollywood technique. Synthetic media, widely known as “deepfake” (a “deep learning” and “fake” coat rack) was famous for Carrie Fisher in a Star Wars movie and Heath Ledger in “A Knight’s Tale”. In 2019, footage of comedian Jimmy Fallon eerily transformed into Donald Trump showed just how advanced the technology has become.

But no one was laughing when it was revealed that deepfake technology was used to simulate Anthony Bourdain’s voice in the new documentary. Roadrunner: a film about Anthony Bourdain. In an interview with GQ, director Morgan Neville revealed that he commissioned an AI model of the chef’s voice and TV personality and considered using it to narrate the entire movie. In the final cut, Neville told The New Yorker he used it for three lines during the two-hour production. Among them is a poignant line from an email to artist David Choe, “My life is kind of shit now. You are successful, and I have succeeded, and I wonder: are you happy? “

Neville uses the AI-generated clip as an artistic touch to accentuate Choe’s pathos as he recounts the last email he received before Bourdain committed suicide in 2018. The audio is compelling, well a little flatter compared to the rest of Bourdain’s narration..

An emerging controversy over the making of Bourdain’s voice

Neville, a former reporter, didn’t see the problem of mixing AI-generated sound bytes with actual clips of Bourdain’s voice. “We can have a documentary ethics panel on this later,” he joked in the New Yorker interview.

Neville added that he obtained consent from Bourdain’s estate. “I checked, you know, with his widow and executor, just to make sure people were cool with that. And they were like, ‘Tony would have been cool with that.’ I didn’t put words in his mouth. I was just trying to bring them to life, ”he told GQ. Bourdain’s ex-wife Ottavia Busia-Bourdain, who appears extensively in the documentary, then disputed that she had never authorized an audio surrogate.

Meredith Broussard, professor of journalism at New York University and author of the book Artificial intelligence: how computers misunderstand the world, says it’s understandable that many find Bourdain’s audio clone deeply disturbing. “I’m not surprised his widow doesn’t feel like she gave her permission for this,” she said. “It’s such a new technology that no one really expects it to be used this way.”

The use of AI in journalism poses the biggest ethical dilemma, Broussard said. “People are more forgiving when we use this kind of technology in fiction as opposed to documentaries,” she explains. “In a documentary, people feel like it’s real and therefore they feel cheated.”

Simulated media and AI in journalism

Roadrunner, which is co-produced by CNN, is not the first instance where news outlets have relied on AI. The Associated Press, for example, has been using AI to automatically generate articles on quarterly profits for companies since 2015. Each automatically generated AP article is accompanied by a note: “This story was generated by Automated Insights”, referring to machine learning technology. they use.

It is imperative to clearly disclose when AI is used, Broussard says. The fact that Neville discussed it after the fact is remarkable although it is questionable whether it satisfies his detractors. “It’s interesting that documentary filmmakers are going to have to think about the ethics of deepfakes,” Broussard says. “They’ve always thought about the ethics of storytelling, just like journalists, but here’s a whole new area where we’re going to have to develop ethical standards.”

The controversy reignites a long-standing debate about how journalists cite their stories. The famous writer Gay Talese, for example, reconstructs quotes as he remembers them, believing that the tape recorder is “the death knell of literary reporting”. In his book The journalist and the murderer, the New YorkerJanet Malcolm pointed out the problem of combining fragments of multiple interviews into one statement, as reporter Joe McGinniss did when covering the murder trial of former physician Jeffrey MacDonald. “The journalist can no more create his subjects than the analyst can create his patients. she wrote. The late writer herself was embroiled in a ten-year legal battle over five quotes she used in a 1983 profile on Sigmund Freud’s archives. The libel case was dismissed in favor of Malcolm.

The ethics of deepfakes

Broussard is unsure of his position on Neville’s use of deepfake technology. “The problem with ethics is that it’s all about context,” she explains. “Three lines in a documentary film, it’s not the end of the world, but it’s important as a precedent. And it’s important to have a conversation about whether we think it’s the right thing to do.

Ultimately, Broussard says the Roadrunner controversy presents another argument for regulating the use of AI as a whole. “There is an emerging conversation in the field of machine learning about the need for ethics and machine learning,” she says, “I am grateful that this conversation has started because it is long overdue.”



Source link

Leave A Reply

Your email address will not be published.