Edward Saatchi, CEO of Amazon-backed startup Fable, sees AI as “possibly the end of human creativity,” at least as an exclusive phenomenon. And no, he doesn’t believe that’s a bad thing.
The exec made the provocative comments on CNBC on Friday in an interview on Squawk Box (watch the full segment above). The appearance was timed to the news that Fable’s interactive platform Showrunner has embarked on a “non-commercial, academic” initiative to rescue lost footage from 1942 Orson Welles film The Magnificent Ambersons.
“What’s coming is a world where we’re not the only creative species,” Saatchi said, “and that we will enjoy entertainment created by AIs. So, we wanted to train our AI on the greatest storyteller of the past 200 years, Orson Welles.”
Following close on the heels of Welles’ towering achievement, Citizen Kane, Ambersons was released by Warner Bros. in 1942 in a slimmed down version, bearing a tacked-on happy ending that the filmmaker did not approve. About 43 minutes included in the director’s cut wound up being destroyed so that the silver nitrate could be retrieved from the physical print. Because Welles spent time in Brazil when the cuts happened, film historians and cinephiles have long searched for remnants there. Over time, the fate of the footage has become one of the biggest enigmas in Hollywood history.
Warners, which has participated in recent efforts to revitalize library titles like The Wizard of Oz and Willy Wonka and the Chocolate Factory, has not formally endorsed the Ambersons project. “This painstaking AI reconstruction over the next two years aims to get as close as possible to Welles’ exact vision – as close as possible without finding the destroyed footage.” Saatchi said in a press release.
The anxiety level in the creative community remains elevated as they worry about AI taking jobs, a fear that motivated the WGA and SAG-AFTRA to secure some protections on that front in their 2023 talks with studios. Saatchi acknowledges the disruption but sees it as a “huge revenue generator” for studios, saying that his company has talked with Disney and other large players.
The goal would be to have IP holders, including filmmakers and writers, included in that revenue opportunity. “You could imagine a world where a movie would come out on a Friday, with [an AI model] alongside it, day-and-date,” the CEO said. By Sunday of opening weekend, he imagined, “there are millions of new scenes” and even full fan-generated features online.
While studios and other stakeholders were dead-set against that scenario as recently as a year ago, Saatchi says they are starting to come around to the idea of monetizing AI. “They could make enormous amounts of money,” he said, instead of having Google or other tech firms not connected with studios reaping all of the benefits.
Computers generating original work “is something Warhol would have found very exciting, DaVinci,” Saatchi said. “The idea that AI can be creative and that you can create a work of art that creates more works of art is really exciting.”
Filmmaker and researcher Brian Rose, who has spent the past five years trying to piece together the 30,000 missing frames from a range of source material, has boarded the Showrunner project. As part of his reconstruction over the last 5 years, Rose rebuilt the physical sets in 3D and then worked out the camera moves to fit with the documentary evidence, script, extensive notes from Welles and the studio, along with set photos, interviews and archive materials.
Of 73 scenes in the original film, 21 were either cut completely or reshot, and 39 were shortened, according to Showrunner. Rose said “only 13 were left intact. They just radically altered the film. These changes were all made without Welles’s approval.”
Source link