Future presence? Creating films just by describing them to AI tech

[We’re used to watching movies and TV shows on high-definition screens and understanding but not consciously acknowledging (except perhaps during the credits) that large groups of people with a wide variety of specialized skills worked for months or years and spent large to huge amounts of money and used sophisticated, specialized technologies to construct every element of our (presence) experience. But a new project described in this story from PC Magazine suggests that in the future “everyone” will be able to merely describe their vision of a movie’s form and content to an AI-based technology that will then create the film, producing an unlimited number of presence experiences in which the role of the technology is even more opaque and less likely to be understood or acknowledged. See the original story for three videos embedded from Twitter. –Matthew]

The Footage in This Sci-Fi Movie Project Comes From AI-Generated Images

A tech entrepreneur is using images from AI-based programs including Midjourney to unlock the storytelling. ‘We’re on the verge of a new era,’ he says.

By Michael Kan
September 1, 2022

Imagine producing your own film filled with big-budget scenery, but from a computer.

A tech entrepreneur in Germany named Fabian Stelzer is trying to do just that by using AI-powered programs to create the footage, sound effects, and voices for a 70s-inspired sci-fi film.

The experimental project is called Salt, and it’s built entirely with AI-generated art. To create the visuals, Stelzer has been tapping publicly available programs such as Midjourney, Stable Diffusion, and DALL-E 2, which can essentially draw anything you want by relying on a mere text description from the user.

On Twitter, Stelzer has been releasing Salt in short clips, called “story seeds.” As you can see, the film project is bit like watching a Ken Burns’ documentary revolving around still images. That’s because AI-generated art can’t render moving pictures, at least not yet.

Nevertheless, Stelzer is able to create the feeling of motion through video editing and even some deepfake programs, which can make the characters’ faces move. All the voices in the film—including the female ones—also come from Stelzer, who’s been using an AI voice generator called Murf to create them.

“It’s all AI, except for one voice, which is… mine,” Stelzer told PCMag via Twitter.

Stelzer, who has a background in neuroscience, said he never considered himself an artist. But the growing advancements in AI-based programs show that movie-making could one day be accessible to anyone.

“We’re on the verge of a new era, really,” he said, later adding, “To me this is as big as the invention of photography, and to be honest maybe as big as the invention of writing.”

Indeed, programs such as Midjourney and DALL-E 2 have already given people with no art skills the ability to quickly produce images only a professional artist could create. Stelzer decided to start his film project after he saw one AI-generated image that looked like a mysterious sci-fi world out of a 1970s film. “I had to go in,” he wrote in a tweet.

Stelzer’s been creating various scenes for Salt by using text prompts like this: “<DESCRIPTION OF SCENE>, film still of a 1980s sci-fi movie, screenshot from film, photography, 35mm, grainy, VHS distortion, cinematic lighting.” Each two-minute chapter he’s made so far took about two hours to create.

Salt’s story focuses on a mysterious salt-like substance on a mining colony. The trailer hints the same substance has an alien origin with a nefarious purpose.

However, Salt won’t be a traditional film. Instead, Stelzer is letting the public vote on how the film’s story will develop after each story seed. The most recent chapter asks whether one of the characters, Sara, should follow her orders, disobey, or take a small sample of the mysterious salt-like substance before shipping out the containers. Stelzer envisions creating multiple, branching plot lines for the film to entertain different groups.

“I definitely want to have a ‘Director’s Cut’ at some point, or a ‘Community’s Cut,’ but the real goal is to transcend the medium of film into something new,” he said. “Like enable everyone in the community to eventually use a model that lets them write their own scenes.”

The idea could represent the future of film-making. Don’t like how a movie ended? Well, AI technologies could help you create a new one that fits your likings.

For now, it’s obvious Stelzer’s movie-making process still faces limitations, especially when it comes to depicting moving characters. However, he notes the same AI-based programs are quickly improving, so it may only be a matter of time before they can help him render realistic in-motion movie scenes.

“All big things start as experiments,” he added.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z