Synthetic humans: New tech further closes gap between real and CGI

[Epic Games is leading a group of companies in making significant strides in “crossing the uncanny valley” to create more convincing, less creepy presence illusions of people. The short story below from Slashgear summarizes one of the developments demonstrated at the recent Game Developers Conference; more details follow in a story from VentureBeat. The original versions of both contain more images and videos, and more information is available in coverage by The Verge and the Unreal Engine blog. Note especially this prediction in the VentureBeat story: “By 2024, we may all be interacting with digital humans in some way or other, whether it’s via headsets, films, TV, games, live performances and broadcasts, or by directing digital assistants in our homes in real-time.” –Matthew]

[Image: Source: Engadget]

Epic Games’ Siren demo is both amazing and super-creepy

Eric Abent
Mar 22, 2018

Epic Games is no stranger to incredibly detailed motion capture technology. 2017’s Hellblade: Senua’s Sacrifice was based on Unreal Engine’s motion capture capabilities, and it received a lot of praise for just how detailed the in-game character of Senua was. Now Epic is looking to take what it learned with Hellblade one step further, creating a new digital personality called Siren who is as impressive as she is unsettling.

For this demo, Epic partnered with four other companies: Vicon, Cubic Motion, 3Lateral, and Tencent. What sets Siren apart from other digital personalities similar to her is that she relies on real-time motion capture. Using technologies from all five companies, Unreal Engine is able to render Siren in real-time based on the actions of an actress who seems to be moving in tandem with her.

That’s impressive enough on its own, but the level of detail in Siren’s model takes this to a completely different level. In an interview with VentureBeat, Epic Games chief technology officer Kim Libreri discussed how Siren’s model has improved over Senua from Hellblade, noting that facial detail has drastically improved. For instance, Libreri says that Siren actually has 300,000 tiny hairs all over her face – something that graphics technology couldn’t support back when Hellblade was being created in 2016, but can now.

Add to that improvements to shading, reflection, and refraction, and we have Siren, taking us one step closer to CGI that’s indistinguishable from the real thing. There are still some spots that need work, Libreri admits, and for as advanced as something like Siren is, it’s still somewhat easy to tell that she’s a CGI character. Even then, this represents a pretty big jump in quality over what we saw in Hellblade, and it’s going to keep getting better from here.

While we may not see motion capture on Siren’s level make it into video games any time in the immediate future, this has pretty big implications for the industry as a whole. Outside of video games, the ability to create a digital personality in real time can be applied to a lot of different industries, especially when you consider the rising popularity of live streaming, so Epic and its partners are definitely onto something special here.

—–

[From VentureBeat]

Epic Games shows off amazing real-time digital human with Siren demo

Dean Takahashi
March 21, 2018

Epic Games, CubicMotion, 3Lateral, Tencent, and Vicon took a big step toward creating believable digital humans today with the debut of Siren, a demo of a woman rendered in real-time using Epic’s Unreal Engine 4 technology.

The move is a step toward transforming both films and games using digital humans who look and act like the real thing. The tech, shown off at Epic’s event at the Game Developers Conference in San Francisco, is available for licensing for game or film makers.

Cubic Motion’s computer vision technology empowered producers to conveniently and instantaneously create digital facial animation, saving the time and cost of digitally animating it by hand.

“Everything you saw was running in the Unreal Engine at 60 frames per second,” said Epic Games chief technology officer Kim Libreri, during a press briefing on Wednesday morning at GDC. “Creating believable digital characters that you can interact with and direct in real-time is one of the most exciting things that has happened in the computer graphics industry in recent years.”

The tech is an improvement on what Epic showed two years ago with the demo of the Senua character from the groundbreaking Ninja Theory video game Hellblade, which debuted in 2017.

Epic [unveiled] the Siren character, which was driven by a live actress. It showed realistic face and eye movement and the ability to interact with participants. The demo used 3Lateral’s facial rigging technology with body animation driven by Vicon’s motion capture system.

Cubic Motion’s computer vision technology tracks more than 200 facial features at over 90 frames per second. It automatically maps this data to high-quality digital characters in real-time.

“We are offering the keys to unlock a virtual world, enabling content producers and game developers to more easily interact with our technology and streamline the creation process for performance driven real-time digital humans,” said Andy Wood, Chairman of Cubic Motion, in a statement. “By 2020, this will no doubt transform content production across the board by making this technology universally available. By 2024, we may all be interacting with digital humans in some way or other, whether it’s via headsets, films, TV, games, live performances and broadcasts, or by directing digital assistants in our homes in real-time.”

Cubic Motion’s motion-capture technology allows a live actor to perform within the entertainment and game world in real-time. The technology transforms the production process, allowing directors and producers to see the result instantly and perform continuous retakes, saving time and money when developing games, film, TV and virtual assistants.

It also works for live esports broadcasts, allowing new content to be created in real-time and transmitted directly into a game.

The real-time presentation displayed during Epic’s keynote was previously recorded on Vicon’s capture stage at its headquarters in Oxford, England. To create the video, actress Alexa Lee wore a full body motion capture suit with a head-mounted camera.

Using Vicon’s new Shōgun 1.2 software, her body and finger movements were captured on one screen while the data was streamed into Unreal Engine using Vicon’s new Live Link plugin. On a second screen, the Siren character — created using the likeness of Chinese actress Bingjie Jiang — moved in sync, driven in-engine at 60 frames per second.

“When we began working on Siren, we knew from the beginning that it was going to push several boundaries. To make this possible we needed the best motion capture hardware and software,” said Kim Liberi, Epic Games’ CTO.

The Siren project began as a collaboration between Epic and Tencent, with the goal of creating a proof-of-concept demonstration to show both the capabilities of Unreal Engine 4, and what the next generation of digital characters will look like.

Libreri said the real actress would demo the tech live at Epic’s booth during GDC.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z