<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

MetaHuman Animator starts to lead us out of the Uncanny Valley

The MetaHuman Creator interface - drag and drop people
1 minute read
The MetaHuman Creator interface - drag and drop people

Some of the results we are starting to see from the combination of Unreal Engine 5.3 and the MetaHuman tool for Unreal are signposting a route beyond the Uncanny Valley.

The MetaHuman tool for Unreal has been available for quite a while now, but for those who are not familiar with it yet, it's a tool for building animatable 3D characters in a browser. The MetaHuman Creator runs in a web browser but connects to an Unreal Editor on the local machine. 

The process begins with selecting a base MetaHuman from the ones available and then customizing it. There are several presets for customizing for build, height, and gender, plus a virtual DNA tool. Select three MetaHuman “parents” and then drag the dot in the center to blend between them. For further customization, there is a sculpt tool currently limited to the MetaHuman's face. The MetaHuman sculpt tool has a set of controls represented by dots with lines indicating the direction in which they will shift facial features. It's pretty easy to use after just a little bit of experimentation.

There is some hair and clothing customization available, but the options there are somewhat limited at the moment.

The Creator app has several lighting and backdrop presets available, as well as several pose and emotion presets.

MetaHuman Creator operates on the cloud using Unreal only as a front end, so the outputted MetaHumans are easy to browse and download with Quixel Bridge, and from there, import them into Unreal, complete with the light rigs if desired.

Once in Unreal, however, things get more interesting. The Metahumans are rigged and ready for animation for an animator or performance capture.

Performance capture in Unreal is not new but usually requires specialized stereo rigs and relatively large systems. Now, all it requires is an iPhone. MetaHuman Animator can take a few frames from the captured video and build a rig from it, then use that rig to map a captured performance to any MetaHuman. With the new rendering features in Unreal 5.3 combined with a current generation CPU and GPU, the process takes just a few minutes, and the results are astonishing. That it's now feasible for an indie game developer to use tools like this — and create realistic characters like what we’re seeing — is even more astonishing.

While the results are not perfect yet, MetaHumans are almost out of the Uncanny Valley. Here’s an example from JSFILMZ on YouTube.

 

 

Tags: Post & VFX

Comments