On Epic's plans for digital humans in Unreal and Microsoft's plans for digital humans everywhere else.
News
<iframe title="MetaHuman Creator: High-Fidelity Digital Humans Made Easy | Unreal Engine" src="https://www.youtube-nocookie.com/embed/S3F1vZYpH8c?rel=0&autoplay=0&showinfo=0&enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="nullpx" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" data-gtm-yt-inspected-91172384_163="true" id="418274734" data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe>Unreal Announces Metahumans
Two issues ago we talked about the then-just-announced Unreal Engine 5, and some of its aims to smooth out production workflows around high density meshes. Since then, Epic has been on a kick building out a case for itself and that goal, releasing various videos about virtual production and cinematic asset creation/lighting on their Youtube channel.
Epic has effectively no competition here on the software side outside of people’s willingness to switch over from offline tooling to Unreal, so their non-stop PR blitz about virtual production is commendable. A general theme of these videos is talking about how Unreal fits in all aspects of the production stack, but the elephant in the room has (and will likely always be), you know, humans.
Humans, the part of films we probably most watch films for, have been cut out of the conversation (or obscured) when talking about virtual production. Virtual production is largely about everything besides humans.
Virtual production, by Epic’s reasoning, is largely an efficiency mechanism. Not “cheaper” or “easier,” but efficient. Unreal’s own “Virtual Production Field Guide” goes through great pains to describe how Virtual Production’s primary goal is to “reduce uncertainty”:
All of these efficiencies and increased image quality offer a trickle-down effect to more modest and tightly scheduled productions. By leveraging virtual production techniques with a real-time engine, network series, streaming productions, and indies can all achieve very high quality imagery and epic scope. A real-time engine has the potential to eliminate many of the bottlenecks of budgeting, schedule, and development time that can prohibit smaller-scale productions from producing imagery on par with blockbusters.
This is at odds with the fact that humans are, and have always been, and will always be, the least efficient part of movie production. They run late, have to eat, sleep, be famous, etc. What if you could make a movie using humans, but… not really.
This, in my mind, is the ultimate promise of the recently announced Metahumans. Epic is, in part, saying, “Listen, everything you already do is in Unreal… why not have your humans in Unreal as well?”
Digital humans aren’t an idea that’s particularly new, but the the fact Metahumans work in a realtime context is what differentiates them. We’ve had really great digital humans for the past few decades through robust offline rendering, but offline rendering doesn’t mesh with the virtual production ethos. All the same live, micro-tweaking talked about with virtual production can now too be applied to human performances.
The pitch is salient, but at the same time you’re making a game engine the central nucleus of a film production. Show of hands: how many people feel comfortable debugging UASSET errors in Unreal while talent is staring at you from the stage and a producer is whispering in your ear that this bug is costing them thousands and the studio manager just walked in to say the next production has arrived and… oops you missed your shot. Every virtual production shoot day is essentially live coding a movie.
Don’t get me wrong, I’m a fan of virtual production and do think it’s what’s next, but I also think all talk of it, especially from Epic, often obscures the fact that you’re running a game engine in the background, which has all the issues game engines normally have. Not only this, but because you’re likely renting out space in someone else’s studio to shoot (you didn’t build your own LED wall right?), you’re relying on the past production to have cleaned up their Unreal workspace (lol) or the studio technical manager to have done it (lol). What happens when you boot up Unreal on shoot day and a Windows update broke Unreal (it happens)?
I’m being a bit facetious here too — as good as Metahumans look (what a name too right?), nobody right now will assume these are actual humans… yet. Most of the copy on the main Metahumans page talks about using them in games, with virtual production getting only a passing mention:
Imagine […] digital doubles on the latest virtual production set that will stand up to close-up shots, virtual participants in immersive training scenarios you can’t tell from the real thing: the possibilities for creators are limitless.
These avatars are definitely AAA games quality, a massive leap forward for any indie, but they are also a step forward towards realtime digital humans that are indistinguishable from the “real” thing.
Here we go again
Are we surprised that the most prominent Metahuman demos are an asian man and a black woman, and that most of the people demoing and puppeting their faces (in the closed beta of Metahumans — this tech isn’t publicly accessible right now) are largely men who don’t share their race?
I spent all of the last issue of Rendered talking about this, saying “emerging technology can and will be used in ways that go beyond whatever cute scenario you’ve imagined for it, and as such we should be taking an active role in preventing these outcomes instead of accepting them as an inevitability.”
The fact the #metahumans tag on Twitter is full of men trying on blackness is a massive issue in the same strain. I’m not a diversity and inclusion expert by any means, nor do I propose to have solutions — but what I have to ask is… what the hell? What sort of precedent are we setting here by giving groundbreaking technology to, what seems to be, the same class of people that has abused technology in the past to reinforce the same social norms. We have to be better.
If you’re reading this and work at Epic — advocate for diversity and inclusion in even the smallest tests like these. Consider what message you’re sending to your prospective users and other non-white creators that rely on your tools to create their work. Bake this into your process.
Be better.
Microsoft Mesh Announced

At the annual IGNITE conference, Microsoft announced a new cloud service, Mesh: