At GDC’s State of Unreal last month, Unreal Editor for Fortnite and its creator economy took centre stage, illustrating Epic Games’ vision for the metaverse.
CEO Tim Sweeney explained to us why and how the metaverse is “happening for real” (unless Apple tries to crush it) and EVP Saxs Persson told us more about expanding Fortnite’s creations beyond shooters and building an ecosystem based on fairness.
But the State of Unreal also introduced a number of major features and tools that all play a part in Epic Games’ goal to be a major competitor in the engine space and beyond.
One of them is new procedural tools, part of the Unreal 5.2 update and of Epic’s wide photorealism goal. At the State of Unreal, the demo prompted audible gasps in the audience; the new tool lets developers generate photoreal foliage, simulate rocks and paths and more natural elements in just a few clicks, based on an initial man-made art direction.
“When we were envisioning UE5 and the suite of all the tools and technology, procedural content generation was one of the big ones,” VP of engineering Nick Penwarden tells GamesIndustry.biz. “And in order to ship 5.0 originally, we knew we were gonna do that a little bit later.
“We were really interested in making new tools to help developers be more efficient and be more creative, and so we’d been working on the digital content generation tools for a while. We worked on those all throughout last year.”
It’s now been one year since the release of Unreal Engine 5 and our last chat with Nick Penwarden. At the State of Unreal, Epic communications director Dana Cowley shared that 77% of Unreal users are now using UE5, with the engine currently boasting 750,000 active users.
“I think the first year’s been pretty amazing, seeing what customers are doing with the engine, seeing how many people have made the transition from Unreal Engine 4 to Unreal Engine 5, it’s really cool to see,” Penwarden says. “We tried to make that as easy as possible, but [having] so many developers [thinking] it was worth their time and energy to [transition] was really good to see.
“The hardest problem is probably deciding what to do next in what order! There’s so much we want to do.”
When introducing the new procedural tools on stage, Penwarden said it was all about how procedural systems can bring an environment to life, but that there will always be the need for man-made elements. We ask him for his thoughts on generative AI and its issues.
“We’re always going to need that human element of creativity,” he says. “Tools help humans be more expressive and different types of AI tools like non-generative AI tools… There’s a lot of potential for those to really be powerful tools for artists to be able to more easily express their creativity.”
He adds: “We think that any sort of data used for AI to learn from needs to be ethically sourced.”
Epic Games’ CTO Kim Libreri chimes in on the generative AI debate, agreeing that it doesn’t replace human creativity.
“And it would even exist without people,” he continues. “But I think as a tool, for an artist, if anything it helps inspire people to be even more creative. What can machines not produce? What new level of artistry or art form, or type of storytelling, is going to be possible?”
Talking about what machines can produce, another impressive aspect of Epic’s presentation at GDC was Metahuman Animator. Two years ago, Epic launched Metahuman Creator, which promised high-fidelity human characters in under an hour. Libreri shared on stage that, since then, four million characters have been created by one million users.
A continuation of that tool, Metahuman Animator makes the promise to create animated characters in mere minutes, with the on-stage demo showing how a few seconds of footage of actress Melina Juergens filmed on a phone can lead to its animated version in just a few clicks.
“We have a long-term plan for the Metahuman product,” says VP for digital humans technology Vlad Mastilovic. “The initial concepts were drawn ten years ago. Animator was always in the plans, because [it’s] one thing to create hyper-realistic characters, but then animating them is also an important part.
“Ultimately, when it comes to digital humans, it’s really the weakest component that defines the final quality of the whole thing. So rendering got really good a few years ago and the textures [and so on], but animation was always a challenge, because as soon as those stills start moving, that’s when the illusion breaks.
“One of the key innovations for Metahuman Animator is this meta human DNA file, which is at the same time character rig and the animation solver. If it’s a rig then it comes from some kind of base elements, it computes them into a final expression, but if you also have final data of how the expression should look like, it also inverse-solves it down to basic elements. So this is how it works, DNA is really the key that unlocks both things.”
Epic’s evolutions – whether procedural tools, Metahumans, or others – take into consideration the growing convergence of film and games, with its engine increasingly used by the film and TV industries on large productions. Over 550 TV and film projects have been made using Unreal Engine to date, Cowley revealed at GDC.
“A large portion of [Epic] are game makers by trade, and a large portion came from the film industry,” says Libreri, who spent 20 years in film before making the jump to games a decade ago. “A lot of us use the same tools across the two industries. The motion picture industry sort of pioneered the use of computer graphics to make photorealistic images and we know that gamers have always been interested in better looking images. It doesn’t mean they need to be photorealistic – Fortnite isn’t photorealistic – but it benefits from very high quality [images].
“And what we’ve been able to do is take a bunch of techniques and turn them into real time conversions of that technique and actually improve on it. And because a lot of us worked in the film business as well as the games business, we sort of have a natural intuition for that. And we really believe that over the next decade, we’re going to see all forms of entertainment merge into one sort of melting pot of, you know, you make an IP, you make some characters, you make an environment – could be a game, could be a movie, could be an experience, who knows what it could be! We’re beginning to see this.”
Penwarden says that this is something that’s at the back of Epic’s mind at all times – the ability for Unreal to scale up or down, and cater to both a small game on mobile or a big production in film.
“As we’re developing the engine, we’re always thinking about scalability of the businesses. How do we provide tools, tech, algorithms, that not only can you make a console game and get it running on a mobile phone, but also use that technology for film and achieve cinematic quality as well?
“It’s an extra dimension whereas, many years ago, if you were focused on just making an Xbox 360 game, you’re not really thinking about, ‘how does this scale for film?’ But today that’s very core to how we’re developing Unreal Engine.”
And that increasing focus on film is also core to the vision for MetaHuman Animator, Mastilovic adds.
“We’re going to the future where movies and games are going to be mixed and we’ll probably be able to watch a movie and then play a game which looks almost the same,” he says. “And for this to happen, we need to enable actors to exist in that space. When it comes to interactive mediums, they need to be translated into that 3D space.”
He continues: “We can do that at a reasonable level of quality – we have not solved the problem, but we’re reasonably good at it. But I think the more important issue for us is to enable everybody else to do the same, but not having to spend millions of dollars in order to do this for a single actor. And I think Metahumans Animator is a part of that agenda.”
While the games industry seems to see the convergence of film and games in a positive light – from a technical point of view at least – it remains to be seen whether the motion picture industry has a lot of interest in interactive mediums. We suggest to the table that film isn’t very interested in games.
“You know what? I don’t think it matters,” Libreri reacts. “I think what matters is consumers. We’ve just seen it with a bunch of TV shows. Look at The Last of Us, it was a tremendous success from the same minds that created the game. So the film business… it depends. A lot of the old school filmmakers are just obsessed with telling linear stories.
“But when you think about the percentage of people that play video games nowadays… As we have a new generation of filmmakers, they absolutely are not just open but want to explore the stories they tell in all sorts of mediums. So I think we’ll see, over the next generation, a crossover just naturally happening. We’re going to see a new breed of creator.”
Concluding our chat, we ask Libreri, Penwarden and Mastilovic what’s ahead for Epic.
“I think we’re going to continue our march towards photorealism,” Penwarden says. “That’s built into Epic’s culture. I’m really interested in continuing to build out the unified toolset to give artists more and better tools that allow them to be successful without worrying about technical details wherever possible.”
Mastilovic says their mission is not only to achieve photorealism, but also to achieve diversity in their data sets, as far as Metahumans is concerned, so that they can “generate pretty much anybody, represent everybody on the planet, which is incredibly difficult.”
“And then we want to enable people to become a Metahuman,” he adds. “Hopefully, in the future, we can enable people to capture a few images on future devices and become a Metahuman. But ultimately we would also like to model behaviours and I think that will provide a very interesting immersion in this virtual world, if you’re surrounded with characters which are actually not like NPCs but agents with some sense of intelligence. That would be quite amazing.”
This, of course, loops back to Epic’s vision for the metaverse, which was at the core of its GDC showcase. As Saxs Persson told us: “The metaverse needs human inhabitants. And part of this keynote was to show how these things are converting into a single story.”
But that’s not all, as Libreri points out: “On the metaverse ecosystem side of things, I want to start experimenting with entertainment that is not just regular video games. I think that [there’s] potential to entertain millions of people in ways they’ve never seen before, because it’s a virtual world, it’s huge. The concerts we did with Travis Scott and Ariana Grande were just the tip of the iceberg on what’s possible.
“We want to be able to have players get closer to the stars that they care about in a way that their interactions are meaningful. Bringing music into the virtual universe is going to be really cool. I just think there’s a whole infinite amount of possibilities and we just want to make sure that the actual underlying engine is capable of, you know, not being a glass ceiling to people’s dreams.”