Elevate your enterprise data technology and strategy at Transform 2021.
After years of work, Epic Games is launching early access to game developers for Unreal Engine 5, the latest version of the company’s tools for making games with highly realistic 3D animations. And to show it off, Epic Games released an amazing video that showed how the engine could render a game space that faithfully reproduced a huge area of the red rock formations of Moab, Utah.
Epic is no doubt pleased to be talking about something other than its three-week antitrust trial against Apple that just wrapped up on Monday. And while the company creates a lot of fun with Fortnite, the Unreal Engine goes back to the geeky roots of the company and its CEO Tim Sweeney, who is one of the world’s graphics gurus. Founded in 1991, Epic now has 3,200 employees and the company is once again free to tout how it innovates in the gaming market.
And the Unreal Engine 5, which will officially ship in 2022, is the company’s crowning technical achievement. The early access build will let game developers start testing features and prototyping their upcoming games. Epic isn’t saying how long this took or how many employees are working on it, but it’s a safe bet that a large chunk of the employees are involved in Unreal Engine 5. It’s been seven years since the last engine shipped.
Unreal Engine 5 will deliver the freedom, fidelity, and flexibility to create next-generation games that will blow players’ minds, said Nick Penwarden, vice president of engineering, in an interview with GamesBeat.
He said it will be effortless for game developers to use groundbreaking new features such as Nanite and Lumen, which provide a generational leap in visual fidelity. The new World Partition system enables the creation of expansive worlds with scalable content.
As it always does, Epic will battle-test the engine in-house, as it prepares to ship Fortnite on UE5 across all platforms down the line. Epic said it will empower creators across all industries to deliver stunning real-time content and experiences. The early access build has only been tested on game development workflows and offers a chance for game developers to go hands-on with some of the new features.
Additional features and other improvements for all industries will be part of the full Unreal Engine 5.0 release in early 2022. The current tech is not yet production ready.
Valley of the Ancient
Developers can also download the new sample project, Valley of the Ancient, to start exploring the new features of UE5. Captured on an Xbox Series X and PlayStation 5, Valley of the Ancient is a rich and practical example of how the new features included with Unreal Engine 5 early access can be used, and is the result of internal stress-testing.
The demo features a woman named Echo in a deserted mountain area. The team from Quixel, which Epic acquired in 2019, went out to Moab to scan tons of rock formations, using drones and cameras. And the artists who created the demo populated the scene with Megascans assets, as opposed to using anything procedural or traditional animation tools.
“We wanted to see how we could populate pretty much everything with Nanite content using Megascans assets,” Ivey said.
Chance Ivey, senior technical designer on the project, walked me through a demo via a remote video. Even while viewing it remotely, I could see that it had massive amounts of detail, as Ivey could zoom in on a realistic rock and zoom out to show terrain that covered a couple of square kilometers of area. Ivey said the demo runs at 30 frames per second, which is slower than the fastest shooter games but is still realistic in its speed to the human eye. At some moments it can get up to 50 frames per second on a high-end machine.
“We are targeting 30FPS on next-generation console hardware” at 4K output with the demo, said Penwarden. “We expect people to be targeting 60 frames per second. It’s really a choice of the the gaming content itself, what you want to target, and UE5 is absolutely capable of powering 60 frames per second experiences. We chose to, in this case, absolutely maximize visual quality. And so we targeted 30FPS. But we’re absolutely going to support 60 frames per second experiences.”
The visual quality was amazing, and so I asked Penwarden what was the trick to getting millions of polygons into a scene with rich details that moved fast in real time.
“The trick is a lot of smart engineering that goes into the algorithms and techniques that we’re using,” Pendwarden said. “Nanite is able to make sure that we’re spending graphics cycles only on detail that you can actually see. As you are traversing the environment, it’s streaming data in and out. So we don’t need to pre populate memory with all of the triangles that you would potentially see. It’s very intelligently selecting which LODs [levels of detail] to render across the board in a very smooth and continuous manner so that so that we’re able to deliver that micro polygon detail in real time.”
In the demo scene, the world is one large map, not a bunch of prefabricated parts that are put together. Yet artists can work on the partitions without interfering with each others’ work. Artists can load and work on only the sections of the world that they need. Echo has body motions that are captured and brought into the scene so she can fluidly vault over some pixels and walk like a human without looking so much like a robot.
In the video, Echo runs into a big giant robot creature called the Ancient, which was created completely inside the Unreal Engine by the Aaron Sims Creative Company. The creature is made out of Nanite meshes and 50 million polygons. Echo proceeds to attack it with a magical energy charge. She brings it down in a big pile of dush amid the Moab rocks. Sadly, Ivey confirms that Echo is just a demo character and the company isn’t building a game around her.
“We’re targeting next-gen hardware, high-end PCs,” Ivey said. “The important thing is to give developers a look inside UE5. We want people to poke around at and give us feedback on it and prepare their next projects on it. There are a number of different transformative workflows for building maps that are coming in UE5, and they’re available in early access.”
Compared to Unreal Engine 4, which debuted for the first time in 2014, the visuals are much better.
“If we want to look at scene complexity, we’re looking at orders of magnitude more objects in the scene,” Penwarden said. “We’re looking at orders of magnitude more triangles so we’re talking hundreds of millions plus triangles, compared to maybe one or two million in a in a traditional game.”
Nanite
Developers can modify the project and make it their own using Unreal Engine. The new Nanite virtualized micropolygon system lets developers create games with massive amounts of geometric detail.
One thing that Unreal Engine 5 does that couldn’t be done before is directly import film-quality source assets comprised of millions of polygons. Developers can place them millions of times, all while maintaining a real-time frame rate, and without any noticeable loss of fidelity. Nanite intelligently streams and processes only the detail you can perceive, largely removing poly count and draw call constraints, and eliminating time-consuming tasks such as baking details to normal maps and manually authoring levels of detail (LODs) —freeing creators up to concentrate on creativity.
Lumen
Lumen is a fully dynamic global illumination solution that lets developers create dynamic, believable scenes. With Lumen, indirect lighting adapts on the fly to changes to direct lighting or geometry, such as changing the sun’s angle with the time of day, turning on a flashlight, or opening an exterior door. This means creators no longer have to author lightmap UVs, wait for lightmaps to bake, or place reflection captures— they can simply create and edit lights inside the Unreal Editor and see the same final lighting as when the game is run on console.
The scene has global illumination, where the sun casts rays of light on every object, and that light bounces off of objects or changes the look of areas in shadows. As the time of day changes, you can see the shadows retreat behind every single rock in the scene. The Megascans library — which consists of real objects meticulously scanned into digital form — is available for artists to drag and drop items into the scenes they’re creating. Megaassemblies are groups of preassembled objects such as a bunch of rocks put together as one object.
Open Worlds
One of Epic’s goals is to make the creation of open worlds faster, easier, and more collaborative for teams of all sizes. The new World Partition system in Unreal Engine 5 changes how levels are managed and streamed, automatically dividing the world into a grid and streaming the necessary cells.
Team members can also simultaneously work on the same region of the same game world without stepping on each other’s toes via the new One File Per Actor system. Data Layers allow creators to create different variations of the same world—such as daytime and nighttime versions—as layers that exist in the same space.
“Compared to our last demo last summer, this is quite a bit more open,” Ivey said. “We’re using the open world features to show that off. One of the features that comes along with World Partition, which is our new world building system, is a system called data layers, which allows us to very quickly change the state of the world. We had a team of artists build an entire dark world representation of this area.”
Animation
Creators can author detailed characters in dynamic, real-time environments with Unreal Engine 5’s powerful animation toolset. Working in context, they can iterate faster and more accurately, without the need for time-consuming round-tripping.
Artist-friendly tools like Control Rig let developers quickly create rigs and share them across multiple characters; pose them in Sequencer and save and apply the poses with the new Pose Browser; and easily create natural movement with the new Full-Body IK solver. And with Motion Warping, developers can dynamically adjust a character’s root motion to align to different targets with a single animation.
When a camera moves through the world, the engine knows how to stream in the right data at the right level of detail so that the scene looks real but doesn’t slow the computer down to a crawl.
MetaSounds
Unreal Engine 5 introduces a fundamentally new way of making audio, with MetaSounds, a high-performance system that offers complete control over audio digital signal processing (DSP) graph generation of sound sources. Users can manage all aspects of audio rendering to drive next-generation procedural audio experiences. MetaSounds is analogous to a fully programmable material and rendering pipeline, bringing all the benefits of procedural content creation to audio that the Material Editor brings to shaders: dynamic data-driven assets, the ability to map game parameters to sound playback, huge workflow improvements, and more.
“I think it’s going to change how people think about audio in games,” Ivey said.
Editor Workflow
The revamped Unreal Editor provides an updated visual style, streamlined workflows, and optimized use of screen real estate, making it easier, faster, and more pleasing to use, Epic said. The demo video showed how artists can be looking at a scene in the Moab rock formations and insert black monoliths and other objects into the scene, without having to exit the scene or bring out another tool.
“With UE5, we’ve tried to get rid of as much complexity in that workflow as possible,” Penwarden said. “That entire world was actually just one big map file. And the Unreal Engine 5 just takes that map file and it breaks it up into a grid of cells for the purposes of the Edit time. And then for runtime, it’s it dices it up into into streaming chunks, that then it streams in and out dynamically. And the the details of how we break it up can be can be set up to be optimal for the platform that you’re targeting, or platforms that you’re targeting. On the Edit time side, we don’t store everything in one big file or one file for the entire level. Rather, we store data at the granularity of the individual actors and objects that you’re placing in the map.”
To free up more space for viewport interactions, Epic added the ability to easily summon and stow the Content Browser and to dock any editor tab to the collapsible sidebar. Creators can now quickly access frequently used properties in the Details panel with a new favoriting system, while the new Create button on the main toolbar lets them easily place Actors into the world.
“An artist would be able to build an environment like this without writing code,” Penwarden said. “Depending on the complexity of the game, it is absolutely possible to build games entirely out of Blueprint that is possible today in UE4 and it will continue to be possible in UE5. As the scope and complexity of your project grows, then the native c++ APIs are there for developers to start working with and really expand on on what their game is. And of course, just like with Unreal Engine 4, developers will have access to the full source code of Unreal Engine 5.”
The user interface is all visual, so artists don’t have to know how to code in order to create what they need to do.
“We wanted the user interface to get out of the way,” Ivey said. “The artist or designer can be as close as they can to the actual work that they are doing. There are more and more tools that keep you close to the creation, you can work in this space, and when you do that, you have less iteration time and you’re able to focus more on creating. This leads to better outcomes. You have tools that let designers and artists work more closely together rather than having to bring in engineering.”
One huge gain for productivity is being able to access so many photorealistic environment assets that artists can access without having to reinvent all of those things in a digital form.
“It’s really about trying to keep developers in the editor as much as we can, and allowing them to iterate and be productive,” Penwarden said.
MetaHumans
To populate characters into the Unreal Engine 5 world, developers who build MetaHumans created in MetaHuman Creator are compatible with both UE4 and UE5. Developers can sign up for MetaHuman Creator early access to begin importing their own MetaHumans, or simply download any of over 50 premade digital humans from Quixel Bridge.
Quixel was an important acquisition in the development of the features of Unreal Engine 5, but so were other acquired companies such as 3Lateral and Cubic Motion, Penwarden said.
The whole goal is to make dramatic changes to workflows for everyone working on a game. Ivey said that the more people working on the same scene, the better. They shouldn’t have to wait for each other to get things done, he said.
“We just were able to let the artists go crazy and start building stuff and add to it again and getting other people’s changes without stepping on each other’s toes,” Ivey said. “It’s a wildly large productivity boost.”
GamesBeat
GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. How will you do that? Membership includes access to:
- Newsletters, such as DeanBeat
- The wonderful, educational, and fun speakers at our events
- Networking opportunities
- Special members-only interviews, chats, and “open office” events with GamesBeat staff
- Chatting with community members, GamesBeat staff, and other guests in our Discord
- And maybe even a fun prize or two
- Introductions to like-minded parties