Unleashing the Potential of Unreal Engine 5 — by Ben Zysberg Director of Technology at iXperiential Media
A few months ago (these days it feels like forever…) Epic Games released a demo of Unreal Engine 5 running on a PlayStation 5 (video above).
This past week at the Unreal Fest they explained how they did it in a 45-minute presentation that is worth watching (below):
iXperiential’s Director of Technology Ben Zysberg explains some of the tech and unique features:
“UE5 is based around two new technologies called Nanite and Lumen. Nanite grants the ability to import cinematic models straight out of ZBrush/ Max/ Maya without the need to decimate the model. In the demo, a single statue was 33 million polygons and what they called “the graveyard scene” was displaying 500 of them. Not only can they load a lot of polys but also a lot of objects (500,000 to a million) and with a low CPU cost. All of this with no level of details or assets being merged. All of the assets in the scene were independent and all in their full 33 million poly glory!
Epic Games did not explain how they pulled it off and some speculate that they might use what NVIDIA calls “mesh shaders.” One thing they did explain is that they rely on lot on streaming meaning: they only load into memory what you can see at any given time. This is something they do very efficiently… Not only are the assets compressed on the local storage, but they are also compressed in memory. Epic says that the streaming pool was only 768 MB of RAM for the demo and that they are trying to optimize even more. In fact, a single object with one million polys and a single UV channel has the same memory impact as a 4K normal map.
As you might imagine with groundbreaking technology — Nanite comes at a cost. For now, Nanite objects cannot support transparency, skeleton animations, tessellation, or displacement. Only rigid objects that can be moved with the typical translation/rotation/scale are supported. Fortunately, the old system still works for objects that need it. In other words: Nanite and non-Nanite objects can work together in the same scene.
Also introduced was Lumen: the new fully dynamic global illumination system. It means that lights and shadows are no longer baked, approximated or static. Everything is done in real-time including color bleeding, indirect shadows, lights bouncing from one surface to another and a new fully dynamic sun light system. It means that light will adapt in real-time when changing the geometry, the weather, the time of day or the sources of light. For now, Lumen is running at 30 FPS on PS5 (which explains why the demo was capped at 30 FPS) but Epic is aiming for 60. You can check a full demo of Lumen in action at 0:43.
Taking all of this in one quickly realizes the major implications for the production pipeline. Lightmaps and AO maps are gone. Epic says that the statues are only using three types of maps: diffuse, metal/roughness and normal. Technically you don’t need normal maps as all the details are now geometry but for small details like scratches normal maps are still useful. Given these technological advances and increased fidelity supporting millions upon millions of polygons the fact is a given: assets are going to be massive… To address the storage/space concerns Epic is working on an asset virtualization system: instead of downloading the entire project on your local machine you just load references to the assets. The assets are managed on a shared network drive and only loaded on your machine as needed.
Continuing-on the pipeline: Epic is working on better tools to interact between the different applications (Maya, Houdini, Shotgun, UE4…) using Python scripts to automatically load, import and render assets. For the teams who might not have time to create hundreds, thousands, or even millions of assets some good news: the entire Quixel library of megascans is free to use in UE4 and UE5. A lot of the assets in the demo are Quixel megascans. Epic is also improving cooking times for UE5 (preparing assets for a build) and deploying to mobile separately when it comes to tedious things on iOS like provisioning profiles. UE5 will also include better versions of tools already included in UE4. One of them will be Niagara and the new version will allow particles to interact with sound and with each other allowing for some fluid simulations. The second one will be Chaos, their physics engine that will fully replace PhysX. Chaos is already impressive on UE4 but for UE5 Epic wants to have the same level of destruction shared among players in a multiplayer game.
As for the timeline, UE5 is expected to go in beta in early 2021 and in full release in late 2021. In the meantime, UE4 is alive and kicking. UE 4.25 is out and is already supported by the PS5 and the Xbox Series X. It also has some brand-new insights tools to analyze everything from memory footprint and network traffic even down to animation steps. In the Fall Quarter 2020, Epic Games will release 4.26 which will add a new rigging system for procedural animations. 4.26 should also unify all of the AR platforms and add the Azure Spatial Anchors. It will probably be the last version of UE4 but UE5 will be backward compatible with UE4 projects. Things will break because they have deprecated but it should allow for developers to future-proof their projects. This also means that Blueprints, the node-based programming language of UE4, is here to stay.
“UE5 explained in a few words…” turned into much more than a few words! What does all of this mean for agencies like iXperiential? It means so much more than just a few things! Baking is gone. Polygon decimation is gone. The same model used for a pre-rendered shot can be used for real-time. We may no longer need to pre-render at this level of fidelity! The walls between VFX/pre-render/real-time are about to get a lot thinner. If your mind is already spinning thinking of the possibilities, it means that the demo did exactly what a tech demo is supposed to do: present the tools and unleash the potential. At the end, what really matters for agencies is telling a good story. But hell, it’s a lot easier to do with 16 billion polygons.”
-Ben Zysberg Director of Technology @ iXperiential Media