top of page
  • 40084662thesecond

A Quick Dive Into That Matrix Demo

Updated: Dec 15, 2021

Yesterday, I ended up properly finding out about this Matrix themed tech demo on PS5 and X-Box Series X called "The Matrix Awakens: An Unreal Engine 5 Experience" from my media technician who also serves as a tutor, who himself found out about it's existence from another student at the college. (I know this is a bit of an unconventional subject for me, but I'm only doing this so that I actually have work to do in college while I'm unable to make the remaining assets for my 20 second video advert that I'm making to promote a cartoon I've been working on in my free time)


This particular demo was shown off during The Game Awards (a yearly event where they award video games in a similar vain to that of the Oscars, but the main draw for me is actually some of the game announcements that get made there), but I was too busy looking at the trailers for Sonic The Hedgehog 2 (The movie, not the game) and the new Sonic game which we now know is going to be called Sonic Frontiers to really notice it (because they're Sonic, and the Matrix isn't)


With this, I ended up finding out yesterday that the demo was actually playable when a friend mentioned that they played the demo for themselves and said they were really impressed with the graphical fidelity of the whole thing, so without further ado, let's see what all the fuss is about:

For this, I'm actually going to be taking a few sample screenshots and see if you can tell what's being done using in game graphics and what's being done as a video/stock film asset:




Give up? well as it turns out, almost everything is being rendered out in real time within the new Unreal Engine 5, it's just Morpheus (who as it turns out is actually just a 2D asset ripped from the original film, thus technically making him "Paper Morpheus") and the meta humans that show up when Carrie-Ann Moss (Trinity's actress from the Matrix movies) is talking about switching bodies as easily as changing clothes (but even then, that's just because of the sheer volume of what they're calling "Meta Humans" on screen at once), but other than that (as well as the re-used footage from the first film), everything is rendered in real time. (don't feel bad if you got it wrong I couldn't quite figure out what was real time and what wasn't when I first looked at it either)

With this newfound knowledge I gained thanks to a digital foundry video I found discussing the demo with the developers who made it (those being Epic Games, the creators of all versions of the Unreal Engine), I'm actually quite impressed with how the graphics look in this demo, because thanks to the power of ray traced lighting mixed with Unreal Engine 5's fancy new "Lumen" (which to my understanding is like ray tracing but it's cheaper when it comes to hardware usage) and "Nanite" (which seems to have something to do with the way geometry on 3D models) systems, they ended up mirroring real life so well that if I were to claim that this demo was actually leaked footage from the new Matrix movie without any context, I think I'd actually have a shot at fooling people. (I mean I haven't had the chance to play it for myself yet but those visuals are still impressive)

In fact, uuuuh... Look at this totally real photograph I took on the set of the new Matrix movie

I'm only kidding of course, but you get the point about the graphics being so realistic that I could totally say something like that and it'll seem legit. But how does all this work?

Lumen Lighting

So from what I've gathered by watching the video you see above and by doing a little bit of further digging, Lumen seems to be a way for game developers to add ray tracing into their games without it being super taxing on the hardware the game is being played on. You can think of it as being like the "performance ray tracing" mode in games such as Ratchet & Clank: Rift Apart and the PS5 version of Spider-Man: Miles Morales:

Part of what lumen does is that it actually creates a lower quality version of the scene you're working with in (this "Lumen scene" as they call it has worse textures and a lower polygon count than the original) and then calculates the lighting based off of that and it also uses said version to generate the various reflections seen in the world (I.E: Puddles, building windows ext), thus allowing for real time reflections and global illumination to be possible on Unreal Engine 5.


That is all I can realistically say on "Lumen" however, as I don't understand a lot of the techno mumbo jumbo that's going on in the video I found.

Nanite

So from my (admittedly limited) understanding of what this "Nanite" system is, it seems to be a way for game developers to create 3D models with higher polygon counts than ever before while also changing the level of detail (or LOD for short) more dynamically in the same vain as the dynamic resolution scaling you see in games such as Halo 5 and even Doom 2016:

as we can see when they use the "Triangle" visualiser and move closer and further away, the polygon count actually goes up and down depending on our distance (meaning if it's closer, we'll see more triangles, and if we're farther away, we'll get less)

The reason the colourful triangles do this is because as we get farther away from a given object, it doesn't need to be rendered with as much detail as it does when we get all up close and personal with it, which means that objects made using this system can actually render faster and cost less in terms of memory than a regular 3D model that doesn't use the "Nanite" system. (and to think, we've come a long way since the Nintendo 64 days where everything)

Comments


bottom of page