Most of what happens within the video games we play is invisible to us. Even the elements we’re looking straight at work because of what’s happening behind the scenes. If you’ve ever watched a behind-the-scenes video about game development, you might’ve seen these versions of flat, gray game worlds filled with lines and icons pointing every which way, with multiple grids and layers. These are the visual representations of all the systems that make the game work.

This is an especially weird dichotomy to consider when it comes to lighting in any game with a 3D perspective, but especially so in high-fidelity games. We don’t see light so much as we see everything it touches; it’s invisible, but it gives us most of our information about game worlds. And it’s a lot more complex than “turn on lamp, room light up.” Reflection, absorption, diffusion, subsurface scattering–the movement of light is a complex thing that has been explored by physicists in the real world for literally centuries, and will likely be studied for centuries more. In the middle of all of that are game designers, applying the science of light to video games in practical ways, balanced with the limitations of even today’s powerful GPUs, just to show all us nerds a good time.
If you’ve wondered why many games seem to be like static amusement parks waiting for you to interact with a few specific things, lighting is often the reason. But it’s also the reason more and more game worlds look vibrant and lifelike. Game developers have gotten good at simulating static lighting, but making it move is harder. Dynamic lighting has long been computationally expensive, potentially tanking game performance, and we’re finally starting to see that change.
Case in point is Assassin’s Creed Shadows and Ubisoft Technical Architect Nicolas Lopez. Lopez spoke at the Game Developers’ Conference this spring, and reading through his slide deck was fascinating, with the release of the video recording of the show only amplifying that. I reached out to Ubisoft talked with Lopez via email about how technology like ray tracing is changing game lighting. For more on those aspects of video game visuals, check out our primer on path tracing, ray tracing, and rasterization in game graphics.
The thing that initially piqued my interest was a specific set of numbers Lopez put on the screen during the GDC presentation. Lopez noted that if lighting were calculated in Assassin’s Creed Shadows the same way it had been in Assassin’s Creed Unity, Shadows would’ve taken nearly two years to “bake” all the lighting–that is, precalculate and render it–and around 2TB of storage space to store all of that lighting. (Assassin’s Creed Shadows requires 115GB on PC for the entire game.) Those numbers seem to represent a vast difference between the games of yesteryear and modern triple-A titles. I wanted to put those numbers into context as soon as I saw them.
Cities in Assassin’s Creed Unity and Syndicate, Lopez explained, were pretty small spaces: dense cities about four square kilometers in area. In those games, developers used uniform lighting probes to dictate how global illumination was rendered (alongside other lighting techniques). Lighting probes are points on the map that contain information about what lighting is passing through the empty space in that area, which is then used to help light the static and dynamic objects in that space. These were 50-centimeter probes, meaning they were requesting light information every half a meter across Unity’s entire city of Paris.
Let’s pause for a moment to discuss global illumination, also referred to as GI. This is a broad term used to describe a variety of techniques meant to imitate and mimic realistic lighting, especially indirect lighting. These include cube maps, screen-space reflections, probe lighting, and more. Digital Foundry has a great primer on global illumination that breaks this stuff down into chunks.
“The small scale [of these smaller game worlds] let us afford high-quality lighting, even though it required significant storage,” Lopez told me. From Assassin’s Creed Origins forward, though, world size exploded, and suddenly we were exploring worlds 256 square kilometers in area. That much baked light information would’ve caused games to explode in file storage size, so the team began to use a dynamic system.
“We vary probe density based on scene complexity–dense urban areas still use 50-centimeter spacing, but in open landscapes like deserts or forests, we reduce the resolution,” Lopez explained. Artists paint this “GI density map” directly–meaning they can focus the quality of light where it matters most–which Lopez said “dramatically reduces data size while preserving visual quality where it matters most.”

It’s also worth mentioning that Assassin’s Creed Unity had a limited number of times of day–just four, compared to 11 in Shadows–and had very limited weather effects, while Shadows has a dynamic weather system. All those elements in Shadows create changes in lighting the game has to take into account to present believable visuals.
That original question–why would baking the lighting take so long and require so much file storage?–comes back to a simple conclusion. The limitations of hardware and game engines at the time forced the team to pre-bake a lot of lighting information; neither the hardware nor the software was ready to handle calculating so much light data in real time.
Historically, most if not all game lighting has been “baked.” That means that the various techniques used to light a given scene are calculated ahead of time and then stored as textures and maps that the engine overlays onto the basic scene, so your computer or game console doesn’t have to do those calculations while you’re playing and slow the game to a crawl. This works for really static games and can provide very convincing, impactful lighting. As a game becomes more dynamic, though, baked lighting becomes less viable. If we consider cube maps (see our primer and that Digital Foundry video above for the specifics), you’d have to calculate a cube map for every possible place a character could stand, and that’s where game install sizes can start ballooning when it comes to game lighting.


Unity’s game world was a small, uniform space when compared to the more recent Assassin’s Creed games. The same approach applied to the much larger areas of Shadows, with more dynamic effects and varied density, would’ve made for a truly massive game.
But it has been a long time since the release of Assassin’s Creed Unity. More than a decade, in fact, and in that game, a lot has changed! One of the biggest changes has been ray tracing, which is available to all PlayStation 5 and Xbox Series X|S gamers, and about 30-40% of graphics cards shown in the Steam Hardware Survey as of this May.
“Ray tracing has had a major impact on how we approach lighting, from both a creative and production standpoint,” Lopez said. “In traditional pipelines, lighting was mostly baked global illumination, reflections, ambient occlusion, etc. That meant lighting couldn’t react to world changes. Change or move a piece of architecture? Suddenly the lighting is invalid. Teams would have to re-bake the lighting data, which could take hours or even days. At Ubisoft, we ran overnight compute jobs so artists could get fresh lighting each morning. But it still slowed down iteration dramatically.”
Ray tracing freed artists from many of those constraints, he said.
“Artists can move objects, adjust scenes, or iterate without waiting for long bakes. And since lighting is computed per pixel, the visual fidelity is much higher and more physically accurate,” Lopez explained. In other words, artists can do art at their own pace instead of waiting for computers to compute before they do so.
Even so, ray tracing is often invisible to us on the consumer side of things, especially when we consider how static many game worlds can be. In those games, ray tracing is all but invisible, as it’s doing the same thing that game designers were doing with creatively placed static lights. A Call of Duty level, for example, doesn’t have to account for procedural changes to time of day and weather, so traditional lighting can be very effective.


“The benefits can be subtle,” Lopez admitted, with respect to that set of game-design rules. Shadows, Lopez said, is a truly dynamic world, and that makes ray tracing a genuine game changer–in a figurative and literal sense. Shadows has, as previously mentioned, 11 different times of day, four seasons, a variety of dynamic weather effects, and destructible environments that all affect the way players see the game as they play. The hybrid solution Shadows uses, combining some baked and some dynamic lighting, “pushed baked GI to its absolute limits.”
“Ray tracing allows us to light these dynamic environments accurately and consistently,” Lopez explained. “Lighting behaves as it should, even when the world changes drastically. No hacks, no workarounds. It makes the world feel more grounded and believable.”
“Interiors light up naturally when doors open,” Lopez continued. “Destructible objects finally contribute to the scene lighting. Season lighting changes the mood of a scene. Without ray tracing, these effects can only be approximated to a certain extent.”
The team iterated on Ubisoft’s Anvil engine, which is used across the Assassin’s Creed series, in other ways. They adopted the Academy Color Encoding System, for example, which is a universal color standard used by the film industry to keep the work of hundreds or thousands of people consistent across many devices. It helps ensure that people working on different machines, in different locations, creating different parts of the game in a variety of applications, are working with the same color information. That gives us a more consistent and more grounded game–that is, the characters, buildings, foliage, and even effects look like they belong in the same game and feel convincing–at the end of development. Color Look-Up Tables (LUTs) allow the team to shift the visual tone and color grading to match the weather and environmental ambiance.


One of the key parts of getting Shadows right in particular fell, again, to getting the lighting just right. With HDR screens being so ubiquitous these days, developers can rely on having some ability to show a greater range of light and dark–but dark can still often mean pitch black. Even high-quality displays are subject to a user’s personal preference, the qualities of that particular display, and a game console’s ability to display color accurately. This became especially important when Ubisoft decided to finally fulfill years of gamers wishing for a ninja-focused Assassin’s Creed game.
“We reworked key parts of our physically based rendering and exposure pipelines to behave more accurately in low-light conditions,” Lopez said.
With ray tracing helping to inform the way lighting works in Shadows, the team can “deliver nights that feel moody and authentic, without relying on artificial fill lighting,” Lopez said. The quality of the lighting directly supports the gameplay, both giving it more weight and believability, and making it feel more cinematic.
All of these changes came together, Lopez said, to create a “unified, reactive visual where everything feels connected,” with Shadows in particular having sparked a cycle of innovation that pushed Anvil and Assassin’s Creed both forward.
For the rest of us, Assassin’s Creed Shadows works as an example of how video game design is adapting to new technology for both gamers and game designers alike. Designers working on tentpole games like Shadows have new ways to create games more quickly without sacrificing fidelity. In return, we get more dynamic worlds that live and move around us.
Image credits: Ubisoft
0 Comments