[Enhancement] Consider other global illumination options #26

Open
opened 2025-04-04 08:33:26 +00:00 by TriVoxel · 0 comments

Description

These days, there are a lot of real-time, fully dynamic global illumination methods that have been developed. It would be good to consider alternatives which could significantly improve our visual fidelity.

Examples

Obviously, we could do path tracing. It is by far the most costly. However, with modern high-end hardware such as AMD's Radeon 7000 series and newer, or Nvidia's RTX 3000 series or newer, it is finally possible with upscaling. Here is an interesting video series by Sebastian Lague where he demonstrates building a real-time path tracer for Unity from scratch. The videos contain a lot of very useful visualizations. There are also a lot of papers out there, especially from Nvidia, which demonstrate massive potential for real-time path tracing! Check out 2 Minute Papers' videos about path tracing tech and Nvidia RTX!

EA's in-house "Frostbite" engine uses a technique called "Surfel-based GI". Here's a video demonstrating the technology. It has a lot of advantages. It provides path traced quality with a fraction of the GPU power and it scales to any size of scene. It also works on animated and skinned meshes like characters or moving geometry, and it adapts well to changing environmental conditions like dynamic time of day. It honestly seems like a near flawless GI implementation and if we could do something like this, it would massively upgrade our GI performance and quality! It also scales well to different resolutions and low end devices and has been powering DICE games like Star Wars Battlefront EA and Battlefield! Amazingly, there is an open source, OpenGL game engine called "EAEngine" which implements this surfel technology. So, we could take inspiration from that codebase! Here's a video demo!

There is also the voxel-based cone tracing method. This approach can also be very realistic, although it is much more resource intense than something like surfels. Here's a video demonstrating it well. Something I find noteworthy about this particular implementation is the level of control the artist gets to creatively control the global illumination to exaggerate it and craft the lighting in a very aesthetically pleasing way. Presumably, this will perform better in smaller indoor levels, unless we can find ways to optimize it, such as biasing more samples closer to the camera and blending.

There is also screen-space GI which can look good but will fall apart with complex scenes with multiple significant points of global illumination. Perhaps we should use this in conjunction with another method. For example, we could use a low resolution voxel cone traced GI that is mostly accurate but not very precise to light everything, and then layer screen-space GI on top to give on-screen elements more clarity. We could then blend these results together so whatever the screen space path tracing misses can be picked up by the fallback option.

There's also baked irradiance grids like Blender's. They aren't fantastic for real-time rendering since a large grid can use a lot of storage space and will take a decent bit of time to bake and will require a lot of artist fussing about. However, they do allow a lot of artistic control. It's also something supported by Blender itself so, it would be nice to at least see supported. I could see some cases, particularly in simpler indoor scenes, where irradiance indirect lighting via manually placed grids could be useful and the best option.

Additional resources

https://github.com/Cutano/CGLibrary

## Description These days, there are a lot of real-time, fully dynamic global illumination methods that have been developed. It would be good to consider alternatives which could significantly improve our visual fidelity. ## Examples Obviously, we could do path tracing. It is by far the most costly. However, with modern high-end hardware such as AMD's Radeon 7000 series and newer, or Nvidia's RTX 3000 series or newer, it is finally possible with upscaling. Here is an interesting [video series by Sebastian Lague](https://youtube.com/playlist?list=PLFt_AvWsXl0dlgwe4JQ0oZuleqOTjmox3) where he demonstrates building a real-time path tracer for Unity from scratch. The videos contain a lot of very useful visualizations. There are also a lot of papers out there, especially from Nvidia, which demonstrate massive potential for real-time path tracing! Check out [2 Minute Papers'](https://www.youtube.com/playlist?list=PLujxSBD-JXgk1hb8lyu6sTYsLL39r_3bG) videos about path tracing tech [and Nvidia RTX](https://www.youtube.com/playlist?list=PLujxSBD-JXgkZIkzudS-dOZbbCFJpiAFD)! EA's in-house "Frostbite" engine uses a technique called "Surfel-based GI". Here's [a video demonstrating the technology](https://www.youtube.com/watch?v=Uea9Wq1XdA4). It has a lot of advantages. It provides path traced quality with a fraction of the GPU power and it scales to any size of scene. It also works on animated and skinned meshes like characters or moving geometry, and it adapts well to changing environmental conditions like dynamic time of day. It honestly seems like a near flawless GI implementation and if we could do something like this, it would massively upgrade our GI performance and quality! It also scales well to different resolutions and low end devices and has been powering DICE games like Star Wars Battlefront EA and Battlefield! Amazingly, there is an [open source, OpenGL game engine called "EAEngine"](https://github.com/man-in-black382/EARenderer/) which implements this surfel technology. So, we could take inspiration from that codebase! Here's a [video demo](https://www.youtube.com/watch?v=n0ktyKqq1UE)! There is also the voxel-based cone tracing method. This approach can also be very realistic, although it is much more resource intense than something like surfels. Here's [a video demonstrating it well](https://www.youtube.com/watch?v=5m9fOVWaqdE). Something I find noteworthy about this particular implementation is the level of control the artist gets to creatively control the global illumination to exaggerate it and craft the lighting in a very aesthetically pleasing way. Presumably, this will perform better in smaller indoor levels, unless we can find ways to optimize it, such as biasing more samples closer to the camera and blending. There is also screen-space GI which can look good but will fall apart with complex scenes with multiple significant points of global illumination. Perhaps we should use this in conjunction with another method. For example, we could use a low resolution voxel cone traced GI that is mostly accurate but not very precise to light everything, and then layer screen-space GI on top to give on-screen elements more clarity. We could then blend these results together so whatever the screen space path tracing misses can be picked up by the fallback option. There's also baked irradiance grids like Blender's. They aren't fantastic for real-time rendering since a large grid can use a lot of storage space and will take a decent bit of time to bake and will require a lot of artist fussing about. However, they do allow a lot of artistic control. It's also something supported by Blender itself so, it would be nice to at least see supported. I could see some cases, particularly in simpler indoor scenes, where irradiance indirect lighting via manually placed grids could be useful and the best option. ## Additional resources https://github.com/Cutano/CGLibrary
Sign in to join this conversation.
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: LeenkxTeam/LNXSDK#26
No description provided.