Global Illumination Based on Surfels

Sdílet
Vložit
  • čas přidán 21. 07. 2024
  • This talk delves into real-time global illumination approaches based on Surfels designed by EA SEED and EA DICE, and it's a part of the SIGGRAPH 2021 Advances in Real-Time Rendering in Games course (advances.realtimerendering.com/).
    Global Illumination Based on Surfels (GIBS) is a solution for calculating indirect diffuse illumination in real-time. The solution combines hardware ray tracing with a discretization of scene geometry to cache and amortize lighting calculations across time and space. It requires no pre-computation, no special meshes, and no special UV sets, freeing artists from tedious and time-consuming processes required by traditional solutions. GIBS enables new possibilities in the runtime, allowing for high fidelity lighting in dynamic environments and for user created content, while accommodating content of arbitrary scale. The algorithm is part of the suite of tools available to developers and teams throughout EA as part of the Frostbite engine.
    This talk will detail the GIBS algorithm and how surfels are used to enable real-time ray traced global illumination. We will describe how the scene is discretized into surfels on the fly, and why we think this discretization is a good fit for caching lighting operations. The talk will describe the acceleration structure used to enable efficient access to surfel data, and how this structure allows us to cover environments of arbitrary size, while keeping a predictable performance and memory footprint. We will detail how the algorithm handles dynamic objects, skinned characters, and transparency. Several techniques have been developed to efficiently integrate irradiance on surfels. We will describe our use of ray guiding, ray binning, spatial filters, and how we handle scenes with large numbers of lights.
    Speaker Bios:
    Henrik Halen joined Electronic Art's SEED research division as a Senior Rendering Engineer in 2017. His work at SEED is focused on real-time graphics algorithms, lighting and characters. Henrik's experience as a rendering engineer prior to joining SEED includes a decade of contributions to franchises such as Gears of War, Battlefield, Medal of Honor and Mirror's Edge.
    Andreas Brinck has worked as a rendering engineer for more than two decades. He joined Electronic Arts in 2011 to help start Ghost Games and was later the rendering lead on NFS Rivals, NFS 2015, NFS Payback, and NFS Heat. In 2019 he joined DICE LA where he is currently working on the Battlefield franchise.
    Kyle Hayward has worked as a rendering engineer since 2010. He has focused on multiple areas in graphics, from animation compression to global illumination, working on both offline and real-time solutions. He joined EA in 2012, and later became the NBA rendering lead from 2014 onwards. In 2019 he joined Frostbite, where he has been working on global illumination and raytracing.
    Xiangshun Bei has been a rendering engineer within DICE LA at EA since 2019, focusing on real-time rendering and ray tracing. He currently works on the Battlefield franchise. Prior to DICE, he contributed to graphics drivers for Adreno GPU on Snapdragon SoC at Qualcomm. He received his master’s degree in computer science from University of Southern California in 2017.

Komentáře • 85

  • @ch3dsmaxuser
    @ch3dsmaxuser Před 7 měsíci

    That is awesome!

  • @nowherebrain
    @nowherebrain Před 2 lety +17

    including skinned meshes, this is impressive. Can't you also save surfels by ignoring geometry with direct lighting...that is..not applying surfels to directly lit surfaces.???

    • @NullPointer
      @NullPointer Před 2 lety +4

      I thought the same, but then the surfaces close to those areas won't receive that bounce

    • @nowherebrain
      @nowherebrain Před 2 lety +2

      @@NullPointer I get that, I'm not clever enough to have a creative solution for that..besides..it's kind of arrogant of me to have thought that during the development(ongoing) that you hadn't thought of this.. I love this btw, good work.

  • @cube2fox
    @cube2fox Před 6 měsíci

    So did they end up using this approach as a default for GI? Or do they use something else for new EA/Frostbite games?

  • @sampruden6684
    @sampruden6684 Před 2 lety +25

    There're some cool ideas here, but after watching this just once I'm not seeing an obvious advantage vs DDGI. This has very slow convergence times, and even the converged renders sometimes look a little blotchy in the demo. There's a lot of complexity that goes into handling skinned meshes etc (and that doesn't handle procedural geometry) that DDGI avoids by storing all of the information in the probe volume.
    At the start they mention that they think it's better to calculate the GI on the surface, because that's where it's needed. That sounds sensible in theory, but I wouldn't say that anything here stood out as being visually better than DDGI in practice.
    Is there something in the "pro" column that I've missed? I guess it doesn't suffer from DDGI's corner case when all eight surrounding probes are unreachable.

    • @williamxie5901
      @williamxie5901 Před rokem +9

      It’s good for large open world games. For ddgi, far objects will fallback to low res probe grid due to its clip map structure, whereas GIBS spawn the surfers from screen space which is almost constant

  • @cptairwolf
    @cptairwolf Před rokem

    Interesting solution but I'll take path tracing with radiance caching over this anyway.

  • @erikm9768
    @erikm9768 Před 2 lety +7

    Isnt this just photon mapping essentially? is there a difference, except that using surfels with depth functions instead of spheres? Photon mapping traces back several decades

    • @clonkex
      @clonkex Před rokem +1

      In realtime though?

  • @antoinelifestyle
    @antoinelifestyle Před 2 lety

    Wow

  • @635574
    @635574 Před 2 lety +1

    I can thank Coretex for telling me about surfels

    • @gamingtemplar9893
      @gamingtemplar9893 Před rokem +1

      yeah sadly he believes raytracing is a thing and not another hairworks.

  • @davinsaputraartandgamedev9453

    I'm curious on how this compares to lumen. Anyone willing to share their thought on comparing the 2?

    • @Xodroc
      @Xodroc Před 2 lety

      If it supports VR, it beats Lumen.. Otherwise, it's a nice alternative.

    • @eclairesrhapsodos5496
      @eclairesrhapsodos5496 Před 2 lety +6

      Irradiance cashe is better, not sure about that one, but Lumen do reflections too and its not ray tracing. My opinion VXGI / Lightmaps / SVOGI / Brute Force RTGI / Poton Mapping diffuse GI is best for now. PS: soon for Unreal Engine be added realtime caustics via Photon Mapping on GPU with extreme good denoise / approximation. I really excited about that method of Irradiance cashe - should be medium premium lol (ballance of quality/speed/production time).

    • @brainz80
      @brainz80 Před 2 lety

      I had the exact same thought

    • @edinev5766
      @edinev5766 Před 2 lety +3

      In my testing, and since it's linked to my job it has been extensive - Lumen is slow for big exteriors. Unusable for most professional applications.
      This doesn't seem to be. But no way to know unless it becomes available for the general public.

    • @halthewise3995
      @halthewise3995 Před 2 lety +7

      I'm not an expert, but you're right to point out that Lumen and this are trying to solve roughly the same problem, and the high-level approach is somewhat similar as well. Both combine local probe points stuck to the surface of objects with a global grid of sample points, and both are using roughly similar approaches for ray steering.
      The biggest difference in approach that I see is that Lumen's "local" sampling points are re-created from scratch each frame because they are strictly placed on a screen-space grid, while surfels stay alive as long as the camera hasn't moved too dramatically. That means Lumen needs to do temporal smoothing in screen space at the end of the pipeline, while surfels can do it earlier (and a little bit more flexibly). In theory, that means the best-case performance of surfels when the scene _isn't_ changing and the camera's not moving is significantly better, especially for high-resolution rendering. On the other hand, when the camera is moving, surfels needs to do a lot more bookkeeping to move and update the surfels, so it seems likely more expensive in that case.
      In practice, the big difference is that Lumen is much farther in development, and actually exists today, including lots of work hammering out edge cases and adding all the little tweaks required to get good real-world performance. Surfel-based GI is clearly earlier stage right now, so it's hard to say how good it will be when it's "done".

  • @dragonslayerornstein387
    @dragonslayerornstein387 Před 2 lety +1

    Oh god this is so jank. But it works!

  • @lohphat
    @lohphat Před 2 lety +4

    OH!
    I thought you said "squirrels". Worst clickbait EVAR!
    (I still enjoyed the video.)

  • @dixie_rekd9601
    @dixie_rekd9601 Před 2 lety +9

    sooo, am i right in thinking its kinda like nvidias hardware raytracing based global illumination but instead of single pixel samples with an AI noise filter, its a softer blobbier sample radius with far better performance?

    • @clonkex
      @clonkex Před rokem +3

      RTX simply provides hardware acceleration of tracing the rays. That is, it makes it really fast to say "fire some rays from this point in this direction and tell me what they hit, how they bounce and what colour information they gather along the way". That's literally all it does. It's up to you to decide how to use that information and incorporate it into your GI shading.
      This is basically another version of "fire as many rays as we can afford and accumulate the results over time until it looks realistic". Hardware raytracing could totally be used in this algorithm to make it "look good faster" by firing a lot more rays. The trick with this sort of solution (well, one of many, many tricks) is that you don't want to waste any work you've already done, but you also have limited memory.
      I also don't think there's any AI noise filtering going on here. It's just regular procedural denoising unless I missed something.

  • @Zi7ar21
    @Zi7ar21 Před 2 lety +41

    Neat! This was made by EA though and so we have to troll them with jokes about how they are gonna start charging $0.01 per surfel

  • @Jkauppa
    @Jkauppa Před 2 lety +4

    you could be shooting rays from all light sources, bounce them around, then keep the average in check, then you get automatic global illumination, just keep track of the real-time light maps, as if its accumulated real ray tracing, as in real-time light baking

    • @Jkauppa
      @Jkauppa Před 2 lety

      paint the textures with light

    • @Jkauppa
      @Jkauppa Před 2 lety

      you only need to fill the image pixels and no more

    • @Jkauppa
      @Jkauppa Před 2 lety

      hows the light outside screen space

    • @Jkauppa
      @Jkauppa Před 2 lety

      importance sample all objects

    • @Jkauppa
      @Jkauppa Před 2 lety

      send more rays

  • @charoleawood
    @charoleawood Před 2 lety +2

    I think that "surface circle" is a better description of what these are versus "surface element"

    • @nielsbishere
      @nielsbishere Před rokem +2

      Surficle

    • @inxiveneoy
      @inxiveneoy Před 10 měsíci +1

      @@nielsbishere Surcle

    • @nielsbishere
      @nielsbishere Před 10 měsíci

      @@inxiveneoy sule

    • @endavidg
      @endavidg Před 4 měsíci

      Since it’s also something that has to do with sinuses, “Snot on a wall”.

  • @TheIqCook
    @TheIqCook Před 2 lety +2

    pixar introduced this kind of rendering techniques 15 years ago for offline rendering,

    • @gamingtemplar9893
      @gamingtemplar9893 Před rokem

      Everything is 15 years old or more. Shadow techniques and this, or nanite that is from the 80's, it's similar to REYES tech from the 80's. But there is also new tehcniques. Future is not RT, it's raster.

    • @clonkex
      @clonkex Před rokem +2

      Did they? Wasn't it just regular offline raytracing?

    • @art-creator
      @art-creator Před rokem +1

      @@clonkex no. It was pointcloud/brickmap-based, with harmonic filtration etc.

  • @diligencehumility6971
    @diligencehumility6971 Před 2 lety +3

    Quite beautiful results...
    But, FrostBite is a EA engine.... and EA is not a nice company, at all. Pay-to-win and microtransactions, surprise mechanics, taking advantages of kids, etc... So not really interesting

    • @miurasrpnt_v2
      @miurasrpnt_v2 Před 2 lety +4

      Company and Engine have to be separated imo.

    • @clonkex
      @clonkex Před rokem +4

      Who cares? They're also spending some of that money on advancing GI technology. We can benefit greatly from their research and still never touch Frostbite.