Radiance Cascades Rendered Directly

Sdílet
Vložit
  • čas přidán 21. 08. 2024
  • In this video we explore data stored in radiance cascades by observing it directly. This is equivalent to precalculating a scene, storing a cross-section of its radiance field and then rendering it from any viewpoint and any angle in O(1).

Komentáře • 211

  • @Alexander_Sannikov
    @Alexander_Sannikov  Před měsícem +113

    How ya'll folks found this video? I'm seeing a lot of half-life themed visitors coming from somewhere, and I've no idea where from.

    • @porttek0oficial
      @porttek0oficial Před měsícem +47

      This was on my recommended page and had interesting thumbnail. I watched it until the end and I still don't know what I am looking at.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +14

      @@porttek0oficial sorry for that..

    • @valshaped
      @valshaped Před měsícem +27

      Don't be sorry for it! This was a wonderful demo, and you don't need to be a graphics programmer to find it interesting, fun, enjoyable, and visually appealing!

    • @Deathbynature89
      @Deathbynature89 Před měsícem +39

      Radiance Cascade sounds like Resonance Cascade combined with a cool thumbnail it is intriguing.
      "What is a Radiance Cascade? I wanna know."
      This was recommended to me on my homepage. I have watch videos on radiance fields and tutorials on Gaussian Splatting. The thumbnail also looks like something that could be rendered in Source 2.

    • @SuboptimalEng
      @SuboptimalEng Před měsícem +25

      Probably getting recommended because of SimonDev’s recent video on Radiance Cascades heh.

  • @OtterCynical
    @OtterCynical Před měsícem +522

    Gordon doesn't need to hear all of this, he's a highly trained professional.

    • @minecraftermad
      @minecraftermad Před měsícem +15

      No no that's resonance cascade.

    • @Etka06
      @Etka06 Před měsícem +1

      @@minecraftermad same thing

    • @Gelatinocyte2
      @Gelatinocyte2 Před měsícem +8

      Gordon doesn't need to *see* all of this, he's a highly trained *game dev.*

  • @astr0_th3_man84
    @astr0_th3_man84 Před měsícem +400

    My dumbass thinking this was a half life video

    • @Lexie_T
      @Lexie_T Před měsícem +2

      There's no chance of a half life video.

    • @elpanatv2537
      @elpanatv2537 Před měsícem +15

      resonance cascade🔥🔥🔥🔥🔥🔥🗣🗣🗣🗣

    • @noisetide
      @noisetide Před měsícem +4

      @@elpanatv2537 It's time to choose Mr. Freeman...

    • @makeandbreakgames1791
      @makeandbreakgames1791 Před měsícem +1

      same lol i thought it was a high quality render or something

    • @freakmoler5563
      @freakmoler5563 Před měsícem

      It was the weird thumbnail that tricked me

  • @K4leidos
    @K4leidos Před měsícem +158

    Never change your mic. It's somehow almost perfect

    • @derc4N
      @derc4N Před měsícem

    • @atlas_19
      @atlas_19 Před měsícem +16

      It's like a voice recording/voice report from games and movies.

    • @liegon
      @liegon Před měsícem +4

      It has a lofi feel, like a cassette recording.

    • @antonio_carvalho
      @antonio_carvalho Před měsícem +3

      The slightly nasal voice in a matter of factly intonation also contribute to the effect

    • @Jam_Axo
      @Jam_Axo Před 17 dny

      its a mad scientist mic in the making

  • @Klayperson
    @Klayperson Před měsícem +33

    this is a sickass way to render cosmic shadow people

  • @DilettanteProjects
    @DilettanteProjects Před měsícem +147

    I never thought I'd live to see a radiance cascade

    • @Kavukamari
      @Kavukamari Před měsícem +35

      let alone create one...

    • @Masonova1
      @Masonova1 Před měsícem

      ☝️🤓 actually it was a resonance cascade

    • @DilettanteProjects
      @DilettanteProjects Před 29 dny +3

      @@Masonova1 I like to do this thing sometimes where I notice that one word sounds a bit like another and then I make a joke out of that

    • @Synthonym
      @Synthonym Před 13 dny

      @@Masonova1 whoooosh

  • @Alexander_Sannikov
    @Alexander_Sannikov  Před rokem +212

    One thing that I forgot to mention in the video is the, um, sparkles? These are path tracing fireflies that made their way into radiance fields -- they go away the more time you give path tracing to converge, but I did not bother waiting more than half an hour and I thought they look kind on cool anyway. They don't exist while using this data structure for calculating actual global illumination because it needs much lower resolution to be resolved and so it converges much faster.

    • @johnsherfey3675
      @johnsherfey3675 Před měsícem

      Would denoising help?

    • @Mr.LeoNov
      @Mr.LeoNov Před měsícem +1

      ​@@johnsherfey3675I don't think so giving how little resolution there is

    • @ThePrimaFacie
      @ThePrimaFacie Před měsícem +2

      You should pin this as the top comment

    • @monx
      @monx Před měsícem +5

      thanks. the video is missing a brief assurance that this "ghost skull" asset is presented exactly as it would appear in a game. It is not a human head in debug mode.

    • @draco6349
      @draco6349 Před měsícem +2

      it's a fascinating effect, actually- a volume that sparkles isn't something i've seen much before. i wonder if it has any actual use.

  • @valshaped
    @valshaped Před měsícem +45

    The cascades look like the raw output from a light field camera! Very cool!

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +21

      There's only so many ways you can encode a light field..

    • @johndawson6057
      @johndawson6057 Před měsícem +2

      ​@@Alexander_Sannikovso it is used in light field cameras then?

  • @longjohn7992
    @longjohn7992 Před měsícem +29

    “Carmack doesn’t need to hear all this he’s a highly trained professional”

  • @Vulcorio
    @Vulcorio Před měsícem +36

    No idea what I've just listened to but the imagery is very fascinating.

  • @ThePrimaFacie
    @ThePrimaFacie Před měsícem +21

    My brain watching this and hearing how its done is doing the sparkly bits of the model. Thanks for the vid

  • @euphemia3092
    @euphemia3092 Před rokem +23

    Absolutely blown away by all of your work. Thank you for sharing!!

  • @footspring94
    @footspring94 Před měsícem +3

    like a digital hologram. crazy that this has been doable for such a long time and only now has been found. And just by someone on yt.

  • @4.0.4
    @4.0.4 Před měsícem +15

    As Wave Function Collapse terrain generation has proven, cool names inspired on physics are generally better for game development.

    • @fritt_wastaken
      @fritt_wastaken Před měsícem +2

      But wave function collapse is an abysmally terrible name. It has nothing to do with waves or functions. Gives wrong impression on both essence and complexity of the algorithm

  • @bebroid420
    @bebroid420 Před rokem +11

    2 years ago I was experimenting with directional lightmaps, trying to achieve both diffuse and sharp specular lighting. I've been messing with "plenoptic textures" that look very similar to this demo. It's really interesting how one can come up with a similar concept while trying to achieve different goal. The whole idea with the cascades and usage of this technique to calculate screen space global illuminance.... Just wow!

  • @codymitchell4114
    @codymitchell4114 Před 23 dny

    Absolutely gorgeous rendering

  • @aburak621
    @aburak621 Před rokem +11

    Thanks Alexander for you ExileCon presentation! It was a joy to watch and learn.

  • @melody3741
    @melody3741 Před 22 dny

    This is the first time I have ever comprehended these, because everyone else basically just called it black box. Thank you!!!

  • @jeremyashcraft2053
    @jeremyashcraft2053 Před 11 dny

    Really clever stuff! I know PoE 2 will be shipping with this lighting technique, and I can't wait to see it in action!

  • @CharlesVanNoland
    @CharlesVanNoland Před měsícem +4

    This is very reminiscent of lightfield rendering (originally "image based rendering" 20 years ago) of the sort that OTOY and Lytro were working on a decade ago, except here you have multiple resolutions for multiple depths? I'll have to look at your paper to understand the cascade aspect.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +2

      each cascade is encoded in a way that's similar to the good ol' image based rendering. but the most powerful property of RC is how information is distributed across multiple cascades.

  • @perkele1989
    @perkele1989 Před 20 dny

    The halflife refs are clearly due to your phenomenal HL voice and production quality

  • @Vegan_Kebab_In_My_Hand
    @Vegan_Kebab_In_My_Hand Před měsícem +2

    Nice video, and great explanation of the cross-sections and their relationship to the spatial and angular resolutions!

  • @DabidarZ
    @DabidarZ Před měsícem +3

    im so confuseddddd BUT THAT LOOKS cool and i love seeing new stuff!!!

  • @TheNSJaws
    @TheNSJaws Před rokem +36

    would you mind doing more of these? Not just for Cascade rendering, but in general.
    I quite appreciate your PoE presentations, and every time I rewatch them, I wish they gave you more time.

  • @footspring94
    @footspring94 Před měsícem +1

    just think how this could replace individual props like tables in corners or show massive events like large cut scenes or animated backgrounds. I also think it will work very well as a way to manage surface texturing.

  • @f7029
    @f7029 Před měsícem

    This is amazing!!!!! Really excited to see what comes out of this

  • @Merthalophor
    @Merthalophor Před 7 dny

    What I gather:
    We're "storing" in some way how the 3d model looks like if viewed through a plane from any angle, without keeping the 3d model. The most obvious way to do that is to store for each pixel of the plane what the pixel would look like if viewed from every angle. We could store, say, 200 different angles, which would mean that we have to store for _every_ pixel 200 different colors. Then, when rendering, we could check what angle we're looking at a pixel, then linarly interpolate between the colors associated with similar angles.
    What this paper shows is that this is not necessary, and we can make the image still look decent while storing much less information. The key idea is that while we want to store a _few_ angles for _every_ pixel, we only need a _few_ pixels that store a _lot_ of angles. So for example, we could store 4 angles for _every_ pixel - then have a separate map that stores 8 angles for every 16th pixel - then another map that stores 16 angles but only every 36 pixels, and so on. By cleverly interpolating this information, we get a really life-like image, while only storing a fraction of the information (and, conversely, while only having to _calculate_ a fraction of the information, if running in real time).

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před 7 dny

      the really important part is that the tradeoff of spatial:angular density is only possible for a given distance range. that's why RC stores radiance intervals, because they capture light coming from a certain distance range.

    • @Merthalophor
      @Merthalophor Před 4 dny

      @@Alexander_Sannikov Distance... from the object being rendered? If you moved away from the plane the quality would decrease?

  • @TheEagleFace
    @TheEagleFace Před měsícem +3

    babe wake up they just dropped virtual holograms

  • @LukeGeaney
    @LukeGeaney Před měsícem

    Subbed - found your channel via the SimonDev video on radiance cascades then your ExileCon video :)

  • @chris-hayes
    @chris-hayes Před měsícem

    How.. illuminating. Very cool.

  • @user-mx7mc7sv2q
    @user-mx7mc7sv2q Před měsícem

    It came from recommendations and I have no idea what this is. But it's damn cool! Now I'll go deeper that rabbit hole to find out what it is and how you managed to make it work

  • @photometricman
    @photometricman Před 9 měsíci +4

    A lot of similarities in the atlas to lightfield photography. Highly interesting work. Thank you!

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před 9 měsíci +2

      it is capturing the same data, so yeah

    • @Tr33fiddy
      @Tr33fiddy Před 7 měsíci +1

      Good thought. Like the Lytro light field camera.

  • @LoraHannike
    @LoraHannike Před měsícem

    digital hologram, finally
    great work! keep it up

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +1

      i have a video on actual simulated holograms. check my community page to see what it looks like.

    • @LoraHannike
      @LoraHannike Před měsícem

      @@Alexander_Sannikov I will

  • @Visigoth_
    @Visigoth_ Před 22 dny

    Cool video (Game Dev)... YT sees my interest in "digital volumes" and recommended this.

  • @DJDDstrich
    @DJDDstrich Před rokem +11

    Hey Alex thanks for the video - Really cool to see a little more insight into this Radiance Cascades tech you've developed! Must be fun to play around with various little projects like this now that the tech has proven itself! How are you going with potentially open sourcing some form of it in the near future? Is it still a possibility? It seems like the only viable GI option for real-time procedural games like ARPGs or sim/management games, and, well I want to get my greedy little mitts on it! :D

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +26

      I'm close to publishing an article first, them I'm going to open-source it.

    • @DJDDstrich
      @DJDDstrich Před rokem +4

      @@Alexander_Sannikov amazing mate! I'm super excited and hope you get the recognition from it! Where might the article land and how do I get notified?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +12

      @@DJDDstrich I'm probably going to make a short announcement video on this channel with a link to something like google drive pdf before sending it to a journal.

  • @blakes8901
    @blakes8901 Před měsícem

    this genuinely might change everything. good luck. I hope this actually pays off for you monetarily.

  • @pravkdey
    @pravkdey Před měsícem

    Looks pretty! If it's too memory intense i can imagine it being used sparingly, like for smoke grenade or effects on weapons

  • @UncoveredTruths
    @UncoveredTruths Před rokem +1

    such a cool technique

  • @Deathend
    @Deathend Před měsícem

    Totally a coincidence that it looks like a interdimensional ghost celebrating cico de mayo.

  • @sublucid
    @sublucid Před rokem +1

    Super cool! Show us what this same model looks like illuminated by the final technique!

  • @chapterone1162
    @chapterone1162 Před měsícem

    Really cool stuff man good work

  • @ellescer
    @ellescer Před měsícem +1

    no idea what your talking about or what any of this means but, looks wicked cool dude

  • @satibel
    @satibel Před 11 měsíci +3

    have you tried that with a billboard explosion? it may allow high speed rendering with some semi-3d effect.

  • @pravkdey
    @pravkdey Před měsícem

    The title sounds like a sleeper agent phrase haha

  • @kabu8341
    @kabu8341 Před měsícem

    If you ever happen to have enough free time on your hands, I would love to have a video explanation for your radiance cascade for people who know not much about graphics programming :D SimonDev's Video was a good start for it tho, now I understand the things you do a bit more. Very fascinating. I still can't really imagine what those probes look like. As far as I understand now it sounds like you have those probes with different properties in the whole... err.. view? volume? of the camera and they are scanning everything and sending the averaged data to the camera and this is what you see in the end?

  • @manapotion1594
    @manapotion1594 Před rokem +3

    Cool research. So, does that mean the angle of view is quite limited? Does the quality degrades with the "depth" of the object from camera?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem

      the angle is not limited, but the quality does degrade with depth with this encoding

  • @firstclass3223
    @firstclass3223 Před rokem

    Awesome stuff, well explained.

  • @iestynne
    @iestynne Před měsícem

    Thanks for showing the atlas view.
    I'm curious why you chose parabolic encoding?

  • @seccentral
    @seccentral Před měsícem

    the profiler is legit. 👍

  • @JamesAidanKing
    @JamesAidanKing Před 21 dnem

    yea i dont know what any of this is but it looks cool

  • @skicreature
    @skicreature Před 28 dny

    Your mention of the accuracy of penumbras makes me wonder if this same technique could be modified in the calculation of radiation dose calculation in radiation therapy. Basically we do a bunch of complex ray tracing or kernel convolutions to perform dose calculations in radiation therapy and calculate the interactions based on a fluence maps from particular angles. For us however reflectivity isn't so important. Mostly we just have attenuation (simple calculation), and a couple different types of scattering interactions (much more complex requiring monte carlo interactions). However, our accuracy in dose calculation tends to decrease at the edge of beams (at the penumbra) as the calculation becomes most difficult there with scattering interactions beginning to dominate over the attenuation interactions.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před 28 dny

      @@skicreature people are already applying RC to a bunch of non-light radiative transfer processes. any process where energy is propagated in rays is suitable.

  • @attashemk8985
    @attashemk8985 Před rokem

    Looks cool, will be interesting if nerfs could be baked in this cascade technique

  • @grimtin10
    @grimtin10 Před měsícem

    yo this is really cool

  • @RPG_Guy-fx8ns
    @RPG_Guy-fx8ns Před měsícem

    so basically you bake point clouds to an atlas of cube maps and use it to render imposter pixels

  • @Jianju69
    @Jianju69 Před měsícem

    Interesting effect, somewhat like glitter immersed in plastic. How much memory does this demo use? Seems too resource intensive for complex real-time scenes.

  • @simonl1938
    @simonl1938 Před měsícem

    this is so cool

  • @thewaysh
    @thewaysh Před rokem +10

    Great video, thank you. Please upgrade your microphone!

    • @iamvfx
      @iamvfx Před rokem +9

      I like it, it sounds like a recording from 1960s 😁

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +16

      Sorry for that. That's why I usually don't voice my videos. I should get an actual mic, but I normally much prefer to program something than to waste time setting it up.
      UPD: ok, anyway, ordered a mic. Again, sorry for the quality on this one.

  • @harry1010
    @harry1010 Před měsícem

    Never seen radiance fields rendered so fast - on the cpu, too!quick q - do you have angular resolution inconsistencies at different layers due to how the resolution changes at each depth layer?

  • @perialis2970
    @perialis2970 Před měsícem

    cool, this is a super cool video (not understanding one thing)

  • @addmix
    @addmix Před měsícem

    Sparkles

  • @eucenor4171
    @eucenor4171 Před měsícem

    0:07 global elimination technique 💀

  • @nates9778
    @nates9778 Před rokem

    This is cool!

  • @iamvfx
    @iamvfx Před rokem

    Cool stuff! Needs more high quality examples

  • @AtomicBl453
    @AtomicBl453 Před měsícem

    i'm curious, would this help with more realistic sun sparkles on a body of water?

  • @PanzerschrekCN
    @PanzerschrekCN Před rokem +1

    Looks great!
    But it is hard to understand how it works.
    As i understand this, you just render the scene into a tiny cubemap for each point on a regular grid. You have 4 variants of this grid (cascades) - from less detail (spatial) to more detail, less detail grid contains more detail cubemaps, more detail grid contains less detail cubemaps. It is still unclear for me how fetching and mixing from these arrays of cubemaps works.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +4

      That's right, the blending part is kind of hard to explain in the video, that's why I'm writing an article with a proper explanation. However, the idea is actually very simple: each cubemap simply has an alpha channel, so since each cascade encodes its own depth range, they are always sorted front-to-back, so you just blend them using their alpha channel.

    • @JannikVogel
      @JannikVogel Před rokem

      @@Alexander_Sannikov How do you choose these depth ranges? Are they manually chosen or do they "emerge" from the data (due to information not being representable at some angular resolution for example)?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem

      @@JannikVogelthe discretization scheme is completely scene-independent. That means the exact same radiance cascades can be used to capture radiance of an arbitrary scene. In the paper I explain in great detail why depth ranges need to increase exponentially for subsequent cascades. If you're on "graphics programming" discord, you can have a look at the draft of my paper that people are reviewing right now.

  • @lemonjumpsofficial
    @lemonjumpsofficial Před měsícem

    IT'S A HOLIGRAM!!!!!

  • @gaia35
    @gaia35 Před měsícem

    This is why computers were built.

  • @TavishMcEwen
    @TavishMcEwen Před měsícem

    so cool!!

  • @rapideye101
    @rapideye101 Před měsícem +1

    how does the sampling work then with the cascades? can you also make a video on that? or is there a paper?

  • @Lucsueus
    @Lucsueus Před měsícem

    Saw this video on my homepage recommendations, no relation to Half-life at all (haven't looked at source/half-life content in forever).
    To be honest I have no clue what this even means, but I'm very intrigued and I often enjoy watching people passionately explain something that to me is very niche.
    Thank you for sharing!

  • @simplegamer6660
    @simplegamer6660 Před měsícem

    Damn, what mic are You using? It sounds rad af. I seriously wanna know
    P.S. Why do i have a feeling that i'm close to being the only one who came here *not* because i thought the video is somehow involved with half-life?:D

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +1

      Boys, looks like we have a non-half-life person here. I repeat, a person who didn't come to joke about resonance cascades.

  • @AlexDicy
    @AlexDicy Před měsícem

    How do you style ImGui like that? Is there a theme or did you manually change backgrounds and border radius?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem +2

      it's open source you can check. legitengine by raikiri

    • @AlexDicy
      @AlexDicy Před měsícem

      @@Alexander_Sannikov Thanks!

  • @anipodat394
    @anipodat394 Před měsícem

    Half-Life + Hollow Knight = Radiance Cascade

  • @oBdurate
    @oBdurate Před měsícem

    Bro sounds like Posy

  • @Terszel
    @Terszel Před měsícem

    yeah

  • @oldlifeorig5028
    @oldlifeorig5028 Před měsícem

    Does it mean that we can look into 3d models and see it more realistically from inside? Not like backfaces, but volume of it's mesh idk

  • @Skythedragon
    @Skythedragon Před rokem

    So from what I understand from this and the presentation, you take an array of textures, and for each pixel in that texture, look at it's 8 neighbors, and do some short distance (1px) raytracing, and repeat that for the number of cascades you have?
    Then to get the final value, you read the pixels in the final cascade for the direction you want to trace in?
    Wouldn't this make it O(n) for the number of cascades instead of having some fixed upper bound?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +6

      Nope, after the precalculation is done, there's no raycasting in this demo at all. Rendering the radiance field of one cascade in this case is equivalent to just reading a cubemap texture for every pixel, which is just a hardware bilinear interpolation. So, each of the 4 cascades are looked up this way and then merged. See at 4:23 each cascades literally stores tiny cubemaps: you don't raymarch them, you interpolate them.

  • @hnlo
    @hnlo Před 10 dny

    Great work! I'm really interested to find out how to encode the radiance cascade data into a texture, can you give me some pointers?

  • @vanillagorilla8696
    @vanillagorilla8696 Před měsícem

    A digital hologram.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před měsícem

      @@vanillagorilla8696 i have a video about actual digital wavefield holograms if you're interested in that

    • @vanillagorilla8696
      @vanillagorilla8696 Před měsícem

      @@Alexander_Sannikov I'd love that.

  • @wonkaytry
    @wonkaytry Před měsícem

    mirrors

  • @gadirom
    @gadirom Před rokem

    Ah! So there’s no sphere harmonics, you used cube maps. Still seems like a trickery. I’m looking forward for the paper. Really, O(1) looks like magic.

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +4

      I use spherical harmonics to gather diffuse GI (not in this demo). This demo does not need gathering irradiance, so it only uses parabolic mapping (basically, cubemaps).

    • @gadirom
      @gadirom Před rokem

      @@Alexander_Sannikov I see. Thank you.

  • @aladorn
    @aladorn Před rokem

    thank you for the video...veeery impresive.... please upgrade your mic

  • @ABWABWABWABWABWABWABWABWABWA

    hell yeah

  •  Před měsícem

    is that gauss splattering?

  • @themerpheus
    @themerpheus Před měsícem

    so its like deep shadow maps but for irradiance?

  • @neon_Nomad
    @neon_Nomad Před měsícem

    Ghost skeletons

  • @perkele1989
    @perkele1989 Před 20 dny

    you didnt even touch the roughness slider :( please, go into more tehcnical detail about how this works ! graphics programmer in me is amazed

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před 20 dny +1

      now I also wonder what that slider did, because it makes no sense :D

  • @stimpyfeelinit
    @stimpyfeelinit Před rokem

    neat!!!!!

  • @NeoShameMan
    @NeoShameMan Před rokem

    It's funny i do something like that to render gi on mali 400 gpu in real time lol, but instead of storing the results, i store the uv of the points, which allows real time updates by simply texture feedback, ie the texture samples itself to resolve gi. I had the cascade idea but didn't implemented it, i didn't knew it would be that efficient. It's on unity's forum under the name exploration of custom diffise rtgi, the technique is called MAGIC for mapping approximation of gi compute 😂. I resolve slowly because i haven't tested how much i can render per frame, so it's one ray per pixel per frame, my computer is dead now lol hasn't finished.

  • @computerghost596
    @computerghost596 Před 13 dny

    I dont think this is half life guys

  • @user-nm4mi2sq1o
    @user-nm4mi2sq1o Před 7 měsíci

    Оп, оказывается это был тизер affliction в PoE)

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před 7 měsíci

      Как Вам удалось их соединить вообще?

    • @user-nm4mi2sq1o
      @user-nm4mi2sq1o Před 6 měsíci

      ​@@Alexander_Sannikov Отвечая на данный вопрос, мне станет стыдно. Сам я мало что понимаю в разработке движка или проектировании/моделировании объектов. Но ваши панели на exilecon и данные ролики посматриваю, чисто из любопытства. Подача материала отличная
      Тут я увидел схожую модель наслоения частиц на объект, если можно так выразится, с текущей лигой в игре...даже цвета совпадают )
      Заглянул сюда после вашего подкаста у CARDIFF'а. К слову, было бы отлично, если бы у вас получилось периодически организовывать с ним, или другими стримерами, такие подкасты. Хотя бы раз в полгода
      У зарубежной аудитории есть Крис и Джонатан, а у нас будете вы.

  • @JannikVogel
    @JannikVogel Před rokem

    Can you share those 8k images you have used in this demo for people who want to reproduce this (without having to capture their own scene first), or could you even upload the entire demo code somewhere?

    • @Alexander_Sannikov
      @Alexander_Sannikov  Před rokem +2

      If you want to try this i really recommend replace the volume rendering part with some really simple SDF fractal raymarcher or a sphere. The only reason why i used volume data is because it's obvious that i'm not rendering it in realtime (that'd be much slower).
      That being said, at some point I will publish the sources.

  • @Danuxsy
    @Danuxsy Před měsícem +1

    So what would be the usecase of this?

    • @NightmareCourtPictures
      @NightmareCourtPictures Před měsícem

      Use case of global illumination? With O(1) time complexity? Practically all applications for every graphical representation for the foreseeable future.

    • @Danuxsy
      @Danuxsy Před měsícem

      Well this is different because it is NOT real-time like path tracing we see in games, this is computed offline and so cannot change afterwards. It's similar to splatting, looks like real life but not very usable in games (as of now anyway), me sleep.

    • @NightmareCourtPictures
      @NightmareCourtPictures Před měsícem

      @@Danuxsy it's a stretch to say path tracing is real-time.
      i also don't know where or why you have the impression this isn't real-time, nor why it wouldn't be real-time, given how efficient it is.

    • @Danuxsy
      @Danuxsy Před měsícem

      @@NightmareCourtPictures i watched the ExileCon2023 talking about their GI implementation using radiance cascades so I see that it does have good usecases yess, it's cool !

    • @caffiend81
      @caffiend81 Před měsícem

      @@Danuxsy it's usable in real time. Path of Exile 2 is using Radiance Cascades. IIRC their dev team published a white paper on the technique.

  • @homematvej
    @homematvej Před rokem +1

    Is there a paper or something?

  • @k-vandan4289
    @k-vandan4289 Před rokem

    cool

  • @paranoidPhantom
    @paranoidPhantom Před měsícem +2

    Нихуя не понял, но очень интересно)

  • @icaroamorim3123
    @icaroamorim3123 Před 3 měsíci

    I'm still a bit too stupid to understand but I will keep trying until I implement it

    • @icaroamorim3123
      @icaroamorim3123 Před 3 měsíci

      Just read the full paper again, I can understand it much better now.

  • @user-yx5wd5yy6h
    @user-yx5wd5yy6h Před rokem

    ничего не понял, но очень интересно =)

  • @macratak
    @macratak Před rokem

    wheres the siggraph paper my man. im tryna read that!!!

  • @dougbeard7624
    @dougbeard7624 Před měsícem

    Audio is pretty bad. You need to compress.