Lecture 18: Monte Carlo Rendering (CMU 15-462/662)

Sdílet
Vložit
  • čas přidán 6. 07. 2024
  • Full playlist: • Computer Graphics (CMU...
    Course information: 15462.courses.cs.cmu.edu/
  • Věda a technologie

Komentáře • 14

  • @bimDe2024
    @bimDe2024 Před 2 lety +8

    100 out of 100 for this course. Thank you so much @KeenanCrane.

  • @aditya_a
    @aditya_a Před rokem

    Thanks for this awesome series! A question: Wouldn't the direct lighting example here not really be "importance sampling", as it seems we're just changing the domain of integration rather than sampling one portion of the domain with higher probability?

  • @neilfan4972
    @neilfan4972 Před rokem

    Great Lecture!
    A question: at 52:00, shouldn't dw = dA' * cos(theta')/... instead of dA' * cos(theta)/... ?

    • @RajkumarDarbar
      @RajkumarDarbar Před 7 měsíci

      you are right. it should be dA'cosθ' / |p - p'|^2

  • @GuillermoValleCosmos
    @GuillermoValleCosmos Před 3 lety

    When you say "terminate 50% of all contributions with luminance less than 0.25" in the Roussian roulette example, how do you do that?
    In the Roussian roulette explanation you only showed how to terminate paths with a certain probability at each bounce, but how can you make it luminance-dependent?

    • @GuillermoValleCosmos
      @GuillermoValleCosmos Před 3 lety

      or is the idea that there is some relation between the russian roulette probability p_rr and the percentage of paths below a certain luminance that you discard? Also why talk about luminance rather than radiance here?

    • @emmpati
      @emmpati Před 2 lety

      compute radiance. generate a random number. if random number is smaller than some predecided value and computed radiance is smaller than 0.25 kill that ray

    • @emmpati
      @emmpati Před 2 lety

      @@GuillermoValleCosmos I think most of the time luminance and radiance refer to same thing

  • @GuillermoValleCosmos
    @GuillermoValleCosmos Před 3 lety +1

    The variance of sum of random variables is equal to the sum of variances (22:10) if they are independent no?

    • @GuillermoValleCosmos
      @GuillermoValleCosmos Před 3 lety

      in 51:30, the expression for d omega should say cos theta'

    • @emmpati
      @emmpati Před 2 lety

      @@GuillermoValleCosmos yes to all

    • @sungjuyea4627
      @sungjuyea4627 Před 2 lety +1

      yes, if and only iff cov(x, y) = 0. And cov(x, y) = 0 if x and y are independent.

  • @AinurEru
    @AinurEru Před 3 lety

    The russion roulette section feels very incoherent:
    1) 1:10:00 Is the "tentative contribution" being tested, COMMULATIVE across the entire path up-to-end-including the current event/vertex, or is it just the contribution of the CURRENT event by itself that is being tested?
    2) In either of the above cases, should it also include the direct illumination component (NEE) or exclude it?
    3) 1:11:00 Anything multiplied by 0 is 0, and the expected value of a constant is that constant, so the right part of the equation would be "E[(something)*0]" which is "E[0]" whichi is just 0 so the whole thing would always be 0.
    So - what's the point? What is that notation meant to convey?

    • @GuillermoValleCosmos
      @GuillermoValleCosmos Před 3 lety

      the calulation at 1:11:00 is meant to show that the *new* random variable obtained by introducing the russian roulette procedure, has the same expected value as the original random variable. So it's valid for our Monte Carlo estimation