Generative Modeling - Normalizing Flows

Sdílet
Vložit
  • čas přidán 5. 09. 2024

Komentáře • 21

  • @svart-rav8072
    @svart-rav8072 Před 17 dny

    This explanation is amazing... it asks for an understanding of basic concepts of linear algebra and and statistics, but it is still clear enough to understand it, when knowledge in these subjects is more based on intuition than on in depth education
    Thanks a lot for this, it's really great!

  • @heyasmusic7553
    @heyasmusic7553 Před rokem +3

    This is the greatest explanation of coupling layers I've seen. Thank you

  • @jakewong6305
    @jakewong6305 Před měsícem

    it like diffusion model now aday! greate!

  • @MathStuff1234
    @MathStuff1234 Před rokem +4

    wow... this is absolutely brilliant. Due to the bijective nature of the normalizing flow, you're constrained to only utilizing bijective functions (which is quite limiting indeed). However, by designing the NN structure in this way, you're able to offload parameter learning to an entire internalized NN, where the NN outputs parameters for a bijective function. Mind blown, crazy stuff!
    After all this is complete the learning piece simply being MLE makes a ton of sense.
    Dank je wel!

  • @jcamargo2005
    @jcamargo2005 Před měsícem

    I think at 9:00, because of the chain rule we must evaluate not at x, but like this (example for 2 functions): Df(x)=Df_1(f_2(x)) Df_2(x)

  • @ShuaimingJing-ue6qn
    @ShuaimingJing-ue6qn Před 2 měsíci

    Hi, that is an amazing lecture. Thank you so much for the video. Could you please post the lecture powerpoints?

  • @praveen3779
    @praveen3779 Před 19 dny

    Nice video again, however I could not wrap my head around: If I use random shuffling, how can it still be invertible?

  • @piotrkaminski2305
    @piotrkaminski2305 Před 2 měsíci

    Great explanation! Straight to the point and clear!

  • @tkkitk
    @tkkitk Před 4 měsíci

    This is the most useful lecture for starting normalizing flow!!!

  • @moon_dragon22
    @moon_dragon22 Před rokem +2

    This is an incredibly nice explanation! Thank you so much

  • @BartoszBielecki
    @BartoszBielecki Před 11 dny

    Are diffusion models are specific implementation of it? Or something else?

  • @seunghyeonjeon3174
    @seunghyeonjeon3174 Před rokem +1

    Thanks for your generative model series!

  • @sehaba9531
    @sehaba9531 Před 8 měsíci

    Thank you so much for such a great, clear and easy to follow explication, I like the comparison between flow-based models, GANs and VAEs at the end of the video ! Also, the math explanation is very clear :)

  • @blacksages
    @blacksages Před 5 měsíci +1

    Hello, thanks for your explanation.
    I don't understand how shuffling is an invertible function, do you have to remember the places where you shuffled your points?

    • @drdca8263
      @drdca8263 Před 4 měsíci +1

      I think each random shuffling is selected once when the architecture of the network is established, and does not change after that.
      Done this way, the shuffling can be undone.

    • @jcamargo2005
      @jcamargo2005 Před měsícem

      Same here. This does not look differentiable

  • @nathanwong1780
    @nathanwong1780 Před 9 měsíci

    Hello, I really enjoyed the explanation. It was easy to follow and the analogy was very useful!

  • @mohammadashraful1598
    @mohammadashraful1598 Před 5 měsíci

    The explanations! very impressive.

  • @CalebCranney
    @CalebCranney Před 4 měsíci +1

    Thank you so much for this video. I'd watched several videos on flow before this one, but this is where it really clicked for me. I echo @MathStuff1234, absolutely brilliant.

  • @simonhradetzky7055
    @simonhradetzky7055 Před rokem

    very nice explanation for me as an data science studen thank you

  • @manangupta6459
    @manangupta6459 Před 6 měsíci

    Extremely good!