A Beginner's Guide to Monte Carlo Markov Chain MCMC Analysis 2016

Sdílet
Vložit
  • čas přidán 4. 08. 2016
  • presented by Dr. David Kipping (Columbia)

Komentáře • 35

  • @crysis1234567
    @crysis1234567 Před 7 lety +157

    6:15 skip to MCMC

  • @boseongcho62
    @boseongcho62 Před 2 lety +6

    Thank you for your helping!
    9:36 posterior
    16:11 Bayes' theorem
    29:18 Metropolis rule
    40:35 walker(emcee)

  • @lexparsimoniae2107
    @lexparsimoniae2107 Před 3 lety

    Very clear and helpful lecture! Thank you!

  • @nemmaadeni
    @nemmaadeni Před 4 lety +3

    This was super helpful. Thanks!

  • @cadenhong245
    @cadenhong245 Před 2 lety

    Quite clear and clever explanation. Thanks a lot for sharing.

  • @broadwaybrian
    @broadwaybrian Před 3 lety +7

    It would be nice to see a video on Nested Sampling. I've read some literature and coded some models in R, but still a little shaky with some things (e.g., dynamic nested sampling, extracting parameter samples proportional to posterior density [I'm getting good estimates with + - sd, but would prefer output similar to MCMC]). Throwing it out there since there are not a ton of resources, and because of the tease at the beginning :)

  • @EliseLucy92
    @EliseLucy92 Před 4 lety +6

    I think the biggest problem of this video is that the presenter is using a laser pointer which we can’t see. So he says “this is this and this is that” and we never know what elements on the slides to follow. Also, the technical problems with things not showing up on slides.

  • @karannchew2534
    @karannchew2534 Před 3 lety +6

    MCMC starts 6:17

  • @andreneves6064
    @andreneves6064 Před 6 lety +1

    Do you know a beginners guide to Gibbs Sampling?

  • @prodbyryshy
    @prodbyryshy Před rokem

    is there a video that discusses which situations are best for this approach

  • @TheRossspija
    @TheRossspija Před 3 lety +1

    He conveys a great point and I understand that the rigor is not the point of this video, but a statistician in me was screaming most of the time.

  • @lillianrthistlethwaite9638

    I had a question about the plots when you are showing the jumps trying to find the "green zones". The axis labels are "a" and "b", with a_min and a_max as the range for variable a, and b_min and b_max for the variable b. Is a and b some sort of high dimensional space variable, like a principle component? Basically what are these variables you are plotting against to visualize the jumps/path the MCMC algorithm takes?

    • @MadaxeMunkeee
      @MadaxeMunkeee Před 6 lety

      Lillian R Ashmore the variables are the parameters of the model - theta. A and B are two model parameters, but likely there will be many others. He’s just offering a simpler scenario.

  • @aymanhijazy9668
    @aymanhijazy9668 Před 6 lety +1

    Great lecture ! really nice introduction to MCMC

  • @WahranRai
    @WahranRai Před 3 lety

    31:10 What the case (Metropolis rule) : P_trial=P_i (accept theta_i+1 = theta_i ?)

  • @jonathanstudentkit
    @jonathanstudentkit Před 6 lety +1

    Laplace started Bayesian statistics at least as much as Bayes!

  • @hfkssadfrew
    @hfkssadfrew Před 4 lety +7

    28:06 presentor should mention that he cannot compute posterior but only to compare the ratio between the trial and previous. Otherwise it can confuse people: since it looks like you have computed P, why bother?

  • @rizwanniaz9265
    @rizwanniaz9265 Před 6 lety

    how to calculate odd ratio in bayesian ordered logistic plz tell me

  • @sandeepmane5779
    @sandeepmane5779 Před 2 lety

    Video starts from @6:15

  • @sitendugoswami1990
    @sitendugoswami1990 Před 3 lety +1

    This is definitely not a beginner's guide. I know a little bit about bayes theorem, prior, posterior etc. But this Dr. Kipping ensured that i had to open up the books to sort out the mess he created in my mind. Definitely wont recommend to anyone.

  • @hfkssadfrew
    @hfkssadfrew Před 4 lety +3

    The way to teach MCMC, should start from its definition on each letter, and explain how does it compute posterior. Not messy terminology together with extremely personal viewpoint on MCMC’ers. I have been in similar situations and I know how to teach the worst.

  • @golgoli4351
    @golgoli4351 Před 7 lety +12

    The main concern is that the presenter is not familiar with how to deliver his points. Hope this feedback helps you.

  • @edwardbrown7469
    @edwardbrown7469 Před 6 lety +20

    Stopping after 14 minutes, this is a garbled mess.

  • @jiayicox2755
    @jiayicox2755 Před 4 lety +1

    Too many unnecessary details, and not a clear logical presentation. The slides are hard to follow, it just makes people who understand MCMC more confused.

  • @powerdriller4124
    @powerdriller4124 Před rokem

    Too many conceptual jumps, leaving big blanks. Something as important and to the core of Bayesian Analysis as how to define the Likelihood function is left unexplained. And at 27:35 he says that Metropolis does not consider the evidence!! ABSURD!! It is precisely in the Likelihood definition where the evidence is considered.

  • @farshadgoldoust6548
    @farshadgoldoust6548 Před 4 lety

    you tried, anyway it's difficult ;)

  • @JohnSmith-lf5xm
    @JohnSmith-lf5xm Před 7 lety +20

    This is not for beginners. you did not explain anything. You just keep saying "Monte Carlo does this or that" but you do not explain anything.

    • @terrysworld5651
      @terrysworld5651 Před 5 lety +4

      I think this is a talk designed for people who have started a PhD in astrophysics. We used a very terse definition of the word 'beginner' in Astrophysics...

    • @yevgenydevine
      @yevgenydevine Před 5 lety +11

      People on CZcams think "MCMC for beginners" means "MCMC for those who missed all math classes" smh.

  • @m_sh_oh
    @m_sh_oh Před rokem +1

    Presentation PDF download:
    github.com/davidkipping/sagan2016/blob/master/MCMC.pdf

  • @rizwanniaz9265
    @rizwanniaz9265 Před 6 lety

    how to calculate odd ratio in bayesian ordered logistic plz tell me