Video není dostupné.
Omlouváme se.

Sparse Gaussian Process Approximations, Richard Turner

Sdílet
Vložit
  • čas přidán 17. 09. 2017
  • Sparse Gaussian Process Approximations
    Richard Turner
    University of Cambridge
    gpss.cc/gpss17/...
    Wednesday 11am
    For the full programme and notebooks: gpss.cc/gpss17

Komentáře • 8

  • @giannisbdk
    @giannisbdk Před 6 měsíci

    Your teaching skills are truly impressive! Thank you for sharing those insightful tips that make these approximations more understandable at their core, especially the link to the factor graphs. Very useful for someone who wishes to connect the dots and gain a comprehensive understanding of why and how these approaches work, rather than just applying their results.

  • @kazparala
    @kazparala Před 6 lety

    Brilliant! It's a very niche topic and this lecture explains the currently available literature in a very intuitive manner.

  • @mohamadmoghadam4756
    @mohamadmoghadam4756 Před 6 lety +1

    Amazing explanation. Thanks a lot for uploading. Although I am out of the field, I could grasp it.

  • @mohammadjoshaghani5388

    Thanks for sharing, very nice explanation!

  • @chenxin4741
    @chenxin4741 Před 3 lety

    Turner is so great at teaching, too!

  • @andreashadjiantonis2596

    Thanks a lot!

  • @king2176
    @king2176 Před 3 lety

    Great explanation, I'm sure it'll get more views as GPs become more mainstream

    • @monsterous289
      @monsterous289 Před 3 lety

      Unfortunately for GPs, it seems for nearly every machine learning-like process the sparse Bernoulli model has been proven as superior, along with the rTop-k algorithm as of late 2020. From my understanding of course.