03 Numerical Integration

Sdílet
Vložit
  • čas přidán 10. 09. 2024

Komentáře • 9

  • @gageshmadaan6819
    @gageshmadaan6819 Před 3 lety

    Finally a great tutorial on Bayesian Quadrature. Thank you..!!

  • @katateo328
    @katateo328 Před 2 lety

    yeah, Gaussian quadrature is exactly the boosting concept.

  • @matej6418
    @matej6418 Před 11 měsíci

    elite content

  • @nullpter
    @nullpter Před 3 lety +1

    Thank you so much for the tutorial!

  • @katateo328
    @katateo328 Před 2 lety

    shifting with rotation the nodes a little bit would create other weak estimators.

  • @katateo328
    @katateo328 Před 2 lety

    dont you think we can improve the performance of trapezoidal method by using boosting concept?? trapezoidal is a weak estimator. Boosting weak estimators may have as well performance as Simpson method.

  • @danel5726
    @danel5726 Před 3 lety

    Thank you so much for this tutorial! It's the only resource i have found yet with an intuitive example and explanation to BQ.
    I have a question regarding the variance of Z in BQ.
    There are 3 x's being considered when estimating the variance, with X being the training data, and then x, and x' as well.
    if k(x,x') is a kernel function for any two given inputs, how can one picture k(x,X) and k(X,x')?
    Thank you again!

    • @willtsing3000
      @willtsing3000 Před 3 lety

      Hi, am also a little bit confused by the fact that there are two input (x, x'), which is writing separately here as k(x,X) and k(X, x')
      was thinking maybe another formulation written as $V = E_{x}k(x, x) - E_(x)k(x,X)K^{-1}k(x,X)^T$ could be little bit more clear, where the 2 input: x, and x' are all concatenated in the input x, what do you think?
      And the diag part of E_{x}k(x, x) should be exactly equal to the original k(x, x), is this understanding correct?

  • @katateo328
    @katateo328 Před 2 lety

    hahaha, generally, lazy is goooooood.