Prof. Deisenroth, Thank you for sharing your excellent lecture. At the very beginning, you say that, "...yesterday we looked at the Bayesian Linear Regression towards the end of the class...". Is that previous lecture's video available?
At 58:10 , f1,..., fk and x1,...xk are with stars I guess. At 1:00:59 what is cov( f(x);f(x*))? Maybe it is a (k+N)x(k+N) covariance matrix there defined via kernels.
Amazing!!
Incredible lecture; thank you
the last 5 mins of process adding observation is so insightful and helped me a lot, thanks professor!
Excellent explanation. Thank you so much for making this video.
Prof. Deisenroth, Thank you for sharing your excellent lecture. At the very beginning, you say that, "...yesterday we looked at the Bayesian Linear Regression towards the end of the class...". Is that previous lecture's video available?
Very helpful !
At 58:10 , f1,..., fk and x1,...xk are with stars I guess. At 1:00:59 what is cov( f(x);f(x*))? Maybe it is a (k+N)x(k+N) covariance matrix there defined via kernels.
Sir may I ask why in the slide page 40 the Posterior has different functions instead like in pages before only one function?