- 69
- 4 071 464
Nando de Freitas
Registrace 17. 11. 2009
I am a machine learning professor at UBC. I am making my lectures available to the world with the hope that this will give more folks out there the opportunity to learn some of the wonderful things I have been fortunate to learn myself. Enjoy.
Deep Learning Lecture 15: Deep Reinforcement Learning - Policy search
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/
Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
zhlédnutí: 55 278
Video
Deep Learning Lecture 16: Reinforcement learning and neuro-dynamic programming
zhlédnutí 120KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 14: Karol Gregor on Variational Autoencoders and Image Generation
zhlédnutí 49KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford. Guest Lecture by Karol Gregor of Google Deepmind.
Deep Learning Lecture 13: Alex Graves on Hallucination with RNNs
zhlédnutí 52KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford. Guest lecture by Alex Graves of Google Deepmind, who gives a new meaning to "the dreams in which I'm dying are the best I ever had"
Deep Learning Lecture 12: Recurrent Neural Nets and LSTMs
zhlédnutí 127KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 11: Max-margin learning, transfer and memory networks
zhlédnutí 30KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 10: Convolutional Neural Networks
zhlédnutí 154KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 9: Neural networks and modular design in Torch
zhlédnutí 44KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep learning Lecture 7: Logistic regression, a Torch approach
zhlédnutí 38KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 8: Modular back-propagation, logistic regression and Torch
zhlédnutí 39KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 5: Regularization, model complexity and data complexity (part 2)
zhlédnutí 35KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 6: Optimization
zhlédnutí 47KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 4: Regularization, model complexity and data complexity (part 1)
zhlédnutí 47KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 3: Maximum likelihood and information
zhlédnutí 79KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Deep Learning Lecture 2: linear models
zhlédnutí 98KPřed 9 lety
Slides available at: www.cs.ox.ac.uk/people/nando.defreitas/machinelearning/ Course taught in 2015 at the University of Oxford by Nando de Freitas with great help from Brendan Shillingford.
Machine learning - Markov chain Monte Carlo (MCMC) II
zhlédnutí 35KPřed 11 lety
Machine learning - Markov chain Monte Carlo (MCMC) II
Machine learning - Importance sampling and MCMC I
zhlédnutí 86KPřed 11 lety
Machine learning - Importance sampling and MCMC I
Machine learning - Deep learning II, the Google autoencoders and dropout
zhlédnutí 24KPřed 11 lety
Machine learning - Deep learning II, the Google autoencoders and dropout
Machine learning - Logistic regression
zhlédnutí 39KPřed 11 lety
Machine learning - Logistic regression
Machine learning - Unconstrained optimization
zhlédnutí 17KPřed 11 lety
Machine learning - Unconstrained optimization
Machine learning - Random forests applications
zhlédnutí 18KPřed 11 lety
Machine learning - Random forests applications
Machine learning - Bayesian optimization and multi-armed bandits
zhlédnutí 131KPřed 11 lety
Machine learning - Bayesian optimization and multi-armed bandits
Machine learning - Introduction to Gaussian processes
zhlédnutí 296KPřed 11 lety
Machine learning - Introduction to Gaussian processes
Machine learning - Bayesian learning part 2
zhlédnutí 24KPřed 11 lety
Machine learning - Bayesian learning part 2
Best lecture in GP. Complement with examples in GPT or claude.
This is amazing. Thanks so much!
59:55 Why does the prediction for new data x_* calculate P(y|x_*, D, σ) instead of (x_*)^T Θ ?
isn't 22:19 the right side formula for x1|x2 not for x2|x1?
at 1:02:40 IK this teacher was a legend. 11years late and still able to gain much valuable knowledge from these lectures!
beautifully linked the idea of maximising likelihood by illustrating the 'green line' @ 51:41
Amazing lecturer
hi. Thanks for wonderful videos. please introduce a book to study for this course.
Thank you for this
Great explanation and pace. Very legit.
For the noisy GP case, we assume the noise is sigma^2 * the identity matrix, which assumes iid. What if the noise is correlated, can we incorporate the true covariance matrix?
Still great in 2023
Hello Nando, thank you for your excellent course. Following the bell example, the muy12 and sigma12 you wrote should be for the case that we are giving X2=x2 and try to find the distribution of X1 given X2=x2. Am I correct? Other understanding is welcomed. Thanks a lot!
It amazes me that people were discussing these topics when I was studying about the water-cycle lol.
Hi Nando, is it possible to access the codes that you have mentioned in the lecture?
Thanks prof. Freitas, this is extremely helpful
super good
This lecture is amazing Professor. From the bottom of my heart, I say thank you.
One thing that remained confusing for me for a long time and which I don't think he clarified in the video was that the N and the summation from i = 1 to i = N does not refer to the # of data points in our dataset but to the number of times of we run the Monte Carlo simulation.
Amazing example!
Thank you very much.
Thanks!!! Easy to understand👍👍👍
He is a hidden gem .. I have gone through a lot of his videos , they are great in terms of removing jargon .. and bringing clarity
This is absolutely the best explanation of the Gaussian!
49:40 str 46
Thank you for everything you have given to us.
This was an excellent lecture. Thank you.
Great lecture, abrupt ending. I believe this is the short (but dense) book mentioned by Criminisi about decision forests www.microsoft.com/en-us/research/wp-content/uploads/2016/02/CriminisiForests_FoundTrends_2011.pdf
It looks like that the notation of the axis in the graph on the right side of the presentation, @ around 20:39, is not correct. It could probably be the x1 on x-axis. I.e: it would make sense if μ12 refered to the mean of variable x1, rather than x2, judging from the equation shown on the next slide.
You can olny "die" once. That is how I remember die is singular :D
The use of notation at 23:00 is confusing for me
Learning advanced ml concepts for free! What a time to be alive. Thanks a lot for the vid!
To this day, I keep coming back to your videos to refresh ML concepts. Your courses are a Masterpiece!
Very elaborate and simple presentation. Thank you.
This is when got really interesting 22:02 typically, I'm given points and I am trying to learn the mu's and the sigma's
1:12:24 What is mu(x)? Is that different from mu?
I wish if i discovered your courses a bit earlier
Love the content. <3
Thank you! Really great explanation!!!
5 minutes in and its already better than all 3 hours at class earlier today!
This is the best video for understanding the Bayesian Optimization. It would be a great help if you could you post a video on multi objective Bayesian optimization specifically on expected hyper volume improvement. Thank you
Don't stop sharing these knowledge for those are vital to the progress of humankind!
access to the source code to do this would be useful
So, basically, it's partial derivatives?
I don't understand, it's basically a lot of derivatives within the layers.. correct?
After 30mins, I am sure that he is top 10 teacher in my life
the best way to explain gamma function, is that is continuous factorial. you should point that P(teta) you write is probability DENCITY function here
Great lecture!
How do you understand the math part with depth? Anyone? Help me!
if their is a mistake in 21:58 ? it should be condition on x1 instead of x2 ?