10. Introduction to Learning, Nearest Neighbors
Vložit
- čas přidán 12. 09. 2024
- MIT 6.034 Artificial Intelligence, Fall 2010
View the complete course: ocw.mit.edu/6-0...
Instructor: Patrick Winston
This lecture begins with a high-level view of learning, then covers nearest neighbors using several graphical examples. We then discuss how to learn motor skills such as bouncing a tennis ball, and consider the effects of sleep deprivation.
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu
Thank you MIT. Just found out today that Professor Patrick had passed on the 19th of July, 2019. I am immensely saddened by this incident. I was actually looking forward to meeting you but I guess that is no longer possible. Rest in Peace legend!
Instablaster...
RIP Professor Patrick Winston, definitely your work gonna live for ever
Deep respect!
"You have to be very careful about the confusion of correlation with cause. They see the correlation, but they don't understand the cause, so that's why they make a mistake"
This is so simple but so meaningful!
The last minute of the lecture is gold.
this mit opencourseware is like eating potato chips you cannot eat just one or view just one lecture. thank you.
except they are healthy
...aaaaand this is why MIT is one of the top educational institutions in the world ! Thanks MIT !
This is why I love this prof's lecture.. so much of passion along with simplicity and fun!
46:06 Another thing that is not especially related to the topic is that even when deprived of sleep, the brain works better in the middle of the day rather then the start or end. The huge drops of performance happens when a "subject" is used to sleep/need to sleep. While performance doesn't drop at all (and even goes higher related to the "sleeping time") during the mid-day. Therefore Linear regression can tell you the obvious hypothesis (losing sleep = losing performance) While the Cubic spline can teach you new things you didn't even think of.
Love this professor..thanks MIT
The world is better with you, thanks prof Winston and MIT
Professor Winston is the best. He is amazing!
His stories and jokes inspires learning and intuition about the subject. That's good teaching skill right there. I was lucky enough to meet teachers with this skill during secondary my school years but I find really rare in university-level education. Thank you MIT and the late professor for this lecture series
i can't imagine how much does the knowledge contained in this course worth.
Really love the way professor is teaching , his confidence and body language is great ..Thank you very much sir
I started out going: "This is too slow". Im now on day two, another 10 hour session. The pace of new information is just perfect. You are a great teacher!
I always put it on 1.25x speed, occasionally pause to ponder. ;)
These lessons just keep getting better and better
hy
Thanks Patrick Winston for the lively presentations! Thanks MIT!!!
Excellent teacher, very organized. He has clearly taught this course many times.
Taught by Marvin Minsky and a truly class act at so many levels.
Amazing lecture thank you MIT for providing us with free knowledge!
I wish I was in MIT such an outstanding teaching.
Boy, he packed a lot into this lecture, but made it so engaging!
best professor funny in a sense that is often senseless...
Love this guy!!!!
Absolutely blessed to be able to get MIT courses for free.
he looks so cool !!!
he's absolutely amazing in every way
I love this lectures, thanks MIT and Mr Pattrick for shared.
Great Lecture, explains it with nice animations on the blackboard great for programmers :-)
that was hella hilarious on the part of the rangers, sleep dep, so the answer then should be how do they get major end decisions out of soldiers when they only have a 25 percent ability, and naps do help immensely if you can handle rounds or just nervousness.
VERY USEFUL FOR STUDENT US GRADUATE PHD RESEARCHERS OR ANYONE NEEDING TOBE INTRODUCED TO AI FIELD I GUES THE BASES GOT HERE FROM PROF WINSTON ARE A GREAT BASIS TO TACKLE OR USE ANY NOTIONS IN THIS VAST FIELD THANKS FROM ALGERIA
luv his curmudgeonly persona, but always lively!
The deeper you go into the series the more hard-core programmers you meet in the comment section :D
Nicecore
Wait, so the majority here are programmers and not math people? I thought that it would be the other way around. I don't know where one subject starts and the other begins anymore
I'm surprised in the 21st century we still haven't found a way to reduce our need for sleep.
I present you coffee!
Coffee doesn't reduce the *need* for sleep. It just prevents you from getting what you need.
FortNikitaBullion It does if you BELIEVE!
Well what I'm thinking is something you can take and then it would feel like you had 8 hours sleep, even though you didn't (or maybe even though you only had 2 hours). Caffeine doesn't do that --- it just makes it impossible to sleep without really improving your productivity.
FortNikitaBullion I understood what you meant.
I can believe I just willingly watched an entire lecture
R.I.P Patrick Winston
Thank you Patrick Winston !
I hope that x-axis grows at a much faster rate than his y-axis, otherwise the example to get the idea across makes less sense. Still a great lecture though! Thumbs up.
At 23:01 the professor says 'so that's just the dot product, right' - but that's to say that cosine similarity = dot product, which is not precise, right? The dot product is the numerator in this case.
Can anyone please help me?
1. Regarding the Robotic Hand Solutions Table:
If I understand correctly in the case of the robotic hand, we start from an empty table and drop a ball from a fixed height on the robotic hand. When the robotic hand feels the touch of the ball, we give a random blow as we record the robotic hand movements.
Now, only if the robotic arm detects after X seconds that the ball has hit the surface again, it realizes that the previous movement was successful and records the movements it made for the successful result in the table for future use.
I guess there is a way to calculate where on the surface the ball fell and then in case the robotic hand feels that the ball touched a region close to the area it remembers it will try the movement closest to these points in the table.
Now there are a few things I do not understand:
A. The ball has an angle, so that touching the same point on the board at different angles will lead to the need to use a different response, our table can only hold data of the desired point and effect and do not know the intensity of the fall of the ball or an angle, the data in the table will be destroyed or never fully filled ?
B. How do we update the table? It is possible that we will drop a ball and at first when the table is empty we will try to give a random hit when the result of this is that the ball will fly to the side so we will not write anything in the table, now this case may repeat itself over and over and we will always be left with an empty table?
It seems to me that I did not quite understand the professor's words and therefore I have these questions. I would be very happy if any of you could explain to me exactly what he meant by this method of solution.
2. In relation to finding properties by vector:
If I understand correctly, we fill in the data we know in advance, and then when a new figure is reached, and we do not know much about it, we measure the angle it creates with the X line (the angle of the vector) and check which group is the most suitable angle.
Now there is a point I do not understand. Suppose I have 2 sets of data, 1 group have data with very low Y points and very high X points and a second group having data with high X and Y points when I get a new data with a low Y and low X , the method of the vector angle will probably associate them with group 1 although it appears on paper that the point is more suitable for group 2.
It seems that if we used a simple surface distribution here (as in the first case presented by the professor) we would get more accurate results than the method of pairing according to vectors angle?
Incredible lecture, thanks MIT.
RIP Professor Winston!
When lecture is over, he just nods and walks away...
As always great stuff
R.I.P. Professor Winston
Deep respect!
Isn't there an ambiguous way to divide the graph into 4 areas? That little triangle in the middle looks like it could be included in any of the four boundaries.
What a great lecture..
amazing lecture
Amazing lecturer!
I once use a nearest-neighbor algorithm to create a voronoi diagram. I didn't even know there was a name for either of them. I was just playing around with pixels.
What is comparitor 8:32? Couldn't find on the web. Is this a spelling mistake?
Thanks for posting, is there somewhere I can go to ask questions or discuss the lecture?
Amazing teacher !
Deep respect to Patrick Winston. And this solo is made using KNN classifier: czcams.com/video/K2PQOgmlQwY/video.html
great lecture
some of the principle looks like very abstract and super-natural and human have been considering as mystery and classify it to AI, but actually it is very simple. Computer could simulate it easily. The Brain is small but can do a lot of things. Not because of mystery but very simple structure.
good work sir
Does anybody know the authors of the robot balancing the tennis ball? Thanks!
Legend.
Just Wow!!
At 41:45, the professor indicates that you cannot use AI for predicting bankruptcies in credit card companies. That's like making cake without flour. Wouldn't the credit card company have relevant data to be able to use AI to predict bankruptcies? Why is the answer "no"?
Wish he could be my machine learning professor!!
did someone understand in 40:00 the derivative of x ?
If you are talking about "x prime", that's not the derivative of x. It's a new random variable. More precisely, it's a random variable transformed from the original x.
With the definition of "x prime", you can calculate its variance by plugging it in the formula. You will get 1.
so.. what is nearest neighbor??
I don't think I'll even be able to walk with 36 hours of sleep deprivation.
xXxBladeStormxXx I start hallucinating after 26 hours
good!!
the longest and less efficient way to deliver the intuition!
Possible image knn code in Java
Can some one help me expand the Cos theta equation?
Take two vectors of u, v of R2 as an example. Let u=[x11 x12] and v=[x21 x22]. Then, cos(theta)=(x11*x21 + x12*x22) / ( sqrt(x11^2 + x12^2 ) * sqrt(x21^2 + x22^2 )). If u=v, then x11=x21 and x12=22 so cos(theta)=1.
with nearest neighbours learning, I've got 92% accuracy on MNIST-Database ( with euclidean distance). 97% with Neural-Nets
Good
IF YOU THING THAT IS FOR KNN NOT! GET OUT OF HERE.. THIS IS NOT FOR KNN ALGORITHM
Tough crowd. He's funny.
Wondefull
thanx a lot ^^
C'mon pierre...
The sleep data is helpful to me. This Professor is very typical of how a Robotics Prof. would teach.
11:20 turn on caption ..lol
lama
asdf
:(
There he GOES again: blaming his age on poor GAIN control
Diet coke
this course is a waste of time...
Could you explain why you think that?