10. Introduction to Learning, Nearest Neighbors

Sdílet
Vložit
  • čas přidán 12. 09. 2024
  • MIT 6.034 Artificial Intelligence, Fall 2010
    View the complete course: ocw.mit.edu/6-0...
    Instructor: Patrick Winston
    This lecture begins with a high-level view of learning, then covers nearest neighbors using several graphical examples. We then discuss how to learn motor skills such as bouncing a tennis ball, and consider the effects of sleep deprivation.
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

Komentáře • 106

  • @MM-uh2qk
    @MM-uh2qk Před 5 lety +151

    Thank you MIT. Just found out today that Professor Patrick had passed on the 19th of July, 2019. I am immensely saddened by this incident. I was actually looking forward to meeting you but I guess that is no longer possible. Rest in Peace legend!

  • @AhmedSALAH-bb7un
    @AhmedSALAH-bb7un Před 2 lety +13

    RIP Professor Patrick Winston, definitely your work gonna live for ever

  • @cristiannievesp
    @cristiannievesp Před 3 lety +21

    "You have to be very careful about the confusion of correlation with cause. They see the correlation, but they don't understand the cause, so that's why they make a mistake"
    This is so simple but so meaningful!

  • @user-ol2gx6of4g
    @user-ol2gx6of4g Před 7 lety +12

    The last minute of the lecture is gold.

  • @qzorn4440
    @qzorn4440 Před 8 lety +103

    this mit opencourseware is like eating potato chips you cannot eat just one or view just one lecture. thank you.

    • @TGPadm
      @TGPadm Před 7 lety +5

      except they are healthy

  • @Andrei-fg6uv
    @Andrei-fg6uv Před 7 lety +5

    ...aaaaand this is why MIT is one of the top educational institutions in the world ! Thanks MIT !

  • @BapiKAR
    @BapiKAR Před 6 lety +5

    This is why I love this prof's lecture.. so much of passion along with simplicity and fun!

  • @EranM
    @EranM Před 6 lety +5

    46:06 Another thing that is not especially related to the topic is that even when deprived of sleep, the brain works better in the middle of the day rather then the start or end. The huge drops of performance happens when a "subject" is used to sleep/need to sleep. While performance doesn't drop at all (and even goes higher related to the "sleeping time") during the mid-day. Therefore Linear regression can tell you the obvious hypothesis (losing sleep = losing performance) While the Cubic spline can teach you new things you didn't even think of.

  • @ranjeetchoice
    @ranjeetchoice Před 8 lety +41

    Love this professor..thanks MIT

  • @cheng-haochang3509
    @cheng-haochang3509 Před 7 lety +8

    The world is better with you, thanks prof Winston and MIT

  • @bxh062000
    @bxh062000 Před 10 lety +10

    Professor Winston is the best. He is amazing!

  • @HangtheGreat
    @HangtheGreat Před 4 lety +1

    His stories and jokes inspires learning and intuition about the subject. That's good teaching skill right there. I was lucky enough to meet teachers with this skill during secondary my school years but I find really rare in university-level education. Thank you MIT and the late professor for this lecture series

  • @hengyue6596
    @hengyue6596 Před 7 lety +18

    i can't imagine how much does the knowledge contained in this course worth.

  • @saikumarmv1250
    @saikumarmv1250 Před 6 lety +1

    Really love the way professor is teaching , his confidence and body language is great ..Thank you very much sir

  • @martinmadsen1199
    @martinmadsen1199 Před 7 lety +9

    I started out going: "This is too slow". Im now on day two, another 10 hour session. The pace of new information is just perfect. You are a great teacher!

    • @user-ol2gx6of4g
      @user-ol2gx6of4g Před 7 lety +6

      I always put it on 1.25x speed, occasionally pause to ponder. ;)

  • @sassoleo
    @sassoleo Před 6 lety +1

    These lessons just keep getting better and better

  • @2flight
    @2flight Před 7 lety +1

    Thanks Patrick Winston for the lively presentations! Thanks MIT!!!

  • @pjakobsen
    @pjakobsen Před 5 lety +1

    Excellent teacher, very organized. He has clearly taught this course many times.

  • @MathsatBondiBeach
    @MathsatBondiBeach Před 5 lety

    Taught by Marvin Minsky and a truly class act at so many levels.

  • @mangaart3366
    @mangaart3366 Před 3 lety +2

    Amazing lecture thank you MIT for providing us with free knowledge!

  • @nb1587
    @nb1587 Před 2 lety +1

    I wish I was in MIT such an outstanding teaching.

  • @MacProUser99876
    @MacProUser99876 Před 6 měsíci

    Boy, he packed a lot into this lecture, but made it so engaging!

  • @KulvinderSingh-pm7cr
    @KulvinderSingh-pm7cr Před 6 lety +1

    best professor funny in a sense that is often senseless...
    Love this guy!!!!

  • @thechesslobster2768
    @thechesslobster2768 Před 3 lety

    Absolutely blessed to be able to get MIT courses for free.

  • @KulvinderSingh-pm7cr
    @KulvinderSingh-pm7cr Před 6 lety +1

    he looks so cool !!!
    he's absolutely amazing in every way

  • @angelmcorrea1704
    @angelmcorrea1704 Před 4 lety +1

    I love this lectures, thanks MIT and Mr Pattrick for shared.

  • @HaiderKhanZ
    @HaiderKhanZ Před 10 lety +1

    Great Lecture, explains it with nice animations on the blackboard great for programmers :-)

  • @acal790
    @acal790 Před 10 lety +1

    that was hella hilarious on the part of the rangers, sleep dep, so the answer then should be how do they get major end decisions out of soldiers when they only have a 25 percent ability, and naps do help immensely if you can handle rounds or just nervousness.

  • @amrdel2730
    @amrdel2730 Před 6 lety

    VERY USEFUL FOR STUDENT US GRADUATE PHD RESEARCHERS OR ANYONE NEEDING TOBE INTRODUCED TO AI FIELD I GUES THE BASES GOT HERE FROM PROF WINSTON ARE A GREAT BASIS TO TACKLE OR USE ANY NOTIONS IN THIS VAST FIELD THANKS FROM ALGERIA

  • @TheBirdBrothers
    @TheBirdBrothers Před 8 lety +18

    luv his curmudgeonly persona, but always lively!

  • @donbasti
    @donbasti Před 7 lety +39

    The deeper you go into the series the more hard-core programmers you meet in the comment section :D

    • @zingg7203
      @zingg7203 Před 3 lety +1

      Nicecore

    • @marceloflc
      @marceloflc Před 3 lety

      Wait, so the majority here are programmers and not math people? I thought that it would be the other way around. I don't know where one subject starts and the other begins anymore

  • @ThePeterDislikeShow
    @ThePeterDislikeShow Před 9 lety +16

    I'm surprised in the 21st century we still haven't found a way to reduce our need for sleep.

    • @KaosFireMaker
      @KaosFireMaker Před 9 lety +5

      I present you coffee!

    • @ThePeterDislikeShow
      @ThePeterDislikeShow Před 9 lety +2

      Coffee doesn't reduce the *need* for sleep. It just prevents you from getting what you need.

    • @KaosFireMaker
      @KaosFireMaker Před 9 lety +15

      FortNikitaBullion It does if you BELIEVE!

    • @ThePeterDislikeShow
      @ThePeterDislikeShow Před 9 lety

      Well what I'm thinking is something you can take and then it would feel like you had 8 hours sleep, even though you didn't (or maybe even though you only had 2 hours). Caffeine doesn't do that --- it just makes it impossible to sleep without really improving your productivity.

    • @KaosFireMaker
      @KaosFireMaker Před 9 lety +7

      FortNikitaBullion I understood what you meant.

  • @redthunder6183
    @redthunder6183 Před rokem

    I can believe I just willingly watched an entire lecture

  • @WepixGames
    @WepixGames Před 5 lety +1

    R.I.P Patrick Winston

  • @elivazquez7582
    @elivazquez7582 Před 6 lety

    Thank you Patrick Winston !

  • @user-hf2dr7sh4y
    @user-hf2dr7sh4y Před 8 lety

    I hope that x-axis grows at a much faster rate than his y-axis, otherwise the example to get the idea across makes less sense. Still a great lecture though! Thumbs up.

  • @Jackeeba
    @Jackeeba Před 2 lety

    At 23:01 the professor says 'so that's just the dot product, right' - but that's to say that cosine similarity = dot product, which is not precise, right? The dot product is the numerator in this case.

  • @hdfhfdhdfhdfghdfh3306
    @hdfhfdhdfhdfghdfh3306 Před 6 lety

    Can anyone please help me?
    1. Regarding the Robotic Hand Solutions Table:
    If I understand correctly in the case of the robotic hand, we start from an empty table and drop a ball from a fixed height on the robotic hand. When the robotic hand feels the touch of the ball, we give a random blow as we record the robotic hand movements.
    Now, only if the robotic arm detects after X seconds that the ball has hit the surface again, it realizes that the previous movement was successful and records the movements it made for the successful result in the table for future use.
    I guess there is a way to calculate where on the surface the ball fell and then in case the robotic hand feels that the ball touched a region close to the area it remembers it will try the movement closest to these points in the table.
    Now there are a few things I do not understand:
    A. The ball has an angle, so that touching the same point on the board at different angles will lead to the need to use a different response, our table can only hold data of the desired point and effect and do not know the intensity of the fall of the ball or an angle, the data in the table will be destroyed or never fully filled ?
    B. How do we update the table? It is possible that we will drop a ball and at first when the table is empty we will try to give a random hit when the result of this is that the ball will fly to the side so we will not write anything in the table, now this case may repeat itself over and over and we will always be left with an empty table?
    It seems to me that I did not quite understand the professor's words and therefore I have these questions. I would be very happy if any of you could explain to me exactly what he meant by this method of solution.
    2. In relation to finding properties by vector:
    If I understand correctly, we fill in the data we know in advance, and then when a new figure is reached, and we do not know much about it, we measure the angle it creates with the X line (the angle of the vector) and check which group is the most suitable angle.
    Now there is a point I do not understand. Suppose I have 2 sets of data, 1 group have data with very low Y points and very high X points and a second group having data with high X and Y points when I get a new data with a low Y and low X , the method of the vector angle will probably associate them with group 1 although it appears on paper that the point is more suitable for group 2.
    It seems that if we used a simple surface distribution here (as in the first case presented by the professor) we would get more accurate results than the method of pairing according to vectors angle?

  • @michafilek6883
    @michafilek6883 Před 7 lety +2

    Incredible lecture, thanks MIT.

  • @sakcee
    @sakcee Před 2 lety

    RIP Professor Winston!

  • @stephk42
    @stephk42 Před 5 lety +6

    When lecture is over, he just nods and walks away...

  • @pokixd2298
    @pokixd2298 Před 4 lety

    As always great stuff

  • @sansin-dev
    @sansin-dev Před 5 lety +1

    R.I.P. Professor Winston

  • @pmcate2
    @pmcate2 Před 5 lety

    Isn't there an ambiguous way to divide the graph into 4 areas? That little triangle in the middle looks like it could be included in any of the four boundaries.

  • @oguzhanakgun9591
    @oguzhanakgun9591 Před 2 lety

    What a great lecture..

  • @user-bt6bo8lb4l
    @user-bt6bo8lb4l Před 2 lety

    amazing lecture

  • @ally_jr
    @ally_jr Před 7 lety

    Amazing lecturer!

  • @kevnar
    @kevnar Před 4 lety

    I once use a nearest-neighbor algorithm to create a voronoi diagram. I didn't even know there was a name for either of them. I was just playing around with pixels.

  • @abjkgp
    @abjkgp Před 2 lety

    What is comparitor 8:32? Couldn't find on the web. Is this a spelling mistake?

  • @shumakriss
    @shumakriss Před 8 lety

    Thanks for posting, is there somewhere I can go to ask questions or discuss the lecture?

  • @famishedrover
    @famishedrover Před 6 lety

    Amazing teacher !

  • @dragolov
    @dragolov Před 2 lety

    Deep respect to Patrick Winston. And this solo is made using KNN classifier: czcams.com/video/K2PQOgmlQwY/video.html

  • @yoyokagus9245
    @yoyokagus9245 Před 9 lety

    great lecture

  • @katateo328
    @katateo328 Před 2 lety

    some of the principle looks like very abstract and super-natural and human have been considering as mystery and classify it to AI, but actually it is very simple. Computer could simulate it easily. The Brain is small but can do a lot of things. Not because of mystery but very simple structure.

  • @suniltech7586
    @suniltech7586 Před 9 lety +1

    good work sir

  • @edusson
    @edusson Před 8 lety

    Does anybody know the authors of the robot balancing the tennis ball? Thanks!

  • @thetranslator1044
    @thetranslator1044 Před 10 měsíci

    Legend.

  • @anishreddyellore6002
    @anishreddyellore6002 Před 2 lety

    Just Wow!!

  • @amitgrover
    @amitgrover Před 6 lety +1

    At 41:45, the professor indicates that you cannot use AI for predicting bankruptcies in credit card companies. That's like making cake without flour. Wouldn't the credit card company have relevant data to be able to use AI to predict bankruptcies? Why is the answer "no"?

  • @whutismyname
    @whutismyname Před 5 lety

    Wish he could be my machine learning professor!!

  • @magnfiyerlmoro3301
    @magnfiyerlmoro3301 Před 7 lety +1

    did someone understand in 40:00 the derivative of x ?

    • @MrFurano
      @MrFurano Před 7 lety

      If you are talking about "x prime", that's not the derivative of x. It's a new random variable. More precisely, it's a random variable transformed from the original x.
      With the definition of "x prime", you can calculate its variance by plugging it in the formula. You will get 1.

  • @이종법-o2m
    @이종법-o2m Před 3 lety

    so.. what is nearest neighbor??

  • @xXxBladeStormxXx
    @xXxBladeStormxXx Před 8 lety +4

    I don't think I'll even be able to walk with 36 hours of sleep deprivation.

  • @samirelzein1978
    @samirelzein1978 Před 2 lety

    the longest and less efficient way to deliver the intuition!

  • @ffzcdbnc9679
    @ffzcdbnc9679 Před 6 lety

    Possible image knn code in Java

  • @sauravfalia9676
    @sauravfalia9676 Před 6 lety +1

    Can some one help me expand the Cos theta equation?

    • @freeeagle6074
      @freeeagle6074 Před 9 měsíci

      Take two vectors of u, v of R2 as an example. Let u=[x11 x12] and v=[x21 x22]. Then, cos(theta)=(x11*x21 + x12*x22) / ( sqrt(x11^2 + x12^2 ) * sqrt(x21^2 + x22^2 )). If u=v, then x11=x21 and x12=22 so cos(theta)=1.

  • @philippg6023
    @philippg6023 Před 5 lety +2

    with nearest neighbours learning, I've got 92% accuracy on MNIST-Database ( with euclidean distance). 97% with Neural-Nets

  • @sainiarvind3660
    @sainiarvind3660 Před 2 lety

    Good

  • @surflaweb
    @surflaweb Před 5 lety +1

    IF YOU THING THAT IS FOR KNN NOT! GET OUT OF HERE.. THIS IS NOT FOR KNN ALGORITHM

  • @tdreamgmail
    @tdreamgmail Před 5 lety +1

    Tough crowd. He's funny.

  • @110Turab
    @110Turab Před 6 lety +1

    Wondefull

  • @asmadjaidri1219
    @asmadjaidri1219 Před 7 lety

    thanx a lot ^^

  • @MattyHild
    @MattyHild Před 6 lety +1

    C'mon pierre...

  • @keira1412
    @keira1412 Před 6 lety

    The sleep data is helpful to me. This Professor is very typical of how a Robotics Prof. would teach.

  • @dhruvjoshi8744
    @dhruvjoshi8744 Před 5 lety

    11:20 turn on caption ..lol

  • @bubbleman2059
    @bubbleman2059 Před 9 lety

    lama

  • @luke8489
    @luke8489 Před 3 lety

    asdf

  • @robertmcintyre9023
    @robertmcintyre9023 Před 2 lety

    :(

  • @bryanjohnson7781
    @bryanjohnson7781 Před 9 lety

    There he GOES again: blaming his age on poor GAIN control

  • @akshayakumars2814
    @akshayakumars2814 Před 3 lety

    Diet coke

  • @bensalemn271
    @bensalemn271 Před rokem

    this course is a waste of time...

    • @maar2001
      @maar2001 Před 9 měsíci

      Could you explain why you think that?