Hey there! Thanks for stopping in! You minimize the vertical (offset) distance because you're checking the error between the model (the "best fit" line) and the actual "performance" of the data. By checking the vertical distance, the x - coordinate (input variable) remains consistent between y (data dependent value) and "y hat" (the predicted y-value). Take care!
+Rachel Hartman Glad it helped, my friend! If you like, here's the link to the project: coccweb.cocc.edu/srule/MTH244/projects/4Correlationandregression.pdf Here are the datasets you'll need: coccweb.cocc.edu/srule/MTH244/projects/correg.xlsx
That was my question when I saw the video as well. However seeing this graphically really helps my understanding of what is going on. Even though we are about done with the last course I will be able to take from you I had to subscribe since this stuff is so fun to learn.
Thanks for stopping by, and a very good point! I was just trying to get at the explanation of the "squares" bit in this video. Maybe I need another one. :) From what I remember from Stat 370, the reason isn't to accentuate the larger variations, but to create a differentiatiable function (when you're minimizing the sum of the squares, you need to set partial derivatives to 0). Hope that helps! Thanks, again, for stopping in!
Hello! The program is called Geometer's Sketchpad. It's truly wonderful. (I tried to post a link to its site, but got an error...so it's "dynamic geometry dot com") Hope you find it useful!
Oh Thank you SO much!! I have got stats exam in a couple of weeks and I just don't understand most of what my lecturer has spent hours talkinggggg about .. its just the wording/ his explanations of it all is not helpful but THIS oh thank you!! That's one topic down!
sorry for my comment in French , il does mean: it would be heaven on earth if the politicians (around the world, not just in the USA) had the same generosity as you,
Hey, you're welcome! The quick and easy answer is that the sum of the deviations (unsquared) will always be zero (this is the same thing that happens when you square the "sum of deviations from average" when computing variances...I mention the motivation at around 2:06). Make sense?
Ah, that makes sense. Do all regression methods require the derivative? I assume other functions could work instead of square, but would provide a different optimized fit? I use LevMar at work for non-linear regression, which requires the Jacobian. If (-1)^2.5 existed, what do you imagine its sign would be?
I loved the explanation and the video, but I have a question about the ending. I agree that the line of best fit was y=0.35x + 0.55, but to find that answer the way you showed in the video by minimizing the areas, you would have to be pivoting that line around a point that was already on the line of best fit. How could you have found this answer graphically without more information or did I miss a step?
Hello, and thanks for the note! You missed not a thing, my friend. The only way to find it graphically would be to a) randomly test all pairs of points and b) minimize the sum of squares for the lines determined by them. No prob...assuming you have infinite precision and infinite time. :) A much more direct method is to use partial derivatives to derive the slope and intercept. This video is just to give the viewer the big picture (without the calculus). Have a great day, and thanks!
This is great, but it doesn't explain why sum of squares and not just sum of absolutes. I imagine it's because we want bigger errors to carry more weight, but why not power of 4 or sum(exp(abs(error))) then?
Absolute values are way more intuitive, for sure! I (just today) had my students "invent" measures of variation, and they came up with average deviation (not standard deviation). As far as I know, the reason the squared deviations are used is to maintain differentiability...if you want to minimize the sum of those squares, you can differentiate x^2, but not |x|.
COCCmath thank you for writing back. I'm afraid, I'm not prepared to understand your explanation. An article on algorithms brought me here. Is this topic a part of statistics?
All good! When you say "an article brought you here", do you mean this this video? if so, which article? And yep - it is statistics. But the fact that you have to minimize the squared deviations of the points requires calculus (at least in the background). Hop that helps!
I would guess that all do, since they're all minimizing the error. And, for sure, other differentiable functions besides squaring could work (especially because the squaring does create such exaggerations at large deviations)...I wonder, though, if we're tied to it due to our common use of the variance statistic as an unbiased estimator. I imagine that (-1)^2.5 would be Capricorn...an imaginary one :)
I have been taught many times about least squares. But this is the first times I understand what it is!
I learned more in this 5 minute video than 4 weeks of correlation and regression analysis practice and lecture.
Yay! This makes me so happy! Thanks for stopping by, and glad it helped!
So glad it helped! Thanks for stopping by!
Yay! Glad you found it helpful! It sure helped me to understand it. Thanks for stopping by!
Merci beaucoup! Hello from Oregon!
So nice to hear from you, and I'm glad you found this helpful. Knowledge should be shared, no?
Au revoir
Hey there! Thanks for stopping in! You minimize the vertical (offset) distance because you're checking the error between the model (the "best fit" line) and the actual "performance" of the data. By checking the vertical distance, the x - coordinate (input variable) remains consistent between y (data dependent value) and "y hat" (the predicted y-value). Take care!
Awesome! So glad it helped! Thanks for stopping by!
Very straightforward explanation. very helpful. Hope you will do more! Good job
still helpful 11 years later.
Thank you so much!!!!
My pleasure! Thanks for stopping by!
I have to take my AP Stats class today, and this helped a lot! wish I was doing that project you kept referencing because I still need more help haha
+Rachel Hartman Glad it helped, my friend! If you like, here's the link to the project:
coccweb.cocc.edu/srule/MTH244/projects/4Correlationandregression.pdf
Here are the datasets you'll need:
coccweb.cocc.edu/srule/MTH244/projects/correg.xlsx
That was my question when I saw the video as well. However seeing this graphically really helps my understanding of what is going on. Even though we are about done with the last course I will be able to take from you I had to subscribe since this stuff is so fun to learn.
Thank you! Glad it made sense. I, for sure, will do more as I find time!
Very well explained. Thank you so much for your generosity. From FRANCE.
Yes, the visual aspect (seeing the actual squares) makes a big difference. Thanks for the help on my stats project! :-)
Thank you for the explanation. Helped me quite a bit.
Heck yeah! Thanks, Randy! It's been fun!
You're so welcome! Thanks for checking it out!
Good job. I'm using this for my class. It's a very clear explanation
Thank you! And heck yeah! Have fun with it!
that was a very good explanation and presentation of the least squares. very good job, thank you
Thank you so much! Thanks for stopping by!
Thanks for stopping by, and a very good point! I was just trying to get at the explanation of the "squares" bit in this video. Maybe I need another one. :)
From what I remember from Stat 370, the reason isn't to accentuate the larger variations, but to create a differentiatiable function (when you're minimizing the sum of the squares, you need to set partial derivatives to 0).
Hope that helps! Thanks, again, for stopping in!
Love it! I'm so visual - Thank you.
Wow!! such a nice to video to understand the meaning of least squares. Thanks !!!
It was so useful and very well explained.
You're very welcome! Glad it helped!
Thank you. Visual cues + Meaning = Dual Code Processing = LEARNING! You just made that cool for a person with math anxiety! Yay!
+Tierceleyas Yay! So glad it was helpful! I like to come at things non - traditionally when I can. :)
The best explanation. Thank you.
Oh, thank you! Be well!
Great video!!
My god this clarified SO MUCH, THANKS!!!
You're so welcome! It always used to mystify me, too. :)
@XboxFearTheReaper95 You're more than welcome! Thanks for stopping by!
Thank you! Glad it made sense!
@DocKTP This is my favorite explanation, so I'm glad you like it, too!
Incredibly useful, thanks.
So glad it helped! Thanks for stopping in!
Great visualization .. thanks
You're so welcome! Glad it helped!
Sweet! SO glad it was helpful!
Thank you. This is very well explained.
You're welcome. Glad it helped!
You're more than welcome!
Thank you for explaining the method of least squares ; ]
Very nice and crystal clear explanation
Thanks, Doc! Appreciate it!
You're so very welcome!
Hello!
The program is called Geometer's Sketchpad. It's truly wonderful.
(I tried to post a link to its site, but got an error...so it's "dynamic geometry dot com")
Hope you find it useful!
Thank you! Hope it helped!
Thank you soo very much!!!! Very very much appreciated!!!!! :)
Man, tha is awesome. Thanks!
You're welcome! Thank you! Glad it helped!!!
I love the Capricorn reference.
Thank you very much 🙏 from India..
You're so welcome!
Oh Thank you SO much!! I have got stats exam in a couple of weeks and I just don't understand most of what my lecturer has spent hours talkinggggg about .. its just the wording/ his explanations of it all is not helpful but THIS oh thank you!! That's one topic down!
I'm sorry if I never responded to this ! So glad it helped!!!
Thanks! Very helpful! :)
En effet, et si certains décideurs politiques avait la même générosité d'esprit que vous le monde serait un paradis. Take care.
Awesome, thank you
You're so welcome! Thanks for stopping by!
Excellent. Thanks a lot
You're so welcome! :)
fantastic stuff!! Thank you!
You're so welcome! Glad it helped!
it did!
this video helps a lot~~!!
Thanks!
@SkateObsession You're welcome!
Thank you for this wonderful sentiment (that I had to Google translate...sorry for being such an...American). Take care!
@aecesped You're welcome!
Thanks, it helped alot!
Yay! Glad it helped!!!
Excllent !! Thanks a lot
+chichasb You're welcome!
sorry for my comment in French , il does mean: it would be heaven on earth if the politicians (around the world, not just in the USA) had the same generosity as you,
@alexe610 It was a tremendous comment! And beautiful in your native language. Thank you so much!
Hey, you're welcome! The quick and easy answer is that the sum of the deviations (unsquared) will always be zero (this is the same thing that happens when you square the "sum of deviations from average" when computing variances...I mention the motivation at around 2:06). Make sense?
amazing
Thank you! Glad it helped!
thank you very much. :)
You're so welcome! Glad it helped!
Brilliant
Thank you! Hope it helped!And your profile pic, likewise, is brilliant! Thought it was Randy Rhoads. :)
Ah, that makes sense. Do all regression methods require the derivative? I assume other functions could work instead of square, but would provide a different optimized fit? I use LevMar at work for non-linear regression, which requires the Jacobian. If (-1)^2.5 existed, what do you imagine its sign would be?
Thanks! this is usefull!
Oh yay! So glad it helped! :)
I loved the explanation and the video, but I have a question about the ending. I agree that the line of best fit was y=0.35x + 0.55, but to find that answer the way you showed in the video by minimizing the areas, you would have to be pivoting that line around a point that was already on the line of best fit. How could you have found this answer graphically without more information or did I miss a step?
Hello, and thanks for the note!
You missed not a thing, my friend. The only way to find it graphically would be to a) randomly test all pairs of points and b) minimize the sum of squares for the lines determined by them. No prob...assuming you have infinite precision and infinite time. :) A much more direct method is to use partial derivatives to derive the slope and intercept. This video is just to give the viewer the big picture (without the calculus).
Have a great day, and thanks!
Thanks! But why minimize the square but not the distance?
This is great, but it doesn't explain why sum of squares and not just sum of absolutes. I imagine it's because we want bigger errors to carry more weight, but why not power of 4 or sum(exp(abs(error))) then?
cool
Thanks!
Which program are you using to demonstrate this?
Thanks
Sorry I have tourettes
real it is good
Xans in my l e a n
Which program is this made with?
Geometer's Sketchpad: www.dynamicgeometry.com/
Scoresby SC for the win
Tomorrow I have exam....thank you.
I just rock in viva today....thanks to you😀😀
@@EWB438 Oh YAY!!!! You're so welcome - and glad it helped!!!!! :) :) :)
Why squaring method is preferred over absolute values?
Absolute values are way more intuitive, for sure! I (just today) had my
students "invent" measures of variation, and they came up with average
deviation (not standard deviation).
As far as I know, the reason the squared deviations are used is to maintain differentiability...if you want to minimize the sum of those squares, you can differentiate x^2, but not |x|.
COCCmath thank you for writing back. I'm afraid, I'm not prepared to understand your explanation. An article on algorithms brought me here. Is this topic a part of statistics?
All good! When you say "an article brought you here", do you mean this this video? if so, which article?
And yep - it is statistics. But the fact that you have to minimize the squared deviations of the points requires calculus (at least in the background).
Hop that helps!
not to this video in particular, but the term least squares was used. calculus...failed it in previous semester.
Gotcha. There's an overlap in much of inferential statistics with calculus - generally, when something needs to be optimized.
Can we get graph
Not sure I know what you mean.
Your voice sounds like Hank's from Breaking Bad
Hope that's not a bad thing. :)
@@COCCmath hah, that is not; this is so cool 😎😅
@@aleksandrsimonov4591 Whew! You had me worried there! :) Glad you liked the video!
You forgot to explain why it is called a "regression"
True.
But I'm fairly certain that "regression to the mean" is more of a universal idea than "least squares".
I would guess that all do, since they're all minimizing the error. And, for sure, other differentiable functions besides squaring could work (especially because the squaring does create such exaggerations at large deviations)...I wonder, though, if we're tied to it due to our common use of the variance statistic as an unbiased estimator.
I imagine that (-1)^2.5 would be Capricorn...an imaginary one :)
finagle
matlab uoe gang?
Not sure what you mean!
Thanks!
cool
I thought so, too!