Linear Regression From Scratch in Python (Mathematical)

Sdílet
Vložit
  • čas přidán 23. 06. 2024
  • In this video we implement the linear regression algorithm from scratch. This episode is highly mathematical.
    ◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾◾
    📚 Programming Books & Merch 📚
    🐍 The Python Bible Book: www.neuralnine.com/books/
    💻 The Algorithm Bible Book: www.neuralnine.com/books/
    👕 Programming Merch: www.neuralnine.com/shop
    🌐 Social Media & Contact 🌐
    📱 Website: www.neuralnine.com/
    📷 Instagram: / neuralnine
    🐦 Twitter: / neuralnine
    🤵 LinkedIn: / neuralnine
    📁 GitHub: github.com/NeuralNine
    🎙 Discord: / discord
    🎵 Outro Music From: www.bensound.com/
    Timestamps:
    (0:00) Intro
    (0:19) Mathematical Theory
    (12:48) Implementation From Scratch
    (24:05) Outro
  • Věda a technologie

Komentáře • 133

  • @MrTheBroMoe
    @MrTheBroMoe Před 2 lety +192

    Please do more mathematical implementations like this please! This really helped me to understand the math behind these algorithms

  • @kapilsonyt
    @kapilsonyt Před 2 lety +3

    I recently stumbled upon this while looking for similar approach via python and I subscribed you now.
    Thank you very much for imparting knowledge.

  • @kenadams2261
    @kenadams2261 Před 2 lety +1

    Amazing video, looking forward for more implementations from scratch!

  • @vardhanr8177
    @vardhanr8177 Před 2 lety +10

    AMAZING video!!! I love these videos which teach theory too! Thank youuu!!!

  • @santrix1595
    @santrix1595 Před rokem +3

    dude Slammer video!! love the fact that you made the math interesting and super easy to understand

  • @Nasrullah0
    @Nasrullah0 Před 8 měsíci +3

    this video is amazing, good job, im now actually thinking about re visiting the math classes i couldn't take before in order to get better at these machine learning algorithims.

  • @Ryguy12543
    @Ryguy12543 Před rokem +1

    Thank you so much. Im working through problem set for a Neural Analysis class and this really helps. Great video.

  • @PML37
    @PML37 Před 14 dny +1

    That was just wow! The way you explained it was amazing.
    Thank you!

  • @Ossigen8
    @Ossigen8 Před 2 měsíci +2

    I don’t comment often videos but this one is really good, you explained the math concept really well by doing the examples on the graph etc… Good stuff in this channel !

  • @SabarishanM
    @SabarishanM Před 22 dny

    I was searching everywhere and final found what I need. Your video really clears up the fundamentals of creating Linear Regression Model. Thank you

  • @AkhilendraOjhaBME
    @AkhilendraOjhaBME Před rokem +2

    Greatly explained in simple words, looking forward to learn more methods from you.

  • @davidmedina1886
    @davidmedina1886 Před 2 lety +2

    This is a great video! Thank you so much!

  • @zukofire6424
    @zukofire6424 Před 2 lety +7

    Thanks very much for this!! I am a data analysis student and close to giving up but still hanging on!

  • @fraggodgaming4433
    @fraggodgaming4433 Před rokem

    Thank you so much man! Great vid...Keep up the good work!

  • @zezestyles6215
    @zezestyles6215 Před rokem

    Thank youuu :) Loved the way you explained it.

  • @happypotato4252
    @happypotato4252 Před 6 měsíci

    Excellent explanation of the implementation of linear regression, thanks!!

  • @lashrack4548
    @lashrack4548 Před rokem

    You got a new subscriber... Best in detail explanation ever!!!

  • @graphicsguy1965
    @graphicsguy1965 Před rokem +3

    Thanks man! Awesome video that dumbed it down enough for me. Could you do linear regression of fitting a sphere to points given the sphere radius (or not) like is used in terrestrial laser scanning? If you've never messed with it, you place spheres around what you are scanning, then the software uses the spheres to align all of the different scans at different locations together.

  • @naitikpatil243
    @naitikpatil243 Před 2 lety +2

    Waiting for series of such videos 🤩

  • @anonymousvevo8697
    @anonymousvevo8697 Před rokem

    the math part as awesome! thanks a lot very clear and simple

  • @TheVerbalAxiom
    @TheVerbalAxiom Před 2 lety +1

    I am subscribing, definitely. You have taught in the most straight forward and explanatory way a concept that other videos made a bit too complicated. Thank you!

  • @dragnar4743
    @dragnar4743 Před 6 dny

    Damn !! I like your video so much, especially the implementation part. Most of the people would just directly use sklearn library and call it a day. But, you really have showed how the code actually runs behind the scene. 👍👍💛💛

  • @throgmortonartstudio2402

    That was so much easier to understand. thank you.

  • @madi6736
    @madi6736 Před 3 měsíci

    your explanations are really clear and precise, thank you very much♥♥♥

  • @omeshnath6826
    @omeshnath6826 Před 8 měsíci

    got a good idea about linear regression thnx bud!!

  • @amirnaeej7986
    @amirnaeej7986 Před 6 měsíci

    You’re a life saver thank you soooo much❤❤

  • @lamiyanabaova3020
    @lamiyanabaova3020 Před rokem

    Thanks a lot for great explanation!

  • @milorday5141
    @milorday5141 Před rokem

    Thank you, very useful and clear.

  • @cambridgebreaths3581
    @cambridgebreaths3581 Před 2 lety +49

    I do like all your videos that contain the keywords "mathematical theory" and "from scratch" :). Please do more similar videos. Thank you kindly

    • @xKreesherZ
      @xKreesherZ Před 2 lety +3

      same, stuff from scratch is the best stuff for learning

    • @fater8711
      @fater8711 Před 2 lety +3

      @@xKreesherZ yeah, and it's really fun
      I am always so proud when I am able to code something from scratch

  • @drewprof
    @drewprof Před 3 měsíci

    Nicely done!

  • @sryps
    @sryps Před 2 lety +2

    love the videos, finally

  • @mysticrustacean4065
    @mysticrustacean4065 Před 10 měsíci

    Brother! I simply love you after I came across this video!

  • @archer9056
    @archer9056 Před 3 měsíci

    Amazing simple explanation of linear regression. Please also cover the other techniques in same way

  • @sajid_ahamed
    @sajid_ahamed Před 2 lety

    awesome video and dark mode paint rocksss

  • @unknowncorsairtim
    @unknowncorsairtim Před 10 měsíci +3

    Awesome tutorial! Could you please explain why you use gradient descent to minimize squared error instead of using the formula: divide the standard deviation of y values by the standard deviation of x values and then multiply this by the correlation between x and y?

  • @cryptoemperor3399
    @cryptoemperor3399 Před 2 lety

    Thanks a bunch professor taught this in class I was like a deer with headlights coming, but after you video I have understanding.

  • @vallabhshelar3176
    @vallabhshelar3176 Před 2 lety

    loved it bro u r just amazing

  • @donowen9848
    @donowen9848 Před 6 měsíci

    Awesome representation!!!

  • @chadstrachan9696
    @chadstrachan9696 Před 5 měsíci

    Great video, very informative

  • @actualBIAS
    @actualBIAS Před 4 měsíci

    Bruder, danke dir. Es wollte nicht in meinen Kopf. Dank dir habe ich das verstanden.

  • @kuroisan2698
    @kuroisan2698 Před rokem

    great video NeuralNine
    I feel this the gradiant descent. is not it ?
    if It is, is there an implementation for the least square
    because I feel it is just so random you take some values and you choose the least one from those random values not necessarily the least value you can get

  • @bluestar2253
    @bluestar2253 Před 2 lety +13

    Yes, please do more videos like this one. Even Einstein gives it 2 thumbs up! :)

  • @adejumoadeniyi5152
    @adejumoadeniyi5152 Před 2 lety

    This was really helpful

  • @user-tr5vv6gf6l
    @user-tr5vv6gf6l Před 10 měsíci +1

    This is amazing!
    How can I store the current m and b values for every iteration?

  • @minuet6919
    @minuet6919 Před 2 lety

    Lmao, I finished summer school for algebra 1 and we just learned this and to calculate it, we used a calculator this is a amazing video the mathematics where correct and your explanation is amazing!

  • @shubhamchoudhary5461
    @shubhamchoudhary5461 Před 2 lety

    that's pretty good lecture.. 😍

  • @bharthyadav6794
    @bharthyadav6794 Před rokem +1

    yoo out all explanations i saw u did the best work this is what i wanted best work man subbed

  • @yxyyy
    @yxyyy Před 2 lety +4

    Next for logistic regression please!

  • @sriram2k4
    @sriram2k4 Před rokem

    Amazing video!!!!

  • @allenallen5136
    @allenallen5136 Před rokem +1

    Combine theory with practice,and show the calculate procedure,really good👍

  • @ahmedifhaam7266
    @ahmedifhaam7266 Před 2 lety

    dude did my 6 hours lecture + lab class all in just 24 minutes, bruh. Ez Clap hahah, thanks a lot.

  • @debashishsarkar649
    @debashishsarkar649 Před rokem +2

    Hi sir , it's really helps me to understand linear regression and build it by myself from scratch. Thanks 👍 can you describe all the models from scratch? Like it's

  • @kaleemahmed3300
    @kaleemahmed3300 Před 2 lety

    Great explanation
    Upload more model implementation plz

  • @nilothpalbhattacharya8230

    this is the real algorithm. great :)

  • @lucasgonzalezsonnenberg3204
    @lucasgonzalezsonnenberg3204 Před 11 měsíci

    Hi, I really like your videos. One question, what is h?
    Thank you!

  • @davidisaacgalang3000
    @davidisaacgalang3000 Před rokem

    Thank you very much

  • @curtezyt1984
    @curtezyt1984 Před 7 měsíci +1

    on 10:00 we need to take derivative of each weight (M) like this, first for M1 then for M2 then M3.... to Mn

  • @leul1407
    @leul1407 Před 8 měsíci

    Thanks man

  • @atrumluminarium
    @atrumluminarium Před 2 lety +7

    It would have been nice to compare it to the analytical solution of least squares regression being (Xᵀ•X)⁻¹•(Xᵀ•Y) just to show they're identical

    • @andreagalloni92
      @andreagalloni92 Před rokem

      quicker and theoretically more correct for solving this problem!

    • @JohnMacPherson-hr4yz
      @JohnMacPherson-hr4yz Před rokem +1

      @@andreagalloni92 could you make a video and share it with everyone please? (theory and python code)

    • @andreagalloni92
      @andreagalloni92 Před rokem +1

      @@JohnMacPherson-hr4yz mmm... Maybe I can :)
      I'll try to do it

  • @baivabdattamajumder6568

    If you're counting from 0 to n, it means you have (n+1) data points, so should the dividing factor not be 1/(n+1) while calculating mean of squared errors at 4:43?

  • @nikhdzia
    @nikhdzia Před 2 lety

    Please do more such videos

  • @mohammadf9646
    @mohammadf9646 Před 3 měsíci +1

    well done

  • @hayki_ds
    @hayki_ds Před rokem

    perfect job

  • @thomashoughton228
    @thomashoughton228 Před 2 lety +1

    Why does 1/n become -2/n. As I read in towards data science that it would be 1/n and then the -2 would be after the sum symbol as 2x'i' so like why or how does 1/n become -2/n. Also great video btw, helped me unlock a few skills in the ml skill tree :)

  • @gamingwithdingo
    @gamingwithdingo Před 12 dny +1

    I love this guy

  • @munazirhussain5450
    @munazirhussain5450 Před 5 měsíci

    Which software you use for the video ?

  • @tusharvashistha190
    @tusharvashistha190 Před rokem

    amazing content

  • @fishfanaticgokul4758
    @fishfanaticgokul4758 Před rokem

    bro how does the model created by these mathematical output and how does the predict function works
    ;

  • @DJLEGS420
    @DJLEGS420 Před 3 dny

    16:29 "The lube is basically the sum"
    lmao
    v helpful vid tho

  • @deveshjain9543
    @deveshjain9543 Před 11 měsíci

    I implemented it myself and it came out to be more accurate then sckit-learn
    import numpy as np
    from sympy import Symbol, solve, diff
    class LinearRegression:
    def __init__(self):
    self.w = {}
    def fit(self, x, y):
    for i in np.arange(x.shape[1] + 1):
    self.w[f"w{i}"] = Symbol(f"w{i}")
    e = 0
    for i in range(len(x)):
    yp = 0
    for j in np.arange(len(self.w)):
    if j == 0:
    yp += self.w[f"w{j}"]
    else:
    yp += self.w[f"w{j}"] * x[i][j-1]
    e += (yp-y[i]) ** 2
    eq = []
    for i in np.arange(len(self.w)):
    eq.append(diff(e, self.w[f"w{i}"]))
    w = solve(eq, list(self.w.keys()))
    for i in np.arange(len(self.w)):
    self.w[f"w{i}"] = w[self.w[f"w{i}"]]
    def predict(self, x):
    def prediction(features):
    yp = 0
    for i in np.arange(len(self.w)):
    if i == 0:
    yp += self.w[f"w{i}"]
    else:
    yp += self.w[f"w{i}"] * features[i-1]
    return yp
    return list(map(prediction, x))

  • @MultiCodFTW
    @MultiCodFTW Před rokem +1

    @NeuralNine How do I test the trained models to give a prediction based on a newly given value/independent variable (a value that is foreign to the training set)?
    Great video btw!

    • @adrianbitsinnie1537
      @adrianbitsinnie1537 Před rokem

      Yes! This is a good question I hope he answers. If he already has can someone link me please? :)

  • @piyush9555
    @piyush9555 Před rokem

    wow, that is exactly what is called simplicity

  • @rockymani94
    @rockymani94 Před 2 lety

    Lstm ,rnn , logistics and more we are expecting more from you

  • @dronedangwal447
    @dronedangwal447 Před 11 měsíci +1

    Nice video! Could you please provide the dataset that you used

  • @anidea8012
    @anidea8012 Před rokem

    please provide the dataset which you use bcz it will us to follow along with you

  • @alihekmat2517
    @alihekmat2517 Před 2 lety

    You're great bro

  • @farukesatergin8129
    @farukesatergin8129 Před rokem +1

    Where can we get the csv file he worked on?

  • @deepakdubey3973
    @deepakdubey3973 Před rokem

    Thanks

  • @joshuarobinson2338
    @joshuarobinson2338 Před 2 lety

    Good job man, can you put the code in pdf format ? again awesome video.

  • @mehdibouhamidi4675
    @mehdibouhamidi4675 Před 2 lety

    can you do the same for Support vector machine

  • @sneha.tiwari
    @sneha.tiwari Před rokem

    Hello, do you have a video or notes on plotting the linear regression when attributes in the input data points are more than one. Say 10 columns of x, and 1 column of y??
    Please respond.
    Thank you.

    • @sairekhaunnam3001
      @sairekhaunnam3001 Před rokem

      I too need explanation on this. It will be really helpful

    • @sneha.tiwari
      @sneha.tiwari Před rokem

      @@sairekhaunnam3001 Hey, if there is one attribute the we can plot it on 2D, if two attributes, then in 3D, and for three attribute, we will plot in 4D, which is not possible visually. That's why we restrict ourselves.

  • @eslamashraf5847
    @eslamashraf5847 Před rokem +1

    Amazing explaining, thanks a lot!
    Can you upload the csv file and send me the link, please ?

  • @phoneix24886
    @phoneix24886 Před 10 měsíci

    why did we do a partial derivative?

  • @duyanh4186
    @duyanh4186 Před rokem

    i love this video

  • @zombiekiller7101
    @zombiekiller7101 Před 2 lety +2

    Nice video as usual :).But I don't know calculus 😥

  • @aandresriera7927
    @aandresriera7927 Před 2 lety

    LSTM from scratch! 🙏😍

  • @ghost_riderrr
    @ghost_riderrr Před 2 měsíci

    why do we need to print epoch values??

  • @roomian
    @roomian Před rokem

    I dont understand the reasoning behind [m * x * b for x in range(20, 80)] why is this the y axis. What does it represent. Why the equation? Also why arent we just reading the found values of x and y? I have understood all the calculations being done but I dont understand the visualization/graph representation part. Please let me know if you see this.

    • @OfficialMazLi
      @OfficialMazLi Před 8 měsíci

      He did it all wrong just ignore it imo

  • @muhammadhammadsarwar698
    @muhammadhammadsarwar698 Před měsícem

    Can Anyone tell me some practical usecase or some example where we can implement this? It should be great if anyone can give me full scenario

  • @shashwatbalodhi4042
    @shashwatbalodhi4042 Před 11 měsíci

    I am getting error
    AttributeError Traceback (most recent call last)
    Cell In[10], line 39
    37 if i % 50 == 0:
    38 print(f"Epoch: {i}")
    ---> 39 m, b = gradient_descent(m, b, data, L)
    41 print(m, b)
    43 plt.scatter(data.studytime, data.score, color = "black")

  • @shaileshchauhan7855
    @shaileshchauhan7855 Před 2 lety

    Why did you decrease the no of epocs ? And doesn't minima can be found with partial derivative =0 ?

    • @CptAJbanned
      @CptAJbanned Před rokem

      The minima of one of the partial derivatives wont necessarily be the minima for the loss function since it also depends on the other variables/features

  • @Graverman
    @Graverman Před 2 lety +1

    here to learn something that I already know but the video is too entertaining to click off

  • @chrisdalain6979
    @chrisdalain6979 Před rokem

    Woooow that's great

  • @quaka96
    @quaka96 Před 2 lety +1

    Why the gradient descent method? The pseudo inverse solves this immediately

    • @CptAJbanned
      @CptAJbanned Před rokem

      Make a video about it and link me, plz

  • @navardowilliams7484
    @navardowilliams7484 Před rokem

    You sound like tech with Tim😂😂😂

  • @user-cw8cw3fq7k
    @user-cw8cw3fq7k Před 2 měsíci

    Do Multiple Linear Regression!!!!!

  • @annawilson3824
    @annawilson3824 Před rokem

    What's the point of printing Epochs if we see nothing on the screen in that regard lol, why is it highly mathematical if we do not even derive anything? I would redo the video with loss_function being in the print, o/w it just hangs in the code for God knows what reason.

  • @simssim262
    @simssim262 Před 2 lety +1

    Pls pls make one for neural networks

  • @keyurpancholi4277
    @keyurpancholi4277 Před rokem

    where is the csv file?

  • @naturemusic7391
    @naturemusic7391 Před rokem

    there is a improvement need to be done sir while discussing about error you are constantly saying the function as mean squared error right! but this is not the error function the function which need to minimize is the cost function. loss or error function measure the error of single training set while cost function is the summation of all the error of the training set.