Stochastic gradient descent explained | Stochastic gradient descent vs Gradient descent|Mini batch
Vložit
- čas přidán 4. 06. 2020
- Stochastic gradient descent explained | Stochastic gradient descent vs Gradient descent|Mini batch
#StochasticgradientDescent #SGD
Hello All,
I am Aman, a data scientist by profession.
About this video:
In this video, I explain about stochastic gradient descent in detail. I explain how is stochastic gradient descent different from gradient descent with example. I also explain about mini batch gradient descent.
Below questions are answered in this video:
1. What is Stochastic gradient descent in Neural netowrk?
2. SGD in machine learning.
3. gradient descent vs Stochastic gradient descent (SGD).
4. What is mini batch gradient descent?
5. How Stochastic gradient descent/SGD works in machine learning?
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
Join Facebook group :
groups/41022...
Follow on medium : / amanrai77
Follow on quora: www.quora.com/profile/Aman-Ku...
Follow on twitter : @unfoldds
Get connected on LinkedIn : / aman-kumar-b4881440
Follow on Instagram : unfolddatascience
Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging and Boosting here:
• Introduction to Ensemb...
Access all my codes here:
drive.google.com/drive/folder...
Have question for me? Ask me here : docs.google.com/forms/d/1ccgl...
My Music: www.bensound.com/royalty-free...
thanks a lot fo r sucn inof #Unfold Data Science ,Aman
What a explaination 👏👏
Nice explanation.
Thank for your deep explanation. if you add code with data set it will be amazing
Good Article of SGD.Please make vedio on SGD with momentum
Thanks Anjani. Will create .
ur best...very gud explanation
Thanks and welcome
great explanation
Thanks a lot Sandipan for motivating me through your comment.
Very good Aman
Thanks a lot.
Plzz do it sir.... Ur explanation is gud..
Thank you. Sure.
veryyy well explained
Thanks a lot. Happy Learning :)
You explained every topic very well. please make a video on how to predict COVID 19 with help of GradientBoostingRegressor in Python and on ARENA model.
Hi Vaishali, I had created a video on this topic. you can check that in my videos. My predictions were also right as per current scenario. I used time series model there. Thanks. Happy Learning!
@@UnfoldDataScience I have seen the video, really good knowledge.
nice video !!
Thank you.
Understood SGD aman, thanks. please make video on Momentum.
Make a videos on SGD MOMENTUM
Thanks Saan. Sure.
Thanks Ragul :)
Hi Sir, thank you so much for your video. I'd like to ask, if I'm already doing Lasso Regression, should I still do gradient descent after that?
Depends on what u r trying to achieve.
Sir..gud eve
I have a small clarification..(reg SGD)
I'm having a dataset of 50k rows and 15 features which i used to perform logistic regression using SGD(with L2 regularization) from scratch in python..I initially divided my dataset into train test with split of 25 percent and just trained my classifier with 37500 datapoints..
When I executed the code with 500 epochs (where each epoch uses one random data point to update the weights & intercept)..
It is almost taking 20 minutes to execute the same..I actually wanted to reach a convergence with a tolerance of 10^-3 but with 500 iterations I wasn't able to do it..now i increased no of iterations to 1500 where it is taking almost 1 hr to display the output..
Could you plz tell me whether I'm going in right direction to reach the optimal weights?..
Try to reduce batch size.
explanation was good sir, but we need your notes following videos sir
we need your notes following videos sir....could you please prepare notes .sir
Please do video on Momentum also if possible..
Sure Sudheer. You will get it today :)
Weight calculated using loss function. Optimise the loss function.
Gradient decent takes the whole data whereas stochastic descent take one record at random, but the problem is that weight calculated might be Baised. So we'll use mini batch descent. Finding the best slope (m) and c (constant intercept).
Computational will be expansive.
Loss function is nothing but MSE, we've to optimise it by differentiating it with slope and intercept.
Sir make a video on adam also...
Hi Gokul, Sure. Thank you.
please explain what is bias and mumentum.
These things are explained in a separate video. Please watch it. Its i same playlist.
Sir jee momentum bhi samjha dijiyega..khud pe se bharosa uth gaya h..😁😁
zaroor :)
SGD with momentum
notation is incorrect. loss = sum(actual - predicted)2
Hmm ok, thanks for correcting.
Delhi university anyone?? Jo aaj paper deke aaya aur kuch smj nhi aaya paper dekhke??