Perceptron Learning Algorithm in Machine Learning | Neural Networks
Vložit
- čas přidán 22. 06. 2020
- Machine Learning: • Machine Learning
Android App(Notes+Videos): play.google.com/store/apps/de...
Facebook: / thinkxacademy
Twitter: / thinkxacademy
Instagram: / thinkxacademy
#perceptron #ml #neuralnetworks
Excellent explanation my friend . I loved it .I am CSE professor of age 61 years . May god bless you .Please provide vedios like this so that many of student community can learn. May godess saraswathi bless you for your bright future.
Thank you so much for such kind words i will keep creating more videos to help student community ✅😄
Sir at age of 61 what will you do by learning perception are you going to teach them in your college?
@@ninasirsi2340 he said hes a Cse PROFESSOR
Thank you SO much for such a clear explanation, with the visuals to support it. I really appreciate it!
Finally a clear explanation. Thank you.
😂
Your way of explanation is so simple and organized that any one can understand.
I enjoyed learning Perceptron, you are amazing educator. Thank you for such content. :)
it was super useful for me
thank you my friend!
finally understood after hoping so many videos. W!
simply amazing explanation . Thanks a lot.
Thanks for support✔️These comments make my day😄
This was so good! Thank you very much.
Share and Like💯
Explained in a very easy way..please share the basics of perceptron on jupyter notebook like real hands on experience
Will work on it 😄
Excellent explanation👍
Thanks man now concept is cleared for me...
Great👍🏻Please share our videos to help this channel grow🌟
I have gone through few videos about the topic and did not get the clear understanding. But your video was very clear and examples were very simple to understand, great job and keep up the good job. A big thanks for explaining things clearly.
Thanks😀 Do share my videos with other students to help this channel grow🌟
you forgot to multiply xi in the formula for calculating new weight.
Really awesome
Correct me if I am wrong.
The weight changing algorithm is increasing the weights if the target value is higher than the actual value and vice versa. That will make the output in the next iteration closer to the target output BUT it will not be that way if suppose the inputs (xi) are say always negative. In that case the weight changing has to be modified that is if the input was negative then the sign of delta weight should be reversed.
Eg. :
x1 = -1, w1 = 1 -> x1*w1 = -1 (actual output) ,
sigmoid(-1) -> 0
target output = 1
delta weight = (n) * (1)
Assuming n to be 0.1 then
delta weight = 0.1
So the new weight becomes -> w1 = w1 + delta weight
w1 = 1.1
But now running it again we see=>
x1 = -1, w1 = 1.1 -> x1*w1 = -1.1 (actual output)
sigmoid(-1.1) -> 0
This makes the algorithm even worse now.
So we should have made sure that as the input was negative rather than adding the delta weight we should have subtracted it.
so
w1 = w1 - delta weight = 1 - 0.1 => 0.9
x1 = -1 , w1 = 0.9 => x1*w1 = -0.9
sigmoid(-0.9) = 0
yes for negative weights we need to handle that case
Thank you man
Thank you
very nice explanation ..thank you
Like and share our content to support us😄
dude you're amazing!! keep up the good job :)
Thank you😀 Please share to help my channel reach to more students✌🏻
Nice 👌 keep it up sir
Keep Learning 💯
exellent!!
Thanks
Thank you!
Welcome 🌟 Share it with other students also👍
please post Multilayer perception model with an example
Sure sir
Very clearly explained video!, thank you!
Thank you 👍
Share and Subscribe😄
Wouldn't all the weights be equal after all the iteration as the delta we are adding to each of the weight is always the same for all in any iteration. (Assuming the weights were same at the start) ?
I think the error term of (yi - yi bar) takes care of that. As the iterations go your error term will also becomes smaller and smaller until it converges eventually.
Very good explanation!!
Thanks😄 Share our videos to help this channel grow💯
How can a set of data be
classified using a simple perceptron? Using a simple perceptron with
weights w0, w1 , and w2 as −1, 2, and 1, respectively, classify data
points (3,4); (5, 2); (1, −3); (−8, −3); (−3, 0).
amazing 10/10
Share and Subscribe 😃
Thankyou for the very clear explanation, it was was a pleasure to learn. I have a question on the activation function, x.w+b, since we are using a squashing function should it not be x.w+b < 0.5 for 0, and x.w+b > 0.5 for it to be classified as 1. Thanks again
No, I have rechecked the conditions, it is correct in the video.
Thanks dude .... I have exam tmrw
Best of luck bro💯
Basically.
Very well explained 😀😀
Thanks..please share my videos to help me grow😄