Here, we had considered 2 dimension as the high dimensonal data for example. One of the most usecase of PCA is in dimensionality reduction. So, if you want you can use e2 and get second PC. But then think about it. From 2 variable, we again got 2 variables. That's why he has shown only PC1. However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.
thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir
Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables. How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!
@fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations: cov(a, a) cov (a, b) cov (a, c) cov(a,d) cov(b, a) cov(b, b) cov(b, c) cov(b, d) cov(c, a) cov (c, b) cov(c, c) cov(c, d) cov(d, a) cov(d, b) cov (d, c) cov(d, d)
I noticed that your channel contains the entirety of Data Mining taught at the Master's level! Thank you very much, subscribing immediately!
Welcome
Do like share and subscribe
Super explanation.. today is my machine learning paper
Thanks and welcome
Do like share and subscribe
Mine is tomorrow!
How was it?
Super explanation..the best channel in CZcams to learn machine learning and ann topics ❤❤
Thank You
Do like share and subscribe
This man has depth knowledge of this topic.
Thank You
Do like share and subscribe
Amazing step-by-step outline!
I love it💌, so I subscribe!
Thank You
Do like share and subscribe
Excellent Teaching. Salute to you sir
Welcome
Do like share and
Very clear Explanation Sir.... Thank you so much...
Welcome
Please do like share and subscribe
Clear and nice explanation. Thanks for the video
Welcome
Do like share and subscribe
Thanks for the video. Great explanation!
Welcome
Do like share and subscribe
Thank u very much.Very clear explanation and it is to understand
Welcome
Do like share and subscribe
thank u for uploading like this video
Welcome
Do like share and subscribe
Thats a clear explanation i have seen
Thank you
Do like share and subscribe
super explanation .. very easy to understand with out any hook ups sir
thanks ...Inspr KVV.Prasad
Thank You
Do like share and subscribe
tq sir for this wonderful concept
Welcome
Do like share and subscribe
Thank you so much today is my data mining and ML paper
Welcome
Do like share and subscribe
Thank you sir. Clear and easy to understand. Thank you.
Welcome
Do like share and subscribe
thank you sir, you were amazing🤩
Welcome
Please do like share and subscribe
Best explanation
content and teaching is very good please also provide the notes it will be helpful
Thank You
Do like share and subscribe
Can you add the concept of hidden Markov model in your machine learning playlist
Sure
Working on it
@@MaheshHuddar okay 👍
@@MaheshHuddar my exam is near
thanks a lot for this wonderful lecture.
Welcome!
Do like share and subscribe
Thanks sir for your explanation 🎉
Welcome
Do like share and subscribe
Sir please upload the content of ensemble methods bagging boosting and random forest
Ensemble Learning: czcams.com/video/eNyUfpGBLts/video.html
Random Forest: czcams.com/video/kPq328mJNE0/video.html
Nice presentation tq sir
thank you so much you are great professor
You are very welcome
Do like share and subscribe
Thank you very much master huddar❤
Welcome
Do like share and subscribe
thank u so much
Welcome
Do like share and subscribe
Super Bhayya ...
Thank you so much sir amazing explaination♥♥♥
Welcome
Do like share and subscribe
Why we are not dealing with e2 means why we not do e2^T.[cov matrix]
Here, we had considered 2 dimension as the high dimensonal data for example.
One of the most usecase of PCA is in dimensionality reduction.
So, if you want you can use e2 and get second PC. But then think about it.
From 2 variable, we again got 2 variables. That's why he has shown only PC1.
However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.
thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir
yeahh im also confused how did he get im getting values diffrent 0.3761 5.6928 -5.128
Thank you very much sir
Welcome
Do like share and subscribe
Hello sir, thank you for your explanation.I have a doubt at 08:17 why you have considered only first equation?
You will get same answer with second equation
You can use either first or second no issues
linear discriminent analysis please make a video bhayya
Thanks you,sir
Welcome
Do like share and subscribe
Excellent
Thank You
Do like share and subscribe
thanks a lot
You are most welcome
Do like share and subscribe
Thank you
Welcome
Do like share and subscribe
thankyou sirr, how to calculate 2nd pc?
Select the second eigen vector and multiply to the given feature matrix
Nice!
Thank You
Do like share and subscribe
Sir book name please
Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables.
How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!
you need to do for all pairwise combinations
@fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations:
cov(a, a) cov (a, b) cov (a, c) cov(a,d)
cov(b, a) cov(b, b) cov(b, c) cov(b, d)
cov(c, a) cov (c, b) cov(c, c) cov(c, d)
cov(d, a) cov(d, b) cov (d, c) cov(d, d)
If there are n variables, covariance matrix will be of nxn shape.
is this covariance for reducing 4 to 1@@shahmirkhan1502
بحبككككككككككككككككككككككككككككككككككككككككك يا سوسو
What it means..?
@@MaheshHuddar According to google translate: _"I love you sooo"_
devru sir neevu
Do like share and subscribe
Thanks Sir
Welcome
Do like share and subscribe