StatQuest: Hierarchical Clustering
Vložit
- čas přidán 3. 07. 2024
- Hierarchical clustering is often used with heatmaps and with machine learning type stuff. It's no big deal, though, and based on just a few simple concepts. If you want to draw a heatmap using R, I've put some sample code on my webiste: statquest.org/statquest-hiera...
For a complete index of all the StatQuest videos, check out:
statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Buying The StatQuest Illustrated Guide to Machine Learning!!!
PDF - statquest.gumroad.com/l/wvtmc
Paperback - www.amazon.com/dp/B09ZCKR4H6
Kindle eBook - www.amazon.com/dp/B09ZG79HXC
Patreon: / statquest
...or...
CZcams Membership: / @statquest
...a cool StatQuest t-shirt or sweatshirt:
shop.spreadshirt.com/statques...
...buying one or two of my songs (or go large and get a whole album!)
joshuastarmer.bandcamp.com/
...or just donating to StatQuest!
www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
#statquest #ML #clustering
Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/
You're a person who saved me lots of time and pain. Thank you. I wish you the best
Thank you very much! :)
The intro song removed my fear of clustering. Thanks for the awesome video.
going on a statequest😌
You are, and I cannot stress this enough, a national treasure!! The ease in how you explain things that have eluded me for over a decade and make it click is truly a gift. Thank you so freaking much!!!
Wow, thank you!
I used to watch your videos while I was a student. It’s been 3 years since my graduation and I’m still here (I’m changing jobs and need to review some stuff).
Thank you a lot for your incredible work
Congratulations on the new job! BAM! :)
I have to congratulate you for this video, it gives the basic notions of the hierarchical cluster easy and fast. Bravo!
I can't thank you enough. Such clear and helpful explanations. Great.
Thanks! :)
you can, with patreon
I still don't believe how this content is free. Thank you sir!
Thanks!
I already watched some of your videos. This one I watched because I want to apply hierarchical clustering in my thesis. It is about time I buy one of your sweaters. I hope this supports you. Thanks for all the truly great explanations.THANK YOU!
Thank you very much!!! :)
Love your videos. The fact that you make it so simple shows the depth of your understanding.
Thank you!
this video proved that "hard" stuff =badly explained stuff
so fuckin true. Not sorry for swearing. Happy learning guys
if you can't explain something in simple terms, then you don't understand it that well.
@@gummybear8883 or you've been a professor for 20 years and are so deep into a topic that you completely forgot how people approach new problems. Your sentence really only applies to novices trying to be teachers.
@@julius4858 We could just change it to: if you can't explain something in simple terms, then you can't teach it that well.
@@Joreselin Yeah, that is absolutely true. Many of my professors for theoretical computer science are experts on various fields but man do their explanations suck. That's why I have to watch youtube videos for stuff like this.
your videos help me see the "big picture" of concepts. after your videos, I can actually understand what is going on and why we are doing something. Thank you!
Happy to help!
This channel is a treasure! Absolutely incredible job my man
Thank you so much 😀!
Thank you for clearly explaining the details at a moderate speed! You save me lots of time!
Thank you!
great videos, I like the way you explain these topics
The visualizations and simplicity of explanations as well as great examples motivate me to keep learning. Thank you so much for making it so interesting. I'll try to do my bit by buying a t-shirt. 😊
Wow! Thank you very much! :)
hi pragya
You are simply amazing !! I love your style and simplicity and the word is BAM! .. your videos are very informative and worth going through... thanks for all your hard work in simplifying the complex topics
Thank you so much!!
Absolutely brilliant..Thank you sooo much for your time and effort!
Thanks! :)
Great explanation. Thanks StatQuest!
I am super grateful for this video. You are such an excellent teacher! Thank you for being such a "you"
Wow, thank you!
Congratulations from Brazil!
Very nice.
I use this in Python and it's a really good way to cluster.
Another thing - from coding aspect, it's only 1 line of code in Seaborn, very easy.
Thanks for sharing!
you saved yet another day Josh. Thank you
Bam! :)
Joshua's video is always helpful. Next time, probably k-means clustering.
The best as always! Love this channel! It's super easy to understand
Thanks!
StatQuest is the Best! Teaching is an art...and these are master pieces.
WOW! Thank you very much! :)
really awesome video! This will help me with my test. Thank you!
I love this channel so much
Thank you! :)
Your explanation is very clear to me and i see all your video, you are very friendly to me. I like you very much.
Thank you! 😃
THANK YOU! This is has been SO HELPFUL!
bam!
awesome content and delivery
Glad you think so!
Congratulations! your video is so great! you explain is a very clear and simple way.
Thank you! 😃
I love this. Your video is wonderful!
Thank you! :)
Thank you. Better than university teaching
Thanks!
This channels is truly a treasure trove! I was wondering if you could do a video on consensus clustering? I.e. how to evaluate clustering across multiple models and parameters. You are awesome!
I'll keep that in mind.
StatQuest never disappoints
BAM! :)
Thank you for allowing me to ascend the stats hierarchy!
bam! :)
Excellent explanation!
Thanks!
Thank You Sir, It was awesome to learn from you.
BAM! :)
just beautiful!
I am preparing my actuarial exam and you saved me a lot❤
Good luck! :)
fantastic explanation, thank you so much for this video.
Thanks!
I would like to add that:
- single-linkage (comparing the closest points of 2 clusters) tends to form more elliptic clusters;
- complete-linkage tends to form more globular clusters.
So, that means that not scaling your data, scaling with a StandardScaler, or with a MinMaxScaler will affect your clustering.
Noted!
Thank you so much sir! This is very helpful and very informative.
Glad it was helpful!
Explained in a simple manner.
Thank you very much for this video! It was really well done :)
Glad you liked it!
Great as always! Thanks.
Thank you! :)
This is Awesome......
Please Make a session on K Modes, KNN and K Prototypes
Here's a complete list of my videos so far: statquest.org/video-index/
Very helpful, thank you!
Thanks! :)
You saved me a week
Awesome! :)
Nice explanation 👍👍
Thanks!
Please add a video on Latin Square design, Joshua!
I am going to pass my stats final tomorrow, only because of your videos :D
your students are lucky.
The CPA and clustering question was worth 30% of total marks on my exam today, and I managed to write them so well only because of your videos. you're a savior. Thank you!!
Man your videos are soo super helpful! THANK YOU (ps consider the color library viridis to make it easier for the colorblind)
Thanks!
Hey Josh! Your videos are great! Thank you for the effort you've put on it!
If you allow me... have you considered making videos explaining DBSCAN and HDBSCAN?
Yes, I've thought about those topics and may make a video about them.
Excellent Sir
Thanks!
Watching this after watching your more recent videos. Missed your 'BAM's a lot!!! You should remake these old videos again! Thanks :)
bam! :)
@@statquest 😍
Found this gem of a channel today. Agreed on the fun rhymes and puns.
great video. thank you very much
Thanks!
Good job mr josh.
Thank you!
You can explain the same concept with may be some other datasets and better visualisation other than heatmap
Great video!
Do you have any plans to talk about co-clustering, look forward to it.
Thanks for the video
@StatQuest please explain probability and Naive Bayes. Thanks in advance! I am a huge fan of your way of teaching and your small songs creations. Keep up the good work!
Thanks! Naive Bayes is on the to-do list.
@@statquest waiting. plz .
Hello Josh! The videos are soooooooo goooood! These are BAMMMMM Good!!
1 request - Could you please create a video on LCA - Latent Class Analysis? Maybe by comparing it to k-means clustering? I cannot be more thankful!
would like this too
beautiful BRO!
Amazing! Your Videos are so much comrehensible. I really enjoy watching!!!*_*
Thank you!
Awesome video
Thank you! :)
Hi Josh, I am really enjoying your videos specially the wha whas and bam !! , you make stats sound easy but also fun! Thank you! I wonder if you could please do a video to explain the different uses of PCA and HCA, when do you use one or the other? In the mean time I will watch your videos on PCA and HCA :) hooray!
BAM! Thank you very much! I'll keep that topic in mind.
Thank You
:)
Thank you for all your videos clearly explaining complex concepts. Can you also make video(s) on different bi-clustering methods?
I'll keep that in mind.
I LOVE YOU JOSH !
:)
Thank you
So cool the video!
Thanks! :)
ohh my god thanks josh u are so brilliant i think marvel should add another new superhero "josh starmer the life saver"
:)
You're THE BEST
Thanks!
Great video
Thanks! :)
thank you so much!
:)
This video is super duper bam bam double double bam!
Will you cover more advanced clustering techniques such as model-based clustering (MCLUST) and weighted gene co-expression network analysis (WGCNA)? I'm learning about these things now for my research, and will be very grateful if you can cover these topics for me. Thanks! :)
Thanks! :)
Amazing explanation. Please make a video on Cluster evaluation. :)
I'll keep that in mind.
You're amaaaaaazing
Thanks!
The opening is always funny
:)
the intro.......nice one bro🖐
bam! :)
how can clustering be applied on spectral data?
Love the song!
:)
4.4 K likes, zero dislikes! You're awesome. Thanks very much
bam!
thank you
:)
Hi, Joshua. Do you know the basics of pseudotime analysis in single-cell RNA-seq. Can you make a short video talking about the basics? Thanks!
I'll put that on the to-do list!
Hi Josh, amazing video as always. Think you can come up with video on how to determine the best number of clusters to have? I get the Elbow method, but I really struggle with the inconsistent method. I was looking at the inconsistency coefficients, and I am confused to do they include singleton clusters, or are singleton clusters excluded. I am also confused about what exactly is the "jump" in the inconsistent coefficient that we are supposed to look out for.
I'll keep that topic in mind.
Hi Josh!! We need a DBSCAN tutorial please!!!!
I'll keep that in mind! :)
Great channel. Clearly explained all most all the topics i watched on ML. Here one question what does gene stands for is it features of the data ?
Yes, it's a feature.
Thank you very much! Can you teach software's? Like R-basic introduction, basics of how to arrange date with various commands?
I have a handful of videos that teach you how to do certain things in R. They don't start at the very beginning, but I still go one step at a time. You can find these videos on the index page: statquest.org/video-index/
@@statquest Thank you very much.
You are amazing
Thanks!
Hey you explain this very well and in very simple form thanks for this, I request you could you please make one video on DEGseq2, means finding DEG gene between the time points and then drawing the heatmap, volcano plot and cluster lines.
Thanks
I'll keep that in mind. I already have a few videos on DESeq2 here: statquest.org/video-index/
finished watching
bam! :)
You saved my life😇 Thank you very much.
And I think the link for the sample code in R isn't available right now...
Yep, that's a really old link. Here's a new one: statquest.org/statquest-hierarchical-clustering/
Thanks for the explanation. Can you please make a video about consensus NMF clustering?
I'll keep that in mind.
Hi Josh! Can you please make a video on DBSCAN, if possible? Especially the parameter tuning part of it, I'm sure that would be of great help to lots of people.
I'll keep that in mind.
Hi Joshua, can you do a video on Gaussian Mixture Models? Also, your videos are awesome! Keep it up.
The good news is that is already on the To-Do list. I'll bump it up a notch since you requested it as well.
@@statquest, make that two requests! Thanks!
@@jordanmakesmaps Cool! It's in the top 10 things for me to do, so hopefully I'll get to it soon.
@@statquestYes! Anytime people start talking about gaussian mixture models, EM, "sampling the posterior", and MCMCs - I get cold sweats.
Thank you!!!!!!!!!!!!!! :)
@StatQuest with Josh Starmer, in this video you are clustering and combining genes (the attributes of data), aren't you supposed to cluster and combine the samples? that's the inverse of the approach shown
You can cluster the samples or the genes, or both! It all depends on the question you are asking. For example, if I have some healthy people and some sick people, I might be interested in clustering the people (to see if healthy people form one cluster and unhealthy people form another) or I might be interested in clustering the genes. In this case I would find out which genes are correlated and up-regulated in healthy people compared to unhealthy people. Or I could do both. Does that make sense?
Dear StatQuest! Thank you for the explanation.
1. What is the best would you would evaluate the algorithm (silluete score,...) to decide which clustering method and distance to use ( i undestand that silluete score is good to choose the number of k but not to decide between algorithms)?
To decide the best algorithm i have been ploting PCA and color label by clusters created this way understanding if the clusters make sense or not? (however it is known by literature that PCA does not work well to evaluate binary data)
2. In the case that the data is binary, (e.g instead of expression data, genomic alteration data) what kind of distance would you use?
Best Regards, Manuel
1) I guess it depends. If I had "training" data, with known categories, I would compare how many times the data were correctly and incorrectly grouped. Otherwise, it really just boils down to subjective preference.
2) If you measure a lot of things, the euclidian distance will still work in this situation.