What Is Self-Supervised Learning and Why Care?
Vložit
- čas přidán 17. 06. 2023
- What is Self-Supervised Learning?
“Self-Supervised methods […] are going to be the main method to train neural nets before we train them for difficult tasks” - Yann LeCun
Let’s discover the fascinating machine learning domain of Self-Supervised Learning!
It is one of the most important machine learning techniques in the current day and age.
So, in this first video of the series we will motivate the idea of representation learning using Self-Supervised Learning!
👾 Join the discord server!
/ discord
Sources of clips used:
• #86 - Prof. YANN LECUN...
• Yann LeCun: Dark Matte...
• Yann LeCun, Chief AI S...
⬇️ Follow me on my other socials and feel free to DM questions! ⬇️
📲 Instagram: / m.m.boris
⚫⚪ Medium: / boris.meinardus
🐦 Twitter: / borismeinardus
#ai #education #SSL - Věda a technologie
What Machine Learning Domain would you like to explore next?
Something that you find interesting.
Very well explained, in a simple manner. Thanks Boris
very good explanation, also love how Vsauce music pops in for a split second when you say "...or are they?" lol
thanks 🤩
haha yeah, really appreciate people noticing the details! ☺️
I was not able to get my head around pretext tasks for a long time and then I found this video. Thanks man.
So happy to hear it helped 🤩
Best explanation on topic i've seen or read. Thanks so much
💛 Really happy to hear that! ☺️
By far the best explanation
Thank you Boris. Such a clear Explanation!
I‘m really glad to hear it could help you! 🤩☺️
Great Video! I'm taking a deep learning class right now, and this video really helped me understand the idea of self-supervised learning!
I'm so glad it was helpful!
Well explained! without going into unnecessary complexities (details) like traditional professors in class or text books go.
Thanks a lot. Continue making videos like these.
Glad you enjoyed it!
You explain complex concepts really well! Thanks!
Thank you! Really glad you enjoyed it :)
Very good high-level explanation.
Thank you!! 🤩☺️
what is this background? headache
Great video! Helps alot with my research
Awesome, thank you!
Nice. Can you do a video on how this might be related to reinforcement learning?
Do you have a specific paper in mind? :)
@@borismeinardus Not really. I was just wondering if ssl can be used to predict future rewards or sth...
@@binig.4591 I see. Well, in a sense, RL already is similar to SSL. The agent performs actions in it‘s environment and autonomously generates it’s own training data.
I’m no expert in RL, but if I happen to learn more about it or find someone to talk to that is more educated in that direction, I will try to work on a video :)
This helped me a lot 🤗
I‘m really happy to hear that ☺️
Amazing explaination, thank you
Glad it was helpful!
So if I stack an autoregressive LLM on top of a self-supervised model LLM (BERT), would it understand the context of the text better?
(Not thinking too long about the idea) I have two thoughts:
- how do you want to use the BERT encoder? In the classical encoder-decoder transformer? Or do you want to use it in between the tokenizer and the actual input into the decoder?
- If you pretrain the BERT model separately from the AR decoder and just use it as a frozen module, the decoder will not instantly understand the embeddings produced by the encoder. You will have to either fine-tune the encoder and decoder together, to align their representations, or freeze them and train some sort of connection module. This connection module can be a simple linear layer, an MLP, or something more complex.
Of course, there are more things to consider, but those are the basic things you might need to think about.
I hope this makes sense and helps!
Well explained thx
Glad you liked it! 😊
Please, remove the music, I had to stop 4 times to avoid stress... If your speech is interesting enough, I think it is, why confuse it and generate noise?