What Is Self-Supervised Learning and Why Care?

Sdílet
Vložit
  • čas přidán 17. 06. 2023
  • What is Self-Supervised Learning?
    “Self-Supervised methods […] are going to be the main method to train neural nets before we train them for difficult tasks”  -  Yann LeCun
    Let’s discover the fascinating machine learning domain of Self-Supervised Learning!
    It is one of the most important machine learning techniques in the current day and age.
    So, in this first video of the series we will motivate the idea of representation learning using Self-Supervised Learning!
    👾 Join the discord server!
    / discord
    Sources of clips used:
    • #86 - Prof. YANN LECUN...
    • Yann LeCun: Dark Matte...
    • Yann LeCun, Chief AI S...
    ⬇️ Follow me on my other socials and feel free to DM questions! ⬇️
    📲 Instagram: / m.m.boris
    ⚫⚪ Medium: / boris.meinardus
    🐦 Twitter: / borismeinardus
    #ai #education #SSL
  • Věda a technologie

Komentáře • 36

  • @borismeinardus
    @borismeinardus  Před rokem

    What Machine Learning Domain would you like to explore next?

    • @kiffbeatz
      @kiffbeatz Před rokem

      Something that you find interesting.

  • @modakad
    @modakad Před dnem

    Very well explained, in a simple manner. Thanks Boris

  • @InfraredScale
    @InfraredScale Před rokem +9

    very good explanation, also love how Vsauce music pops in for a split second when you say "...or are they?" lol

    • @borismeinardus
      @borismeinardus  Před rokem +1

      thanks 🤩
      haha yeah, really appreciate people noticing the details! ☺️

  • @maximinjoshua766
    @maximinjoshua766 Před 7 měsíci +1

    I was not able to get my head around pretext tasks for a long time and then I found this video. Thanks man.

  • @Opinionman2
    @Opinionman2 Před 9 měsíci +2

    Best explanation on topic i've seen or read. Thanks so much

  • @ashwanthramas5095
    @ashwanthramas5095 Před 12 hodinami

    By far the best explanation

  • @af121x
    @af121x Před 10 měsíci +3

    Thank you Boris. Such a clear Explanation!

    • @borismeinardus
      @borismeinardus  Před 10 měsíci +1

      I‘m really glad to hear it could help you! 🤩☺️

  • @granttannert4457
    @granttannert4457 Před 8 měsíci

    Great Video! I'm taking a deep learning class right now, and this video really helped me understand the idea of self-supervised learning!

  • @engr_nadeemshah
    @engr_nadeemshah Před 8 měsíci

    Well explained! without going into unnecessary complexities (details) like traditional professors in class or text books go.
    Thanks a lot. Continue making videos like these.

  • @tfhfjhgkjhkjh
    @tfhfjhgkjhkjh Před 4 měsíci

    You explain complex concepts really well! Thanks!

  • @kiffbeatz
    @kiffbeatz Před rokem +2

    Very good high-level explanation.

  • @user-us6sg8vn3o
    @user-us6sg8vn3o Před 4 měsíci +4

    what is this background? headache

  • @sergioorozco7331
    @sergioorozco7331 Před 7 měsíci

    Great video! Helps alot with my research

  • @binig.4591
    @binig.4591 Před rokem +1

    Nice. Can you do a video on how this might be related to reinforcement learning?

    • @borismeinardus
      @borismeinardus  Před rokem

      Do you have a specific paper in mind? :)

    • @binig.4591
      @binig.4591 Před rokem

      @@borismeinardus Not really. I was just wondering if ssl can be used to predict future rewards or sth...

    • @borismeinardus
      @borismeinardus  Před rokem

      @@binig.4591 I see. Well, in a sense, RL already is similar to SSL. The agent performs actions in it‘s environment and autonomously generates it’s own training data.
      I’m no expert in RL, but if I happen to learn more about it or find someone to talk to that is more educated in that direction, I will try to work on a video :)

  • @anna040598
    @anna040598 Před rokem +1

    This helped me a lot 🤗

  • @athenaconstantinou5353
    @athenaconstantinou5353 Před 8 měsíci

    Amazing explaination, thank you

  • @r.s.e.9846
    @r.s.e.9846 Před 9 měsíci

    So if I stack an autoregressive LLM on top of a self-supervised model LLM (BERT), would it understand the context of the text better?

    • @borismeinardus
      @borismeinardus  Před 9 měsíci

      (Not thinking too long about the idea) I have two thoughts:
      - how do you want to use the BERT encoder? In the classical encoder-decoder transformer? Or do you want to use it in between the tokenizer and the actual input into the decoder?
      - If you pretrain the BERT model separately from the AR decoder and just use it as a frozen module, the decoder will not instantly understand the embeddings produced by the encoder. You will have to either fine-tune the encoder and decoder together, to align their representations, or freeze them and train some sort of connection module. This connection module can be a simple linear layer, an MLP, or something more complex.
      Of course, there are more things to consider, but those are the basic things you might need to think about.
      I hope this makes sense and helps!

  • @efesirin2870
    @efesirin2870 Před 7 měsíci

    Well explained thx

  • @pauceano945
    @pauceano945 Před 6 měsíci +1

    Please, remove the music, I had to stop 4 times to avoid stress... If your speech is interesting enough, I think it is, why confuse it and generate noise?