Contrastive Learning in PyTorch - Part 1: Introduction

Sdílet
Vložit
  • čas přidán 29. 08. 2024

Komentáře • 39

  • @philipmay9614
    @philipmay9614 Před rokem +14

    Cosine similarity is between 1 and -1 and not just between 0 and 1.

    • @DeepFindr
      @DeepFindr  Před rokem +6

      Oh yes, stupid mistake. Cosine is obviously also between -1 and 1.
      Thanks for pointing this out!

    • @DeepFindr
      @DeepFindr  Před rokem +14

      This will however not affect the general concept of this loss, because the exp will scale all negative terms into [0,1].

  • @HafeezUllah
    @HafeezUllah Před rokem +1

    man you have delivered the lecture extremely well

  • @rajeshve7211
    @rajeshve7211 Před 26 dny

    Fantastic explanation. You made it look easy!

  • @buh357
    @buh357 Před rokem

    I recently discovered self-supervised learning.
    And starting to work on it.
    Your video helped me a lot.
    Thank you for the great explanation.

  • @zhuangzhuanghe530
    @zhuangzhuanghe530 Před rokem

    This video is the best video I've ever seen

  • @thegimel
    @thegimel Před rokem +2

    Great video on a very interesting subject. I've read the Supervied Contrastive Learning paper recently since I'm trying to use it in a problem I'm working on. Excited to watch the next video!
    P.S. It would be cool if you could do a video (or series) on N-shot learning (few-, one- and zero-shot).

    • @DeepFindr
      @DeepFindr  Před rokem

      Thank you :)
      Thanks for the recommendation, I put it on the list!

  • @amortalbeing
    @amortalbeing Před rokem

    Loved this. Keep up the great work.
    Thanks lot

  • @user-sn4ws7qc7n
    @user-sn4ws7qc7n Před 5 měsíci

    Thank you for this vedio. I learned alot.

  • @Rfhbe1
    @Rfhbe1 Před rokem

    Hi. Thank you for video. I found defect in NT-Xent Loss formula: temperature should be in exponent. Also when you plug numbers into a formula you should add to the denominator what's in the numerator. Have a nice day!

    • @DeepFindr
      @DeepFindr  Před rokem

      Yeah, thanks for pointing out! I messed some things up regarding NT-Xent :D will do some corrections in the next part :)

  • @mhadnanali
    @mhadnanali Před rokem

    looking forward to implementation.

  • @mafiamustafa
    @mafiamustafa Před rokem

    another amazing video

  • @nikosspyrou3890
    @nikosspyrou3890 Před rokem +1

    Great video!! Could you make also a video that will show us an implementation on how to do contrastive learning for semantic segmentation problem?

    • @DeepFindr
      @DeepFindr  Před rokem

      Thanks! Soon I'll upload the implementation for point clouds. It should be quite similar, just using other layer types.
      Or do you refer to any special variants of CL for semantic segmentation?

    • @nikosspyrou3890
      @nikosspyrou3890 Před rokem +1

      Thanks for your reply! Actually I would like to see experimentally an example of image segmentation dataset in which the contrastive loss(for example infoLoss) with a combination of a supervised loss such as cross entropy boost the performance of segmentation

    • @DeepFindr
      @DeepFindr  Před rokem

      I have to see if I find time, but it's certainly noted. Thanks for the suggestion!

  • @jamesgalante7967
    @jamesgalante7967 Před rokem

    Damn. You’re a good teacher

  • @CollegeTasty
    @CollegeTasty Před rokem

    Thank you!

  • @PrajwalSingh15
    @PrajwalSingh15 Před rokem

    Awesome explanation thanks, just a small query about how long this series will be and the expected frequency of each release?

    • @DeepFindr
      @DeepFindr  Před rokem +1

      Thanks! I plan to upload the hands on part in latest 2 weeks. That will be final part of this introduction :)

  • @Sciencehub-oq5go
    @Sciencehub-oq5go Před rokem

    Great video. Thanks. Could you please comment on some of the handlings of False Negatives?

  • @sakib.9419
    @sakib.9419 Před rokem

    sucha good video

  • @hussainmujtaba638
    @hussainmujtaba638 Před rokem

    amazing content

  • @eranjitkumar11
    @eranjitkumar11 Před rokem

    Thanks for your videos. Can you create a tutorial video on Deep Graph Infomax (maybe on the Cora dataset)? This will (besides be useful for me ;) ) tie up with your last subject on GNN with contrastive learning.

    • @DeepFindr
      @DeepFindr  Před rokem +1

      Yep, I've read the paper. Will note it down :) but the list is getting very loooong :D

  • @kornellewychan
    @kornellewychan Před rokem

    great

  • @badrinathroysam5159
    @badrinathroysam5159 Před rokem

    The temperature term seems to be misplaced

    • @DeepFindr
      @DeepFindr  Před rokem

      Yes, pls see correction at the beginning of the second part :)

  • @vignatej663
    @vignatej663 Před rokem

    but the loss at 12:50 has to be 0.8/(0.8+0.2). As denominator has a sigma, I don't know why u did not add a 0.8 to denominator.

    • @DeepFindr
      @DeepFindr  Před rokem

      Yeah as mentioned in the second part I had some errors there :\

    • @The_Night_Knight
      @The_Night_Knight Před 11 měsíci

      @@DeepFindr What if we used disentangled variational autoencoders to rotate 2d images by 3d means not just changing the color or rotation? The model would be able to generalize far better for far more different 3d angles with less data.

  • @user-wd7gv5jl9z
    @user-wd7gv5jl9z Před 4 měsíci

    Anyone from IISc B?