Building Recommender System with PyTorch using Collaborative Filtering

Sdílet
Vložit
  • čas přidán 3. 09. 2022
  • Welcome to this video! In this video, we covered how to implement a basic #recommendersystems using Collaborative Filtering and #deeplearning with #pytorch . It also covers evaluation metrics such as RMSE and Recall@k.
    Don't hesitate to leave a comment if you have any questions.
    Likes 👍🏼, subscribes 🔔 and sharing are always much appreciated and will be my motivations to make more videos! 🙌🏼
    #recommendersystems #pytorch #gnn #deeplearning #machinelearning #artificialintelligence #ai #graphneuralnetwork #graph #neuralnetwork #machinelearning #deeplearning #developers #pytorchgeometric #pytorch #GNN #towermodel #explanation #machinelearner #matrixmultiplication #machinelearning #systemdesign #recommendationsystem #recommendersystem #recommendation #rankingsystem #ranking #google #youtube #ai #ml #artificialintelligence #mlsystemdesign
  • Věda a technologie

Komentáře • 31

  • @the-ai-alchemy
    @the-ai-alchemy  Před rokem +15

    Thank you for supporting this video. If you are interested in the supplementary materials (slides and notebook) in the video, please visit: open.substack.com/pub/aialchemyofficial/p/ai-alchemy-supplementary-materials

  • @cailin9601
    @cailin9601 Před rokem +7

    Great video with excellent information - keep it up! 😊

  • @Debz_panache
    @Debz_panache Před rokem +3

    Perfect! It's executable too. Thanks

  • @FromNamoR3k
    @FromNamoR3k Před rokem +3

    Great video with clear explanation, I think watching it with zoom-in/fullscreen would make it even better

  • @nobodycanfindme314
    @nobodycanfindme314 Před 6 měsíci

    Great video

  • @utubesiddhant
    @utubesiddhant Před 5 měsíci

    Awesome video! It provides a great intro to collaborative filtering using PyTorch. One question - the model in the video doesn't take the dot product of user and movie embedding right? Instead, it predicts movie ratings using a linear combination of both the embeddings using weights, with embeddings and weights evolving as the model learns after each iteration. Is my understanding correct or am I missing something?

  • @christianreizner2546
    @christianreizner2546 Před rokem +1

    Why you choose 32 as output size for the embeddings? Is this an arbitrary choice?

  • @736939
    @736939 Před rokem +1

    So why don't you use any activation function in the model? And how to use it in production after training? - should we just get the dot product from both tuned embeddied layers to get predicted result?

  • @vinothkannan5091
    @vinothkannan5091 Před rokem +1

    If we use length of unique users and items for embedding, what happens if a new user/item added in the future? how the trained model handle this case

  • @TheJosephjeffy
    @TheJosephjeffy Před rokem

    Thank you for your video! I want to ask where I can embed the user characteristics into the model for prediction? The data you use did not consist of that information.

    • @the-ai-alchemy
      @the-ai-alchemy  Před rokem

      Hi YH Dai, thank you for your support. I will make future videos on those for models such as Two Tower, Deep and Wide.

    • @ron-hd9hr
      @ron-hd9hr Před rokem

      @@the-ai-alchemy That's great. Looking forward to the Two Tower video. Thanks

  • @XxXx-sc3xu
    @XxXx-sc3xu Před 7 měsíci

    Awesome video! Are the slides available for download? Loved the format, thank you.

    • @utubesiddhant
      @utubesiddhant Před 5 měsíci

      drive.google.com/drive/folders/1B_NZl0GeDrdx67dd_7mymYRF3Ktkdrd0

    • @user-iq4xg9dw5e
      @user-iq4xg9dw5e Před 2 měsíci

      @@utubesiddhant Could you allow my access?

  • @vikramsudarshan6416
    @vikramsudarshan6416 Před rokem

    Hello, the iter(train_loader) and dataiter.next( ) steps appear to be very slow on my end - is this normal? Does it always take time to load the data-set in batches?

    • @matealextoth2649
      @matealextoth2649 Před 3 měsíci

      I have the same issue, but I didn't really find a solution for it.
      Tell me pls if you solved it.

  • @stevekelly5381
    @stevekelly5381 Před rokem

    Please explain how you derived the User * Items Co-occurence Matrix from the User Matrix * Item Matrix. I dont understand where the values come from. The first cell is 3.16 you leave it blank. the second cell is 1.92 you put 4.5?! Why?

  • @user-bl8hi7je1z
    @user-bl8hi7je1z Před rokem +1

    Thks for sharing ,could you please explaine to how you creat latent vector or embadding for user and item .I tried to repeat your video but i could not get .

    • @the-ai-alchemy
      @the-ai-alchemy  Před rokem +1

      Embedding is learned/trained. So in this video you can see (inside def __init__) the items' embedding shape is n_item x 32 . It's initialized with some random numbers and then through the training process, the model will learn what's the best values for embedding vectors for each item to minimize loss.
      On a side note, Embedding is a very interesting and important topic in Deep Learning, I planning to have series of videos designated to different type of Embeddings. So stay tune 🔔。

    • @user-bl8hi7je1z
      @user-bl8hi7je1z Před rokem

      @@the-ai-alchemy thanks alot ,So in pytorch there is embadding layer which creat random values ,but how you extract the user and item vectors from matrix in slide .is it also by creating random number if yes ,thats mean will make different opeation till get the number in matrix .right .am asking becouse i find embadding in differnt area and i could not understand excatelly what it do as graph

    • @the-ai-alchemy
      @the-ai-alchemy  Před rokem +3

      The concept of embedding is essentially just a M x N matrix, where M is n_item or n_user, and N is the vector(array) size we want to give to each user or item as their individual embedding. Once you have the embedding, you can use it however you like, either a dot product (like in my GNN tutorial part2) or concatenate them for fully connected layers (like in this tutorial)
      You might be confused the general concept of embedding with the PyTorch embedding. The two are related. The PyTorch Embedding is just a M x N matrix that's initialized with random values at the start of training and once training finished, then each row will represent a given user or item's embedding vector for easy lookup.
      E.g: item_emb[5] will give me the embedding of item with item_id=5.
      I also made a video series on graph neural network in which graph embedding is covered as well. You can check it out: czcams.com/play/PLcLdsfpLufYCJ_eg7VWuI7ROQT7SxUDGj.html

    • @user-bl8hi7je1z
      @user-bl8hi7je1z Před rokem +1

      Thanks alot ,Appreciate it .

  • @hamnaka3355
    @hamnaka3355 Před rokem

    Sir can you explain the output?

  • @spicytuna08
    @spicytuna08 Před 20 dny

    do you have github to access the code?