Artificial Intelligence - All in One
Artificial Intelligence - All in One
  • 413
  • 4 694 339
Lecture 39 — Recommender Systems Content based Filtering -- Part 2 | UIUC
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field.
Check out the following interesting papers. Happy learning!
Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
Paper: aclanthology.org/2023.findings-eacl.125/
Dataset: huggingface.co/datasets/tafseer-nayeem/review_helpfulness_prediction
Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
Paper: aclanthology.org/C18-1102/
Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
Paper: aclanthology.org/W17-2407.pdf
Paper Title: "Paraphrastic Fusion for Abstractive Multi-Sentence Compression Generation"
Paper: dl.acm.org/doi/abs/10.1145/3132847.3133106
Paper Title: "Neural Diverse Abstractive Sentence Compression Generation"
Paper: link.springer.com/chapter/10.1007/978-3-030-15719-7_14
zhlédnutí: 3 715

Video

Lecture 38 - Recommender Systems Content based Filtering -- Part 1 | UIUC
zhlédnutí 7KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 37 - Future of Web Search | UIUC
zhlédnutí 898Před 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 36 - Learning to Rank -- Part 3 | UIUC
zhlédnutí 925Před 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 35 - Learning to Rank -- Part 2 | UIUC
zhlédnutí 1,3KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 34 - Learning to Rank -- Part 1 | UIUC
zhlédnutí 3KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 33 - Link Analysis -- Part 3 | UIUC
zhlédnutí 838Před 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 32 - Link Analysis -- Part 2 | UIUC
zhlédnutí 1,1KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 31 - Link Analysis -- Part 1 | UIUC
zhlédnutí 1,3KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 30 - Web Indexing | UIUC
zhlédnutí 1,3KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 29 - Web Search Introduction & Web Crawler | UIUC
zhlédnutí 8KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 28 - Feedback in Text Retrieval Feedback in LM | UIUC
zhlédnutí 988Před 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 27 - Feedback in Vector Space Model | UIUC
zhlédnutí 3,4KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 40 - Recommender Systems Collaborative Filtering -- Part 1 | UIUC
zhlédnutí 1,9KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 41 - Recommender Systems Collaborative Filtering -- Part 2 | UIUC
zhlédnutí 1,4KPřed 6 lety
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow ( mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field. Check out the following interesting papers. Happy learning! Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Predic...
Lecture 42 - Recommender Systems Collaborative Filtering -- Part 3
zhlédnutí 971Před 6 lety
Lecture 42 - Recommender Systems Collaborative Filtering Part 3
Lecture 43 - Course Summary | UIUC
zhlédnutí 576Před 6 lety
Lecture 43 - Course Summary | UIUC
Lecture 26 - Feedback in Text Retrieval | UIUC
zhlédnutí 1,1KPřed 6 lety
Lecture 26 - Feedback in Text Retrieval | UIUC
Lecture 12 - System Implementation Fast Search | UIUC
zhlédnutí 1,1KPřed 6 lety
Lecture 12 - System Implementation Fast Search | UIUC
Lecture 13 - Evaluation of TR Systems | UIUC
zhlédnutí 1,4KPřed 6 lety
Lecture 13 - Evaluation of TR Systems | UIUC
Lecture 14 - Evaluation of TR Systems Basic Measures | UIUC
zhlédnutí 1,2KPřed 6 lety
Lecture 14 - Evaluation of TR Systems Basic Measures | UIUC
Lecture 15 -Evaluation of TR Systems Evaluating Ranked Lists -- Part 1 | UIUC
zhlédnutí 1,2KPřed 6 lety
Lecture 15 -Evaluation of TR Systems Evaluating Ranked Lists Part 1 | UIUC
Lecture 16 - Evaluation of TR Systems Evaluating Ranked Lists -- Part 2 | UIUC
zhlédnutí 1,4KPřed 6 lety
Lecture 16 - Evaluation of TR Systems Evaluating Ranked Lists Part 2 | UIUC
Lecture 17 - Evaluation of TR Systems Multi Level Judgements | UIUC
zhlédnutí 959Před 6 lety
Lecture 17 - Evaluation of TR Systems Multi Level Judgements | UIUC
Lecture 18 - Evaluation of TR Systems Practical Issues | UIUC
zhlédnutí 906Před 6 lety
Lecture 18 - Evaluation of TR Systems Practical Issues | UIUC
Lecture 19 - Probabilistic Retrieval Model Basic Idea | UIUC
zhlédnutí 21KPřed 6 lety
Lecture 19 - Probabilistic Retrieval Model Basic Idea | UIUC
Lecture 20 - Statistical Language Models | UIUC
zhlédnutí 3,6KPřed 6 lety
Lecture 20 - Statistical Language Models | UIUC
Lecture 21 - Query Likelihood Retrieval Function | UIUC
zhlédnutí 4KPřed 6 lety
Lecture 21 - Query Likelihood Retrieval Function | UIUC
Lecture 22 - Smoothing of Language Model -- Part 1 | UIUC
zhlédnutí 2KPřed 6 lety
Lecture 22 - Smoothing of Language Model Part 1 | UIUC
Lecture 23 - Smoothing of Language Model -- Part 2 | UIUC
zhlédnutí 974Před 6 lety
Lecture 23 - Smoothing of Language Model Part 2 | UIUC

Komentáře

  • @bintoanto1471
    @bintoanto1471 Před 4 dny

    If A and B nodes have two edges to each other, in the adjacency matrix (UNDIRECTED G) is it Aij=1 or 2(there's an edge yes=1, there are two edges =2)? Which one is correct? Also if A-B there's no edge but B-A a has an edge(direction) it should be calculated for both right?

  • @sohanaryal
    @sohanaryal Před 13 dny

    Do we have a book which teaches just like this lecture series

  • @30ashishchotani81
    @30ashishchotani81 Před 18 dny

    how is dconcept = d x V different from the U x ⨊ Matrix if both give the position of the User to Rating in the new coordinate space

  • @hmcyrus
    @hmcyrus Před 19 dny

    2:56 - RL 3:26 - why RL is difficult

  • @gemini_537
    @gemini_537 Před 21 dnem

    Gemini 1.5 Pro: This video is about a mixture of experts model, a neural network architecture proposed by Geoffrey Hinton in the early 1990s. The main idea is to train a set of neural networks, each specializing in a different part of the data. A manager neural network then assigns weights to these experts to decide which expert to rely on for a particular case. The benefit of this approach is that it allows the models to specialize on a subset of the data, leading to better performance on that subset. This is particularly useful for very large datasets. However, training a mixture of experts model is more complex than training a single model. Here are the key points of the video: * Mixture of experts model consists of multiple neural networks (experts) and a manager neural network. * Each expert is trained on a different part of the data. * The manager neural network assigns weights to the experts based on the input data. * This allows experts to specialize on a subset of the data and improve performance. * Mixture of experts model is particularly useful for very large datasets. * Training a mixture of experts model is more complex than training a single model. I hope this summary is helpful!

  • @SamuelOgazi
    @SamuelOgazi Před 29 dny

    This was great to watch! Thank you and God bless!

  • @kakumon8949
    @kakumon8949 Před měsícem

    missed u , prev prof was horrible

  • @yangningxin1832
    @yangningxin1832 Před měsícem

    this video is the best to SVD, the best, the best!!!

  • @Iamfafafel
    @Iamfafafel Před měsícem

    for the last slide, you should impose the condition in the minimization that the sum of y is 0. otherwise the constant vector of all 1's is a trivial soln (corresponding to lambda_1)

  • @Iamfafafel
    @Iamfafafel Před měsícem

    such great presentation with great insights. so we can interpret a vector as a scalar function on the vertex set. multiplication with the adj matrix can be seen as averaging out this function wrt to its neighbors. so in a way, this is like the linear-algebraic version of the heat equation

    • @Iamfafafel
      @Iamfafafel Před měsícem

      this smoothing interpretation makes the normalized laplacian the undisputed discrete counterpart to the riemannian laplacian. in short, the C0 story of the laplacian is measuring the difference between the function value and its average (this is an adaptation of harmonic functions satisfying the mean-value-property). therefore, the discrete analog of this story is to consider I - AD^{-1}, where I is the id matrix, A is the adj matrix, and D is the diagonal matrix.

  • @abdulahadqureshi1104
    @abdulahadqureshi1104 Před 2 měsíci

    best explanation of PCA 🔥🔥

  • @DataScientist-ou7sc
    @DataScientist-ou7sc Před 2 měsíci

    this is too boring and the more sad part is that it comes from Stanford uni

  • @andrewdavidson2574
    @andrewdavidson2574 Před 2 měsíci

    Thank you for this video, it has been very helpful

  • @hermeitus
    @hermeitus Před 2 měsíci

    Turkish subtitles please ❤

  • @user-vs6xc7dk1w
    @user-vs6xc7dk1w Před 2 měsíci

    我谢谢你啊

  • @pinpintea-makorn9678
    @pinpintea-makorn9678 Před 2 měsíci

    I LOVE JURE!

  • @musashifanboy
    @musashifanboy Před 2 měsíci

    very clear and straight forward, thank you for making this video

  • @ali_najafzadeh
    @ali_najafzadeh Před 2 měsíci

    Insightful!!

  • @amerginmilesius2336
    @amerginmilesius2336 Před 2 měsíci

    At 4:25 I don't understand why there is only one block of size 8. It looks like there should be two of 8 and the block of size 16 should be shifted by 8 (and then the same reasoning could apply). However he computes everything as if it was ok, am I missing something?

  • @pauljay9379
    @pauljay9379 Před 2 měsíci

    When a user does not learn mic placement 101 when recording video, a bad user experience will be the result. Isn't it ironic

  • @theo9743
    @theo9743 Před 2 měsíci

    So what's the difference between negative loss likelihood and cross-entropy?

  • @yvettecrystal6075
    @yvettecrystal6075 Před 3 měsíci

    Great introduction.

  • @sutterismine
    @sutterismine Před 3 měsíci

    Very well explained, thank you so much!

  • @akshpatel1321
    @akshpatel1321 Před 3 měsíci

    Hes genius

  • @cooking60210
    @cooking60210 Před 4 měsíci

    They are saying that the decomposition is unique, but this isn't true.

  • @VuongSyViet
    @VuongSyViet Před 4 měsíci

    good !

  • @Government_of_moon
    @Government_of_moon Před 4 měsíci

    5:40 those are even human can't understand.

  • @astroknight5
    @astroknight5 Před 4 měsíci

    I never really understood the concept behind SVD until now. The example in the first minute made everything click!

  • @karthikrajeshwaran1997
    @karthikrajeshwaran1997 Před 5 měsíci

    Helped me totally grok! Thanks so much prof.

  • @jeevan88888
    @jeevan88888 Před 5 měsíci

    Great lessons, thank you Stanford.

  • @revimfadli4666
    @revimfadli4666 Před 5 měsíci

    I wonder if liquid/ODE echo state nets could model high-dimensional systems better

  • @D8teen
    @D8teen Před 5 měsíci

    This is legendary introduction of RMSProp to wider world, paper was only written after this. Interesting enough almost on the next slide where Geoffrey recommends algorithms to pick he suggests RMSProp with Momentum which is indeed the principle of Adam - current industry standard.

  • @greattatsumaki3141
    @greattatsumaki3141 Před 6 měsíci

    Thanks. I was able to understand the concept of SVD. Very clean explained

  • @CarlintVeld
    @CarlintVeld Před 6 měsíci

    How do I calculate that fiedler vector?

  • @matteolugli1607
    @matteolugli1607 Před 6 měsíci

    when exactly we passed from x being just a vector to supposing that x is an eigenvector? Thanks in advance

  • @shubhamsd100
    @shubhamsd100 Před 6 měsíci

    @5:35, Please correct your statement. SVD is applicable for both Real and Complex matrices.

  • @PotatoMan1491
    @PotatoMan1491 Před 6 měsíci

    Prof is excellent in delivering essential pieces, I find this very informative

  • @NUCLEAR154
    @NUCLEAR154 Před 6 měsíci

    How you describe this study should realistically prove the corrupt person's in authority > though what should correlate are those people that are intent on totally abusing all our fundamental rights which includes the corrupt Malta government staff, lawfirms, Notaries, Police, judges who use Bogus Court Claims to steal all our assets

  • @JusticeForTheHacked
    @JusticeForTheHacked Před 7 měsíci

    And when you don't want to be connected, but are being intellectually trafficked, through frequencies you help promote? like myself, at this time? You don't care, until you are held a silent hostage, ignored, trafficked, silenced.

  • @patrick_bateman-ty7gp
    @patrick_bateman-ty7gp Před 7 měsíci

    the notation for factoring R as Q and P can be made better as follows : Q ----> matrix of shape (k x m) where m = no. of movies and k = latent vector dimensions P -----> matrix of shape (k x n) where n = no. of users and k = latent vector dimensions R ~ transpose(Q)*P this gives a nice view as the latent vectors are all packed as columns in both matrices P and Q

  • @mdzidane9620
    @mdzidane9620 Před 7 měsíci

    indians are the best teachers

  • @muthukrishnanp6208
    @muthukrishnanp6208 Před 7 měsíci

    Thank you for your clear explanation and great effort sir. And what is evaluation techniques to evaluate these content based recommendation systems sir, Could you please answer it will be more helpful sir

  • @user-iy7rm7dt8x
    @user-iy7rm7dt8x Před 7 měsíci

    Thank you very much!!!

  • @devhau5
    @devhau5 Před 7 měsíci

    Godfather of AI ❤🙏

  • @gustafkugelberg3906
    @gustafkugelberg3906 Před 7 měsíci

    Hinton is one of the greats in machine learning so how can he mess up the indices this badly?

  • @marcopierrefernandezburgos3116

    Amazing video. All clear now.

  • @justusmzb7441
    @justusmzb7441 Před 7 měsíci

    I would love to see a little explanation of the complexity that this creates, and how it is better than pairwise comparison. How is e.g. the pairwise comparison of band hashes avoided in this circumstance?

  • @miguelfsousa
    @miguelfsousa Před 7 měsíci

    Is the equation at the slide 22 correct? I mean, the SVD there is not the approximation, right? The optimal error equation only makes sense when applied over the truncated SVD approximation. Otherwise, that subtraction would be equal zero.

  • @WahranRai
    @WahranRai Před 7 měsíci

    bla bla bla ¨!!!!

  • @carlnxlvl6455
    @carlnxlvl6455 Před 8 měsíci

    He was in the wave earlier than gpt