Lecture 9 : N-Gram Language Models

Sdílet
Vložit
  • čas přidán 25. 08. 2024

Komentáře • 6

  • @MohitSharma-lp4cp
    @MohitSharma-lp4cp Před rokem +2

    This is one of the best n gram videos I've seen. Straight up no cap

  • @pawanchoure1289
    @pawanchoure1289 Před 2 lety +1

    N-grams are continuous sequences of words or symbols or tokens in a document. In technical terms, they can be defined as the neighboring sequences of items in a document. They come into play when we deal with text data in NLP(Natural Language Processing) tasks.

  • @pawanchoure1289
    @pawanchoure1289 Před 2 lety +1

    The term smoothing refers to the adjustment of the maximum likelihood estimator of a language model so that it will be more accurate. ... When estimating a language model based on a limited amount of text, such as a single document, smoothing of the maximum likelihood model is extremely important.

  • @louerleseigneur4532
    @louerleseigneur4532 Před 4 lety +4

    Thanks a lot sir

  • @vivekanand8912
    @vivekanand8912 Před 3 lety +2

    JAZZ😉👌

  • @pawanchoure1289
    @pawanchoure1289 Před 2 lety +2

    completion prediction