HuggingFace Fundamentals with LLM's such as TInyLlama and Mistral 7B

Sdílet
Vložit
  • čas přidán 4. 02. 2024
  • chris looks under the hood of huggingface models such as TinyLlama and Mistral 7-B. In the video Chris presents a high level reference model of large language models and uses this to show how tokenization and the AutoTokenizer module works from the HuggingFace transfomer library linking it back to the HuggingFace repository. In addition we look at the tokenizer config and Chris shows how Mistral and Llama-2 both use the same tokenizer and embeddings architecture (albeit different vocabularies). Finally Chris shows you how to look at the model configuration and model architecture of hugging face models.
    As we start to build towards our own large language model, understanding these fundamentals are critical no matter whether you are a builder or consumer of AI.
    Google Colab:
    colab.research.google.com/dri...
  • Věda a technologie

Komentáře • 47

  • @msssouza2
    @msssouza2 Před dnem

    Thanks, for another great video Chris. I've been through some LLM courses on Udemy but your channel is helping me to clear many doubts I have on the whole thing. I'm glad I found your channel. It's really the best on this subject. Congratulations. Marcelo.

  • @ukaszrozewicz7488
    @ukaszrozewicz7488 Před 5 měsíci +9

    The best video I've watched on CZcams about LLM so far. You explain complex topics in an accessible language, clearly and understandably. You are doing a very good job. I'm eagerly waiting for the next videos :)

    • @ThomazMartinez
      @ThomazMartinez Před 5 měsíci +3

      same here

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +2

      Wow, thanks!, this one actually took a long time to get right, glad you liked it

  • @narenkrishnanGenius
    @narenkrishnanGenius Před 5 měsíci +3

    very well explained and useful

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +2

      So glad to hear that, thank you

  • @RejectAnticipatoryObedience
    @RejectAnticipatoryObedience Před 5 měsíci +3

    Insanely valuable video. Thank you!

  • @nguyenhuuanhtuan5360
    @nguyenhuuanhtuan5360 Před 5 měsíci +3

    Aways awesome content ❤

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +2

      Super glad it’s useful, thank you

  • @janstrunk
    @janstrunk Před 5 měsíci +3

    Great video! Looking forward to your next videos…

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +2

      Yeah, next ones in series will be fun, glad you’re enjoying it

  • @BipinRimal314
    @BipinRimal314 Před 5 měsíci +2

    Looking really forward to the next video.

  • @tekperson
    @tekperson Před 5 měsíci +2

    Great video. Just the right amount of detail. Thanks.

  • @mindurownbussines
    @mindurownbussines Před 3 měsíci +2

    Thank you so much Chris
    I truly believe if one has a great understanding of a subject he can teach it clearly and you simply did that !
    God bless you

    • @chrishayuk
      @chrishayuk  Před 3 měsíci

      You are too kind, and thank you. Glad it was useful

  • @prashantkowshik5637
    @prashantkowshik5637 Před 2 měsíci +1

    Thanks a lot Chris.

  • @wadejohnson4542
    @wadejohnson4542 Před 4 měsíci

    For the very first time, I finally get it, thanks to you. Thank you for your service to the community.

  • @atifsaeedkhan9207
    @atifsaeedkhan9207 Před 5 měsíci +1

    Thanks being so so in details. That was really a refresher for me. Glad someone like you is doing such a good work.

    • @chrishayuk
      @chrishayuk  Před 4 měsíci

      thank you, very much appreciate that

  • @Jaypatel512
    @Jaypatel512 Před 4 měsíci +1

    Amazing way to get people comfortable with the model architecture. Thank you so much for sharing your knowledge.

  • @javaneze
    @javaneze Před 4 měsíci +1

    great video - many thanks!

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak Před 4 měsíci +1

    Excellent tutorial to get started with LLMs.

  • @ilyanemihin6029
    @ilyanemihin6029 Před 5 měsíci +1

    Thank you! This video brings light into the black box of LLM magic)

    • @chrishayuk
      @chrishayuk  Před 4 měsíci +1

      more to come, the next set of videos reveal a bunch more

  • @kenchang3456
    @kenchang3456 Před 5 měsíci +2

    Excellent explanation. Although I don't have a use case to fine-tune a model currently, I presume I will eventually it'll be great to have what you've shared in my back pocket. Thanks a bunch.

  • @AncientSlugThrower
    @AncientSlugThrower Před 5 měsíci +1

    Great video.

  • @MannyBernabe
    @MannyBernabe Před 5 měsíci +1

    great video. Thx!

  • @john6268
    @john6268 Před 5 měsíci +2

    How does the tokenizer decode sub-word embeddings? Specifically, how do you determine which sequence is concatenated into a word vs. standing on its own? As shown, the answer would be decoded with spaces between the embeddings, which wouldn't make "Lovelace" into a word.

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +2

      Certain tokens will have spaces others won’t so _lace would be a different token from lace. I have a deep dive of the tiktoken tokenizer where I spend a lot of time on this. I am planning to do a building a tokenizer vid soon as part of this series

    • @chrishayuk
      @chrishayuk  Před 5 měsíci

      how the tokenizer for gpt-4 (tiktoken) works and why it can't reverse strings
      czcams.com/video/NMoHHSWf1Mo/video.html

    • @john6268
      @john6268 Před 5 měsíci

      @@chrishayuk Thanks, I'll check out the other video and looking forward to the next one.

  • @tec-earning8672
    @tec-earning8672 Před 5 měsíci +1

    Great job sir, one video for me sir how to build llama APIs i want use my train own model now i want using in my website ..

    • @chrishayuk
      @chrishayuk  Před 5 měsíci +1

      That’s where we are are working up to, but you can check out my existing fine tuning llama-2 video

  • @huiwencheng4585
    @huiwencheng4585 Před 5 měsíci +2

    Bro, just turn-on the Big Thank so I can donate you

    • @chrishayuk
      @chrishayuk  Před 4 měsíci

      lol, not gonna happen but appreciate the gesture and glad you like the videos