Ollama.ai to install llama2| Local Language Models on your machine | open source llm

Sdílet
Vložit
  • čas přidán 28. 08. 2024

Komentáře • 15

  • @YTDIMIR
    @YTDIMIR Před 10 měsíci +1

    Good one! You should do another video on how to connect / train a local LLama2 instance with your own data. Cheers.

    • @techwithmarco
      @techwithmarco  Před 10 měsíci

      Thanks for the idea! Will definitely try to find some sneaky cool ways to do that :)

  • @consig1iere294
    @consig1iere294 Před 8 měsíci

    Hi can you please make a video on how to use Ollama with an already downloaded LLM? Bonus point if you can show it for Windows (wsl) :)

    • @techwithmarco
      @techwithmarco  Před 7 měsíci

      You mean a self downloaded llm and how to get that into ollama, or just a downlaoded llm though ollama and how to use it?

  • @joekustek2623
    @joekustek2623 Před 3 měsíci

    How can I run this on my website or in a browser instead of terminal window

  • @NeilMartin98
    @NeilMartin98 Před 10 měsíci

    Great video, local LLMs are gaining a lot of traction in research

    • @techwithmarco
      @techwithmarco  Před 9 měsíci

      Yes, it's a very interesting topic and also great to learn new things how things work!

  • @vincentdupont
    @vincentdupont Před 10 měsíci

    Thanks for this very insightful video. In the "start your local LLM" section you're mentioning the memory usage, are you talking for the Graphic Card or CPU RAM? How do they relate to each other in the LLM runtime? Thanks

    • @techwithmarco
      @techwithmarco  Před 9 měsíci +2

      I was talking about the standardRAM of the computer, so not the GPU-RAM :)
      But if you want to run ollama on a remote machine without any cost so you do not need to run it on your local machine, I just made a video about it, feel free to check it out!
      czcams.com/video/Qa1h7ygwQq8/video.html

  • @buggingbee1
    @buggingbee1 Před 9 měsíci +1

    What is your laptop specification? RAM, VRAM, CPU, GPU, SSD

    • @techwithmarco
      @techwithmarco  Před 9 měsíci +1

      I have a Macbook Pro with a M1 Pro Chip and 16 gigs of RAM and 512GB SSD

  • @sourabhsahu4048
    @sourabhsahu4048 Před 10 měsíci

    Can we use it on Google Collab

    • @techwithmarco
      @techwithmarco  Před 10 měsíci

      Yes for sure
      It's just as installable as everything else on Google colab! So you can use even bigger language models

    • @sourabhsahu4048
      @sourabhsahu4048 Před 10 měsíci +1

      If you have time and only if you want can you show us how to do it, although it's not urgent but it's always better to know how to do something for me 😅

    • @techwithmarco
      @techwithmarco  Před 9 měsíci +1

      Hey @@sourabhsahu4048 ,
      I just made a video about how to run it on google colab :) Thanks for the inspiration!
      Feel free to check it out and I'd be more than happy if you leave some feedback :)
      czcams.com/video/Qa1h7ygwQq8/video.html