How to run ChatGPT in your own laptop for free? | GPT4All

Sdílet
Vložit
  • čas přidán 21. 07. 2024
  • AI based tools like ChatGPT have EXPLODED these days: Every week there's new development and crazy ideas what you can do with them. I've done a few videos already on what you can accomplish with OpenAI APIs or ChatGPT, but today's topic is a bit different. I want to share how you can run a large language model locally, and why would you want to do that.
    It's not quite the same as ChatGPT, of course, not today, but there are still good reasons why you might want to do that. Join me for a bit and find out why, how, and where.
    As always, if you like the video, give me a thumbs-up, subscribe to my channel for more. And leave any questions, comments, feedback below the video. All activity will help others to find these videos and my channel to grow.
    Here are some links from the video:
    - github.com/nomic-ai/gpt4all
    - huggingface.co/datasets/nomic...

Komentáře • 25

  • @yakymoto
    @yakymoto Před rokem +10

    Hell yeah! I would love to see a guide from you to "how to train and/or improve the model". Cheers!

    • @DevXplaining
      @DevXplaining  Před rokem +3

      Hahaa, thank you, that's definitely a solid vote for it then :)

  • @martinf5129
    @martinf5129 Před rokem

    Oh yes, definitely like to see how are you train this model 😊 i’m really appreciate that👍👍

    • @DevXplaining
      @DevXplaining  Před rokem +1

      Ah, didn't train this model, just used pretrained ones. Now already much more powerful are available. Interesting times!

  • @valibb1
    @valibb1 Před rokem +4

    Thank you for your good explanation. Please correct me if I understand right. "GPT4 is a general computational machine based on training model. If I training this machine with data from other tehnologies like for ex. math, physics, etc, I can chat with this machine on these fields ?"

    • @DevXplaining
      @DevXplaining  Před rokem +2

      Hi, thank you for your feedback. Yes, I think you got that right, you can fit/fine-tune a more general model to be aware of a documentation base, so answers from it may be included as well as the pre-training. GPT models typically come extensively pre-trained which gives them a very nice base. But there is a lot of interest to be able to have them answer based on some defined documentations. There are a few ways to approach this, depending on the exact needs, and I might be tempted to make another video on details later on, if there is some interest towards it.

  • @longjimmy2620
    @longjimmy2620 Před 10 měsíci +1

    Hi. I appreciate your tutorial. 1 question. Can I install it on Chrome Os?

    • @DevXplaining
      @DevXplaining  Před 10 měsíci

      As far as I know, no. At least haven't seen any indication it would be supported. Best bet would be to use the hosted services, or set up a Linux machine in cloud and remote to that.

  • @smartpicksonline
    @smartpicksonline Před 10 měsíci +1

    Thanks for this. I hear gpt4all now has an api. Any idea how to set this up?

    • @DevXplaining
      @DevXplaining  Před 10 měsíci

      Oh yes, it has a great API, and bindings for Python for example. I kind of skipped the API part because I went to Langchain that also includes API for GPT4All, but might do a video on GPT4All native API too if it becomes relevant.
      Right now tracking all the open source models to see which one beats the race, but the API documentation and setup is faily simple, here:
      docs.gpt4all.io/gpt4all_api.html
      docs.gpt4all.io/index.html

  • @m.goedeker7381
    @m.goedeker7381 Před 11 měsíci +1

    Can you do a fine tuning and training? I want to include data from 2022-2023. I am already using localdocs plugin.

    • @DevXplaining
      @DevXplaining  Před 11 měsíci

      Oh, definitely. But have to warn you, finetuning requires computational resources and time, very much more than using the local docs plugin (which indexes and injects a bit more context). Furthermore, finetuning is a machine-learning task that requires a bit more understanding to be successful. Finetuning also requires data, the more you have the better. I haven't made a video of finetuning gpt4all but after all these caveats, it's definitely something you can do, and there are some tutorials for it.
      What is right for you I think depends a lot on how much data you have and what you want to accomplish. There are at least three ways to bring in data from 2022-2023 and beyond:
      - Fine-tuning means you take the model, and keep on training it with more data - this would also allow changing the nature of the model which really cannot be affected by any other ways. As mentioned, this is the most expensive, slowest, and requires a lot of data to do you much good.
      - Localdocs plugin or similar - bring in extra content as context for the queries. No training required, just a well indexed database to find and prioritize what is relevant. For very small queries, just use a large token count and include the context as part of input. This makes the model work mostly as it was, but with extra information.
      - Web access. If model has ability to fetch more data from search engines and Internet, it can be very much up to date without need for fine-tuning or training every year. This also retains the model itself, but injects it with more information.
      I think you hit an interesting area with the question, as that is what most organizations are figuring out right now, how to have the most recent up-to-date information with the model, or use a custom knowledge base.

    • @DevXplaining
      @DevXplaining  Před 11 měsíci

      Oh, and here's a good blog on the topic in case you would like to try that route.
      medium.com/@xiucat/gpt4all-train-with-local-data-for-fine-tuning-6efefb07c101
      Note: For gpt4, fine-tuning is a bit work-in-progress. Meaning you can do it, but it's a bit tricky. More user-friendly 'feature' for this I think is coming up in the future, as part of the distribution, by the gpt4all team.
      github.com/nomic-ai/gpt4all/issues/15

  • @Cyberfoxxy
    @Cyberfoxxy Před rokem +2

    Yeah with the current velocity OpenAIs dominance is very temporary. They should crank up the price while they are still relevant.

    • @DevXplaining
      @DevXplaining  Před rokem

      Yeah, it"s not all about the price either, ownership of the data is interesting for me. But exciting times! These thing evolve like crazy right now!

  • @polyrics5857
    @polyrics5857 Před rokem +1

    Can i use Google Collab?

    • @DevXplaining
      @DevXplaining  Před rokem +1

      Haven't used Collab but as you can run it in Rapsberry Pi, sure why not. Any corner of a machine should be fine. Even GPU not requires.

  • @polyrics5857
    @polyrics5857 Před rokem +4

    Can we feed our own data ?

    • @DevXplaining
      @DevXplaining  Před rokem +6

      It's possible, but topic for another video another day. Few ways to approach it.

    • @dragon3602010
      @dragon3602010 Před rokem +1

      Awesome🎉🎉🎉 thanks

  • @vaisakhkm783
    @vaisakhkm783 Před rokem +2

    yes please do a video on how to improved with personal data
    🥲 i really want a personal assistant that i can trust with me data...

    • @vaisakhkm783
      @vaisakhkm783 Před rokem

      😅 can i train it on our text book and make it do my assignments??

    • @DevXplaining
      @DevXplaining  Před rokem +1

      Hi, thank you! There's been a few requests already so I think I better start finding the time to dig into this :)

    • @naimneman4216
      @naimneman4216 Před rokem

      could you train it with the notes you take? I'm imagining a kind of personal AI professor!

    • @aftermath7
      @aftermath7 Před rokem

      ​@@naimneman4216 same I'm having same concept of AI education