Build the LLM OS | Autonomous LLMs as the new Operating System
Vložit
- čas přidán 11. 05. 2024
- Lets build the `LLM OS` inspired by the great Andrej Karpathy
Can LLMs be the CPU of a new operating system and solve problems using:
💻 software 1.0 tools
🌎 internet browsing
📕 knowledge retrieval
🤖 communication with other LLMs
Code: git.new/llm-os
⭐️ Phidata: git.new/phidata
Questions on Discord: phidata.link/discord
Really like the idea of selecting assistants for a specific task
You are doing an amazing job ❤
Amazing stuff Ashpreet, thank you
beautiful - congrads!
Thank you! Really nice work.
can i change using openai, for like groq ?
another great video
WoW so cool this! Is the chat history also saved to memory or database? Or maybe an option if you let it do research, that you can save the research so you can later ask questions about it? Or a tool, when you say in the chat save history or something. Or first ask to summarise the chathistory and save, or only save last answer/research. But also after that save last answer/research to a md or pdf. So cool to see an llm as a cpu.
Question, tried langgraph yesterday and had to make a router node(which can have a lot of options), which will generate the next step in the route, is this the same as the cpu in llm os? Or is this different? Thanks for your great work love it❤️❤️❤️
Great presentation, can we use llm os with ollama ?
Someone said it, I’ll say it again. How are these different from conversational agents which langchain showed how to build like a year ago… is it a nicer API? (Langchain at this point like bloatware) but what is the new pattern here? We’ve always had agents where agent = LLM+ knowledge + memory + tools. Otherwise, nice demo!!
"Research new/updated functionality required to improve this LLM OS App and produce a new version of it every day at noon"
... Delegating task to Research Assistant ---> Redirecting Task to Python Assistant
why postgres for memory? is it something about memory that makes postgres a good fit?
When can we see same application running using local llms using Ollama or Lmstudio api?
Where do you enter the API key? Apologies am pretty new to this.
I was thinking same, to integrate LLM into Linux.
I think that a entire OS is to much for a LLM but I expect to use one as a linux shell in a near future.
NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
Interesting title but disappointed to see that its just a website wrapper for GPT-4. I was expecting some level of kernel development that integrates AI to the core and using GPT as a input like mouse or keyboard.
Sorry about that
What would you like instead of a web interface? A cli? Even I am building a minimal llm os implementation and honestly dont like web interface. It would be helpful to know ur thoughts.
@@yashwanthsai3401 I like the web UI but it doesn't make sense to call it an operating system. Although OS is technically just a software that runs on a computer that runs other software, On first glance I thought
- AI would be sitting right on top of kernel instead of windows or linux(should be possible. I saw a youtuber build a tetris OS which just plays tetris and nothing else)
- Use the various models to interact with the peripheral devices
- Instead of reading the text out of the document, it can open it in default text editor.
Basically, all the manual stuff like searching for a file, sorting a folder, looking for an image, etc will be handled by natural language and other things like doc editing can have assist but not intrusive.
If it was for me, I would just remove the "OS" from the name and get a mascot like Stapler(Friend of Clippy) that just floats in the screen and helps using AI.
now you're just renaming an "agent" as an "OS". There is nothing novel here. You're just taking LLM processes and equating them to OS processes. It's all quite pointless and doesn't expose anything new or interesting.
@avi7278 we're implementing what karpathy proposes as the LLM OS. If you share something novel im happy to build it :)
@@phidata how about storing and reproducing data from vector space to rebuild the original file without the original file. You know like a novel LLM driven file system that generates formats that can only be unencrypted and read by the LLM (a specifically fine tuned one), not your OS's actual file system just called as an LLM OS file system 😂
@@avi7278 haha this is a very good one, i'll get on it :) btw love the feedback
You are nothing new or interesting, therefore no new knowledge is interesting to you.
@@toddai2721 good one bro, wow deep.