Tirendaz AI
Tirendaz AI
  • 190
  • 233 567
Getting Started with GPT-4o API for Beginners
In this video, we covered how to use the GPT-4o model with OpenAI API. We performed generative AI tasks such as image, audio, and video processing to see the power of this model
▶ Subscribe: bit.ly/subscribe-tirendazai
▶ Join my channel: bit.ly/join-tirendazai
00:01 Introduction
01:09 Setup
02:43 Set the OpenAI API key
03:44 Configure the client
05:44 Image processing
07: 34 Video processing
10:31 Video summary
11: 37 Audio summary
13:00 Audio + visual summary
RELATED VIDEOS:
▶ PandasAI Tutorials: bit.ly/pandasai
▶ Ollama Tutorials: bit.ly/ollama-tutorials
▶ LangChain Tutorials: bit.ly/langchain-tutorials
▶ Generative AI for DS: bit.ly/genai-for-data-science
▶ HuggingFace Tutorials: bit.ly/hugging-face-tutorials
▶ LLMs Tutorials: bit.ly/llm-tutorials
FOLLOW ME:
▶ Medium: tirendazacademy.medium.com
▶ X: TirendazAcademy
▶ LinkedIn: www.linkedin.com/in/tirendaz-academy
Don't forget to subscribe, like the video, and turn on notifications so you don't miss new videos.
▶ Project file: github.com/TirendazAcademy/GPT-4o-Tutorials/tree/main/Getting-Started-with-GPT-4o
▶ Image link: upload.wikimedia.org/wikipedia/commons/e/e2/The_Algebra_of_Mohammed_Ben_Musa_-_page_82b.png
Hi guys, I'm a content creator on generative AI and data science, PhD. I aim to create engaging content to make the latest AI technologies understandable for everyone.
#ai #generativeai #datascience
zhlédnutí: 513

Video

MySQL database with PandasAI & Ollama & Streamlit
zhlédnutí 4KPřed měsícem
Today, we'll cover how to analyze data in a MySQL database with local Llama 3 using Pandas AI, Ollama and Streamlit for free. Happy learning. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 01:06 Setup 02:47 Create a new database 04:43 Load the model with Ollama 06:41 Initialize the model 08:29 Connect to the database 10:45 Build the app 12:...
How to Use PandasAI Agent with Llama 3 using Ollama?
zhlédnutí 3,4KPřed měsícem
Today, we'll cover how to perform data analysis and visualization with local Meta Llama 3 using Pandas AI agent and Ollama for free. Happy learning. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 01:12 Setup 02:47 Load the model with Ollama 04:37 Initialize the model 07:43 Build the app 10:54 Inference 13:15 Data visualization RELATED VIDEO...
How to Use Llama 3 with PandasAI and Ollama Locally
zhlédnutí 15KPřed měsícem
Today, we'll cover how to perform data analysis and visualization with local Meta Llama 3 using Pandas AI and Ollama for free. Happy learning. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 01:32 Setup 03:02 Initialize the model 05:15 Initialize the app 08:10 Build the app 09:18 Inference 11:16 Data visualization RELATED VIDEOS: ▶ PandasAI ...
AUTOMATE Your Blog Posts with CREWAI and OLLAMA
zhlédnutí 940Před měsícem
CrewAI tutorial with Ollama and LangChain using the local mistral model step-by-step for free. Automate your blog posts with AI Agents Locally. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 00:57 Setup 01:27 Installation 04:01 Load the model with Ollama 05:21 Initialize the local model 07:18 Create agents 10:23 Define tasks 13:16 Build a c...
Llama-3 70B vs GPT-4 with OpenAI API & Groq API
zhlédnutí 2,2KPřed měsícem
Today, we'll compare Llama-3 70B and GPT-4 for data analysis using Python and build chatbot apps with Streamlit. Happy learning. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 01:07 Setup 03:21 Set environment variables 04:27 Load the dataset 05:44 Create the agent 08:58 Chatbot for Llama-3 10:34 Compare GPT4 & Llama3 RELATED VIDEOS: ▶ Lang...
Build a Chatbot with Llama 3 8B & 70B + Ollama + Streamlit
zhlédnutí 5KPřed měsícem
Today, I'll show you how to build a llm app with the Meta local Llama 3 model, Ollama and Streamlit for free using LangChain and Python. Happy learning! ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 01:10 Setup 02:38 Initialize Llama 3 with Ollama 05:55 Build chatbot with Streamlit 08:00 Inference 09:03 Test Llama 3 70B RELATED VIDEOS: ▶ L...
Chat with Llama-3 with HuggingFace & Build a chatbot with Gradio
zhlédnutí 3,3KPřed měsícem
Run Meta Llama 3 8B tutorial with Transformers locally and for free and then test this model by creating an app with Gradio. 00:01: Introduction 00:54 Setup 03:38 Load the model 06:27 Create the prompt 08:08 Inference 10:07 Build the app 15:21 Test the app ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai RELATED VIDEOS: ▶ LangChain Tutorials: bit.ly/langchain-t...
Build an App using Streamlit for data visualization with PandasAI
zhlédnutí 1,1KPřed měsícem
Today, I'll show you how to chat with your data and create an app using PandasAI, Streamlit, and Claude 3 Haiku. Happy learning. ▶ Subscribe: bit.ly/subscribe-tirendazai ▶ Join my channel: bit.ly/join-tirendazai 00:01 Introduction 00:59 Setup 03:35 Initialize the app 04:16 Load the dataset 05:22 Create SmartDataframe 06:51 Build the app 08:05 Inferences 09:11 Data Visualization RELATED VIDEOS: ...
Build an App with Streamlit Using PandasAI & Ollama for Data Analysis
zhlédnutí 2,2KPřed měsícem
Today, I'll show you how to create a data analysis app with Streamlit, Ollama, PandasAI locally and for free using Python. Subscribe ➡️ bit.ly/subscribe-tirendazai Join my channel ➡️ bit.ly/join-tirendazai 00:01 Introduction 01:01 Setup 02:40 Initialize the app 04:16 Start Ollama 05:26 Initialize the model 09:10 Inference LangChain Tutorials ➡️ bit.ly/langchain-tutorials Generative AI for DS ➡️...
Build a QA Chatbot with Chainlit & OpenAI using Python
zhlédnutí 430Před měsícem
Today, I'll show you how to create a question-answering chatbot app with Chainlit and OpenAI using Python in minutes. Subscribe ➡️ bit.ly/subscribe-tirendazai 00:01 Introduction 00:59 Setup 03:35 Initialize the chatbot 06:30 Initialize the chat model 08:46 Test the app LangChain Tutorials ➡️ bit.ly/langchain-tutorials Generative AI for DS ➡️ bit.ly/genai-for-data-science HuggingFace Tutorials ➡...
Talk to Your Database with the Langchain SQL Agent
zhlédnutí 1KPřed 2 měsíci
Today, I'll show you how to use the langchain SQL agent to talk to your database, build an app with Streamlit and monitor this app in LangSmith. Happy learning. Subscribe ➡️ bit.ly/subscribe-tirendazai Medium ➡️ tirendazacademy.medium.com X ➡️ TirendazAcademy LinkedIn ➡️ www.linkedin.com/in/tirendaz-academy LangChain Tutorials ➡️ bit.ly/langchain-tutorials Generative AI for DS ➡️ bi...
LangChain Pandas DataFrame Agent for Data Analysis
zhlédnutí 988Před 2 měsíci
Today, I'll show you how to use pandas dataframe agent for data analysis and monitor an LLM app in LangSmith. 00:01 Introduction 00:54 Setup 01:23 Install libraries 02:52 Set environment variables 04:24 Load the dataframe 05:11 Create an agent 07:13 Monitor the app LangChain Tutorials ➡️ bit.ly/langchain-tutorials Generative AI for DS ➡️ bit.ly/genai-for-data-science HuggingFace Tutorials ➡️ bi...
Build an App with LangChain AGENTs & Streamlit for Data Analysis
zhlédnutí 1,4KPřed 2 měsíci
Hi guys! Today, I'll show you how to use agents for data analysis and build an LLM app. Subscribe ➡️ bit.ly/subscribe-tirendazai LangChain Tutorials ➡️ bit.ly/langchain-tutorials Generative AI for DS ➡️ bit.ly/genai-for-data-science HuggingFace Tutorials ➡️ bit.ly/hugging-face-tutorials Generative AI Tutorials ➡️ bit.ly/generative-ai-tutorials LLMs Tutorials ➡️ bit.ly/llm-tutorials 00:01 Introd...
Build a ChatGPT clone using LangChain, Ollama & Streamlit
zhlédnutí 543Před 2 měsíci
Today, I'll show you how to create a ChatGPT-like app using Python for free and locally with the Mistral model. Don't forget to subscribe and hit the bell icon so you don't miss the latest videos. 🔗 Let's connect: Medium: tirendazacademy.medium.com X: x.com/tirendazacademy LinkedIn: www.linkedin.com/in/tirendaz-academy 00:01 Introduction 01:00 Setup 02:28 Initialize the local model 03:56 Create...
RAG with Chat History using LangChain & Ollama Locally and For Free
zhlédnutí 2,2KPřed 2 měsíci
RAG with Chat History using LangChain & Ollama Locally and For Free
RAG with LangChain & Ollama Locally and For Free
zhlédnutí 2,1KPřed 2 měsíci
RAG with LangChain & Ollama Locally and For Free
How to Use LlamaParse? 🦙 LlamaIndex Tutorial
zhlédnutí 1,1KPřed 2 měsíci
How to Use LlamaParse? 🦙 LlamaIndex Tutorial
How to Build Chat APP with Claude API and Streamlit?
zhlédnutí 806Před 2 měsíci
How to Build Chat APP with Claude API and Streamlit?
Gradio APP with the Claude API using Python
zhlédnutí 287Před 3 měsíci
Gradio APP with the Claude API using Python
PandasAI with the Groq Free API in LangChain using Python
zhlédnutí 3,2KPřed 3 měsíci
PandasAI with the Groq Free API in LangChain using Python
Data Analysis with PandasAI and Ollama - Locally and Free
zhlédnutí 6KPřed 3 měsíci
Data Analysis with PandasAI and Ollama - Locally and Free
Groq API: Build Chat UI with Groq & LangChain & Gradio using Python
zhlédnutí 2,3KPřed 3 měsíci
Groq API: Build Chat UI with Groq & LangChain & Gradio using Python
How to use GEMMA in PyTorch (Free GPU)
zhlédnutí 421Před 3 měsíci
How to use GEMMA in PyTorch (Free GPU)
How to RUN GEMMA with LANGCHAIN and OLLAMA Locally
zhlédnutí 1,8KPřed 3 měsíci
How to RUN GEMMA with LANGCHAIN and OLLAMA Locally
First Step: How to USE GEMMA with KerasNLP on Colab (Free GPU!)
zhlédnutí 269Před 3 měsíci
First Step: How to USE GEMMA with KerasNLP on Colab (Free GPU!)
How to RUN GEMMA with HuggingFace Transformers (Free GPU!)
zhlédnutí 1,3KPřed 3 měsíci
How to RUN GEMMA with HuggingFace Transformers (Free GPU!)
LangChain Prompt Templates with Ollama 🔥 Generative AI Tutorial
zhlédnutí 1,1KPřed 3 měsíci
LangChain Prompt Templates with Ollama 🔥 Generative AI Tutorial
LangChain Models using OpenAI API & Ollama 🔥 Generative AI Tutorial
zhlédnutí 471Před 3 měsíci
LangChain Models using OpenAI API & Ollama 🔥 Generative AI Tutorial
What is LangChain? 🦜 LangChain New Version Explained
zhlédnutí 679Před 3 měsíci
What is LangChain? 🦜 LangChain New Version Explained

Komentáře

  • @jackgaleras
    @jackgaleras Před 7 hodinami

    Alguien ya verifico que los numeros esten bien

  • @Mallubeast69_xd
    @Mallubeast69_xd Před dnem

    How much did it cost

  • @Deepakbhogle
    @Deepakbhogle Před dnem

    I like when you say there we go😊

  • @muhaiminhading3477

    Thank you, but could you please hellp me. I get this error : Unfortunately, I was not able to answer your question, because of the following error: 404 page not found when I look at the pandasai.log I get this error : What is the sales in Canada? Variable `dfs: list[pd.DataFrame]` is already declared. At the end, declare "result" variable as a dictionary of type and value. If you are asked to plot a chart, use "matplotlib" for charts, save as png. Generate python code and return full updated code: 2024-06-10 13:31:35 [INFO] Executing Step 3: CodeGenerator 2024-06-10 13:31:35 [INFO] HTTP Request: POST 127.0.0.1:11434/chat/completions "HTTP/1.1 404 Not Found" 2024-06-10 13:31:35 [ERROR] Pipeline failed on step 3: 404 page not found

  • @Deepakbhogle
    @Deepakbhogle Před 3 dny

    Thanks your videos are detailed and informative

  • @ClimateDS
    @ClimateDS Před 6 dny

    I would call this an agent. more like pandas chat bot

  • @anshulsingh8326
    @anshulsingh8326 Před 13 dny

    Hi any steps you can suggest i should learn first? I'm from game development background with c#, I know a little bit of python. I don't know about conda or anything else

  • @AshishSingh-ri9rr
    @AshishSingh-ri9rr Před 18 dny

    Hi great video. I wanted to ask whether prompting is necessary to improve the answer accuracy of the agent. Also how to deal with huge databases? Thanks in advance

  • @gemini22581
    @gemini22581 Před 19 dny

    Can u show how i can use RAG in this framework. I have a .csv file i would want to use for responses. Pls show that

  • @erolkuluslu3942
    @erolkuluslu3942 Před 20 dny

    What is purpose of using Conda environment for the projects ? I have been following your tutorials for a while and in your implementations you are always creating Conda environment. Due to fact that I am new to python I am curious is there any reason other than those benefits: Isolation, dependency management ?

  • @souravbarua3991
    @souravbarua3991 Před 21 dnem

    Its not working all the time. Its handy but not good. In other way langchain dataframe agents are working better than this.

    • @TirendazAI
      @TirendazAI Před 21 dnem

      If you are using a smaller model like the Llama-8b, sometimes you may need to try a few prompts to get a good response.

    • @souravbarua3991
      @souravbarua3991 Před 21 dnem

      @@TirendazAI I am using same model as shown in video.

  • @erdemkoraysanl7547
    @erdemkoraysanl7547 Před 22 dny

    Hello, I wrote the same code, it works, but it is very slow, for example, when I say bring the first 4 data, it responds in 2 minutes 3 minutes, but it is very very fast in you. It is very fast when chatting normally with llama3. It is very fast in my database, but when I write this code, it works very slowly.

    • @TirendazAI
      @TirendazAI Před 21 dnem

      The app answered quickly a response in the video because I used cache. Cache allows PandasAI to store the results of previous queries. My queries took some time. Which graphic card do you use?

    • @erdemkoraysanl7547
      @erdemkoraysanl7547 Před 21 dnem

      @@TirendazAI I use the Amd Rodeon rx580 graphics card. Actually, my goal is to use AI search on my website. Algorithms almost most of the time can not give correct results. I really thought that artificial intelligence would be very useful to think like a human, but it works very slowly.

  • @Alicornriderm
    @Alicornriderm Před 22 dny

    Thanks for the tutorial! Is this actually running locally? If so, how did it download so quickly?

    • @TirendazAI
      @TirendazAI Před 22 dny

      Yes, the model is running locally and for free. To download the model, you can use ollama.

    • @muhaiminhading3477
      @muhaiminhading3477 Před 5 dny

      @@TirendazAI hi, I have use ollama3 with this localhost port : 127.0.0.1:11434. but I am confusing, how to load the model with transformers ? so I can follow your step?

  • @sdplusm3
    @sdplusm3 Před 22 dny

    Nice video. Thank you.

  • @PrinBensonic
    @PrinBensonic Před 24 dny

    @TirendazAI HELLO, Are you available to maximize your Udemy course's potential! Our tailored marketing strategies ensure greater visibility, increased enrollments, and enhanced student engagement. Ready to see your course top the charts? Let’s connect! @PBG TEAM

  • @user-ey7pb5re2i
    @user-ey7pb5re2i Před 24 dny

    why i am getting: llm = ollama(model="llama3") TypeError: 'module' object is not callable

  • @TooyAshy-100
    @TooyAshy-100 Před 24 dny

    Thank you,,,

  • @HmzaY
    @HmzaY Před 25 dny

    I have 64gb ram and 8gb vram, i want to run llama 70B but it doesn't fit. how can i run it on system ram (64gb one) on python . can you make a video for that?

    • @TirendazAI
      @TirendazAI Před 24 dny

      I also have 64gb RAM, it worked for me. My system used about 58GB RAM for llama-3:70B. I show RAM I use if I make a video with llama 3:70B.

  • @HmzaY
    @HmzaY Před 25 dny

    I have 64gb ram and 8gb vram, i want to run llama 70B but it doesn't fit. how can i run it on system ram with python. can you make a video for that?

    • @TirendazAI
      @TirendazAI Před 24 dny

      I also have 64gb RAM, it worked for me. My system used about 58GB RAM for llama-3:70B. I show RAM I use if I make a video with llama 3:70B.

  • @suffympm1601
    @suffympm1601 Před 25 dny

    thanks for your video. from indonesia.. but why in my laptop result data is so long.. any idea ?. my spec is asus ryzen 7 gen 5

    • @TirendazAI
      @TirendazAI Před 25 dny

      Hi, to get a quick response, it is important to have a powerful graphics card.

  • @dangya3481
    @dangya3481 Před 25 dny

    I use conda 23 + Python 3.10 , it worked. But use conda 24 + Python 3.12, it's not worked.

    • @TirendazAI
      @TirendazAI Před 25 dny

      This is due to the dependencies of the libraries. I recommend creating a separate virtual environment.

  • @lowkeylyesmith
    @lowkeylyesmith Před 26 dny

    Is it possible to expand the limit per file? my csv files are larger then 1GB.

    • @TirendazAI
      @TirendazAI Před 25 dny

      This is possible, but you need to use a larger model, such as llama-3:70b instead of llama-3b:8b.

  • @thetanukigame3289
    @thetanukigame3289 Před 27 dny

    Thank you for the great video. It was really helpful for getting everything set up. If I may ask, I have a 4090 graphics card and I can this maxing out my GPU usage so the cuda should be working correctly. However, my prompts when asked take anywhere between 20s and 2 minutes to return and after a few questions the chatbot stops responding at all and just stays processing. Is this normal?

    • @TirendazAI
      @TirendazAI Před 26 dny

      Which large model do you use? If you're using llama-3:70B, I think it's normal.

  • @PuffNSnort
    @PuffNSnort Před 27 dny

    Great video! What are the dataset size limitations? I get an answer 30% of the time and errors the rest of the time.

    • @TirendazAI
      @TirendazAI Před 26 dny

      Large models like Llama-3:70b and GPT-4 respond better.

  • @sebastianarias9790
    @sebastianarias9790 Před 27 dny

    i'm not getting a response from the chat. it keeps "Generating the prompt". what could be a reason for that? Thanks!

    • @TirendazAI
      @TirendazAI Před 27 dny

      Did you get any error? If yes, can you share this error? I can say something if I see the error.

    • @sebastianarias9790
      @sebastianarias9790 Před 27 dny

      @@TirendazAI there’s no error my friend. It only takes a very long time to get the output. Any ideas?

    • @TirendazAI
      @TirendazAI Před 27 dny

      Which large model do you use?

    • @sebastianarias9790
      @sebastianarias9790 Před 27 dny

      @@TirendazAI llama3 !

    • @sebastianarias9790
      @sebastianarias9790 Před 27 dny

      @@TirendazAI llama3 !

  • @focusedstudent464
    @focusedstudent464 Před 27 dny

    First

  • @PriyanshuSiddharth-ku7ev

    awesome

  • @ByteBop911
    @ByteBop911 Před 29 dny

    Is it possible to use agents without the pandasapi and use ollama?

    • @TirendazAI
      @TirendazAI Před 27 dny

      You can use many agents with LangChain.

  • @stanTrX
    @stanTrX Před 29 dny

    Excel?

  • @suryadiyadi2240
    @suryadiyadi2240 Před 29 dny

    really great 😮

  • @varshakrishnan3686
    @varshakrishnan3686 Před 29 dny

    I'm getting the error no module named pandasai.llm.local_llm. Is there any way to solve it?

    • @TirendazAI
      @TirendazAI Před 29 dny

      llm is a module in pandasai. Make sure pandasai is installed and virtual environment is activated.

  • @sareythakumar3062
    @sareythakumar3062 Před měsícem

    👍👌

  • @mohamedmaf
    @mohamedmaf Před měsícem

    Thanks a lot , very helpful tutorial, I have one question, what's the specs of the running machine (cpu, gpu, ram, os)?

    • @TirendazAI
      @TirendazAI Před 29 dny

      If you use a local model with Ollama, system requirements depend on the model you'll use. For example, you need to have at least 8GB RAM for the 7B or 8B model version.

  • @zainmunir5628
    @zainmunir5628 Před měsícem

    Hi. I love your videos. Is there anyway to do same with MongoDB or any NoSQL database?

    • @TirendazAI
      @TirendazAI Před měsícem

      Thanks for your feedback. PandasAI provides connectors for the following SQL databases: PostgreSQL, MySQL, Generic SQL, Snowflake, DataBricks, GoogleBigQuery, Yahoo Finance and Airtable.

  • @TheBestgoku
    @TheBestgoku Před měsícem

    bro please tell me if pandasAI is free if, i host on my own server. why is PandasAI so expensive?

    • @TirendazAI
      @TirendazAI Před měsícem

      PandasAI is free and open source.

  • @stanTrX
    @stanTrX Před měsícem

    Thanks. Has potential but Still not consistent and robust as you have demonstrated.

    • @TirendazAI
      @TirendazAI Před měsícem

      If a prompt doesn't work, I try another prompt. You can get the same results with the PandasAI cache. This cache stores the results of previous queries.

    • @stanTrX
      @stanTrX Před měsícem

      @@TirendazAI thanks, i will give it a try then

    • @TirendazAI
      @TirendazAI Před měsícem

      Happy learning!

  • @HimanshuSaxena16november
    @HimanshuSaxena16november Před měsícem

    Can we get data from multiple tables using prompts?

    • @TirendazAI
      @TirendazAI Před měsícem

      PandasAI supports queries with multiple dataframes. You can utilize a SmartDatalake to implement such queries.

    • @HimanshuSaxena16november
      @HimanshuSaxena16november Před měsícem

      @@TirendazAI what about joins in tables,what will be the prompt in that case?

  • @MANONTHEMOON419
    @MANONTHEMOON419 Před měsícem

    dude, please do another video on this with more things you can do, or maybe explaining further, this is amazing.

  • @nguyenngothuong
    @nguyenngothuong Před měsícem

    Can you make a video tutorial on role configuration for Botchat?. For example, only respond in English, and focus on using SQL to query data and respond to users.....

    • @nguyenngothuong
      @nguyenngothuong Před měsícem

      Added features to save chat history, suggest prompts after asking, etc.

  • @DarkNessShadow-
    @DarkNessShadow- Před měsícem

    Hi, awesome videos, can you make a video using agent pandasai with the explain and clarification questions features, i try to use it but doesn't work for me

  • @JohnBvoN
    @JohnBvoN Před měsícem

    Wanted to know, can I load the model directly from huggingface? Also I have stored the model and tokenizer using save_pretrained, How can I use these?

    • @TirendazAI
      @TirendazAI Před měsícem

      To load the model from HuggingFace, you can use langchain. Check this link: python.langchain.com/v0.1/docs/integrations/platforms/huggingface/

  • @nguyenngothuong
    @nguyenngothuong Před měsícem

    Nice 🎉❤

  • @MustafaCam-uv5fr
    @MustafaCam-uv5fr Před měsícem

    Sir, why are there no conv and maxpooling layers?

    • @TirendazAI
      @TirendazAI Před měsícem

      Can you tell me the second?

  • @user-kk1li5mk7q
    @user-kk1li5mk7q Před měsícem

    How are you able to get the response so fast. It is taking me few minutes to get a response. My csv file has 7K records. running on ubuntu 22 with i7 with 32 GB RAM.

    • @TirendazAI
      @TirendazAI Před měsícem

      The most important component for a fast response is the graphics card.

  • @felipemorelli4059
    @felipemorelli4059 Před měsícem

    Excellent video. which cpu are you using?

    • @TirendazAI
      @TirendazAI Před měsícem

      Thanks! My system is AMD Ryzen 5 7500F, 64GB RAM and 4070 TI Super graphics card with 16GB VRAM

  • @felipemorelli4059
    @felipemorelli4059 Před měsícem

    Thanks!

  • @satvikbisht
    @satvikbisht Před měsícem

    Unfortunately, I was not able to answer your question, because of the following error: Connection error. i am getting this error

    • @TirendazAI
      @TirendazAI Před měsícem

      Did you start Ollama with the "ollama serve" command in your terminal?

    • @amanporwal6748
      @amanporwal6748 Před 28 dny

      ​@@TirendazAIYes, I start ollama serve but after that same problem will occur.

  • @thrashassault1
    @thrashassault1 Před měsícem

    nice 🤩

  • @varganbas427
    @varganbas427 Před měsícem

    Grear! Thanks! it works 😁there was an error in python !!!

  • @TooyAshy-100
    @TooyAshy-100 Před měsícem

    Thank you, I installed crewai_tools and python=3.11.9. I got an error: ModuleNotFoundError: No module named "crewai_tools" found How to solve it?

    • @TirendazAI
      @TirendazAI Před měsícem

      Did you activate your virtual environment?