LangChain, SQL Agents & OpenAI LLMs: Query Database Using Natural Language | Code

Sdílet
Vložit
  • čas přidán 11. 09. 2024
  • Discover how you can harness the power of LangChain, SQL Agents, and OpenAI LLMs to query databases using natural language. This step-by-step tutorial will guide you through setting up a powerful, intuitive, and user-friendly database querying system.
    Code: blog.futuresma...
    AI Demos: www.aidemos.com/
    AIDemos.com is your go-to directory for video demos of the latest AI tools. AI Demos's goal is to educate and inform about the possibilities of AI.
    Building a Document-based Question Answering System with LangChain, Pinecone, and LLMs like GPT-4.
    • Building a Document-ba...
    Revolutionizing Search: How to Combine Semantic Search with GPT-3 Q&A
    • Revolutionizing Search...
    GPT-4 API Tutorial: • GPT-4 API: Real-World ...
    ChatGPT API Tutorial: • Using OpenAI's ChatGPT...
    Building a GPT-4 Chatbot using ChatGPT API and Streamlit Chat
    • Building a GPT-4 Chatb...
    🚀 Top Rated Plus Data Science Freelancer with 8+ years of experience, specializing in NLP and Back-End Development. Founder of FutureSmart AI, helping clients build custom AI NLP applications using cutting-edge models and techniques. Former Lead Data Scientist at Oracle, primarily working on NLP and MLOps.
    💡 As a Freelancer on Upwork, I have earned over $60K with a 100% Job Success rate, creating custom NLP solutions using GPT-3, ChatGPT, GPT-4, and Hugging Face Transformers. Expert in building applications involving semantic search, sentence transformers, vector databases, and more.
    #LangChain #Pinecone #GPT4 #ChatGPT #SemanticSearch #DocumentQnA"

Komentáře • 162

  • @FutureSmartAI
    @FutureSmartAI  Před rokem +2

    📌 Hey everyone! Enjoying these NLP tutorials? Check out my other project, AI Demos, for quick 1-2 min AI tool demos! 🤖🚀
    🔗 CZcams: www.youtube.com/@aidemos.futuresmart
    We aim to educate and inform you about AI's incredible possibilities. Don't miss our AI Demos CZcams channel and website for amazing demos!
    🌐 AI Demos Website: www.aidemos.com/
    Subscribe to AI Demos and explore the future of AI with us!

  • @AshishKhurana71
    @AshishKhurana71 Před 5 měsíci

    I normally never comment on videos but this was a very informative video. Thank you!

  • @satheeshthangaraj5614
    @satheeshthangaraj5614 Před rokem +3

    👍 Thanks Pradip for sharing

  • @neerajshrivastava5600
    @neerajshrivastava5600 Před 11 měsíci +1

    Excellent Video and knowledge sharing !!!! Thank you very much Pradip

  • @2771237
    @2771237 Před 9 měsíci +1

    Thanks Alot Dear . Video is very Good and meaningful

  • @negidheerendra1
    @negidheerendra1 Před rokem +2

    Great 👍

  • @manuthvann7560
    @manuthvann7560 Před 6 měsíci +1

    Really thx Pradip for ur effort on producing such an informative content , your explanation is simply very easy for the newbie as me to catch up also , from ur personal experience may I know your preference between langchain and llmaindex on project NL2SQL , i also have watched another another vdo of urs and can make decision what to use , and for the scenario where we have large amount of data i meant many many tables in the db so do you suggest using the sql agent or any alternative method ?
    your suggestion is greatly appreciated ^^

    • @FutureSmartAI
      @FutureSmartAI  Před 6 měsíci

      Hi These two libraries keep adding more features and we use both of them for our projects. But if you want to start with one start with langchain it has better support

  • @nulladdin
    @nulladdin Před rokem +1

    Very helpful! Thank you

  • @yazhiniraja3599
    @yazhiniraja3599 Před 5 měsíci

    It worked for me.thank you so much.

  • @nachopascual9085
    @nachopascual9085 Před rokem +4

    Hi Pradip , great video!! Is it possible to have a sql agent with memory in order to avoid spending to many tookens each time?(Im using a postgres database)

  • @user-cw1mg1es7d
    @user-cw1mg1es7d Před rokem +1

    Nice work ..thanks for sharing ...

  • @HarshitSharma-yr3yv
    @HarshitSharma-yr3yv Před 4 měsíci

    Hi, I'm facing an error after running the agent executor. After entering the new AgentExecutor chain, it displays "Unexpect exception formatting exception. Falling back to standard exception". Can you please help me out .

  • @rameshh3821
    @rameshh3821 Před 4 měsíci

    Hi Pradip. Im getting this error:
    NameError: name 'SQLDatabaseToolkit' is not defined

  • @rameshh3821
    @rameshh3821 Před 4 měsíci

    Hi. Good explanation. Could you please do a demo on using SQL agent using Azure OpenAI. The data is stored in Azure SQL server. Please do a demo on this. Thank you.

  • @Tirtharaj___Talukdar
    @Tirtharaj___Talukdar Před rokem +2

    Hi Pradip really helpful video but while using the SQLDatabase Agent with gpt-3.5-turbo model I get the following error: "OutputParserException: Could not parse LLM output: `Action: list_tables_sql_db," . I followed the exact same steps as you and this error only seems to happen with only this model. If I use models like text-davinci-003 this does not seems to happen. Please can you help me !

  • @bastianelgueta7318
    @bastianelgueta7318 Před rokem +1

    In the last two querys, how does the model know the names of the columns that are in each table and that it must occupy to generate the query?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      Agent can perfom actions like list tables or check schema for particular table and based on that it can create query.

    • @bastianelgueta7318
      @bastianelgueta7318 Před rokem

      @@FutureSmartAI ahhh ok, I had a doubt since the step of consulting the schema is not shown within the chain of thoughts and actions of the agent

  • @simsingh4321
    @simsingh4321 Před 9 měsíci +1

    Amazing

  • @apdallhapasher6191
    @apdallhapasher6191 Před rokem +1

    I'm using sql server on the rds, how can i get my connection string

  • @mr.premponvilai9983
    @mr.premponvilai9983 Před rokem +1

    Hi Pradip, thanks for a very wonderful concepts and knowledge. Now, I'm trying to find the connection string for Azure SQL database but it's very hard to find and unable to connect it yet, lately. If you have any resource or advice on this, itwould be very helpful. Thank you.

  • @MANALI-sn9li
    @MANALI-sn9li Před 4 měsíci

    Hello, I have been working on a project in which two csv files are given and i need to develop a QA system based on these csv files and the output should be given in a way that Your system should generate a Python dictionary with:
    ● select_column: Data to be retrieved.
    ● from_table: Dataset containing the necessary data.
    ● where: List of conditions to refine the search.
    Each condition within where is a dictionary specifying the column, relation, and value. Can I use any agent for this? I wanted to ask you what tools from langchain I can use for this usecase? Your guidance will be of great help.

  • @notmanas
    @notmanas Před rokem

    Hi Pradip, great video!
    Here's my question - what if the table names or column names are not self explanatory? What if they are abbreviations, or short forms?
    What I have currently are data catalogues, with column descriptions and table descriptions that can possibly help the agent.
    So, basically, how do I pass such information to the agent? How do I provide it context? Any hints would be appreciated.

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      We can have custom prompt, which will add tables description

  • @manthnimisha
    @manthnimisha Před rokem +1

    Hey.. Great video and explanation !!
    I want to know how can I connect my Redshift Database here?

  • @ahsanrossi4328
    @ahsanrossi4328 Před rokem +1

    Awesome

  • @SanjogMehta
    @SanjogMehta Před rokem +1

    Hi Pradip, thanks for such an elaborate video on this. I have one question on this if you can help. In enterprise databases, there could be multiple users querying same database tables at a time very often. How does Langchain or LLM preserve and handle the context within that user's session without mixing it with others?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      There is nothing specfic here about langchain its same as the way web server is able to handle different request without mixing them

  • @neerajchimwal2274
    @neerajchimwal2274 Před rokem +1

    There is one problem on this method. If your database size is very large, or will take forever to establish the connection to your database. I have around 140 tables with many columns in my azure SQL server and it takes around 13 minutes to connect to database. There is no problem with the internet bandwidth. I tried to connect using SQLalchemy directly and it took 13 seconds to connect but it won't either because SQLDatabaseToolkit() needs SQLDatabase instance.

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      Can you check this? You can print prompt, limit number of sample rows from table and can also limit tables that is getting used.
      python.langchain.com/docs/modules/chains/popular/sqlite

  • @user-qu3vl2zg5q
    @user-qu3vl2zg5q Před 7 měsíci

    Hi Pradip, thanks for sharing the video. I am currently working with MongoDB databases. Could you please create a similar video on MongoDB? Unfortunately, I haven't found a MongoDB agent in the Lang chain, so I was wondering if you could help me with that.

  • @SampikGupta
    @SampikGupta Před 10 měsíci

    Thanks for the video. But the issue that I am getting is excedding the token limit. it is entering in long set of action and observation and finally token limit exceeds.

    • @FutureSmartAI
      @FutureSmartAI  Před 10 měsíci

      It can lead to token limit? How many tables do you have and coumns they have? are you passing sample rows also?If its more either you have to use index over table and then decide what tables to pick first. I have shown this in Llama index sql video

    • @SampikGupta
      @SampikGupta Před 10 měsíci

      @@FutureSmartAIOk sir. I'll go through that video. I'm using the same one mysql sample database. sometimes its working fine, it gives the result but sometimes it exceeds the token limit with the same question asked. I tried adding more parameters such as prefix, suffix and description as well in create_sql_agent. But the same issue sometimes it works but it also fails. Thanks for your reply. I'll go through another video that you have mentioned.

  • @pranjalgupta6920
    @pranjalgupta6920 Před 5 měsíci

    @pradip , do you know if we can have same solution using Redshift instead of mysql via rds?

  • @manojrangera1580
    @manojrangera1580 Před 9 měsíci

    Hi @Pradip Nichite
    can we use this to get query from a big query table?

  • @adiy77
    @adiy77 Před rokem

    @FutureSmartAI In your current implementation, is the data in your database table gets shared with ChatGpt? If yes, is there any safer way to accomplish the same without actually sharing the data in your database with chatgpt?

  • @dharminisaravanan3369
    @dharminisaravanan3369 Před rokem +1

    Hi Pradip, loved your tutorial. Can we do the same with azure sql database?

  • @boopalanm5206
    @boopalanm5206 Před rokem

    Hi Pradip, great video!
    Have one doubt
    How can we use SQLDatabase with multipromptchain. When We try to pass like the below getting error as
    validation errors for MultiPromptChain default_chain -> database extra fields not permitted (type=value_error.extra) default_chain -> input_key extra fields not permitted (type=value_error.extra) default_chain -> llm_chain extra fields not permitted
    SQLDatabaseChain(llm=llm, database=db, verbose=True, prompt=prompt)

  • @karthikvakee9063
    @karthikvakee9063 Před rokem

    Is there a way to get the query alone while using memory along with this

  • @user-jj5et9ep7f
    @user-jj5et9ep7f Před rokem +1

    Hi, thanks for the video.
    I have 2 questions:
    Question 1: what is the difference between "SQLDatabaseChain" and "SQLDatabaseToolkit"?
    Question 2: I get an error when i try using the create_sql_agent in the below code.
    toolkit = SQLDatabaseToolkit(db=db)
    agent_executor = create_sql_agent(
    llm=OpenAI(temperature=0),
    toolkit=toolkit,
    verbose=True
    )
    ValidationError: 1 validation error for SQLDatabaseToolkit
    llm
    field required (type=value_error.missing)
    Can you please help here? Appretiate your help

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      This thread has solution github.com/hwchase17/langchain/issues/5068

  • @yashkolhatkar2097
    @yashkolhatkar2097 Před 11 měsíci

    Hello Pradeep
    Thank you for the useful video.
    I have a question.
    Can we connect two databases using single agents.

  • @bunty672744
    @bunty672744 Před rokem

    Am using Oracle Autonomous CLoud database. How do i get its connection string?

  • @nishkam5jan
    @nishkam5jan Před rokem

    Can you please help in establishing connection to a postgresql server ? will the same sqltoolkit work for it also?

  • @user-lo7fi2qu5c
    @user-lo7fi2qu5c Před rokem +1

    Do you think it is practical to use SQL Agent on a database which has 25 tables and complicated joins? What's the best approach in this scenario?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      25 tables perfectly fine. For complicated joins you should test it. you can add few complicated queries as prompt example.

    • @multiverse9512
      @multiverse9512 Před rokem

      @@FutureSmartAI I have a very big and complex database that I am planning to extract meaningful data using an LLM but I am kind of confused where to start. Do you have any suggestion for me?

  • @farexBaby-ur8ns
    @farexBaby-ur8ns Před 6 měsíci

    Why do they say dont run this against sensitive data?
    Is there any data security issue with this?
    Will there be data spillage, say, if I run the similar(as in your example) inside a company - will that data be sent to outside LLMs?

    • @FutureSmartAI
      @FutureSmartAI  Před 6 měsíci

      Basically what if agent modifies data? make sure db credentials we use has only read only permissions if you dont expect agent to insert anything in db.
      Is there any data security issue with this?
      I think its more about behaviour control that sometimes we dont know how agents will behave. When connect DB to langchain or LLM its basically has access to schema, sample rows if you allow and it can run some queries on DB.

  • @LogicaQI
    @LogicaQI Před rokem

    So many thanks Pradip. I'm trying to build my own agent without know that already exists ^^ ueheuheu. You help me alot. Now let me ask. You told: With you have a good schema you will receive a good results but if don't have one ? Now I'm working on a legacy system. I have some way to help the agent get corrects understandings ?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      You can customize prompt and add some information about your schema. You can createmaping mentioning column xyz is actuall cutomer_name etc

  • @addyjain1375
    @addyjain1375 Před 7 měsíci

    Hi, thanks for this video! Had a question -
    When we call the chatgpt api does it retrain send all our data back to chatgpt?

    • @FutureSmartAI
      @FutureSmartAI  Před 7 měsíci

      are you asking whether we have to send all our data to chatgpt ? we send db schema and sql outptut to paraphrase answer.
      Do you mean does chatGPT uses our data to retain it ? with latest developer notes no, they dont use.

  • @preeshap468
    @preeshap468 Před 10 měsíci

    When I try to connect to sql server getting the error "data source name not found and no default driver specified" or otherwise error no connection can be made because the target machine refused it. Please help what can be done

    • @FutureSmartAI
      @FutureSmartAI  Před 10 měsíci

      Hi make sure you use correct sql server connection string

  • @AngweenAnnora
    @AngweenAnnora Před 3 měsíci

    I wonder if you have the updated the code, I tried the SQL agents but it's not working anymore.

    • @FutureSmartAI
      @FutureSmartAI  Před 3 měsíci +1

      check my recent video of NL2sql using langchain

  • @aleccisco
    @aleccisco Před rokem

    How can we train the model so that we can explain which tables and columns should be used for certain information? I can train it then someone else can ask the questions

  • @santoshsingh-dev
    @santoshsingh-dev Před rokem +1

    How can I connect to MS SQL Server?
    I am doing some analysis using Open AI.

  • @jayrn4596
    @jayrn4596 Před rokem

    Hi sir. Can we use integrate streamlit to see the o/p of sql queries?

  • @vedantgosavi4936
    @vedantgosavi4936 Před 7 měsíci

    hi, I need help with postgre sql db connection string.

  • @sakilaprabu2858
    @sakilaprabu2858 Před rokem +1

    Great tutorial. Can you help me on getting the connection string for SQL Server

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      Can you try this:
      db_user = "your_username"
      db_password = "your_password"
      db_host = "your_host"
      db_name = "your_database_name"
      db_driver = "ODBC Driver 17 for SQL Server" # Update the driver version as needed
      connection_string = f"mssql+pyodbc://{db_user}:{db_password}@{db_host}/{db_name}?driver={db_driver}"

    • @sakilaprabu2858
      @sakilaprabu2858 Před rokem

      @@FutureSmartAI Thank you

  • @AaronAsherRandall
    @AaronAsherRandall Před rokem

    Hi Pradip, thank you so much for this video 🙏 I have a question. I am building an app that allows people to drag and drop files into an app, ChatGPT looks at the files (word files etc.) and assigns a title, description and category. That data is then stored in an SQL database. These files could be anything.
    In the case that say the files are sales and marketing data, would a user be able to use agents to ask broad questions like “What marketing campaigns did we run in February?”, and get an accurate result? Even if the tables in SQL only contains title, description and category for example?
    What I mean is, would it be able to produce a meaningful answer even with very broad and non-specific schema?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      If there no specific scheam then instead of sql agent you should use semantic search + Qna

  • @pravingupta2892
    @pravingupta2892 Před rokem

    Hi Pradeep, it was nice explained . pls help me for SQL server connection string with Lang chain.

  • @inspiredteaching4468
    @inspiredteaching4468 Před rokem

    When I use Langchain to connect my sql database to Openai api do my private data becomes accessible to OpenAI? I am asking this in context to data security. Suppose I am building a chatbot for my client and use this architecture then is there a possibility that the data is exposed to OpenAI no matter they use it or not to train their models?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      No OpenAI dont have access to your data. You need to only share schema and it will convert nataural query to SQL

  • @tanvijindal3606
    @tanvijindal3606 Před rokem

    I m not able to access AWS RDS ..any other option to connect database to langchain

  • @ambrosionguema9200
    @ambrosionguema9200 Před rokem +1

    Hi Pradip, Can you explain in detail the part of database please? toolkit = SQLDatabaseToolkit(db=db) . I have an issue ValidationError: 1 validation error for SQLDatabaseToolkit
    llm
    field required (type=value_error.missing). And How we can built own Database?

  • @SD-io6yu
    @SD-io6yu Před rokem

    @Pradip : Great work!. I would like to seek some advice for one of my projects, where i would like to predict error code (eg: Error 8505) which correspond to error messages (Eg: Logger: NetworkError ). The error code and error messages were stored in SQL databank. Would it be good idea of using SQL Lang chain for this task? Is there any combination of Langchain could be used, as this is more into prediction of error code based message rather than query..

  • @mayurkastiya9242
    @mayurkastiya9242 Před rokem

    Hello, can you please help me with connection string that I can put to connect to my local MySQL Workbench
    db_user = "username"
    db_password = "password"
    db_host = "host name"
    db_port = port number
    db_name = "db name"
    These are the variables that I've declared but if you could please help me with the connection string, that would be great!
    Currently I'm getting this error "Can't connect to MySQL server on 'localhost' ([Errno 99] Cannot assign requested address)"
    Thanks in advance!

  • @helptest-f6c
    @helptest-f6c Před 9 měsíci

    Hey Pradip, Can we plot a chart from the output.

    • @FutureSmartAI
      @FutureSmartAI  Před 9 měsíci

      we can use sql query ouptut and then use any visualisation library

  • @josiahadesola
    @josiahadesola Před rokem

    Can we used llama2 instead of Open ai API key because of the cost @FutureSmartAI

  • @geekyprogrammer4831
    @geekyprogrammer4831 Před rokem +7

    Leo's gf's age 🤣

  • @Suresh_Singanamala
    @Suresh_Singanamala Před rokem

    Hi, How can we add permission based search? For example I have a user table and first ask the username in the prompt and first check if user present in the table then only proceed with search if not then stop the search and return unauthorized user to search.

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      why you want to ask username in prompt. you can have basic user management using python and mysql

  • @s1mplyFrost
    @s1mplyFrost Před rokem

    Thanks for the video. Super helpful. Can we use the same code with MS SQL Server database? instead of MySql?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      Yes you can! you need to find ms sql connection string

  • @VjPrasanna
    @VjPrasanna Před rokem

    Hi Pradip,
    Thanks for the wonderful content. When I try to use my custom response from SQL Agent I am getting the below error, how to print this response for the "generated" array.
    response = openai.ChatCompletion.create(
    model = model,
    messages = agent_executor.run(messages)
    )
    openai.error.InvalidRequestError: 'There are 5 orders from Amazon.' is not of type 'array' - 'messages'
    'There are 5 orders from Amazon.' --- this is the response which I want to send it to streamlit chat. Could you please help here ?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      seems your executor response is not in format that chat models accepts

  • @vishnuajoshy1588
    @vishnuajoshy1588 Před 10 měsíci

    Instead of OpenAI...Is it possible to do with Llama or mistral models?

  • @yoongfu2426
    @yoongfu2426 Před rokem

    Hi, now I’m also using SQLDatabase Agent. Do you know how to reduce the execute time of agent? Because it takes too long to execute it, every time execute almost 20-30 seconds. Would be appreciated.

  • @elahehsalehirizi618
    @elahehsalehirizi618 Před rokem

    Hi I have a question. I am using GPT-3 for the task of few-shot text classification, I have tried directly GPT-3 and also via LangChain (using the same few-shot examples for both, the same format of the prompt (the one from LangChain formatter) and the same features of GPT-3). I had better results when I used LangChain. Is there any explanation why LangChain performed better than directly using GPT-3? thanks

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      Ideally if prompt and model is same, output should be same

    • @Amr-Ibrahim-AI
      @Amr-Ibrahim-AI Před rokem

      @@FutureSmartAI If temperature is greater than 0, answers might change even for the same prompt and the same model, might not it?

    • @arjunob
      @arjunob Před rokem +1

      LangChain uses specific prompt template internally that instructs the LLM to be more cogent in its response, even if you provide the prompt in your own style/format. That's what makes the difference. This however, comes at a cost. If you use the get_openai_callback() from langchain.callbacks to track the number of tokens used in both the scenarios, you will notice that the one with LangChain consumed more number of tokens(therefore costs more money) for the same question asked to the LLM.

    • @elahehsalehirizi618
      @elahehsalehirizi618 Před rokem

      @@arjunob Thanks a lot Arjun. That helps a lot

  • @balasrinivas8646
    @balasrinivas8646 Před rokem

    Thank you so much Pradip, can you share the URI combination for mssql? I am using local db.
    I have been trying 'mssql+pyodbc' but got the error
    InterfaceError: (pyodbc.InterfaceError) ('IM002', '[IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified (0) (SQLDriverConnect)')

    • @ZeyadTallalMoharam
      @ZeyadTallalMoharam Před 10 měsíci

      hello my friend
      did you make this tutorial on al local MYSQL DB please ?

  • @levsigal6151
    @levsigal6151 Před rokem +1

    Hi Pradip, thank you for the helpful video. I've tried the langchain SQL Database Agent with my production database, but was always getting errors like "InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 56821 tokens (56565 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
    ". It is obvious that the agent generates too long prompt. Can you please advise if there is workaround of this problem. Thanks.

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      Can you check this? You can print prompt, limit number of sample rows from table and can also limit tables that is getting used.
      python.langchain.com/docs/modules/chains/popular/sqlite

  • @user-lx8yb6si8p
    @user-lx8yb6si8p Před 6 měsíci

    how can we convert the results into a dataframe or a table?

    • @FutureSmartAI
      @FutureSmartAI  Před 6 měsíci

      Instead of showing final rephrase answer you can access generated query run it and share those results table or dataframe

  • @Aidev7876
    @Aidev7876 Před 8 měsíci

    How donwe add memory to the conversation?

    • @FutureSmartAI
      @FutureSmartAI  Před 8 měsíci

      blog.futuresmart.ai/langchain-memory-with-llms-for-advanced-conversational-ai-and-chatbots

  • @prathamnotorious8685
    @prathamnotorious8685 Před 9 měsíci

    Can anyone please tell me how can I actually deploy this technique in a Next.Js web app and planetscale sql database? On what server do i have to deploy this?

    • @FutureSmartAI
      @FutureSmartAI  Před 9 měsíci

      Hi you should first expose it as API, check my fastapi llm video

  • @tusharjain9302
    @tusharjain9302 Před rokem

    Hi Pradip, I'm currently working on building a chat bot for article recommendations using langchain and pinecone to interact with a CSV file. I was wondering if you could offer any advice or assistance in this project?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem +1

      Hi, You can ask any specific questions you have

    • @tusharjain9302
      @tusharjain9302 Před rokem

      @@FutureSmartAI In your previous video, you built a document-based qna system. It works well for pdfs. What changes do I need to make to create a similar system for CSV files?

    • @jayrn4596
      @jayrn4596 Před rokem

      I also have the same doubt. Could you please clear it @FutureSmartAI

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      So you dont want use pandas or sql agent for csvs?

    • @jayrn4596
      @jayrn4596 Před rokem

      @@FutureSmartAI No we don't want. We just want to replace the text file with the structured data in the form of csv file

  • @navneetkrc
    @navneetkrc Před rokem

    Which Huggingface LLM work the best for the above scenario??

  • @ravitejavemula333
    @ravitejavemula333 Před 10 měsíci

    Hello Pradip sir. how can we connect to ibm db2

    • @FutureSmartAI
      @FutureSmartAI  Před 10 měsíci

      Did you find connection string for db? you can give mysql string to chatgpt and asks it to create similar for ibmdb2

  • @harshalmandaliya8968
    @harshalmandaliya8968 Před rokem

    The problem here is this works for only small database due to limit token of GPT3. do you have any workaround on this where you can use tables and columns instead of listing all ?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      Hi Harshal, it works for even bigger databases with many tables and columns. Rather than passing all tables schema to GPT , it first decided which tables are relevant and then pass only relevant tables schema.

    • @NaranMoturi
      @NaranMoturi Před rokem

      @@FutureSmartAI - Is there any guidelines on this..? The agent fails due to token limitation, list_table_sql_db first action doesn't seem to be retrieving only tables required in the query

    • @chaletsurlespistes6087
      @chaletsurlespistes6087 Před rokem

      @@NaranMoturi Hi, I have the same problem than pradip on my business db. I'm using gpt4 but exceed the token limit on every request

    • @danielvillacis-EIGO
      @danielvillacis-EIGO Před rokem

      Hi, Same problem here, I'm working in a demo with the AdventureWorksLT MS SQL db demo and I always hit with the exceed token limitation error.

    • @imabhisht
      @imabhisht Před rokem

      ​@@FutureSmartAI How to pass only Relevant Tables?

  • @LifeTravelerAmmu
    @LifeTravelerAmmu Před rokem

    Hello .. great thanks !!!! can you please make a video without openai_key..

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      without open ai key then we will need open source llm

    • @LifeTravelerAmmu
      @LifeTravelerAmmu Před rokem

      @@FutureSmartAI Yes.. you are right .. my question is how to connect the database with the open-source LLM model (hugging face models ), if possible please make a video for this... it is beneficial for students... THANK YOU for your response...

  • @megaheuston
    @megaheuston Před rokem

    Hi Pradip, thanks for the video and explanation... i am having trouble with SQLDatabaseToolkit(db=db) and i am using sql server.
    ValidationError: 2 validation errors for SQLDatabaseToolkit
    db
    instance of SQLDatabase expected (type=type_error.arbitrary_type; expected_arbitrary_type=SQLDatabase)
    llm
    Can't instantiate abstract class BaseLLM with abstract methods _agenerate, _generate, _llm_type (type=type_error)
    thx in advance

  • @truliapro7112
    @truliapro7112 Před rokem

    Does openai read and save data in there own servers?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      As per there latest developer policy, they wont be using our data to train further model

  • @PraveenKumar-oj8tk
    @PraveenKumar-oj8tk Před rokem

    Can I connect local MySQL database?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      yes. Just create appropriate connection string

  • @vinsin4619
    @vinsin4619 Před 6 měsíci

    you shouldnt show you api key like this i guess?

  • @VishMedh
    @VishMedh Před rokem

    Hi Pradip, Your video is immensely helpful. I need to connect to Snowflake database. Can you help me figure out the right connection string for Snowflake?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      Thanks. Did you check this python.langchain.com/docs/integrations/document_loaders/snowflake ?

  • @dipanshukumar9477
    @dipanshukumar9477 Před rokem

    Hi Pradip, great video!!!
    While I was trying to connect with mysql database. I'm facing this error: in _correct_for_mysql_bugs_88718_96365
    fkey["referred_table"] = rec["TABLENAME"]
    KeyError: 'TABLENAME'
    Can anyone help me with the same...

  • @sakilaprabu2858
    @sakilaprabu2858 Před rokem

    ProgrammingError: (pymssql._pymssql.ProgrammingError) (2812, b"Could not find stored procedure"displays the query". What is the reason for this error?

    • @FutureSmartAI
      @FutureSmartAI  Před rokem

      I think it is wrongly interpreting. And trying to find stored procedure

  • @gurjant47
    @gurjant47 Před 6 měsíci

    hi after connecting db i want to store that db vector store db with embeddings how can i do this @pradip