Komentáře •

  • @nishantchoudhary3245
    @nishantchoudhary3245 Před 2 měsíci +9

    One of the best langchain series. Thanks God I am able to find such good content

  • @rajamailtome
    @rajamailtome Před 2 měsíci +17

    Instead of openai, plz use any other downloadable open source LLM which can be run locally 😊

  • @canyouvish
    @canyouvish Před 2 měsíci

    Very comprehensive and super helpful!

  • @user-oi6rs3fp2b
    @user-oi6rs3fp2b Před 2 měsíci +1

    Thank you sir .Great videos you are making............

  • @carlosbelleza5104
    @carlosbelleza5104 Před 2 měsíci +2

    your videos are amazing... state of the art

  • @jayanthkothapalli9.2
    @jayanthkothapalli9.2 Před 2 měsíci +1

    Sir understood the complete flow. Great explaination. ❤❤

  • @shankarpentyala2390
    @shankarpentyala2390 Před 20 dny

    Thanks for introducing and agents

  • @tintintintin576
    @tintintintin576 Před 2 měsíci +1

    God bless you , sir!

  • @DoomsdayDatabase
    @DoomsdayDatabase Před 2 měsíci +7

    Wow this is what i wanted!
    can i use the same code with opensource model by just changing the way it loads sir?
    ​​I wrote email to you yesterday and the video came today! This is next level Krish sir! Thankyou ❤!

  • @lalaniwerake881
    @lalaniwerake881 Před měsícem

    amazing - Thank you

  • @zshortsfeed6862
    @zshortsfeed6862 Před 2 měsíci +5

    Sir, please create a separate playlist for the opensource LLM so we can access it easily.

  • @binayashrestha4131
    @binayashrestha4131 Před 2 měsíci

    thank you so much

  • @Nishant-xu1ns
    @Nishant-xu1ns Před 2 měsíci

    wiating for next video

  • @123arskas
    @123arskas Před měsícem

    You're awesome.

  • @varindanighanshyam
    @varindanighanshyam Před 2 měsíci

    Krish fantastic work. Can you explain it keeping ollama in context

  • @piyush_nimbokar_07
    @piyush_nimbokar_07 Před 2 měsíci +1

    Sir can you please elaborate on using neo4j knowledge graph to build RAG application

  • @THOSHI-cn6hg
    @THOSHI-cn6hg Před 2 měsíci +5

    U can use Lllama 2 or other opensource api insteaddddddddddddddddddddddddddddddddddddddddd....

  • @santhiyac8252
    @santhiyac8252 Před 2 měsíci +1

    Hello krish ,
    I had done google palm2 using pdf bot , but it's not gave response properly because prompt it will working publicly. How to handle prompt for specific dataset.please reply me..

  • @AIConverge
    @AIConverge Před 2 měsíci

    Great video. Any alternatives to Langsmith?

  • @kannansingaravelu
    @kannansingaravelu Před měsícem

    Hi Krish, if we use "create_conversational_retrieval_agent", how do we pass the prompt - Is prompt mandatory?

  • @muhammedyaseenkm9292
    @muhammedyaseenkm9292 Před 2 měsíci

    How can we extract multi columnar tabular data , especially from images,

  • @hemanthram7907
    @hemanthram7907 Před 2 měsíci +1

    This video is very comprehensive and easy to understand, really grateful for your efforts sir, However, Could you please create a session on how to achieve the function calling , tools and agents using Gemini Pro or any other Open source LLM, unfortunately, there is no alternative for the open AI version (create_openai_tools_agent). Please explain us the workaround to use other LLMs.

    • @aj.arijit
      @aj.arijit Před 24 dny +1

      exactly sent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge
      i found a function using gemini which a person wrote as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time
      def process_user_request(user_input):
      # Parse user input for potential tool usage
      if "{" in user_input and "}" in user_input:
      # Extract tool name and arguments
      tool_call = user_input.split("{")[1].split("}")[0]
      tool_name, arguments = tool_call.split(":")
      arguments = eval(arguments)
      # Find the corresponding tool function
      for tool in tools:
      if tool.__name__ == tool_name:
      # Execute the tool with user arguments
      tool_output = tool(arguments)
      return tool_output
      # User request doesn't involve a tool, respond normally
      return f"I understand, but I can't use a tool for this request. {user_input}"
      while True:
      # Get user input
      user_input = input("User: ")
      # Process user request and generate response with Llama 2
      response = model.generate(
      input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools="
      * ".join([t.__name__ for t in tools]))),
      max_length=1024,
      num_beams=5,
      no_repeat_ngram_size=2,
      early_stopping=True
      )
      # Extract and format the generated response
      generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True)
      tool_output = process_user_request(user_input)
      final_response = generated_text.replace("{generated_response}", tool_output)
      # Print the final response to the user
      print(final_response)
      there is something called binding of func which i could not understand shit

  • @AsifKhan-cc3ye
    @AsifKhan-cc3ye Před 2 měsíci

    hey krish, i develop an app using rag for qc of manually populated data in excel but the model is not performing with accuracy i used llama2, is there any other best athematic open source llm?

  • @atharvsakalley9633
    @atharvsakalley9633 Před 2 měsíci

    How to get the accuracy of our search with implementation?

  • @mohsenghafari7652
    @mohsenghafari7652 Před 2 měsíci

    Hi dear friend .
    Thank you for your efforts .
    How to use this tutorial in PDFs at other language (for example Persian )
    What will the subject ?
    I made many efforts and tested different models, but the results in asking questions about pdfs are not good and accurate!
    Thank you for the explanation

  • @divya-ob3jq
    @divya-ob3jq Před 2 měsíci

    Sir please make a video on virtual car assistant using LLMs

  • @ashishmalhotra2230
    @ashishmalhotra2230 Před 2 měsíci

    Hi Krish, can you make a video on conversational chatbot trained on own datasource

  • @SrinithiMalar
    @SrinithiMalar Před 2 měsíci

    Is block chain is good career to start in 2024 and it's future scope

  • @Political_Memess
    @Political_Memess Před 2 měsíci

    Use Llama2 and create a Flask html front end too please

  • @arID3371ER
    @arID3371ER Před 2 měsíci

    Man I thought you got your hair back! 😂😂😂❤❤❤

  • @yashthakkar2629
    @yashthakkar2629 Před měsícem

    I tried adding an SQLDatabase tool to the tools list. I got an error because i think the QuerySQLDataBaseTool is not really returning a tool. What am i suppose to do if i want to add an sqlDatabase to the following list of tools without any error, kindly help.

  • @karansingh-fk4gh
    @karansingh-fk4gh Před měsícem

    Hi Krish,
    Can you please create Vedio on langgraph??

  • @rishiraj2548
    @rishiraj2548 Před 2 měsíci

    🙏🙂👍

  • @AlanSabuJohn
    @AlanSabuJohn Před 2 měsíci

    sir,can you do the embedchain tutorials

  • @DamanjeetGTBIT
    @DamanjeetGTBIT Před 2 měsíci

    I am learning Stats, sql and ML from miscellaneous videos. I want to start with a clean course . Which one is better to pursue data science career ? IBM Machine Learning or Google Advanced Data Analytics?

    • @slayer_dan
      @slayer_dan Před 14 dny

      Find a roadmap and follow it loosly but not a big reroute.
      Then find a person who teaches the concepts in a way you can absorb it. And practice if they recommend or not.
      For example, I found these people very much as per my taste.
      - Codebasics, Krish for ML concepts with analogies and practical implementation.
      - StatQuest for visual interpretation and understanding of Statistical concepts.
      I hope it would be useful for you.

  • @mvuyisogqwaru2409
    @mvuyisogqwaru2409 Před 2 měsíci

    Hey guys, is anyone else having an issue on the invoke call when using Ollama (llama2 and llama3). I get the following error: ValueError: Ollama call failed with status code 400. Details: {"error":"invalid options: tools"}

    • @aj.arijit
      @aj.arijit Před 24 dny

      exactly spent one full day wasting on this and reading hell lot of documentations although gained a lot of knowledge
      i found a function using gemini as no a agent formation tool for ollama which support chat generation using different tools and ollama together at the same time
      def process_user_request(user_input):
      # Parse user input for potential tool usage
      if "{" in user_input and "}" in user_input:
      # Extract tool name and arguments
      tool_call = user_input.split("{")[1].split("}")[0]
      tool_name, arguments = tool_call.split(":")
      arguments = eval(arguments)

      # Find the corresponding tool function
      for tool in tools:
      if tool._name_ == tool_name:
      # Execute the tool with user arguments
      tool_output = tool(arguments)
      return tool_output

      # User request doesn't involve a tool, respond normally
      return f"I understand, but I can't use a tool for this request. {user_input}"
      while True:
      # Get user input
      user_input = input("User: ")

      # Process user request and generate response with Llama 2
      response = model.generate(
      input_ids=model.tokenizer.encode(prompt.format(user_input=user_input, list_of_available_tools="
      * ".join([t._name_ for t in tools]))),
      max_length=1024,
      num_beams=5,
      no_repeat_ngram_size=2,
      early_stopping=True
      )

      # Extract and format the generated response
      generated_text = model.tokenizer.decode(response[0]["generated_tokens"], skip_special_tokens=True)
      tool_output = process_user_request(user_input)
      final_response = generated_text.replace("{generated_response}", tool_output)

      # Print the final response to the user
      print(final_response)
      there is something called binding of func which i could not understand shit which could solve the problem using the langchain.agents func - create_tool_calling_agent

  • @Mabzone-q4p
    @Mabzone-q4p Před 5 dny

    Hi @Krish Naik, i saw many videos on Generative AI, but i feel that there are many missing connections from basic level understanding to coding understanding, I thinking Everyone is capable to loading the libraries and use classes and get the code done. Also provide the basic core concepts also about Prompts, chat models, Tools, agent, memory, chains with their types and where to use them using coding. The basic knowledge in these videos are broken in different parts, time and space. Hope you will find this comment.

    • @vinayaksharma3650
      @vinayaksharma3650 Před 3 dny

      Learn from the blogs posted on websites from their it will be easy to understand things like agents,Tools etc

    • @Mabzone-q4p
      @Mabzone-q4p Před 3 dny

      @@vinayaksharma3650 Hi, thanks for the suggestion. I read and tried to understand, but some of the concept are high overview that need to be understandable. My meaning for the above comment was, if someone is explaining the things which are already explained, so it means the explanation should be like more than from the document in a easy way.

  • @tekionixkeshavag.452
    @tekionixkeshavag.452 Před 2 měsíci +26

    Pls don't use OpenAI in this project...as it's API is paid so we can't access it....instead use gemini or any other open source model...so that we can also try it at our end...

    • @datatalkswithchandranshu2028
      @datatalkswithchandranshu2028 Před 2 měsíci +1

      Not gemini...as they are removing free features quicklu

    • @tekionixkeshavag.452
      @tekionixkeshavag.452 Před 2 měsíci

      @@datatalkswithchandranshu2028 okk..

    • @quezinmark8225
      @quezinmark8225 Před 2 měsíci

      Break your training data into chunks size less than token limit so that you can use free version even for big data...

    • @theinhumaneme
      @theinhumaneme Před 2 měsíci

      Switch the LLM in langchain

    • @tekionixkeshavag.452
      @tekionixkeshavag.452 Před 2 měsíci +1

      That is somewhat complicated and we need help for that only from Krish sir ​@@theinhumaneme

  • @omkarjamdar4076
    @omkarjamdar4076 Před 2 měsíci

    I decided not to use retriever tool so,
    tools=[wiki,arxiv]
    the following error occured
    TypeError: type 'Result' is not subscriptable

    • @mohamed_deshaune
      @mohamed_deshaune Před 2 měsíci

      can you provide the whole code i can help you if you want

    • @EkNidhi
      @EkNidhi Před 2 měsíci

      this is work for me and i have created its ui also its working fine please send code or more about it ...we will help you

  • @thetagang6854
    @thetagang6854 Před 2 měsíci +1

    Great video. Ignore the comments on not using OpenAI, if they don’t want to pay they wouldn’t be the ones to develop actual apps anyway

  • @EkNidhi
    @EkNidhi Před 2 měsíci +1

    Please dont use any paid api key ...we can't excess it

  • @rakeshkumar-pf6yu
    @rakeshkumar-pf6yu Před 2 měsíci

    with due respect....you are little faster here...don't know why you are in hurry these days...please be a little slower..may be of 80% speed of your current speed...thanks..

  • @thelifehackerpro9943
    @thelifehackerpro9943 Před měsícem

    Code explanation is not good, each class ans component should be explained properly, it is very confusing, you are just writing the code that you have been working with.

  • @venky433
    @venky433 Před 2 měsíci

    Did anyone faced below "orjson.orjson" module error.
    error:ModuleNotFoundError: No module named 'orjson.orjson'
    its coming after running "from langchain_community.tools import WikipediaQueryRun" code.

    • @venky433
      @venky433 Před 2 měsíci

      this error is coming with python 3.12 , but it worked with lower version ie 3.10