This video is part of my Best Selling Udemy Course on Amazon Bedrock and Generative AI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35
It is possible to optimize the costs of OpenSearch in a development environment. Implementing this example incurs a charge of approximately USD$6 daily, which amounts to around USD$180 per month, regardless of whether the agent is being used or not. Can the service be shut down without deleting the collection? Or what alternatives exist to optimize costs?
Thanks for this great demo. Could you please share some insights regarding costing for the services used from knowledge base excluding gateway, lambda and S3?
KnowledgeBase, though AWS calls it a serverless service but it is not...Its a per hour cost and every hour cost will be approximately $1 irrespective whether you use it or not and the cost mainly comes from openSearch vector store...Please be careful while using KnowledgeBases while using it from cost perspective...When you delete the KnowledgeBase, you also need to separately delete OpenSearch vector store by going to opensearch service....
Question? This guy made some AI thats hyperbrilliant on communicating with other people, and not as good on other things. That’s kind of what I’m looking for (for another specialization). Is this the way of achieving this? Is this creating the savant AIs
Hi Jisu - I will try...In the meantime you can check out the detailed use case implementation including AWS Lambda code from Udemy Course on Bedrock and GenAI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35
I believe in lambda part, Rahul is showing how RAG or knowledge base model can be called from API call. Lambda function in the video was to call knowledge base , that's all to test knowledge base bot.
@Akashbadal978 - You can mention you are doing a poc for personal use. It’s just for Anthropic, makers of Claude llm to learn how their model is being used…shouldn’t be a problem
Thanks for your video and udemy tutorial. I have question on AWS Bedrock knowledge base , If new PDF files is added in source s3 location does it automatically perform embedding automatically or we have to sync again.
Hello can you expand on the sessionID portion of the request syntax for retrieveAndGenerate API? The documentation doesn't explain sessionIDs. Thank you!
Great question… it is Serverless service, so there should be no per hour cost but unfortunately there is. Will need to research as to why…will research n update here
Helpful video! One issue I am facing using retrieveAndGenerate API is that the cited references is not coming even though the output answer is coming in the response. Any idea whats the issue?
Hey Rahul, nice step by step tutorial helps getting overall idea. I have a question though - what happens if I ask a question for which no related data is present on configured S3 data source? So, let's say if I ask "Who is Michael Jackson?" - will it know the answer? I am trying to understand by default what data is it already trained on apart from what we provide from the s3 data source?
Good question Prashant...Bedrock Knowledge Bases provides contextual information but the answer is actually provided by the Claude Foundation Model (or LLM) in this use case...So, if it gets a context from KnowledgeBases, it will use it to augment the answer, if it doesnot find any context, it will still try and answer based on its training data and Micheal Jackson is a very common question...so it will be able to answer it ...
In this scenario, is it possible to configure the model in a way that it doesn’t respond to questions that has context which is not present in the uploaded files?
hello sir!!! I have a issue in creating the knowledge base. When I create it shows failed to create open search serverless collection.eventhough I gave the full access access for bedrock and opensearch service for the user and made the s3 bucket to be accessed by the opensearch service, the issues is not fixed. can you help me to clear that issue? I'm struggling with that issue !!! please help me
@trisalrahul 0 seconds ago Hi - I will try and upload the code...In the meantime you can check out the detailed use case implementation including AWS Lambda code from Udemy Course on Bedrock and GenAI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35
This video is part of my Best Selling Udemy Course on Amazon Bedrock and Generative AI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35
It is possible to optimize the costs of OpenSearch in a development environment. Implementing this example incurs a charge of approximately USD$6 daily, which amounts to around USD$180 per month, regardless of whether the agent is being used or not. Can the service be shut down without deleting the collection? Or what alternatives exist to optimize costs?
I lowered the size of the resources in the collection created in Opensearch, lower but still expensive.
Thanks for watching my video...Please try other pinecone and others...they seem to be cheaper alternatives comapred to opensearch...
Thanks for this great demo. Could you please share some insights regarding costing for the services used from knowledge base excluding gateway, lambda and S3?
KnowledgeBase, though AWS calls it a serverless service but it is not...Its a per hour cost and every hour cost will be approximately $1 irrespective whether you use it or not and the cost mainly comes from openSearch vector store...Please be careful while using KnowledgeBases while using it from cost perspective...When you delete the KnowledgeBase, you also need to separately delete OpenSearch vector store by going to opensearch service....
Question? This guy made some AI thats hyperbrilliant on communicating with other people, and not as good on other things. That’s kind of what I’m looking for (for another specialization). Is this the way of achieving this? Is this creating the savant AIs
Not sure what is the question?
Could you please explain AWS lambda part after knowledge base and how this app will run AWS or local, please explain detail
Hi Jisu - I will try...In the meantime you can check out the detailed use case implementation including AWS Lambda code from Udemy Course on Bedrock and GenAI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35
I believe in lambda part, Rahul is showing how RAG or knowledge base model can be called from API call. Lambda function in the video was to call knowledge base , that's all to test knowledge base bot.
There is error with using another model except claude. Does AWS allows to integrate knowledge base with other LLMs ?
It only supports claude at the moment
@@trisalrahul Thanks Rahul sir... And for claude , AWS requires company details and usecase... Can't we do it without it ?
@Akashbadal978 - You can mention you are doing a poc for personal use. It’s just for Anthropic, makers of Claude llm to learn how their model is being used…shouldn’t be a problem
Thanks for your video and udemy tutorial. I have question on AWS Bedrock knowledge base , If new PDF files is added in source s3 location does it automatically perform embedding automatically or we have to sync again.
Good question Neeraj...You will have to do a sync for any new documents added....
Hello can you expand on the sessionID portion of the request syntax for retrieveAndGenerate API? The documentation doesn't explain sessionIDs. Thank you!
Hi Alex, session id is used if you want to create a chatbot, so that entire previous conversation is used as part of context window…
I am your 1000 subscriber
Awesome…thank you 👍🏻👍🏻
Great tutorial, just subbed… one question about on cost, if the solution includes opensearch serverless, why is there hourly cost related to this?
Great question… it is Serverless service, so there should be no per hour cost but unfortunately there is. Will need to research as to why…will research n update here
Thank you ❤
Helpful video!
One issue I am facing using retrieveAndGenerate API is that the cited references is not coming even though the output answer is coming in the response.
Any idea whats the issue?
Hi, Does the raw response show citation...maybe you have incorrectly retrieved dictionary response...
@@trisalrahul Response json consist citations also as per the retrieve and generate API documentation.
Hey Rahul, nice step by step tutorial helps getting overall idea. I have a question though - what happens if I ask a question for which no related data is present on configured S3 data source?
So, let's say if I ask "Who is Michael Jackson?" - will it know the answer?
I am trying to understand by default what data is it already trained on apart from what we provide from the s3 data source?
Good question Prashant...Bedrock Knowledge Bases provides contextual information but the answer is actually provided by the Claude Foundation Model (or LLM) in this use case...So, if it gets a context from KnowledgeBases, it will use it to augment the answer, if it doesnot find any context, it will still try and answer based on its training data and Micheal Jackson is a very common question...so it will be able to answer it ...
In this scenario, is it possible to configure the model in a way that it doesn’t respond to questions that has context which is not present in the uploaded files?
hello sir!!! I have a issue in creating the knowledge base. When I create it shows failed to create open search serverless collection.eventhough I gave the full access access for bedrock and opensearch service for the user and made the s3 bucket to be accessed by the opensearch service, the issues is not fixed. can you help me to clear that issue? I'm struggling with that issue !!! please help me
Let Knowledge Bases create the IAM Service Role please…
@@trisalrahul I even tried that. It doesn't work !!!
can you provide the codes?
@trisalrahul
0 seconds ago
Hi - I will try and upload the code...In the meantime you can check out the detailed use case implementation including AWS Lambda code from Udemy Course on Bedrock and GenAI - www.udemy.com/course/amazon-bedrock-aws-generative-ai-beginner-to-advanced/?referralCode=A02153572B8864F928E7&couponCode=NVDPRODIN35