Simple QA using Open Source Large Language Models
In this use, we will build a simple Question and Answering assistant just like ChatGPT but with help of Open Source Language Models.
Last updated
In this use, we will build a simple Question and Answering assistant just like ChatGPT but with help of Open Source Language Models.
Last updated
Every Large Language Model requires a well-crafted prompt template to guide its actions effectively. For the creation of a Q&A chatbot, a Prompt Template is essential to provide clear instructions to the Large Language Model on its tasks. This component is invaluable as it ensures that the model understands the desired task and generates accurate responses based on the provided instructions.
Click on Prompt Template, and edit the template with your prompt:
Know more about Prompts: https://docs.aiplanet.com/components/prompts
To enhance the intelligence of the responses, we require a Large Language Model capable of understanding instructions and generating corresponding responses. In this context, we utilize an Open Source Large Language Model due to its accessibility and cost-effectiveness. By leveraging the HuggingFace Hub Large Language Models, users can input their Access token and model name.
Notably, this approach offers the advantage of accessing 7B Large Language Models without the need for manual loading.
Know more about LLM: https://docs.aiplanet.com/components/large-language-models
To link the prompt with the Large Language Model (LLM), we use a tool called LLMChain. Just connect your Prompt Template and LLM directly to the LLM Chain, and you can optionally include a memory component if needed. The LLM Chain helps the LLM learn from your prompt, making it better at giving sensible responses.
Know more about Chains: https://docs.aiplanet.com/components/chains
Once the chain is complete click on the build icon in the bottom right corner of the page.
Once the build is complete you can click on the chat icon to test the flow.
If you're able to get the response then the flow is running successfully & now you can deploy the flow to share it with others.
Check out how to use Chat Interface: https://docs.aiplanet.com/quickstart/chat-interface-genai-stack-chat