Prompts

A prompt serves as the instruction for inputting text to Large Language Models (LLMs), prompting them to generate factual responses. Writing effective prompts is crucial as it helps mitigate hallucinations to a certain extent. Several prompt techniques exist, including Few-shot prompting, Zero-shot prompting, and Chain of Thoughts. In this analogy, the prompt acts as the parent, guiding the LLM, which acts as the child, to produce accurate responses.

Note: Keep in mind that every Large Language Model has a maximum content length limit, determining the extent to which it can store tokens. Writing longer prompts consumes more of this context length, reducing the space available for generating responses.

System Message Prompt Template

System messages are instructions for the overall task that you are prompting the Large Languages models for. These messages guide the model on the specific goal or objective it needs to achieve within the context of the task at hand.

Parameters

  • prompt: This template is provided by you to enter instructions for the System prompt

Example

You are a helpful assistant that talks casually about life in general.
You are a good listener and you can talk about anything.

System Message Prompt template doesn't require any input component, further its prompt template can be passed to Sequential Chain or Chat Prompt Template.


Human Message Prompt Template

The Human message prompt provides a specific example relevant to the broad task outlined in the system message. It serves as a focused input to guide the model's response generation process, aiding in achieving the desired outcome efficiently.

Parameters

  • prompt: This template accepts user query along with rules it should follow to generate the response.

Example Usage

For the Human message prompt template, there may be occasions where you need to incorporate user input. To integrate user input or variables, enclose the variable within curly brackets {}.

Without Input Variable

One can define the rules that a LLM needs to adapt to generate the response.

Since the prompt does not include any input variables, there is no provision for connecting input to the Human Message Prompt template. The output component connection remains the same as that used for the System Message Prompt.

Rules:
- Answer to the given context
- If the user question is not in context, say "I don't know"

With Input Variable

Let's use the same prompt as before but with input variables.

Rules:
- Answer to the given {context}
- If the user {question} is not in context, say "I don't know"

Now, within the Human message, two new entries have been added. These can be connected to either a loader or a text splitter as input for the input variable. However, if you choose not to connect the loader or text splitter to the input variables, it will not result in a stack error. Nonetheless, when utilizing a chat interface, you will need to fill in these variables manually.


Chat Prompt Template

The Prompt Template is a hybrid template that combines elements of both the System Message Prompt and the Human Message Prompt. It is passed directly input to the chain, facilitating seamless passage into the Large Language Models for generating responses.

Parameters

  • Messages: This is the input component in ChatPromptTemplate, where you can connect Human Message and System Message prompts.

Example Usage

The Chat Prompt Template requires a message as input, and it can be connected to both Human Message and System Message as inputs. The chain then serves as the output for the Chat Prompt Template.


Prompt Template

Prompt Template is a component where you can integrate different prompt techniques such as Few Shot, Zero Shot, Chain of thoughts and so on. A prompt template can be constructed from either a set of examples, or from an Example Selector object.

Parameters

  • Template: A user entry to implement different prompt techniques.

Example Usage

This prompt component is very straightforward; you can enter any instructions that you need to provide to the LLM. To receive a response, this component should be connected to either an LLM Chain or a RetrievalQAPrompt.


Chat Message Prompt Template

The Chat Message Prompt Template resembles the Prompt Template but offers the flexibility to designate whether the entered prompt is a system message or a human message.

Parameters

  • prompt: Input either a system prompt or a human prompt according to your requirements.

  • role: Assign the role as system for a System prompt and user for a Human prompt based on the entered prompt.

Example Usage

The Chat Message Prompt Template determines the role assigned as the input and links the component to the Chat Prompt Template as the output.

Think of the Chat Message Prompt Template as an alternative for both the System and Human message prompt templates.


Last updated