Prompt Design
N-shot prompting and it comes in different variants. For example, zero-shot prompting is when you simply send an instruction without any examples, and few-shot prompting is when you send an instruction along with examples.
Chain of Thought which consists of providing a prompt and breaking up the actions, into intermediary reasoning steps, so that the LLM can find the most correct, well thought out answer. This technique excelling at various arithmetic, commonsense, and symbolic reasoning tasks.
ReAct and it’s a loop thinking process involving answering questions and looking up data in external systems. After a prompt, the LLM responds with a thought and an action, which is used to generate an observation.
The thoughts are the reasoning about how to act.
The actions are used to formulate calls to an external system, which can be Cloud Functions or external APIs.
And the observations are the response from the external system.
Through these interleaved thoughts, actions, and observations the LLM eventually arrives at an answer.
Prompt Tips
Label: una label è una sentenza seguita da ":".
Prompt principalmente sono di 3 tipi: classification summarization and extraction. classification prompts make the model assign a class or a category to the text you can have the model automatically assign a category or you can provide it with a list of categories to choose from summarization prompts extract a summary of the most important information from the text you can give the model additional direction or let it choose an appropriate summary you can influence the length of its response by increasing or decreasing the maximum number of tokens when you submit your prompt and extraction
prompts allow you to ask the model questions about a given text it will parse the text and use it to answer your questions
Classifcation Prompt
The model will select one classification for the input text rather than outputting a list of probabilities furthermore it means that if you give the model the same input text and parameters it should always give you back the same response. È possibile istruire anche un'opzione di fallback.
- set temperature to 0
- set the top K responses to one this forces the model to always pick the classification label
Summarization Prompt
Scrivi prima le istruzioni per il summary e poi a seguire testo da fare summary
Increase the creativity of the model's responses by increasing the temperature top k and top P values and when you write your prompt tell the model the purpose
Extraction Prompt
in the text event extraction is used for extracting information about events from text such as project milestones and product launch dates and question answering this
allows you to ask the model questions about text that you provide giving you contextual responses with citations from the text this can save you a
- set temperature to 0
- set the top K responses to one this forces the model to always
Chat Prompt
analyze data for you there are three principal components of a chat prompt that you should take into consideration
messages context and examples messages contains a list of author content pairs that distinguish whether the message content originated
from the user or the model the model will respond to the last message in the list using all other messages in the list as context for its response thereby making the user's chat experience more conversationalComponents:
- message
- context (history)
- examples
Best practices:
- start by giving the chatbot an identity and a Persona this ells the model to role play
- give rules for the chatbot to follow so that you can limit Its Behavior
- add rules that prevent the exposure of context information
- test your chatbot and add rules to counteract undesirable behaviors
- add a rule to reduce hallucinations and to help the chatbot give more factual answers