Use a Prompt node to call an LLM (Large Language Model) that generates, extracts, or classifies information based on your inputs. A Prompt node can return either plain text or a structured JSON object.A Prompt node in Flow includes four key attributes:
system_prompt
Initial instructions that guide the LLM’s behavior.Use the system prompt to set the tone or role for the LLM. For example:
[TEXT]
Copy
Ask AI
You're a friendly bot that generates a morning greeting message.
Note:
You can insert dynamic values using syntax .
user_prompt
The specific request or task you want the LLM to perform.Use the user prompt to define the task clearly. For example:
[TEXT]
Copy
Ask AI
Generate a simple greeting message for {name} wishing them a good day.
Note:
You can insert dynamic values using syntax.
llm
The name of the LLM model to use.Specify the name of the LLM model. Make sure the name matches one of the supported models.
[TEXT]
Copy
Ask AI
meta-llama/llama-3-3-70b-instruct
llm_parameters
Configuration options such as temperature, token limits, and sampling strategies.Customize the generation behavior using these parameters:
temperature: Controls randomness. Higher values produce more diverse outputs.
min_new_tokens: Sets the minimum number of tokens to generate.
max_new_tokens: Sets the maximum number of tokens to generate.
top_k: Limits token selection to the top k most likely options.
top_p: Uses nucleus sampling to select from the top p cumulative probability.
stop_sequences: Defines sequences that stop generation when encountered.
Example:In this example, you use a prompt node to extract structured data from a support request.
[Python]
Copy
Ask AI
prompt_node = aflow.prompt( name="extract_support_info", display_name="Extract information from a support request message.", description="Extract information from a support request message.", system_prompt=[ "You are a customer support processing assistant, your job take the supplied support request received from email,", "and extract the information in the output as specified in the schema." ], user_prompt=[ "Here is the {message}" ], llm="meta-llama/llama-3-3-70b-instruct", llm_parameters={ "temperature": 0, "min_new_tokens": 5, "max_new_tokens": 400, "top_k": 1, "stop_sequences": ["Human:", "AI:"] }, input_schema=Message, output_schema=SupportInformation )