Prompt Component in FlowHunt

The Prompt component in FlowHunt specifies bot roles and behavior for personalized AI responses. Control output with custom templates to build effective, context-aware chatbots.

Prompt Component in FlowHunt

Component description

How the Prompt Component in FlowHunt component works

Without a good prompt, all bots would act the same way and often miss the mark with their answers. Prompts give instructions and context to the language model, helping it to understand what kind of text it should produce.

Prompt Component Overview

The Prompt component is designed to generate flexible prompt templates for use in AI workflows, allowing dynamic insertion of variables and context. This component is particularly useful in conversational AI scenarios, such as chatbots or virtual assistants, where creating adaptable and context-aware prompts is essential.

What Does the Component Do?

The Prompt component creates a prompt template that can incorporate various dynamic variables, such as user input, chat history, system instructions, and context messages. By leveraging these variables, the component helps you structure rich and context-sensitive prompts that enhance the performance and relevance of downstream AI models or agents.

Key Features

  • Dynamic Templates: Build prompts that automatically include available information like chat history, user input, and context.
  • Custom Variables: Supports insertion of variables such as {input}, {human_input}, {context}, {chat_history}, {system_message}, and {all_input_variables} directly into the prompt template.
  • System Message Support: Allows for the addition of system-level instructions to influence AI behavior.
  • Reusable in Workflows: The output of this component can be used as input for other components, such as LLMs (Large Language Models) or further processing steps.

Inputs

The following inputs can be provided to the Prompt component:

Input NameTypeRequiredDescription
Chat HistoryInMemoryChatMessageHistoryNoPrevious conversation messages. Useful for maintaining context or generating alternative queries.
ContextMessageNoAdditional context information to be included in the prompt.
InputMessageNoThe main user input or message.
System MessageString (multiline)NoSystem-level instructions to guide the AI’s behavior.
TemplatePrompt (multiline)NoThe actual template for the prompt, supporting dynamic variables for customization.

Outputs

  • Message:
    The component outputs a single message object that contains the constructed prompt, with all dynamic variables replaced by their corresponding values.
Output NameTypeDescription
MessageMessageThe generated prompt, ready for use in downstream AI components.

Example Use Cases

  • Conversational AI: Automatically generate prompts for chatbots based on user input, conversation history, and additional context.
  • Retrieval-Augmented Generation: Customize prompts for retrieval tasks by including relevant past interactions and system instructions.
  • Instruction Tuning: Easily adapt prompts for different tasks or user personas by adjusting the template and system message.

Why Use This Component?

  • Enhances Prompt Engineering: Easily manage and update prompt templates without hardcoding.
  • Improves AI Relevance: By injecting context, system messages, and history, prompts become more informative and precise.
  • Increases Flexibility: Supports a wide range of use cases, from simple Q&A to complex, multi-turn conversations.

Summary Table

FeatureBenefit
Dynamic variable injectionContext-aware, adaptable prompts
Support for chat historyMaintains continuity in multi-turn interactions
System message integrationFine-tunes AI personality or instructions
Easy integration in workflowsStreamlines prompt creation for downstream AI

This component is a foundational tool for anyone building sophisticated, context-sensitive AI workflows where prompt construction is key to achieving high-quality results.

Template

This is an advanced optional setting. You can create prompt templates with specified variables to control the chat output fully. For example:

As a skilled SEO, analyze the content of the URL and come up with a title up to 65 characters long.— Content of the URL —{input}Task: Generate a Title similar to others using {human_input} query. Don’t change {human_input} in the new title. NEW TITLE:

The default prompt template looks like this:

You are an AI language model assistant.
Your task is to generate answer based on the input query.
If context is provided, use it to generate the answer to INPUT and HUMAN_INPUT query.
Format answer with markdown.

ANSWER IN LANGUAGE: {lang}
VARIABLES:
{"session_start_time": "2025-06-03 07:35:22", "current_page_url": "https://app.flowhunt.io/aistudio/flows/de6c2e2c-d817-4b2f-af2c-12dba3f46870?ws=74be5f74-d7c5-4076-839d-8ac1771a3b75"}
INPUT: {input}

ANSWER:

The default prompt copies the same structure as the component’s settings. You can override the settings by altering and using the variables in the template field. Creating your own templates gives you greater control over the output.

How to connect the Prompt component to your flow

The prompt is an optional component that further modifies and specifies the final output. It needs several components to be connected:

  • Chat History: Connecting Chat History is not required but is often beneficial. Remembering previous messages makes future replies more relevant.
  • Context: Any meaningful text output can serve as context. The most common choice is to connect the knowledge from retrievers.
  • Input: Only the Chat Input component can be connected here.

This component’s output is text that can be connected to various components. Most of the time, you immediately follow up with the Generator component to connect the prompt to an LLM.

Example

Let’s create a very simple bot. We’ll expand on the medieval knight bot example from earlier. While it talks funny, its main mission is to be a helpful customer service bot, and we want it to provide relevant information.

Let’s ask our bot a typical customer service question. We’ll ask about the pricing of URLsLab. To get a successful answer, we need to:

  • Give it context: For the purposes of this example, let’s use the URL retriever component to give it a page with all the necessary information.
  • Connect input: Input is always the human message from the Chat Input component.
  • Chat History: It’s optional, but let’s connect it for this particular case.
  • Template: We’ll keep the prompt, “You are a helpful customer service bot that talks like a medieval knight.”. Prompts can be much more elaborate than this. See our prompts library for inspiration.
  • Add Generator: We want the bot to have conversational abilities. To do this, connect the Generator. The Prompt serves as input for the generator.

The resulting flow will look something like this:

Result flow using Prompt component in FlowHunt

It’s time to test the knowledge of our medieval knight bot. The URL we gave it is the pricing for URLsLab. So let’s ask about it:

Flowhunt bot answers according to Prompt

Our bot now uses pompous old-timey language to answer basic queries. But more importantly, notice how the bot adheres to its central role as a helpful customer service bot. Lastly, it successfully uses the information from the specified URL.

Examples of flow templates using Prompt Component in FlowHunt component

To help you get started quickly, we have prepared several example flow templates that demonstrate how to use the Prompt Component in FlowHunt component effectively. These templates showcase different use cases and best practices, making it easier for you to understand and implement the component in your own projects.

Previous Next

Frequently asked questions

What is the Prompt component?

The Prompt component gives the bot instructions and context, ensuring it replies in the desired way.

Do I always need to include Prompt in my flows?

Including it for many use cases is a great idea, but the component is optional.

What is the system message?

It’s an editable text field where you set the personality and role of the bot. Simply fill in the template: 'You are a {role} that {behavior}.' For example, 'You are a helpful customer service bot that talks like a medieval knight.'

Do I need to always include Prompt in my flows?

It’s certainly a great idea to include it for many use cases, but the component is optional.

Try FlowHunt's Prompt Component

Start building personalized, context-aware AI chatbots with FlowHunt's intuitive Prompt feature. Define roles, behaviors, and control output for smarter automations.

Learn more