Flow description
Purpose and benefits
Workflow Overview: Simple Flow with Chat History
This workflow is designed to facilitate an interactive chat experience where the AI assistant responds to user-defined tasks, while leveraging the chat history for context-aware answers. It is a general-purpose template, making it adaptable for a wide variety of conversational automations and scalable AI-driven chat solutions.
Step-by-Step Workflow Breakdown
1. Chat Session Initiation and Welcome Message
- Chat Opened Trigger: When the chat is opened, a trigger is activated.
- Welcome Message: A message widget displays a friendly welcome message to the user:
👋 Welcome to the Simple Task Flow!
This tool is designed for you to define your own task based on your input 🌟. I’ll take into account our chat history to provide relevant assistance without any additional context.
Just let me know what you’d like to do, and let’s get started! ✨💬
- Display: The welcome message is shown in the chat output area, providing onboarding and setting expectations.
- Chat Input Node: Receives text (and optionally file) input from the user, representing the task or question they want to address.
3. Retrieving Chat History
- Chat History Node: Fetches up to the last 10 messages (with a token cap of 8000) from the chat. This history is later used to provide context and maintain continuity in the conversation.
4. Prompt Construction
Prompt Template Node: Constructs a dynamic prompt for the language model. It integrates:
- The user’s latest input.
- The recent chat history.
- A fixed system message that instructs the AI to generate context-aware answers.
The prompt template used is:
You are an AI language model assistant.
Your task is to generate answer for human INPUT with consideration of previous conversation in CHAT HISTORY.
--- CHAT HISTORY START
{chat_history}
--- CHAT HISTORY END
--- INPUT START
{input}
--- INPUT END
ANSWER:
5. AI Generation
- Generator Node: Receives the constructed prompt and generates a text response using a large language model (LLM). This ensures the response is contextually relevant and tailored to the user’s request.
6. Output Display
- Chat Output Node: The AI-generated answer is displayed to the user in the chat interface.
Workflow Structure Table
Step | Node/Component | Purpose |
---|
Chat Start | ChatOpenedTrigger | Detects when the chat is opened |
Welcome Message | MessageWidget | Greets and informs the user |
Display Welcome | ChatOutput | Shows the welcome message |
User Input | ChatInput | Captures user’s task or question |
Retrieve History | ChatHistory | Fetches recent conversation for context |
Prompt Construction | PromptTemplate | Builds prompt for the LLM with input and chat history |
AI Generation | Generator | Produces context-aware response using the prompt |
Display AI Output | ChatOutput | Shows the AI-generated answer to the user |
Why This Workflow is Useful for Scaling and Automation
- Contextual Interactions: By incorporating chat history, the system maintains context, improving response relevance and user satisfaction.
- User-Defined Tasks: The workflow is task-agnostic, allowing users to define their own objectives, making it highly flexible.
- Scalable Automation: The modular design is suitable for scaling—multiple users can interact simultaneously, with each session maintaining its own context.
- Easy Customization: The prompt template and nodes can be easily adapted for specific use-cases (e.g., support, information retrieval, onboarding).
- Consistent User Experience: Automated greeting and context-aware responses ensure that every user interaction is handled professionally and efficiently.
Example Use Cases
- Customer support chatbots that remember previous interactions.
- Onboarding assistants that guide new users based on their ongoing conversation.
- General-purpose AI helpers in apps where users can define their own queries or tasks.
This workflow provides a robust foundation for building intelligent, context-aware chat automations that can be tailored to many different applications.