Conversation Flow
Conversation Flow outlines how the conversation progresses from one point to another, typically organized in a tree-like structure with nodes representing different states or steps in the conversation.
Main elements in conversation flow
Nodes
Connections
1. Nodes
Nodes represent individual states or steps within the conversation. Each node typically corresponds to a specific action, prompt or response.
2. Connections
Connections specify the pathways through which the conversation advances, allowing for branching, looping or other forms of interaction.
Now that you’ve already created a BotStack and added your data to the Brain Vault, let’s build a simple assistant using the nodes and connections.
In the Sequence Studio, you will find the Node Stack on the left side. This section contains all the nodes available for building your AI Assistant. Each node has its own specific purpose, and you can learn more about them by clicking here.
For now, we will only focus on four nodes:
- Start Node
- LLM Node
- Response AI Node
- Listen Node
1. Start Node
The start node represents the beginning of the conversation. It is triggered by a mention or the initial message in a DM. The input provided to start the conversation is immediately passed to all the start node’s outputs.
2. LLM Node
The LLM node utilizes the Large Language Model of your choice, such as OpenAI Chat-GPT, Anthropic Claude, or Google Gemini to handle conversational dialogue.
3. Response Node
The Response node takes input from the other nodes in the workflow and sends it to the user in the chat.
4. Listen Node
The listen node waits for user input within the chat. Once a message is received, it is immediately passed onto all the Listen node’s outputs.
Here, we have added the four nodes we mentioned above and connected them in a sequence
The conversation flow begins with the Start Node, then transitions to the LLM Node, followed by the Response AI Node, and finally, moves to the Listen Node.
By connecting the Listen Node to the LLM Node, it forms a loop, ensuring that the assistant won’t end the conversation but rather continue the flow indefinitely.
You can copy the settings on LLM Node from the image above. If you want to know more about how the LLM node works, click here.
Now you can proceed to the next chapter, which covers how to test the assistant you just built.
Was this page helpful?