Handoff
The Handoff operator acts as an intelligent router for conversations. It uses a configured LLM to analyze the current conversation and decide which specialized Agent operator is best suited to handle the next step. When handoff is disabled, it passes the conversation directly to a manually selected agent.
Overview
Section titled “Overview”The Handoff operator sits between your input conversation and a pool of Agent operators. On the Handoff page, you define a sequence of agents, each with an optional rename and system prompt inclusion toggle. When triggered, the operator either uses an LLM to make a routing decision (when Enable Handoff is on) or sends the conversation to whichever agent is selected in Current Agent (when Enable Handoff is off).
The routing LLM uses a route_conversation tool call to pick the best agent. Its decision includes whether to continue with the current agent or hand off to a different one, along with a reason displayed in the Reason field.
Key Features
Section titled “Key Features”- LLM-powered routing that evaluates agent descriptions and system prompts to pick the best specialist
- Manual agent selection mode for deterministic routing without LLM overhead
- Handoff history tracking with from/to agent, reason, and timestamp
- Switchable display between the conversation and the handoff history table
- Agent sequence with per-agent rename and system prompt inclusion controls
Input/Output
Section titled “Input/Output”Inputs
Section titled “Inputs”- Input 1: A conversation table DAT with columns
role,message,id,timestamp. Typically the output of a Chat or similar operator.
Outputs
Section titled “Outputs”- Output 1: The conversation table, which mirrors the input conversation and appends agent responses as they arrive.
Usage Examples
Section titled “Usage Examples”LLM-Driven Routing
Section titled “LLM-Driven Routing”- Add two or more Agent operators to your network, each configured with a different system prompt reflecting their specialty (e.g., one for creative writing, one for technical support).
- On the Handoff page, add blocks to the Agents sequence and link each OP parameter to one of your Agent operators.
- Give each agent a clear name in the Rename field (e.g., “Creative Writer”, “Support Bot”). Toggle Include System to On so the routing LLM can read each agent’s system prompt.
- Connect your input conversation DAT to the Handoff operator’s first input.
- Toggle Enable Handoff to On.
- On the Model page, select an LLM that supports function calling (such as GPT-4, Claude 3, or Gemini).
- Pulse Call Agents. The Handoff operator analyzes the conversation and routes it to the most appropriate agent.
- Check the Reason field to see the LLM’s rationale for its routing decision.
Manual Agent Selection
Section titled “Manual Agent Selection”- Configure the Agents sequence as above.
- Toggle Enable Handoff to Off.
- Select the desired target agent from the Current Agent menu.
- Pulse Call Agents. The conversation is sent directly to the selected agent without any LLM evaluation.
Viewing Handoff History
Section titled “Viewing Handoff History”- On the Handoff page, set Display to “handoff_history”.
- The operator’s viewer switches to the handoff history table, showing a log of all routing decisions with source agent, target agent, reason, and timestamp.
Best Practices
Section titled “Best Practices”- Use descriptive Rename values and enable Include System for each agent in the sequence. The routing LLM relies on these descriptions to make informed decisions — vague names lead to poor routing.
- Choose a capable model for the routing decision. Smaller models may not reliably produce the required
route_conversationtool call. - For deterministic workflows where the target agent is always known, disable Enable Handoff and select the agent manually. This avoids the extra LLM call and associated latency.
Troubleshooting
Section titled “Troubleshooting”- No routing decision: If the Reason field stays empty after pulsing Call Agents, the routing LLM may not support function calling. Switch to a model that does (GPT-4, Claude 3, Gemini).
- Wrong agent selected: Improve the Rename labels and system prompts for each agent. The routing LLM can only distinguish agents based on the information it receives.
- Agent response not appearing: Ensure each Agent operator in the sequence is properly configured and can generate responses on its own. The Handoff operator manages routing but relies on the individual agents to produce the actual output.
Parameters
Section titled “Parameters”Handoff
Section titled “Handoff”op('handoff').par.Call Pulse - Default:
False
op('handoff').par.Enablehandoff Toggle - Default:
False
op('handoff').par.Active Toggle - Default:
False
op('handoff').par.Status Str - Default:
"" (Empty String)
op('handoff').par.Reason Str - Default:
"" (Empty String)
op('handoff').par.Agents Sequence - Default:
0
op('handoff').par.Agents0op OP - Default:
"" (Empty String)
op('handoff').par.Agents0rename Str - Default:
"" (Empty String)
op('handoff').par.Agents0includesystem Toggle - Default:
False
op('handoff').par.Maxtokens Int - Default:
2048- Range:
- 0 to 1
- Slider Range:
- 16 to 4096
op('handoff').par.Temperature Float - Default:
0.7- Range:
- 0 to 1
- Slider Range:
- 0 to 1
op('handoff').par.Modelcontroller OP - Default:
"" (Empty String)
op('handoff').par.Search Toggle - Default:
False
op('handoff').par.Modelsearch Str - Default:
"" (Empty String)
op('handoff').par.Showmodelinfo Toggle - Default:
False
Provider Model Documentation
Consult the documentation for your chosen provider to find supported models, API key information, and usage limits.
View LiteLLM Supported Providers →
Changelog
Section titled “Changelog”v1.0.02024-11-10
Initial release