Chat Operator
Overview
Section titled “Overview”The Chat LOP manages multi-turn conversations with AI models directly from the TouchDesigner parameter panel. It provides a dynamic message sequence where you define roles and content for each message, making it ideal for few-shot prompting, conversation prototyping, and building example dialogues that downstream operators like Agent can use as context.
Input/Output
Section titled “Input/Output”Inputs
Section titled “Inputs”- Conversation Table (DAT, optional): A table with
role,message,id, andtimestampcolumns. Combined with the message sequence based on the Input Handling setting.
Outputs
Section titled “Outputs”- conversation_dat: A table DAT containing the full conversation state with
role,message,id, andtimestampcolumns.
Usage Examples
Section titled “Usage Examples”Few-Shot Prompting
Section titled “Few-Shot Prompting”The Chat LOP excels at creating few-shot examples that teach an AI a specific response format.
- On the Messages page, click the
+button on the Message sequence to add message blocks. - Set up alternating user/assistant pairs as examples:
- Block 0: Role =
user, Text =Translate 'hello' to French. - Block 1: Role =
assistant, Text ={"translation": "bonjour"} - Block 2: Role =
user, Text =Translate 'goodbye' to Spanish. - Block 3: Role =
assistant, Text ={"translation": "adios"}
- Block 0: Role =
- Wire the output of this Chat LOP into the first input of an Agent LOP.
When the Agent receives a new prompt, it uses these examples as context and follows the established format.
Generating a Response with Call Assistant
Section titled “Generating a Response with Call Assistant”- Add one or more message blocks on the Messages page, ending with a
userrole message. - Pulse Call Assistant.
- The AI model responds and a new
assistantmessage block is automatically appended to the sequence.
Simulating Both Sides with Call User
Section titled “Simulating Both Sides with Call User”Call User reverses all roles before sending to the API — the model sees user messages as assistant and vice versa. This lets the AI generate the “user” side of a conversation.
- Set up your conversation with existing assistant messages.
- On the Conversation page, enable Use User Prompt and enter a prompt that describes what kind of user responses to generate.
- Pulse Call User on the Messages page.
- A new
usermessage block is appended with the generated response.
Loading a Conversation from a DAT
Section titled “Loading a Conversation from a DAT”- Create a Table DAT with
roleandmessagecolumns (plus optionalidandtimestamp). - Wire it into the Chat LOP’s input.
- Pulse Load from Input on the Conversation page to populate the message sequence from the table.
This sets Input Handling to none so the loaded messages replace any input merging.
Combining Input with Message Sequence
Section titled “Combining Input with Message Sequence”The Input Handling menu on the Messages page controls how wired input data merges with the message sequence:
- prepend — Input messages appear before the sequence messages
- append — Input messages appear after the sequence messages
- index — Insert at a specific position (set via Insert Index)
- none — Ignore input entirely, use only the message sequence
System Message
Section titled “System Message”On the Conversation page, enable Use System Message and enter instructions in the System Message field. This is prepended to the conversation when calling the assistant.
Best Practices
Section titled “Best Practices”- Use the Chat LOP for static or semi-static conversation templates. For dynamic single-message injection, use the Add Message LOP instead.
- Wire multiple Chat LOPs in series to build layered conversation contexts for an Agent.
- Use Conversation ID on the Conversation page to tag conversations for tracking across your network.
- Pulse Clear Conversation to reset the sequence to a single empty user message.
Parameters
Section titled “Parameters”Messages
Section titled “Messages”op('chat').par.Active Toggle - Default:
False
op('chat').par.Callassistant Pulse - Default:
False
op('chat').par.Calluser Pulse - Default:
False
op('chat').par.Insertindex Int - Default:
0- Range:
- 0 to 1
- Slider Range:
- 0 to 1
op('chat').par.Message Sequence - Default:
0
op('chat').par.Message0text Str - Default:
"" (Empty String)
op('chat').par.Maxtokens Int - Default:
256- Range:
- 0 to 1
- Slider Range:
- 16 to 4096
op('chat').par.Temperature Float - Default:
0.0- Range:
- 0 to 1
- Slider Range:
- 0 to 1
op('chat').par.Modelcontroller OP - Default:
"" (Empty String)
op('chat').par.Search Toggle - Default:
False
op('chat').par.Modelsearch Str - Default:
"" (Empty String)
Provider Model Documentation
Consult the documentation for your chosen provider to find supported models, API key information, and usage limits.
View LiteLLM Supported Providers →
Conversation
Section titled “Conversation”op('chat').par.Usesystemmessage Toggle - Default:
False
op('chat').par.Systemmessage Str - Default:
"" (Empty String)
op('chat').par.Useuserprompt Toggle - Default:
False
op('chat').par.Userprompt Str - Default:
"" (Empty String)
op('chat').par.Clearconversation Pulse - Default:
False
op('chat').par.Loadfrominput Pulse - Default:
False
op('chat').par.Conversationid Str - Default:
"" (Empty String)
Callbacks
Section titled “Callbacks”op('chat').par.Callbackdat DAT - Default:
ChatTD_callbacks
op('chat').par.Editcallbacksscript Pulse - Default:
False
op('chat').par.Createpulse Pulse - Default:
False
op('chat').par.Ontaskstart Toggle - Default:
False
op('chat').par.Ontaskcomplete Toggle - Default:
False
Callbacks
Section titled “Callbacks”onTaskStartonTaskComplete
def onTaskStart(info):
# Called when a Call Assistant or Call User request begins
# info contains: op, callType
pass
def onTaskComplete(info):
# Called when the AI response is received and added to the conversation
# info contains: op, result, conversationID
pass Changelog
Section titled “Changelog”v2.0.02025-07-30
- upgraded model page for release 2.0.0
v1.0.02024-11-06
Initial release