subagent
Defines a nested LLM agent that a deepAgent can invoke via its task() tool. The node is not executed directly by the runner — DeepAgentCompiler reads its configuration from the graph and registers it so the main agent can delegate work to it.
Tools are attached to a subagent via tool edges from tool nodes. The subagent connects to the main deepAgent via an agent edge.
note
Calling process() on this node raises NotImplementedError. It exists purely for graph wiring and configuration.
Parameters
| Param | Type | Required | Default | Description |
|---|---|---|---|---|
llm_model | string | Yes | — | LLM model identifier for this subagent (e.g. gpt-4o-mini) |
llm_provider | string | No | "openai" | LLM provider name |
system_prompt | string | Yes | — | System prompt for this subagent |
description | string | No | "Subagent for specialized tasks" | Description shown to the main agent when deciding whether to invoke this subagent |
Output
Output depends on what the subagent's LLM produces. It is returned to the main agent as the response to the task() tool call.
Example
{
"id": "searchSubagent",
"type": "subagent",
"data": {
"label": "Search Subagent",
"isExecuted": false,
"handles": ["agent"],
"schema": {},
"params": {
"llm_model": { "value": "gpt-4o-mini", "isExpression": false, "isAttachedToInputNode": false },
"system_prompt": { "value": "You are a vector search specialist. Search the knowledge base and return relevant passages.", "isExpression": false, "isAttachedToInputNode": false },
"description": { "value": "Searches the vector database for passages relevant to a query", "isExpression": false, "isAttachedToInputNode": false }
},
"inputs": [], "outputs": [], "errors": []
},
"position": { "x": 300, "y": 200 },
"isSelected": false,
"isDragging": false
}
Connect this node to a deepAgent with an agents → agent edge.