Streaming
Stream agent execution events in real-time with the streaming API.
Basic Streaming
for await (const event of agent.stream({ task: 'Write a summary' })) {
switch (event.type) {
case 'run_start':
console.log(`Run started: ${event.runId}`);
break;
case 'llm_token':
process.stdout.write(event.token);
break;
case 'tool_start':
console.log(`\nCalling: ${event.toolName}`);
break;
case 'tool_end':
console.log(`Result: ${event.result}`);
break;
case 'done':
console.log(`\nCompleted: ${event.result.status}`);
break;
case 'error':
console.error(`Error: ${event.error}`);
break;
}
}async for event in agent.stream(task="Write a summary"):
match event.type:
case "run_start":
print(f"Run started: {event.run_id}")
case "llm_token":
print(event.token, end="")
case "tool_start":
print(f"\nCalling: {event.tool_name}")
case "tool_end":
print(f"Result: {event.result}")
case "done":
print(f"\nCompleted: {event.result.status}")
case "error":
print(f"Error: {event.error}")AgentStreamEvent
The stream yields a union of event types:
| Event | Fields | When |
|---|---|---|
run_start | runId | Run begins |
step_start | stepIndex | New step begins |
llm_token | token | LLM produces a token |
llm_stream | event: ChatStreamEvent | Raw LLM stream event |
tool_start | toolName, input | Tool execution begins |
tool_end | toolName, result, duration | Tool execution completes |
step_end | stepIndex, step: StepRecord | Step completes |
done | result: RunResult | Run completes |
error | error: string | Error occurred |
LLM Streaming Requirement
Streaming requires the LLM to implement the StreamingLLM interface. All built-in providers (Anthropic, OpenAI, Gemini, Ollama) support streaming.
import { isStreamingLLM } from '@ahzan-agentforge/core';
const llm = createLLM({ provider: 'anthropic', model: 'claude-sonnet-4-20250514' });
console.log(isStreamingLLM(llm)); // trueIf the LLM doesn't support streaming, agent.stream() will throw an error.
Collecting Stream Results
To both stream and collect the final result:
let finalResult;
for await (const event of agent.stream({ task: 'Do something' })) {
if (event.type === 'llm_token') {
process.stdout.write(event.token);
}
if (event.type === 'done') {
finalResult = event.result;
}
}
console.log(finalResult.status); // "completed"Next Steps
- Hooks — lifecycle callbacks
- LLM Streaming — LLM-level streaming details