AgentForge

Streaming

Stream agent execution events in real-time with the streaming API.

Basic Streaming

for await (const event of agent.stream({ task: 'Write a summary' })) {
  switch (event.type) {
    case 'run_start':
      console.log(`Run started: ${event.runId}`);
      break;
    case 'llm_token':
      process.stdout.write(event.token);
      break;
    case 'tool_start':
      console.log(`\nCalling: ${event.toolName}`);
      break;
    case 'tool_end':
      console.log(`Result: ${event.result}`);
      break;
    case 'done':
      console.log(`\nCompleted: ${event.result.status}`);
      break;
    case 'error':
      console.error(`Error: ${event.error}`);
      break;
  }
}
async for event in agent.stream(task="Write a summary"):
    match event.type:
        case "run_start":
            print(f"Run started: {event.run_id}")
        case "llm_token":
            print(event.token, end="")
        case "tool_start":
            print(f"\nCalling: {event.tool_name}")
        case "tool_end":
            print(f"Result: {event.result}")
        case "done":
            print(f"\nCompleted: {event.result.status}")
        case "error":
            print(f"Error: {event.error}")

AgentStreamEvent

The stream yields a union of event types:

EventFieldsWhen
run_startrunIdRun begins
step_startstepIndexNew step begins
llm_tokentokenLLM produces a token
llm_streamevent: ChatStreamEventRaw LLM stream event
tool_starttoolName, inputTool execution begins
tool_endtoolName, result, durationTool execution completes
step_endstepIndex, step: StepRecordStep completes
doneresult: RunResultRun completes
errorerror: stringError occurred

LLM Streaming Requirement

Streaming requires the LLM to implement the StreamingLLM interface. All built-in providers (Anthropic, OpenAI, Gemini, Ollama) support streaming.

import { isStreamingLLM } from '@ahzan-agentforge/core';

const llm = createLLM({ provider: 'anthropic', model: 'claude-sonnet-4-20250514' });
console.log(isStreamingLLM(llm)); // true

If the LLM doesn't support streaming, agent.stream() will throw an error.

Collecting Stream Results

To both stream and collect the final result:

let finalResult;

for await (const event of agent.stream({ task: 'Do something' })) {
  if (event.type === 'llm_token') {
    process.stdout.write(event.token);
  }
  if (event.type === 'done') {
    finalResult = event.result;
  }
}

console.log(finalResult.status); // "completed"

Next Steps