Example: Research Assistant
Build a research agent with memory, streaming, and budget control.
Overview
This agent researches a topic using search, stores findings in memory, and generates a structured report.
Implementation
import {
defineAgent, defineTool, createLLM,
InMemoryMemoryStore,
} from '@ahzan-agentforge/core';
import { z } from 'zod';
const searchTool = defineTool({
name: 'web-search',
description: 'Search the web for information',
input: z.object({ query: z.string() }),
output: z.object({
results: z.array(z.object({
title: z.string(),
snippet: z.string(),
url: z.string(),
})),
}),
execute: async ({ query }) => {
// Your search API integration
return { results: [{ title: 'Example', snippet: 'Info...', url: 'https://example.com' }] };
},
cacheable: true, // Cache identical searches
});
const agent = defineAgent({
name: 'researcher',
description: 'Researches topics and generates reports',
tools: [searchTool],
llm: createLLM({ provider: 'anthropic', model: 'claude-sonnet-4-20250514' }),
systemPrompt: `You are a research assistant. When given a topic:
1. Search for relevant information using multiple queries
2. Synthesize findings into a structured report
3. Include citations for all claims`,
maxSteps: 15,
budget: { maxCostUsd: 0.50, warnThreshold: 0.7 },
memory: {
store: new InMemoryMemoryStore(),
config: { autoCapture: true },
},
});
// Stream results
for await (const event of agent.stream({
task: 'Research the current state of AI agent frameworks in 2025',
})) {
if (event.type === 'llm_token') {
process.stdout.write(event.token);
}
if (event.type === 'tool_start') {
console.log(`\nSearching: ${event.input?.query}`);
}
}What this uses
cacheable: trueon the search tool prevents duplicate queriesautoCapturestores research findings for future sessions- Budget caps spending at $0.50 with warnings at 70%
- Streaming gives real-time output as the report is written
Next Steps
- Multi-Agent Pipeline — multi-agent example