Your First Agent
This tutorial walks you through building a single agent and then extending it to a multi-agent setup using the Heartbit library.
Create a new project
Section titled “Create a new project”cargo new my-agentcd my-agentcargo add heartbitcargo add tokio --features fullA minimal agent
Section titled “A minimal agent”Replace the contents of src/main.rs with:
use std::sync::Arc;
use heartbit::{AgentRunner, AnthropicProvider};
#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { let api_key = std::env::var("ANTHROPIC_API_KEY").expect("set ANTHROPIC_API_KEY environment variable"); let provider = Arc::new(AnthropicProvider::new(api_key, "claude-sonnet-4-20250514"));
let agent = AgentRunner::builder(provider) .name("greeter") .system_prompt("You are a friendly assistant. Be concise.") .max_turns(1) .max_tokens(1024) .build()?;
let output = agent.execute("Say hello in three languages.").await?; println!("{}", output.result); eprintln!( "[tokens: {} in / {} out, {} tool calls]", output.tokens_used.input_tokens, output.tokens_used.output_tokens, output.tool_calls_made );
Ok(())}What each part does
Section titled “What each part does”AnthropicProvider::new(api_key, model)— creates an LLM provider that talks to Anthropic’s API. Wrap it inArcso it can be shared.AgentRunner::builder(provider)— starts building an agent with the given provider..name("greeter")— gives the agent a name (used in logs and events)..system_prompt(...)— sets the system prompt that guides the agent’s behavior..max_turns(1)— limits the agent to one reasoning turn (no tool use loops)..max_tokens(1024)— caps the LLM response length..build()?— validates the configuration and returns the agent.agent.execute(task)— runs the agent’s ReAct loop and returns anAgentOutputwith the result text and token usage.
Run it
Section titled “Run it”export ANTHROPIC_API_KEY=sk-...cargo runYou should see the agent’s response followed by token usage on stderr.
Adding streaming output
Section titled “Adding streaming output”To see tokens as they arrive, add an on_text callback:
let agent = AgentRunner::builder(provider) .name("greeter") .system_prompt("You are a friendly assistant. Be concise.") .max_turns(1) .max_tokens(1024) .on_text(Arc::new(|text| print!("{text}"))) .build()?;The closure receives each text chunk as it streams from the LLM.
Adding retry logic
Section titled “Adding retry logic”Wrap the provider with RetryingProvider to handle transient API errors (429, 500, 502, 503) with exponential backoff:
use heartbit::{AgentRunner, AnthropicProvider, BoxedProvider, RetryingProvider};
let provider = Arc::new(BoxedProvider::new( RetryingProvider::with_defaults( AnthropicProvider::new(api_key, "claude-sonnet-4-20250514") )));BoxedProvider is a type-erasing wrapper needed when composing providers (like adding retry on top of the base provider).
Multi-agent orchestration
Section titled “Multi-agent orchestration”Now let’s extend to multiple agents. The Orchestrator manages sub-agents and delegates tasks between them.
Replace src/main.rs with:
use std::sync::Arc;
use heartbit::{AnthropicProvider, Orchestrator};
#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { let api_key = std::env::var("ANTHROPIC_API_KEY").expect("set ANTHROPIC_API_KEY environment variable"); let provider = Arc::new(AnthropicProvider::new(api_key, "claude-sonnet-4-20250514"));
let mut orchestrator = Orchestrator::builder(provider) .sub_agent( "researcher", "Finds facts and data on a topic", "You are a research assistant. Find key facts and return them as bullet points.", ) .sub_agent( "writer", "Writes polished prose from notes", "You are a writer. Turn bullet-point notes into a short, polished paragraph.", ) .max_turns(5) .max_tokens(4096) .build()?;
let output = orchestrator .run("Write a short paragraph about the Rust programming language.") .await?;
println!("{}", output.result); eprintln!( "[total tokens: {} in / {} out]", output.tokens_used.input_tokens, output.tokens_used.output_tokens );
Ok(())}How it works
Section titled “How it works”Orchestrator::builder(provider)— creates an orchestrator that uses the given provider for its own reasoning and as the default for sub-agents..sub_agent(name, description, system_prompt)— registers a sub-agent. The description tells the orchestrator when to delegate to this agent.orchestrator.run(task)— the orchestrator analyzes the task, delegates subtasks to the appropriate sub-agents usingdelegate_task, and synthesizes the final result.
The orchestrator follows a flat hierarchy: it delegates to sub-agents, but sub-agents never spawn further agents. This keeps execution predictable and debuggable.
Two delegation strategies
Section titled “Two delegation strategies”The orchestrator has two built-in tools:
delegate_task— sends independent subtasks to agents in parallelform_squad— creates a group of agents that share aBlackboardfor collaborative work
Next steps
Section titled “Next steps”- Add custom tools to give your agents capabilities
- Configure guardrails to constrain agent behavior
- Enable memory for agents that remember across sessions
- Set up multi-agent configs with per-agent providers and MCP tool servers