.env at your workspace root. The setup wizard (octo init) generates this file, or you can create it manually from .env.example.
LLM Providers
Octo supports 5 providers. You only need to configure one.- Anthropic
- AWS Bedrock
- OpenAI
- Azure OpenAI
- GitHub Models
The simplest option — direct API access to Claude models.
Auto-Detection
The model factory auto-detects the provider from the model name:| Model name pattern | Provider |
|---|---|
github/* | GitHub Models |
eu.anthropic.*, us.anthropic.* | AWS Bedrock |
claude-* | Anthropic direct |
gpt-*, o1-*, o3-* | OpenAI |
gpt-* + AZURE_OPENAI_ENDPOINT set | Azure OpenAI |
LLM_PROVIDER if needed:
Model Tiers
Octo uses three tiers to balance cost vs quality. Different agents use different tiers:| Tier | Used For | Example |
|---|---|---|
| HIGH | Complex reasoning, architecture, multi-step planning | claude-opus-4-5-20250929 |
| DEFAULT | Supervisor routing, general chat, tool use | claude-sonnet-4-5-20250929 |
| LOW | Summarization, simple workers, cost-sensitive tasks | claude-haiku-4-5-20251001 |
Model Profiles
Profiles are presets that map tiers to agent roles:| Profile | Supervisor | Workers | High-tier agents |
|---|---|---|---|
quality | high | default | high |
balanced | default | low | high |
budget | low | low | default |
/profile <name>.
Agent Directories
Load agents from external projects by pointing to their AGENT.md directories:*/AGENT.md files.
Middleware Tuning
MCP Servers
MCP server configuration lives in.mcp.json. See .mcp.json.example for a template, and MCP Servers for management commands.

