Are you an LLM? Read llms.txt for a summary of the docs, or llms-full.txt for the full context.
Skip to content

Providers

Swarmie provider configuration is defined in crates/core/src/config/toml_types.rs and resolved at runtime in crates/core/src/config/resolve.rs.

Provider Config Shape

Use [providers.<name>] entries and point [defaults].provider at one of those names.

[defaults]
provider = "anthropic"
model = "sonnet"
 
[providers.anthropic]
type = "anthropic"
authentication = "api_key"
api_key_env = "ANTHROPIC_API_KEY"
 
[providers.openai]
type = "openai-compatible"
api_key_env = "OPENAI_API_KEY"
base_url = "https://api.openai.com/v1"

Supported fields (crates/core/src/config/toml_types.rs):

FieldTypeNotes
typestringProvider runtime type.
authenticationstringapi_key or oauth.
api_key_envstringEnvironment variable name for API key lookup.
base_urlstringOptional endpoint override for compatible backends.

Runtime Resolution

resolve_runtime_provider() in crates/core/src/config/resolve.rs resolves providers in this order:

  1. Load [defaults].provider and corresponding [providers.<name>] entry.
  2. Resolve credentials by authentication:
    • oauth: load token from ~/.swarmie credential store.
    • otherwise: read the configured api_key_env.
  3. Resolve model via swarmie_provider::resolve_model_id() (short names like sonnet map to full model IDs).
  4. Build API client from provider type:
    • anthropic uses SwarmProviderLlmApi.
    • openai-compatible uses OpenAI-compatible runtime API selection.

If stage 1 fails, try_resolve_runtime_provider() falls back to OAuth built-ins from provider metadata.

Supported Provider IDs

Canonical provider IDs come from crates/provider/src/metadata.rs (PROVIDER_METADATA):

  • amazon-bedrock
  • anthropic
  • google
  • google-gemini-cli
  • google-antigravity
  • google-vertex
  • openai
  • openai-codex
  • azure-openai-responses
  • github-copilot
  • xai
  • groq
  • cerebras
  • openrouter
  • vercel-ai-gateway
  • zai
  • mistral
  • minimax
  • minimax-cn
  • huggingface
  • opencode
  • kimi-coding
  • gitlab

Provider Types vs Provider IDs

[providers.<name>].type is a runtime transport category, while provider IDs are specific integrations.

  • type = "anthropic" for Anthropic-native API flow.
  • type = "openai-compatible" for OpenAI-compatible style providers and endpoints.

Most providers in metadata use openai-compatible; Anthropic uses anthropic.

Custom Endpoints

Set base_url in [providers.<name>] to route to custom or self-hosted endpoints:

[providers.internal-gateway]
type = "openai-compatible"
api_key_env = "INTERNAL_LLM_KEY"
base_url = "https://llm-gateway.example.com/v1"

OAuth Example

[providers.openai-codex]
type = "openai-compatible"
authentication = "oauth"
api_key_env = "OPENAI_API_KEY" # retained for schema consistency

With authentication = "oauth", Swarmie reads credentials from the credential store instead of env.