Providers
AgentCrew supports multiple AI agent providers. A provider determines which CLI and container image are used to run your agents. The platform features — teams, agents, skills, schedules, chat, and workspaces — work identically regardless of which provider you choose.
Available Providers
| Provider | CLI | Agent Image | Credentials |
|---|---|---|---|
| Claude Code | claude | ghcr.io/helmcode/agent_crew_agent | ANTHROPIC_API_KEY or CLAUDE_CODE_OAUTH_TOKEN |
| OpenCode | opencode | ghcr.io/helmcode/agent_crew_opencode_agent | Standard provider keys (see below) |
Choosing a Provider
When creating a team, you select the provider in Step 1 of the creation wizard. The provider determines which container image is deployed and which CLI handles the AI interaction inside the container.
You can run multiple teams simultaneously with different providers. For example, one team using Claude Code and another using OpenCode — they operate independently.
Claude Code
Claude Code is Anthropic's official agentic coding tool. It runs as a CLI process that reads from stdin and writes to stdout. The sidecar bridges NATS messages to this stdin/stdout interface.
Credentials
Claude Code requires one of the following credentials, configured in the Settings page:
- ANTHROPIC_API_KEY: Your Anthropic API key from console.anthropic.com.
- CLAUDE_CODE_OAUTH_TOKEN: An OAuth token for Claude Code. If both are set, the API key takes precedence.
OpenCode
OpenCode
is an open-source terminal-based AI assistant. It runs as an HTTP server
(opencode serve) and the sidecar communicates with it via
REST API and SSE (Server-Sent Events) for streaming responses.
Credentials
OpenCode supports 75+ LLM providers. Configure the API key for your chosen model provider in the Settings page. At least one must be set:
- ANTHROPIC_API_KEY: For Anthropic models (Claude Sonnet, Opus, Haiku, etc.).
- OPENAI_API_KEY: For OpenAI models (GPT-4o, o3, etc.).
- GOOGLE_GENERATIVE_AI_API_KEY: For Google models (Gemini, etc.).
Local model servers are also supported via OLLAMA_BASE_URL or
LM_STUDIO_BASE_URL.
What Stays the Same
Regardless of provider, all platform features work identically:
- Teams and agents: Leader/worker model, Markdown-driven specialization.
- Skills: Install and manage skills the same way.
- Schedules: Cron-based task automation works across providers.
- Chat: Real-time messaging, activity feed, and message history.
- Workspaces: Shared
/workspacedirectory with project files.
What Differs
| Aspect | Claude Code | OpenCode |
|---|---|---|
| Container image | agent_crew_agent | agent_crew_opencode_agent |
| Communication | stdin/stdout (blocking) | HTTP REST + SSE (async) |
| Credential keys | ANTHROPIC_API_KEY | ANTHROPIC_API_KEY, OPENAI_API_KEY, GOOGLE_GENERATIVE_AI_API_KEY, or local URLs |
| Instructions file | .claude/CLAUDE.md | .opencode/AGENTS.MD |
Next Steps
- Quick Start: Get AgentCrew running and create your first team with any provider.
- Configuration: Review all credential and environment variable options.
- Architecture: Understand how the sidecar abstracts provider differences.