OpenCode Models & LLM Providers
OpenCode is model-agnostic and supports 75+ LLM providers through Models.dev. Use built-in free models, connect your existing subscriptions, or bring your own API keys.
Quick Model Selection
| Use Case | Recommended Model | Provider |
|---|---|---|
| Getting Started | Zen Default | OpenCode (Free) |
| Complex Tasks | Zen Advanced | OpenCode (Free) |
| Fast Responses | Zen Fast | OpenCode (Free) |
| Best Reasoning | Claude 3.5 Sonnet | Anthropic |
| Large Context | Gemini 1.5 Pro | |
| Existing Subscription | GPT-4o | OpenAI |
Built-in Models (Zen)
The easiest way to get started is with OpenCode Zen - no API key required:
opencode auth login --provider zen
See OpenCode Zen for detailed information about built-in models.
Major Providers
Anthropic (Claude)
Claude models are known for excellent reasoning and code understanding.
Available Models:
- Claude 3.5 Sonnet (Recommended)
- Claude 3.5 Haiku
- Claude 3 Opus
Setup:
opencode auth login --provider anthropic
Or configure manually:
{
"provider": "anthropic",
"apiKey": "sk-ant-..."
}
Use Cases:
- Complex reasoning: Best for multi-step problem solving
- Code understanding: Excellent at analyzing large codebases
- Documentation: Great for generating detailed documentation
OpenAI (GPT)
GPT models excel at general-purpose coding tasks and have strong tool use capabilities.
Available Models:
- GPT-4o (Recommended)
- GPT-4 Turbo
- GPT-4o Mini
Setup:
opencode auth login --provider openai
Or configure manually:
{
"provider": "openai",
"apiKey": "sk-..."
}
Use Cases:
- General coding: Well-rounded for most coding tasks
- Fast iterations: Quick responses for rapid development
- Tool integration: Strong function calling capabilities
Google (Gemini)
Gemini models offer large context windows and strong multimodal capabilities.
Available Models:
- Gemini 1.5 Pro (Recommended)
- Gemini 1.5 Flash
- Gemini 1.0 Ultra
Setup:
opencode auth login --provider google
Or configure manually:
{
"provider": "google",
"apiKey": "AIza..."
}
Use Cases:
- Large codebases: 1M+ token context window
- Fast responses: Flash models for quick tasks
- Multimodal: Can analyze images and diagrams
Additional Providers
Groq
Ultra-fast inference for supported models.
Available Models:
- Llama 3.1 70B
- Mixtral 8x7B
- Gemma 2 9B
Setup:
opencode auth login --provider groq
Use Cases:
- Speed priority: Fastest response times available
- High throughput: Great for batch processing
GitHub Copilot
Use your existing GitHub Copilot subscription with OpenCode.
Setup:
opencode auth login --provider github
Use Cases:
- Existing subscription: No additional API costs
- Familiar models: GPT-4 and Claude models
Azure OpenAI
Use Azure-hosted OpenAI models with enterprise features.
Available Models:
- GPT-4
- GPT-4 Turbo
- GPT-3.5 Turbo
Setup:
opencode auth login --provider azure
Use Cases:
- Enterprise: Data privacy and compliance requirements
- SLA guarantees: Azure’s enterprise support
Local Models
Run models locally with Ollama for complete privacy.
Ollama
Setup:
- Install Ollama from ollama.com
- Pull a model:
ollama pull llama3.1
ollama pull codellama
ollama pull qwen2.5-coder
- Configure OpenCode:
opencode auth login --provider ollama
Recommended Local Models:
- Llama 3.1 70B: Best overall quality (requires 40GB+ RAM)
- Llama 3.1 8B: Good balance of quality and speed (requires 8GB+ RAM)
- Code Llama 34B: Specialized for code (requires 20GB+ RAM)
- Qwen 2.5 Coder: Excellent code understanding (various sizes)
Use Cases:
- Privacy: Code never leaves your machine
- Offline: Works without internet connection
- Cost: No API costs after setup
Model Comparison
| Provider | Best For | Speed | Privacy | Cost |
|---|---|---|---|---|
| Anthropic | Complex reasoning | Medium | Cloud | $15/M tokens |
| OpenAI | General coding | Medium | Cloud | $10/M tokens |
| Large context | Fast | Cloud | $7/M tokens | |
| Groq | Speed | Very Fast | Cloud | Free tier |
| GitHub | Existing subscription | Medium | Cloud | Free with sub |
| Ollama | Privacy | Depends | Local | Free |
Model Configuration
Default Model
Set your default model in the OpenCode configuration:
opencode config set model anthropic/claude-3.5-sonnet
Per-Session Model
Override the default for a specific session:
opencode tui --model openai/gpt-4o
Model Fallback
Configure fallback models for when the primary model is unavailable:
{
"models": {
"primary": "anthropic/claude-3.5-sonnet",
"fallback": ["openai/gpt-4o", "google/gemini-1.5-pro"]
}
}
Provider Authentication
Interactive Setup
The easiest way to configure providers:
opencode auth login
This will guide you through setting up each provider.
Manual Configuration
For automated setup or CI/CD environments:
# Set API keys via environment variables
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="AIza..."
# Or use the config file
opencode config set providers.anthropic.apiKey "sk-ant-..."
Multiple Providers
You can use multiple providers and switch between them:
# List configured providers
opencode auth list
# Switch default provider
opencode config set provider anthropic
# Use specific provider for one session
opencode tui --provider openai
Cost Management
Track Usage
Monitor your API costs:
opencode stats
This shows:
- Tokens used per provider
- Estimated costs
- Usage trends
Set Limits
Configure spending limits:
opencode config set budget.monthly 100 # USD
opencode config set budget.alerts true
Optimize Costs
- Use smaller models for simple tasks: GPT-4o Mini is 10x cheaper than GPT-4o
- Use free tiers: Groq and GitHub Copilot offer free usage
- Use local models: Ollama has no API costs
- Compact sessions: Use
/compactto reduce token usage
Troubleshooting
Rate Limits
If you hit rate limits:
- Wait for the limit to reset
- Switch to a different provider
- Use local models (Ollama)
API Errors
Common issues:
- 401 Unauthorized: Check your API key
- 429 Rate Limited: You’ve exceeded your quota
- 500 Server Error: Provider issue, try again later
Model Not Found
If a model isn’t available:
- Check the model name is correct
- Verify your provider supports that model
- Check if the model requires special access
Next Steps
- Quick Start Guide - Get started with OpenCode
- Free Models Guide - Use OpenCode for free
- Configuration - Customize your setup