OpenCode Free Models Guide
Use OpenCode with completely free LLM models. No API keys or credit cards required.
Free Model Options
| Provider | Model | Free Tier | Requirements |
|---|---|---|---|
| Google Gemini | Gemini 2.5 Flash | 250 requests/day | Google account |
| Groq | Llama 3.1 8B | Free tier | Groq account |
| Ollama | Various local models | Unlimited | GPU recommended |
| GitHub Copilot | GPT-4o | Free tier | GitHub account |
Option 1: Google Gemini (Recommended)
Google Gemini offers 250 free requests per day with no credit card required.
Get Your Free API Key
- Go to Google AI Studio
- Sign in with your Google account
- Click “Get API key”
- Copy your API key
Configure OpenCode
# Start the TUI
opencode tui
# Then configure your provider
/connect
# Select Google and enter your API key
Or authenticate from command line:
opencode auth login --provider google
Recommended Model
# Start OpenCode with Gemini
opencode tui --model google/gemini-2.5-flash
Option 2: Ollama (100% Local, Unlimited)
Run AI models completely locally with no internet required.
Install Ollama
macOS/Linux:
curl -fsSL https://ollama.com/install.sh | sh
Windows: Download from ollama.com
Download a Model
# Download Qwen 2.5 Coder (recommended for coding)
ollama pull qwen2.5-coder
# Or download Llama 3.1
ollama pull llama3.1
# Or download Code Llama
ollama pull codellama
Configure OpenCode
# Start the TUI
opencode tui
# Configure Ollama as your provider
/connect
# Select Ollama (it runs on localhost:11434 by default)
Start Coding
opencode tui --model ollama/qwen2.5-coder
Option 3: Groq (Free Tier)
Groq offers free access to fast models with their API.
Get Your API Key
- Go to console.groq.com
- Sign up for a free account
- Create an API key
- Copy your API key
Configure OpenCode
# Start the TUI
opencode tui
# Configure Groq
/connect
# Select Groq and enter your API key
Start Coding
opencode tui --model groq/llama-3.1-8b-instant
Option 4: GitHub Copilot
If you have a GitHub account, you can use GitHub Copilot with OpenCode.
Configure OpenCode
# Start the TUI
opencode tui
# Configure GitHub Copilot
/connect
# Select GitHub and authenticate via browser
Comparison of Free Options
| Feature | Google Gemini | Ollama | Groq | GitHub Copilot |
|---|---|---|---|---|
| Free Tier | 250 req/day | Unlimited | Limited | Free tier |
| Privacy | Cloud | Local | Cloud | Cloud |
| Speed | Fast | Depends on hardware | Very fast | Fast |
| Model Quality | Excellent | Good | Good | Excellent |
| Offline | ❌ | ✅ | ❌ | ❌ |
Tips for Using Free Models
- Start with Gemini: It’s the easiest to set up and has the most generous free tier
- Use Ollama for privacy: Keep your code completely local
- Switch models: Use different models for different tasks
- Monitor usage: Check
opencode statsto see your API usage
Troubleshooting
Rate Limits
If you hit rate limits:
- Wait for the limit to reset
- Switch to a different free provider
- Use Ollama for unlimited local usage
Slow Performance
If models are slow:
- Try a different model (Gemini Flash is faster than Pro)
- Use Groq for fastest response times
- For Ollama, ensure you have a good GPU
Authentication Issues
If authentication fails:
- Check your API key is correct
- Verify your account is active
- Try re-authenticating with
/connect
Next Steps
- Quick Start Guide - Get started with OpenCode
- Models Documentation - Explore all available models
- Configuration - Customize your setup