Skip to content

OpenCode Free Models Guide

Use OpenCode with completely free LLM models. No API keys or credit cards required.

Free Model Options

ProviderModelFree TierRequirements
Google GeminiGemini 2.5 Flash250 requests/dayGoogle account
GroqLlama 3.1 8BFree tierGroq account
OllamaVarious local modelsUnlimitedGPU recommended
GitHub CopilotGPT-4oFree tierGitHub account

Google Gemini offers 250 free requests per day with no credit card required.

Get Your Free API Key

  1. Go to Google AI Studio
  2. Sign in with your Google account
  3. Click “Get API key”
  4. Copy your API key

Configure OpenCode

# Start the TUI
opencode tui

# Then configure your provider
/connect

# Select Google and enter your API key

Or authenticate from command line:

opencode auth login --provider google
# Start OpenCode with Gemini
opencode tui --model google/gemini-2.5-flash

Option 2: Ollama (100% Local, Unlimited)

Run AI models completely locally with no internet required.

Install Ollama

macOS/Linux:

curl -fsSL https://ollama.com/install.sh | sh

Windows: Download from ollama.com

Download a Model

# Download Qwen 2.5 Coder (recommended for coding)
ollama pull qwen2.5-coder

# Or download Llama 3.1
ollama pull llama3.1

# Or download Code Llama
ollama pull codellama

Configure OpenCode

# Start the TUI
opencode tui

# Configure Ollama as your provider
/connect

# Select Ollama (it runs on localhost:11434 by default)

Start Coding

opencode tui --model ollama/qwen2.5-coder

Option 3: Groq (Free Tier)

Groq offers free access to fast models with their API.

Get Your API Key

  1. Go to console.groq.com
  2. Sign up for a free account
  3. Create an API key
  4. Copy your API key

Configure OpenCode

# Start the TUI
opencode tui

# Configure Groq
/connect

# Select Groq and enter your API key

Start Coding

opencode tui --model groq/llama-3.1-8b-instant

Option 4: GitHub Copilot

If you have a GitHub account, you can use GitHub Copilot with OpenCode.

Configure OpenCode

# Start the TUI
opencode tui

# Configure GitHub Copilot
/connect

# Select GitHub and authenticate via browser

Comparison of Free Options

FeatureGoogle GeminiOllamaGroqGitHub Copilot
Free Tier250 req/dayUnlimitedLimitedFree tier
PrivacyCloudLocalCloudCloud
SpeedFastDepends on hardwareVery fastFast
Model QualityExcellentGoodGoodExcellent
Offline

Tips for Using Free Models

  1. Start with Gemini: It’s the easiest to set up and has the most generous free tier
  2. Use Ollama for privacy: Keep your code completely local
  3. Switch models: Use different models for different tasks
  4. Monitor usage: Check opencode stats to see your API usage

Troubleshooting

Rate Limits

If you hit rate limits:

  • Wait for the limit to reset
  • Switch to a different free provider
  • Use Ollama for unlimited local usage

Slow Performance

If models are slow:

  • Try a different model (Gemini Flash is faster than Pro)
  • Use Groq for fastest response times
  • For Ollama, ensure you have a good GPU

Authentication Issues

If authentication fails:

  • Check your API key is correct
  • Verify your account is active
  • Try re-authenticating with /connect

Next Steps