Model Connection
Connect DSPy Code to local and cloud LLM providers for code generation and optimization.
Provider Overview
DSPy Code supports both local and cloud LLMs:
- Ollama (Local) β Runs models on your machine (free, private)
- OpenAI (Cloud) β GPTβ4o, gptβ5 family (e.g. gptβ5βnano)
- Anthropic (Cloud) β Claude Sonnet/Opus 4.5 (paid only)
- Google Gemini (Cloud) β Gemini 2.5 family (via
google-genai)
Local vs Cloud
- Local (Ollama): Best for experimentation, zero API cost, but uses your CPU/GPU and RAM.
- Cloud (OpenAI, Anthropic, Gemini): Best quality and scale, but billed per token. Optimization workflows can generate many calls.
Quick Connect
Easiest: Interactive Model Selector
This walks you through:
- Picking Ollama (local) vs cloud providers
- For Ollama: selecting from detected models (for example
gpt-oss:120b,llama3.2) by number - For cloud: picking OpenAI, Anthropic, or Gemini and then typing a model name (for example
gpt-5-nano,claude-sonnet-4.5,gemini-2.5-flash)
Ollama (Local - Recommended for Beginners)
Advantages: - β Free - β Private (runs locally) - β No API key needed - β Fast
Requirements: - Ollama installed - Model downloaded: ollama pull gpt-oss:120b
OpenAI (Cloud)
Requirements:
- OpenAI Python SDK (installed via
dspy-code[openai]) - OpenAI API key:
export OPENAI_API_KEY=sk-...
Use the Best Model You Have
gpt-5-nano is a good starter model. For higher quality, switch to gptβ4o or newer gptβ5 family models your account supports.
Anthropic (Cloud, Paid Only)
Requirements:
- Anthropic Python SDK (installed via
dspy-code[anthropic]) - Anthropic API key:
export ANTHROPIC_API_KEY=sk-ant-...
Anthropic no longer offers free API keys. DSPy Code fully supports Claude if you have a paid key; otherwise, just skip Anthropic.
Google Gemini (Cloud)
Requirements:
- Google Gen AI SDK (
google-genai, installed viadspy-code[gemini]) - API key:
export GEMINI_API_KEY=...(orGOOGLE_API_KEY=...)
Check Your Quotas
All cloud providers enforce quotas and rate limits. If you see 429 or quota errors during optimization, check your usage dashboards and billing settings.
Connection Status
Check your connection:
Output shows: - β Model Connected: llama3.1:8b (ollama) - Or: β No Model Connected
Disconnect
Configure Default Model
Edit dspy_config.yaml:
Troubleshooting
Ollama Not Running
Solution: Start Ollama: ollama serve
Invalid API Key
Solution: Check environment variable is set correctly
Model Not Found
Solution: For Ollama: ollama pull llama3.1:8b