Installation¶
This guide will help you install CodeOptiX on your system.
Prerequisites¶
Before installing CodeOptiX, make sure you have:
- Python 3.12 or higher - Check with
python --version - pip or uv - Python package manager
- API Key - From at least one LLM provider (OpenAI, Anthropic, or Google)
Installation Methods¶
Method 1: Using pip (Recommended for Beginners)¶
The simplest way to install CodeOptiX:
Method 2: Using uv (Faster)¶
If you have uv installed:
Method 3: From Source¶
For development or latest features:
# Clone the repository
git clone https://github.com/SuperagenticAI/codeoptix.git
cd codeoptix
# Install in development mode
pip install -e .
Verify Installation¶
After installation, verify that CodeOptiX is installed correctly:
You should see output like:
Setting Up LLM Providers¶
CodeOptiX supports multiple LLM providers. Choose the one that works best for you:
Option 1: Ollama (Recommended for Open-Source Users) 🆕¶
No API key required! Use local Ollama models:
# 1. Install Ollama: https://ollama.com
# 2. Start Ollama service
ollama serve
# 3. Pull a model
ollama pull llama3.1:8b
# 4. Use in CodeOptiX
codeoptix eval --agent basic --behaviors insecure-code --llm-provider ollama
See Ollama Integration Guide for detailed setup.
Option 2: Cloud Providers (Requires API Keys)¶
CodeOptiX supports cloud LLM providers. Set at least one API key:
OpenAI¶
Anthropic¶
Google¶
Ollama vs Cloud Providers
Ollama (Local): - ✅ No API key required - ✅ Free to use - ✅ Privacy-friendly (runs locally) - ✅ Works offline - ⚠️ Requires local compute resources
Cloud Providers: - ✅ More powerful models - ✅ No local compute needed - ⚠️ Requires API key - ⚠️ May incur costs - ⚠️ Data sent to external service
Windows (PowerShell)¶
Windows (CMD)¶
Optional Dependencies¶
Development Tools¶
For development and testing:
Includes: - pytest - Testing framework - ruff - Code linter - mypy - Type checker - black - Code formatter
Ollama Integration (Local LLM Support)¶
CodeOptiX supports local Ollama models - no API key required!
Prerequisites: 1. Install Ollama: https://ollama.com 2. Start Ollama service: ollama serve 3. Pull a model: ollama pull llama3.1:8b (or gpt-oss:120b, qwen3:8b, etc.)
Usage:
codeoptix eval \
--agent basic \
--behaviors insecure-code \
--llm-provider ollama \
--config examples/configs/ollama-insecure-code.yaml
Configuration:
adapter:
llm_config:
provider: ollama
model: llama3.2:3b # Or llama3.1:8b, gpt-oss:120b, qwen3:8b, etc.
# No api_key needed!
See Ollama Integration Guide for detailed setup and examples.
Troubleshooting¶
Installation Fails¶
If installation fails, try:
Import Errors¶
If you get import errors:
# Verify installation
pip show codeoptix
# Reinstall if needed
pip uninstall codeoptix
pip install codeoptix
API Key Not Found¶
If CodeOptiX can't find your API key:
-
Check that the environment variable is set:
-
Make sure you're using the correct variable name
- Restart your terminal after setting the variable
Next Steps¶
Now that CodeOptiX is installed, you're ready to:
- Quick Start - Run your first evaluation
- Your First Evaluation - Detailed walkthrough
- Python API Guide - Use CodeOptiX in Python
Need Help?¶
If you encounter any issues:
- Check the Troubleshooting section above
- Open an issue on GitHub
- Join our Discussions