Skip to content

Installation

This guide will help you install CodeOptiX on your system.


Prerequisites

Before installing CodeOptiX, make sure you have:

  • Python 3.12 or higher - Check with python --version
  • pip or uv - Python package manager
  • API Key - From at least one LLM provider (OpenAI, Anthropic, or Google)

Installation Methods

The simplest way to install CodeOptiX:

pip install codeoptix

Method 2: Using uv (Faster)

If you have uv installed:

uv pip install codeoptix

Method 3: From Source

For development or latest features:

# Clone the repository
git clone https://github.com/SuperagenticAI/codeoptix.git
cd codeoptix

# Install in development mode
pip install -e .

Verify Installation

After installation, verify that CodeOptiX is installed correctly:

# Check version
codeoptix --version

# View help
codeoptix --help

You should see output like:

CodeOptiX, version 0.1.0

Setting Up LLM Providers

CodeOptiX supports multiple LLM providers. Choose the one that works best for you:

No API key required! Use local Ollama models:

# 1. Install Ollama: https://ollama.com
# 2. Start Ollama service
ollama serve

# 3. Pull a model
ollama pull llama3.1:8b

# 4. Use in CodeOptiX
codeoptix eval --agent basic --behaviors insecure-code --llm-provider ollama

See Ollama Integration Guide for detailed setup.

Option 2: Cloud Providers (Requires API Keys)

CodeOptiX supports cloud LLM providers. Set at least one API key:

OpenAI

export OPENAI_API_KEY="sk-your-api-key-here"

Anthropic

export ANTHROPIC_API_KEY="sk-ant-your-api-key-here"

Google

export GOOGLE_API_KEY="your-api-key-here"

Ollama vs Cloud Providers

Ollama (Local): - ✅ No API key required - ✅ Free to use - ✅ Privacy-friendly (runs locally) - ✅ Works offline - ⚠️ Requires local compute resources

Cloud Providers: - ✅ More powerful models - ✅ No local compute needed - ⚠️ Requires API key - ⚠️ May incur costs - ⚠️ Data sent to external service

Windows (PowerShell)

$env:OPENAI_API_KEY="sk-your-api-key-here"

Windows (CMD)

set OPENAI_API_KEY=sk-your-api-key-here

Optional Dependencies

Development Tools

For development and testing:

pip install codeoptix[dev]

Includes: - pytest - Testing framework - ruff - Code linter - mypy - Type checker - black - Code formatter

Ollama Integration (Local LLM Support)

CodeOptiX supports local Ollama models - no API key required!

Prerequisites: 1. Install Ollama: https://ollama.com 2. Start Ollama service: ollama serve 3. Pull a model: ollama pull llama3.1:8b (or gpt-oss:120b, qwen3:8b, etc.)

Usage:

codeoptix eval \
  --agent basic \
  --behaviors insecure-code \
  --llm-provider ollama \
  --config examples/configs/ollama-insecure-code.yaml

Configuration:

adapter:
  llm_config:
    provider: ollama
    model: llama3.2:3b  # Or llama3.1:8b, gpt-oss:120b, qwen3:8b, etc.
    # No api_key needed!

See Ollama Integration Guide for detailed setup and examples.


Troubleshooting

Installation Fails

If installation fails, try:

# Upgrade pip first
pip install --upgrade pip

# Then install CodeOptiX
pip install codeoptix

Import Errors

If you get import errors:

# Verify installation
pip show codeoptix

# Reinstall if needed
pip uninstall codeoptix
pip install codeoptix

API Key Not Found

If CodeOptiX can't find your API key:

  1. Check that the environment variable is set:

    echo $OPENAI_API_KEY
    

  2. Make sure you're using the correct variable name

  3. Restart your terminal after setting the variable

Next Steps

Now that CodeOptiX is installed, you're ready to:

  1. Quick Start - Run your first evaluation
  2. Your First Evaluation - Detailed walkthrough
  3. Python API Guide - Use CodeOptiX in Python

Need Help?

If you encounter any issues: