Skip to content

🧠 Model Management API

The model management system provides unified access to multiple language model backends with automatic discovery, configuration, and DSPy integration.

SuperOptiXModelManager

The main model manager class that handles model operations across different backends.

Constructor

Python
class SuperOptiXModelManager:
    def __init__(self, config_dir: Optional[str] = None)

Parameters: - config_dir (str, optional): Configuration directory path

Core Methods

list_models()

Python
def list_models(
    self,
    backend: Optional[SuperOptiXBackendType] = None,
    size: Optional[SuperOptiXModelSize] = None,
    task: Optional[SuperOptiXModelTask] = None,
    installed_only: bool = True,
    verbose: bool = False
) -> List[SuperOptiXModelInfo]:
    """List models with optional filtering."""

Parameters: - backend (SuperOptiXBackendType, optional): Filter by backend type - size (SuperOptiXModelSize, optional): Filter by model size - task (SuperOptiXModelTask, optional): Filter by model task - installed_only (bool): Show only installed models - defaults to True - verbose (bool): Show detailed information - defaults to False

Returns: List of model information objects

test_model()

Python
def test_model(
    self,
    model_name: str,
    prompt: str = "Hello, world!",
    backend_type: Optional[SuperOptiXBackendType] = None
) -> Dict[str, Any]:
    """Test a SuperOptiX model with a simple prompt."""

Parameters: - model_name (str): Name of the model to test - prompt (str): Prompt to send to the model - defaults to "Hello, world!" - backend_type (SuperOptiXBackendType, optional): Backend type (auto-detected if not specified)

Returns: Dictionary with test results including: - success (bool): Whether the test was successful - response (str): Model response text - model (str): Model name - prompt (str): Input prompt - response_time (float): Response time in seconds - tokens (int): Number of tokens generated (if available) - error (str): Error message if test failed

Features: - Auto-backend detection: Automatically finds the correct backend for the model - Auto-installation: Automatically installs models if not found - Cross-backend support: Works with Ollama, MLX, and HuggingFace - Real-time execution: Direct model execution without server setup

Supported Backends: - Ollama: Uses ollama run command for direct execution - MLX: Downloads and uses MLX-LM directly with transformers pipeline - HuggingFace: Downloads and uses transformers pipeline directly - LM Studio: Not supported (designed for server mode)

Example:

Python
from superoptix.models.manager import SuperOptiXModelManager

manager = SuperOptiXModelManager()

# Test with auto-detection and auto-installation
result = manager.test_model("llama3.2:3b", "Write a Python function to add two numbers")

if result["success"]:
    print(f"Response: {result['response']}")
    print(f"Time: {result['response_time']:.2f}s")
else:
    print(f"Error: {result['error']}")

install_model()

Python
def install_model(
    self,
    model_name: str,
    backend_type: Optional[SuperOptiXBackendType] = None
) -> bool:
    """Install a SuperOptiX model with automatic backend detection."""

Parameters: - model_name (str): Name of the model to install - backend_type (SuperOptiXBackendType, optional): Target backend (auto-detected if not specified)

Returns: Boolean indicating success

Features: - Auto-backend detection: Automatically determines the appropriate backend based on model name patterns - Backend-specific installation: Uses appropriate installation method for each backend - Progress feedback: Provides real-time installation progress - Error handling: Comprehensive error reporting and recovery

Backend-Specific Installation:

Ollama: - Uses ollama pull command - Downloads from Ollama model registry - Supports all Ollama-compatible models

MLX: - Downloads from HuggingFace Hub - Uses MLX-LM format for Apple Silicon - Supports MLX-community models

HuggingFace: - Downloads from HuggingFace Hub - Uses transformers pipeline format - Supports all HuggingFace models

LM Studio: - Not supported for direct installation - Designed for server mode operation

Example:

Python
from superoptix.models.manager import SuperOptiXModelManager

manager = SuperOptiXModelManager()

# Auto-detection installation
success = manager.install_model("llama3.2:3b")  # Uses Ollama

# Explicit backend installation
success = manager.install_model("microsoft/phi-1_5", SuperOptiXBackendType.HUGGINGFACE)

get_model_info()

Python
def get_model_info(
    self,
    model_name: str,
    backend: SuperOptiXBackendType = SuperOptiXBackendType.OLLAMA
) -> SuperOptiXModelInfo:
    """Get detailed information about a model."""

Parameters: - model_name (str): Name of the model - backend (SuperOptiXBackendType): Backend type - defaults to OLLAMA

Returns: Detailed model information object

start_server()

Python
def start_server(
    self,
    model_name: str,
    backend: SuperOptiXBackendType = SuperOptiXBackendType.OLLAMA,
    port: int = None,
    **kwargs
) -> Dict[str, Any]:
    """Start a model server for the specified model."""

Parameters: - model_name (str): Name of the model to serve - backend (SuperOptiXBackendType): Backend type - defaults to OLLAMA - port (int, optional): Port number for the server - **kwargs (Any): Additional server configuration

Returns: Server information dictionary

create_dspy_client()

Python
def create_dspy_client(
    self,
    model_name: str,
    backend: SuperOptiXBackendType = SuperOptiXBackendType.OLLAMA,
    **kwargs
) -> dspy.LM:
    """Create a DSPy language model client."""

Parameters: - model_name (str): Name of the model - backend (SuperOptiXBackendType): Backend type - defaults to OLLAMA - **kwargs (Any): Additional client configuration

Returns: DSPy language model client

check_backend_status()

Python
def check_backend_status(
    self,
    backend: SuperOptiXBackendType
) -> SuperOptiXBackendInfo:
    """Check the status of a backend."""

Parameters: - backend (SuperOptiXBackendType): Backend to check

Returns: Backend status information

uninstall_model()

Python
def uninstall_model(
    self,
    model_name: str,
    backend_type: Optional[SuperOptiXBackendType] = None
) -> bool:
    """Uninstall a SuperOptiX model with intelligent backend detection."""

Parameters: - model_name (str): Name of the model to uninstall - backend_type (SuperOptiXBackendType, optional): Backend type (auto-detected if not specified)

Returns: Boolean indicating success

Features: - Auto-backend detection: Automatically finds the backend containing the model - Intelligent removal: Removes model files and cache entries - Cross-backend support: Works with all supported backends - Safe removal: Confirms model existence before removal

Backend-Specific Removal:

Ollama: - Uses ollama rm command - Removes model from Ollama registry

MLX: - Removes model files from MLX cache - Cleans up HuggingFace cache entries

HuggingFace: - Removes model files from SuperOptiX cache - Cleans up HuggingFace cache entries

Example:

Python
from superoptix.models.manager import SuperOptiXModelManager

manager = SuperOptiXModelManager()

# Auto-detection removal
success = manager.uninstall_model("llama3.2:3b")

# Explicit backend removal
success = manager.uninstall_model("microsoft/phi-1_5", SuperOptiXBackendType.HUGGINGFACE)

Data Classes

SuperOptiXModelInfo

Python
@dataclass
class SuperOptiXModelInfo:
    name: str
    backend: SuperOptiXBackendType
    size: SuperOptiXModelSize
    task: SuperOptiXModelTask
    status: SuperOptiXModelStatus
    installed: bool
    file_size: Optional[int]
    download_url: Optional[str]
    description: Optional[str]
    tags: List[str]
    parameters: Optional[int]
    context_length: Optional[int]
    last_updated: Optional[datetime]

SuperOptiXBackendInfo

Python
@dataclass
class SuperOptiXBackendInfo:
    backend: SuperOptiXBackendType
    available: bool
    version: Optional[str]
    models_count: int
    server_running: bool
    server_url: Optional[str]
    error_message: Optional[str]

SuperOptiXModelConfig

Python
@dataclass
class SuperOptiXModelConfig:
    name: str
    backend: SuperOptiXBackendType
    temperature: float = 0.1
    max_tokens: int = 2000
    top_p: float = 1.0
    frequency_penalty: float = 0.0
    presence_penalty: float = 0.0
    stop_sequences: List[str] = None
    api_base: Optional[str] = None
    api_key: Optional[str] = None

Enums

SuperOptiXBackendType

Python
class SuperOptiXBackendType(Enum):
    OLLAMA = "ollama"
    MLX = "mlx"
    HUGGINGFACE = "huggingface"
    LMSTUDIO = "lmstudio"

SuperOptiXModelSize

Python
class SuperOptiXModelSize(Enum):
    TINY = "tiny"      # < 1B parameters
    SMALL = "small"    # 1B - 7B parameters
    MEDIUM = "medium"  # 7B - 30B parameters
    LARGE = "large"    # > 30B parameters

SuperOptiXModelTask

Python
class SuperOptiXModelTask(Enum):
    CHAT = "chat"
    CODE = "code"
    REASONING = "reasoning"
    EMBEDDING = "embedding"
    MULTIMODAL = "multimodal"

SuperOptiXModelStatus

Python
class SuperOptiXModelStatus(Enum):
    AVAILABLE = "available"
    INSTALLED = "installed"
    DOWNLOADING = "downloading"
    ERROR = "error"
    NOT_FOUND = "not_found"

Utility Functions

get_superoptix_system_info()

Python
def get_superoptix_system_info() -> Dict[str, Any]:
    """Get comprehensive system information."""

Returns: Dictionary with system information including OS, Python version, available backends, etc.

format_size()

Python
def format_size(size_bytes: int) -> str:
    """Format file size in human-readable format."""

Parameters: - size_bytes (int): Size in bytes

Returns: Formatted size string (e.g., "1.5 GB")

parse_superoptix_model_name()

Python
def parse_superoptix_model_name(model_name: str) -> Dict[str, Any]:
    """Parse a SuperOptiX model name into components."""

Parameters: - model_name (str): Model name to parse

Returns: Dictionary with parsed components

get_superoptix_model_discovery_guide()

Python
def get_superoptix_model_discovery_guide() -> str:
    """Get a comprehensive guide for model discovery."""

Returns: Formatted guide text

validate_superoptix_model_compatibility()

Python
def validate_superoptix_model_compatibility(
    model_name: str,
    backend: SuperOptiXBackendType
) -> bool:
    """Validate if a model is compatible with a backend."""

Parameters: - model_name (str): Model name to validate - backend (SuperOptiXBackendType): Backend to check compatibility with

Returns: True if compatible, False otherwise

Configuration Functions

get_superoptix_global_config()

Python
def get_superoptix_global_config() -> Dict[str, Any]:
    """Get global SuperOptiX configuration."""

Returns: Global configuration dictionary

save_superoptix_global_config()

Python
def save_superoptix_global_config(config: Dict[str, Any]) -> None:
    """Save global SuperOptiX configuration."""

Parameters: - config (Dict[str, Any]): Configuration to save

get_superoptix_project_config()

Python
def get_superoptix_project_config(project_path: str) -> Dict[str, Any]:
    """Get project-specific configuration."""

Parameters: - project_path (str): Path to the project

Returns: Project configuration dictionary

save_superoptix_project_config()

Python
def save_superoptix_project_config(
    project_path: str,
    config: Dict[str, Any]
) -> None:
    """Save project-specific configuration."""

Parameters: - project_path (str): Path to the project - config (Dict[str, Any]): Configuration to save

Example Usage

Python
from superoptix.models import (
    SuperOptiXModelManager,
    SuperOptiXBackendType,
    SuperOptiXModelSize,
    SuperOptiXModelTask
)

# Initialize model manager
manager = SuperOptiXModelManager()

# List installed models
models = manager.list_models(installed_only=True)
for model in models:
    print(f"{model.name} ({model.backend.value}) - {model.size.value}")

# List all available models
all_models = manager.list_models(installed_only=False, verbose=True)

# Filter by backend and size
ollama_models = manager.list_models(
    backend=SuperOptiXBackendType.OLLAMA,
    size=SuperOptiXModelSize.SMALL
)

# Install a model
model_info = manager.install_model(
    "llama3.2:3b",
    backend=SuperOptiXBackendType.OLLAMA
)

# Get model information
info = manager.get_model_info("llama3.2:3b")

# Start a model server
server_info = manager.start_server(
    "llama3.2:3b",
    port=8000
)

# Create DSPy client
lm = manager.create_dspy_client(
    "llama3.2:3b",
    temperature=0.2,
    max_tokens=1000
)

# Check backend status
status = manager.check_backend_status(SuperOptiXBackendType.OLLAMA)
print(f"Ollama available: {status.available}")
print(f"Models installed: {status.models_count}")

# Get system information
system_info = get_superoptix_system_info()
print(f"OS: {system_info['os']}")
print(f"Python: {system_info['python_version']}")

# Get discovery guide
guide = get_superoptix_model_discovery_guide()
print(guide)

Backend-Specific Configuration

Ollama Configuration

Python
ollama_config = {
    "api_base": "http://localhost:11434",
    "timeout": 30,
    "retry_attempts": 3
}

MLX Configuration

Python
mlx_config = {
    "api_base": "http://localhost:8000",
    "model_path": "/path/to/models",
    "device": "cpu"  # or "gpu"
}

HuggingFace Configuration

Python
huggingface_config = {
    "api_base": "http://localhost:8001",
    "trust_remote_code": True,
    "device_map": "auto"
}

LM Studio Configuration

Python
lmstudio_config = {
    "api_base": "http://localhost:1234",
    "timeout": 60,
    "stream": False
}