Skip to content

Conversational Interface

๐ŸŽฏ Overview

SuperOptiX features a conversational interface that makes it easy to interact with the framework through natural language and slash commands.

Just type:

super

That's it! No need for super chat or other subcommands.


๐Ÿš€ Getting Started

First Run

The first time you run super, you'll go through a quick setup:

$ super

๐ŸŽ‰ Welcome to SuperOptiX!

First time setup - this will take about 30 seconds.

Step 1/2: Choose AI Model Provider

1. ๐Ÿ  Ollama (Local - FREE, Private, Offline)
2. โ˜๏ธ  OpenAI (Cloud - Paid)
3. โ˜๏ธ  Anthropic (Cloud - Paid)

Choose [1-3]: 1

We recommend Ollama for: - Privacy (data stays local) - No API costs (free) - Offline capability

After Setup

Once configured, just type super to enter conversational mode:

$ super

Welcome to SuperOptiX!
Using: ollama (llama3.1:8b)

Type /help for slash commands or just tell me what to do.

SuperOptiX> _

๐ŸŽฎ Slash Commands

Slash commands provide quick access to SuperOptiX features:

Configuration & Models

/model                    # Show current model
/model list              # List all available models
/model set <model>       # Switch model
/config                  # Show configuration
/config show             # Show all settings
/config set <k> <v>      # Set configuration value

Help & Documentation

/help                    # Show all commands
/ask <question>         # Ask about SuperOptiX
/help <topic>           # Topic-specific help
/docs <topic>           # Open documentation
/examples               # Show example workflows

Project & Agents

/status                 # Show project status
/agents                 # List all agents
/playbooks              # List all playbooks
/templates              # Show available templates

Conversation

/clear                  # Clear screen
/history                # Show conversation history (coming soon)
/exit, /quit            # Exit conversational mode

๐Ÿ’ฌ Example Session

$ super

Welcome to SuperOptiX!
Using: ollama (llama3.1:8b)

SuperOptiX> /help

[Shows all slash commands]

SuperOptiX> /model list

Available AI Models

๐Ÿ  LOCAL MODELS (via Ollama):
llama3.1:8b (current)
qwen2.5:14b

โ˜๏ธ  CLOUD MODELS:
OpenAI: gpt-4o, gpt-4o-mini
Anthropic: claude-3.5-sonnet

SuperOptiX> /ask How do I add memory?

๐Ÿ’ก How do I add memory to my agent?

To add memory to your agent, update your playbook's spec:

```yaml
spec:
  memory:
    enabled: true
    enable_context_optimization: true
    max_context_tokens: 2000

SuperOptiX> /agents

Agents

Found 2 agent(s):

โ€ข code_reviewer Compiled โ€ข customer_support โš ๏ธ Not compiled

SuperOptiX> /playbooks

Available Playbooks

๐Ÿ“ฆ Library Templates (5): โ€ข genie_playbook [memory, tools, rag] General-purpose intelligent agent... โ€ข security_agent_playbook [tools, rag] Code security review agent...

๐Ÿ“ Your Project (2): โ€ข code_reviewer_playbook [memory, tools] โ€ข customer_support_playbook [memory, rag]

SuperOptiX> /exit

๐Ÿ‘‹ Goodbye! Happy building with SuperOptiX!

---

## ๐Ÿ”ง Model Management

### Viewing Models

```bash
SuperOptiX> /model

Current Model Configuration

โ€ข Provider: ollama
โ€ข Model: llama3.1:8b
โ€ข API Base: http://localhost:11434
โ€ข Status: Connected

Switching Models

# Switch to different Ollama model
SuperOptiX> /model set qwen2.5:14b

Switched to: qwen2.5:14b

# Switch to OpenAI (requires API key)
SuperOptiX> /model set gpt-4o

โš ๏ธ  OPENAI_API_KEY not set
Set it with: /config set OPENAI_API_KEY sk-...

Listing All Models

SuperOptiX> /model list

Available AI Models

๐Ÿ  LOCAL MODELS (via Ollama):

Installed:
  llama3.1:8b (current)
  qwen2.5:14b

Available to install:
  โ€ข deepseek-coder:33b (19GB) - Best for coding
  โ€ข mistral:7b (4.1GB) - Fast alternative

Install: ollama pull <model>

โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€

โ˜๏ธ  CLOUD MODELS:

OpenAI (Requires OPENAI_API_KEY):
  โ€ข gpt-4o - Best overall
  โ€ข gpt-4o-mini - Fast and affordable

Anthropic (Requires ANTHROPIC_API_KEY):
  โ€ข claude-3.5-sonnet - Best for coding
  โ€ข claude-3.5-haiku - Fast and affordable

Set API key: /config set OPENAI_API_KEY sk-...

๐ŸŽ“ Learning SuperOptiX

Ask Questions

Use /ask to learn about SuperOptiX features:

SuperOptiX> /ask How do I add memory?
SuperOptiX> /ask What is GEPA?
SuperOptiX> /ask How do I add RAG?
SuperOptiX> /ask What is SuperSpec?

View Examples

SuperOptiX> /examples

Example Workflows

1. Build and Optimize Agent:
   super spec generate genie code_reviewer
   super agent compile code_reviewer
   super agent optimize code_reviewer --auto medium

2. Quick Agent from Template:
   super agent pull developer
   super agent compile developer
   super agent run developer --goal "Build a CLI tool"

๐Ÿ”„ Backwards Compatibility

Traditional CLI commands still work:

# Traditional commands (no conversational mode)
$ super agent compile code_reviewer
$ super agent optimize code_reviewer --auto medium
$ super agent evaluate code_reviewer

# These bypass conversational mode and run directly

When to use each:

  • Conversational mode (super): Interactive exploration, learning, quick tasks
  • Traditional CLI (super agent ...): Scripts, CI/CD, automation

๐Ÿ” Configuration & Privacy

Local Mode (Default)

  • No authentication required
  • All data stays on your machine
  • Privacy-first

Configuration Storage

~/.superoptix/
โ”œโ”€โ”€ config.yaml          # Your model choice and settings
โ”œโ”€โ”€ credentials.yaml     # API keys (encrypted)
โ””โ”€โ”€ history/            # Conversation history

Changing Configuration

# View configuration
SuperOptiX> /config

# View detailed settings
SuperOptiX> /config show

# Set API keys
SuperOptiX> /config set OPENAI_API_KEY sk-...

# Reset configuration
SuperOptiX> /config reset

๐Ÿ’ก Tips & Tricks

1. Use /ask for Quick Help

SuperOptiX> /ask memory
SuperOptiX> /ask RAG
SuperOptiX> /ask GEPA

2. Discover Available Playbooks

SuperOptiX> /playbooks
# Shows all library templates + your project playbooks

3. Check Project Status

SuperOptiX> /status
# Quick overview of your project

4. Clear Screen

SuperOptiX> /clear
# Clears conversation history

5. Traditional CLI Still Works

# Exit conversational mode
SuperOptiX> /exit

# Run traditional command
$ super agent compile code_reviewer

# Re-enter conversational mode
$ super


๐Ÿš€ Coming Soon

Natural Language Mode:

SuperOptiX> Build a code review agent
SuperOptiX> Optimize my customer support agent
SuperOptiX> Show me optimization results

Currently in development! For now, use: - Slash commands in conversational mode - Traditional CLI for full functionality


๐Ÿ†˜ Troubleshooting

"Ollama not running"

Solution: 1. Install Ollama: https://ollama.com 2. Run: ollama serve 3. Install model: ollama pull llama3.1:8b

"Not in a SuperOptiX project"

Solution:

# Exit conversational mode
SuperOptiX> /exit

# Initialize project
$ super init my_project
$ cd my_project

# Re-enter conversational mode
$ super

"No agents found"

Solution:

SuperOptiX> /exit

# Create an agent
$ super spec generate genie code_reviewer

# Re-enter and check
$ super
SuperOptiX> /agents


๐Ÿ“š Learn More


โœจ Summary

Just type super - That's it!

  • ๐ŸŽฎ Slash commands for quick access
  • ๐Ÿ’ฌ Ask questions with /ask
  • ๐Ÿ”ง Manage models with /model
  • ๐Ÿ“‹ Explore playbooks with /playbooks
  • ๐Ÿš€ Full backwards compatibility with traditional CLI

Welcome to the future of SuperOptiX! ๐ŸŽ‰