Conversational Interface
๐ฏ Overview
SuperOptiX features a conversational interface that makes it easy to interact with the framework through natural language and slash commands.
Just type:
super
That's it! No need for super chat or other subcommands.
๐ Getting Started
First Run
The first time you run super, you'll go through a quick setup:
$ super
๐ Welcome to SuperOptiX!
First time setup - this will take about 30 seconds.
Step 1/2: Choose AI Model Provider
1. ๐ Ollama (Local - FREE, Private, Offline)
2. โ๏ธ OpenAI (Cloud - Paid)
3. โ๏ธ Anthropic (Cloud - Paid)
Choose [1-3]: 1
We recommend Ollama for: - Privacy (data stays local) - No API costs (free) - Offline capability
After Setup
Once configured, just type super to enter conversational mode:
$ super
Welcome to SuperOptiX!
Using: ollama (llama3.1:8b)
Type /help for slash commands or just tell me what to do.
SuperOptiX> _
๐ฎ Slash Commands
Slash commands provide quick access to SuperOptiX features:
Configuration & Models
/model # Show current model
/model list # List all available models
/model set <model> # Switch model
/config # Show configuration
/config show # Show all settings
/config set <k> <v> # Set configuration value
Help & Documentation
/help # Show all commands
/ask <question> # Ask about SuperOptiX
/help <topic> # Topic-specific help
/docs <topic> # Open documentation
/examples # Show example workflows
Project & Agents
/status # Show project status
/agents # List all agents
/playbooks # List all playbooks
/templates # Show available templates
Conversation
/clear # Clear screen
/history # Show conversation history (coming soon)
/exit, /quit # Exit conversational mode
๐ฌ Example Session
$ super
Welcome to SuperOptiX!
Using: ollama (llama3.1:8b)
SuperOptiX> /help
[Shows all slash commands]
SuperOptiX> /model list
Available AI Models
๐ LOCAL MODELS (via Ollama):
llama3.1:8b (current)
qwen2.5:14b
โ๏ธ CLOUD MODELS:
OpenAI: gpt-4o, gpt-4o-mini
Anthropic: claude-3.5-sonnet
SuperOptiX> /ask How do I add memory?
๐ก How do I add memory to my agent?
To add memory to your agent, update your playbook's spec:
```yaml
spec:
memory:
enabled: true
enable_context_optimization: true
max_context_tokens: 2000
SuperOptiX> /agents
Agents
Found 2 agent(s):
โข code_reviewer Compiled โข customer_support โ ๏ธ Not compiled
SuperOptiX> /playbooks
Available Playbooks
๐ฆ Library Templates (5): โข genie_playbook [memory, tools, rag] General-purpose intelligent agent... โข security_agent_playbook [tools, rag] Code security review agent...
๐ Your Project (2): โข code_reviewer_playbook [memory, tools] โข customer_support_playbook [memory, rag]
SuperOptiX> /exit
๐ Goodbye! Happy building with SuperOptiX!
---
## ๐ง Model Management
### Viewing Models
```bash
SuperOptiX> /model
Current Model Configuration
โข Provider: ollama
โข Model: llama3.1:8b
โข API Base: http://localhost:11434
โข Status: Connected
Switching Models
# Switch to different Ollama model
SuperOptiX> /model set qwen2.5:14b
Switched to: qwen2.5:14b
# Switch to OpenAI (requires API key)
SuperOptiX> /model set gpt-4o
โ ๏ธ OPENAI_API_KEY not set
Set it with: /config set OPENAI_API_KEY sk-...
Listing All Models
SuperOptiX> /model list
Available AI Models
๐ LOCAL MODELS (via Ollama):
Installed:
llama3.1:8b (current)
qwen2.5:14b
Available to install:
โข deepseek-coder:33b (19GB) - Best for coding
โข mistral:7b (4.1GB) - Fast alternative
Install: ollama pull <model>
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ๏ธ CLOUD MODELS:
OpenAI (Requires OPENAI_API_KEY):
โข gpt-4o - Best overall
โข gpt-4o-mini - Fast and affordable
Anthropic (Requires ANTHROPIC_API_KEY):
โข claude-3.5-sonnet - Best for coding
โข claude-3.5-haiku - Fast and affordable
Set API key: /config set OPENAI_API_KEY sk-...
๐ Learning SuperOptiX
Ask Questions
Use /ask to learn about SuperOptiX features:
SuperOptiX> /ask How do I add memory?
SuperOptiX> /ask What is GEPA?
SuperOptiX> /ask How do I add RAG?
SuperOptiX> /ask What is SuperSpec?
View Examples
SuperOptiX> /examples
Example Workflows
1. Build and Optimize Agent:
super spec generate genie code_reviewer
super agent compile code_reviewer
super agent optimize code_reviewer --auto medium
2. Quick Agent from Template:
super agent pull developer
super agent compile developer
super agent run developer --goal "Build a CLI tool"
๐ Backwards Compatibility
Traditional CLI commands still work:
# Traditional commands (no conversational mode)
$ super agent compile code_reviewer
$ super agent optimize code_reviewer --auto medium
$ super agent evaluate code_reviewer
# These bypass conversational mode and run directly
When to use each:
- Conversational mode (
super): Interactive exploration, learning, quick tasks - Traditional CLI (
super agent ...): Scripts, CI/CD, automation
๐ Configuration & Privacy
Local Mode (Default)
- No authentication required
- All data stays on your machine
- Privacy-first
Configuration Storage
~/.superoptix/
โโโ config.yaml # Your model choice and settings
โโโ credentials.yaml # API keys (encrypted)
โโโ history/ # Conversation history
Changing Configuration
# View configuration
SuperOptiX> /config
# View detailed settings
SuperOptiX> /config show
# Set API keys
SuperOptiX> /config set OPENAI_API_KEY sk-...
# Reset configuration
SuperOptiX> /config reset
๐ก Tips & Tricks
1. Use /ask for Quick Help
SuperOptiX> /ask memory
SuperOptiX> /ask RAG
SuperOptiX> /ask GEPA
2. Discover Available Playbooks
SuperOptiX> /playbooks
# Shows all library templates + your project playbooks
3. Check Project Status
SuperOptiX> /status
# Quick overview of your project
4. Clear Screen
SuperOptiX> /clear
# Clears conversation history
5. Traditional CLI Still Works
# Exit conversational mode
SuperOptiX> /exit
# Run traditional command
$ super agent compile code_reviewer
# Re-enter conversational mode
$ super
๐ Coming Soon
Natural Language Mode:
SuperOptiX> Build a code review agent
SuperOptiX> Optimize my customer support agent
SuperOptiX> Show me optimization results
Currently in development! For now, use: - Slash commands in conversational mode - Traditional CLI for full functionality
๐ Troubleshooting
"Ollama not running"
Solution:
1. Install Ollama: https://ollama.com
2. Run: ollama serve
3. Install model: ollama pull llama3.1:8b
"Not in a SuperOptiX project"
Solution:
# Exit conversational mode
SuperOptiX> /exit
# Initialize project
$ super init my_project
$ cd my_project
# Re-enter conversational mode
$ super
"No agents found"
Solution:
SuperOptiX> /exit
# Create an agent
$ super spec generate genie code_reviewer
# Re-enter and check
$ super
SuperOptiX> /agents
๐ Learn More
- Full documentation: https://superoptix.ai
- Getting started: Run
super docsfor comprehensive guide - GitHub: https://github.com/SuperagenticAI/superoptix
โจ Summary
Just type super - That's it!
- ๐ฎ Slash commands for quick access
- ๐ฌ Ask questions with
/ask - ๐ง Manage models with
/model - ๐ Explore playbooks with
/playbooks - ๐ Full backwards compatibility with traditional CLI
Welcome to the future of SuperOptiX! ๐