Running Examples with Ollama
This tutorial shows you how to run all the example scripts from the DSPy Code repository using Ollama for local execution. All examples are configured to work without API keys, making them perfect for learning and experimentation.
Prerequisites
Before running the examples, you'll need:
- Python 3.10+ installed
- Ollama installed and running locally
- DSPy Code repository cloned or installed
- Required Python packages
Step 1: Install Ollama
If you haven't installed Ollama yet:
macOS/Linux:
Windows: Download from ollama.ai
Step 2: Start Ollama and Pull the Model
Start the Ollama service:
In a new terminal, pull the model used by the examples:
This will download the llama3.1:8b model (approximately 4.7GB). The examples use this model for all LLM operations.
Step 3: Clone the Repository
If you haven't already, clone the DSPy Code repository:
Step 4: Install Dependencies
Install the required Python packages:
Or if you want to install from source:
Available Examples
The examples/ directory contains several ready-to-run scripts:
1. Complete Workflow Example
File: examples/complete_workflow_example.py
What it demonstrates: - Creating a DSPy Signature for sentiment analysis - Building a Module with Chain of Thought reasoning - Preparing training examples - Optimizing with GEPA (Genetic Pareto) - Evaluating the optimized module
How to run:
Expected output: - Model configuration - Module creation - Pre-optimization test - GEPA optimization progress (takes 3-10 minutes) - Post-optimization validation results
Key features: - Uses ollama/llama3.1:8b for both main LM and reflection LM - Demonstrates the full DSPy workflow from signature to optimization - Shows GEPA optimization with 30 metric calls budget - Includes validation evaluation
2. Email Classifier Demo
File: examples/email_classifier_demo.py
What it demonstrates: - Email classification by priority (urgent, normal, low) - Multi-field signature (subject, body, sender) - Chain of Thought reasoning for classification - Training data preparation - Module optimization
How to run:
Expected output: - Email classification examples - Module predictions before optimization - Training data preparation - Optimization process - Improved accuracy after optimization
Key features: - Realistic use case (email triage) - Multi-input signature design - Priority classification with reasoning
3. MCP Filesystem Assistant
File: examples/mcp_filesystem_assistant.py
What it demonstrates: - Integrating MCP (Model Context Protocol) filesystem server - Reading project files via MCP - Using DSPy modules with MCP tools - File summarization workflow
Prerequisites: - Node.js installed (for MCP server) - MCP filesystem server configured
How to run:
First, ensure you have Node.js installed:
Then run the example:
Configuration: The script expects a dspy_config.yaml with MCP filesystem server configuration. See examples/mcp_config_examples.yaml for reference.
Expected output: - MCP server connection - File reading via MCP tools - File content summarization - Project file analysis
Note: This example is experimental. For a more stable MCP example, see the GitHub triage assistant below.
4. MCP GitHub Triage Assistant
File: examples/mcp_github_triage_assistant.py
What it demonstrates: - MCP GitHub server integration - Fetching GitHub issues and pull requests - Summarizing and triaging issues with DSPy - Using MCP tools for external API access
Prerequisites: - GitHub personal access token (optional, for private repos) - Node.js installed - MCP GitHub server configured
How to run:
-
Create or edit
dspy_config.yamlin your project root: -
If using a GitHub token, set it as an environment variable:
-
Run the example:
Expected output: - MCP GitHub server connection - Issue/PR fetching - Summarization and triage recommendations - Priority classification
Key features: - Stable MCP integration example - Real-world use case (GitHub issue triage) - External API integration via MCP
Common Issues and Solutions
Issue: "Connection refused" or "Failed to connect to Ollama"
Solution: 1. Ensure Ollama is running: ollama serve 2. Check if the model is pulled: ollama list 3. Verify the API base URL in the script matches your Ollama setup (default: http://localhost:11434)
Issue: "Model not found" or "Model llama3.1:8b not found"
Solution:
Issue: "Module 'dspy' not found"
Solution:
Issue: MCP examples fail with "Connection failed"
Solution: 1. Ensure Node.js is installed: node --version 2. Check dspy_config.yaml exists and is correctly configured 3. Verify MCP server package names are correct: - Filesystem: @modelcontextprotocol/server-filesystem - GitHub: @modelcontextprotocol/server-github 4. Try running the MCP server command manually to test:
Issue: GEPA optimization takes too long
Solution: - The examples use max_metric_calls=30 for reasonable demo times - For faster testing, reduce this value in the script - For production, increase to 100+ for better optimization
Issue: "Out of memory" errors
Solution: - The llama3.1:8b model requires ~8GB RAM - Close other applications - Consider using a smaller model: ollama pull llama3.1:3b
Customizing Examples
All examples use the same Ollama configuration. To customize:
-
Change the model:
-
Change the API base: If Ollama is running on a different host/port:
-
Adjust GEPA budget: In optimization examples, modify:
Next Steps
After running the examples:
- Modify the examples to suit your use case
- Create your own signatures and modules
- Build your own training data using the patterns shown
- Explore the interactive CLI to build programs interactively:
- Read the tutorials for deeper understanding:
- Build a Sentiment Analyzer
- Create a RAG System
- Optimize with GEPA
Example File Reference
| File | Purpose | Runtime | Complexity |
|---|---|---|---|
complete_workflow_example.py | Full DSPy workflow with GEPA | 5-15 min | Intermediate |
email_classifier_demo.py | Email classification use case | 5-10 min | Intermediate |
mcp_filesystem_assistant.py | MCP filesystem integration | 1-2 min | Advanced |
mcp_github_triage_assistant.py | MCP GitHub integration | 2-5 min | Advanced |
mcp_config_examples.yaml | MCP server configurations | Reference | - |
Additional Resources
đĄ Tip: Start with complete_workflow_example.py to understand the full DSPy workflow, then explore the other examples based on your interests!