A Streamlit application for comparing different approaches to using MCP (Model Context Protocol) tools with language models. This app allows you to test and compare:
- Direct Ollama + MCP: Using OllamaClient with MCP tools directly
- Minions MCP: Using the full SyncMinionsMCP framework with task decomposition
Before running the app, make sure you have:
-
Ollama installed with
llama3.2:1bmodel availableollama pull llama3.2:1b
-
MCP configuration (
mcp.json) in the root directory with filesystem server setup -
OpenAI API key configured for the remote client (gpt-4o-mini)
-
Python dependencies installed (see requirements.txt)
-
Navigate to the app directory:
cd apps/minions-tools -
Install dependencies:
pip install -r requirements.txt
-
Ensure MCP configuration exists: Make sure you have a
mcp.jsonfile in the project root with filesystem server configuration.
-
Start the Streamlit app:
streamlit run app.py
-
Configure your comparison:
- Enter a task in the sidebar (filesystem-related tasks work best)
- Select which methods you want to compare
- Click "🚀 Run Comparison"
-
View results:
- Each method's results appear in separate tabs
- Performance metrics are displayed for each method
- A summary table compares execution times
- Best for: Simple, direct tool calling scenarios
- Pros: Lightweight, fast, direct integration
- Cons: Limited coordination capabilities
- Best for: Tasks requiring coordination between models
- Pros: Balance of performance and capability
- Cons: More complex than direct approach
- Best for: Complex tasks requiring planning and decomposition
- Pros: Most sophisticated approach with multi-round processing
- Cons: Most resource-intensive, slower execution
The app uses several configuration options:
- Local Model:
llama3.2:1b(configurable in the code) - Remote Model:
gpt-4o-mini(requires OpenAI API key) - MCP Server:
filesystem(configurable via mcp.json) - Max Rounds: 3 (configurable per method)
Here are some example tasks that work well with the filesystem MCP server:
-
Directory Analysis:
Can you show me the directory structure of the examples folder and summarize what you find? -
File Search:
Find all Python files in the current directory and list their names. -
Content Analysis:
Read the contents of README.md and provide a summary of the main points.
-
"Failed to initialize clients"
- Check that Ollama is running and the model is available
- Verify MCP configuration exists and is valid
- Ensure OpenAI API key is properly configured
-
"Tool execution failed"
- Verify MCP server is running
- Check file paths and permissions
- Review MCP server logs
-
Import errors
- Ensure you're running from the correct directory
- Check that the minions package is properly installed
- Verify Python path includes the project root
- Start with simpler tasks to verify setup
- Use the Direct Ollama + MCP method for quick tests
- Enable only necessary methods for faster comparisons
- Monitor system resources when running multiple methods