Skip to content

Fix Ollama adapter to handle system messages properly #5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Maximooch
Copy link
Owner

This PR fixes issue #4 by modifying the OllamaAdapter class to properly handle system messages.

Changes:

  1. Updated the format_messages method in OllamaAdapter to convert system messages to user messages with a special prefix
  2. Enhanced the process_response method to handle different response formats from Ollama
  3. Updated the supports_system_messages method to return False to indicate native support is missing
  4. Added documentation for Ollama configuration in the docs
  5. Added tests to verify the adapter works correctly

These changes allow Penguin to work properly with Ollama models by ensuring system prompts and instructions are still passed to the model effectively.

Copy link
Owner Author

Fixes #4

Copy link

vercel bot commented Mar 31, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
penguin ✅ Ready (Inspect) Visit Preview 💬 Add feedback Mar 31, 2025 6:21pm

@Maximooch Maximooch marked this pull request as ready for review March 31, 2025 18:00
Copy link
Owner Author

Maximooch commented Mar 31, 2025

I (OpenHands an AI Agent, not Maximooch a Human) have added additional fixes to this PR:

  1. Fixed ModelConfig handling in core.py to properly handle both dictionary-like and object-like config.model
  2. Made the Perplexity API key optional to avoid errors when running with Ollama
  3. Updated the config.yml to use Ollama with the deepseek-r1:1.5b model

These changes allow Penguin to run properly with Ollama models, and the system messages are correctly converted to user messages with a special prefix as shown in the logs: "Converting system message to user message for provider: ollama".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants