Releases: kspviswa/pyOllaMx
Releases · kspviswa/pyOllaMx
v0.0.7
v0.0.6
v0.0.5
v0.0.4
New Functionality
- Now you can download Ollama models right within 🤌🏻 PyOllaMx's Model Hub tab. You can also inspect existing models 🧐, delete models 🗑️ right within PyOllaMx instead of using Ollama CLI. This greatly simplifies the user experience 🤩🤩. And you before you ask, yes I'm working to bring similar functionality for MLX models from huggingface hub. Please stay tuned 😎
BugFixes
- Updated DDGS dependency to fix some of the rate limit issues
v0.0.3
Dark Mode Support
Toggle between Dark & Light mode with a click of the icon
Model settings menu
Brand new settings menu to set the model name and the temperature along with Ollama & MlX model toggle
Streaming support
Streaming support for both chat & search tasks
Brand New Status bar
Status bar that displays the selected mode name, model type & model temperature
Web search enabled for Apple MlX models
Now you can use Apple MlX models to power the web search when choosing the search tab
v0.0.2
- Web search capability (powered by DuckDuckGo search engine via https://github.com/deedy5/duckduckgo_search)
a. Web search powered via basic RAG using prompt engineering. More advanced techniques are in pipeline
b. Search response will cite clickable sources for easy follow-up / deep dive
c. Beneath every search response, search keywords are also shown to verify the search scope
d. Easy toggle between chat and search operations - Clear / Erase history
- Automatic scroll on chat messages for better user experience
- Basic error & exception handling for searches
Limitations:
- Web search only enabled for Ollama models. Use dolphin-mistral:7b model for better results. MlX model support is planned for next release
- Search results aren't deterministic and vary vastly among the chosen models. So play with different models to find your optimum
- Sometimes search results are gibberish. It is due to the fact that search engine RAG is vanilla i.e done via basic prompt engineering without any library support. So re-trigger the same search prompt and see the response once again if the results aren't satisfactory.
v0.0.1
v0.0.1 Features
- Auto discover Ollama & MlX models. Simply download the models as you do with respective tools and pyOllaMx would discover and use the models seamlessly
- Markdown support on chat messages for programming code
- Selectable Text
- Temperature control
- Basic error handling