
Starred repositories
Simple and lightweight HTTP async server for micropython
Full async Micropython web server with small memory footprint.
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.
A multi-platform desktop application to evaluate and compare LLM models, written in Rust and React.
🐝 Awesome R and Python packages offering extended UI or server components for the web framework Shiny
Repository with all what is necessary for sentiment analysis and related areas