play on RAGE + gemini as MVP
MVP to connect with ollama recognizing ollama from localhost if running
MMP to connect with gemini API including experimental from toggle
RAGE folder contains the as yet to be implented Retrieval Augmented Generative Engine
functions as a basic UI for chat response from localhost ollama model or Gemini
with memory.py and logger.py
RAGEmini/
├── src/
│ ├── memory.py
│ ├── logger.py
│ ├── openmind.py
│ └── locallama.py
├── gfx/
│ └── styles.css
├── memory/
│ ├── sessions/
│ ├── knowledge/
│ └── long_term_memory.json
└── rage.py
Retrieval Augmented Generative Engine
RAGE Retrieval Augmented Generative Engine is a dynamic engine designed to learn from context, injest and memory over time.
By leveraging the continuously updated data and learning from past interactions, RAGE can understand and respond to nuances in user queries. This ability makes it particularly effective in scenarios where context heavily influences the nature of the response.
As RAGE evolves, it becomes more adept at predicting user needs and adjusting its responses accordingly, ensuring high relevance and personalization
perform manual install or INSTALL not both
manual install
git clone https://github.com/GATERAGE/DeepSeekRAGE
python3.11 -m venv rage
source rage/bin/activate
pip install --no-cache-dir -r requirements.txt
streamlit run rage.py
INSTALL
source install.sh
streamlit run rage.py