-
Offline LLMs
tool for running llm server |
Beta Was this translation helpful? Give feedback.
Answered by
wassfila
Sep 8, 2024
Replies: 1 comment
-
ollama https://ollama.com/ is becoming a defacto standard for managing and running offline llm models |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
wassfila
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
ollama https://ollama.com/ is becoming a defacto standard for managing and running offline llm models