Skip to content

Which LLMs can run offline #2

Answered by wassfila
wassfila asked this question in Q&A
Discussion options

You must be logged in to vote

ollama https://ollama.com/ is becoming a defacto standard for managing and running offline llm models

Replies: 1 comment

Comment options

wassfila
Sep 8, 2024
Maintainer Author

You must be logged in to vote
0 replies
Answer selected by wassfila
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant