Skip to content

Create out symbolic links for the GGUF Models in Ollama Blobs. for use in other applications such as Llama.cpp/Jan/LMStudio etc. / 将 Ollama GGUF 模型文件软链接出,以便其他应用使用。

License

Notifications You must be signed in to change notification settings

mili-tan/Onllama.GGUFLinkOut

Repository files navigation

Onllama.GGUFLinkOut

GitHub-release

Simply run it with an Administrator or Root user and the symbolic links for GGUF Models will be created under the OllamaGGUFs folder in the current run directory.

That's it, nothing more.

只需要以管理员或根用户身份运行,GGUF 模型的符号链接就会在当前运行目录下的 OllamaGGUFs 文件夹中创建。

仅此而已,如此简单。

Usage: Onllama.GGUFsLinkOut [options]

Options:
  -?|-he|--help      Show help information.
  -m|--model <path>  Set ollama model path / Ollama 模型文件路径。
  -g|--ggufs <path>  Set GGUFs link output path / GGUF 文件链接输出路径。

About

Create out symbolic links for the GGUF Models in Ollama Blobs. for use in other applications such as Llama.cpp/Jan/LMStudio etc. / 将 Ollama GGUF 模型文件软链接出,以便其他应用使用。

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages