Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cease support for llama.cpp-served Gemma #4

Open
AstraBert opened this issue Jan 3, 2025 · 0 comments
Open

Cease support for llama.cpp-served Gemma #4

AstraBert opened this issue Jan 3, 2025 · 0 comments
Assignees
Labels
enhancement New feature or request licensing issue
Milestone

Comments

@AstraBert
Copy link
Owner

Reference to #2 but also to the inefficiency of the solution

Explore new local serving methods like quantization (non-dockerizabble) and llama.cpp python package

@AstraBert AstraBert added enhancement New feature or request licensing issue labels Jan 3, 2025
@AstraBert AstraBert added this to the January 2025 milestone Jan 3, 2025
@AstraBert AstraBert self-assigned this Jan 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request licensing issue
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

1 participant