Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where is the TurboSparse-Mixtral mlp_predictor? #203

Open
MatthewCroughan opened this issue Jun 27, 2024 · 1 comment
Open

Where is the TurboSparse-Mixtral mlp_predictor? #203

MatthewCroughan opened this issue Jun 27, 2024 · 1 comment
Labels
question Further information is requested

Comments

@MatthewCroughan
Copy link
Contributor

Predictors for all other models are provided, but not this one https://huggingface.co/PowerInfer?search_models=predictor

I was wanting to convert TurboSparse-Mixtral into a quantized GGUF to reproduce the claims made about usage on smartphones.

@MatthewCroughan MatthewCroughan added the question Further information is requested label Jun 27, 2024
@yichen0104
Copy link

Similar situation here. Since it is basically Bamboo with an extra MLP as predictor within BambooMLP, I suspect we need standalone pretrained predictor weights s.t. they can be combined with the TurboSparse model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants