Run in webgpu
#10889
Replies: 1 comment
-
@pdufour did you get an answer for this? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does anyone know if it's possible to run https://huggingface.co/nvidia/canary-1b via WebGPU? Possibly via https://github.com/xenova/transformers.js/ or https://github.com/mlc-ai/web-llm. I understand people have done it with whisper (https://huggingface.co/spaces/Xenova/whisper-webgpu) but wanted to try out canary.
I think maybe if I export it a ONNX version I can use it with transformers.js? https://docs.nvidia.com/nemo-framework/user-guide/latest/nemotoolkit/core/export.html
Beta Was this translation helpful? Give feedback.
All reactions