forked from suno-ai/bark
-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error using Swap Voice and Clone Voice on Apple Silicon M2 #77
Comments
I solved this by replacing the line "model.load_state_dict(torch.load(path))" with "model.load_state_dict(torch.load(path, map_location='mps'))" in file "customtokenizer.py" in folder "/bark-gui/bark/hubert/". Hope this helps others. |
This worked for me on a M1 Pro 😉 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I receive the following error when starting webui.py -enablemps on my Mac mini with M2 CPU. This only happens with Swap or Clone Voice. TTS works fine.
File "/Users/jan/anaconda3/lib/python3.11/site-packages/torch/serialization.py", line 165, in validate_cuda_device
raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
I does not seem to load the Hubert_model into MPS but tries to use CUDA (that I do not have).
Do I need to adjust the code somewhere? I thought it would be enough to set the -enablemps switch.
Thank you very much for your support.
The text was updated successfully, but these errors were encountered: