Replies: 1 comment
-
I only have one GPU, so I am not able to test for or support multiple GPUs. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is it possible to utilize the VRAM from multiple GPUs at once, rather than selecting which to use individually?
Recently added a 2nd GPU and (48GB total) and keep getting out of memory errors (where the model worked fine on the single card).
Beta Was this translation helpful? Give feedback.
All reactions