Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't make it work on windows11... #6

Open
moebiussurfing opened this issue Aug 30, 2024 · 3 comments
Open

can't make it work on windows11... #6

moebiussurfing opened this issue Aug 30, 2024 · 3 comments

Comments

@moebiussurfing
Copy link

moebiussurfing commented Aug 30, 2024

i am stuck at this errors:

(deforum_xflux_env) PS D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux> python.exe run.py
Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00,  3.60it/s]
Traceback (most recent call last):
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\run.py", line 122, in <module>
    root.__dict__.update(ModelSetup())
                         ^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\run.py", line 119, in ModelSetup
    model = Model("flux-dev-fp8", quantized=True)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\run.py", line 92, in __init__
    self.dit, self.ae, self.t5, self.clip = self.get_models(name, 'cuda', offload=False, is_schnell=False, quantized=quantized)
                                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\run.py", line 95, in get_models
    t5 = load_t5(device, max_length=256 if is_schnell else 512)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\./x-flux\src\flux\util.py", line 315, in load_t5
    return HFEmbedder("xlabs-ai/xflux_text_encoders", max_length=max_length, torch_dtype=torch.bfloat16).to(device)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\nn\modules\module.py", line 1174, in to
    return self._apply(convert)
           ^^^^^^^^^^^^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\nn\modules\module.py", line 780, in _apply
    module._apply(fn)
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\nn\modules\module.py", line 780, in _apply
    module._apply(fn)
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\nn\modules\module.py", line 805, in _apply
    param_applied = fn(param)
                    ^^^^^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\nn\modules\module.py", line 1160, in convert
    return t.to(
           ^^^^^
  File "D:\_AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\deforum-x-flux\deforum_xflux_env\Lib\site-packages\torch\cuda\__init__.py", line 305, in _lazy_init
    raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled

I tried

import torch
print(torch.cuda.is_available())

but returns false...

Any idea?

@tonywhite11
Copy link

go to pytorch and download the correct torch version

@tonywhite11
Copy link

go to pytorch and download the correct torch version

Uninstall the current version of PyTorch

pip uninstall torch

Install the correct version of PyTorch with CUDA support

pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu117

Verify the installation

python -c "import torch; print(torch.cuda.is_available())"

import torch
print(torch.cuda.is_available())

After completing these steps, you should be able to run your script without encountering the "Torch not compiled with CUDA enabled" error.

@SoftologyPro
Copy link

I can get further on Windows.
I have batch files here to install and run it.
#2 (comment)
I do have the issue that it runs super slow on Windows though that I am hoping a dev can help with eventually.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants
@moebiussurfing @tonywhite11 @SoftologyPro and others