Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cudaMallocAsync does not yet support checkPoolLiveAllocations. If you need it, please file an issue describing your use case. #132

Open
if-ai opened this issue Apr 15, 2024 · 5 comments

Comments

@if-ai
Copy link

if-ai commented Apr 15, 2024

I always get this error

[2024-04-15 13:57:46,399] [0/0] torch._dynamo.variables.torch: [WARNING] Profiler function <class 'torch.autograd.profiler.record_function'> will be ignored
/home/impactframes/micromamba/envs/comfy/lib/python3.11/multiprocessing/popen_fork.py:66: RuntimeWarning: os.fork() was called. os.fork() is incompatible with multithreaded code, and JAX is multithreaded, so this will likely lead to a deadlock.
  self.pid = os.fork()

WSL Ubuntu CUDA 12.1

@if-ai
Copy link
Author

if-ai commented Apr 16, 2024

It works setting torch.compile to False but is not ideal maybe my torch is too new and it manage mem allocation differently
torch 2.2.1+cu121 torch_scatter 2.1.2 torchaudio 2.2.1+cu121

@if-ai if-ai closed this as completed Apr 16, 2024
@jpc
Copy link
Contributor

jpc commented Apr 16, 2024

The "Profiler function" warning sohuld be 100% harmless. I am not sure where forking may be used during inference.

Were you able to figure out the cause of the issue?

@if-ai
Copy link
Author

if-ai commented Apr 16, 2024 via email

@if-ai
Copy link
Author

if-ai commented Apr 17, 2024

just in case this is the log the first attempt is successful because I set torch_compile to false but the second time set to true the error is to do with dynamo and triton don't know why here is the full log
` 2.1 seconds: /home/impactframes/ComfyUI/custom_nodes/ComfyUI-IF_AI_tools
4.2 seconds: /home/impactframes/ComfyUI/custom_nodes/ComfyUI-Crystools
6.8 seconds: /home/impactframes/ComfyUI/custom_nodes/ComfyUI-Zho-InstantID

Starting server

To see the GUI go to: http://127.0.0.1:8188
FETCH DATA from: /home/impactframes/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json
got prompt
Fetch hyperparams.yaml: Using existing file/symlink in ~/.cache/speechbrain/hyperparams.yaml.
Fetch custom.py: Delegating to Huggingface hub, source speechbrain/spkrec-ecapa-voxceleb.
Fetch embedding_model.ckpt: Using existing file/symlink in ~/.cache/speechbrain/embedding_model.ckpt.
Fetch mean_var_norm_emb.ckpt: Using existing file/symlink in ~/.cache/speechbrain/mean_var_norm_emb.ckpt.
Fetch classifier.ckpt: Using existing file/symlink in ~/.cache/speechbrain/classifier.ckpt.
Fetch label_encoder.txt: Using existing file/symlink in ~/.cache/speechbrain/label_encoder.ckpt.
Loading pretrained files for: embedding_model, mean_var_norm_emb, classifier, label_encoder
Vibrant green leaf sways,
Intricate veins intertwine,
Nature's artwork shines.
/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/backends/cuda/init.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature.
warnings.warn(

<IPython.lib.display.Audio object>
ffmpeg version 6.0 Copyright (c) 2000-2023 the FFmpeg developers
built with gcc 11 (Ubuntu 11.4.0-1ubuntu1~22.04)
configuration: --enable-libx264 --enable-gpl
libavutil 58. 2.100 / 58. 2.100
libavcodec 60. 3.100 / 60. 3.100
libavformat 60. 3.100 / 60. 3.100
libavdevice 60. 1.100 / 60. 1.100
libavfilter 9. 3.100 / 9. 3.100
libswscale 7. 1.100 / 7. 1.100
libswresample 4. 10.100 / 4. 10.100
libpostproc 57. 1.100 / 57. 1.100
Guessed Channel Layout for Input Stream #0.0 : mono
Input #0, wav, from '/home/impactframes/ComfyUI/output/IF_whisper_speech_20240417104816/IF_whisper_speech_20240417104816.wav':
Duration: 00:00:05.47, bitrate: 512 kb/s
Stream #0:0: Audio: pcm_f32le ([3][0][0][0] / 0x0003), 16000 Hz, 1 channels, flt, 512 kb/s
Stream mapping:
Stream #0:0 -> #0:0 (pcm_f32le (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
Output #0, wav, to 'tmp/IF_AI_DreamTalk_20240417104854/IF_AI_DreamTalk_20240417104854_16K.wav':
Metadata:
ISFT : Lavf60.3.100
Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 16000 Hz, mono, s16, 256 kb/s
Metadata:
encoder : Lavc60.3.100 pcm_s16le
size= 171kB time=00:00:05.44 bitrate= 257.4kbits/s speed= 512x
video:0kB audio:171kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.044588%
Some weights of Wav2Vec2Model were not initialized from the model checkpoint at jonatasgrosman/wav2vec2-large-xlsr-53-english and are newly initialized: ['wav2vec2.encoder.pos_conv_embed.conv.parametrizations.weight.original0', 'wav2vec2.encoder.pos_conv_embed.conv.parametrizations.weight.original1']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Moviepy - Building video /home/impactframes/ComfyUI/output/IF_AI_DreamTalk_20240417104854/IF_AI_DreamTalk_20240417104854_converted.mp4.
MoviePy - Writing audio in IF_AI_DreamTalk_20240417104854_convertedTEMP_MPY_wvf_snd.mp4
MoviePy - Done.
Moviepy - Writing video /home/impactframes/ComfyUI/output/IF_AI_DreamTalk_20240417104854/IF_AI_DreamTalk_20240417104854_converted.mp4

Moviepy - Done !
Moviepy - video ready /home/impactframes/ComfyUI/output/IF_AI_DreamTalk_20240417104854/IF_AI_DreamTalk_20240417104854_converted.mp4

IF_AI_tool_output:

Vibrant green leaf sways,
Intricate veins intertwine,
Nature's artwork shines.
Prompt executed in 101.68 seconds
got prompt
/home/impactframes/micromamba/envs/comfy/lib/python3.11/multiprocessing/popen_fork.py:66: RuntimeWarning: os.fork() was called. os.fork() is incompatible with multithreaded code, and JAX is multithreaded, so this will likely lead to a deadlock.
self.pid = os.fork()

Fetch hyperparams.yaml: Using existing file/symlink in ~/.cache/speechbrain/hyperparams.yaml.
Fetch custom.py: Delegating to Huggingface hub, source speechbrain/spkrec-ecapa-voxceleb.
Fetch embedding_model.ckpt: Using existing file/symlink in ~/.cache/speechbrain/embedding_model.ckpt.
Fetch mean_var_norm_emb.ckpt: Using existing file/symlink in ~/.cache/speechbrain/mean_var_norm_emb.ckpt.
Fetch classifier.ckpt: Using existing file/symlink in ~/.cache/speechbrain/classifier.ckpt.
Fetch label_encoder.txt: Using existing file/symlink in ~/.cache/speechbrain/label_encoder.ckpt.
Loading pretrained files for: embedding_model, mean_var_norm_emb, classifier, label_encoder
Here is a haiku about the image:

Verdant leaf unfurls,
Intricate veins glisten green,
Nature's artistry.
/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/backends/cuda/init.py:342: FutureWarning: torch.backends.cuda.sdp_kernel() is deprecated. In the future, this context manager will be removed. Please see, torch.nn.attention.sdpa_kernel() for the new context manager, with updated signature.
warnings.warn(

W0417 10:51:58.792000 140378514298432 torch/_logging/_internal.py:1016] [0/0] Profiler function <class 'torch.autograd.profiler.record_function'> will be ignored
!!! Exception during processing !!!
Traceback (most recent call last):
File "/home/impactframes/ComfyUI/execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/ComfyUI/execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/ComfyUI/execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/ComfyUI/custom_nodes/ComfyUI-IF_AI_tools/IFWhisperSpeechNode.py", line 115, in generate_audio
stoks = pipe.t2s.generate(chunk, cps=cps, show_progress_bar=False)[0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/whisperspeech/t2s_up_wds_mlang_enclm.py", line 462, in generate
toks[:,i+1] = self.generate_next(toks[:,i:i+1], toks_positions[i:i+1], cps_emb, xenc, xenc_positions, T, top_k)[:,0]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/eval_frame.py", line 410, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 977, in catch_errors
return callback(frame, cache_entry, hooks, frame_state, skip=1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 411, in _convert_frame_assert
return _compile(
^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_utils_internal.py", line 70, in wrapper_function
return function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 700, in _compile
guarded_code = compile_inner(code, one_graph, hooks, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 568, in compile_inner
out_code = transform_code_object(code, transform)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/bytecode_transformation.py", line 1116, in transform_code_object
transformations(instructions, code_options)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 173, in _fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/convert_frame.py", line 515, in transform
tracer.run()
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 2237, in run
super().run()
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 875, in run
while self.step():
^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 790, in step
self.dispatch_table[inst.opcode](self, inst)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 2380, in RETURN_VALUE
self._return(inst)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/symbolic_convert.py", line 2365, in _return
self.output.compile_subgraph(
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1050, in compile_subgraph
self.compile_and_call_fx_graph(tx, list(reversed(stack_values)), root)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1264, in compile_and_call_fx_graph
compiled_fn = self.call_user_compiler(gm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1331, in call_user_compiler
raise BackendCompilerFailed(self.compiler_fn, e).with_traceback(
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/output_graph.py", line 1312, in call_user_compiler
compiled_fn = compiler_fn(gm, self.example_inputs())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/dynamo/repro/after_dynamo.py", line 127, in debug_wrapper
compiled_gm = compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/dynamo/repro/after_dynamo.py", line 127, in debug_wrapper
compiled_gm = compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/init.py", line 1742, in call
return compile_fx(model
, inputs
, config_patches=self.config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1174, in compile_fx
return compile_fx(
^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1418, in compile_fx
return aot_autograd(
^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/backends/common.py", line 65, in compiler_fn
cg = aot_module_simplified(gm, example_inputs, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 958, in aot_module_simplified
compiled_fn = create_aot_dispatcher_function(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_functorch/aot_autograd.py", line 685, in create_aot_dispatcher_function
compiled_fn = compiler_fn(
^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/runtime_wrappers.py", line 469, in aot_wrapper_dedupe
return compiler_fn(flat_fn, leaf_flat_args, aot_config, fw_metadata=fw_metadata)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/runtime_wrappers.py", line 671, in aot_wrapper_synthetic_base
return compiler_fn(flat_fn, flat_args, aot_config, fw_metadata=fw_metadata)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_functorch/_aot_autograd/jit_compile_runtime_wrappers.py", line 149, in aot_dispatch_base
compiled_fw = compiler(fw_module, updated_flat_args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 1322, in fw_compiler_base
return inner_compile(
^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/repro/after_aot.py", line 83, in debug_wrapper
inner_compiled_fn = compiler_fn(gm, example_inputs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/debug.py", line 304, in inner
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 477, in compile_fx_inner
compiled_graph = fx_codegen_and_compile(
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/compile_fx.py", line 758, in fx_codegen_and_compile
compiled_fn = graph.compile_to_fn()
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/graph.py", line 1517, in compile_to_fn
return self.compile_to_module().call
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/graph.py", line 1460, in compile_to_module
self.codegen_with_cpp_wrapper() if self.cpp_wrapper else self.codegen()
^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/graph.py", line 1416, in codegen
self.scheduler.codegen()
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_dynamo/utils.py", line 266, in time_wrapper
r = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/scheduler.py", line 2473, in codegen
self.get_backend(device).codegen_node(node) # type: ignore[possibly-undefined]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/codegen/cuda_combined_scheduling.py", line 69, in codegen_node
return self._triton_scheduling.codegen_node(node)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/codegen/triton.py", line 3313, in codegen_node
return self.codegen_node_schedule(node_schedule, buf_accesses, numel, rnumel)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/codegen/triton.py", line 3483, in codegen_node_schedule
src_code = kernel.codegen_kernel()
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/_inductor/codegen/triton.py", line 2806, in codegen_kernel
"backend_hash": torch.utils._triton.triton_hash_with_backend(),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/utils/_triton.py", line 62, in triton_hash_with_backend
backend = triton_backend()
^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/torch/utils/_triton.py", line 48, in triton_backend
target = driver.active.get_current_target()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/runtime/driver.py", line 23, in getattr
self._initialize_obj()
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/runtime/driver.py", line 20, in _initialize_obj
self._obj = self._init_fn()
^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/runtime/driver.py", line 9, in _create_driver
return actives0
^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/backends/nvidia/driver.py", line 350, in init
self.utils = CudaUtils() # TODO: make static
^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/backends/nvidia/driver.py", line 79, in init
mod = compile_module_from_src(Path(os.path.join(dirname, "driver.c")).read_text(), "cuda_utils")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/backends/nvidia/driver.py", line 56, in compile_module_from_src
so = _build(name, src_path, tmpdir, library_dirs(), include_dir, libraries)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/triton/runtime/build.py", line 77, in _build
setuptools.setup(**args)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/init.py", line 155, in setup
return distutils.core.setup(**attrs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 148, in setup
return run_commands(dist)
^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/_distutils/core.py", line 163, in run_commands
dist.run_commands()
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 967, in run_commands
self.run_command(cmd)
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 984, in run_command
cmd_obj = self.get_command_obj(command)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/site-packages/setuptools/_distutils/dist.py", line 859, in get_command_obj
cmd_obj = self.command_obj[command] = klass(self)
^^^^^^^^^^^
File "/home/impactframes/micromamba/envs/comfy/lib/python3.11/distutils/cmd.py", line 57, in init
raise TypeError("dist must be a Distribution instance")
torch._dynamo.exc.BackendCompilerFailed: backend='inductor' raised:
TypeError: dist must be a Distribution instance

Set TORCH_LOGS="+dynamo" and TORCHDYNAMO_VERBOSE=1 for more information

You can suppress this exception and fall back to eager by setting:
import torch._dynamo
torch._dynamo.config.suppress_errors = True

Prompt executed in 170.13 seconds`

@if-ai if-ai reopened this Apr 17, 2024
@MaxTran96
Copy link

hi i'm getting the same issue. Any fix for this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

3 participants