You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This option only appears when <weight_dtype> is selected in - <weight_dtype>. It does not appear when <fp8_e5m2> is selected.
cuda11.8升级到12.4时出现这个异常,但是仍然出图。unet加载器flux-dev-fp8.safetensors枝剪类型选择<weight_dtype>有错误信息,选其他两个<fp8_e5m2>没有错误信息。
pytorch version: 2.5.0+cu124
xformers version: 0.0.28.post2
model weight dtype torch.float8_e4m3fn, manual cast: torch.bfloat16 model_type FLUX Requested to load Flux Loading 1 new model loaded partially 10745.827979049682 10745.7216796875 46 ERROR lora diffusion_model.double_blocks.0.img_attn.qkv.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ................ ERROR lora diffusion_model.double_blocks.16.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.17.img_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.17.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.18.img_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.18.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn'
The text was updated successfully, but these errors were encountered:
hong2610
changed the title
Help, cuda 11.8 level to 12.4. The console shows an exception, but the image is still being drawn.
ERROR lora diffusion_model.double_blocks.16.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn'
Nov 9, 2024
This option only appears when <weight_dtype> is selected in - <weight_dtype>. It does not appear when <fp8_e5m2> is selected.
cuda11.8升级到12.4时出现这个异常,但是仍然出图。unet加载器flux-dev-fp8.safetensors枝剪类型选择<weight_dtype>有错误信息,选其他两个<fp8_e5m2>没有错误信息。
pytorch version: 2.5.0+cu124
xformers version: 0.0.28.post2
model weight dtype torch.float8_e4m3fn, manual cast: torch.bfloat16 model_type FLUX Requested to load Flux Loading 1 new model loaded partially 10745.827979049682 10745.7216796875 46 ERROR lora diffusion_model.double_blocks.0.img_attn.qkv.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ................ ERROR lora diffusion_model.double_blocks.16.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.17.img_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.17.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.18.img_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn' ERROR lora diffusion_model.double_blocks.18.txt_attn.proj.weight "addmm_cuda" not implemented for 'Float8_e4m3fn'
The text was updated successfully, but these errors were encountered: