Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"llava is already used by a transformers config" #111

Closed
linkpharm opened this issue Apr 29, 2024 · 7 comments
Closed

"llava is already used by a transformers config" #111

linkpharm opened this issue Apr 29, 2024 · 7 comments

Comments

@linkpharm
Copy link

I am using xformers==0.0.26.post1, torch==2.3.0, torchvision==0.18.0, and torchaudio==2.3.0.

This is the error I'm getting. I've asked chatgpt, but it has no idea.

Can anybody help?

C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py
no module 'xformers'. Processing without...
no module 'xformers'. Processing without...
Traceback (most recent call last):
File "C:\Users\Luke\Supir_Install\SUPIR\gradio_demo.py", line 11, in
from llava.llava_agent import LLavaAgent
File "C:\Users\Luke\Supir_Install\SUPIR\llava_init_.py", line 1, in
from .model import LlavaLlamaForCausalLM
File "C:\Users\Luke\Supir_Install\SUPIR\llava\model_init_.py", line 1, in
from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
File "C:\Users\Luke\Supir_Install\SUPIR\llava\model\language_model\llava_llama.py", line 139, in
AutoConfig.register("llava", LlavaConfig)
File "C:\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 981, in register
CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)
File "C:\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 680, in register
raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
ValueError: 'llava' is already used by a Transformers config, pick another name.

@FurkanGozukara
Copy link

we have auto installers and many times improved app just in case if you decide to use

#109

or try

xformers 0.0.24 and torch 2.2.0

@linkpharm
Copy link
Author

We don't have auto installers, you sell them. Please don't advertise. I already bought the subscription, and the one-click installer doesn't work.

I incorporated the correct versions of xformers and torch, however, now there's no output. The only sign of life is the cursor spinning for a second. Here's cmd:

C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py

C:\Users\Luke\Supir_Install\SUPIR>python gradio_demo.py --use_tile_vae --no_llava --use_image_slider --loading_half_params

C:\Users\Luke\Supir_Install\SUPIR>

I also tried to access http://127.0.0.1:6688 as chatgpt said that's the default IP and port it should be on, but no luck. Thanks for the help.

@FurkanGozukara
Copy link

no my 1 click installer works perfectly fine

image

image

image

@linkpharm
Copy link
Author

Alright, I redownloaded your build. It worked this time, I probably had some dependency wrong. I troubleshooted it for hours, before resorting to following a Reddit guide. Somehow one click and it worked. Huh. Thanks for the help. I still don't like paywalling work on an open-source project.

@KenWuqianghao
Copy link

I'm still facing this issue, any open source options?

@linkpharm
Copy link
Author

Fixed in dms.

@17Reset
Copy link

17Reset commented Aug 8, 2024

This problem still exists, how to solve it?

(supir_venv) xlab@xlab:/mnt/Deblur2/supir$ CUDA_VISIBLE_DEVICES=0,1 python test.py --img_dir '/mnt/Deblur2/tmp/input/' --save_dir '/mnt/Deblur2/tmp/output/' --SUPIR_sign Q --upscale 2
/mnt/Deblur2/supir_venv/lib/python3.12/site-packages/kornia/feature/lightglue.py:44: FutureWarning: `torch.cuda.amp.custom_fwd(args...)` is deprecated. Please use `torch.amp.custom_fwd(args..., device_type='cuda')` instead.
  @torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
/mnt/Deblur2/supir_venv/lib/python3.12/site-packages/xformers/ops/fmha/flash.py:211: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch.
  @torch.library.impl_abstract("xformers_flash::flash_fwd")
/mnt/Deblur2/supir_venv/lib/python3.12/site-packages/xformers/ops/fmha/flash.py:344: FutureWarning: `torch.library.impl_abstract` was renamed to `torch.library.register_fake`. Please use that instead; we will remove `torch.library.impl_abstract` in a future version of PyTorch.
  @torch.library.impl_abstract("xformers_flash::flash_bwd")
Traceback (most recent call last):
  File "/mnt/Deblur2/supir/test.py", line 5, in <module>
    from llava.llava_agent import LLavaAgent
  File "/mnt/Deblur2/supir/llava/__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
  File "/mnt/Deblur2/supir/llava/model/__init__.py", line 1, in <module>
    from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
  File "/mnt/Deblur2/supir/llava/model/language_model/llava_llama.py", line 139, in <module>
    AutoConfig.register("llava", LlavaConfig)
  File "/mnt/Deblur2/supir_venv/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 1029, in register
    CONFIG_MAPPING.register(model_type, config, exist_ok=exist_ok)
  File "/mnt/Deblur2/supir_venv/lib/python3.12/site-packages/transformers/models/auto/configuration_auto.py", line 728, in register
    raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.")
ValueError: 'llava' is already used by a Transformers config, pick another name.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants