Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Triton? #1

Open
tayshie opened this issue Aug 20, 2024 · 12 comments
Open

Triton? #1

tayshie opened this issue Aug 20, 2024 · 12 comments

Comments

@tayshie
Copy link

tayshie commented Aug 20, 2024

image
any suggestions?
11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz 8 16

Intel(R) UHD Graphics 30.0.101.2079
NVIDIA GeForce RTX 3050 Ti Laptop GPU 32.0.15.6081

@stazizov
Copy link
Collaborator

Sorry for this, we had to add that you should better create virtual environment and install all packages there

@melMass
Copy link

melMass commented Aug 20, 2024

Triton is notoriously hard to install (if even possible) on windows triton-inference-server/server#4737

@tonywhite11
Copy link

open a linux or wsl cli and install requirements

@Bladed3d
Copy link

pip install requirements results in Triton error message. I can open a WSL window in Windows, but how do I install requirements there? Why should this resolve the Triton error message?

@tonywhite11
Copy link

tonywhite11 commented Aug 23, 2024

I got the same error message in an earlier project and just like the previous commenter and ChatGPT said, it’s virtually impossible to install triton on windows but not Linux and WSL is Linux for windows. I use VS Code. Just open a new terminal and the chose a Ubuntu wsl terminal and follow project instructions. It works. Correction, got it installed but ran out of memory before Incould use it

@Bladed3d
Copy link

I got the same error message in an earlier project and just like the previous commenter and ChatGPT said, it’s virtually impossible to install triton on windows but not Linux and WSL is Linux for windows. I use VS Code. Just open a new terminal and the chose a Ubuntu wsl terminal and follow project instructions. It works. Correction, got it installed but ran out of memory before Incould use it

Instructions seem to say that I just need to install the requirements.txt from a wsl window. Is that right and what exactly is the command to install requirements.txt?

@Bladed3d
Copy link

And what is the error... ValueError: Non-consecutive added token '<extra_id_99>' found. Should have index 32100 but has index 32000 in saved vocabulary?

@tonywhite11
Copy link

No, I was responding to the triton installation problem. The command is the same except I have use python3 instead of python. My suggestion is to use a LLM to help with your issues. I gave copilot your error message and this is what it responded-

It seems like you're encountering an error related to the tokenization process in a language model, possibly while using the Hugging Face's Transformers library.

The error message suggests that there's an issue with the indices of the added tokens. The token <extra_id_99> should have an index of 32100, but it has an index of 32000 in the saved vocabulary.

This could be due to a mismatch between the pre-trained model's tokenizer and the one you're using. If you've added new tokens to the tokenizer, you need to make sure that the model is aware of these new tokens.

Here's a general way to add new tokens:

tokenizer = AutoTokenizer.from_pretrained('model_name')
model = AutoModel.from_pretrained('model_name')

Add new tokens

new_tokens = ['<extra_id_99>', '<extra_id_98>', ...] # Add your new tokens here
num_added_tokens = tokenizer.add_tokens(new_tokens)

Resize the token embeddings of the model

model.resize_token_embeddings(len(tokenizer))

@stazizov
Copy link
Collaborator

Guys, have you just tried to ignore triton installation?

@stazizov
Copy link
Collaborator

and all torch dependencies installation too

@tonywhite11
Copy link

I haven't but it's good to have WSL installed for other linux projects or projects that don't have Windows compatibility yet like ollama when it fisrt came out. I also use Copilot or the internet for most of my installation issues.

@stazizov
Copy link
Collaborator

Please try to just ignore triton and run the code again

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants