Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Support for Custom Attention Mask #2232

Open
Peng-YM opened this issue Apr 26, 2024 · 0 comments
Open

[Question] Support for Custom Attention Mask #2232

Peng-YM opened this issue Apr 26, 2024 · 0 comments
Labels
question Question about the usage

Comments

@Peng-YM
Copy link

Peng-YM commented Apr 26, 2024

❓ General Questions

Many thanks for your effort to develop such a great library. I want to add support for the ChatGLM (not the 3rd generation) model in mlc-llm, however, it seems to me that currently attention mask is not supported as documented in the following file:

# 3rdparty/tvm/python/tvm/relax/frontend/nn/modules.py:924

assert attention_mask is None, "Attention mask not yet supported."

How can I resolve this issue, sincerely wish to hear from you.

@Peng-YM Peng-YM added the question Question about the usage label Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about the usage
Projects
None yet
Development

No branches or pull requests

1 participant