Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add VLLM_T_COMPILE_FULLGRAPH flag #932

Merged
merged 2 commits into from
Mar 24, 2025

Conversation

anko-intel
Copy link

@anko-intel anko-intel commented Mar 19, 2025

The flag is set to False as default to don't change behavior for end-users, but allow use it in CI to earlier catch performance regression as usually graph breaks reduce performance.

The flag is set to False as default to don't change behavior for end-users,
but allow use it in CI to earlier catch performance regression as usually
graph breaks reduce performance.
@anko-intel anko-intel force-pushed the dev/anko/add_fullgraph_flag branch from 44dc690 to c492e18 Compare March 19, 2025 12:47
@anko-intel anko-intel requested a review from PatrykWo March 19, 2025 16:35
@michalkuligowski michalkuligowski merged commit 181b6a9 into habana_main Mar 24, 2025
33 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants