Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to visualize the attention? #29

Open
Johnzu-2019 opened this issue Jun 24, 2021 · 5 comments
Open

How to visualize the attention? #29

Johnzu-2019 opened this issue Jun 24, 2021 · 5 comments

Comments

@Johnzu-2019
Copy link

Do you have a way to visualize the attention mechanism?I want to know what exactly does the attention mechanism focus on?

@v-iashin
Copy link
Owner

Just propagate attention

softmax = F.softmax(sm_input, dim=-1)

to the output of the modules and apply an aggregation function of your liking.

@Johnzu-2019
Copy link
Author

Just propagate attention

softmax = F.softmax(sm_input, dim=-1)

to the output of the modules and apply an aggregation function of your liking.

Thank you. I'll try that.

@Johnzu-2019
Copy link
Author

softmax

Sorry, I didn't quite catch your meaning. Could you explain it in more detail?In which file should this line of code be added?

@v-iashin
Copy link
Owner

I meant that you will need to add this variable to the output of each module.

@Johnzu-2019
Copy link
Author

I meant that you will need to add this variable to the output of each module.

Thank you. I will continue to study this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants