Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to implement target attention in this framework #208

Open
LeiShenVictoria opened this issue Apr 25, 2024 · 3 comments
Open

how to implement target attention in this framework #208

LeiShenVictoria opened this issue Apr 25, 2024 · 3 comments

Comments

@LeiShenVictoria
Copy link

In DeepInterestNetwork, there is a target attention between a candidate feature (one column) and a sequence feature, how to implement this target attention in this repo, which can be considered as an attention between a column in deep part (candidate) and the text part (sequence) i guess...
Thanks a lot

@jrzaurin
Copy link
Owner

Hey @LeiShenVictoria

So I have not read the Deep Interest Network paper, I will, maybe I can incorporate some ideas to the library.

As of right now, the only thing "kind-of" similar you would have here are the attention weights of the models.

All model components that are based on attention mechanisms have an attribute called attention_weights: see here

I will have a look to the paper Deep Interest Network paper asap and see if I can come up with a quick answer that is more helpful :)

@LeiShenVictoria
Copy link
Author

Hi, thanks for your reply.
One more question is that how to implement the embedding-sharing operation for a candidate feature and a sequence feature.

@jrzaurin
Copy link
Owner

jrzaurin commented May 9, 2024

Hey @LeiShenVictoria

I would have to read the paper :)

I am busy at work now, but ill see what I can do asap

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants