You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a good reason to put the cls_tokens at the beginning of the sequence?
If I want a 1d sequence classifier that is used in streaming mode, then I want to limit, maybe even forbid, right context. In the extreme case of zero right context, then you essentially have causal attention and in that case, it probably makes more sense to put cls_tokens at the end so that it can attend to the whole sequence right
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
vit-pytorch/vit_pytorch/vit_1d.py
Line 104 in 5578ac4
Is there a good reason to put the
cls_tokens
at the beginning of the sequence?If I want a 1d sequence classifier that is used in streaming mode, then I want to limit, maybe even forbid, right context. In the extreme case of zero right context, then you essentially have causal attention and in that case, it probably makes more sense to put
cls_tokens
at the end so that it can attend to the whole sequence rightBeta Was this translation helpful? Give feedback.
All reactions