Skip to content

Commit 1c99be2

Browse files
author
Eren Golge
committed
Change window size for attention
1 parent 97a16ce commit 1c99be2

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

layers/attention.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -118,8 +118,8 @@ def __init__(self, out_dim, rnn_dim, annot_dim, memory_dim, align_model, windowi
118118
self.rnn_cell = nn.GRUCell(annot_dim + memory_dim, rnn_dim)
119119
self.windowing = windowing
120120
if self.windowing:
121-
self.win_back = 1
122-
self.win_front = 3
121+
self.win_back = 3
122+
self.win_front = 6
123123
self.win_idx = None
124124
# pick bahdanau or location sensitive attention
125125
if align_model == 'b':

0 commit comments

Comments
 (0)