You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Given that the experimental launch of scan operator that lowers to XLA's WhileOp, we should leverage it to implement performant RNN layers.
It should great to leverage it in place of the for loop that loops throught the time dimension, which could be large.
Motivation
for loops defined in python will be traced through by Lazy tensor and create a huge XLA graph. The runtime of these graphs might even be better than the rolled loop; but the compile time would be too long.
Scan operator is the technique for shortening compile time, this is also used in flax's RNN implementation.
Pitch
Alternatives
Additional context
The text was updated successfully, but these errors were encountered:
🚀 Feature
Given that the experimental launch of
scan
operator that lowers to XLA'sWhileOp
, we should leverage it to implement performant RNN layers.It should great to leverage it in place of the for loop that loops throught the time dimension, which could be large.
Motivation
for loops defined in python will be traced through by Lazy tensor and create a huge XLA graph. The runtime of these graphs might even be better than the rolled loop; but the compile time would be too long.
Scan operator is the technique for shortening compile time, this is also used in flax's RNN implementation.
Pitch
Alternatives
Additional context
The text was updated successfully, but these errors were encountered: