You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The SeLU activation uses (typename data_T::value_type)1.0507009873554804934193349852946. This is dangerous because there are no guarantees that the data type has a range covering 1.05, or if there sufficient precision for the value to be meaningful. What if the data range is -0.25 to 0.25 or so? The dimensions should be decoupled. One possibility is to just hardcode, say ap_ufixed<16, 1, AP_RND> or something.
The text was updated successfully, but these errors were encountered:
…arning#1287)
* Replaced literal cast 1.050700987… → `res_T` with a
`static const ap_fixed<16,6> lambda`, preserving range and
~1.5 × 10⁻² LSB precision regardless of user-chosen `res_T`.
* Removed redundant datareg scope; now a single per-element branch:
if (x ≥ 0) y = λ · x
else y = selu_table[idx] (with index clamped to [0,N-1]).
* Guard against negative-index underflow; behaviour for <0 inputs unchanged.
* Keeps identical latency/area on Vivado 2023.1 & Vitis 2024.1, but
eliminates silent overflow/rounding when `res_T` is narrow (e.g., ap_fixed<8,2>).
Fixesfastmachinelearning#1287
Uh oh!
There was an error while loading. Please reload this page.
Quick summary
The SeLU activation uses
(typename data_T::value_type)1.0507009873554804934193349852946
. This is dangerous because there are no guarantees that the data type has a range covering 1.05, or if there sufficient precision for the value to be meaningful. What if the data range is -0.25 to 0.25 or so? The dimensions should be decoupled. One possibility is to just hardcode, sayap_ufixed<16, 1, AP_RND>
or something.The text was updated successfully, but these errors were encountered: