You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WebNN spec doesn't put any restriction on epsilon. It means if both variance and epsilon are 0, there will be divide-by-zero error.
epsilon is also used by instanceNormalization and layerNormalization, they may have similar division-by-zero error. The generic element-wise binary div should also be considered.
The proposal is to standardize divide-by-zero outcome. For example, as @fdwrmentioned, DirectML would give NaN if variance and epsilon are 0.
This issue was raised by @wacky6 and @a-sully in Chromium CL review 5064994 and 5541389.
The discussion was about
MLBatchNormalizationOptions.epsilon
epsilon
would be used bybatchNormalization
according to its calculationOutput = Scale * ((Input - Mean) / sqrt(Variance + Epsilon)) + Bias
.WebNN spec doesn't put any restriction on
epsilon
. It means if both variance and epsilon are 0, there will be divide-by-zero error.epsilon
is also used byinstanceNormalization
andlayerNormalization
, they may have similar division-by-zero error. The generic element-wise binarydiv
should also be considered.The proposal is to standardize divide-by-zero outcome. For example, as @fdwr mentioned, DirectML would give NaN if variance and epsilon are 0.
Refer to Wikipedia Division by Zero page for more information.
The text was updated successfully, but these errors were encountered: