Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there a reason to support out of order axes for layerNormalization? #748

Open
philloooo opened this issue Aug 1, 2024 · 0 comments
Open

Comments

@philloooo
Copy link
Contributor

philloooo commented Aug 1, 2024

Hi!
I found that on CoreML, when the axes are out of order, sometimes the result is wrong. I've also tested with tensorflow, which also returns wrong result when axes are out of order

Is there a reason to support unordered axes?

We could emulate this by transposing(it's just not as efficient, CoreML only support scale and bias as const , so applying transposed bias and scale have to be done outside of layernorm), but if there is no valid use case to support unordered axes, it's simpler to just reject that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants