Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when sequence length is changed #160

Open
shingoaoyagi opened this issue Oct 8, 2024 · 0 comments
Open

Error when sequence length is changed #160

shingoaoyagi opened this issue Oct 8, 2024 · 0 comments

Comments

@shingoaoyagi
Copy link

shingoaoyagi commented Oct 8, 2024

Hello.
Thank you for presenting your excellent research.
I have a question about pre_seq_length.
I have tried the SimVP in the Colab tutorial.
When I set pre_seq_length = 10 and aft_seq_length = 20, it ran fine. However, when I changed pre_seq_length to pre_seq_length = 5 or 20, I got the error “running_mean should contain 320 elements not 640”. I found that the number of channels in the batch normalization layer is not appropriate. I would like to know how to solve this problem, can I make pre_seq_length variable?
What is the solution? Is the only way to change the structure of the model?
Thank you in advance for your help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant