-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BAD-Gaussians's performance on the nerf 360 dataset. #22
Comments
Hi @chenkang455, thank you very much for your continued interest in our work! To analyze this issue, I'm curious if you used COLMAP to initialize the camera poses and point cloud with the blurred Lego images? If so, I suggest investigating the error between the estimated pose and the GT pose. Our approach may fail if COLMAP does not give a proper initialization. Also, if possible, could you share the data so that we can reproduce and analyze this issue? |
Hi @LingzheZhao , thanks for your reply! I didn't use the COLMAP to initialize the camera pose. I utilize the trasform.json from the I have tried applying the My lego dataset can be downloaded from this link: https://pan.baidu.com/s/1U7qUCEzJYRI2sGaJ-jWNhA?pwd=8poe . And my code modification from the
It should be noted that the filename in This dataset reading format has been shown correct on the vanilla nerf/3dgs, so you can believe that this data reading format is correct. Thanks a lot for your help!!! |
Hi @LingzheZhao , thanks for your great work. While applying the BAD-Gaussian on the 360 scene like lego, I encounter the following problems and I am looking for your help.
My reconstructed scene is severely destroyed shown as below:
It seems that rays are blocked by something.
My default input image is shown below:
And my modified dataparser code is:
Can you help me figure it out? Thx a lot. By the way, I encounter similar problems in the BAD-NeRF WU-CVGL/BAD-NeRF#9
The text was updated successfully, but these errors were encountered: