Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Split not working when submodels folder is binded on docker volume #1762

Open
CarstenL opened this issue Jun 4, 2024 · 4 comments
Open

Split not working when submodels folder is binded on docker volume #1762

CarstenL opened this issue Jun 4, 2024 · 4 comments

Comments

@CarstenL
Copy link

CarstenL commented Jun 4, 2024

How did you install ODM? (Docker, installer, natively, ...)?

Docker - Image: opendronemap/odm:latest

What is the problem?

I have a large data set of about 3000 images.
Our server does not have enough RAM, so I wanted to split it into 1000 images per submodel using --split.

According to the log, the split process says that it will split it into 0 submodels.

{"log":"2024-06-03 22:24:06,219 DEBUG: Matching DJI_20240412141228_1017_V.JPG and DJI_20240412141110_0925_V.JPG.  Matcher: FLANN (symmetric) T-desc: 1.604 T-robust: 0.001 T-total: 1.607 Matches: 375 Robust: 371 Success: True\n","stream":"stdout","time":"2024-06-03T22:47:27.54336146Z"}
{"log":"2024-06-03 22:24:06,224 DEBUG: Matching DJI_20240412141356_1120_V.JPG and DJI_20240412141527_1227_V.JPG.  Matcher: FLANN (symmetric) T-desc: 1.600 T-robust: 0.001 T-total: 1.601 Matches: 208 Robust: 200 Success: True\n","stream":"stdout","time":"2024-06-03T22:47:27.543367581Z"}
{"log":"2024-06-03 22:24:06,250 DEBUG: Matching DJI_20240412140920_0796_V.JPG and DJI_20240412141950_1529_V.JPG.  Matcher: FLANN (symmetric) T-desc: 1.481 T-robust: 0.001 T-total: 1.483 Matches: 337 Robust: 331 Success: True\n","stream":"stdout","time":"2024-06-03T22:47:27.543373693Z"}
{"log":"2024-06-03 22:24:06,273 DEBUG: Matching DJI_20240412133744_0145_V.JPG and DJI_20240412133923_0261_V.JPG.  Matcher: FLANN (symmetric) T-desc: 1.563 T-robust: 0.005 T-total: 1.570 Matches: 1507 Robust: 1493 Success: True\n","stream":"stdout","time":"2024-06-03T22:47:27.543379784Z"}
{"log":"2024-06-03 22:24:06,299 DEBUG: Matching DJI_20240412134111_0388_V.JPG and DJI_20240412134002_0307_V.JPG.  Matcher: FLANN (symmetric) T-desc: 1.505 T-robust: 0.011 T-total: 1.522 Matches: 3568 Robust: 3502 Success: True\n","stream":"stdout","time":"2024-06-03T22:47:27.543385835Z"}
{"log":"2024-06-03 22:24:06,375 INFO: Matched 29235 pairs (brown-brown: 29235) in 1763.6751683829934 seconds (0.06032752420783345 seconds/pair).\n","stream":"stdout","time":"2024-06-03T22:47:27.543391907Z"}
{"log":"\u001b[93m[WARNING] Submodels directory already exist at: /code/submodels\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.543397687Z"}
{"log":"\u001b[39m[INFO]    Dataset has been split into 0 submodels. Reconstructing each submodel...\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.550810523Z"}
{"log":"\u001b[39m[INFO]    Aligning submodels...\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.550960113Z"}
{"log":"\u001b[39m[INFO]    Finished split stage\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.559395036Z"}
{"log":"\u001b[39m[INFO]    Running merge stage\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.559428889Z"}
{"log":"\u001b[93m[WARNING] No input point cloud files to process\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.559742295Z"}
{"log":"\u001b[39m[INFO]    Classifying /code/odm_georeferencing/odm_georeferenced_model.laz using Simple Morphological Filter (1/2)\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.559752294Z"}
{"log":"\u001b[39m[INFO]    running pdal translate -i /code/odm_georeferencing/odm_georeferenced_model.laz -o /code/odm_georeferencing/odm_georeferenced_model.laz smrf --filters.smrf.scalar=1.25 --filters.smrf.slope=0.15 --filters.smrf.threshold=0.5 --filters.smrf.window=18.0\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.559756752Z"}
{"log":"PDAL: Unable to open stream for '/code/odm_georeferencing/odm_georeferenced_model.laz' with error 'No such file or directory'\n","stream":"stdout","time":"2024-06-03T22:47:27.689991401Z"}
{"log":"\n","stream":"stdout","time":"2024-06-03T22:47:27.690031716Z"}
{"log":"\u001b[93m[WARNING] Error creating classified file /code/odm_georeferencing/odm_georeferenced_model.laz\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.690038128Z"}
{"log":"\u001b[39m[INFO]    Created /code/odm_georeferencing/odm_georeferenced_model.laz in 0:00:00.130250\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.690041905Z"}
{"log":"\u001b[39m[INFO]    Classifying /code/odm_georeferencing/odm_georeferenced_model.laz using OpenPointClass (2/2)\u001b[0m\n","stream":"stdout","time":"2024-06-03T22:47:27.690045382Z"}

I start the Docker container via the following Docker Compose file.

services:
  ODM:
    container_name: ODM
    image: opendronemap/odm:latest
    command: --dtm --dsm --build-overviews --pc-skip-geometric --split 1000
    volumes:
      - /data/Example/images:/code/images
      - /data/Example/odm_georeferencing:/code/odm_georeferencing
      - /data/Example/odm_meshing:/code/odm_meshing
      - /data/Example/odm_orthophoto:/code/odm_orthophoto
      - /data/Example/odm_texturing:/code/odm_texturing
      - /data/Example/odm_dem:/code/odm_dem
      - /data/Example/opensfm:/code/opensfm
      - /data/Example/smvs:/code/smvs    
      - /data/Example/submodels:/code/submodels
    logging:
        driver: "json-file"
        options:
            max-file: "5"
            max-size: "10m"

If I remove the last volume “- /data/Example/submodels:/code/submodels”, the process works.
So there are problems when you bind the submodels folder to a volume.

What should be the expected behavior? If this is a feature request, please describe in detail the changes you think should be made to the code, citing files and lines where changes should be made, if possible.

Creating submodels on an docker volume should be work.

How can we reproduce this? What steps did you do to trigger the problem? If this is an issue with processing a dataset, YOU MUST include a copy of your dataset AND task output log, uploaded on Google Drive or Dropbox (otherwise we cannot reproduce this).

It should be possible to reproduce it with each data set

Complete logfile: logfile.log

Copy link

github-actions bot commented Jun 4, 2024

Thanks for the report, but it looks like you didn't include a copy of your dataset for us to reproduce this issue? Please make sure to follow our issue guidelines 🙏

p.s. I'm just an automated script, not a human being.

@github-actions github-actions bot closed this as completed Jun 4, 2024
@smathermather smathermather reopened this Jun 4, 2024
@smathermather
Copy link
Contributor

Is this perchance on a windows machine or a disk type that doesn't support symlinks?

@Saijin-Naib
Copy link
Contributor

Saijin-Naib commented Jun 5, 2024

NTFS does support symlinks, though I am not sure if we call out to mklink to generate them 🤔

All bets are off on NFS/SMB drives I think, though.

@CarstenL
Copy link
Author

CarstenL commented Jun 5, 2024

Docker is running in an Ubuntu 24.04 LTS Hyper-V VM.
The host is Windows Server 2022 21H2

Extract from /etc/fstab:

/dev/disk/by-uuid/4d8b4b00-9ed0-4570-8422-9dd5848ddc48 /var/lib/docker ext4 defaults 1 2
/dev/disk/by-uuid/91640669-ac84-45d4-b029-ae88cd895358 /data ext4 defaults 1 2

The first one works, the second one not

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants