The repository contains software used to obtain the DeepHistReg results (ANHIR challenge). The purpose of the repository is to make the results fully reproducible. Please contact us in case of any technical problems. For the iterative MIND-Demons (the best median of median rTRE among all participants but significantly higher registration time) please visit: ANHIR-AGH
There are two ways to use the code:
- To reproduce the results
- To improve the registration procedure and perform your own experiments
If you do not want or do not have time to use the code and the purpose is just to obtain the results then contact us. We will provide the submission file used to create the ANHIR submission with the transformed landmarks and the reported registration time.
To reproduce the results without performing the training from scratch, perform the following steps:
- Prepare Dataset
- Use the file to convert the original ANHIR dataset into the format used by the DeepHistReg framework.
- Contact authors for the access to the pretrained models and use: Main File.
- First point the model variables to appropriate paths (documented in the file) and then run the file.
- Create the submission from the registration output using Submission File.
- Add your own machine benchmark file to the submission folder and zip the folder (see challenge website ANHIR Webiste).
- The submission is ready.
To improve the results and perform your own training, the following steps are necessary:
- Prepare Dataset
- Use the file to convert the original ANHIR dataset into the format used by the DeepHistReg framework.
- Use Main File with only the initial alignment to create training set for the affine registration. Then use Prepare Dataset to prepare the dataset for the affine training.
- Use Affine Reg to train the affine registration network.
- Use Main File with the initial alignment and the affine registration to create the training set for the nonrigid registration. Then use Prepare Dataset to prepare the dataset for the nonrigid training.
- Use Nonrigid Reg to train the nonrigid registration network.
- Use Main File with the whole framework turned on to create the final registration results.
- Create the submission from the registration output using Submission File.
- Add your own machine benchmark file to the submission folder and zip the folder (see challenge website ANHIR Webiste).
- The submission is ready.
For the dataset access and the full description please visit ANHIR Webiste. If you found the dataset useful please cite the appropriate publications.
- PyTorch
- NumPy
- SciPy
- Matplotlib
- SimpleITK
- Pandas
- PyTorch-Summary
The software was tested on Ubuntu 18.04 LTS and Python version >=3.6.x.
If you found the software useful please cite:
- Marek Wodzinski and Henning Müller, DeepHistReg: Unsupervised Deep Learning Registration Framework for Differently Stained Histology Samples. Computer Methods and Programs in Biomedicine Vol. 198, January 2021.
https://www.sciencedirect.com/science/article/pii/S0169260720316321
The article presents the whole DeepHistReg framework with deep segmentation, initial alignment, affine registration and improved deformable registration. - Marek Wodzinski and Henning Müller, Unsupervised Learning-based Nonrigid Registration of High Resolution Histology Images, 11th International Workshop on Machine Learning in Medical Imaging (MICCAI-MLMI), 2020.
https://link.springer.com/chapter/10.1007/978-3-030-59861-7_49
The article introduces the first version of the nonrigid registration. - Marek Wodzinski and Henning Müller, Learning-based Affine Registration of Histological Images, 9th International Workshop on Biomedical Image Registration (WBIR), 2020.
https://link.springer.com/chapter/10.1007/978-3-030-50120-4_2
The article introduces the patch-based, resolution-independent affine registration being a part of the framework.
You may also find useful to cite (however, the DeepHistReg is not part of the challenge summary article):
- J. Borovec et.al., ANHIR: Automatic Non-rigid Histological Image Registration Challenge, IEEE Transactions on Medical Imaging, 2020, DOI: 10.1109/TMI.2020.2986331 https://ieeexplore.ieee.org/document/9058666