diff --git a/README.md b/README.md index abb8463..6570963 100644 --- a/README.md +++ b/README.md @@ -1,8 +1,4 @@ - # SDCL: Students Discrepancy-Informed Correction Learning for Semi-supervised Medical Image Segmentation - - #### By [Bentao Song](), [Qingfeng Wang]() - -![MICCAI2024](https://img.shields.io/badge/MICCAI-2024-blue) + # SDCL: Students Discrepancy-Informed Correction Learning for Semi-supervised Medical Image Segmentation (MICCAI 2024) Pytorch implementation of our method for MICCAI 2024 paper: "SDCL: Students Discrepancy-Informed Correction Learning for Semi-supervised Medical Image Segmentation".[Paper Link](https://papers.miccai.org/miccai-2024/672-Paper0821.html) ## Contents @@ -26,10 +22,12 @@ the current State-of-the-Art (SOTA) methods by 2.57%, 3.04%, and 2.34% in the Di ## Introduction Official code for "SDCL: Students Discrepancy-Informed Correction Learning for Semi-supervised Medical Image Segmentation". -Due to the large size of the model parameter files, it is difficult to upload them anonymously. After our paper is accepted, we will publish both the pretrained model and the fully trained model. The proof for the kl_loss in the code can be found in the document "MICCAI2024_SDCL.pdf". +## News +2024/11/12 +We provide SDCL model weights [google drive](https://drive.google.com/file/d/18C5C8VEUnFFZwg-zG6pu1WPC0Bi3GLCe/view?usp=sharing). ## Requirements This repository is based on PyTorch 2.1.0, CUDA 12.1, and Python 3.8. All experiments in our paper were conducted on an NVIDIA GeForce RTX 4090 GPU with an identical experimental setting under Windows. ## Datasets @@ -68,7 +66,7 @@ If our SDCL is useful for your research, please consider citing: organization={Springer} } ## Acknowledgements -Our code is largely based on [BCP](https://github.com/DeepMed-Lab-ECNU/BCP). Thanks for these authors for their valuable work, hope our work can also contribute to related research. +Our code is largely based on [BCP](https://github.com/DeepMed-Lab-ECNU/BCP) and [SSL4MIS](https://github.com/HiLab-git/SSL4MIS). Thanks for these authors for their valuable work, hope our work can also contribute to related research.