Skip to content

nuaazs/SARUNet-Pytorch

Repository files navigation

SARU-Net: A Self Attention ResUnet to generate synthetic CT images for MRI-only BNCT treatment planning



Todo List:

  • SARU++
  • 3D SARU++

Links:

Backend:SARU-flask

Frontend:SARU-VUE

Topas utils:topas4bnct

Table of Content

Preparation

  • Linux or macOS
  • Python 3
  • CPU or NVIDIA GPU + CUDA CuDNN

Environment setup

We advise the creation of a new conda environment including all necessary packages. The repository includes a requirements file. Please create and activate the new environment with

conda env create -f requirements.yml
conda activate attngan

Dataset preparation

Running those commands should result in a similar directory structure:

root
  datasets
    MRICT
      train
          patient_001_001.png
          ...
          patient_002_001.png
          ...
		  patient_100_025.png
      test
      	  patient_101_001.png
      	  ...
          patient_102_002.png
          ...
		  patient_110_025.png
      val
          patient_111_001.png
          ...
          patient_112_002.png
          ...
          ...

Our pre-trained model used 130 + patient cases, for a total of about 4500 image pairs, while performing data enhancement methods such as random flipping, random scaling, and random cropping.

Pretrained weights

We release a pretrained set of weights to allow reproducibility of our results. The weights are downloadable from Google Drive(or 百度云). Once downloaded, unpack the file in the root of the project and test them with the inference notebook.

All the models were trained on 2*NVIDIA 12GB TITAN V.

Training

The training routine of SARU is mainly based on the pix2pix codebase, available with details in the official repository.

To launch a default training, run

python train.py --data_root path/to/data --gpu_ids 0,1,2 --netG attnunet --netD basic --model pix2pix --name attnunet-gan

Code structure

To help users better understand and use our code, we briefly overview the functionality and implementation of each package and each module here.

Citation

If you use this code for your research, please cite our papers.

@article{zhao2022saru,
  title={SARU: A self attention ResUnet to generate synthetic CT images for MR-only BNCT treatment planning},
  author={Zhao, Sheng and Geng, Changran and Guo, Chang and Tian, Feng and Tang, Xiaobin},
  journal={Medical Physics},
  year={2022},
  publisher={Wiley Online Library}
}

Related Projects

contrastive-unpaired-translation (CUT) CycleGAN-Torch | pix2pix-Torch | pix2pixHD| BicycleGAN | vid2vid | SPADE/GauGAN iGAN | GAN Dissection | GAN Paint

Acknowledgments

Our code is inspired by pytorch-CycleGAN-and-pix2pix and pytorch-CBAM

Releases

No releases published

Packages

No packages published

Languages