Skip to content

Original implementation of the paper "Usupervised Domain Adaptation through Inter-modal Rotation for RGB-D Object Recognition": https://arxiv.org/pdf/2004.10016.pdf

License

Notifications You must be signed in to change notification settings

MRLoghmani/relative-rotation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Relative Rotation

Original implementation of the paper "Unsupervised Domain Adaptation through Inter-modal Rotation for RGB-D Object Recognition": https://arxiv.org/pdf/2004.10016.pdf

Requirements:

Instructions:

  1. Download the datasets (see link below) and extract them in directory <dataset_dir> [Skip to point (4) to run w/o docker]
  2. To execute the code within a docker container, run docker build -t <container_name> .
  3. Start the container with docker run -it --runtime=nvidia --shm-size 16G -v <dataset_dir>:<dataset_dir> <container_name> bash
  4. To train the network, run:
python code/train.py \
--source synHB \
--target valHB \
--epoch 40 \
--batch_size 64 \
--lr 0.0003 \
--weight_rot 1 \
--weight_ent 0.1 \
--data_root_source '<dataset_dir>/HB_Syn_crops_square' \
--train_file_source '<dataset_dir>/HB_Syn_crops_square/HB_Syn_crops_25k-split_sync_train1.txt' \
--test_file_source '<dataset_dir>/HB_Syn_crops_square/HB_Syn_crops_25k-split_sync_test1.txt' \
--data_root_target '<dataset_dir>/HB_val_crops_square' \
--train_file_target '<dataset_dir>/HB_val_crops_square/HB_val_crops_25k-split_sync.txt' \
--test_file_target '<dataset_dir>/HB_val_crops_square/HB_val_crops_25k-split_sync.txt'
  1. To evaluate a trained model, run:
python code/eval.py \
--source synHB \
--target valHB \
--epoch 40 \
--batch_size 64 \
--lr 0.0003 \
--weight_rot 1 \
--weight_ent 0.1 \
--data_root_source '<dataset_dir>/HB_Syn_crops_square' \
--train_file_source '<dataset_dir>/HB_Syn_crops_square/HB_Syn_crops_25k-split_sync_train1.txt' \
--test_file_source '<dataset_dir>/HB_Syn_crops_square/HB_Syn_crops_25k-split_sync_test1.txt' \
--data_root_target '<dataset_dir>/HB_val_crops_square' \
--train_file_target '<dataset_dir>/HB_val_crops_square/HB_val_crops_25k-split_sync.txt' \
--test_file_target '<dataset_dir>/HB_val_crops_square/HB_val_crops_25k-split_sync.txt'

Download:

For more information about the datasets we presented and used for our experiments, please visit our webpage https://www.acin.tuwien.ac.at/vision-for-robotics/software-tools/synthetic-to-real-rgbd-datasets/

Contributors:

Citation:

@article{loghmani2020unsupervised,
  title={Unsupervised Domain Adaptation through Inter-modal Rotation for RGB-D Object Recognition},
  author={Loghmani, Mohammad Reza and Robbiano, Luca and Planamente, Mirco and Park, Kiru and Caputo, Barbara and Vincze, Markus},
  journal={arXiv preprint arXiv:2004.10016},
  year={2020}
}

About

Original implementation of the paper "Usupervised Domain Adaptation through Inter-modal Rotation for RGB-D Object Recognition": https://arxiv.org/pdf/2004.10016.pdf

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published