Skip to content

Windows running repository of the pix2surf code of the paper "Learning to Transfer Texture from Clothing Images to 3D Humans"

License

Notifications You must be signed in to change notification settings

minar09/pix2surf_windows

Repository files navigation

Disclaimer

This is a modified version of the original repository Pix2Surf, for running the demo and visualization in windows OS. Please refer to the original repository for details.

Learning to Transfer Texture from Clothing Images to 3D Humans (CVPR 2020)

This repository contains code corresponding to the paper "Learning to Transfer Texture from Clothing Images to 3D Humans"

Demo

To run the demo you will need blender. This code has been tested with blender 2.79. Please download it from here. After installation, please add the blender directory to the environment variables path. Also, demo tested in python 3.6.9

  1. Clone/Download the repo.

  2. Install the requirements: pip install -r requirements.txt

  3. Download pretrained weights and other assets from here, and unzip inside the repository folder.

  4. Running the demo is as simple as: python demo.py

The script produces a video in which the front and back views of a T-shirt and a pair of shorts are rendered atop a textured SMPL mesh. By changing the parameters in the script, different textures and garment classes can be rendered atop SMPL. Example:

python demo.py --pose_id 2 --img_id 4 --low_type pants

We provide five pairs of upper and lower clothing images to run our demo script. Please note that we do not own the copyrights of the clothing images. These are released merely for demonstration and should not be used for any other purpose. By excuting the script to download data, you automatically consent to the license agreement of the SMPL body.

Training

The training data for all neural models was obtained from the following websites:

  1. Zalando
  2. Jack and Jones
  3. Tom-Tailor

We do not own the copyrights to these images. These can be downloaded using a web scraper. Once this is done, we obtain silhouettes of these clothing images by a mixture of manual and autmatic execution of grab cut .

Sample masks and texture images are stored in the ./train/data directory. The code for obtaining the correspondence and texture maps is in the ./prep_data directory. All three scripts for silhouette matching, correspondence extraction and texture map extraction can be executed using the command

python ./prep_data/run.py

The dependencies for running the three scripts can be found in the requirements_prep_data.txt file

Once the data has been obtained, the mapping and segmentation networks can be trained using the scripts provided in the train directory using the commands :

python train_seg.py

python train_map.py

Citation

If you find the code useful, please consider citing the paper

@inproceedings{mir20pix2surf,
    title = {Learning to Transfer Texture from Clothing Images to 3D Humans},  
    author = {Mir, Aymen and Alldieck, Thiemo and Pons-Moll, Gerard},  
    booktitle = {{IEEE} Conference on Computer Vision and Pattern Recognition (CVPR)}, 
    month = {jun},
    organization = {{IEEE}},
    year = {2020},
} 

License

This code is available for non-commercial scientific research purposes as defined in the LICENSE file. By downloading and using this code you agree to the terms in the LICENSE.