Skip to content

its-gucci/staying-in-shape

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Staying in Shape: Learning Invariant Shape Representations using Contrastive Learning

This is the original PyTorch implementation of the Staying in Shape paper:

@article{gu2021staying,
  title={Staying in Shape: Learning Invariant Shape Representations using Contrastive Learning},
  author={Gu, Jeffrey and Yeung, Serena},
  journal={arXiv preprint arXiv:2107.03552},
  year={2021}
}

Preparation

Download the aligned version of ModelNet40 here and ShapeNet here. You will also need to add a models folder to the base directory.

Unsupervised Training

An example training command for the unsupervised pre-training of our model is

python main_moco_shape.py \ 
  [your shapenet folder] -d ShapeNet \
  --lr 0.0075 \
  --batch-size 64 \
  --dist-url 'tcp://localhost:10001' --multiprocessing-distributed --world-size 1 --rank 0 \
  --mlp --moco-t 0.2 --aug-plus --cos \
  --model-name [your model name] --orth 

The available augmentations are --orth,--rip, --perturb, --interp, --rotation, --y-rotation, which are described in the paper. Multiple data augmentation settings in the paper uses the --rand flag, which applies a random augmentation out of the augmentations provided to the model, as opposed to sequentially. Models are saved in models/.

2-layer MLP Classification

With a pre-trained model, to train a supervised 2-layer MLP classifier, run:

python main_lincls.py \
  [your path to modelnet40] \
  --lr 0.01 \
  --batch-size 128 \
  --pretrained [your checkpoint path]/checkpoint_0199.pth.tar \
  --dist-url 'tcp://localhost:10001' --multiprocessing-distributed --world-size 1 --rank 0 \
  --mlp \
  --model-name [your classification model name]

To run robustness experiments, the same augmentation flags are available as in the Unsupervised Training section.

See Also

This repository is based on this implementation of the MoCo paper and MoCo v2 paper:

@Article{he2019moco,
  author  = {Kaiming He and Haoqi Fan and Yuxin Wu and Saining Xie and Ross Girshick},
  title   = {Momentum Contrast for Unsupervised Visual Representation Learning},
  journal = {arXiv preprint arXiv:1911.05722},
  year    = {2019},
}
@Article{chen2020mocov2,
  author  = {Xinlei Chen and Haoqi Fan and Ross Girshick and Kaiming He},
  title   = {Improved Baselines with Momentum Contrastive Learning},
  journal = {arXiv preprint arXiv:2003.04297},
  year    = {2020},
}

We also based some code off this implementation of PointNets.

@article{Pytorch_Pointnet_Pointnet2,
      Author = {Xu Yan},
      Title = {Pointnet/Pointnet++ Pytorch},
      Journal = {https://github.com/yanx27/Pointnet_Pointnet2_pytorch},
      Year = {2019}
}

About

[UAI 2021] code for "Staying in Shape: Learning Invariant Shape Representations using Contrastive Learning"

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 100.0%