Skip to content

Code for the paper "CompoNet: Learning to Generate the Unseen by Part Synthesis and Composition"

License

Notifications You must be signed in to change notification settings

nschor/CompoNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CompoNet: Learning to Generate the Unseen by Part Synthesis and Composition

Created by Nadav Schor, Oren Katzir, Hao Zhang, Daniel Cohen-Or.

representative

Introduction

This work is based on our ICCV paper. We present CompoNet, a generative neural network for 3D shapes that is based on a part-based prior, where the key idea is for the network to synthesize shapes by varying both the shape parts and their compositions.

Citation

If you find our work useful in your research, please consider citing:

@InProceedings{Schor_2019_ICCV,
  author = {Schor, Nadav and Katzir, Oren and Zhang, Hao and Cohen-Or, Daniel},
  title = {CompoNet: Learning to Generate the Unseen by Part Synthesis and Composition},
  booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
  month = {October},
  year = {2019}
}

Dependencies

Requirements:

  • Python 2.7
  • Tensorflow (version 1.4+)
  • OpenCV (for visualization)

Our code has been tested with Python 2.7, TensorFlow 1.4.0, CUDA 8.0 and cuDNN 6.0 on Ubuntu 18.04.

Installation

Download the source code from the git repository:

git clone https://github.com/nschor/CompoNet

Compile the Chamfer loss file, under CompoNet/tf_ops/nn_distance, taken from Fan et. al.

cd CompoNet/tf_ops/nn_distance

Modify the Tensorflow and CUDA path in the tf_nndistance_compile.sh script and run it.

sh tf_nndistance_compile.sh

For visualization go to utils/.

cd CompoNet/utils

Run the compile_render_balls_so.sh script.

sh compile_render_balls_so.sh

If you are using Anaconda, we attached the environment we used CompoNet.yml under anaconda_env/. Create the environment using:

cd CompoNet/anaconda_env
conda env create -f CompoNet.yml

Activate the environment:

source activate CompoNet

Data Set

Download the ShapeNetPart dataset by running the download_data.sh script under datasets/.

cd CompoNet/datasets
sh download_data.sh

The point-clouds will be stored in CompoNet/datasets/shapenetcore_partanno_segmentation_benchmark_v0

Train CompoNet

To train CompoNet on the Chair category with 400 points per part run:

python train.py

Check the available options using:

python train.py -h

Generate Shapes Using the Trained Model

To generate new shapes, and visualize them run:

python test.py --category category --model_path model_path

Check the available options using:

python test.py -h

License

This project is licensed under the terms of the MIT license (see LICENSE for details).

About

Code for the paper "CompoNet: Learning to Generate the Unseen by Part Synthesis and Composition"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published