Simple PyTorch implementation of NeRF (Neural Radiance Fields).
git clone https://github.com/murumura/NeRF.git
cd NeRF-Simple
pip install -r environment.txt
I have the dockerfile needed to build the environment for this project, for those who are familiar with the basic operations of docker are very welcome to execute this project directly through the provieded dockerfile in docker/dockerfile
and building/running scripts(docker/docker_build.sh
and docker/docker_run.sh
).
git clone https://github.com/murumura/NeRF-Simple.git
cd NeRF-Simple
cd docker
sh docker_build.sh
Once everything is setup, to run experiments, first edit configs/lego.txt
to specify your own parameters.
(For details of training options, please refer to src/opt.py
)
Invoked training procedure by :
python src/train.py \
--conf_path configs/lego.conf
--exp_tag tag-you-what
Monitor training procedure through tensorboard:
tensorboard --logdir=./exp/tag-you-want/exp_name/logs --host=0.0.0.0 --port=6006
For those who want to output the synthesis result into a mesh, you need to additionly install PyMCubes
pip install PyMCubes
Then invoked mesh reconstruction by :
python src/eval.py \
--conf configs/lego.conf
--ckpt_path pretrained/lego.pth
--output_path ./output
--mesh_name lego.obj
--iso_level 90
--limit 1.2
--sample_resolution 128
pretrained/
└── lego.pth
You can download the blender dataset from paper authors's link here.
Unzip and place the downloaded directory in ./data/datasets
for later training. See the following directory structure for an example:
data/
└── datasets
└── nerf_synthetic
├── chair
├── drums
├── ficus
├── hotdog
├── lego
├── materials
├── mic
├── README.txt
└── ship