Unofficial implementation of Tensorial Radiance Fields (Chen & Xu ‘22)
JAX implementation of Tensorial Radiance Fields, written as an exercise.
@misc{TensoRF,
title={TensoRF: Tensorial Radiance Fields},
author={Anpei Chen and Zexiang Xu and Andreas Geiger and and Jingyi Yu and Hao Su},
year={2022},
eprint={2203.09517},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
We don't attempt to reproduce the original paper exactly, but can achieve decent results after 5~10 minutes of training:
As proposed, TensoRF only supports scenes that fit in a fixed-size bounding box. We've also added basic support for unbounded "real" scenes via mip-NeRF 360-inspired scene contraction[^1]. From nerfstudio's "dozer" dataset:
[^1]: Same as the original, but with an $L-\infty$ norm instead of $L-2$ norm.
Download nerf_synthetic
dataset:
Google Drive.
With the default training script arguments, we expect this to be extracted to
./data
, eg ./data/nerf_synthetic/lego
.
Install dependencies. Probably you want the GPU version of JAX; see the official instructions. Then:
pip install -r requirements.txt
To print training options:
python ./train_lego.py --help
To monitor training, we use Tensorboard:
tensorboard --logdir=./runs/
To render:
python ./render_360.py --help
Things aren't totally matched to the official implementation:
Implementation details are based loosely on the original PyTorch implementation apchsenstu/TensoRF.
unixpickle/learn-nerf and google-research/jaxnerf were also really helpful for understanding core NeRF concepts + connecting them to JAX!