NeX360: Real-time All-around View Synthesis with Neural Basis Expansion

Pakkapon Phongthawee*
Suttisak Wizadwongsa*
Jiraphon Yenphraphai
Supasorn Suwajanakorn
VISTEC - Vidyasirimedhi Institute of Science and Technology
Rayong, Thailand

*equal contribution

Abstract

We present NeX, a new approach to novel view synthesis based on enhancements of multiplane images (MPI) that can reproduce view-dependent effects in real time. Unlike traditional MPI, our technique parameterizes each pixel as a linear combination of spherical basis functions learned from a neural network to model view-dependent effects and uses a hybrid implicit-explicit modeling strategy to improve fine detail. Moreover, we also present an extension to NeX, which leverages knowledge distillation to train multiple MPIs for unbounded 360° scenes. Our method is evaluated on several benchmark datasets: NeRF-Synthetic dataset, Light Field dataset, Real Forward-Facing dataset, Space dataset, as well as Shiny, our new dataset that contains significantly more challenging view-dependent effects, such as the rainbow reflections on the CD. Our method outperforms other real-time rendering approaches on PSNR, SSIM, and LPIPS and can render unbounded 360° scenes in real time.

Video renderings

More results

Real-time demos

*A high-end GPU recommended for PNG MPIs or Full resolution.

NeRF-Synthetic dataset

Lego
Chair
Placeholder image
Ficus
Hotdog
Materials
Mic
Ship

Light Field (LF) dataset