Neural Spectro-polarimetric Fields

POSTECH
SIGGRAPH ASIA 2023
Teaser.

We present neural spectro-polarimetric fields that allow for the rendering of a scene from novel views, not limited to the synthesis of (a) RGB radiance images. We can render (b) hyperspectral Stokes vectors for each individual light ray, thus providing a comprehensive description of its spectro-polarization property. Also, (c) it can inherently reduce noise in low-signal-to-noise ratio spectro-polarimetric measurements, as shown in the degree of polarization (DoP). (d)We further show the analysis of spectro-polarimetric information for the CD that is otherwise invisible to the human eye.

Abstract

Modeling the spatial radiance distribution of light rays in a scene has been extensively explored for applications, including view synthesis. Spectrum and polarization, the wave properties of light, are often neglected due to their integration into three RGB spectral bands and their non-perceptibility to human vision. However, these properties are known to encompass substantial material and geometric information about a scene. Here, we propose to model spectro-polarimetric fields, the spatial Stokes-vector distribution of any light ray at an arbitrary wavelength.

We present Neural Spectro-polarimetric Fields (NeSpoF), a neural representation that models the physically-valid Stokes vector at given continuous variables of position, direction, and wavelength. NeSpoF manages inherently noisy raw measurements, showcases memory efficiency, and preserves physically vital signals - factors that are crucial for representing the high-dimensional signal of a spectro-polarimetric field. To validate NeSpoF, we introduce the first multi-view hyperspectral-polarimetric image dataset, comprised of both synthetic and real-world scenes. These were captured using our compact hyperspectral-polarimetric imaging system, which has been calibrated for robustness against system imperfections. We demonstrate the capabilities of NeSpoF on diverse scenes.

Neural Spectro-polarimetric Field (NeSpoF)

We propose modeling the spectro-polarimetric field $f(\cdot)$ using NeSpoF, a neural representation that describes the Stokes vector $\mathbf{s}$ and volumetric density $\sigma$ for the input variables of position, direction, and wavelength.

$\mathbf{s}, \sigma = F_\Theta(x,y,z,\theta, \phi, \lambda)$

Network architecture.

We design NeSpoF with a positional MLP, $F_\Theta^p$, and a spectro-directional MLP, $F_\Theta^d$ and turn to model a physically-valid Stokes vector ($s_0^2 \geq s_1^2+s_2^2+s_3^2$) by using intermediate polarimetric properties as outputs ($s_0$, $\rho$, $\chi$, and $\psi$). Additionally, we perform coordinate conversions of Stokes vectors since the polarization state of a light ray can be described differently with Stokes vectors at different coordinates.

Portable Hyperspectral-polarimetric Imager

We built a portable hyperspectral polarimetric imaging system, consisting of a monochromatic camera, LCTF that controls the transmitted wavelength, and a quarter wave plate mounted on a motorized rotation stage.

Network architecture.

Multi-view Hyperspectral Polarimetric Dataset

Synthetic

Network architecture.

For synthetic scenes, we apply polarimetric BRDF to mesh by fitting a fourth order-polynomial function at each pixel along the spectrum since only five spectral channels are available in existing dataset [Baek et al. 2020] and render images using spectro-polarimetric renderer Mitsuba 3.

Real-world

Network architecture.

For real-world scenes, we capture a scene for 21 wavelengths and 4 quarter-wave plate angles and reconstruct per-pixel Stokes vector by solving a least-squares problem with spatially-varying spectro-polarimetric calibration using these captured images.

Novel-view Spectro-polarimetric Synthesis

Synthetic Results

Real Results

NeSpoF enables novel-view rendering of intensity, which is the first Stokes-vector element $s_0$, as well as polarimetric properties as visualized in AoLP (angle of linear polarization), DoP (degree of polarization), and ToP (type of polarization).

Interpolate start reference image.

Start Frame

Loading...
Interpolation end reference image.

End Frame


Compared to ground truth, visualizing intensity $s_0$ and polarization states with AoLP, DoP, and ToP, we found that NeSpoF enables accurate polarimetric reconstruction.

This scene includes a glass box surrounding certain objects and a specular crystal outside of the glass box. For this area, NeSpoF demonstrates modeling intricate variations of AoLP on the crystal surfaces. The polarization of this region also significantly changes according to viewpoints and wavelength.

This is another scene containing linear polarization film laptop and smartphone covered with protective film. When we visualize AoLP for this region, we observed that the linear polarization film and laptop display emits uniformly-distributed linearly polarized light, while the light from smartphone display interacts with protective film, resulting in interesting polarimetric variation. NeSpoF successfully models such variations, closely matching the ground truth even for different wavelengths.

Applications

Spectral Light transport decomposition

NeSpoF enables decomposing light transport into unpolarized and polarized hyperspectral components. As a result, we can separate not only diffuse and specular components (red arrows) but also more complex light transport, such as polarized inter-reflection between the walls (blue arrows). Additionally, we can extract hyperspectral intensity for diffuse and specular components at a novel-view.

Novel-view Spectral Relighting

We identify elliptically-polarized sky light and extract illumination spectrum of the sky. NeSpoF enables rendering of the hyperspectral intensity at a novel view and relighting by replacing the original illumination with the target illumination, CIE D65 illuminant plotted as blue.

Fast Forward

BibTeX


      @inproceedings{kim2023neural,
        title={Neural spectro-polarimetric fields},
        author={Kim, Youngchan and Jin, Wonjoon and Cho, Sunghyun and Baek, Seung-Hwan},
        booktitle={SIGGRAPH Asia 2023 Conference Papers},
        pages={1--11},
        year={2023}
      }