DeepShadow: Neural Shape From Shadows
Department of Computer Science, Reichman University, Israel
Abstract
We present ‘DeepShadow’, a one-shot method for
recovering the depth map and surface normals from photometric stereo
shadow maps. Previous works that try to recover the surface normals
from photometric stereo images treat cast shadows as a disturbance. We
show that the self and cast shadows not only do not disturb 3D reconstruction,
but can be used alone, as a strong learning signal, to recover
the depth map and surface normals.We demonstrate that 3D reconstruction
from shadows can even outperform shape-from-shading in certain
cases. The method does not require any pre-training or expensive labeled data, and is optimized
during inference time.
|
Method
Algorithm Overview
DeepShadow takes the light source location \(L^j\) and pixel coordinates \(({u, v)}\) as inputs,
along with the estimated depth \(\hat{d}\) from the MLP, and outputs an estimate of the shadow map
\(\hat{S^j}\) at each pixel location. The ground-truth shadow map \(S^j\) is then used as a
supervision to optimize the learned depth map \(\hat{d}\).
Flow of our Method
|
Left - The light source \(L^j\) is projected onto the image plane to receive \(\boldsymbol\ell^j\). A ray \(\mathbf{r}_i^j\) of \((u,v)\) points is created
between \(\boldsymbol\ell^j\) and \(\mathbf{u}_i\). Then, each point with its estimated depth \(\hat{d}\) is unprojected to world coordinates.
Right - The shadow line scan algorithm is used on points in 3D space to calculate shadowed pixels.
Red points are shadowed, since their angle to the light source is smaller than \(\alpha\).
Citation
@inproceedings{karnieli2022deepshadow,
title={DeepShadow: Neural shape from shadows},
author={Asaf Karnieli, Ohad Fried, Yacov Hel-Or},
year={2022},
booktitle={ECCV},
}
|
Acknowledgements
This work was supported by the Israeli Ministry of Science and Technology under The National Foundation for Applied Science (MIA),
and by the Israel Science Foundation (grant No. 1574/21).
|
Webpage template from here.