Captured Images: These are images of various objects captured with a Canon EOS 40D in a lab setting, under a directional light source with a chrome sphere in the scene that was used to determine the ground truth lighting direction. The 99th percentile intensity value in each image was assumed to correspond to the lighting intensity x albedo and used for normalization.
Synthetic Images: These are images of random surfaces, constructed by generating random samples of depth values from a Gaussian distribution on a 5x5 grid and then interpolating to 128x128 surfaces, rendered synthetically under horizontal directional lighting with an elevation angle of 60°. We also show results for cases when these synthetic images are corrupted by additive white Gaussian noise and by specular reflection.
Baseline Methods For comparison: The baseline SFS algorithms used for comparison includes:
 A. Ecker and A. D. Jepson, “Polynomial shape from shading,” in Proc. CVPR, 2010.
 J.T. Barron and J. Malik, “Shape, Illumination, and Reflectance from Shading”, Technical Report, EECS, UC Berkeley, 2013.