Focal Track

Depth and Accommodation with Oscillating Lens Deformation

Qi Guo   Emma Alexander   Todd Zickler

Abstract: The focal track sensor is a monocular and computationally efficient depth sensor that is based on defocus controlled by a liquid membrane lens. It synchronizes small lens oscillations with a photosensor to produce real-time depth maps by means of differential defocus, and it couples these oscillations with bigger lens deformations that adapt the defocus working range to track objects over large axial distances. To create the focal track sensor, we derive a texture-invariant family of equations that relate image derivatives to scene depth when a lens changes its focal length differentially. Based on these equations, we design a feed-forward sequence of computations that: robustly incorporates image derivatives at multiple scales; produces confidence maps along with depth; and can be trained end-to-end to mitigate against noise, aberrations, and other non-idealities. Our prototype with 1-inch optics produces depth and confidence maps at 100 frames per second over an axial range of more than 75cm.

Demo Video

Publication

Guo, Q., Alexander, E., Zickler, T.: Focal Track: Depth and Accommodation with Oscillating Lens Deformation. In: International Conference on Computer Vision (ICCV). IEEE (2017).

[pdf] [technical report] [ICCV poster] [NECV slides] [code]