Depth from defocus aims to estimate scene depth from two or more photos captured with differing camera parameters,
such as lens aperture or focus, by characterizing the difference in image blur. In the absence of noise, the ratio of Fourier
transforms of two corresponding image patches captured under differing focus conditions reduces to the ratio of the
optical transfer functions, since the contribution from the scene cancels. For a focus or aperture bracket, the shape of this
spectral ratio depends on object depth. Imaging noise complicates matters, introducing biases that vary with object
texture, making extraction of a reliable depth value from the spectral ratio difficult. We propose taking the mean of the
complex valued spectral ratio over an image tile as a depth measure. This has the advantage of cancelling much of the
effect of noise and significantly reduces depth bias compared to characterizing only the modulus of the spectral ratio.
This method is fast to calculate and we do not need to assume any shape for the optical transfer function, such as a
Gaussian approximation. Experiments with real world photographic imaging geometries show our method produces
depth maps with greater tolerance to varying object texture than several previous depth from defocus methods.
We present a new method for accurately determining the best focus position of a camera lens in the context of image
quality evaluation and modulation transfer function (MTF) measurement. Our method makes use of the “live preview”
function of digital cameras to image a test chart containing spatially and rotationally invariant alignment patterns. The
patterns can be located to sub-pixel accuracy even under defocus using the technique of blur-invariant phase correlation,
which leads to an absolute measure of focus position, independent of any backlash in the lens mechanism. We describe
an efficient closed feedback loop algorithm which makes use of this to drive the lens rapidly to best focus. This method
achieves the peak focus position to within a single step of the focus drive motor, typically allowing the peak focus MTF
to be measured to within 1.4% RMS. The mean time taken to find the peak focus position and drive the focus motor back
to that position ready for a comprehensive test exposure is 11.7 seconds, with maximum time 26 seconds, across a
variety of lenses of varying focal lengths.
We present a novel method for accurately measuring the optical transfer function (OTF) of a camera lens by digitally
imaging a tartan test pattern containing sinusoidal functions with multiple frequencies and orientations. The tartan
pattern can be tuned to optimize the measurement accuracy for an adjustable set of sparse spatial frequencies. The
measurement method is designed to be accurate, reliable, and fast in a wide range of measurement conditions, including
uncontrolled lighting. We describe the design of the tartan pattern and the algorithm for estimating the OTF accurately
from a captured digital image. Simulation results show that the tartan method has significantly better accuracy for
measuring the modulus of the OTF (the modulation transfer function, or MTF) than the ISO 12233 standard slanted-edge
method, especially at high spatial frequencies. With 1% simulated imaging noise, the root mean square (RMS) error of
the tartan method is on average 5 times smaller than the RMS error of the slanted-edge method. Experiments with a
printed tartan chart show good agreement (0.05 RMS) with MTFs measured using the slanted-edge method and that, like
the slanted-edge method, our method is tolerant to wide variations in illumination conditions.
Conference Committee Involvement (1)
Digital Photography and Mobile Imaging XI
9 February 2015 | San Francisco, California, United States
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.