Most near-eye displays with one fixed focal plane suffer from the vergence-accommodation conflict (VAC) and cause visual discomfort to users. In contrast, a light field display with continuous focal planes offers the most natural and comfortable AR/VR visual experiences without VAC and holds the promise to be the ultimate near-eye 3-D display. It projects light rays onto human retina as if the light rays were emanated from a real object. This paper considers a near-eye light field display comprising a light field generator, a collimator, and a geometric waveguide as the three main components. It takes 4-D light field data in the form of an array of 2-D subview images as input and generates a light field as output. The light field generator is the device responsible for converting the light emitted from the display panel to the light representing the light field of a virtual scene. The geometric waveguide along with a collimator ensures that the light rays propagating in the waveguide are collimated. The partially reflective mirrors of the waveguide replicate the optical path to achieve exit pupil expansion (EPE) and a large eyebox. However, existing waveguide eyepieces for near-eye AR/VR displays are not designed for, and hence may not fit light field displays. In this work, we look into a geometric waveguide for light field display and find that the light fields replicated by the partially reflective mirrors cannot perfectly overlap on the user’s retina, resulting in the appearance of multiple repetitive images—a phenomenon we call “ghost artifact”. This paper delves into the cause of this artifact and develops a solution for applications that require short-range interaction with virtual objects, such as surgical procedures. We define a working range devoid of noticeable ghost artifact based on the angular resolution characteristics of human eye and optimize the orientation of an array of partially reflective mirrors of the waveguide to meet the image quality requirement for short-range interaction. With the optimized waveguide, the ghost artifact is significantly reduced. More results of the optimized waveguide will be shown at the conference.
Natural and comfortable visual experience is critical to the success of metaverse, which has recently drawn worldwide attention. However, due to the vergence-accommodation conflict (VAC), most augmented reality (AR) or virtual reality (VR) displays on the market today can easily cause visual discomfort or eyestrain to users. Being able to resolve the VAC, light field display is commonly believed to be the ultimate display for metaverse. Similar to conventional near-eye AR displays, a near-eye light field AR display consists of three basic components: light source, projection unit, and eyepiece. Although the same light source can be used in both kinds of displays, the projection unit and the eyepiece of a near-eye light field AR display call for a new design to preserve the structure of light field when it reaches user’s retina. The primary focus of this paper is on the eyepiece design for a near-eye light field AR display. In consideration of the compact form factor and wide field of view, the birdbath architecture, which consists of a beam splitter and a combiner, is selected as the basis of our eyepiece design. We optimize the birdbath eyepiece for the light field projection module produced by PetaRay Inc. The birdbath eyepiece receives the light field emitted from the light field projection module and projects it fully into the user’s eye. Our design preserves the structure of light field and hence allows virtual objects at different depths to be properly perceived. In addition, the eyepiece design leads to a compact form factor for the near-eye light field AR display. Specifically, our eyepiece is designed by optimizing the tradeoff between the eyebox and the depth of focus (DOF) of the near-eye light field AR display. The resulting DOF allows the user to have a clear and sharp perception of any virtual object in the working range, which is from 30 cm to infinity. In addition, we optimize the entrance pupil position and the F-number of the eyepiece according to the exit pupil position and the divergence angle of the light field projection module. This way, the eyepiece is able to preserve the structure of light field, meaning that the angular relation between light rays coming from the same object point in space is preserved. To demonstrate the performance of our birdbath eyepiece, we use the Human Eye Model-Liou & Brennan (JOSA A 08/97) to simulate the image formation process.
Switching the backlight of handheld devices to low power mode saves energy but affects the color appearance of an
image. In this paper, we consider the chroma degradation problem and propose an enhancement algorithm that
incorporates the CIECAM02 appearance model to quantitatively characterize the problem. In the proposed algorithm, we
enhance the color appearance of the image in low power mode by weighted linear superposition of the chroma of the
image and that of the estimated dim-backlight image. Subjective tests are carried out to determine the perceptually
optimal weighting and prove the effectiveness of our framework.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.