We propose a new framework for interactive Augmented Reality (AR) and Mixed Reality (MR) representation using both visible and invisible projection onto physical target objects. Projection-based approach for constructing AR/MR uses physical objects such as walls, books, plaster ornaments and whatever the computer generated contents can be optically projected onto. Namely, projection makes it possible to use real objects as displays.
We mainly focus on capturing and utilizing the 3D shape of the object surface, whose information allows the AR/MR system to take into account the visual consistency when merging the physical and rendered objects. 3D shape data of the object can be used to compensate the distortion caused by the difference between positions of projectors and the viewer. The other advantage is the capability to generate proper visual occlusion between physical and virtual objects so that they seem to coexist in front of the viewer.
What we demonstrate in this study is to employ near-infrared pattern projection for triangulation so that scanning and updating the geometry data of the object is automatically performed in background process, thus parallel processing to provide AR/MR representation can be achieved according to dynamic physical geometry changes.
In this paper, we propose new measurement technique of whole three dimensional shape for small moving objects. The proposed measurement system is very simple structure with the use of a CCD camera that installed a fish-eye lens and a cylinder that coating mirror inside. The CCD camera is set on the top side of the cylinder, and its optical axis is set to the center of cylinder. A captured image includes two types information. One is direct view of the target, the other is reflected view. These two information are used for measuring the shape of target by means of stereo matching. This proposed method can acquire the shape of target using only single image, so we can obtaine the three dimensional shape with the moving with the use of image sequence.
KEYWORDS: 3D metrology, Motion measurement, Cameras, Sensors, 3D acquisition, 3D image processing, 3D scanning, Imaging systems, Stereoscopic cameras, Gyroscopes
Wearable 3D measurement realizes to acquire 3D information of an objects or an environment using a wearable computer. Recently, we can send voice and sound as well as pictures by mobile phone in Japan. Moreover it will become easy to capture and send data of short movie by it. On the other hand, the computers become compact and high performance. And it can easy connect to Internet by wireless LAN. Near future, we can use the wearable computer always and everywhere. So we will be able to send the three-dimensional data that is measured by wearable computer as a next new data. This paper proposes the measurement method and system of three-dimensional data of an object with the using of wearable computer. This method uses slit light projection for 3D measurement and user’s motion instead of scanning system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.