We propose a three-stage navigation method to a hand size target object using sound guidance for the visually impaired in a walking distance situation. The advantage of our proposed method is to let visually impaired people reach a target object that he/she should touch with only a camera-equipped wearable device. It could apply to any indoor situation because our proposed system needs only a vision-based pre-registration process where only a single video trajectory should be set in advance. The navigation is decomposed into three stages—path navigation, body navigation, and hand navigation. As for the walking stage, we utilize the Clew app that is sufficient for this purpose. For the successive two stages, we introduce AR anchor. The AR anchor should be registered on the target object in advance. Our sophisticated sound guidance is made to let the subject reach the target with the resolution of hand size. The stage change is informed by vibration. We have conducted a preliminary evaluation with our smartphone-based system and confirmed that the proposed method can navigate users to a hand size target starting from a 5-meter away position.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.