Pose estimation algorithm for mobile augmented reality based on inertial sensor fusion
Abstract
Augmented reality (AR) applications have become increasingly ubiquitous as it integrates virtual information such as images, 3D objects, video, and more to the real world, which further enhances the real environment. Many researchers have investigated the augmentation of the 3D object on the digital screen. However, certain loopholes exist in the existing system while estimating the object’s pose, making it inaccurate for mobile augmented reality (MAR) applications. Objects augmented in the current system have much jitter due to frame illumination changes, affecting the accuracy of vision-based pose estimation. This paper proposes to estimate the pose of an object by blending both vision-based techniques and micro electrical mechanical system (MEMS) sensor (gyroscope) to minimize the jitter problem in MAR. The algorithm used for feature detection and description is oriented FAST rotated BRIEF (ORB), whereas to evaluate the homography for pose estimation, random sample consensus (RANSAC) is used. Furthermore, gyroscope sensor data is incorporated with the vision-based pose estimation. We evaluated the performance of augmenting the 3D object using the techniques, vision-based, and incorporating the sensor data using the video data. After extensive experiments, the validity of the proposed method was superior to the existing vision-based pose estimation algorithms.
Keywords
3D virtual object; augmented reality; gyroscope sensor; oriented FAST rotated BRIEF pose estimation;
Full Text:
PDFDOI: http://doi.org/10.11591/ijece.v12i4.pp3620-3631
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
International Journal of Electrical and Computer Engineering (IJECE)
p-ISSN 2088-8708, e-ISSN 2722-2578
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).