Fast Natural Feature Tracking Using optical Flow


The KIPS Transactions:PartB , Vol. 17, No. 5, pp. 345-354, Oct. 2010
10.3745/KIPSTB.2010.17.5.345,   PDF Download:

Abstract

Visual tracking techniques for Augmented Reality are classified as either a marker tracking approach or a natural feature tracking approach. Marker-based tracking algorithms can be efficiently implemented sufficient to work in real-time on mobile devices. On the other hand, natural feature tracking methods require a lot of computationally expensive procedures. Most previous natural feature tracking methods include heavy feature extraction and pattern matching procedures for each of the input image frame. It is difficult to implement real-time augmented reality applications including the capability of natural feature tracking on low performance devices. The required computational time cost is also in proportion to the number of patterns to be matched. To speed up the natural feature tracking process, we propose a novel fast tracking method based on optical flow. We implemented the proposed method on mobile devices to run in real-time and be appropriately used with mobile augmented reality applications. Moreover, during tracking, we keep up the total number of feature points by inserting new feature points proportional to the number of vanished feature points. Experimental results showed that the proposed method reduces the computational cost and also stabilizes the camera pose estimation results.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article
[IEEE Style]
J. S. Park and B. J. Bae, "Fast Natural Feature Tracking Using optical Flow," The KIPS Transactions:PartB , vol. 17, no. 5, pp. 345-354, 2010. DOI: 10.3745/KIPSTB.2010.17.5.345.

[ACM Style]
Jong Seung Park and Byung Jo Bae. 2010. Fast Natural Feature Tracking Using optical Flow. The KIPS Transactions:PartB , 17, 5, (2010), 345-354. DOI: 10.3745/KIPSTB.2010.17.5.345.