Eye localization in video by combining eye detector and eye tracker
Author affiliations
DOI:
https://doi.org/10.15625/1813-9663/29/2/2667Keywords:
eye localization, eye tracker, eye detector.Abstract
In this paper, we propose a method for combining eye tracker and eye detector for robust eye localization in video. Instead of sequential intergration of the two systems, we use eye locations suggested by an eye detector for initialization and measurement of updating steps of particles used in an eye tracker. This combination allows to improve the localization performance, since the detector provides good estimation of eye location candidates, meanwhile the tracker allows to find the best eye location by using temporal information. Experiments are conductedon two benchmark video databases (TRECVID and Boston University Headposedatasets) and videos from Vietnamese Television. The obtained results show that our method achieves better effectiveness compared to the state-of-the-art eye detector and eye tracker.Metrics
Metrics Loading ...
Downloads
Published
30-05-2013
How to Cite
[1]
C. P. Đình Thăng, N. Đức Thành, L. Đình Duy, D. C. Nhân, and D. A. Đức, “Eye localization in video by combining eye detector and eye tracker”, JCC, vol. 29, no. 2, pp. 170–180, May 2013.
Issue
Section
Computer Science
License
1. We hereby assign copyright of our article (the Work) in all forms of media, whether now known or hereafter developed, to the Journal of Computer Science and Cybernetics. We understand that the Journal of Computer Science and Cybernetics will act on my/our behalf to publish, reproduce, distribute and transmit the Work.2. This assignment of copyright to the Journal of Computer Science and Cybernetics is done so on the understanding that permission from the Journal of Computer Science and Cybernetics is not required for me/us to reproduce, republish or distribute copies of the Work in whole or in part. We will ensure that all such copies carry a notice of copyright ownership and reference to the original journal publication.
3. We warrant that the Work is our results and has not been published before in its current or a substantially similar form and is not under consideration for another publication, does not contain any unlawful statements and does not infringe any existing copyright.
4. We also warrant that We have obtained the necessary permission from the copyright holder/s to reproduce in the article any materials including tables, diagrams or photographs not owned by me/us.