Please use this identifier to cite or link to this item:
http://dspace.iitrpr.ac.in:8080/xmlui/handle/123456789/1461
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Dubey, N. | - |
dc.contributor.author | Ghosh, S. | - |
dc.contributor.author | Dhall, A. | - |
dc.date.accessioned | 2020-01-02T14:53:40Z | - |
dc.date.available | 2020-01-02T14:53:40Z | - |
dc.date.issued | 2020-01-02 | - |
dc.identifier.uri | http://localhost:8080/xmlui/handle/123456789/1461 | - |
dc.description.abstract | Automatic eye gaze estimation has interested researchers for a while now. In this paper, we propose an unsupervised learning based method for estimating the eye gaze region. To train the proposed network “Ize-Net” in selfsupervised manner, we collect a large ‘in the wild’ dataset containing 1,54,251 images from the web. For the images in the database, we divide the gaze into three regions based on an automatic technique based on pupil-centers localization and then use a feature-based technique to determine the gaze region. The performance is evaluated on the Tablet Gaze and CAVE datasets by fine-tuning results of Ize-Net for the task of eye gaze estimation. The feature representation learned is also used to train traditional machine learning algorithms for eye gaze estimation. The results demonstrate that the proposed method learns a rich data representation, which can be efficiently finetuned for any eye gaze estimation dataset. | en_US |
dc.language.iso | en_US | en_US |
dc.title | Unsupervised learning of eye gaze representation from the web | en_US |
dc.type | Article | en_US |
Appears in Collections: | Year-2019 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Full Text.pdf | 2.38 MB | Adobe PDF | View/Open Request a copy |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.