INTERACCION UBICUA BASADA EN LA MIRADA PARA DISPOSITIVOS MOVILES

TIN2014-52897-R

Nombre agencia financiadora Ministerio de Economía y Competitividad
Acrónimo agencia financiadora MINECO
Programa Programa Estatal de I+D+I Orientada a los Retos de la Sociedad
Subprograma Todos los retos
Convocatoria Retos Investigación: Proyectos de I+D+I (2014)
Año convocatoria 2014
Unidad de gestión Dirección General de Investigación Científica y Técnica
Centro beneficiario UNIVERSIDAD PÚBLICA DE NAVARRA (UPNA)
Centro realización DEPARTAMENTO INGENIERÍA ELÉCTRICA Y ELECTRÓNICA
Identificador persistente http://dx.doi.org/10.13039/501100003329

Publicaciones

Resultados totales (Incluyendo duplicados): 5
Encontrada(s) 1 página(s)

Low cost gaze estimation: knowledge-based solutions

Academica-e. Repositorio Institucional de la Universidad Pública de Navarra
  • Martinikorena Aranburu, Ion
  • Larumbe Bergera, Andoni
  • Ariz Galilea, Mikel
  • Porta Cuéllar, Sonia
  • Cabeza Laguna, Rafael
  • Villanueva Larre, Arantxa
Eye tracking technology in low resolution scenarios
is not a completely solved issue to date. The possibility of using
eye tracking in a mobile gadget is a challenging objective that
would permit to spread this technology to non-explored fields.
In this paper, a knowledge based approach is presented to solve
gaze estimation in low resolution settings. The understanding
of the high resolution paradigm permits to propose alternative
models to solve gaze estimation. In this manner, three models
are presented: a geometrical model, an interpolation model
and a compound model, as solutions for gaze estimation for
remote low resolution systems. Since this work considers head
position essential to improve gaze accuracy, a method for head
pose estimation is also proposed. The methods are validated
in an optimal framework, I2Head database, which combines
head and gaze data. The experimental validation of the models
demonstrates their sensitivity to image processing inaccuracies,
critical in the case of the geometrical model. Static and extreme
movement scenarios are analyzed showing the higher robustness
of compound and geometrical models in the presence of user’s
displacement. Accuracy values of about 3◦ have been obtained,
increasing to values close to 5◦ in extreme displacement settings,
results fully comparable with the state-of-the-art., This work was supported in part by the Ministry of Economy and Competitiveness under Grant TIN2014-52897-R and in part by the Ministry of Science, Innovation and Universities under Grant TIN2017-84388-R.




Introducing I2Head database

Academica-e. Repositorio Institucional de la Universidad Pública de Navarra
  • Martinikorena Aranburu, Ion
  • Cabeza Laguna, Rafael
  • Villanueva Larre, Arantxa
  • Porta Cuéllar, Sonia
I2Head database has been created with the aim to become an optimal reference for low cost gaze estimation. It exhibits the following outstanding characteristics: it takes into account key aspects of low resolution eye tracking technology; it combines images of users gazing at different grids of points from alternative positions with registers of user's head position and it provides calibration information of the camera and a simple 3D head model for each user. Hardware used to build the database includes a 6D magnetic sensor and a webcam. A careful calibration method between the sensor and the camera has been developed to guarantee the accuracy of the data. Different sessions have been recorded for each user including not only static head scenarios but also controlled displacements and even free head movements. The database is an outstanding framework to test both gaze estimation algorithms and head pose estimation methods., The authors would like to acknowledge the Spanish Ministry of Economy, Industry and Competitiveness for their support under Contracts TIN2014-52897-R and TIN2017-84388-R in the framework of the National Plan of I+D+i.




Fast and robust ellipse detection algorithm for head-mounted eye tracking systems

Academica-e. Repositorio Institucional de la Universidad Pública de Navarra
  • Martinikorena Aranburu, Ion
  • Cabeza Laguna, Rafael
  • Villanueva Larre, Arantxa
  • Urtasun, Iñaki
  • Larumbe Bergera, Andoni
In head-mounted eye tracking systems, the correct detection of pupil position is a key factor in estimating gaze direction.
However, this is a challenging issue when the videos are recorded in real-world conditions, due to the many sources of
noise and artifacts that exist in these scenarios, such as rapid changes in illumination, reflections, occlusions and an elliptical
appearance of the pupil. Thus, it is an indispensable prerequisite that a pupil detection algorithm is robust in these challenging
conditions. In this work, we present one pupil center detection method based on searching the maximum contribution point
to the radial symmetry of the image. Additionally, two different center refinement steps were incorporated with the aim of
adapting the algorithm to images with highly elliptical pupil appearances. The performance of the proposed algorithm is
evaluated using a dataset consisting of 225,569 head-mounted annotated eye images from publicly available sources. The
results are compared with the better algorithm found in the bibliography, with our algorithm being shown as superior., The authors would like to acknowledge the Spanish Ministry of Economy, Industry and Competitiveness for their support under Contract TIN2014-52897-R in the framework of the National Plan of I+D+i.in the framework of the National Plan of I+D+i.




Supervised descent method (SDM) applied to accurate pupil detection in off-the-shelf eye tracking systems

Academica-e. Repositorio Institucional de la Universidad Pública de Navarra
  • Larumbe Bergera, Andoni
  • Cabeza Laguna, Rafael
  • Villanueva Larre, Arantxa
The precise detection of pupil/iris center is key to estimate gaze accurately. This fact becomes specially challenging in low cost frameworks in which the algorithms employed for high performance systems fail. In the last years an outstanding effort has been made in order to apply training-based methods to low resolution images. In this paper, Supervised Descent Method (SDM) is applied to GI4E database. The 2D landmarks employed for training are the corners of the eyes and the pupil centers. In order to validate the algorithm proposed, a cross validation procedure is performed. The strategy employed for the training allows us to affirm that our method can potentially outperform the state of the art algorithms applied to the same dataset in terms of 2D accuracy. The promising results encourage to carry on in the study of training-based methods for eye tracking., Spanish Ministry of Economy,Industry and Competitiveness, contracts TIN2014-52897-R and TIN2017-84388-R




Improved strategies for HPE employing learning-by-synthesis approaches

Academica-e. Repositorio Institucional de la Universidad Pública de Navarra
  • Larumbe Bergera, Andoni
  • Ariz Galilea, Mikel
  • Bengoechea Irañeta, José Javier
  • Segura, Rubén
  • Cabeza Laguna, Rafael
  • Villanueva Larre, Arantxa
The first contribution of this paper is the presentation of a synthetic video database where the groundtruth of 2D facial landmarks and 3D head poses is available to be used for training and evaluating Head Pose Estimation (HPE) methods. The database is publicly available and contains videos of users performing guided and natural movements. The second and main contribution is the submission of a hybrid method for HPE based on Pose from Ortography and Scaling by Iterations (POSIT). The 2D landmark detection is performed using Random Cascaded-Regression Copse (R-CR-C). For the training stage we use, state of the art labeled databases. Learning-by-synthesis approach has been also used to augment the size of the database employing the synthetic database. HPE accuracy is tested by using two literature 3D head models. The tracking method proposed has been compared with state of the art methods using Supervised Descent Regressors (SDR) in terms of accuracy, achieving an improvement of 60%., Spanish Ministry of Economy, Industry and Competitiveness, contract TIN2014-52897-R.