Skip to content

This repository contains frameworks for pre-processing, training and evaluating full face or multi-region (face, left and right eyes) gaze estimation models on the following three datasets: ETH-XGaze, MPII-Gaze, and Gaze360.

Notifications You must be signed in to change notification settings

yunhanwang1105/GazeTech

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

77 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Investigation of Architectures and Receptive Fields for Appearance-based Gaze Estimation

1Computer Vision Lab, Delft University of Technology  2NVIDIA 
3School of Computer Science, University of Birmingham

Description

This repository contains frameworks for pre-processing, training, and evaluating full face or multi-region (face, left and right eyes) gaze estimation models on the following three datasets: ETH-XGaze, MPIIFaceGaze, and Gaze360. These frameworks allow for flexible load single-face input or multi-region input and serve to reproduce our results and benchmark new models.

The readme files in the submodules of different datasets here contain step-by-step tutorials to help you set up, train and evaluate our existing and your new models.

Links

Pre-trained models

The pre-trained models on different datasets can be found in the sub-folders (ETHX-Gaze, MPIIFaceGaze, Gaze360).

Citation

@misc{wang2023investigation,
      title={Investigation of Architectures and Receptive Fields for Appearance-based Gaze Estimation}, 
      author={Yunhan Wang and Xiangwei Shi and Shalini De Mello and Hyung Jin Chang and Xucong Zhang},
      year={2023},
      eprint={2308.09593},
      archivePrefix={arXiv},
      primaryClass={cs.CV}
}

About

This repository contains frameworks for pre-processing, training and evaluating full face or multi-region (face, left and right eyes) gaze estimation models on the following three datasets: ETH-XGaze, MPII-Gaze, and Gaze360.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages