Skip to content

[TPAMI] Automatic Gaze Analysis ‘in-the-wild’: A Survey

Notifications You must be signed in to change notification settings

i-am-shreya/Eye-Gaze-Survey

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

82 Commits
 
 
 
 

Repository files navigation

EyeGazeSurvey [TPAMI 2023 Accepted]

Overview of Human Visual System

Eye_model

Overview of Gaze Estimation Setup

setup

  • Automatic Gaze Analysis: A Survey of Deep Learning based Approaches by Shreya Ghosh, Abhinav Dhall, Munawar Hayat, Jarrod Knibbe and Qiang Ji. (Paper Arxiv Link)

If we miss your work, please let us know and we'll add it.

Datasets

A comparison of gaze datasets with respect to several attributes (i.e. number of subjects (# subjects), gaze labels, modality, headpose and gaze angle in yaw and pitch axis, environment (Env.), baseline method, data statistics (# of data), and year of publication.) The abbreviations used are: In: Indoor, Out: Outdoor, Both: Indoor + Outdoor, Gen.: Generation, u/k: unknown, Seq.: Sequence, VF: Visual Field, EB: Eye Blink, GE: Gaze Event, GBRT: Gradient Boosting Regression Trees, GC: Gaze Communication, GNN: Graph Neural Network and Seg.: Segmentation.

Dataset Links:

CAVE 2013, EYEDIAP 2014, UT MV 2014, OMEG 2015, MPIIGaze 2015, GazeFollow 2015, SynthesEye 2015, GazeCapture 2016, UnityEyes 2016, TabletGaze 2017, MPIIFaceGaze 2017, InvisibleEye 2017, RT-GENE 2018, Gaze360 2019, RT-BENE 2019, NV Gaze 2019, HUST-LBW 2019, VACATION 2019, OpenEDS2019 2019, OpenEDS2020 2020, OpenEDS2021 2021, mEBAL 2020, ETH-XGaze 2020, EVE 2020, Gaze-in-the-Wild 2020, LAEO 2021, GOO 2021, OpenNEEDS 2021

The details of the datasets are mentioned in the table: datasets

Gaze Analysis Methods (to be updated)

arch A comparison of gaze analysis methods with respect to registration (Reg.), representation (Represent.), Level of Supervision, Model, Prediction, validation on benchmark datasets (validation), Platforms, Publication venue (Publ.) and year. Here, GV: Gaze Vector, Scr.: Screen, LOSO: Leave One Subject Out, LPIPS: Learned Perceptual Image Patch Similarity, MM: Morphable Model, RRF: Random Regression Forest, AEM: Anatomic Eye Model, GRN: Gaze Regression Network, ET: External Target, FV: Free Viewing, HH: HandHeld Device, HMD: Head Mounted Device, Seg.: Segmentation and GR: Gaze Redirection, LAEO: Looking At Each Other.

methods

Privacy Issues

methods

Contact

If you find the survey useful for your research, please consider citing our work:

@article{ghosh2021Automatic,
  title={Automatic Gaze Analysis: A Survey of Deep Learning based Approaches},
  author={Ghosh, Shreya and Dhall, Abhinav and Hayat, Munawar and Knibbe, Jarrod and Ji, Qiang},
  journal={arXiv preprint arXiv:2108.05479},
  year={2021}
}

Eye-Gaze: Awesome Maintenance PR's Welcome

A curated list of papers and datsets for various gaze estimation techniques, inspired by awesome-computer-vision. Mostly Recent papers are here.

Contents

Eye Gaze Estimation/Tracking

Gaze Trajectory

Gaze Redirection

Gaze Zone + Driver Gaze

Gaze and Attention

Gaze and Interaction

Visual Attention

Uncategorized Papers

Releases

No releases published

Packages

No packages published