Home > Research > Publications & Outputs > Classifying Head Movements to Separate Head-Gaz...

Electronic data

Links

Text available via DOI:

View graph of relations

Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. / Hou, Baosheng James; Newn, Joshua; Sidenmark, Ludwig et al.
Proceedings of the 2023 CHI Conference on Human Factors in Computing. New York: ACM, 2023. p. 253:1-253:14.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Hou, BJ, Newn, J, Sidenmark, L, Khan, AA, Bækgaard, P & Gellersen, H 2023, Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. in Proceedings of the 2023 CHI Conference on Human Factors in Computing. ACM, New York, pp. 253:1-253:14. https://doi.org/10.1145/3544548.3581201

APA

Vancouver

Hou BJ, Newn J, Sidenmark L, Khan AA, Bækgaard P, Gellersen H. Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. In Proceedings of the 2023 CHI Conference on Human Factors in Computing. New York: ACM. 2023. p. 253:1-253:14 doi: 10.1145/3544548.3581201

Author

Hou, Baosheng James ; Newn, Joshua ; Sidenmark, Ludwig et al. / Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input. Proceedings of the 2023 CHI Conference on Human Factors in Computing. New York : ACM, 2023. pp. 253:1-253:14

Bibtex

@inproceedings{c8f1611f40b6429ca74d139546af0378,
title = "Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input",
abstract = "Head movement is widely used as a uniform type of input for human-computer interaction. However, there are fundamental differences between head movements coupled with gaze in support of our visual system, and head movements performed as gestural expression. Both Head-Gaze and Head Gestures are of utility for interaction but differ in their affordances. To facilitate the treatment of Head-Gaze and Head Gestures as separate types of input, we developed HeadBoost as a novel classifier, achieving high accuracy in classifying gaze-driven versus gestural head movement (F1-Score: 0.89). We demonstrate the utility of the classifier with three applications: gestural input while avoiding unintentional input by Head-Gaze; target selection with Head-Gaze while avoiding Midas Touch by head gestures; and switching of cursor control between Head-Gaze for fast positioning and Head Gesture for refinement. The classification of Head-Gaze and Head Gesture allows for seamless head-based interaction while avoiding false activation.",
author = "Hou, {Baosheng James} and Joshua Newn and Ludwig Sidenmark and Khan, {Anam Ahmad} and Per B{\ae}kgaard and Hans Gellersen",
year = "2023",
month = mar,
day = "19",
doi = "10.1145/3544548.3581201",
language = "English",
pages = "253:1--253:14",
booktitle = "Proceedings of the 2023 CHI Conference on Human Factors in Computing",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Classifying Head Movements to Separate Head-Gaze and Head Gestures as Distinct Modes of Input

AU - Hou, Baosheng James

AU - Newn, Joshua

AU - Sidenmark, Ludwig

AU - Khan, Anam Ahmad

AU - Bækgaard, Per

AU - Gellersen, Hans

PY - 2023/3/19

Y1 - 2023/3/19

N2 - Head movement is widely used as a uniform type of input for human-computer interaction. However, there are fundamental differences between head movements coupled with gaze in support of our visual system, and head movements performed as gestural expression. Both Head-Gaze and Head Gestures are of utility for interaction but differ in their affordances. To facilitate the treatment of Head-Gaze and Head Gestures as separate types of input, we developed HeadBoost as a novel classifier, achieving high accuracy in classifying gaze-driven versus gestural head movement (F1-Score: 0.89). We demonstrate the utility of the classifier with three applications: gestural input while avoiding unintentional input by Head-Gaze; target selection with Head-Gaze while avoiding Midas Touch by head gestures; and switching of cursor control between Head-Gaze for fast positioning and Head Gesture for refinement. The classification of Head-Gaze and Head Gesture allows for seamless head-based interaction while avoiding false activation.

AB - Head movement is widely used as a uniform type of input for human-computer interaction. However, there are fundamental differences between head movements coupled with gaze in support of our visual system, and head movements performed as gestural expression. Both Head-Gaze and Head Gestures are of utility for interaction but differ in their affordances. To facilitate the treatment of Head-Gaze and Head Gestures as separate types of input, we developed HeadBoost as a novel classifier, achieving high accuracy in classifying gaze-driven versus gestural head movement (F1-Score: 0.89). We demonstrate the utility of the classifier with three applications: gestural input while avoiding unintentional input by Head-Gaze; target selection with Head-Gaze while avoiding Midas Touch by head gestures; and switching of cursor control between Head-Gaze for fast positioning and Head Gesture for refinement. The classification of Head-Gaze and Head Gesture allows for seamless head-based interaction while avoiding false activation.

U2 - 10.1145/3544548.3581201

DO - 10.1145/3544548.3581201

M3 - Conference contribution/Paper

SP - 253:1-253:14

BT - Proceedings of the 2023 CHI Conference on Human Factors in Computing

PB - ACM

CY - New York

ER -