Section: Application Domains

Modelling of awareness and expertise from Eye Gaze and Emotion

Humans display awareness and expertise through a variety of non-verbal channels. It is increasingly possible to record and interpret such information with available technology. In the ANR CEEGE project, have constructed an instrument for capturing and interpreting multimodal signals of humans engaged in solving challenging problems. Our instrument captures eye gaze, fixations, body postures, and facial expressions signals from humans engaged in interactive tasks on a touch screen.

An initial experiment with multi-modal observation of human experts engaged in solving problems in Chess revealed an unexpected observation of rapid changes in emotion as players attempt to solve challenging problems. In a scientific collaboration with the NeuroCognition group at the Univ Bielefeld, we have constructed to explain for chess experts that explains these unexpected results. This model has recently been tested in a second experiment with 22 chess players. Our results indicate that chess players associate emotions to chess chunks, and reactively use these associations to guide search in chunks for planning and problem solving. These results have recently been reported in a paper at the International Conference on Multimodal Interaction, and is the subject of the nearly completed doctoral dissertation of Thomas Guntz.

The results are currently being used in the construction of a student aware driver training device to be commercialized by SME company Sym2B financed by the SATT Linksium Project MAT: Monitoring Attention of Trainees, starting in Sept 2019. IN this project we will construct a training simulator for operation of busses and tramways.