Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models

Publication/Creation Date
September 9 2015
Julian Steil (creator)
Andreas Bulling (creator)
Sonja Forderer (contributor)
Sabrina Hoppe (contributor)
Saarland University (contributor)
Max Planck Institute For Informatics (contributor)
Persuasive Intent
Discursive Type
Human visual behaviour has significant potential for activity recognition and computational behaviour analysis, but previous works focused on supervised methods and recognition of predefined activity classes based on short-term eye movement recordings. We propose a fully unsupervised method to discover users’ everyday activities from their long-term visual behaviour. Our method combines a bag-of-words representation of visual behaviour that encodes saccades, fixations, and blinks with a latent Dirichlet allocation (LDA) topic model. We further propose different methods to encode saccades for their use in the topic model. We evaluate our method on a novel long-term gaze dataset that contains full-day recordings of natural visual behaviour of 10 participants (more than 80 hours in total). We also provide annotations for eight sample activity classes (outdoor, social interaction, focused work, travel, reading, computer work, watching media, eating) and periods with no specific activity. We show the ability of our method to discover these activities with performance competitive with that of previously published supervised methods.
HCI Platform
Relation to Body
Related Body Part

Date archived
October 5 2015
Last edited
July 5 2021
How to cite this entry
Julian Steil, Andreas Bulling. (September 9 2015). "Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models". 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015). International Joint Conference On Pervasive And Ubiquitous Computing. Fabric of Digital Life.