1 to 10 of 23 Results
May 26, 2023
Hirsch, Alexandra; Franke, Max; Koch, Steffen, 2023, "Source code for "Comparative Study on the Perception of Direction in Animated Map Transitions Using Different Map Projections"", https://doi.org/10.18419/darus-3540, DaRUS, V1
This repository contains the source code related to an OSF pre-registration for an online study. The goal of the study was to evaluate how well participants can determine the geographical direction of an animated map transition. In our between-subject online study, each of three... |
May 26, 2023
Hirsch, Alexandra; Franke, Max; Koch, Steffen, 2023, "Stimulus Data for "Comparative Study on the Perception of Direction in Animated Map Transitions Using Different Map Projections"", https://doi.org/10.18419/darus-3463, DaRUS, V1
We compare how well participants can determine the geographical direction of an animated map transition. In our between-subject online study, each of three groups is shown map transitions in one map projection: Mercator, azimuthal equidistant projection, or two-point equidistant... |
Mar 14, 2023 - Perceptual User Interfaces
Bulling, Andreas, 2023, "MPIIFaceGaze", https://doi.org/10.18419/darus-3240, DaRUS, V1
We present the MPIIFaceGaze dataset which is based on the MPIIGaze dataset, with the additional human facial landmark annotation and the face regions available. We added additional facial landmark and pupil center annotations for 37,667 face images. Facial landmarks annotations w... |
Mar 8, 2023 - Perceptual User Interfaces
Bulling, Andreas, 2023, "Labeled pupils in the wild (LPW)", https://doi.org/10.18419/darus-3237, DaRUS, V1
We present labelled pupils in the wild (LPW), a novel dataset of 66 high-quality, high-speed eye region videos for the development and evaluation of pupil detection algorithms. The videos in our dataset were recorded from 22 participants in everyday locations at about 95 FPS usin... |
Feb 24, 2023 - Perceptual User Interfaces
Bulling, Andreas, 2023, "MPIIEmo", https://doi.org/10.18419/darus-3287, DaRUS, V1
We present a human-validated dataset that contains 224 high-resolution, multi-view video clips and audio recordings of emotionally charged interactions between eight couples of actors. The dataset is fully annotated with categorical labels for four basic emotions (anger, happines... |
Nov 30, 2022 - Perceptual User Interfaces
Bulling, Andreas, 2022, "DEyeAdicContact", https://doi.org/10.18419/darus-3289, DaRUS, V1, UNF:6:7QxaU+oeOPfaI8gMpCn5cw== [fileUNF]
We created our own dataset of natural dyadic interactions with fine-grained eye contact annotations using videos of dyadic interviews published on YouTube. Especially compared to lab-based recordings, these Youtube interviews allow us to analyse behaviour in a natural situation.... |
Nov 30, 2022 - Perceptual User Interfaces
Bulling, Andreas, 2022, "MPIIMobileAttention", https://doi.org/10.18419/darus-3285, DaRUS, V1
This is a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). |
Nov 30, 2022 - Perceptual User Interfaces
Bulling, Andreas, 2022, "MPIIPrivacEye", https://doi.org/10.18419/darus-3286, DaRUS, V1
First-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity. The dataset of Steil et al. contains more than 90 hours of data recorded continuously from 20 participants (six females, aged 22-31) over more than fo... |
Oct 31, 2022 - Perceptual User Interfaces
Bulling, Andreas, 2022, "MPIIEgoFixation", https://doi.org/10.18419/darus-3234, DaRUS, V1, UNF:6:fcDo56Ha9jxYApA9klubEQ== [fileUNF]
This dataset is made up of handmade fixation annotations for a subset of the private dataset created in Yusuke Sugano and Andreas Bulling. 2015. Self-calibrating head-mounted eye trackers using egocentric visual saliency. In Proceedings of the 28th Annual ACM Symposium on User In... |
Oct 28, 2022 - Perceptual User Interfaces
Bulling, Andreas, 2022, "Data for "Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models"", https://doi.org/10.18419/darus-3231, DaRUS, V1, UNF:6:ED4k3r2vTuaHMlglWvQ+Gw== [fileUNF]
We were able to record a dataset of more than 80 hours of eye tracking data. The dataset comprises 7.8 hours of outdoor activities, 14.3 hours of social interaction, 31.3 hours of focused work, 8.3 hours of travel, 39.5 hours of reading, 28.7 hours of computer work, 18.3 hours of... |