1 to 10 of 20 Results
Nov 22, 2024 - SFB-TRR 161 A07 "Visual Attention Modeling for Optimization of Information Visualizations"
Wang, Yao, 2024, "SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)", https://doi.org/10.18419/darus-3884, DaRUS, V2
Understanding the link between visual attention and user’s needs when visually exploring information visualisations is under-explored due to a lack of large and diverse datasets to facilitate these analyses. To fill this gap, we introduce SalChartQA - a novel crowd-sourced datase... |
Oct 28, 2024
Sood, Ekta; Kögel, Fabian; Bulling, Andreas, 2024, "VQA-MHUG", https://doi.org/10.18419/darus-4428, DaRUS, V1
We present VQA-MHUG - a novel 49-participant dataset of multimodal human gaze on both images and questions during visual question answering (VQA), collected using a high-speed eye tracker. To the best of our knowledge, this is the first resource containing multimodal human gaze d... |
May 22, 2024
Zermiani, Francesca, 2024, "InteRead", https://doi.org/10.18419/darus-4091, DaRUS, V1, UNF:6:peWc+ExRsnPhsVEeOyMu0w== [fileUNF]
The InteRead dataset is designed to explore the impact of interruptions on reading behavior. It includes eye-tracking data from 50 adults with normal or corrected-to-normal eyesight and proficiency in English (native or C1 level). The dataset encompasses a self-paced reading task... |
May 16, 2024
Bulling, Andreas, 2024, "InvisibleEye", https://doi.org/10.18419/darus-3288, DaRUS, V1
We recorded a dataset of more than 280,000 close-up eye images with ground truth annotation of the gaze location. A total of 17 participants were recorded, covering a wide range of appearances: Gender: Five (29%) female and 12 (71%) male Nationality: Seven (41%) German, seven (41... |
Mar 14, 2023
Bulling, Andreas, 2023, "MPIIFaceGaze", https://doi.org/10.18419/darus-3240, DaRUS, V1
We present the MPIIFaceGaze dataset which is based on the MPIIGaze dataset, with the additional human facial landmark annotation and the face regions available. We added additional facial landmark and pupil center annotations for 37,667 face images. Facial landmarks annotations w... |
Mar 8, 2023
Bulling, Andreas, 2023, "Labeled pupils in the wild (LPW)", https://doi.org/10.18419/darus-3237, DaRUS, V1
We present labelled pupils in the wild (LPW), a novel dataset of 66 high-quality, high-speed eye region videos for the development and evaluation of pupil detection algorithms. The videos in our dataset were recorded from 22 participants in everyday locations at about 95 FPS usin... |
Feb 24, 2023
Bulling, Andreas, 2023, "MPIIEmo", https://doi.org/10.18419/darus-3287, DaRUS, V1
We present a human-validated dataset that contains 224 high-resolution, multi-view video clips and audio recordings of emotionally charged interactions between eight couples of actors. The dataset is fully annotated with categorical labels for four basic emotions (anger, happines... |
Nov 30, 2022
Bulling, Andreas, 2022, "DEyeAdicContact", https://doi.org/10.18419/darus-3289, DaRUS, V1, UNF:6:7QxaU+oeOPfaI8gMpCn5cw== [fileUNF]
We created our own dataset of natural dyadic interactions with fine-grained eye contact annotations using videos of dyadic interviews published on YouTube. Especially compared to lab-based recordings, these Youtube interviews allow us to analyse behaviour in a natural situation.... |
Nov 30, 2022
Bulling, Andreas, 2022, "MPIIMobileAttention", https://doi.org/10.18419/darus-3285, DaRUS, V1
This is a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). |
Nov 30, 2022
Bulling, Andreas, 2022, "MPIIPrivacEye", https://doi.org/10.18419/darus-3286, DaRUS, V1
First-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity. The dataset of Steil et al. contains more than 90 hours of data recorded continuously from 20 participants (six females, aged 22-31) over more than fo... |