1 to 5 of 5 Results
Nov 22, 2024
Wang, Yao, 2024, "SalChartQA: Question-driven Saliency on Information Visualisations (Dataset and Reproduction Data)", https://doi.org/10.18419/DARUS-3884, DaRUS, V2
Understanding the link between visual attention and user’s needs when visually exploring information visualisations is under-explored due to a lack of large and diverse datasets to facilitate these analyses. To fill this gap, we introduce SalChartQA - a novel crowd-sourced dataset that uses the BubbleView interface as a proxy for human gaze and a q... |
Apr 8, 2024
Wang, Yao; Bulling, Andreas, 2024, "VisRecall++: Analysing and Predicting Recallability of Information Visualisations from Gaze Behaviour (Dataset and Reproduction Data)", https://doi.org/10.18419/DARUS-3138, DaRUS, V1, UNF:6:NwphGtoYrBQqd2TyRh0OHA== [fileUNF]
This dataset contains stimuli and collected participant data of VisRecall++. The structure of the dataset is described in the README-File. Further, if you are interested in related codes of the publication, you can find a copy of the code repository (see Metadata for Research Software) within this dataset. |
Mar 22, 2024
Wang, Yao; Bulling, Andreas, 2024, "Saliency3D: A 3D Saliency Dataset Collected on Screen (Dataset and Experiment Application)", https://doi.org/10.18419/DARUS-4101, DaRUS, V1
While visual saliency has recently been studied in 3D, the experimental setup for collecting 3D saliency data can be expensive and cumbersome. To address this challenge, we propose a novel experimental design that utilizes an eye tracker on a screen to collect 3D saliency data. Our experimental design reduces the cost and complexity of 3D saliency... |
Jun 26, 2023
Wang, Yao, 2023, "Data for: "Scanpath Prediction on Information Visualizations"", https://doi.org/10.18419/DARUS-3361, DaRUS, V2, UNF:6:cqkNueYjBVCLYaXEqJq3yw== [fileUNF]
We propose Unified Model of Saliency and Scanpaths (UMSS) - a model that learns to predict multi-duration saliency and scanpaths (i.e. sequences of eye fixations) on information visualisations. Although scanpaths provide rich information about the importance of different visualisation elements during the visual exploration process, prior work has b... |
Jun 21, 2022
Wang, Yao; Bulling, Andreas, 2022, "Data for "VisRecall: Quantifying Information Visualisation Recallability via Question Answering"", https://doi.org/10.18419/DARUS-2826, DaRUS, V1, UNF:6:AuvgRc09o1rESd63AqlW9Q== [fileUNF]
Despite its importance for assessing the effectiveness of communicating information visually, fine-grained recallability of information visualisations has not been studied quantitatively so far. We propose a question-answering paradigm to study visualisation recallability and present VisRecall -- a novel dataset consisting of 200 information visual... |