Data for "Prediction of Search Targets From Fixations in Open-World Settings" (doi:10.18419/darus-3226)

View:

Part 1: Document Description
Part 2: Study Description
Part 5: Other Study-Related Materials
Entire Codebook

(external link)

Document Description

Citation

Title:

Data for "Prediction of Search Targets From Fixations in Open-World Settings"

Identification Number:

doi:10.18419/darus-3226

Distributor:

DaRUS

Date of Distribution:

2022-10-28

Version:

1

Bibliographic Citation:

Bulling, Andreas, 2022, "Data for "Prediction of Search Targets From Fixations in Open-World Settings"", https://doi.org/10.18419/darus-3226, DaRUS, V1

Study Description

Citation

Title:

Data for "Prediction of Search Targets From Fixations in Open-World Settings"

Identification Number:

doi:10.18419/darus-3226

Authoring Entity:

Bulling, Andreas (Universität Stuttgart)

Other identifications and acknowledgements:

Cluster of Excellence on Multimodal Computing and Interaction (MMCI) at Saarland University

Grant Number:

EXC 284 - 39134088

Distributor:

DaRUS

Access Authority:

Bulling, Andreas

Depositor:

Perceptual User Interfaces

Date of Deposit:

2022-10-13

Holdings Information:

https://doi.org/10.18419/darus-3226

Study Scope

Keywords:

Computer and Information Science

Topic Classification:

Gaze, Visual Search

Abstract:

We designed a human study to collect fixation data during visual search. We opted for a task that involved searching for a single image (the target) within a synthesised collage of images (the search set). Each of the collages are the random permutation of a finite set of images. To explore the impact of the similarity in appearance between target and search set on both fixation behaviour and automatic inference, we have created three different search tasks covering a range of similarities. In prior work, colour was found to be a particularly important cue for guiding search to targets and target-similar objects. Therefore we have selected for the first task 78 coloured O'Reilly book covers to compose the collages. These covers show a woodcut of an animal at the top and the title of the book in a characteristic font underneath. Given that overall cover appearance was very similar, this task allows us to analyse fixation behaviour when colour is the most discriminative feature. For the second task we use a set of 84 book covers from Amazon. In contrast to the first task, appearance of these covers is more diverse. This makes it possible to analyse fixation behaviour when both structure and colour information could be used by participants to find the target. Finally, for the third task, we use a set of 78 mugshots from a public database of suspects. In contrast to the other tasks, we transformed the mugshots to grey-scale so that they did not contain any colour information. In this case, allows abalysis of fixation behaviour when colour information was not available at all. We found faces to be particularly interesting given the relevance of searching for faces in many practical applications. <ul> <li>18 participants (9 males), age 18-30 <li>Gaze data recorded with a stationary Tobii TX300 eye tracker </ul> More information about the dataset can be found in the <a href="https://darus.uni-stuttgart.de/file.xhtml?fileId=165014">README file</a>.

Kind of Data:

Eye gaze fixations

Notes:

The data is only to be used for non-commercial scientific purposes.

Methodology and Processing

Sources Statement

Data Access

Other Study Description Materials

Related Publications

Citation

Title:

Hosnieh Sattar, Sabine Müller, Mario Fritz and Andreas Bulling. 2015. Prediction of Search Targets From Fixations in Open-world Settings. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 981-990.

Identification Number:

10.1109/CVPR.2015.7298700

Bibliographic Citation:

Hosnieh Sattar, Sabine Müller, Mario Fritz and Andreas Bulling. 2015. Prediction of Search Targets From Fixations in Open-world Settings. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 981-990.

Other Study-Related Materials

Label:

Amazon.zip

Notes:

application/zip

Other Study-Related Materials

Label:

changelog.txt

Notes:

text/plain

Other Study-Related Materials

Label:

Mugshots.zip

Notes:

application/zip

Other Study-Related Materials

Label:

Oreilly.zip

Notes:

application/zip

Other Study-Related Materials

Label:

readme.txt

Notes:

text/plain