View: |
Part 1: Document Description
|
Citation |
|
---|---|
Title: |
Supplemental Material for "Which Experimental Design is Better Suited for VQA Tasks? - Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations" |
Identification Number: |
doi:10.18419/darus-3380 |
Distributor: |
DaRUS |
Date of Distribution: |
2023-04-03 |
Version: |
1 |
Bibliographic Citation: |
Vriend, Sita; Vidyapu, Sandeep; Rama, Amer; Chen, Kun-Ting; Weiskopf, Daniel, 2023, "Supplemental Material for "Which Experimental Design is Better Suited for VQA Tasks? - Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations"", https://doi.org/10.18419/darus-3380, DaRUS, V1, UNF:6:mnjCJZLU4zgl+uXCgQV5fA== [fileUNF] |
Citation |
|
Title: |
Supplemental Material for "Which Experimental Design is Better Suited for VQA Tasks? - Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations" |
Identification Number: |
doi:10.18419/darus-3380 |
Authoring Entity: |
Vriend, Sita (Universität Stuttgart) |
Vidyapu, Sandeep (Universität Stuttgart) |
|
Rama, Amer (Universität Stuttgart) |
|
Chen, Kun-Ting (Universität Stuttgart) |
|
Weiskopf, Daniel (Universität Stuttgart) |
|
Software used in Production: |
Rstudio |
Software used in Production: |
Gazealytics |
Distributor: |
DaRUS |
Access Authority: |
Vriend, Sita |
Access Authority: |
Chen, Kun-Ting |
Access Authority: |
Vidyapu, Sandeep |
Access Authority: |
Rama, Amer |
Depositor: |
Vriend, Sita |
Date of Deposit: |
2023-03-13 |
Holdings Information: |
https://doi.org/10.18419/darus-3380 |
Study Scope |
|
Keywords: |
Computer and Information Science, Social Sciences, Visual Analytics, Empirical Research, Human-Centered Computing, Eye Tracking |
Abstract: |
<p>We investigated the effect of stimulus-question ordering and modality in which the question is presented of a user study with visual question answering (VQA) tasks. In an eye-tracking user study (N=13), we tested 5 conditions within-subjects. The conditions were counter-balanced to account for order effects. We collected participants' answers to the VQA tasks, responses to the NASA TLX questionnaire after each completed condition, and gaze data was recorded only during exposure to the image stimulus.</p> <p>We provide the data and scripts used for statistical analysis, the files used for the exploratory analysis in WebVETA, the image stimuli used per condition and training as well as the VQA tasks related to the images. The images and questions used in the user study is a subset of the GQA dataset (Hudson and Manning, 2019). For more information see: <a href="https://cs.stanford.edu/people/dorarad/gqa/index.html">https://cs.stanford.edu/people/dorarad/gqa/index.html</a></p> The mean fixation duration, hit-any-AOI rate and scan paths were generated using gazealytics (<a href="https://www2.visus.uni-stuttgart.de/gazealytics/">https://www2.visus.uni-stuttgart.de/gazealytics/</a>). The Hit-Any-AOI-rate and mean fixation duration was calculated per person per image stimulus. |
Kind of Data: |
User performance data |
Kind of Data: |
Stimuli |
Notes: |
<p>Users interested in reproducing the results can follow the methodology as reported in the paper and use the image and question stimuli from the Files section. The analysis code as reported in the R scripts and the data are located the Files section.</p> The zip files inside webveta_files.zip should be uploaded to gazealytics (<a href="https://www2.visus.uni-stuttgart.de/gazealytics/">https://www2.visus.uni-stuttgart.de/gazealytics/</a>) as zip-files. |
Methodology and Processing |
|
Sources Statement |
|
Data Sources: |
The images and questions used in the user study is a subset of the GQA dataset (Hudson and Manning, 2019, CC-BY 4.0), downloaded at <a href="https://cs.stanford.edu/people/dorarad/gqa/download.html">https://cs.stanford.edu/people/dorarad/gqa/download.html</a>. For more information see: <a href="https://cs.stanford.edu/people/dorarad/gqa/index.html">https://cs.stanford.edu/people/dorarad/gqa/index.html</a> and <p>Drew A. Hudson and Christopher D. Manning. 2019. GQA: A new dataset for real-world visual reasoning and compositional question answering. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. IEEE, 6693-6702. doi: <a href="https://doi.org/10.1109/CVPR.2019.00686">10.1109/CVPR.2019.00686</a></p> |
Data Access |
|
Other Study Description Materials |
|
Related Materials |
|
Drew A. Hudson and Christopher D. Manning. 2019. GQA: A new dataset for real-world visual reasoning and compositional question answering. In 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition. IEEE, 6693–6702. <a href="https://doi.org/10.1109/CVPR.2019.00686">https://doi.org/10.1109/CVPR.2019.00686</a> |
|
Related Publications |
|
Citation |
|
Title: |
Sandeep Vidyapu, Sita A. Vriend, Amer Rama, Kun-Ting Chen, Daniel, Weiskopf. 2023. Which Experimental Design is Better Suited for VQA Tasks? - Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations |
Bibliographic Citation: |
Sandeep Vidyapu, Sita A. Vriend, Amer Rama, Kun-Ting Chen, Daniel, Weiskopf. 2023. Which Experimental Design is Better Suited for VQA Tasks? - Eye Tracking Study on Cognitive Load, Performance, and Gaze Allocations |
File Description--f199150 |
|
File: DataAccuracy.tab |
|
|
|
Notes: |
UNF:6:vZ/M6dCHYcfAn4BcK3jWQA== |
File Description--f199124 |
|
File: DataGaze.tab |
|
|
|
Notes: |
UNF:6:dBxORRTF7/Az+AUa1bdeHg== |
File Description--f199170 |
|
File: DataGazeMetrics.tab |
|
|
|
Notes: |
UNF:6:cEerEP0SIhqvkcm3NuU4FQ== |
File Description--f199145 |
|
File: DataNasaTLX.tab |
|
|
|
Notes: |
UNF:6:pAi9d6yvtxn8XsJmCGlgkA== |
List of Variables: |
|
Variables |
|
f199150 Location: |
Summary Statistics: Max. 520.0; Min. 101.0; Valid 1200.0; StDev 141.59787533216078; Mean 310.5; Variable Format: numeric Notes: UNF:6:LvoANKDhckARmyxAPDxr3A== |
f199150 Location: |
Summary Statistics: StDev 3.8844514445083376; Max. 13.0; Mean 6.916666666666667; Min. 1.0; Valid 1200.0 Variable Format: numeric Notes: UNF:6:OfWMP84qGm/OIt9Hg5Kh4Q== |
f199150 Location: |
Variable Format: character Notes: UNF:6:LXXO2RfXKebBIgYaqSEOvg== |
f199150 Location: |
Variable Format: character Notes: UNF:6:a6F8pyY+P9ygrbEqVTFCDg== |
f199150 Location: |
Variable Format: character Notes: UNF:6:5MftPSI0JgiBuQBbeE0I2Q== |
f199150 Location: |
Variable Format: character Notes: UNF:6:Q+4UahKJkc8J28dOTapMDg== |
f199150 Location: |
Variable Format: character Notes: UNF:6:sraKMkRsMaGmONRhddte5g== |
f199124 Location: |
Variable Format: character Notes: UNF:6:+EwF6cm8JNJJabm+c5wmhw== |
f199124 Location: |
Summary Statistics: Max. 520.0; Mean 310.47120559404703; Valid 2583019.0; Min. 101.0; StDev 129.23531677674086 Variable Format: numeric Notes: UNF:6:aUs5N752xJH9yk8uVnwBnQ== |
f199124 Location: |
Summary Statistics: StDev 3.8827966810412335; Max. 13.0; Min. 1.0; Mean 6.91644467191326; Valid 2583019.0 Variable Format: numeric Notes: UNF:6:ZN2IqvFVrENGJnW/3v53Uw== |
f199124 Location: |
Variable Format: character Notes: UNF:6:nHuGLUK8AUER2HWbcwcHmw== |
f199124 Location: |
Summary Statistics: Min. 1.0; Max. 4.0; StDev 0.8150389729217249; Mean 1.9642093225020782; Valid 2583019.0 Variable Format: numeric Notes: UNF:6:Kt+pqIP1k2zFc0JxLgTJAg== |
f199124 Location: |
Summary Statistics: Valid 2583019.0; Mean 1995.6869055145664; Min. 1.66666666666667; StDev 1412.4639798149042; Max. 6090.0; Variable Format: numeric Notes: UNF:6:YKthtDxK+5kLISqxj3PbzA== |
f199124 Location: |
Summary Statistics: Min. -1339.11849975586; Valid 2523428.0; StDev 335.41361217276085; Mean 990.5006086928836; Max. 3153.26831817627; Variable Format: numeric Notes: UNF:6:ilq5asg1JijNjI+sh0rgbQ== |
f199124 Location: |
Summary Statistics: StDev 190.10077537521366; Min. -33.5766374319791; Max. 2126.91780567169; Valid 2523428.0; Mean 523.5497908685409 Variable Format: numeric Notes: UNF:6:H8jt649T51osGCFHHGckWw== |
f199170 Location: |
Variable Format: character Notes: UNF:6:Fw5tFc7gvEv3AlAsV74jJw== |
f199170 Location: |
Summary Statistics: Mean 310.17445742904835; StDev 141.49128916983466; Max. 520.0; Min. 101.0; Valid 1198.0 Variable Format: numeric Notes: UNF:6:6jks14T9tMj+DrBuuaxnrA== |
f199170 Location: |
Summary Statistics: Mean 6.9065108514190285; Max. 13.0; Min. 1.0; Valid 1198.0; StDev 3.879721417209451; Variable Format: numeric Notes: UNF:6:+K7K+eVcUp4r3/w63yFenA== |
f199170 Location: |
Variable Format: character Notes: UNF:6:203cWW+BmUwt212oPGZbmQ== |
f199170 Location: |
Summary Statistics: StDev 81.09993695070962; Valid 1198.0; Mean 274.91096049965; Max. 992.777777777778; Min. 161.666666666667 Variable Format: numeric Notes: UNF:6:95rQxro4G+R4/2GzPfvruw== |
f199170 Location: |
Summary Statistics: StDev 20.14918489464005; Valid 863.0; Max. 100.0; Min. 0.0; Mean 19.60023174971032 Variable Format: numeric Notes: UNF:6:xPKgm1Oz5tlhK07hvT1x6g== |
f199145 Location: |
Summary Statistics: Max. 13.0; Valid 55.0; StDev 4.08248290463863; Mean 7.0; Min. 1.0 Variable Format: numeric Notes: UNF:6:ZXQl7/yOG3pq24otjCZ7Mw== |
f199145 Location: |
Variable Format: character Notes: UNF:6:84PSCZIJ+e1DffAqtjFRjg== |
f199145 Location: |
Variable Format: character Notes: UNF:6:OoH+YSRf8udKb0XGlKAUCw== |
f199145 Location: |
Variable Format: character Notes: UNF:6:jF5PBV5lnvyL5/3y5RIQIg== |
f199145 Location: |
Variable Format: character Notes: UNF:6:2wcr6u2EadSWt4jzI0THoA== |
f199145 Location: |
Variable Format: character Notes: UNF:6:g1rOgtGo9JJUK+edp4+HRw== |
f199145 Location: |
Variable Format: character Notes: UNF:6:pLXBY0+9osocBjocYP1f9A== |
Label: |
webveta_files.zip |
Text: |
Contains gazealytics files (.zip) for all image stimuli per condition. Extract only this zip, not the zip files in this. The gazealytics zip files can individually be uploaded to https://www2.visus.uni-stuttgart.de/gazealytics/ |
Notes: |
application/zip |
Label: |
answerAccuracy.R |
Text: |
Statistical analysis script for accuracy. |
Notes: |
type/x-r-syntax |
Label: |
GazeAnalysis.R |
Text: |
Statistical analysis script for gaze analysis: hit-any-AOI rate and mean fixation duration. |
Notes: |
type/x-r-syntax |
Label: |
nasatlx.R |
Text: |
Statistical analysis script for NASA TLX rating |
Notes: |
type/x-r-syntax |
Label: |
questions.py |
Text: |
List of questions related to the image stimuli. The number in front of each question relates to the image file name. |
Notes: |
text/x-python |
Label: |
501.jpg |
Notes: |
image/jpeg |
Label: |
502.jpg |
Notes: |
image/jpeg |
Label: |
503.jpg |
Notes: |
image/jpeg |
Label: |
504.jpg |
Notes: |
image/jpeg |
Label: |
505.jpg |
Notes: |
image/jpeg |
Label: |
506.jpg |
Notes: |
image/jpeg |
Label: |
507.jpg |
Notes: |
image/jpeg |
Label: |
508.jpg |
Notes: |
image/jpeg |
Label: |
509.jpg |
Notes: |
image/jpeg |
Label: |
510.jpg |
Notes: |
image/jpeg |
Label: |
511.jpg |
Notes: |
image/jpeg |
Label: |
512.jpg |
Notes: |
image/jpeg |
Label: |
513.jpg |
Notes: |
image/jpeg |
Label: |
514.jpg |
Notes: |
image/jpeg |
Label: |
515.jpg |
Notes: |
image/jpeg |
Label: |
516.jpg |
Notes: |
image/jpeg |
Label: |
517.jpg |
Notes: |
image/jpeg |
Label: |
518.jpg |
Notes: |
image/jpeg |
Label: |
519.jpg |
Notes: |
image/jpeg |
Label: |
520.jpg |
Notes: |
image/jpeg |
Label: |
401.jpg |
Notes: |
image/jpeg |
Label: |
402.jpg |
Notes: |
image/jpeg |
Label: |
403.jpg |
Notes: |
image/jpeg |
Label: |
404.jpg |
Notes: |
image/jpeg |
Label: |
405.jpg |
Notes: |
image/jpeg |
Label: |
406.jpg |
Notes: |
image/jpeg |
Label: |
407.jpg |
Notes: |
image/jpeg |
Label: |
408.jpg |
Notes: |
image/jpeg |
Label: |
409.jpg |
Notes: |
image/jpeg |
Label: |
410.jpg |
Notes: |
image/jpeg |
Label: |
411.jpg |
Notes: |
image/jpeg |
Label: |
412.jpg |
Notes: |
image/jpeg |
Label: |
413.jpg |
Notes: |
image/jpeg |
Label: |
414.jpg |
Notes: |
image/jpeg |
Label: |
415.jpg |
Notes: |
image/jpeg |
Label: |
416.jpg |
Notes: |
image/jpeg |
Label: |
417.jpg |
Notes: |
image/jpeg |
Label: |
418.jpg |
Notes: |
image/jpeg |
Label: |
419.jpg |
Notes: |
image/jpeg |
Label: |
420.jpg |
Notes: |
image/jpeg |
Label: |
301.jpg |
Notes: |
image/jpeg |
Label: |
302.jpg |
Notes: |
image/jpeg |
Label: |
303.jpg |
Notes: |
image/jpeg |
Label: |
304.jpg |
Notes: |
image/jpeg |
Label: |
305.jpg |
Notes: |
image/jpeg |
Label: |
306.jpg |
Notes: |
image/jpeg |
Label: |
307.jpg |
Notes: |
image/jpeg |
Label: |
308.jpg |
Notes: |
image/jpeg |
Label: |
309.jpg |
Notes: |
image/jpeg |
Label: |
310.jpg |
Notes: |
image/jpeg |
Label: |
311.jpg |
Notes: |
image/jpeg |
Label: |
312.jpg |
Notes: |
image/jpeg |
Label: |
313.jpg |
Notes: |
image/jpeg |
Label: |
314.jpg |
Notes: |
image/jpeg |
Label: |
315.jpg |
Notes: |
image/jpeg |
Label: |
316.jpg |
Notes: |
image/jpeg |
Label: |
317.jpg |
Notes: |
image/jpeg |
Label: |
318.jpg |
Notes: |
image/jpeg |
Label: |
319.jpg |
Notes: |
image/jpeg |
Label: |
320.jpg |
Notes: |
image/jpeg |
Label: |
201.jpg |
Notes: |
image/jpeg |
Label: |
202.jpg |
Notes: |
image/jpeg |
Label: |
203.jpg |
Notes: |
image/jpeg |
Label: |
204.jpg |
Notes: |
image/jpeg |
Label: |
205.jpg |
Notes: |
image/jpeg |
Label: |
206.jpg |
Notes: |
image/jpeg |
Label: |
207.jpg |
Notes: |
image/jpeg |
Label: |
208.jpg |
Notes: |
image/jpeg |
Label: |
209.jpg |
Notes: |
image/jpeg |
Label: |
210.jpg |
Notes: |
image/jpeg |
Label: |
211.jpg |
Notes: |
image/jpeg |
Label: |
212.jpg |
Notes: |
image/jpeg |
Label: |
213.jpg |
Notes: |
image/jpeg |
Label: |
214.jpg |
Notes: |
image/jpeg |
Label: |
215.jpg |
Notes: |
image/jpeg |
Label: |
216.jpg |
Notes: |
image/jpeg |
Label: |
217.jpg |
Notes: |
image/jpeg |
Label: |
218.jpg |
Notes: |
image/jpeg |
Label: |
219.jpg |
Notes: |
image/jpeg |
Label: |
220.jpg |
Notes: |
image/jpeg |
Label: |
101.jpg |
Notes: |
image/jpeg |
Label: |
102.jpg |
Notes: |
image/jpeg |
Label: |
103.jpg |
Notes: |
image/jpeg |
Label: |
104.jpg |
Notes: |
image/jpeg |
Label: |
105.jpg |
Notes: |
image/jpeg |
Label: |
106.jpg |
Notes: |
image/jpeg |
Label: |
107.jpg |
Notes: |
image/jpeg |
Label: |
108.jpg |
Notes: |
image/jpeg |
Label: |
109.jpg |
Notes: |
image/jpeg |
Label: |
110.jpg |
Notes: |
image/jpeg |
Label: |
111.jpg |
Notes: |
image/jpeg |
Label: |
112.jpg |
Notes: |
image/jpeg |
Label: |
113.jpg |
Notes: |
image/jpeg |
Label: |
114.jpg |
Notes: |
image/jpeg |
Label: |
115.jpg |
Notes: |
image/jpeg |
Label: |
116.jpg |
Notes: |
image/jpeg |
Label: |
117.jpg |
Notes: |
image/jpeg |
Label: |
118.jpg |
Notes: |
image/jpeg |
Label: |
119.jpg |
Notes: |
image/jpeg |
Label: |
120.jpg |
Notes: |
image/jpeg |
Label: |
001.jpg |
Notes: |
image/jpeg |
Label: |
002.jpg |
Notes: |
image/jpeg |
Label: |
003.jpg |
Notes: |
image/jpeg |