This .zip file contains the Unity project of the visualization tool described in the paper "Visual Gaze Labeling for Augmented Reality Studies" which was accepted at EuroVis 2023 Conference. This tool can be used to annotate gaze data from Augmented Reality (AR) scenarios to perform AOI based eye-tracking analysis. The visualization tool consists of a gaze replay and timeline visualization - linked together to provide spatial and image-based annotation. The project includes a dataset that we collected from an AR pilot study.
Please install Unity 2020.3.24 to run the visualization tool. The Unity project contains two scenes (located in Assets/Scenes):
- TimelineVisualization
- GazeReplay
First, drag both scenes into the hierarchy window, then unload GazeReplay. In
File/Build Settings, the order of the
Scenes in Build should be as follows:
TimelineVisualization 0, GazeReplay 1. When you start the active scene (TimelineVisualization), the GazeReplay scene will be loaded as well. You can work with the visualization tool in the game view.
The project consists of an Assets, Packages and ProjectSettings folder. The Assets folder contains the necessary scripts, prefabs and data.
Assets.
└── Scenes
|── TimelineVisualization # Unity scene for Timeline Visualization
|── GazeReplay # Unity scene for Gaze Replay
└── Scripts
|── AOI_Manager # the code to create AOI cubes in gaze replay
|── FileHandler # the code to save annotated gaze data in json file
|── Frame # code that contains fixation information
|── FrameAnnotator # the code to handle fixation annotation in timeline visualization
|── dataHandler # the code to extract gaze data from the csv files
└── Streaming Assets
|── frames # thumbnail images of the fixations for each participant
|── study_data # fixation data of each participant
|── RScript # R code to extract fixations from gaze data
Please check the
GitHub page for the latest version.