Purpose of this experiment is to record participant’s eye-movements while he/she is listening to a spoken utterance and looking at a screen displaying a set of four objects. For each trial a set of images is is displayed and an utterance is played. The utterance typically is an instruction to the participant. Participant’s task is to carefully look and listen and to select one of the four objects as instructed.
This example is a slightly modified version of the visworld4
experiment
template provided on the Zep wiki. If desired you can change the script to
for instance show only one image. This experiment template could also be used
as a basis for a looking while listening experiment.
-
Download experiment package: visworld4_demo.zip
-
Install the experiment package as described in Installing a Zep experiment script
-
Run the experiment as described in Running a Zep experiment
On successful completion of the experiment session the experiment results database will contain the results for the session as defined by the experiment. Since this is an eye-tracking experiment it should also have produced an eye-tracking data file (or multiple, depending on the script) in a subdirectory of the experiment folder.
In addition to eye-tracking data (eye samples and gaze analysis events) various other data records can be present in an eye-tracking data file. For instance, the start and end of blocks, trials and stimuli, or other experiment defined messages.
For details about the experiment script please refer to the documentation on the Zep wiki.
Screencast of the experiment
This screencast shows running the experiment from ZepMan. The visual stimuli are taken from the Bank of Standardized Stimuli (https://doi.org/10.3758/s13428-011-0184-7) and from Openclipart (https://openclipart.org). The auditory stimuli where created using a text-to-speech system (https://luvvoice.com). Video not yet narrated.
This experiment produces trial results in the
Experiment Results Database.
We can use ZepMan to extract these results from the database:
visworld4_demo-test-1.csv
Since we ran just one session the database will only contain data for that
single session. Normally you would extract the database contents after having
tested all participants taking part in the study.
In addition to the regular trial results an eye-tracking data file is created:
0001_00_01.bed[1]
Converting this file to CSV format yields:
0001_00_01.csv
The following command was used to convert the file:
eyevec-convert 0001_00_01.bed -right=no -left=no -px -r
Post-processing the eye-tracking data
For each test item you might want to determine the cumulative fixation duration per stimulus object and the first fixation relative to a some reference point in the auditory stimulus (for example the target onset).