6 173 Eyes on you: ensuring empathic accuracy or signaling empathy? SUPPLEMENT S6.3 A customized MATLAB script (MathWorks, Inc., Natick, MA, version 9.5) was used to pre-process raw eye tracking data into measures of eye gaze per perceiver per video. Raw gaze data of at least one eye was used to calculate information of gaze position and duration. Furthermore, validity of gaze data was calculated as the percentage successfully recorded eye tracking data per video as an estimate of data quality. Individual videos of which the validity was below 70% were excluded from analyses. In order to follow the natural movement of the targets in the videos, dynamically moving areas of interest (AOI) were created around the left eye, right eye, mouth, and face as a whole of all individual targets using MATLABs cascade object detector, which uses the algorithm of Viola and Jones (2001) for face and facial feature detection. More specifically, for each frame of each video, this algorithm outputted rectangular AOIs encompassing the left eye, right eye, mouth, and face (see below). Outlier removal, smoothing, and interpolation was performed on the AOIs thereafter, to correct any incorrectly identified AOIs due to movement or blinking of the target in the video. The gaze data within the right and left eye AOIs were corrected for overlap and combined into a single AOI for the eye region. The screenshot of the target person presented below is blurred due to privacy reasons.
RkJQdWJsaXNoZXIy MjY0ODMw