Testing how movement affects the tracking algorithm
Thus far, small head movements during eye-tracking have been assumed to have little effect on the tracking algorithm. Here, we attempt to investigate this claim in a controlled manner. Fixed translational motion of the camera relative to the eye may give insight into the robustness of the algorithm. Furthermore, insights obtained here can also be useful for the head-mounted system, where movement is exacerbated. As an extra, several other factors including the collimation of light and gamma corrections will be compared as well.
Plan
Part I: Joypad control
- Translate stage using existing mechanism
- Joypad controller
- X, Y, Z direction
- Do offline tracking of artificial eye at static position
- Compare tracking between each subsequent motion of the stage
- Namely:
is there a difference in the shape of purkinje images?
is there a difference in tracked position of the eye?
- Namely:
- Translate stage using existing mechanism
Part II: Fixed Route
- Create preprogrammed motion in Arduino in 3-space
- Create voxels of camera position with respect to the eye
Repeat steps 2 and 3 from Part I
- Create preprogrammed motion in Arduino in 3-space
Part III: Collimation of light
Repeat Part I and Part II using uncollimated light
Part IV: Gamma correction
Evaluation
Changes in the relative position between the camera and the eye (during fixation) should not affect the tracking as perceived by the current algorithm. Thus, the following criteria should hold during translational motion of the camera:
- The shape of the purkinje images across positions are consistently round
- The tracked location by the algorithm is static across position