EyeDraw: Enabling Children with Severe Motor Imparements to Draw with Their Eyes
Anthony J. Hornof
Computer and Information Science,
University of Oregan,
Eugene, OR 97403 USA
Anna Cavender
Computer Science and engineering,
University of Washington,
Seattle, WA 98195 USA
Comments:
Summary:
Introduction:
This is an application that enable children with severe motor disabilities to draw pictures with their eyes. The application runs on a computer equipped with an eye tracking device.
In order to draw a line the start and end points are identified from successive gaze positions. The line is then automatically created by connecting the dots. The midas touch problem is avoided by the use of dwell time accompanied with visual feed back in the form of a change in the shape of the cursor.
Drawing Commands:
Alternation between viewing and drawing is performed by dwell time. The cursor changes from green for viewing to red for drawing. The dwell time threshold is adjustable based on experience but starts at a half second. A drawing command can be cancelled by extending the dwell for another half second.
Version 1:
This is a minimal control version which has the following tools
Line drawing,
Circle drawing,
Undo Button,
Grid to assist with dwell stabilization,
Save and open drawings.
Evaluation of Version 1:
participants without disabilities were first recruited. After calibration of the eye tracker and a short familiarization with the controls the participants were asked to make some drawings.
500ms was found to be the preferred dwell time.
The easiest functions were clicking buttons.
The hardest functions were drawing and controlling the drawing.
The grid was found to be useful.
Overall the task required a lot of focussed attention.
A second evaluation was performed by impaired users which lead to the second version of the program.
Version 2:
This version of the program was in response to the needs of the impaired users and contained the following additions:
Image of what the camera sees so the user can stay in an optimal position while drawing.
More user defined settings such as dwell time etc.
Audi feedback on the current state of the cursor while drawing.
Rectangle and polygon tools were added.
Colors.
On off switch for the eye tracking.
Evaluation of Version 2:
As before the evaluation was divided into two groups. Both found the rich palate of tools engaging, with the consequence that an extended period of familiarization was required. The next version will have a feature that reveals tools one at a time to enable first time users to start drawing and not be over faced by the apparent complexity of the application.
Discussion:
This is a very interesting article. The evaluation of the first two versions of the application is very thorough in its descriptions of each individuals reaction to the system. Some information about the implementation of the eye tracking would have been of interest. One of the users is reported to have written text at the bottom of their picture. Again, some information on how the keyboard was represented and controlled would have been of interest. Very little is mentioned about participants reaction to using eye movements deliberately. This is only touched upon by mentioning that a lot of attention was required.
It was good paper with plenty of results. I liked how they presented one application and then they improved it based on user recommendations.
ReplyDeleteI agree that some more details of the implementation would be nice, though this paper is encouraging since it accomplished something seemingly useful using eye tracking.
ReplyDelete