UI&us is about User Interface Design, User Experience design and the cognitive psychology behind design in general. It's written by Keith Lang, co-founder of Skitch; now a part of Evernote.  His views and opinions are his own and do not represent in any way the views or opinions of any company. 

Navigation
Categories
External Articles

« Interior Decorating and Computer Architecture | Main | Synesthesia in Design »
Thursday
Aug072008

Eye-Tracking in Future Interfaces

 Picture of computerized eyes

The video below shows Martin Tall, researcher at Lund University demonstrating his eye-tracking operating system. Eye-tracking is not new, but is still limited to research due to it's cost and complexity. I recently spoke with Dr Peter Brawn, who runs one of Australia's top eye-tracking businesses,  www.eyetracker.com.au Some of the users of eye-tracking technology include:

  • Website engineers who want to know what features and ads are being seen

  • Large physical stores where shelf space is paid for by breakfast cereal makers, and the like

  • Computer Interface Designers

  • Disabled people who have difficulty or inability to use a mouse or keyboard

  • General researchers into human vision




The technology used in this research costs tens of thousands of dollars, and can produce eye tracking to a resolution of 1-1.5 centimetres of a computer screen in normal PC usage. The technology works by shining small Infra Red (invisible to us) lights near the computer screen, and tracking the resulting glints in our eyes with specialized cameras. Software then works out the relative position of the eyes and computes the point where we are focussed. However it is possible to eye, and feature-tracking with off-the-shelf components at the expense of accuracy.



The above video demonstrates a the Logitech Orbit, a consumer webcam which includes some fun face tracking software, popular with video chat users. As always, consumer technology is on track to do what only research labs could do before. Our everyday point and shoot cameras now feature subtleties like smile detection. What could accurate eye-tracking mean for us UI designers? Ever started typing in somewhere, only to discover your cursor was elsewhere? Knowing where you are looking would help the computer better understand where the text was supposed to go. Subtly, important and overlooked notifications could be brought nearer your focus. A personal dream of mine: no more would your screen go to sleep whilst you intently watch a video!
So what are the challenges to incorporating eye-tracking in mainstream computing?



  1. Cost. Improved HD cameras and perhaps additional IR hardware

  2. Cost of CPU-time. Continous monitoring of video and complex image processing takes 

  3. Power consumption. Battery life, and Green concerns

  4. Some outstanding technical issues like tracking eyes behind thick corrective lenses

  5. Privacy. Could you be spied apon if your computer always had a camera?

  6. Designing interactions for both the eye-tracking equipped computer, and not; only a percentage of computer users will have this technology.. can you design something that will not 'disable' your legacy users?



And the answers:



  1. The cost of miniture high-resolution  cameras are continually driven down by the huge mobile phone market

  2. Upcoming technologies like Apple's OS X 'Grand Central' architecture and GPU-inspired Intel chips are highly geared towards efficient image processing at little cost to the responsiveness of the computer

  3. This is harder. Ongoing eye-tracking will cost some power to run the camera and processing

  4. Tracking eyes through corrective lenses, thick-rimmed glasses is still an issue in even the best systems today

  5. Apple's answer to this is including a LED next to it's iSight cameras, included on all new Macs. Eye-tracking may instead rely on a dedicated camera and sand-boxed processing space

  6. This may be the toughest challenge of all.. and perhaps is the reason why so little new hardware has gained ground in personal computers in the last 30 years?



I'd love to see other examples of integrating eye-tracking into the everyday computing experience for the average user. As always I look forward to your comments and directions to learn more about this intriguing field.

[ADDITION] Jobe in the comments brings up an interesting point of how far can eye-tracking be taken. Could you write using just eye-tracking? And the answer is yes. This google tech talk below illustrates 'Dasher', a system which lets you 'walk' the computer through a predictive text system using something like eye-tracking (but could also be a breath-pressure-tube etc.)
 

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.
Editor Permission Required
Sorry — had to remove comments due to spam.