Welcome to the Aidis Trust blog. Here you’ll find our posts on assistive technology that are meant to inform and encourage discussion. Feel free to join in!

Eye Tracking

eye-tracking-large

  

Eye tracking (also known as Eye Gaze) technology, is something that wouldn’t have looked out of place in a science fiction movie a few years ago. Now fiction has become reality, though there are mixed views and reviews surrounding both its suitability and its uses.

 

Eye tracking hit the disability scene, pretty much before any other scene and to this day the technology is still trying to find a true identity outside of this market. There are stories of supermarkets looking at the technology to research what shelves people are viewing within their stores and the gaming industry is, of course, a huge potential market for eye tracking (limb movement on the Wii and Xbox Connect is ‘old-hat’ now).

 

However, the reason we’re writing about eye tracking is that the one area it has established itself in, is the electronic Assistive Technology market. This may still come as news to some people who support disabled people, but more and more people, who’s mobility is severely restricted, are finding that eye tracking is helping to open up their world.

 

So, what’s it all about then?

 

Unlike many technologies out there, eye tracking actually describes the technology quite accurately. In essence, a device with 2 built-in cameras, will triangulate and track a person’s pupils within their eyes and use this movement to move the cursor on the computer screen. This basically turns eye movement into mouse movement. When you look at something on the screen, the cursor is moved to that location.

 

How does it work?

 

Using the most popular tobii eye gaze system for an example, along with the included software and to get the optimum results from the technology, the user first needs to calibrate the unit by gazing at dots in turn on the screen. The user can then use one of 2 methods to control the computer with their eyes, ‘Gaze selection’ or ‘Mouse emulation’.

 

Gaze selection uses a toolbar that slides in on either side of the screen when the user looks off to the side of the screen. This allows the user to select mouse actions such as left-click, right-click, double-click, drag and so on. Next, you need to look at the part of the screen that you want to click on or drag. Next, after keeping your gaze fixed, the software will zoom in on that part of the screen to enable you to be more precise. There is no cursor shown on the screen.

 

Mouse emulation uses a similar toolbar, though this one is permanently displayed on-screen. The action (clicking, dragging and so on) is always set to the previous selected action and the cursor will follow the user’s gaze, with a cursor moving in relation to the movement of your gaze. When the user’s gaze is relatively stationary, a countdown will start, to signify the amount of time before the action (clicking, double-clicking and so on) will take place. This technique is called dwell clicking. Dwell clicking isn’t new. It was first brought in with devices such as head pointers, where a person could move a cursor but not able to press a button.

 

We will be publishing a video in the next few days that should help illustrate these two access methods.

 

Is it any good?

 

The bottom line is, it works well for some and doesn’t work well for others. I know this isn’t an ideal answer, though it is the best we can give in a review. Some people’s eyes just don’t seem to be picked up as well as others, though as revisions of the technology are released, the accuracy and compatibility is improving.

 

Having said that, there are other factors that aren’t so individualistic. Lighting can play a major part in the accuracy of eye tracking.

 

Often, direct or even semi-direct lighting that shines on the camera can impair accuracy. Often, turning the camera (and person) around by 180 degrees can dramatically improve performance.

 

The application for eye tracking

 

Eye tracking can be effectively used for general computer use, but only with very good results from the initial calibration as well as consistent lighting environment as I’ve just mentioned. Basically, the smaller the area that you want to select with your eyes, the better the accuracy needs to be. So, selecting items such as small buttons in Word can be quite tricky if the eye tracker is not picking your eyes up very well.

 

Applications that require large areas to be selected are much more forgiving, as you don’t have to be so precise. An example would be for verbal communication (AAC), where there may be just a handful of relatively large buttons to select. Another scenario would be if the user is literate and wants to use eye tracking to write emails. Having a large on-screen keyboard to type with is going to be easier than a small on-screen keyboard, especially for those that are borderline candidates for eye tracking.

 

Also, remember to take into account the issues over lighting. Using eye tracking whilst mobile, will produce varying results as the lighting levels and directions will change with your location. So a user could target quite easily in a room without light shining over their back, into the tracker’s camera, but then lose effectiveness when going outside on a sunny day.

 

Conclusion

 

It’s all a case of looking at the bigger picture I suppose. If you calibrate the eye tracker in the optimum lighting conditions and it works well, then it may well be for you. However, there may still be an alternative technology that suits you better and you will only know this if you have the knowledge of, and access to, those alternatives.

 

If you want to use eye tracking in a multitude of locations, where you can’t control the lighting, then that too could be a ‘deal-breaker’. It’s definitely a viable solution for some and as the technology improves, so will the amount of people that eye tracking would benefit.

Share Button

Tags: , ,

Trackback from your site.

Leave a comment