GazeIT, Man in wheelchair using gaze tech

GazeIT: Accessibility by Gaze Tracking

Gaze tracking affords fast, precise and durable input for communication, mobility and smart home control. GazeIT will make sure that this technology becomes available for people with motor disabilities.

Background & motivation

Low-cost gaze tracking technology is now build into computers and head-mounted displays. This opens up for a range of usages, where people who cannot control their arm and finger movements will make input simply by looking at objects they would like to engage with, or look at keys on an onscreen-keyboard that they would like to type. For instance, gaze can be used to steer a wheelchair, or to control a telerobot that will drive in the direction the person is looking.

 

Project Objectives

The objective of the project is to invent new interaction methods that take advantage of gaze input – as a mono-modal input or in combination with e.g. voice commands, head movements or hand gestures. The interaction address daily needs of people with motor challenges, e.g. cerebral palsy, repetitive strain injury or locked-in conditions in late-stage ALS. The main focus will be on:

  • Type-to-speak communication, striving to increase typing speed above 30 words per minute, that will make it possible to take part in fluent conversations. 
  • Control of wheelchair with gaze. The wish for independent mobility is strong for people who cannot walk, but gaze interaction with wheelchairs is still an open research area. We have build a wheelchair simulator that will make it possible to test new gaze interaction principles safely.
  • Telerobot control is a new possibly for immobile people to take part in a video-conference, for instance moving around in their home while lying in a hospital bed, but gaze control of telerobots is a challenging task and research is needed to make it easy for people to learn. 
  • Control of smart home devices. Gaze tracking makes it possible to e.g. open a door with a gaze gesture, to turn on a light that is looked at or to change the TV-channels. A start-up company was launched by DTU-students associated with researchers from the GazeIt group. This company has now been acquired by JABRA/GN Nord.
  • Interaction with virtual realities by gaze. With gaze tracking being integrated with VR-glasses it is relevant to study how people with disabilities can take advantage of this.
  • Finally, the GazeIt group is involved in a Horizont 2020 project Rehyb (rehyb.eu) exploring gaze interaction and comfort monitoring for stroke patients using exoskeletons in rehabilitation exercises.

 

Funding

This project has been financially funded by the Danish BEVICA Foundation.


 

Contact

John Paulin Hansen
Professor
DTU Management
+45 45 25 48 52
https://www.cachet.dk/research/research_projects/gaze-it
11 DECEMBER 2024