Virtual reality telepresence robot

Patient Training for Gaze-Controlled Telepresence

For individuals with motor disabilities, gaze-control can be implemented into a telepresence robot, which provides the feeling of presence, in order to promote their participation in social interactions and enhance their communication quality.

Background

Over 190 million persons worldwide are suffering from severe disabilities [1]. It limits their ability to seamlessly interact with daily-use devices and engage in social communication and activities.
Gaze-interaction is a common control mode for individuals with movement disorders and reduced body control. Eye-tracking components can be built into computers and mobile devices. A range of usages have been opened up for patients and individuals suffering from the problems, to interact with such devices simply by looking at components they would like to engage with. Prior work has examined accuracy and precision of remote gaze tracking in a bed scenario, and has shown that eye tracking quality in the scenario can sufficiently support gaze interaction [2]. Robotic telepresence systems promote social interaction for geographically dispersed people, which allows people with limited mobility to independently participate in social activities [3]. Gaze-controlled robotic telepresence systems [4] take advantage of these technologies. However, few studies have implemented gaze-interaction into a telepresence robot and it is still unclear how gaze-interaction within these systems impacts users and how to improve them.

Objectives

To address the above mentioned issues, this project [4] attempts to investigate the potential benefits and challenges of these systems for people with motor disabilities. Furthermore, the impacts of gaze-control training within a simulated environment will be explored in order to improve the target users’ communication quality.

The project is rooted in the GazeIT project. The final goal is to make these systems extremely simple, compact, non-obtrusive, and comfortable, in order to help people with motor disabilities deal with their daily interaction problems. Additionally, this mobile, robotic telepresence system should be as accessible as possible for people who can only move their eyes.

The current research envisions two phases with the following introduction:

  1. to identify potential benefits and challenges of gaze-controlled telepresence systems;
  2. further investigate impacts of eye-gaze control training in simulation-based environments (e.g. in hardware-in-the-loop, and virtual reality environment) on gaze-controlled telepresence systems for target users.

References

  1. WHO. (2011). World report on disability. Geneva: WHO.
  2. Hansen, J. P., Agustin, J. S., & Skovsgaard, H. (2011). Gaze interaction from bed. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications (NGCA '11). ACM.
  3. Kristoffersson, A., Coradeschi, S., & Loutfi, A. (2013). A review of mobile robotic telepresence. Advances in Human-Computer Interaction, 2013, 3.
  4. Hansen, J. P., Alapetite, A., Thomsen, M., Wang, Z., Minakata, K., & Zhang, G. (2018). Head and gaze control of a telepresence robot with an HMD. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Article. 82). ACM.

Funding

China scholarship council logo
This project is funded by the China Scholarship Council, and the Danish BEVICA Foundation

Contact

Guangtao Zhang
PhD student
DTU Management
+45 45 25 45 56

Contact

John Paulin Hansen
Professor, Group Leader
DTU Management
+45 45 25 48 52

Contact

Jakob Eyvind Bardram
Head of Sections, Professor
DTU Health Tech
+45 45 25 53 11
http://www.cachet.dk/research/phd-projects/gaze-controlled-telepresence
23 JULY 2019