Wednesday 16th of December
Interacting in Mixed Reality with a Head-Mounted Display: Application to Augmented Surgery

Abstract:

This thesis is in the field of Human-Computer Interaction (HCI) and specifically focuses on user interaction with Mixed Reality systems that rely on Head-Mounted Displays. Mixed Reality (MR) enables us to create mixed environments that combine virtual content with physical content. The entire workspace, including both physical and virtual objects, can then be used to interact with the system. Unlike traditional graphical interfaces, MR combines the advantages of the physical environment while providing the flexibility of virtual elements. The fields of application are many and varied, ranging from industrial manufacturing to medicine.

Surgery is an application context in which MR is particularly promising. The ability to view medical data directly on the patient's body and around the operating table is a crucial benefit for surgeons. However, the existing techniques for interacting with this virtual information are not suited to the many constraints of the operating room. New approaches are needed to enable surgeons to view, manipulate and edit virtual content during an operation.

Our research is driven by this need for MR interaction techniques for augmented surgery. Thanks to our partnership with the company Aesculap for this CIFRE thesis, our work is dedicated to knee prosthesis operations. We focus on the use of Head-Mounted Displays (HMDs), through which the augmented field of view is directly related to the position of the head.

We start by providing a design space for MR head-based interaction. We explore this design space by devising new interaction techniques with menus. These techniques involve the entire reality-virtuality continuum by considering different physicalities for targets and implementing transitions between mixed reality and virtual reality. Our goal is to fulfill the surgery constraints by taking advantage of the objects already present in the operating room.

After exploring head-based techniques, we broaden our area of research by considering surgical stages where surgeons have at least one hand available to interact with the system. Our contributions on this aspect are twofold. We first continue our study of menu techniques by considering a key aspect of the surgeon's work: visual attention to the operating field. We thus propose three new techniques allowing the surgeon to bring virtual widgets back into his field of vision without having to look away from the operating field. We then study the exploration of MR 3D scenes with multiple views. For instance, multiple views of the knee are displayed during the surgical planning step. The originality of the work lies in the different nature of the multiple views: virtual views and mixed views. We propose two new complementary interaction techniques that allow users to control the level of coupling between these different views and implement transitions between virtual reality and mixed reality.
 

Keywords

Human-Computer Interaction, Mixed Reality, Augmented Reality, Augmented surgery, Menus.
Mis à jour le 20 December 2020