Gijs Luijten holds a Master’s degree in Technical Medicine, specializing in Medical Imaging and Intervention, from the Technical University of Twente. He focused on Augmented Reality (AR) applications for oral-maxillofacial (OMF) and Free Flap Surgery. At the Institute for Artificial Intelligence in Medicine (IKIM), Gijs conducts research with the University Hospital Essen (UKE). The research aims to improve (surgical) diagnosis and intervention using state-of-the-art 3D technologies, particularly Virtual and Augmented Reality (VR/AR). As a visiting scientist, Gijs collaborated closely with the Technical University of Graz (Institute for Computer Graphics and Vision) and the Medical University of Graz. This collaboration aimed to investigate the effectiveness and usability of AR tools in real clinical settings and to explore interesting combinations of machine learning (ML) and AR. Recently, Gijs has explored diminishing reality in synthetic data of surgeries and created an extensive database and API for 3D medical shapes for machine learning, eye tracking to improve diagnosis, and more. Currently, Gijs is co-managing the XR-Lab at IKIM while working on a combination of ultrasound, machine learning, and augmented reality. A strong desire for continuous learning and improvement in related fields, including machine learning, workflow planning, modeling, and programming is what drives Gijs. A strong desire for continuous learning and improvement in related fields, including machine learning, workflow planning, modeling, and programming is what drives Gijs. Gijs’ work aims to empower researchers, improve patient care, and inspire clinicians, patients, and companies to push the boundaries of current possibilities.
XR Projects in Medicine at the smart-XR Lab of the AI-Guided Therapies Group
The smart XR Lab of the Artificial Intelligence Guided Therapies (AIT) group within the Institute of Institute of Artificial Intelligence in Medicine (IKIM) is located at Essen University Hospital (AöR). Our team develops applications to support researchers, patients, and clinicians.
After an overview of the hospital, institute, and XR Lab, we will present some past and ongoing projects. For education, we developed MultiAR—a multi-user, cross-device platform for collaborative anatomy education—and recently received a grant to further develop virtual reality for educational purposes. For diagnostics, we are measuring eye tracking during the HINTS exam in clinical settings. For patients, we have a project to help them relax during chemotherapy. For clinicians and medical students, we stream ultrasound images to multiple headsets and incorporate automated length and width measurements. In surgery, we are working on several markerless registration projects for CT-to-patient superimposition using both standard and custom algorithms. With the recently acquired Apple Vision Pro and Siemens Cinematic Reality application, we are investigating its added value for surgical planning. To support researchers, we created a 3D surgical instrument collection used to generate synthetic scenes for object detection, segmentation, and diminished reality. We have also developed several facial and skull datasets based on CT scans—datasets that support the training of algorithms for patient-specific implants. All these datasets, along with those from our collaboration partners, have been integrated into MedShapeNet, a database available via API. We hope MedShapeNet will impact medical computer vision research as ShapeNet did for general computer vision. We look forward to sharing our projects for researchers, patients, and clinicians and to collaborating on translating our research into meaningful applications.
His talk takes place on Thursday, April 3, 2025 at 14:00 in E105. The talk will be streamed live.