A mixed reality system which allows medical practitioners to view and interact with virtual replicas of patients’ organs, bones or body parts is being developed by academics.
Researchers at Birmingham City University’s Digital Media Technology Lab (DMT Lab) are devising the system which enables users to interact with virtual models and patient data using freehand movements.
The system allows users to manipulate, navigate and demonstrate patient data using hand motions and gestures, so that practitioners can showcase medical procedures, lifestyle choices, and treatment effects using customised 3-D virtual models and patients’ real medical records.
It could be used to visually demonstrate medical problems, the areas where surgery will be conducted, improvements which could be made following treatment or the damage caused by harmful addictive substances such as tobacco.
The technology uses motion detecting sensors combined with the DMT Lab expertise in freehand interaction in mixed reality to create a more realistic experience in virtual environments and bridge the gap between users and technology.
Dr. Ian Williams says: “We are developing this system as a platform to allow medical professionals to interact with genuine patient data and manipulate it by hand to educate and inform patients.
“The real advantages this brings are being able to visually demonstrate parts of the anatomy, using virtual models which can be customised for each patient and show how they have been impacted by lifestyle choices or how they may be changed following treatments or surgery.”
In the future the system will be upgraded to replicate injuries, mobility problems or illnesses and show changes which could be made through lifestyle choices or medical procedures.
It could also help the practitioners by creating a new way to view patient data in an array of settings.
Medical practitioners could be able to showcase medical procedures and treatment effects on customised medical models.
Surgeons would also be able to interact with images of patients’ bodies to view and manipulate during procedures without the need to remove their scrubs and gloves in sterilised environments.
The use of customised models and the interactive environment that can be shared with the patient, can help to boost patients’ engagement into treatments and their understanding.