From Academia: Mixed Reality, Augmented Reality, and 3D Printing in Healthcare

Want to write a piece for 3DHEALS Expert Corner? Email us: info@3dheals.com

We have reviewed 9 latest publications in the space combining mixed reality, augmented reality, virtual reality, and 3D printing in this summary blog. Stay tuned for a more in-depth discussion on the technical and clinical insights on our upcoming Expert Corner blog soon.

Mixed Reality Combined With Three-Dimensional Printing Technology in Total Hip Arthroplasty: An Updated Review With a Preliminary Case Presentation (Peng-Fei Lei, et al) Orthop Surg, 11 (5), 914-920 Oct 2019

Three-dimensional (3D) printing technology, virtual reality, and augmented reality technology have been used to help surgeons to complete complex total hip arthroplasty, while their respective shortcomings limit their further application. With the development of technology, mixed reality (MR) technology has been applied to improve the success rate of complicated hip arthroplasty because of its unique advantages. We presented a case of a 59-year-old man with an intertrochanteric fracture in the left femur, who had received a prior left hip fusion. After admission to our hospital, a left total hip arthroplasty was performed on the patient using a combination of MR technology and 3D printing technology. Before surgery, 3D reconstruction of a certain bony landmark exposed in the surgical area was first performed. Then a veneer part was designed according to the bony landmark and connected to a reference registration landmark outside the body through a connecting rod. After that, the series of parts were made into a holistic reference registration instrument using 3D printing technology, and the patient’s data for bone and surrounding tissue, along with digital 3D information of the reference registration instrument, were imported into the head-mounted display (HMD). During the operation, the disinfected reference registration instrument was installed on the selected bony landmark, and then the automatic real-time registration was realized by HMD through recognizing the registration landmark on the reference registration instrument, whereby the patient’s virtual bone and other anatomical structures were quickly and accurately superimposed on the real body of the patient. To the best of our knowledge, this is the first report to use MR combined with 3D printing technology in total hip arthroplasty.

blank
(Copyright to the referenced article) Preoperative operative design for the patient: (A) The designed osteotomy plane (cyan part) of the femur neck was very close to the sciatic nerve (yellow part) and the femoral vessels (red part). (B) The three-dimensional (3D) fracture digital model is presented. (C) The location and size of the designed acetabular cup (cyan part) are presented. (D) The selected bony landmark (orange part) is presented. (E) The personalized veneer part simulation. (F) The reference registration instrument simulation. (G) The reference registration instrument (blue) was installed on the selected bony landmark. (H) Sterilized 3D printed model. (I) The3D image was displayed using a head-mounted display (HMD).

A Review of Simulation Applications in Temporal Bone Surgery

(Tanisha S Kashikar, et al) Laryngoscope Investig Otolaryngol, 4 (4), 420-424

 2019 Jun 7

Background: Temporal bone surgery is a technically challenging and high-risk procedure in an anatomically complex area. Safe temporal bone surgery emphasizes a consummate anatomic understanding and technique development that requires the guidance of an experienced otologic surgeon and years of practice. Temporal bone stimulation can augment otologic surgical training and enable rehearsal of surgical procedures.

Objectives: The purpose of this article is to provide an updated review of temporal bone simulation platforms and their uses.

Data sources: PubMed literature search. Search terms included temporal bone, temporal bone simulation, virtual reality (VR), and presurgical planning and rehearsal.

Discussion: Various simulation platforms such as cadaveric bone, three-dimensional (3D) printed models, and VR simulation has been used for temporal bone surgery training. However, each simulation method has its drawbacks. There is a need to improve upon current simulation platforms to enhance surgical training and skills assessment, as well as a need to explore other clinically significant applications of simulation, such as preoperative planning and rehearsal, in otologic surgery.

Conclusions: There is no replacement for actual surgical experience, but high-fidelity temporal bone models such as those produced with 3D printing and computer simulation have emerged as promising tools in otolaryngologic surgery. Improvements in the fidelity of both 3D printed and VR simulators, as well as the integration of a standardized assessment format, would allow for an expansion in the use of these simulation platforms in training and assessment.

Comparison of 3-Dimensional and Augmented Reality Kidney Models With Conventional Imaging Data in the Preoperative Assessment of Children With Wilms Tumors (Lianne M Wellens, et al) JAMA Network Open, 2 (4), e192633 2019 Apr 5 PMID: 31002326 PMCID: PMC6481457 DOI: 10.1001/jamanetworkopen.2019.2633

Importance: Nephron-sparing surgery can be considered in well-defined cases of unilateral and bilateral Wilms tumors, but the surgical procedure can be very challenging for the pediatric surgeon to perform.

Objective: To assess the added value of personalized 3-dimensional (3-D) kidney models derived from conventional imaging data to enhance preoperative surgical planning.

Design, setting, and participants: In a survey study, the conventional imaging data of 10 Dutch children with Wilms tumors were converted to 3-D prints and augmented reality (AR) holograms and a panel of pediatric oncology surgeons (n = 7) assessed the quality of the different imaging methods during preoperative evaluation. Kidney models were created with 3-D printing and AR using a mixed reality headset for visualization.

Main outcomes and measures: Differences in the assessment of 4 anatomical structures (tumor, arteries, veins, and urinary collecting structures) using questionnaires. A Likert scale measured differences between the imaging methods, with scores ranging from 1 (completely disagree) to 5 (completely agree).

Results: Of the 10 patients, 7 were girls, and the mean (SD) age was 3.7 (1.7) years. Compared with conventional imaging, the 3-D print and the AR hologram models were evaluated by the surgeons to be superior for all anatomical structures: tumor (median scores for conventional imaging, 4.07; interquartile range [IQR], 3.62-4.15 vs 3-D print, 4.67; IQR, 4.14-4.71; P = .008 and AR hologram, 4.71; IQR, 4.26-4.75; P = .002); arteries (conventional imaging, 3.62; IQR, 3.43-3.93 vs 3-D print, 4.54; IQR, 4.32-4.71; P = .002 and AR hologram, 4.83; IQR, 4.64-4.86; P < .001), veins (conventional imaging, 3.46; IQR 3.39-3.62 vs 3-D print, 4.50; IQR, 4.39-4.68; P < .001 and AR hologram, 4.83; IQR, 4.71-4.86; P < .001), and urinary collecting structures (conventional imaging, 2.76; IQR, 2.42-3.00 vs 3-D print, 3.86; IQR, 3.64-4.39; P < .001 and AR hologram, 4.00; IQR, 3.93-4.58; P < .001). There were no differences in anatomical assessment between the two 3-D techniques (the 3-D print and AR hologram).

Conclusions and relevance: In this study, the 3-D kidney models were associated with improved anatomical understanding among the surgeons and can be helpful in future preoperative planning of nephron-sparing surgery for Wilms tumors. These models may be considered as a supplementary visualization in clinical care.

blank
(Copyright to the referenced article) Figure 1.WorkflowDiagram Depicting the Construction Process of 3-Dimensional (3-D) Visualizations. From the patient-derived magnetic resonance image (MRI), computed tomographic (CT) image, or both, a corresponding 3-D print and augmented reality hologram was made. Instep 3, segmentations were saved as stereolithography (.STL) files.

Augmented Reality and Three-Dimensional Printing in Percutaneous Interventions on Pulmonary Arteries (Jan Witowski, et al) Quant Imaging Med Surg, 9 (1), 23-29 Jan 2019 PMID: 30788243 PMCID: PMC6351817 DOI: 10.21037/qims.2018.09.08

Background: Percutaneous pulmonary interventions require extensive and accurate navigation planning and guidance, especially in regard to the three-dimensional (3D) relationships between anatomical structures. In this study, we are demonstrating the feasibility of novel visualization techniques: 3D printing (3DP) and augmented reality (AR) in planning transcatheter pulmonary interventions.

Methods: Two patients were qualified for balloon pulmonary angioplasty (BPA) for treatment of chronic thromboembolic pulmonary hypertension (CTEPH) and stent implantation for pulmonary artery stenosis, respectively. Computed tomography images of both patients were processed with segmentation algorithms and subsequently submitted to 3D modelling software. Microsoft HoloLens® AR headsets with dedicated CarnaLife Holo® software were utilized to display surface and volume rendering of pulmonary vessels as holograms.

Results: Personalized life-sized models of the same structures were additionally 3D-printed for preoperative planning. Holograms were shown to physicians throughout the procedure and were used as a guidance and navigation tool. Operative team was able to manipulate the hologram and multiple users of the AR system could share the same image in real time. Clinicians expressed their satisfaction with the quality of imaging and potential clinical benefits.

Conclusions: This study reports the potential value of AR in pulmonary interventions, however, prospective trials need to be conducted to decide on whether novel 3D visualization techniques affect perioperative treatment and outcomes.Keywords: Percutaneous pulmonary interventions; augmented reality (AR); balloon pulmonary angioplasty (BPA); chronic thromboembolic pulmonary hypertension (CTEPH); three-dimensional printing (3DP).

blank
(Copyright to the referenced article) 3DP models fabricated with white polylactic acid (A), flexible resin (B) and multi-part polylactic acid model (C). 3DP, three-dimensional printing.

blank
(Copyright to the referenced article) First person view of holograms displayed throughout the procedures. (A) surface rendering of pulmonary artery with selected branches highlighted in a different color (blue, red, green); (B) volume-rendering-based hologram of pulmonary artery and its branches.

Patient-specific 3D Printed and Augmented Reality Kidney and Prostate Cancer Models: Impact on Patient Education (Nicole Wake, et al) 3D Print Med, 5 (1), 4 2019 Feb 19 ; PMID: 30783869 PMCID: PMC6743040 DOI: 10.1186/s41205-019-0041-3

Background: Patient-specific 3D models are being used increasingly in medicine for many applications including surgical planning, procedure rehearsal, trainee education, and patient education. To date, experiences on the use of 3D models to facilitate patient understanding of their disease and surgical plan are limited. The purpose of this study was to investigate in the context of renal and prostate cancer the impact of using 3D printed and augmented reality models for patient education.

Methods: Patients with MRI-visible prostate cancer undergoing either robotic assisted radical prostatectomy or focal ablative therapy or patients with renal masses undergoing partial nephrectomy were prospectively enrolled in this IRB approved study (n = 200). Patients underwent routine clinical imaging protocols and were randomized to receive pre-operative planning with imaging alone or imaging plus a patient-specific 3D model which was either 3D printed, visualized in AR, or viewed in 3D on a 2D computer monitor. 3D uro-oncologic models were created from the medical imaging data. A 5-point Likert scale survey was administered to patients prior to the surgical procedure to determine understanding of the cancer and treatment plan. If randomized to receive a pre-operative 3D model, the survey was completed twice, before and after viewing the 3D model. In addition, the cohort that received 3D models completed additional questions to compare usefulness of the different forms of visualization of the 3D models. Survey responses for each of the 3D model groups were compared using the Mann-Whitney and Wilcoxan rank-sum tests.

Results: All 200 patients completed the survey after reviewing their cases with their surgeons using imaging only. 127 patients completed the 5-point Likert scale survey regarding understanding of disease and surgical procedure twice, once with imaging and again after reviewing imaging plus a 3D model. Patients had a greater understanding using 3D printed models versus imaging for all measures including comprehension of disease, cancer size, cancer location, treatment plan, and the comfort level regarding the treatment plan (range 4.60-4.78/5 vs. 4.06-4.49/5, p < 0.05).

Conclusions: All types of patient-specific 3D models were reported to be valuable for patient education. Out of the three advanced imaging methods, the 3D printed models helped patients to have the greatest understanding of their anatomy, disease, tumor characteristics, and surgical procedure.

blank
(Copyright to the referenced article) (a) 3D printed, (b) 3D computer, and (c) AR kidney cancer models with the kidney – clear, tumor –white (3D print and computer), tumor –purple (AR), artery – red, vein – blue, collecting system – yellow. (d) 3D printed, (e) 3D computer, and (f) AR prostate cancer models (sagittalview) with the prostate –clear, tumor – blue, rectal wall – white, bladder neck and urethra – yellow, and neurovascular bundles –pink

Augmented Reality, Surgical Navigation, and 3D Printing for Transcanal Endoscopic Approach to the Petrous Apex (Samuel R Barber, et al) OTO Open, 2 (4), 2473974X18804492 2018 Oct 29 eCollection Oct-Dec 2018 PMID: 30719506 PMCID: PMC6348519 DOI: 10.1177/2473974X18804492

Otolaryngologists increasingly use patient-specific 3-dimensional (3D)-printed anatomic physical models for preoperative planning. However, few reports describe concomitant use with virtual models. Herein, we aim to (1) use a 3D-printed patient-specific physical model with lateral skull base navigation for preoperative planning, (2) review anatomy virtually via augmented reality (AR), and (3) compare physical and virtual models to intraoperative findings in a challenging case of a symptomatic petrous apex cyst. Computed tomography (CT) imaging was manually segmented to generate 3D models. AR facilitated virtual surgical planning. Navigation was then coupled to 3D-printed anatomy to simulate surgery using an endoscopic approach. Intraoperative findings were comparable to simulation. Virtual and physical models adequately addressed details of endoscopic surgery, including avoidance of critical structures. Complex lateral skull base cases may be optimized by surgical planning via 3D-printed simulation with navigation. Future studies will address whether simulation can improve patient outcomes.

blank
(Copyright to the referenced article above) Figure 1. Left ear 3-dimensional (3D) reconstruction. (A) Manual segmentation from computed tomography images into 3D meshes using ITK-SNAP. (B, C) Augmented reality mobile phone application visualized anatomy preoperatively, registered with target image. FN, facial nerve; ICA, internal carotid artery; JB, jugular bulb; PAC, petrous apex cyst; SCC, semicircular canal; SS, sigmoid sinus.

blank
(Copyright to the referenced article above) Figure 2. (A) A 3-dimensional (3D) print of temporal bone used for preoperative simulation. (B) Computed tomography scan of a 3Dprint was a 1:1 match with the original and able to be registered for navigation. (C) Transcanal approach to petrous apex was simulated on 3D print with navigation.

blank
(Copyright to the referenced article above) Figure 3. (A) Intraoperative photo of the live surgery performed using a transcanal endoscopic approach. (B) Comparison of intraoperative (upper panel) with virtual, preoperative otoendoscopic views (lower panel) demonstrated that the virtual render predicted the trajectory of the real surgical approach based on structures at risk: (a) internal carotid artery, (b) jugular bulb, (c) basal turn of the cochlea, and (d) access to petrous apex cyst.

Visualization Improves Supraclavicular Access to the Subclavian Vein in a Mixed Reality Simulator (Joshua Warren Sappenfield, et al) Anesthesia & Analgesia. 127(1):83–89, JULY 2018 PMID: 29200069 PMCID: PMC6774241DOI: 10.1213/ANE.0000000000002572

Background: We investigated whether visual augmentation (3D, real-time, color visualization) of a procedural simulator improved performance during training in the supraclavicular approach to the subclavian vein, not as widely known or used as its infraclavicular counterpart.

Methods: To train anesthesiology residents to access a central vein, a mixed reality simulator with emulated ultrasound imaging was created using an anatomically authentic, 3D-printed, physical mannequin based on a computed tomographic scan of an actual human. The simulator has a corresponding 3D virtual model of the neck and upper chest anatomy. Hand-held instruments such as a needle, an ultrasound probe, and a virtual camera controller are directly manipulated by the trainee and tracked and recorded with submillimeter resolution via miniature, 6 degrees of freedom magnetic sensors. After Institutional Review Board approval, 69 anesthesiology residents and faculty were enrolled and received scripted instructions on how to perform subclavian venous access using the supraclavicular approach based on anatomic landmarks. The volunteers were randomized into 2 cohorts. The first used real-time 3D visualization concurrently with trial 1, but not during trial 2. The second did not use real-time 3D visualization concurrently with trial 1 or 2. However, after trial 2, they observed a 3D visualization playback of trial 2 before performing trial 3 without visualization. An automated scoring system based on time, success, and errors/complications generated objective performance scores. Nonparametric statistical methods were used to compare the scores between subsequent trials, differences between groups (real-time visualization versus no visualization versus delayed visualization), and improvement in scores between trials within groups.

Results: Although the real-time visualization group demonstrated significantly better performance than the delayed visualization group on trial 1 (P = .01), there was no difference in gain scores, between performance on the first trial and performance on the final trial, that were dependent on group (P = .13). In the delayed visualization group, the difference in performance between trial 1 and trial 2 was not significant (P = .09); reviewing performance on trial 2 before trial 3 resulted in improved performance when compared to trial 1 (P < .0001). There was no significant difference in median scores (P = .13) between the real-time visualization and delayed visualization groups for the last trial after both groups had received visualization. Participants reported a significant improvement in confidence in performing supraclavicular access to the subclavian vein. Standard deviations of scores, a measure of performance variability, decreased in the delayed visualization group after viewing the visualization.

Conclusions: Real-time visual augmentation (3D visualization) in the mixed reality simulator improved performance during supraclavicular access to the subclavian vein. No difference was seen in the final trial of the group that received real-time visualization compared to the group that had delayed visualization playback of their prior attempt. Training with the mixed reality simulator improved participant confidence in performing an unfamiliar technique.

blank
(Copyright to the referenced article above) The mixed reality simulator of central venous access used in the study. The three-dimensional visualization is displayed on the laptop screen.

Neurosurgical Virtual Reality Simulation for Brain Tumor Using High-definition Computer Graphics: A Review of the Literature (Taichi Kin, et al) Neurol Med Chir (Tokyo), 57 (10), 513-520  2017 Oct 15 PMID: 28637947 PMCID: PMC5638778 DOI: 10.2176/nmc.ra.2016-0320

Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for “neurosurgery AND (simulation OR virtual reality)” retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.

Augmented Reality in Computer-Assisted Interventions Based on Patient-Specific 3D Printed Reference (Rafael Moreta-Martinez, et al) Healthc Technol Lett, 5 (5), 162-166 2018 Sep 14 eCollection Oct 2018 PMID: 30464847 PMCID: PMC6222179 DOI: 10.1049/htl.2018.5072

Augmented reality (AR) can be an interesting technology for clinical scenarios as an alternative to conventional surgical navigation. However, the registration between augmented data and real-world spaces is a limiting factor. In this study, the authors propose a method based on desktop three-dimensional (3D) printing to create patient-specific tools containing a visual pattern that enables automatic registration. This specific tool fits on the patient only in the location it was designed for, avoiding placement errors. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing’s sarcoma, and then tested during the actual surgical intervention. The application allowed physicians to visualise the skin, bone and tumour location overlaid on the phantom and patient. This workflow could be extended to many clinical applications in the surgical field and also for training and simulation, in cases where hard body structures are involved. Although the authors have tested their workflow on AR head mounted display, they believe that a similar approach can be applied to other devices such as tablets or smartphones.

blank
(Copyright to the referenced article above) Surgical guide containing visual marker
blank
(Copyright to the referenced article above): 3 Point recording on phantom a. Virtual vew b. Real view c. AR view

Related Articles:

From 3D Printing to VR/AR: Simple Connection?

From Academia: 3D Printed Pills, 4D Printed Structure, and how to property 3D print Chocolate

From Academia: 3D-Printed Aligner, Bioprinting for Mouth Ulcer, Vertebroplasty Guides

From Academia: 3D Printing and Robotics, Stem cell coated Implants, Decentralized Mitigation of Pandemics

Comments