Sinus navigation systems have been around since the late 1980s. For many years, the systems were relatively basic; the monitors were low resolution and the instruments were relatively crude, noted Martin J. Citardi, MD, professor and chair of the department of otolaryngology–head and neck surgery at the University of Texas Health Science Center in Houston.
However, in the past five years, several advancements in image-guided surgery (IGS) have increased the functionality of these systems, which are now enhanced by augmented and virtual reality technology.
“The newest systems are like a GPS system for the sinuses,” said Jivianne Lee, MD, associate professor in the department of head and neck surgery at the David Geffen School of Medicine at the University of California, Los Angeles.
Dr. Citardi agreed that significant advances have been made. “There are multiple [commercial] systems that are available now; competition has been very good at forcing innovation,” he said. (Dr. Citardi is a consultant for Acclarent and Intersect.)
Advances in Technology
One of the major advances in sinus navigation systems has been an improvement in instrumentation. All systems used in IGS have similar components: computer, workstation, video monitor, tracking system, surgical instrumentation, and data transfer hardware. Today’s models have high-resolution monitors that offer improved visualization and color contrast, especially in the red spectrum, which is essential for endoscopy of the sinuses.
According to Dr. Citardi, one of the foundations for reliable and accurate IGS is the registration process, a software-guided component that enhances the accuracy of the technology. “There’s a lot of discussion around instrumentation,” he said. “Each platform has a slightly different registration process, but getting registration right is essential for performing any of the advanced applications that are becoming possible.”
Most systems today use a contour-based registration protocol. The software automatically identifies the surface contour in the imaging dataset volume by building a 3D model from the computed tomography (CT) scan. At the time of surgery, the surgeon identifies contours on the surface of the patient’s face and head with a probe that’s tracked in 3D space. The software then aligns the contours mapped in the surgical volume with the corresponding contours in the preoperative imaging volume to create the registration. The system can also track the position of the patient’s head through a sensor/tracker that’s firmly attached to the patient’s forehead; this sensor allows the system to compensate for movement of the patient’s head during surgery (Ear Nose Throat J. 2021;100:NP475-NP486).
You have the benefit of dual information: The CT scan tells you the areas of bony erosion, for example, and the MRI tells you the extent of the tumor. The surgeon can toggle back and forth between the two images as they are navigating.
—Jivianne Lee, MD
A second advance involves navigation software that facilitates simultaneous navigation using multiple imaging modalities. “We now have the ability to merge CT and MRI [magnetic resonance imaging] together,” Dr. Lee said. “Traditionally, we would navigate off the CT scan for our endoscopic sinus surgery cases.” Now, she explained, with innovations in surgical navigation software, the CT and MRI can be used concurrently during intraoperative tracking, which can be particularly useful during endoscopic resection of sinonasal and skull base tumors. (Dr. Lee is a consultant for Medtronic ENT and Stryker ENT.)
Advances in Visualization
The manufacturers of the navigation systems have taken two approaches to improved visualization: augmented reality and virtual reality.
Augmented Reality. “Augmented reality [AR] is when you take annotations that the surgeon makes on the preoperative imaging [usually the CT scan] and overlay those annotations on the endoscopic camera view,” said Dr. Citardi.
In a study of AR systems, researchers compared conventional navigation software (n = 52) with a navigation software incorporating AR elements (n = 48). To evaluate the benefit provided by both navigation systems, the surgeons had to complete a questionnaire after finishing the operation. Surgeons reported a higher benefit when they spent a longer time with preoperative image analysis when using the AR system and used the navigation system for more surgical steps as compared with the conventional system. The authors concluded that “the AR-enhanced navigation software offers potential benefits during surgery without affecting the duration of the operation or the incidence of postoperative complications” (Larnygoscope Investig Otolaryngol. 2020;5:621-629).
“AR technology permits virtual elements to be overlaid on the endoscopic surgical view in real time,” said Waleed M. Abuzeid, MD, an associate professor in the department of otolaryngology–head and neck surgery and a specialist in rhinology and skull base surgery at the University of Washington in Seattle. “Take the example of a surgery for a complex skull base tumor. The surgeon can use the surgical navigation system to outline critical structures, including the tumor target, as well as anatomy that one wishes to avoid, such as critical nerves and vascular structures,” Dr. Abuzeid said. (Dr. Abuzeid is a consultant for Medtronic.)
Charles Stephen Ebert, Jr., MD, MPH, a professor and director of advanced neurorhinology and endoscopic skull base surgery at the University of North Carolina in Chapel Hill, agreed. “The newer systems have the ability to auto-segment structures, such as the orbital contents, brain, and soft tissues. Some systems can allow the surgeon to pick an area of interest, like tumors, to segment. As such, these structures can be used as anti-targets. Surgeons can place points of anti-targets on various structures, like the anterior ethmoid artery, that warn the surgeon when the guided instrument nears that structure,” said Dr. Ebert. (Dr. Ebert is a consultant for Acclarent.)
I can imagine a reality where the surgeon wears AR goggles that serve to highlight key anatomy, surgical targets, surgical trajectory, and relevant imaging as a real-time overlay on the endoscopic view.
—Waleed M Abuzeid, MD
Another innovation of AR has been the development of surgical pathway planning. The surgeon has the option to preplan a route to safely gain access to the tumor target. “All this data can be overlaid on the endoscopic view, augmenting the surgeon’s ability to avoid important anatomy as they navigate a preplanned trajectory to the tumor,” Dr. Abuzeid said. “This allows the surgeon to pick a point in the nasal cavity and another target point in a sinus. The computer picks a path to the target around those structures. Surgeons may follow this path to find the opening of the sinus of interest.”
Virtual Reality. In the context of surgical navigation systems, virtual reality (VR) technology refers to the addition of virtual elements onto a digital model of the sinuses and/or skull base. These customizable elements can include critical anatomic features or, for example, a pathway for surgical dissection into the frontal sinus. The elements can be highlighted on a “virtual endoscopic” view that “is akin to a digital reconstruction of the real endoscopic surgical view, or onto the preoperative CT or MRI,” said Dr. Abuzeid.
With VR, the user adds virtual information to computerized models of the sinuses during surgery. Newer VR systems have the ability to perform “fast-anatomical mapping [FAM],” said Dr. Ebert. “This feature tracks all the points where guided instruments have been during surgery. This is important because, at any point, the surgeon can see where and to what extent surgery has been performed, allowing them to check for completeness of surgery,” he said.
The merging of the CT and MRI has been very useful when performing skull base surgery and tumor resections, Dr. Lee said. “You have the benefit of dual information: The CT scan tells you the areas of bony erosion, for example, and the MRI tells you the extent of the tumor. The surgeon can toggle back and forth between the two images as they are navigating,” she said.
The implementation of AR and VR technology in surgical navigation systems has the potential to “increase the safety, efficiency, and completeness of surgery while reducing the surgeon’s mental workload,” said Dr. Abuzeid.
Office-Based Systems
The smaller footprint of the newer sinus navigation systems has made it possible to move this technology from the operating room to the office, said Dr. Lee. In that office setting, other benefits accrue, including the ability to navigate flexible tips, she said. Previously, Dr. Lee explained, surgeons had the option only to navigate with rigid instrumentation. “But that has changed. You can now track a flexible tip, a flexible balloon, or a guidewire, and more. In addition, we now have distal tip tracking available, compared with the traditional tracker located on the mid-shaft or proximal end of the instrument. Both of these, the distal tip and flexible tip tracking, are now available in the office setting.”
The Future of Sinus Navigation
“Ultimately, I expect this technology to allow for real-time updates to the preoperative imaging as surgery progresses,” said Dr. Abuzeid. “In currently available systems, the anatomy displayed on the preoperatively acquired imaging uploaded into the navigation system and the patient’s true anatomy diverge. The divergence gets worse the further the surgery progresses. VR technology will eventually allow anatomy to be ‘deleted’ in real-time from the imaging as the surgery progresses so that the imaging and reality are a better reflection of one another. We have already developed and continue to refine surgical navigation technology that allows us to achieve real-time updates to the ‘virtual CT’ with our team of surgeons and engineers at the University of Washington.”
“I see the technology advancing more toward real-time feedback methods like the FAM,” said Dr. Ebert. “The ability to navigate angled drills and debriders is on the horizon. I also think that the ability to autosegment vascular structures and tumors will aid in the endoscopic skull base surgery area.”
Dr. Abuzeid can also see AR becoming a means of combining the many streams of data that a surgeon has to process simultaneously during surgery—including the preoperative imaging, the endoscopic view, a constant situational awareness as it relates to critical anatomy—into a more efficient, combined view akin to the data aggregation that a fighter pilot gets through their heads-up display. “I can imagine a reality where the surgeon wears AR goggles that serve to highlight key anatomy, surgical targets, surgical trajectory, and relevant imaging as a real-time overlay on the endoscopic view. This could be game changing from a mental workload, safety, and efficiency standpoint,” he said.
Nikki Kean is a freelance medical writer based in New Jersey.
Sinus Navigation Systems and Resident Training
There are limited data on the ability of augmented reality (AR) and virtual reality (VR) to enhance resident training in sinus surgery. But early experience with the technology suggests that it has a great deal of promise.
“Most residents and fellows in training now get exposure to [sinus navigation systems],” said Martin J. Citardi, MD, a professor and chair of the department of otolaryngology–head and neck surgery at the University of Texas Health Science Center in Houston. “When someone is ready to go into practice, he or she should be comfortable with it. We’ve done a couple of papers that show that the robustness of the registration process is related to the experience of the person doing the registration.” Based on these findings, noted Dr. Citardi, surgeons may want to reconsider delegating the registration process to their most junior person.
Fellowship-trained surgeons have reported increased confidence operating near critical structures and more complete surgical dissection using this technology, which, in future studies, may translate to increased safety and reduced surgical failure rates, said Waleed M. Abuzeid, MD, an associate professor in the department of otolaryngology–head and neck surgery and a specialist in rhinology and skull base surgery at the University of Washington in Seattle. Furthermore, mental workload and stress during surgery may also improve based on early studies (Laryngoscope Investig Otolaryngol. 2020;5:621-629).
From a training standpoint, AR and VR could facilitate a deeper understanding of anatomic relationships and, importantly, identification of variations in anatomy that could lead to complications in endoscopic sinus surgery. “This technology may also allow for a graduated approach in surgical technique, beginning with the trainee adhering to a preplanned ‘flight path’ or surgical trajectory set by the teacher and progressing through to the trainee demonstrating their own ability to develop and execute a presurgical dissection plan and execution of that plan,” said Dr. Abuzeid.
“For training, you can build a 3D model of the sinus anatomy, particularly with respect to the frontal recess, which can help plan our surgical approaches during endoscopic dissection,” added Jivianne Lee, MD, an associate professor in the department of head and neck surgery at the University of California, Los Angeles David Geffen School of Medicine. “For the frontal sinus, you can color block specific cells and pathways that you anticipate you’ll encounter at the time of surgery [on the scan] and then superimpose those drawings on the endoscopic image at the time of surgery.”
“I would say the advancement that most has aided in resident education is the fast-anatomical-mapping [FAM],” said Charles Stephen Ebert, Jr., MD, MPH, a professor and director of advanced neurorhinology and endoscopic skull base surgery at the University of North Carolina Health in Chapel Hill. This feature allows the resident to get real-time feedback with regard to the extent of their surgery. It can then be compared to the attending’s completeness of surgery.