Image-guided surgery (IGS) is a technique where a computer is used to obtain a real-time correlation of the location of an instrument that has been inserted into a patient's body to a set of preoperatively obtained images (e.g., a CT or MRI scan, 3-D map, etc.), such that the computer system may superimpose the current location of the instrument on the preoperatively obtained images. An example of an electromagnetic IGS navigation system that may be used in IGS procedures is the CARTO® 3 System by Biosense-Webster, Inc., of Irvine, California. In some IGS procedures, a digital tomographic scan (e.g., CT or MRI, 3-D map, etc.) of the operative field is obtained prior to surgery. A specially programmed computer is then used to convert the digital tomographic scan data into a digital map. During surgery, special instruments having sensors (e.g., electromagnetic coils that emit electromagnetic fields and/or are responsive to externally generated electromagnetic fields) are used to perform the procedure while the sensors send data to the computer indicating the current position of each medical procedure instrument. The computer correlates the data it receives from the sensors with the digital map that was created from the preoperative tomographic scan. The tomographic scan images are displayed on a video monitor along with an indicator (e.g., crosshairs or an illuminated dot, etc.) showing the real-time position of each medical procedure instrument relative to the anatomical structures shown in the scan images. The surgeon is thus able to know the precise position of each sensor-equipped instrument by viewing the video monitor even if the surgeon is unable to directly visualize the instrument itself at its current location within the body.
In some instances, it may be desirable to dilate an anatomical passageway in a patient. This may include dilation of ostia of paranasal sinuses (e.g., to treat sinusitis), dilation of the larynx, dilation of the Eustachian tube, dilation of other passageways within the ear, nose, or throat, etc. One method of dilating anatomical passageways includes using a guide wire and catheter to position an inflatable balloon within the anatomical passageway, then inflating the balloon with a fluid (e.g., saline) to dilate the anatomical passageway. For instance, the expandable balloon may be positioned within an ostium at a paranasal sinus and then be inflated, to thereby dilate the ostium by remodeling the bone adjacent to the ostium, without requiring incision of the mucosa or removal of any bone. The dilated ostium may then allow for improved drainage from and ventilation of the affected paranasal sinus.
It may also be desirable to ablate tissue within the ear, nose, or throat of a patient. For instance, such ablation may be desirable to remodel tissue (e.g., to reduce the size of a turbinate), to provide denervation (e.g., to disable the posterior nasal nerve), and/or for other purposes. To achieve ablation, an end effector with one or more needle electrodes or other kind(s) of tissue contacting electrodes may be activated with monopolar or bipolar RF energy. Such ablation procedures may be carried out in conjunction with a dilation procedure or separately from a dilation procedure.
It may also be desirable to provide easily controlled placement of a dilation catheter, ablation instrument, or other ENT instrument in an anatomical passageway, including in procedures that will be performed only by a single operator. While several systems and methods have been made and used to position an ENT instrument in an anatomical passageway, it is believed that no one prior to the inventors has made or used the invention described in the appended claims.
The drawings and detailed description that follow are intended to be merely illustrative and are not intended to limit the scope of the invention as contemplated by the inventors.
The following description of certain examples of the invention should not be used to limit the scope of the present invention. Other examples, features, aspects, embodiments, and advantages of the invention will become apparent to those skilled in the art from the following description, which is by way of illustration, one of the best modes contemplated for carrying out the invention. As will be realized, the invention is capable of other different and obvious aspects, all without departing from the invention. Accordingly, the drawings and descriptions should be regarded as illustrative in nature and not restrictive.
For clarity of disclosure, the terms “proximal” and “distal” are defined herein relative to a surgeon, or other operator, grasping a medical procedure instrument having a distal medical procedure end effector. The term “proximal” refers to the position of an element arranged closer to the surgeon, and the term “distal” refers to the position of an element arranged closer to the medical procedure end effector of the medical procedure instrument and further away from the surgeon. Moreover, to the extent that spatial terms such as “upper,” “lower,” “vertical,” “horizontal,” or the like are used herein with reference to the drawings, it will be appreciated that such terms are used for exemplary description purposes only and are not intended to be limiting or absolute. In that regard, it will be understood that medical procedure instruments such as those disclosed herein may be used in a variety of orientations and positions not limited to those shown and described herein.
As used herein, the terms “about” and “approximately” for any numerical values or ranges indicate a suitable dimensional tolerance that allows the part or collection of components to function for its intended purpose as described herein.
When performing a medical procedure within a head (H) of a patient (P), it may be desirable to have information regarding the position of an instrument within the head (H) of the patient (P), particularly when the instrument is in a location where it is difficult or impossible to obtain an endoscopic view of a working element of the instrument within the head (H) of the patient (P).
IGS navigation system (50) of the present example comprises a field generator assembly (60), which comprises set of magnetic field generators (64) that are integrated into a horseshoe-shaped frame (62). Field generators (64) are operable to generate alternating magnetic fields of different frequencies around the head (H) of the patient (P). An instrument, such as any of the instruments described below, may be inserted into the head (H) of the patient (P). Such an instrument may be a standalone device or may be positioned on an end effector. In the present example, frame (62) is mounted to a chair (70), with the patient (P) being seated in the chair (70) such that frame (62) is located adjacent to the head (H) of the patient (P). By way of example only, chair (70) and/or field generator assembly (60) may be configured and operable in accordance with at least some of the teachings of U.S. Pat. No. 10,561,370, entitled “Apparatus to Secure Field Generating Device to Chair,” Issued Feb. 18, 2020, the disclosure of which is incorporated by reference herein, in its entirety.
IGS navigation system (50) of the present example further comprises a processor (52), which controls field generators (64) and other elements of IGS navigation system (50). For instance, processor (52) is operable to drive field generators (64) to generate alternating electromagnetic fields; and process signals from the instrument to determine the location of a navigation sensor in the instrument within the head (H) of the patient (P). Processor (52) comprises a processing unit (e.g., a set of electronic circuits arranged to evaluate and execute software instructions using combinational logic circuitry or other similar circuitry) communicating with one or more memories. Processor (52) of the present example is mounted in a console (58), which comprises operating controls (54) that include a keypad and/or a pointing device such as a mouse or trackball. A physician uses operating controls (54) to interact with processor (52) while performing the medical procedure.
While not shown in
Processor (52) uses software stored in a memory of processor (52) to calibrate and operate IGS navigation system (50). Such operation includes driving field generators (64), processing data from the instrument, processing data from operating controls (54), and driving display screen (56). Processor (52) is further operable to provide video in real time via display screen (56), showing the position of the distal end of the instrument in relation to a video camera image of the patient's head (H), a CT scan image of the patient's head (H), and/or a computer-generated three-dimensional model of the anatomy within and adjacent to the patient's nasal cavity. Display screen (56) may display such images simultaneously and/or superimposed on each other during the medical procedure. Such displayed images may also include graphical representations of instruments that are inserted in the patient's head (H), such that the operator may view the virtual rendering of the instrument at its actual location in real time. By way of example only, display screen (56) may provide images in accordance with at least some of the teachings of U.S. Pat. No. 10,463,242, entitled “Guidewire Navigation for Sinuplasty,” issued Nov. 5, 2019, the disclosure of which is incorporated by reference herein, in its entirety. In the event that the operator is also using an endoscope, the endoscopic image may also be provided on display screen (56).
The images provided through display screen (56) may help guide the operator in maneuvering and otherwise manipulating instruments within the patient's head (H). It should also be understood that other components of a medical procedure instrument and other kinds of medical procedure instruments, as described below, may incorporate a navigation sensor like the navigation sensor described above.
II. Example of an ENT Instrument with Steerable Guide and Working Element
Instrument (100) of this example includes a handle assembly (106) and a shaft assembly (108). In some versions, handle assembly (106) and/or shaft assembly (108) includes one or more navigation sensors (e.g., navigation sensor assembly (110) as described below) that cooperate with IGS navigation system (50) as described above, to provide signals indicating the real-time position of corresponding portions of instrument (100). Instrument (100) may be coupled with an inflation fluid source (not shown), which may be operable to selectively supply an inflation fluid to a balloon (not shown) of end effector (104). In addition, or alternatively, instrument (100) may be coupled with an RF generator (not shown), which may be operable to generate RF electromedical procedure energy for delivery to tissue via electrodes (not shown) of end effector (104) and/or via electrodes (121, 123) (
Handle assembly (106) of this example includes a body (112) and a slider (114). Slider (114) is operable to translate longitudinally relative to body (112). Slider (114) is coupled with working element (101) and is thus operable to translate working device (101) longitudinally between a proximally retracted position (
Shaft assembly (108) of the present example includes a rigid portion (116), a flexible portion (118) distal to rigid portion (116), and an open distal end (120). A pull-wire (not shown) is coupled with flexible portion (118) and with a deflection control knob (122) of handle assembly (106). Deflection control knob (122) is rotatable relative to body (112) to selectively retract the pull-wire proximally. As the pull-wire is retracted proximally, flexible portion (118) bends and thereby deflects distal end (120) laterally away from the longitudinal axis of rigid portion (116). Deflection control knob (122), the pull-wire, and flexible portion (118) thus cooperate to impart steerability to shaft assembly (108). By way of example only, such steerability of shaft assembly (108) may be provided in accordance with at least some of the teachings of U.S. Pat. Pub. No. 2021/0361912, entitled “Shaft Deflection Control Assembly for ENT Guide Instrument,” published Nov. 25, 2021, the disclosure of which is incorporated by reference herein, in its entirety; and/or U.S. Pat. No. 11,376,401, entitled “Deflectable Guide for Medical Instrument,” issued Jun. 15, 2022, the disclosure of which is incorporated by reference herein, in its entirety. Other versions may provide some other kind of user input feature to drive steering of flexible portion (118), instead of deflection control knob (122). In some alternative versions, deflection control knob (122) is omitted, and flexible portion (118) is malleable. In still other versions, the entire length of shaft assembly (108) is rigid.
Shaft assembly (108) is also rotatable relative to handle assembly (106), about the longitudinal axis of rigid portion (116). Such rotation may be driven via rotation control knob (124), which is rotatably coupled with body (112) of handle assembly (106). Alternatively, shaft assembly (108) may be rotated via some other form of user input; or may be non-rotatable relative to handle assembly (106).
A working lumen extends longitudinally from an open proximal end of shaft assembly (108) all the way to open distal end (120) and is configured to slidably receive working element (101), such that shaft assembly (108) may receive working element (101) at the open proximal end, and such that shaft assembly (108) may guide working element (101) out through open distal end (120).
As shown in
As also shown in
Illuminating elements (162, 163) are configured and operable to illuminate the field of view of camera (161). Conduits (164, 165) laterally flank camera (161) in this example. One or both of conduits (164, 165) may be in fluid communication with a source of liquid (e.g., saline, etc.) and/or a source of suction. In versions where at least one of conduits (164, 165) is in communication with a source of liquid, such conduit(s) (164, 165) may be used to deliver such liquid to a distal end of camera (161). By flushing the distal end of camera (161) with liquid, conduits (164, 165) may be used to keep the distal end of camera (161) clear of debris and thereby maintain appropriate visualization via camera (161).
Plate member (160) of this example includes a plate (166) and a pair of transversely extending tabs (167, 168). Plate (166) is positioned over camera (161) and may thus serve to shield camera (161) from getting snagged and perhaps damaged by working element (104) and/or other instruments that are advanced along working channel (149). Tabs (167, 168) are positioned to correspond with the locations of respective distal ends of conduits (164, 165). Tab (167) may be further positioned to leave a gap (not shown) between the proximal face of tab (167) and the distal end of conduit (164), and a similar gap may be left between the proximal face of tab (168) and the distal end of conduit (165). These gaps may be sized to allow liquid to escape from the distal ends of conduits (164, 165); and to allow suction to be applied via the distal ends of conduits (164, 165). However, the presence of tabs (167, 168) may assist in diverting liquid expelled via the distal ends of conduits (164, 165) toward the distal end of camera (161) and thereby assist in flushing debris away from camera (161).
III. Example of Overlaying Instrument Data onto Images
While image (400) clearly shows the anatomical structures and instrument (410) within the field of view of camera (161), it may be desirable provide additional information in images (400) captured via camera (161). For instance, such additional information may include the kind of working element (101) that is disposed in instrument (100); the kind of other instrument (410) that is inserted in the patient with instrument (100); structural characteristics of working element (101) or instrument (410), such as length, bend angle, diameter, etc.; operational characteristics of working element (101) or instrument (410), such as depth of insertion, activation status, etc.; and/or other information. With the additional information being provided in the form of one or more overlays on an image (400) from camera (161), the operator may readily observe the additional information while observing the image (400) from camera (161) during a procedure, without the operator needing to look away from image (400) to observe the additional information.
As noted above, it may be beneficial to include additional information about working element (101) or instrument (410) in an image (400) captured by camera (161), to enable the operator to observe the additional information without needing to look away from image (400). It may also be beneficial to include such additional information within user interface (500) (e.g., as an overlay), to enable the operator to observe the additional information without needing to look away from user interface (500). In addition, it may be beneficial to integrate image (400) into user interface (500). Thus, some versions of display screen (56) may render a combination of image (400), preoperative images (510, 520, 530, 540), and one or more overlays, etc., providing additional information about working element (101) or instrument (410).
Once the instrument data is obtained (601), processor (52) may determine or identify (602) various factors about the instrument use, such as, for example, the instrument type, one or more kinds of instrument use cases (e.g., the kind or kinds of procedures in which the working element (101) and/or instrument (410) is configured to be used), structural characteristics of working element (101) or instrument (410) (e.g., structural characteristics of working element (101) or instrument (410), such as length, bend angle, diameter, etc.), and/or other preoperative information associated with working element (101) or instrument (410).
Once instruments (100, 410) have been inserted into the patient (P), processor (52) may obtain (603) position information from one or more position sensors to determine the real-time position of working element (101) or instrument (410) in the patient (P). As noted above, instrument (100) includes a navigation sensor assembly (110) that may be used to generate such position information. It should be understood that working element (101) or instrument (410) may include a similar kind of position sensor to generate signals indicating the real-time position of working element (101) or instrument (410) in the patient (P).
Based on the obtained (603) position information indicating the real-time position of working element (101) or instrument (410) within the patient (P), processor (52) may determine (604) the location of the instrument relative to instrument (100) (or relative to some other kind of endoscope or other image capturing instrument). As noted above, processor (52) may already “know” the location of instrument (100) within the patient (P) based on signals from navigation sensor assembly (110). Processor (52) may thus correlate the real-time position of working element (101) or instrument (410) within the patient (P) with the real-time position of instrument (100) within the patient (P).
Once processor (52) has determined (602) the instrument information (e.g., type, configuration, use case, etc.) and determined (604) the location of the instrument relative to instrument (100) (or relative to some other kind of endoscope or other image capturing instrument), processor (52) may generate (605) an overlay. This overlay may include a graphical representation of working element (101) or instrument (410), textual information, numerical information, other graphical information (e.g., arrows, etc.). As noted above, the overlay may be positioned on the endoscopic image (400), on one or more preoperative images (510, 520, 530, 540), and/or elsewhere on display screen (56). It should also be understood that more than one overlay may be generated (605). For instance, a graphical representation of working element (101) or instrument (410), and one or more other graphical representations (e.g., an arrow indicating a direction of insertion, etc.), may be overlaid on the endoscopic image (400) and/or on one or more preoperative images (510, 520, 530, 540) on display screen (56); while alphanumeric data (e.g., representing type of instrument, depth of insertion, etc.) may be overlaid on display screen (56) separately from images (400, 520, 530, 540). In addition, or in the alternative, alphanumeric data (e.g., representing type of instrument, depth of insertion, etc.) may be overlaid on the endoscopic image (400) and/or on one or more preoperative images (510, 520, 530, 540) on display screen (56).
In some variations, the configuration of the instrument itself may change during use of the instrument. For instance, the instrument may have a steerable distal section such that the distal section may have a bend angle that varies during the medical procedure. Similarly, an instrument may have a rotatable shaft or other feature that rotates during the medical procedure. In scenarios where the configuration of the instrument itself changes during use of the instrument in a medical procedure, an overlay (734) may visually indicate those changes in real time. For instance, an overlay (734) may indicate a changing bend angle (or angle of rotation, etc.) in real time with an alphanumeric representation, a graphical representation, or some other visual indication.
Thus, as disclosed herein, in some implementations, an analysis system or method for a medical procedure at an operative site in a volume located inside a patient body may involve the following non-limiting steps. Firstly, a medical procedure type may be selected or determined (e.g., based on preoperative data or user input). Once the procedure has begun, the system may detect, using an image (e.g., an endoscope image such as shown in
In a further implementation, the system may determine a position of the detected instrument relative to the detected anatomical structure in the patient (e.g., via image analysis of the captured image of the operative site, preoperative data, navigational tracking data, etc.), so that the detected instrument, the endoscope, the detected tissue, and their positions relative to one another are determined. The position of the instrument relative to the endoscope and/or tissue may be used to generate one or more medical procedure parameters defining a medical procedure step. In some implementations, the medical procedure step may be a known step associated with a medical procedure plan. In an alternative implementation, the system may identify or determine the step based on a factor know to the system, such as, for example, the obtained (601) instrument data; the determined (602) instrument type, use case, and configuration; the determined (604) instrument location relative to endoscope; etc.
Once a medical procedure step is identified, the system may, in some implementations, select (e.g., from a database) a conceptual medical procedure step that corresponds to the detected medical procedure step (e.g., the step identified based on information housed in the device on a storage platform, such as an EEPROM device). The system may then compare one or more factors that were identified or determined, as discussed herein, with one or more factors that define the selected conceptual medical procedure step. Based on this comparison, the system may determine what, if any, differences exist between these parameters. If any differences are detected, the system may take various actions, such as, for example, determining that the difference is negligible (thus proceeding with the medical procedure plan); providing a notification to the user of the differences; providing a notification to the user indicating the specific factors causing the difference; requesting user feedback; searching a database for historical medical procedure information that may more closely match the one or more factors; providing information associated with the quality of the detected medical procedure step; etc.
Accordingly, in some implementations, the system may compare the factors of the detected medical procedure step with the factors of a previously performed step or conceptual step. It should be understood that the historical medical procedure information may be information that originated at the medical procedure navigation system as well as historical medical procedure information that originated at other medical procedure navigation systems. Stated differently, in some implementations, the system may access a shared or global database that contains medical procedures performed by other users and/or other medical procedure navigation systems.
Once the system determines the most accurate medical procedure step, it may then display (e.g., on a display screen or wearable device) an overlay on an image (e.g., the image captured by an endoscope) showing a current position of the detected instrument and indicating an optimal position of the instrument. In a further implementation, the system may also display a determined depth of the instrument (e.g., as shown in
IV. Example of Combining CT Image View with Endoscopic Image View
As discussed herein, providing supplemental information to a user (e.g., surgeon) during medical procedure navigation may be beneficial for training purposes and/or for improving patient outcomes. This may be especially true when performing procedures within a nasal cavity and/or when performing minimally invasive surgery because only small portions of the instrumentation being used may be visible (e.g., via an endoscope) during the medical procedure. However, further improvements may be possible, such as, for example, showing a high accuracy “virtual” or projected image even when the instrument is not in the vicinity (e.g., field of view) of the endoscope or is otherwise obscured by anatomical structures and/or fluid (e.g., blood). For instance, as shown in
It may therefore be desirable to provide a modified version of endoscopic image (400), to effectively indicate visually to the operator the position of distal portion (414) in relation to surrounding anatomical structures. To the extent that such positional information of distal portion (414) may be conveyed via a set of images (510, 520, 530, 540) such as those shown in user interface (500), or via a set of images such as those shown in user interface (700), it may be further desirable to visually to the operator the position of distal portion (414) in relation to surrounding anatomical structures within a single image. To that end, it may be desirable to effectively combine an endoscopic image (400) (depicting the portion of instrument (410) visible within the field of view of the endoscope (e.g., instrument (100)) with a preoperative image and a real-time position indictor in the preoperative image (to show the otherwise obscured distal portion (414) of instrument (410)). Such a single composite image may more effectively convey the real-time position of distal portion (414) than could be conveyed via a set of images such as those shown in user interfaces (500, 700) described above.
The following describes examples of how a single composite image may be generated to effectively convey the real-time position of a portion of an instrument that is obscured by blood, anatomical structures, etc. while simultaneously displaying the portions of the same instrument that is visible within the field of view of an endoscope. It should be understood that the composite images described below may also include one or more overlay features similar to the overlays described above in connection with
As noted above, an endoscopic view such as the one depicted in image (400) may be captured by an endoscope or other viewing instrument like instrument (100) of
In addition to an endoscope (e.g., instrument (100)) having a position sensor (e.g., navigation sensor assembly (110)), an instrument (e.g., working element (101) or instrument (410)) that is used with the endoscope may also have a position sensor. Such a position sensor may thus be used to determine the real-time position of the instrument in three-dimensional space. In some scenarios, the instrument has known characteristics such as length, width, height, geometry, angle, diameter, etc., that may be used in combination with data from a position sensor of the endoscope to extrapolate the real-time position of the instrument in three-dimensional space. For instance, if a position sensor of the endoscope is used to determine the real-time position of the endoscope in three-dimensional space, and one or more features of the instrument are used to determine (e.g., optically, electromagnetically, etc.) the position of the instrument relative to the endoscope, then the real-time position of the instrument in three-dimensional space may be determined. In addition, certain structural characteristics of the instrument may be predetermined and utilized in determining the real-time position of the instrument in three-dimensional space.
Once the instrument data is obtained (801), processor (52) may determine or identify (802) various factors about the instrument use, such as, for example, the instrument type, one or more kinds of instrument use cases (e.g., the kind or kinds of procedures in which the working element (101) and/or instrument (410) is configured to be used), structural characteristics of working element (101) or instrument (410) (e.g., structural characteristics of working element (101) or instrument (410), such as length, bend angle, diameter, etc.), and/or other preoperative information associated with working element (101) or instrument (410).
Once instruments (100, 410) have been inserted into the patient (P), processor (52) may obtain (803) position information from one or more position sensors to determine the real-time position of working element (101) or instrument (410) in the patient (P). As noted above, instrument (100) includes a navigation sensor assembly (110) that may be used to generate such position information. It should be understood that working element (101) or instrument (410) may include a similar kind of position sensor to generate signals indicating the real-time position of working element (101) or instrument (410) in the patient (P).
Based on the obtained (803) position information indicating the real-time position of working element (101) or instrument (410) within the patient (P), processor (52) may determine (804) the location of the instrument relative to instrument (100) (or relative to some other kind of endoscope or other image capturing instrument). As noted above, processor (52) may already “know” the location of instrument (100) within the patient (P) based on signals from navigation sensor assembly (110). Processor (52) may thus correlate the real-time position of working element (101) or instrument (410) within the patient (P) with the real-time position of instrument (100) within the patient (P).
Processor (52) may further identify (805) one or more preoperative images associated with the real-time position of the distal end of working element (101) or instrument (410); and based on the real-time position of instrument (100) (which will correspond to the position of the field of view of camera (161)). In other words, with the real-time position of the patient (P) having been registered with processor (52) to thereby register the patient (P) with the preoperative images, and with the real-time positions of instrument (100) and either working element (101) or instrument (410) being tracked by processor (52), processor (52) may determine which preoperative images depict the anatomical position in patient (P) where instrument (100) and either working element (101) or instrument (410) are currently located. Moreover, as discussed herein, processor (52) may determine the location of each device relative to the other. Thus, by utilizing a high precision tracking system in combination with known instrumentation processor (52) may determine the real-time locations of the instruments (e.g., instrument (100) and either working element (101) or instrument (410)) inside the body of the patient (P). Based on the location information, processor (52) can then correlate the location of the instruments, and thus the endoscopic view from camera (161), relative to one or more preoperative images.
Once the appropriate preoperative images are identified, processor (52) may determine (806) if a portion of working element (101) or instrument (410) is likely to be obscured by an anatomical structure of the patient (P). This determination (806) may be achieved in a variety of ways. For example, in some versions, processor (52) and/or the operator may rely on the endoscopic view from camera (161) to determine what, if any, portion of working element (101) or instrument (410) is obscured and/or outside the endoscopic view from camera (161). Using the high precision tracking, combined the known factors, processor (52) may construct a projected and/or superimposed view of the location of the obscured or otherwise unseen portion of working element (101) or instrument (410) with reference to a preoperative image (e.g., CT image).
Once processor (52) has determined (802) the instrument information (e.g., type, configuration, use case, etc.) and determined (804) the location of the instrument relative to the endoscope, a graphical representation of the instrument (e.g., its location, size, direction, shape, etc.) may be overlayed (805) with the preoperative image data onto the endoscopic view. In other words, in a region of the endoscopic view where a portion of the inserted instrument is obscured (e.g., by an anatomical structure or by blood, etc.), processor (52) may overlay (805) a corresponding portion of a preoperative image, with a graphical representation of the obscured portion of the inserted instrument in the preoperative image. Processor (52) may thus provide a composite image formed by a combination of the endoscopic view, a portion of a preoperative image, and a graphical representation of an obscured portion of an inserted instrument (e.g., working element (101) or instrument (410)). The portion of the inserted instrument that is not obscured may still be shown in the corresponding portion of the endoscopic view in the composite image. By providing these views within a single composite image, display screen (56) may provide less visual clutter to the operator than might otherwise be provided through the presentation of several separate images simultaneously.
In some variations, the operator may manually designate a region (e.g., via operating controls (54)) within an endoscopic view where processor (52) should overlay a corresponding portion of a preoperative image. Such manual designations may be provided even in scenarios where a portion of an inserted instrument is not being obscured. For instance, an initial endoscopic image may provide a view of a middle turbinate (MT) and surrounding anatomical structures in a nasal cavity, without any preoperative image overlaid on the endoscopic image. The user may select a region within the endoscopic image, such as a portion of the middle turbinate (MT); and processor (52) may then overlay a corresponding preoperative image of that region of the patient's anatomy onto that portion of the endoscopic image.
In some cases (e.g., where the corresponding preoperative image is at a plane that is deeper into the patient anatomy along line of sight), the overlaid preoperative image may allow the operator to effectively “see through” the middle turbinate (MT) or other anatomical structure. This ability to effectively “see through” anatomical structures in a composite image formed by a combination of a real-time endoscopic view and a preoperative image may be provided regardless of whether the composite image is formed via the method (800) of FIG. and/or via an operator's manual selection of a region to “see through.” This “see through” capability may also be provided in scenarios where instrumentation or other anatomical structures are obscured in an endoscopic view by blood, other fluids, debris, etc. Similarly, this “see through” capability may also be provided in scenarios where instrumentation or other anatomical structures are otherwise outside the field of view of the endoscope (e.g., instrument (100) or another instrument with a camera).
A distal portion of instrument (910) would be obscured by an anatomical structure in endoscopic view (902), similar to how distal portion (414) is obscured by the middle turbinate (MT) in
Since processor (52) is also configured to track the real-time position of instrument (910) within the patient (P), processor (52) is operable to generate virtual projection (914) to visually indicate where the otherwise obscured distal portion of instrument (910) is located. In this example, virtual projection (914) is provided only within semi-transparent overlay (904) formed by the corresponding preoperative image region. In some versions where semi-transparent overlay (904) does not span across the entirety of the obscured portion of instrument (910), virtual projection (914) may overlay a portion of endoscopic view (902) where semi-transparent overlay (904) does not extend. In other words, virtual projection (914) may still be used to visually indicate where an obscured portion of instrument (910) is located within endoscopic view (902), even if semi-transparent overlay (904) does not extend into that region of endoscopic view (902). Moreover, processor (52) may be configured to enable the operator to select (e.g., via operating controls (54)) the extent to which semi-transparent overlay (904) should be provided over endoscopic view (902). In some cases, the operator may wish to omit semi-transparent overlay (904) from all or part of a region of endoscopic view (902) in which a distal portion of instrument (910) is obscured, such that the operator may simply rely on virtual projection (914) in endoscopic view (902) to determine where some or all of the obscured portion of instrument (910) is located in relation to anatomical structures.
A distal portion of instrument (1010) would be obscured by an anatomical structure in endoscopic view (1002), similar to how distal portion (414) is obscured by the middle turbinate (MT) in
Since processor (52) is also configured to track the real-time position of instrument (1010) within the patient (P), processor (52) is operable to generate virtual projection (1014) to visually indicate where the otherwise obscured distal portion of instrument (1010) is located. In this example, virtual projection (1014) is provided only within opaque overlay (1004) formed by the corresponding preoperative image region. In some versions where semi-transparent overlay (1004) does not span across the entirety of the obscured portion of instrument (1010), virtual projection (1014) may overlay a portion of endoscopic view (1002) where opaque overlay (1004) does not extend. In other words, virtual projection (1014) may still be used to visually indicate where an obscured portion of instrument (1010) is located within endoscopic view (1002), even if opaque overlay (1004) does not extend into that region of endoscopic view (1002). Moreover, processor (52) may be configured to enable the operator to select (e.g., via operating controls (54)) the extent to which opaque overlay (1004) should be provided over endoscopic view (1002). In some cases, the operator may wish to omit opaque overlay (1004) from all or part of a region of endoscopic view (1002) in which a distal portion of instrument (1010) is obscured, such that the operator may simply rely on virtual projection (1014) in endoscopic view (1002) to determine where some or all of the obscured portion of instrument (1010) is located in relation to anatomical structures.
As described above, a composite image (900) provides a semi-transparent overlay (904) of a preoperative image region while composite image (1000) provides an opaque overlay (1004) of a preoperative image region. Some variations may allow the operator to select a degree of transparency for a preoperative image overlay (904, 1004) on an endoscopic view (902, 1002). For instance, some versions of processor (52) may allow the operator to select a degree of transparency via operating controls (54), such as through manipulation of a slider, knob, or other control feature, in a graphical user interface or otherwise.
It should also be understood that, as the operator moves instrument (910, 1010) within the patient, the position of overlay (904, 1004) and virtual projection (914, 1014) may correspondingly move in real time to facilitate visual tracking of the movement of instrument (910, 1010) relative to adjacent anatomical structures.
As yet another variation, a composite image (900, 1000) may include one or more additional overlays, such as overlay (734) described above, in addition to including overlay (904, 1004) and virtual projection (914, 1014). In scenarios where more than one instrument (910, 1010) is disposed in the patient simultaneously, and processor (52) is configured to track the real-time position of the two or more instruments (910, 1010) in the patient, each such instrument may have its own corresponding overlay (904, 1004) and virtual projection (914, 1014) within the same single composite image (900, 1000).
The following examples relate to various non-exhaustive ways in which the teachings herein may be combined or applied. It should be understood that the following examples are not intended to restrict the coverage of any claims that may be presented at any time in this application or in subsequent filings of this application. No disclaimer is intended. The following examples are being provided for nothing more than merely illustrative purposes. It is contemplated that the various teachings herein may be arranged and applied in numerous other ways. It is also contemplated that some variations may omit certain features referred to in the below examples. Therefore, none of the aspects or features referred to below should be deemed critical unless otherwise explicitly indicated as such at a later date by the inventors or by a successor in interest to the inventors. If any claims are presented in this application or in subsequent filings related to this application that include additional features beyond those referred to below, those additional features shall not be presumed to have been added for any reason relating to patentability.
A medical procedure navigation system, the system comprising processing circuitry configured to: (a) obtain, from an electronic storage device, data associated with a medical procedure instrument, the electronic storage device being housed within the medical procedure instrument; (b) obtain at least one preoperative patient image; (c) track, using one or more tracking devices, a real-time location of the medical procedure instrument and an imaging instrument; (d) obtain, from the imaging instrument, at least one patient image; (e) determine, based on the data associated with the medical procedure instrument, at least one feature of the medical procedure instrument; (f) identify, based on the real-time location of the medical procedure instrument and the imaging instrument and the at least one feature, a localized view of the at least one preoperative patient image; and (g) superimpose and display, on a display device, a localized view of the at least one preoperative patient image over the at least one patient image.
The medical procedure navigation system of Example 1, the electronic storage device comprising an EEPROM device.
The medical procedure navigation system of any of Examples 1 through 2, the processing circuitry being further configured to: (a) determine an insertion depth of the medical procedure instrument with patient anatomy; and (b) display, on the display device, a visual representation of the determined insertion depth.
The medical procedure navigation system of any of Examples 1 through 3, the processing circuitry being further configured to: (a) determine a bend angle of the instrument; and (b) display, on the display device, a visual representation of the determined bend angle.
The medical procedure navigation system of any of Examples 1 through 4, the processing circuitry being further configured to: (a) receive, via user input, at least one user criteria; and (b) display, on the display device, a visual representation of the at least one user criteria.
The medical procedure navigation system of any of Examples 1 through 5, the processing circuitry being further configured to display, on the display device, a portion of the medical procedure instrument, the portion of the medical procedure instrument being within a field of view of the imaging instrument.
The medical procedure navigation system of any of Examples 1 through 6, the processing circuitry being further configured to: (a) determine a portion of the medical procedure instrument that is hidden from a view of the imaging instrument; and (b) generate and display a virtual instrument projection associated with the portion of the medical procedure instrument that is hidden from the view of the imaging instrument.
The medical procedure navigation system of any of Examples 1 through 7, the one or more tracking devices comprising an optical tracking device.
The medical procedure navigation system of any of Examples 1 through 8, the one or more tracking devices comprising an electromagnetic tracking device.
The medical procedure navigation system of any of Examples 1 through 9, the medical procedure instrument and the imaging instrument being housed together in a single instrument.
A system comprising: (a) a medical procedure instrument having a first position sensor assembly; (b) an imaging instrument having a second position sensor assembly; and (c) a processor, the processor being configured to: (i) determine a real-time position of the medical procedure instrument based on a signal from the first position sensor assembly, (ii) determine a real-time position of the imaging instrument based on a signal from the second position sensor assembly, (iii) display a real-time image from the imaging instrument, the real-time image providing a field of view, (iv) determine that a portion of the medical procedure instrument is obscured within the field of view, (v) select a preoperative image region corresponding to the obscured portion of the medical procedure instrument, and (iv) overlay the selected preoperative image region onto the real-time image from the imaging instrument, at a location corresponding to a region of the field of view in which the portion of the medical instrument is obscured.
The system of Example 11, the medical procedure instrument being selected from the group consisting of a dilation catheter, an ablation instrument, a curette, a microdebrider or other shaver, a probe, an irrigation instrument, and a suction instrument.
The system of any of Examples 11 through 12, the imaging instrument comprising an endoscope.
The system of any of Examples 11 through 13, the imaging instrument having a working channel, the medical procedure instrument being disposed within the working channel.
The system of any of Examples 11 through 14, the processor being further configured to render a virtual representation of the obscured portion of the medical procedure instrument at a location corresponding to a region of the field of view in which the portion of the medical instrument is obscured.
The system of Example 15, the processor being further configured to render the virtual representation within the overlaid preoperative image region.
The system of any of Examples 11 through 16, the processor being further configured to: (i) receive a user selection indicating a transparency level, and (ii) apply a transparency level to the overlay based on the user selection.
The system of any of Examples 11 through 17, the processor being further configured to provide a secondary overlay on the real-time image from the imaging instrument, the secondary overlay representing a preoperative characteristic of the medical procedure instrument.
The system of Example 18, the preoperative characteristic being selected from the group consisting of length, bend angle, and diameter.
The system of any of Examples 11 through 19, the processor being further configured to provide a secondary overlay on the real-time image from the imaging instrument, the secondary overlay representing a real-time operational characteristic of the medical procedure instrument.
The system of Example 20, the real-time operational characteristic being selected from the group consisting of a depth of insertion, an activation status, and a bend angle.
A medical procedure navigation system, the system comprising processing circuitry configured to: (a) obtain, from an electronic storage device, preoperative data associated with a medical procedure instrument; (b) obtain at least one preoperative patient image; (c) track, using one or more tracking devices, a real-time location of the medical procedure instrument and an imaging instrument; (d) obtain, from the imaging instrument, at least one real-time patient image; (e) determine, based on the preoperative data associated with the medical procedure instrument, at least one feature of the medical procedure instrument; (f) identify, based on the location of the medical procedure instrument, the location of the imaging instrument, and the at least one feature of the medical procedure instrument, a localized view of the at least one preoperative patient image; (g) determine, based on the localized view, whether a distal portion of the instrument will be obscured by an anatomical structure of a patient; and (h) responsive to determining that the distal portion of the instrument will be obscured by the anatomical structure of the patient, superimpose and display, on a display device, the localized view of the at least one preoperative patient image over the at least one real-time patient image.
The medical procedure navigation system of Example 22, the electronic storage device comprising an EEPROM device.
The medical procedure navigation system of any of Examples 22 through 23, the processing circuitry being further configured to: (a) determine, based on the location of the medical procedure instrument an insertion depth associated with patient anatomy; and (b) display, on the display device, a visual representation of the determined insertion depth.
The medical procedure navigation system of any of Examples 22 through 24, the processing circuitry being further configured to: (a) determine a real-time bend angle of the instrument; and (b) display, on the display device, a visual representation of the determined bend angle.
The medical procedure navigation system of any of Examples 22 through 25, the processing circuitry being further configured to: (a) receive, via user input, at least one user criteria; and (b) display, on the display device, a visual representation of the at least one user criteria.
The medical procedure navigation system of any of Examples 22 through 26, the processing circuitry being further configured to display, on the display device, a portion of the medical procedure instrument, the portion of the medical procedure instrument being visible to the imaging instrument.
The medical procedure navigation system of any of Examples 22 through 27, the processing circuitry being further configured to: (a) determine a portion of the medical procedure instrument being hidden from the view of the imaging instrument; and (b) generate and display, based on the data associated with a medical procedure instrument, a virtual instrument projection associated with the portion of the medical procedure instrument that is hidden from the view of the imaging instrument.
The medical procedure navigation system of any of Examples 22 through 28, the one or more tracking devices comprising an optical tracking device.
The medical procedure navigation system of any of Examples 22 through 29, the one or more tracking devices comprising an electromagnetic tracking device.
The medical procedure navigation system of any of Examples 22 through 30, the medical procedure instrument and the imaging instrument being housed in the same instrument.
A method comprising: (a) obtaining, from an electronic storage device, data associated with a medical procedure instrument, the electronic storage device being housed within the medical procedure instrument; (b) obtaining at least one preoperative patient image; (c) tracking, using one or more tracking devices, a real-time location of the medical procedure instrument and an imaging instrument; (d) obtaining, from the imaging instrument, at least one patient image; (e) determining, based on the data associated with the medical procedure instrument, at least one feature of the medical procedure instrument; (f) identifying, based on the real-time location of the medical procedure instrument and the imaging instrument and the at least one feature, a localized view of the at least one preoperative patient image; and (g) superimposing and displaying, on a display device, a localized view of the at least one preoperative patient image over the at least one patient image.
The method of Example 32, further comprising: (a) determining an insertion depth of the medical procedure instrument with patient anatomy; and (b) displaying, on the display device, a visual representation of the determined insertion depth.
The method of any of Examples 32 through 33, further comprising: (a) determining a bend angle of the instrument; and (b) displaying, on the display device, a visual representation of the determined bend angle.
The method of any of Examples 32 through 34, further comprising: (a) receive, via user input, at least one user criteria; and (b) display, on the display device, a visual representation of the at least one user criteria.
The method of any of Examples 32 through 35, further comprising displaying, on the display device, a portion of the medical procedure instrument, the portion of the medical procedure instrument being within a field of view of the imaging instrument.
The method of any of Examples 32 through 36, further comprising: (a) determining a portion of the medical procedure instrument that is hidden from a view of the imaging instrument; and (b) generating and displaying a virtual instrument projection associated with the portion of the medical procedure instrument that is hidden from the view of the imaging instrument.
A method comprising: (a) determining a real-time position of a medical procedure instrument based on a signal from a first position sensor assembly; (b) determining a real-time position of an imaging instrument based on a signal from a second position sensor assembly; (c) displaying a real-time image from the imaging instrument, the real-time image providing a field of view; (d) determining that a portion of the medical procedure instrument is obscured within the field of view; (e) selecting a preoperative image region corresponding to the obscured portion of the medical procedure instrument; and (f) overlaying the selected preoperative image region onto the real-time image from the imaging instrument, at a location corresponding to a region of the field of view in which the portion of the medical instrument is obscured.
The method of Example 38, further comprising rendering a virtual representation of the obscured portion of the medical procedure instrument at a location corresponding to a region of the field of view in which the portion of the medical instrument is obscured.
The method of Example 39, the virtual representation being rendered within the overlaid preoperative image region.
The method of any of Examples 38 through 40, further comprising: (a) receiving a user selection indicating a transparency level; and (b) applying a transparency level to the overlay based on the user selection.
The method of any of Examples 38 through 41, further comprising providing a secondary overlay on the real-time image from the imaging instrument, the secondary overlay representing a preoperative characteristic of the medical procedure instrument.
The method of any of Examples 38 through 42, further comprising providing a secondary overlay on the real-time image from the imaging instrument, the secondary overlay representing a real-time operational characteristic of the medical procedure instrument.
A method comprising: (a) obtaining, from an electronic storage device, preoperative data associated with a medical procedure instrument; (b) obtaining at least one preoperative patient image; (c) tracking, using one or more tracking devices, a real-time location of the medical procedure instrument and an imaging instrument; (d) obtaining, from the imaging instrument, at least one real-time patient image; (e) determining, based on the preoperative data associated with the medical procedure instrument, at least one feature of the medical procedure instrument; (f) identifying, based on the location of the medical procedure instrument, the location of the imaging instrument, and the at least one feature of the medical procedure instrument, a localized view of the at least one preoperative patient image; (g) determining, based on the localized view, whether a distal portion of the instrument will be obscured by an anatomical structure of a patient; and (h) responsive to determining that the distal portion of the instrument will be obscured by the anatomical structure of the patient, superimposing and displaying, on a display device, the localized view of the at least one preoperative patient image over the at least one real-time patient image.
The method of Example 44, further comprising: (a) determining, based on the location of the medical procedure instrument an insertion depth associated with patient anatomy; and (b) displaying, on the display device, a visual representation of the determined insertion depth.
The method of any of Examples 44 through 45, further comprising: (a) determining a real-time bend angle of the instrument; and (b) displaying, on the display device, a visual representation of the determined bend angle.
The method of any of Examples 44 through 46, further comprising: (a) receiving, via user input, at least one user criteria; and (b) displaying, on the display device, a visual representation of the at least one user criteria.
The method of any of Examples 44 through 47, further comprising displaying, on the display device, a portion of the medical procedure instrument, the portion of the medical procedure instrument being visible to the imaging instrument.
The method of any of Examples 44 through 48, further comprising: (a) determining a portion of the medical procedure instrument being hidden from the view of the imaging instrument; and (b) generating and display, based on the data associated with a medical procedure instrument, a virtual instrument projection associated with the portion of the medical procedure instrument that is hidden from the view of the imaging instrument.
It should be understood that any one or more of the teachings, expressions, embodiments, implementations, examples, etc. described herein may be combined with any one or more of the other teachings, expressions, embodiments, implementations, examples, etc. that are described herein. The above-described teachings, expressions, embodiments, examples, etc. should therefore not be viewed in isolation relative to each other. Various suitable ways in which the teachings herein may be combined will be readily apparent to those skilled in the art in view of the teachings herein. Such modifications and variations are intended to be included within the scope of the claims.
It should be appreciated that any patent, publication, or other disclosure material, in whole or in part, that is said to be incorporated by reference herein is incorporated herein only to the extent that the incorporated material does not conflict with existing definitions, statements, or other disclosure material set forth in this disclosure. As such, and to the extent necessary, the disclosure as explicitly set forth herein supersedes any conflicting material incorporated herein by reference. Any material, or portion thereof, that is said to be incorporated by reference herein, but which conflicts with existing definitions, statements, or other disclosure material set forth herein will only be incorporated to the extent that no conflict arises between that incorporated material and the existing disclosure material.
Versions of the devices described above may be designed to be disposed of after a single use, or they can be designed to be used multiple times. Versions may, in either or both cases, be reconditioned for reuse after at least one use. Reconditioning may include any combination of the steps of disassembly of the device, followed by cleaning or replacement of particular pieces, and subsequent reassembly. In particular, some versions of the device may be disassembled, and any number of the particular pieces or parts of the device may be selectively replaced or removed in any combination. Upon cleaning and/or replacement of particular parts, some versions of the device may be reassembled for subsequent use either at a reconditioning facility or by a user immediately prior to a procedure. Those skilled in the art will appreciate that reconditioning of a device may utilize a variety of techniques for disassembly, cleaning/replacement, and reassembly. Use of such techniques, and the resulting reconditioned device, are all within the scope of the present application.
By way of example only, versions described herein may be sterilized before and/or after a procedure. In one sterilization technique, the device is placed in a closed and sealed container, such as a plastic or TYVEK bag. The container and device may then be placed in a field of radiation that can penetrate the container, such as gamma radiation, x-rays, or high-energy electrons. The radiation may kill bacteria on the device and in the container. The sterilized device may then be stored in the sterile container for later use. A device may also be sterilized using any other technique known in the art, including but not limited to beta or gamma radiation, ethylene oxide, or steam.
Some versions of the examples described herein may be implemented using a processor, which may be part of a computer system and communicate with a number of peripheral devices via bus subsystem. Versions of the examples described herein that are implemented using a computer system may be implemented using a general-purpose computer that is programmed to perform the methods described herein. Alternatively, versions of the examples described herein that are implemented using a computer system may be implemented using a specific-purpose computer that is constructed with hardware arranged to perform the methods described herein. Versions of the examples described herein may also be implemented using a combination of at least one general-purpose computer and at least one specific-purpose computer.
In versions implemented using a computer system, each processor may include a central processing unit (CPU) of a computer system, a microprocessor, an application-specific integrated circuit (ASIC), other kinds of hardware components, and combinations thereof. A computer system may include more than one type of processor. The peripheral devices of a computer system may include a storage subsystem including, for example, memory devices and a file storage subsystem, user interface input devices, user interface output devices, and a network interface subsystem. The input and output devices may allow user interaction with the computer system. The network interface subsystem may provide an interface to outside networks, including an interface to corresponding interface devices in other computer systems. User interface input devices may include a keyboard; pointing devices such as a mouse, trackball, touchpad, or graphics tablet; a scanner; a touch screen incorporated into the display; audio input devices such as voice recognition systems and microphones; and other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and ways to input information into computer system.
In versions implemented using a computer system, a storage subsystem may store programming and data constructs that provide the functionality of some or all of the modules and methods described herein. These software modules may be generally executed by the processor of the computer system alone or in combination with other processors. Memory used in the storage subsystem may include a number of memories including a main random-access memory (RAM) for storage of instructions and data during program execution and a read only memory (ROM) in which fixed instructions are stored. A file storage subsystem may provide persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges. The modules implementing the functionality of certain implementations may be stored by file storage subsystem in the storage subsystem, or in other machines accessible by the processor.
In versions implemented using a computer system, the computer system itself may be of varying types including a personal computer, a portable computer, a workstation, a computer terminal, a network computer, a television, a mainframe, a server farm, a widely-distributed set of loosely networked computers, or any other data processing system or user device. Due to the ever-changing nature of computers and networks, the example of the computer system described herein is intended only as a specific example for purposes of illustrating the technology disclosed. Many other configurations of a computer system are possible having more or fewer components than the computer system described herein.
Having shown and described various embodiments of the present invention, further adaptations of the methods and systems described herein may be accomplished by appropriate modifications by one of ordinary skill in the art without departing from the scope of the present invention. Several of such potential modifications have been mentioned, and others will be apparent to those skilled in the art. For instance, the examples, embodiments, geometrics, materials, dimensions, ratios, steps, and the like discussed above are illustrative and are not required. Accordingly, the scope of the present invention should be considered in terms of the following claims and is understood not to be limited to the details of structure and operation shown and described in the specification and drawings.
This application claims priority to U.S. Provisional Pat. App. No. 63/460,050, entitled “Apparatus and Method to Overlay Information on Endoscopic Images,” filed Apr. 18, 2023, the disclosure of which is incorporated by reference herein, in its entirety.
Number | Date | Country | |
---|---|---|---|
63460050 | Apr 2023 | US |