1. Technical Field
The present disclosure relates to the use of surgical devices to navigate through a body during a surgical procedure. More specifically, the present disclosure is directed to the use of a fiducial pattern and image capture device to determine a spatial relationship between the patient and the surgical devices.
2. Background of the Related Art
Electrosurgical devices have become widely used. Electrosurgery involves the application of thermal and/or electrical energy to cut, dissect, ablate, coagulate, cauterize, seal or otherwise treat biological tissue during a surgical procedure. Electrosurgery is typically performed using a handpiece including a surgical device (e.g., end effector or ablation probe) that is adapted to transmit energy to a tissue site during electrosurgical procedures, a remote electrosurgical generator operable to output energy, and a cable assembly operatively connecting the surgical device to the remote generator.
Treatment of certain diseases requires the destruction of malignant tissue growths, e.g., tumors. In the treatment of diseases such as cancer, certain types of tumor cells have been found to denature at elevated temperatures that are slightly lower than temperatures normally injurious to healthy cells. Known treatment methods, such as hyperthermia therapy, typically involving heating diseased cells to temperatures above 41° C. while maintaining adjacent healthy cells below the temperature at which irreversible cell destruction occurs. These methods may involve applying electromagnetic radiation to heat, ablate and/or coagulate tissue. There are a number of different types of electrosurgical apparatus that can be used to perform ablation procedures.
Minimally invasive tumor ablation procedures for cancerous or benign tumors may be performed using two dimensional (2D) preoperative computed tomography (CT) images and an “ablation zone chart” which typically describes the characteristics of an ablation needle in an experimental, ex vivo tissue across a range of input parameters (power, time). Energy dose (power, time) can be correlated to ablation tissue effect (volume, shape) for a specific design. It is possible to control the energy dose delivered to tissue through microwave antenna design, for example, an antenna choke may be employed to provide a known location of microwave transfer from device into tissue. In another example, dielectric buffering enables a relatively constant delivery of energy from the device into the tissue independent of differing or varying tissue properties.
After a user determines which ablation needle should be used to effect treatment of a target, the user performs the treatment with ultrasound guidance. Typically, a high level of skill is required to place a surgical device into a target identified under ultrasound. Of primary importance is the ability to choose the angle and entry point required to direct the device toward the ultrasound image plane (e.g., where the target is being imaged).
Ultrasound-guided intervention involves the use of real-time ultrasound imaging (transabdominal, intraoperative, etc.) to accurately direct surgical devices to their intended target. This can be performed by percutaneous application and/or intraoperative application. In each case, the ultrasound system will include a transducer that images patient tissue and is used to identify the target and to anticipate and/or follow the path of an instrument toward the target.
Ultrasound-guided interventions are commonly used today for needle biopsy procedures to determine malignancy of suspicious lesions that have been detected (breast, liver, kidney, and other soft tissues). Additionally, central-line placements are common to gain jugular access and allow medications to be delivered. Finally, emerging uses include tumor ablation and surgical resection of organs (liver, lung, kidney, and so forth). In the case of tumor ablation, after ultrasound-guided targeting is achieved a biopsy-like needle may be employed to deliver energy (RF, microwave, cryo, and so forth) with the intent to kill tumor. In the case of an organ resection, intimate knowledge of subsurface anatomy during dissection, and display of a surgical device in relation to this anatomy, is key to gaining successful surgical margin while avoiding critical structures.
In each of these cases, the ultrasound-guidance typically offers a two dimensional image plane that is captured from the distal end of a patient-applied transducer. Of critical importance to the user for successful device placement is the ability to visualize and characterize the target, to choose the instrument angle and entry point to reach the target, and to see the surgical device and its motion toward the target. Today, the user images the target and uses a high level of skill to select the instrument angle and entry point. The user must then either move the ultrasound transducer to see the instrument path (thus losing site of the target) or assume the path is correct until the device enters the image plane. Of primary importance is the ability to choose the angle and entry point required to direct the device toward the ultrasound image plane (e.g., where the target is being imaged).
This description may use the phrases “in an embodiment,” “in embodiments,” “in some embodiments,” or “in other embodiments,” which may each refer to one or more of the same or different embodiments in accordance with the present disclosure. For the purposes of this description, a phrase in the form “A/B” means A or B. For the purposes of the description, a phrase in the form “A and/or B” means “(A), (B), or (A and B)”. For the purposes of this description, a phrase in the form “at least one of A, B, or C” means “(A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C)”.
As shown in the drawings and described throughout the following description, as is traditional when referring to relative positioning on a surgical device, the term “proximal” refers to the end of the apparatus that is closer to the user or generator, while the term “distal” refers to the end of the apparatus that is farther away from the user or generator. The term “user” refers to any medical professional (i.e., doctor, nurse, or the like) performing a medical procedure involving the use of aspects of the present disclosure described herein.
As used in this description, the term “surgical device” generally refers to a surgical tool that imparts electrosurgical energy to treat tissue. Surgical devices may include, but are not limited to, needles, probes, catheters, endoscopic instruments, laparoscopic instruments, vessel sealing devices, surgical staplers, etc. The term “electrosurgical energy” generally refers to any form of electromagnetic, optical, or acoustic energy.
Electromagnetic (EM) energy is generally classified by increasing frequency or decreasing wavelength into radio waves, microwaves, infrared, visible light, ultraviolet, X-rays and gamma-rays. As used herein, the term “microwave” generally refers to electromagnetic waves in the frequency range of 300 megahertz (MHz) (3×108 cycles/second) to 300 gigahertz (GHz) (3×1011 cycles/second). As used herein, the term “RE” generally refers to electromagnetic waves having a lower frequency than microwaves. As used herein, the term “ultrasound” generally refers to cyclic sound pressure with a frequency greater than the upper limit of human hearing.
As used in this description, the term “ablation procedure” generally refers to any ablation procedure, such as microwave ablation, radio frequency (RF) ablation or microwave ablation-assisted resection. As it is used in this description, “energy applicator” generally refers to any device that can be used to transfer energy from a power generating source, such as a microwave or RF electrosurgical generator, to tissue.
As they are used in this description, the terms “power source” and “power supply” refer to any source (e.g., battery) of electrical power in a form that is suitable for operating electronic circuits. As it is used in this description, “transmission line” generally refers to any transmission medium that can be used for the propagation of signals from one point to another. As used in this description, the terms “switch” or “switches” generally refers to any electrical actuators, mechanical actuators, electro-mechanical actuators (rotatable actuators, pivotable actuators, toggle-like actuators, buttons, etc.), optical actuators, or any suitable device that generally fulfills the purpose of connecting and disconnecting electronic devices, or a component thereof, instruments, equipment, transmission line or connections and appurtenances thereto, or software.
As used in this description, “electronic device” generally refers to a device or object that utilizes the properties of electrons or ions moving in a vacuum, gas, or semiconductor. As it is used herein, “electronic circuitry” generally refers to the path of electron or ion movement, as well as the direction provided by the device or object to the electrons or ions. As it is used herein, “electrical circuit” or simply “circuit” generally refers to a combination of a number of electrical devices and conductors that when connected together, form a conducting path to fulfill a desired function. Any constituent part of an electrical circuit other than the interconnections may be referred to as a “circuit element” that may include analog and/or digital components.
The term “generator” may refer to a device capable of providing energy. Such device may include a power source and an electrical circuit capable of modifying the energy outputted by the power source to output energy having a desired intensity, frequency, and/or waveform.
As it is used in this description, “user interface” generally refers to any visual, graphical, tactile, audible, sensory or other mechanism for providing information to and/or receiving information from a user or other entity. The term “user interface” as used herein may refer to an interface between a human user (or operator) and one or more devices to enable communication between the user and the device(s). Examples of user interfaces that may be employed in various embodiments of the present disclosure include, without limitation, switches, potentiometers; buttons, dials, sliders, a mouse, a pointing device, a keyboard, a keypad, joysticks, trackballs, display screens, various types of graphical user interfaces (GUIs), touch screens, microphones and other types of sensors or devices that may receive some form of human-generated stimulus and generate a signal in response thereto. As it is used herein, “computer” generally refers to anything that transforms information in a purposeful way.
The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, or the like. The controller may also include a memory to store data and/or algorithms to perform a series of instructions.
Any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program. A “Programming Language” and “Computer Program” is any language used to specify instructions to a computer, and includes (but is not limited to) these languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, Machine code, operating system command languages, Pascal, Peri, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, and fifth generation computer languages. Also included are database and other data schemas, and any other meta-languages. For the purposes of this definition, no distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. For the purposes of this definition, no distinction is made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. The definition also encompasses the actual instructions and the intent of those instructions.
Any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory. The term “memory” may include a mechanism that provides (e.g., stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device. For example, a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device. Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.
As it is used in this description, the phrase “treatment plan” refers to a selected ablation needle, energy level, and/or treatment duration to effect treatment of a target. The term “target” refers to a region of tissue slated for treatment, and may include, without limitation, tumors, fibroids, and other tissue that is to be ablated. The phrase “ablation zone” refers to the area and/or volume of tissue that will be ablated.
As it is used in this description, the phrase “computed tomography” (CT) or “computed axial tomography” (CAT) refer to a medical imaging method employing tomography created by computer processing. Digital geometry processing is used to generate a three-dimensional image of the inside of an object from a large series of two-dimensional X-ray images taken around a single axis of rotation.
As it is used in this description, the term magnetic resonance imaging (MRI), nuclear magnetic resonance imaging (NMRI), or magnetic resonance tomography (MRT) refer to a medical imaging technique used in radiology to visualize detailed internal structures. MRI makes use of the property of nuclear magnetic resonance (NMR) to image nuclei of atoms inside the body. An MRI machine uses a powerful magnetic field to align the magnetization of some atomic nuclei in the body, while using radio frequency fields to systematically alter the alignment of this magnetization. This causes the nuclei to produce a rotating magnetic field detectable by the scanner and this information is recorded to construct an image of the scanned area of the body.
As it is used in this description, the term “three-dimensional ultrasound” or “3D ultrasound” refers to medical ultrasound technique providing three dimensional images.
As it is used in this description, the phrase “digital imaging and communication in medicine” (DICOM) refers to a standard for handling, storing, printing, and transmitting information relating to medical imaging. It includes a file format definition and a network communications protocol. The communication protocol is an application protocol that uses TCP/IP to communicate between systems. DICOM files can be exchanged between two entities that are capable of receiving image and patient data in DICOM format.
Any of the herein described systems and methods may transfer data therebetween over a wired network, wireless network, point to point communication protocol, a DICOM communication protocol, a transmission line, a removable storage medium, and the like.
The systems described herein may utilize one or more sensors configured to detect one or more properties of tissue and/or the ambient environment. Such properties include, but are not limited to: tissue impedance, tissue type, tissue clarity, tissue compliance, temperature of the tissue or jaw members, water content in tissue, jaw opening angle, water motality in tissue, energy delivery, and jaw closure pressure.
In an aspect of the present disclosure, a fiducial tracking system is provided. The fiducial tracking system includes a first device having a fiducial pattern disposed thereon and a second device having an image capture device disposed thereon, the image capture device configured to obtain a fiducial image of the fiducial pattern. A controller is also provided that receives the fiducial image, corrects the fiducial image for lens distortion, finds correspondence between the fiducial image and a model image, estimates a camera pose, and transforms a position of the surgical device to model coordinates.
In the fiducial tracking system, the fiducial pattern includes a plurality of first unique identifiers disposed in a region and a plurality of second unique identifiers.
The controller finds the plurality of first unique identifiers by applying a first threshold to the fiducial image, performing a connected component analysis, applying a geometric filter to determine the weighted centroids of the plurality of first unique identifiers, and storing the weighted centroids of the plurality of first unique identifiers.
The controller finds the plurality of second unique identifiers by inverting the fiducial image, applying a second threshold to the inverted fiducial image, performing a connected component analysis, applying a geometric filter to determine the weighted centroids of the plurality of second unique identifiers and to determine the region having the plurality of first unique identifiers, and storing the weighted centroids of the plurality of second identifiers and the region having the plurality of first unique identifiers.
The controller may find correspondence between the fiducial image and the model image by selecting a plurality of first unique identifiers from the fiducial image, arranging the plurality of first unique identifiers in clockwise order, arranging a plurality of model fiducials in clockwise order, computing a planar homography, transforming the plurality of model fiducials into image coordinates using the computed planar homography, finding a model fiducial from the plurality of model fiducials that matches the fiducial image, and computing the residual error.
The controller may also select a plurality of first unique identifiers from the fiducial image by selecting the plurality of first unique identifiers, selecting the region having the plurality of first unique identifiers, counting the number of first unique identifiers in the selected region, comparing the number of first unique identifiers in the selected region to a predetermined number. If the number of first unique identifiers in the selected region equals the predetermined number, the method proceeds to arranging the plurality of first unique identifiers in clockwise order. If the number of first unique identifiers in the selected region does not equal the predetermined number, a new region is selected and the number of first unique identifiers in the new region is counted. The first threshold or the second threshold is a dynamic threshold.
The above and other aspects, features, and advantages of the present disclosure will become more apparent in light of the following detailed description when taken in conjunction with the accompanying drawings in which:
Particular embodiments of the present disclosure are described hereinbelow with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely examples of the disclosure and may be embodied in various forms. Well-known functions or constructions are not described in detail to avoid obscuring the present disclosure in unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.
Turning to the figures,
Planning system 100, which is described in more detail below, receives the pre-operative images 15 and determines the size of a target. Based on the target size and a selected surgical device, planning system 100 determines settings that include an energy level and a treatment duration to effect treatment of the target.
Navigation system 200, which is described in more detail below, utilizes a fiducial pattern disposed on a medical imaging device (e.g., an ultrasound imaging device) to determine an intracorporeal position of an surgical device. The intracorporeal position of the surgical device is displayed on a display device in relation to an image obtained by the medical imaging device. Once the surgical device is positioned in the vicinity of the target, the user effects treatment of the target based on the treatment zone settings determined by the planning system.
In some embodiments, a user determines the treatment zone settings using planning system 100 and utilizes the treatment zone settings in effecting treatment using navigation system 200. In other embodiments, the planning system 100 transmits the treatment zone settings to navigation system 200 to automatically effect treatment of the target when the surgical device is in the vicinity of the target. Additionally, in some embodiments, planning system 100 and navigation system 200 are combined into a single standalone system. For instance, a single processor and a single user interface may be used for planning system 100 and navigation system 200, a single processor and multiple user interfaces may be used to for planning system 100 and navigation system 200, or multiple processors and a single user interface may be used for planning system 100 and navigation system 200.
To improve the energy focus of the ablation needle 60, the electrical choke 72 is used to contain field propagation or radiation pattern to the distal end of the ablation needle 60. Generally, the choke 72 is disposed on the ablation needle 60 proximally of the radiating section. The choke 72 is placed over a dielectric material that is disposed over the ablation needle 60. The choke 72 is a conductive layer that may be covered by a tubing or coating to force the conductive layer to conform to the underlying ablation needle 60, thereby forming an electrical connection (or short) more distally and closer to the radiating portion 62. The electrical connection between the choke 72 and the underlying ablation needle 60 may also be achieved by other connection methods such as soldering, welding, brazing, crimping, use of conductive adhesives, etc. Ablation needle 60 is electrically coupled to a generator that provides ablation needle 60 with electrosurgical energy.
The flowchart of
In step 154, controller 106 applies a geometric filter to compute the size and shape of an object. The geometric filter enables the measurement of geometric features of all objects in a labeled volume. This labeled volume can represent, for instance, a medical image segmented into different anatomical structures. The measurement of various geometric features of these objects can provide additional insight into the image.
The algorithm determines if a predetermined shape is detected in step 155. If a predetermined shape is not detected, the algorithm proceeds to step 156 where the threshold is increased by a predetermined value. The algorithm repeats steps 153 to 155 until a predetermined object is detected.
Once a predetermined object is detected, the algorithm ends in step 157 and the planning system 100 proceeds to step 144 to perform volumetric analysis. During the volumetric analysis, the following properties of the spherical object may be calculated by controller 106: minimum diameter; maximum diameter; average diameter; volume; sphericity; minimum density; maximum density; and average density. The calculated properties may be displayed on display 110 as shown in region 139 of
In step 146, power and time settings are calculated for a demarcated target.
Using the values in Table 1, a linear equation can be derived from the table to compute optimal power and time settings. For example, using a linear regression analysis, Table 1 provides the following equation:
Volume=0.292381*Power+8.685714*Time−44.0762 (1)
which can be written as
Power=(Volume−8.685714*Time+44.0762)/0.292381. (2)
The desired volume can be calculated using the maximum diameter from the volumetric analysis plus a 1 centimeter margin as follows:
DesiredVolume=4/3*pi*DesiredRadius^3 (3)
where the desired radius is calculated as follows:
DesiredRadius=MaximumNoduleDiameter/2+Margin. (4)
Substituting the desired volume into equation (1) or (2) leaves two unknowns, power and time. Using equation (2) controller 106 can solve for power by substituting values for time. Controller 106 chooses the smallest value for time that maintains power below 70 W, or some other predetermined value, so that the user can perform the procedure as quickly as possible while keeping power in a safe range.
Once the power and time are calculated 146, the power and time are displayed on display 110 as shown in
Memory 104 and/or controller 106 may store a number of equations that correspond to different surgical devices. When a user selects a different surgical devices in drop down menu 131, controller 106 can perform the same analysis described above to determine the smallest value for time that keeps the power below 70 W or some other predetermined value.
Although the above described procedure describes the use of a single seed point to determine a predetermined object, some targets may have an irregular shape that can not be treated by the predetermined treatment zone without causing damage to other tissue. In such instances, multiple seed points may be used to create an irregular shaped treatment plan using a single surgical device that is repositioned in a number of places or multiple surgical devices that may be used concurrently to treat an irregularly shaped region.
In other embodiments, memory 104 and/or controller 106 may store a catalog of surgical devices and treatment zone performance, which includes power, time, number of instruments, and spacing of instruments required to achieve treatment zones ex vivo or in vivo. Based on the results of the image segmentation and volumetric analysis, the controller may automatically select device types, numbers of devices, spacing of multiple devices, and/or power and time settings for each device to treat the ROI. Alternatively, a user can manually select device types, numbers of devices, spacing of multiple devices, power and/or time settings for each device to treat the ROI using the GUI to generate a treatment plan.
In another embodiment according to the present disclosure, planning system 100 may also segment organs and other vital structures in addition to targets. Segmentation of organs and other structures, such as vessels, are used to provide a more advanced treatment plan. As described above with regard to
In other embodiments, after targets, tissues, organs, and other structures are segmented, known tissue properties can be attributed to these structures. Such tissue properties include, but are not limited to, electrical conductivity and permittivity across frequency, thermal conductivity, thermal convection coefficients, and so forth. The planning algorithm of
Turning to
The navigation system also incorporates a camera 208 affixed to an surgical device 206. The camera 208 captures an image of fiducial patch 204 in real time in order to determine the position of the surgical device 206 in relation to the scan plane “S”. In particular, fiducial patch 204 has a defined spatial relationship to scan plane “S”. This defined spatial relationship is stored in controller 212. Camera 208 also has a known spatial relationship to surgical device 206 that is stored in controller 212. In order to determine the spatial relationship between surgical device 206 and scan plane “S”, camera 208 captures an image of fiducial patch 204 and transmits the image to controller 212. Using the image of the fiducial patch 204, controller 212 can calculate the spatial relationship between the surgical device 206 and the scan plane “S”.
After controller 212 determines the spatial relationship between the surgical device 206 and scan plane “S”, controller 212 displays that relationship on display 214. As shown in
Controller 212 can also be controlled by a user to input the surgical device type, energy level, and treatment duration. The surgical device type, energy level, and treatment duration can be displayed on display 214 as shown in
The fiducial tracking system is described hereinbelow with reference to
In step 232, controller 212 finds the white circles in the image frame using the algorithm of
threshold=(black circle intensityaverage+white circle intensityaverage)/2 (5)
A predetermined threshold may be used to capture the initial valid frame which is then used to calculate a new threshold.
Alternatively, controller 212 may scan for an initial threshold by testing a range of threshold values until a threshold value is found that results in a valid frame. Once an initial threshold is found, controller 212 would use equation (5) for dynamic thresholding based on the valid frame.
In other embodiments, a fixed threshold may be used. The fixed threshold may be a predetermined number stored in controller 212 or it may be determined by testing the range of threshold values until a threshold value is found that results in a valid frame.
After a threshold and automatic gain control is applied to the image, a connected component analysis is performed in step 244 to find all the objects in the thresholded image. A geometric filter is applied to the results of the connected component analysis and the image frame in step 245. The geometric filter computes the size and shape of the objects and keeps only those objects that are circular and about the right size as shown in
Turning back to
In step 234 of
After the four circles are chosen, they are arranged in a clockwise order using a convex hull algorithm in step 252. The convex hull or convex envelope for a set of points X in a real vector space V is the minimal convex set containing X. If the points are all on a line, the convex hull is the line segment joining the outermost two points. In the planar case, the convex hull is a convex polygon unless all points are on the same line. Similarly, in three dimensions the convex hull is in general the minimal convex polyhedron that contains all the points in the set. In addition, the four matching fiducials in the model are also arranged in a clockwise order.
In step 253, a planar homography matrix is computed. After a planar homography matrix is calculated, the homography matrix is used to transform the fiducial models to image coordinates using the four corresponding fiducial models shown in
In step 235 of
Because object boundaries expand and contract under different lighting conditions, a conventional square corner fiducials location may change depending on lighting conditions. Fiducial patch 204 uses black and white circles, and, thus, is not hampered by this problem because the center of the circle always stays, the same and continues to work well for computing weighted centroids. Other contrasting images or colors are also contemplated.
In another embodiment of the present disclosure, and as shown in
In other embodiments of the present disclosure, CT navigation and software can be integrated with planning system 100. Turning to
Navigation system 406 may use an electromagnetic tracking system as shown in
After receiving data from navigation system 406, controller 408 may correlate the position of the surgical device 424 with the CT images in order to navigate the surgical device 424 to a target “T” as described below. In this case, the patient reference (of any type) may have radiopaque markers on it as well to allow visualization during CT. This allows the controller to connect the patient CT image coordinate system to the instrument tracking coordinate system.
Controller 408 and display 410 cooperate with each other to display the CT images on a navigation screen 440 as shown in
A navigation guide screen 448 is provided on display screen 440 to assist in navigating the ablation needle to the target “T”. Based on the data received from the navigation system 406, the controller can determine if the surgical device 424 is aligned with target “T”. If the surgical device 424 is not aligned with target “T”, the circle 454 would be off-centered from outer circle 453. The user would then adjust the angle of entry for the surgical device 424 until the center of circle 454 is aligned with the center of outer circle 453. In some embodiments, circle 454 may be displayed as a red circle when the center of circle 454 is not aligned with the center of outer circle 453 or circle 454 may be displayed as a green circle when the center of circle 454 is aligned with the center of outer circle 453. Additionally, controller 408 may calculate the distance between the target “T” and the surgical device 424.
In another embodiment depicted in
It should be understood that the foregoing description is only illustrative of the present disclosure. Various alternatives and modifications can be devised by those skilled in the art without departing from the disclosure. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances. The embodiments described with reference to the attached drawing figures are presented only to demonstrate certain examples of the disclosure. Other elements, steps, methods and techniques that are insubstantially different from those described above and/or in the appended claims are also intended to be within the scope of the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
5515160 | Schulz et al. | May 1996 | A |
5528699 | Obata et al. | Jun 1996 | A |
5776062 | Nields | Jul 1998 | A |
5788636 | Curley | Aug 1998 | A |
5799099 | Wang et al. | Aug 1998 | A |
5810008 | Dekel et al. | Sep 1998 | A |
5817022 | Vesely | Oct 1998 | A |
5825908 | Pieper et al. | Oct 1998 | A |
5836954 | Heilbrun et al. | Nov 1998 | A |
5842473 | Fenster et al. | Dec 1998 | A |
5873822 | Ferre et al. | Feb 1999 | A |
5891030 | Johnson et al. | Apr 1999 | A |
5902239 | Buurman | May 1999 | A |
5953013 | Shimizu | Sep 1999 | A |
5954648 | Van Der Brug | Sep 1999 | A |
5957844 | Dekel et al. | Sep 1999 | A |
5967980 | Ferre et al. | Oct 1999 | A |
6002808 | Freeman | Dec 1999 | A |
6006126 | Cosman | Dec 1999 | A |
6019724 | Gronningsaeter et al. | Feb 2000 | A |
6052477 | Wang et al. | Apr 2000 | A |
6081577 | Webber | Jun 2000 | A |
6112112 | Gilhuijs et al. | Aug 2000 | A |
6112113 | Van Der Brug et al. | Aug 2000 | A |
6119033 | Spigelman et al. | Sep 2000 | A |
6165181 | Heilbrun et al. | Dec 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
6195444 | Simanovsky et al. | Feb 2001 | B1 |
6203497 | Dekel et al. | Mar 2001 | B1 |
6216029 | Paltieli | Apr 2001 | B1 |
6226542 | Reisfeld | May 2001 | B1 |
6259943 | Cosman et al. | Jul 2001 | B1 |
6285902 | Kienzle | Sep 2001 | B1 |
6298262 | Franck et al. | Oct 2001 | B1 |
6301495 | Gueziec et al. | Oct 2001 | B1 |
6332089 | Acker et al. | Dec 2001 | B1 |
6334847 | Fenster et al. | Jan 2002 | B1 |
6338716 | Hossack et al. | Jan 2002 | B1 |
6341231 | Ferre et al. | Jan 2002 | B1 |
6343936 | Kaufman et al. | Feb 2002 | B1 |
6379302 | Kessman et al. | Apr 2002 | B1 |
6381483 | Hareyama et al. | Apr 2002 | B1 |
6440071 | Slayton et al. | Aug 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6466815 | Saito et al. | Oct 2002 | B1 |
6470207 | Simon et al. | Oct 2002 | B1 |
6477275 | Melikian et al. | Nov 2002 | B1 |
6487432 | Slack | Nov 2002 | B2 |
6505065 | Yanof et al. | Jan 2003 | B1 |
6529758 | Shahidi | Mar 2003 | B2 |
6539247 | Spetz | Mar 2003 | B2 |
6540679 | Slayton et al. | Apr 2003 | B2 |
6546279 | Bova et al. | Apr 2003 | B1 |
6553152 | Miller et al. | Apr 2003 | B1 |
6574493 | Rasche et al. | Jun 2003 | B2 |
6612980 | Chen et al. | Sep 2003 | B2 |
6669635 | Kessman et al. | Dec 2003 | B2 |
6675032 | Chen et al. | Jan 2004 | B2 |
6694163 | Vining | Feb 2004 | B1 |
6711429 | Gilboa et al. | Mar 2004 | B1 |
6724930 | Kosaka et al. | Apr 2004 | B1 |
6731966 | Spigelman et al. | May 2004 | B1 |
6733458 | Steins et al. | May 2004 | B1 |
6751361 | Wagman | Jun 2004 | B1 |
6754374 | Miller et al. | Jun 2004 | B1 |
6772002 | Schmidt et al. | Aug 2004 | B2 |
6812933 | Silver | Nov 2004 | B1 |
6892090 | Verard et al. | May 2005 | B2 |
6909913 | Vining | Jun 2005 | B2 |
6920347 | Simon et al. | Jul 2005 | B2 |
6925319 | McKinnon | Aug 2005 | B2 |
6947786 | Simon et al. | Sep 2005 | B2 |
6961405 | Scherch | Nov 2005 | B2 |
6968224 | Kessman et al. | Nov 2005 | B2 |
6969352 | Chiang et al. | Nov 2005 | B2 |
6973202 | Mostafavi | Dec 2005 | B2 |
7035461 | Luo et al. | Apr 2006 | B2 |
7043055 | Silver | May 2006 | B1 |
7043064 | Paik et al. | May 2006 | B2 |
7050845 | Vilsmeier | May 2006 | B2 |
7161596 | Hoile | Jan 2007 | B2 |
7171255 | Holupka et al. | Jan 2007 | B2 |
7204254 | Riaziat et al. | Apr 2007 | B2 |
7215990 | Fessner et al. | May 2007 | B2 |
7251352 | Sauer et al. | Jul 2007 | B2 |
7259762 | Tanacs et al. | Aug 2007 | B2 |
7302288 | Schellenberg | Nov 2007 | B1 |
7333644 | Jerebko et al. | Feb 2008 | B2 |
7343026 | Niwa et al. | Mar 2008 | B2 |
7379572 | Yoshida et al. | May 2008 | B2 |
7383073 | Abovitz et al. | Jun 2008 | B1 |
7450749 | Rouet et al. | Nov 2008 | B2 |
7452357 | Vlegele et al. | Nov 2008 | B2 |
7457443 | Persky | Nov 2008 | B2 |
7491198 | Kockro | Feb 2009 | B2 |
7492930 | Leitner et al. | Feb 2009 | B2 |
7496173 | Goldman et al. | Feb 2009 | B2 |
7499743 | Vass et al. | Mar 2009 | B2 |
7519218 | Takemoto et al. | Apr 2009 | B2 |
7536041 | Pekar et al. | May 2009 | B2 |
7567697 | Mostafavi | Jul 2009 | B2 |
7570987 | Raabe et al. | Aug 2009 | B2 |
7581191 | Rice et al. | Aug 2009 | B2 |
7593505 | Saracen et al. | Sep 2009 | B2 |
7623250 | Moctezuma de la Barrera et al. | Nov 2009 | B2 |
7630753 | Simon et al. | Dec 2009 | B2 |
7636420 | Spies et al. | Dec 2009 | B2 |
7639853 | Olivera et al. | Dec 2009 | B2 |
7643663 | Wiemker et al. | Jan 2010 | B2 |
7672705 | Lachaine et al. | Mar 2010 | B2 |
7689019 | Boese et al. | Mar 2010 | B2 |
7780084 | Zhang et al. | Aug 2010 | B2 |
7809184 | Neubauer et al. | Oct 2010 | B2 |
7831082 | Holsing et al. | Nov 2010 | B2 |
7844087 | Ray et al. | Nov 2010 | B2 |
7853305 | Simon et al. | Dec 2010 | B2 |
7856130 | Suri et al. | Dec 2010 | B2 |
7860331 | Lal et al. | Dec 2010 | B2 |
7860548 | McIntyre et al. | Dec 2010 | B2 |
7873400 | Moctezuma de la Barrera et al. | Jan 2011 | B2 |
7874987 | Altmann et al. | Jan 2011 | B2 |
7876937 | Schildkraut et al. | Jan 2011 | B2 |
7876939 | Yankelevitz et al. | Jan 2011 | B2 |
7876942 | Gilboa | Jan 2011 | B2 |
7892224 | Hartlep et al. | Feb 2011 | B2 |
7894663 | Berg et al. | Feb 2011 | B2 |
7899513 | Phillips et al. | Mar 2011 | B2 |
7907772 | Wang et al. | Mar 2011 | B2 |
7912258 | Warmath et al. | Mar 2011 | B2 |
7916918 | Suri et al. | Mar 2011 | B2 |
7920911 | Hoshino et al. | Apr 2011 | B2 |
7953265 | Sirohey et al. | May 2011 | B2 |
20110137168 | Lee et al. | Jun 2011 | |
7957572 | Von Berg et al. | Jun 2011 | B2 |
7970174 | Goldbach | Jun 2011 | B2 |
8000442 | Lachaine et al. | Aug 2011 | B2 |
8010180 | Quaid et al. | Aug 2011 | B2 |
8019133 | Knoplioch et al. | Sep 2011 | B2 |
8023712 | Ikuma et al. | Sep 2011 | B2 |
8023734 | Jolly et al. | Sep 2011 | B2 |
8036435 | Partain et al. | Oct 2011 | B2 |
8045778 | Blaffert et al. | Oct 2011 | B2 |
8046052 | Verard et al. | Oct 2011 | B2 |
20030135115 | Burdette et al. | Jul 2003 | A1 |
20040015070 | Liang et al. | Jan 2004 | A1 |
20040034297 | Darrow et al. | Feb 2004 | A1 |
20060229594 | Francischelli et al. | Oct 2006 | A1 |
20070238961 | Vilsmeier et al. | Oct 2007 | A1 |
20080063136 | Ohyu et al. | Mar 2008 | A1 |
20080081982 | Simon et al. | Apr 2008 | A1 |
20080097186 | Biglieri et al. | Apr 2008 | A1 |
20080119712 | Lloyd | May 2008 | A1 |
20080123921 | Gielen et al. | May 2008 | A1 |
20080123927 | Miga et al. | May 2008 | A1 |
20080167547 | Bova et al. | Jul 2008 | A1 |
20080200794 | Teichman et al. | Aug 2008 | A1 |
20080200926 | Verard et al. | Aug 2008 | A1 |
20080200927 | Hartmann et al. | Aug 2008 | A1 |
20080214922 | Hartmann et al. | Sep 2008 | A1 |
20080232656 | Voegele | Sep 2008 | A1 |
20080242978 | Simon et al. | Oct 2008 | A1 |
20080262345 | Fichtinger et al. | Oct 2008 | A1 |
20080285854 | Kotake et al. | Nov 2008 | A1 |
20090124896 | Haras | May 2009 | A1 |
20090198126 | Klingenbeck-Regn | Aug 2009 | A1 |
20090221908 | Glossop | Sep 2009 | A1 |
20090292201 | Kruecker | Nov 2009 | A1 |
20090312629 | Razzaque et al. | Dec 2009 | A1 |
20100063392 | Nishina et al. | Mar 2010 | A1 |
20100063496 | Trovato et al. | Mar 2010 | A1 |
20100076305 | Maier-Hein et al. | Mar 2010 | A1 |
20100121189 | Ma et al. | May 2010 | A1 |
20100121190 | Pagoulatos et al. | May 2010 | A1 |
20100168763 | Zhao et al. | Jul 2010 | A1 |
20100179529 | Podhajsky et al. | Jul 2010 | A1 |
20100208963 | Kruecker et al. | Aug 2010 | A1 |
20100217117 | Glossop et al. | Aug 2010 | A1 |
20100249771 | Pearson et al. | Sep 2010 | A1 |
20100250209 | Pearson et al. | Sep 2010 | A1 |
20100268223 | Coe et al. | Oct 2010 | A1 |
20100274124 | Jascob et al. | Oct 2010 | A1 |
20100295931 | Schmidt | Nov 2010 | A1 |
20100298705 | Pelissier et al. | Nov 2010 | A1 |
20100322489 | Tizhoosh et al. | Dec 2010 | A1 |
20110015628 | Dalal et al. | Jan 2011 | A1 |
20110118596 | Vining et al. | May 2011 | A1 |
20110129154 | Shimodaira | Jun 2011 | A1 |
20110137156 | Razzaque et al. | Jun 2011 | A1 |
20110160569 | Cohen et al. | Jun 2011 | A1 |
20110251483 | Razzaque et al. | Oct 2011 | A1 |
20120050258 | Kay et al. | Mar 2012 | A1 |
20121362420 | Qi et al. | May 2012 |
Number | Date | Country |
---|---|---|
1 649 822 | Apr 2006 | EP |
WO 9515729 | Jun 1995 | WO |
WO 9515729 | Jun 1995 | WO |
WO 9703609 | Feb 1997 | WO |
WO 0139124 | May 2001 | WO |
WO 2006089426 | Aug 2006 | WO |
WO 2008017051 | Feb 2008 | WO |
WO 2008058520 | May 2008 | WO |
WO 2012066446 | May 2012 | WO |
Entry |
---|
U.S. Appl. No. 13/477,374, filed May 22, 2012, Jason A. Case et al. |
U.S. Appl. No. 13/477,406, filed May 22, 2012, Kevin Frank et al. |
U.S. Appl. No. 13/477,291, filed May 22, 2012, Jason A. Case et al. |
U.S. Appl. No. 13/477,417, May 22, 2012, Kevin Frank et al. |
U.S. Appl. No. 13/477,395, filed May 22, 2012, Jason A. Case et al. |
International Search Report dated Aug. 21, 2013, corresponding to International Application No. PCT/US2013/041842; 4 pages. |
European Search Report dated Oct. 10, 2013, corresponding to European Application No. EP 13 16 8705; 8 pages. |
European Search Report dated Oct. 8, 2013 corresponding to European Application No. EP 13 16 8706; 8 pages. |
European Search Report dated Aug. 23, 2013, corresponding to European Application No. EP 13 16 8707; 9 pages. |
European Search Report dated Aug. 23, 2013, corresponding to European Application No. EP 16 16 8516; 15 pages. |
Kosaka A. et al. “Augmented Reality System for Surgical Navigation Using Robust Target Vision”, Proceedings 2000 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000. Hilton Head, SC, Jun. 13-15, 2000, pp. 187-194. |
European Search Report dated Oct. 23, 2013 for EP 13 17 6292. |
Number | Date | Country | |
---|---|---|---|
20130315440 A1 | Nov 2013 | US |