The systems and methods disclosed herein relate generally to computer systems facilitating medical device guidance through tissue by a medical practitioner.
Various existing medical device systems, such as for ablation procedures, require a healthcare provider, such as a physician, to place controlled formations of medical devices into patient tissue. Examples of these systems include the RFA Medical InCircle™ System and the AngioDynamics NanoKnife™ system. Similarly, the Covidien Evident™ MWA System does not require, but supports, multi-medical device configurations. Many of these systems include a plastic guide piece to help the healthcare provider hold the medical devices in an acceptable spatial configuration (e.g. three medical devices held parallel, with shafts 2 cm apart).
Unfortunately, controlled placement of the medical devices often comes at the expense of operator flexibility and responsiveness. Accordingly, there is a need for a medical device guidance system which assists operators in placing medical devices, while permitting the operator to maintain an appropriate degree of hand flexibility.
System Overview
Implementations disclosed herein provide systems, methods and apparatus for generating images facilitating medical device insertion into tissue by an operator. Certain embodiments pertain to a free-hand medical device guidance system. The system can provide the healthcare provider full manual control over the medical device, while making the spatial relationships between the target, medical device and U/S image more intuitive via a visual display. Using this visual feedback, the operator can adjust the medical device's position, orientation, or trajectory. Particularly, the system can be used to facilitate multiple-medical device configurations. Certain of the contemplated embodiments can be used in conjunction with previous systems, such as U.S. patent application Ser. No. 13/014,587 and U.S. patent application Ser. No. 11/137,156, each of which is hereby incorporated by reference in its entirety.
In some embodiments, a user desires that the medical devices that are used together (e.g., used simultaneously for treating one tumor) are parallel to each other. In certain embodiments, the medical devices are arranged to be equally spaced from the center of the tumor. In some embodiments, a user desires that the tips of the medical devices be in the same plane, and all medical device shafts be perpendicular to that plane. In some embodiments, each medical device includes a single shaft-like electrode surrounded by a tube of electrically insulating material. The medical provider can expose some length of the electrode by sliding the insulating tube partially into the medical device's handle. In some embodiments, the length of the exposed electrode can be specified by the user (e.g. 0.5 cm-2 cm). In some embodiments, the length of the exposed electrodes can be the same for all medical devices. In certain embodiments, the healthcare provider exposes the electrode (by withdrawing the insulating sleeve) before placing the medical device into the patient. In some embodiments, the power generator is provided the distances between the medical devices, and the length of the exposed electrodes. In some embodiments, the medical devices can move continuously, even after the healthcare provider has placed them into the proper position in the patient's tissue because of patient motion from breathing, manual manipulation, etc. In some embodiments, the medical devices are not parallel to an image plane.
The system can aid the healthcare provider in placing the medical devices. In some embodiments, the system improves the healthcare provider's ability to place the medical devices level with each other (e.g., tips are co-planar) and parallel with each other. The system can also aid the healthcare provider in determining the number of medical devices to be used and what their optimal positions are, including: distance from tumor's center; distance from tumor's extent or boundary; medical device depth; spacing between medical devices; angle between medical devices; exposure of the deployable electrodes (or optimal retraction of the electrode's insulation), etc.
The system can also help the healthcare provider determine what healthy tissues are in an ablation zone volume (by displaying the predicted ablation zone of the multi-medical device configuration). The system can help the healthcare provider place the medical devices in the above determined (planned) configuration. The system can help the healthcare provider understand how the current configuration (i.e. the way the medical devices are currently placed) differs from the optimal, acceptable or pre-planned configurations. The system can output the distance between medical devices to the power generator, so that the ablation time, the ablation power and other ablation parameters can be automatically computed by the ablation generator. The system can be used for treatment of tumors, fibroids or cysts, with bipolar radiofrequency medical device ablation, multiple microwave medical devices, electroporation, and/or electrochemotherapy systems. It can also be used for nerve or muscle stimulation or sensing (electrodes in the spine, brain). The system can be used during open surgery, laparoscopic surgery, endoscopic procedures, biopsies, and/or interventional radiology procedures.
The system can be used in conjunction with live intraoperative ultrasound (U/S), pre-operative CT, or any cross-sectional medical imaging modality (e.g. MRI, OCT, etc.). In addition, the system can use a variety of techniques to determine each medical device's position and orientation. For example, the system can use the NDI Aurora magnetic system, NDI Polaris optical system, etc. In some embodiments, a position sensor can be embedded inside, or affixed to the outside of each medical device, at the tip, along the shaft, or on the handle. Sensors can be built into the medical devices or attached after manufacturing, before use. Each medical device can have its own sensor, which continually reports position and orientation, or a single sensor can be used for all the medical devices. In embodiments where one sensor is used, the healthcare provider can attach the sensor to the particular medical device that she is intentionally repositioning, and then, once she has placed that medical device, she would remove the sensor and attach it to the next medical device she is repositioning. In some embodiments, the medical devices, U/S probe and/or laparoscope can be manipulated by the healthcare provider. In certain embodiments, the system can be used with a robotic manipulator, where the robot controls the medical devices, U/S probe and/or laparoscope.
In some embodiments, the handles of medical devices can have push-button switches, to allow the user to select a medical device, indicate a tissue target, etc. The handle can also have an indicator light to indicate to the users which medical device is selected. Finally, the handle can have an encoder to detect how much length of electrode has been exposed by the user, and report this information to the guidance system and therapeutic generator
Image Guidance Systems
In some embodiments, position sensing units 110 and 140 can track surgical instruments, also referred to herein as medical devices, within a tracking area and provide data to the image guidance unit. The medical devices can include invasive medical devices, biopsy needles, ablation needles, surgical needles, nerve-block needles, or other needles, electrocautery device, catheters, stents, laparoscopic cameras, or other instruments that enter a part of the body, and non-invasive medical devices that do not enter the body, such as ultrasound transducers. The medical devices can also include medical imaging devices that provide or aid in the selection of medical images for display. In some embodiments, the medical imaging device can be any device that is used to select a particular medical image for display. The medical imaging devices can include invasive medical devices, such as laparoscopic cameras, and non-invasive medical devices, such as ultrasound transducers.
Although only two surgical instruments 145 and 155 are shown in
Information about and from multiple surgical systems 149 and attached surgical instruments 145 (and additional surgical instruments not shown) can be processed by image guidance unit 130 and shown on display 120. These and other possible embodiments are discussed in more detail below. Imaging unit 150 can be coupled to image guidance unit 130. In some embodiments, imaging unit 150 can be coupled to a second display unit (not shown). The second display unit can display imaging data from imaging unit 150. The imaging data displayed on display unit 120 and displayed on second display unit can be the same or different. In some embodiments, the imaging unit 150 is an ultrasound machine 150, the movable imaging device 155 is an ultrasound transducer 155 or ultrasound probe 155, and the second display unit is a display associated with the ultrasound machine 150 that displays the ultrasound images from the ultrasound machine 150. In some embodiments, a movable imaging unit 155 can be connected to image guidance unit 130. The movable imaging unit 155 can be useful for allowing a user to indicate what portions of a first set of imaging data are to be displayed. For example, the movable imaging unit 155 can be an ultrasound transducer 155, a needle or other medical device, for example, and can be used by a user to indicate what portions of imaging data, such as a pre-operative CT scan, to show on a display unit 120 as image 125. Further, in some embodiments, there can be a third set of pre-operative imaging data that can be displayed with the first set of imaging data.
In some embodiments, system 100 comprises a first position sensing unit 110, a display unit 120, and second position sensing unit 140 (if it is included) all coupled to image guidance unit 130. In some embodiments, first position sensing unit 110, display unit 120, and image guidance unit 130 are all physically connected to stand 170. Image guidance unit 130 can be used to produce images 125 that are displayed on display unit 120. The images 125 produced on display unit 120 by the image guidance unit 130 can be determined based on ultrasound or other visual images from the first surgical instrument 145 and second surgical instrument 155.
For example, if the first surgical instrument 145 is an ablation needle 145 and the second surgical instrument 155 is an ultrasound probe 155, then images 125 produced on display 120 can include the images, or video, from the ultrasound probe 155 combined with graphics, such as projected medical device drive or projected ablation volume, determined based on the emplacement of ablation needle 145. If the first surgical instrument 145 is an ultrasound probe 145 and the second surgical instrument 155 is a laparoscopic camera 155, then images 125 produced on display 120 can include the video from the laparoscopic camera 155 combined with ultrasound data superimposed on the laparoscopic image. More surgical instruments can be added to the system. For example, the system can include an ultrasound probe, ablation needle, laparoscopic camera, cauterizer, scalpel and/or any other surgical instrument or medical device. The system can also process and/or display collected data, such as preoperative CT scans, X-Rays, MRIs, laser scanned 3D surfaces etc.
The term “emplacement” and the term “pose” as used herein are broad terms encompassing their plain and ordinary meanings and may refer to, without limitation, position, orientation, the combination of position and orientation, or any other appropriate location information. In some embodiments, the imaging data obtained from one or both of surgical instruments 145 and 155 can include other modalities such as a CT scan, MRI, open-magnet MRI, optical coherence tomography (“OCT”), positron emission tomography (“PET”) scans, fluoroscopy, ultrasound, or other preoperative, or intraoperative 2D or 3D anatomical imaging data. In some embodiments, surgical instruments 145 and 155 can also be scalpels, implantable hardware, or any other device used in surgery. Any appropriate surgical system 149 or imaging unit 150 can be attached to the corresponding medical instruments 145 and 155.
As noted above, images 125 produced can also be generated based on live, intraoperative, or real-time data obtained using the second surgical instrument 155, which is coupled to second imaging unit 150. The term “real time” as used herein is a broad term and has its ordinary and customary meaning, including without limitation instantaneously or nearly instantaneously. The use of the term real time can also mean that actions are performed or data is obtained with the intention to be used immediately, upon the next cycle of a system or control loop, or any other appropriate meaning. Additionally, as used herein, real-time data can be data that is obtained at a frequency that would allow a healthcare provider to meaningfully interact with the data during surgery. For example, in some embodiments, real-time data can be a medical image of a patient that is updated one time per second. In some embodiments, real-time data can be ultrasound data that is updated multiple times per second.
Second surgical instrument 155 can be coupled to second position sensing unit 140. Second position sensing unit 140 can be part of imaging unit 150 or it can be separate. Second position sensing unit 140 can be used to determine the emplacement of second surgical instrument 155. In some embodiments, first and/or second position sensing units 110 and/or 140 can be magnetic trackers and magnetic can be coils coupled to surgical instruments 145 and/or 155. In some embodiments, first and/or second position sensing units 110 and/or 140 can be optical trackers and visually-detectable fiducials can be coupled to surgical instruments 145 and/or 155.
Images 125 can be produced based on intraoperative or real-time data obtained using first surgical instrument 145, which is coupled to first surgical system 149. In
In some embodiments, the first position sensing unit 110 tracks the emplacement of first surgical device 145. First position sensing unit 110 can be an optical tracker 110 and first surgical device 145 can have optical fiducials attached thereto. The emplacement of optical fiducials can be detected by first position sensing unit 110, and, therefrom, the emplacement of first surgical device 145 can be determined.
In various embodiments, as depicted in
In some embodiments, either or both of the first position sensing unit 110 and the second position sensing unit 140 can be an Ascension Flock of Birds, Nest of Birds, driveBAY, medSAFE, trakSTAR, miniBIRD, MotionSTAR, pciBIRD, or Calypso 2D Localization System and tracking units attached to the first and/or second medical devices 145 and 155 can be magnetic tracking coils. The term “tracking unit,” as used herein, is a broad term encompassing its plain and ordinary meaning and includes without limitation all types of magnetic coils or other magnetic field sensing devices for use with magnetic trackers, fiducials or other optically detectable markers for use with optical trackers, such as those discussed above and below. In some embodiments, the tracking units can be implemented using optical position sensing devices, such as the HiBall tracking system and the first and second position sensing units 110 and 140 can form part of the HiBall tracking system. Tracking units can also include a GPS device or signal emitting device that allows for tracking of the position and, optionally, orientation of the tracking unit. In some embodiments, a signal emitting device might include a radio-frequency identifier (RFID). In such embodiments, the first and/or second position sensing unit 110 and 140 can use the GPS coordinates of the tracking units or can, for example, triangulate the radio frequency signal being emitted by the RFID associated with tracking units. The tracking systems can also include one or more 3D mice.
In some embodiments, either or both of the first position sensing unit 110 and the second position sensing unit 140 can be an electromagnetic measurement system (e.g., NDI Aurora system) using sensor coils for tracking units attached to the first and/or second surgical devices 145 and 155. In some embodiments, either or both of the first position sensing unit 110 and the second position sensing unit 140 can be an optical 3D tracking system using fiducials. Such optical 3D tracking systems can include the NDI Polaris Spectra, Vicra, Certus, PhaseSpace IMPULSE, Vicon MX, InterSense IS-900, NaturalPoint OptiTrack, Polhemus FastTrak, IsoTrak, or Claron MicronTracker2. In some embodiments, either or both of position sensing units 110 and 140 can each be an inertial 3D tracking system comprising a compass, accelerometer, tilt sensor and/or gyro, such as the InterSense InertiaCube or the Nintendo Wii controller. In some embodiments, either or both of position sensing units 110 and 140 can be attached to or affixed on the corresponding surgical device 145 and 155. In some embodiments, the position sensing units, 110 and 140, can include sensing devices such as the HiBall tracking system, a GPS device, or signal emitting device that would allow for tracking of the position and, optionally, orientation of the tracking unit. In some embodiments, a position sensing unit 110 or 140 can be affixed to either or both of the surgical devices 145 and 155. The surgical devices 145 or 155 can be tracked by the position sensing units 110 or 140. A room coordinate system reference, such as the display 120 can also be tracked by the position sensing unit 110 or 140 in order to determine the emplacements of the surgical devices 145 and 155 with respect to the room coordinate system. Devices 145 and 155 can also include or have coupled thereto one or more accelerometers, which can be used to estimate movement, position, and location of the devices.
In some embodiments, the display unit 120 displays 3D images to a user, such as a healthcare provider. Stereoscopic 3D displays separate the imagery shown to each of the user's eyes. This can be accomplished by a stereoscopic display, a lenticular auto-stereoscopic display, or any other appropriate type of display. The display 120 can be an alternating row or alternating column display. Example alternating row displays include the Miracube G240S, as well as Zalman Trimon Monitors. Alternating column displays include devices manufactured by Sharp, as well as many “auto-stereoscopic” displays (e.g., Philips). Display 120 can also be a cathode ray tube. Cathode Ray Tube (CRT) based devices, can use temporal sequencing, showing imagery for the left and right eye in temporal sequential alternation. This method can also be used projection-based devices, as well as by liquid crystal display (LCD) devices, light emitting diode (LED) devices, and/or organic LED (OLED) devices.
In certain embodiments, a user can wear a head mounted display in order to receive 3D images from the image guidance unit 130. In such embodiments, a separate display, such as the pictured display unit 120, can be omitted. The 3D graphics can be produced using underlying data models, stored in the image guidance unit 130 and projected onto one or more 2D planes in order to create left and right eye images for a head mount, lenticular, or other 3D display. The underlying 3D model can be updated based on the relative emplacements of the various devices 145 and 155, as determined by the position sensing unit(s), and/or based on new data associated with the devices 145 and 155. For example, if the second medical device 155 is an ultrasound probe, then the underlying data model can be updated to reflect the most recent ultrasound image. If the first medical device 145 is an ablation needle, then the underlying model can be updated to reflect any changes related to the needle, such as power or duration information. Any appropriate 3D graphics processing can be used for rendering including processing based on OpenGL, Direct3D, Java 3D, etc. Whole, partial, or modified 3D graphics packages can also be used, such packages including 3DS Max, SolidWorks, Maya, Form Z, Cybermotion 3D, VTK, Slicer, or any others. In some embodiments, various parts of the needed rendering can occur on traditional or specialized graphics hardware. The rendering can also occur on the general CPU, on programmable hardware, on a separate processor, be distributed over multiple processors, over multiple dedicated graphics cards, or using any other appropriate combination of hardware or technique.
One or more modules, units, devices, or elements of various embodiments can be packaged and/or distributed as part of a kit. For example, in one embodiment, an ablation needle, tracking elements, 3D viewing glasses, and/or a portion of an ultrasound wand can form a kit. Other embodiments can have different elements or combinations of elements grouped and/or packaged together. Kits can be sold or distributed separately from or with the other portions of the system.
One will readily recognize that there are numerous other examples of image guidance systems which can use, incorporate, support, or provide for the techniques, methods, processes, and systems described herein.
Depicting Surgical Instruments
Previous systems do not provide satisfactory image guidance data. It can often be difficult to discern the content of a 3D scene from a 2D depiction of it, or even from a 3D depiction of it. Therefore, various embodiments herein provide image guidance that can help the doctor better understand the scene, relative emplacements or poses of object in the scene and thereby provide improved image guidance.
The virtual surgical instrument 201 can be displayed in a virtual 3D space with the screen 220 acting as a window into the virtual 3D space, which can also be referred to as the perspective view. Thus, as the surgical instrument 245 is moved to the right, the virtual surgical instrument 201 also moves to the right. Similarly, if the surgical instrument 245 is rotated 90 degrees so that the tip of the surgical instrument is pointing towards the screen 220, the virtual surgical instrument 201 will likewise show the change in orientation, and show the tip of the virtual surgical instrument 201 in the background and the other end of the image 201 in the foreground.
Some models of medical devices have markings such as bands around the shaft (to indicate distance along the shaft), and a colored region near the tip to indicate where the radio frequency or microwave energy is emitted from in the case of an ablation probe. Healthcare providers performing medical device procedures are often familiar with these markings and can use them to help understand the spatial relationship between the medical device and anatomy. In some embodiments, the make and model of the medical device 245 is known to the image guidance system and the virtual medical device displayed (201) in display 220 can resemble medical device 245. The features of medical devices that can be rendered in the scene include the overall shape (diameter, cross sectional shape, curvature, etc.), color, distance markers, visuals or echogenic fiduciary markers, the state of deployable elements such as tines, paddles, anchors, resection loops, stiffening or steerable sleeves, temperature, radiation, light or magnetic field sensors, lens, waveguides, fluid transfer channels, and the like.
The type of medical device being used can be input into the image guidance system, can be a system default, can be detected by a camera or other device, can be received as data from an attached medical device, such as surgical system 149 in
Consider an embodiment in which the virtual surgical instrument 201 in the display 220 is an ablation needle depicting the portion of the needle that will perform the ablation, for example, the portion that emits the radio or microwave energy. If the display 220 also includes ultrasound data, then the doctor can be able to find the tumor she wishes to ablate by moving the ultrasound probe around until she spots the tumor. In various embodiments, she will be able to see the displayed ultrasound data and its location relative to the displayed medical device with the markings. She can then drive the medical device until she sees, on display 220, that the emitter-portion of the medical device encompasses the tumor in the ultrasound, also seen on display 220. When she activates the ablation, she can then be much more certain that she has ablated the correct portion of the tissue. Various embodiments of this are discussed more below.
As another example, consider the physical markings that can be on the instruments themselves. These markings can help orient a healthcare provider during use of the instrument. In some embodiments, the image guidance unit can represent these markings in the images displayed in the display. For example, certain ultrasound transducers are built with an orientation mark (e.g., a small bump) on one side of the transducing array. That mark can also be shown in the ultrasound image on the scanner's display, to help the healthcare provider understand where the scanned anatomical structures shown on screen are located under the transducer, inside the patient. In some embodiments, the image guidance system can display a symbolic 3D representation of the orientation mark both next to the motion-tracked ultrasound slice (e.g., moving with the displayed ultrasound slice) and next to the 2D ultrasound slice also displayed by the system. An example of this is displayed in
Other embodiments can track and display other types of instruments and their features. For example, a healthcare provider may want to track one or more of a scalpel, a biopsy, a cauterizer (including an electrocauterizer and Bovies), forceps, cutting loops on hysteroscopes, harmonic sheers, lasers (including CO.sub.2 lasers), etc. For example, in various embodiments, the following devices can be tracked and various aspects of their design displayed on display 220: Olympus™ OES Pro Hystero-Resectoscope, SonoSurg Ultrasonic Surgical System Olympus™ GF-UC 160 Endoscope Wallus™ Embryo Transfer Catheter AngioDynamics® NanoKnife™, VenaCure™ laser, StarBurst, Uniblade, Habib® Resector Bovie™ Electrodes, Covidien Evident™, Cool-tip™ Ablation Antennas, Opti4™ Electrodes Microsulis MEA (microwave endometrial ablation), Acculis Halt™ Medical System Optimed BigLumen Aspiration Catheter Optimed Optipure Stent Central venous catheterization introducer medical device (such as those made by Bard and Arrow).
Once tracked, a healthcare provider is able to see image guidance data on display 220 that will allow her to know the relative pose, location, or emplacement of the tracked instrument(s) with respect to one another or with respect to imaging data and will be able to see, on display 220, the features of the instrument rendered in the scene.
Depicting Ablation Volume and Other Instrument Information
Various embodiments of the systems herein depict as part of the image guidance data information related to the surgical instruments. For example, in some embodiments, an image guidance system such as the systems of
For some ablation needles, the expected volume of ablated tissue is neither spherical nor centered at the tip of the medical device. For example: a Covidien surgical microwave medical device has an ellipsoidal ablation volume; a Covidien Evident transcutaneous microwave medical device has a teardrop-like ablation volume; RFA Medical's bipolar ablation system uses two medical devices simultaneously, where each medical device has paddles that deploy after the medical device is inserted inside the tissue (which one can equate to a canoe's oar). In some embodiments, the ablation volume for such a medical device is, to a first approximation, a volume that lies directly between the paddles of the two medical devices.
The position and orientation of the volume can be specified by the placement of a tracked medical device, such as medical device 345 in
Other instrument information can also be depicted. For example, if a cauterizer is tracked as part of an image guidance system, then the cauterization volume can be determined or estimated and that volume can be displayed. If a laser is tracked as part of the image guidance system, then the projected laser path can be determined or estimated and displayed. In embodiments where multiple medical devices are used, the combined volume can be shown, as described in greater detail below with reference to
Depicting Medical Device Placement Trajectory, and Other Prediction Information
In certain procedures, the system can provide prediction information related to the surgical instruments. In the context of scalpel movement, this can be the location that the scalpel will hit if a healthcare provider continues to move the scalpel in a particular direction. In the context of ablation or biopsies, this can be the projected medical device placement if it is driven along its central axis, which is also referred to herein as a longitudinal axis.
In some embodiments, in order to aid the healthcare provider in placing or orienting a medical device 445, an image guidance system, such as that depicted in
The rings can be spaced at regular (e.g., 0.5, 1, or 2 cm) intervals to provide the healthcare provider with visual or guidance cues regarding the distance from the medical device tip to the targeted anatomy. In some embodiments, the spacing of the rings can indicate other aspects of the data, such as the drive speed of the medical device, the density of the tissue, the distance to a landmark, such as the ultrasound data, or any other appropriate guidance data or property. In some embodiments, the rings or other trajectory indicators can extend beyond the medical device tip, by a distance equal to the length of the medical device-shaft. This way, the user knows if the medical device is long enough to reach the target—even before the tip enters the patient. That is, in some embodiments, if the rings do not reach the target with the tip still outside the body, then the tip won't reach the target even when the entire length shaft is inserted into the body.
Other display markers can be used to show trajectory, such as a dashed, dotted, or solid line, transparent medical device shaft, point cloud, wire frame, etc. In some embodiments, three-dimensional rings can be used and provide depth cues and obscure little of the ultrasound image. Virtual rings or other virtual markers can be displayed semi-transparently, so that they obscure less of the ultrasound image than an opaque marker would.
Other prediction information can also be displayed. For example, if a scalpel is being tracked by the image guidance system, then a cutting plane corresponding to the scalpel can be displayed (not pictured). Such a cutting plan can be coplanar with the blade of the scalpel and can project from the blade of the scalpel. For example, the projected cutting plane can show where the scalpel would cut if the doctor were to advance the scalpel. Similar prediction information can be estimable or determinable for cauterizers, lasers, and numerous other surgical instruments.
Depicting Combinations of Graphics
As discussed herein, when there are multiple instruments or devices being used in a procedure, images, graphics, and data associated with the multiple instruments can be displayed to the healthcare provider. In some embodiments, as depicted in
The data from two or more devices can be combined and displayed based on their relative emplacements or poses. For example, the system 100 can determine an image plane based on the emplacement information of the ultrasound probe 555. Further, the ultrasound image 504 can be displayed on the image plane with respect to a virtual ablation needle 502 on a display 520 in a manner that estimates the relative emplacements or poses of an ultrasound probe 555 and ablation needle 545. As illustrated in
In addition, the display 520 includes an intersection indicator 510 that indicates the where the virtual ablation medical device 502 intersects the ultrasound image 504. In some embodiments, the intersection indicator 510 can be displayed before the medical device is inserted, thereby allowing the healthcare provider to see where the medical device will intersect the image.
In this image, a tumor 512 appears in the ultrasound image 504 and the virtual ablation needle 502 is shown driven through the tumor 512. The ablation volume 506 estimates where ablation would occur if the tissue were ablated at that time. The healthcare provider can see that the ablation volume 506 appears to cover the tumor displayed in the ultrasound image.
Various embodiments can include any combinations of the graphics described above with reference to
In some embodiments, the image guidance system can constantly display an additional 2D view of the ultrasound image 505 (in screen space), simultaneous to the 3D depiction of the procedure, so that the ultrasound image is always visible, regardless of the orientation in which the healthcare provider holds the transducer. This is illustrated in
In some embodiments, the 2D view 505 of an ultrasound image is depicted in the upper right corner of the monitor (though it can be placed in any location). In some embodiments, the guidance system can automatically (and continually) choose a corner in which to render the 2D view of the ultrasound image, based on the 3D position of the surgical instruments in the rendered scene. For example, in
In some embodiments, the system attempts to avoid having the 2D ultrasound image quickly moving among corners of the display in order to avoid overlapping with graphics and data in the display. For example, a function f can be used to determine which corner is most suitable for the 2D ultrasound image to be drawn in. The inputs to f can include the locations, in the screen coordinate system, of the displayed medical device tip, the corners of the 3D ultrasound image, etc. In some embodiments, f's output for any given point in time is independent of f's output in the previous frames, which can cause the ultrasound image to move among corners of the display rapidly. In some embodiments, the image guidance system will filter f's output over time. For example, the output of a filter g, for any given frame, could be the corner which has been output by f the most number of times over the last n frames, possibly weighting the most recent values for f most heavily. The output of the filter g can be used to determine in which corner of display 520 to display the 2D ultrasound image and the temporal filtering provided by g can allow the 2D ultrasound image display to move more smoothly among the corners of the display 520.
In some embodiments, other appropriate virtual information can be overlaid on the 2D ultrasound image as well. Examples include: an indication of the distance between the medical device's tip and the point in the plane of the ultrasound image that is closest to the medical device tip; the cross section or outline of the ablation volume that intersects with the ultrasound slice; and/or the intersection point, box, outline, etc. between the medical device's axis and the ultrasound image plane.
Representing Spatial Relationships
At times, when three dimensional relationships are depicted in 2D, or even in 3D, it can be difficult to gauge the relative positions, orientations, and distances among various objects. Consider
In some embodiments, the image guidance system can indicate spatial relationships with graphical indicators. For example, in
In some unpictured embodiments, the image guidance system can draw “guidance graphics” in the form of projective lines between the medical device and the ultrasound slice. These lines can be perpendicular to the plane of the slice and serve to indicate the most likely location in the slice where the medical device will become visible if it is moved to become coplanar with the slice. Together with stereoscopic head-tracked visualization, the projective lines help a healthcare provider determine a more accurate assessment of the location of the medical device with respect to the ultrasound slice.
Returning to
Representing Non-Intersecting Objects or Images
When data related to two devices or surgical instruments are displayed with relative emplacement, it can be difficult to orient their relative locations if they do not intersect. In some embodiments, an image guidance system will render relative location information. The relative location information can be shown with color (e.g., objects can be rendered in brighter colors if they are closer), with rendering techniques (e.g., objects can be rendered with transparency so that one object behind another can be visible, but visually appear behind the closer object), with geometry (e.g., a geometric connector can be shown that will allow the viewer to discern the relative relationships), or with any other appropriate technique.
For example, in some embodiments, if the intersection point of a medical device 702 is outside of the area of the ultrasound slice 704, the image guidance system can draw geometry, such as a line (or rectangle) in the image plane to indicate the relative positions of the medical device(s) and ultrasound image. This is depicted in
Marking Points of Interest
In certain procedures, healthcare providers desire to keep track of multiple spots within the volume of the patient or keep track of a single point or feature while looking at other parts of the volume. For example, when a healthcare provider is going to perform an ablation, before inserting any medical devices, the healthcare provider will often scan the tissues at the procedures site to find all targets (e.g., tumors) and note other features of the tissues. Then, later in the procedure, the healthcare provider can return to the previously identified points-of-interest. For example, a healthcare provider might first scan the liver and find seven lesions that she will attempt to ablate. After ablating the first lesion, she might be required to find the second lesion again, and so forth. Before finishing the procedure, she might be required to verify that she has ablated all seven of the lesions that she identified at the beginning of the procedure. This constant scanning and rescanning can be time consuming and error prone. Further, in a procedure where the healthcare provider is attempting to locate, for example, fluid-filled cysts, once a medical device pierces the cyst, the fluid can drain out, making the target difficult or impossible to locate again with ultrasound.
In some embodiments, the image guidance system allows the healthcare provider to mark or keep track of points or features of interest. In various embodiments, the healthcare provider can mark the points or features of interest in various ways. For example, consider a procedure where the doctor is using the image guidance system with an ablation needle and an ultrasound probe. The doctor can be able to mark the point by pressing a button on a keyboard or medical device, by gesturing or issuing a verbal command, or with any other appropriate method. The point of interest can be marked at the point where the medical device intersects with the ultrasound image plane, where the medical device's projection intersects with the ultrasound image plane, or any other appropriate relationship (such as at the location of the tip of the medical device). For example, when the healthcare provider identifies a point-of-interest within the ultrasound image, she can point to it using the medical device even if the medical device is outside the body of the patient. This is depicted in
Healthcare providers, during some liver ablation procedures, can manage fifteen points-of-interest, or even more. As depicted in
In some embodiments, the image guidance system stores the points-of-interests' positions in the position sensing system's coordinate system. If the position sensing system is fixed to the image guidance system, then, if the patient or image guidance system are moved, stored points-of-interest can become incorrectly located. In some embodiments, this can be remedied via a fiducial or other detectable feature or item, the pose of which relative to the tracking system can be continually, continuously, periodically, or occasionally measured. The fiducial can be attached to the operating table, the patient's skin, or even embedded into the tissue itself (e.g., as a magnetic tracking coil), and the points-of-interest’ positions, relative to it, can be stored and displayed. For example, in a system where magnetic tracking is used, a magnetic tracking coil can be affixed to the operating table or patient. In some embodiments, the healthcare provider can draw the annotations.
The image 1056 can be associated with a medical device, such as an ultrasound transducer (not pictured in
The annotation 1071, although it has been drawn on an image 1056, can be located in the virtual 3D space—defined by the placement of the image 1056 and the annotation 1071.
Multiple Medical Device Tracking
Using the emplacement information received from the tracking units, the system can calculate the emplacement of each medical device within a predefined area as well as the relative emplacement with respect to each other. The system can also determine the longitudinal axis 1112A, 1112B (also referred to as the central axis) of the medical devices 1102, 1104 and use the longitudinal axes to calculate the trajectories, of each medical device, angle differences between the medical devices, etc. This data can be used to generate useful information for display for the healthcare provider.
As mentioned previously, in some medical procedures that use multiple medical devices, it is desirable to have the medical devices on the same plane (e.g., the tips on the same plane) and parallel with each other. The system described herein can aid a healthcare provider in placing the medical devices so that they are level and parallel with each other.
During, or prior to, a medical procedure, one of the medical devices can be selected as the foundational medical device. In the illustrated embodiment of
Once a foundational medical device is selected, the system can calculate one or more foundational planes, and can generate foundational plane indicators for the different foundational planes. As used herein, a foundational plane is a plane that is perpendicular to a trajectory of the foundational medical device and intersects at least one point of the foundational medical device and/or intersects at least one point in the trajectory of the foundational medical device. In some embodiments, the trajectory of the foundational medical device is determined by the longitudinal axis of the foundational medical device.
Accordingly, a variety of foundational planes can be calculated and used as described herein. For example, the foundational tip plane 1114 is a foundational plane that intersects a tip of the foundational medical device 1102. As another example, the foundational electrode plane 1116 is a foundational plane that intersects the electrode of the foundational medical device 1102. In some embodiments, the foundational electrode plane intersects the electrode at a location where the exposed electrode 1108A ends, such as where the exposed electrode meets the insulated tube 1110A or handle. In certain embodiments the foundational electrode plane 1116 intersects the exposed electrode 1108A at any location.
The emplacement information can also be used to calculate various distances between the foundational medical device and other medical devices and between the foundational planes and the medical devices. In some embodiments, the emplacement information can be used to calculate the relative distance 1118 between the tips 1106A, 1106B of the medical devices 1102, 1104.
In certain embodiments, the emplacement information can be used to calculate the horizontal distance 1120 between the tips 1106A, 1106B of the medical devices 1102, 1104. The horizontal distance 1120 can represent the distance between the tips 1106A, 1106B if the tips were (or are) level (e.g., on the same plane, such as the foundational tip plane). In some embodiments, the horizontal distance can be calculated by determining the distance between the tip 1106A of the foundational medical device 1102 and the location of the tip 1106B of the secondary medical device 1104 if the secondary medical device 1104 were on the foundational tip plane 1114.
The emplacement information can also be used to calculate the vertical distance 1122 between the tips 1106A, 1106B of the medical devices 1102, 1104. The vertical distance can 1122 represent the distance one tip is in front of or behind the other tip (e.g., how far the tip of one medical device is from the foundational tip plane). In some embodiments, the vertical distance can be calculated by determining the distance between the tip 1106B of the secondary medical device 1104 and the foundational plane 1114. This information can be used to determine whether the medical devices 1102, 1104 are level (e.g., whether the tips are on the same plane, such as the foundational tip plane).
The emplacement information can also be used to calculate the relative difference in degrees between the medical devices 1102, 1104 (e.g., how many degrees from being parallel). For example, the system can compare the longitudinal axis 1112A of the foundational medical 1102 with the longitudinal axis 1112B of the secondary medical device 1104 to determine how many degrees difference there is between the two. The difference in degrees can be displayed as a number or graphical indicator. For example, the system can provide projective lines as discussed previously and also in greater detail below with reference to
The emplacement information can also be used to determine a target axis 1124 for the secondary medical device 1104. In some embodiments, the target axis 1124 can be the axis at which the secondary medical device 1104 is to be placed. The target axis 1124 location can be based on user input entered prior to or during the procedure, such as where the user wants the secondary medical device to be placed for a medical procedure and/or dynamically determined based on the emplacement of the foundational medical device 1102. In certain embodiments, the target axis 1124 is a predetermined distance from and parallel to the longitudinal axis 1112A of the foundational medical device 1102. The predetermined distance can be selected by a user or dynamically determined by the system 100 based on the length of the exposed electrodes of the medical devices, a model of the ablation or biopsy parameters of the medical devices (e.g. a lookup table), 4) tumor size, etc.
In some embodiments, the system can determine relative spatial indicators that indicate the relative distance between portions of the secondary medical device 1104 and the target plane or target axis (e.g., the longitudinal axis 1112A of the foundational medical device 1102, location of the preferred placement of the secondary medical device 1104, etc.). In certain embodiments, the relative spatial indicators can indicate the distance from portions of the secondary medical device 1104 to corresponding portions of the foundational medical device 1102.
Once the system determines one or more of the parameters described above, it can cause a display device to display those parameters. In some embodiments, one or more of the determined parameters are displayed for each medical device that is being tracked. In certain embodiments, the system displays one or more parameters of a selected medical device (or multiple selected devices) that is being tracked. In some embodiments, the one or more parameters of the selected medical device are displayed in conjunction with the foundational medical device.
A user can select the selected medical device using any number of different inputs (e.g., touchscreen, keyboard, mouse, foot pedal, button, etc.). Once the selected medical device is selected, the system can cause the display device to display one or more of the associated parameters. For example, when a user is attempting to place a medical device during a procedure, she can press a button that causes the medical device to be the selected medical device. In response, the system can display one or more parameters associated with the selected medical device, such as, but not limited to, trajectory, intersection indicators, relative spatial indicators, etc., as will be described in greater detail below with reference to
In addition, the system can use the emplacement information to determine a perspective view of the virtual 3D space. For example, the system can use any one or a combination of the following points as the center of perspective for the perspective view: an image (e.g., center of an ultrasound image); the center of the foundational needle; the center of the exposed electrode 1108A of the foundational medical device 1102, the center of the exposed electrode 1108B of the selected secondary medical device 1104 (or other secondary medical device); the location between the foundational medical device 1102 and the selected secondary medical device 1104; the center of the exposed electrode of a non-selected secondary medical device; the center of all medical devices selected thus far in the procedure; the center of all medical devices within a distance from the foundational medical device; the center of all medical devices.
Using the emplacement information and calculated data, the system can provide a user various indicators to aid in the placement of the medical devices, as will be described in greater detail below with reference to
Rendering with Multiple Medical Devices
2D Image Display Area
The 2D image display area 1202, described in greater detail above with reference to
3D Image Display Area
The 3D image display area 1204 can represent a virtual 3D space that corresponds to an actual 3D space that is being tracked. In the illustrated embodiment, the 3D image display area 1204 includes a virtual medical imaging device 1208, an image 1210, as described in greater detail above with reference to
Furthermore, the 3D image display area 1204 can include multiple virtual medical devices 1212, 1214, 1216, image guidance cues (e.g., trajectory indicators 1218, 1220, 1222, image plane intersection indicators 1224, 1226, foundational plane indicators, 1228, 1330, and foundational plane intersection indicator 1232), a patient orientation indicator 1234, and medical provider indicator 1236.
Although three medical devices 1212, 1214, 1216 are displayed, it will be understood that fewer or more medical devices can be used and displayed as desired. In addition, it will be understood that in some embodiments not all of the features shown in
The multiple virtual medical devices 1212, 1214, 1216, can be implemented by tracking emplacement information of multiple real medical devices located within a predetermined area. The predetermined area can correspond to a location of a medical procedure, the location of a patient, the location of tracking units, the range of position sensing units, a surgical table, an area corresponding to a virtual 3D area displayed on a display etc. Tracking emplacement information for a medical device is described in greater detail above with reference to
In some embodiments, each medical device and its associated image guidance cues (e.g. trajectory rings, intersection square, text, etc.) can be associated with a color. In some embodiments, each medical device with its associated image guidance cues is associated with a different color. For example, the first medical device 1212 and image guidance cues related to it can be drawn in pink, a second medical device 1214 and its associated image guidance cues can be drawn in green, and a third medical device 1216 and its associated image guidance cues can be drawn in blue. It will be understood that any combination of colors can be used as desired. Furthermore, the medical devices and associated image guidance cues can be distinguished based on different patterns or markings. For example, the system can cause the medical devices to have zigzags, lines, be bolded, flicker, etc.
Foundational Plane Indicators
In some embodiments, the image guidance cues can include foundational plane indicators, such as foundational plane indicators 1228, 1230 and foundational plane intersection indicators, such as foundational plane intersection indicator 1232. The foundational plane indicators 1228, 1230 can indicate a location of a foundational plane and/or indicate a location where secondary medical devices are to be placed in order to be level with the foundational medical device. In some embodiments, the foundational plane indicators 1228, 1230 can also indicate how the medical devices are to be placed so they are parallel to each other. The foundational plane intersection indicators (e.g., foundational plane intersection indicator 1232) can indicate a location where the trajectory of a medical device intersects a foundational plane.
In the illustrated embodiment of
In the illustrated embodiment, the foundational tip plane indicator 1228 extends from the tip of the foundational medical device 1212 to locations on the foundational tip plane where the tips secondary medical devices 1214, 1216, are to be placed. Accordingly, a user can use the foundational tip plane indicator 1228 to identify the location where the secondary medical devices 1214, 1216 are to be located to be level with the foundational medical device 1212. In certain embodiments, the foundational electrode plane indicators can indicate the location where the trajectories of the secondary medical devices intersect the foundational plane.
In some embodiments, the desired location for the tips of the secondary medical devices on the foundational tip plane is determined based on user input. For example, the user can indicate at what distances and locations the secondary medical devices are to be placed with respect to the foundational medical device. Based on the input, the system can determine the appropriate shape and size of the foundational tip plane indicator. In certain embodiments, the system dynamically calculates the location for the tips of the secondary medical devices on the foundational tip plane based on an identified object, such as a tumor, fibroid, etc., the size of the identified object, the number of medical devices to be used, electrical power specifications of the medical devices, etc.
In the illustrated embodiment of
The foundational electrode plane indicator 1230 extends from the electrode of the foundational medical device 1212 to locations on the foundational electrode plane where the electrodes of secondary medical devices are to be placed. In the illustrated embodiment of
The foundational plane intersection indicator 1232 can be used to indicate where the trajectory of a medical device intersects with a foundational plane, or where a medical device will be if it is moved forward. In the illustrated embodiment of
In some embodiments, the guidance cues (e.g., trajectory indicators, intersection indicators, etc.) associated with one or more medical devices are displayed simultaneously. In certain embodiments, the guidance cues associated with each medical device are displayed only when the associated medical device is selected. For example, in the illustrated embodiment of
Patient and Provider Indicators
The patient orientation indicator 1234 can indicate the orientation of the patient with respect to the perspective view of the 3D image display area 1204. In the illustrated embodiment of
In some embodiments, the display 1200 can include a provider indicator 1236 that indicates the location of a medical provider with respect to the patient. The provider indicator can be implemented as a dot, shape, color, arrow, line, or any other graphical indicator to indicate the location of the medical provider with respect to the patient. The system can determine the location of the provider based on user input and/or dynamically based on one or more tracking units associated with the medical provider.
In certain embodiments, the location of the medical provider coincides with the perspective view of the virtual 3D space. In the illustrated embodiment of
Emplacement Measurement Display Area
The emplacement measurement display area 1206 can provide a user with information regarding the emplacement of the medical devices. In some embodiments, the emplacement measurement display area 1206 provides relative emplacement information of the medical devices. For example, the emplacement measurement display area 1206 can indicate relative distances between the medical devices, relative degrees between the medical devices, etc. The relative distances between the medical devices can include the distance between the ends of medical devices, the distance between other corresponding portions of the medical devices, the horizontal distance between corresponding portions of medical devices (e.g., the distance between a tip of the foundational medical device and the tip of the second medical device if the tip of the second medical device was on the foundational tip plane), the vertical distance between corresponding portions of medical devices (e.g., the distance between the foundational tip plane and the tip of a second medical device), etc. The relative degrees between the medical devices can include the difference in degrees between the longitudinal axis of one medical device and the longitudinal axis of another medical device or some other axis of the medical devices. The emplacement measurement display area 1206 can include any one of or any combination of the distances and/or degrees described herein.
In the illustrated embodiment of
In some embodiments, the text indicating the distance between the tips of a pair of medical devices can be drawn in a first color (e.g., white) if it satisfies a threshold distance (e.g., 2 cm), and a second color (e.g., red) it does not. Similarly, the angles can be drawn in the first color if they meet a threshold angle (e.g., 10 degrees), and the second color if they do not. It will be understood that other color schemes can be used as well. In some embodiments, the medical device numbers can be included next to the distances and angles, and in certain embodiments, the number of the medical device can be color-coded similar to the image guidance cues discussed previously.
Relative Location Indicators
As illustrated in
In the illustrated embodiment of
Multiple Medical Device Guidance—Relative Spatial Relationships
When multiple medical devices are depicted in 2D or 3D, it can be difficult to determine when the medical devices are parallel. Similarly, it can be difficult to determine the relative angle between a longitudinal axis of the foundational medical device and the longitudinal axis of another medical device. It can also be difficult to determine the distance between various portions of the foundational medical device and corresponding portions of a secondary medical device.
Furthermore, in the illustrated embodiment of
In the illustrated embodiment, the guidance cues include relative spatial indicators 1250 and foundational plane indicators 1252, 1254. The relative spatial indicators 1250 indicate the distance between portions of a longitudinal axis of the secondary medical device 1216 and portions of a target axis or target plane. In some embodiments, the target axis can be the axis at which the secondary medical device 1216 is to be placed. The target axis can be based on user input, such as where the user wants the secondary medical device 1216 to be placed and/or dynamically based on the emplacement of the foundational medical device 1212. In certain embodiments, the target axis or target plane is parallel to the longitudinal axis of the foundational medical device 1212. In some embodiments the target plane is the image plane, described previously.
In some embodiments, the relative spatial indicators 1250 indicate the relative distance between portions of the secondary medical device 1216 and the longitudinal axis of the foundational medical device 1212. In certain embodiments, the relative spatial indicators 1250 can indicate the distance from portions of the secondary medical device 1216 to corresponding portions of the foundational medical device 1212.
As the secondary medical device 1216 is moved closer to, or farther away from, the target axis or target plane, the spatial indicators 1250 can shorten or lengthen, respectively. If the secondary medical device 1212 is placed at the target axis or target plane, the spatial indicators 1250 can disappear. In some embodiments, if the secondary medical device 1212 is parallel to the target axis or target plane, the spatial indicators 1250 can be equal in length. However, if the secondary medical device 1216 is angled with respect to the target axis, the spatial indicators 1250 can have different lengths.
In the illustrated embodiment of
In the illustrated embodiment of
The foundational plane indicators 1252, 1254 are similar to the foundational plane indicators 1228, 1230 described above with reference to
In some embodiments, the foundational plane indicators 1252, 1254 can indicate when the secondary medical device 1216 is parallel with the foundational medical device 1212 and/or indicate when the secondary medical device 1216 is level with the foundational medical device 1212. For example, the end of each foundational plane indicator 1252, 1254 can include a line that indicates at what angle the medical device is to be placed in order to be parallel to the target plane or target axis. In certain embodiments, a number indicating the relative difference in degrees can be provided as part of the foundational plane indicators 1252, 1254.
It will be understood that any one, or any combination, of the embodiments described above with respect to the foundational plane indicators of
Multiple Medical Device Guidance Routine
At block 1302, the system 100 receives emplacement information of a plurality of medical devices within a predetermined area. In some embodiments, the medical devices are invasive medical devices, such as ablation or biopsy needles, catheters, etc. As described previously, the medical devices, such as needles, can include tips, electrodes, and handles. In certain embodiments, the medical devices are non-invasive medical devices. In some embodiments, the medical devices are medical imaging devices, such as ultrasound transducers and/or laparoscopic cameras.
As described in greater detail above with reference to
At block 1304, the system 100 calculates a viewing angle in a virtual 3D space of a plurality of virtual medical devices. The virtual medical devices can correspond to the medical devices that are being tracked. Further, the viewing angle can be based at least on the emplacement information of the plurality of medical devices. In some embodiments, the system calculates a viewing angle for each of the medical devices. In certain embodiments, to calculate the viewing angle, the system determines the emplacement of the medical devices with respect to a perspective view.
As mentioned previously with reference to
At block 1306, the system 100 causes a display device to display the plurality of virtual medical devices based at least on the calculated viewing angle(s) in the virtual 3D space. Based on the calculated viewing angle(s) and the perspective view, the system 100 can cause the display to display the medical devices with respect to one another. As the system 100 can calculate the viewing angle for each virtual medical device separately, each virtual medical device can be displayed at a different angle with respect to the perspective view. As mentioned previously, the perspective view can be based on the location of the healthcare provider, the location of an image in the virtual 3D space, the location and number of medical devices, etc.
As the position and orientation of the medical devices change, the system 100 can display the change with respect to the perspective view. For example, if the longitudinal axis of a virtual medical device is displayed as being left to right on the display and the medical device is rotated 90 degrees around the z-axis, the system 100 will change the display to show the longitudinal axis of the virtual medical device as being from the foreground to the background (or front to back).
Additional, fewer, or different blocks can be used to implement the routine 1300 without departing from the spirit and scope of the description. For example, any one or a combination of blocks 1308-1322 can be used as part of routine 1300.
At block 1308, the system 100 receives emplacement information of a medical imaging device within the predetermined area. As described previously, the medical imaging device can be an ultrasound transducer, laparoscopic camera, etc. Similar to the receipt of emplacement information described above with reference to block 1302, the system 100 can receive emplacement information of the medical imaging device.
At block 1310, the system 100 receives image data based at least on the emplacement information of the medical imaging device. In some embodiments, system receives one or more images from the medical imaging device. For example, the medical imaging device can be an ultrasound transducer and can provide one or more ultrasound images to the system 100. As described in greater detail above with reference to
At block 1312, the system 100 calculates a viewing angle in the virtual 3D space of the image data (e.g., one or more images) based at least on the emplacement information of the medical imaging device. As described previously with reference to block 1304, the system can calculate viewing angles in the virtual 3D space based on the emplacement information of the medical imaging device with respect to the perspective view. Using this information, the system 100 can calculate a viewing angle for image data in the virtual 3D space.
At block 1314, the system 100 causes the display device to display the image data based at least on the calculated viewing angle in the virtual 3D space. As described previously with reference to block 1306 the system can cause a display device to display images based on the calculated viewing angle. Similarly, the system 100 can cause the display device to display the image data (e.g., one or more images).
At block 1316, the system 100 identifies a foundational medical device. As described in greater detail above with reference to
At block 1318, the system 100 determines one or more foundational planes and displays one or more foundational plane indicators, as described in greater detail above with reference to
At block 1320, the system 100 determines one or more foundational plane intersections and displays one or more foundational plane intersection indicators. As described in greater detail above with reference to
At block 1322, the system 100 determines and displays one or more distances between the medical devices. As described in greater detail above with reference to
Furthermore, additional blocks can be used as part of the routine 1300. As described in greater detail above with reference to
In addition, as described above with reference to
Medical Device Guidance—Planning Mode
The guidance cues in the illustrated embodiment include the center of the tumor marker 1412, which can be manually placed by a user, and tumor boundary markers 1410 which indicates the extents or boundary of the tumor in two perpendicular planes. As described in greater detail above with reference to
The guidance cues in the illustrated embodiment also include draws bars 1408A-1408D, which can be generated by the system after the user has marked the center of the tumor (or other point in the tumor) and the boundary of the tumor. The system can calculate the location of the draw bars based on the suggested placement of a medical device, the length of the exposed electrode of the medical device and/or different locations within the exposed electrode of the medical device (e.g., tip, center, and end).
In the illustrated embodiment, the draw bars include draw bar 1408A between the end of the exposed electrode of the virtual medical device 1406 (when placed at the placement suggestion 1404A) and the image 1402, draw bar 1408B between the center of the exposed electrode of the medical device 1406 (when placed at the placement suggestion 1404A) and the center of the tumor marker 1412, draw bar 1408C between the tip of the virtual medical device 1406 (when placed at the placement suggestion 1404A) and the image 1402, and draw bar 1408D parallel to the exposed electrode of the virtual medical device (when placed at the placement suggestion 1404A), but running through the previously marked center of the tumor.
The draws bars 1408A-1408D can show the user 1) if the exposed electrode is long enough to cover the extent/boundary of the tumor, 2) if the exposed electrode is centered with respect to the tumor, and 3) the distance between the tumor-center and the electrode. This distance can also be displayed numerically on the screen (not shown).
In addition, the system can display placement suggestions 1404A, 1404B for the medical devices. The system can generate the placement suggestions 1404A, 1404B and number of placement suggestions based on: 1) the distance between the first medical device (after placement) and the tumor-center 2) the length of the exposed electrodes of the medical devices, 3) a model of the ablation parameters of the medical device (e.g. a lookup table), 4) tumor size, etc. In illustrated example, the system suggests a configuration of two medical devices.
In the illustrated embodiment, the placement suggestions 1404A, 1404B are illustrated as faded virtual medical devices, however, it will be understood that other method can be used to provide the placement suggestions. For example, the placement suggestion 1404A, 1404B can be illustrated as lines, bars, cylinders, letters (e.g., X's), etc.
Furthermore, when the medical device 1406 is close to, or enters the predetermined area, the system can generate the virtual medical device 1406 and the user can guide the virtual medical device 1406 to the placement suggestion 1404A. The system's proposed configuration can be repeatedly updated as the user manipulates the first medical device (still outside the patient), until she accepts the proposed position.
The second virtual medical device 1414 is in a vertical orientation, and the user manipulates the second medical device corresponding to the second virtual medical device 1414 (e.g., while it is outside the patient) such that the intersection indicator 1416 (described in
Medical Device Guidance—Planning Routine
At block 1502, the system 100 receives emplacement information associated with a target region. The emplacement information associated with the target region can be based on a marker or annotation placed by a healthcare provider as described earlier with reference to
At block 1504, the system 100 determines a first position relative to the target region for a first medical device based at least on the emplacement information of the target region. As described in greater detail above with reference to
At block 1506, the system 100 determines a second position relative to the target region for a second medical device. As mentioned previously the second position can be based on the emplacement information of the target region, the emplacement information of the first position, and/or any one or any combination of the parameters described above with reference to
At block 1508, the system 100 causes a display device to display the target region, as described in greater detail above with reference to
At block 1510, the system 100 causes the display device to display a first position indicator. The system can cause the display device to display the first position indicator at the first position. As discussed previously with reference to
As described in greater detail above with reference to
Additional, fewer, or different blocks can be used to implement the routine 1500 without departing from the spirit and scope of the description. For example, any one or a combination of blocks 1516-1520 can be used as part of routine 1500.
Similar to blocks 1510-1514 described previously, the system can display a second position indicator, receive emplacement information for the second medical device and display a virtual second medical device, as illustrated at blocks 1516-1520, respectively.
Furthermore, in some embodiments and as described in greater detail above with reference to
Previous Emplacement of a Medical Device
Referring back to
Typically, when a medical device is removed (or the associated tracking unit is removed), the corresponding virtual medical device is not shown on the display. However, in some embodiments, when a medical device is removed (or when a tracking unit associated with the medical device is removed), the system can display an altered image 1404B (e.g., a faded image) of the virtual medical device.
The system can use the emplacement information received at the first time to determine the emplacement of the altered image 1404B. In certain embodiments, the system displays the altered image 1404B at the location of the virtual medical device at the first time. Over time, as the location of the previously removed (or no longer tracked) medical device becomes less reliable (e.g., due to normal organ movement, etc.), the system can continue to alter the altered image 1404B, until it is removed. For example, the system can make the altered image 1404B more transparent over time (e.g., more faded).
In the interim, a user can use the altered image 1404B to place a second medical device 1406. For example, this can be done when a second biopsy is taken. The system can provide the user with the guidance cues described above with reference to
Previous Emplacement of a Medical Device Routine
At block 1602, the system 100 receives emplacement information of a tracking unit associated with a medical device at a first time. As described in greater detail above with reference to block 1302 of
At block 1604, the system 100 calculates a viewing angle of a virtual medical device in a virtual 3D space based at least on the emplacement information. As described in greater detail above with reference to block 1302 of
At block 1606, the system 100 causes a display device to display the virtual medical device based at least on the calculated viewing angle. As described in greater detail above with reference to block 1302 of
At block 1608, the system 100 determines that the tracking unit associated with the medical device has been removed from the predetermined area. As the system 100 receives emplacement information from the tracking unit, it can determine the location of the tracking unit and associated medical device. Further, it can determine when the tracking unit has left the predetermined area, such as the area corresponding to the virtual 3D space displayed by the display. To determine that the tracking unit has left the predetermined area, the system 100 can compare the emplacement information of the tracking unit with emplacement information of the predetermined area. When the tracking unit is outside the predetermined area, the system 100 can determine that the tracking unit has been removed from the predetermined area.
In some embodiments, the removal of the tracking unit from the predetermined area corresponds to the removal of the medical device from the predetermined area. In certain embodiments, the removal of the tracking unit from the predetermined area does not correspond to the removal of the medical device from the predetermined area. As described in greater detail above with reference to
At block 1610, the system 100 causes the display device to display an altered image of the virtual medical device based at least on the emplacement information received at the first time. Upon determining that the tracking unit has left the predetermined area, the system 100 can cause the display device to display an altered the image of the virtual medical device.
Typically, when a medical device is removed from the predetermined area, the corresponding virtual medical device is no longer displayed by the system 100. However, in some cases it can be useful to identify where medical devices were located at a first time, such as during previous medical procedures (e.g., ablations, biopsies, etc.). Accordingly, in some embodiments, the system 100 can use the emplacement information received at the first time to display an altered image of the medical device on the display. In some embodiments, the virtual medical device is displayed at its previous location at the predetermined time. In certain embodiments, the system 100 can cause the virtual medical device to be faded or grayed out to indicate that it is no longer present.
Similarly, in cases where a tracking unit is removed from one medical device (or stops working) it can be useful to retain an image of the virtual medical device on the display. Accordingly, in some embodiments, the system 100 can use the emplacement information received during at the predetermined time to display an altered image of the medical device on the display. In some embodiments, the altered image of the virtual medical device is displayed at its last known location (based on the first time). In certain embodiments, the system 100 can cause the virtual medical device to be faded or grayed out to indicate that its location is no longer being tracked and/or may be unreliable.
Additional, fewer, or different blocks can be used to implement the routine 1600 without departing from the spirit and scope of the description. For example, the system can omit blocks 1604 and 1606 determine a previous placement of a medical device based on emplacement information received previously. Using that information, the system can cause the display device to display an altered image of the virtual medical device or can cause the display device to display the virtual medical device. Further, any one or a combination of blocks 1612-1020 can be used as part of routine 1600.
At block 1612, the system 100 further alters the image over time. Over time the reliability of the location of the altered virtual medical device will decrease. This can be due to normal organ movement of the patient or movement due to the healthcare provider. Accordingly, based at least on an amount of time that has passed since the first time, the system can further alter (e.g., fade or gray out) the altered virtual medical device. Thus, the altered virtual medical device can continue to fade until it is removed from the display (e.g., the system ceases to display the altered image), as illustrated at block 1614.
At block 1616, the system 100 can receive emplacement information of a tracking unit associated with a second medical device after causing the display device to display the altered image. In some embodiments, the second medical device is the same as the first medical device. For example, after the medical device has been removed from the predetermined area, the healthcare provider may re-introduce it at another location for another medical procedure. Accordingly, the system 100 can receive the new emplacement information of the tracking unit associated with the medical device. Based on the new emplacement information, the system 100 can calculate the new viewing angle of the virtual medical device in the virtual 3D space and cause the display to display it. However, as mentioned previously, the system 100 can retain the altered image as a reference for the healthcare provider.
Similarly, the system can receive emplacement information of a tracking unit (the same or different tracking unit from the one mentioned above with reference to block 1616) associated with a second medical device after causing the display device to display the altered image, as illustrated at block 1618. The second medical device can be a different medical device with its own tracking unit or a different medical device using the same tracking unit as the first medical device. In either instance, the system 100 can receive the emplacement information of the second medical device (via the emplacement information of the tracking unit), calculate the viewing angle of the second virtual medical device in the virtual 3D space, and cause the display to display it. As mentioned previously, the system 100 can retain the altered image of the first virtual medical device on the display as a reference.
At block 1620, the system 100 determines and displays one or more guidance cues associated with the altered image of the first virtual medical device, the first virtual medical device (based on new emplacement information), and/or the second virtual medical device. The guidance cues can be any one or more of the guidance cues described above with reference to
Various example embodiments of the disclosure can be described in view of the following clauses:
Those having skill in the art will further appreciate that the various illustrative logical blocks, modules, circuits, and process steps described in connection with the implementations disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. One skilled in the art will recognize that a portion, or a part, can comprise something less than, or equal to, a whole. For example, a portion of a collection of pixels can refer to a sub-collection of those pixels.
The various illustrative logical blocks, modules, and circuits described in connection with the implementations disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or process described in connection with the implementations disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory storage medium known in the art. An exemplary computer-readable storage medium is coupled to the processor such the processor can read information from, and write information to, the computer-readable storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal, camera, or other device. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal, camera, or other device.
Headings are included herein for reference and to aid in locating various sections. These headings are not intended to limit the scope of the concepts described with respect thereto. Such concepts can have applicability throughout the entire specification.
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein can be applied to other implementations without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The present application claims priority benefit to U.S. Provisional Application Nos. 61/592,531, filed Jan. 30, 2012 and 61/736,789 filed Dec. 13, 2012, each of which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3556079 | Omizo | Jan 1971 | A |
4058114 | Soldner | Nov 1977 | A |
RE30397 | King | Sep 1980 | E |
4249539 | Vilkomerson et al. | Feb 1981 | A |
4294544 | Altschuler et al. | Oct 1981 | A |
4390025 | Takemura et al. | Jun 1983 | A |
4407294 | Vilkomerson | Oct 1983 | A |
4431006 | Trimmer et al. | Feb 1984 | A |
4567896 | Barnea et al. | Feb 1986 | A |
4583538 | Onik et al. | Apr 1986 | A |
4620546 | Aida et al. | Nov 1986 | A |
4671292 | Matzuk | Jun 1987 | A |
4839836 | Fonsalas | Jun 1989 | A |
4862873 | Yajima et al. | Sep 1989 | A |
4884219 | Waldren | Nov 1989 | A |
4899756 | Sonek | Feb 1990 | A |
4911173 | Terwillige | Mar 1990 | A |
4945305 | Blood | Jul 1990 | A |
5076279 | Arenson et al. | Dec 1991 | A |
5078140 | Kwoh | Jan 1992 | A |
5078142 | Siczek et al. | Jan 1992 | A |
5095910 | Powers | Mar 1992 | A |
5109276 | Nudelman et al. | Apr 1992 | A |
5158088 | Nelson et al. | Oct 1992 | A |
5161536 | Vilkomerson et al. | Nov 1992 | A |
5193120 | Gamache et al. | Mar 1993 | A |
5209235 | Brisken et al. | May 1993 | A |
5249581 | Horbal et al. | Oct 1993 | A |
5251127 | Raab | Oct 1993 | A |
5261404 | Mick et al. | Nov 1993 | A |
5265610 | Darrow et al. | Nov 1993 | A |
5271400 | Dumoulin et al. | Dec 1993 | A |
5307153 | Maruyama et al. | Apr 1994 | A |
5309913 | Kormos et al. | May 1994 | A |
5323002 | Sampsell et al. | Jun 1994 | A |
5371543 | Anderson | Dec 1994 | A |
5383454 | Bucholz | Jan 1995 | A |
5394875 | Lewis et al. | Mar 1995 | A |
5411026 | Carol | May 1995 | A |
5433198 | Desai | Jul 1995 | A |
5433739 | Sluijter et al. | Jul 1995 | A |
5443489 | Ben-Haim | Aug 1995 | A |
5446798 | Morita et al. | Aug 1995 | A |
5447154 | Cinquin et al. | Sep 1995 | A |
5452024 | Sampsell | Sep 1995 | A |
5457493 | Leddy et al. | Oct 1995 | A |
5474073 | Schwartz et al. | Dec 1995 | A |
5476096 | Olstad et al. | Dec 1995 | A |
5483961 | Kelly et al. | Jan 1996 | A |
5488431 | Gove et al. | Jan 1996 | A |
5489952 | Gove et al. | Feb 1996 | A |
5491510 | Gove | Feb 1996 | A |
5494039 | Onik et al. | Feb 1996 | A |
5503152 | Oakley et al. | Apr 1996 | A |
5505204 | Picot et al. | Apr 1996 | A |
5515856 | Olstad et al. | May 1996 | A |
5517990 | Kalfas et al. | May 1996 | A |
5526051 | Gove et al. | Jun 1996 | A |
5526812 | Dumoulin et al. | Jun 1996 | A |
5529070 | Augustine et al. | Jun 1996 | A |
5531227 | Schneider | Jul 1996 | A |
5532997 | Pauli | Jul 1996 | A |
5541723 | Tanaka | Jul 1996 | A |
5558091 | Acker et al. | Sep 1996 | A |
5568811 | Olstad | Oct 1996 | A |
5570135 | Gove et al. | Oct 1996 | A |
5579026 | Tabata | Nov 1996 | A |
5588948 | Takahashi et al. | Dec 1996 | A |
5608468 | Gove et al. | Mar 1997 | A |
5608849 | King, Jr. | Mar 1997 | A |
5611345 | Hibbeln | Mar 1997 | A |
5611353 | Dance et al. | Mar 1997 | A |
5612753 | Poradish et al. | Mar 1997 | A |
5625408 | Matsugu et al. | Apr 1997 | A |
5628327 | Unger et al. | May 1997 | A |
5629794 | Magel et al. | May 1997 | A |
5630027 | Venkateswar et al. | May 1997 | A |
5647361 | Damadian | Jul 1997 | A |
5647373 | Paltieli | Jul 1997 | A |
5660185 | Shmulewitz et al. | Aug 1997 | A |
5662111 | Cosman | Sep 1997 | A |
5699444 | Palm | Dec 1997 | A |
5701898 | Adam et al. | Dec 1997 | A |
5701900 | Shehada et al. | Dec 1997 | A |
5726670 | Tabata et al. | Mar 1998 | A |
5728044 | Shan | Mar 1998 | A |
5758650 | Miller et al. | Jun 1998 | A |
5766135 | Terwilliger | Jun 1998 | A |
5784098 | Shoji et al. | Jul 1998 | A |
5792147 | Evans et al. | Aug 1998 | A |
5793701 | Wright et al. | Aug 1998 | A |
5797849 | Vesely et al. | Aug 1998 | A |
5807395 | Mulier et al. | Sep 1998 | A |
5810008 | Dekel et al. | Sep 1998 | A |
5817022 | Vesely | Oct 1998 | A |
5820554 | Davis et al. | Oct 1998 | A |
5820561 | Olstad et al. | Oct 1998 | A |
5829439 | Yokosawa et al. | Nov 1998 | A |
5829444 | Ferre et al. | Nov 1998 | A |
5851183 | Bucholz | Dec 1998 | A |
5870136 | Fuchs et al. | Feb 1999 | A |
5891034 | Bucholz | Apr 1999 | A |
5920395 | Schulz | Jul 1999 | A |
5961527 | Whitmore, III et al. | Oct 1999 | A |
5967980 | Ferre et al. | Oct 1999 | A |
6016439 | Acker | Jan 2000 | A |
6019724 | Gronningsaeter et al. | Feb 2000 | A |
6048312 | Ishrak et al. | Apr 2000 | A |
6064749 | Hirota et al. | May 2000 | A |
6095982 | Richards-Kortum et al. | Aug 2000 | A |
6099471 | Torp et al. | Aug 2000 | A |
6108130 | Raj | Aug 2000 | A |
6122538 | Sliwa, Jr. et al. | Sep 2000 | A |
6122541 | Cosman et al. | Sep 2000 | A |
6167296 | Shahidi | Dec 2000 | A |
RE37088 | Olstad et al. | Mar 2001 | E |
6216029 | Paltieli | Apr 2001 | B1 |
6241725 | Cosman | Jun 2001 | B1 |
6245017 | Hashimoto et al. | Jun 2001 | B1 |
6246898 | Vesely et al. | Jun 2001 | B1 |
6248101 | Whitmore, III et al. | Jun 2001 | B1 |
6261234 | Lin | Jul 2001 | B1 |
6341016 | Malione | Jan 2002 | B1 |
6348058 | Melken et al. | Feb 2002 | B1 |
6350238 | Olstad et al. | Feb 2002 | B1 |
6352507 | Torp et al. | Mar 2002 | B1 |
6379302 | Kessman et al. | Apr 2002 | B1 |
6385475 | Cinquin et al. | May 2002 | B1 |
6442417 | Shahidi et al. | Aug 2002 | B1 |
6447450 | Olstad | Sep 2002 | B1 |
6456868 | Saito et al. | Sep 2002 | B2 |
6470207 | Simon et al. | Oct 2002 | B1 |
6471366 | Hughson et al. | Oct 2002 | B1 |
6477400 | Barrick | Nov 2002 | B1 |
6478793 | Cosman et al. | Nov 2002 | B1 |
6503195 | Keller et al. | Jan 2003 | B1 |
6511418 | Shahidi et al. | Jan 2003 | B2 |
6517485 | Torp et al. | Feb 2003 | B2 |
6518939 | Kikuchi | Feb 2003 | B1 |
6527443 | Vilsmeier | Mar 2003 | B1 |
6529758 | Shahidi | Mar 2003 | B2 |
6537217 | Bjærum et al. | Mar 2003 | B1 |
6545706 | Edwards et al. | Apr 2003 | B1 |
6546279 | Bova et al. | Apr 2003 | B1 |
6551325 | Neubauer et al. | Apr 2003 | B2 |
6570566 | Yoshigahara | May 2003 | B1 |
6579240 | Bjaerum et al. | Jun 2003 | B2 |
6587711 | Alfano et al. | Jul 2003 | B1 |
6591130 | Shahidi | Jul 2003 | B2 |
6592522 | Bjaerum et al. | Jul 2003 | B2 |
6594517 | Nevo | Jul 2003 | B1 |
6597818 | Kumar et al. | Jul 2003 | B2 |
6604404 | Paltieli et al. | Aug 2003 | B2 |
6616610 | Steininger et al. | Sep 2003 | B2 |
6626832 | Paltieli et al. | Sep 2003 | B1 |
6652462 | Bjaerum et al. | Nov 2003 | B2 |
6669635 | Kessman et al. | Dec 2003 | B2 |
6676599 | Torp et al. | Jan 2004 | B2 |
6689067 | Sauer et al. | Feb 2004 | B2 |
6695786 | Wang et al. | Feb 2004 | B2 |
6711429 | Gilboa et al. | Mar 2004 | B1 |
6725082 | Sati et al. | Apr 2004 | B2 |
6733458 | Steins et al. | May 2004 | B1 |
6764449 | Lee et al. | Jul 2004 | B2 |
6766184 | Utzinger et al. | Jul 2004 | B2 |
6768496 | Bieger et al. | Jul 2004 | B2 |
6775404 | Pagoulatos et al. | Aug 2004 | B1 |
6782287 | Grzeszczuk et al. | Aug 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6827723 | Carson | Dec 2004 | B2 |
6863655 | Bjaerum et al. | Mar 2005 | B2 |
6873867 | Vilsmeier | Mar 2005 | B2 |
6875179 | Ferguson et al. | Apr 2005 | B2 |
6881214 | Cosman et al. | Apr 2005 | B2 |
6895268 | Rahn et al. | May 2005 | B1 |
6915150 | Cinquin et al. | Jul 2005 | B2 |
6917827 | Kienzle, III | Jul 2005 | B2 |
6923817 | Carson et al. | Aug 2005 | B2 |
6936048 | Hurst | Aug 2005 | B2 |
6947783 | Immerz | Sep 2005 | B2 |
6968224 | Kessman et al. | Nov 2005 | B2 |
6978167 | Dekel et al. | Dec 2005 | B2 |
7008373 | Stoianovici et al. | Mar 2006 | B2 |
7033360 | Cinquin et al. | Apr 2006 | B2 |
7072707 | Galloway, Jr. et al. | Jul 2006 | B2 |
7077807 | Torp et al. | Jul 2006 | B2 |
7093012 | Olstad et al. | Aug 2006 | B2 |
7110013 | Ebersole et al. | Sep 2006 | B2 |
7209776 | Leitner | Apr 2007 | B2 |
7245746 | Bjaerum et al. | Jul 2007 | B2 |
7248232 | Yamazaki et al. | Jul 2007 | B1 |
7261694 | Torp et al. | Aug 2007 | B2 |
7313430 | Urquhart et al. | Dec 2007 | B2 |
7331932 | Leitner | Feb 2008 | B2 |
7351205 | Szczech et al. | Apr 2008 | B2 |
7379769 | Piron et al. | May 2008 | B2 |
7385708 | Ackerman et al. | Jun 2008 | B2 |
7392076 | Moctezuma de La Barrera | Jun 2008 | B2 |
7398116 | Edwards | Jul 2008 | B2 |
7466303 | Yi et al. | Dec 2008 | B2 |
7480533 | Cosman et al. | Jan 2009 | B2 |
7505809 | Strommer et al. | Mar 2009 | B2 |
7588541 | Floyd et al. | Sep 2009 | B2 |
7662128 | Salcudean et al. | Feb 2010 | B2 |
7678052 | Torp et al. | Mar 2010 | B2 |
7728868 | Razzaque et al. | Jun 2010 | B2 |
7798965 | Torp et al. | Sep 2010 | B2 |
7833168 | Taylor et al. | Nov 2010 | B2 |
7833221 | Voegele et al. | Nov 2010 | B2 |
7846103 | Cannon, Jr. et al. | Dec 2010 | B2 |
7876942 | Gilboa | Jan 2011 | B2 |
7889905 | Higgins et al. | Feb 2011 | B2 |
7912849 | Ohrn et al. | Mar 2011 | B2 |
7920909 | Lyon et al. | Apr 2011 | B2 |
7962193 | Edwards et al. | Jun 2011 | B2 |
7976469 | Bonde et al. | Jul 2011 | B2 |
8023712 | Ikuma et al. | Sep 2011 | B2 |
8038631 | Sanghvi et al. | Oct 2011 | B1 |
8041413 | Barbagli et al. | Oct 2011 | B2 |
8050736 | Piron et al. | Nov 2011 | B2 |
8052636 | Moll et al. | Nov 2011 | B2 |
8066644 | Sarkar et al. | Nov 2011 | B2 |
8073528 | Zhao et al. | Dec 2011 | B2 |
8086298 | Whitmore, III et al. | Dec 2011 | B2 |
8135669 | Olstad et al. | Mar 2012 | B2 |
8137281 | Huang et al. | Mar 2012 | B2 |
8147408 | Bunce et al. | Apr 2012 | B2 |
8152724 | Ridley et al. | Apr 2012 | B2 |
8216149 | Oonuki et al. | Jul 2012 | B2 |
8221322 | Wang et al. | Jul 2012 | B2 |
8228028 | Schneider | Jul 2012 | B2 |
8257264 | Park et al. | Sep 2012 | B2 |
8296797 | Olstad et al. | Oct 2012 | B2 |
8340379 | Razzaque et al. | Dec 2012 | B2 |
8350902 | Razzaque et al. | Jan 2013 | B2 |
8482606 | Razzaque et al. | Jul 2013 | B2 |
20010007919 | Shahidi | Jul 2001 | A1 |
20010016804 | Cunningham et al. | Aug 2001 | A1 |
20010045979 | Matsumoto et al. | Nov 2001 | A1 |
20020010384 | Shahidi et al. | Jan 2002 | A1 |
20020032772 | Olstad et al. | Mar 2002 | A1 |
20020049375 | Strommer et al. | Apr 2002 | A1 |
20020077540 | Kienzie, III | Jun 2002 | A1 |
20020077543 | Grzeszczuk et al. | Jun 2002 | A1 |
20020135673 | Favalora et al. | Sep 2002 | A1 |
20020138008 | Tsujita et al. | Sep 2002 | A1 |
20020140814 | Cohen-Solal et al. | Oct 2002 | A1 |
20020156375 | Kessmam et al. | Oct 2002 | A1 |
20020198451 | Carson | Dec 2002 | A1 |
20030040743 | Cosman et al. | Feb 2003 | A1 |
20030073901 | Simon et al. | Apr 2003 | A1 |
20030135119 | Lee et al. | Jul 2003 | A1 |
20030163142 | Paltieli et al. | Aug 2003 | A1 |
20030164172 | Chumas et al. | Sep 2003 | A1 |
20030231789 | Willis et al. | Dec 2003 | A1 |
20040034313 | Leitner | Feb 2004 | A1 |
20040078036 | Keidar | Apr 2004 | A1 |
20040095507 | Bishop et al. | May 2004 | A1 |
20040116810 | Olstad | Jun 2004 | A1 |
20040147920 | Keidar | Jul 2004 | A1 |
20040152970 | Hunter et al. | Aug 2004 | A1 |
20040181144 | Cinquin et al. | Sep 2004 | A1 |
20040215071 | Frank et al. | Oct 2004 | A1 |
20040238732 | State et al. | Dec 2004 | A1 |
20040243146 | Chesbrough et al. | Dec 2004 | A1 |
20040243148 | Wasielewski | Dec 2004 | A1 |
20040249281 | Olstad | Dec 2004 | A1 |
20040249282 | Olstad | Dec 2004 | A1 |
20040254454 | Kockro | Dec 2004 | A1 |
20050010098 | Frigstad et al. | Jan 2005 | A1 |
20050085717 | Shahidi | Apr 2005 | A1 |
20050085718 | Shahidi | Apr 2005 | A1 |
20050090742 | Mine et al. | Apr 2005 | A1 |
20050111733 | Fors et al. | May 2005 | A1 |
20050159641 | Kanai | Jul 2005 | A1 |
20050182316 | Burdette et al. | Aug 2005 | A1 |
20050192564 | Cosman et al. | Sep 2005 | A1 |
20050219552 | Ackerman et al. | Oct 2005 | A1 |
20050222574 | Giordano et al. | Oct 2005 | A1 |
20050251148 | Friedrich et al. | Nov 2005 | A1 |
20060004275 | Vija et al. | Jan 2006 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060036162 | Shahidi et al. | Feb 2006 | A1 |
20060052792 | Boettiger et al. | Mar 2006 | A1 |
20060058609 | Olstad | Mar 2006 | A1 |
20060058610 | Olstad | Mar 2006 | A1 |
20060058674 | Olstad | Mar 2006 | A1 |
20060058675 | Olstad | Mar 2006 | A1 |
20060089625 | Voegele et al. | Apr 2006 | A1 |
20060100505 | Viswanathan | May 2006 | A1 |
20060122495 | Kienzle, III | Jun 2006 | A1 |
20060184040 | Keller et al. | Aug 2006 | A1 |
20060193504 | Salgo et al. | Aug 2006 | A1 |
20060229594 | Francischelli et al. | Oct 2006 | A1 |
20060235290 | Gabriel et al. | Oct 2006 | A1 |
20060235538 | Rochetin et al. | Oct 2006 | A1 |
20060241450 | Da Silva et al. | Oct 2006 | A1 |
20060253030 | Altmann et al. | Nov 2006 | A1 |
20060253032 | Altmann et al. | Nov 2006 | A1 |
20060271056 | Terrill-Grisoni et al. | Nov 2006 | A1 |
20060282023 | Leitner | Dec 2006 | A1 |
20060293643 | Wallace et al. | Dec 2006 | A1 |
20070016035 | Hashimoto | Jan 2007 | A1 |
20070032906 | Sutherland et al. | Feb 2007 | A1 |
20070073155 | Park et al. | Mar 2007 | A1 |
20070073455 | Oyobe et al. | Mar 2007 | A1 |
20070078346 | Park et al. | Apr 2007 | A1 |
20070167699 | Lathuiliere et al. | Jul 2007 | A1 |
20070167701 | Sherman | Jul 2007 | A1 |
20070167705 | Chiang et al. | Jul 2007 | A1 |
20070167771 | Olstad | Jul 2007 | A1 |
20070167801 | Webler et al. | Jul 2007 | A1 |
20070225553 | Shahidi | Sep 2007 | A1 |
20070239281 | Gotte et al. | Oct 2007 | A1 |
20070244488 | Metzger et al. | Oct 2007 | A1 |
20070255136 | Kristofferson et al. | Nov 2007 | A1 |
20070270718 | Rochetin et al. | Nov 2007 | A1 |
20070276234 | Shahidi | Nov 2007 | A1 |
20080004481 | Bax et al. | Jan 2008 | A1 |
20080004516 | DiSilvestro et al. | Jan 2008 | A1 |
20080030578 | Razzaque et al. | Feb 2008 | A1 |
20080039723 | Suri et al. | Feb 2008 | A1 |
20080051910 | Kammerzell et al. | Feb 2008 | A1 |
20080091106 | Kim et al. | Apr 2008 | A1 |
20080114235 | Unal et al. | May 2008 | A1 |
20080161824 | McMillen | Jul 2008 | A1 |
20080200794 | Teichman et al. | Aug 2008 | A1 |
20080208031 | Kurpad et al. | Aug 2008 | A1 |
20080208081 | Murphy et al. | Aug 2008 | A1 |
20080214932 | Mollard et al. | Sep 2008 | A1 |
20080232679 | Hahn et al. | Sep 2008 | A1 |
20080287805 | Li | Nov 2008 | A1 |
20090024030 | Lachaine et al. | Jan 2009 | A1 |
20090118724 | Zvuloni et al. | May 2009 | A1 |
20090137907 | Takimoto et al. | May 2009 | A1 |
20090226069 | Razzaque et al. | Sep 2009 | A1 |
20090234369 | Bax et al. | Sep 2009 | A1 |
20090312629 | Razzaque et al. | Dec 2009 | A1 |
20100045783 | State et al. | Feb 2010 | A1 |
20100198045 | Razzaque et al. | Aug 2010 | A1 |
20100208963 | Kruecker et al. | Aug 2010 | A1 |
20100268067 | Razzaque et al. | Oct 2010 | A1 |
20100268072 | Hall et al. | Oct 2010 | A1 |
20100268085 | Kruecker et al. | Oct 2010 | A1 |
20100305448 | Dagonneau et al. | Dec 2010 | A1 |
20100312121 | Guan | Dec 2010 | A1 |
20110043612 | Keller et al. | Feb 2011 | A1 |
20110046483 | Fuchs et al. | Feb 2011 | A1 |
20110057930 | Keller | Mar 2011 | A1 |
20110082351 | Razzaque et al. | Apr 2011 | A1 |
20110130641 | Razzaque et al. | Jun 2011 | A1 |
20110137156 | Razzaque et al. | Jun 2011 | A1 |
20110201915 | Gogin et al. | Aug 2011 | A1 |
20110201976 | Sanghvi et al. | Aug 2011 | A1 |
20110237947 | Boctor et al. | Sep 2011 | A1 |
20110238043 | Kleven | Sep 2011 | A1 |
20110251483 | Razzaque et al. | Oct 2011 | A1 |
20110282188 | Burnside et al. | Nov 2011 | A1 |
20110288412 | Deckman et al. | Nov 2011 | A1 |
20110295108 | Cox et al. | Dec 2011 | A1 |
20110301451 | Rohling | Dec 2011 | A1 |
20120035473 | Sanghvi et al. | Feb 2012 | A1 |
20120059260 | Robinson | Mar 2012 | A1 |
20120071759 | Hagy et al. | Mar 2012 | A1 |
20120078094 | Nishina et al. | Mar 2012 | A1 |
20120101370 | Razzaque et al. | Apr 2012 | A1 |
20120108955 | Razzaque et al. | May 2012 | A1 |
20120143029 | Silverstein et al. | Jun 2012 | A1 |
20120143055 | Ng et al. | Jun 2012 | A1 |
20120165679 | Orome et al. | Jun 2012 | A1 |
20120259210 | Harhen et al. | Oct 2012 | A1 |
20130132374 | Olstad et al. | May 2013 | A1 |
20130151533 | Udupa et al. | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
7656896 | May 1997 | AU |
9453898 | Apr 1999 | AU |
1719601 | Jun 2001 | AU |
9036301 | Mar 2002 | AU |
2003297225 | Jul 2004 | AU |
2001290363 | Feb 2006 | AU |
0113882 | Jul 2003 | BR |
2420382 | Apr 2011 | CA |
69618482 | Aug 2002 | DE |
60126798 | Oct 2007 | DE |
0 427 358 | May 1991 | EP |
1955284 | Aug 2008 | EP |
S63-290550 | Nov 1988 | JP |
H07-116164 | May 1995 | JP |
2005-058584 | Mar 2005 | JP |
2005-323669 | Nov 2005 | JP |
2009-517177 | Apr 2009 | JP |
WO 9605768 | Feb 1996 | WO |
WO 9715249 | May 1997 | WO |
WO 9717014 | May 1997 | WO |
WO 9926534 | Jun 1999 | WO |
WO 01039683 | Jun 2001 | WO |
WO 03032837 | Apr 2003 | WO |
PCTUS200317987 | Dec 2003 | WO |
WO 2005010711 | Feb 2005 | WO |
WO 2007019216 | Feb 2007 | WO |
WO 2007067323 | Jun 2007 | WO |
WO 2007067323 | Sep 2007 | WO |
WO 2008017051 | Feb 2008 | WO |
WO 2009094646 | Jul 2009 | WO |
WO 2010057315 | May 2010 | WO |
WO 2010096419 | Aug 2010 | WO |
WO 2009063423 | Oct 2010 | WO |
WO 2011014687 | Feb 2011 | WO |
PCTUS2013023678 | Feb 2013 | WO |
Entry |
---|
U.S. Appl. No. 11/828,826, filed Jul. 26, 2007, Keller et al. |
“3D Laparoscope Technology,” http://www.inneroptic.com/tech—3DL.htm, copyright 2007 InnerOptic Technology, Inc. printed Sep. 19, 2007, 2 pages. |
“Cancer Facts & Figures 2004,” www.cancer.org/downloads/STT/CAFF—finalPWSecured.pdf, copyright 2004 American Cancer Society, Inc., printed Sep. 19, 2007, 60 pages. |
“David Laserscanner <-Latest News <- Institute for Robotics and Process Control <- Te . . . ,” http://www/rob.cs.tu-bs.de/en/news/david, printed Sep. 19, 2007, 1 page. |
“laser scanned 3d model Final” video, still image of video attached, http://www.youtube.com/watch?v+DaLgIgmoUf8, copyright 2007 YouTube, LLC, printed Sep. 19, 2007, 2 pages. |
“Olympus Endoscopic Ultrasound System,” www.olympusamerica.com/msg—section/download—brochures/135—b—gfum130.pdf, printed Sep. 20, 2007, 20 pages. |
“Point Grey Research Inc.—Imaging Products—Triclops SDK Samples,” http://www.ptgrey.com/products/triclopsSDK/samples.asp, copyright 2007 Point Grey Research Inc., printed Sep. 19, 2007, 1 page. |
“Robbins, Mike—Computer Vision Research—Stereo Depth Perception,” http://www.compumike.com/vision/stereodepth. php, copyright 2007 Michael F. Robbins, printed Sep. 19, 2007, 3 pages. |
“Rue, Registered Ultrasound-Endoscope,” copyright 2007 InnerOptic Technology, Inc., 2 pages. |
Advertisement, “Inspeck 3DC 3D Capturor,” Inspeck 3DC 3D Capturor (www.inspeck.com), 1998. |
Advertisement, “Virtual 3D High Speed Non-Contact Surface Perception,” Virtual 3-D Technologies Corporation (www.virtual3dtech.com)., Dec. 21, 1998. |
Advertisements, “Virtuoso,” Visual Interface, Inc. (www.visint.com), Dec. 21, 1998. |
Akka, “Automatic Software Control of Display Parameters for Stereoscopic Graphics Images,” SPIE vol. 1669: Stereoscopic Displays and Applications III, pp. 31-38 (1992). |
Ali et al., “Near Infrared Spectroscopy and Imaging to Probe Differences in Water Content in Normal and Cancer Human Prostate Tissues,” Technology in Cancer Research & Treatment; Oct. 2004; 3(5):491-497; Adenine Press. |
Aylward et al., Analysis of the Parameter Space of a Metric for Registering 3D Vascular Images, in W. Niessen and M. Viergever (Eds.): MICCAI 2001, LNCS 2208, pp. 932-939, 2001. |
Aylward et al., Registration and Analysis of Vascular Images, International Journal of Computer Vision 55(2/3), 123-138, 2003. |
Aylward, et al., Intra-Operative 3D Ultrasound Augmentation, Proceedings of the IEEE International Symposium on Biomedical Imaging, Washington, Jul. 2002. |
Azuma, “A Survey of Augmented Reality,” Presence: Teleoperators and Virtual Environments 6, 4:1-48 (Aug. 1997). |
Bajura, Michael et al.,, “Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient,” Computer Graphics, Proceedings of SIGGRAPH 1992, vol. 26(2), pp. 203-210, available from www.cs.unc.edu/—fuchs/publications/MergVirtObjs92.pdf, printed Sep. 20, 2007, 8 pages. |
Benavides et al., “Multispectral digital colposcopy for in vivo detection of cervical cancer,” Optics Express; May 19, 2003; 11(1 0) Optical Society of America; USA. |
Beraldin, J.A. et al., “Optimized Position Sensors for Flying-Spot Active Triangulation Systems,” Proceedings of the Fourth International Conference on a 3-D Digital Imaging and Modeling (3DIM), Banff, Alberta, Canada, Oct. 6-10, 2003, pp. 334-341, NRC 47083, copyright 2003 National Research Council of Canada, http:/iit-iti.nrc-cnrc.gc.ca/iit-publications-iti/docs/NRC-47083.pdf, printed Sep. 19, 2007, 9 pages. |
Billinghurst, M. et al., Research Directions in Handheld AR; Int. J. of Virtual Reality 5(2),51-58 (2006). |
Bishop, Azuma et al., “Improving Static and Dynamic Registration in an Optical See-Through HMD,” Paper Presented at SIGGRAPH '94 Annual Conference in Orlando, FL (1994). |
Blais, F., “Review of 20 Years of Range Sensor Development,” Journal of Electronic Imaging, 13(1): 231-240, Jan. 2004, NRC 46531, copyright 2004 National Research Council of Canada, http://iit-iti.nrc-cnrc.gc.ca/iit-publications-iti/docs/NRC-46531.pdf, printed Sep. 19, 2007, 14 pages. |
Bouguet, Jean-Yves, “Camera Calibration Toolbox for Matlab,” www.vision.caltech.edu/bouguetj/calib—doc, printed Sep. 20, 2007, 5 pages. |
Cancer Prevention & Early Detection Facts & Figures 2004; National Center for Tobacco-Free Kids; 2004; American Cancer Society; USA. |
Cantor et al., “Cost-Effectiveness Analysis of Diagnosis and Management of Cervical Squamous Intraepithelial Lesions,” Diagnostic Strategies for SILs; Feb. 1998; 91(2):270-277. |
Catalano et al. “Multiphase helical CT findings after percutaneous ablation procedures for hepatocellular carcinoma.” Abdom. Imaging, 25(6),2000, pp. 607-614. |
Chiriboga et al., “Infrared Spectroscopy of Human Tissue. IV. Detection of Dysplastic and Neoplastic Changes of Human Cervical Tissue Via Infrared Microscopy,” Cellular and Molecular Biology; 1998; 44(1): 219-229. |
Crawford, David E. et al., “Computer Modeling of Prostate Biopsy: Tumor Size and Location—Not Clinical Significance—Determine Cancer Detection,” Journal of Urology, Apr. 1998, vol. 159(4), pp. 1260-1264, 5 pages. |
Deering, Michael “High Resolution Virtual Reality.” Proceedings of SIGGRAPH '92, Computer Graphics, 26(2), 1992, pp. 195-202. |
Depiero et al., “3-D Computer Vision Using Structured Light: Design, Calibration and Implementation Issues,” The University of Tennessee, pp. 1-46, (1996). |
Dodd, G.D. et al. “Minimally invasive treatment of malignant hepatic tumors: at the threshold of a major breakthrough.” Radiographies 20(1),2000, pp. 9-27. |
Drascic et al., “Perceptual Issues in Augmented Reality,” SPIE vol. 2653: Stereoscopic Displays and Virtual Reality Systems III, pp. 123-134 (Feb. 1996). |
Fahey et al., “Meta-analysis of Pap Test Accuracy; American Journal of Epidemiology,” 1995 141(7):680-689; The John Hopkins University School of Hvgiene and Public Health; USA. |
Foxlin et al., “An Inertial Head-Orientation Tracker with Automatic Drift Compensation for Use with HMD's,” Proceedings of the 1994 Virtual Reality Software and Technology Conference, Aug. 23-26, 1994, Singapore, pp. 159-173 (1994). |
Fronheiser et al., Real-Time 3D Color Doppler for Guidance of Vibrating Interventional Devices, IEEE Ultrasonics Symposium, pp. 149-152 (2004). |
Fuchs, Henry et al. “Augmented Reality Visualization for Laparoscopic Surgery,” Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI) 1998, pp. 934-943, available from www.cs.unc.edu/—fuchs/publications/AugRealVis—LaparoSurg98.pdf, printed Sep. 20, 2007, 10 pages. |
Fuchs, et al.: “Virtual Environments Technology to Aid Needle Biopsies of the Breast,” Health Care in the Information Age, Ch. 6, pp. 60-61, Presented in San Diego, Jan. 17-20, 1996, published by IOS Press and Ohmsha Feb. 1996. |
Garrett, William F. et al., “Real-Time Incremental Visualization of Dynamic Ultrasound Volumes Using Parallel BSP Trees,” Proceedings of IEEE Visualization 1996, pp. 235-240, available from www.cs.unc.edu/˜andrei/pubs/1996—VIS—dualBSP—Mac.pdf, printed Sep. 20, 2007, 7 pages. |
Georgakoudi et al., “Trimodal spectroscopy for the detection and characterization of cervical precancers in vivo,” American Journal of Obstetrics and Gynecology; Mar. 2002; 186(3):374-382; USA. |
Herline et al., Surface Registration for Use in Interactive, Image-Guided Liver Surgery, Computer Aided Surgery 5:11-17 (2000). |
Holloway, R.; Registration Error Analysis for Augmented Reality; Presence: Teleoperators and Virtual Environments 6(4), 413—432 (1997). |
Hornung et al., “Quantitative near-infrared spectroscopy of cervical dysplasia in vivo,” Human Reproduction; 1999; 14(11):2908-2916; European Society of Human Reproduction and Embryology. |
Howard, M.D., et al.: “An Electronic Device for Needle Placement during Sonographically Guided Percutaneous Intervention”, Radiology 2001; 218:905-911. |
InnerAim Brochure; 3D Visualization Software for Simpler, Safer, more Precise Aiming, Published no earlier than Apr. 1, 2010. |
InVision System Brochure; A “GPS” for Real-Time 3D Needle Visualization & Guidance, Published no earlier than Mar. 1, 2008. |
InVision User Manual; Professional Instructions for Use, Published no earlier than Dec. 1, 2008. |
Jacobs, Marco C. et al., “Managing Latency in Complex Augmented Reality Systems,” ACM SIGGRAPH Proceedings of the Symposium of Interactive 3D Graphics 1997, pp. 49-54, available from www.cs.unc.edu/˜us/Latency//ManagingRelativeLatency.html, printed Sep. 20, 2007, 12 pages. |
Kanbara et al., “A Stereoscopic Video See-through Augmented Reality System Based on Real-time Vision-Based Registration,” Nara Institute of Science and Technology, pp. 1-8 (2000). |
Lass, Amir, “Assessment of Ovarian Reserve,” Human Reproduction, 2004, vol. 19(3), pp. 467-469, available from http://humrep.oxfordjournals.orgcgi/reprint/19/3/467, printed Sep. 20, 2007, 3 pages. |
Lee et al., “Modeling Real Objects Using Video See-Through Augmented Reality,” Presence, 11(2):144-157 (Apr. 2002). |
Leven et al., DaVinci Canvas: A Telerobotic Surgical System with Integrated, Robot-Assisted, Laparoscopic Ultrasound Capability, in J. Duncan and G. Gerig (Eds.): MICCAI 2005, LNCS 3749, pp. 811-818, 2005. |
Levy, et al., An Internet-Connected, Patient Specific, Deformable Brain Atlas Integrated into a Surgical Navigation System, Journal of Digital Imaging, vol. 10, No. 3. Suppl. 1 Aug. 1997: pp. 231-237. |
Livingston, Mark A. et al., “Magnetic Tracker Calibration for Improved Augmented Reality Registration,” Presence: Teleoperators and Virtual Environments, 1997, vol. 6(5), pp. 532-546, available from www.cs.unc.edu/˜andrei/pubs/1997—Presence—calibr.pdf, printed Sep. 20, 2007, 14 pages. |
Matsunaga et al., “The Effect of the Ratio Difference of Overlapped Areas of Stereoscopic Images on each Eye in a Teleoperalion,” Stereoscopic Displays and Virtual Reality Systems VII, Proceedings of SPIE, 3957:236-243 (2000). |
Meehan, Michael et al., “Effect of Latency on Presence in Stressful Virtual Environment,” Proceedings of IEEE Virtual Reality 2003, pp. 141-148, available from http://www.cs.unc.edu/˜eve/pubs.html, printed Sep. 20, 2007, 9 pages. |
Milgram et al., “Adaptation Effects in Stereo due to Online Changes in Camera Configuration,” SPIE vol. 1669-13, Stereoscopic Displays and Applications III, pp. 1-12 (1992). |
Mtchell et al., “Colposcopy for the Diagnosis of Squamous Intraepithelial lesions: A metaanalysis,” Obstetrics and Gynecology; Apr. 1998; 91(4):626-631. |
Nakamoto et al., 3D Ultrasound System Using a Magneto-optic Hybrid Tracker for Augmented Reality Visualization in Laparoscopic Liver Surgery, in T. Dohi and R. Kikinis (Eds.): MICCAI 2002, LNCS 2489, pp. 148-155, 2002. |
Nordstrom et al., “Identification of Cervical Intraepithelial Neoplasia (CIN) Using UV-Excited Fluorescence and Diffuse-Reflectance Tissue Spectroscopy,” Lasers in Surgery and Medicine; 2001; 29; pp. 118-127; Wiley-Liss, Inc. |
Ohbuchi et al. “An Incremental Volume Rendering Algorithm for Interactive 3D Ultrasound Imaging”, UNC-CH Computer Science Technical Report TR91-003, (1991). |
Ohbuchi et al., “Incremental Volume Reconstruction and Rendering for 3D Ultrasound Imaging,” Visualization in Biomedical Computing, SPIE Proceedings, pp. 312-323, (Oct. 13, 1992). |
Ohbuchi, “Incremental Acquisition and Visualization of 3D Ultrasound Images,” Ph.D. Dissertation, UNC-CH Computer Science Technical Report TR95—023, (1993). |
PCT, International Search Report and Written Opinion, re PCT Application No. PCT/US07/75122, mailing date Aug. 20, 2008. |
PCT, International Preliminary Report on Patentability, re PCT Application No. PCT/US07/75122, mailing date Mar. 3, 2009. |
PCT, International Search Report and Written Opinion, re PCT Application No. PCT/US2010/024378, mailing date Oct. 13, 2010. |
PCT, International Search Report and Written Opinion, re PCT Application No. PCT/US2010/043760, mailing date Mar. 3, 2011. |
PCT, The International Search Report and Written Opinion of the International Searching Authority, mailed Sep. 9, 2009, for case PCT/US2009/032028. |
Pogue, Brian W. et al., “Analysis of acetic acid-induced whitening of high-grade squamous intraepitheliallesions,” Journal of Biomedical Optics; Oct. 2001; 6(4):397-403. |
Raij, A.B., et al., Comparing Interpersonal Interactions with a Virtual Human to Those with a Real Human; IEEE Transactions on Visualization and Computer Graphics 13(3), 443-457 (2007). |
Raz et al, Real-Time Magnetic Resonance Imaging-Guided Focal Laser Therapy in Patients with Low-Risk Prostate Cancer, European Urology 58, pp. 173-177. Mar. 12, 2010. |
Robinett et al., “A Computational Model for the Stereoscopic Optics of a Head-Mounted Display,” SPIE vol. 1457, Stereoscopic Displays and Applications II, pp. 140-160 (1991). |
Rolland et al., Towards Quantifying Depth and Size Perception in Virtual Environments, Presence: Teleoperators and Virtual Environments, Winter 1995, vol. 4, Issue 1, pp. 1-21 and 24-49. |
Rosenthal, Michael et al., “Augmented Reality Guidance for Needle Biopsies: An Initial Randomized, Controlled Trial in Phantoms,” Proceedings of Medical Image Analysis, Sep. 2002, vol. 6(3), pp. 313-320, available from www.cs.unc.edu/˜fuchs/publications/AugRealGuida—NeedleBiop02.pdf, printed Sep. 20, 2007, 8 pages. |
Rosenthal, Michael et al., “Augmented Reality Guidance for Needle Biopsies: A Randomized, Controlled Trial in Phantoms,” Proceedings of MICCAI 2001, eds. W. Niessen and M. Viergever, Lecture Notes in Computer Science, 2001, vol. 2208, pp. 240-248, available from www.cs.unc.edu/˜us/AugmentedRealityAssistance.pdf, printed Sep. 20, 2007, 9 pages. |
Splechtna, Fuhrmann A. et al., Comprehensive calibration and registration procedures for augmented reality; Proc. Eurographics Workshop on Virtual Environments 2001,219—228 (2001). |
State et al., “Case Study: Observing a Volume Rendered Fetus within a Pregnant Patient,” Proceedings of IEEE Visualization 1994, pp. 364-368, available from www.cs.unc.edu/˜fuchs/publications/cs-ObservVolRendFetus94.pdf, printed Sep. 20, 2007, 5 pages. |
State et al., “Interactive Volume Visualization on a Heterogenous Message-Passing Multicomputer,” Proceedings of 1995 Symposium on Interactive 3D Graphics, 1995, pp. 69-74, 208, available from www.cs.unc.edu/˜andrei/pubs/1995—I3D—vol2—Mac.pdf, printed Sep. 20, 2007. |
State et al., “Simulation-Based Design and Rapid Prototyping of a Parallax-Free, Orthoscopic Video See-Through Head-Mounted Display,” Proceedings of International Symposium on Mixed and Augmented Reality (ISMAR) 2005, available from www.cs.unc.edu/˜andrei/pubs/2005—ISMAR—VSTHMD—design.pdf, printed Sep. 20, 2007, 4 pages. |
State et al., “Stereo Imagery from the UNC Augmented Reality System for Breast Biopsy Guidance” Proc. Medicine Meets Virtual Reality (MMVR) 2003 (Newport Beach, CA, Jan. 22-25, 2003). |
State et al., “Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking,” ACM SIGGRAPH Computer Graphics, Proceedings of SIGGRAPH 1996, pp. 429-438, available from www.cs.princeton.edu/courses/archive/fall01/cs597d/papers/state96.pdf, printed Sep. 20, 20007, 10 pages. |
State et al., “Technologies for Augmented Reality Systems: Realizing Ultrasound-Guided Needle Biopsies,” Computer Graphics, Proceedings of SIGGRAPH 1996, pp. 429-438, available from www.cs.princeton.edu/courses/archive/fall101/cs597d/papers/state96.pdf, printed Sep. 20, 2007. |
State, Andrei “Exact Eye Contact with Virtual Humans.” Proc. IEEE International Workshop on Human Computer Interaction 2007 (Rio de Janeiro, Brazil, Oct. 20, 2007), pp. 138-145. |
Takagi et al., “Development of a Stereo Video See-through HMD for AR Systems,” IEEE, pp. 68-77 (2000). |
Ultraguide 1000 System, Ultraguide, www.ultraguideinc.com, 1998. |
Van Staveren et al., “Light Scattering in Intralipid-10% in the wavelength range of 400-1100 nm,” Applied Optics; Nov. 1991; 30(31):4507-4514. |
Viola et al., “Alignment by Maximization of Mutual Information,” International Journal of Computer Vision, vol. 24, No. 2, pp. 1-29 (1997). |
Viola, Paul A., Alignment by Maximization of Mutual Information, Ph.D. Dissertation, MIT—Artificial Intelligence Laboratory Technical Report No. 1548 (Jun. 1995). |
Ware et al., “Dynamic Adjustment of Stereo Display Parameters,” IEEE Transactions on Systems, Many and Cybernetics, 28(1):1-19 (1998). |
Watson et al., “Using Texture Maps to Correct for Optical Distortion in Head-Mounted Displays,” Proceedings of the Virtual Reality Annual Symposium '95, IEEE, pp. 1-7 (1995). |
Welch, Hybrid Self-Tracker: An Inertial/Optical Hybrid Three-Dimensional Tracking System, University of North Carolina Chapel Hill Department of Computer Science, TR 95-048 (1995). |
Yinghui et al., Real-Time Deformation Using Modal Analysis on Graphics Hardware, Graphite 2006, Kuala Lumpur, Malaysia, Nov. 29-Dec. 2, 2006. |
Zitnick et al., “Multi-Base Stereo Using Surface Extraction,” Visual Interface Inc., (Nov. 24, 1996). |
Bajura, Michael et al.,, “Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient,” Computer Graphics, Proceedings of SIGGRAPH 1992, vol. 26(2), pp. 203-210, available from www.cs.unc.edu/˜fuchs/publications/MergVirtObjs92.pdf, printed Sep. 20, 2007, 8 pages. |
Buxton et al.; “Colposcopically directed punch biopsy: a potentially misleading investigation,” British Journal of Obstetrics and Gynecology; Dec. 1991; 98:1273-1276. |
Caines, Judy S. et al. Stereotaxic Needle Core Biopsy of Breast Lesions Using a Regular Mammographic Table with an Adaptable Stereotaxic Device, American Journal of Roentgenology, vol. 163, No. 2, Aug. 1994, pp. 317-321. Downloaded from www.ajrorline.org on Jul. 10, 2013. |
Dumoulin, C.L. et al, Real-Time Position Monitoring of Invasive Devices Using Magnetic Resonance, Magnetic Resonance in Medicine, vol. 29, Issue 3, Mar. 1993, pp. 411-415. |
Jolesz, Ferenc A, M.D., et al. MRI-Guided Laser-Induced Interstitial Thermotherapy: Basic Principles, SPIE Institute on Laser-Induced Interstitial Thermotherapy (L1TT), Jun. 22-23, 1995, Berlin, Germany. |
Kadi, A Majeed, et al., Design and Simulation of an Articulated Surgical Arm for Guidling Sterotactic Neurosurgery, SPIE vol. 1708 Applications of Artificial Intelligence X: Machine Vision and Robotics (1992). Downloaded from: http://proceedings.spiedigitallibrary.org/ on Jul. 11, 2013. |
Kato, Amami, et al., A frameless, armless navigational system for computer-assisted neurosurgery, Journal of Neurosurgery, vol. 74, No. 5, May 1991, pp. 845-849. |
PCT International Search Report and Written Opinion, re PCT Application No. PCT/US2013/023678, mailed Jun. 13, 2013. |
Screenshots from video produced by the University of North Carolina, produced circa 1992. |
State et al., Contextually Enhanced 3D Visualization for Multi-burn Tumor Ablation Guidance, Departments of Computer Science and Radiology,and School of Medicine, University of North Carolina at Chapel Hill; InnerOptic Technology, Inc., 2008, Chapel Hill, NC, pp. 70-77. |
Fuchs, et al., Optimizing a Head-Tracked Stereo Display System to Guide Hepatic Tumor Ablation, Departments of Computer Science and Radiology,and School of Medicine, University of North Carolina at Chapel Hill; InnerOptic Technology, Inc., 2008, Chapel Hill, NC. |
Number | Date | Country | |
---|---|---|---|
20130197357 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
61592531 | Jan 2012 | US | |
61736789 | Dec 2012 | US |