System and method for displaying anatomy and devices on a movable display

Information

  • Patent Grant
  • 11020016
  • Patent Number
    11,020,016
  • Date Filed
    Friday, May 23, 2014
    10 years ago
  • Date Issued
    Tuesday, June 1, 2021
    2 years ago
Abstract
An image display system is provided comprised of a virtual window system that creates a visual coherency between the patient's anatomical images and the actual patient by aligning the image on the display to the patient and then presenting the image to the user in a way that feels as if the user is looking directly into the patient through the display. The image shown within the image display system is dependent upon the position of the image display apparatus and the position of the user so that the display orientation of the image may be biased slightly toward the user to improve ergonomics and usability.
Description
BACKGROUND OF THE INVENTION
1. Field of the Invention

The invention relates generally to the diagnosis and treatment of disorders using minimally invasive techniques. In many minimally invasive procedures very small devices are manipulated within the patient's body under visualization from a live imaging source like ultrasound, fluoroscopy, or endoscopy. Live imaging in a minimally invasive procedure may be supplemented or replaced by displaying the position of a sensored medical device within a stored image of the patient anatomy.


Many minimally invasive procedures are conducted in expensive settings by specialized physicians. Often small, percutaneous medical devices are visualized during the procedure by using live fluoroscopic or ultrasonic imaging. While the live imaging provides a real-time image of anatomy, it has many drawbacks:


Time spent in an imaging suite is expensive and raises the cost of many minimally invasive medical procedures.


Ionizing radiation used to create the fluoroscopic image is dangerous to the patient, physician, and assistants.


Needles, Guidewires, and other small devices may be difficult to locate within the live two-dimensional image. These devices may be too small to see clearly in fluoroscopic images. In ultrasound images, these devices may be difficult to locate when they are outside of the ultrasonic imaging plane or they may reflect a diffused, ambiguous image when they are within the ultrasonic imaging plane.


The fluoroscopic and ultrasonic images are two-dimensional and do not provide determinant information about motion of the medical device and three-dimensional anatomical structures.


During a typical minimally invasive procedure the physician must look away from the patient and his or her hands to see the display showing the live image. Additionally, the frame of reference for the live image is typically misaligned from the frames of reference for the physician, the tool and the patient. This presents a challenging situation for the physician who must compensate for differences in these frames of reference. For instance, when the physician inserts a device into the patient by moving his hands from left to right, the fluoroscopic image of the device moves towards the top of the display. Ultrasonic images can be even more confounding in that the frame of reference for the ultrasound image is based on the position and orientation of the ultrasound probe which is frequently moving during imaging. The physician must compensate for the misalignment of the coordinate systems for the respective frames of reference while also concentrating on achieving the goals of the minimally invasive procedure. The physician's need to look away from the patient and his or her instrument creates an ergonomic challenge in addition to this mental challenge. As a result the completion of minimally invasive procedures becomes delayed, increasing the procedure cost.


Prior to a minimally invasive catheter procedure, patients often have an anatomical image created using CT or MR imaging systems commercially provided by companies like Philips, Siemens, General Electric, and Toshiba. The anatomical images can be processed, or “segmented,” into three-dimensional representations of the anatomy of interest. Individual organs, muscles and vasculature can be visually separated from other anatomy for even clearer viewing of regions of interest. In this invention the three-dimensional pre-procedure images may be used instead of or in addition to live imaging for navigation during the procedure because the position and orientation of the medical device can be sensed in real-time. For example, navigation systems provided by Medtronic, GE, and Stryker sense the positions of medical devices within the patient's body and present the sensed position data in a pre-procedural image of the patient's anatomy. These navigation systems provide a supplement or replacement to fluoroscopic imaging so that the physician may conduct a minimally invasive procedure within the patient's body using little or no X-ray. However, the navigation systems do not provide a means for making the physician's hand motions on the medical device match the motions of the device displayed in the image of the anatomy on the display. In order to make minimally invasive procedures easy and intuitive, the coordinate systems of the patient, the device, the display, and the physician's hands must be unified.


Minimally invasive procedures where a medical device is inserted into the body are especially well suited for a system that provides navigation assistance by unifying the physician, patient, display, and device coordinate systems. These procedures usually employ devices that are navigated through the body to small anatomical targets. For example, to obtain a tissue biopsy of a prostate, a physician may insert a small catheter through the urethra into the bladder. The urethral catheter provides an ideal location for the placement of sensors that can be used by software to match the live three-dimensional shape of the urethra to the stored three-dimensional shape of the urethra in the pre-operative image set. This “registration” of the real-time position of the patient's soft tissue to the pre-operative image of the same tissue allows the tissue and adjacent tissue structures to be accessed using the pre-operative images. Then a biopsy needle may be inserted into biopsy targets within the prostate by a physician who is navigating the needle using a three-dimensional image of the prostate. Once target tissue is reached with a needle, it may be treated directly with therapies like RF ablation, cryo-therapy, brachy-therapy or chemo-embolozation. Similar use of the invention may be made for other tissues like breast, liver, lung


Endoscopic device use may similarly be improved by displaying an anatomical image that is aligned to the patient. Prior to inserting the endoscope, it is difficult to know the exact locations of anatomical structures within the body. After the endoscope is inserted, the external references of the patient's body are lost. Displaying an anatomical image that is aligned to the patient's body provides context by unifying the external view of the patient with the internal view of the anatomy, allowing the physician to choose optimal placement of access ports and improving the ability access desired anatomy quickly and directly.


Robotic surgical procedures may be improved to displaying the projected workspaces of robotic devices on an anatomical image that is aligned to the patient. The projected path, workspace, and collision space of robotic devices may be overlaid on the anatomical image and viewed from different perspectives by moving the display, allowing the user to optimize the placement of the devices in the patients body for reaching specific target anatomies.


The present invention improves the ease and reliability of visualizing anatomy within a patient by providing a system for displaying the device and patient anatomy in a substantially aligned manner.


2. Description of Background Art

Relevant references include US 2010/295931; US2010/053151; US2010/039506; US2009/322671; U.S. Pat. Nos. 7,880,739; 7,203,277; 5,808,665; 7,774,044; 5,134,390; 6,038,467; and Nikou C, DiGioia A M, Blackwell M, et al. Augmented reality imaging technology for orthopaedic surgery. Operative Techniques in Orthopaedics. 2000; 10:82-86


SUMMARY OF THE INVENTION

The invention comprises a virtual window system that creates a visual coherency between the patient's anatomical images and the actual patient by aligning the image on the display to the patient and then presenting the image to the user in a way that feels as if the user is looking directly into the patient through the display. The invention is designed to also display medical devices, such as a biopsy needle. The invention makes the anatomy and the motion of the minimally invasive medical device in the display match the motion of the physician's hands by substantially unifying the coordinate systems of the patient, the medical device, the display, and the physician's hands. The invention creates a visual coherency between the motion of the medical device in the image and the motion of the physician's hands manipulating the device. This invention also creates a visual coherency between the motion of the image in the display and the motion of the display. For example, the invention shows the image of the anatomy, the projected path of the biopsy needle, and the actual location of the tip of the biopsy needle in a single image that is shown on a display over the patient in substantial alignment to the patient's actual anatomy.


Embodiments of the invention possess inventive design elements that provide excellent user ergonomics and increase the functional anatomical workspace of the virtual window surgical system. Coupling the position and orientation of the display to the image allows the image to remain aligned to the patient for various positions and orientations of the display. To improve the ergonomics and workspace of the system, the knowledge of the general position of the user relative to the patient is leveraged to slightly bias the image position to an optimized position. For example, if the user is on the left side of the patient, the image may be angled fifteen degrees away from the user so that when the display is angled fifteen degrees toward the user, the image will appear flat relative to the patient. Practice has shown that the intuitive benefits to the user of an aligned image may still be captured when small angular offsets are in place, with offsets of 30 degrees being the well-tolerated limit in many procedures. The system uses the knowledge of the user's position to bias the display toward more comfortable positions. The knowledge of the user's position may be input to the system by the user, inferred by the system using the position of the display, or sensed by the system using position or contact devices on the system. To further increase the workspace of the system, this invention allows for decoupling the relationship to reposition the display independently of the image. For instance, an aligned display may interfere with other equipment during some portion of the procedure and it may be desirable to reposition the display slightly to relieve the interference. Additionally this invention allows for a scaled coupling for improved ergonomics. For instance, moving the display with a unity ratio may cause the display to interfere with other equipment during some portion of the procedure or may make the screen difficult to view. Up to a 1.5:1 scale may be used to optimize the ergonomics of the system while maintaining the visual coherency between the patient and the image. It should be noted that the display may be repositioned along multiple axes and in multiple directions and that the scaling may be different for different axes and directions. For example, the scaling may be unity in the translational axes and 1.3:1 in the rotational axes.


Additionally this invention provides a movable support structure to hold a display directly in front of the physician, between the physician and the patient. Ideally the images are presented in a fashion such that the images are substantially aligned with the patient. This invention details the methods and techniques needed to align the images to the patient. Many embodiments utilize a display that is mounted on a movable support structure that allows for the display to be positioned between the patient and the physician. The range of motion of the support structure and the degrees of freedom enable a wide range of display positions and orientations. In one embodiment, the patient is lying on an exam table with the physician standing by the patient's side. The support structure allows the display to be brought over the patient. The physician can move and orient the display so the display is located roughly between the physician and the patient. Providing a display over the operative area of the patient allows the physician to perform minimally invasive procedures with needles, Guidewires, and catheters as if the physician were performing open surgery by looking directly into the patient.


Techniques are also disclosed to track the position of the display, the imaging source, the patient, and the medical device. Tracking individual elements of the system allows the image to be aligned with the patient and constantly updated to accommodate for a moving patient, moving medical device, or moving display.


A live image of the patient anatomy may also be shown on a display located over the patient. Sensors track the position and orientation of the display screen and the imaging source so that the position and orientation of the display screen may control position and orientation of the imaging source, keeping the anatomical image, the medical device image, and the patient substantially co-aligned. Alternatively, sensors track the position and orientation of the display screen and the imaging source so that the position and orientation of the imaging source may control position and orientation of the display screen, to keep the anatomical image, the display screen, the medical device image, and the patient substantially co-aligned. The live image may be supplemented with other anatomical images from live or static sources that are sensored, registered, and displayed in the same substantially co-aligned manner on the display screen. For example, a live endoscopic image may be superimposed over a three-dimensional image of the prostate derived from a pre-operative MR scan. As the physician moves the display to view the three-dimensional image from different angles, the endoscope may be remotely automatically repositioned so that the live image viewing position matches the viewing position of the three-dimensional image.


All embodiments create a coupling between the image position and orientation and the position and orientation of a secondary system component. This invention improves the workspace of the system by providing an input device to temporarily decouple the relationship to reposition the display or secondary system component for improved workspace. Additionally, this invention improves the ergonomics by allowing for a scaling factor between the coupled display and secondary system component.


In another embodiment the system comprises a processor further adapted to receive image data for the patient's anatomy. Such image data may be a static image obtained by MRI, ultrasound, X-ray, computed tomography or fluoroscopic imaging modalities. The image data can also be a live fluoroscopic image collected in real-time. The system can further track patient position by one or more of the following: fiducial markers, live imaging data, optical sensors, or electromagnetic sensors. The processor is also further adapted to receive position data from a tool, which is tracked by electromagnetic sensors. The display is held by a support arm having at least one degree of freedom, wherein the members and joints of the support arm may be operatively coupled to counterbalance springs or weights. The processor is further adapted to receive position data of the display, which is tracked by one or more of the following: optical tracking, electromagnetic sensors, or encoded joints of the support arm. The processor processes the various position data and image data to display an image of the patient's anatomy substantially aligned with the patient's actual anatomy superimposed with the position of any device being tracked. The processor is also adapted to direct any live imaging equipment to ensure proper functioning of the system. When used in a surgical setting the invention may be located in the surgical field and may also comprise a sterile drape for the display to protect the integrity of the surgical field.


In one embodiment, a live image of the patient anatomy is shown on a repositionable display screen located over the patient. The physician can move the display over the patient while sensors track the motion of the display so that the image shown on the display screen may be periodically or constantly updated to show the medical device, and the patient anatomy substantially aligned with the patient from the perspective of the user with a slight angular bias toward the user. The position of the user relative to the patient may be entered by the user at the start of the procedure by touching a button on the display labeled “patient left,” “patient right,” “patient head,” or “patient feet.” In this manner, the image shown on the display provides a view of the medical device and patient anatomy that is intuitive, ergonomic, and allows for easy navigation of the medical device within the patient anatomy shown on the display screen. While the image of the anatomy is frequently based on a pre-operative image, a live image may be supplemented with other anatomical images from live or static sources which are sensored, registered, and displayed in the same substantially co-aligned manner on the display screen.


In additional embodiments, a sensor on the medical device provides position and orientation data of the device to a data processor. A sensor on the patient provides position and orientation data of the patient to the processor, and sensors on the display screen provide the viewing position and orientation of the display screen to the processor. With data from the medical device, the patient, and the display, the processor unifies the three coordinate systems so that the image shown on the display screen substantially matches the position of the patient anatomy. Adjustments to the display position over the patient result in similar changes to the position of the image in the display: changing the position of the display changes the view of the image on the display screen. For example, the user may change the angle of the display to change the angle of the apparent image on the display screen or may translate the display to pan the image in the display along the patient to show different anatomy. Aligning the positions of the shown image and the patient anatomy helps coordinate the physician's control of the medical device.


Elements of both embodiments may be combined to display preoperative and intra-operative anatomical images within the same procedure. In both embodiments, the invention provides a virtual window into the patient where the physician may view the anatomy and navigate the surgical device in substantial alignment with the patient. For example, sensored endoscope may be shown relative to the aligned anatomical image. An anatomical target may be chosen and marked on the image. As sensored medical devices are moved to different potential access points on the body, the ability to reach the anatomical target may be shown by projecting the path of the device to the target and presenting a positive indication when the path to the anatomical target is uninterrupted. Similar real-time updates may be used to assist in quickly choosing access points for minimally invasive devices by showing whether adjacent medical devices will collide with each other, external anatomy, or internal anatomy as different potential access points on the body are selected by moving the medical device to those access points.





BRIEF DESCRIPTION OF THE DRAWINGS

The novel features of the invention are set forth with particularity in the claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:



FIG. 1 is a side diagrammatic view of a system for displaying a substantially co-aligned anatomical image with a sensored medical device over a patient's anatomy.



FIG. 2 is a block diagram showing data flow for the system in FIG. 1.



FIG. 3 is an isometric view of an embodiment of the display and support arm positioned next to the patient table with the projected workspace of a robotic surgical device overlaid on the anatomy in the display.



FIG. 4 is an isometric view of an embodiment of the display and support arm positioned next to the patient table.



FIG. 5 is a flow chart describing the basic steps for a minimally invasive procedure using a sensored medical device and the system for displaying a co-aligned image.





DETAILED DESCRIPTION OF THE INVENTION

While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.



FIGS. 1-2 describe an embodiment for navigating a minimally invasive medical device within the patient using an acquired three-dimensional anatomical image shown in a display 7 that is substantially aligned to the patient anatomy. A sterile cover may be used to separate the display from the sterile operating field and the sterile cover may incorporate a conductive film to provide a sterile touch interface for a capacitive touch screen display. The sterile display cover may be a flexible, clear drape made of plastic like polyethylene or polyurethane film, a rigid plate made of clear plastic like polycarbonate or acrylic, or a combination of both flexible and rigid plastics. The display is preferably a light-weight, flat LCD display provided by manufacturers like LG Display, Philips, and Innolux or a light-weight, flat OLED display provided by manufacturers like Samsung and Sony. A prime example of such a display would be the NEC TFT color LCD module which provides a usable viewing angle of 85° in all directions. In FIG. 1, the position of the medical device within the patient 5 is provided by an electromagnetic coil sensor located on the distal elongated section of the medical device 1. The position of the sensor is derived through an electromagnetic transmitter 2 similar to those transmitters supplied commercially by NDI and Ascension Technology Corporation. Alternatively, the position of the medical device may be derived from an optical fiber position sensor like that supplied by Luna Innovations. A similar patient reference sensor 3 is placed on the patient in a reliably stable position like the outcropping of the pelvic bone, sternum or clavicle. The reference sensor or sensors provide frequently updated data describing the position of the patient anatomy in the same coordinate system as the medical device sensor. The patch holding the patient sensor may be placed on the patient before the patient's anatomy of interest is imaged and the patch may contain known X-ray visible materials such as tungsten, platinum-iridium, platinum, barium sulfide or iodine and MR visible materials such as gadolinium or vitamin E. The patch is visible within the image of the anatomy and therefore the patient reference sensor 3 can be registered to the three dimensional anatomical image. Position data from the sensor in the medical device 1 and patient reference sensor 3 and display support arm 4 are sent to the system processor 6. The local coordinate systems of the medical device sensor 1 and display 7 may undergo a coordinate system transformation in the system processor so that the positions of the device sensor, patient sensor, and display may be evaluated in a single world coordinate system. Display 7 has a user input button 8. FIG. 2 shows the flow of sensor position data from the sensor buffer 9 to the system processor 10 where the position sensor data is used by the processor to place an icon of the medical device into the three-dimensional patient anatomy image for display through the system display 11. The system processor is a standard computing system like those supplied by Dell or Hewlett Packard running an operating system like Windows or Linux. Position data from the system display and support arm is likewise used by the system processor to orient the image on the screen so that the image, based on display position data from the display 7 and support arm 4 and patient position data from the patient reference sensor 3, is substantially aligned with the patient anatomy. Display position data may also be used to modify the image in the display, for example zooming or clipping the image as the display moves closer to the patient. Other image modifications may include changing transparency, removing layers, removing anatomical structures, or changing colors. Additionally, scaling of the image in discrete steps or image modifications may be done via a touch sensitive surface on the display.



FIG. 3 shows a movable display 12 positioned over a surgical table showing an image of the patient anatomy. A target 13 may be chosen on the image of the anatomy. A remote electromagnetic transmitter, such as those commercially available from Northern Digital Incorporated (NDI) and Ascension Technology Corporation, is positioned near or under the table to localize sensors 15 on at least one medical device 16. As the display is moved, the image of the anatomy, the medical devices, projected the path 14 of the medical devices, and the collision boundaries of the medical devices is repositioned to provide the optimum view for navigation of the medical device within the anatomical image. The access points may be chosen to optimize the ability of the medical devices to reach the anatomical target without creating collisions of the medical devices that are internal and external to the patient and to optimize the ability of the medical devices to reach the target anatomy without intersecting other anatomical structures. Software may be employed to present the collision-free projected path to the anatomical target in an intuitively obvious manner by, for example, showing free path as a green line and a path with collisions as a red line.



FIG. 4 presents an embodiment of the display and support arm with counterbalanced joints at the support arm elbow 18, and shoulder 19. An additional rotational or linear joint is provided at the base of the shoulder 20 to allow the display to move along the inferior to superior axis of the patient. All support arm joints may be encoded to provide data describing the position of the display. The display support is shown in an embodiment where the arm is mounted to a portable cart that is positioned next to the patient table. Axis 17 allows the display to rotate. An alternate embodiment may attach to the table or imaging system.



FIG. 5 provides an overview of the procedure flow for a minimally invasive procedure using a stored image for navigation. The patient anatomy is scanned 21 with a non-invasive imaging modality like CT, MR, or rotational angiography. The imaged anatomy is stored and segmented into a three dimensional image, and borders and centerlines of anatomical structures are calculated using commercially available software from vendors like Philips, Siemens, GE, Toshiba, Terra Recon, Calgary Scientific, Materialise, or Osirix. The image is transferred to the memory of the system processor and the image is registered 22 to the system coordinate system along with the patient and the medical device sensors. Registration of the image may be done by imaging the patient with an image-visible skin patch, by touching a sensored probe to prominent bony an antomical points, or with an externally anatomical marker placed on the patient. At least three separate points of the patch are visible in the image and then a position sensor is placed into the patch. The visible points on the patch or bones may be selected on the displayed image and then the known distance from the marker is used to register the image to the patient position sensor. The patient position sensor and medical device position sensor are inherently registered because their positions are determined by the same sensing system. Next, the registered image is shown 23 above the patient in a manner substantially aligned to the patient anatomy. The image position may be biased slightly toward the user to provide improved ergonomics. For example, if the user is on the right side of the patient, the user may press a button on the display touch screen to inform the system of the user's operating position. The system processor will then bias the image rotationally by a small amount, usually by and angle α of 15-30 degrees (FIG. 4), toward the user. The system may also bias rotational scaling in the user's direction, creating a rotation scale factor that increases slightly as the display is moved rotationally away from the user. In this way, the image is biased toward ergonomically comfortable viewing positions for the user without losing the substantial alignment of the image to the patient that provides for improved perception and usability. The medical device may be navigated 24 within or near the patient as the position sensor in the medical device is tracked and presented as an image icon within the image of the patient anatomy. The image of the anatomy and the image of the medical device may be shown with varying degrees of transparency to maximize the visibility of the device and anatomical images. The display, showing the image of the medical device within the image of the anatomy, may be repositioned 25 to enhance the viewing angle of the anatomy. As the display is moved, the image on the screen is updated to maintain substantial alignment between the displayed anatomical image and the patient anatomy.

Claims
  • 1. A system for displaying an image of a tool and an image of a patient's anatomy, said system comprising: a repositionable display screen configured to show the images of the tool and the patient's anatomy;a robotic device configured to control movement of the tool; anda processor configured to receive: (a) the image of the patient's anatomy;(b) position data and orientation data for the tool;(c) position data for the patient's anatomy;(d) position data for the display screen; and(e) position data for a user's position relative to the patient,wherein the processor is configured to: superimpose the image of the tool on the image of the patient's anatomy and reposition the image of the patient's anatomy on the display screen in real time based on the position data for the user's position relative to the patient so the images of both the patient's anatomy and the tool are substantially aligned with the patient as the display screen is moved over the patient,allow the user to selectively angle the aligned images away from the user so that when the display screen is angled toward the user, the aligned images will appear flat relative to the patient,receive a user input comprising a selection of an anatomical target on the image of the patient's anatomy,output a predicted position of the tool based on the position data for the tool and the orientation data for the tool relative to the position data for the patient's anatomy,project a path from the predicted position of the tool to the anatomical target,determine that the path is free of collisions (i) between the tool and an internal anatomy of the patient and (ii) between the tool and another medical device external to the patient,cause the path and an indication that the path is collision-free to be displayed on the display screen, andcause a collision space of the robotic device to be overlaid on the image of the patient's anatomy.
  • 2. A system as in claim 1, wherein the processor is configured to receive a pre-operative static image of the patient's anatomy.
  • 3. A system as in claim 1, wherein the processor is configured to receive a real time image of the patient's anatomy.
  • 4. A system as in claim 3, wherein the real time image is fluoroscopic.
  • 5. A system as in claim 3, wherein the real time image is a 3 dimensional point cloud of a position of the tool within the patient's anatomy.
  • 6. A system as in claim 1, further comprising an external position tracker configured to track a position of the tool, a position of the patient's anatomy, and a position of the display screen in a reference frame.
  • 7. A system as in claim 6, wherein: the external position tracker comprises a plurality of electromagnetic sensors,at least one of the plurality of electromagnetic sensors is present on the tool, andat least one of the plurality of electromagnetic sensors is affixed to the patient.
  • 8. A system as in claim 1, further comprising an articulated support coupled to the display screen to hold the display screen over the patient, the articulated support having an encoder configured to provide the position data for the display screen to the processor.
  • 9. A system as in claim 1, wherein the processor is configured to allow the display screen to be repositioned relative to its associated image by: interrupting a control loop within the processor between the display screen and the associated image; freezing the associated image on the display screen at the time of the interruption; and uninterrupting the control loop between the display screen and the associated image subsequent to the display screen being repositioned to a desired position.
  • 10. A system as in claim 1, wherein the processor is configured to selectively decouple a relationship between the display screen and the image of the patient's anatomy displayed on the display screen based at least in part on a signal received from a user input device.
  • 11. A system as in claim 1, wherein the processor is configured to change a relationship between the display screen and the image of the patient's anatomy displayed on the display screen based at least in part on a signal received from a user input device.
  • 12. A system for displaying an image of a tool and an image of a patient's anatomy, said system comprising: a repositionable display screen configured to show the images of the tool and the patient's anatomy;a robotic device configured to control movement of the tool; anda processor configured to receive: (a) the image of the patient's anatomy;(b) position data for the patient's anatomy;(c) position data for the display screen; and(d) position data and orientation data for the tool;wherein the processor is configured to: superimpose the image of the tool on the image of the patient's anatomy and reposition the image of the patient's anatomy on the display screen in real time based on the position data for the display screen so the image of the both the patient's anatomy and the tool are substantially aligned with the patient as the display screen is moved,output a predicted position of the tool based on the position data for the tool and the orientation data for the tool relative to the position data for the patient's anatomy,project a path from the predicted position to an anatomical target of the patient's anatomy,determine that the path from the predicted position to the anatomical target is free of collisions (i) between the tool and another structure of the patient's anatomy and (ii) between the tool and another medical device external to the patient,output the path and an indication that the path from the predicted position to the anatomical target is free of collisions, andcause a collision space of the robotic device to be overlaid on the image of the patient's anatomy.
  • 13. A system as in claim 12, wherein the processor is configured to track a position of the patient in real time and shift a coordinate system associated with the display screen in response to changes in position of the patient.
  • 14. A system as in claim 12, wherein the processor is configured to receive a real time image of the patient's anatomy.
  • 15. A system as in claim 14, wherein the real time image is ultrasonic.
  • 16. A system as in claim 12, further comprising an external position tracker configured to track a position of the patient and a position of the display screen in a reference frame.
  • 17. A system as in claim 16, wherein the external position tracker comprises a plurality of electromagnetic sensors.
  • 18. A system as in claim 17, wherein: at least one of the plurality of electromagnetic sensors is affixed to the patient, andat least one of the plurality of electromagnetic sensors is affixed to the display screen.
  • 19. A system as in claim 12, wherein the processor is configured to allow the display screen to be repositioned relative to its associated image by: interrupting a control loop within the processor between the display screen and the associated image; freezing the associated image on the display screen at the time of the interruption; and uninterrupting the control loop between the display screen and the associated image subsequent to the display screen being repositioned to a desired position.
  • 20. A system as in claim 12, wherein the processor is configured to selectively decouple a relationship between the display screen and an image displayed on the display screen based at least in part on a signal received from a user input device.
  • 21. A system as in claim 12, wherein the processor is configured to change a relationship between the display screen and an image displayed on the display screen based at least in part on a signal received from a user input device.
  • 22. A system as in claim 12, wherein the display screen is repositionable in a first axis with a first scaling factor for the displayed image of the patient's anatomy and repositionable in a second axis different from the first axis with a second scaling factor different from the first scaling factor for the displayed images, the first axis comprising a first translational axis or a first rotational axis, and the second axis comprising a second translational axis different from the first translational axis or a second rotational axis different from the first rotational axis.
  • 23. A system as in claim 22, wherein the first scaling factor is in a range between 1:1 and 1.5:1.
  • 24. A system as in claim 22, wherein the second scaling factor is in a range between 1:1 and 1.5:1.
  • 25. A system as in claim 1, wherein the processor is configured to freeze the aligned images on the display screen while the display screen is being repositioned, unfreeze the aligned images after the display screen has been repositioned, and resume the alignment of the images with the patient as the display screen is further moved over the patient.
  • 26. A system as in claim 12, wherein the processor is configured to freeze images on the display screen while the display screen is being repositioned, and unfreeze images on the display screen after the display screen has been repositioned.
  • 27. A system as in claim 1, wherein the processor is configured to cause another path from the predicted position to the selected anatomical target that includes at least one collision to be displayed.
  • 28. A system as in claim 12, wherein the processor is configured to: cause another path from the predicted position to the anatomical target to be displayed on the display screen along with another indication that the another path is not free of collisions.
  • 29. A system as in claim 1, wherein the indication is a first indication displayed on the display screen in response to determining that the path from the predicted position to the anatomical target is collision-free, wherein the processor is further configured to display on the display screen a second indication in response to determining that the path from the predicted position to the anatomical target is not collision-free.
  • 30. A system as in claim 12, wherein the indication is a first indication outputted in response to determining that the path from the predicted position to the anatomical target is collision-free, wherein the processor is further configured to output a second indication in response to determining that the path from the predicted position to the anatomical target is not collision-free.
  • 31. A system, comprising: a display comprising a display position sensor configured to generate position data;a medical tool configured to be inserted into a patient's anatomy, the medical tool comprising a sensor configured to generate position data and orientation data for the medical tool;a patient reference sensor configured to generate position data for the patient's anatomy;a robotic device configured to control movement of the medical tool; anda processor configured to: generate an image on the display comprising a position of the medical tool with respect to the patient's anatomy superimposed on an image of the patient's anatomy based on the position data for the medical tool and the position data for the patient's anatomy,determine that a path from the position of the medical tool to an anatomical target is free of collisions (i) between the medical tool and an internal anatomy of the patient and (ii) between the medical tool and another medical device external to the patient, based on the orientation data of the medical tool,cause the path to be displayed on the display,output an indication that the path is collision-free, andcause a collision space of the robotic device to be overlaid on the image of the patient's anatomy.
  • 32. A system as in claim 31, wherein the processor is further configured to: apply a rotational bias to the image on the display toward a user based on the position data for the display and position data for a user's position relative to the patient's anatomy, andadjust an amount of the rotational bias applied to the image in response to the display being moved rotationally with respect to the user.
  • 33. A system as in claim 31, wherein the image of the patient's anatomy is a pre-operative image, and wherein the image on the display further comprises an intra-operative endoscopic image of the patient's anatomy superimposed on the pre-operative image.
  • 34. A system as in claim 31, wherein the processor is further configured to: detect movement of the display based on the position data for the display, andcause re-positioning of the medical tool in response to the detected movement of the display.
  • 35. A system as in claim 31, wherein the processor is further configured to: indicate that the path is collision-free by causing the path to be displayed as a green line on the display, andindicate that a second path involves a collision by causing the second path to be displayed as a red line on the display.
CROSS-REFERENCE

1. This application claims the benefit of U.S. Provisional Application No. 61/829,078 filed May 30, 2013.

US Referenced Citations (643)
Number Name Date Kind
4745908 Wardle May 1988 A
4771262 Reuss Sep 1988 A
4896554 Culver Jan 1990 A
5008528 Duchon Apr 1991 A
5134390 Kishimoto et al. Jul 1992 A
5176310 Akiyama et al. Jan 1993 A
5273025 Sakiyam et al. Dec 1993 A
5280781 Oku Jan 1994 A
5499632 Hill et al. Mar 1996 A
5524180 Wang et al. Jun 1996 A
5526812 Dumoulin et al. Jun 1996 A
5550953 Seraji Aug 1996 A
5694142 Dumoulin et al. Dec 1997 A
5762458 Wang et al. Jun 1998 A
5808665 Green Sep 1998 A
5831614 Tognazzini et al. Nov 1998 A
5899851 Koninckx May 1999 A
5935075 Casscells Aug 1999 A
5963770 Eakin Oct 1999 A
6007550 Wang et al. Dec 1999 A
6016439 Acker Jan 2000 A
6038467 De Bliek et al. Mar 2000 A
6047080 Chen Apr 2000 A
6059718 Taniguchi et al. May 2000 A
6063095 Wang et al. May 2000 A
6096004 Meglan et al. Aug 2000 A
6167292 Badano Dec 2000 A
6203493 Ben-Haim Mar 2001 B1
6246784 Summers Jun 2001 B1
6246898 Vesely Jun 2001 B1
6332089 Acker Dec 2001 B1
6425865 Salcudean Jul 2002 B1
6466198 Feinstein Oct 2002 B1
6468265 Evans et al. Oct 2002 B1
6490467 Bucholz Dec 2002 B1
6516421 Peters Feb 2003 B1
6553251 Lahdesmaki Apr 2003 B1
6665554 Charles Dec 2003 B1
6690963 Ben-Haim Feb 2004 B2
6690964 Bieger et al. Feb 2004 B2
6755797 Stouffer Jun 2004 B1
6812842 Dimmer Nov 2004 B2
6856827 Seeley et al. Feb 2005 B2
6899672 Chin May 2005 B2
6926709 Beiger et al. Aug 2005 B2
7180976 Wink Feb 2007 B2
7203277 Birkenbach et al. Apr 2007 B2
7206627 Abovitz Apr 2007 B2
7233820 Gilboa Jun 2007 B2
7386339 Strommer et al. Jun 2008 B2
7594925 Danek Sep 2009 B2
7618371 Younge et al. Nov 2009 B2
7756563 Higgins Jul 2010 B2
7774044 Sauer et al. Aug 2010 B2
7850642 Moll et al. Dec 2010 B2
7880739 Long et al. Feb 2011 B2
7901348 Soper Mar 2011 B2
7935059 Younge et al. May 2011 B2
7963288 Rosenberg et al. Jun 2011 B2
7972298 Wallace et al. Jul 2011 B2
7974681 Wallace et al. Jul 2011 B2
7976539 Hlavka et al. Jul 2011 B2
8005537 Hlavka et al. Aug 2011 B2
8021326 Moll et al. Sep 2011 B2
8041413 Barbagli et al. Oct 2011 B2
8050523 Younge et al. Nov 2011 B2
8052621 Wallace et al. Nov 2011 B2
8052636 Moll et al. Nov 2011 B2
8092397 Wallace et al. Jan 2012 B2
8108069 Stahler et al. Jan 2012 B2
8155403 Tschirren Apr 2012 B2
8172747 Wallace et al. May 2012 B2
8180114 Nishihara et al. May 2012 B2
8190238 Moll et al. May 2012 B2
8257303 Moll et al. Sep 2012 B2
8285364 Barbagli et al. Oct 2012 B2
8290571 Younge et al. Oct 2012 B2
8298135 Ito et al. Oct 2012 B2
8311626 Hlavka et al. Nov 2012 B2
8317746 Sewell et al. Nov 2012 B2
8388538 Younge et al. Mar 2013 B2
8388556 Wallace et al. Mar 2013 B2
8394054 Wallace et al. Mar 2013 B2
8409136 Wallace et al. Apr 2013 B2
8409172 Moll et al. Apr 2013 B2
8409234 Stahler et al. Apr 2013 B2
8460236 Roelle et al. Jun 2013 B2
8498691 Moll et al. Jul 2013 B2
8617102 Moll et al. Dec 2013 B2
8716973 Lammertse May 2014 B1
8718837 Wang et al. May 2014 B2
8720448 Reis et al. May 2014 B2
8801661 Moll et al. Aug 2014 B2
8821376 Tolkowsky Sep 2014 B2
8858424 Hasegawa Oct 2014 B2
8926603 Hlavka et al. Jan 2015 B2
8929631 Pfister et al. Jan 2015 B2
8961533 Stahler et al. Feb 2015 B2
8971597 Zhao et al. Mar 2015 B2
8974408 Wallace et al. Mar 2015 B2
9014851 Wong et al. Apr 2015 B2
9084623 Gomez et al. Jul 2015 B2
9125639 Mathis Sep 2015 B2
9138129 Diolaiti Sep 2015 B2
9173713 Hart et al. Nov 2015 B2
9183354 Baker et al. Nov 2015 B2
9186046 Ramamurthy et al. Nov 2015 B2
9241767 Prisco et al. Jan 2016 B2
9272416 Hourtash et al. Mar 2016 B2
9283046 Walker et al. Mar 2016 B2
9289578 Walker et al. Mar 2016 B2
9358076 Moll et al. Jun 2016 B2
9457168 Moll et al. Oct 2016 B2
9459087 Dunbar Oct 2016 B2
9498291 Balaji et al. Nov 2016 B2
9498601 Tanner et al. Nov 2016 B2
9503681 Popescu et al. Nov 2016 B1
9504604 Alvarez Nov 2016 B2
9561019 Mihailescu et al. Feb 2017 B2
9561083 Yu et al. Feb 2017 B2
9566414 Wong et al. Feb 2017 B2
9603668 Weingarten et al. Mar 2017 B2
9622827 Yu et al. Apr 2017 B2
9629682 Wallace et al. Apr 2017 B2
9636184 Lee et al. May 2017 B2
9710921 Wong et al. Jul 2017 B2
9713509 Schuh et al. Jul 2017 B2
9717563 Tognaccini Aug 2017 B2
9727963 Mintz et al. Aug 2017 B2
9737371 Romo et al. Aug 2017 B2
9737373 Schuh Aug 2017 B2
9744335 Jiang Aug 2017 B2
9763741 Alvarez et al. Sep 2017 B2
9770216 Brown et al. Sep 2017 B2
9788910 Schuh Oct 2017 B2
9827061 Balaji et al. Nov 2017 B2
9844412 Bogusky et al. Dec 2017 B2
9867635 Alvarez et al. Jan 2018 B2
9918681 Wallace et al. Mar 2018 B2
10016900 Meyer et al. Jul 2018 B1
10022192 Ummalaneni Jul 2018 B1
10028789 Quaid et al. Jul 2018 B2
10046140 Kokish et al. Aug 2018 B2
10123843 Wong et al. Nov 2018 B2
10130427 Tanner et al. Nov 2018 B2
10136950 Schoenefeld Nov 2018 B2
10143526 Walker et al. Dec 2018 B2
10145747 Lin et al. Dec 2018 B1
10159532 Ummalaneni et al. Dec 2018 B1
10206746 Walker et al. Feb 2019 B2
10278778 State May 2019 B2
10285574 Landey et al. May 2019 B2
10299870 Connolly et al. May 2019 B2
10346976 Averbuch et al. Jul 2019 B2
10482599 Mintz et al. Nov 2019 B2
10820954 Marsot et al. Nov 2020 B2
20010021843 Bosselmann et al. Sep 2001 A1
20010039421 Heilbrun Nov 2001 A1
20020065455 Ben-Haim et al. May 2002 A1
20020077533 Bieger et al. Jun 2002 A1
20020082612 Moll et al. Jun 2002 A1
20020120188 Brock et al. Aug 2002 A1
20020161280 Chatenever et al. Oct 2002 A1
20020173878 Watanabe Nov 2002 A1
20030105603 Hardesty Jun 2003 A1
20030125622 Schweikard Jul 2003 A1
20030181809 Hall et al. Sep 2003 A1
20030195664 Nowlin et al. Oct 2003 A1
20040047044 Dalton Mar 2004 A1
20040072066 Cho et al. Apr 2004 A1
20040186349 Ewers Sep 2004 A1
20040249267 Gilboa Dec 2004 A1
20040263535 Birkenbach Dec 2004 A1
20050027397 Niemeyer Feb 2005 A1
20050043718 Madhani Feb 2005 A1
20050060006 Pflueger Mar 2005 A1
20050085714 Foley Apr 2005 A1
20050107679 Geiger May 2005 A1
20050143649 Minai et al. Jun 2005 A1
20050143655 Satoh Jun 2005 A1
20050182295 Soper et al. Aug 2005 A1
20050182319 Glossop Aug 2005 A1
20050193451 Quistgaard Sep 2005 A1
20050197557 Strommer et al. Sep 2005 A1
20050222554 Wallace et al. Oct 2005 A1
20050256398 Hastings Nov 2005 A1
20050272975 McWeeney et al. Dec 2005 A1
20060004286 Chang Jan 2006 A1
20060015096 Hauck et al. Jan 2006 A1
20060025668 Peterson Feb 2006 A1
20060058643 Florent Mar 2006 A1
20060079745 Viswanathan et al. Apr 2006 A1
20060084860 Geiger Apr 2006 A1
20060095022 Moll et al. May 2006 A1
20060095066 Chang May 2006 A1
20060098851 Shoham May 2006 A1
20060149134 Soper et al. Jul 2006 A1
20060173290 Lavallee et al. Aug 2006 A1
20060184016 Glossop Aug 2006 A1
20060200026 Wallace et al. Sep 2006 A1
20060209019 Hu Sep 2006 A1
20060258935 Pile-Spellman et al. Nov 2006 A1
20060258938 Hoffman et al. Nov 2006 A1
20070032826 Schwartz Feb 2007 A1
20070055128 Glossop Mar 2007 A1
20070055144 Neustadter Mar 2007 A1
20070073136 Metzger Mar 2007 A1
20070083098 Stern et al. Apr 2007 A1
20070083193 Werneth Apr 2007 A1
20070123748 Meglan May 2007 A1
20070135886 Maschke Jun 2007 A1
20070138992 Prisco et al. Jun 2007 A1
20070144298 Miller Jun 2007 A1
20070156019 Larkin et al. Jul 2007 A1
20070161857 Durant et al. Jul 2007 A1
20070167743 Honda Jul 2007 A1
20070167801 Webler et al. Jul 2007 A1
20070185486 Hauck et al. Aug 2007 A1
20070208252 Makower Sep 2007 A1
20070253599 White et al. Nov 2007 A1
20070269001 Maschke Nov 2007 A1
20070293721 Gilboa Dec 2007 A1
20070299353 Harlev et al. Dec 2007 A1
20080027313 Shachar Jan 2008 A1
20080027464 Moll et al. Jan 2008 A1
20080033442 Amoit Feb 2008 A1
20080071140 Gattani Mar 2008 A1
20080079421 Jensen Apr 2008 A1
20080082109 Moll et al. Apr 2008 A1
20080097465 Rollins et al. Apr 2008 A1
20080103389 Begelman et al. May 2008 A1
20080108870 Wiita et al. May 2008 A1
20080118118 Berger May 2008 A1
20080118135 Averbach May 2008 A1
20080123921 Gielen et al. May 2008 A1
20080140087 Barbagli Jun 2008 A1
20080147089 Loh Jun 2008 A1
20080159653 Dunki-Jacobs et al. Jul 2008 A1
20080161681 Hauck Jul 2008 A1
20080183064 Chandonnet Jul 2008 A1
20080183068 Carls Jul 2008 A1
20080183073 Higgins et al. Jul 2008 A1
20080183188 Carls et al. Jul 2008 A1
20080201016 Finlay Aug 2008 A1
20080207997 Higgins et al. Aug 2008 A1
20080212082 Froggatt et al. Sep 2008 A1
20080218770 Moll et al. Sep 2008 A1
20080243064 Stahler et al. Oct 2008 A1
20080243142 Gildenberg Oct 2008 A1
20080249536 Stahler et al. Oct 2008 A1
20080262297 Gilboa Oct 2008 A1
20080262480 Stahler et al. Oct 2008 A1
20080262513 Stahler et al. Oct 2008 A1
20080275349 Halperin Nov 2008 A1
20080287963 Rogers et al. Nov 2008 A1
20080306490 Lakin et al. Dec 2008 A1
20080312501 Hasegawa et al. Dec 2008 A1
20090024141 Stahler et al. Jan 2009 A1
20090030307 Govari Jan 2009 A1
20090054729 Mori Feb 2009 A1
20090062602 Rosenberg et al. Mar 2009 A1
20090076476 Barbagli et al. Mar 2009 A1
20090138025 Stahler et al. May 2009 A1
20090149867 Glozman Jun 2009 A1
20090209817 Averbuch Aug 2009 A1
20090227861 Ganatra Sep 2009 A1
20090228020 Wallace et al. Sep 2009 A1
20090248036 Hoffman et al. Oct 2009 A1
20090254083 Wallace et al. Oct 2009 A1
20090259230 Khadem Oct 2009 A1
20090259412 Brogardh Oct 2009 A1
20090262109 Markowitz et al. Oct 2009 A1
20090292166 Ito Nov 2009 A1
20090295797 Sakaguchi Dec 2009 A1
20090322671 Scott et al. Dec 2009 A1
20090326322 Diolaiti Dec 2009 A1
20090326556 Diolaiti et al. Dec 2009 A1
20100008555 Trumer Jan 2010 A1
20100019890 Helmer et al. Jan 2010 A1
20100030061 Canfield Feb 2010 A1
20100039506 Sarvestani et al. Feb 2010 A1
20100041949 Tolkowsky Feb 2010 A1
20100053151 Marti et al. Mar 2010 A1
20100054536 Huang Mar 2010 A1
20100076263 Tanaka Mar 2010 A1
20100113852 Sydora May 2010 A1
20100121139 OuYang May 2010 A1
20100121269 Goldenberg May 2010 A1
20100125284 Tanner et al. May 2010 A1
20100160733 Gilboa Jun 2010 A1
20100161022 Tolkowsky Jun 2010 A1
20100161129 Costa et al. Jun 2010 A1
20100204613 Rollins et al. Aug 2010 A1
20100225209 Goldberg Sep 2010 A1
20100240989 Stoianovici Sep 2010 A1
20100290530 Huang et al. Nov 2010 A1
20100292565 Meyer Nov 2010 A1
20100295931 Schmidt Nov 2010 A1
20100298641 Tanaka Nov 2010 A1
20100328455 Nam et al. Dec 2010 A1
20110015648 Alvarez et al. Jan 2011 A1
20110021926 Spencer Jan 2011 A1
20110054303 Barrick Mar 2011 A1
20110092808 Shachar Apr 2011 A1
20110113852 Prisco May 2011 A1
20110118748 Itkowitz May 2011 A1
20110118752 Itkowitz et al. May 2011 A1
20110118753 Itkowitz et al. May 2011 A1
20110130718 Kidd et al. Jun 2011 A1
20110184238 Higgins Jul 2011 A1
20110196199 Donhowe et al. Aug 2011 A1
20110234780 Ito Sep 2011 A1
20110235855 Smith Sep 2011 A1
20110238010 Kirschenman et al. Sep 2011 A1
20110238082 Wenderow Sep 2011 A1
20110238083 Moll et al. Sep 2011 A1
20110245665 Nentwick Oct 2011 A1
20110248987 Mitchell Oct 2011 A1
20110249016 Zhang Oct 2011 A1
20110257480 Takahashi Oct 2011 A1
20110270273 Moll et al. Nov 2011 A1
20110276058 Choi et al. Nov 2011 A1
20110276179 Banks et al. Nov 2011 A1
20110295247 Schlesinger et al. Dec 2011 A1
20110295248 Wallace et al. Dec 2011 A1
20110295267 Tanner et al. Dec 2011 A1
20110295268 Roelle et al. Dec 2011 A1
20110306873 Shenai et al. Dec 2011 A1
20110319910 Roelle et al. Dec 2011 A1
20120046521 Hunter et al. Feb 2012 A1
20120056986 Popovic Mar 2012 A1
20120059392 Diolaiti Mar 2012 A1
20120062714 Liu Mar 2012 A1
20120065481 Hunter Mar 2012 A1
20120069167 Liu et al. Mar 2012 A1
20120071752 Sewell Mar 2012 A1
20120071782 Patil et al. Mar 2012 A1
20120071891 Itkowitz et al. Mar 2012 A1
20120071892 Itkowitz et al. Mar 2012 A1
20120071894 Tanner et al. Mar 2012 A1
20120075638 Rollins et al. Mar 2012 A1
20120078053 Phee et al. Mar 2012 A1
20120082351 Higgins Apr 2012 A1
20120103123 McInroy et al. May 2012 A1
20120116253 Wallace et al. May 2012 A1
20120120305 Takahashi May 2012 A1
20120158011 Sandhu Jun 2012 A1
20120165656 Montag Jun 2012 A1
20120191079 Moll et al. Jul 2012 A1
20120203067 Higgins et al. Aug 2012 A1
20120209069 Popovic Aug 2012 A1
20120215094 Rahimian et al. Aug 2012 A1
20120219185 Hu Aug 2012 A1
20120253276 Govari et al. Oct 2012 A1
20120289777 Chopra Nov 2012 A1
20120289783 Duindam et al. Nov 2012 A1
20120296161 Wallace et al. Nov 2012 A1
20120302869 Koyrakh Nov 2012 A1
20120314022 Jo Dec 2012 A1
20130018306 Ludwin Jan 2013 A1
20130060146 Yang Mar 2013 A1
20130072787 Wallace et al. Mar 2013 A1
20130144116 Cooper et al. Jun 2013 A1
20130165854 Sandhu et al. Jun 2013 A1
20130165945 Roelle Jun 2013 A9
20130190741 Moll et al. Jul 2013 A1
20130204124 Duindam Aug 2013 A1
20130225942 Holsing Aug 2013 A1
20130243153 Sra Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130246334 Ahuja Sep 2013 A1
20130259315 Angot et al. Oct 2013 A1
20130303892 Zhao Nov 2013 A1
20130317519 Romo et al. Nov 2013 A1
20130345718 Crawford Dec 2013 A1
20140058406 Tsekos Feb 2014 A1
20140072192 Reiner Mar 2014 A1
20140107390 Brown Apr 2014 A1
20140107666 Madhani Apr 2014 A1
20140111457 Briden et al. Apr 2014 A1
20140114180 Jain Apr 2014 A1
20140148808 Inkpen et al. Apr 2014 A1
20140142591 Alvarez et al. May 2014 A1
20140180063 Zhao Jun 2014 A1
20140222204 Kawashima Aug 2014 A1
20140235943 Paris Aug 2014 A1
20140243849 Saglam Aug 2014 A1
20140257746 Dunbar et al. Sep 2014 A1
20140264081 Walker et al. Sep 2014 A1
20140275988 Walker et al. Sep 2014 A1
20140276033 Brannan Sep 2014 A1
20140276392 Wong et al. Sep 2014 A1
20140276594 Tanner et al. Sep 2014 A1
20140276933 Hart et al. Sep 2014 A1
20140276937 Wong et al. Sep 2014 A1
20140276938 Hsu et al. Sep 2014 A1
20140277333 Lewis et al. Sep 2014 A1
20140296655 Akhbardeh et al. Oct 2014 A1
20140296657 Izmirli Oct 2014 A1
20140309527 Namati et al. Oct 2014 A1
20140309649 Alvarez et al. Oct 2014 A1
20140343416 Panescu Nov 2014 A1
20140350391 Prisco et al. Nov 2014 A1
20140357953 Roelle et al. Dec 2014 A1
20140357984 Wallace et al. Dec 2014 A1
20140364739 Liu Dec 2014 A1
20140364870 Alvarez et al. Dec 2014 A1
20140379000 Romo et al. Dec 2014 A1
20150018622 Tesar et al. Jan 2015 A1
20150051482 Liu et al. Feb 2015 A1
20150051592 Kintz Feb 2015 A1
20150054929 Ito et al. Feb 2015 A1
20150057498 Akimoto Feb 2015 A1
20150073266 Brannan Mar 2015 A1
20150101442 Romo Apr 2015 A1
20150105747 Rollins et al. Apr 2015 A1
20150119638 Yu et al. Apr 2015 A1
20150141808 Elhawary May 2015 A1
20150141858 Razavi May 2015 A1
20150142013 Tanner et al. May 2015 A1
20150157191 Phee et al. Jun 2015 A1
20150164594 Romo et al. Jun 2015 A1
20150164596 Romo Jun 2015 A1
20150223725 Engel Aug 2015 A1
20150223897 Kostrzewski et al. Aug 2015 A1
20150223902 Walker et al. Aug 2015 A1
20150224845 Anderson et al. Aug 2015 A1
20150255782 Kim et al. Sep 2015 A1
20150265087 Park Sep 2015 A1
20150265368 Chopra Sep 2015 A1
20150265807 Park et al. Sep 2015 A1
20150275986 Cooper Oct 2015 A1
20150287192 Sasaki Oct 2015 A1
20150290454 Tyler et al. Oct 2015 A1
20150297133 Jouanique-Dubuis et al. Oct 2015 A1
20150305650 Hunter Oct 2015 A1
20150313503 Seibel et al. Nov 2015 A1
20150335480 Alvarez et al. Nov 2015 A1
20150375399 Chiu et al. Dec 2015 A1
20160000302 Brown Jan 2016 A1
20160000414 Brown Jan 2016 A1
20160000520 Lachmanovich Jan 2016 A1
20160001038 Romo et al. Jan 2016 A1
20160008033 Hawkins et al. Jan 2016 A1
20160026253 Bradski et al. Jan 2016 A1
20160059412 Oleynik Mar 2016 A1
20160098095 Gonzalez-Banos et al. Apr 2016 A1
20160111192 Suzara Apr 2016 A1
20160128992 Hudson May 2016 A1
20160183841 Duindam et al. Jun 2016 A1
20160199134 Brown et al. Jul 2016 A1
20160206389 Miller Jul 2016 A1
20160213432 Flexman Jul 2016 A1
20160213436 Inoue Jul 2016 A1
20160213884 Park Jul 2016 A1
20160228032 Walker et al. Aug 2016 A1
20160235495 Wallace et al. Aug 2016 A1
20160256069 Jenkins Sep 2016 A1
20160270865 Landey et al. Sep 2016 A1
20160279394 Moll et al. Sep 2016 A1
20160287279 Bovay et al. Oct 2016 A1
20160287346 Hyodo et al. Oct 2016 A1
20160296294 Moll et al. Oct 2016 A1
20160314710 Jarc Oct 2016 A1
20160314716 Grubbs Oct 2016 A1
20160314717 Grubbs Oct 2016 A1
20160324580 Esterberg et al. Nov 2016 A1
20160331469 Hall et al. Nov 2016 A1
20160360947 Iida Dec 2016 A1
20160372743 Cho et al. Dec 2016 A1
20160374541 Agrawal et al. Dec 2016 A1
20170007337 Dan Jan 2017 A1
20170023423 Jackson Jan 2017 A1
20170055851 Al-Ali Mar 2017 A1
20170065364 Schuh et al. Mar 2017 A1
20170065365 Schuh Mar 2017 A1
20170079725 Hoffman Mar 2017 A1
20170079726 Hoffman Mar 2017 A1
20170086929 Moll et al. Mar 2017 A1
20170100199 Yu et al. Apr 2017 A1
20170113019 Wong et al. Apr 2017 A1
20170119411 Shah May 2017 A1
20170119412 Noonan et al. May 2017 A1
20170119413 Romo May 2017 A1
20170119481 Romo et al. May 2017 A1
20170119484 Tanner et al. May 2017 A1
20170143429 Richmond et al. May 2017 A1
20170165011 Bovay et al. Jun 2017 A1
20170172664 Weingarten et al. Jun 2017 A1
20170172673 Yu et al. Jun 2017 A1
20170189118 Chopra Jul 2017 A1
20170202627 Sramek et al. Jul 2017 A1
20170209073 Sramek et al. Jul 2017 A1
20170215808 Shimol et al. Aug 2017 A1
20170215969 Zhai et al. Aug 2017 A1
20170215978 Wallace et al. Aug 2017 A1
20170238807 Veritkov et al. Aug 2017 A9
20170258366 Tupin Sep 2017 A1
20170290631 Lee et al. Oct 2017 A1
20170296032 Li Oct 2017 A1
20170296202 Brown Oct 2017 A1
20170303941 Eisner Oct 2017 A1
20170325896 Donhowe Nov 2017 A1
20170333679 Jiang Nov 2017 A1
20170340241 Yamada Nov 2017 A1
20170340396 Romo et al. Nov 2017 A1
20170348067 Krimsky Dec 2017 A1
20170360418 Wong et al. Dec 2017 A1
20170360508 Germain et al. Dec 2017 A1
20170365055 Mintz et al. Dec 2017 A1
20170367782 Schuh et al. Dec 2017 A1
20180025666 Ho et al. Jan 2018 A1
20180055576 Koyrakh Mar 2018 A1
20180055582 Krimsky Mar 2018 A1
20180055583 Schuh et al. Mar 2018 A1
20180078321 Liao Mar 2018 A1
20180079090 Koenig et al. Mar 2018 A1
20180098690 Iwaki Apr 2018 A1
20180177383 Noonan et al. Jun 2018 A1
20180177556 Noonan et al. Jun 2018 A1
20180177561 Mintz et al. Jun 2018 A1
20180184988 Walker et al. Jul 2018 A1
20180214011 Graetzel et al. Aug 2018 A1
20180217734 Koenig et al. Aug 2018 A1
20180221038 Noonan et al. Aug 2018 A1
20180221039 Shah Aug 2018 A1
20180240237 Donhowe et al. Aug 2018 A1
20180250083 Schuh et al. Sep 2018 A1
20180271616 Schuh et al. Sep 2018 A1
20180279852 Rafii-Tari et al. Oct 2018 A1
20180280660 Landey et al. Oct 2018 A1
20180286108 Hirakawa Oct 2018 A1
20180289431 Draper et al. Oct 2018 A1
20180308247 Gupta Oct 2018 A1
20180325499 Landey et al. Nov 2018 A1
20180333044 Jenkins Nov 2018 A1
20180360435 Romo Dec 2018 A1
20180368920 Ummalaneni Dec 2018 A1
20190000559 Berman et al. Jan 2019 A1
20190000560 Berman et al. Jan 2019 A1
20190000566 Graetzel et al. Jan 2019 A1
20190000576 Mintz et al. Jan 2019 A1
20190046814 Senden et al. Feb 2019 A1
20190066314 Abhari Feb 2019 A1
20190083183 Moll et al. Mar 2019 A1
20190086349 Nelson Mar 2019 A1
20190090969 Jarc et al. Mar 2019 A1
20190105776 Ho et al. Apr 2019 A1
20190105785 Meyer Apr 2019 A1
20190107454 Lin Apr 2019 A1
20190110839 Rafii-Tari et al. Apr 2019 A1
20190110843 Ummalaneni et al. Apr 2019 A1
20190117176 Walker et al. Apr 2019 A1
20190117203 Wong et al. Apr 2019 A1
20190151148 Alvarez et al. Apr 2019 A1
20190125164 Roelle et al. May 2019 A1
20190142519 Siemionow et al. May 2019 A1
20190151032 Mustufa et al. May 2019 A1
20190167361 Walker et al. Jun 2019 A1
20190167366 Ummalaneni Jun 2019 A1
20190175009 Mintz Jun 2019 A1
20190175062 Rafii-Tari et al. Jun 2019 A1
20190175287 Hill Jun 2019 A1
20190175799 Hsu Jun 2019 A1
20190183585 Rafii-Tari et al. Jun 2019 A1
20190183587 Rafii-Tari et al. Jun 2019 A1
20190216548 Ummalaneni Jul 2019 A1
20190216550 Eyre Jul 2019 A1
20190216576 Eyre Jul 2019 A1
20190223974 Romo Jul 2019 A1
20190228525 Mintz et al. Jul 2019 A1
20190246882 Graetzel et al. Aug 2019 A1
20190262086 Connolly et al. Aug 2019 A1
20190269468 Hsu et al. Sep 2019 A1
20190274764 Romo Sep 2019 A1
20190287673 Michihata Sep 2019 A1
20190290109 Agrawal et al. Sep 2019 A1
20190298160 Ummalaneni et al. Oct 2019 A1
20190298458 Srinivasan Oct 2019 A1
20190298460 Al-Jadda Oct 2019 A1
20190298465 Chin Oct 2019 A1
20190328213 Landey et al. Oct 2019 A1
20190336238 Yu Nov 2019 A1
20190365201 Noonan et al. Dec 2019 A1
20190365209 Ye et al. Dec 2019 A1
20190365479 Rafii-Tari Dec 2019 A1
20190365486 Srinivasan et al. Dec 2019 A1
20190371012 Flexman Dec 2019 A1
20190374297 Wallace et al. Dec 2019 A1
20190375383 Alvarez Dec 2019 A1
20190380787 Ye Dec 2019 A1
20190380797 Yu Dec 2019 A1
20200000530 DeFonzo Jan 2020 A1
20200000533 Schuh Jan 2020 A1
20200022767 Hill Jan 2020 A1
20200038123 Graetzel Feb 2020 A1
20200039086 Meyer Feb 2020 A1
20200046434 Graetzel Feb 2020 A1
20200054405 Schuh Feb 2020 A1
20200054408 Schuh et al. Feb 2020 A1
20200060516 Baez Feb 2020 A1
20200078103 Duindam Mar 2020 A1
20200085516 DeFonzo Mar 2020 A1
20200093549 Chin Mar 2020 A1
20200093554 Schuh Mar 2020 A1
20200100845 Julian Apr 2020 A1
20200100853 Ho Apr 2020 A1
20200100855 Leparmentier Apr 2020 A1
20200101264 Jiang Apr 2020 A1
20200107894 Wallace Apr 2020 A1
20200121502 Kintz Apr 2020 A1
20200146769 Eyre May 2020 A1
20200155084 Walker May 2020 A1
20200170630 Wong Jun 2020 A1
20200170720 Ummalaneni Jun 2020 A1
20200171660 Ho Jun 2020 A1
20200188043 Yu Jun 2020 A1
20200197112 Chin Jun 2020 A1
20200206472 Ma Jul 2020 A1
20200217733 Lin Jul 2020 A1
20200222134 Schuh Jul 2020 A1
20200237458 DeFonzo Jul 2020 A1
20200261172 Romo Aug 2020 A1
20200268459 Noonan et al. Aug 2020 A1
20200268460 Tse Aug 2020 A1
20200281787 Ruiz Sep 2020 A1
20200297437 Schuh Sep 2020 A1
20200297444 Camarillo Sep 2020 A1
20200305983 Yampolsky Oct 2020 A1
20200305989 Schuh Oct 2020 A1
20200305992 Schuh Oct 2020 A1
20200315717 Bovay Oct 2020 A1
20200315723 Hassan Oct 2020 A1
20200323596 Moll Oct 2020 A1
20200330167 Romo Oct 2020 A1
20200345216 Jenkins Nov 2020 A1
20200345432 Walker Nov 2020 A1
20200352420 Graetzel Nov 2020 A1
20200360183 Alvarez Nov 2020 A1
20200360659 Wong Nov 2020 A1
20200367726 Landey et al. Nov 2020 A1
20200367981 Ho et al. Nov 2020 A1
20200375678 Wallace Dec 2020 A1
Foreign Referenced Citations (37)
Number Date Country
101147676 Mar 2008 CN
101222882 Jul 2008 CN
102316817 Jan 2012 CN
102458295 May 2012 CN
102946801 Feb 2013 CN
102973317 Mar 2013 CN
103705307 Apr 2014 CN
103735313 Apr 2014 CN
103813748 May 2014 CN
104758066 Jul 2015 CN
105559850 May 2016 CN
105559886 May 2016 CN
105611881 May 2016 CN
106455908 Feb 2017 CN
106821498 Jun 2017 CN
104931059 Sep 2018 CN
1 800 593 Jun 2007 EP
1 109 497 May 2009 EP
2 158 834 Mar 2010 EP
3 025 630 Jun 2019 EP
10-2014-0009359 Jan 2014 KR
10-1713676 Mar 2017 KR
2569699 Nov 2015 RU
WO 05087128 Sep 2005 WO
WO 09097461 Jun 2007 WO
WO 08049088 Apr 2008 WO
WO 10025522 Mar 2010 WO
WO-2013040498 Mar 2013 WO
WO 15089013 Jun 2015 WO
WO 16077419 May 2016 WO
WO 16203727 Dec 2016 WO
WO 17036774 Mar 2017 WO
WO 17048194 Mar 2017 WO
WO 17066108 Apr 2017 WO
WO 17146890 Aug 2017 WO
WO 17167754 Oct 2017 WO
WO 17214243 Dec 2017 WO
Non-Patent Literature Citations (48)
Entry
Racadio et al., “Live 3D guidance in the interventional radiology suite,” Dec. 2007, AJR, 189:W357-W364.
Solheim et al., “Navigated resection of giant intracranial meningiomas based on intraoperative 3D ultrasound,” May 14, 2009, Acta Neurochir, 151:1143-1151.
“Point Cloud,” Sep. 10, 2010, Wikipedia.
European search report and search opinion dated Aug. 24, 2015 for EP Application No. 12832283.1.
International search report and written opinion dated Feb. 5, 2013 for PCT/US2012/055634.
Nikou, et al. Augmented reality imaging technology for orthopaedic surgery. Operative Techniques in Orthopaedics 10.1 (2000): 82-86.
Office action dated Mar. 17, 2015 for U.S. Appl. No. 13/618,915.
Office action dated May 11, 2016 for U.S. Appl. No. 13/618,915.
Office action dated May 24, 2017 for U.S. Appl. No. 13/618,915.
Office action dated Aug. 14, 2014 for U.S. Appl. No. 13/618,915.
Office action dated Oct. 14, 2016 for U.S. Appl. No. 13/618,915.
Notice of allowance dated Nov. 8, 2017 for U.S. Appl. No. 13/618,915.
Notice of allowance dated Nov. 22, 2017 for U.S. Appl. No. 13/618,915.
Ciuti et al., 2012, Intra-operative monocular 30 reconstruction for image-guided navigation in active locomotion capsule endoscopy. Biomedical Robotics and Biomechatronics (Biorob), 4th IEEE Ras & Embs International Conference on IEEE.
Fallavollita et al., 2010, Acquiring multiview C-arm images to assist cardiac ablation procedures, EURASIP Journal on Image and Video Processing, vol. 2010, Article ID 871408, pp. 1-10.
Kumar et al., 2014, Stereoscopic visualization of laparoscope image using depth information from 3D model, Computer methods and programs in biomedicine 113(3):862-868.
Livatino et al., 2015, Stereoscopic visualization and 3-D technologies in medical endoscopic teleoperation, IEEE.
Luo et al., 2010, Modified hybrid bronchoscope tracking based on sequential monte carlo sampler: Dynamic phantom validation, Asian Conference on Computer Vision. Springer, Berlin, Heidelberg.
Mourgues et al., 2002, Flexible calibration of actuated stereoscopic endoscope for overlay inrobot assisted surgery, International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, Berlin, Heidelberg.
Nadeem et al., 2016, Depth Reconstruction and Computer-Aided Polyp Detection in Optical Colonoscopy Video Frames, arXiv preprint arXiv:1609.01329.
Sato et al., 2016, Techniques of stapler-based navigational thoracoscopic segmentectomy using virtual assisted lung mapping (VAL-MAP), Journal of Thoracic Disease, 8(Suppl 9):S716.
Shen et al., 2015, Robust camera localisation with depth reconstruction for bronchoscopic navigation. International Journal of Computer Assisted Radiology and Surgery, 10(6):801-813.
Song et al., 2012, Autonomous and stable tracking of endoscope instrument tools with monocular camera, Advanced Intelligent Mechatronics (AIM), 2012 IEEE-ASME International Conference on. IEEE.
Verdaasdonk et al., Jan. 23, 2013, Effect of microsecond pulse length and tip shape on explosive bubble formation of 2.78 iLtr Er,Cr:YSGG and 2.94 iLtrm Er:YAG laser, Proceeings fo SPIE, vol. 8221, 12.
Yip et al., 2012, Tissue tracking and registration for image-guided surgery, IEEE transactions on medical imaging 31(11):2169-2182.
Zhou et al., 2010, Synthesis of stereoscopic views from monocular endoscopic videos, Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on IEEE.
Haigron et al., 2004, Depth-map-based scene analysis for activew navigation in virtual angioscopy, IEEE Transactions on Medical Imaging; 23(11):1380-1390.
Kiraly et al, 2002, Three-dimensional Human Airway Segmentation Methods for Clinical Virtual Bronchoscopy, Acad Radiol, 9:1153-1168.
Kiraly et al., Sep. 2004, Three-dimensional path planning for virtual bronchoscopy, IEEE Transactions on Medical Imaging, 23(9):1365-1379.
Solomon et al., Dec. 2000, Three-dimensional CT- Guided Bronchoscopy With a Real-Time Electromagnetic Position Sensor a Comparison of Two Image Registration Methods, Chest, 118(6):1783-1787.
Konen et al., 1998, The VN-project: endoscopic image processing for neurosurgery, Computer Aided Surgery, 3:1-6.
Mayo Clinic, Robotic Surgery, https://www.mayoclinic.org/tests-procedures/robotic-surgery/about/pac-20394974?p=1, downloaded from the internet on Jul. 12, 2018, 2 pp.
Shi et al., Sep. 14-18, 2014, Simultaneous catheter and environment modeling for trans-catheter aortic valve implantation, IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2024-2029.
Vemuri et al., Dec. 2015, Inter-operative biopsy site relocations in endoluminal surgery, IEEE Transactions on Biomedical Engineering, Institute of Electrical and Electronics Engineers, <10.1109/TBME.2015.2503981>. <hal-01230752>.
Wilson et al., 2008, a buyer's guide to electromagnetic tracking systems for clinical applications, Proc. of SPCI, 6918:69182B-1 p. 6918B-11.
Al-Ahmad et al., dated 2005, Early experience with a computerized robotically controlled catheter system, Journal of Interventional Cardiac Electrophysiology, 12:199-202.
Bell et al., 2014, Six DOF motion estimation for teleoperated flexible endoscopes using optical flow: a comparative study, IEEE International Conference on Robotis and Automation,.
Gutierrez et al., Mar. 2008, A practical global distortion correction method for an image intensifier based x-ray fluoroscopy system, Med. Phys, 35(3):997-1007.
Hansen Medical, Inc. 2005, System Overview, product brochure, 2 pp., dated as available at http://hansenmedical.com/system.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Hansen Medical, Inc. Bibliography, product brochure, 1 p., dated as available at http://hansenmedical.com/bibliography.aspx on Jul. 14, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Hansen Medical, Inc. dated 2007, Introducing the Sensei Robotic Catheter System, product brochure, 10 pp.
Hansen Medical, Inc. dated 2009, Sensei X Robotic Catheter System, product brochure, 5 pp.
Hansen Medical, Inc. Technology Advantages, product brochure, 1 p., dated as available at http://hansenmedical.com/advantages.aspx on Jul. 13, 2006 (accessed Jun. 25, 2019 using the internet archive way back machine).
Marrouche et al., dated May 6, 2005, AB32-1, Preliminary human experience using a novel robotic catheter remote control, Heart Rhythm, 2(5):S63.
Oh et al., dated May 2005, P5-75, Novel robotic catheter remote control system: safety and accuracy in delivering RF Lesions in all 4 cardiac chambers, Heart Rhythm, 2(5):5277-5278.
Reddy et al., May 2005, P1-53. Porcine pulmonary vein ablation using a novel robotic catheter control system and real-time integration of CT imaging with electroanatomical mapping, Hearth Rhythm, 2(5):S121.
Ren et al., 2011, Multisensor data fusion in an integrated tracking system for endoscopic surgery, IEEE Transactions on Information Technology in Biomedicine, 16(1):106-111.
Slepian, dated 2010, Robotic Catheter Intervention: the Hansen Medical Sensei Robot Catheter System, PowerPoint presentation, 28 pp.
Related Publications (1)
Number Date Country
20140357984 A1 Dec 2014 US
Provisional Applications (1)
Number Date Country
61829078 May 2013 US