TOOL POSITION AND IDENTIFICATION INDICATOR DISPLAYED IN A BOUNDARY AREA OF A COMPUTER DISPLAY SCREEN

Abstract
An endoscope captures images of a surgical site for display in a viewing area of a monitor. When a tool is outside the viewing area, a GUI indicates the position of the tool by positioning a symbol in a boundary area around the viewing area so as to indicate the tool position. The distance of the out-of-view tool from the viewing area may be indicated by the size, color, brightness, or blinking or oscillation frequency of the symbol. A distance number may also be displayed on the symbol. The orientation of the shaft or end effector of the tool may be indicated by an orientation indicator superimposed over the symbol, or by the orientation of the symbol itself. When the tool is inside the viewing area, but occluded by an object, the GUI superimposes a ghost tool at its current position and orientation over the occluding object.
Description
FIELD OF THE INVENTION

The present invention generally relates to robotic surgical systems and in particular, to a tool position and identification indicator displayed in a boundary area of a computer display screen.


BACKGROUND

Robotic surgical systems such as those used in performing minimally invasive surgical procedures offer many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue. Consequently, demand for minimally invasive surgery using robotic surgical systems is strong and growing.


One example of a robotic surgical system is the da Vinci® Surgical System from Intuitive Surgical, Inc., of Sunnyvale, Calif. The da Vinci® system includes a surgeon's console, a patient-side cart, a high performance 3-D vision system, and Intuitive Surgical's proprietary EndoWrist™ articulating instruments, which are modeled after the human wrist so that when added to the motions of the robot arm holding the surgical instrument, they allow a full six degrees of freedom of motion, which is comparable to the natural motions of open surgery.


The da Vinci° surgeon's console has a high-resolution stereoscopic video display with two progressive scan cathode ray tubes (“CRTs”). The system offers higher fidelity than polarization, shutter eyeglass, or other techniques. Each eye views a separate CRT presenting the left or right eye perspective, through an objective lens and a series of mirrors. The surgeon sits comfortably and looks into this display throughout surgery, making it an ideal place for the surgeon to display and manipulate 3-D intraoperative imagery.


A stereoscopic endoscope is positioned near a surgical site to capture left and right views for display on the stereoscopic video display. When an instrument is outside a viewing area on the display, however, the surgeon may not know how far away or in which direction the instrument is at the time. This makes it difficult for the surgeon to guide the instrument to the surgical site. Also, it may be disconcerting to the surgeon if the instrument unexpectedly appears in view. Even when an instrument is within the viewing area of the display, the surgeon may not know which instrument it is or which patient-side manipulator (e.g., robotic arm on the patient-side cart) the instrument is associated with. This makes it difficult, for example, for the surgeon to instruct a patient side assistant to replace the instrument with another during a surgical procedure.


In order to locate an instrument which is outside of a viewing area on the display, it may be necessary to move the endoscope until the instrument appears in the viewing area. In this case, if the surgical instrument is being guided to the surgical site, the cameras' zoom and focus controls may also require frequent adjustment, making the process tedious and time consuming for the surgeon. If it happens that the instrument is in the camera field of view (“FOV”), but outside of the viewing area, because of a zoom-in adjustment to the view, then a zoom-out adjustment may be performed so that the instrument is back in the viewing area. Such a zoom-out, however, may be undesirable when a delicate surgical procedure is being performed that requires close scrutiny by the surgeon.


OBJECTS AND BRIEF SUMMARY

Accordingly, one object of various aspects of the present invention is a method for indicating a tool position relative to images being displayed on a computer display screen when the tool is outside a viewing area of the screen.


Another object of various aspects of the present invention is a method for indicating a tool distance from images being displayed on a computer display screen when the tool is outside a viewing area of the screen.


Another object of various aspects of the present invention is a method for indicating a tool orientation relative to images being displayed on a computer display screen when the tool is outside a viewing area of the screen.


Another object of various aspects of the present invention is a method for indicating a tool position or orientation relative to images being displayed on a computer display screen when the tool is occluded within a viewing area of the screen.


Still another object of various aspects of the present invention is a method for indicating a tool identification on a computer display screen that clearly identifies which patient-side manipulators are connected to which surgical instruments, so as to improve surgeon performance and surgeon-assistant communications.


These and additional objects are accomplished by the various aspects of the present invention, wherein the embodiments of the invention are summarized by the claims that follow below.


In preferred embodiments of the method, apparatus and medical robotic system, the symbol provides information identifying the tool and/or its associated patient-side manipulator by an associated color or some other means, such as text or numeric information that is written on or displayed adjacent to the symbol. In the latter case, the text information may be continuously displayed on the computer display screen. Alternatively, it may only be displayed when a cursor is placed over the symbol or the symbol is clicked on using a pointing device.


Additional objects, features and advantages of the various aspects of the present invention will become apparent from the following description of its preferred embodiment, which description should be taken in conjunction with the accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a top view of an operating room employing a robotic surgical system utilizing aspects of the present invention.



FIG. 2 illustrates two tools positioned in the FOV of an endoscope camera.



FIG. 3 illustrates one tool positioned in and one tool positioned out of the FOV of an endoscope camera.



FIG. 4 illustrates a first computer display screen resulting from a method for indicating a tool's position when the tool is out of the FOV of an endoscope camera, utilizing aspects of the present invention.



FIG. 5 illustrates a second computer display screen resulting from a method for indicating a tool's position when the tool is out of the FOV of an endoscope camera, utilizing aspects of the present invention.



FIG. 6 illustrates a third computer display screen resulting from a method for indicating a tool's position when the tool is out of the FOV of an endoscope camera, utilizing aspects of the present invention.



FIG. 7 illustrates a fourth computer display screen resulting from a method for indicating a tool's position when the tool is occluded in the FOV of an endoscope camera, utilizing aspects of the present invention.



FIG. 8 illustrates a flow diagram of a method for indicating a tool's position when the tool is outside of, or occluded in, the FOV of an endoscope camera, utilizing aspects of the present invention.



FIG. 9 illustrates left and right views of a point in an endoscope camera reference frame as used in a robotic surgical system configured to perform the method described in reference to FIG. 8 which utilizes aspects of the present invention.



FIGS. 10 and 11 respectively illustrate a full left camera view being displayed on a left viewing area of a computer monitor and partial left camera view being displayed on a left viewing area of a computer monitor.



FIG. 12 illustrates a flow diagram of a method for identifying a tool in a camera view that may be used in the method described in reference to FIG. 8 which utilizes aspects of the present invention.





DETAILED DESCRIPTION


FIG. 1 illustrates, as an example, a top view of an operating room employing a robotic surgical system. The robotic surgical system in this case is a Minimally Invasive Robotic Surgical (MIRS) system 100 including a Console (“C”) utilized by a Surgeon (“S”) while performing a minimally invasive diagnostic or surgical procedure, usually with assistance from one or more Assistants (“A”), on a Patient (“P”) who is lying down on an Operating table (“O”).


The Console includes a 3-D monitor 104 for displaying an image of a surgical site to the Surgeon, one or more manipulatable master manipulators 108 and 109 (also referred to herein as “control devices” and “input devices”), and a processor 102. The control devices 108 and 109 may include any one or more of a variety of input devices such as joysticks, gloves, trigger-guns, hand-operated controllers, or the like. The processor 102 is a personal computer that is integrated into the Console or positioned next to it.


The Surgeon performs a minimally invasive surgical procedure by manipulating the control devices 108 and 109 so that the processor 102 causes their respectively associated slave manipulators 128 and 129 (also referred to herein as “robotic arms” and “patient-side manipulators”) to manipulate their respective removably coupled surgical instruments 138 and 139 (also referred to herein as “tools”) accordingly, while the Surgeon views the surgical site in 3-D, as it is captured by a stereoscopic endoscope 140 (having left and right cameras for capturing left and right stereo views) and displayed on the Console 3-D monitor 104.


Each of the tools 138 and 139, as well as the endoscope 140, is preferably inserted through a cannula or other tool guide (not shown) into the Patient so as to extend down to the surgical site through a corresponding minimally invasive incision such as incision 166. Each of the robotic arms is conventionally formed of linkages, such as linkage 162, which are coupled together and manipulated through motor controlled joints, such as joint 163.


The number of surgical tools used at one time and consequently, the number of robotic arms being used in the system 100 will generally depend on the diagnostic or surgical procedure and the space constraints within the operating room, among other factors. If it is necessary to change one or more of the tools being used during a procedure, the Surgeon may instruct the Assistant to remove the tool no longer being used from its robotic arm, and replace it with another tool 131 from a Tray (“T”) in the operating room. To aid the Assistant in identifying the tool to be replaced, each of the robotic arms 122, 128 and 129 may have an identifying number or color indicator printed on it, such as on its setup joint.


Preferably, the monitor 104 is positioned near the Surgeon's hands so that it will display a projected image that is oriented so that the Surgeon feels that he or she is actually looking directly down onto the operating site. To that end, an image of the tools 138 and 139 preferably appear to be located substantially where the Surgeon's hands are located. To do this, the processor 102 preferably changes the orientations of the control devices 108 and 109 so as to match the orientations of their associated tools 138 and 139 as seen by the endoscope 140.


The processor 102 performs various functions in the system 100. One important function that it performs is to translate and transfer the mechanical motion of control devices 108 and 109 to their respective robotic arms 128 and 129 through control signals over bus 110 so that the Surgeon can effectively move and/or manipulate their respective tools 138 and 139. Another important function is to implement a method for indicating positions of a tool when the tool is outside a camera captured view being displayed on the monitor 104, or occluded within the camera captured view being displayed on the monitor 104, as described herein. Still another important function is to implement a method for readily identifying tools and/or their respective patient-side manipulators on the monitor 104 to facilitate Surgeon/Assistant communications.


Although described as a personal computer, it is to be appreciated that the processor 102 may be implemented in practice by any combination of hardware, software and firmware. Also, its functions as described herein may be performed by one unit, or divided up among different components, each of which may be implemented in turn by any combination of hardware, software and firmware.


During the performance of a minimally surgical procedure, the tools 138 and 139 are preferably kept within a viewing area 200 of the monitor 104 (such as shown in FIG. 2) so that the Surgeon may see them on the monitor 104 and accordingly, use them during the procedure. When one of the tools 138 is outside the viewing area 200 of the monitor 104 (such as shown in FIG. 3), however, the Surgeon will be unable to see that tool on the monitor 104 and consequently, will be unable to properly use it during the procedure. In addition, the Surgeon may have difficulty moving the out-of-view tool into the viewing area 200 of the monitor 104 without any knowledge of where the out-of-view tool is currently positioned relative to the viewing area 200.


To indicate tool positions to the Surgeon for out-of-view or occluded tools, the processor 102 is configured with a Graphical User Interface (“GUI”) computer program which implements a method for indicating tool positions on the monitor 104, as described in reference to FIG. 8. Before describing this aspect of the GUI, however, examples of output generated by the GUI are illustrated and described in reference to FIGS. 4-7.


In each of the FIGS. 4-7, the viewing area 300 of the monitor 104 may correspond to the FOV of the endoscope 140 (with proper scaling of the entire FOV) such as depicted in FIG. 10, or it may correspond to only a portion of the FOV of the endoscope 140 (with proper scaling corresponding to a ZOOM-IN of images in the portion of the FOV displayed on the monitor 104) such as depicted in FIG. 11. Tools within the viewing area 300 are seen in bold line in the viewing area 300. Circumscribing the viewing area 300 is a boundary area 400, in which, non-clickable symbols or clickable icons (hereinafter cumulatively referred to as “symbols”) are positioned so as to indicate positions of corresponding tools.


The symbols also preferably provide information identifying their respective tools and/or associated patient-side manipulators. One way they may do this is by their colors which may match color indications printed on the patient-side manipulators, such as on their setup joints. For example, patient-side manipulators 122, 128 and 129 may be color coded respectively as red, green and yellow, and symbols corresponding to their attached tools also color coded in the same manner. Alternatively, number indicators and/or other identifying information may be displayed on or adjacent to the symbols which may match numbers printed on the patient-side manipulators, such as on their setup joints. For example, patient-side manipulators 122, 128 and 129 may be numbered 1, 2 and 3 respectively, and symbols corresponding to their attached tools also numbered in the same manner. Where text information is provided with the symbols, the text may be written on or displayed adjacent to the symbol. It may be continuously displayed on the computer display screen, or only displayed when a cursor is placed over the symbol or the symbol is clicked on using a pointing device.


Tools outside the viewing area 300 are seen in dotted line for the purposes of explaining certain aspects of the method implemented by the GUI. It is to be appreciated that these dotted lined tools (or dotted line tool extensions) are not seen by the Surgeon on the monitor 104. Their relative positions with respect to the viewing area 300 in FIGS. 4-7, however, correspond to their relative positions in or to the FOV of the endoscope 140 in the endoscope camera frame of reference.


Although the tools shown in FIGS. 4-7 appear as 2-D images, it is to be appreciated that this is not to be construed as a limitation, but rather as a simplification for descriptive purposes only. Preferably, 3-D images are displayed in the viewing area 300. The symbols and in particular, end effector or tool shaft orientation indications superimposed on the symbols, may appear in 2-D or 3-D in the boundary area 400. Also, although the examples described herein refer to images captured by the endoscope 140, it is to be appreciated that the various aspects of the present invention are also applicable to images captured by other types of imaging devices such as those using MRI, ultrasound, or other imaging modalities, which may be displayed in the viewing area 300 of the monitor 104.



FIG. 4 illustrates, as a first example, a GUI generated screen that is displayed on the monitor 104, wherein a first symbol 410 is placed in the boundary area 400 to indicate the position of the out-of-view tool 138, and an orientation indicator 411 is superimposed on the symbol 410 to indicate the current orientation of an end effector 215 of the out-of-view tool 138. An in-view tool 139 is shown partially extending into the viewing area 300 from a second symbol 420 in the boundary area 400.


In this example, the position of the first symbol 410 is determined by the intersection of a line 402 and the boundary area 400, wherein the line 402 extends from a reference point on the out-of-view tool 138 to a central point 401 of the viewing area 300 of the monitor 104. The position of the second symbol 420 is determined by the intersection of the shaft 222 of the in-view tool 139 and the boundary area 400.


The distance that the out-of-view tool 138 is away from the viewing area 300, may be indicated in a number of ways, such as by the size, color, brightness/intensity, blinking frequency, or oscillating frequency of its symbol. Alternatively, the distance may be simply indicated by displaying a distance number (such as the distance in centimeters) over the symbol. For example, when the tool is in-view, such as the tool 139, then its symbol may be a maximum size, such as the symbol 420 of the in-view tool 139. When the tool is out-of-view, however, such as the tool 138, then the size of its symbol may indicate the distance that the out-of-view tool is away from the viewing area 300 so that it gets larger as the tool moves closer to entering the viewing area 300. Alternatively, the color of the symbol may indicate distance using a color spectrum, or the brightness/intensity of the symbol or the blinking frequency of the symbol may indicate distance by increasing as the tool moves closer to entering the viewing area 300, or an oscillation frequency of the symbol about its nominal position may reduce as the tool is brought closer to being in the viewing area 300.



FIG. 5 illustrates, as a second example, a GUI generated screen that is displayed on the monitor 104, wherein a first symbol 510 is placed in the boundary area 400 to indicate the position of the out-of-view tool 138, and an orientation indicator 511 is superimposed on the symbol 510 to indicate the current orientation of a shaft 217 of the out-of-view tool 138.


In this example, the position of the first symbol 510 is determined by the intersection of a line 502 and the boundary area 400, wherein the line 502 extends along an axis of the shaft 217. The distance that the out-of-view tool 138 is away from the viewing area 300, may be indicated in the same manner as described above in reference to FIG. 4.



FIG. 6 illustrates, as a third example, a GUI generated screen that is displayed on the monitor 104, wherein a first symbol 610 is placed in the boundary area 400 to indicate the position of the out-of-view tool 138, and an orientation indicator 611 is superimposed on the symbol 610 to indicate the current orientation of a shaft 217 of the out-of-view tool 138.


In this example, the position of the first symbol 610 is determined by the intersection of a trajectory 602 and the boundary area 400, wherein the trajectory 602 is defined by the path of a reference point on the out-of-view tool 138 as it moves in the endoscope camera reference frame. In this way the symbol 610 is placed in the boundary area 400 where the tool will first appear in the viewing area 300 if it continues along its current trajectory (or, if it is moving away from the viewing area 300, where it would appear if the trajectory were reversed). For example, if only two points in time are used to determine the trajectory, as the tool 138 moves from a first location at time t1 to a second location at time t2, the path of the reference point is represented by a line extending through the two points. If time t2 is the current time and time t1 a prior time, then the current orientation of the shaft 217 is indicated by the orientation indicator 611. By using more than two points in time to define the trajectory of the out-of-view tool 138, the trajectory may take on more sophisticated curves. The distance that the out-of-view tool 138 is away from the viewing area 300, may be indicated in the same manner as described above in reference to FIG. 4.



FIG. 7 illustrates, as a fourth example, a GUI generated screen that is displayed on the monitor 104, wherein both tools 138 and 139 are positioned so as to be within the viewing area 300, but the end effector of the tool 138 is occluded by an object 700. In this case, since each of the tools is in the viewing area 300, their respective symbols 710 and 420 are at maximum size. Although the end effector of the tool 138 is occluded by the object 700, a ghost image 711 (e.g., a computer model) of the end effector is shown at the proper position and orientation over the object 700. If the ghost image 711 is too distracting, then an outline of the end effector may be used instead, as either a programmed or surgeon selected option.


As previously described, the symbols 420, 410, 510, 610, and 710 may be non-clickable symbols or clickable icons. In the former case, if the Surgeon passes the cursor of a pointing device such as a mouse over the non-clickable symbol, additional information about the associated tool may be provided. In the latter case, if the Surgeon clicks on the clickable icon using the pointing device, additional information about the associated tool may be provided. The additional information in either case is information that is in addition to that identifying its associated patient-side manipulator, which may be indicated by its color or a number that is always displayed on or adjacent to the symbol. Examples of such additional information may include identification of the tool's type and its associated master manipulator. The additional information may be provided in a separate window such as a picture-in-picture, or it may be provided as text adjacent to, or superimposed over, the symbol. When the separate window is provided, the additional information may further include a zoomed out, computer generated view of the surgical site including the FOV of the endoscope 140 and computer generated models of all tools outside of it.


Although shown as circles, the symbols 420, 410, 510, 610, and 710 may be displayed in any one or more of many different shapes. For example, when the tool is positioned so as to be viewed inside the viewing area 300, then the symbol may take the form of a computer model of the tool shaft so that a ghost shaft is displayed in the boundary area 400. On the other hand, when the tool is positioned so as to be outside of the viewing area 300, then the symbol may take the form of a computer model of the distal end of the tool so that a ghost end effector is displayed in the boundary area 400. As the tool moves from outside of the viewing area 300 into the viewing area 300, the symbol would then seamlessly change from the ghost end effector to the ghost shaft, and vice versa when the tool moves from inside of the viewing area 300 to outside of the viewing area 300. The orientation of the ghost shaft or ghost end effector, as the case may be, would preferably match that of the actual tool. When the tool is outside of the viewing area 300, the size of the ghost end effector may indicate its distance from the viewing area 300, as previously described for the symbols Likewise, in order to identify the tool and/or its patient-side manipulator, the ghost shaft or ghost end effector, as the case may be, may be color coded or numerically numbered as previously described for the symbols.



FIG. 8 illustrates, as an example, a flow diagram of a method for indicating a tool's position and identification on the monitor 104. The method is preferably performed for each tool by a GUI executed in the processing unit 102. In 801, the position and orientation of a tool are determined in the reference frame of an imaging device whose captured images are being displayed on the monitor 104. Although for the purposes of this example the images are described as being captured by the stereo cameras of the endoscope 140, it is to be appreciated that images captured by other imaging devices using other imaging modalities may also be used with the method. Also for the purposes of this example, the full FOV of the cameras is assumed to be displayed in viewing area 300, such as depicted in FIG. 10. Therefore, in such case, the position and orientation of the tool may not be determinable using conventional imaging techniques when the tool is outside the FOV of the cameras.


Consequently, the tool position and orientation (also referred to herein as the “tool state”) are first estimated in a tool reference frame by receiving information from joint sensors in the tool's robotic arm, and applying the information to kinematics of the robotic arm. Because the tool state in this case is primarily determined from the robotic arm kinematics, it can be readily determined even though the tool is outside the FOV of the endoscope 140 or occluded in the FOV of the endoscope 140.


The estimated tool state is then translated into the camera reference frame, and corrected using a previously determined error transform. The error transform may be determined from a difference between the tool state determined using its robotic arm kinematics and a tool state determined using video image processing. The error transform may be first determined with a pre-operative calibration step, and periodically updated when the tool is in the FOV of the endoscope 140 during a minimally invasive surgical procedure.


If only a portion of the FOV of the cameras is displayed in viewing area 300 of the monitor 104, however, such as depicted by area 1101 in FIG. 11, then it may still be possible to use conventional imaging techniques to determine the tool position if the tool is in a portion of the FOV of the cameras that is not being displayed in the viewing area 300 of the monitor 104, such as depicted by the area 1102 in FIG. 11. Note that in both FIGS. 10 and 11, only the left camera view I1 is shown. It is to be appreciated, however, that for a 3-D display, a corresponding right camera view I2 is also necessary as described, for example, in reference to FIG. 9, but is not being shown herein to simplify the description.


Additional details for determining tool positions and orientations, and in particular, for performing tool tracking are described, for example, in commonly owned U.S. application Ser. No. 11/130,471 entitled “Methods and Systems for Performing 3-D Tool Tracking by Fusion of Sensor and/or Camera derived Data during Minimally Invasive Robotic Surgery,” file May 16, 2005, which is incorporated herein by this reference.


In 802, a determination is made whether the position of the tool is within the viewing area 300 of the monitor 104, which is equivalent in this example to determining whether the tool is within the FOV of the endoscope 140. This latter determination may be performed using epipolar geometry. Referring to FIG. 9, for example, the endoscope 140 includes two cameras, C1 and C2, separated by a baseline distance “b”, and having image planes, I1 and I2, defined at the focal length “f” of the cameras. The image planes, I1 and I2, are warped using a conventional stereo rectification algorithm to remove the effects of differing internal and external camera geometries.


A point P in the camera reference frame is projected onto the image planes, I1 and I2, at image points, P1 and P2, by an epipolar plane containing the point P, the two optical centers of the cameras, C1 and C2, and the image points, P1 and P2. The position of the point P may then be determined in the camera reference frame using known values for the baseline distance “b” and focal length “f”, and a disparity “d” calculated from the distances of the image points, P1 and P2, from their respective image plane center points (i.e., at the intersections of the x-axis with the y1 and y2 axes).


Thus, in order for a tool to be in the FOV of the endoscope 140, at least one point on the tool must be projected onto at least one of the two image planes, I1 and I2. Although it may be possible to estimate a position of a point on the tool that is projected onto only one of the two image planes, I1 and I2, using disparity information calculated for nearby points, for example, preferably the point on the tool would be projected onto both of the two image planes, I1 and I2, so that a disparity value may be calculated for the point and consequently, its depth can be determined directly. Also, although the tool may technically be in the FOV of the endoscope 140 if only one point of the tool is in it, for practical reasons, a sufficient number of points are preferably required so that the tool is visually identifiable in the monitor 104 by the Surgeon.


Now, if the position of the tool is determined in 802 to be outside the viewing area 300 of the monitor 104, then in 803, a position for a symbol in the boundary area 400 circumscribing the viewing area 300 is determined such that the position of the symbol indicates the tool's position relative to the viewing area 300. Examples of such determination have been previously described in reference to FIGS. 4-6. After determining the position of the symbol in the boundary area 400, in 804, the symbol is then displayed in the boundary area 400 at its determined position. In addition, an orientation indicator may be superimposed on the symbol and other tool and/or its robotic arm identifying information provided as described in reference to FIGS. 4-6. The method then repeats for another processing interval by going back to 801.


On the other hand, if the position of the tool is determined in 802 to be within the viewing area 300 of the monitor 104, then in 805, an attempt is made to identify the tool in the FOV of the endoscope 140. Referring to FIG. 12 as one example for performing this task, in 1201, a 3-D computer model of the tool is generated. This is generally a one time, pre-operative process. In 1202, the 3-D computer model of the tool is positioned and oriented according to the tool state determined in 801. In 1203, right and left 2-D outlines of the computer model of the tool are generated by projecting an outline of the 3-D computer model of the tool onto the left and right image planes, I1 and 1I2, of the left and right cameras, C1 and C2, of the endoscope 140. In 1204, the 2-D outline of the computer model of the tool that was generated in 1203 for the left image plane I1 is cross-correlated with a left camera view captured by the left camera C1, and/or the 2-D outline of the computer model of the tool that was generated in 1203 for the right image plane I2 is cross-correlated with a right camera view captured by the right camera C2.


In 806, a determination is then made whether the tool has been identified in the FOV of the endoscope 140 by, for example, determining whether a cross-correlation value calculated in 1204 meets or exceeds a threshold value for one or both of the left and right camera views. If the result of 806 is a YES, then the tool has been identified in the right and/or left camera view. The method then goes to 803 to determine the symbol position in the boundary area 400, which in this case may be simply determined by the intersection of the tool shaft with the boundary area 400. The method then proceeds to 804 to display the symbol in the determined position in the boundary area 400, and then to 801 to repeat the method for another processing period.


If the result of 806 is a NO, however, then the tool is presumably occluded by another object. In that case, in 807, the 2-D outline of the computer model of the tool that was generated by projecting the 3-D computer model of the tool into the left image plane I1 is superimposed on the left camera view captured by the left camera C1, and the 2-D outline of the computer model of the tool that was generated by projecting the 3-D computer model of the tool into the right image plane I2 is superimposed on the right camera view captured by the right camera C2. As a result, a 3-D outline of the computer model of the tool is displayed in the viewing area 300 of the monitor 104 superimposed over the occluding object. Alternatively, the full 3-D computer model of the tool may be displayed as a ghost tool rather than just its outline over the occluding object by superimposing appropriate left and right images of the 3-D computer model on the left and right camera views captured by the cameras C1 and C2 of the endoscope 140.


The method then goes to 803 to determine the symbol position in the boundary area 400, which in this case may be simply determined by the intersection of the tool shaft with the boundary area 400. The method then proceeds to 804 to display the symbol in the determined position in the boundary area 400, and then to 801 to repeat the method for another processing period.


Although the various aspects of the present invention have been described with respect to a preferred embodiment, it will be understood that the invention is entitled to full protection within the full scope of the appended claims.

Claims
  • 1-27. (canceled)
  • 28. An apparatus comprising: a memory device non-transitorily storing program instructions; anda processor coupled to a display device, an image capture device, and the memory device, the processor configured to execute the program instructions to: determine a position of a tool in a reference frame of the image capture device;determine a position to display a non-depictive symbol, which is non-depictive of the tool, in a boundary area circumscribing a viewing area on the display device, so as to indicate the position of the tool, while the viewing area is displaying an image captured by the image capture device; andcause the non-depictive symbol to be displayed at the determined position in the boundary area.
  • 29. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area so that the non-depictive symbol provides information identifying one of: the tool, and a manipulator used for moving the tool.
  • 30. (canceled)
  • 31. The apparatus according to claim 29, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area so that the non-depictive symbol is marked with one of: a color that is uniquely associated with the manipulator, and a number that is uniquely associated with the manipulator.
  • 32. (canceled)
  • 33. (canceled)
  • 34. The apparatus according to claim 29, wherein the processor is configured to execute the program instructions to: determine the position of the tool by using at least one of:kinematics for the manipulator, and an image identification technique.
  • 35. (canceled)
  • 36. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: determine the position to display the non-depictive symbol in the boundary area by determining where a line passing through an axis, which extends along a length of a shaft of the tool, enters a field of view of the image capture device.
  • 37. (canceled)
  • 38. The apparatus according to claim 37, wherein the processor is configured to execute the program instructions to: determine the position of the non-depictive symbol in the boundary area by determining where a line extending from the position of the tool to a reference point, which is in the field of view of the image capture device, enters the field of view.
  • 39. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: determine the position of the non-depictive symbol in the boundary area by: determining a trajectory of the tool from current and past positions of the tool, and determining where an extrapolation of the trajectory enters the field of view of the image capture device.
  • 40. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area by: determining a distance of a position of a reference point on the tool from a position of a reference point in the field of view of the image capture device, and causing the non-depictive symbol to be displayed so that its size indicates the distance.
  • 41. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area by: determining a distance of a position of a reference point on the tool from a position of a reference point in the field of view of the camera, and causing the non-depictive symbol to be displayed so that its color indicates the distance.
  • 42. The apparatus according to claim 41, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed in the boundary area so that the intensity of the color of the non-depictive symbol indicates the distance.
  • 43. The apparatus according to claim 41, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed in the boundary area so that the color of the non-depictive symbol relative to a color spectrum indicates the distance.
  • 44. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area by: determining a distance of a position of a reference point on the tool from a position of a reference point in the field of view of the camera, and causing the non-depictive symbol to be displayed so that a frequency of blinking of the non-depictive symbol indicates the distance.
  • 45. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area by: determining a distance of a position of a reference point on the tool from a position of a reference point in the field of view of the camera, and causing the non-depictive symbol to be displayed so that a frequency of oscillation of the non-depictive symbol about the determined position in the boundary area indicates the distance.
  • 46. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: cause the non-depictive symbol to be displayed at the determined position in the boundary area by: determining a distance of a position of a reference point on the tool from a position of a reference point in the field of view of the camera, and causing the non-depictive symbol to be displayed so that the distance is indicated by overlaying a distance number over the non-depictive symbol.
  • 47. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: determine the position of the tool by determining a position and an orientation of an end effector of the tool, andcause the non-depictive symbol to be displayed at the determined position in the boundary area by causing an orientation indicator to be displayed over the non-depictive symbol such that the orientation indicator is oriented so as to indicate the orientation of the end effector.
  • 48. The apparatus according to claim 28, wherein the processor is configured to execute the program instructions to: determine an orientation of an axis that extends along a length of a shaft of the tool, andcause the non-depictive symbol to be displayed at the determined position in the boundary area by causing an orientation indicator to be displayed over the non-depictive symbol such that the orientation indicator is oriented so as to indicate the orientation of the axis.
  • 49. (canceled)
  • 50. (canceled)
  • 51. An apparatus comprising: a memory device non-transitorily storing program instructions; anda processor is configured to execute the program instructions to: cause images captured by an image capture device to be displayed in a viewing area on a display device;determine a position of a tool, which is within, but occluded in a field of view of the image capture device, by determining a position of the tool in a reference frame of the image capture device by using kinematics of manipulator used for moving the tool; andcause a ghost tool to be displayed in the viewing area where the tool is occluded by using the determined position of the tool in the reference frame of the image capture device.
  • 52. The apparatus according to claim 51, wherein the viewing area includes stereoscopic right and left two-dimensional views, andwherein the processor is configured to further execute the program instructions to: determine that the tool is occluded in the field of view of the image capture device by: generating a three-dimensional computer model of the tool;positioning and orientating the computer model to coincide with the current position and orientation of the tool in the reference frame of the computer display screen;generating a two-dimensional outline of the computer model that projects onto a selected one of the right and left two-dimensional views; andcross-correlating the two-dimensional outline of the computer model with the selected one of the right and left two-dimensional views.
  • 53. The apparatus according to claim 52, wherein the processor is configured to execute the program instructions to: cause the ghost tool to be displayed in the viewing area by causing at least the two-dimensional outline of the computer model of the tool to be displayed as an overlay on the selected one of the right and left two-dimensional views as it is being displayed on the computer screen.
  • 54. The apparatus according to claim 52, wherein the processor is configured to execute the program instructions to: cause the ghost tool to be displayed in the viewing area by causing right and left two-dimensional ghost images of the tool to be overlaid in corresponding of the right and left two-dimensional views, so as to appear as a three-dimensional ghost image of the tool at the position and orientation of the tool in the reference frame of the computer display screen.
  • 55-76. (canceled)
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a divisional of application Ser. No. 11/478,531 (filed Jun. 29, 2006), which is incorporated herein by reference.

Divisions (1)
Number Date Country
Parent 11478531 Jun 2006 US
Child 15459145 US