Robotic surgery involves a clinician, such as a surgeon or technician, operating a surgical robot via a control console. Robotic surgery may be performed endoscopically, and thus the only view of a surgical site available to the clinician may be images, such as three-dimensional (3D) or stereoscopic images, captured by an endoscopic camera. While operating the surgical robot, and thus viewing the 3D images, the clinician's head may be moved. Such movement of the clinician's head may cause the clinician to expect corresponding movement of the 3D images, for instance, based on a change in the clinician's perspective. However, conventional 3D video images are not configured to move based on movement of the clinician's head. Thus, if the clinician's head is moved while the clinician views the 3D images, the 3D images that the clinician perceives are somewhat different from the 3D images that the clinician expects to perceive. This difference may be even greater in surgical robotic systems that utilize head or gaze tracking to control movement of the robotic arm coupled to the endoscopic camera. In view of the foregoing, it would be beneficial to have improved systems and methods for controlling and displaying stereoscopic images from an endoscopic camera while controlling a surgical robot during robotic surgery.
The present disclosure describes robotic surgical systems with observer movement compensation, in accordance with various embodiments. In an aspect of the present disclosure, an illustrative system includes an image capture device configured to capture images of a surgical site, a stereoscopic display device, a sensor configured to detect positions of an observer, and a computing device including at least one processor and a memory storing instructions. When executed by the at least one processor, the instructions cause the computing device to receive the images of the surgical site from the image capture device, receive data from the sensor indicating a first position of the observer, process the received images of the surgical site based on the first position of the observer, and cause the stereoscopic display device to display the processed stereoscopic images of the surgical site.
In embodiments, the images of the surgical site include left-eye image data and right-eye image data.
In some embodiments, the images of the surgical site have a frame size, and the processing the received images of the surgical site based on the first position of the observer includes determining a portion of the images to display based on the first position of the observer, the portion of the images to display being smaller than the frame size.
In another embodiment, the portion of the images to display corresponds to a number of pixels less than the number of pixels included in the images of the surgical site.
In an embodiment, the determining the portion of the images to display includes cropping at least a portion of the images.
In embodiments, the determining the portion of the images to display includes shifting at least a portion of the images.
In some embodiments, the shifting of at least the portion of the images includes determining a vector of movement of the observer based on the first position of the observer, determining a direction and an amount of pixels to shift based on the determined vector of movement of the observer, and shifting at least a portion of the images based on the determined direction and amount of pixels to shift.
In additional embodiments, the determining the vector of movement of the observer further includes determining a degree of movement, and the determining the direction and the amount of pixels to shift is further based on the determined degree of movement.
In another embodiment, the determining the direction and the amount of pixels to shift is further based on a relationship between the vector of movement of the observer and the direction and amount of pixels to shift. The relationship may be based on a table or a threshold.
In embodiments, the determining of the vector of movement of the observer includes determining whether the first position of the observer approaches a maximum threshold, and providing an alert indicating that the first position of the observer approaches the maximum threshold.
In an embodiment, the determining the vector of movement of the observer includes determining whether the first position of the observer exceeds a maximum threshold, and providing an alert indicating that the first position of the observer exceeds the maximum threshold.
In another embodiment, the system further includes a surgical robot, wherein the image capture device is coupled to the surgical robot, and the instructions, when executed by the processor, further cause the computing device to determine a vector of movement of the observer based on the data received from the sensor; and cause the surgical robot to reposition the image capture device based on the determined vector of movement of the observer.
In some embodiments, the stereoscopic display is an autostereoscopic display.
In several embodiments, the stereoscopic display is a passive stereoscopic display, and the system further comprises three-dimensional (3D) glasses worn by the observer.
In embodiments, the 3D glasses cause a left-eye image to be displayed to a left eye of the observer, and a right-eye image to be displayed to a right eye of the observer.
In an embodiment, the image capture device is a stereoscopic camera coupled to an endoscope.
In another embodiment, the sensor is a motion sensor or a camera.
In some embodiments, the data received from the sensor indicating a first position of the observer includes an image of the observer, and the instructions, when executed by the processor, further cause the computing device to generate second image data based on the image of the observer, and detect the first position of the observer by processing the second image data.
In several embodiments, the detecting first position of the observer includes detecting one or more of a distance of the observer relative to a vector normal to the stereoscopic display, a direction of the observer relative to the vector normal to the stereoscopic display, or an orientation of the observer relative to the stereoscopic display.
In embodiments, the direction of the observer is one or more of a lateral direction or a vertical direction.
In some embodiments, the instructions, when executed by the at least one processor, further cause the computing device to receive additional data from the sensor indicating a second position of the observer, process the received images of the surgical site based on the second position of the observer, and cause the stereoscopic display to display the processed images of the surgical site.
Provided in accordance with embodiments of the present disclosure are methods for compensating for observer movement in a robotic surgical system. In an aspect of the present disclosure, an illustrative method includes receiving images of a surgical site from an image capture device, receiving data from a sensor indicating a first position of an observer, processing the received images of the surgical site based on the first position of the observer, and causing a stereoscopic display device to display the processed images of the surgical site.
Provided in accordance with embodiments of the present disclosure are non-transitory computer-readable storage media storing a program for compensating for observer movement in a robotic surgical system. In an aspect of the present disclosure, an illustrative program includes instructions which, when executed by a processor, cause a computing device to receive images of a surgical site from an image capture device, receive data from a sensor indicating a first position of an observer, process the received images of the surgical site based on the first position of the observer, and cause a stereoscopic display device to display the processed images of the surgical site.
Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
Various aspects and features of the present disclosure are described hereinbelow with references to the drawings, wherein:
The present disclosure generally relates to the display of stereoscopic images on a three-dimensional (3D) display and, more particularly, to the mitigation of a head-movement effect that is perceived by a user when viewing stereoscopic images on a conventional 3D display. In that regard, the present disclosure relates to systems and methods for compensating for observer movement in robotic surgical systems having stereoscopic displays, such as by detecting movement of the observer, repositioning an endoscopic camera based on the detected movement of the observer, shifting images received from the endoscopic camera based on the detected movement of the observer to compensate for the movement of the observer, and displaying the shifted images on a stereoscopic display. In this manner, the stereoscopic images are cropped and/or shifted, such as by shifting one or more lines and/or columns of pixels around the edges of the stereoscopic images, based on movement of the observer's head prior to being displayed. The effect of such shifting of the stereoscopic images is that the observer's perceived movement of the stereoscopic images caused by movement of the observer's head is visually mitigated by changing a displayed portion of the stereoscopic images to cause the displayed images to “move” in the way the observer expects the stereoscopic images to move even if the images received from the endoscopic camera do not move. Visual and/or auditory guidance, notifications, and/or alarms may be displayed and/or emitted by the stereoscopic display and/or a computing device associated therewith, to assist the observer with appropriately moving the observer's body, head, face, and/or eyes to control movement of the endoscopic camera and/or shifting of the images. Those skilled in the art will appreciate that the endoscopic camera may also be controlled based on other user interfaces, and thus need not be controlled based on movement of the observer.
With reference to
Endoscopic camera 145 may be a single camera or a plurality of cameras capable of capturing stereoscopic images and/or any other camera or imaging device known to those skilled in the art that may be used to capture 3D images of a surgical site. In some embodiments, endoscopic camera 145 is a dual-lens or multi-lens camera. Display 120 may be any stereoscopic display device configured to output stereoscopic images to the observer. For example, display 120 may be an autostereoscopic display, a passive stereoscopic display, and/or any other display device configured to display three-dimensional (3D) images known to those skilled in the art. In embodiments where display 120 is a passive stereoscopic display device, system 100 may further include 3D glasses 127 worn by the observer. For example, 3D glasses 127 may cause a left-eye image to be displayed to a left eye of the observer, and a right-eye image to be displayed to a right eye of the observer.
Image capture devices 125a and 125b, may be any image capture devices known to those skilled in the art, such as video cameras, still cameras, stereoscopic cameras, etc. In some embodiments, image capture devices 125a and 125b are motion sensors configured to detect movement of the observer. In other embodiments, image capture devices 125a, 125b are infrared light based marker tracking devices configured to track markers attached to the observer, such as to the observer's head and/or to 3D glasses 127. In embodiments, image capture devices 125a and 125b are positioned about display 120 to detect a viewing direction and/or angle of the observer. Image capture devices 125a and 125b, are referred to collectively hereinafter as image capture devices 125.
Surgical robot assembly 150 includes a base 151, a first joint 152 coupled to base 151, a first robotic arm 155, coupled to first joint 152, a second joint 153 coupled to first robotic arm 155, a second robotic arm 154 coupled to second joint 153, and an instrument drive unit 156 coupled to second arm 154. Endoscope 140 is attached to surgical robot assembly 150 via instrument drive unit 156. In embodiments, multiple surgical robot assemblies 150 may be used concurrently and may together form a surgical robot. While a single surgical robot assembly 150 is shown in
Computing device 200 may be any computing device configurable for use during robotic surgery known to those skilled in the art. For example, computing device 200 may be a desktop computer, laptop computer, server and terminal configuration, and/or a control computer for surgical robot assembly 150, etc. In some embodiments, computing device 200 may be included in display 120. As described further below, system 100 may be used during robotic surgery to detect movement of the observer, reposition endoscope 140 based on the detected movement, and shift images captured by endoscopic camera 145 based on the detected movement.
Turning now to
Memory 202 may include any non-transitory computer-readable storage medium for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 200, display 120, and/or surgical robot assembly 150. In an embodiment, memory 202 may include one or more solid-state storage devices such as flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown in
Computing device 200 further includes an input interface 206, a communications interface 208, and an output interface 210. Input interface 206 may be a mouse, keyboard, or other hand-held controller, foot pedal, touch screen, voice interface, and/or any other device or interface by means of which a user may interact with computing device 200.
Communications interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Output interface 210 may be a screen or other display device usable to output images or data by computing device 200.
With reference to
Next, at step S304, computing device 200 receives data indicating a position of the observer. The data may be acquired by image capture devices 125. The data may include images and/or image data of the observer, such as images of a portion of the observer's body, for example, the observer's head, face, and/or eyes. For example, if the data received from image capture devices 125 include images, computing device 200 may process the images to generate image data, and may then further process the image data to identify the observer and detect a position of the observer in the image data, such as a position relative to a vector normal and centered to a front face of display 120 (referred to hereinafter as the “normal vector”). For example, the normal vector may be a vector coming out of the center of a face of display 120 facing the observer. In some embodiments, computing device 200 may process the images and/or image data received from image capture devices 125 to identify the observer's head, face, and/or eyes, as well as a viewing direction, orientation, and/or angle of the observer's head, face, and/or eyes. Computing device 200 may further process the images and/or image data received from image capture devices 125 to determine movement of the observer's body, head, face, and/or eyes, such as movement relative to the normal vector. For example, computing device 200 may intermittently and/or continuously receive data from image capture devices 125, and may intermittently and/or continuously process the data to identify successive positions of the observer, and thereby determine movement of the observer relative to a previous position and/or relative to the normal vector. In other embodiments, such as embodiments where image capture devices 125 are motion sensors, image capture devices 125 provide motion data to computing device 200.
As an optional step, at step S306, computing device 200, such as via application 280, may process the data received at step S304 to determine a direction, amount, speed, and/or degree of movement of the observer based on the determined position or positions of the observer, such as of the observer's head, face, and/or eyes. For example, computing device 200 may determine whether the observer is moving in a horizontal direction, a vertical direction, a diagonal direction, and/or a rotational direction relative to the normal vector. Computing device 200 may further determine an amount, speed, and/or degree of movement of the observer in each direction—e.g., the user may be moving left by 5 cm and up by 2 cm. Computing device 200 may then cause surgical robot assembly 150 to reposition endoscope 140 based on the determined movement of the observer.
As noted above, in some embodiments, computing device 200 may intermittently and/or continuously receive images and/or image data of the observer from image capture devices 125. Computing device 200 may then intermittently and/or continuously process the images and/or image data of the observer to detect movement of the observer. For example, computing device 200 may determine successive positions of the observer relative to the normal vector in the images and/or image data received from image capture devices 125. Computing device 200 may then determine a vector of the observer's movement based on the determined successive positions of the observer relative to the normal vector in the images and/or image data.
As an additional optional step, after endoscope 140 is repositioned, computing device 200, at step S308, receives second images of the surgical site from endoscopic camera 145. Similar to the first images, the second images may be stereoscopic images including left-eye image data and right-eye image data, and may have the same aspect ratio, resolution, frame size, and/or number of pixels as the first images.
Thereafter, concurrently with steps S306 and/or S308, or in embodiments where steps S306 and S308 are not performed directly after step S304, at step S310 computing device 200, such as via application 280, determines whether the position of the observer, as determined at step S306, approaches a maximum threshold for that particular direction of movement. For example, computing device 200 may have stored in database 240 various tables and/or thresholds for each direction relative to the normal vector. The tables and/or thresholds may indicate various relationships between the direction and distance of the observer relative to the normal vector, and corresponding movement of endoscope 140 and/or adjustment required for displaying stereoscopic image data. For example, the tables and/or thresholds may indicate an amount of movement of the observer in a particular direction, such as a horizontal direction, required to cause surgical robot assembly 150 to reposition endoscope 140 by a predetermined amount in the same direction, and/or a number of pixels in a particular direction to shift stereoscopic images received from endoscope 140 based on position of the observer relative to the normal vector. The tables and/or thresholds may also indicate a maximum amount of movement in a particular direction that can be used to cause surgical robot assembly 150 to reposition endoscope 140 and/or a maximum number of pixels that could be shifted in a particular direction. Such a maximum threshold may correspond to a limit of motion of endoscope 140 and/or surgical robot assembly 150, and/or a maximum number of pixels available to be shifted in a particular direction.
If it is determined at S310 that the position of the observer does not approach a maximum threshold corresponding to that particular direction (“No” at step S310), processing skips ahead to step S318. Alternatively, if it is determined at step S310 that the position of the observer approaches a maximum threshold corresponding to that particular direction (“Yes” at step S310), processing proceeds to step S312. At step S312, computing device 200, such as via application 280, determines whether the position of the observer exceeds the maximum threshold corresponding to that particular direction. If it is determined at step S312 that the position of the observer does not exceed the maximum threshold corresponding to that particular direction (“No” at step S312), processing proceeds to step S314, where computing device 200 provides an alert to notify the observer that the position of the observer is approaching the maximum threshold corresponding to that particular direction. Alternatively, if it is determined at step S312 that the position of the observer exceeds the maximum threshold corresponding to that particular direction (“Yes” at step S312), processing proceeds to step S316, where computing device provides a warning to notify the observer that the position of the observer has exceeded the maximum threshold. After either the alert is provided at step S314 or the warning is provided at step S316, processing proceeds to step S318.
At step S318, computing device 200, such as via application 280, processes the images of the surgical site received at step S302 and/or the second images of the surgical site received at step S308. Further details regarding an exemplary procedure 400 that may be employed as part of the processing of the images of the surgical site at step S318 are described below with reference to
Thereafter, at step S322, computing device 200, such as via application 280, determines whether the surgical procedure has been completed. For example, computing device 200 may receive input from the observer and/or another clinician involved in the surgical procedure indicating that the surgical procedure has been completed. If it is determined at step S322 that the surgical procedure has not been completed (“No” at step S322), processing returns to step S304. Alternatively, if it is determined at step S322 that the surgical procedure has been completed (“Yes” at step S322), processing ends.
Turning now to
Starting at step S402, computing device 200, receives images of a surgical site. The images of the surgical site may be received from a camera such as endoscopic camera 145, as shown in
Thereafter, at step S404, computing device 200, such as via application 280, crops at least a portion of the image 502 of the surgical site to designate a displayed portion 504 of the image 502, as shown in
Next, at step S406, computing device 200, such as via application 280, determines a position of the observer's body, e.g. the observer's head, face, and/or eyes, relative to the normal vector. In some embodiments, the determination described above with reference to step S306 is the same as the determination described here. In other embodiments, computing device 200 may perform two separate determinations of a position of the observer relative to the normal vector. For example, computing device 200 may determine a horizontal distance, a vertical distance, and/or a diagonal distance, of the position of the observer relative to the normal vector. In some embodiments, computing device 200 may determine a directional component, such as a vector, and a scalar component, such as a magnitude, based on the position of the observer relative to the normal vector. In some embodiments, successive positions of the observer (such as a first position, a second position, etc.) are determined, and movement of the observer may then be detected based on the determined successive positions of the observer relative to the normal vector. For example, as the observer moves relative to the normal vector, one or more positions of the observer relative to the normal vector may be determined, and a direction, amount, speed, and/or degree of movement of the observer may be determined based on the successive positions of the observer relative to the normal vector.
Thereafter, at step S408, computing device 200, such as via application 280, determines a direction and/or amount of pixels to shift and/or pan the displayed portion 504 of the image 502, based on the position of the observer determined at step S406. The determination may be based on a table and/or a threshold. For example, computing device 200 may have stored in database 240 various tables and/or thresholds, as described above with reference to step S310. The tables and/or thresholds may indicate various relationships between a direction and/or distance of the observer relative to the normal vector, and a corresponding direction and amount of pixels to shift and/or pan the displayed portion 504 of the image 502. For example, the tables and/or thresholds may indicate distance of the observer from the normal vector in a particular direction, such as a horizontal direction, required to shift a predetermined amount of pixels of the image frame 504 in the same direction. In embodiments, different thresholds and/or relationships may be configured for each direction. For example, based on the preference of the observer, the threshold for the horizontal direction may be lower than the threshold for the vertical direction, thus allowing for more sensitivity to movement of the observer in a horizontal direction relative to the normal vector than movement of the observer in a vertical direction relative to the normal vector. The tables and/or thresholds may also indicate a maximum distance of the observer from the normal vector in a particular direction that can be used to shift pixels in that direction. For example, if too many pixels are shifted and/or panned at once, the images displayed by display 120 may appear distorted. As such, a maximum threshold may correspond to a limit of the number of pixels that may be shifted and/or panned at once. Further, in embodiments where computing device 200 determines directional and scalar components based on movement of the observer, the determination of the direction and/or amount of pixels to shift may further be based on the directional and scalar components.
Next, at step S410, computing device 200, such as via application 280, shifts the displayed portion 504 of the image 502 based on the direction and/or amount of pixels to shift determined at step S408. For example, if the position of the observer is to the left of the normal vector by a particular amount, computing device 200 may shift the displayed portion 504 right by the amount of pixels determined at step S408. In the example shown in
As such, by virtue of the above-described systems and methods, computing device 200 may be configured to detect one or more positions of the observer, reposition an endoscopic camera based on movement of the observer determined based on successive positions of the observer, shift one or more lines and/or columns of pixels in the images received from the endoscopic camera based on the one or more positions of the observer to compensate for the movement of the observer, and display the shifted images on a stereoscopic display. In this manner, the stereoscopic images are cropped and/or shifted based on movement of the observer's head prior to being displayed. One effect of such shifting of the stereoscopic images is that the displayed images “move” in the way the observer expects the stereoscopic images to move even if the images received from the endoscopic camera do not move.
Detailed embodiments of devices, systems incorporating such devices, and methods using the same as described herein. However, these detailed embodiments are merely examples of the disclosure, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for allowing one skilled in the art to variously employ the present disclosure in appropriately detailed structure.
This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2019/025096, filed Apr. 1, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/660,398, filed Apr. 20, 2018, the entire disclosure of which is incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2019/025096 | 4/1/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/204012 | 10/24/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6132368 | Cooper | Oct 2000 | A |
6206903 | Ramans | Mar 2001 | B1 |
6246200 | Blumenkranz et al. | Jun 2001 | B1 |
6312435 | Wallace et al. | Nov 2001 | B1 |
6331181 | Tierney et al. | Dec 2001 | B1 |
6394998 | Wallace et al. | May 2002 | B1 |
6424885 | Niemeyer et al. | Jul 2002 | B1 |
6441577 | Blumenkranz et al. | Aug 2002 | B2 |
6459926 | Nowlin et al. | Oct 2002 | B1 |
6491691 | Morley et al. | Dec 2002 | B1 |
6491701 | Tierney et al. | Dec 2002 | B2 |
6493608 | Niemeyer | Dec 2002 | B1 |
6549641 | Ishikawa et al. | Apr 2003 | B2 |
6565554 | Niemeyer | May 2003 | B1 |
6645196 | Nixon et al. | Nov 2003 | B1 |
6659939 | Moll et al. | Dec 2003 | B2 |
6671581 | Niemeyer et al. | Dec 2003 | B2 |
6676684 | Morley et al. | Jan 2004 | B1 |
6685698 | Morley et al. | Feb 2004 | B2 |
6699235 | Wallace et al. | Mar 2004 | B2 |
6714839 | Salisbury, Jr. et al. | Mar 2004 | B2 |
6716233 | Whitman | Apr 2004 | B1 |
6728599 | Wang et al. | Apr 2004 | B2 |
6746443 | Morley et al. | Jun 2004 | B1 |
6757422 | Suzuki et al. | Jun 2004 | B1 |
6766204 | Niemeyer et al. | Jul 2004 | B2 |
6770081 | Cooper et al. | Aug 2004 | B1 |
6772053 | Niemeyer | Aug 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6793652 | Whitman et al. | Sep 2004 | B1 |
6793653 | Sanchez et al. | Sep 2004 | B2 |
6799065 | Niemeyer | Sep 2004 | B1 |
6837883 | Moll et al. | Jan 2005 | B2 |
6839612 | Sanchez et al. | Jan 2005 | B2 |
6840938 | Morley et al. | Jan 2005 | B1 |
6843403 | Whitman | Jan 2005 | B2 |
6846309 | Whitman et al. | Jan 2005 | B2 |
6866671 | Tierney et al. | Mar 2005 | B2 |
6871117 | Wang et al. | Mar 2005 | B2 |
6879880 | Nowlin et al. | Apr 2005 | B2 |
6899705 | Niemeyer | May 2005 | B2 |
6902560 | Morley et al. | Jun 2005 | B1 |
6936042 | Wallace et al. | Aug 2005 | B2 |
6951535 | Ghodoussi et al. | Oct 2005 | B2 |
6974449 | Niemeyer | Dec 2005 | B2 |
6991627 | Madhani et al. | Jan 2006 | B2 |
6994708 | Manzo | Feb 2006 | B2 |
7048745 | Tierney et al. | May 2006 | B2 |
7066926 | Wallace et al. | Jun 2006 | B2 |
7118582 | Wang et al. | Oct 2006 | B1 |
7121946 | Paul et al. | Oct 2006 | B2 |
7125403 | Julian et al. | Oct 2006 | B2 |
7155315 | Niemeyer et al. | Dec 2006 | B2 |
7239940 | Wang et al. | Jul 2007 | B2 |
7306597 | Manzo | Dec 2007 | B2 |
7357774 | Cooper | Apr 2008 | B2 |
7373219 | Nowlin et al. | May 2008 | B2 |
7379790 | Toth et al. | May 2008 | B2 |
7386365 | Nixon | Jun 2008 | B2 |
7391173 | Schena | Jun 2008 | B2 |
7398707 | Morley et al. | Jul 2008 | B2 |
7413565 | Wang et al. | Aug 2008 | B2 |
7453227 | Prisco et al. | Nov 2008 | B2 |
7524320 | Tierney et al. | Apr 2009 | B2 |
7574250 | Niemeyer | Aug 2009 | B2 |
7594912 | Cooper et al. | Sep 2009 | B2 |
7607440 | Coste-Maniere et al. | Oct 2009 | B2 |
7666191 | Orban, III et al. | Feb 2010 | B2 |
7682357 | Ghodoussi et al. | Mar 2010 | B2 |
7689320 | Prisco et al. | Mar 2010 | B2 |
7695481 | Wang et al. | Apr 2010 | B2 |
7695485 | Whitman et al. | Apr 2010 | B2 |
7699855 | Anderson et al. | Apr 2010 | B2 |
7713263 | Niemeyer | May 2010 | B2 |
7725214 | Diolaiti | May 2010 | B2 |
7727244 | Orban, III et al. | Jun 2010 | B2 |
7741802 | Prisco et al. | Jun 2010 | B2 |
7756036 | Druke et al. | Jul 2010 | B2 |
7757028 | Druke et al. | Jul 2010 | B2 |
7762825 | Burbank et al. | Jul 2010 | B2 |
7778733 | Nowlin et al. | Aug 2010 | B2 |
7803151 | Whitman | Sep 2010 | B2 |
7806891 | Nowlin et al. | Oct 2010 | B2 |
7819859 | Prisco et al. | Oct 2010 | B2 |
7819885 | Cooper | Oct 2010 | B2 |
7824401 | Manzo et al. | Nov 2010 | B2 |
7835823 | Sillman et al. | Nov 2010 | B2 |
7843158 | Prisco | Nov 2010 | B2 |
7865266 | Moll et al. | Jan 2011 | B2 |
7865269 | Prisco et al. | Jan 2011 | B2 |
7886743 | Cooper et al. | Feb 2011 | B2 |
7899578 | Prisco et al. | Mar 2011 | B2 |
7907166 | Lamprecht et al. | Mar 2011 | B2 |
7935130 | Williams | May 2011 | B2 |
7963913 | Devengenzo et al. | Jun 2011 | B2 |
7983793 | Toth et al. | Jul 2011 | B2 |
8002767 | Sanchez et al. | Aug 2011 | B2 |
8004229 | Nowlin et al. | Aug 2011 | B2 |
8012170 | Whitman et al. | Sep 2011 | B2 |
8054752 | Druke et al. | Nov 2011 | B2 |
8062288 | Cooper et al. | Nov 2011 | B2 |
8079950 | Stern et al. | Dec 2011 | B2 |
8100133 | Mintz et al. | Jan 2012 | B2 |
8108072 | Zhao et al. | Jan 2012 | B2 |
8120301 | Goldberg et al. | Feb 2012 | B2 |
8142447 | Cooper et al. | Mar 2012 | B2 |
8147503 | Zhao et al. | Apr 2012 | B2 |
8151661 | Schena et al. | Apr 2012 | B2 |
8155479 | Hoffman et al. | Apr 2012 | B2 |
8182469 | Anderson et al. | May 2012 | B2 |
8202278 | Orban, III et al. | Jun 2012 | B2 |
8206406 | Orban, III | Jun 2012 | B2 |
8210413 | Whitman et al. | Jul 2012 | B2 |
8216250 | Orban, III et al. | Jul 2012 | B2 |
8220468 | Cooper et al. | Jul 2012 | B2 |
8256319 | Cooper et al. | Sep 2012 | B2 |
8285517 | Sillman et al. | Oct 2012 | B2 |
8315720 | Mohr et al. | Nov 2012 | B2 |
8335590 | Costa et al. | Dec 2012 | B2 |
8347757 | Duval | Jan 2013 | B2 |
8374723 | Zhao et al. | Feb 2013 | B2 |
8418073 | Mohr et al. | Apr 2013 | B2 |
8419717 | Diolaiti et al. | Apr 2013 | B2 |
8423182 | Robinson et al. | Apr 2013 | B2 |
8452447 | Nixon | May 2013 | B2 |
8454585 | Whitman | Jun 2013 | B2 |
8499992 | Whitman et al. | Aug 2013 | B2 |
8508173 | Goldberg et al. | Aug 2013 | B2 |
8528440 | Morley et al. | Sep 2013 | B2 |
8529582 | Devengenzo et al. | Sep 2013 | B2 |
8540748 | Murphy et al. | Sep 2013 | B2 |
8551116 | Julian et al. | Oct 2013 | B2 |
8562594 | Cooper et al. | Oct 2013 | B2 |
8594841 | Zhao et al. | Nov 2013 | B2 |
8597182 | Stein et al. | Dec 2013 | B2 |
8597280 | Cooper et al. | Dec 2013 | B2 |
8600551 | Itkowitz et al. | Dec 2013 | B2 |
8608773 | Tierney et al. | Dec 2013 | B2 |
8620473 | Diolaiti et al. | Dec 2013 | B2 |
8624537 | Nowlin et al. | Jan 2014 | B2 |
8634957 | Toth et al. | Jan 2014 | B2 |
8638056 | Goldberg et al. | Jan 2014 | B2 |
8638057 | Goldberg et al. | Jan 2014 | B2 |
8644988 | Prisco et al. | Feb 2014 | B2 |
8666544 | Moll et al. | Mar 2014 | B2 |
8668638 | Donhowe et al. | Mar 2014 | B2 |
8746252 | McGrogan et al. | Jun 2014 | B2 |
8749189 | Nowlin et al. | Jun 2014 | B2 |
8749190 | Nowlin et al. | Jun 2014 | B2 |
8758352 | Cooper et al. | Jun 2014 | B2 |
8761930 | Nixon | Jun 2014 | B2 |
8768516 | Diolaiti et al. | Jul 2014 | B2 |
8780178 | Koh et al. | Jul 2014 | B2 |
8786241 | Nowlin et al. | Jul 2014 | B2 |
8790243 | Cooper et al. | Jul 2014 | B2 |
8808164 | Hoffman et al. | Aug 2014 | B2 |
8816628 | Nowlin et al. | Aug 2014 | B2 |
8821480 | Burbank | Sep 2014 | B2 |
8823308 | Nowlin et al. | Sep 2014 | B2 |
8827989 | Niemeyer | Sep 2014 | B2 |
8838270 | Druke et al. | Sep 2014 | B2 |
8852174 | Burbank | Oct 2014 | B2 |
8858547 | Brogna | Oct 2014 | B2 |
8862268 | Robinson et al. | Oct 2014 | B2 |
8864751 | Prisco et al. | Oct 2014 | B2 |
8864752 | Diolaiti et al. | Oct 2014 | B2 |
8903546 | Diolaiti et al. | Dec 2014 | B2 |
8903549 | Itkowitz et al. | Dec 2014 | B2 |
8911428 | Cooper et al. | Dec 2014 | B2 |
8912746 | Reid et al. | Dec 2014 | B2 |
8944070 | Guthart et al. | Feb 2015 | B2 |
8989903 | Weir et al. | Mar 2015 | B2 |
9002518 | Manzo et al. | Apr 2015 | B2 |
9014856 | Manzo et al. | Apr 2015 | B2 |
9016540 | Whitman et al. | Apr 2015 | B2 |
9019345 | O'Grady et al. | Apr 2015 | B2 |
9043027 | Durant et al. | May 2015 | B2 |
9050120 | Swarup et al. | Jun 2015 | B2 |
9055961 | Manzo et al. | Jun 2015 | B2 |
9068628 | Solomon et al. | Jun 2015 | B2 |
9078684 | Williams | Jul 2015 | B2 |
9084623 | Gomez et al. | Jul 2015 | B2 |
9095362 | Dachs, II et al. | Aug 2015 | B2 |
9096033 | Holop et al. | Aug 2015 | B2 |
9101381 | Burbank et al. | Aug 2015 | B2 |
9113877 | Whitman et al. | Aug 2015 | B1 |
9138284 | Krom et al. | Sep 2015 | B2 |
9144456 | Rosa et al. | Sep 2015 | B2 |
9198730 | Prisco et al. | Dec 2015 | B2 |
9204923 | Manzo et al. | Dec 2015 | B2 |
9226648 | Saadat et al. | Jan 2016 | B2 |
9226750 | Weir et al. | Jan 2016 | B2 |
9226761 | Burbank | Jan 2016 | B2 |
9232984 | Guthart et al. | Jan 2016 | B2 |
9241766 | Duque et al. | Jan 2016 | B2 |
9241767 | Prisco et al. | Jan 2016 | B2 |
9241769 | Larkin et al. | Jan 2016 | B2 |
9259275 | Burbank | Feb 2016 | B2 |
9259277 | Rogers et al. | Feb 2016 | B2 |
9259281 | Griffiths et al. | Feb 2016 | B2 |
9259282 | Azizian et al. | Feb 2016 | B2 |
9261172 | Solomon et al. | Feb 2016 | B2 |
9265567 | Orban, III et al. | Feb 2016 | B2 |
9265584 | Itkowitz et al. | Feb 2016 | B2 |
9283049 | Diolaiti et al. | Mar 2016 | B2 |
9292734 | Zhu et al. | Mar 2016 | B2 |
9301811 | Goldberg et al. | Apr 2016 | B2 |
9314307 | Richmond et al. | Apr 2016 | B2 |
9317651 | Nixon | Apr 2016 | B2 |
9345546 | Toth et al. | May 2016 | B2 |
9393017 | Flanagan et al. | Jul 2016 | B2 |
9402689 | Prisco et al. | Aug 2016 | B2 |
9417621 | Diolaiti et al. | Aug 2016 | B2 |
9424303 | Hoffman et al. | Aug 2016 | B2 |
9433418 | Whitman et al. | Sep 2016 | B2 |
9446517 | Burns et al. | Sep 2016 | B2 |
9452020 | Griffiths et al. | Sep 2016 | B2 |
9474569 | Manzo et al. | Oct 2016 | B2 |
9480533 | Devengenzo et al. | Nov 2016 | B2 |
9503713 | Zhao et al. | Nov 2016 | B2 |
9550300 | Danitz et al. | Jan 2017 | B2 |
9554859 | Nowlin et al. | Jan 2017 | B2 |
9566124 | Prisco et al. | Feb 2017 | B2 |
9579164 | Itkowitz et al. | Feb 2017 | B2 |
9585641 | Cooper et al. | Mar 2017 | B2 |
9615883 | Schena et al. | Apr 2017 | B2 |
9623563 | Nixon | Apr 2017 | B2 |
9623902 | Griffiths et al. | Apr 2017 | B2 |
9629520 | Diolaiti | Apr 2017 | B2 |
9662177 | Weir et al. | May 2017 | B2 |
9664262 | Donlon et al. | May 2017 | B2 |
9687312 | Dachs, II et al. | Jun 2017 | B2 |
9700334 | Hinman et al. | Jul 2017 | B2 |
9718190 | Larkin et al. | Aug 2017 | B2 |
9730719 | Brisson et al. | Aug 2017 | B2 |
9737199 | Pistor et al. | Aug 2017 | B2 |
9795446 | DiMaio et al. | Oct 2017 | B2 |
9797484 | Solomon et al. | Oct 2017 | B2 |
9801690 | Larkin et al. | Oct 2017 | B2 |
9814530 | Weir et al. | Nov 2017 | B2 |
9814536 | Goldberg et al. | Nov 2017 | B2 |
9814537 | Itkowitz et al. | Nov 2017 | B2 |
9820823 | Richmond et al. | Nov 2017 | B2 |
9827059 | Robinson et al. | Nov 2017 | B2 |
9830371 | Hoffman et al. | Nov 2017 | B2 |
9839481 | Blumenkranz et al. | Dec 2017 | B2 |
9839487 | Dachs, II | Dec 2017 | B2 |
9850994 | Schena | Dec 2017 | B2 |
9855102 | Blumenkranz | Jan 2018 | B2 |
9855107 | Labonville et al. | Jan 2018 | B2 |
9872737 | Nixon | Jan 2018 | B2 |
9877718 | Weir et al. | Jan 2018 | B2 |
9883920 | Blumenkranz | Feb 2018 | B2 |
9888974 | Niemeyer | Feb 2018 | B2 |
9895813 | Blumenkranz et al. | Feb 2018 | B2 |
9901408 | Larkin | Feb 2018 | B2 |
9918800 | Itkowitz et al. | Mar 2018 | B2 |
9943375 | Blumenkranz et al. | Apr 2018 | B2 |
9948852 | Lilagan et al. | Apr 2018 | B2 |
9949798 | Weir | Apr 2018 | B2 |
9949802 | Cooper | Apr 2018 | B2 |
9952107 | Blumenkranz et al. | Apr 2018 | B2 |
9956044 | Gomez et al. | May 2018 | B2 |
9980778 | Ohline et al. | May 2018 | B2 |
10008017 | Itkowitz et al. | Jun 2018 | B2 |
10028793 | Griffiths et al. | Jul 2018 | B2 |
10033308 | Chaghajerdi et al. | Jul 2018 | B2 |
10034719 | Richmond et al. | Jul 2018 | B2 |
10052167 | Au et al. | Aug 2018 | B2 |
10085811 | Weir et al. | Oct 2018 | B2 |
10092344 | Mohr et al. | Oct 2018 | B2 |
10123844 | Nowlin et al. | Nov 2018 | B2 |
10188471 | Brisson | Jan 2019 | B2 |
10201390 | Swarup et al. | Feb 2019 | B2 |
10213202 | Flanagan et al. | Feb 2019 | B2 |
10258416 | Mintz et al. | Apr 2019 | B2 |
10278782 | Jarc et al. | May 2019 | B2 |
10278783 | Itkowitz et al. | May 2019 | B2 |
10282881 | Itkowitz et al. | May 2019 | B2 |
10335242 | Devengenzo et al. | Jul 2019 | B2 |
10405934 | Prisco et al. | Sep 2019 | B2 |
10433922 | Itkowitz et al. | Oct 2019 | B2 |
10464219 | Robinson et al. | Nov 2019 | B2 |
10485621 | Morrissette et al. | Nov 2019 | B2 |
10500004 | Hanuschik et al. | Dec 2019 | B2 |
10500005 | Weir et al. | Dec 2019 | B2 |
10500007 | Richmond et al. | Dec 2019 | B2 |
10507066 | DiMaio et al. | Dec 2019 | B2 |
10510267 | Jarc et al. | Dec 2019 | B2 |
10524871 | Liao | Jan 2020 | B2 |
10548459 | Itkowitz et al. | Feb 2020 | B2 |
10575909 | Robinson et al. | Mar 2020 | B2 |
10592529 | Hoffman et al. | Mar 2020 | B2 |
10595946 | Nixon | Mar 2020 | B2 |
10881469 | Robinson | Jan 2021 | B2 |
10881473 | Itkowitz et al. | Jan 2021 | B2 |
10898188 | Burbank | Jan 2021 | B2 |
10898189 | McDonald, II | Jan 2021 | B2 |
10905506 | Itkowitz et al. | Feb 2021 | B2 |
10912544 | Brisson et al. | Feb 2021 | B2 |
10912619 | Jarc et al. | Feb 2021 | B2 |
10918387 | Duque et al. | Feb 2021 | B2 |
10918449 | Solomon et al. | Feb 2021 | B2 |
10932873 | Griffiths et al. | Mar 2021 | B2 |
10932877 | Devengenzo et al. | Mar 2021 | B2 |
20030012425 | Suzuki et al. | Jan 2003 | A1 |
20070176914 | Bae | Aug 2007 | A1 |
20090268015 | Scott et al. | Oct 2009 | A1 |
20120127203 | Imai | May 2012 | A1 |
20120127302 | Imai | May 2012 | A1 |
20140038635 | Ngo et al. | Feb 2014 | A1 |
20140247329 | Nakamura et al. | Sep 2014 | A1 |
20150035953 | Bredehoft et al. | Feb 2015 | A1 |
20150192996 | Katou | Jul 2015 | A1 |
20160142683 | Seesselberg | May 2016 | A1 |
20170112368 | Stern | Apr 2017 | A1 |
20170172675 | Jarc | Jun 2017 | A1 |
20180092706 | Anderson et al. | Apr 2018 | A1 |
20190110843 | Ummalaneni | Apr 2019 | A1 |
20200169724 | Meglan | May 2020 | A1 |
20210361379 | Ramirez Luna | Nov 2021 | A1 |
Number | Date | Country |
---|---|---|
0122149 | Mar 2001 | WO |
2017210101 | Dec 2017 | WO |
Entry |
---|
Extended European Search Report dated Dec. 23, 2021 corresponding to counterpart Patent Application EP 19789525.3. |
European Search Report dated Jul. 12, 2019, and Written Opinion Completed Jul. 10, 2019, International Patent Application No. PCT/US2019/025096 9 pages. |
Tobii, e-book, tech.tobii.com. Copyright 2021, Tobii AB, “5 Ways Next-Generation Surgical Robotics Will Leverage Attention to Enhance Care”, p. 1/12-12/12. |
Tobii, Tobii White Paper, tech.tobii.com., May 2020, Version 1.0, “Why Next-Generation Surgical Systems Will Include Eye Tracking”, p. 1/15-15/15. |
Number | Date | Country | |
---|---|---|---|
20210243427 A1 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
62660398 | Apr 2018 | US |