Compensation for observer movement in robotic surgical systems having stereoscopic displays

Information

  • Patent Grant
  • 11647888
  • Patent Number
    11,647,888
  • Date Filed
    Monday, April 1, 2019
    5 years ago
  • Date Issued
    Tuesday, May 16, 2023
    12 months ago
Abstract
Systems and methods for compensating for observer movement during robotic surgery. An exemplary system includes an image capture device configured to capture images of a surgical site, a stereoscopic display device, a sensor configured to detect positions of an observer, and a computing device including a processor and a memory storing instructions. The instructions, when executed by the processor, cause the computing device to receive the images of the surgical site from the image capture device, receive data from the sensor indicating a position of the observer, process the received images of the surgical site based on the movement of the observer, and cause the stereoscopic display device to display the processed images of the surgical site.
Description
BACKGROUND
Description of Related Art

Robotic surgery involves a clinician, such as a surgeon or technician, operating a surgical robot via a control console. Robotic surgery may be performed endoscopically, and thus the only view of a surgical site available to the clinician may be images, such as three-dimensional (3D) or stereoscopic images, captured by an endoscopic camera. While operating the surgical robot, and thus viewing the 3D images, the clinician's head may be moved. Such movement of the clinician's head may cause the clinician to expect corresponding movement of the 3D images, for instance, based on a change in the clinician's perspective. However, conventional 3D video images are not configured to move based on movement of the clinician's head. Thus, if the clinician's head is moved while the clinician views the 3D images, the 3D images that the clinician perceives are somewhat different from the 3D images that the clinician expects to perceive. This difference may be even greater in surgical robotic systems that utilize head or gaze tracking to control movement of the robotic arm coupled to the endoscopic camera. In view of the foregoing, it would be beneficial to have improved systems and methods for controlling and displaying stereoscopic images from an endoscopic camera while controlling a surgical robot during robotic surgery.


SUMMARY

The present disclosure describes robotic surgical systems with observer movement compensation, in accordance with various embodiments. In an aspect of the present disclosure, an illustrative system includes an image capture device configured to capture images of a surgical site, a stereoscopic display device, a sensor configured to detect positions of an observer, and a computing device including at least one processor and a memory storing instructions. When executed by the at least one processor, the instructions cause the computing device to receive the images of the surgical site from the image capture device, receive data from the sensor indicating a first position of the observer, process the received images of the surgical site based on the first position of the observer, and cause the stereoscopic display device to display the processed stereoscopic images of the surgical site.


In embodiments, the images of the surgical site include left-eye image data and right-eye image data.


In some embodiments, the images of the surgical site have a frame size, and the processing the received images of the surgical site based on the first position of the observer includes determining a portion of the images to display based on the first position of the observer, the portion of the images to display being smaller than the frame size.


In another embodiment, the portion of the images to display corresponds to a number of pixels less than the number of pixels included in the images of the surgical site.


In an embodiment, the determining the portion of the images to display includes cropping at least a portion of the images.


In embodiments, the determining the portion of the images to display includes shifting at least a portion of the images.


In some embodiments, the shifting of at least the portion of the images includes determining a vector of movement of the observer based on the first position of the observer, determining a direction and an amount of pixels to shift based on the determined vector of movement of the observer, and shifting at least a portion of the images based on the determined direction and amount of pixels to shift.


In additional embodiments, the determining the vector of movement of the observer further includes determining a degree of movement, and the determining the direction and the amount of pixels to shift is further based on the determined degree of movement.


In another embodiment, the determining the direction and the amount of pixels to shift is further based on a relationship between the vector of movement of the observer and the direction and amount of pixels to shift. The relationship may be based on a table or a threshold.


In embodiments, the determining of the vector of movement of the observer includes determining whether the first position of the observer approaches a maximum threshold, and providing an alert indicating that the first position of the observer approaches the maximum threshold.


In an embodiment, the determining the vector of movement of the observer includes determining whether the first position of the observer exceeds a maximum threshold, and providing an alert indicating that the first position of the observer exceeds the maximum threshold.


In another embodiment, the system further includes a surgical robot, wherein the image capture device is coupled to the surgical robot, and the instructions, when executed by the processor, further cause the computing device to determine a vector of movement of the observer based on the data received from the sensor; and cause the surgical robot to reposition the image capture device based on the determined vector of movement of the observer.


In some embodiments, the stereoscopic display is an autostereoscopic display.


In several embodiments, the stereoscopic display is a passive stereoscopic display, and the system further comprises three-dimensional (3D) glasses worn by the observer.


In embodiments, the 3D glasses cause a left-eye image to be displayed to a left eye of the observer, and a right-eye image to be displayed to a right eye of the observer.


In an embodiment, the image capture device is a stereoscopic camera coupled to an endoscope.


In another embodiment, the sensor is a motion sensor or a camera.


In some embodiments, the data received from the sensor indicating a first position of the observer includes an image of the observer, and the instructions, when executed by the processor, further cause the computing device to generate second image data based on the image of the observer, and detect the first position of the observer by processing the second image data.


In several embodiments, the detecting first position of the observer includes detecting one or more of a distance of the observer relative to a vector normal to the stereoscopic display, a direction of the observer relative to the vector normal to the stereoscopic display, or an orientation of the observer relative to the stereoscopic display.


In embodiments, the direction of the observer is one or more of a lateral direction or a vertical direction.


In some embodiments, the instructions, when executed by the at least one processor, further cause the computing device to receive additional data from the sensor indicating a second position of the observer, process the received images of the surgical site based on the second position of the observer, and cause the stereoscopic display to display the processed images of the surgical site.


Provided in accordance with embodiments of the present disclosure are methods for compensating for observer movement in a robotic surgical system. In an aspect of the present disclosure, an illustrative method includes receiving images of a surgical site from an image capture device, receiving data from a sensor indicating a first position of an observer, processing the received images of the surgical site based on the first position of the observer, and causing a stereoscopic display device to display the processed images of the surgical site.


Provided in accordance with embodiments of the present disclosure are non-transitory computer-readable storage media storing a program for compensating for observer movement in a robotic surgical system. In an aspect of the present disclosure, an illustrative program includes instructions which, when executed by a processor, cause a computing device to receive images of a surgical site from an image capture device, receive data from a sensor indicating a first position of an observer, process the received images of the surgical site based on the first position of the observer, and cause a stereoscopic display device to display the processed images of the surgical site.


Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects and features of the present disclosure are described hereinbelow with references to the drawings, wherein:



FIG. 1 is a schematic diagram of an exemplary robotic surgical system that may be used to compensate for observer movement, according to an embodiment of the present disclosure;



FIG. 2 is a simplified block diagram of an exemplary computing device forming part of the system of FIG. 1;



FIG. 3 is a flowchart of an exemplary method for compensating for observer movement in a robotic surgical system having a stereoscopic display, according to an embodiment of the present disclosure;



FIG. 4 is a flowchart of an exemplary method for processing stereoscopic image data that compensates for observer movement, according to an embodiment of the present disclosure; and



FIG. 5 shows various views of an exemplary graphical user interface that may be displayed by the computing device of FIG. 2, according to an embodiment of the present disclosure.





DETAILED DESCRIPTION

The present disclosure generally relates to the display of stereoscopic images on a three-dimensional (3D) display and, more particularly, to the mitigation of a head-movement effect that is perceived by a user when viewing stereoscopic images on a conventional 3D display. In that regard, the present disclosure relates to systems and methods for compensating for observer movement in robotic surgical systems having stereoscopic displays, such as by detecting movement of the observer, repositioning an endoscopic camera based on the detected movement of the observer, shifting images received from the endoscopic camera based on the detected movement of the observer to compensate for the movement of the observer, and displaying the shifted images on a stereoscopic display. In this manner, the stereoscopic images are cropped and/or shifted, such as by shifting one or more lines and/or columns of pixels around the edges of the stereoscopic images, based on movement of the observer's head prior to being displayed. The effect of such shifting of the stereoscopic images is that the observer's perceived movement of the stereoscopic images caused by movement of the observer's head is visually mitigated by changing a displayed portion of the stereoscopic images to cause the displayed images to “move” in the way the observer expects the stereoscopic images to move even if the images received from the endoscopic camera do not move. Visual and/or auditory guidance, notifications, and/or alarms may be displayed and/or emitted by the stereoscopic display and/or a computing device associated therewith, to assist the observer with appropriately moving the observer's body, head, face, and/or eyes to control movement of the endoscopic camera and/or shifting of the images. Those skilled in the art will appreciate that the endoscopic camera may also be controlled based on other user interfaces, and thus need not be controlled based on movement of the observer.


With reference to FIG. 1, there is shown a system 100 for compensating for observer movement in robotic surgical systems having stereoscopic displays, according to an embodiment of the present disclosure. System 100 includes a table 110 supporting a body B, a stereoscopic display device 120, one or more image capture devices 125a and 125b, an endoscope 140 including an endoscopic camera 145, a surgical robot assembly 150, and a computing device 200. FIG. 1 further shows the observer O. The observer may be a user, clinician, surgeon, nurse, technician, and/or any other person operating surgical robot assembly 150.


Endoscopic camera 145 may be a single camera or a plurality of cameras capable of capturing stereoscopic images and/or any other camera or imaging device known to those skilled in the art that may be used to capture 3D images of a surgical site. In some embodiments, endoscopic camera 145 is a dual-lens or multi-lens camera. Display 120 may be any stereoscopic display device configured to output stereoscopic images to the observer. For example, display 120 may be an autostereoscopic display, a passive stereoscopic display, and/or any other display device configured to display three-dimensional (3D) images known to those skilled in the art. In embodiments where display 120 is a passive stereoscopic display device, system 100 may further include 3D glasses 127 worn by the observer. For example, 3D glasses 127 may cause a left-eye image to be displayed to a left eye of the observer, and a right-eye image to be displayed to a right eye of the observer.


Image capture devices 125a and 125b, may be any image capture devices known to those skilled in the art, such as video cameras, still cameras, stereoscopic cameras, etc. In some embodiments, image capture devices 125a and 125b are motion sensors configured to detect movement of the observer. In other embodiments, image capture devices 125a, 125b are infrared light based marker tracking devices configured to track markers attached to the observer, such as to the observer's head and/or to 3D glasses 127. In embodiments, image capture devices 125a and 125b are positioned about display 120 to detect a viewing direction and/or angle of the observer. Image capture devices 125a and 125b, are referred to collectively hereinafter as image capture devices 125.


Surgical robot assembly 150 includes a base 151, a first joint 152 coupled to base 151, a first robotic arm 155, coupled to first joint 152, a second joint 153 coupled to first robotic arm 155, a second robotic arm 154 coupled to second joint 153, and an instrument drive unit 156 coupled to second arm 154. Endoscope 140 is attached to surgical robot assembly 150 via instrument drive unit 156. In embodiments, multiple surgical robot assemblies 150 may be used concurrently and may together form a surgical robot. While a single surgical robot assembly 150 is shown in FIG. 1, multiple surgical robot assemblies 150 may be included in the surgical training environment, and those skilled in the art will recognize that the below-described methods may be applied using surgical robots having single and/or multiple surgical robot assemblies 150, each including at least one base 151, robotic arms 154 and 155, joints 152 and 153, and instrument drive unit 156, without departing from the scope of the present disclosure. Body B may be a body of a patient upon whom a robotic surgical procedure is being performed.


Computing device 200 may be any computing device configurable for use during robotic surgery known to those skilled in the art. For example, computing device 200 may be a desktop computer, laptop computer, server and terminal configuration, and/or a control computer for surgical robot assembly 150, etc. In some embodiments, computing device 200 may be included in display 120. As described further below, system 100 may be used during robotic surgery to detect movement of the observer, reposition endoscope 140 based on the detected movement, and shift images captured by endoscopic camera 145 based on the detected movement.


Turning now to FIG. 2, there is shown a schematic diagram of computing device 200 forming part of system 100 of FIG. 1, according to an embodiment of the present disclosure. Computing device 200 includes a memory 202 storing a database 240 and an application 280. Application 280 includes instructions which, when executed by a processor 204, cause computing device 200 to perform various functions, as described below. Application 280 further includes graphical user interface (GUI) instructions 285 which, when executed by processor 204, cause computing device 200 to generate one or more GUIs (not shown in FIG. 2), such as, for example, the exemplary GUI shown in FIGS. 5A-5D. Database 240 stores various tables, thresholds, and/or relational data related to shifting of images based on movement of the observer, as further described below.


Memory 202 may include any non-transitory computer-readable storage medium for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 200, display 120, and/or surgical robot assembly 150. In an embodiment, memory 202 may include one or more solid-state storage devices such as flash memory chips. Alternatively, or in addition to the one or more solid-state storage devices, memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown in FIG. 2) and a communications bus (not shown in FIG. 2). Although the description of computer-readable media included herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media may be any available media that can be accessed by processor 204. That is, computer-readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information and which may be accessed by computing device 200.


Computing device 200 further includes an input interface 206, a communications interface 208, and an output interface 210. Input interface 206 may be a mouse, keyboard, or other hand-held controller, foot pedal, touch screen, voice interface, and/or any other device or interface by means of which a user may interact with computing device 200.


Communications interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the internet. Output interface 210 may be a screen or other display device usable to output images or data by computing device 200.


With reference to FIG. 3, there is shown a flowchart of an exemplary method 300 for compensating for observer movement in robotic surgical systems having stereoscopic displays, according to an embodiment of the present disclosure. In embodiments, method 300 may be implemented using a system, such as system 100 of FIG. 1, and one or more computing devices, such as computing device 200 of FIG. 2. Starting at step S302, computing device 200 receives one or more images of a surgical site within body B. The images may be first images of the surgical site, and may be captured by endoscopic camera 145. The first images may be stereoscopic images including left-eye image data and right-eye image data. In some embodiments, the first images are live images of the surgical site, and the below-described processing of the first images occurs in real time.


Next, at step S304, computing device 200 receives data indicating a position of the observer. The data may be acquired by image capture devices 125. The data may include images and/or image data of the observer, such as images of a portion of the observer's body, for example, the observer's head, face, and/or eyes. For example, if the data received from image capture devices 125 include images, computing device 200 may process the images to generate image data, and may then further process the image data to identify the observer and detect a position of the observer in the image data, such as a position relative to a vector normal and centered to a front face of display 120 (referred to hereinafter as the “normal vector”). For example, the normal vector may be a vector coming out of the center of a face of display 120 facing the observer. In some embodiments, computing device 200 may process the images and/or image data received from image capture devices 125 to identify the observer's head, face, and/or eyes, as well as a viewing direction, orientation, and/or angle of the observer's head, face, and/or eyes. Computing device 200 may further process the images and/or image data received from image capture devices 125 to determine movement of the observer's body, head, face, and/or eyes, such as movement relative to the normal vector. For example, computing device 200 may intermittently and/or continuously receive data from image capture devices 125, and may intermittently and/or continuously process the data to identify successive positions of the observer, and thereby determine movement of the observer relative to a previous position and/or relative to the normal vector. In other embodiments, such as embodiments where image capture devices 125 are motion sensors, image capture devices 125 provide motion data to computing device 200.


As an optional step, at step S306, computing device 200, such as via application 280, may process the data received at step S304 to determine a direction, amount, speed, and/or degree of movement of the observer based on the determined position or positions of the observer, such as of the observer's head, face, and/or eyes. For example, computing device 200 may determine whether the observer is moving in a horizontal direction, a vertical direction, a diagonal direction, and/or a rotational direction relative to the normal vector. Computing device 200 may further determine an amount, speed, and/or degree of movement of the observer in each direction—e.g., the user may be moving left by 5 cm and up by 2 cm. Computing device 200 may then cause surgical robot assembly 150 to reposition endoscope 140 based on the determined movement of the observer.


As noted above, in some embodiments, computing device 200 may intermittently and/or continuously receive images and/or image data of the observer from image capture devices 125. Computing device 200 may then intermittently and/or continuously process the images and/or image data of the observer to detect movement of the observer. For example, computing device 200 may determine successive positions of the observer relative to the normal vector in the images and/or image data received from image capture devices 125. Computing device 200 may then determine a vector of the observer's movement based on the determined successive positions of the observer relative to the normal vector in the images and/or image data.


As an additional optional step, after endoscope 140 is repositioned, computing device 200, at step S308, receives second images of the surgical site from endoscopic camera 145. Similar to the first images, the second images may be stereoscopic images including left-eye image data and right-eye image data, and may have the same aspect ratio, resolution, frame size, and/or number of pixels as the first images.


Thereafter, concurrently with steps S306 and/or S308, or in embodiments where steps S306 and S308 are not performed directly after step S304, at step S310 computing device 200, such as via application 280, determines whether the position of the observer, as determined at step S306, approaches a maximum threshold for that particular direction of movement. For example, computing device 200 may have stored in database 240 various tables and/or thresholds for each direction relative to the normal vector. The tables and/or thresholds may indicate various relationships between the direction and distance of the observer relative to the normal vector, and corresponding movement of endoscope 140 and/or adjustment required for displaying stereoscopic image data. For example, the tables and/or thresholds may indicate an amount of movement of the observer in a particular direction, such as a horizontal direction, required to cause surgical robot assembly 150 to reposition endoscope 140 by a predetermined amount in the same direction, and/or a number of pixels in a particular direction to shift stereoscopic images received from endoscope 140 based on position of the observer relative to the normal vector. The tables and/or thresholds may also indicate a maximum amount of movement in a particular direction that can be used to cause surgical robot assembly 150 to reposition endoscope 140 and/or a maximum number of pixels that could be shifted in a particular direction. Such a maximum threshold may correspond to a limit of motion of endoscope 140 and/or surgical robot assembly 150, and/or a maximum number of pixels available to be shifted in a particular direction.


If it is determined at S310 that the position of the observer does not approach a maximum threshold corresponding to that particular direction (“No” at step S310), processing skips ahead to step S318. Alternatively, if it is determined at step S310 that the position of the observer approaches a maximum threshold corresponding to that particular direction (“Yes” at step S310), processing proceeds to step S312. At step S312, computing device 200, such as via application 280, determines whether the position of the observer exceeds the maximum threshold corresponding to that particular direction. If it is determined at step S312 that the position of the observer does not exceed the maximum threshold corresponding to that particular direction (“No” at step S312), processing proceeds to step S314, where computing device 200 provides an alert to notify the observer that the position of the observer is approaching the maximum threshold corresponding to that particular direction. Alternatively, if it is determined at step S312 that the position of the observer exceeds the maximum threshold corresponding to that particular direction (“Yes” at step S312), processing proceeds to step S316, where computing device provides a warning to notify the observer that the position of the observer has exceeded the maximum threshold. After either the alert is provided at step S314 or the warning is provided at step S316, processing proceeds to step S318.


At step S318, computing device 200, such as via application 280, processes the images of the surgical site received at step S302 and/or the second images of the surgical site received at step S308. Further details regarding an exemplary procedure 400 that may be employed as part of the processing of the images of the surgical site at step S318 are described below with reference to FIG. 4. Computing device 200 then, at step S320, causes display 120 to display the processed images of the surgical site.


Thereafter, at step S322, computing device 200, such as via application 280, determines whether the surgical procedure has been completed. For example, computing device 200 may receive input from the observer and/or another clinician involved in the surgical procedure indicating that the surgical procedure has been completed. If it is determined at step S322 that the surgical procedure has not been completed (“No” at step S322), processing returns to step S304. Alternatively, if it is determined at step S322 that the surgical procedure has been completed (“Yes” at step S322), processing ends.


Turning now to FIG. 4, there is shown a flowchart of an exemplary method 400 for processing stereoscopic images to compensate for observer movement, according to an embodiment of the present disclosure. Computing device 200 may perform some or all of the steps of the method of FIG. 4, for example, at or during step S318 of the method 300 of FIG. 3, described above. However, those skilled in the art will recognize that some of the steps of method 400 may be repeated, omitted, and/or performed in a different sequence without departing from the scope of the present disclosure.


Starting at step S402, computing device 200, receives images of a surgical site. The images of the surgical site may be received from a camera such as endoscopic camera 145, as shown in FIG. 5 where, for example, image 502 is a stereoscopic image displayed by display 120. Thus, the images received from endoscopic camera 145 may include left-eye image data and right-eye image data which are displayed by display device 120 as a stereoscopic image 502. The image 502 may have a particular aspect ratio, resolution, frame size, and/or number of pixels. For example, the image 502 may have multiple rows and columns of pixels. The aspect ratio, resolution, frame size, and/or number of pixels may correspond to a type of endoscopic camera 145 used. Alternatively, or in addition, the aspect ratio, resolution, frame size, and/or number of pixels may correspond to image processing techniques used by computing device 200. For example, the frame size of the image 502 may correspond to a number of pixels included in the image 502.


Thereafter, at step S404, computing device 200, such as via application 280, crops at least a portion of the image 502 of the surgical site to designate a displayed portion 504 of the image 502, as shown in FIG. 5. For example, the displayed portion 504 may include a number of pixels less than the full number of pixels included in the image 502 of the surgical site. Thus, the displayed portion 504 may have a smaller resolution, frame size, and/or number of pixels than the image 502 of the surgical site. For example, the displayed portion 504 may exclude one or more rows and/or columns of pixels 506 around outer edges of the image 502 of the surgical site. As shown in FIG. 5, the displayed portion 504 has a smaller frame size than the image 502. Image view 508 shows an example of the displayed portion 504 of the image 502 that may be displayed on display 120 prior to being shifted.


Next, at step S406, computing device 200, such as via application 280, determines a position of the observer's body, e.g. the observer's head, face, and/or eyes, relative to the normal vector. In some embodiments, the determination described above with reference to step S306 is the same as the determination described here. In other embodiments, computing device 200 may perform two separate determinations of a position of the observer relative to the normal vector. For example, computing device 200 may determine a horizontal distance, a vertical distance, and/or a diagonal distance, of the position of the observer relative to the normal vector. In some embodiments, computing device 200 may determine a directional component, such as a vector, and a scalar component, such as a magnitude, based on the position of the observer relative to the normal vector. In some embodiments, successive positions of the observer (such as a first position, a second position, etc.) are determined, and movement of the observer may then be detected based on the determined successive positions of the observer relative to the normal vector. For example, as the observer moves relative to the normal vector, one or more positions of the observer relative to the normal vector may be determined, and a direction, amount, speed, and/or degree of movement of the observer may be determined based on the successive positions of the observer relative to the normal vector.


Thereafter, at step S408, computing device 200, such as via application 280, determines a direction and/or amount of pixels to shift and/or pan the displayed portion 504 of the image 502, based on the position of the observer determined at step S406. The determination may be based on a table and/or a threshold. For example, computing device 200 may have stored in database 240 various tables and/or thresholds, as described above with reference to step S310. The tables and/or thresholds may indicate various relationships between a direction and/or distance of the observer relative to the normal vector, and a corresponding direction and amount of pixels to shift and/or pan the displayed portion 504 of the image 502. For example, the tables and/or thresholds may indicate distance of the observer from the normal vector in a particular direction, such as a horizontal direction, required to shift a predetermined amount of pixels of the image frame 504 in the same direction. In embodiments, different thresholds and/or relationships may be configured for each direction. For example, based on the preference of the observer, the threshold for the horizontal direction may be lower than the threshold for the vertical direction, thus allowing for more sensitivity to movement of the observer in a horizontal direction relative to the normal vector than movement of the observer in a vertical direction relative to the normal vector. The tables and/or thresholds may also indicate a maximum distance of the observer from the normal vector in a particular direction that can be used to shift pixels in that direction. For example, if too many pixels are shifted and/or panned at once, the images displayed by display 120 may appear distorted. As such, a maximum threshold may correspond to a limit of the number of pixels that may be shifted and/or panned at once. Further, in embodiments where computing device 200 determines directional and scalar components based on movement of the observer, the determination of the direction and/or amount of pixels to shift may further be based on the directional and scalar components.


Next, at step S410, computing device 200, such as via application 280, shifts the displayed portion 504 of the image 502 based on the direction and/or amount of pixels to shift determined at step S408. For example, if the position of the observer is to the left of the normal vector by a particular amount, computing device 200 may shift the displayed portion 504 right by the amount of pixels determined at step S408. In the example shown in FIG. 5, the displayed portion 504 of the image 502, as shown in unshifted image view 508, is shifted to the right (as viewed by the observer), as shown in shifted image view 510, such that the displayed portion 504 includes one or more columns of pixels to the right of, and not included in, the unshifted image view 508, and excludes one or more columns of pixels of the image 502 of the surgical site included in the right side of the unshifted image view 508. As such, the unshifted image view 508 and the shifted image view 510, both being based on the image 502 of the surgical site received from endoscopic camera 145, may include an overlapping portion that is the majority of both the unshifted image view 508 and the shifted image view 510. However, the shifted image view 510 includes a minor portion of the right of the shifted image view 510 that is not included in the unshifted image view 508, and will exclude a minor portion of the right of the unshifted image view 508.


As such, by virtue of the above-described systems and methods, computing device 200 may be configured to detect one or more positions of the observer, reposition an endoscopic camera based on movement of the observer determined based on successive positions of the observer, shift one or more lines and/or columns of pixels in the images received from the endoscopic camera based on the one or more positions of the observer to compensate for the movement of the observer, and display the shifted images on a stereoscopic display. In this manner, the stereoscopic images are cropped and/or shifted based on movement of the observer's head prior to being displayed. One effect of such shifting of the stereoscopic images is that the displayed images “move” in the way the observer expects the stereoscopic images to move even if the images received from the endoscopic camera do not move.


Detailed embodiments of devices, systems incorporating such devices, and methods using the same as described herein. However, these detailed embodiments are merely examples of the disclosure, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for allowing one skilled in the art to variously employ the present disclosure in appropriately detailed structure.

Claims
  • 1. A robotic surgical system with observer movement compensation, the system comprising: a camera configured to capture images of a surgical site;a stereoscopic display device;a sensor configured to detect positions of an observer; anda computing device including at least one processor and a memory storing instructions which, when executed by the at least one processor, cause the computing device to: receive the images of the surgical site from the camera,receive data from the sensor indicating a first position of the observer,process the received images of the surgical site based on the first position of the observer,cause the stereoscopic display device to display the processed images of the surgical site,determine a vector of movement of the observer based on the first position of the observer,determine whether the vector of movement exceeds a threshold corresponding to a maximum amount of movement for which the camera may be repositioned,reposition the camera to a new position corresponding to the vector of movement when it is determined that the vector of movement does not exceed the threshold corresponding to the maximum amount of movement for which the camera may be repositioned,determine whether the vector of movement exceeds a threshold corresponding to a maximum number of pixels available to be shifted, andshift the stereoscopic images when it is determined that: the vector of movement does not exceed the threshold corresponding to the maximum number of pixels available to be shifted; andthe vector of movement exceeds the threshold corresponding to the maximum amount of movement for which the camera may be repositioned.
  • 2. The system according to claim 1, wherein the images of the surgical site include left-eye image data and right-eye image data.
  • 3. The system according to claim 1, wherein the images of the surgical site have a frame size, and wherein the processing the received images of the surgical site based on the first position of the observer includes determining a portion of the images to display based on the first position of the observer, the portion of the images to display being smaller than the frame size.
  • 4. The system according to claim 3, wherein the portion of the images to display corresponds to a number of pixels less than the number of pixels included in the images of the surgical site.
  • 5. The system according to claim 3, wherein the determining the portion of the images to display includes cropping at least a portion of the images.
  • 6. The system according to claim 3, wherein the determining the portion of the images to display includes shifting at least a portion of the images.
  • 7. The system according to claim 6, wherein the shifting at least the portion of the images includes: determining a direction and an amount of pixels to shift based on the determined vector of movement of the observer; andshifting at least a portion of the images based on the determined direction and amount of pixels to shift.
  • 8. The system according to claim 7, wherein the determining the vector of movement of the observer further includes determining a degree of movement, and wherein the determining the direction and the amount of pixels to shift is further based on the determined degree of movement.
  • 9. The system according to claim 7, wherein the determining the direction and the amount of pixels to shift is further based on a relationship between the vector of movement of the observer and the direction and amount of pixels to shift.
  • 10. The system according to claim 9, wherein the relationship is based on a table.
  • 11. The system according to claim 1, wherein the computing device is further configured to: determine whether the first position of the observer approaches a maximum threshold; andprovide an alert indicating that the first position of the observer approaches the maximum threshold.
  • 12. The system according to claim 1, wherein the computing device is further configured to: determine whether the first position of the observer exceeds a maximum threshold; andprovide an alert indicating that the first position of the observer exceeds the maximum threshold.
  • 13. The system according to claim 1, wherein the stereoscopic display is an autostereoscopic display.
  • 14. The system according to claim 1, wherein the stereoscopic display is a passive stereoscopic display, and the system further comprises three-dimensional (3D) glasses worn by the observer.
  • 15. The system according to claim 14, wherein the 3D glasses cause a left-eye image to be displayed to a left eye of the observer, and a right-eye image to be displayed to a right eye of the observer.
  • 16. The system according to claim 1, wherein the camera is a stereoscopic camera coupled to an endoscope.
  • 17. The system according to claim 1, wherein the sensor is a motion sensor.
  • 18. The system according to claim 1, wherein the sensor is a second camera.
  • 19. The system according to claim 18, wherein the data received from the sensor indicating a first position of the observer includes an image of the observer, and wherein the instructions, when executed by the at least one processor, further cause the computing device to: generate second image data based on the image of the observer; anddetect the first position of the observer by processing the second image data.
  • 20. The system according to claim 19, wherein the detecting the first position of the observer includes detecting one or more of: a distance of the observer relative to a vector normal to the stereoscopic display;a direction of the observer relative to the vector normal to the stereoscopic display; oran orientation of the observer relative to the stereoscopic display.
  • 21. The system according to claim 20, wherein the direction of the observer is one or more of a lateral direction or a vertical direction.
  • 22. The system according to claim 1, wherein the instructions, when executed by the at least one processor, further cause the computing device to: receive additional data from the sensor indicating a second position of the observer;process the received images of the surgical site based on the second position of the observer, andcause the stereoscopic display device to display the processed images of the surgical site.
  • 23. A method for compensating for observer movement in a robotic surgical system, the method comprising: receiving images of a surgical site from a camera;receiving data from a sensor indicating a first position of an observer;processing the received images of the surgical site based on the first position of the observer;causing a stereoscopic display device to display the processed images of the surgical site;determining a vector of movement of the observer based on the first position of the observer,determining whether the vector of movement exceeds a threshold corresponding to a maximum amount of movement for which the camera may be repositioned,repositioning the camera to a new position corresponding to the vector of movement when it is determined that the vector of movement does not exceed the threshold corresponding to the maximum amount of movement for which the camera may be repositioned,determining whether the vector of movement exceeds a threshold corresponding to a maximum number of pixels available to be shifted, andshifting the stereoscopic images when it is determined that: the vector of movement does not exceed the threshold corresponding to the maximum number of pixels available to be shifted; andthe vector of movement exceeds the threshold corresponding to the maximum amount of movement for which the camera may be repositioned.
  • 24. A non-transitory computer-readable storage medium storing a program for compensating for observer movement in a robotic surgical system, the program including instructions which, when executed by a processor, cause a computing device to: receive images of a surgical site from a camera;receive data from a sensor indicating a first position of an observer;process the received images of the surgical site based on the first position of the observer;cause a stereoscopic display device to display the processed images of the surgical site;determine a vector of movement of the observer based on the first position of the observer,determine whether the vector of movement exceeds a threshold corresponding to a maximum amount of movement for which the camera may be repositioned,reposition the camera to a new position corresponding to the vector of movement when it is determined that the vector of movement does not exceed the threshold corresponding to the maximum amount of movement for which the camera may be repositioned,determine whether the vector of movement exceeds a threshold corresponding to a maximum number of pixels available to be shifted, andshift the stereoscopic images when it is determined that: the vector of movement does not exceed the threshold corresponding to the maximum number of pixels available to be shifted; andthe vector of movement exceeds the threshold corresponding to the maximum amount of movement for which the camera may be repositioned.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2019/025096, filed Apr. 1, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/660,398, filed Apr. 20, 2018, the entire disclosure of which is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2019/025096 4/1/2019 WO
Publishing Document Publishing Date Country Kind
WO2019/204012 10/24/2019 WO A
US Referenced Citations (321)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6549641 Ishikawa et al. Apr 2003 B2
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6757422 Suzuki et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7121946 Paul et al. Oct 2006 B2
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8780178 Koh et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O'Grady et al. Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9292734 Zhu et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
20030012425 Suzuki et al. Jan 2003 A1
20070176914 Bae Aug 2007 A1
20090268015 Scott et al. Oct 2009 A1
20120127203 Imai May 2012 A1
20120127302 Imai May 2012 A1
20140038635 Ngo et al. Feb 2014 A1
20140247329 Nakamura et al. Sep 2014 A1
20150035953 Bredehoft et al. Feb 2015 A1
20150192996 Katou Jul 2015 A1
20160142683 Seesselberg May 2016 A1
20170112368 Stern Apr 2017 A1
20170172675 Jarc Jun 2017 A1
20180092706 Anderson et al. Apr 2018 A1
20190110843 Ummalaneni Apr 2019 A1
20200169724 Meglan May 2020 A1
20210361379 Ramirez Luna Nov 2021 A1
Foreign Referenced Citations (2)
Number Date Country
0122149 Mar 2001 WO
2017210101 Dec 2017 WO
Non-Patent Literature Citations (4)
Entry
Extended European Search Report dated Dec. 23, 2021 corresponding to counterpart Patent Application EP 19789525.3.
European Search Report dated Jul. 12, 2019, and Written Opinion Completed Jul. 10, 2019, International Patent Application No. PCT/US2019/025096 9 pages.
Tobii, e-book, tech.tobii.com. Copyright 2021, Tobii AB, “5 Ways Next-Generation Surgical Robotics Will Leverage Attention to Enhance Care”, p. 1/12-12/12.
Tobii, Tobii White Paper, tech.tobii.com., May 2020, Version 1.0, “Why Next-Generation Surgical Systems Will Include Eye Tracking”, p. 1/15-15/15.
Related Publications (1)
Number Date Country
20210243427 A1 Aug 2021 US
Provisional Applications (1)
Number Date Country
62660398 Apr 2018 US