Robotic surgical system with automated guidance

Information

  • Patent Grant
  • 11839441
  • Patent Number
    11,839,441
  • Date Filed
    Monday, May 7, 2018
    6 years ago
  • Date Issued
    Tuesday, December 12, 2023
    5 months ago
Abstract
A robotic surgical system includes at least one robot arm, at least one instrument, and a plurality of drive motors configured to drive the at least one robot arm and at least one instrument. The system also includes a laparoscopic port having a plurality of fiducials, a sensor configured to detect the plurality of fiducials, and a controller configured to control the plurality of drive motors. The controller includes a processor that determines a current distance between each fiducial among the plurality of fiducials, determines a location of the laparoscopic port based on the distance between each fiducial, determines a position of the at least one robot arm and the at least one instrument relative to the location of the laparoscopic port, and controls the plurality of drive motors to align the at least one robot arm or the at least one instrument with the laparoscopic port.
Description
BACKGROUND

Robotic surgical systems such as teleoperative systems are used to perform minimally invasive surgical procedures that offer many benefits over traditional open surgery techniques, including less pain, shorter hospital stays, quicker return to normal activities, minimal scarring, reduced recovery time, and less injury to tissue.


Robotic surgical systems can have a number of robotic arms that move attached instruments or tools, such as an image capturing device, a stapler, an electrosurgical instrument, etc., in response to movement of input devices by a surgeon viewing images captured by the image capturing device of a surgical site. During a surgical procedure, each of the tools may be inserted through an opening, e.g., a laparoscopic port, into the patient and positioned to manipulate tissue at a surgical site. The openings are placed about the patient's body so that the surgical instruments may be used to cooperatively perform the surgical procedure and the image capturing device may view the surgical site.


During the surgical procedure, the tools are manipulated in multiple degrees of freedom by a clinician. In order to manipulate the tool through the laparoscopic port, the clinician has to position the robotic arm correctly to facilitate insertion and/or removal of the tool. However, obtaining the correct position may be a relatively time consuming step. Further, manual positioning of the robotic arm by the clinician adds further time and complexity to this step of the surgical procedure.


Accordingly, there is a need for guiding the robotic arms of the robotic surgical system to reduce the complexity and duration of a surgical procedure, as well as increase the outcome and/or results of the surgical procedure.


SUMMARY

The present disclosure relates generally to guiding of a robotic surgical system and, in particular, guiding a robotic arm of the robotic surgical system to automatically insert and/or remove tools through an opening or laparoscopic port.


In an aspect of the present disclosure, a robotic surgical system is provided. The system includes at least one robot arm, at least one instrument coupled to the robot arm, and a plurality of drive motors configured to drive the at least one robot arm and at least one instrument. The system also includes a laparoscopic port having a plurality of fiducials and a sensor configured to detect the plurality of fiducials. A controller that is configured to control the plurality of drive motors includes a processor configured to determine a current distance between each fiducial among the plurality of fiducials, determine a location of the laparoscopic port based on the distance between each fiducial, determine a position of the at least one robot arm and the at least one instrument relative to the location of the laparoscopic port, and control the plurality of drive motors to align the at least one robot arm or the at least one instrument with the laparoscopic port.


In embodiments, the processor is also configured to obtain a predetermined distance between each fiducial among the plurality of fiducials and obtain a predetermined distance between each fiducial and the laparoscopic port. The processor may determine the location of the laparoscopic port based on the current distance between each fiducial among the plurality of fiducials, the predetermined distance between each fiducial among the plurality of fiducials, and the predetermined distance between each fiducial and the laparoscopic port.


In embodiments, the processor is configured to obtains a length of the at least one instrument and a length of the at least one robot arm. Control of the plurality of drive motors may be based on the position of the at least one robot arm and the at least one instrument relative to the location of the laparoscopic port, the length of the at least one instrument, and the length of the at least one robot arm.


In some embodiments, the plurality of fiducials is active light emitting diodes.


In some embodiments, the robotic surgical system includes a light source configured to emit light directed at the plurality of fiducials. The plurality of fiducials may include a reflective material and the light emitted from the light source may be reflected by the plurality of fiducials and detected by the sensor.


In another aspect of the present disclosure, a method for guiding a robot arm and/or instrument toward a laparoscopic port is provided. The method includes determining a current distance between each fiducial among a plurality of fiducials disposed around the laparoscopic port and determining a location of the laparoscopic port based on the distance between each fiducial. The method also includes determining a position of the robot arm and the instrument relative to the location of the laparoscopic port and controlling a plurality of drive motors associated with the robot arm and the instrument to align the robot arm or the instrument with the laparoscopic port.


In embodiments, the method includes obtaining a predetermined distance between each fiducial among the plurality of fiducials and obtaining a predetermined distance between each fiducial and the laparoscopic port. The method also includes determining the location of the laparoscopic port based on the current distance between each fiducial among the plurality of fiducials, the predetermined distance between each fiducial among the plurality of fiducials, and the predetermined distance between each fiducial and the laparoscopic port.


In embodiments, the method includes obtaining a length of the instrument and obtaining a length of the robot arm. Controlling the plurality of drive motors is based on the position of the robot arm and the instrument relative to the location of the laparoscopic port, the length of the instrument, and the length of the robot arm.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:



FIG. 1 is a schematic illustration of a user interface and a robotic system of a robotic surgical system in accordance with the present disclosure;



FIG. 2A is a top view of a laparoscopic port in accordance with the present disclosure;



FIG. 2B is a perspective view of a portion of robotic arm of the robot system of FIG. 1 with a surgical instrument attached to the robotic arm;



FIG. 3A is a perspective view of an exemplary outside-looking-in optical system in accordance with the present disclosure;



FIG. 3B is a perspective view of an exemplary inside-looking-out optical system in accordance with the present disclosure;



FIG. 3C is a perspective view of an exemplary optical system in accordance with the present disclosure including sensors disposed about an operating theater; and



FIG. 4 is a flowchart depicting operation of an automated guidance system of the present disclosure including an optical system of FIG. 3A, 3B, or 3C.





DETAILED DESCRIPTION

The present disclosure is directed to systems and methods for guiding one or more robotic arms in a robotic surgical system utilizing images captured during a surgical procedure. Image data captured during a surgical procedure may be analyzed to guide a robotic arm having a device or tool coupled thereto to insert and/or remove the tool from a laparoscopic port. In the systems described herein, one or more laparoscopic ports have three (3) fiducials, e.g., active infrared light emitting diodes (LEDs), with known distances between the fiducials and known distances between the fiducials and a center of the laparoscopic port. The positions of each fiducial may be analyzed using motion analyzing techniques to establish a plane defined by the fiducials and establish the steps necessary to move the robot arm in position. The position of the center of the laparoscopic port relative to the three (3) fiducials determines where to insert and/or remove the tool, and an orientation angle for the tool.


In the systems described herein, aligning and orienting the robotic arm to insert and/or remove the tool through the laparoscopic port relies on one or more variables. The variable(s) may include, but is/are not limited to, distance(s) between three (3) or more fiducials, distance(s) between the fiducials and the center of the laparoscopic port, the port plane defined by the fiducials, and/or the length of the interchangeable tool and robotic arm.


The systems described herein permit quicker installation of robotic instrumentation into laparoscopic ports, less intervention on the part of a clinician during a surgical procedure, and/or potentially lower incidents of impact with a laparoscopic port site.


Turning to FIG. 1, a robotic surgical system 100 may be employed with one or more consoles 102 that are next to the operating theater or located in a remote location. In this instance, one team of clinicians or nurses may prep the patient for surgery and configure the robotic surgical system 100 with one or more tools 104 while another clinician (or group of clinicians) remotely controls the instruments via the robotic surgical system. As can be appreciated, a highly skilled clinician may perform multiple operations in multiple locations without leaving his/her remote console which can be both economically advantageous and a benefit to the patient or a series of patients.


The robotic arms 106 of the surgical system 100 are typically coupled to a pair of master handles 108 by a controller 110. Controller 110 may be integrated with the console 102 or provided as a standalone device within the operating theater. The handles 106 can be moved by the clinician to produce a corresponding movement of the working ends of any type of tools 104 (e.g., probes, mechanical or electrosurgical end effectors, graspers, knifes, scissors, etc.) attached to the robotic arms 106. For example, tool 104 may be a probe that includes an image capture device.


The console 102 includes a display device 112 which is configured to display two-dimensional or three-dimensional images. The display device 112 displays the images of the surgical site which may include data captured by tool 104 positioned on the ends 114 of the arms 106 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site, an imaging device positioned adjacent the patient, imaging device positioned at a distal end of an imaging arm). The imaging devices may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site. The imaging devices transmit captured imaging data to the controller 110 which creates the images of the surgical site in real-time from the imaging data and transmits the images to the display device 112 for display.


The movement of the master handles 108 may be scaled so that the working ends have a corresponding movement that is different, smaller or larger, than the movement performed by the operating hands of the clinician. The scale factor or gearing ratio may be adjustable so that the operator can control the resolution of the working ends of the surgical instrument(s) 104.


During operation of the surgical system 100, the master handles 108 are operated by a clinician to produce a corresponding movement of the robotic arms 106 and/or surgical instruments 104. The master handles 108 provide a signal to the controller 110 which then provides a corresponding signal to one or more drive motors 114. The one or more drive motors 114 are coupled to the robotic arms 106 in order to move the robotic arms 106 and/or surgical instruments 104.


The master handles 108 may include various haptics 116 to provide feedback to the clinician relating to various tissue parameters or conditions, e.g., tissue resistance due to manipulation, cutting or otherwise treating, pressure by the instrument onto the tissue, tissue temperature, tissue impedance, etc. As can be appreciated, such haptics 116 provide the clinician with enhanced tactile feedback simulating actual operating conditions. The haptics 116 may include vibratory motors, electroactive polymers, piezoelectric devices, electrostatic devices, subsonic audio wave surface actuation devices, reverse-electrovibration, or any other device capable of providing a tactile feedback to a user. The master handles 108 may also include a variety of different actuators (not shown) for delicate tissue manipulation or treatment further enhancing the clinician's ability to mimic actual operating conditions.


The controller 110 includes a transceiver 118 and a processor 120. Transceiver 118 receives a signal from infrared sensors 122 which will be described in more detail below. The signal from infrared sensors may be transmitted to transceiver 118 via any conventional wired or wireless methods. Transceiver 118 provides the signal to a motion analysis unit 124 in processor 120 which performs a motion analysis on the signal in order to control the one or more drive motors 114 to move the robotic arms 106 and/or surgical instruments 104 into the correct position and/or orientation. A memory 126 may store an algorithm used to perform the motion analysis. In some embodiments, memory 126 may store a look up table (LUT) that includes information pertaining to instruments 104, robot arms 106, and laparoscopic ports discussed below.


As will be discussed in more detail below, infrared sensors receive light from infrared light sources. The infrared sensors may be disposed on the robotic arm 106, the surgical instrument 104, the laparoscopic port 130, or may be disposed anywhere in the surgical environment. The infrared light sources may be incorporated in the robotic arm 106, the surgical instrument 104, the laparoscopic port 130, or may be disposed anywhere in the surgical environment.


With reference to FIG. 2A, the laparoscopic port 130 has at least three (3) port fiducials 132a-c positioned in a port plane about a port opening 131 of the laparoscopic port 130. The port plane may be defined by an outer surface 133 (FIG. 3A) of the laparoscopic port 130. The port fiducials 132a-c are positioned about the port opening 131 with distances D1-3 between the port fiducials 132a-c and distances P1-3 between the port fiducials 132a-c and a center “X” of the port opening 131 being known.


Referring to FIG. 2B, the robotic arm 106 has at least three (3) arm fiducials 107a-c positioned in an arm plane orthogonal to a longitudinal axis of the robotic arm 106. The arm plane may be defined by a distal surface 103 of the robotic arm 106. Similar to the port fiducials 132a-c, distances between the arm fiducials 107a-c and distances between each of the arm fiducials 107a-c and the longitudinal axis of the robotic arm 106 are known.


Turning to FIG. 3A, an outside-looking-in optical system is shown according to an embodiment of the present disclosure including the robotic arm 106 and the laparoscopic port 130. In such an embodiment, the port fiducials 132a-c of laparoscopic port 130 are infrared light sources (e.g., active infrared light emitting diodes (LEDs)) which each emit infrared light IL1-3 having a distinctive characteristic (e.g., wavelength, phase) and the arm fiducials 107a-c are infrared sensors (e.g., infrared cameras) in communication with the transceiver 118 (FIG. 1). The arm fiducials 107a-c determine the time it takes for infrared light (e.g., IL1-3) from the port fiducials 132a-c to reach the respective arm fiducial 107a-c and provides a signal indicative of the time to motion analysis unit 124.


Referring to FIG. 3B, an inside-looking-out optical system is shown according to an embodiment of the present disclosure including the robotic arm 106 and the laparoscopic port 130. The at least three (3) arm fiducials 107a-c are infrared light sources (e.g., active infrared light emitting diodes (LEDs)) which each emit infrared light EL1-3 having a distinctive characteristic (e.g., wavelength, phase) and the at least three (3) of laparoscopic port 130 are infrared sensors (e.g., infrared cameras) in communication with the transceiver 118 (FIG. 1). The port fiducials 132a-c determine the time it takes for the infrared light (e.g., EL1-3) from the arm fiducials 107a-c to reach the respective port fiducial 132a-c and provides a signal indicative of the motion analysis unit 124.


Referring to FIG. 3C, a fixed optical system is shown according to an embodiment of the present disclosure including the robotic arm 106 and the laparoscopic port 130. In such an embodiment, at least three (3) infrared sensors 192a-c (e.g., infrared cameras) are positioned about the operating theater and are in communication with the transceiver 118 (FIG. 1). The port fiducials 132a-c of laparoscopic port 130 are infrared light sources (e.g., active infrared light emitting diodes (LEDs)) which each emit infrared light IL1-3 having a distinctive characteristic (e.g., wavelength, phase) and the arm fiducials 107a-c and the arm fiducials 107a-c are infrared light sources (e.g., active infrared light emitting diodes (LEDs)) which each emit infrared light EL1-3 having a distinctive characteristic (e.g., wavelength, phase). The positions of the infrared sensors 192a-c within the operating theater are known such that the position of the arm and port fiducials 107a-c, 132a-c relative to the infrared sensors 192a-c can be determined as detailed below. The infrared sensors 192a-c determine the time it takes for infrared light (e.g., IL1-3 or EL1-3) from the arm fiducials 107a-c and the port fiducials 132a-c to reach the respective infrared sensor 192a-c and provides a signal indicative of the time to motion analysis unit 124.


As detailed below with reference to FIG. 4, the motion analysis unit 124 (FIG. 1) determines the position of the robotic arm 106 relative to the laparoscopic port 130 based on signals from the fiducials 107a-c, 132a-c. FIG. 4 will be described while making references to FIGS. 1-3A. As shown in FIG. 4, a user enters a laparoscopic port identification number in step s202. Based on the laparoscopic port identification number, the motion analysis unit queries the memory 126 in step s204 to obtain the distances between port fiducials 132a-c (D1, D2, and D3 as shown in FIG. 2A) and the distances between each port fiducial 132a-c and the port center “X” of the port opening 131 (P1, P2, and P3 as shown in FIG. 2A) that correspond to the laparoscopic port 130. In step s206, an instrument identification number is obtained. The instrument identification number of an instrument 104 coupled to the robotic arm 106 may be obtained via a user input or it may be obtained from an integrated circuit (IC) included in the instrument 104. In step s208, the length of the instrument 104 is obtained based on the instrument identification number, either from the memory 124 or the IC included in the instrument 104. In step s210, a robot arm identification number is obtained. The robot arm identification number may be obtained via a user input or it may be obtained from an IC included in the robot arm 106. In step s212, the geometry of the robot arm 106 is obtained based on the robot arm identification number, either from memory 124 or the IC included in the robot arm 106.


In step s214, the motion analysis unit 124 receives signals from the arm fiducials 107a-c representing the time that the respective infrared light (e.g., IL1-3) from a respective one of the port fiducials 132a-c reached a respective arm fiducial 107a-c. Based on the signals from the arm fiducials 107a-c, the motion analysis unit 124 uses three-dimensional spherical trilateration to determine the distance between each of the port fiducials 132a-c and each of the arm fiducials 107a-c in step s216.


For a detailed discussion of three-dimensional spherical trilateration used to determine the location of points in three dimensions, reference can be made to MURPHY JR., WILLIAM S. & HEREMAN, WILLY, DETERMINATION OF A POSITION IN THREE DIMENSIONS USING TRILATERATION AND APPROXIMATE DISTANCES, Nov. 28, 1999, available at http://inside.mines.edu/˜whereman/papers/Murphy-Hereman-Trilateration-1995.pdf, the entire contents of which are hereby incorporated by reference.


In step s218, the distances between each of the port fiducials 132a-c relative to each of the arm fiducials 107a-c, the port plane defined by the port fiducials 132a-c is determined relative to the arm fiducials 107a-c.


It will be appreciated that the at least three port fiducials (e.g., port fiducials 132a-c) are required to define the port plane relative to the arm fiducials 107a-c which can be determined as follows:

    • 1. Solving for the vector from 132a, 132b and the vector from 132a, 132c as follows:

      custom character=(xB−xA)custom character+(yB−yA)custom character+(zB−zA)custom character
      custom character=(xC−xA)custom character+(yC−yA)custom character+(zC−zA)custom character
    • 2. Determining the normal vector as follows:







x

=

[




i





j





k







(


x
B

-

x
A


)




(


y
B

-

y
A


)




(


z
B

-

z
A


)






(


x
C

-

x
A


)




(


y
C

-

y
A


)




(


z
C

-

z
A


)




]







    • 3. Determine the equation of the plane from the normal vector as follows:

      ((xB−xA)+(xC−xA))x+((yB−yA)+(yC−yA))y+((zB−zA)+(zC−zA))z+d=0

    • 4. Use any known point on the plane to solve for d.





As detailed above, the arm fiducials 107a-c are sensors and the port fiducials 132a-c are light sources. Alternatively, the arm fiducials 107a-c can be light sources and the port fiducials 132a-c can be sensors in communication with the motion analysis unit 124. In addition, as detailed above, the arm and port fiducials 107a-c, 132a-c can be light sources and the sensors 192a-c can be sensors in communication with the motion analysis unit 124.


In addition, during step s218, the motion analysis unit 124 determines the port center “X” on the port plane from the position of the three arm fiducials 107a-c. To determine the port center “X”, the motion analysis unit 124 solves the following system of equations:

(Xx−XA)2+(Yx−YA)2+(Zx−ZA)2=RA2
(Xx−XB)2+(Yx−YB)2+(Zx−ZB)2=RB2
(Xx−XC)2+(Yx−YC)2+(Zx−ZC)2=RC2

where RA-C are the distances each port fiduciary 132a-c is from a given one of the arm fiduciaries 107a-c.


In step s220, from the position of the port fiducials 132a-c and the port center “X”, a vector “V” normal to the port plane defined by the port fiducials 132a-c and passing through the port center “X” provides the orientation of the port opening 131 relative to the arm fiducials 107a-c.


In step s222, the motion analysis unit 124 determines the current position of the robot arm 106 and the instrument 104 relative to the laparoscopic port 130. To determine the current position of the instrument 104, the motion analysis unit 124 can use inverse kinematics based on the known geometry of links of the robot arm 106, the instrument 104, and measured angles between the links of the robot arm 106.


In step s224, the length of the instrument 104 obtained in step s208, the geometry of the robot arm 106 obtained in steps s212, the location of the laparoscopic port 130 determined in step s220, and the current position of the robot arm 106 (e.g., angles between links) and the instrument 104 determined in step s222 are used to determine how to control the drive motors 114 in order to align the instrument 104 with the vector “V” so that the robot arm 106 and/or instrument 104 may be inserted and/or removed from the laparoscopic port 130. It will be appreciated that by determining the vector “V” relative to the arm fiducials 107a-c the movement of the robot arm 106, and thus instrument 104, are relative to the port fiducials 132a-c. In step s226, the drive motors 114 are controlled to perform insertion and/or removal of the instrument 104.


The embodiments disclosed herein are examples of the disclosure and may be embodied in various forms. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.


The phrases “in an embodiment,” “in embodiments,” “in some embodiments,” or “in other embodiments,” which may each refer to one or more of the same or different embodiments in accordance with the present disclosure. A phrase in the form “A or B” means “(A), (B), or (A and B)”. A phrase in the form “at least one of A, B, or C” means “(A), (B), (C), (A and B), (A and C), (B and C), or (A, B and C)”. A clinician may refer to a surgeon or any medical professional, such as a doctor, nurse, technician, medical assistant, or the like performing a medical procedure.


The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, or the like. The controller may also include a memory to store data and/or algorithms to perform a series of instructions.


The systems described herein utilize optical or visual motion tracking technology; however, it is envisioned that other motion tracking technologies can be used in place of or in conjunction with the optical motion tracking technologies detailed above including, but not limited to, accelerometer, gyroscopic, ultrasound, magnetic, radio frequency, or other light based (e.g., laser) motion tracking technologies.


Any of the herein described methods, programs, algorithms, or codes may be converted to, or expressed in, a programming language or computer program. A “Programming Language” and “Computer Program” includes any language used to specify instructions to a computer, and includes (but is not limited to) these languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, Machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, and fifth generation computer languages. Also included are database and other data schemas, and any other meta-languages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is also made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.


Any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory. The term “memory” may include a mechanism that provides (e.g., stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device. For example, a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device. Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.


It should be understood that the foregoing description is only illustrative of the present disclosure. Various alternatives and modifications can be devised by those skilled in the art without departing from the disclosure. For instance, any of the augmented images described herein can be combined into a single augmented image to be displayed to a clinician. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances. The embodiments described with reference to the attached drawing figures are presented only to demonstrate certain examples of the disclosure. Other elements, steps, methods, and techniques that are insubstantially different from those described above and/or in the appended claims are also intended to be within the scope of the disclosure.

Claims
  • 1. A robotic surgical system comprising: at least one robot arm;at least one instrument coupled to the robot arm;a plurality of drive motors configured to drive the at least one robot arm;a laparoscopic port;a plurality of fiducials;a plurality of sensors configured to detect the plurality of fiducials; anda controller configured to control the plurality of drive motors, the controller including a processor configured to: determine a current distance between each sensor of the plurality of sensors and each fiducial of the plurality of fiducials;determine a port plane based on the distance between the each sensor and each fiducial;determine a vector normal to the port plane passing through a center of the laparoscopic port based on the distance between each sensor and each fiducial;determine a position of the at least one robot arm and the at least one instrument relative to the vector; andcontrol the plurality of drive motors to align the at least one robot arm or the at least one instrument with the vector.
  • 2. The robotic surgical system of claim 1, wherein the processor is also configured to: obtain a predetermined distance between each fiducial among the plurality of fiducials; andobtain a predetermined distance between each fiducial and the center of the laparoscopic port.
  • 3. The robotic surgical system of claim 2, wherein the processor determines the port plane based on the current distance between each fiducial among the plurality of fiducials, the predetermined distance between each fiducial among the plurality of fiducials, and the predetermined distance between each fiducial and the center of the laparoscopic port.
  • 4. The robotic surgical system of claim 3, wherein the processor is configured to obtain a length of the at least one instrument.
  • 5. The robotic surgical system of claim 4, wherein the processor is configured to obtain a geometry of the at least one robot arm.
  • 6. The robotic surgical system of claim 5, wherein controlling the plurality of drive motors is based on the position of the at least one robot arm and the at least one instrument relative to the vector, the length of the at least one instrument, and the geometry of the at least one robot arm.
  • 7. The robotic surgical system of claim 1, wherein the plurality of fiducials are active light emitting diodes.
  • 8. The robotic surgical system of claim 1, wherein the plurality of sensors are disposed on the robot arm.
  • 9. The robotic surgical system of claim 1, wherein the plurality of fiducials are disposed on the laparoscopic port.
  • 10. The robotic surgical system of claim 1, wherein the plurality of fiducials are disposed on the robot arm.
  • 11. The robotic surgical system of claim 1, wherein the plurality of sensors are disposed on the laparoscopic port.
  • 12. The robotic surgical system of claim 1, wherein the plurality of drive motors are configured to drive the at least one instrument.
  • 13. A method for guiding a robot arm and/or instrument toward a laparoscopic port, the method comprising: determining a position of each fiducial among a plurality of fiducials disposed around the laparoscopic port;determining a port plane of the laparoscopic port based on the position of each fiducial;determining a vector normal to the port plane passing through a center of the laparoscopic port based on the position of each fiducial;determining a position of the robot arm and the instrument relative to the vector; andcontrolling a plurality of drive motors associated with the robot arm and the instrument to align the robot arm or the instrument with the vector.
  • 14. The method of claim 13, further comprising: obtaining a predetermined distance between each fiducial among the plurality of fiducials; andobtaining a predetermined distance between each fiducial and the center of the laparoscopic port.
  • 15. The method of claim 14, further comprising determining the vector based on the position of each fiducial among the plurality of fiducials, the predetermined distance between each fiducial among the plurality of fiducials, and the predetermined distance between each fiducial and center of the laparoscopic port.
  • 16. The method of claim 15, further comprising obtaining a length of the instrument.
  • 17. The method of claim 16, further comprising obtaining a length of the robot arm.
  • 18. The method of claim 17, wherein controlling the plurality of drive motors is based on the position of the robot arm and the instrument relative to the vector, the length of the instrument, and the length of the robot arm.
  • 19. The method of claim 13, wherein determining the position of each fiducial among a plurality of fiducials includes determining a distance between each sensor of a plurality of sensors and each fiducial of the plurality of fiducials.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a National Stage Application of PCT/US2018/031302, filed May 7, 2018 under 35USC § 371 (a), which claims benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/510,938 filed May 25, 2017, the disclosures of each of the above-identified applications are hereby incorporated by reference in their entirety.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2018/031302 5/7/2018 WO
Publishing Document Publishing Date Country Kind
WO2018/217431 11/29/2018 WO A
US Referenced Citations (367)
Number Name Date Kind
5868673 Vesely Feb 1999 A
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8082064 Kay Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9280158 Bron et al. Mar 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9358682 Ruiz Morales Jun 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335116 Boctor Jul 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stern et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
20030120283 Stoianovici et al. Jun 2003 A1
20100241079 Abrams Sep 2010 A1
20130066335 Barwinkel et al. Mar 2013 A1
20140276007 Sela et al. Sep 2014 A1
20160235493 LeBoeuf, II et al. Aug 2016 A1
20170007349 Solar et al. Jan 2017 A1
20170079722 O'Grady Mar 2017 A1
20170165005 Kheradpir et al. Jun 2017 A1
Foreign Referenced Citations (7)
Number Date Country
102905642 Jan 2013 CN
102010029275 Dec 2011 DE
102010040987 Mar 2012 DE
2015150636 Aug 2015 JP
2015129474 Sep 2015 WO
2016013636 Jan 2016 WO
2016029289 Mar 2016 WO
Non-Patent Literature Citations (6)
Entry
Japanese Office Action dated Jan. 26, 2022 corresponding to counterpart Patent Application JP 2019-564842.
Indian Office Action dated Mar. 2, 2022 corresponding to counterpart Patent Application IN 201917050301.
Chinese First Office Action dated May 7, 2022 corresponding to counterpart Patent Application CN 201880034162.3.
International Search Report dated Sep. 3, 2018 and Written Opinion completed Aug. 29, 2018 corresponding to counterpart Int'l Patent Application PCT/US18/31302.
Extended European Search Report dated Feb. 9, 2021 corresponding to counterpart Patent Application EP 18806526.2.
Weiss et al: “Dynamic Sensor-Based Control of Robots with Visual Feedback”, IEEE Journal on Robotics and Automation, IEEE, USA, vol. 3, No. 5, Oct. 1, 1987 (Oct. 1, 1987), pp. 404-417, XP011217419, ISSN: 0882-4967, DOI: 10.1109/JRA.1987.1087115.
Related Publications (1)
Number Date Country
20200163730 A1 May 2020 US
Provisional Applications (1)
Number Date Country
62510938 May 2017 US