Collision handling algorithms for robotic surgical systems

Information

  • Patent Grant
  • 11628022
  • Patent Number
    11,628,022
  • Date Filed
    Tuesday, September 4, 2018
    6 years ago
  • Date Issued
    Tuesday, April 18, 2023
    a year ago
Abstract
Methods of collision handling for robotic surgical systems include slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and an input handle is moved in a direction that corresponds to moving the tool towards the obstruction. The input handle having an offset relative to a desired pose of the tool after the input handle is slipped.
Description
BACKGROUND

Robotic surgical systems have been used in minimally invasive medical procedures. During a medical procedure, the robotic surgical system is controlled by a surgeon interfacing with a user interface. The user interface allows the surgeon to manipulate an end effector that acts on a patient.


The end effector is inserted into a small incision (via a cannula) or a natural orifice of a patient to position the end effector at a work site within the body of the patient. Some robotic surgical systems include a robotic console supporting a robot arm, and at least one end effector such as a scalpel, a forceps, or a grasping tool that is mounted to the robot arm.


In general, the user interface includes an input controller or handle that is moveable by the surgeon to control the robotic surgical system. Robotic surgical systems typically use a scaling factor to scale down the motions of the surgeons hands to determine the desired position of the robotic instruments within the patient. Often this scaling factor requires the motions of the handles to be larger than the range of motion of the input handle. The handles therefore reach a boundary limit of the workspace and prevent the surgeon from completing the desired motion. Current robotic surgical systems on the market use a feature called “clutching” to decouple the motion of the input handles from the robotic instruments. The surgeon is then free to move the handles to a new position within the workspace of the user interface while the instruments remain stationary. Once the input handle is away from the workspace boundary, the surgeon can “reclutch” to recouple the motion of the input handle to complete the desired motion with the robotic instrument.


During a robotic surgical procedure, the robot arm or end effector may collide with tissue, an organ, or another surgical implement (e.g., another robot arm or end effector, access port, or camera). Such collisions can create a positional mismatch between the position of the input handles and the robot arm or end effector associated with the input handle. This positional mismatch can create undesired motions of the robot arm or the end effector during the surgical procedure.


Accordingly, there is a need for collision handling algorithms for robotic surgical system.


SUMMARY

In an aspect of the present disclosure, a method of collision handling for a robotic surgical system includes slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and an input handle is moved in a direction that corresponds to moving the tool towards the obstruction. The input handle having an offset relative to a desired pose of the tool after the input handle is slipped.


In aspects, the method includes moving the input handle in a direction to move the portion of the surgical robot away from the obstruction after the slipping of the input handle. The input handle may move a distance corresponding to the offset before the tool moves in a direction away from the obstruction. Alternatively, the tool may move in a direction away from the obstruction while maintaining a trim between a position of the input handle and a pose of the tool. The trim may be equal to the offset or the method may include dynamically scaling movement of the input handle relative to the pose of the tool in a direction parallel to the offset until the trim reaches a predetermined value. The predetermined value may be zero or nonzero.


In some aspects, slipping the handle relative to the pose of the tool occurs after the surgical robot reaches the predetermined force threshold to move the tool towards a desired pose. The method may further include a processing unit of the robotic surgical system to define the offset between a threshold position of the input handle when the tool reaches the predetermined force threshold and a position of the input handle after the input handle is pushed beyond the threshold position. The method may include the robotic surgical system providing force feedback to a clinician to resist slipping of the input handle beyond the threshold position.


In another aspect of the present disclosure, a method of collision handling of a robotic surgical system with a processing unit of the robotic surgical system includes receiving a first input signal from a user interface of the robotic surgical system to move a tool of a surgical robot of the robotic surgical system to a desired pose of the tool, transmitting an input control signal to the surgical robot to move the tool towards the desired pose, receiving a feedback signal from the surgical robot that a force to move the tool towards the desired pose is greater than a predetermined threshold, maintaining the tool at a threshold pose when the predetermined threshold is reached, and slipping a position of the input handle relative to the threshold pose to a second position of the input handle to define an offset between the second position of the input handle and a desired pose of the tool corresponding to the second position of the input handle.


In aspects, the method includes transmitting a feedback control signal to the user interface to resist movement of the input handle beyond a threshold position corresponding to the threshold pose of the tool.


In some aspects, the method includes receiving a second input signal from the user interface after slipping the position of the input handle indicative of the input handle moving towards a threshold position corresponding to the threshold pose of the tool. The method may include maintaining the tool in the threshold pose in responds to receiving the second input signal. Alternatively, the method may include transmitting a second control signal to the surgical robot to move the tool away from the desired pose with a trim defined between the input handle and the pose of the tool. Transmitting the second control signal may include the trim being equal to the offset between the second position of the input handle and the desired pose of the tool corresponding to the second position of the input handle. The method may include dynamically scaling movement of the input handle to the pose of the tool to reduce the trim between the position of the input handle and the pose of the tool until the trim reaches a predetermined value. The predetermined value may be zero or nonzero.


Further details and aspects of exemplary embodiments of the present disclosure are described in more detail below with reference to the appended figures.





BRIEF DESCRIPTION OF THE DRAWINGS

Various aspects of the present disclosure are described hereinbelow with reference to the drawings, which are incorporated in and constitute a part of this specification, wherein:



FIG. 1 is a schematic illustration of a user interface and a robotic system in accordance with the present disclosure; and



FIG. 2 is a plan view, schematic illustration, of a workspace of the user interface of FIG. 1;



FIG. 3 is a view of a display device of the user interface of FIG. 1 illustrating a tool of a surgical robot within a surgical site;



FIG. 4 is a flowchart of a method of collision handling and collision recovery in accordance with the present disclosure; and



FIG. 5 is a flowchart of another method of collision handling and collision recovery in accordance with the present disclosure.





DETAILED DESCRIPTION

Embodiments of the present disclosure are now described in detail with reference to the drawings in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “clinician” refers to a doctor, a nurse, or any other care provider and may include support personnel. Throughout this description, the term “proximal” refers to the portion of the device or component thereof that is closest to the clinician and the term “distal” refers to the portion of the device or component thereof that is farthest from the clinician. In addition, as used herein the term “neutral” is understood to mean non-scaled.


This disclosure generally relates to collision handling and collision recovery algorithms or methods for robotic surgical systems. Specifically, for collision handling a processing unit of a robotic surgical system may allow an input handle of a user interface to slip beyond a position corresponding to a pose of a tool of a surgical robot when a portion of the surgical robot collides with an obstruction. Slipping the input handle relative to the pose of the tool defines an offset between the position of the input handle and a pose of the tool.


To recover from the collision, the input handle may move through the entire offset before the tool moves from the pose when the surgical robot collided with the obstruction. Alternatively, any movement of the input handle to move the surgical robot away from the obstruction would move the surgical robot away from the obstruction such that a trim is defined between the position of the input handle and a pose of the tool. The trim may be equal to the offset or the robot surgical system may dynamically scale movement of the surgical robot to reduce or remove the trim in a manner imperceptible to a clinician.


Referring to FIG. 1, a robotic surgical system 1 in accordance with the present disclosure is shown generally as a surgical robot 10, a processing unit 30, and a user interface 40. The surgical robot 10 generally includes linkages 12 and a robot base 18. The linkages 12 moveably support an end effector or tool 20 which is configured to act on tissue. The linkages 12 may be in the form of arms each having an end 14 that supports an end effector or tool 20 which is configured to act on tissue. In addition, the ends 14 of the linkages 12 may include an imaging device 16 for imaging a surgical site “S”. The user interface 40 is in communication with robot base 18 through the processing unit 30.


The user interface 40 includes a display device 44 which is configured to display three-dimensional images. The display device 44 displays three-dimensional images of the surgical site “S” which may include data captured by imaging devices 16 positioned on the ends 14 of the linkages 12 and/or include data captured by imaging devices that are positioned about the surgical theater (e.g., an imaging device positioned within the surgical site “S”, an imaging device positioned adjacent the patient “P”, imaging device 56 positioned at a distal end of an imaging arm 52). The imaging devices (e.g., imaging devices 16, 56) may capture visual images, infra-red images, ultrasound images, X-ray images, thermal images, and/or any other known real-time images of the surgical site “S”. The imaging devices transmit captured imaging data to the processing unit 30 which creates three-dimensional images of the surgical site “S” in real-time from the imaging data and transmits the three-dimensional images to the display device 44 for display.


The user interface 40 also includes input handles 42 which are supported on control arms 43 which allow a clinician to manipulate the surgical robot 10 (e.g., move the arms 12, the ends 14 of the linkages 12, and/or the tools 20). Each of the input handles 42 is in communication with the processing unit 30 to transmit control signals thereto and to receive feedback signals therefrom. Additionally or alternatively, each of the input handles 42 may include input devices 46 (FIG. 2) which allow the surgeon to manipulate (e.g., clamp, grasp, fire, open, close, rotate, thrust, slice, etc.) the tools 20 supported at the ends 14 of the linkages 12.


With additional reference to FIG. 2, each of the input handles 42 is moveable through a predefined workspace to move the ends 14 of the linkages 12, e.g., tools 20 (FIG. 1), within a surgical site “S”. The three-dimensional images on the display device 44 are orientated such that the movement of the input handles 42 moves the ends 14 of the linkages 12 as viewed on the display device 44. The three-dimensional images remain stationary while movement of the input handles 42 is scaled to movement of the ends 14 of the linkages 12 within the three-dimensional images. To maintain an orientation of the three-dimensional images, kinematic mapping of the input handles 42 is based on a camera orientation relative to an orientation of the ends 14 of the linkages 12. The orientation of the three-dimensional images on the display device 44 may be mirrored or rotated relative to view from above the patient “P”. In addition, the size of the three-dimensional images on the display device 44 may be scaled to be larger or smaller than the actual structures of the surgical site permitting a clinician to have a better view of structures within the surgical site “S”. As the input handles 42 are moved, the tools 20 are moved within the surgical site “S” as detailed below. Movement of the tools 20 may also include movement of the ends 14 of the linkages 12 which support the tools 20.


For a detailed discussion of the construction and operation of a robotic surgical system 1, reference may be made to U.S. Pat. No. 8,828,023, the entire contents of which are incorporated herein by reference.


The movement of the tools 20 is scaled relative to the movement of the input handles 42. When the input handles 42 are moved within a predefined workspace, the input handles 42 send input signals to the processing unit 30. The processing unit 30 analyzes the input signals to move the tools 20 in response to the input signals. The processing unit 30 transmits scaled control signals to the robot base 18 to move the tools 20 in response to the movement of the input handles 42. The processing unit 30 scales the input signals by dividing an Inputdistance (e.g., the distance moved by one of the input handles 42) by a scaling factor SF to arrive at a scaled Outputdistance (e.g., the distance that one of the ends 14 is moved). The scaling factor SF is in a range between about 1 and about 10 (e.g., 3). This scaling is represented by the following equation:

Outputdistance=Inputdistance/SF

It will be appreciated that the larger the scaling factor SF the smaller the movement of the tools 20 relative to the movement of the input handles 42.


For a detailed description of scaling movement of the input handle 42 along the X, Y, and Z coordinate axes to movement of the tool 20, reference may be made to commonly owned International Patent Application Serial No. PCT/US2015/051130, filed Sep. 21, 2015, and International Patent Application No. PCT/US2016/14031, filed Jan. 20, 2016, the entire contents of each of these disclosures are herein incorporated by reference.


Referring to FIGS. 1-3, during a robotic surgical procedure, a clinician interfaces with the input handle 42 to manipulate the tool 20 within the surgical site “S”. As the tool 20 is moved within the surgical site “S”, a clinician can visualize movement of the tool 20 within the surgical site “S” on the display 44.


To manipulate the tool 20, a clinician moves an input handle 42 from a first position “P1” to a second position “P2”, shown in dashed lines (FIG. 2). The processing unit 30 receives an input signal sent from the user interface 40 and transmits a control signal to the surgical robot 10 to move the tool 20 from a first pose to a second pose. For example, the input handle 42 is moved a distance along a control X axis in a direction illustrated by arrow “M1” and the tool 20 is moved in a direction along a robotic X axis illustrated by arrow “R1” representing movement of the tool 20 from a first pose “T1” towards a second pose “T2”.


During movement of the tool 20 from the first pose “T1” towards the second pose “T2”, the tool 20 may collide with an obstruction within the surgical site “S”, e.g., tissue T, another tool 20, an organ, or other surgical implement. When the tool 20 collides with the obstruction, the processing unit 30 receives a feedback signal from the surgical robot 10 and transmits a feedback control signal to the user interface 40. In response to receiving the feedback control signal, the user interface provides force feedback to the clinician indicative of the tool 20 colliding with the obstruction. For example, the clinician may feel resistance to continued movement along the control X axis in the direction of the arrow “M1”.


When the clinician feels the force feedback, the clinician may push the input handle 42 against the force feedback (e.g., in a direction opposite to the direction of the force feedback) and continue to move the input handle 20 along the control X axis in the direction of arrow “M1”. In response, the processing unit 30 continues to send control signals to the surgical robot 10 to move the tool 20 along the robotic X axis in the direction of arrow “R1” until the force of the surgical robot 10, to continue movement of the tool 20 along the robotic X axis, exceeds a predetermined threshold. The predetermined threshold may be determined by a deflection of a portion of the surgical robot 10 or by a torque at one or more joints of the surgical robot 10. When the force of the surgical robot 10 exceeds the predetermined threshold, the surgical robot 10 “clutches” the movement of the input handle 42 from movement of the robotic system 10, scales down movement of the input handle 42 from movement of the surgical robot 10, and/or any other known means of collision handling. For a detailed discussion of systems and methods for detecting and handling of a collision of a tool or linkage of a robotic system and an obstruction reference may be made to U.S. Provisional Patent Application Ser. No. XX/XXX,XXX, filed XXXX, and entitled “SURGICAL ROBOT INCLUDING TORQUE SENSORS, the entire contents of which are hereby incorporated by reference.


With particular reference to FIG. 2, the force to move the tool 20 along the robotic X axis was reached the predetermined threshold when the input handle 42 was positioned at a threshold position “PT”. As shown, the input handle 42 was pushed through the threshold position “PT” to the second position “P2”. As the input handle 42 is moved between the threshold position “PT” and the second position “P2” the tool 20 is substantially stationary within the surgical site “S”, e.g., the tool 20 remains in the first pose “T1” as shown in FIG. 3, such that the input handle 42 “slips” relative to the tool 20. This “slipping” of the input handle 42 relative to the tool 20 results in a position mismatch between a desired pose “T2” of the tool 20 based on the position of the input handle 42 and the actual pose of the tool 20 which remains at the first pose “T1”.


With the input handle 42 in the second position “P2”, the surgical robot 10 maintains the tool 20 at the first pose “T1”, the pose at which the predetermined threshold was reached, until the input handle 42 is moved along the control X axis in a direction that requires a force below the predetermined threshold to reposition the tool 20 along the robotic X axis, e.g., in a direction opposite the arrow “R1”.


This position mismatch can create undesired motions of the tool 20 within the surgical site “S” during a surgical procedure. For example, when the input handle 42 is in the second position “P2”, the tool 20 may be maintained in the first pose “T1” with the predetermined threshold force being directed towards an obstruction, e.g., tissue “T”, such that, were the tool 20 to free itself from the obstruction, the tool 20 may move towards desired pose “T2” unexpectedly and/or at an undesired high velocity.


With reference to FIG. 4, a method 200 for slipping the input handle 42 relative to the tool 20 in an event of a collision with an obstruction and a method for collision recovery is disclosed, in accordance with the present disclosure, with reference to the robotic surgical system 1 of FIGS. 1-3. As detailed below, a collision between a tool 20 and tissue “T” of a patient is described; however, such a collision may be between any portion of the surgical robot 10 and an obstruction. For example, a collision may occur between a linkage 12 of the surgical robot 10 and another linkage 12.


Initially, a clinician moves the input handle 42 in a first direction along the control X axis towards the second position “P2” and transmits an input signal indicative of the movement (Step 210). The processing unit 30 receives the input signal (Step 240) and transmits an input control signal to move the tool 20 towards the desired pose of the surgical robot 10 (Step 242). The surgical robot 10 receives the control signal and moves the tool 20, and thus the surgical robot 10, towards the desired pose “T2” (Step 260).


As the tool 20 is moved towards the desired pose “T2”, a portion of the surgical robot 10, e.g., tool 20, may collide with tissue “T” such that the surgical robot 10 would require a force greater than a predetermined threshold to continue to move the surgical robot 10 towards the desired pose “T2” (Step 262); this pose is defined as the threshold pose “T1”. When the predetermined threshold is reached or exceeded, the surgical robot 10 transmits a feedback signal to the processing unit 30.


The processing unit 30 receives the feedback signal (Step 244) from the surgical robot 10 and transmits a control signal to the surgical robot 10 (Step 246) to maintain the surgical robot at the threshold pose “T1” (Step 264). In addition, the processing unit 30 transmits a feedback control signal to the user interface 40 (Step 246). In response to the feedback control signal, a clinician experiences force feedback against moving the input handle beyond a threshold position “PT” that corresponds to the threshold pose “T1” of the surgical robot 10 (Step 212).


The clinician may push the input handle 42 in the first direction through the force feedback of the user interface 40 to a second position “P2” (Step 214). The processing unit 30 receives an input signal in response to movement of the input handle 42 in the first direction and slips the position of the input handle 42 relative to the pose of the surgical robot 10 (Step 248). As the input handle 42 is moved beyond the threshold position “PT” an offset is generated along the control X axis as the input handle 42 is “slipped” between the threshold position “PT” and the second position “P2”. The offset represents movement of the input handle 42 beyond the point at which the position of the input handle 42 corresponds to the pose of the surgical robot 10, e.g., the threshold position “PT”, and the position of the input handle 42, e.g., the second position “P2”.


With the input handle 42 at the second position “P2”, the input handle 42 can be moved along the control X axis in a second direction away from the obstruction, e.g., the direction opposite the arrow “M1”, (Step 216) such that the input handle 42 moves through a dead zone equal to the offset between the second position “P2” and the threshold position “PT” before the tool 20 of the surgical robot 10 moves along the robot X axis in a direction opposite the arrow “R1”. Once the input handle 42 returns to the threshold position “PT” along the control X axis, the surgical robot 10 is recovered from the collision such that the surgical robot 10 moves the tool 20 along the robot X axis in response to additional movement of the input handle 42 in the second direction (Steps 220, 254, 256, 266). It will be appreciated that movement of the input handle 42 along the control X axis towards the threshold position “PT” will be allowed with little or no resistance, e.g., force feedback, while additional movement of the input handle 42 along the control X axis away from the threshold position “PT” will be resisted with additional force feedback.


With additional reference to FIG. 5, another method 300 of collision recovery is disclosed in accordance with the present disclosure. After the processing unit 30 slips the position of the input handle 42 relative to the threshold pose of the surgical robot 10 to define an offset (Step 248), the input handle 42 is moved in the second direction along the control X axis (Step 302). The processing unit 30 receives an input signal indicative of the movement of the input handle 42 in the second direction (Step 350) and transmits a second control signal to the surgical robot 10 to move away from the threshold pose “T2” with a trim between the input handle and the pose of the surgical robot (Step 352). It will be appreciated that the trim is substantially equal to the offset between the threshold position “PT” and the second position “P2”. The surgical robot 10 receives the second control signal and moves the surgical robot 10 away from the threshold pose (Step 366). The robotic surgical system 1 may continue to manipulate the surgical robot 10 in response to movements of the input handle 42 with the trim maintained between the position of the input handle 42 and the pose of the surgical robot 10.


Alternatively in some embodiments, the robotic surgical system 1 may dynamically scale the movement of the input handle 42 and the tool 20 to reduce or eliminate the trim in a manner imperceptible to a clinician. For example, the input handle 42 can be moved in the first and second directions along the control X axis such that input signals are transmitted to the processing unit 30 (Step 304). The processing unit 30 receives the input signals (Step 354) and dynamically scales movements of the input handle 42 to reduce the trim between the input handle 42 and the pose of the surgical robot 10 (Step 356). The processing unit 30 transmits scaled control signals to the surgical robot 10 (Step 358) which moves the surgical robot 10 in response to the scaled control signals (Step 368). The trim may be reduced to a predetermined value and the robotic surgical system 10 may continue to move the surgical robot 10 in response to movement of the input handle 42. In particular embodiments, the predetermined value of the trim is nonzero and in other embodiments the trim is reduced to zero such that the position of the input handle 42 corresponds to the pose of the surgical robot 10.


For a detailed discussion of a robotic surgical system functioning with an offset and/or dynamic scaling to eliminate an offset reference can be made to commonly owned U.S. Provisional Patent Application No. 62/554,292, filed Sep. 5, 2017 and entitled “ROBOTIC SURGICAL SYSTEMS WITH ROLL, PITCH, AND YAW REALIGNMENT INCLUDING TRIM AND FLIP ALGORITHMS”, the entire contents of which are hereby incorporated by reference.


Slipping a position of the input handle 42 relative to a pose of the tool 20 allows for movement or repositioning of the input handle 42 within the workspace of the user interface 40 without movement of the tool 20 within the surgical site “S”. The methods of collision recovery detailed above, e.g., moving the input handle 42 through a dead zone, operating with an offset, and dynamically scaling to eliminate offset, allows for predictable movement of a tool, e.g., tool 20, of a surgical robot after a collision. Such predictable movement may improve surgical outcomes, reduce the surgical time, reduce recovery time, and/or reduce the cost of surgery.


As detailed above, the user interface 40 is in operable communication with the surgical robot 10 to perform a surgical procedure on a patient; however, it is envisioned that the user interface 40 may be in operable communication with a surgical simulator (not shown) to virtually actuate a robotic system and/or tool in a simulated environment. For example, the robotic surgical system 1 may have a first mode in which the user interface 40 is coupled to actuate the surgical robot 10 and a second mode in which the user interface 40 is coupled to the surgical simulator to virtually actuate a robotic system. The surgical simulator may be a standalone unit or be integrated into the processing unit 30. The surgical simulator virtually responds to a clinician interfacing with the user interface 40 by providing visual, audible, force, and/or haptic feedback to a clinician through the user interface 40. For example, as a clinician interfaces with the input handles 42, the surgical simulator moves representative tools that are virtually acting on tissue. It is envisioned that the surgical simulator may allow a clinician to practice a surgical procedure before performing the surgical procedure on a patient. In addition, the surgical simulator may be used to train a clinician on a surgical procedure. Further, the surgical simulator may simulate “complications” during a proposed surgical procedure to permit a clinician to plan a surgical procedure.


While several embodiments of the disclosure have been shown in the drawings, it is not intended that the disclosure be limited thereto, as it is intended that the disclosure be as broad in scope as the art will allow and that the specification be read likewise. Any combination of the above embodiments is also envisioned and is within the scope of the appended claims. Therefore, the above description should not be construed as limiting, but merely as exemplifications of particular embodiments. Those skilled in the art will envision other modifications within the scope of the claims appended hereto.

Claims
  • 1. A method of collision handling for a robotic surgical system, the method comprising: slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and the input handle is moved in a direction corresponding to moving the tool towards the obstruction, the input handle having an offset relative to a desired pose of the tool in response to slipping of the input handle; andslipping the input handle relative to the pose of the tool after the surgical robot reaches a predetermined force threshold to move the tool towards a desired pose.
  • 2. The method according to claim 1, further comprising, in response to slipping the input handle, moving the input handle in a direction away from the obstruction such that the tool moves in a direction away from the obstruction maintaining a trim between a position of the input handle and a pose of the tool.
  • 3. The method according to claim 2, further comprising the robotic surgical system providing force feedback to a clinician to resist slipping of the input handle beyond the threshold position.
  • 4. The method according to claim 1, wherein the predetermined force threshold is non-zero.
  • 5. The method according to claim 2, further comprising, in response to moving the input handle a distance corresponding to the offset, moving the tool in a direction away from the obstruction.
  • 6. The method according to claim 2, wherein the trim is equal to the offset.
  • 7. A method of collision handling for a robotic surgical system, the method comprising: slipping an input handle of a user interface of the robotic surgical system relative to a pose of a tool of a surgical robot of the robotic surgical system when a portion of the surgical robot collides with an obstruction and the input handle is moved in a direction corresponding to moving the tool towards the obstruction, the input handle having an offset relative to a desired pose of the tool in response to slipping of the input handle;in response to a moving of the input handle in a direction away from the obstruction, after the slipping the input handle, moving the tool in a direction away from the obstruction maintaining a trim between a position of the input handle and a pose of the tool; anddynamically scaling movement of the input handle relative to the pose of the tool in a direction parallel to the offset until the trim reaches a predetermined value.
  • 8. The method according to claim 7, further comprising, in response to: moving the input handle in a direction to move the portion of the surgical robot away from the obstruction, after the slipping of the input handle; andmoving the input handle a distance corresponding to the offset,
  • 9. The method according to claim 7, wherein the trim is equal to the offset.
  • 10. The method according to claim 7, wherein the predetermined value is nonzero.
  • 11. The method according to claim 7, wherein the dynamic scaling movement of the input handle relative to the post of the tool results in a reduction of the trim.
  • 12. The method according to claim 11, wherein the reduction of the trim is in a manner imperceptible to the clinician.
  • 13. A method of collision handling of a robotic surgical system with a processing unit of the robotic surgical system, the method comprising: receiving a first input signal from a user interface of the robotic surgical system to move a tool of a surgical robot of the robotic surgical system to a desired pose of the tool;transmitting an input control signal to the surgical robot to move the tool towards the desired pose;receiving a feedback signal from the surgical robot that a force to move the tool towards the desired pose is greater than a predetermined threshold;maintaining the tool at a threshold pose when the predetermined threshold is reached; andslipping a position of the input handle relative to the threshold pose to a second position of the input handle to define an offset between the second position of the input handle and a desired pose of the tool corresponding to the second position of the input handle.
  • 14. The method according to claim 13, further comprising transmitting a feedback control signal to the user interface to resist movement of the input handle beyond a threshold position corresponding to the threshold pose of the tool.
  • 15. The method according to claim 13, further comprising receiving a second input signal from the user interface after slipping the position of the input handle indicative of the input handle moving towards a threshold position corresponding to the threshold pose of the tool.
  • 16. The method according to claim 15, further comprising maintaining the tool in the threshold pose in response to receiving the second input signal.
  • 17. The method according to claim 15, further comprising transmitting a second control signal to the surgical robot to move the tool away from the desired pose with a trim defined between the input handle and the pose of the tool.
  • 18. The method according to claim 17, wherein transmitting the second control signal includes the trim being equal to the offset between the second position of the input handle and the desired pose of the tool corresponding to the second position of the input handle.
  • 19. The method according to claim 17, further comprising dynamically scaling movement of the input handle to the pose of the tool to reduce the trim between the position of the input handle and the pose of the tool until the trim reaches a predetermined value.
  • 20. The method according to claim 19, wherein the predetermined value is nonzero.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) of International Patent Application Serial No. PCT/US2018/049334, filed Sep. 4, 2018, which claims the benefit of and priority to U.S. Provisional Patent Application Ser. No. 62/554,331, filed Sep. 5, 2017, the entire disclosure of which is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind
PCT/US2018/049334 9/4/2018 WO
Publishing Document Publishing Date Country Kind
WO2019/050829 3/14/2019 WO A
US Referenced Citations (385)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7035716 Harris et al. Apr 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7155316 Sutherland et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8010180 Quaid et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8083691 Goldenberg et al. Dec 2011 B2
8092397 Wallace et al. Jan 2012 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8753346 Suarez et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8828023 Neff et al. Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 Patrick Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9266239 Miller Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9308646 Lim et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9333650 Bajo et al. May 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9469034 Diolaiti et al. Oct 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9696700 Beira et al. Jul 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stem et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
11334063 Celia May 2022 B2
20090076476 Barbagli et al. Mar 2009 A1
20110015649 Anvari et al. Jan 2011 A1
20110105898 Guthart et al. May 2011 A1
20110319910 Roelle et al. Dec 2011 A1
20120209069 Popovic et al. Aug 2012 A1
20130090763 Simaan et al. Apr 2013 A1
20140052150 Taylor et al. Feb 2014 A1
20140058406 Tsekos Feb 2014 A1
20140058564 Zhao et al. Feb 2014 A1
20140228631 Kwak et al. Aug 2014 A1
20150182285 Yen et al. Jul 2015 A1
20150250547 Fukushima et al. Sep 2015 A1
20150320514 Ahn et al. Nov 2015 A1
20170112582 Itkowitz et al. Apr 2017 A1
20170224428 Kopp Aug 2017 A1
20180014897 Peine Jan 2018 A1
20180310999 Peine Nov 2018 A1
20200367984 Peine Nov 2020 A1
Foreign Referenced Citations (6)
Number Date Country
1871267 Jan 2008 EP
2009240789 Oct 2009 JP
5754820 Jul 2015 JP
2014151550 Sep 2014 WO
2016030336 Mar 2016 WO
2016205266 Dec 2016 WO
Non-Patent Literature Citations (4)
Entry
Extended European Search Report dated Aug. 27, 2021 corresponding to counterpart Patent Application EP18853152.9.
International Search Report dated Dec. 19, 2018 and Written Opinion completed Dec. 19, 2018 corresponding to counterpart Int'l Patent Application PCT/US2018/049330.
Japanese Office Action dated Aug. 31, 2022 corresponding to counterpart Patent Application JP 2020-534809.
Indian Office Action dated Mar. 25, 2022 corresponding to counterpart Patent Application IN 202017008950.
Related Publications (1)
Number Date Country
20200345433 A1 Nov 2020 US
Provisional Applications (1)
Number Date Country
62554331 Sep 2017 US