This application claims priority to EP Patent Appl. No. 22305572.4, filed Apr. 19, 2022, the entire contents of which are incorporated herein by reference.
The present disclosure is directed to co-manipulation robotic systems having a constant tension mode for assisting with laparoscopic surgical procedures.
Managing vision and access during a laparoscopic procedure is a challenge. Most surgeries are done with at least four hands with an surgeon and an assistant. The surgical assistant paradigm is inherently imperfect, as the assistant is being asked to anticipate and see with the surgeon's eyes, without standing where the surgeon stands, and similarly to anticipate and adjust how the surgeon wants the tissue of interest exposed, throughout the procedure. For example, during a laparoscopic procedure, one assistant may be required to hold a retractor device to expose tissue for the surgeon, while another assistant may be required to hold a laparoscope device to provide a field of view of the surgical space within the patient to the surgeon during the procedure, either one of which may be required to hold the respective tools in an impractical position, e.g., from between the arms of the surgeon while the surgeon is actively operating additional surgical instruments.
Moreover, the assistant holding the retractor may be required to control and maintain the tension applied to the organ as the surgeon performs a procedure on the organ, which may cause movement of the organ and cause redistribution of forces applied to the retractor by the organ over the course of the procedure. The surgeon and assistants' vision of the surgical scene is limited by the field of view of the laparoscopic camera. Thus, the surgeon must entrust the assistant holding the retractor as the retractor may not be visible within the field of view of the camera. In addition, as the dedicated surgical instruments are passed through a fixed point, e.g., the trocar ports, through the wall of the abdomen, hand movements, including aberrant movements or tremors, may be amplified with a leverage effect. Accordingly, it may be very difficult for the assistant holding the retractor to physically maintain a constant pull or push force over the minutes or hours of an operation. The assistant will therefore, usually imperceptibly, even to himself or herself, weaken by letting the arm carrying the retractor descend, which may consequently push the retractor into the abdomen or off-center with the risk damaging, e.g., by puncturing or tearing, the organ and/or other exposed adjacent tissues.
In addition, the assistant may be required to adapt the force applied as the procedure progresses, e.g., during a dissection where the force applied to the retractor by the anatomical structure changes during the dissection, to maintain the constant force applied to the anatomical structure, which may require moving the readjusting the position of the retractor. For example, during the procedure, the surgeon may release one organ from another organ, such as when a gallbladder is freed from the liver during a cholecystectomy, where the point of traction on the gallbladder may move more than 10 cm. Another difficulty is the variability of the operating mode of the surgeon/assistant pair. For example, when the assistant is experienced and used to working with the surgeon, the assistant may immediately provide assistance with the right pulling or retracting maneuvers. However, when the assistant is less experienced or is not known to the surgeon, the surgeon may position the retractor, and may even apply a force on the retractor to grasp an anatomical structure, such that the surgeon must entrust the assistant to reproduce the same force in the same axis.
Various attempts have been made at solving this issue. For example, a rail-mounted orthopedic retractor, which is a purely mechanical device that is mounted to the patient bed/table, may be used to hold a laparoscope device in position during a laparoscopic procedure, and another rail-mounted orthopedic retractor may be used to hold a retractor device in position during the laparoscopic procedure. However, the rail-mounted orthopedic retractor requires extensive manual interaction to unlock, reposition, and lock the tool in position.
Complex robot-assisted systems such as the Da Vinci Surgical System (made available by Intuitive Surgical, Sunnyvale, California) have been used by surgeons to enhance laparoscopic surgical procedures by permitting the surgeon to tele-operatively perform the procedure from a surgeon console remote from the patient console holding the surgical instruments. Such complex robot-assisted systems are very expensive, and have a very large footprint and take up a lot of space in the operating room. Moreover, such robot-assisted systems typically require unique system-specific surgical instruments that are compatible with the system, and thus surgeons may not use standard off-the-shelf surgical instruments that they are used to. As such, the surgeon is required to learn an entirely different way of performing the laparoscopic procedure.
In view of the foregoing drawbacks of previously known systems and methods, there exists a need for a system that provides the surgeon with the ability to seamlessly position and manipulate various surgical instruments as needed, thus avoiding the workflow limitations inherent to both human and mechanical solutions.
The present disclosure overcomes the drawbacks of previously-known systems and methods by providing a co-manipulation surgical system to assist with laparoscopic surgery performed using a surgical instrument having a handle, an operating end, and an elongated shaft therebetween. The co-manipulation surgical system may include a robot arm having a proximal end, a distal end that may be removably coupled to the surgical instrument, a plurality of links, and a plurality of joints between the proximal end and the distal end. The co-manipulation surgical system further may include a controller operatively coupled the robot arm. The controller may be programmed to cause the robot arm to automatically switch between: a passive mode responsive to determining that movement of the robot arm due to movement at the handle of the surgical instrument is less than a predetermined amount for at least a predetermined dwell time period, wherein the controller may be programmed to cause the robot arm to maintain a static position in the passive mode; and a co-manipulation mode responsive to determining that force applied at the robot arm due to force applied at the handle of the surgical instrument exceeds a predetermined threshold, wherein the controller may be programmed to permit the robot arm to be freely moveable in the co-manipulation mode responsive to movement at the handle of the surgical instrument for performing laparoscopic surgery using the surgical instrument, and wherein the controller may be programmed to apply a first impedance to the robot arm in the co-manipulation mode to account for weight of the surgical instrument and the robot arm. The controller further may be programmed to cause the robot arm to automatically switch to a haptic mode responsive to determining that at least a portion of the robot arm is outside a predefined haptic barrier, wherein the controller may be programmed to apply a second impedance to the robot arm in the haptic mode greater than the first impedance, thereby making movement of the robot arm responsive to movement at the handle of the surgical instrument more viscous in the haptic mode than in the co-manipulation mode
Disclosed herein are co-manipulation surgical robot systems for assisting an operator, e.g., a surgeon, in performing a surgical procedure, e.g., a laparoscopic procedure, and methods of use thereof. Currently, laparoscopic procedures typically require a surgeon and one or more assistants. For example, as shown in
As shown in
The co-manipulation surgical robot systems described herein provide superior control and stability such that the surgeon and/or assistant may seamlessly position various off-the-shelf surgical instruments as needed, thus avoiding the workflow limitations inherent to both human and mechanical solutions. For example, the robot arms of the co-manipulation surgical robot system may provide surgical assistance by holding a first surgical instrument, e.g., a laparoscope, via a first robot arm, and a second surgical instrument, e.g., a retractor, via a second robot arm, stable throughout the procedure to provide an optimum view of the surgical site and reduce the variability of force applied by the surgical instruments to the body wall at the trocar point. As will be understood by a person having ordinary skill in the art, the robots arms of the co-manipulation surgical robot systems described herein may hold any surgical instrument, preferably having a long and thin instrument shaft, used for surgical procedures such as laparoscopic procedures including, e.g., endoscopes/laparoscopes, retractors, graspers, surgical scissors, needle holders, needle drivers, clamps, suturing instruments, cautery tools, staplers, clip appliers, etc.
The co-manipulation surgical robot system further allows the surgeon to easily maneuver both tools when necessary, providing superior control and stability over the procedure and overall safety. Any implementations of the systems described herein enable a surgeon to directly co-manipulate instruments while remaining sterile at the patient bedside. For example, the system may include two robot arms that may be used by the surgeon to hold both a laparoscope and a retractor. During a surgical procedure, the system may seamlessly reposition either instrument to provide optimal visualization and exposure of the surgical field. Both instruments may be directly coupled to the robot arms of the system and the system may constantly monitor and record the position of the two instruments and/or the two robot arms throughout the procedure. Moreover, the system may record information such as the position and orientation of surgical instruments attached to the robot arm, sensor readings related to force(s) applied at proximal and distal ends of the surgical instruments attached to robot arms, force required to hold each instrument in position, endoscopic video streams, algorithm parameters, operating room 3D stream captured with an optical scanning device, including, e.g., position(s) of surgical entry port(s), position and movements of the surgeon's hands, surgical instrument(s) position and orientation, whether or not attached to robot arms, patient position, and patient table orientation and height.
Such data may be used to develop a database of historical data that may be used to develop the algorithms used in some implementations to control one or more aspects of an operation of the system. In addition, such data may be used during a procedure to control of one or more aspects of an operation of the system per one or more algorithms of the system. For example, the data may be used to assess a level of fatigue of a user of the system.
As the operator manipulates a robot arm of the co-manipulation surgical robot system by applying movement to the surgical instrument coupled to the robot arm, the system may automatically transition the robot arm between various operational modes upon determination of predefined conditions. For example, the system may transition the robot arm to a passive mode responsive to determining that movement of the robot arm due to movement at the handle of the surgical instrument is less than a predetermined amount for at least a predetermined dwell time period, such that in the passive mode, the robot arm maintains a static position, e.g., to prevent damage to the equipment and/or injury to the patient. Additionally, the system may transition the robot arm to a co-manipulation mode responsive to determining that force applied at the robot arm due to force applied at the handle of the surgical instrument exceeds a predetermined threshold, such that in the co-manipulation mode, the robot arm is permitted to be freely moveable responsive to movement at the handle of the surgical instrument for performing laparoscopic surgery using the surgical instrument, while a first impedance is applied to the robot arm in the co-manipulation mode to account for weight of the surgical instrument and the robot arm. Moreover, the system may transition the robot arm to a haptic mode responsive to determining that at least a portion of the robot arm is outside a predefined haptic barrier, such that in the haptic mode, a second impedance greater than the first impedance is applied to the robot arm, thereby making movement of the robot arm responsive to movement at the handle of the surgical instrument more viscous in the haptic mode than in the co-manipulation mode. The system further may transition the robot arm to a robotic assist mode responsive to detecting various conditions that warrant automated movement of the robot arm to guide the surgical instrument attached thereto, e.g., along a planned trajectory or to avoid a collision with another object or person in the surgical space.
Referring now to
In addition, each of robot arms 300 further may include indicators 334 for visually indicating the operational mode associated with the respective robot arm in real-time. For example, indicators 334 may be positioned on at least the elbow joint of the robot arm. Additionally or alternatively, indicators 334 may be placed elsewhere on system 200, e.g., on platform 100, on display 110, etc. Moreover, indicators 334 may include lights, e.g., LED lights, that may illuminate in a variety of distinct colors and in distinct patterns, e.g., solid on or blinking. For example, each operational mode of system 200 may be associated with a uniquely colored light, such as red, yellow, blue, green, purple, white, orange, etc. Accordingly, indicators 334 may indicate a transition from one operational mode to another operational mode.
As shown in
Surgical robot system 200 is configured for co-manipulation, such that system 200 may assist the user or operator, e.g., a surgeon and/or surgical assistant, by permitting the user to freely move robot arm 300a and/or robot arm 300b due to manipulation of one or more surgical instruments coupled with the robot arms in response to force inputs provided by the user to the surgical instruments. Accordingly, system 200 may be configured so that it is not controlled remotely, such that robot arms 300 move directly responsive to movement of the surgical instrument coupled thereto by the operator, while compensating for the mass of the surgical instrument and of the respective robot arm and providing localized impedance along the robot arm, thereby increasing the accuracy of the movements or actions of the operator as the operator manipulates the surgical instrument.
System 200 may be particularly useful in laparoscopic surgical procedures and/or other surgical procedures that utilize long and thin instruments that may be inserted, e.g., via cannulas, into the body of a patient to allow surgical intervention. As will be understood by a person having ordinary skill in the art, system 200 may be used for any desired or suitable surgical operation. Moreover, system 200 may be used in conjunction or cooperation with video monitoring provided by one or more cameras and/or one or more endoscopes so that an operator of system 200 may view and monitor the use of the instrument coupled with robot arms 300 via coupler interface 400. For example, robot arm 300a may be removeably coupled with and manipulate an endoscope, while robot arm 300b may be may be removeably coupled with and manipulate a surgical instrument.
Referring now to
Robot arm 300 further may include shoulder link 305, which includes proximal shoulder link 306 rotatably coupled to distal shoulder link 308. A proximal end of proximal shoulder link 306 may be rotatably coupled to shoulder portion 304 of the base at shoulder joint 318, such that proximal shoulder link 306 may be rotated relative to shoulder portion 304 about axis Q2 at shoulder joint 318. As shown in
In some embodiments, upon actuation of actuator 330, distal shoulder link 308 may be manually rotated in predefined increments relative to proximal shoulder link 306. Alternatively, upon actuation of actuator 330, distal shoulder link 308 may be automatically rotated relative to proximal shoulder link 306 until actuator 330 is released. For example, actuator 330 may be a button or switch operatively coupled to a motor operatively coupled to distal shoulder link 308 and/or proximal shoulder link 306, such that upon actuation of actuator 330, the associated motor causes distal shoulder link 308 to rotate relative to proximal shoulder link 306. Preferably, the motor is disposed within the base of robot arm 300, or alternatively, the motor may be disposed on shoulder link 305. Accordingly, actuator 330 may be a button or switch that permits dual actuation, e.g., a first actuation to cause distal shoulder link 308 to rotate in a first direction relative to shoulder link 306, and a second actuation to cause distal shoulder link 308 to rotate in a second direction opposite to the first direction. In some embodiments, the button or switch may be located on a graphical user interface such as display 110.
Robot arm 300 further may include elbow link 310. A proximal end of elbow link 310 may be rotatably coupled to a distal end of distal shoulder link 308 at elbow joint 322, such that elbow link 310 may be rotated relative to distal shoulder link 308 about axis Q4 at elbow joint 322. Robot arm 300 further may include wrist portion 311, which may include proximal wrist link 312 rotatably coupled to the distal end of elbow link 310 at wrist joint 324, middle wrist link 314 rotatably coupled to proximal wrist link 312 at joint 326, and distal wrist link 316 rotatably coupled to middle wrist link 314 at joint 328, as further shown in
Referring again to
Axis Q6 and axis Q7 may be a “passive” axis, such that middle wrist link 314 may be rotated relative to proximal wrist link 312 without any applied impedance from system 200, and distal wrist link 316 may be rotated relative to middle wrist link 314 without any applied impedance from system 200. The distal end of distal wrist link 316 may include surgical instrument coupler interface 400 for removably coupling with a surgical instrument, e.g., via coupler body 500 as shown in
Referring again to
Prior to attachment with a surgical instrument, robot arm 300 may be manually manipulated by a user, e.g., to position robot arm 300 is a desired position for coupling with the surgical instrument. For example, the user may manually manipulate robot arm 300 via wrist portion 11, actuator 330, and/or actuator 332. Upon actuation of actuator 330, the user may manually rotate distal shoulder link 308, and upon actuation of actuator 332, the user may manually manipulate proximal wrist portion 312. Upon attachment to the surgical instrument, robot arm 300 may still be manipulated manually by the user exerting force, e.g., one or more linear forces and/or one or more torques, directly to robot arm 300; however, during the laparoscopic procedure, the operator preferably manipulates robot arm 300 only via the handle of the surgical instrument, which applies force/torque to the distal end of the robot arm 300, and accordingly the links and joints of robot arm 300. As the operator applies a force to the surgical instrument attached to robot arm 300, thereby causing movement of the surgical instrument, robot arm 300 will move responsive to the movement of the surgical instrument to provide the operator the ability to freely move surgical instrument relative to the patient. As described in further detail below, robot arm 300 may apply an impedance to account for weight of the surgical instrument and of robot arm 300 itself, e.g., gravity compensation, as the operator moves the surgical instrument, thereby making it easier for the operator to move the instrument despite gravitational forces and/or inertial forces being exerted on the robot arm and/or the surgical instrument. As will be understood by a person having ordinary skill in the art, robot arm 300 may include less or more articulation joints than is shown in
Referring now to
Coupler body 500, which may have opening 514 sized and shaped to slidably and releasably receive the elongated shaft of a surgical instrument therethrough, may be removably coupled with coupler interface 400. For example, coupler body 500 may be removeably coupled to coupler body 500 via a magnetic connection, to thereby facilitate efficient attachment and detachment between coupler body 500 and coupler interface 400, e.g., by overcoming the magnetic coupling force between coupler body 500 and coupler interface 400. Accordingly, as shown in
Accordingly, coupler interface 400 or the distal end of distal wrist portion 316 may have a ferrous base component configured to receive and magnetically couple with magnets 506 of coupler body 500 so that coupler body 500 may be removably coupled with coupler interface 500 and/or the distal end of distal wrist portion 316.
In addition, coupler interface 400 may have one or more recesses or depressions 406 sized and shaped to receive one or more magnets 506 therein. Coupler interface 400 may have a ferrous base component or magnets within recesses 406 to magnetically couple with magnets 506. For example, the magnets within recesses 406 may have a south magnetic pole and magnets 506 may have a north magnetic pole, or vice versa. Moreover, the polarity of the magnets can ensure appropriate coupling orientation. Recesses 406 may be sized and shaped to limit or otherwise prevent movement between coupler body 500 and coupler interface 400 in any direction that is radial or normal to an axial (e.g., longitudinal) centerline of magnets 506 when coupler body 500 is in an assembled state with coupler interface 400. As will be understood by a person having ordinary skill in the art, coupler interface 400 may have less or more than two recesses 406, such that coupler body 500 will have a corresponding amount of magnets.
Referring now to
As shown in
The diameter of opening 514 may be selected based on the surgical instrument to be coupled to coupler body 500. For example, a coupler body may be selected from a plurality of coupler bodies, each coupler body having an opening sized and shaped to receive the elongate shaft of a specific surgical instrument having a predefined elongated shaft diameter such as a laparoscopic or other surgical instrument including surgical instruments used for orthopedic and trauma surgery (OTS), a needle holder, clamp, scissors, etc. Coupler body 500 may be coupled with the surgical instrument at any desired axial position on the surgical instrument.
As shown in
Opening 514 may be defined by a first semi-circular cutout in first portion 508 and a second semi-circular cutout in the second portion 510 of coupler body 500, to thereby engage with the circular outer surface of the elongate shaft of a surgical instrument. Opening 514 may include, e.g., rubber pads, sheets, bumps, O-rings, projections, or other components or features configured to contact and grip the outer surface of the elongated shaft of the surgical instrument. For example, the rubber material may be a silicone rubber or any other suitable type of rubber. Accordingly, once coupler body 500 is coupled with the surgical instrument, e.g., by securing screw 516, the surgical instrument may be at least inhibited or otherwise prevented from moving axially, e.g., the direction along the longitudinal axis of the surgical instrument, or, in some embodiments, moving axially and rotationally, relative to coupler body 500 in the secured state. Preferably, the surgical instrument coupled with coupler body 500 may be freely rotated by an operator relative to coupler body 500, while axial movement of the surgical instrument relative to coupler body 500 is inhibited or otherwise prevented in the secured state. For example, the frictional force between the outer surface of the elongated shaft of the surgical instrument and the inner surface of coupler body 500 defining opening 514 may be selected such that rotation of the surgical instrument relative to coupler body 500 requires less force that axial movement of the surgical instrument relative to coupler body 500 in the secured state. Accordingly, coupler 500 may be configured to account for diametric variations and surface variations (including variations in a coefficient of friction of the surface) of the surgical instruments.
In some embodiments, the surgical instrument may be moved in an axial direction relative to coupler body 500 upon the application of at least a threshold force on the surgical instrument relative to coupler body 500, or upon actuation of a release or a state change of coupler body 500. For example, such actuation may be achieved by, e.g., pressing a button, loosening a locking screw such as locking screw 516 or other connector, moving a dial, or otherwise changing coupler body 500 and/or coupler interface 400 from a second, secured state to a first, unsecured state. Accordingly, the surgical instrument may be axially repositioned relative to coupler body 500 by loosening screw 516 or other hand-operated fastener or fastening mechanism such as a clamp in coupler body 500, repositioning the surgical instrument in the desired axial position, and re-tightening screw 516 or other hand-operated fastener or fastening mechanism. Coupler body 500 may be disposable, or alternatively, may be sterilizable such that it may sterilized between surgical procedures.
As described above, the diameter of the opening of the coupler body may be selected based on the surgical instrument to be coupled to the coupler body. Most commonly used laparoscopic surgical instruments have a predefined, known elongated shaft diameter, and thus the numerous coupler bodies may be provided, each having an opening sized and shaped to receive and engage with a specific surgical instrument. For example,
With the appropriate sized coupler body coupled to the selected surgical instrument, the coupler body may be removeably coupled to coupler interface 400 of robot arm 300. Coupler body 500 and coupler interface 400 may be configured for single-handed coupling, such that an operator may couple coupler body 500, and accordingly the surgical instrument coupled thereto, to coupler interface 400 of robot arm 300 using a single hand. Preferably, a surgical drape may be pinched or clamped between the coupler body and coupler interface 400, and draped over robot arm 300 to maintain sterility of the surgical space and prevent contact with non-sterile components of robot arm 300. Accordingly, the sterile drape may pass continuously (e.g., without a hole, a slit, or any other type of opening) between the coupler body and the coupler interface such that the coupler body is on a first side of the sterile drape and the coupler interface, robot arm 300, and/or other components of system 200 are on the other side of the sterile drape. In some embodiments, the coupler body may be integrated with the surgical drape. Additionally or alternatively, the surgical drape may include an adapter integrated therewith, such that coupler body 500 may be coupled to coupler interface 400 via the adapter, e.g., the adapter may be positioned between coupler body 500 and coupler interface 400.
Referring now to
Sterile drape 800 may be completely closed at an end portion thereof. In some embodiment, sterile drape 800 may have an opening (that can optionally have a sterile seal or interface) in a distal portion thereof that a portion of robot arm 300, coupler interface 400, coupler body 500, and/or the surgical instrument may pass through. Drapes having a sealed end portion without any openings, and being sealed along a length thereof may provide a better sterile barrier for system 200. Accordingly, all of robot arm 300 may be located inside sterile drape 800 and/or be fully enclosed within sterile drape 800, except at an opening at a proximal end of sterile drape 800, e.g., near the base of robot arm 300). In some embodiments, coupler body 500 and coupler interface 400 may have electrical connectors to produce an electronic connection between robot arm 300 and the surgical instrument. Accordingly, the electrical signals may be transmitted through sterile drape 800. Alternatively, sterile drape 800 may include an opening such that electrical wires or other components may pass through the opening to provide a wired communication channel to electrical components that may include, e.g., memory chips for calibration, radiofrequency probes for ablation, cameras, and other electronic components. The surgical instrument and the coupler body may instead be passive or non-electronic such that no electrical wires need pass through sterile drape 800.
Referring now to
Referring now to
As shown in
For example, the data obtained may be used to optimize the procedures performed by the system including, e.g., automatic servoing (i.e., moving) of one or more portions of robot arm 300. By tracking the tendency of the surgeon to keep the tools in a particular region of interest and/or the tendency of the surgeon to avoid moving the tools into a particular region of interest, the system may optimize the automatic servoing algorithm to provide more stability in the particular region of interest. In addition, the data obtained may be used to optimize the procedures performed by the system including, e.g., automatic re-centering of the field of view of the optical scanning devices of the system. For example, if the system detects that the surgeon has moved or predicts that the surgeon might move out of the field of view, the system may cause the robot arm supporting the optical scanning device, e.g., a laparoscope, to automatically adjust the laparoscope to track the desired location of the image as the surgeon performs the desired procedure. This behavior may be surgeon-specific and may require an understanding of a particular surgeon's preference for an operating region of interest. Thus, the system may control the robot arms pursuant to specific operating requirements and/or preferences of a particular surgeon.
Referring now to
The depth data generated by the plurality of optical sensors may be used by the controller of system 200 to generate a virtual map, e.g., a “bird's eye view”, of the area surrounding platform 100, e.g., within the operating room, in real-time. For example, the virtual map may illustrate the operating room from a top perspective. Moreover, as shown in
In some embodiments, the controller may only cause display 110 to display the virtual map while platform 100 is being moved within the operating room. For example, platform 100 may include one or more actuators, e.g., a button, lever, or handlebar, that may be operatively coupled to the braking mechanism of the wheels of platform 100, such that upon actuation of the actuator, the braking mechanism is disengaged such that mobility of platform 100 is permitted. Accordingly, when the actuator is not actuated, the braking mechanism is engaged such that mobility of platform 100 is prevented. Thus, upon actuation of the actuator, the controller may automatically cause display 110 to display the virtual map, such that operator O can view the area surrounding platform 100 before, during, or after movement of platform 100 while the braking mechanism is disengaged. Once the actuator is released, such that the braking mechanism is reengaged, display 110 may stop displaying the virtual map. In some embodiments, when the virtual map indicates that platform 100 and/or robot arms 300a, 300b are approaching or within the predetermined distance from the one or more objects/persons within the operating room, the controller may override actuation of the actuator by the operator and reengage the braking mechanism to thereby prevent further movement of platform 100. Accordingly, the actuator may need to be released and re-actuated by the operator to disengage the braking mechanism and permit further movement of platform 100.
As shown in
For example, the system may measure and record any of the following within the coordinate space of the system: motion of the handheld surgical instruments manipulated by the surgeon (attached to or apart from a robot arm); the presence/absence of other surgical staff (e.g., scrub nurse, circulating nurse, anesthesiologist, etc.); the height and angular orientation of the surgical table; patient position and volume on the surgical table; presence/absence of the drape on the patient; presence/absence of trocar ports, and if present, their position and orientation; gestures made by the surgical staff; tasks being performed by the surgical staff; interaction of the surgical staff with the system; surgical instrument identification; attachment or detachment “action” of surgical instruments to the system; position and orientation tracking of specific features of the surgical instruments relative to the system (e.g., camera head, coupler, fiducial marker(s), etc.); measurement of motion profiles or specific features in the scene that allow for the phase of the surgery to be identified; position, orientation, identity, and/or movement of any other instruments, features, and/or components of the system or being used by the surgical team.
The system may combine measurements and/or other data described above with any other telemetry data from the system and/or video data from the laparoscope to provide a comprehensive dataset with which to improve the overall usability, functionality, and safety of the co-manipulation robot-assisted surgical systems described herein. For example, as the system is being setup to start a procedure, optical scanner 1100 may detect the height and orientation of the surgical table. This information may allow the system to automatically configure the degrees of freedom of platform 200 supporting robot arms 300 to the desired or correct positions relative to the surgical table. Specifically, optical scanner 1100 may be used to ensure that the height of platform 100 is optimally positioned to ensure that robot arms 300 overlap with the intended surgical workspace. Moreover, based on the data obtained by optical scanner 1100, the system may alert the surgical staff of a potential collision (either during setup or intra-operatively) between the system and other pieces of capital equipment in the operating room, e.g., the surgical table, a laparoscopic tower, camera booms, etc., as well as with a member of the surgical staff, e.g., an inadvertent bump by the staff member. The system may use this information to recommend a repositioning of platform 100 and/or other components of the system, the surgical table, and/or patient, and/or prevent the robot arm from switching to the co-manipulation mode as a result of the force applied to the robot arm by the collision with the staff member, even if the force exceeds the predetermined force threshold of the robot arm.
In addition, the data obtained from optical scanner 1100 may be used to monitor the progress of setup for a surgical procedure and may be combined with the known state of the system to inform remote hospital staff (e.g., the surgeon) of the overall readiness to start the procedure. Such progress steps may include: (i) patient on table; (ii) patient draped; (iii) sterile instruments available; (iv) robot arm draped; (v) trocar ports inserted; and (vi) confirmation that instruments (e.g., a laparoscope and retractor) are attached to the robotic arms of system. For example, the data obtained from optical scanner 1100 may include detected gestures indicative of the system state (e.g., system is draped), readiness to start the procedure, etc., and further may be used to prepare the system for the attachment or detachment of a surgical instrument.
In addition, optical scanner 1100 may identify the specific surgeon carrying out the procedure, such that the system may use the surgeon's identity to load a system profile associated with the particular surgeon into the system. The system profile may include information related to a surgeon's operating parameter and/or preferences, a surgeon's patient list having parameters for each patient, the desired or required algorithm sensitivity for the surgeon, the degree of freedom positioning of the support platform, etc. Examples of algorithm sensitivities that may be surgeon-specific include: adapting/adjusting the force required to transition from passive mode to co-manipulation mode (e.g., from low force to high force), adapting/adjusting the viscosity felt by the surgeon when co-manipulating the robot arm (e.g., from low viscosity to high viscosity), etc. Moreover, the surgeon's preferences may include preferred arrangements of robot arm 300, e.g., the positioning of the links and joints of robot arm 300 relative to the patient, with regard to specific surgical instruments, e.g., the preferred arrangement may be different between a laparoscope and a retractor.
In some embodiments, the surgeon's preferences may be learned based on data from past procedures and/or sensors collecting information about current procedure including a surgeon's current pose, a surgeon's height, a surgeon's hand preference, and other similar factors. For example, the system may record when a user interacts with the system and also record what the user does with the system, such that the dataset may allow for surgeon preferences to be “learned” and updated over time. This learning may be done either via traditional algorithmic methods (i.e., trends over time, averaging, optical flow, etc.) or via machine learning approaches (classification, discrimination, neural networks, reinforcement learning, etc.).
Regarding the degree of freedom positioning, a height of a surgical table is typically adjusted to accommodate the height of the surgeon in some operating rooms. Thus, by detecting the surgeon and loading the surgeon's specific profile, the system may position the platform at a height that is suitable for the respective surgeon to accommodate the preferred height of the surgical table. In addition, the horizontal translation of a robot arm may depend on the size of the patient. Thus, by accessing the patient list, the system may adjust the position of the arm based on the patient's body mass index (“BMI”). For example, for a patient with a high BMI, the system may move the robot arm away from the operating table and, for a patient with a low BMI, the system may move the robot arm closer to the operating table. Accordingly, the system permits the surgical team to fine-tune the position of the robot arm relative to the patient as necessary. The system further may be configured to access a hospital medical record database to access the procedure type and any other medical data available (e.g., CT scan images, x-ray images, MRI images, and/or other patient specific information), which may be used to inform positioning of the trocar ports, and the position and orientation of platform 100 relative to the patient.
Based on the data captured by optical scanner 1100, the system may generate a virtual model of the pieces of capital equipment and/or other objects in an operating room that are within a range of movement of the robot arms in the same co-ordinate space as the robot arms and surgical instruments coupled thereto, such that the virtual model may be stored and monitor, e.g., to detect potential collisions. Additionally, the system may track the position and orientation of each virtual model, and the objects within the virtual models as the objects move relative to each other, such that the system may alert the user if the proximity of (i.e., spacing between) any of the virtual models or objects falls below a predefined threshold, e.g., within 50 mm, 75 mm, from 30 mm or less to 100 mm, or more. In some embodiments, the distance threshold may be based off the Euclidean distance between the closest points on two virtual models, the normal distance between two surfaces of the virtual models, etc. Moreover, the system may stop or inhibit (e.g., prevent) further movement of a robot arm, e.g., freeze the robot arm, if the proximity of any of the virtual models or objects, e.g., a robot arm reaches or falls below the predefined threshold relative to a laparoscopic tower, or the surface of the surgical table, or other objects within the surgical space. In addition, the system may freeze the robot arm if the system detects that the proximity between an object, e.g., capital equipment or a member of the surgical staff other than the surgeon, moving toward a respective robot arm reaches or falls below the predefined threshold, to thereby prevent the inadvertent movement of the robot arm that may otherwise result from such a collision or inadvertent force, e.g., an inadvertent bump from a member of the staff or another piece of capital equipment, etc.
In some embodiments, the system may render and display the virtual models generated from laparoscopic video data, e.g., at an angle that the laparoscope is pointing within the patient's body to align the virtual models with the user's viewpoint and the anatomical structures within the field of view of the laparoscope. The virtual models may illustrate deformations corresponding to real-time deformations of the actual anatomical structure caused by forces applied to the anatomical structure by one or more surgical instruments or adjacent anatomical structures in real-time, to thereby provide enhanced visualization of the surgical environment and anatomical structures.
Referring again to
Moreover, based on the data captured by optical scanner 1100, the system may track the motion of the handheld surgical instruments that are directly and independently controlled by the surgeon, that are not coupled with the robot arm. For example, the optical scanner 1100 may track a clearly defined feature of the instrument, a fiducial marker attached to the instrument or to the gloves (e.g., the sterile gloves) of the surgeon, the coupler between the robot arm and the instrument, a distal tip of the instrument, and/or any other defined location on the instrument. For example, fiducial markers may include Manus virtual reality gloves (made available by Manus, The Netherlands) or other wearables, and/or the OptiTrack systems (made available by NaturalPoint, Corvallis, Oregon). The following are examples of uses and purposes of the motion data: (i) closing a control loop between a handheld instrument and the robot arm holding the camera, thus allowing the surgeon to servo (i.e., move) the camera by “pointing” with a handheld instrument; (ii) tracking information that may be used independently or in combination with other data streams to identify the phase of the surgical procedure; (iii) to identify the dominant hand of the surgeon; (iv) to monitor metrics associated with the experience of the surgeon; (v) to identify which tools the surgeon is using and when to change them for other tools; and/or (vi) tracking of the skin surface of the patient, as well as the number, position and orientation of the trocar ports. This data and information also may be used and computed by the system as part of the co-manipulation control paradigm. By measuring the true position and orientation of the trocar ports, the system may be provided an additional safety check to ensure that the system level computations are correct, e.g., to ensure that the actual motion of the robot arms or instrument matches a commanded motion of the robot arms or instrument in robotic assist mode.
Based on the data captured by optical scanner 1100, the system further may track the which instrument is being used in a respective port, how often instruments are swapped between ports, which ports have manually held instruments versus instruments coupled to the robot arm, to monitor and determine if additional trocar ports are added, if the system is holding the instruments in place while the patient or surgical table is moving (in which case, the system may change the operational mode of the robot arms to a passive mode and accommodate the movement by repositioning robot arm 300 and/or platform 100), and/or other conditions or parameters of the operating room or the system. The knowledge of the position and orientation of the skin surface and trocar ports relative to the robot arms may facilitate the implementation of “virtual boundaries” as described in further detail below.
Referring now to
Platform 1400 may contain memory and/or be coupled, via one or more buses, to read information from, or write information to, memory. Memory 1410 may include processor cache, including a multi-level hierarchical cache in which different levels have different capacities and access speeds. The memory also may include random access memory (RAM), other volatile storage devices, or non-volatile storage devices. Memory 1410 may be RAM, ROM, Flash, other volatile storage devices or non-volatile storage devices, or other known memory, or some combination thereof, and preferably includes storage in which data may be selectively saved. For example, the storage devices can include, for example, hard drives, optical discs, flash memory, and Zip drives. Programmable instructions may be stored on memory 1410 to execute algorithms for, e.g., calculating desired forces to be applied along robot arm 300 and/or the surgical instrument coupled thereto and applying impedances at respective joints of robot arm 300 to effect the desired forces.
Platform 1400 may incorporate processor 1402, which may consist of one or more processors and may be a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof designed to perform the functions described herein. Platform 1400 also may be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
Platform 1400, in conjunction with firmware/software stored in the memory may execute an operating system (e.g., operating system 1446), such as, for example, Windows, Mac OS, QNX, Unix or Solaris 5.10. Platform 1400 also executes software applications stored in the memory. For example, the software may be programs in any suitable programming language known to those skilled in the art, including, for example, C++, PHP, or Java.
Communication circuitry 1404 may include circuitry that allows platform 1400 to communicate with an image capture devices such as optical scanner and/or endoscope. Communication circuitry 1404 may be configured for wired and/or wireless communication over a network such as the Internet, a telephone network, a Bluetooth network, and/or a WiFi network using techniques known in the art. Communication circuitry 1404 may be a communication chip known in the art such as a Bluetooth chip and/or a WiFi chip. Communication circuitry 1404 permits platform 1400 to transfer information, such as force measurements on the body wall at the trocar insertion point locally and/or to a remote location such as a server.
Power supply 1406 may supply alternating current or direct current. In direct current embodiments, power supply may include a suitable battery such as a replaceable battery or rechargeable battery and apparatus may include circuitry for charging the rechargeable battery, and a detachable power cord. Power supply 1406 may be a port to allow platform 1400 to be plugged into a conventional wall socket, e.g., via a cord with an AC to DC power converter and/or a USB port, for powering components within platform 1400. Power supply 1406 may be operatively coupled to an emergency switch, such that upon actuation of the emergency switch, power stops being supplied to the components within platform 1400 including, for example, the braking mechanism disposed on at least some joints of the plurality of joints of robot arm 300. For example, the braking mechanisms may require power to disengage, such that without power supplied to the braking mechanisms, the braking mechanisms engage to prevent movement of robot arm 300 without power.
User interface 1408 may be used to receive inputs from, and/or provide outputs to, a user. For example, user interface 1408 may include a touchscreen, display, switches, dials, lights, etc. Accordingly, user interface 1408 may display information such as selected surgical instrument identity and force measurements observed during operation of robot arm 300. Moreover, user interface 1408 may receive user input including adjustments to the predetermined amount of movement at the handle of the surgical instrument or the predetermined dwell time period to cause the robot arm to automatically switch to the passive mode, the predetermined threshold of force applied at the handle of the surgical instrument to cause the robot arm to automatically switch to the co-manipulation mode, a position of the predefined haptic barrier, an identity of the surgical instrument coupled to the distal end of the robot arm, a vertical height of the robot arm, a horizontal position of the robot arm, etc., such that platform 1400 may adjust the information/parameters accordingly. In some embodiments, user interface 1408 is not present on platform 1400, but is instead provided on a remote, external computing device communicatively connected to platform 1400 via communication circuitry 1404.
Memory 1410, which is one example of a non-transitory computer-readable medium, may be used to store operating system (OS) 1446, surgical instrument identification module 1412, surgical instrument calibration module 1414, encoder interface module 1416, robot arm position determination module 1418, trocar position detection module 1420, force detection module 1422, impedance calculation module 1424, motor interface module 1426, optical scanner interface module 1428, gesture detection module 1430, passive mode determination module 1432, co-manipulation mode determination module 1434, haptic mode determination module 1436, robotic assist mode determination module 1438, fault detection module 1440, indicator interface module 1442, and fatigue detection module 1444. The modules are provided in the form of computer-executable instructions/algorithms that may be executed by processor 1402 for performing various operations in accordance with the disclosure.
For example, during a procedure, the system may continuously run the algorithms described herein based on the data collected by the system. That data may be collected and/or recorded using any of the components and methods disclosed herein, including, e.g., from sensors/encoders within the robots, from optical scanning devices in communication with the other components of the robotic system, and/or from manual inputs by an operator of the system. Accordingly, the algorithms, the data, and the configuration of the system may enable the user to co-manipulate the robot arms with minimal impact and influence from the weight of the robot arms and/or surgical instruments coupled thereto, force of gravity, and other forces that traditional robot arms fail to compensate for. Some of the parameters of the algorithms described herein may control an aspect of the behavior of the system including, e.g., robustness of detected features, sensitivity to false positives, robot control gains, number of features to track, dead zone radius, etc.
Surgical instrument identification module 1412 may be executed by processor 1402 for identifying the surgical instrument coupled to each of the robot arms, and loading the appropriate calibration file into the controller system. For example, the calibration file for each surgical instrument may be stored in a database accessible by surgical instrument identification module 1412, and may include information associated with the surgical instrument such as, e.g., instrument type, weight, center of mass, length, instrument shaft diameter, etc. Accordingly, when the appropriate calibration file is loaded, and the associated surgical instrument is coupled to robot arm 300, the system will automatically account for the mass of the surgical instrument, e.g., compensate for gravity on the surgical instrument, when the surgical instrument is attached to robot arm 300 based on the data in the calibration file, such that robot arm 300 may hold the surgical instrument in position after the surgical instrument is coupled to the robot arm and the operator lets go of the surgical instrument. For example, surgical instrument identification module 1412 may identify the surgical instrument based on user input via user interface 1408, e.g., the operator may select the surgical instrument from a database of surgical instruments stored in memory 1410.
In some embodiments, surgical instrument identification module 1412 may automatically identify the surgical instrument coupled with the robotic arm via the coupler body and the coupler interface using, e.g., an RFID transmitter chip and reader or receiver (e.g., placing an RFID sticker or transmitter on the surgical instrument that may transmit information about the surgical instrument to a receiver of the system), an near field communication (“NFC”) device such as a near field magnetic induction communication device, a barcode and scanner or other optical device, a magnet based communication system, reed switches, a Bluetooth transmitter, the weight of the instrument and/or data gathered from the optical scanner and a lookup table, and/or any other features or mechanisms described herein or suitable for identification of the surgical instrument. As described above, the coupler body may be selected based on the size and shape of the lumen extending therethrough to accommodate and engage with a surgical instrument having a known elongated shaft diameter. Accordingly, surgical instrument identification module 1412 may automatically identify the surgical instrument based on the coupler body that is coupled to the surgical instrument via the magnetic connection between the coupler body and the coupler interface.
In some embodiments, surgical instrument identification module 1412 may identify the surgical instrument, e.g., the type of surgical instrument, based on data obtained by optical scanner 1100 via optical scanner interface module 1428 described in further detail below. For example, the data may include measurement data associated with the specific instrument, such that surgical instrument identification module 1412 may compare such data with information contained within the database to identify the instrument and load the appropriate calibration file into the controller system. Similarly, surgical instrument identification module 1412 may detect if the instrument is removed and return the calibration parameters to a default configuration.
Surgical instrument calibration module 1414 may be executed by processor 1402 for calibration a surgical instrument, e.g., a surgical instrument that does not currently have an associated calibration file in the database stored in memory 1410. Accordingly, surgical instrument calibration module 1414 may calculate measurements and specifications of a surgical instrument when it is coupled to robot arm 300 and the system is in calibration mode, as described in further detail below with regard to
If surgical instrument calibration module 1414 determines that re-calibration results are consistently different from the configurations already loaded into the system, surgical instrument calibration module 1414 may replace existing information or add to its list of known tools without any user inputs and load them automatically. Surgical instrument calibration module 1414 may determine that the calibration factors are not adequate to compensate for the force of gravity if, e.g., when a surgical instrument is coupled with the robot arm, the robot arm moves due only to forces of gravity acting on the robot arm and/or the surgical instrument, which may be done when the surgical instrument is positioned completely outside of the patient's body. Moreover, surgical instrument calibration module 1414 may automatically update or adjust the calibration factors (e.g., the forces applied to the joints of the robot arm) if it determines that the calibration factors are not adequate to compensate for the force of gravity. Thus, surgical instrument calibration module 1414 may update the calibration factors for the particular surgical instrument and store the updated calibration factors for the particular surgical instrument in the associated calibration file for future use.
Encoder interface module 1416 may be executed by processor 1402 for receiving and processing angulation measurement data from the plurality of encoders of robot arm 300, e.g., encoders E1-E7, in real time. For example, encoder interface module 1416 may calculate the change in angulation over time of the links of robot arm 300 rotatably coupled to a given joint associated with the encoder. As described above, the system may include redundant encoders at each joint of robot arm 300, to thereby ensure safe operation of robot arm 300. Moreover, additional encoders may be disposed on platform 100 to measure angulation/position of each robot arm relative to platform 100, e.g., the vertical and horizontal position of the robot arms relative to platform 100. Accordingly, an encoder may be disposed on platform 100 to measure movement of the robot arms along the vertical axis of platform 100 and another encoder may be disposed on platform 100 to measure movement of the robot arms along the horizontal axis of platform 100.
Robot arm position determination module 1418 may be executed by processor 1402 for determining the position of robot arm 300 and the surgical instrument attached thereto, if any, in 3D space in real time based on the angulation measurement data generated by encoder interface module 1416. For example, robot arm position determination module 1418 may determine the position of various links and joints of robot arm 300 as well as positions along the surgical instrument coupled to robot arm 300. Based on the position data of robot arm 300 and/or the surgical instrument, robot arm position determination module 1418 may calculate the velocity and/or acceleration of movement of robot arm 300 and the surgical instrument attached thereto in real time. For example, by determining the individual velocities of various joints of robot arm 300, e.g., via the encoder associated with each joint of the various joints, robot arm position determination module 1418 may determine the resultant velocity of the distal end of robot arm 300, which may be used by passive mode determination module 1432 to determine whether movement of the distal end of robot arm 300 is within a predetermined threshold for purposes of transitioning system 200 to passive mode, as described in further detail below.
Trocar position detection module 1420 may be executed by processor 1402 for determining the position and/or orientation of one or more trocar port inserted within the patient. The position and/or orientation of a trocar port may be derived based on data obtained from, e.g., inertial measurement units and/or accelerometers, optical scanners, electromechanical tracking instruments, linear encoders, the sensors and data as described above. For example, the position of the trocar ports on the patient may be determined using a laser pointing system that may be mounted on one or more of the components of the system, e.g., wrist portion 311 of the robot arm, and may be controlled by the system to point to the optimal or determined position on the patient's body to insert the trocar. Moreover, upon insertion of the surgical instrument that is attached to robot arm 300 through a trocar, virtual lines may continuously be established along the longitudinal axis of the surgical instrument, the alignment/orientation of which may be automatically determined upon attachment of the surgical instrument to coupler interface 400 via the coupler body via the magnetic connection as described above, in real time as the surgical instrument moves about the trocar point. Moreover, when the surgical instrument is inserted within the trocar port, it will be pointing toward the trocar point, and accordingly, distal wrist link 316 will also point toward the trocar point, the angle of which may be measured by an encoder associated therewith. Accordingly, the trocar point may be calculated as the intersection of the plurality of virtual lines continuously established along the longitudinal axis of the surgical instrument. In this manner, the calculated trocar point will remained fixed relative to the patient as the surgical instrument is maneuvered about the trocar port, e.g., rotated or moved in or out of the patient.
Based on the known position and/or orientation of a trocar port in addition to the known position of the distal end of robot arm 300 from robot arm position determination module 1418, the system may maintain the position of the distal end of robot arm 300 relative to the trocar point as robot arm 300 moves, e.g., via vertical or horizontal adjustment thereof by platform 100, or as the patient table height is adjusted, thereby causing the height of the patient's abdomen to move, thereby keeping the surgical instrument within the patient's body and coupled to robot arm 300 steady during these external movements. To achieve this, the known position of the distal end of robot arm 300 from robot arm position determination module 1418 is calculated in the global frame of the system by adding position of platform 100 to the kinematics calculations (e.g., the “forward kinematics” of robot arm 300 in the context of serial chain robotic manipulators). With the position of the distal end of robot arm 300 known globally, the system may hold that position steady by applying appropriate forces to robot arm 300 during the external movements that minimize the error between its current and desired positions.
Force detection module 1422 may be executed by processor 1402 for detecting forces applied on robot arm 300, e.g., at the joints or links of robot arm 300 or along the surgical instrument, as well as applied on the trocar, e.g., body wall forces. For example, force detection module 1422 may receive motor current measurements in real time at each motor, e.g., M1, M2, M3, disposed within the base of robot arm 300, which are each operatively coupled to a joint of robot arm 300, e.g., base joint 303, shoulder joint 318, elbow joint 322, wrist joint 332. The motor current measurements are indicative of the amount of force applied to the associated joint. Accordingly, the force applied to each joint of robot arm 300 as well as to the surgical instrument attached thereto may be calculated based on the motor current measurements and the position data generated by robot arm position determination module 1418 and/or trocar position detection module 1420.
Due to the passive axes at the distal end of robot arm 300, the force applied by the instrument coupled with the robot arm on the trocar may remain generally consistent throughout the workspace of the robot arm. The force on the trocar may be affected by the interaction of the distal tip of the instrument with tissue within the body. For example, if a tissue retractor advanced through the trocar is engaged with (e.g., grasping) bodily tissue or another object inside the body, the force exerted on the end of the instrument from the bodily tissue or other object may cause a change in the force applied to the trocar. In some aspects, the force on the trocar may be a function of how much weight is being lifted by the instrument being used.
Impedance calculation module 1424 may be executed by processor 1402 for determining the amount of impedance/torque needed to be applied to respective joints of robot arm 300 to achieve the desired effect, e.g., holding robot arm 300 in a static position in the passive mode, permitting robot arm 300 to move freely while compensating for gravity of robot arm and the surgical instrument attached thereto in the co-manipulation mode, applying increased impedance to robot arm 300 when robot arm 300 and/or the surgical instrument attached thereto is within a predefined virtual haptic barrier in the haptic mode, applying a constant tension force to an anatomical structure during a constant tension mode, etc.
For example, impedance calculation module 1424 may determine the amount of force required by robot arm 300 to achieve the desired effect based on position data of robot arm 300 generated by robot arm position determination module 1418 and the position data of the trocar generated by trocar position detection module 1420. For example, by determining the position of the distal end of robot arm 300, as well as the point of entry of the surgical instrument into the patient, e.g., the trocar position, and with knowledge of one or more instrument parameters, e.g., mass and center of mass of the surgical instrument stored by surgical instrument calibration module 1414, impedance calculation module 1424 may calculate the amount of force required to compensate for gravity of the surgical instrument (compensation force), as described in further detail below with regard to
Moreover, by determining the position of the distal end of robot arm 300, and accordingly, a change in position of the distal end of robot arm 300 over time, for example, due to an external force applied to the distal end of robot arm 300, e.g., by tissue held by the operating end of the surgical instrument, and with knowledge of one or more instrument parameters, e.g., mass, center of mass, and length of the surgical instrument stored by surgical instrument calibration module 1414, impedance calculation module 1424 may calculate the amount of force required to maintain the surgical instrument in a static position (hold force), as described in further detail below with regard to
In addition, rather than calculating the amount of force required to maintain the surgical instrument in a static position, impedance calculation module 1424 may continuously calculate the amount of force to maintain a constant tension force applied by, e.g., the distal end of a surgical instrument such as a retractor or grasper, to an anatomical structure independent of the position of the surgical instrument, as described in further detail below with regard to
In addition, impedance calculation module 1424 and/or force detection module 1422 may calculate the amount of force applied by the surgical instrument to the patient at the point of entry, e.g., at the trocar, as well as the amount of force applied to the operating end of the surgical instrument, e.g., the grasper end of a surgical instrument, based on the compensation force, the hold force, one or more parameters of the surgical instrument such as the mass, center of mass, and length of the surgical instrument, and the distance from the center of mass to the point of entry. Additionally or alternatively, by determining the forces applied on robot arm 300 via force detection module 1422, as well as the position/velocity/acceleration of the distal end of robot arm 300 in 3D space via robot arm position determination module 1418, the desired force/impedance to be applied to robot arm 300 to compensate for the applied forces may be calculated, e.g., for gravity compensation or to hold robot arm 300 in a static position in the passive mode. Accordingly, the desired force may be converted to torque to be applied at the joints of robot arm 300, e.g., by the motors operatively coupled to the joints of robot arm 300. For example, the robot Jacobian may be used for this purpose. Jacobian is a matrix that is computer at each given post of the robot arm, and relates the velocities at the joints to the velocity at the distal end of robot arm 300:
V=J*q
dot
Here, V is the velocity vector at the distal end of robot arm 300, J is its Jacobian matrix, and qdot is its joint velocities expressed in vector form. Using the energy principle, and assuming negligible masses for the links of robot arm 300 and negligible friction/dampening, the power of the system may be determined by multiplying its force and velocity:
Here, F is the generalized force vector at the distal end of robot 300. Further, vector manipulation results in:
Here, t denotes the transpose of the matrix, such that the forces at the distal end of robot arm 300 may be converted to torques to be applied at the joints using the Jacobian matrix.
Motor interface module 1426 may be executed by processor 1402 for receiving motor current readings at each motor, e.g., M1, M2. M3, disposed within the base of robot arm 300, and for actuating the respective motors, e.g., by applying a predetermined impedance to achieved the desired outcome as described herein and/or to cause the joints operatively coupled to the respective motors to move, such as in the robotic assist mode.
Optical scanner interface module 1428 may be executed by processor 1402 for receiving depth data obtained by optical scanner 1100 and processing the depth data to detect, e.g., predefined conditions therein. Moreover, optical scanner interface module 1428 may generate depth maps indicative of the received depth data, which may be displayed to the operator, e.g., via a monitor. For example, optical scanner interface module 1428 may map the location of the trocar ports in 3D space, such that the mapping of trocar ports may be communicated to the operator, e.g., via display or user interface 1408. Moreover, optical scanner interface module 1428 may receive depth data obtained by optical scanners 1100a, 1100b, 1100c coupled to platform 100 and process the depth data to generate a virtual map of the area surrounding platform 100, as described above with regarding to
Optical scanner interface module 1428 further may receive image data from additional optical scanning devices as defined herein, including for example, an endoscope operatively coupled to the system. In some embodiments, optical scanner interface module 1428 further may detect when the “horizontality” of the laparoscopic video images as perceived by the user operating the laparoscope coupled to the robot arm, e.g., an assistant, differs from the horizontality of the user viewing the video display monitor and performing the surgical procedure, e.g., the surgeon, and automatically adjust the laparoscopic video images displayed on the monitor to align the with the surgeon's horizontality. For example, the horizontality of the laparoscopic video images may depend on the orientation of the laparoscope in space and the visual comfort of the user, and may appear different to the user depending on the user's point of view of the video display monitor. Accordingly, as an assistant holding the laparoscope may have a horizon (the horizontality of the image displayed on the monitor) that is different from the horizontality of the image from the surgeon's point of view, frequently observing the monitor via a lateral angle of incidence by the assistant may result in errors in determining the horizontality of the image, i.e., the “parallax effect.” Moreover, the horizontality of the laparoscopic images may inadvertently change as the laparoscope is moved from one position to another. Accordingly, optical scanner interface module 1428 may adjust, e.g., rotate, the laparoscopic video images displayed on the monitor to align with the surgeon's horizon and provide an optimal viewing angle. For example, the initial orientation of the laparoscope video images before the laparoscope is moved may be established as the default horizontality of the laparoscope images, such that when optical scanner interface module 1428 detects that the horizontality of the laparoscope images falls outside of a predetermined angular threshold range of the default horizontality, optical scanner interface module 1428 may automatically rotate the laparoscope images to align the horizontality of the laparoscopic images with the default horizontality. Alternatively, without establishing a default horizontality based on an initial orientation of the laparoscope, optical scanner interface module 1428 may automatically rotate the laparoscope images via machine learning algorithms trained on historical data of the same or similar procedures when optimizing viewing angle to address the parallax effect.
Gesture detection module 1430 may be executed by processor 1402 for detecting predefined gestural patterns as user input, and executing an action associated with the user input. The predefined gestural patterns may include, for example, movement of a surgical instrument (whether or not attached to robot arm 300), movement of robot arm 300 or other components of the system, e.g., foot pedal, buttons, etc., and/or movement of the operator in a predefined pattern. For example, movement of the surgical instrument back and forth in a first direction (e.g., left/right, up/down, forward/backward, in a circle) may be associated with a first user input requiring a first action by the system and/or back and forth in a second direction (e.g., left/right, up/down, forward/backward, in a circle) that is different than the first direction may be associated with a second user input requiring a second action by the system. Similarly, pressing the foot pedal or a button operatively coupled with the system in a predefined manner may be associated with a third user input requiring a third action by the system, and movement of the operator's head back and forth or up and down repeatedly may be associated with a fourth user input requiring a fourth action by the system. Various predefined gestural patterns associated with different components or operators of the system may be redundant such that the associated user input may be the same for different gestural patterns. The predefined gestural patterns may be detected by, e.g., an optical scanning device such as a laparoscope or optical scanner 1100 via optical scanner interface module 1428 or directly by force applied to robot arm 300 via force detection module 1422 or other components of the system.
Actions responsive to user input associated with predefined gestural patterns may include, for example, enabling tool tracking to servo (i.e., move) the laparoscope based on the motion of a handheld tool; engaging the brakes on (e.g., preventing further movement of) the robot arm; engaging a software lock on the robot arm; dynamically changing the length of time that the robot arm takes to transition between states from a default setting; and/or identifying which member of the surgical staff is touching the robot arm, if any. This information may be used to ensure that the system does not move if the surgeon is not touching the robot arm, e.g., to avoid the scenario where an external force is acting on the robot arm (e.g., a light cable or other wire being pulled across the robot arm) and the system perceives the force to be intentional from the surgeon. The same information may be used to detect the gaze direction of the surgeon, e.g., whether the surgeon is looking at the video feed or somewhere else in the room, such that the system may freeze the robot arm if the surgeon's gaze is not in the direction it should be. Additionally, the system may reposition a field of view of a camera based on, for example, the direction a surgeon is facing or based on the objects that the surgeon appears to be looking at, based on the data from the optical scanner 1100.
In some embodiments, the operator may actively switch the system to a command mode, e.g., via user interface 1408, where particular movements or gestures of the robot arm, surgical instrument, operator, or otherwise as described herein are monitored by gesture detection module 1430 to determine if they are consistent with a predefined gestural pattern associated with a predefined user input.
Passive mode determination module 1432 may be executed by processor 1402 for analyzing the operating characteristics of robot arm 300 to determine whether to switch the operational mode of robot arm 300 to the passive mode where the system applies impedance to the joints of robot arm 300 via motor interface module 1426 in an amount sufficient to maintain robot arm 300, and accordingly a surgical instrument attached thereto, if any, in a static position, thereby compensating for mass of robot arm 300 and the surgical instrument, and any other external forces acting of robot arm 300 and/or the surgical instrument. If robot arm 300 is moved slightly while in the passive mode, but not with enough force to switch out of the passive mode, the system may adjust the amount of impedance applied the robot arm 300 to maintain the static position, and continue this process until robot arm 300 is held in a static position. For example, passive mode determination module 1432 may determine to switch the operational mode of robot arm 300 to the passive mode if movement of the robot arm due to movement at the handle of the surgical instrument as determined by force detection module 1422 is less than a predetermined amount, e.g., no more than 1 to 5 mm, for at least a predetermined dwell time period associated with robot arm 300. The predetermined dwell time period refers to the length of time that robot arm 300 and/or the surgical instrument attached thereto, if any, are held in a static position. For example, the predetermined dwell time may range between, e.g., 0.1 to 3 seconds or more, and may be adjusted by the operator.
In some embodiments, passive mode determination module 1432 may determine to switch the operational mode of robot arm 300 to the passive mode if movement of the distal end of the robot arm due to movement at the handle of the surgical instrument as determined by force detection module 1422 has a velocity that is less than a predetermined dwell velocity/speed. For example, if passive mode determination module 1432 determines that the distal end of the robot arm 300 and/or the surgical instrument attached thereto, if any, moves at a speed that is lower than the predetermined dwell speed during an entire predetermined dwell period, then passive mode determination module 1432 may switch the operational mode of robot arm 300 to the passive mode.
Co-manipulation mode determination module 1434 may be executed by processor 1402 for analyzing the operating characteristics of robot arm 300 to determine whether to switch the operational mode of robot arm 300 to the co-manipulation mode where robot arm 300 is permitted to be freely moveable responsive to movement at the handle of the surgical instrument for performing laparoscopic surgery using the surgical instrument, while the system applies an impedance to robot arm 300 via motor interface module 1426 in an amount sufficient to account for mass of the surgical instrument and robot arm 300. Moreover, the impedance applied to robot arm 300 may provide a predetermined level of viscosity perceivable by the operator.
Moreover, the force exerted by the user on the surgical instrument and any external tissue forces applied to the surgical instrument may be directionally dependent. For example, if the force exerted by the user on the surgical instrument is in the same direction as an external tissue force applied to the surgical instrument, the two forces may be additive such that the amount of force exerted by the user on the surgical instrument needed to overcome the predefined force threshold may be reduced by the magnitude of the external tissue force such that a lower force than the predefined force threshold would be required to exit the passive mode and enter the co-manipulation mode. On the other hand, if the force exerted by the user on the surgical instrument is in a direction opposite to an external tissue force applied to the surgical instrument, than the necessary amount of force exerted by the user on the surgical instrument needed to overcome the predefined force threshold may be increased by the magnitude of the external tissue force such that a higher force than the predefined force threshold would be required to exit the passive mode and enter the co-manipulation mode.
In addition, if the force exerted by the user on the surgical instrument is in a direction that is perpendicular to an external tissue force applied to the surgical instrument, than the necessary amount of force exerted by the user on the surgical instrument needed to overcome the predefined force threshold may not be affected by the magnitude of the external tissue force such that the necessary force exerted by the user on the surgical instrument needed to exit the passive mode and enter the co-manipulation mode will equal the predefined force threshold. For other directions, the force vectors of the applied forces may be added to or offset by the force vectors of the external tissue forces to overcome predefined force threshold values for the system or the particular surgical instrument that is coupled with the robot arm, depending on the direction of the external tissue force, if any, and the force applied by the user. In some embodiments, co-manipulation mode determination module 1434 may determine to switch the operational mode of robot arm 300 to the co-manipulation mode based on the identity of the surgical instrument.
Haptic mode determination module 1436 may be executed by processor 1402 for analyzing the operating characteristics of robot arm 300 to determine whether to switch the operational mode of robot arm 300 to the haptic mode where the system applies an impedance to robot arm 300 via motor interface module 1426 in an amount higher than applied in the co-manipulation mode, thereby making movement of robot arm 300 responsive to movement at the handle of the surgical instrument more viscous in the co-manipulation mode. For example, haptic mode determination module 1436 may determine to switch the operational mode of robot arm 300 to the haptic mode if at least a portion of robot arm 300 and/or the surgical instrument attached thereto is within a predefined virtual haptic boundary. Specifically, a virtual haptic boundary may be established by the system, such that the robot arm or the surgical instrument coupled thereto should not breach the boundary. For example, a virtual boundary may be established at the surface of the patient to prevent any portion of the robot arms or the instruments supported by the robot arms from contacting the patient, except through the one or more trocars. Similarly, the virtual haptic boundary may include a haptic funnel to help guide the instrument into the patient as the operator inserts the instrument into a trocar port. Accordingly, based on position data of robot arm 300 and/or the surgical instrument coupled thereto, e.g., received by robot arm position determination module 1418 and/or trocar position detection module 1420, haptic mode determination module 1436 may determine if robot arm 300 and/or the surgical instrument is within the predefined virtual haptic boundary, and accordingly transition robot arm 300 to the haptic mode where processor 1402 may instruct associated motors to apply an effective amount of impedance to the joints of robot arm 300 perceivable by the operator to communicate to the operator the virtual haptic boundary. Accordingly, the viscosity of robot arm 300 observed by the operator will be much higher than in co-manipulation mode. In some embodiments, haptic mode determination module 1436 may determine to switch the operational mode of robot arm 300 to the haptic mode based on the identity of the surgical instrument.
Robotic assist mode determination module 1438 may be executed by processor 1402 for analyzing the operating characteristics of robot arm 300 to determine whether to switch the operational mode of robot arm 300 to the robotic assist mode where processor 1402 may instruct associated motors via motor interface module 1426 to cause movement of corresponding link and joints of robot arm 300 to achieve a desired outcome. For example, robotic assist mode determination module 1438 may determine to switch the operational mode of robot arm 300 to the robotic assist mode if a predefined condition exists based on data obtained from, e.g., optical scanner interface module 1428.
For example, robotic assist mode determination module 1438 may determine that a condition exists, e.g., the field of view of a laparoscope coupled to robot arm 300 or optical scanner 1100 is not optimal for a given surgical procedure, e.g., due to blocking by the surgeon or assistant or another component of the system, based on image data obtained from the laparoscope or optical scanner 1100 via optical scanner interface module 1428, such that the robot arm coupled to the laparoscope or optical scanner 1100 should be repositioned or zoom in/out to optimize the field of view of the surgical site for the operator. Thus, in robotic assist mode, processor 1402 may instruct robot arm 300, either automatically/quasi-automatically or responsive to user input by the operator, to move to reposition the laparoscope and/or cause the laparoscope to zoom in or zoom out, or to increase a resolution of an image, or otherwise. For example, the user input by the operator may be determined by gesture detection module 1430, as described above, such that movement of the robot arm or a surgical instrument in a predefined gestural pattern in a first direction causes the endoscope to increase resolution or magnification and in a second direction causes the endoscope to decrease resolution or magnification, and movement in another predefined gestural pattern causes the robot arm holding the laparoscope to retract away from the patient's body.
In some embodiments, robotic assist mode determination module 1438 may instruct robot arm 300 coupled to a laparoscope to move between one or more preset configurations to optimize the field of view of the laparoscope for a given procedure, e.g., zoom in/out, based on the phase of a surgical procedure upon determination that a condition exists. For example, for the treatment of cancers, there is often a dissection which may benefit from a close-up view to allow the surgeon sufficient resolution or detail to dissect the vessels, and a distant view to clearly see the limits of the dissection; for handmade sutures, the surgeon typically has to pass the needle with precision, which may benefit from a close-up view of the camera, and also tie the knots by finding the ends of the threads, which may benefit from a distant view of the camera; for a cholecystectomy or any other organ resection, it may be necessary to free the vessel(s), which may benefit from a close-up view to provide more precision, and a distant view to visualize the gallbladder in its entirety upon release from the liver. The preset configurations may be determined via machine learning algorithms trained on historical data of the same or similar procedures.
Additionally, or alternatively, robotic assist mode determination module 1438 may instruct robot arm 300 coupled to an instrument other than a laparoscope between one or more preset configurations to assist with a given surgical procedure based on the type of surgical instrument and the phase of a surgical procedure upon determination that a condition exists. For example, for a cholecystectomy, which typically begins with exposing the underside of the liver where the gallbladder is located, robotic assist mode determination module 1438 may instruct robot arm 300 to move from a first preset configuration where the surgical instrument coupled to the robot arm pushes the liver up the abdomen, such that the surgeon may dissect the neck area to locate the canal and cystic artery, to a second preset configuration where the surgical instrument pulls the gallbladder forward and down, such that the surgeon may separate the gallbladder from the hepatic bed. For treatment of gastroesophageal reflux, which typically begins with pulling the stomach forward and down to free the esophagus and putting on a lace, robotic assist mode determination module 1438 may instruct robot arm 300 to successively and repeatedly move between three preset configurations to pass from a traction alternately in the axis, to the left and to the right depending on whether the surgeon wants to free the esophagus forward in the mediastinum or dissect the right pillar or the left pillar, or whether the surgeon wants to fix the anti-reflux valve to the right edge or to the left edge. For a dissection of the mesorectum in a proctectomy, robotic assist mode determination module 1438 may instruct robot arm 300 to successively and repeatedly move between three preset configurations to stretch the rectum forward and upwards, then to the right and then to the left, in a downward manner until the pelvic floor is reached. As described above, the preset configurations may be determined via machine learning algorithms trained on historical data of the same or similar procedures indicative of what conditions existed at the time that the user moved the instrument to a first, second, and/or third position, etc., so that the system may identify the same or similar conditions in future procedures to automatically move the robot arm to move the instrument to a preset configuration upon the occurrence of the conditions.
In addition, robotic assist mode determination module 1438 may determine that a condition exists, e.g., that one or more trocars are not in an optimal position, for example, due to movement of the patient, such that robot arm 300 should be repositioned to maintain the trocar in the optimal position, e.g., in an approximate center of the movement range of robot arm 300, thereby minimizing the risk of reaching a joint limit of the robot arm during a procedure. Thus, in robotic assist mode, processor 1402 may instruct system to reposition robot arm 300, e.g., via vertical/horizontal adjustment by platform 100 or via the joints and links of robot arm 300, to better align the surgical instrument workspace.
Robotic assist mode determination module 1438 may determine that a condition exists, e.g., the distance between an object and robot arm 300 is within a predetermined threshold, based on image data obtained from the laparoscope or optical scanner 1100 via optical scanner interface module 1428, such that the robot arm should be frozen to avoid collision with the object. Thus, in robotic assist mode, processor 1402 may instruct robot arm 300 apply the brakes to slow down the robot arm or inhibit or prevent movement within a predetermined distance from the other object. In addition, robotic assist mode determination module 1438 may determine that a condition exists, e.g., a force applied by a surgical instrument coupled to robot arm 300 falls outside of a predetermined threshold of a predetermined constant tension force when the robot arm is in a constant tension mode, based on motor current measurements in real-time, such that robot arm 300 should move the surgical instrument to maintain the predetermined constant tension force.
Fault detection module 1440 may be executed by processor 1402 for analyzing the data indicative of the operating characteristics of the system, e.g. position data generated by robot arm position determination module 1418 and/or trocar position detection module 1420 and/or force measurement calculated by force detection module 1422, to detect whether a fault condition is present. For example, fault detection module 1440 may a fault condition of the system and determine whether the fault condition is a “minor fault,” a “major fault,” or a “critical fault,” wherein each category of fault condition may be cleared in a different predefined manner.
For example, fault detection module 1440 may detect a minor fault condition such as robot arm 300 being moved with a velocity exceeding a predetermined velocity threshold, which may be cleared, e.g., by slowing down the movement of robot arm 300. In some embodiments, the system may automatically apply additional impedance to robot arm 300 when robot arm 300 is moving too fast to thereby force the operator to slow down movement of robot arm 300. Moreover, fault detection module 1440 may detect a major fault condition such as an inadvertent bump of robot arm 300 as indicated by a large force applied to robot arm 300 by a person other than the operator. In response to detection of a major fault condition, fault detection module 1440 may actuate the braking mechanism associate with each motorized joint of robot arm 300 (or at least the joints associated with the major fault condition), to thereby freeze robot arm 300 and inhibit further movement of robot arm 300. Such a major fault condition may be cleared by the operator actuating a “clear” option displayed on user interface 1408. Fault detection module 1440 may detect a critical fault condition such as redundant encoders associated with a given joint of robot arm 300 generating different angulation measurements with a delta exceeding a predetermined threshold. In response to detection of a critical fault condition, fault detection module 1440 may actuate the braking mechanism associate with each motorized joint of robot arm 300 to thereby freeze robot arm 300 and inhibit further movement of robot arm 300. Such a critical fault condition may be cleared by the operator restarting the system. Upon restart of the system, if the critical fault condition is still detected by fault detection module 1440, robot arm 300 will remain frozen until the critical fault condition is cleared.
Indicator interface module 1442 may be executed by processor 1402 for causing indicators 334 to communicate the state of the system, e.g., the operational mode of robot arm 300, to the operator or other users, based on, for example, determinations made by passive mode determination module 1432, co-manipulation mode determination module 1434, haptic mode determination module 1436, and/or robotic assist mode determination module 1438. For example, indicator interface module 1442 may cause indicators 334 to illuminate in specific color light associated with a specific state of the system. For example, indicator interface module 1442 may cause indicators 334 to illuminate in a first color (e.g., yellow) to indicate that no surgical instrument is attached to the robot arm, and that the robot arm may be moved freely such that the system compensates for the mass of the robot arm; in a second color (e.g., purple) to indicate that a surgical tool is attached to the robot arm, and that the robot arm may be moved freely such that the system compensates for the mass of the robot arm and the mass of the surgical instrument coupled to the robot arm; in a third color (e.g., blue) to indicate that a surgical instrument is attached to the robot arm, and that the robot arm is in the passive mode as determined by passive mode determination module 1432; in a fourth color (e.g., pulsing orange) to indicate that at least a portion of the robot arm and/or the surgical instrument attached thereto is within the virtual haptic boundary, e.g., 1.4 m or more above the ground; in a fifth color (e.g., pulsing red) to indicate that a fault has been detected by the system by fault detection module 1440. As will be understood by a person having ordinary skill in the art, different colors and patterns may be communicated by indicators 334 to indicate the states of the system described above.
Additionally, indicators 334 may be illuminated in other distinct colors and/or patterns to communicate additional maneuvers by robot arm 300, e.g., when robot arm 300 retracts the surgical arm in the robotic assist mode, or performs another robotically-assisted maneuver in the robotic assist mode. As described above, indicators 334 further may include devices for emitting other alerts such as an audible alert or text alert. Accordingly, indicator interface module 1442 may cause indicators 334 to communicate the state of the system to the operator using audio or text, as well as or instead of light.
Fatigue detection module 1444 may be executed by processor 1402 for detecting user fatigue that may occur during operation of robot arm 300 in a surgical procedure, as described in further detail below with regard to
The co-manipulation surgical robot systems described herein may include additional modules within memory 1410 of platform 200 for executing additional tasks based on the data obtained. For example, the system may determine that a surgical instrument has been attached to robot arm 300 by detecting a rapid or sudden change in force (a “snapping motion”) applied to robot, e.g., due to the attraction force of the magnetic connection between the coupler body and coupler interface 400, via force detection module 1422. For example, the attractive forces of the magnets on the coupler body and coupler interface 400 may cause a sudden movement on at least an end portion of the robot arm, and/or a sudden rotation of the last joint of the robot arm when the magnets are aligning. Accordingly, this sudden movement may be detected and may trigger surgical instrument identification module 1412 to determine that an instrument has been attached or detached from the robot arm. Similarly, surgical instrument identification module 1412 may determine that the surgical instrument has been detached from robot arm 300, e.g., when subsequent motions of the distal end of robot arm 300 are accompanied by little to no rotation in the distal-most joint of robot arm 300.
Additionally, the system may determine if the surgical instrument has been detached from robot arm 300 based on data indicative of the position of the distal end of robot arm 300 relative to the trocar point generated by trocar position detection module 1420, as well as the direction of an instrument shaft and/or an orientation of the distal-most link of robot arm 300, e.g., distal wrist link 316. For example, if the instrument is pointing directly at the trocar, then there is a higher probability that a tool is attached to the robot arm. Moreover, axis Q7 of robot arm 300 may indicate the pointing direction of the instrument and, if the instrument is passing through the trocar port, the distal wrist link 316 will point in a direction of the trocar port. Therefore, if distal wrist link 316 is not pointing toward the trocar port, then the system may determine that the robot arm is not supporting an instrument or the instrument is not advanced through the trocar port. For example, when an instrument is detached from robot arm 300 and robot arm 300 is moved, the computed direction of the instrument shaft (e.g., the direction that the instrument would point if attached to robot arm 300) may no longer point to the trocar entry point and likely will not point to the trocar entry point. Accordingly, the may alert a user if the system determines that no tool is coupled with robot arm 300, e.g., via indicators 334.
In addition, the system may identify when a user may be attempting to remove or decouple a surgical instrument from robot arm 300 and adjust the removal force required to decouple the surgical instrument, and accordingly the coupler body, from coupler interface 400. For example, where one or more magnets are used to provide a biasing force to bias the surgical coupler body to the coupler interface, a force greater than the attraction force provided by the one or more magnets in a direction opposing the force provided by the one or more magnets must be exerted on the surgical instrument and/or the coupler body that is coupled with the surgical instrument to overcome the attracting force and decouple the coupler body and surgical instrument from the coupler interface. For example, the removal force may be 30-60 Newtons.
Moreover, the system may gather and analyze telemetry data regarding forces being applied to the robot arm to assess or estimate whether a user is attempting to remove a tool from the robot arm and, if so, reduce the coupling force between the coupler body and the coupler interface to make it easier for the user to disengage the surgical instrument from the robot arm. For example, the coupling/removal force may be reduced by 50-80%. Based on historical data and user feedback, as well as on data such as whether a user replaces the instrument without adjusting a location of the instrument, which could indicate inadvertent removal of the instrument, the system may estimate the optimal times to reduce a coupling force between the coupler body and the coupler interface. Moreover, the coupling force may be increased during operation to prevent inadvertent removal of surgical instrument from the robot arm.
Additionally, the system may determine an optimum positioning of robot arms 300 and its joints, the surgical instruments coupled with the robot arms, or other components of the robot arms and/or the system based on data obtained from the optical scanning devices used with the system, and provide guidance to the operator of the system to achieve the optimum positioning. Data indicative of the optimum positioning further may be used by processor 1402 to instruct the motors to cause corresponding links and joints of robot arm 300 to move, e.g., in robotic assist mode, to automatically reposition robot arm 300 and/or the optical scanning devices in the optimum position, e.g., during the setup stage or thereafter.
In addition, the system may collect data from sensors, e.g., position data of robot arm 300 or the surgical instrument attached thereto via the encoders or optical scanning devices and/or position data of the operator via body sensors or optical scanning devices, during a procedure, e.g., during setup or operation of robot arm 300, such that processor 1402 may detect deviations of movements or processes of the current user as compared to a model or optimal movement pattern, and communicate the deviations to the current user in real-time. For example, processor 1402 may cause a monitor to display the deviations to the current user in real-time, as well as the optimal and/or actual movement pattern. Additionally, or alternatively, indicator interface module 1440 may cause indicators 334 to indicate deviations from the model or optimal movement pattern, e.g., by illuminating a specific color and/or in a specific pattern. Additionally, or alternatively, motor interface module 1426 may apply impedance to robot arm 300 perceivable by the operator as haptic feedback including vibrations, restrictions on movement, or sensations to indicate deviations from the model or optimal movement pattern. Accordingly, the system may be used as a training tool for new users as such data may be used to optimize the position of a surgical device in real-time.
The system further may analyze the depth map generated by the optical scanning devices and cluster different groups of (depth) pixels into unique objects, a process which is referred to as object segmentation. Examples of such algorithms for segmentation may include: matching acquired depth map data to a known template of an object to segment; using a combination of depth and RGB color image to identify and isolate relevant pixels for the object; and/or machine learning algorithms trained on a real or synthetic dataset to objects to identify and segment. Examples of such segmentation on a depth map may include: locating the robot arms or determining the position of the robot arms; identifying patient ports (e.g., trocar ports) and determining a distance from the instruments to the trocar ports; identifying the surgeon and distinguishing the surgeon from other operators in the room; and/or identifying the surgeon in the sensor's field of view. Moreover, the system may use object segmentation algorithms to uniquely identify the surgeon and track the surgeon with respect to, for example, a surgical table, a patient, one or more robot arms, etc. In addition, the system may use object segmentation algorithms to determine if a surgeon is touching or handling either of the robot arms and, if so, identify which robot arm is being touched or handled by the surgeon. The system further may use object segmentation to locate the surgical instrument and the distal end of the robot arm in 3D space, such that the system may determine whether the surgical instrument is attached to the distal end of the robot arm, e.g., based on proximity between the surgical instrument and the distal end of the robot arm.
Referring now to
If the calibration file for the selected surgical instrument is not available in the database, the operator may self-calibrate the surgical instrument using the system. For example,
At step 1605, the system compensates for the gravity of the surgical instrument and the force applied by the hand of the operator, e.g., by measuring the force applied to the distal end of robot arm 300 due to the mass of the surgical instrument. As described above, the force applied to the distal end of robot arm 300 may be measured by measuring the motor current across the motors disposed in the base of robot arm 300. If the system overcompensates for the gravity of the surgical instrument, at step 1606, robot arm 300 may “runaway”, e.g., drift upward. The runaway effect may be detected at step 1607, and at step 1608, indicators 334 may blink to indicate to the operator of the runaway. At step 1609, the system may identify the runaway as a minor fault, and accordingly apply additional impedance to robot arm 300 and freeze robot arm 300 when robot arm 300 slows down before removing the additional impedance. Once the minor fault is addressed, calibration process 1600 may return to step 1603.
After step 1605, when the system compensates for the gravity of the surgical instrument, if the surgical instrument is detached, either accidentally or manually by the operator at step 1611, at step 1610, the system detected the detachment of the surgical instrument from robot arm 300. As a result, the system will stop compensating for the gravity of the surgical instrument, and calibration process 1600 may return to step 1603. After step 1605, when the system compensates for the gravity of the surgical instrument, calibration process 1600 is ready to enter calibration mode at step 1612. For example, the operator may initiate calibration mode via user interface 1408 at step 1613. At step 1614, the system may indicate to the operator, e.g., via user interface 1408 and/or blinking of indicators 334, that it is safe to let go of surgical instrument, such that the operator may let go of the surgical instrument at step 1616. At step 1615, the system calibrations the surgical instrument.
Referring again to
For example, an operator may exert a particular force on the distal end of robot arm 300, e.g. by manipulating the surgical instrument coupled to robot arm 300, to indicate that the operator wishes to change the operational mode of the particular robot arm. Sensors and/or motor current readings may be used to detect the force applied to the distal end of robot arm 300 and to determine if the force matches a predefined force signature associated with an operational change, e.g., by comparing the force with one or more predefined force signatures stored in the system. If there is a match, then the system may change the operational mode of the robot arm to the particular operational mode that matches the force signature.
As described above, during operation of the co-manipulation surgical system, the system may continuously monitor the robot arm and forces applied thereto to detect predefined conditions that require switching the operational modes of the system, as described in method 1700 of
For example, a first robot arm may be coupled to a laparoscope, and the operator may manipulate the laparoscope within the patient until a desirable field of view is provided by the laparoscope, e.g., via a monitor displaying the image feed from the laparoscope. In order to freely move the laparoscope coupled to the first robot arm in the co-manipulation mode, the operator must apply a sufficient force to the laparoscope that exceeds a predetermined force threshold. The predetermined force threshold should be low enough such that it does not require much force by the operator to freely move the laparoscope. Moreover, the predetermined force threshold may be selected so as to resist inadvertent movement away from the passive mode. As the operator freely moves the laparoscope in the co-manipulation mode, as described above, the system will apply enough impedance to the first robot arm to compensate for the effects of mass (i.e., inertia) and/or gravity of the first robot arm and the laparoscope during the movement, such that a mass or weight of the first robot arm is not detectable by the operator or is otherwise significantly attenuated. In some embodiments, if when the operator couples the laparoscope to the first robot arm, the laparoscope is not already positioned within the body of the patient, the system may determine that there are no external forces acting on the surgical instrument and may automatically switch the first robot arm to the haptic mode in order to guide the operator to move the laparoscope to the appropriate location through the trocar port, e.g., via a virtual haptic funnel established about the trocar port.
When the laparoscope is in the desired position relative to the patient and the surgical site within the patient, the system will automatically switch from co-manipulation mode to passive mode upon detection that movement of the first robot arm due to movement of the surgical instrument is within a predetermined movement threshold for a period of time exceeding a predetermined dwell time. For example, upon reaching the desired position, the operator will hold the laparoscope in the desired position, e.g., for at least a quarter of the second. Thus, if the predetermined dwell time is a quarter of a second, holding the laparoscope in the desired position for any longer than the predetermined dwell period will cause the system to automatically switch to passive mode. Moreover, as the operator may not be able to hold the laparoscope perfectly still, at least some movement of the laparoscope is permitted for the duration of the predetermined dwell time to enter into the passive mode. As described above, in passive mode, the first robot arm will hold the laparoscope in a static position, e.g., by the system applying enough impedance to the first robot arm to compensate for all external forces acting on the laparoscope.
Similarly, a second robot arm may be coupled to a retractor, and the operator may freely manipulate the retractor within the patient in the co-manipulation mode, e.g., to grasp tissue within the patient and retract the tissue to provide a clear field of view of the surgical site by the laparoscope coupled to the first robot arm, by applying a sufficient force to the second robot arm due to force applied at the retractor exceeding the predetermined force threshold of the second robot arm. As the operator grasps/lifts/retracts the tissue with retractor, the system may only compensate for the gravity of the second robot arm and/or the instrument and not of the tissue being grasped, such that the operator may feel any other forces acting on the retractor, including without limitation the forces acting on the instrument from the tissue. Accordingly, the haptics associated with the tissue being grasped may be preserved.
When the retractor sufficiently grasps and retracts the tissue, the system may automatically transition to the passive mode upon the operator holding the retractor in position, e.g., with movement not exceeding a predetermined movement threshold of the second robot arm, for a period of time exceeding the predetermined dwell period of the second robot arm. Accordingly, when the retractor is retracting the tissue within the patient in the passive mode, the second robot arm will account for the mass of the tissue in addition to the mass of the retractor and the second robot arm. Thus, the predetermined force threshold to cause the second robot arm to switch out of the passive mode must be greater than the force applied to second robot arm due to force applied to the tip of the retractor by the tissue, such that if the force applied by the tissue to the surgical instrument exceeds the predetermined first threshold of the second robot arm, the system will automatically cause the second robot arm to switch out of the passive mode and into, e.g., the co-manipulation mode. However, the predetermined force threshold should not be so high that it is very difficult for the operator to move the retractor. As described above, the operator may adjust the predetermined force threshold via, e.g., user interface 1408.
Additionally, or alternatively, the system may transition to a constant tension mode, which may be a sub-mode of the robotic assist mode, responsive to user input, e.g., a predefined gestural pattern that may be detected by optical scanner 1100 and/or the laparoscopic video feed, user input received by user interface 1408, voice command, one or more actuators associated with robot arm 300, etc., to maintain a constant tension force applied by the retractor on the tissue, as described in further detail below with regard to
Upon retraction of the tissue via the retractor coupled to the second robot arm, the operator may need to readjust the field of view of the laparoscope coupled to the first robot arm. Accordingly, the operator may apply a force to the laparoscope that exceeds the predetermined force threshold of the first robot arm, such that the system automatically switches the first robot arm from the passive mode to the co-manipulation mode. When the new desired position of the laparoscope is achieved, the first robot arm may automatically switch back to the passive mode if the predefined conditions described above are met. Alternatively, to readjust the laparoscope or to reposition the links of the first robot arm to avoid potential collisions during the laparoscopic procedure or to switch the laparoscope to a different robot arm altogether, the operator may elect to decouple the laparoscope, readjust the robot arm and/or laparoscope, and reattach the laparoscope to the first robot arm (or to the other robot arm). Upon reattachment of the laparoscope to the first robot arm, the first robot arm may automatically switch to the passive mode if the predefined conditions described above are met.
Moreover, as the operator freely moves the retractor in the co-manipulation mode, e.g., prior to inserting the tip of the retractor through the trocar within the patient, if the operator moves the tip of the retractor too close to the patient's skin away from the trocar port, and a virtual haptic boundary has been established by the system on the skin of the patient outside the trocar ports, the system may automatically switch to the haptic mode. Accordingly, the system may apply an impedance to the second robot arm that is much higher than the impedance applied to the second robot arm in co-manipulation mode to indicate to the operator that they are approaching or within the virtual haptic boundary. For example, movement of the retractor by the operator may feel much more viscous in the haptic mode. The system may remain in the haptic mode until the operator moves the retractor out of the virtual haptic boundary. In some embodiments, in the haptic mode, the second robot arm may reduce the effects of gravity, eliminate tremor of the instrument tip, and apply force feedback to avoid critical structures as defined by the virtual haptic boundary. Accordingly, the system does not replace the operator, but rather augments the operator's capabilities through features such as gravity compensation, tremor removal, haptic barriers, force feedback, etc.
In some embodiments, the system may switch the second robot arm to the robotic assist mode. For example, as the operator attempts to retract the tissue, if more force is required to retract the tissue than the operator is able or willing to apply to the retractor, the operator may provide user input to the system indicating that the operator wants the second robot arm to assist in the retraction of the tissue. For example, as described above, the operator may perform a predefined gestural pattern that may be detected by, e.g., optical scanner 1100, such that the system switches the second robot arm to the robotic assist mode and causes the motors of the second robot arm to move the second robot arm, and accordingly the retractor, to provide the additional force required to retract the tissue.
In addition, instead of manually manipulating the laparoscope coupled to the first robot arm as described, the operator may provide another user input to the system indicating that the operator wants the system to reposition the laparoscope. For example, if the operator is actively manipulating a surgical scissor, which may or may not be coupled to a robot arm of the system, such that the tip of the surgical scissor is within the field of view of the laparoscope coupled to the first robot arm, the operator may perform a predefined gestural pattern with the tip of the surgical scissor, e.g., moving the surgical scissor quickly back in forth in a particular direction. The predefined gestural pattern of the surgical scissor may be captured as image data by the laparoscope, and based on the data, the system may detect and associated the predefined gestural pattern with a predefined user input requiring that the system switch the first robot arm from the passive mode to the robotic assist mode, and cause the first robot arm to reposition itself, and accordingly the laparoscope, to adjust the field of view in the direction of the pattern motion of the surgical scissor. As described above, additional gestural patterns may be performed via the surgical scissor within the field of view of the laparoscope to cause the first robot arm to retract the laparoscope and/or to cause the laparoscope itself to zoom in or zoom out or improve resolution. In some embodiments, based on the image data captured by the laparoscope, using object tracking of the additional tools in the field of view of the laparoscope, e.g., the surgical scissors actively operated by the operator, the system may cause the first robot arm coupled to the laparoscope to automatically switch to the robotic assist mode and cause the first robot arm to reposition itself to adjust the field of view to ensure that the tip of the surgical scissors remain within an optimum position within the field of view of the laparoscope during the procedure.
The operational mode of any one of the robot arms may be changed independent of the operational mode of the other robot arms of the system. In addition, the operational parameters of each robot arm may be tailored to the specific surgical instrument coupled thereto. For example, the predetermined force threshold for the robot arm coupled to the retractor device may be higher than the predetermined force threshold for the robot arm coupled to the laparoscope, as the retractor will endure higher forces during the procedure. The sensors, motors, etc. of the system may be active in all modes, but may act very differently in each mode, e.g., including acting as if inactive. As will be understood by a person having ordinary skill in the art, the system may include more than two robot arms, such that the operator may couple a third surgical instrument, e.g., a grasper device, to a third robot arm and a fourth surgical instrument, e.g., a surgical scissor device, to a fourth robot arm for operation during the laparoscopic procedure.
In some embodiments, the operational mode of a robot arm may be changed responsive to user input provided by the operated. For example, the operator may selectively change the operational mode of the robot arm by actuating a button, dial, or switch located on the robot arm, a foot pedal or foot switch, voice command, an input on a touchscreen, or using gestures or force signatures as described above. In some embodiments, the operational mode of a robot arm may be changed based only on the coupling of the surgical instrument to the coupler interface via the coupler body. As described above, the system may automatically identify the surgical instrument based on the coupling of the coupler body to the coupler interface. Accordingly, based on the identity of the surgical instrument coupled to the robot arm, the system may automatically switch the operational mode of the robot arm to a predetermined operational mode, e.g., passive mode if the surgical instrument is an endoscope, or if the robot arm is already in the passive mode, the system will remain in the passive mode upon coupling of the endoscope with the robot arm.
Similarly, based on the identity of the surgical instrument upon attachment of the surgical instrument to the robot arm, the system may automatically switch the operational mode of the robot arm to the co-manipulation mode, e.g., is the surgical instrument identity indicates that it is a tool that will be actively operated by the operator during the laparoscopic procedure. Additionally, based on the identity of the surgical instrument upon attachment of the surgical instrument to the robot arm, the system may automatically switch the operational mode of the robot arm to the robotic assist mode, e.g., if the surgical instrument identity indicates that it is a tool that the operate desires to be completely robotically controlled such as an irrigation device. Accordingly, upon attachment of the irrigation device to the robot arm, the system will switch to the robotic assist mode and cause the robot arm to position the irrigation device in the desired position within the body.
Moreover, the system may be instructed by the operator, e.g., via user interface 1408, to operate the robot arm in less than the four operational modes discussed above. For example, the operator may deactivate any one of the operational modes for a give procedure. In some embodiments, the system may cause the robot arm to operate in an additional operational mode, such as a locking mode, which may be similar to the passive mode, except that the predetermined force threshold of the robot arm to switch out of passive/locking mode may be so high that the robot arm is effectively frozen so as to protect the robot arm from inadvertently switching out of the passive/locking mode, e.g., to avoid movement due to inadvertent bumps of the robot arm. In this locking mode, if the force from the inadvertent bump is sufficiently high to cause even a slight movement of the robot arm, the system may cause the robot arm to reposition itself to the position it was in prior to the inadvertent bump.
In addition, when no surgical instrument is coupled to the distal end of a robot arm of the system, the system is still capable of automatically switching the operational modes of the robot arm responsive to movement of the robot arm by an operator upon detection of the predefined conditions described above. Accordingly, the system will apply an impedance to the joints of the robot arm to compensate for the mass of the robot arm such that the robot arm may remain in a static position when in the passive mode, and will permit the robot arm to be freely moveably by the operator in the co-manipulation mode if the system detects that the force applied to the robot arm by the operator exceeds the predetermined force threshold of the robot arm. Additionally, the system will switch the robot arm to the haptic mode if the operator attempts to move any portion of the robot arm within a predefined virtual haptic barrier.
At step 1514, when the laparoscopic procedure is complete, the operator may remove the surgical instruments from the respective robot arms.
Referring now to
As shown in
Where Feff is the force at the distal end of robot arm 300 (e.g., the “end-effector force” of robot arm 300), W is the weight vector of the surgical instrument (=−mgz), and Ftr is the trocar force. Accordingly, Feff is the desired force sent to the system, which is the sum of all the forces generated in the algorithm pipeline including, e.g., gravity compensation, hold, etc.
As shown in
Here, distances D1 and D3 are known as described above, and D2 may be derived based on the known position of the distal end of robot arm 300 and the calculated position of trocar Tr. As shown in
As described above, the system may alert the operator if the forces, e.g., force Ftt applied to the tip of the instrument and/or force Ftr applied by the instrument at the trocar using, are greater than the respective threshold forces, and accordingly freeze the system if the calculated force is greater than the threshold force, and/or reduce the force exerted at the trocar point at the body wall or at the tip of the instrument by automatically applying brakes or stopping forces to robot arm 300, by slowing or impeding further movement of the instrument in the direction that would increase forces applied at the tip of the instrument or the trocar, and/or automatically moving the robotic arm in a direction that reduces the force being exerted at the instrument tip and/or at the trocar point at the body wall.
Referring now to
Referring to
Referring now to
In addition,
Referring now to
Non-real-time computer 2302 further may provide user feedback 2312 to the user via user interface 2314. User feedback may include, e.g., collision notifications, positioning information and/or recommendations regarding the various components of the system, the operational mode that has been detected by the system, etc. Non-real-time computer 2302 further may provide commands 2318, e.g., high level commands, to real-time computer 2308. High-level commands may include, e.g., mode changes, trajectories, haptic barriers, user configurations, etc. Real-time computer 2308 may include robot controller 2320 programmed to provide robot commands 2322, e.g., motion or force commands, to the one or more robot arms 2324, e.g., robot arms 300. Robot controller 2320 may receive robot feedback data 2326, e.g., motion, force, and/or touchpoint data, etc., from the one or more robotic arms 2324.
Referring now to
In some embodiments, the system may collect data during a procedure indicative of at least one of operator hand tremor, distance/minimum path travelled by the instrument tip, time to achieve procedure steps, and/or time to complete the procedure, and compare such data with threshold or predefined values for each of the factors to determine whether a magnitude of any one of the factors has reached a level sufficient to cause the system to warn the operator and/or sufficient to cause the system to adjust one or more operating parameters to mitigate the user's fatigue. For example, the system may eliminate or reduce tremor of the instrument tip by exerting forces on the instrument to increase the impedance or viscosity of the instrument, to avoid critical structures, and/or to apply force feedback. User fatigue may be identified when, for example, a procedure time increases beyond a threshold value for a particular procedure, the number of movements of the surgical instrument increases beyond a threshold value for a particular procedure or otherwise indicates errant or uncontrolled movements, if an operator moves an instrument into a haptic barrier a predefined number of times, if an operator exerts an excessive force on the trocar one or a predetermined number of times, etc. As described above, such data may be collected using the sensors on the robot arms and/or one or more optical scanning devices. When a particular level of user fatigue is identified by the system, the system may increase a viscosity or impedance of the instrument and/or the robot arm associated with the instrument to reduce a magnitude of movements and/or a number of movements of the surgical instrument and/or the robot arm.
Additionally, the system may collect data regarding the speed and frequency with which the operator moves the various instruments/laparoscopes along with estimates of how much tremor is involved in the movements, estimate the required added viscosity to reduce tremors while not hindering their motions or adding unnecessary fatigue to the operator. In some embodiments, a controller of robot arm 300 may iteratively adjust a viscosity value for a particular instrument, collect data related to the movement of the instrument, and to assess whether an additional adjustment is needed to the viscosity applied to the instrument. Moreover, the system may use additional algorithms to adopt an iterative approach to optimizing a particular operational characteristic or parameter of robot arm 300, including collecting data related to a particular operational characteristic or parameter, changing operational characteristic or parameter, collecting additional data related to the operational characteristic or parameter, and analyzing the data to determine if additional changes to the operational characteristic or parameter should be made, which may be based on, e.g., deviations between the actual data values and preferred or optimal values of an operational characteristic or parameter.
Referring now to
Moreover, centralizing procedure data may enable the running of large data analytics on a wide range of clinical procedures coming from different users. Analysis of data may result in optimized settings for a specific procedure, including, e.g., optimized system positioning, optimal ports placement, optimal algorithms settings for each robot arm and/or detection of procedure abnormalities (e.g., excessive force, time, bleeding, etc.). These optimal settings or parameters may depend on patient and tool characteristics. As described above, a surgeon may load and use optimal settings from another surgeon or group of surgeons. This way, an optimal setup may be achieved depending on, e.g., the surgeon's level of expertise. To keep track of the various users in the distributed network of cobot systems, it may be beneficial to identify each user. As such, the user may log into the cobot system and access their profile online as necessary. This way the user may have access to their profile anywhere and will be able to perform a clinical procedure with their settings at a different hospital location.
An example user profile may contain the user's specific settings and information, including, e.g., username; level of expertise; different procedures performed, and/or region of clinical practice. In addition, the clinical procedure may require a user to store specific settings such as clinical procedure (e.g., cholecystectomy, hernia, etc.), table orientation and height, preferred port placement, settings per assistant arm for each algorithm, patient characteristics (e.g., BMI, age, sex), and/or surgical tools characteristics and specifications (e.g., weights, length, center of gravity, etc.). The user may be able to enable his own profile, and optionally may enable another user's profile, such as the profile of a peer, the most representative profile of a surgeon of the user's area of practice, the most representative profile of a surgeon with a specific level of expertise, and/or the recommended profile according to patient characteristics.
The identification of a user may be performed via password, RFID key, facial recognition, etc. Learning from a large number of procedures may result in a greater level of optimization of the cobot system setup for a given procedure. This may include, e.g., cart position, individual robot arm position, surgical table height and orientation, port placement, and/or setup joints position. These settings may be based on patient height, weight, and sex, and further may be interdependent. For example, the optimal port placement may depend on patient table orientation.
Additionally, a clinical procedure may be described as a sequence of clinical procedures steps. Learning these different steps may allow the cobot system to infer in real time the actual step for a given procedure. For example learning clinical steps from procedures may allow or enable: adjustment of algorithm settings, the system to give the practical custom reminders, the system to notify staff of an estimate procedure end time, the system to alert staff if necessary equipment is not available in the room, and/or the system to alert staff of the occurrence of an emergency situation.
During a clinical procedure, the surgeon will often realize simple and routine surgical tasks such as grasping, retracting, cutting etc. Learning these different tasks may allow the cobot system to infer in real time preferences and habits of the surgeon regarding a sequence of a procedure in real time. Some algorithms of the cobot system may be tuned (i.e., adjusted and optimized) during the procedure based on this sequence recognition and help the user to be better at this simple surgical task. An example of such a task is the automated retraction of a liver during a gall bladder procedure. By aggregating the information over many cases, the optimized force vectors may be developed.
Further, some complications may occur during a clinical procedure that may result in unexpected steps or surgical acts. Learning how to discriminate these unexpected events would help the cobot system to enable some specific safety features. In case of emergency, the robot arms may be stopped or motion restricted depending on the level of emergency detected by the system.
Referring now to
As platform 2700 is being moved toward the patient, the scene may be directly observed by a depth mapping sensor, e.g., optical scanner 1100′, which may be mounted on platform 2700. From the depth maps observed and generated by optical scanner 1100′, key features may be identified such as, for example, the height and/or location of patient table PT, the surface of the patient's abdomen, position and other characteristics of the surgeon, including the surgeon's height, and the trocar port(s), the base of robot arms 300a′, 300b′, e.g., base portions 302a′, 302b′ and shoulder portions 304a′, 304b′, robot arms 300a′, 300b′, and/or one or more surgical instruments coupled with the robot arms. Identification of such key features may be carried out using standard computer vision techniques such as template matching, feature tracking, edge detection, etc. As each feature is registered, its position and orientation may be assigned a local co-ordinate system and transformed into the global co-ordinate system the system using standard transformation matrices. Once all features are transformed into a single global co-ordinate system, an optimization algorithm, e.g., least squares and gradient descent, may be used to identify the most appropriate vertical and horizontal positions of robot arms 300a′, 300b′, which may be adjusted via platform 2700, to maximize the workspace of the robot arms with respect to the insertion point on the patient. The optimal workspace may be dependent on the surgical operation to be performed and/or the surgeon's preferred position.
As shown in
Referring now to
In addition, the operator may adjust the vertical and horizontal position of each robot arm, as shown in
As shown in
Referring now to
As shown in
Thus, by actuating the constant tension mode, the system may continuously measure the force applied to surgical instrument SI1 by organ O, e.g., via motor current measurements of robot arm 300, and upon determining that the force applied to surgical instrument SI1 by organ O falls outside of a predetermined constant tension force threshold, e.g., an acceptable force range based on constant tension force Fconst, at a second time after the first time, the system may cause robot arm 300 to move surgical instrument SI1 in a direction to apply and maintain constant tension force Fconst to organ O by surgical instrument SI1. Accordingly, when the force applied to surgical instrument SI1 is equal to constant tension force Fconst (or within the predetermined constant tension force threshold), the system may cause robot arm 300 to remain in a static position at the position relative to organ O which maintains the constant tension on organ O. Thus, constant tension mode may allow a surgeon to operate more efficiently and faster as the stability of a retractor is a sine qua non condition for their efficiency, and because organs may slide very easily on the retractor as soon as the user releases the retractor.
Moreover, the system may include a constant tension movement threshold in the constant tension mode, such that robot arm 300 may not cause surgical instrument SI1 to move beyond a predetermined distance from the initial position in the constant tension mode while attempting to maintain the constant tension force Fconst applied to organ O by surgical instrument SI1, to thereby avoid excessive or insufficient push/pull forces applied by surgical instrument SI1 and prevent surgical instrument SI1 from damaging nearby anatomical structures or inadvertently being withdrawn through trocar Tr1. In some embodiments, the system may generate and emit/display an alert if the tip of surgical instrument SI1 is within a predetermined distance of trocar Tr1, and/or the system may apply increased impedance to robot arm 300 when surgical instrument SI1 is within the predetermined distance of trocar Tr1. Accordingly, the system may cause robot arm 300 to cease application of the constant tension force to organ O via surgical instrument SI1 when surgical instrument SI1 is within the predetermined distance of trocar Tr1. Additionally, or alternatively, the system may apply a haptic boundary localized around the surgical site within the patient's body, such that increased impedance is applied to robot arm 300 when surgical instrument SI1 approaches the haptic boundary to prevent surgical instrument SI1 from contacting or otherwise damaging nearby anatomical structures. For example, the increased impedance may be sufficient to maintain surgical instrument SI1 in a static position.
In some embodiments, the system may determine that dissection of organ O is complete, e.g., when the piece of organ O grasped by surgical instrument SI1 is released/detached from the rest of organ O, via laparoscopic video data received from a laparoscope (not shown) having a field of view of the surgical site within the patient's body, and accordingly, the system may automatically switch robot arm 300 out of constant tension mode, e.g., to passive mode, to thereby maintain robot arm 300 in a static position when the dissection is observed to be complete, such that robot arm 300 ceases applying the constant tension force to organ O. Additionally, or alternatively, the system may determine that dissection of organ O is complete by monitoring the force applied to surgical instrument SI1 by organ O. For example, the system may determine that dissection of organ O is complete upon determination that a rate of change in the force applied to surgical instrument SI1 by organ O exceeds a predetermined time threshold.
In some embodiments, rather than moving robot arm 300 every time the force applied to surgical instrument SI1 by organ O falls outside of the predetermined constant tension force threshold, the system may be programmed with two or more preset configurations of robot arm 300 in the constant tension mode. For example, the system may establish the initial position surgical instrument SI1 is in when applying constant tension force Fconst to organ O as a first preset configuration, and may be configured to move robot arm 300, and accordingly, surgical instrument SI1 from the first preset configuration to a second preset configuration, e.g., a predetermined distance from the first preset configuration in a predetermined direction, such that surgical instrument SI1 applies and maintains an acceptable force to organ O, upon detection of one or more predefined conditions. For example, a predefined condition may be detected based on laparoscopic video data indicating that the surgical procedure, e.g., dissection of organ O, has reached a predefined phase. Alternatively, the predefined condition may be detected based on user input, and/or determination that the force applied to surgical instrument SI1 by organ O falls out of another predetermined constant tension force threshold, and/or determination that a rate of change in the force applied to surgical instrument SI1 by organ O exceeds a predetermined time threshold, e.g., indicating a complete dissection of organ O. As will be understood by a person having ordinary skill in the art, the system may be programmed with more than two preset configurations, each present configuration triggered by the detection of one or more predefined conditions. Moreover, one or more preset configurations may be established via machine learning algorithms trained via a database having data indicative of previous positions of the same type of surgical instrument by other users during similar phases of the same surgical procedure. Accordingly, as a user may adjust the position of surgical instrument SI1 at any time while robot arm 300 is in any operational mode, any adjustments made by the user may be recorded and used to further train the machine learning algorithms to further define the one or more preset configurations.
In some embodiments, surgical instruments having one or more force sensors may be utilized with the robot arm, such that data received from the force sensors indicative of the amount of force applied to the surgical instrument by an anatomical structure may serve as the input for the system to determine whether the force applied to the surgical instrument falls out of the predetermined constant tension force threshold. In some embodiments, when the surgical scissors are coupled to a second robot arm, the system may cause the second robot arm to move the surgical scissors to facilitate dissection of organ O, based on force measurements of robot arm 300 indicating that the a dissection of organ O is occurring, laparoscopic video data indicating the same, and/or learned intensity and direction of tension forces applied to the surgical scissors or the retractor by organ O during the procedure. Moreover, as will be understood by a person having ordinary skill in the art, multiple surgical instruments, each coupled to a respective robot arm may be used in constant tension mode to apply and maintain a respective constant force on the target anatomical structure, which may be specific to the respective surgical instrument.
In another example, e.g., a cholecystectomy, the constant tension mode may be used during the phase of the cholecystectomy where the gallbladder is placed in a bag and needs to be removed from the body through an umbilical port. For example, when the bag is not able to pass through the trocar opening, the bag may be pulled halfway through this opening such that the gallbladder stone(s) inside the bag may be broken into smaller pieces while within the patient's body. During fragmentation of the gallbladder stone(s), the robot arm(s) coupled to a surgical instrument(s) holding the bag may be switched to constant tension mode such that a constant tension force may be applied to and maintained on the bag via the surgical instrument(s). Constant tension mode may be useful in other surgical procedures including, for example, treatment of large ovaries, enlarged appendix by infection, enlarged tumoral nodes after an abdominal, pelvic or thoracic lymphadenectomy, etc.
Some implementations of the systems described herein may be configured to be controlled or manipulated remotely, e.g., via joystick or other suitable remote control device, computer vision algorithm, force measuring algorithm, and/or by other means. However, in a preferred embodiment, the systems described herein operate without any telemetry, e.g., the robot arm is not teleoperated via a remote surgeon console separate from the robot arm, but instead the robot arm moves in response to movement applied to the surgical instrument coupled thereto. Any robot-assisted movements applied to the surgical instrument by the system, e.g., in the robotic assist mode, are not responsive to user input received at a remote surgeon console.
While various illustrative embodiments of the invention are described above, it will be apparent to one skilled in the art that various changes and modifications may be made therein without departing from the invention. The appended claims are intended to cover all such changes and modifications that fall within the true scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
22305572.4 | Apr 2022 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2023/053972 | 4/18/2023 | WO |