This disclosure relates to a surgical robotic systems and more particularly to positioning a camera to capture images inside a body cavity of a patient during a medical procedure.
Miniaturized cameras are used during investigative medical procedures and surgical procedures such as laparoscopic surgery to produce images of a site of the procedure within a body cavity of the patient. The camera generally has a field of view that captures only a portion of the body cavity of the patient and may have a positioning mechanism for orienting the camera to change the portion of the body cavity within the field of view.
In accordance with one disclosed aspect there is provided a method for positioning a camera to capture images inside a body cavity of a patient during a medical procedure. The method involves receiving location information at a controller of a surgical system performing the medical procedure, the location information defining a location of at least one tool with respect to a body cavity frame of reference, and in response to receiving an align command signal at the controller, causing the controller to produce positioning signals operable to cause the camera to be positioned within the body cavity frame of reference to capture images of the at least one tool for display to an operator of the surgical system.
Movement of the at least one tool within the body cavity may be caused by movement signals produced by the controller based on kinematic calculations and receiving location information may involve using results of the kinematic calculations to determine the location of the at least one tool.
The method may involve receiving location signals at the controller, the location signals being indicative of an actual location of the at least one tool within the body cavity frame of reference.
Causing the controller to produce positioning signals may involve causing the controller to produce positioning signals operable to cause the camera to be positioned such that a field of view of the camera is disposed to cause a reference point associated with the at least one tool to be centered within the captured images.
The reference point may involve a point on the at least one tool proximate a distal end of the at least one tool.
Receiving location information may involve receiving location information defining locations of a plurality of tools with respect to a body cavity frame of reference, and the reference point may include a point disposed in-between respective distal ends of the plurality of tools.
The method may involve receiving operator input of a desired offset between the reference point and the center of the field of view of the camera and causing the controller to produce positioning signals may involve causing the controller to produce positioning signals operable to cause the camera to be positioned such that a field of view of the camera is disposed offset from the reference point by the desired offset within the captured images.
The method of may further involve, while the align command signal is being received at the controller and in response to receiving operator input from an input device configured to generate input signals for controlling movement of the at least one tool, causing the controller to continue to produce movement signals for causing movement of the at least one tool while simultaneously producing camera positioning signals operable to cause the camera to follow the at least one tool within the body cavity frame of reference.
Receiving the align command signal may involve causing the controller to determine whether a camera align control has been activated by the operator.
The camera align control may involve one or more of a finger actuated switch and a foot actuated switch.
Receiving the align command signal may involve causing the controller to determine whether at least one of a pattern of movement of the at least one tool has been received at an input device configured to generate input signals for controlling movement of the at least one tool, and a pattern of movement of an end effector disposed at a distal tip of the at least one tool has been received from an input device configured to generate input signals for controlling movement of the end effector.
Receiving the align command signal may involve causing the controller to determine whether at least one of a reference point associated with the at least one tool is disposed outside of a defined central region within the captured image, and a reference point associated with a currently active one of a plurality of tools is disposed outside of a defined central region within the captured image.
In accordance with another disclosed aspect there is provided an apparatus for positioning a camera to capture images inside a body cavity of a patient during a medical procedure performed by a surgical system. The apparatus includes a controller operably configured to receive location information, the location information defining a location of at least one tool with respect to a body cavity frame of reference. The controller is configured to produce positioning signals operable to cause the camera to be positioned within the body cavity frame of reference to capture images of the at least one tool for display to an operator of the surgical system in response to receiving an align command signal.
Movement of the at least one tool within the body cavity may be caused by movement signals produced by the controller based on kinematic calculations and the controller may be configured to receive location information by using results of the kinematic calculations to determine the location of the at least one tool.
The controller may be configured to receive location signals, the location signals being indicative of an actual location of the at least one tool within the body cavity frame of reference.
The controller may be configured to produce positioning signals operable to cause the camera to be positioned such that a field of view of the camera is disposed to cause a reference point associated with the at least one tool to be centered within the captured images.
The reference point may include a point on the at least one tool proximate a distal end of the at least one tool.
The controller may be configured to receive location information defining locations of a plurality of tools with respect to a body cavity frame of reference, and the reference point may include a point disposed in-between respective distal ends of the plurality of tools.
The controller may be configured to receive operator input of a desired offset between the reference point and the center of the field of view of the camera and the controller may be further configured to cause the camera to be positioned such that a field of view of the camera is disposed offset from the reference point by the desired offset within the captured images.
While the align command signal is being received at the controller and while receiving operator input from an input device configured to generate input signals for controlling movement of the at least one tool, the controller may be configured to continue to produce movement signals for causing movement of the at least one tool while simultaneously producing camera positioning signals operable to cause the camera to follow the at least one tool within the body cavity frame of reference.
The align command signal may be produced in response to determining that a camera align control has been activated by the operator.
The camera align control may include one or more of a finger actuated switch and a foot actuated switch.
The align command signal may be produced in response to the controller determining whether at least one of a pattern of movement of the at least one tool has been received at an input device configured to generate input signals for controlling movement of the at least one tool, and a pattern of movement of an end effector disposed at a distal tip of the at least one tool has been received from an input device configured to generate input signals for controlling movement of the end effector.
The align command signal may be produced by the controller when at least one of a reference point associated with the at least one tool is disposed outside of a defined central region within the captured image, and a reference point associated with a currently active one of a plurality of tools is disposed outside of a defined central region within the captured image.
Other aspects and features will become apparent to those ordinarily skilled in the art upon review of the following description of specific disclosed embodiments in conjunction with the accompanying figures.
In drawings which illustrate disclosed embodiments,
Referring to
The instrument 106 and the drive unit 108 are shown in more detail in
In the embodiment shown the end effector 212 is a pair of forceps having opposing moveable gripper jaws 220 controlled by the associated tool drive for manipulating tissue, while the end effector 218 is a pair of curved dissecting forceps controlled by the associated tool drive for also manipulating tissue. The instrument 106 also includes a camera 222 deployed on an articulated arm 224 that is able to pan, elevate, and tilt the camera. In this embodiment the camera 222 includes a pair of spaced apart image sensors 226 and 228 for producing stereoscopic views of the surgical workspace. The camera 222 is initially positioned in-line with the insertion tube 202 prior to insertion through the incision and then deployed as shown at 206. The tools 208 and 214 are also initially positioned in-line with the insertion tube 202 prior to installation and insertion through the insertion tube and then deployed as shown at 206.
The instrument 106 without the tools 208 and 214 installed is shown in more detail in
Movement of the plurality of connected linkages 300 is actuated by drivers (not shown) housed within the drive unit 108 (shown in
Referring back to
The workstation 102 also includes a workstation processor circuit 120, which is in communication with the input devices 116 and 118 and the hand controllers 112 and 114 for receiving input from a surgeon. The instrument cart 104 also includes an instrument processor circuit 130 for controlling the instrument 106. The workstation processor circuit 120 and instrument processor circuit 130 act as controllers for controlling operations of the system 100. In this embodiment the instrument processor circuit 130 is in communication with the workstation processor circuit 120 via an interface cable 132 for transmitting signals between the workstation processor circuit 120 and the instrument processor circuit 130. In other embodiments communication between the workstation processor circuit 120 and the processor circuit 130 may be wireless or via a computer network, and the workstation 102 and may even be located remotely from the instrument cart 104.
The workstation 102 also includes a display 122 in communication with the workstation processor circuit 120 for displaying real time images and/or other graphical depictions of the surgical workspace. In this embodiment where the camera 222 includes the pair of spaced apart image sensors 226 and 228, the display 122 is configured to provide separate 2D stereoscopic views of the surgical workspace that provide a 3D depth effect when viewed through suitable stereoscopic spectacles worn by the surgeon.
The workstation 102 also includes a footswitch 134, which is actuable by the surgeon to provide an enablement signal to the workstation processor circuit 120. The 102 also includes a plurality of footswitches 136, which are actuable by the right foot of the surgeon and provide input signals to the workstation processor circuit 120 for controlling the instrument 106.
Input signals are generated by the left and right input devices 116 and 118 in response to movement of the hand controllers 112 and 114 by a surgeon within an input device workspace associated with the left and right input devices. The manipulators 210 and 216 associated with the tools 208 and 214 spatially position the end effectors 212 and 218 of the respective tools 208 and 214 in the surgical workspace in response to the input signals.
A block diagram of the processor circuit elements of the system 100 is shown in
In this embodiment the input device 110 communicates using a USB protocol and the USB interface 254 receives input signals produced by the input device in response to movements of the hand controllers 112 and 114. The microprocessor 250 processes the input signals based on a current mapping between the input device workspace and the surgical workspace and causes the motion control interface 258 to transmit control signals, which are conveyed to the instrument processor circuit 130 via the interface cable 132. The mapping may include a scale factor that scales movements in input device workspace to produce scaled movements in surgical workspace. For example a 100 mm translation in input device workspace may be scaled by a scale factor of 0.5 to produce a 50 mm movement in surgical workspace for fine movement.
The workstation processor circuit 120 receives the footswitch signals at the input/output 256 from the footswitch 134 and the plurality of footswitches 136. The workstation memory 252 includes a current buffer 320 and a previous buffer 340 including a plurality of stores for storing values associated with the control signals, as described later herein.
The instrument processor circuit 130 includes a microprocessor 280, a memory 282, a communications interface 284, and a drive control interface 286, all of which are in communication with the microprocessor.
The microprocessor 280 receives the input signals at the communications interface 284. The microprocessor 280 processes the control signals and causes the drive control interface 286 to produce drive signals for moving the tools 208 and 214.
The workstation processor circuit 120 thus acts as a master subsystem for receiving user input, while the instrument processor circuit 130 and tools 208 and 214 act as a slave subsystem in responding to the user input.
The right input device 116 is shown in greater detail in
The right hand controller 112 is mounted to the arms 402-406 to permit positioning and rotation about orthogonal axes x1, y1 and z1 of a Cartesian reference frame. The Cartesian reference frame has an origin at a point on a body of the hand controller 112 and the location of the origin defines the hand controller position 408 (i.e. at the origin). In this embodiment, the hand controller 112 is mounted on a gimbal mount 410. The arms 402-406 confine movements of the hand controller 112 and hence the hand controller position 408 to within the hemispherical input device workspace. In one embodiment the input device 116 may also be configured to generate haptic forces for providing haptic feedback to the hand controller 112 through the arms 402-406 and gimbal mount 410.
The input device 116 has sensors (not shown) that sense the position of each of the arms 402-406 and rotation of the hand controller 112 about each of the x1, y1 and z1 axes and produces signals representing the position of the hand controller in the workspace and the rotational orientation of hand controller relative to an input device Cartesian reference frame xr, yr, zr. In this embodiment, the position and orientation signals are transmitted as input signals via a USB connection 418 to the USB interface 254 of the workstation processor circuit 120.
In this embodiment, the gimbal mount 410 has a pin 412 extending downwardly from the mount and the base 400 includes a calibration opening 414 for receiving the pin. When the pin 412 is received in the opening 414 the input device 116 is located in a calibration position that is defined relative to the input device Cartesian reference frame xr, yr, zr. The input device reference frame has an xr-zr plane parallel to the base 400 and a yr axis perpendicular to the base. The zr axis is parallel to the base 400 and is coincident with an axis 416 passing centrally through the input device 116.
The input device 116 produces current hand controller signals and current hand controller orientation signals that represent the current position and orientation of the hand controller 112. The signals may be represented by a current hand controller position vector and a current hand controller rotation matrix. The current hand controller position vector is given by:
where x1, y1, and z1 represent coordinates of the hand controller position 408 (i.e. the origin of the coordinate system x1, y1, z1) relative to the input device reference frame xr, yr, zr. The current hand controller rotation matrix is given by:
where the columns of the matrix represent the axes of the hand controller reference frame x1, y1, z1 relative to the input device reference frame xr, yr, zr. The matrix RMCURR thus defines the current rotational orientation of the hand controller 112 relative to the xr, yr, and zr fixed master reference frame. The current hand controller position vector {right arrow over (P)}MCURR and current handle rotation matrix RMCURR are transmitted as current hand controller position and current hand controller orientation signals via the USB connection 418 to the USB interface 254 of the workstation processor circuit 120. The workstation processor circuit 120 stores the three values representing the current handle position vector {right arrow over (P)}MCURR in a store 322 and the nine values representing the current hand controller rotation matrix RMCURR in a store 324 of the current buffer 320 of workstation memory 252.
The right side tool 208 is shown in greater detail in
The tool 208 includes a plurality of the identical “vertebra” 550 as described in commonly owned PCT patent application PCT/CA2013/001076 entitled “ARTICULATED TOOL POSITIONER AND SYSTEM EMPLOYING SAME” filed on Dec. 20, 2013, which is incorporated herein by reference in its entirety. The vertebra 550 are operable to move with respect to each other when control wires passing through the vertebra are extended or retracted to cause movements of the manipulator 210. The position and orientation of the end effector 212 is defined relative to a fixed slave reference frame having axes xv, yv, and zv, which intersect at a point referred to as the fixed slave reference position 652. The fixed slave reference position 552 lies on a longitudinal axis 554 of the tool 208 and is contained in a plane perpendicular to the longitudinal axis and containing a distal edge of the insertion tube 202. In one embodiment the fixed slave reference frame acts as a body cavity frame of reference.
In the embodiment shown, the gripper jaws 220 of the end effector 212 are positioned and oriented within an end effector workspace. A tip of the gripper jaws 220 may be designated as an end effector position 560 defined as the origin of an end effector Cartesian reference frame x2, y2, z2. The end effector position 560 is defined relative to the slave reference position 552 and the end effector may be positioned and orientated relative to the fixed slave reference frame xv, yv, zv for causing movement of the manipulator 210 and/or the end effector 212.
The current hand controller position signal {right arrow over (P)}MCURR and current hand controller orientation signal RMCURR cause movement of the end effector 212 of the tool 208 to new end effector positions and desired new end effector orientations and are represented by a new end effector position vector {right arrow over (P)}EENEW:
where x2, y2, and z2 represent coordinates of the end effector position 560 within the end effector workspace relative to the xv, yv, zv fixed slave reference frame, and a 3×3 end effector rotation matrix REENEW:
where the columns of the REENEW matrix represent the axes of the end effector reference frame x2, y2, and z2 written in the fixed slave reference frame xv, yv, and zv. REENEW thus defines a new orientation of the end effector 212 in the end effector workspace, relative to the xv, yv, and zv fixed slave reference frame. Values for the vector {right arrow over (P)}EENEW and rotation matrix REENEW are calculated as described later herein and stored in stores 330 and 332 of the current buffer 320 of the workstation memory 252 respectively.
When the system 100 initially starts up, the workstation processor circuit 120 sets a master base position vector {right arrow over (P)}MBASE equal to the current hand controller vector {right arrow over (P)}MCURR and causes a definable master base rotation matrix RMBASE to define an orientation that is the same as the current orientation defined by the hand controller rotation matrix RMCURR associated with the current hand controller rotation. At startup the following operations are therefore performed:
{right arrow over (P)}MBASE={right arrow over (P)}MCURR, and
RMBASE=RMCURR.
For the example of the right tool 208, the hand controller 112 reference frame represented by the axes x1, y1, and z1 shown in
At startup of the system 100 there would be no previously stored values for the new end effector position vector {right arrow over (P)}EENEW and the new end effector rotation matrix REENEW and in one embodiment these values are set to home configuration values. A home configuration may be defined that produces a generally straight manipulator 210 for the tool 208 as shown in
At startup, the following operations are therefore performed:
{right arrow over (P)}EEBASE={right arrow over (P)}EENEW={right arrow over (P)}EEPREV, and
REEBASE=REENEW=REEPREV.
The end effector reference frame represented by the axes x2, y2, and z2 shown in
Referring to
The movement process 600 begins at block 602, which directs the microprocessor 250 to determine whether the enablement signal produced by the footswitch 134 is in an active state. If at block 602, it is determined that the footswitch 134 is currently released, the enablement signal will be in the active state and the microprocessor is directed to block 604, which directs the microprocessor 250 to read new values for {right arrow over (P)}MCURR and RMCURR from the current buffer 320 of the workstation memory 252, which represent the current hand controller position vector {right arrow over (P)}MCURR and current hand controller matrix RMCURR. Block 606 then directs the microprocessor 250 to calculate new end effector position signals {right arrow over (P)}EENEW and new end effector orientation signals REENEW representing a desired end effector position 560 and desired end effector orientation, relative to the fixed slave reference position 552 and the slave base orientation (shown in
The new end effector position signals {right arrow over (P)}EENEW and new end effector orientation signals REENEW are calculated according to the following relations:
{right arrow over (P)}EENEW=A({right arrow over (P)}MCURR−{right arrow over (P)}MBASE)+{right arrow over (P)}EEBASE Eqn 1a
REENEW=REEBASERMBASE−1RMCURR Eqn 1b
where:
Block 608 then directs the microprocessor 250 to determine whether the enablement signal has transitioned to the inactive state. If the enablement signal has not transitioned to the inactive state at block 608, block 610 then directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the newly calculated values for {right arrow over (P)}EENEW and REENEW. When the control signals are received at the communications interface 284 of the instrument processor circuit 130, the microprocessor 280 causes drive signals to be produced to cause the end effector 212 to assume a position and orientation determined by the current position and current orientation of the hand controller 112.
Block 612 then directs the microprocessor 250 to copy the current position vector {right arrow over (P)}MCURR and the current rotation matrix RMCURR stored in stores 322 and 324 of the current buffer 320 into stores 342 ({right arrow over (P)}MPREV) and 344 (RMPREV) of the previous buffer 340 of the workstation memory 252. Block 612 also directs the microprocessor 250 to copy the newly calculated end effector position vector {right arrow over (P)}EENEW and the newly calculated end effector rotation matrix REENEW into stores 346 and 348 of the previous buffer 340. By storing the newly calculated end effector position vector {right arrow over (P)}EENEW and newly calculated end effector rotation matrix REENEW, as previously calculated end effector position vector {right arrow over (P)}EEPREV and previously calculated end effector rotation matrix REEPREV, a subsequently acquired new end effector position vector {right arrow over (P)}EENEW and subsequently acquired new end effector rotation matrix REENEW can be calculated from the next received hand controller position vector {right arrow over (P)}MCURR and next received hand controller rotation matrix RMCURR provided by the input device 116.
If at block 608, the enablement signal has transitioned to the inactive state, the microprocessor 250 is directed to block 614. Block 614 directs the microprocessor 250 to cause the motion control interface 258 to transmit control signals based on the previously calculated values of {right arrow over (P)}EEPREV and REEPREV in the respective stores 346 and 348 of the previous butter 340 of the workstation memory 252. The control signals transmitted by the motion control interface 258 are thus derived from the last saved values of {right arrow over (P)}EENEW and REENEW. The instrument processor circuit 130 receives the control signals and produces drive signals at the drive control interface 286 that inhibit further movement of the tool 208
If at block 602, it is determined that the footswitch 134 is currently depressed, the enablement signal will be in the inactive state and the microprocessor is directed to block 616 initiating the base setting process 630. The base setting process 630 (blocks 616 and 618) is executed asynchronously whenever the enablement signal produced by the footswitch 134 transitions from the active state to the inactive state. During the base setting process 630, the drive signals are maintained at the values that were in effect at the time the enablement signal transitioned to inactive at block 608. At block 616 the microprocessor 250 is directed to determine whether the enablement signal has transitioned back to being in the active state. While enablement signal remains inactive (i.e. while the footswitch 134 is depressed) the control signals transmitted by the motion control interface 258 are based only on the previously calculated end effector position and previously calculated orientation signals {right arrow over (P)}EEPREV and REEPREV that were in effect before the enablement signal transitioned to inactive. If at block 616 the enablement signal remains in the inactive state, the microprocessor 250 is directed to repeat block 616 and the process is thus effectively suspended while the enablement signal remains in in the inactive state. While the footswitch 134 is depressed, the surgeon may thus move the hand controller 112 to a new location to relocate the input device workspace relative to the surgical workspace.
When at block 616 the enablement signal transitions from the inactive state to the active state, the microprocessor 250 is directed to block 618. Block 618 directs the microprocessor 250 to set new base positions and orientations for the hand controller 112 and end effector 212 respectively. Block 618 directs the microprocessor 250 to cause current values of current hand controller position vector {right arrow over (P)}MCURR and the hand controller rotation matrix RMCURR to be stored in locations 326 and 328 of the current buffer 320 workstation memory 252 as new values for the master base position vector {right arrow over (P)}MBASE and master base rotation matrix RMBASE. Block 618 also directs the microprocessor 250 to cause current values for the end effector position signal {right arrow over (P)}EENEW and the end effector orientation signal REENEW to be stored in stores 334 and 336 of the current buffer 320 as the definable end effector base position vector {right arrow over (P)}EEBASE and definable slave base rotation matrix RMBASE. Following execution of block 618, the microprocessor 250 is directed back to block 604 of the process 600, which directs the microprocessor to permit further movement of the tool 208. The control signals transmitted by the motion control interface 258 thus cause the instrument processor circuit 130 to produce drive signals at the drive control interface 286 that cause further movement of the tool 208.
The base setting process 630 thus allows the tool 208 to be immobilized by depressing the footswitch 134 while the hand controller 112 of the input device 116 is moved to a new location. When the footswitch 134 is released, control of the tool 208 resumes at the new position of the hand controller 112. The hand controller 112 may thus be repositioned as desired while the tool remains immobile, allowing a greater workspace to the surgeon and preventing unintended movements that may inflict injury to the patient.
The camera align process 650 begins at block 620, which directs the microprocessor 250 to determine whether an align command signal has been received. The align command signal may be generated by any of the input devices 116 and 118, the hand controllers 112 and 114, the footswitches 136, or other input received from the surgeon at the workstation 102. The align command signal may be generated when a camera align control has been activated by the operator. For example, in one embodiment the align command signal may be generated in response to actuation of an input button by a finger of the operator on either of the hand controllers 112 and 114 and detected in the input signals received from the input device 110. Alternatively, the align command signal may be generated by a secondary input device such as a touch screen or through the actuation of one or more of the footswitches 136 and detected in the footswitch signals received at the input/output 256. In the embodiment where a touch screen is used, an image of the surgical site may be displayed where the user can touch the area whereas to align the camera by generating the align command signal. In other embodiments the align command signal may be generated in response to a pre-defined or user defined input such as a specific movement of the hand controllers 112 and 114. If an align command signal is not received the microprocessor 250 is directed to repeat block 620 and the process 650 is thus effectively suspended waiting for the align command signal to transition to an active state.
In other embodiments the align command signal may be produced by causing the workstation processor circuit 120 to determine whether a pattern of movement of the tool or tools 208 and 214 has been received. For example, the hand controllers 112 and 114 may be moved in a pre-determined pattern, or the end effectors 212 and 218 may be moved toward each other to touch, or some other pattern of movement may be defined to cause generation of the align command signal. In response to detecting the pattern of movement, the workstation processor circuit 120 may cause the align command signal to be placed in the active state until another defined pattern of movement is detected.
In yet another embodiment, the align command signal may be set to the active state when the workstation processor circuit 120 determines that a reference point associated with either the tool 208 or 214 is disposed outside of a defined central region within the captured image. The reference point may be associated with either of the tools 208 or 214, or may be associated with a currently active tool.
If an align command signal is received by the microprocessor 250, block 620 directs the microprocessor to receive location information defining the location of either or both of the tools 208 and 214. The location information is provided by the simultaneous execution of block 606 of the process 600 and may be retrieved from the stored values of the calculated end effector position vector {right arrow over (P)}EENEW and calculated end effector rotation matrix REENEW in the stores 330 and 332 of the current buffer 320 of the workstation processor circuit 120 shown in
The process 650 then continues at block 624, which directs the microprocessor 250 to produce camera positioning signals operable to cause the camera 222 to be positioned within the body cavity frame of reference (i.e. xv, yv, zv) to capture images. For the instrument 106 shown in
Block 624 then directs the microprocessor 250 back to block 620 to determine whether the align command signal is still in the active state. As an example, if the align command signal is produced in response to actuation of one of the footswitches 134 or 136 and the footswitch is still being depressed, then blocks 622 and 624 are repeated. Accordingly, if the tool location has changed, block 622 causes the microprocessor 250 to retrieve the new location and block 624 generates updated camera positioning signals for positioning the camera 222. Blocks 622 and 624 thus cause the workstation processor circuit 120 to produce camera positioning signals that will cause the camera to follow the tool 208 within the body cavity frame of reference (i.e. xv, yv, zv) while the align command signal is active.
If at block 620, the align command signal is not received or is no longer in the active state, the camera 222 remains in the position corresponding to the camera positioning signals last generated at block 556. The process 650 is run concurrently with other processes being executed by the microprocessor 250 and may be repeated at a fixed time interval during operation of the system 100. For example, in one embodiment the process 500 may be executed several times per second so that movement of the camera 222 while the align command signal is being generated is sufficiently smooth to prevent image jitter in the images displayed on the display 122.
The end effector position vector {right arrow over (P)}EENEW or {right arrow over (P)}EEPREV and end effector orientation matrix REENEW or REEPREV respectively produced at blocks 606 and 614 provide a desired location end effector tip 660 (shown in
Generation of motion control signals by the instrument processor circuit 130 is described with further reference to
The s-segment 700 extends from the first position 704 to a third position 706 defined as an origin of a third reference frame having axes x5, y5, and z5 and is capable of assuming a smooth s-shape when control wires (not shown) inside the s-segment 700 are pushed and pulled. The s-segment 700 has a mid-point at a second position 708, defined as the origin of a second position reference frame having axes x4, y4, and z4. The s-segment 700 has a length L1, best shown in
The distal segment 702 extends from the third position 706 to a fourth position 710 defined as an origin of a fourth reference frame having axes x6, y6, and z6. The distal segment 702 has a length L2, best shown in
Each end effector 212 and 218 also has an end effector length, which in the embodiment shown is a gripper length L3 extending from the fourth position 710 to the end effector tip position 660 defined as the origin of the axes x2, y2, and z2. The gripper length L3 is best shown in
As described in PCT/CA2013/001076, by pushing and pulling on control wires inside the manipulators 210 and 216, the s-segments 700 of the manipulators may be bent into various degrees of an s-shape, from the straight condition shown in
In addition, the distal segment 702 lies in a second bend plane containing the third position 706 and the fourth position 710. The second bend plane is at an angle δdist to the xv-zv plane of the fixed slave reference frame xv, yv, and zv. The distal segment 702 is bent in the second bend plane at an angle ϑdist. Thus, by pushing and pulling the control wires within the manipulator 210, the fourth position 710 can be placed within another volume in space about the fourth position 710. This volume may be referred to as the distal workspace. The combination of the s-segment workspace and the distal workspace may be referred to as the positioning device workspace as this represents the total possible movement of the tool 208 as effected by the manipulator 210. The left side tool 214 may be similarly positioned by the manipulator 216.
The distance between the fourth position 710 and the end effector position 660 is the distance between the movable portion of the distal segment 702 and the tip of the gripper 220 of the end effector 212 in the embodiment shown, i.e. the length the gripper length L3 shown in
In the embodiment shown, the end effector 212 include moveable gripper jaws 220 that are rotatable about the z2 axis in the x2-y2 plane of the end effector reference frame, the angle of rotation being represented by an angle γ relative to the positive x2 axis. Finally, the gripper jaws 220 may be at any of varying degrees of openness from fully closed to fully open (as limited by a hinge joint of the jaws). The varying degrees of openness may be defined as “G”. In summary therefore, the motion control signals are generated based on a kinematic configuration of the manipulator 210 and end effector 212 as defined by the following configuration variables:
To calculate the configuration variables, it will first be recalled that the end effector rotation matrix REENEW is a 3×3 matrix:
where the last column of REENEw is the z-axis of the end effector reference frame written relative to the fixed slave reference frame xv, yv, and zv. The values ϑdist, δdist, and γ associated with the distal segment 702 may be calculated according to the relations:
The third position 706 may then be written in terms of a vector
where ī is a unit vector in the x direction,
The vector
Taking a ratio of Eqn 8b and Eqn 8a yields:
δprox=a tan 2(−p3/v·
where ī and
where ī is the unit vector in the x direction. The equation Eqn 10 is Eqn 8a rewritten in the form ƒ(ϑprox)=0. The Newton-Raphson method tends to converge very quickly because in the range 0<ϑprox<π, the function has a large radius of curvature and has no local stationary points. Following the Newton-Raphson method, successive improved estimates of ϑprox can be made iteratively to satisfy equation Eqn 10 using the following relationship:
Finally, upon determination of ϑprox, the following equation can be used to find qins:
where k is the unit vector in the z direction and
The above configuration variables calculated for the end effector position and orientation signals {right arrow over (P)}EENEW and REENEW at block 606 or {right arrow over (P)}EEPREV and REEPREV at block 614 of the processes 602 and 630. The configuration variables generally define a pose of the manipulator 210 required to position the end effector 212 at the desired location and orientation in end effector workspace. Configuration variables are produced for each end effector 212 and 218 of the respective right and left side tools 208 and 212. Two sets of configuration variables referred to as left and right configuration variables respectively are thus produced and transmitted by the motion control interface 258 to the instrument processor circuit 130 and used by the microprocessor 280 to generate drive control signals for spatially positioning the manipulator 210 and end effector 212 of the tool 208 in the surgical workspace.
The values of the vector {right arrow over (P)}EENEW and rotation matrix REENEW calculated as described above and stored in stores 330 and 332 of the current buffer 320 of the workstation memory 252 define the location (x, y, z) of the end effector 212 of the tools 208 within the surgical workspace relative to the fixed slave reference frame xv, yv, zv (shown in
T03=A01(θ1)A12(θ2)A23(θ3) Eqn 13
The transformation matrix T03 is a transformation matrix from the distal end 302 of the insertion tube 202 to the camera 222, where:
and where:
o0,x0,y0,z0 coordinate frame of the revolute joint 350;
o1,x1,y1,z1 coordinate frame of the revolute joint 352;
o2,x2,y2,z2 coordinate frame of the revolute joint 354;
o3,x3,y3,z3 coordinate frame of the camera 222;
a1 perpendicular distance from z0 to z1 (the length of the panning linkage 306);
a2 perpendicular distance from z1 to z2 (the length of link elevating linkage 308);
a3 perpendicular distance from z2 to z3 (the length of the camera tilt linkage 310);
si sin ϑi (i=1, 2, 3);
ci cos ϑi (1=1, 2, 3);
s23 sin (ϑ2+ϑ3);
c23 cos (ϑ2+ϑ3);
ϑ1 angular displacement of the panning linkage 306;
ϑ2 angular displacement of the elevating linkage 308;
ϑ3 angular displacement of the tilt linkage 310;
Ai−1,i coordinate transformation matrix from frame oi to frame oi−1
To3 coordinate transformation from the camera 222 frame O3 to the base frame O0.
In one embodiment an orientation vector 380 is defined that is directed outwardly and perpendicular to a front face 382 of the camera 222. The orientation vector 380 is thus aligned with the tilt linkage 310 and provides an indication of the current orientation of the camera 222. A set of axes 384 represent the location information defining the location of the tool with respect to a body cavity frame of reference and act as a target location for orienting the vector 380 associated with the camera 222. In
While specific embodiments have been described and illustrated, such embodiments should be considered illustrative of the disclosure only and not as limiting the disclosure as construed in accordance with the accompanying claims.
Number | Name | Date | Kind |
---|---|---|---|
6926709 | Bieger et al. | Aug 2005 | B2 |
7890211 | Green | Feb 2011 | B2 |
8108072 | Zhao et al. | Jan 2012 | B2 |
8147503 | Zhao et al. | Apr 2012 | B2 |
8600551 | Itkowitz | Dec 2013 | B2 |
9108318 | Diolaiti | Aug 2015 | B2 |
9333042 | Diolaiti et al. | May 2016 | B2 |
20020111713 | Wang | Aug 2002 | A1 |
20050222554 | Wallace et al. | Oct 2005 | A1 |
20090088634 | Zhao et al. | Apr 2009 | A1 |
20100274087 | Diolaiti | Oct 2010 | A1 |
20100332031 | Itkowitz | Dec 2010 | A1 |
20120059391 | Diolaiti | Mar 2012 | A1 |
20120071892 | Itkowitz | Mar 2012 | A1 |
20130331644 | Pandya et al. | Dec 2013 | A1 |
20140276951 | Hourtash | Sep 2014 | A1 |
20140343416 | Panescu | Nov 2014 | A1 |
20190314095 | Unsworth | Oct 2019 | A1 |
Number | Date | Country |
---|---|---|
2 687 186 | Jan 2014 | EP |
WO 0060521 | Oct 2000 | WO |
WO 2010039394 | Apr 2010 | WO |
WO 2015031777 | Mar 2015 | WO |
WO 2015135057 | Sep 2015 | WO |
WO 2019040278 | Feb 2019 | WO |
Entry |
---|
International Search Report and Written Opinion in Application No. PCT/US2018/045646, dated Oct. 23, 2018 in 12 pages. |
International Preliminary Report on Patentability received in co-pending International Application No. PCT/US2018/045646, dated Feb. 25, 2020, in 6 pages. |
Number | Date | Country | |
---|---|---|---|
20190060029 A1 | Feb 2019 | US |