The present disclosure relates to an endoscope system, an endoscope movement control method, and a recording medium.
An endoscope system in which an endoscope is moved by a moving device such as an electric holder has been conventionally known. In order to move the endoscope accurately, it is desirable that the coordinate system of the endoscope and the coordinate system of the moving device match each other.
For example, when an operator remotely operates an electric holder using a user interface, the operator inputs a desired moving direction of the endoscope into a user interface based on an endoscopic image. When the coordinate system of the endoscope and the coordinate system of the moving device match each other, the user can intuitively and accurately move the endoscope in a desired direction. On the other hand, when the coordinate system of the endoscope and the coordinate system of the moving device do not match each other, the actual moving direction of the endoscope is different from the moving direction input to the user interface, which makes it difficult for the user to intuitively and accurately move the endoscope in a desired direction.
In a robot system in which a camera system is attached to a movable arm, there is known a method for correcting the transformation between the coordinate system of a robot and the coordinate system of a camera system (see, for example, PTL 1). In PTL 1, an image of a target is captured by a camera system, and the transformation is determined from the position of the movable arm when the image is captured and the position of a feature point of the target in the image.
According to an aspect of the present disclosure, an endoscope system comprises: an insertion portion extending in a longitudinal axis direction;
According to another aspect of the present disclosure, there is provided an endoscope movement control method for controlling a robotic arm for moving an endoscope, the endoscope comprising an insertion portion extending in a longitudinal axis direction, an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis, and an optical element that is provided in the insertion portion to tilt an optical axis in a direction offset from the longitudinal axis direction, and the robotic arm holding the endoscope such that the endoscope is rotatable about the longitudinal axis, the endoscope movement control method comprising: moving the endoscope in the longitudinal axis direction by the robotic arm; detecting a first moving direction of an object in first two images by using the first two images captured by the imaging sensor before and after the movement in the longitudinal axis direction; estimating a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction; moving the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm; detecting a second moving direction of the object in second two images using two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction; and estimating a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable recording medium in which an endoscope movement control program for controlling a robotic arm for moving an endoscope is stored, the endoscope comprising an insertion portion extending in a longitudinal axis direction, an imaging sensor disposed at a proximal end of the insertion portion so as to be rotatable about a longitudinal axis, and an optical element that is provided in the insertion portion to incline an optical axis in a direction offset from the longitudinal axis direction, the robotic arm holding the endoscope such that the endoscope is rotatable about the longitudinal axis, and the endoscope movement control program causing a computer to: move the endoscope in the longitudinal axis direction by the robotic arm; detect a first moving direction of an object in first two images by using two images captured by the imaging sensor before and after the movement in the longitudinal axis direction; estimate a first amount of rotation of the optical element about the longitudinal axis with respect to the imaging sensor based on the first moving direction; move the endoscope in a direction perpendicular to the longitudinal axis direction by the robotic arm; detect a second moving direction of the object in the second two images by using the second two images captured by the imaging sensor before and after the movement in the direction perpendicular to the longitudinal axis direction; and estimate a second amount of rotation about the longitudinal axis between the robotic arm and the endoscope based on the second moving direction and the first amount of rotation.
An endoscope system 1, an endoscope movement control device 5, a method, a program, and a recording medium according to a first embodiment of the present disclosure will be described with reference to the drawings.
As shown in
The endoscope 2 is a rigid endoscope including an elongated and rigid lens tube portion (insertion portion) 2a, an optical system (optical element) 2b disposed in the lens tube portion 2a, and a camera head 2c disposed at a base end of the lens tube portion 2a. Furthermore, the endoscope 2 is an oblique-viewing type endoscope 2 having a visual axis (optical axis) B which is tilted at a predetermined angle by an optical system 2b with respect to a longitudinal axis A extending along the center in the radial direction of the lens tube portion 2a.
The lens tube portion 2a is attached to the camera head 2c so as to be rotatable around the longitudinal axis A. An operation ring 2d is fixed to the lens tube portion 2a. An operator can rotate the lens tube portion 2a around the longitudinal axis A with respect to the camera head 2c by rotating the operation ring 2d around the longitudinal axis A while holding the camera head 2c. Alternatively, the operator can also rotate the lens tube portion 2a around the longitudinal axis A with respect to the camera head 2c by rotating the camera head 2c around the longitudinal axis A while holding the operation ring 2d.
An imaging element 2e for imaging light focused by the optical system 2b is fixed inside the camera head 2c. The imaging element 2e includes an imaging surface which is arranged perpendicularly to the longitudinal axis A such that the longitudinal axis A passes through the center thereof. The imaging element 2e is an image sensor such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
The endoscope 2 is inserted into a body together with one or more treatment tools 7, and an endoscopic image (image) C including the one or more treatment tools 7 (see
The moving device 3 is a five-degree-of-freedom robot arm having a plurality of, for example, three driving joints (joints) 3M and two passive joints 3P. The moving device 3 includes, for example, a base 3a installed on a horizontal plane, and a first link 3b to be rotationally driven with respect to the base 3a around a vertical first axial line J1. Further, the moving device 3 includes a second link 3c to be rotationally driven with respect to the first link 3b around a horizontal second axial line J2, and a third link 3d to be rotationally driven with respect to the second link 3c around a horizontal third axial line J3. Further, the moving device 3 includes a fourth link 3e that is supported to be freely rotatable with respect to the third link 3d about a fourth axial line J4 extending along a plane that is orthogonal to the third axial line J3 and contains the first axial line J1, and a fifth link 3f which is supported to be rotatable with respect to the fourth link 3e around a fifth axial line J5 perpendicular to the fourth axial line J4.
The moving device 3 includes a holder 3g that is provided at the distal end of the fifth link 3f and holds the lens tube portion 2a of the endoscope 2 so as to make the lens tube portion 2a rotatable around the longitudinal axis A. The holder 3g holds the lens tube portion 2a such that the longitudinal axis A of the lens tube portion 2a passes through the intersection point between the fourth axial line J4 and the fifth axial line J5 and is perpendicular to the fifth axial line J5. The moving device 3 has five-degree-of-freedom. However, since the holder 3g holds the lens tube portion 2a so as to make the lens tube portion 2a rotatable around the longitudinal axis A, the moving device 3 has six-degree-of-freedom as a whole, so that the endoscope 2 can be arranged in a desired posture and at a desired position.
The endoscope 2 is moved by an interlocking operation of the three driving joints 3M and the two passive joints 3P of the moving device 3, whereby the position and posture of the endoscope 2 are three-dimensionally changed.
When using the endoscope 2, as shown in
The three-dimensional position of the distal end of the third link 3d is uniquely determined by driving the three driving joints 3M. The moving direction of the endoscope 2 is restricted by the trocar 8. Therefore, when the distal end position of the third link 3d is changed, the endoscope 2 swings the trocar 8 around the pivot point X or move the lens tube portion 2a in the longitudinal axis direction A inside the through-hole 8a of the trocar 8, and passively rotates the two passive joints 3P so as to follow the above movement. The rotation of the lens tube portion 2a around the longitudinal axis A relative to the holder 3g is maintained in a state manually adjusted by the operator.
Each of the three driving joints 3M of the moving device 3 is equipped with an angle sensor 3h for detecting the rotation angle thereof. The angle sensor 3h is, for example, an encoder, a potentiometer, or a Hall sensor provided at each driving joint 3M.
The operating device 4 has a user interface 4a including input devices such as keys, a joystick, buttons, and a touch panel. The operator can input an instruction for moving the endoscope 2 to the operating device 4 by operating the user interface 4a. The operating device 4 transmits an operating signal based on an operation of the user interface 4a to the control device 5.
Further, the user interface 4a can accept a trigger input from the operator. As described later, the trigger is used for executing processing of estimating the amounts of rotation amounts β and θ of the endoscope 2.
As shown in
The control device 5 is connected to the surrounding endoscope 2, moving device 3 (robotic arm), operating device 4, and display device 6 via the input interface 5d and the output interface 5e, and transmits and receives an endoscopic image C, information on the rotation angle of the driving joints 3M, signals, etc. via the interfaces 5d and 5e.
The memory 5b is, for example, a semiconductor memory including ROM (read-only memory) or RAM (random access memory) area.
The storage unit 5c is a computer-readable non-transitory recording medium, and is, for example, a nonvolatile recording medium including a hard disk or a semiconductor memory such as a flash memory.
The processor 5a controls the moving device 3 based on the operation signal from the operating device 4, thereby moving the endoscope 2 according to an instruction input to the user interface 4a by the operator.
Here, the operator can rotate the endoscope 2 around the longitudinal axis A according to two ways by operating the endoscope 2 attached to the holder 3g. First, the lens tube portion 2a is rotated around the longitudinal axis A with respect to the camera head 2c by manipulating the operation ring 2d, which makes it possible to change the relative angle (first amount of rotation) β between the camera head 2c and the lens tube portion 2a. Second, the lens tube portion 2a is rotated around the longitudinal axis A with respect to the holder 3g, which makes it possible to change the angle (second amount of rotation) θ representing the direction of the visual axis B of the lens tube portion 2a with respect to the holder 3g.
In order for the operator to attach the endoscope 2 to the moving device 3 and intuitively move the endoscope 2 vertically and horizontally while watching the endoscopic image C displayed on the display device 6, it is necessary that the amounts of rotation β and θ between the moving device 3 and the endoscope 2 have been grasped.
In the endoscope system 1 according to the present embodiment, the processor 5a estimates the amounts of rotation β and θ of the endoscope 2 by executing a program (endoscope movement control program) which is recorded in the storage unit 5c and read out to the memory 5b.
Hereinafter, an endoscope movement control method by the endoscope system 1 according to the present embodiment will be described with reference to the drawings.
The processing of estimating the amounts of rotation β and θ of the endoscope 2 is started by the operator inputting a trigger to the user interface 4a.
As shown in
If the processor 5a determines in step S3 that Nβ≠1 is satisfied, the processor 5a transmits a first signal for activating the moving device 3 so as to move the endoscope 2 in one direction along the longitudinal axis A (Step S7). It is determined whether the amount of movement of the endoscope 2 caused by the moving device 3 being activated with the first signal is equal to or more than a predetermined threshold value (step S8). If it is equal to or more than the threshold value, the processing is terminated.
When determining that the amount of movement of the endoscope 2 is smaller than the threshold value, the processor 5a stores a second endoscopic image C acquired by the endoscope 2 at that time into the memory 5b (step S9). Further, the processor 5a recognizes the treatment tools (moving objects) 7 in the second endoscopic image C, and stores the areas of the treatment tools 7 as treatment tool areas D into the memory 5b (step S10).
A first moving direction M1 of an object S in the endoscopic image C is detected using the first endoscopic image C and the second endoscopic image C (step S11). Specifically, the processor 5a uses a known method such as optical flow to estimate a movement vector vobj of the object S in another area E obtained by excluding the treatment tool areas D from the two endoscopic images C captured before and after the endoscope 2 is moved by the moving device 3. As shown in
Next, the processor 5a determines whether the magnitude of the estimated movement vector vobj is equal to or more than a threshold value (step S12). The determination of the magnitude of the movement vector vobj is performed using the magnitude of any of estimated movement vectors vobj or the average value of the magnitudes of the estimated movement vectors vobj.
When the magnitude of the movement vector vobj is smaller than the threshold value, the processor 5a repeats the processing from step S7. When the magnitude of the movement vector vobj is equal to or more than the threshold value, the processor 5a estimates the amount of rotation β of the endoscope 2 from the estimated movement vector vobj (step S13).
The estimation of the amount of rotation β of the endoscope 2 is performed as follows.
First, the processor 5a calculates the intersection point (hereinafter also referred to as a vanishing point) of straight lines along the directions of a plurality of movement vectors vobj estimated for a plurality of feature points on the endoscopic image C. Specifically, as shown in
The radius L is the distance from the image center O to the vanishing point, and can be calculated using Formula (1).
Here, as shown in
The processor 5a uses, as a reference line R, a straight line which connects the position of a vanishing point Q1 for the amount of rotation β=0° and the image center O, and sets the temporary vanishing points Q1 to Q12, for example, at intersection points between the circle of the radius L and straight lines obtained by rotating the reference line R by every 30° around the image center O. As a result, for example, when the temporary vanishing point Q2 at a position where rotation of 30° is performed with respect to the reference line R is selected as the vanishing point, the amount of rotation β of the endoscope 2 can be estimated to be β=30°.
The processor 5a calculates, for each temporary vanishing point Qj, the inner product of each movement vector vi and a vector (Qj−pi) connecting the origin pi of each movement vector vi and the temporary vanishing point Qj.
In the example shown in
As a result, the processor 5a estimates the rotation angle corresponding to the selected vanishing point, 330° in the example shown in
Next, as shown in
In step S14, when determining that Nθ≠1 is satisfied, the processor 5a transmits a second signal for causing the moving device 3 to operate so as to move the endoscope 2 in one direction perpendicular to the longitudinal axis A (step S18). The processor 5a determines whether the amount of movement of the endoscope 2 caused by the moving device 3 being activated with the second signal is equal to or more than a predetermined threshold value (step S19), and when it is equal to or more than the threshold value, the processor 5a terminates the rotation amount estimation processing.
When the movement amount of the endoscope 2 is smaller than the threshold value, the processor 5a uses the amount of rotation β estimated in step S13 to calculate a movement vector vsys of the object S in the endoscopic image C, which is assumed to be caused by transmission of the second signal when the amount of rotation θ is equal to a predetermined value, for example, 0° (step S20).
Then, the processor 5a stores, into the memory 5b, the fourth endoscopic image C captured after the endoscope 2 is moved with the second signal (step S21). Further, the processor 5a recognizes the treatment tools (moving objects) 7 in the fourth endoscopic image C, and stores the areas of the treatment tools 7 as the treatment tool areas D into the memory 5b (step S22).
The second moving direction M2 of the object S in the endoscopic image C is detected using the third endoscopic image C and the fourth endoscopic image C (step S23). Specifically, the processor 5a uses a known method such as optical flow to estimate a movement vector vreal of the object S in another area E obtained by excluding the treatment tool areas D from the two endoscopic images C captured before and after the endoscope 2 is moved by the moving device 3.
Next, the processor 5a determines whether the magnitude of the estimated movement vector vreal is equal to or more than a threshold value (step S24). The determination of the magnitude of the movement vector vreal is performed using the magnitude of any of estimated movement vectors vreal or the average value of the magnitudes of the estimated movement vectors vreal.
When the magnitude of the movement vector vreal is smaller than the threshold value, the processor 5a repeats the processing from step S18. When the magnitude of the movement vector vreal is equal to or more than the threshold value, the processor 5a estimates the angle between the assumed movement vector vsys and the estimated movement vector vreal, which is 180° in the example shown in
When the amounts of rotation β and θ of the endoscope 2 are estimated, the processor 5a uses the estimated amounts of rotation β and θ to correct a holder coordinate system Σr so that the holder coordinate system Σr and an endoscope coordinate system e match each other. The endoscope coordinate system Ce is a coordinate system fixed for the imaging element 2e, and the holder coordinate system Σr is a coordinate system fixed for the distal end of the holder 3g.
In one example, the endoscope coordinate system Ce is a rectangular coordinate system having Xe-axis, Ye-axis, and Ze-axis which are orthogonal to one another, and the holder coordinate system Σr is a rectangular coordinate system having Xr-axis, Yr-axis, and Zr-axis which are orthogonal to one another. The Xe-axis and the Xr-axis match the longitudinal axis A, the Ye-axis and the Yr-axis are parallel to the horizontal direction (left-right direction) of the endoscopic image C, and the Ze-axis and the Zr-axis are parallel to the vertical direction (up-and-down direction) of the endoscopic image C. The holder coordinate system Σr and the endoscope coordinate system Σe matching each other means that the directions of the Xe-axis and Xr-axis match each other, the directions of the Ye-axis and Yr-axis match each other, and the directions of the Ze-axis and Zr-axis match each other.
However, when the endoscope 2 is rotated by the amounts of rotation β and θ around the longitudinal axis A, a shift occurs between the endoscope coordinate system Ce and the holder coordinate system Σr. When the endoscope coordinate system Ce does not match the holder coordinate system Σr as described above, it is difficult for the operator who is observing the endoscopic image C displayed on the display device 6 to intuitively and accurately move the endoscope 2 in a desired direction by operating the user interface 4a.
The processor 5a makes the holder coordinate system Σr match the endoscope coordinate system Ce by correcting the holder coordinate system Σr based on the amounts of rotation β and θ. Specifically, the processor 5a corrects a DH (Denavit-Hartenberg) parameter of the moving device 3 based on the amounts of rotation β and @ so that the holder coordinate system Σr matches the endoscope coordinate system Ce.
The processor 5a controls the moving device 3 based on the corrected holder coordinate system Σr.
As described above, according to the present embodiment, the amounts of rotation β and θ which are manually adjusted by the operator are estimated, and the holder coordinate system Σr is corrected using the estimated amounts of rotation β and θ. Therefore, even in the case of the endoscope 2 consisting of an oblique-viewing endoscope, the operator can intuitively and accurately move the endoscope 2 in a desired direction by operating the user interface 4a while observing the endoscopic image C.
Furthermore, the amounts of rotation β and θ of the endoscope 2 can be calculated from the two endoscopic images C obtained before and after the moving device 3 is activated, respectively. In other words, it is not necessary to add any equipment such as a sensor to detect the amounts of rotation β and θ, and the amounts of rotation β and θ can be estimated without any sensor.
If a device such as a sensor for detecting the amounts of rotation β and θ of the endoscope 2 is added to the moving device 3, it would be possible to easily measure the amounts of rotation amounts β and θ. However, in this case, it is impossible to use any existing moving device 3, and it is necessary to improve the moving device 3. Moreover, the size and weight of the moving device 3 increase, and the moving device 3 may become an obstacle to the operator. Therefore, according to the present embodiment, there is no need to improve the moving device 3, and the moving device 3 can be easily miniaturized.
Further, according to the present embodiment, the area E other than the treatment tool areas D in the endoscopic image C is used for estimating the movement vector vobj. As a result, even when a moving treatment tool 7 exists in the endoscopic image C, a highly accurate movement vector vobj that accurately represents the moving direction of the endoscope 2 can be estimated. The amounts of rotation β and θ of the endoscope 2 can be calculated based on such a highly accurate movement vector vobj, and the holder coordinate system Σr can be accurately corrected. Furthermore, based on the accurately corrected holder coordinate system Σr, the endoscope 2 can be moved by the moving device 3 in a direction that exactly corresponds to a direction input into the operating device 4 by the operator.
In the present embodiment, as shown in
The margin F is an area that extends along the outline of the treatment tool area D and surrounds the treatment tool area D, and is, for example, a belt-shaped area having a predetermined width.
In the vicinity of the treatment tool 7, the movement of the object S may be affected by the movement of the treatment tool 7. For example, the object S may partially move in the vicinity of the treatment tool 7 due to the object S being pushed or pulled by the treatment tool 7.
By excluding the area obtained by adding the margin F to the treatment tool area D, it is possible to improve the estimation accuracy of the movement vectors vobj and vreal Of the object S and more accurately estimate the amounts of rotation β and θ of the endoscope 2.
Further, in the present embodiment, a plurality of temporary vanishing points Qj corresponding to the amount of rotation β is set, and the vanishing point is selected using the evaluation function Uj shown in Formula 2. In this case, the temporary vanishing points are set at every 30°, but they may be set at every smaller angles. As a result, the amount of rotation β can be estimated with higher accuracy.
Further, instead of setting a plurality of temporary vanishing points Qj corresponding to the amount of rotation β, functions of a plurality of straight lines along movement vectors vobj of a plurality of feature points in the endoscopic image C are determined, and as shown in
Further, as described above, a gap exists in the radial direction between the lens tube portion 2a of the endoscope 2 and the through-hole 8a of the trocar 8. When the moving device 3 has the passive joint 3P, there may occur a situation that even if an attempt is made to move the lens tube portion 2a in the longitudinal axis direction A, the lens tube portion 2a shifts in a direction intersecting the longitudinal axis A by the amount corresponding to the gap. The optical flow when the lens tube portion 2a is moved in the longitudinal axis direction A is used in order to estimate the amount of rotation β. Therefore, if the lens tube portion 2a shifts in the direction intersecting the longitudinal axis A, the accuracy of estimating the vanishing point and thus the amount of rotation β deteriorates.
Therefore, the lens tube portion 2a is moved in the longitudinal axis direction A sufficiently larger than the gap between the lens tube portion 2a and the trocar 8 (for example, 20 mm or more when the gap is 2 to 3 mm). As a result, it is possible to reduce the ratio of the deviation amount of the movement vector vobj caused by the gap, so that the estimation accuracy of the amount of rotation β can be improved.
Further, in order to accurately arrange the lens tube portion 2a of the endoscope 2 at a desired position and in a desired posture, a moving device 3 having six-degree-of-freedom may required. The moving device 3 in general as a robotic arm that can have 6 degrees of freedom, however, a moving device having 3 degrees of freedom or more can be used. In the present embodiment, as described above, the six-degree-of-freedom holder 3g holds the endoscope 2 so that the endoscope 2 is manually rotatable around the longitudinal axis A. For this reason, for example, some posture of the endoscope 2 may cause the entire endoscope 2 to rotate around the longitudinal axis A when the lens tube portion 2a of the endoscope 2 is moved in a direction perpendicular to the longitudinal axis A. In this case, the endoscopic images C obtained before and after the movement rotates, which makes it difficult to correctly calculate the moving direction of the object S.
Therefore, as shown in
In this case, a second signal for a joint which rotates the endoscope 2 around the longitudinal axis A, but does not actually exist is also calculated. When the thus-calculated second signal is transmitted to the moving device 3, the entire endoscope 2 would be rotated around the longitudinal axis A to compensate for the posture of the endoscope 2 if the moving device 3 has six-degree-of-freedom. However, the posture of the endoscope 2 is not compensated because the moving device 3 of the present embodiment does not have a driving joint around the longitudinal axis A.
Therefore, the processor 5a rotates the endoscopic image C stored in step S21 after the activation of the moving device 3 by only the rotation angle corresponding to the second signal calculated for the joint around the longitudinal axis A which does not actually exist (step S28). As a result, the same endoscopic image C as an endoscopic image C which would be captured when the moving device 3 has the six-degree-of-freedom can be acquired as the endoscopic image C to be captured by the endoscope 2, and at least one of the two endoscopic images C acquired before and after the transmission of the second signal is rotated by image processing so that the angles around the longitudinal axis A of the two endoscopic images C acquired before and after the transmission of the second signal match each other, whereby it is possible to accurately calculate the moving direction of the object S before and after the movement.
Further, as described above, in the present embodiment, the two joints at the distal end of the moving device 3 are passive joints 3P, and have neither a motor nor an angle sensor. Therefore, when the three driving joints 3M are driven, the lens tube portion 2a of the endoscope 2 swings the trocar 8, and is moved along the through-hole 8a of the trocar 8 by the reaction force received from the trocar 8. Following this movement, the passive joints 3P are passively activated.
However, as described above, a gap exists between the inner surface of the through-hole 8a of the trocar 8 and the outer surface of the lens tube portion 2a of the endoscope 2. Therefore, as shown in
Therefore, as shown in
Further, in a case where the moving device 3 has the passive joint 3P, a problem may occur when the amount of rotation θ of the endoscope 2 is estimated. In other words, when there is a gap between the lens tube portion 2a and the through-hole 8a of the trocar 8, there may occur a situation that even if an attempt is made to move the endoscope 2 straight in a direction perpendicular to the longitudinal axis A, the passive joint 3P shifts unrestrainedly by the amount corresponding to the gap, so that the passive joint 3P cannot be moved straight.
Therefore, as shown in
Since the endoscope 2 is moved in the direction in which the passive joint 3P moves, the endoscope 2 can be moved straight in the direction perpendicular to the longitudinal axis A, and the accuracy of estimating the amount of rotation θ can be improved.
Further, in the present embodiment, in step S20, the assumed movement vector vsys is calculated, and in step S25, the intersection angle between the movement vector vsys and the estimated movement vector vreal is estimated as the amount of rotation θ of the endoscope 2. Instead of this, as shown in
In other words, first, the processor 5a stores the angle of each driving joint 3M of the moving device 3 at the time when the first endoscopic image C and the treatment tool area D are stored in steps S15 and S16 (step S32). Then, at the time point when the real movement vector vreal having sufficient magnitude is calculated, the processor 5a stores the angle of each driving joint 3M of the moving device 3 again (step S33). Thereafter, the processor 5a calculates respective movement vectors vsimu of the object S through simulations in which the amount of rotation θ is changed (step S34). The processor 5a selects a simulation in which a movement vector vsimu whose moving direction matches that of the real movement vector vreal is calculated, and adopts the amount of rotation θ used in the selected simulation as an estimated value (step S35).
In the simulation, the angle of each of the joints 3M stored in steps S32 and S33, the amounts of rotation β estimated in step S13, the length of each of the links 3b, 3c, and 3d, and the distance from the distal end of the endoscope 2 to the object S are used as fixed values. Then, the processor 5a performs a plurality of simulations to calculate the movement vector vsimu of the object S using the amount of rotation θ as a parameter.
According to the present embodiment, since the simulation is performed using the real angles of the driving joints 3M of the moving device before and after movement, it is possible to calculate the movement vector vsimu in consideration of the variation in posture of the endoscope 2 caused by the moving device having five-degree-of-freedom and including the passive joints 3P. Therefore, as compared with the movement vector vsys of the first embodiment which is calculated on the assumption that the distal end of the endoscope 2 moves straight in the direction perpendicular to the longitudinal axis A, the movement vector vsimu can be calculated with high accuracy, and the estimation accuracy of the amount of rotation θ can be improved.
Further, the case where the sixth to eighth modifications are implemented individually has been described, but instead of this case, the modifications may be implemented while combining at least two of these modifications.
(Ninth modification)
Furthermore, when the endoscope 2 is inserted via the trocar 8 as described above, if the endoscope 2 is excessively advanced along the longitudinal axis A in order to estimate the amount of rotation β, it is considered the distal end of the endoscope 2 comes into contact with the object S inside the abdominal cavity. On the other hand, if the endoscope 2 is excessively retreated along the longitudinal axis A, it is considered that the distal end of the endoscope 2 enters the trocar 8, and sufficient optical flow cannot be obtained.
Therefore, the operator operates the moving device 3 while checking the endoscopic image C in advance, moves the endoscope 2 to a position and posture in which the distal end thereof does not come into contact with the object S inside the body cavity, and registers the three-dimensional position of the distal end of the endoscope 2 at that time as a registration point T. The processing of estimating the amount of rotation β may be performed by moving the endoscope 2 in a retreating direction along the longitudinal axis A from a state in which the distal end of the endoscope 2 is disposed at the registration point T.
Specifically, as shown in
When the processing of estimating the amounts of rotation β and θ is started in a state where the distal end of the endoscope 2 is disposed at an arbitrary position, as shown in
When it is determined in step S39 that the distance L1 is equal to or less than the threshold value, the processor 5a calculates a distance L2 between the distal end of the endoscope 2 and the registration point T for all registration points T (step S40). Then, the processor 5a moves the distal end of the endoscope 2 to the registration point T which provides the smallest calculated distance L2 (step S41), and then executes the processing from step S2. In this case, the operating direction of the moving device 3 according to the first signal transmitted in step S7 is limited to the direction in which the endoscope 2 is retreated along the longitudinal axis A.
As described above, according to the present embodiment, in the processing of estimating the amount of rotation β, it is possible to prevent occurrence of the inconvenience of entering the trocar 8 and efficiently estimate the amount of rotation β. In particular, simply by registering the registration points T, the estimation processing can be performed regardless of the position of the distal end of the endoscope 2 at the time of starting the estimation. Therefore, there is an advantage that the operator does not have to manually move the endoscope 2 to a position where the distance L11 is ensured to be larger than the threshold value each time.
Furthermore, when the distal end of the endoscope 2 is moved to the registration point T by the operation of the moving device 3, the processor 5a moves the distal end of the endoscope 2 as follows. First, the endoscope 2 is retreated along the longitudinal axis A until the distance L1 between the distal end of the endoscope 2 and the pivot point X becomes a constant distance or less. Next, the posture of the endoscope 2 is adjusted so that the longitudinal axis A of the endoscope 2 is parallel to a straight line connecting a registration point T to which the endoscope 2 is requested to be moved and the pivot point X. Finally, the endoscope 2 is moved along the longitudinal axis A until the distal end of the endoscope 2 is at a distance equal to or less than a predetermined threshold value from the registration point T. As a result, even when there is an obstacle such as tissue or a treatment tool between the distal end of the endoscope 2 and the registration point T to which the endoscope 2 is requested to be moved at the start time of the processing of estimating the amount of rotation β, it is possible to move the distal end of the endoscope 2 to the registration point T without causing the distal end of the endoscope 2 to come into contact with the obstacle.
Moreover, in the present embodiment, the moving device 3 including three driving joints 3M and two passive joints 3P has been described. Instead of this moving device 3, a moving device 3 including five or more driving joints 3M may be adopted. As a result, the position and posture of the endoscope 2 can be controlled more accurately. However, since the driving joint 3M requires a motor and an angle sensor, the moving device 3 becomes larger in size. Therefore, according to the present embodiment, the minimum number of driving joints 3M that can achieve the position and posture of the endoscope 2 are provided, so that the moving device 3 can be miniaturized and the moving device 3 can be prevented from interfering with surgery.
Furthermore, in the present embodiment, the imaging element 2e in the camera head 2c is arranged so as to be orthogonal to the longitudinal axis A of the lens tube portion 2a and such that the longitudinal axis A passes through the center of the imaging surface. Instead of this, in a case where an optical axis B is inclined by the optical element 2b such as a mirror or a prism in the camera head 2c, the imaging element 2e may be arranged so as to be perpendicular to the optical axis B which is inclined with respect to the longitudinal axis A and such that the optical axis B passes through the center of the imaging surface.
Next, an endoscope system 1, an endoscope movement control device 5, a method, a program, and a recording medium according to a second embodiment of the present disclosure will be described. In the present embodiment, points different from the first embodiment will be described, and with respect to configurations common to the first embodiment, the same reference signs are appended, and description thereof will be omitted.
The endoscope system 1 according to the present embodiment is different from the first embodiment in that the endoscope 2 has a two-degree-of-freedom curved joint 2f at the distal end of the lens tube portion 2a as shown in
In the description of the present embodiment, as shown in
In the endoscope system 1 according to the present embodiment, the processor 5a executes a program (endoscope movement control program) which is recorded in the storage unit 5c and read out to the memory 5b, thereby estimating the amounts of rotation γ, δ, and θ of the endoscope 2.
An endoscope movement control method using the endoscope system 1 according to the present embodiment will be described below with reference to the drawings.
As shown in
The present embodiment differs from the first embodiment in estimating the amounts of rotation γ and δ (step S45).
The amount of rotation γ can be determined by setting θ=0° and calculating Formula (3) using the distance L between the vanishing point and the image center O in the positional relation shown in
Here, the distance L is determined as follows.
First, as shown in
The amount of rotation γ can be estimated by substituting the determined distance L into Formula (3). Further, the angle of a straight line connecting the calculated vanishing point and the image center O with respect to the reference line R can be estimated as the amount of rotation δ. Further, the amount of rotation θ can be estimated in the same manner as in the first embodiment.
As described above, according to the present embodiment, it is possible to estimate the amounts of rotation γ and θ of the two-degree-of-freedom of the curved joint 2f of the endoscope 2 to be manually adjusted by the operator, and the attachment angle of the endoscope 2 to be manually adjusted, and correct the holder coordinate system Σr using the estimated amounts of rotation γ, δ, and θ. Therefore, even when the endoscope 2 has a curved joint 2f, the operator can intuitively and accurately move the endoscope 2 in a desired direction by operating the user interface 4a while observing the endoscopic image C.
This application claims the benefit of U.S. Provisional Application No. 63/468,906, filed May 25, 2023, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63468906 | May 2023 | US |