MEDICAL SYSTEM AND CONTROL METHOD

Abstract
A medical system includes an endoscope that captures an image including an object, a moving device that includes a robot arm and that moves the endoscope in a body, and a controller that controls the moving device on the basis of the position of the object. The controller is configured to control the moving device in a first control mode in which the endoscope is caused to follow the object at a first speed and a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed. The controller controls the moving device in the first control mode when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope. The controller controls the moving device in the second control mode when the object is located in the predetermined three-dimensional region.
Description
BACKGROUND ART

Conventionally, a system has been proposed to move the field of view of an endoscope in a semiautonomous manner by causing the endoscope to follow an object, e.g., a surgical instrument (for example, see PTL 1).


To allow a surgeon to follow an object with ease of operation, it is desirable to suppress excessive following motion of an endoscope so that an excessive movement of the field of view is prevented. Specifically, if an endoscope follows all the movements of an object, the field of view may become unstable and cause a surgeon to feel stress. Moreover, the field of view desirably stays stationary during procedures such as blunt dissection and thus a movement of the field of view may interfere with the procedure.


In PTL 1, a permissible region is set in an image so as to extend around the central region of the image, an endoscope follows a surgical instrument to place the surgical instrument back to the central region when the surgical instrument moves out of the permissible region, and the following is terminated when the surgical instrument moves into the central region. This configuration prevents the endoscope from following the surgical instrument insofar as the surgical instrument stays in the permissible region and the central region, thereby suppressing an excessive movement of the field of view.


CITATION LIST
Patent Literature

{PTL 1} U.S. Pat. application publication No. 2002/0156345


SUMMARY OF INVENTION

An aspect of the present invention is a medical system including an endoscope that captures an image including an object, a moving device that includes a robot arm and that moves the endoscope in a body, and a controller that controls the moving device on the basis of the position of the object, wherein the controller is configured to control the moving device in a first control mode in which the endoscope is caused to follow the object at a first speed and a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, the controller controls the moving device in the first control mode when the object is located outside a predetermined three-dimensional region set in the field of view of the endoscope, and the controller controls the moving device in the second control mode when the object is located in the predetermined three-dimensional region.


Another aspect of the present invention is a control method for controlling a movement of an endoscope on the basis of the position of an object, the endoscope capturing an image including the object, the control method including: controlling the movement of the endoscope in a first control mode in which the endoscope is caused to follow the object at a first speed, when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope; and controlling the movement of the endoscope in a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, when the object is located in the predetermined three-dimensional region.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 illustrates an appearance of a medical system according to an embodiment of the present invention.



FIG. 2 is a block diagram of the medical system illustrated in FIG. 1.



FIG. 3 illustrates a three-dimensional specific region set in the field of view of an endoscope.



FIG. 4A is an endoscope image illustrating an example of a cross section of the specific region.



FIG. 4B is an endoscope image illustrating another example of a cross section of the specific region.



FIG. 4C is an endoscope image illustrating another example of a cross section of the specific region.



FIG. 5 is an explanatory drawing of a size on the endoscope image of the specific region at depth positions X1, X2, and X3 in FIG. 3.



FIG. 6 is an explanatory drawing of a movement of a surgical instrument followed by the endoscope in the endoscope image.



FIG. 7A is an explanatory drawing of a specific example of a method for calculating the specific region.



FIG. 7B is an explanatory drawing of the specific example of the method for calculating the specific region.



FIG. 7C is an explanatory drawing of the specific example of the method for calculating the specific region.



FIG. 7D is an explanatory drawing of the specific example of the method for calculating the specific region.



FIG. 8A is a flowchart of a control method performed by a controller in FIG. 1.



FIG. 8B is a flowchart of a modification of the control method performed by the controller in FIG. 1.



FIG. 9A is an explanatory drawing of a method for setting the size of the specific region according to the viewing angle of the endoscope.



FIG. 9B is an explanatory drawing of the method for setting the size of the specific region according to the viewing angle of the endoscope.



FIG. 10 is an explanatory drawing of a modification of a movement of the surgical instrument followed by the endoscope in the endoscope image.



FIG. 11 illustrates a three-dimensional specific region in a reference example.





DESCRIPTION OF EMBODIMENT

A medical system and a control method according to an embodiment of the present invention will be described below with reference to the accompanying drawings.


As illustrated in FIG. 1, a medical system 10 according to the present embodiment includes an endoscope 1 and a surgical instrument 2 that are inserted into the body of a patient, a moving device 3 that holds the endoscope 1 and moves the endoscope 1 in the body, a controller 4 that is connected to the endoscope 1 and the moving device 3 and controls the moving device 3, and a display device 5 that displays an endoscope image.


The endoscope 1 is, for example, a rigid endoscope and includes an imaging portion 1a that has an image sensor and captures an endoscope image (see FIG. 2). The endoscope 1 captures an endoscope image D (see FIGS. 5 and 6), which includes a tip 2a of the surgical instrument 2, through the imaging portion 1a and transmits the endoscope image D to the controller 4. The imaging portion 1a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 1 and captures a stereo image, which includes information on the three-dimensional position of the tip 2a of the surgical instrument 2, as the endoscope image D.


The moving device 3 includes a robot arm 3a having a plurality of joints 3b and holds the proximal portion of the endoscope 1 at the tip portion of the robot arm 3a. In an example, the robot arm 3a has three degrees of freedom of movement: a back-and-forth linear motion along the X axis, a rotation (pitch) about the Y axis, and a rotation (yaw) about the Z axis. A rotation (roll) about the X axis is preferably added as a degree of freedom of movement. The X axis is an axis on the same straight line as an optical axis A of the endoscope 1, and the Y and Z axes are axes that are orthogonal to the optical axis A and extend in respective directions corresponding to the lateral direction and the longitudinal direction of the endoscope image D.


As illustrated in FIG. 2, the controller 4 includes at least one processor 4a like a central processing unit, a memory 4b, a storage unit 4c, an input interface 4d, an output interface 4e, and a network interface 4f.


The endoscope images D transmitted from the endoscope 1 are sequentially inputted to the controller 4 through the input interface 4d, are sequentially outputted to the display device 5 through the output interface 4e, and are displayed on the display device 5. A surgeon operates the surgical instrument 2 inserted into a body while observing the endoscope image D displayed on the display device 5, and performs an operation on an affected area in the body by using the surgical instrument 2.


The storage unit 4c is a ROM (read-only memory) or a nonvolatile recording medium such as a hard disk and stores a program and data necessary for causing the processor 4a to perform processing. The program is read in the memory 4b and is executed by the processor 4a, thereby implementing the functions of the controller 4. The functions will be described later. Some of the functions of the controller 4 may be implemented by dedicated logic circuits or the like.


The controller 4 has a manual mode or a follow-up mode. The manual mode is a mode in which an operator, e.g., a surgeon manually operates the endoscope 1, whereas the follow-up mode is a mode in which the controller 4 causes the endoscope 1 to automatically follow the tip 2a of the surgical instrument (object) 2.


The controller 4 switches the manual mode and the follow-up mode on the basis of an instruction from the operator. For example, the controller 4 has AI capable of recognizing a human voice. When recognizing a voice of “manual mode,” the controller 4 switches to the manual mode. When recognizing a voice of “follow-up mode,” the controller 4 switches to the follow-up mode. The controller 4 may switch the manual mode and the follow-up mode in response to turn-on or turn-off of a manual operation switch (not illustrated) provided on the endoscope 1.


In the manual mode, for example, an operator, e.g., a surgeon can remotely operate the robot arm 3a by operating an operating device (not illustrated) connected to the controller 4.


In the follow-up mode, the controller 4 controls the moving device 3 on the basis of the three-dimensional position of the tip 2a of the surgical instrument 2, so that the endoscope 1 is caused to three-dimensionally follow the tip 2a so as to move the tip 2a toward the center of the endoscope image D and to a predetermined depth of the endoscope image D. Specifically, the controller 4 recognizes the surgical instrument 2 in the endoscope image D and calculates the three-dimensional position of the tip 2a by using the endoscope image D. The controller 4 then operates the joints 3b such that the optical axis A of the endoscope 1 moves to the tip 2a in a direction that crosses the optical axis A and the tip of the endoscope 1 moves to a position at a predetermined observation distance from the tip 2a in the depth direction extending along the optical axis A.


In this case, the follow-up mode includes a first control mode in which the endoscope 1 is caused to follow the tip 2a of the surgical instrument 2 at a first speed and a second control mode in which the endoscope 1 is caused to follow the tip 2a of the surgical instrument 2 at a second speed lower than the first speed. As illustrated in FIG. 3, the controller 4 controls the moving device 3 in the first control mode when the tip 2a is located outside a predetermined specific region B, and the controller 4 controls the moving device 3 in the second control mode when the tip 2a is located in the specific region B. Thus, when the tip 2a is located in the specific region B, sensitivity for following a movement of the tip 2a by the endoscope 1 decreases, thereby suppressing excessive following motion of the endoscope 1 with respective to the tip 2a.


The specific region B is a predetermined three-dimensional region that is set in a field of view F of the endoscope 1 and has dimensions in the X direction, the Y direction, and the Z direction that are orthogonal to one another. The X direction is a depth direction parallel to the optical axis A of the endoscope 1. The Y direction and the Z direction are directions that are orthogonal to the optical axis A and are parallel respectively to the lateral direction and the longitudinal direction of the endoscope image D.


The specific region B is separated from the tip of the endoscope 1 in the X direction and is set in a part of the range of the field of view F in the X direction. Moreover, the specific region B includes the optical axis A and has a three-dimensional shape that decreases in size in cross section toward the tip of the endoscope 1. Hence, the specific region B on the endoscope image D is a central region including the center of the endoscope image D. As illustrated in FIGS. 4A to 4C, the specific region B orthogonal to the optical axis A may be rectangular, circular, or oval in cross section and may have any other shapes such as a polygon. The specific region B may be superimposed on the endoscope image D or may be hidden.


In an example, the specific region B and the endoscope image D are identical in shape in cross section. For example, when the endoscope image D is rectangular, the specific region B is also rectangular in cross section. The specific region B displayed on the endoscope image D may interfere with an observation of the endoscope image D and thus is preferably hidden. When the specific region B and the endoscope image D are identical in shape, the surgeon easily recognizes the position of the hidden specific region B.


The field of view F of the endoscope 1 is typically shaped like a cone having the vertex at or near the tip of the endoscope 1. The specific region B is preferably a frustum with the vertex shared with the field of view F of the endoscope 1. As illustrated in FIG. 5, the size and position of the displayed specific region B are fixed on the endoscope image D regardless of positions X1, X2, and X3 in the X direction.


The size of the specific region B on the endoscope image D (that is, the size of the specific region B relative to the field of view F in cross section) is preferably 25% to 55% of the size of the endoscope image D. In the case of the specific region B like a frustum, a vertex angle β of the specific region B is preferably 25% to 55% of a viewing angle α of the endoscope 1. This configuration can place the tip 2a of the surgical instrument 2 at the center of the endoscope image D and suppress excessive following motion of the endoscope 1 with respective to the tip 2a.


When the size of the specific region B is less than 25% of the size of the endoscope image D, the effect of suppressing excessive following motion of the endoscope 1 relative to a movement of the tip 2a may become insufficient, leading to frequent movements of the field of view F. When the size of the specific region B is larger than 55% of the size of the endoscope image D, the tip 2a is frequently disposed at a position remote from the center of the endoscope image D, leading to difficulty in placing the tip 2a at the center.


As illustrated in FIGS. 3 to 5, the specific region B includes a non-following region B1 and a following region B2. The non-following region B1 is a central region of the specific region B including the optical axis A. The following region B2 is an outer region of the specific region B and surrounds the non-following region B1. Like the specific region B, the non-following region B1 has a three-dimensional shape that decreases in size in cross section toward the tip of the endoscope 1. The non-following region B1 is preferably a frustum.


As illustrated in FIG. 6, when the tip 2a of the surgical instrument 2 is disposed outside the following region B2, the controller 4 rotates, for example, the robot arm 3a about the Y axis and the Z axis so as to cause the endoscope 1 to follow the tip 2a at a first speed V1.


When the tip 2a is disposed in the non-following region B1, the controller 4 keeps the position of the endoscope 1 without causing the endoscope 1 to follow the tip 2a. Specifically, the controller 4 controls the angular velocities of the joints 3b to zero. Thus, the second speed in the non-following region B1 is zero.


When the tip 2a is disposed in the following region B2, the controller 4 continues the operation of the endoscope 1 in the previous control cycle. Specifically, when the position of the endoscope 1 is kept in the previous control cycle, the controller 4 keeps the position of the endoscope 1 also in the current control cycle. When the endoscope 1 is caused to follow the tip 2a in the previous control cycle, the controller 4 causes the endoscope 1 to follow the tip 2a also in the current control cycle. At this point, the speed of following is a second speed V2 higher than zero.


In the foregoing control, the following region B2 acts as a trigger for starting following of the tip 2a by the endoscope 1, and the non-following region B1 acts as a trigger for terminating following of the tip 2a by the endoscope 1. Specifically, when the tip 2a moves from the following region B2 to an outer region C, the endoscope 1 starts following the tip 2a. When the tip 2a enters the non-following region B1 from the outer region C through the following region B2, following of the tip 2a by the endoscope 1 is terminated.


The first speed V1 and the second speed V2 are each kept constant, whereas the following speed of the endoscope 1 may be changed in two steps.


Alternatively, the first speed V1 and the second speed V2 may change according to a distance from the center of the endoscope image D to the tip 2a. For example, the controller 4 may calculate distances from the optical axis A of the endoscope 1 to the tip 2a in the Y direction and the Z direction and increase the speeds V1 and V2 according to the distances. In this case, the following speeds V1 and V2 of the endoscope 1 may continuously decrease from the outer region C to the non-following region B1.



FIGS. 7A to 7D are explanatory drawings of a method for calculating the specific region B.


As illustrated in FIG. 7A, the controller 4 sets, as a fiducial point E, an intersection point of the optical axis A and a YZ plane P that passes through the tip 2a of the surgical instrument 2 and is perpendicular to the optical axis A. The controller 4 then defines, as the specific region B, a region like a rectangular solid or a sphere that is centered around the fiducial point E.



FIGS. 7B and 7C are explanatory drawings of a method for calculating an actual size [mm] of the specific region B.


A size Lmax_dz[mm] of the endoscope image D (the size of the field of view F in the Z direction) at an observation distance di (i = 1, 2, ... ) in the Z direction (longitudinal direction) is expressed by the following formula according to the geometric relationship of FIG. 7C.






Lmax_dz = di*tan
α




where α[deg] is the viewing angle (half angle of view) of the endoscope 1. A pixel size Lmax_dz_pixel [px] of the endoscope image D in the Z direction is known and is expressed by, for example, the following formula:






Lmax_dz_pixel = 1080/2


pixel






Thus, an actual size L_dz[mm] of the specific region B in the Z direction is calculated from the following formula by using a pixel size[dx] of the specific region B in the Z direction.






L_dz = Lmax_dz*


dz/Lmax_dz_pixel






An actual size L_dy[mm] of the specific region B in the Y direction is also calculated by the same method as L_dz.


An actual size L_dx of the specific region B in the X direction is also set. For example, the actual size L_dx may be set at a fixed value regardless of the observation distance di. Alternatively, as illustrated in FIG. 7D, an actual size L_dx at a reference observation distance di (e.g., d1) may be preset and L_dx at another observation distance di (e.g., d2) may be set at a value proportionate to a change of the observation distance.


The operation of the medical system 10 will be described below.


A surgeon performs a procedure by operating the surgical instrument 2 inserted into a body while observing the endoscope image D displayed on the display device 5. During the procedure, the surgeon switches from the manual mode to the follow-up mode or from the follow-up mode to the manual mode in response to, for example, a voice.


As indicated in FIG. 8A, when switching to the follow-up mode in step S1, the controller 4 performs the control method of steps S2 to S8 and controls the moving device 3 in the follow-up mode.


The control method includes step S2 of determining whether the tip 2a of the surgical instrument 2 is located in the specific region B and steps S3 to S8 of causing, when the position of the tip 2a is located outside the specific region B, the endoscope 1 to follow the surgical instrument 2 until the tip 2a of the surgical instrument 2 reaches the non-following region B1.


After the start of the follow-up mode (YES at step S1), the controller 4 calculates the three-dimensional position of the tip 2a by using the endoscope image D, which is a stereo image, and determines whether the tip 2a is located in the predetermined specific region B (step S2). When the tip 2a is located in the specific region B (YES at step S2), the controller 4 keeps the position of the endoscope 1 without causing the endoscope 1 to follow the surgical instrument 2. When the tip 2a is located outside the specific region B (NO at step S2), the controller 4 starts following of the surgical instrument 2 by the endoscope 1 (step S3).


In following of the surgical instrument 2, the controller 4 selects one of the first control mode and the second control mode on the basis of the position of the tip 2a. As illustrated in FIG. 6, the tip 2a is located outside the specific region B at the start of following (NO at step S4) and thus the controller 4 controls the moving device 3 in the first control mode, so that the endoscope 1 is caused to follow the tip 2a of the surgical instrument 2 at the first speed V1 so as to move the tip 2a of the surgical instrument 2 toward the center of the endoscope image D (step S5). The controller 4 controls the moving device 3 in the first control mode until the tip 2a enters the specific region B.


After the tip 2a enters the specific region B (YES at step S4), the controller 4 then controls the moving device 3 in the second control mode, so that the endoscope 1 is caused to follow the tip 2a of the surgical instrument 2 at the second speed V2 so as to move the tip 2a of the surgical instrument 2 toward the center of the endoscope image D. Since the second speed V2 is lower than the first speed V1, the responsivity for following a movement of the tip 2a by the endoscope 1 decreases. In other words, after the tip 2a returns to the specific region B from the outer region C, the endoscope 1 is prevented from excessively following a movement of the surgical instrument 2. The controller 4 controls the moving device 3 in the second control mode until the tip 2a enters the non-following region B1.


When the tip 2a of the surgical instrument 2 enters the non-following region B1 (YES at step S6), the controller 4 causes the endoscope 1 to finish following the surgical instrument 2 (step S8).


While the follow-up mode continues (NO at step S9), the controller 4 repeats steps S1 to S8.


To allow a surgeon to follow an object through the endoscope 1 with ease of operation, it is desirable to cause the endoscope 1 to follow the surgical instrument 2 so as to satisfy three conditions: suppressing excessive following motion, bringing the tip 2a of the surgical instrument 2 at the center of the endoscope image D, and bringing the tip 2a of the surgical instrument 2 at a proper distance in the X direction.


According to the present embodiment, the specific region B is a three-dimensional region set in the field of view F and thus can be properly designed. For example, a distance between the tip of the endoscope 1 and the specific region B in the X direction and a size of the specific region B in cross section at each position in the X direction are designed to satisfy the three conditions. This allows the endoscope 1 to follow the surgical instrument 2 with ease of operation.


Moreover, the specific region B is shaped to decrease in size in cross section toward the tip of the endoscope 1, thereby suppressing a difference in the size of the displayed specific region B on the endoscope image D between positions in the X direction. The specific region B is preferably displayed with a fixed size regardless of the position in the X direction. This can suppress excessive following motion of the endoscope 1 and place the tip 2a at the center regardless of the position of the tip 2a in the X direction.



FIG. 11 illustrates a specific region B′ as a reference example. As illustrated in FIG. 11, when the specific region B′ is formed by simply extending a two-dimensional region on the image plane of the endoscope image D, the specific region B′ extends from the tip of the endoscope 1 in the X direction. Thus, the endoscope 1 cannot be caused to follow the surgical instrument 2 such that the tip 2a is brought at a proper distance in the X direction.


Moreover, the size of the specific region B′ is fixed in cross section and thus the displayed specific region B′ on the endoscope image D changes in size according to the position in the X direction. This leads to difficulty in suppressing excessive following motion of the endoscope 1 with respective to the surgical instrument 2 while placing the tip 2a at the center. Specifically, the displayed specific region B′ decreases in size at a position X3 remote from the tip of the endoscope 1 in the X direction, so that excessive following motion of the endoscope 1 cannot be suppressed, though the tip 2a can be placed at the center. The size of the displayed specific region B′ increases at a position X1 close to the tip of the endoscope 1 in the X direction. This can suppress excessive following motion of the endoscope 1 but leads to difficulty in placing the tip 2a at the center.


In the foregoing embodiment, when the tip 2a is located in the following region B2, the controller 4 continues the operation of the endoscope 1 in the previous control cycle. Alternatively, the endoscope 1 may be caused to always follow the tip 2a at the second speed V2 larger than zero. In other words, the controller 4 may control the moving device 3 in the second control mode in either of the following cases: where the tip 2a enters the following region B2 from the outer region C, and where the tip 2a enters the following region B2 from the non-following region B1.


In this case, as indicated in FIG. 8B, the controller 4 determines whether the tip 2a is located in the non-following region B1 (step S2′). When the tip 2a moves from the non-following region B1 into the following region B2 (NO at step S2′), the controller 4 starts following the surgical instrument 2 by the endoscope 1 (step S3).


The tip 2a is located in the following region B2 at the start of following (YES at step S4 and NO at step S6) and thus the controller 4 controls the moving device 3 in the second control mode, so that the endoscope 1 is caused to follow the tip 2a of the surgical instrument 2 at the second speed V2 so as to move the tip 2a of the surgical instrument 2 toward the center of the endoscope image D (step S7). The controller 4 controls the moving device 3 in the second control mode until the tip 2a enters the non-following region B1. As described above, since the second speed V2 is lower than the first speed V1, the endoscope 1 is prevented from excessively following a movement of the surgical instrument 2 while the tip 2a moves in the following region B2.


When the tip 2a moves out of the specific region B while being followed by the endoscope 1 in the second control mode (NO at step S4), the controller 4 switches from the second control mode to the first control mode (step S5) and controls the moving device 3 in the first control mode until the tip 2a returns to the specific region B.


In the foregoing embodiment, as illustrated in FIGS. 9A and 9B, the controller 4 may change the size of the specific region B in cross section according to the viewing angle α of the endoscope 1.


For example, in the storage unit 4c, a value of the viewing angle α is stored for each type of the endoscope 1. The controller 4 recognizes the type of the endoscope 1 held by the robot arm 3a, reads the value of the viewing angle α of the recognized type from the storage unit 4c, and sets the vertex angle βof the specific region B at a predetermined rate of the viewing angle α. For example, the vertex angle β is calculated by multiplying the value of the viewing angle α by a predetermined rate k selected from 25% to 55%. Thus, the specific region B increases in size in cross section in proportion to the viewing angle α.


With this configuration, the area ratio of the cross section of the specific region B to the cross section of the field of view F is kept constant regardless of a difference in the viewing angle α of the used endoscope 1. Hence, the displayed specific region B on the endoscope image D displayed on the display device 5 can have the same size regardless of the viewing angle α of the endoscope 1.


In the foregoing embodiment, the specific region B includes the non-following region B1 where the endoscope 1 is not caused to follow the surgical instrument 2. Alternatively, as illustrated in FIG. 10, the non-following region B1 may be absent in the specific region B. In this modification, the controller 4 causes the endoscope 1 to follow the surgical instrument 2 at the second speed V2 until the tip 2a of the surgical instrument 2 is located at the center of the endoscope image D. When the tip 2a is located at the center of the endoscope image D, the controller 4 causes the endoscope 1 to finish following the surgical instrument 2.


In the case of FIG. 6, the endoscope 1 finishes following the tip 2a when the tip 2a reaches the end of the following region B2 remote from the center of the endoscope image D. In contrast, in FIG. 10, the endoscope 1 is caused to follow the tip 2a until the tip 2a reaches the center of the endoscope image D. Thus, a procedure can be performed with the tip 2a located at the center of the endoscope image D.


In the modification of FIG. 10, the second speed V2 is preferably 50% or less of the first speed V1. The second speed V2 may remain constant or gradually decrease as the tip 2a of the surgical instrument 2 moves close to the center of the endoscope image D. If the second speed V2 is higher than 50% of the first speed V1, it is difficult to sufficiently obtain the effect of suppressing excessive following motion of the endoscope 1.


In the foregoing embodiment, the shape of the specific region B may be changed in cross section. For example, the cross-sectional shape can be selected from a rectangle, a circle, and an ellipse in FIGS. 4A to 4C, and parameters dy, dz, R, a, and b can be set to determine the size of each shape in cross section. The selection of the shape and the setting of the parameters may be manual operations performed by the surgeon or automatic operations performed by the controller 4.


With this configuration, the shape and size of the specific region B in cross section can be set according to, for example, the technique, the contents of the procedure, or the preferences of the surgeon.


In an example, in a procedure where the tip 2a frequently makes large movements in the longitudinal direction of the endoscope image D, the cross section is set as a vertically oriented ellipse in FIG. 4C. This can prevent the field of view F from excessively reacting to a longitudinal movement of the tip 2a and vibrating in the longitudinal direction, thereby stopping the field of view F regardless of a longitudinal movement of the tip 2a during the procedure.


The controller 4 may recognize the type of the surgical instrument 2 or a procedure and automatically change at least one of the shape of the specific region B, the sizes of the specific region B in the X, Y, and Z directions, and the position of the specific region B according to the type of the surgical instrument 2 or the procedure. Furthermore, the controller 4 may automatically change the first speed and the second speed according to the type of the surgical instrument 2 or the procedure. For example, the controller 4 recognizes the type of the surgical instrument 2 on the basis of the endoscope image D and recognizes the type of the procedure according to the type of the surgical instrument 2.


The proper shape, size, and position of the specific region B change according to the type of the surgical instrument 2 or the procedure. With this configuration, the shape, size, and position of the specific region B can be automatically set to be suitable for the type of the surgical instrument 2 and the procedure.


In an example, when the type of the surgical instrument 2 is gripping forceps, the specific region B having a larger size in the X direction is set to be located at a larger distance from the tip of the endoscope 1. For example, a range of 90 mm to 190 mm from the tip of the endoscope 1 is set as the specific region B.


In another example, when the type of the surgical instrument 2 is an energy treatment tool, the specific region B is set to be located at a shorter distance from the tip of the endoscope 1 in order to perform an elaborate procedure. For example, a range of 60 mm to 90 mm from the tip of the endoscope 1 is set as the specific region B. Moreover, in order to prevent a movement of the field of view F during a blunt dissection operation, the size of the specific region B may be increased in cross section or the second speed may be reduced.


In still another example, the controller 4 may learn a movement of the tip 2a during a procedure and change the shape and size of the specific region B such that the motion range of the tip 2a is included in the specific region B during the procedure.


In the foregoing embodiment, a definite border may be absent between the specific region B and the outer region C. In other words, the controller 4 may continuously change the following speed according to a distance from the center of the endoscope image D to the tip 2a.


For example, the controller 4 may calculate an angular velocity Vp of a rotation about the Y axis and an angular velocity Vy of a rotation about the Z axis according to the formula below, and rotate the robot arm 3a at the calculated angular velocities Vp and Vy. py is a distance from the center of the endoscope image D to the tip 2a in the Y direction, pz is a distance from the center of the endoscope image D to the tip 2a in the Z direction, and Gy and Gz are predetermined coefficients of proportionality.






Vp = Gz*pz








Vy = Gy*py




In the foregoing embodiment, the endoscope 1 captures a three-dimensional stereo image as the endoscope image D. Alternatively, a two-dimensional endoscope image D may be captured. In this case, for example, the position of the tip 2a of the surgical instrument 2 in the X direction may be measured by another distance-measuring means, e.g., a distance sensor provided at the tip of the endoscope 1.


In the foregoing embodiment, an object to be followed by the endoscope 1 is the surgical instrument 2 but is not limited thereto. The endoscope 1 may follow any object in the endoscope image D during a surgical operation. For example, an object may be a lesion, an organ, a blood vessel, a marker, a biomedical material such as gauze, or a medical instrument other than the surgical instrument 2.










REFERENCE SIGNS LIST





1

Endoscope



1
a

Imaging portion



2

Surgical instrument (object)



3

Moving device



3
a

Robot arm



3
b

Joint



4

Controller



5

Display device



10

Medical system


A
Optical axis


B
Specific region (predetermined three-dimensional region)


B1
Non-following region (specific region)


B2
Following region (specific region)


C
Outer region


D
Endoscope image


F
Field of view


α
Viewing angle





Claims
  • 1. A medical system comprising: an endoscope that captures an image including an object;a moving device that comprises a robot arm and that moves the endoscope in a body; anda controller that controls the moving device on a basis of a position of the object,wherein the controller is configured to control the moving device in a first control mode in which the endoscope is caused to follow the object at a first speed and a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed,the controller controls the moving device in the first control mode when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope, andthe controller controls the moving device in the second control mode when the object is located in the predetermined three-dimensional region.
  • 2. The medical system according to claim 1, wherein the three-dimensional region orthogonal to an optical axis of the endoscope is shaped to decrease in size in cross section toward a tip of the endoscope.
  • 3. The medical system according to claim 1, wherein the endoscope is configured to capture a stereo image, and the controller calculates a three-dimensional position of the object by using the stereo image.
  • 4. The medical system according to claim 1, wherein the object is a surgical instrument, and the controller recognizes a type of the surgical instrument andthe controller changes at least one of a size and a shape of the three-dimensional region orthogonal to an optical axis of the endoscope in cross section according to the type of the surgical instrument.
  • 5. The medical system according to claim 1, wherein the controller controls the moving device in the first control mode and the second control mode so as to move the object toward a center of the image.
  • 6. The medical system according to claim 1, wherein the controller controls the moving device in the first control mode until the object enters the three-dimensional region.
  • 7. The medical system according to claim 1, wherein the controller recognizes a type of a procedure, and the controller changes at least one of a size and a shape of the three-dimensional region according to the type of the procedure.
  • 8. The medical system according to claim 1, wherein the controller changes a size of the three-dimensional region orthogonal to an optical axis of the endoscope in cross section according to a viewing angle of the endoscope.
  • 9. A control method for controlling a movement of an endoscope on a basis of a position of an object, the endoscope capturing an image including the object, the control method comprising: controlling the movement of the endoscope in a first control mode in which the endoscope is caused to follow the object at a first speed, when the object is located outside a predetermined three-dimensional region set in a field of view of the endoscope; andcontrolling the movement of the endoscope in a second control mode in which the endoscope is caused to follow the object at a second speed lower than the first speed, when the object is located in the predetermined three-dimensional region.
TECHNICAL FIELD

The present invention relates to a medical system and a control method and particularly relates to a medical system having the function of causing an endoscope to follow an object, and a control method thereof. The present application claims priority under the provisional U.S. Pat. Application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/027564 which is hereby incorporated by reference herein in its entirety.

Provisional Applications (1)
Number Date Country
63076408 Sep 2020 US
Continuations (1)
Number Date Country
Parent PCT/JP2021/027564 Jul 2021 WO
Child 18105291 US