The present invention relates to a controller, an endoscope system, a control method, and a control program, and particularly relates to a controller, an endoscope system, a control method, and a control program, by which an endoscope image displayed on a display device is controlled.
The present application claims priority under the provisional U.S. Pat. application No. 63/076408 filed on Sep. 10, 2020, which is incorporated herein by reference. This is a continuation of International Application PCT/JP2021/033209 which is hereby incorporated by reference herein in its entirety.
Conventionally, a system has been proposed to move the field of view of an endoscope in a semiautonomous manner by causing the endoscope to follow a surgical instrument (for example, see PTL 1 and PTL 2).
An aspect of the present invention is a controller configured to control an image that is captured by an image sensor of an endoscope and is displayed on the display screen of a display device, the controller including a processor, wherein the processor acquires a first image that is an image captured by the image sensor, the processor detects a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image, and the processor generates a second image rotated with respect to the first image on the basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.
Another aspect of the present invention is an endoscope system including an endoscope, a moving device that includes a robot arm and that moves the endoscope in a subject, and the controller.
Another aspect of the present invention is a control method for controlling an image that is captured by an image sensor of an endoscope and is displayed on the display screen of a display device, the control method including: acquiring a first image that is an image captured by the image sensor; detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image, and generating a second image rotated with respect to the first image on the basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.
Another aspect of the present invention is a non-transitory computer-readable medium having a control program stored therein, the program being for controlling an image that is captured by an image sensor of an endoscope and is displayed on a display screen of a display device, the program causing a processor to execute functions of: acquiring a first image that is an image captured by the image sensor; detecting a first angle of a surgical instrument, the first angle being an angle formed by the surgical instrument in the first image with respect to a predetermined reference line that is set with respect to a plane of the first image; and generating a second image rotated with respect to the first image on a basis of the first angle and a predetermined target angle, the surgical instrument forming a second angle in the second image with respect to the predetermined reference line such that the second angle is equal to the predetermined target angle.
[
A controller, an endoscope system, a control method, and a control program according to a first embodiment of the present invention will be described below with reference to the accompanying drawings.
As illustrated in
In the endoscope system 10 of
As illustrated in
The endoscope 2 is, for example, a rigid endoscope and includes an image sensor 2a. The image sensor 2a is, for example, a three-dimensional camera provided at the tip portion of the endoscope 2 and captures a stereo image, which includes a tip 6a of the surgical instrument 6, as the endoscope image A (for example, see
The endoscope image A is transmitted from the endoscope 2 to the endoscope processor 4, is subjected to necessary processing in the endoscope processor 4, is transmitted from the endoscope processor 4 to the display device 5, and is displayed on a display screen 5a of the display device 5. The display device 5 may be any display, for example, a liquid crystal display or an organic electroluminescent display. A surgeon operates the surgical instrument 6 in a body while observing the endoscope image A displayed on the display screen 5a. The display device 5 may include an audio system, for example, a speaker.
In addition to the display device 5, a user terminal for communications with the controller 1 and the endoscope processor 4 via a communication network may be provided to display the endoscope image E at the terminal. The terminal is, for example, a notebook computer, a laptop computer, a tablet computer, or a smartphone but is not particularly limited thereto.
The moving device 3 includes a robot arm 3a (including an electric scope holder) that holds the endoscope 2 and three-dimensionally controls the position and orientation of the endoscope 2. The moving device 3 in
As illustrated in
The storage unit 1c is a hard disk or a nonvolatile recording medium including a semiconductor memory such as flash memory and stores various programs including a follow-up control program (not illustrated) and an image control program (control program) 1g and data necessary for the processing of the processor 1a. Processing performed by the processor 1a may be partially implemented by dedicated logic circuits or hardware, for example, an FPGA (Field Programmable Gate Array), a SoC (System-on-a-Chip), an ASIC (Application Specific Integrated Circuit), and a PLD (Programmable Logic Device). The processing will be described later.
The storage unit 1c may be a server, e.g., a cloud server connected via a communication network to the controller 1 provided with a communication interface, instead of a recording medium integrated in the controller 1. The communication network may be, for example, a public network such as the Internet, a dedicated line, or a LAN (Local Area Network). The connection of the devices may be wired connection or wireless connection.
Any one of the configurations of the at least one processor 1a, the memory 1b, the storage unit 1c, the input interface 1d, the output interface 1e, and the user interface 1f in the controller 1 may be provided for a user terminal, aside from the endoscope processor 4 and the controller 1. The controller 1 may be integrated with the moving device 3.
The processor 1a performs processing according to the follow-up control program (not illustrated) read in the memory 1b to cause the endoscope 2 to follow the surgical instrument 6 to be followed. Specifically, the processor 1a acquires the three-dimensional position of the tip 6a of the surgical instrument 6 from the endoscope image A and controls the moving device 3 on the basis of the three-dimensional position of the tip 6a and the three-dimensional position of a predetermined target point set in the field of view of the endoscope 2. The target point is, for example, a point that is located on an optical axis and is disposed at a predetermined distance from the tip of the endoscope 2 in a direction parallel to the optical axis. The target point corresponds to a center point C of the endoscope image A. Thus, the controller 1 controls a movement of the endoscope 2 and causes the endoscope 2 to automatically follow the surgical instrument 6 such that the target point is disposed at the tip 6a.
The processor 1a is configured to operate in a rotation mode. As illustrated in
The input interface 1d and the output interface 1e are connected to the endoscope processor 4. The controller 1 can acquire the endoscope image A from the endoscope 2 via the endoscope processor 4 and output the endoscope image A to the display device 5 via the endoscope processor 4. The input interface 1d may be directly connected to the endoscope 2 and the output interface 1e may be directly connected to the display device 5 such that the controller 1 can directly acquire the endoscope image A from the endoscope 2 and directly output the endoscope image A to the display device 5.
The user interface 1f has input devices for inputs to the user interface 1f by users such as a surgeon and receives a user input. The input devices include a mouse, a button, a keyboard, and a touch panel. Moreover, the user interface 1f has a means that allows a user to switch on/off the rotation mode. The means is, for example, a switch. At the start of the controller 1, the switch is initially set to be turned off. Thereafter, when the switch is turned on by a user, the user interface 1f receives the input of the turn-on of the rotation mode. When the switch is turned off by the user, the user interface 1f receives the input of the turn-off of the rotation mode.
For a proper operation of the surgical instrument 6 by the surgeon who is observing the endoscope image A, it is important to properly set the vertical direction of the endoscope image A (the orientations of subjects, for example, the surgical instrument 6 and a biological tissue in the endoscope image A) displayed on the display screen 5a. For example, it is desirable to the surgeon that the surgical instrument 6 operated with the right hand of the surgeon would protrude at about 30° from the right side of the endoscope image A. During a surgical operation, however, a movement of the surgical instrument 6 by the surgeon or a change of the orientation of the endoscope 2 following the surgical instrument 6 leads to a change of the angle θ of the surgical instrument 6 in the endoscope image A. When the surgical instrument 6 in the endoscope image A is to be displayed at the target angle θt on the display screen 5a, a user, e.g., a surgeon can start the rotation mode by turning on the switch.
A control method performed by the processor 1a in the rotation mode will be described below.
As indicated in
When the user interface 1f receives the input of the turn-on of the rotation mode (YES at step S1), the processor 1a acquires the first image A1, which is the latest endoscope image A, from the endoscope 2 as illustrated in
The processor 1a then recognizes the surgical instrument 6 and the shaft 6b in the first image A1 and detects the first angle θ1 of the surgical instrument 6 (step S3). For recognizing the surgical instrument 6 and the shaft 6b, for example, a known image recognition technique according to deep learning is used. The first angle θ1 is an angle formed by a longitudinal axis B of the shaft 6b with respect to a predetermined reference line L.
The predetermined reference line L is a straight line that is set with respect to the plane of the first image A1 and forms a predetermined angle with respect to the horizontal line in the first image A1. The reference line L is fixed relative to the first image A1. In the present embodiment, the reference line L is a horizontal line passing through the center point C of the first image A1 and corresponds to a horizontal line Lh of the display screen 5a.
The processor 1a then compares the first angle θ1 with the predetermined target angle θt (step S4). The target angle θt is, for example, a value determined in advance by the surgeon according to the preferences of the surgeon, a fixed value set for each case of surgery (e.g., a surgical site), or a fixed value set for each surgical instrument 6. The target angle θt is stored in advance in, for example, the storage unit 1c.
If the first angle θ1 is equal to the target angle θt (YES at step S4), the processor 1a returns to step S2 without performing steps S5 to S7. In this case, the processor 1a outputs the first image A1 to the display device 5 and displays the image on the display screen 5a.
If the first angle θ1 is different from the target angle θt (NO at step S4), the processor 1a then calculates a difference Δθ between the first angle θ1 and the target angle θt (step S5).
Subsequently, as illustrated in
In an example of step S6, the processor 1a generates the second image A2 by rotating the image sensor 2a about the optical axis by the difference Δθ. In other words, the image sensor 2a captures the second image A2 that is the endoscope image A rotated by the difference Δθ with respect to the first image A1.
In another example of step S6, the processor 1a generates the second image A2 by rotating the first image A1 by the difference Δθ through image processing.
In another example of step S6, the processor 1a generates the second image A2 by rotating the endoscope 2 by using a rotating mechanism provided at the tip of the moving device 3.
The processor 1a then outputs the generated second image A2 to the display device 5 and displays the image on the display screen 5a (step S7). The surgical instrument 6 in the second image A2 displayed on the display screen 5a forms the target angle θt with respect to the horizontal line Lh of the display screen 5a.
Until the user interface 1f receives the input of the turn-off of the rotation mode in step S8, the processor 1a regularly performs the acquisition of the first image A1 in step S2 and performs steps S3 to S7 each time an additional first image A is acquired. Thus, steps S2 to S7 are repeated and the angle θ of the surgical instrument 6 on the display screen 5a is kept at the target angle θt while the rotation mode is executed.
As described above, according to the present embodiment, when the first angle θ1 of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is equal to the predetermined target angle θt, the first image A1 is displayed on the display screen 5a. When the first angle θ1 is different from the predetermined target angle θt, the second image A2 is automatically generated by a rotation with respect to the first image A1 such that the second angle θ2 of the surgical instrument 6 is equal to the target angle θt, and the second image A2 is displayed on the display screen 5a instead of the first image A1.
As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the angle θ of the surgical instrument 6 on the display screen 5a is equal to the target angle θt. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.
A controller, an endoscope system, a control method, and a control program according to a second embodiment of the present invention will be described below with reference to the accompanying drawings.
As illustrated in
As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.
A user interface 1f is configured to receive the inputs of a first trigger and a second trigger from a user. For example, the user interface 1f includes a first switch for a user input of the first trigger and a second switch for a user input of the second trigger.
In a rotation mode of the present embodiment, a processor 1a performs a control method shown in
The control method according to the present embodiment includes step S1, step S11 of receiving the first trigger, step S12 of acquiring a third image A3, step S13 of detecting a third angle θ3 of the surgical instrument 6 in the third image A3, step S14 of setting the target angle θt at the third angle θ3, step S15 of receiving the second trigger, and steps S2, S3, and S5 to S8.
When the user interface 1f receives the input of the turn-on of the rotation mode (YES at step S1), the processor 1a waits for the reception of the first trigger by the user interface 1f (step S11). As illustrated in
In response to the reception of the first trigger by the user interface 1f (YES at step S11), the processor 1a acquires the third image A3, which is the latest endoscope image A, from the endoscope 2 (step S12).
The processor 1a then detects the third angle θ3 of the surgical instrument 6 in the third image A3 according to the same method as step S3 (step S13). Like a first angle θ1, the third angle θ3 is an angle formed by a longitudinal axis B of a shaft 6b with respect to a predetermined reference line L.
The processor 1a then sets the target angle θt at the third angle θ3 (step S14).
Thereafter, the processor 1a waits for the reception of the second trigger by the user interface 1f (step S15). As illustrated in
In response to the reception of the second trigger by the user interface 1f (YES at step S15), the processor 1a acquires the first image, which is the latest endoscope image A (step S2), and detects the first angle θ1 of the surgical instrument 6 in the first image A1 (step S3).
Subsequently, the processor 1a calculates a difference Δθ between the first angle θ1 and the target angle θt (step S5) and generates, as illustrated in
Thereafter, as in the first embodiment, steps S2 to S7 are repeated and the angle θ of the surgical instrument 6 on the display screen 5a is kept at the target angle θt.
As described above, according to the present embodiment, the surgeon can set the target angle θt and the difference Δθ at desired values by using the surgical instrument 6 in the endoscope image A. Moreover, the surgeon can rotate the background of the surgical instrument 6 by a desired angle A6 and properly adjust the orientation of the background. For example, when the biological tissue E is diagonally laid in the endoscope image A as illustrated in
Furthermore, after the target angle θt and the difference Δθ are set, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a as in the first embodiment such that the angle θ of the surgical instrument 6 on the display screen 5a is equal to the target angle θt. Hence, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.
A controller, an endoscope system, a control method, and a control program according to a third embodiment of the present invention will be described below with reference to the accompanying drawings.
As illustrated in
As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.
In a rotation mode of the present embodiment, a processor 1a controls the rotation angle of an endoscope image A displayed on a display screen 5a such that the tip 6a of the surgical instrument 6 is disposed on a predetermined horizontal line Lh passing through the center of the display screen 5a. Specifically, the processor 1a performs a control method shown in
The control method according to the present embodiment includes steps S1 and S2, step S21 of determining whether the tip 6a of the surgical instrument 6 is disposed on a predetermined reference line L, and steps S3 and S6 to S8.
As in the first embodiment, in response to the reception of the input of the turn-on of the rotation mode by a user interface 1f (YES at step S1), the processor 1a acquires a first image A1 (step S2).
The processor 1a then recognizes the surgical instrument 6 in the first image A1 and determines whether the tip 6a of the surgical instrument 6 is disposed on the predetermined reference line L (step S21). The predetermined reference line L is a horizontal line passing through a center point C of the first image A1 and corresponds to a predetermined horizontal line Lh passing through the center point of the display screen 5a.
If the tip 6a is disposed on the reference line L (YES at step S21), the processor 1a returns to step S2 without performing steps S3 and S6 to S8. In this case, the processor 1a outputs the first image A1 to the display device 5 and displays the image on the display screen 5a.
As illustrated in
Subsequently, the processor 1a generates a second image A2 that is the endoscope image A rotated by the first angle θ1 with respect to the first image A1 (step S6). In other words, in the present embodiment, a target angle θt is 0°. In the second image A2, the tip 6a has a deflection angle (second angle) of 0° and is disposed on the predetermined reference line L.
Thereafter, steps S2, S21, S3, S6, and S7 are repeated, so that the angle θ of the surgical instrument 6 on the display screen 5a is kept at 0° and the tip 6a of the surgical instrument 6 is kept on the horizontal line Lh.
As described above, according to the present embodiment, when the tip 6a of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is disposed on the reference line L, the first image A1 is displayed on the display screen 5a. When the tip 6a is not disposed on the reference line L, the second image A2 rotated with respect to the first image A1 is automatically generated with the tip 6a disposed on the reference line L. The second image A2 is displayed on the display screen 5a instead of the first image A1.
As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the tip 6a of the surgical instrument 6 on the display screen 5a is disposed on the predetermined horizontal line Lh. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.
In the present embodiment, the predetermined reference line L is a horizontal line passing through the center point C. The direction and position of the predetermined reference line L can be optionally changed.
For example, if the tip 6a is to be disposed on a vertical line Lv passing through the center point of the display screen 5a, the predetermined reference line L may be a vertical line passing through the center point C. Alternatively, if the tip 6a is to be disposed on an inclined straight line passing through the center point of the display screen 5a, the predetermined reference line L may be an inclined straight line passing through the center point C.
A controller, an endoscope system, a control method, and a control program according to a fourth embodiment of the present invention will be described below with reference to the accompanying drawings.
As illustrated in
As in the first embodiment, an endoscope system 10 according to the present embodiment includes a controller 1, an endoscope 2, a moving device 3, an endoscope processor 4, and a display device 5.
In the present embodiment, while the controller 1 controls the moving device 3, a processor 1a always operates in a rotation mode and performs a control method shown in
The control method according to the present embodiment includes steps S2 and S3, step S31 of determining whether the absolute value of a difference between the first angle θ1 and the target angle θt is at most a predetermined threshold value, and steps S5 to S7.
The processor 1a acquires the first image A1, which is the latest endoscope image A, from the endoscope 2 (step S2) and detects the first angle θ1 of the surgical instrument 6 in the first image A1 (step S3).
The processor 1a then determines whether the first angle θ1 is within a range of the target angle θt±X (step S31).
As illustrated in
As illustrated in
As in the first embodiment, by repeating steps S2 to S7, the endoscope image A displayed on the display screen 5a is rotated each time the first angle θ1 deviates from a range of the target angle θt±X, so that the angle of the surgical instrument 6 on the display screen 5a is kept in a predetermined range including the target angle θt.
As described above, according to the present embodiment, when the first angle θ1 of the surgical instrument 6 in the first image A1, which is the endoscope image A captured by the endoscope 2, is within a range of the target angle θt±X, the first image A1 is displayed on the display screen 5a. When the first angle θ1 deviates from a range of the target angle θt±X, the second image A2 is automatically generated by a rotation with respect to the first image A1 such that the second angle θ2 of the surgical instrument 6 is equal to the target angle θt, and the second image A2 is displayed on the display screen 5a instead of the first image A1.
As described above, the vertical directions of the endoscope images A1 and A2 to be displayed on the display screen 5a are automatically controlled by the processor 1a such that the angle of the surgical instrument 6 on the display screen 5a is within a range of the target angle θt±X. Thus, the surgeon who is operating the surgical instrument 6 does not need to manually operate the endoscope 2 to adjust the vertical direction of the endoscope image A. In other words, the endoscope images A1 and A2 can be provided in proper vertical directions so as to facilitate procedures for the surgeon without the need for taking a surgeon’s hand off from the surgical instrument 6.
Also in the present embodiment, the control method may include steps S1 and S8 as in the first to third embodiments. The processor 1a may start and end the rotation mode in response to the inputs of the turn-on and turn-off of the rotation mode to a user interface 1f.
In the foregoing embodiments, the predetermined reference line L is a horizontal line of the first image A1 but is not limited thereto. A line extending in any direction can be set as the reference line L.
For example, the predetermined reference line L may be a vertical line extending in the longitudinal direction (vertical direction) of the first image A1 or a line extending in an oblique direction of the first image A1.
In the foregoing embodiments, the processor 1a generates the second image A2 rotated about the rotation axis D that passes through the center point C of the first image A1 and is parallel to the optical axis. The direction and position of the rotation axis D can be optionally changed depending upon a user request or the like. In other words, the rotation axis D may be inclined with respect to the optical axis and may pass through a position deviated from the center point C.
In the foregoing embodiments, the processor 1a may be configured to operate in any one of a first rotation mode in which the second image A2 is generated on the basis of the first angle θ1 of the surgical instrument 6 in the first image A1 and a second rotation mode in which a fourth image A4 is generated on the basis of anatomical characteristics in the first image A1. The processor 1a performs the second rotation mode in response to the reception of the input of the turn-off of the rotation mode by the user interface 1f.
In the first rotation mode, the processor 1a performs any one of the control methods according to the first to fourth embodiments.
As illustrated in
In the storage unit 1c, a predetermined target angle θs of the anatomical characteristic G in the endoscope image A is stored for each type of the anatomical characteristic G. In the second rotation mode, the processor 1a acquires the first image A1 from the endoscope 2, detects the anatomical characteristic G of a biological tissue in the first image A1, and recognizes the type of the anatomical characteristic G. Thereafter, as illustrated in
When the vertical direction of the endoscope image A displayed on the display screen 5a changes, the layout of organs and biological tissues are viewed differently. Hence, in order to facilitate the recognition of organs or biological tissues in the endoscope image A by a surgeon, it is important to properly set the vertical direction of the endoscope image A displayed on the display screen 5a. When the first rotation mode is turned off, the processor 1a controls the vertical direction of the endoscope image A displayed on the display screen 5a in the second rotation mode. This can properly adjust the vertical direction of the endoscope image A displayed on the display screen 5a, thereby displaying the anatomical characteristic G of the biological tissue at the predetermined target angle θs on the display screen 5a.
The first to fourth embodiments may be implemented in combination as appropriate. For example, the processor 1a may be operable in the four rotation modes described in the first to fourth embodiments. In this case, a user may input selected one of the four rotation modes to the user interface 1f, and the processor 1a may perform the inputted rotation mode.
Number | Date | Country | |
---|---|---|---|
63076408 | Sep 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/033209 | Sep 2021 | WO |
Child | 18105314 | US |