The present invention relates to an endoscope system, an endoscopic image generating method, and a non-transitory computer-readable recording medium for composing two acquired images of an object to be perceived as a stereoscopic image.
In recent years, endoscope devices have been widely used in medical and industrial fields. Endoscope devices used in the medical field include an elongated insertion portion inserted into a body, and have been widely used for observation of organs, therapeutic devices using treatment instruments, surgical operations under endoscopic observation, and the like.
A common endoscope device is an endoscope device that observes a portion to be observed in a planar image. In the case of a planar image, when it is desired to observe minute asperities on a surface of a body cavity wall or the like as a to-be-observed portion or grasp a spatial positional relationship between organs and devices in a body cavity, for example, perspective and three-dimensional appearance cannot be obtained. Thus, in recent years, a three-dimensional endoscope system that enables three-dimensional observation of an object has been developed.
As a method for enabling three-dimensional perception of an object, there is a method in which two images having a parallax are picked up by two image pickup devices provided in the endoscope and the two images are displayed as a 3D image on a 3D monitor, for example. In this method, an observer perceives a stereoscopic image by seeing the 3D image separately with left and right eyes using 3D observation glasses such as polarizing glasses.
Depending on a distance between the object and an objective, the stereoscopic image may be hard to recognize and unnatural feeling and eyestrain may be caused. As a method for resolving difficulty in recognizing the stereoscopic image, there is a method of resolving difficulty in recognizing the entire stereoscopic image by performing predetermined processing on a region that is hard to observe such as an inconsistent region of left and right images, as disclosed in Japanese Patent Application Laid-Open Publication No. 2005-334462, Japanese Patent Application Laid-Open Publication No. 2005-58374, and Japanese Patent Application Laid-Open Publication No. 2010-57619, for example.
An endoscope system in an aspect of the present invention includes: an endoscope including a first image pickup device and a second image pickup device each configured to pick up an image of an object in a subject; a monitor configured to display a 3D image as a displayed image; a sensor configured to sense distance information that is information of a distance from an objective surface positioned at a distal end of an insertion portion of the endoscope to a predetermined observation object in the subject; and a processor, wherein the processor is configured to: generate the 3D image based on a first picked-up image picked up by the first image pickup device and a second picked-up image picked up by the second image pickup device; and change the displayed image by performing at least one of control of the endoscope, image processing for generating the 3D image, and control of the monitor, and based on the distance information, the processor controls the first picked-up image so as to change a position of a first output region included in an entire image pickup region of the first image pickup device and in which an image pickup signal for the first picked-up image is outputted, controls the second picked-up image so as to change a position of a second output region included in an entire image pickup region of the second image pickup device and in which an image pickup signal for the second picked-up image is outputted, controls the first and second picked-up images such that an interval between a center of the first output region and a center of the second output region is a first value when the distance is within a predetermined range, and controls the first and second picked-up images such that the interval is a second value different from the first value when the distance is out of the predetermined range.
An endoscopic image generating method in an aspect of the present invention is an endoscopic image generating method for generating a 3D image based on first and second picked-up images respectively picked up by first and second image pickup devices of an endoscope, the endoscopic image generating method including: sensing distance information that is information of a distance from an objective surface positioned at a distal end of an insertion portion of the endoscope to a predetermined observation object in a subject; and based on the distance information, controlling the first picked-up image so as to change a position of a first output region included in an entire image pickup region of the first image pickup device and in which an image pickup signal for the first picked-up image is outputted, controlling the second picked-up image so as to change a position of a second output region included in an entire image pickup region of the second image pickup device and in which an image pickup signal for the second picked-up image is outputted, controlling the first and second picked-up images such that an interval between a center of the first output region and a center of the second output region is a first value when the distance is within a predetermined range, and controlling the first and second picked-up images such that the interval is a second value different from the first value when the distance is out of the predetermined range.
A non-transitory computer-readable recording medium in an aspect of the present invention is a non-transitory computer-readable recording medium storing an endoscopic image processing program to be executed by a computer, wherein the endoscopic image processing program causes an endoscopic image generating system for generating a 3D image based on first and second picked-up images respectively picked up by first and second image pickup devices of an endoscope to perform: sensing distance information that is information of a distance from an objective surface positioned at a distal end of an insertion portion of the endoscope to a predetermined observation object in a subject; and based on the distance information, controlling the first picked-up image so as to change a position of a first output region included in an entire image pickup region of the first image pickup device and in which an image pickup signal for the first picked-up image is outputted, controlling the second picked-up image so as to change a position of a second output region included in an entire image pickup region of the second image pickup device and in which an image pickup signal for the second picked-up image is outputted, controlling the first and second picked-up images such that an interval between a center of the first output region and a center of the second output region is a first value when the distance is within a predetermined range, and controlling the first and second picked-up images such that the interval is a second value different from the first value when the distance is out of the predetermined range.
An embodiment of the present invention will be described below with reference to the drawings.
First, a schematic configuration of an endoscope system according to an embodiment of the present invention will be described. An endoscope system 100 according to the present embodiment is a three-dimensional endoscope system including a three-dimensional endoscope.
The endoscope system 100 includes a three-dimensional endoscope (hereinafter simply referred to as an endoscope) 1, a main-body device 2 having a function of a 3D video processor, a display unit 3 having a function of a 3D monitor, and 3D observation glasses 4 worn for seeing the display unit 3 to perceive a stereoscopic image. The endoscope 1 and the display unit 3 are connected to the main-body device 2. The 3D observation glasses 4 are configured to be able to communicate with the main-body device 2 through wired or wireless communication.
The endoscope 1 includes an insertion portion 10 inserted into a subject, an operation portion (not shown) connected to a proximal end of the insertion portion 10, and a universal cord 15 extending out from the operation portion. The endoscope 1 is connected to the main-body device 2 via the universal cord 15. The endoscope 1 may be constituted as a hard three-dimensional endoscope in which the insertion portion 10 has a hard tube portion, or may be constituted as a soft three-dimensional endoscope in which the insertion portion has a flexible tube portion.
The endoscope 1 also includes an image pickup optical system including a first image pickup device 11 and a second image pickup device 12 that pick up images of an object in a subject and an illumination optical system including an illumination unit 14. The image pickup optical system is provided at a distal end portion of the insertion portion 10. The image pickup optical system further includes two observation windows 11A and 12A provided on a distal end surface 10a of the insertion portion 10. The observation windows 11A and 12A constitute an end surface (hereinafter referred to as an objective surface) positioned at the distal end of the image pickup optical system. A light receiving surface of the first image pickup device 11 receives incident light from the object through the observation window 11A. A light receiving surface of the second image pickup device 12 receives incident light from the object through the observation window 12A. The first and second image pickup devices 11 and 12 are constituted by CCD or CMOS, for example.
The illumination optical system further includes two illumination windows 14A and 14B provided on the distal end surface 10a of the insertion portion 10. The illumination unit 14 emits illumination light for illuminating the object. The illumination light is emitted from the illumination windows 14A and 14B and irradiates the object. The illumination unit 14 may be provided at a position distanced from the distal end portion of the insertion portion 10. In this case, the illumination light emitted by the illumination unit 14 is transmitted to the illumination windows 14A and 14B by a lightguide provided in the endoscope 1. Alternatively, the illumination unit 14 may be constituted by a light-emitting element such as an LED provided at the distal end portion of the insertion portion 10.
The endoscope 1 further includes a distance sensing unit 13 that senses distance information that is information of a distance from the observation windows 11A and 12A, which constitute the objective surface, to a predetermined observation object 101 in the subject. In the present embodiment, the distance sensing unit 13 is provided on the distal end surface 10a of the insertion portion 10 in the same way as the observation windows 11A and 12A. The distance sensing unit 13 calculates the distance from the observation windows 11A and 12A to the observation object 101 based on a result of measuring a distance from the distal end surface 10a to the observation object 101 and a positional relationship between the observation windows 11A and 12A and the distance sensing unit 13. Hereinafter, it is assumed that a distance from the observation window 11A to the observation object 101 and a distance from the observation window 11A to the observation object 101 are equal to each other. The distance from the observation windows 11A and 12A to the observation object 101 is denoted by a symbol C. Note that, in
The display unit 3 displays a 3D image generated from first and second picked-up images, which will be described later, as a displayed image. The 3D observation glasses 4 are glasses worn for seeing the 3D image displayed on the display unit 3 to observe the first picked-up image and the second picked-up image with respective, left and right eyes to perceive a stereoscopic image. The display unit 3 may be a polarized 3D monitor that displays the 3D image through different polarizing filters, or may be an active shutter 3D monitor that alternately displays the first picked-up image and the second picked-up image as the 3D image, for example. The 3D observation glasses 4 are polarized glasses if the display unit 3 is a polarized 3D monitor, and are shutter glasses if the display unit 3 is an active shutter 3D monitor.
The 3D observation glasses 4 include a line-of-sight direction detecting unit 41 that detects a direction of a line of sight of a wearer. A detection result of the line-of-sight direction detecting unit 41 is sent to the main-body device 2 through wired or wireless communication.
The endoscope 1 further includes a notification unit 5 connected to the main-body device 2. The notification unit 5 will be described later.
Next, a configuration of the main-body device 2 will be described with reference to
The displayed-image controlling unit 22 performs, on the first picked-up image picked up by the first image pickup device 11 and the second picked-up image picked up by the second image pickup device 12, predetermined image processing and processing for outputting the first and second picked-up images as a 3D image, and outputs the processed first and second picked-up images to the image generating unit 21. As the processing for outputting the first and second picked-up images as a 3D image, processing of cutting out output regions for the 3D image, processing of controlling parameters required for displaying the 3D image, and the like are performed on the first picked-up image and the second picked-up image.
The image generating unit 21 generates a 3D image based on the first and second picked-up images outputted from the displayed-image controlling unit 22, and outputs the generated 3D image to the display unit 3. In the present embodiment, the image generating unit 21 is controlled by the displayed-image controlling unit 22 to perform predetermined image processing in generating the 3D image. The details of the image processing of the image generating unit 21 will be described later.
The display unit information acquiring unit 23 acquires display unit information that is information of a display region 3a of the display unit 3 connected to the main-body device 2, and is configured to be able to acquire the display unit information from the display unit 3. The display unit information includes, as the information of the display region 3a, information of a size of the display region 3a, that is, a dimension of the display region 3a in a vertical direction and a dimension of the display region 3a in a lateral direction, for example. The display unit information acquiring unit 23 outputs the acquired display unit information to the displayed-image controlling unit 22.
The line-of-sight information sensing unit 24 receives the detection result of the line-of-sight direction detecting unit 41 of the 3D observation glasses 4, and senses line-of-sight information that is information of movement of the direction of the line of sight based on the detection result of the line-of-sight direction detecting unit 41. The line-of-sight information sensing unit 24 outputs the sensed line-of-sight information to the displayed-image controlling unit 22.
The displayed-image controlling unit 22 can display a changed 3D image on the display unit 3 by controlling at least one of the endoscope 1, the image generating unit 21, and the display unit 3 and outputting the first and second picked-up images to the image generating unit 21 or providing the first and second picked-up images with control parameters for generating the 3D image. The displayed-image controlling unit 22 includes a display determination unit 22A that determines whether to display the 3D image on the display unit 3 based on the distance information sensed by the distance sensing unit 13. In the present embodiment, the displayed-image controlling unit 22 performs processing of changing the displayed image based on a determination result of the display determination unit 22A, the distance information sensed by the distance sensing unit 13, a content of the display unit information acquired by the display unit information acquiring unit 23, and a sensing result of the line-of-sight information sensing unit 24. The details of the processing of changing the displayed image will be described later.
The notification signal generating unit 25 generates a notification signal based on the distance information sensed by the distance sensing unit 13. For example, when the distance C from the observation windows 11A and 12A to the observation object 101 becomes a distance at which the observation object 101 is hard to recognize, the notification signal generating unit 25 generates a notification signal that notifies an observer to that effect. The notification signal generating unit 25 outputs the generated notification signal to the notification unit 5.
The notification unit 5 may be the display unit 3. In this case, the display unit 3 may display an alert that notifies the observer that the observation object 101 has become hard to recognize based on the notification signal. Note that, in
Here, a hardware configuration of the main-body device 2 will be described with reference to
The processor 2A is used to perform at least part of functions of the image generating unit 21, the displayed-image controlling unit 22, the display unit information acquiring unit 23, the line-of-sight information sensing unit 24, and the notification signal generating unit 25. The processor 2A is constituted by an FPGA (field programmable gate array), for example. At least part of the image generating unit 21, the displayed-image controlling unit 22, the display unit information acquiring unit 23, the line-of-sight information sensing unit 24, and the notification signal generating unit 25 may be constituted as a circuit block in the FPGA.
The memory 2B is constituted by a rewritable volatile storage element such as a RAM. The storage device 2C is constituted by a rewritable non-volatile storage device such as a flash memory or a magnetic disk device. The input/output device 2D is used to send and receive signals between the main-body device 2 and an external device through wired or wireless communication.
Note that the processor 2A may be constituted by a central processing unit (hereinafter denoted as a CPU). In this case, at least part of the functions of the image generating unit 21, the displayed-image controlling unit 22, the display unit information acquiring unit 23, the line-of-sight information sensing unit 24, and the notification signal generating unit 25 may be implemented by the CPU reading out a program from the storage device 2C or another storage device that is not shown and executing the program.
The hardware configuration of the main-body device 2 is not limited to the example shown in
Next, the processing of changing the displayed image performed by the displayed-image controlling unit 22 will be described in detail with reference to
First, operation of the display determination unit 22A will be described. As described above, the display determination unit 22A determines whether to display the 3D image on the display unit 3 based on the distance information sensed by the distance sensing unit 13. More specifically, for example, when the distance C from the observation windows 11A and 12A to the observation object 101 is within a predetermined range, the display determination unit 22A determines that the 3D image is displayed on the display unit 3. On the other hand, when the distance C is out of the predetermined range, the display determination unit 22A determines that the 3D image is not displayed on the display unit 3. The predetermined range mentioned above is hereinafter referred to as a display determination range.
The display determination range is stored in advance in the storage device 2C shown in
The display determination range is defined such that the distance C is within the display determination range when the distance C is a distance at which the observation object 101 can be comfortably observed or is a distance at which the observation object 101 is hard to recognize but the difficulty in recognizing the stereoscopic image can be resolved by performing the processing of changing the displayed image. In other words, when the observation object 101 can be comfortably observed or the difficulty in recognizing the stereoscopic image can be resolved by performing the processing of changing the displayed image, the display determination unit 22A determines that the 3D image is displayed on the display unit 3. On the other hand, when the difficulty in recognizing the stereoscopic image cannot be resolved even by performing the processing of changing the displayed image, the display determination unit 22A determines that the 3D image is not displayed on the display unit 3.
The first to sixth processing are performed when the display determination unit 22A determines that the 3D image is displayed on the display unit 3. The seventh processing is performed when the display determination unit 22A determines that the 3D image is not displayed on the display unit 3. Note that the first to sixth processing may be performed regardless of the determination of the display determination unit 22A.
Note that “the observation object 101 can be comfortably observed” more specifically means that a three-dimensional image of the observation object 101 can be observed without causing unnatural feeling and eyestrain, for example.
The distance C from the observation windows 11A and 12A to the observation object 101 at which the observation object 101 can be comfortably observed has a correspondence with a position at which the three-dimensional image of the observation object 101 is perceived.
The position at which the three-dimensional image of the observation object 101 is perceived changes depending on the distance C from the observation windows 11A and 12A to the observation object 101. The distance C from the observation windows 11A and 12A to the observation object 101 is hereinafter denoted as a distance C between the observation object and the objective or simply as a distance C. As the distance C relatively decreases, the position at which the three-dimensional image of the observation object 101 is perceived becomes closer to the observer 200. As the distance C relatively increases, the position at which the three-dimensional image of the observation object 101 is perceived becomes farther from the observer 200. The distance C at which the observation object 101 can be comfortably observed is a distance such that the position at which the three-dimensional image of the observation object 101 is perceived is within the range R1 shown in
Next, the first processing will be described. In the first processing, the displayed-image controlling unit 22 controls the first and second picked-up images acquired by the first image pickup device 11 and the second image pickup device 12 of the endoscope 1. The displayed-image controlling unit 22 controls the first picked-up image so as to change a position of a first output region included in an entire image pickup region of the first image pickup device 11 and in which an image pickup signal for the first picked-up image is outputted based on the distance information sensed by the distance sensing unit 13. The displayed-image controlling unit 22 also controls the second picked-up image so as to change a position of a second output region included in an entire image pickup region of the second image pickup device 12 and in which an image pickup signal for the second picked-up image is outputted based on the distance information.
The details of the first processing will be specifically described below with reference to
In
In
In the present embodiment, when the distance C is out of the first range, that is, is a distance at which the observation object 101 is hard to recognize, the displayed-image controlling unit 22 controls the first and second picked-up images such that the interval k between the center of the first output region 111 and the center of the second output region 121 decreases as compared to when the distance C is within the first range.
Note that when decreasing the interval k between the center of the first output region 111 and the center of the second output region 121, the displayed-image controlling unit 22 may change the interval k in a stepwise or continuous manner according to the distance C.
The above has described the first processing for when the distance C from the observation windows 11A and 12A to the observation object 101 changes from being within the first range to being out of the first range. Conversely, when the distance C changes from being out of the first range to being within the first range, the displayed-image controlling unit 22 changes the interval k between the center of the first output region 111 and the center of the second output region 121 from the interval shown in
Next, the second processing will be described. In the second processing, the image generating unit 21 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the image generating unit 21 so as to change a display position of each of a left-eye image and a right-eye image of the 3D image on the display unit 3 to change a position of the three-dimensional image 102 of the observation object 101 in the depth direction based on the distance information sensed by the distance sensing unit 13.
The details of the second processing will be specifically described below with reference to
In the present embodiment, when the distance C from the observation windows 11A and 12A to the observation object 101 is out of a predetermined second range, the displayed-image controlling unit 22 changes the display position of each of the left-eye image and the right-eye image on the display unit 3 such that a distance between the point P1 at which the observation object 101 is positioned in the left-eye image and the point P2 at which the observation object 101 is positioned in the right-eye image decreases as compared to when the distance C is within the second range. In this manner, a distance D from the display unit 3 to the point P3 at which the three-dimensional image 102 of the observation object 101 is perceived, that is, a stereoscopic depth of the three-dimensional image 102 of the observation object 101 is decreased.
Note that the second range may be defined in the same way as the first range, for example. In this case, when the distance C is a distance at which the observation object 101 is hard to recognize, the first processing and the second processing are performed at the same time.
Alternatively, the second range may be defined such that the distance C is within the second range when there is a distance at which the observation object 101 is hard to recognize but the observation object 101 can be comfortably observed by performing the first processing. If the second range is defined in this manner, when the distance C is out of the second range, that is, when the observation object 101 cannot be comfortably observed even by performing the first processing, the first processing and the second processing are performed at the same time. Note that when there is a distance at which the observation object 101 is hard to recognize but the observation object 101 can be comfortably observed by performing the first processing, only the first processing is performed and the second processing is not performed.
When decreasing the distance between the point P1 and the point P2, the displayed-image controlling unit 22 may change the distance between the point P1 and the point P2 in a stepwise or continuous manner according to the distance C.
Next, the third processing will be described. In the third processing, the illumination unit 14 of the endoscope 1 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the illumination unit 14 so as to change a light quantity of the illumination light based on the distance information sensed by the distance sensing unit 13.
The third processing is performed when the distance C from the observation windows 11A and 12A to the observation object 101 is a distance at which the observation object 101 cannot be comfortably observed even by performing the first processing, for example. When the distance C is relatively small, the displayed-image controlling unit 22 controls the illumination unit 14 such that the light quantity of the illumination light increases to cause halation. When the distance C is relatively large, the displayed-image controlling unit 22 controls the illumination unit 14 such that the light quantity of the illumination light decreases to darken the stereoscopic image. Note that when increasing or decreasing the light quantity of the illumination light as described above, the displayed-image controlling unit 22 may change the light quantity of illumination light in a stepwise or continuous manner according to the distance C.
Next, the fourth processing will be described. In the fourth processing, the image generating unit 21 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the image generating unit 21 so as to perform blurring processing on the 3D image based on the distance information sensed by the distance sensing unit 13. The fourth processing is performed when the distance C from the observation windows 11A and 12A to the observation object 101 is a distance at which the observation object 101 cannot be comfortably observed even by performing the first processing, for example. Note that when performing the blurring processing, the displayed-image controlling unit 22 may change a degree of blurring in a stepwise or continuous manner according to the distance C.
Next, the fifth processing will be described. In the fifth processing, the image generating unit 21 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the image generating unit 21 so as to change an area of each of the left-eye image and the right-eye image of the 3D image displayed on the display unit 3 based on the display unit information acquired by the display unit information acquiring unit 23.
As the display region 3a of the display unit 3 becomes relatively larger, that is, the dimension of the display region 3a in the vertical direction and the dimension of the display region 3a in the lateral direction relatively increase, a position of a three-dimensional image near an outer edge of a perceived range of the stereoscopic image becomes farther from the display unit 3. The fifth processing is performed when the display region 3a of the display unit 3 is larger than a predetermined threshold, for example. In this case, the displayed-image controlling unit 22 controls the image generating unit 21 so as to delete a portion near the outer edge of each of the left-eye image and the right-eye image displayed on the display unit 3 to decrease the area of each of the left-eye image and the right-eye image.
Next, the sixth processing will be described. In the sixth processing, the image generating unit 21 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the image generating unit 21 so as to change the display position of each of the left-eye image and the right-eye image of the 3D image on the display unit 3 based on the distance information sensed by the distance sensing unit 13 and the display unit information acquired by the display unit information acquiring unit 23. More specifically, for example, when the display region 3a of the display unit 3 is larger than a predetermined threshold and the distance C from the observation windows 11A and 12A to the observation object 101 is out of a predetermined third range, the displayed-image controlling unit 22 changes the display position of each of the left-eye image and the right-eye image on the display unit 3 such that the distance between the point P1 at which the observation object 101 is positioned in the left-eye image and the point P2 at which the observation object 101 is positioned in the right-eye image (see
Note that the third range may be defined in the same way as the second range, for example. When decreasing the distance between the point P1 and the point P2, the displayed-image controlling unit 22 may change the distance between the point P1 and the point P2 in a stepwise or continuous manner according to the distance C.
Next, the seventh processing will be described. As described above, the seventh processing is performed when the display determination unit 22A determines that the 3D image is not displayed on the display unit 3. In the seventh processing, the image generating unit 21 is controlled by the displayed-image controlling unit 22. The displayed-image controlling unit 22 controls the image generating unit 21 so as to generate a single 2D image based on the first and second picked-up images. For example, the image generating unit 21 may use one of the first and second picked-up images as the 2D image generated by the image generating unit 21. The display unit 3 displays the 2D image generated by the image generating unit 21.
Note that the displayed-image controlling unit 22 may be configured to be able to perform all of the first to seventh processing, or may be configured to be able to perform the first processing and at least one of the second to seventh processing.
Next, the processing performed based on the line-of-sight information included in the processing of changing the displayed image will be described. First, operation of the line-of-sight direction detecting unit 41 of the 3D observation glasses 4 and operation of the line-of-sight information sensing unit 24 will be described with reference to
Next, the processing performed based on the line-of-sight information will be described. The displayed-image controlling unit 22 performs the processing of changing the displayed image based on the line-of-sight information sensed by the line-of-sight information sensing unit 24. In the present embodiment, when an amount of movement of the direction of the line of sight within a predetermined period of time is greater than or equal to a predetermined threshold, the displayed-image controlling unit 22 controls the endoscope 1 and the image generating unit 21 to perform at least the first processing among the foregoing first to seventh processings. Note that the displayed-image controlling unit 22 may perform the above-mentioned processing regardless of the distance information sensed by the distance sensing unit 13 when the amount of movement of the direction of the line of sight within the predetermined period of time is greater than or equal to the predetermined threshold. Alternatively, the displayed-image controlling unit 22 may perform the above-mentioned processing when the amount of movement of the direction of the line of sight within the predetermined period of time is greater than or equal to the predetermined threshold and the distance C from the observation windows 11A and 12A to the observation object 101 is out of a predetermined range. The above-mentioned predetermined range may be a range that is narrower than the foregoing first range, for example.
Next, operations and effects of the endoscope system 100 according to the present embodiment will be described. In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the first picked-up image so as to change the position of the first output region 111 and controlling the second picked-up image so as to change the position of the second output region 121 based on the distance information that is information of the distance C from the observation windows 11A and 12A to the observation object 101 (the first processing). As described above, as the interval k between the center of the first output region 111 and the center of the second output region 121 decreases, the inward angle α decreases, and as the inward angle α decreases, the size of the three-dimensional image in the depth direction decreases. Thus, the three-dimensional appearance is weakened. Therefore, according to the present embodiment, when there is a distance at which the observation object 101 is hard to recognize, by controlling the first and second picked-up images such that the interval k decreases, the three-dimensional appearance of the three-dimensional image of the observation object 101 is weakened, and the difficulty in recognizing the observation object 101 can be resolved. As a result, according to the present embodiment, the difficulty in recognizing the stereoscopic image can be resolved.
A value of the interval k when the distance C from the observation windows 11A and 12A to the observation object 101 is within the foregoing first range is referred to as a first value, and a value of the interval k when the distance C is out of the first range, which value is different from the first value, is referred to as a second value. In the present embodiment, the displayed-image controlling unit 22 controls the first and second picked-up images such that the interval k is at the first value when the distance C is within the first range, and controls the first and second picked-up images such that the interval k is at the second value when the distance C is out of the first range. In the present embodiment, in particular, the first range is defined such that the distance C is within the first range when the distance C is a distance at which the observation object 101 can be comfortably observed, and the second value is smaller than the first value. Thus, according to the present embodiment, when the distance C changes from being a distance at which the observation object 101 can be comfortably observed to being a distance at which the observation object 101 is hard to recognize, the three-dimensional appearance of the three-dimensional image of the observation object 101 can be weakened, and when the distance C changes back from being a distance at which the observation object 101 is hard to recognize to being a distance at which the observation object 101 can be comfortably observed, the three-dimensional appearance of the three-dimensional image of the observation object 101 can be restored.
Note that the second value may be a single value or a plurality of values as long as the above-mentioned requirement for the second value is met.
In the present embodiment, the interval k between the center of the first output region 111 and the center of the second output region 121 is electrically changed. Thus, according to the present embodiment, as compared to a case in which a mechanism for physically changing the interval k between the center of the first output region 111 and the center of the second output region 121 is provided, a structure of the distal end portion of the insertion portion 10 of the endoscope 1 can be simplified, and the distal end portion can be made smaller.
In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the image generating unit 21 so as to change the display position of each of the left-eye image and the right-eye image of the 3D image on the display unit 3 (the second processing). Thus, according to the present embodiment, when there is a distance at which the observation object 101 is hard to recognize, by changing the display position of each of the left-eye image and the right-eye image on the display unit 3 such that the distance between the point P1 at which the observation object 101 is positioned in the left-eye image and the point P2 at which the observation object 101 is positioned in the right-eye image decreases, the stereoscopic depth of the three-dimensional image of the observation object 101 can be decreased. Thus, according to the present embodiment, the difficulty in recognizing the observation object 101 can be resolved, and as a result, the difficulty in recognizing the stereoscopic image can be resolved.
Note that the distance between the point P1 and the point P2 on the display unit 3 may be defined based on, for example, the distance C from the observation windows 11A and 12A to the observation object 101, the interval k between the center of the first output region 111 and the center of the second output region 121, the interval between the left eye 201 and the right eye 202 of the observer (the pupil distance), the distance from the display unit 3 to the observer, and the like, regardless of whether to perform the second processing.
In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the illumination unit 14 so as to change the light quantity of the illumination light (the third processing). In the present embodiment, as described above, when the distance C from the observation windows 11A and 12A to the observation object 101 is a distance at which the observation object 101 cannot be comfortably observed even by performing the first processing, halation is caused or the stereoscopic image is darkened. In the present embodiment, by intentionally making the three-dimensional image of the observation object 101 harder to recognize in this manner, the difficulty in recognizing the stereoscopic image can be resolved.
In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the image generating unit 21 so as to perform blurring processing on the 3D image (the fourth processing). According to the present embodiment, when the distance C from the observation windows 11A and 12A to the observation object 101 is a distance at which the observation object 101 cannot be comfortably observed even by performing the first processing, by intentionally making the three-dimensional image of the observation object 101 harder to recognize by performing the blurring processing, the difficulty in recognizing the stereoscopic image can be resolved.
In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the image generating unit 21 so as to change the area of each of the left-eye image and the right-eye image of the 3D image displayed on the display unit 3 based on the display unit information acquired by the display unit information acquiring unit 23 (the fifth processing). In the present embodiment, as described above, by deleting a portion near the outer edge of each of the left-eye image and the right-eye image displayed on the display unit 3, a three-dimensional image near the outer edge of the perceived range of the stereoscopic image and at a portion far from the display unit 3 can be deleted. Thus, according to the present embodiment, the difficulty in recognizing the stereoscopic image can be resolved.
In the present embodiment, the displayed-image controlling unit 22 can perform processing of controlling the image generating unit 21 so as to change the display position of each of the left-eye image and the right-eye image of the 3D image on the display unit 3 based on the distance information and the display unit information (the sixth processing). In the present embodiment, as described above, when the display region 3a of the display unit 3 is larger than the predetermined threshold and the distance C from the observation windows 11A and 12A to the observation object 101 is out of the third range, by changing the display position of each of the left-eye image and the right-eye image on the display unit 3 such that the distance between the point P1 at which the observation object 101 is positioned in the left-eye image and the point P2 at which the observation object 101 is positioned in the right-eye image decreases, the stereoscopic depth of the three-dimensional image of the observation object 101 can be decreased. Thus, according to the present embodiment, the difficulty in recognizing the observation object 101 can be resolved, and as a result, the difficulty in recognizing the stereoscopic image can be resolved.
In the present embodiment, when the display determination unit 22A determines that the 3D image is not displayed on the display unit 3, the displayed-image controlling unit 22 can perform processing of controlling the image generating unit 21 so as to generate a single 2D image based on the first and second picked-up images (the seventh processing). Thus, according to the present embodiment, when the observation object 101 cannot be comfortably observed even by performing the processing of changing the displayed image, by displaying the 2D image, eyestrain or the like due to the difficulty in recognizing the stereoscopic image can be prevented.
When the observer attempts to observe a three-dimensional image that is at a position far from the display unit 3 and is hard to recognize, the image becomes out of focus, and the positions of the pupils and hence the direction of the line of sight fluctuate. Thus, a situation in which the direction of the line of sight fluctuates can be regarded as a situation in which the stereoscopic image is hard to recognize. In the present embodiment, when the amount of movement of the direction of the line of sight within the predetermined period of time is greater than or equal to the predetermined threshold, the displayed-image controlling unit 22 can control the endoscope 1 and the image generating unit 21 to perform at least the first processing among the foregoing first to seventh processing. Thus, according to the present embodiment, the difficulty in recognizing the stereoscopic image can be resolved.
When the display region 3a of the display unit 3 becomes relatively larger, a three-dimensional image of the observation object 101 positioned relatively far is perceived as being positioned farther from the observer, and a three-dimensional image of the observation object 101 positioned relatively close is perceived as being positioned closer to the observer. In this manner, as the display region 3a of the display unit 3 becomes larger, the range R1 (see
The present invention is not limited to the above-described embodiment, and various changes, modifications, and the like are possible without departing from the spirit of the present invention. For example, in the second processing, the fifth processing, and the sixth processing, the displayed-image controlling unit 22 may control the display unit 3, instead of controlling the image generating unit 21, to change the display position and area of the 3D image displayed on the display unit 3.
Number | Date | Country | Kind |
---|---|---|---|
2018-127059 | Jul 2018 | JP | national |
This application is a continuation application of PCT/JP2019/004611 filed on Feb. 8, 2019 and claims benefit of Japanese Application No. 2018-127059 filed in Japan on Jul. 3, 2018, the entire contents of which are incorporated herein by this reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2019/004611 | Feb 2019 | US |
Child | 17122412 | US |