The present disclosure relates to an endoscope apparatus and an image generation method for an endoscope apparatus.
In the past, for example, Patent Literature 1 mentioned below describes an endoscope apparatus including a light source lamp for supplying illumination light, light transmission means for transmitting the illumination light, and an imaging device that captures an image of a subject illuminated with the illumination light transmitted by the light transmission means.
Patent Literature 1: Japanese Patent Application Laid-open No. 2002-119468
In an endoscope image, when depth information can be acquired in addition to an observation image, the depth information is applicable to many applications and becomes more useful as the accuracy and resolution thereof become higher. The depth information is conceived to have many applications to, for example, segmentation/registration of an organ or object recognition. As the ranging technology of acquiring the depth information, there are a TOF (Time of Flight) method, a binocular parallax method, an SLAM (Simultaneous Localization and mapping) method, a structured light method, and the like. However, in the TOF method, the resolution of TOF is a VGA level, and a high-definition image cannot be obtained. The binocular parallax method needs two lens holes at the leading end of a rigid scope, which makes it difficult to reduce the diameter thereof. If the lens diameter is intended to be reduced, drawbacks in the resolution or image quality are caused. In the SLAM, it is assumed that a subject or a camera remains still, and within a living body, illumination conditions on the motion of a flexible object, a bright spot, and the like are strict, which makes it difficult to follow a change in feature point between images. In the structured light method, because it is necessary to provide independent illumination windows for normal illumination and illumination for structured light at the leading end of the leading end of a rigid scope, the diameter is difficult to reduce. In addition, because the normal illumination and the structured illumination are switched in a time division manner, in order to acquire a structured image, a frame rate of the normal image becomes half, which is problematic.
In recent healthcare, the needs of low invasiveness have been increasing. Because of the influence on postoperative QoL, there is a demand for minimization of an incised wound. Therefore, there is a demand for further reduction in diameter of an endoscope to be inserted into the inside of a human body.
The technology described in Patent Literature 1 described above relates to an endoscope apparatus including an imaging device that captures an image of a subject illuminated with illumination light transmitted by light transmission means. When depth information is intended to be acquired, in order to acquire the depth information, it is necessary to provide an irradiation window that irradiates the subject with illumination, and it becomes difficult to reduce the diameter of the leading end of the endoscope. As in a stereoscopic image, there is also a method of acquiring images of a plurality of viewpoints to acquire depth information. However, also in this case, in order to acquire images of a plurality of viewpoints, reducing the diameter of the leading end of the endoscope becomes difficult.
In this regard, there has been a demand for simultaneously meeting both of acquisition of depth information with high resolution and high accuracy and a reduction in diameter of a leading end of an endoscope.
According to the present disclosure, there is provided an endoscope apparatus including: an irradiation unit that irradiates a subject with illumination light and with a structured pattern for acquiring depth information of the subject; and an acquisition unit that acquires reflected light of the light emitted by the irradiation unit.
Further, according to the present disclosure, there is provided an image generation method for an endoscope apparatus, the endoscope apparatus including an irradiation unit that irradiates a subject with illumination light and with a structured pattern for acquiring depth information of the subject, and an acquisition unit that acquires reflected light of the light emitted by the irradiation unit, the method including: capturing an image of the reflected light and acquiring a captured image; and separating structured pixels of the structured pattern from the captured image and generating an observation image.
According to the present disclosure, it is possible to simultaneously meet both of acquisition of depth information with high resolution and high accuracy and a reduction in diameter of a leading end of an endoscope.
It should be noted that the effects described above are not necessarily limitative. With or in the place of the above effects, any of the effects described in this specification or other effects that may be grasped from this specification may be achieved.
Hereinafter, a suitable embodiment of the present disclosure will be described in detail with reference to the accompanying drawings. It should be noted that in this specification and drawings, constituent elements having substantially the same functional configurations are denoted by the same reference symbols and overlapping description thereof will be omitted.
It should be noted that description will be given in the following order.
The fiber 104 causes illumination light to propagate within the rigid scope 100 to irradiate a subject with light. When the subject is irradiated with light from the fiber 104, reflected light from the subject reaches an imaging surface of the image sensor 202 via the relay lens 102, and an image signal is acquired. As shown in
The camera control unit 300 includes an image signal processing unit 320, a DMD drive control unit 304, a DMD 306, a fiber 308, and a light source 310. The DMD drive control unit 304 and the DMD 306 constitute a light source of a DLP method (irradiation-pattern output unit 330) that is capable of appropriately switching between observation illumination and structured-pattern illumination. As an example, the resolution of the DMD 306 is approximately 2560*1600 pixels, and the diameter of the fiber 308 is approximately 966 pixels. Thus, a condenser lens is provided between the DMD 306 and the fiber 308. Because the DLP method has characteristics of a short response time and a bright light source, it is suitable to apply the DLP method to this embodiment. Meanwhile, if another method such as a transmissive or reflective liquid crystal method satisfies the requirements of the response time and brightness, such a method is also applicable to this embodiment.
The image signal processing unit 320 includes an observation-image generation unit 312, a ranging-image extraction unit 314, a subject recognition unit 316, and an irradiation-pattern generation unit 318.
When the rigid scope 100 is inserted into the inside of a human body and a subject such as an internal organ is imaged, there is a demand for acquiring distance information of the subject. In this embodiment, as shown in
Meanwhile, in a case where both of observation illumination for a normal observation and illumination independently used for structured light are performed, as shown in
Further, in a case where the camera head 200 is configured to be capable of stereoscopic vision, as shown in
In this embodiment, a single illumination window 108 is provided to the leading end of the rigid scope 100, and the observation illumination and the illumination of the structured pattern 510 are switched on the camera control unit 300 side to irradiate the subject with the illumination. Subsequently, signal processing is performed such that an observation image and a structured image are separated from each other from the image acquired by the image sensor 202. Accordingly, the distance information can be obtained from the structured pattern 510, and since it is unnecessary to separately provide an illumination window for irradiating the subject with the structured pattern 510, reducing the diameter of the leading end of the rigid scope 100 can be achieved.
Hereinafter, switching of the structured pattern 510 will be described. In this embodiment, the structured pattern 510 is switched in chronological order. More specifically, the subject is irradiated with an irradiation pattern 500, which is obtained by inverting the phases of the structured pattern 510 and the observation illumination on a frame-by-frame basis, and the subject is imaged by the image sensor 202. A captured image 530 obtained by the imaging includes structured pixels 532 in which a position of the structured pattern 510 changes depending on the shape of the subject. The image signal processing unit 320 extracts a ranging image 540 obtained when the structured pixels 532 are separated from the captured image 530, and also generates an observation image 550 that does not include the structured pixels 532.
In Step S10, determination on whether a pixel is the structured pixel 532 or not can be performed on the basis of a color of that pixel. In a case where the color of that pixel is a color that does not normally occur in a human body (e.g., blue), the pixel can be determined to be the structured pixel 532. Further, determination on whether a pixel is the structured pixel 532 or not can also be performed on the basis of a result of the determination of the previous frame. As described above, since the phases of the structured pattern 510 and the observation illumination 520 are inverted on a frame-by-frame basis, the pixel that is not the structured pixel 532 in the previous frame can be determined to be the structured pixel 532 in the current frame.
In Step S12, calculation of the interpolated pixel value is performed by the following methods. In a first method, in the pixels of the input frame, a value of a peripheral pixel that is not the structured pixel 532 is assumed as an interpolated pixel value. For example, a pixel value of a pixel 600 in the frame 1 of
In order to perform the processing as described above, the observation-image generation unit 312 of the image signal processing unit 320 separates the structured pixels 532 of the structured pattern 510 from the captured image 530 and generates an observation image 550. The ranging-image extraction unit 314 extracts the structured pixels 532 of the structured pattern 510 from the captured image 530 and acquires a ranging image 540.
Regarding the structured pattern 510, the following variations are conceivable. In a first variation, the structured pattern 510 is a spatially uniform geometric pattern, and an organ 700 is irradiated with the structured pattern 510. In a second variation, as shown in
In a third variation, as shown in
Thus, the subject recognition unit 316 of the image signal processing unit 320 recognizes the subject from the captured image 530. The irradiation-pattern generation unit 318 adaptively disposes the structured pattern 510 on the basis of a result of the recognition of the subject by the subject recognition unit 316 and generates the irradiation pattern 500. More specifically, the subject recognition unit 316 recognizes the outline of the subject from the captured image 530, and the irradiation-pattern generation unit 318 disposes the structured pattern 510 at a position of the outline and generates the irradiation pattern 500. Further, the subject recognition unit 316 recognizes a pattern of the surface of the subject from the captured image 530, and the irradiation-pattern generation unit 318 disposes the structured pattern 510 so as not to coincide with the pattern and generates the irradiation pattern 500. Information of the irradiation pattern 500 is transmitted to the irradiation-pattern output unit 330 constituted by the DMD drive control unit 304 and the DMD 306, and the subject is irradiated with the irradiation pattern 500 after passing through the fibers 104 and 308.
Further, regarding the color of the structured pattern 510, the following variations are assumed. In a first variation, the color of the structured pattern 510 is set to a plain color. For example, the color is set to blue having a small probability of occurrence in a living body (having small spectral distribution). Accordingly, in Step S10 of
After the ranging image 540 and the observation image 550 are separated from each other, depth information is estimated from the ranging image 540. The estimation of the depth information can be performed by a well-known structured light method.
Next, description will be given on an example of performing irradiation with a constantly uniformly sparse structured pattern 510. In
If the ranging image 540 can be extracted, the structured pixels 532 corresponding to the structured pattern 510 in the ranging image 540 are interpolated by using surrounding pixels, and the observation image 550 can thus be obtained.
The basic processing of generating the observation image 550 is similar to that of
A ranging mode by the irradiation with the structured pattern 510 can be adaptively switched between on and off. For example, in a case where large motion is detected in the observation image 550, the ranging mode is turned off in order to avoid causing negative effects caused by an erroneous detection. Accordingly, the irradiation with the structured pattern 510 is not performed. Further, in a case where mist occurs in the subject, an erroneous detection of ranging may be performed due to the reflection of light or the like, and thus the ranging mode is turned off and the irradiation with the structured pattern 510 is not performed. Accordingly, it is possible to avoid causing negative effects caused by an erroneous detection. Further, it may also be possible to normally turn the ranging mode off, and only in a scene where a precise procedure such as suturing is necessary, to turn the ranging mode on to perform the irradiation with the structured pattern 510.
Hereinabove, a suitable embodiment of the present disclosure has been described in details with reference to the accompanying drawings, while the technical range of the present disclosure is not limited to the above examples. It is obvious that a person having ordinary skill in the technical field of the present disclosure could arrive at various alterations or modifications within the technical ideas described in the scope of claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
The effects described in this specification are merely illustrative or exemplified effects, and are not limitative. That is, with or in the place of the above effects, the technology according to the present disclosure may achieve other effects that are clear to those skilled in the art on the basis of the description of this specification.
It should be noted that the following configurations also come under the technical range of the present disclosure.
(1) An endoscope apparatus, including:
an irradiation unit that irradiates a subject with illumination light and with a structured pattern for acquiring depth information of the subject; and an acquisition unit that acquires reflected light of the light emitted by the irradiation unit.
(2) The endoscope apparatus according to (1), further including:
an imaging device that captures an image of the reflected light; and
an image signal processing unit that processes an image signal of a captured image captured by the imaging device, in which
the image signal processing unit includes an observation-image generation unit that separates structured pixels of the structured pattern from the captured image and generates an observation image.
(3) The endoscope apparatus according to (2), in which
the image signal processing unit includes a ranging-image extraction unit that extracts the structured pixels of the structured pattern from the captured image and acquires a ranging image.
(4) The endoscope apparatus according to (2), in which
the observation-image generation unit interpolates image information of pixels, from which the structured pixels are separated, and generates the observation image.
(5) The endoscope apparatus according to (4), in which
the observation-image generation unit interpolates the image information of the pixels, from which the structured pixels are separated, by using image information of a peripheral pixel.
(6) The endoscope apparatus according to (4), in which
the observation-image generation unit interpolates the image information of the pixels, from which the structured pixels are separated, by using image information of a corresponding pixel in a different frame.
(7) The endoscope apparatus according to (2), further including
an irradiation-pattern output unit that outputs the illumination light and an irradiation pattern including the structured pattern to the irradiation unit.
(8) The endoscope apparatus according to (7), in which
the irradiation-pattern output unit outputs the irradiation pattern by inverting pixels to which the structured pattern is output and pixels to which the structured pattern is not output on a frame-by-frame basis.
(9) The endoscope apparatus according to (7), in which
the irradiation-pattern output unit sets pixels to which the structured pattern is output as constant pixels that do not change for each frame, and outputs the irradiation pattern.
(10) The endoscope apparatus according to (7), in which
the irradiation-pattern output unit outputs the irradiation pattern including the structured pattern of a predetermined color.
(11) The endoscope apparatus according to (10), in which
the predetermined color is a color that does not occur inside of a human body.
(12) The endoscope apparatus according to (11), in which
the predetermined color includes blue or yellow.
(13) The endoscope apparatus according to (7), in which
the image signal processing unit includes
the subject recognition unit recognizes an outline of the subject from the captured image, and
the irradiation-pattern generation unit disposes the structured pattern at a position of the outline and generates the irradiation pattern.
(15) The endoscope apparatus according to (13), in which
the subject recognition unit recognizes a pattern of a surface of the subject from the captured image, and
the irradiation-pattern generation unit disposes the structured pattern to avoid coinciding with the pattern and generates the irradiation pattern.
(16) An image generation method for an endoscope apparatus, the endoscope apparatus including an irradiation unit that irradiates a subject with illumination light and with a structured pattern for acquiring depth information of the subject, and an acquisition unit that acquires reflected light of the light emitted by the irradiation unit, the method including:
capturing an image of the reflected light and acquiring a captured image; and
separating structured pixels of the structured pattern from the captured image and generating an observation image.
Number | Date | Country | Kind |
---|---|---|---|
2017-000117 | Jan 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/042293 | 11/24/2017 | WO | 00 |