1. Field of the Invention
The present invention relates to an image processing system, and an imaging device, a receiving device, and an image display device thereof.
2. Description of the Related Art
In recent years, capsule body-insertable apparatuses (for example, capsule endoscopes) provided with an imaging function and a radio communication function have been proposed in the field of endoscope and body-insertable systems to acquire intra-subject images using such capsule endoscopes have been developed. To make intra-subject observations (examinations), for example, after being swallowed through the mouth of a subject, a capsule endoscope moves through a body cavity, for example, inside organs such as stomach and small intestine following peristaltic movement and also functions to capture intra-subject images at intervals of, for example, 0.5 s before being naturally discharged.
While a capsule endoscope moves through inside a subject, images captured by the capsule endoscope are received by an external image display device via an antenna arranged on the body surface of the subject. The image display device has the radio communication function for the capsule endoscope and a memory function of images and successively stores an image received from the in-vivo capsule endoscope into a memory. A doctor or nurse can make intra-subject observations (examinations) and make a diagnosis by displaying images, that is, images inside an alimentary canal of the subject accumulated in such an image display device in a display.
Japanese Laid-open Patent Publication No. 2006-247404 describes an in-vivo imaging device in which a plurality of individual light sources and a plurality of individual optical sensors are arranged and the operation and gain of light sources are controlled based on the quantity of light sensed by optical sensors of light reflected by an object when light sources operate.
An image processing system according to an aspect of the present invention includes an image generation unit that has two observation modes of a first observation mode for capturing an image under illumination by a first light source and a second observation mode for capturing an image under illumination by a second light source different from the first light source and generates an image to be displayed based on the image captured by selecting one of the observation modes; a brightness detection unit that detects brightness of the image captured in one observation mode; and a control unit that controls an exposure operation or image processing in the other observation mode performed subsequent to an observation in the one observation mode based on the brightness of the image detected by the brightness detection unit.
An imaging device according to another aspect of the present invention includes an image generation unit that has two observation modes of a first observation mode for capturing an image under illumination by a first light source and a second observation mode for capturing an image under illumination by a second light source different from the first light source and generates an image to be displayed based on the image captured by selecting one of the observation modes; a brightness detection unit that detects brightness of the image captured in one observation mode; a control unit that controls an exposure operation or image processing in the other observation mode performed subsequent to an observation in the one observation mode based on the brightness of the image detected by the brightness detection unit; and a transmission unit that transmits the image generated by the image generation unit.
A receiving device according to still another aspect of the present invention includes an image receiving unit that receives each of two images of an image captured under illumination by a first light source and an image captured under illumination by a second light source different from the first light source; a recording unit that records the image received by the image receiving unit in a predetermined recording region; a brightness detection unit that detects brightness of the image received by the image receiving unit; and a control unit that controls whether to allow the recording unit to record the image based on the brightness of the image detected by the brightness detection unit.
An image display device according to still another aspect of the present invention includes an image processing unit that performs predetermined image processing on each of two images of an image captured under illumination by a first light source and an image captured under illumination by a second light source different from the first light source; a brightness detection unit that detects brightness of the image; a control unit that controls the predetermined image processing on the image by the image processing unit based on the brightness of the image detected by the brightness detection unit; and a display unit that displays at least one of the image and the image on which the predetermined image processing has been performed by the image processing unit.
The above and other features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Preferred embodiments of the image processing system and the imaging device, receiving device, and image display device thereof according to the present invention will be described in detail with reference to drawings. All embodiments shown below can be combined in all their configurations or a portion thereof when appropriate.
First, before describing a capsule endoscope system according to the first embodiment, the configuration of an image processing system to be a construction concept of the capsule endoscope system will be described in detail using drawings.
As brightness information in the present embodiment, all information indicating image brightness, for example, an exposure time when the imaging unit 104 captures an image, average luminance of images acquired by the imaging unit 104, and an integral value (also called a light exposure) of signal strength of pixels contained in a predetermined region of an acquired image can be used as brightness information.
The control unit 107 exercises control such as determining the light emission quantity (power) or light emission time of the illuminating unit 103 and selecting the type of a driven light source based on brightness information detected by the brightness detection unit 106. The control unit 107 also exercises control such as determining the exposure time by the imaging unit 104 and selecting the type (one or more of R, G, and B) of pixels of an image signal to be read similarly based on the detected brightness information. Further, the control unit 107 exercises control such as changing various parameters in image processing by the image processing unit 105 and selecting the image processing function to be executed similarly based on the detected brightness information.
In the present embodiment, as described above, appropriate control in accordance with image brightness is exercised by controlling the imaging unit 104, the illuminating unit 103, or the image processing unit 105 in the image generation unit 101 based on the acquired image brightness information so that it becomes possible to generate image data itself and perform processing on the generated image data with stability.
Next, as an image processing system according to the second embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
The capsule endoscope 2 is equipped with the imaging function and radio communication function inside a capsule casing. The capsule endoscope 2 is inserted into an organ of the subject 1 through ingestion intake or the like and then, successively captures an in-vivo image of the subject 1 at predetermined intervals (for example, at intervals of 0.5 s) while moving through inside the organ of the subject 1 due to peristaltic movement or the like. More specifically, the capsule endoscope 2 alternately captures an ordinary image using white light (ordinary light observation) and a spectral image generated by using special light consisting of specific color components of blue and green (special light observation) such as a sharp blood vessel image of the inner wall of body cavity including a plurality of repetitions of each. The capsule endoscope 2 transmits an image signal of in-vivo images of the subject 1 captured in this manner to the outside receiving device 3 by radio. The capsule endoscope 2 successively repeats the imaging operation and radio transmission operation of such in-vivo images in a period between insertion into organs of the subject 1 and the discharge out of the subject 1.
The receiving device 3 is equipped with a plurality of receiving antennas 3a to 3h arranged, for example, on a body surface of the subject 1 in a distributed fashion and receives a radio signal from the capsule endoscope 2 inside the subject 1 via at least one of the plurality of receiving antennas 3a to 3h. The receiving device 3 extracts an image signal from the radio signal output from the capsule endoscope 2 to acquire image data of in-vivo images contained in the extracted image signal.
The receiving device 3 also performs various kinds of image processing on the acquired image data and stores a group of the image-processed in-vivo images in the recording medium 5 inserted in advance. The receiving device 3 also associates each image of the group of in-vivo images with time data such as the imaging time or receiving time.
The receiving antennas 3a to 3h of the receiving device 3 may be arranged, as shown in
The image display device 4 is configured like a workstation that captures various kinds of data such as a group of in-vivo images of the subject 1 via the recording medium 5 and displays various kinds of data of the captured group of in-vivo images or the like. More specifically, after the recording medium 5 removed from the receiving device 3 being inserted into, the image display device 4 captures saved data of the recording medium 5 to acquire various kinds of data such as a group of in-vivo images of the subject 1. The image display device 4 has a function to display acquired in-vivo images in a display. A diagnosis is made based on the image display by the image display device 4.
The recording medium 5 is a portable recording medium to exchange data between the receiving device 3 and the image display device 4 described above. The recording medium 5 is structured to be removable from the receiving device 3 and the image display device 4 and to be able to output and record data when inserted into one of the receiving device 3 and the image display device 4. More specifically, when inserted into the receiving device 3, the recording medium 5 records a group of in-vivo images processed by the receiving device 3 and time data of each image.
The capsule endoscope 2 contains various functions inside a capsule casing 21, one end thereof is covered with a dome-shaped transparent cover 20, and the illuminating unit and imaging unit are arranged on the one end side. As shown in
A special light observation using the special light source 11 will be described. First, as shown in
Therefore, in the special light observation, contrast information of the blood vessel can be obtained and also a spectral image, which is a blood vessel image, can be obtained by irradiating an object with light having wavelengths of blue and green and using an imaging element having sensitivity characteristics of wavelengths of blue and green.
The illuminating unit 51 includes the ordinary light source 10 and the special light source 11 described above and a light source control circuit 61 that drives and controls the ordinary light source 10 and the special light source 11. If the same current is supplied to the ordinary light source 10 and the special light source 11, the special light source 11 emits special light whose quantity of light is smaller than that of ordinary light. The imaging unit 52 includes the above imaging element 14 and an imaging element control circuit 62 that drives and controls the imaging element 14. Further, the state detection unit 53 includes a sensor unit 63 and a sensor unit control circuit 64 that drives and controls the sensor unit 63. The sensor unit 63 is at least realized by various sensors capable of detecting whether the capsule endoscope 2 is in a liquid such as water (whether in a liquid or a gas) inside the subject 1.
The system controller 54 includes an exposure time measuring unit 71 and an observation mode controller 72. The exposure time measuring unit 71 measures the exposure time of at least an ordinary light observation as brightness information. The observation mode controller 72, on the other hand, controls the operation of an ordinary light observation mode corresponding to a first observation mode for capturing an ordinary light image and a special light observation mode corresponding to a second imaging mode for capturing a special light image based on exposure time information measured by the exposure time measuring unit 71.
The observation mode control processing procedure by the observation mode controller 72 will be described with reference to
On the other hand, if the exposure time of ordinary light is not successively equal to the specified value or more (step S105, No), the observation mode controller 72 causes the special light source 11 to emit special light (step S106) and proceeds to step S102 to acquire a special light image capturing the image through the imaging unit 52. That is, the observation mode controller 72 causes the imaging unit 52 to perform an operation in special light observation mode.
Namely, when a special light observation is made in a preset alternating order, if the last exposure time and the exposure time before the last exposure are both equal to a specified value or more when a ordinary light observation is made, in other words, the quantity of reflected light of ordinary light is small, the observation mode controller 72 makes an ordinary light observation, instead of a special light observation, because a special light image having sufficient brightness cannot be obtained since the quantity of reflected light is small even when the special light observation is performed.
Thus, while the ordinary light observation is always made in a time zone in which the ordinary light observation and special light observation are alternately made, if the exposure time during ordinary light observation is successively equal to the specified value ΔT3 or more, a special light observation immediately thereafter is not made and instead, an ordinary light observation is made. Accordingly, an ordinary light image with sufficient brightness can be obtained, instead of a special light image without sufficient brightness, leading to efficient use of power.
Next, as an image processing system according to the third embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to the second embodiment, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system. In the third embodiment, the special light source 11 includes a pair of wide-directivity special light sources 111 (111a to 111c) having wide directivity with regard to the optical axis of the imaging element 14 and narrow-directivity special light sources 112 (112a to 112c) having narrow directivity. As shown in
Then, if the exposure time of ordinary light is not successively equal to the specified value or more (step S205, No), the observation mode controller 72 causes the narrow-directivity special light sources 112 and the wide-directivity special light sources 111 to emit light (step S206) before proceeding to step S202 to cause an operation in special light observation mode.
On the other hand, if the exposure time of ordinary light is successively equal to the specified value or more (step S205, Yes), the observation mode controller 72 further determines whether the capsule endoscope 2 is in a liquid based on a detection result of the sensor unit 63 (step S207). If the capsule endoscope 2 is not in a liquid (step S207, No), the capsule endoscope 2 is in a gas and the observation mode controller 72 proceeds to step S201 to cause an ordinary light observation in this special light observation period. On the other hand, if the capsule endoscope 2 is in a liquid (step S207, Yes), the observation mode controller 72 causes only the wide-directivity special light sources 111 to emit light (step S208) before proceeding to step S202 to cause a special light observation. In this case, wide-directivity light irradiates so that a special light image of surroundings of an object close to the capsule endoscope 2 can be obtained.
In the above second and third embodiments, it is assumed that the exposure time measuring unit 71 measures the exposure time of the imaging unit 52, but the present invention is not limited to this and measurements may be made by associating the light emission quantity of the ordinary light sources 10 and 110 with the exposure time. In this case, instead of step S105 in the flow chart shown in
In the above second to fourth embodiments, the observation mode controller 72 performs processing to determine whether to make a special light observation or to replace a special light observation with an ordinary light observation without making the special light observation, but the present invention is not limited to this and, for example, as shown in
Next, as an image processing system according to the sixth embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to fifth embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
In the sixth embodiment, as shown in
The system controller 54 includes a light emission quantity adjustment unit 171 that makes light emission quantity adjustments of the ordinary light sources 10 and the special light sources 11 corresponding to each of ordinary light images and special light images. The system controller 54 also includes an observation mode controller 172 that exercises mode control such as switching each observation mode to capture ordinary light images and special light images.
The light emission quantity adjustment processing procedure by the light emission quantity adjustment unit 171 will be described with reference
On the other hand, if the image is not an ordinary light image (step S501, No), the light emission quantity adjustment unit 171 adds up green (G) pixels and blue (B) pixels within a predetermined range of the special light image obtained last time (step S505). Then, the light emission quantity adjustment unit 171 determines whether the added value is within an appropriate range (step S506). If the added value is not within the appropriate range (step S506, No), the light emission quantity adjustment unit 171 makes light emission quantity adjustments of the special light sources 11 (step S507) so that the image brightness is within the appropriate range before proceeding to step S508. If the added value is within the appropriate range (step S506, Yes), the light emission quantity adjustment unit 171 directly proceeds to step S508. Then, in step S508, the light emission quantity adjustment unit 171 determines whether the light emission quantity adjustment processing has terminated and only if the processing has not terminated (step S508, No), the light emission quantity adjustment unit 171 repeats the above processing and if the processing has terminated (step S508, Yes), the light emission quantity adjustment unit 171 terminates the present processing.
In the sixth embodiment, light emission quantity adjustments are individually made for each of ordinary light images and special light images and thus, each image can be obtained as an image having individually appropriate brightness.
Light emission quantities of the ordinary light sources 10 and the special light sources 11 are adjusted in the sixth embodiment, but the present invention is not limited to this and the exposure time may be adjusted for each of ordinary light images and special light images.
Different addition operations are performed in steps S502 and S505 in the sixth embodiment described above, but the present invention is not limited to this and all pixels may be added up in each of steps S502 and S505. That is, the addition processing of steps S502 and S505 may be made common processing. In such a case, it is preferable to set each appropriate range in steps S503 and S506 differently.
Next, as an image processing system according to the seventh embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to sixth embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
In the seventh embodiment, luminance of each of ordinary light images and special light images is calculated based on each calculation formula corresponding to characteristics of output of each image as brightness information to make light emission quantity adjustments of the ordinary light sources 10 and the special light sources 11.
The light emission quantity adjustment unit 171 according to the seventh embodiment makes, like the light emission quantity adjustment unit 171 according to the sixth embodiment, light emission quantity adjustments, but performs the processing according to the light emission quantity adjustment processing procedure shown in
YW=0.30×R+0.59×G+0.11×B (1)
Then, the light emission quantity adjustment unit 171 determines whether the average luminance YW is within an appropriate range, that is, the image has appropriate brightness (step S603). If the average luminance YW is not within the appropriate range (step S603, No), the light emission quantity adjustment unit 171 makes light emission quantity adjustments of the ordinary light sources 10 (step S604) so that the image brightness is within the appropriate range before proceeding to step S608. On the other hand, if the average luminance YW is within the appropriate range (step S603, Yes), the light emission quantity adjustment unit 171 directly proceeds to step S608 to allow the currently set light emission quantity of the ordinary light sources 10 to be maintained.
On the other hand, if the image is not an ordinary light image (step S601, No), the light emission quantity adjustment unit 171 calculates average luminance based on values of green (G) pixels and blue (B) pixels within a predetermined range of the special light image obtained last time (step S605) according to Formula (2) below:
YN=0.30×G+0.70×B (2)
Formula (2) is a formula applied when red (R) pixels are output as green (G) pixels and blue (B) pixels as blue (B) pixels.
Then, the light emission quantity adjustment unit 171 determines whether the average luminance YN is within an appropriate range (step S606). If the average luminance YN is not within the appropriate range (step S606, No), the light emission quantity adjustment unit 171 makes light emission quantity adjustments of the special light sources 11 (step S607) so that the image brightness is within the appropriate range before proceeding to step S608. If the image brightness is within the appropriate range (step S606, Yes), the light emission quantity adjustment unit 171 directly proceeds to step S608. Then, in step S608, the light emission quantity adjustment unit 171 determines whether the light emission quantity adjustment processing has terminated and only if the processing has not terminated (step S608, No), the light emission quantity adjustment unit 171 repeats the above processing and if the processing has terminated (step S608, Yes), the light emission quantity adjustment unit 171 terminates the present processing. The appropriate range in step S603 and that in step S606 may be the same or different.
In the seventh embodiment, average luminance is individually calculated using average luminance calculation formulas that are different for each of ordinary light images and special light images and light emission quantity adjustments are made based on the average luminance and thus, each image can be obtained as an image having individually appropriate brightness.
Next, as an image processing system according to the eighth embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to seventh embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
In the eighth embodiment, brightness adjustments of each piece of image data are made by performing amplification processing of pixel data corresponding to each of received ordinary light images and special light images.
The brightness adjustment processing procedure will be described with reference to the flow chart shown in
Then, the brightness adjustment unit 201 determines whether the calculated average luminance is within an appropriate range (step S703). If the average luminance is not within the appropriate range (step S703, No), the brightness adjustment unit 201 changes the amplification factor of image data by the amplification unit 206 so that the brightness of the special light image is within the appropriate range and outputs a special light image composed of image data having appropriate brightness to the signal processing unit 207 (step S704) before proceeding to step S705.
On the other hand, if the average luminance is within the appropriate range (step S703, Yes), the brightness adjustment unit 201 directly outputs each piece of pixel data to the signal processing unit 207 without amplifying the pixel data before proceeding to step S705. If, in step S701, the image is an ordinary light image (step S701, Yes), the brightness adjustment unit 201 directly proceeds to step S705. Then, in step S705, the brightness adjustment unit 201 determines whether the brightness adjustment processing has terminated and only if the processing has not terminated (step S705, No), the brightness adjustment unit 201 repeats the above processing and if the processing has terminated (step S705, Yes), the brightness adjustment unit 201 terminates the present processing.
In the eight embodiment, amplification processing of pixel data corresponding to the type of an acquired image, that is, corresponding to each of ordinary light images and special light images and thus, an image having appropriate brightness can be obtained.
The brightness adjustment unit 201 may further amplify pixel data by the signal processing unit 207 based on a calculation result of average luminance. The amplification unit 206 may perform, in addition to amplification, attenuation processing.
Further, in the eighth embodiment described above, the processing is described as processing to be performed inside the receiving device 3, but the present invention is not limited to this and amplification processing similar to that performed inside the receiving device 3 may be performed by the image display device 4. Naturally, amplification processing may be performed by the capsule endoscope 2.
The second to eighth embodiments described above have each been described by taking the capsule endoscope 2 as an example. After being inserted into a subject, the capsule endoscope 2 needs to exercise operation control of the observation mode on its own and thus is suitable for the application of the present invention.
Next, as an image processing system according to the ninth embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to eighth embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
The capsule endoscope 2 according to the present embodiment determines the light emission time of the ordinary light sources 10 or the special light sources 11 for the next imaging based on brightness of image data obtained by the last imaging. The image data obtained by the imaging is transmitted to the receiving device 3 outside the subject 1 through a radio signal by the transmitting circuit 55 via the transmitting antenna 56. The receiving device 3 records the image data received from the capsule endoscope 2 in, for example, the portable recording medium 5. At this point, the receiving device 3 works not to store images whose brightness level is extremely low or high. Accordingly, images that are not effective in reading X-rays inside the subject 1 (images not contained within an allowable range) such as underexposed images that are dark and blurred as a whole and overexposed images that are whitened as a whole can be discarded.
Subsequently, a capsule endoscope system according to the ninth embodiment will be described in detail together with drawings. The capsule endoscope system according to the ninth embodiment is similar to that of one of the above embodiments. In the present embodiment, however, as shown in
Next, the operation of a capsule endoscope system according to the present embodiment will be described in detail using drawings.
As shown in
Next, the capsule endoscope 2 switches the imaging mode to the ordinary light observation mode or the special light observation mode (step S905). If, for example, the current imaging mode is the ordinary light observation mode, the observation mode is switched to the special light observation mode and if the current imaging mode is the special light observation mode, the observation mode is switched to the ordinary light observation mode. Subsequently, the capsule endoscope 2 determines whether the observation mode after the switching, that is, the observation mode for the next photographing is the special light observation mode (step S906).
If, as a result of the determination in step S906, the current observation mode is the ordinary light observation mode (step S906, No), the capsule endoscope 2 detects brightness information of the image from all components of R components, G components, and B components in the ordinary light image acquired last time (step S907). Subsequently, the capsule endoscope 2 calculates the light emission time of the ordinary light sources 10 from the detected brightness information (step S908) and causes the ordinary light sources 10 to emit light for the calculated light emission time (step S909) before returning to step S903. If the light emission time calculated in step S908 is larger than a maximum value of the light emission time preset as an upper limit, the capsule endoscope 2 causes the ordinary light sources 10 to emit light, for example, for the maximum value of the light emission time.
On the other hand, if, as a result of the determination in step S906, the current observation mode is the special light observation mode (step S906, Yes), the capsule endoscope 2 detects brightness information of the image from G components and B components in the ordinary light image or special light image acquired immediately before, that is, color components forming a special light image (step S910), calculates the light emission time of the special light sources 11 from the detected brightness information (step S911) and causes the special light sources 11 to emit light for the calculated light emission time (step S912) before returning to step S903. If the light emission time calculated in step S912 is larger than a maximum value of the light emission time preset as an upper limit, the capsule endoscope 2 causes the ordinary light sources 10 to emit light, for example, for the maximum value of the light emission time.
As shown in
Next, the receiving device 3 derives brightness information of an image from a pixel value of pixels contained in a predetermined region of the target image (step S925) and determines whether the brightness of the image identified from the brightness information is included in the allowable range identified in step S923 or S924 (step S926). If, as a result of the determination in step S926, the brightness of the target image is included in the allowable range (step S926, Yes), the receiving device 3 performs image processing such as synchronization processing and compression processing on the target image (step S927) and stores image data after the image processing in the recording medium 5 (step S928). On the other hand, if the brightness of the target image is not included in the allowable range (step S926, No), the receiving device 3 discards the target image data (step S929).
Then, the receiving device 3 determines whether any termination instruction of the operation has been input from, for example, a user (step S930) and if the termination instruction has been input (step S930, Yes), the receiving device 3 terminates the operation shown in
In the present embodiment, as described above, appropriate control not to store image data that does not have appropriate brightness can be performed by the receiving device 3 in a stable fashion based on brightness of the image. As a result, various kinds of processing on image data that is not effective in reading X-rays and a region where image data not effective in reading X-rays is stored can be eliminated so that processing can be slimmed and the storage region can be used more effectively.
Next, as an image processing system according to the tenth embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to ninth embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
The capsule endoscope 2 according to the present embodiment determines the light emission time of the ordinary light sources 10 or the special light sources 11 for the next imaging based on brightness of image data obtained by the last imaging. The image data obtained by the imaging is transmitted to the receiving device 3 outside the subject 1 through a radio signal by the transmitting circuit 55 via the transmitting antenna 56 and stored in predetermined storage (for example, the recording medium 5). The stored image data is loaded into the image display device 4 via a communication interface (such as USB and LAN) connecting a cradle and the image display device 4 when, for example, the receiving device 3 is connected to the cradle (not shown). The image display device 4 performs image processing functions such as the motion detection function that detects image motion (or movement of the capsule endoscope 2 predicted based on image changes) and the red detection function that determines whether there is any red portion in an image or detects a region of a red portion in an image on the input image data.
The motion detection function calculates a scalar quantity (an absolute value) of a motion vector between consecutive images and if the quantity is larger than a preset threshold, selects the target image as a display target, that is, an image for reading X-rays. Images excluded from display targets are stocked, for example, in a predetermined storage region while maintaining chronological information of consecutive images.
Cases when a large scalar quantity is calculated include, for example, a case when an imaging window of the capsule endoscope 2 is directed toward the direction of emptiness from a state in which the imaging window is close to in-vivo tissues (hereinafter, referred to as a first case) and a case when an observation window comes into contact with in-vivo tissues from a state in which the imaging window is in the direction of emptiness (hereinafter, referred to as a second case). In a state in which the imaging window is close to in-vivo tissues, an object (in-vivo tissues) can be clearly imaged with a small illuminating light quantity. Thus, in the first case, one or several images captured immediately after the observation window is directed toward the direction of emptiness will be underexposed dark images. While such dark images are not appropriate for reading X-rays, the scalar quantity thereof becomes a large value because such dark images have a large motion vector with regard to images captured immediately before when the observation window is close to in-vivo tissues. As a result, such dark images will be selected as display target images. On the other hand, the distance between the imaging unit and an object is long in a state in which the imaging window is in the direction of emptiness and thus, a bright image cannot be obtained unless illuminated with a large illuminating light quantity. Thus, in the second case, one or several images captured immediately after the observation window being close to in-vivo tissues will be overexposed too bright images. While such too bright images are not appropriate for reading X-rays, the scalar quantity thereof becomes a large value because such too bright images have a large motion vector with regard to images captured immediately before when the observation window is in the direction of emptiness. As a result, such too bright images will be selected as display target images.
Thus, in the present embodiment, whether to select a target image as a display target is determined based on, in addition to the scalar quantity of a motion vector between consecutive images, brightness information of each image. Accordingly, dark images or too bright images that are not appropriate for reading X-rays can be prevented from being selected as display targets.
For the red detection function, malfunctioning of the algorithm thereof may be triggered by an image whose brightness is lacking or excessive. This is because the white balance of an image changes depending on the level of contrast such as the R component (red component) being dominant over other components (G and B components) in a dark image. That is, if the white balance of an image is disturbed, the red detection function that detects reddish images (images containing many red regions or images strong in the R component) by an algorithm based on the relative value of each color component may evaluate the image whose white balance is disturbed differently from colors in real space. As a result, even if red is strong in real space, an image capturing the redness may be evaluated as an image strong in red or even if red is not strong in real space, an image capturing the redness may be evaluated as an image strong in red.
Thus, the present embodiment is configured to perform red detection only for images having a certain level of uniform brightness. Accordingly, execution of red detection of an image whose white balance is significantly disturbed can be avoided so that the operation of the red detection function can be stabilized.
The operation of a capsule endoscope system according to the present embodiment will be described below in detail using drawings.
First, as shown in
Next, the image display device 4 causes the user to read intra-subject images by performing image display processing to display the image processed by using the image processing function (step S1003). Then, the image display device 4 determines whether any termination instruction of the operation has been input from, for example, a user (step S1004) and if the termination instruction has been input (step S1004, Yes), the image display device 4 terminates the operation. On the other hand, if no termination instruction has been input (step S1004, No), the image display device 4 returns to step S1001 to perform the operation that follows. However, the step to which the image display device 4 returns is not limited to step S1001 and may be step S1002 or S1003.
Next, the motion detection function will be described as an example of the image processing function executed in step S1002 in
Next, the image display device 4 determines whether the brightness of the target image is within a preset allowable range based on the detected image brightness information (step S1013) and if the brightness of the target image is not within the allowable range (step S1013, No), sets the target image data as image data excluded from display targets (step S1017) before proceeding to step S1018.
On the other hand, if the brightness of the target image is within the allowable range (step S1013, Yes), the image display device 4 calculates a motion vector between the target image data and the image data chronologically immediately before (step S1014). Subsequently, the image display device 4 determines whether the scalar quantity (absolute value) of the calculated motion vector is equal to a preset threshold or more (step S1015) and if the scalar quantity is not equal to the preset threshold or more (step S1015, No), sets the target image data as image data excluded from display targets (step S1017) before proceeding to step S1018.
On the other hand, if the scalar quantity (absolute value) of the calculated motion vector is equal to the threshold or more (step S1015, Yes), the image display device 4 selects the target image as a display target image (step S1016). The selection of a display target image can be realized by, for example, attaching a flag indicating a display target to image data or recording an image to be displayed in a recording region such as another folder.
Then, the image display device 4 determines whether the above processing has been performed on all input image data (step S1018) and if the above processing has been performed on all input image data (step S1018, Yes), returns to the operation shown in
Next, the red detection function will be described as an example of the image processing function executed in step S1002 in
Next, the image display device 4 determines whether the brightness of the target image is within a preset allowable range based on the detected image brightness information (step S1023) and if the brightness of the target image is not within the allowable range (step S1023, No), sets the target image data as image data excluded from red detection targets (step S1027) before proceeding to step 1028.
On the other hand, if the brightness of the target image is within the allowable range (step S1023, Yes), the image display device 4 identifies the threshold of a color evaluation function in accordance with brightness information managed in a memory (not shown) or the like in advance (step S1024) and performs red detection of the target image using the threshold (step S1025). The image display device 4 stores a detected result in the same time sequence as that of the image data (step S1026).
Then, the image display device 4 determines whether the above processing has been performed on all input image data (step S1028) and if there is image data that is not yet processed (step S1028, No), the image display device 4 returns to step S1021 and performs the operation that follows. On the other hand, if the processing has been performed on all image data (step S1028, Yes), the image display device 4 generate red bar images from red detection results stored in the time sequence in step S1026 (step S1029) and then, returns to the operation shown in
According to the present embodiment, as described above, appropriate control in accordance with image brightness is enabled by the image processing function being operated based on image brightness so that image processing can be performed on image data in a stable fashion.
In the tenth embodiment, the image display device 4 is configured to control the operation based on whether the value of brightness information is within a range (allowable range) of the preset upper limit and lower limit, but the present invention is not limited to this and various modifications can be made. For example, the amount of change of the value of image brightness information between consecutive images may be calculated to configure the image display device 4 to operate in accordance with the amount of change. In this case, for example, an image whose amount of change from the previous image is larger than a preset threshold may be selected as a display target image or a red detection target image.
Also in the tenth embodiment, the image display device 4 is configured to perform red detection by selecting images whose value of brightness information is included in an allowable range as targets for red detection, but the present invention is not limited to this and various modifications can be made. For example, the image display device 4 may be configured so that the red detection function changes the threshold of a color evaluation coefficient used for red detection in accordance with the value of brightness information. Accordingly, the operating precision of the red detection function can further be improved. Correspondences between the threshold of the color evaluation function and brightness information may be derived in advance and managed in a table in a memory.
Next, as an image processing system according to the eleventh embodiment of the present invention, a capsule endoscope system using a capsule endoscope as an imaging device is taken as an example. The capsule endoscope system according to the present embodiment is, like the capsule endoscope system according to any of the second to tenth embodiments, an embodiment of the image processing system according to the first embodiment described above and the concept thereof is contained in the concept of the image processing system.
In the capsule endoscope system according to the present embodiment, for example, the capsule endoscope 2 acquires ordinary light images. An image obtained by the capsule endoscope 2 is input into the image display device 4 via the receiving device 3. The image display device 4 generates a special light image by using G components and B components from the input ordinary light image. The image display device 4 also performs predetermined image processing on the ordinary light image and special light image and presents a result of the processing and the images to the user.
If image data captured in ordinary light observation mode by using the ordinary light sources 10 contains many R components, G and B components may be insufficient. In such a case, while brightness of an ordinary light image is sufficient, brightness of a special light image generated from the ordinary light image is at a low level. Thus, in the present embodiment, the illuminating unit 51 is controlled so that G and B components in an image obtained by the next imaging are sufficient for generation of a special light image based on brightness of the image obtained by the last imaging. Accordingly, an ordinary light image and a special light image can be obtained from an image obtained in one imaging.
A capsule endoscope system according to the present embodiment will be described below in detail using drawings. The capsule endoscope system according to the present embodiment is like that of one of the above embodiments. However, as shown in
Next, the operation of a capsule endoscope system according to the present embodiment will be described in detail using drawings.
As shown in
Next, the capsule endoscope 2 determines whether the value of the ordinary light image brightness information detected in step S1103 is within a preset allowable range (step S1105) and if the value is within the allowable range (step S1105, Yes), attaches an ordinary light image flag indicating that the image data is an ordinary light image effective in reading X-rays to the image data (step S1106). On the other hand, if the value of the ordinary light image brightness information is not within the allowable range (step S1105, No), the capsule endoscope 2 directly proceeds to step S1107.
Next, the capsule endoscope 2 determines whether the value of the special light image brightness information detected in step S1104 is within a preset allowable range (step S1107) and if the value is within the allowable range (step S1107, Yes), attaches a special light image flag indicating that the image data is image data from which a special light image can be generated to the image data (step S1108). On the other hand, if the value of the special light image brightness information is not within the allowable range (step S1107, No), the capsule endoscope 2 directly proceeds to step S1109. Instead of the ordinary light image flag and special light image generation flag described above, calculated ordinary light image brightness information and/or special light image brightness information may be attached to image data.
Next, the capsule endoscope 2 transmits the image data to the receiving device 3 (step S1109). Subsequently, the capsule endoscope 2 calculates the light emission time of the ordinary light sources 10 for the next imaging from the special light image brightness information (step S1110) and emits light from the ordinary light sources 10 for the calculated light emission time (step S1111). Then, the capsule endoscope 2 returns to step S1102 and hereafter performs the same operation. If the light emission time calculated in step S1110 is larger than a maximum value of the light emission time preset as an upper limit, the capsule endoscope 2 causes the ordinary light sources 10 to emit light, for example, for the maximum value of the light emission time.
As shown in
On the other hand, if the special light image generation flag is attached to the image data (step S1122, Yes), the receiving device 3 performs predetermined image processing such as synchronization processing and compression processing on the image data (step S1123) and stores the image data after the image processing in the recording medium 5 (step S1124).
Then, the receiving device 3 determines whether any termination instruction of the operation has been input from, for example, a user (step S1126) and if the termination instruction has been input (step S1126, Yes), the receiving device 3 terminates the operation shown in
As shown in
In step S1135, the image display device 4 stores the image data. Thus, if a special light image is generated in step S1134, in addition to an ordinary light image, the image display device 4 stores the ordinary light image and special light image in step S1135.
Next, the image display device 4 determines whether the above processing has been performed on all input image data (step S1136) and if there is image data that is not yet processed (step S1136, No), the image display device 4 returns to step S1132 and performs the operation that follows. On the other hand, if the processing has been performed on all image data (step S1136, Yes), the image display device 4 determines whether any termination instruction of the operation has been input from, for example, a user (step S1137) and if the termination instruction has been input (step S1137, Yes), the image display device 4 terminates the operation. On the other hand, if no termination instruction has been input (step S1137, No), the image display device 4 returns to step S1131 to perform the operation that follows.
In the present embodiment, as described above, not only the capsule endoscope 2, but also the receiving device 3 and the image display device 4 can operate on the basis of information based on brightness (such as a flag and brightness information) attached to image data by the capsule endoscope 2 and thus, image data itself can be generated and processing on the generated image data can be performed in a stable fashion.
It is evident from the above that the embodiments described above are only examples to carry out the present invention and the present invention is not limited to these examples, various modifications in accordance with specifications are included in the scope of the present invention, and other various embodiments can be implemented further within the scope of the present invention.
For example, brightness of images obtained by the imaging unit 52 is adjusted by controlling the exposure time of the imaging unit 52 in accordance with brightness of images in the second to eighth embodiments described above and brightness of images obtained by the imaging unit 52 is adjusted by controlling the illumination time of the illuminating unit 51 in accordance with brightness of images in the ninth to eleventh embodiment described above. However, the present invention is not limited to such examples and it is easy for those skilled in the art to partially recombine configurations among the above embodiments such as adjusting brightness of images obtained by the imaging unit 52 by controlling the illumination time of the illuminating unit 51 in accordance with brightness of images in the second to eighth embodiments and adjusting brightness of images obtained by the imaging unit 52 by controlling the exposure time of the imaging unit 52 in accordance with brightness of images in the ninth to eleventh embodiments and thus, a detailed description thereof is omitted here.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2008-293834 | Nov 2008 | JP | national |
This application is a continuation of PCT international application Ser. No. PCT/JP2009/069435 filed on Nov. 16, 2009 which designates the United States, incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2009/069435 | Nov 2009 | US |
Child | 12898986 | US |