The present invention relates to a technology for outputting a combined image based on an image captured with visible light and an image captured with infrared light.
In the related art, to perform imaging with visible light and imaging with infrared light (non-visible light), an imaging apparatus including a visible-light sensor that receives visible light and an infrared sensor that receives infrared light in one optical system is known (Japanese Unexamined Patent Publication No. 2010-103740). In an environment in which illumination is low, or the like, a color image with little noise can be acquired by combining image data output by the visible-light sensor (visible image) and image data output by the infrared sensor (infrared image).
In such a combined image, color is included. Therefore, visibility is higher compared to the infrared image, but color reproduction is different compared to the visible image. Accordingly, as the illumination decreases, the hue of an image may change when the image delivered from a camera is switched from a visible image to the combined image. However, it is difficult for a user to distinguish the visible image from the combined image from image content. Thus, when a change in the hue occurs, it is difficult to ascertain whether the change is caused due to the switching between the visible image and the combined image or is caused due to a change in the surrounding environment of an imaged region.
An object of the invention is to provide a technology for easily determining a change in hue caused due to switching between a visible image and a combined image.
An imaging apparatus according to an aspect of the invention is an imaging apparatus capable of imaging a visible image and an infrared image. The imaging apparatus includes: a combination unit configured to combine the visible image and the infrared image to generate a combined image; and a superimposition unit configured to superimpose combination information indicating a combination ratio of the visible image to the infrared image on the combined image.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinafter, modes for carrying out the invention will be described in detail. The embodiments to be described below are examples given to realize the invention and should be appropriately modified or changed in accordance with configurations of apparatuses or various conditions to which the invention is applied. The invention is not limited to the following embodiments.
Hereinafter, overviews of a configuration and a function of an imaging apparatus 101 according to a first embodiment will be described with reference to
A network 102 is a network used to connect the imaging apparatus 101 to the client apparatus 103. The network 102 includes, for example, a plurality of routers, switches, and cables that meet a communication standard such as Ethernet (trademark). The communication standard, scale, and configuration of the network 102 do not matter as long as the network 102 can perform communication between the imaging apparatus 101 and the client apparatus 103. The network 102 may be configured with, for example, the Internet, a wired local area network (LAN), a wireless LAN, a wide area network (WAN), or the like.
The client apparatus 103 is, for example, an information processing apparatus such as a personal computer (PC), a server apparatus, or a tablet apparatus. The client apparatus 103 outputs various commands related to control of the imaging apparatus 101 to the imaging apparatus 101. The imaging apparatus 101 outputs images or responses to such commands to the client apparatus 103.
Next, the details of the imaging apparatus 101 will be described. The imaging apparatus 101 is, for example, an imaging apparatus such as a network camera. The imaging apparatus 101 can capture a visible image and an infrared image and is connected to be able to communicate with the client apparatus 103 via the network 102. The imaging apparatus 101 includes an imaging unit 116, a first image processing unit 108, a second image processing unit 109, a combination unit 110, a change unit 111, an infrared illumination unit 112, an illumination control unit 113, a superimposition unit 114, and an NW processing unit 115. The imaging unit 116 can include a lens 104, a wavelength separation prism 105, a first image sensor 106, and a second image sensor 107. The lens 104 is an optical lens that forms an image from light incident from a subject. The wavelength separation prism 105 separates light passing through the lens 104 by wavelength. More specifically, the wavelength separation prism 105 separates the light passing through the lens 104 into a visible-light component with a wavelength of about 400 nm to 700 nm and an infrared component with a wavelength of about 700 nm or more.
The first image sensor 106 converts visible light passing through the wavelength separation prism 105 into an electric signal. The second image sensor 107 converts infrared light passing through the wavelength separation prism 105 into an electric signal. The first image sensor 106 and the second image sensor 107 are, for example, a complementary metal-oxide semiconductor (CMOS), a charged coupled device (CCD), or the like.
The first image processing unit 108 performs a development process on an image signal captured by the first image sensor 106 to generate a visible image. The first image processing unit 108 determines subject illumination of the visible image from a luminance signal of the visible image. The second image processing unit 109 performs a development process on an image signal captured by the second image sensor 107 to generate an infrared image. When resolutions of the first image sensor 106 and the second image sensor 107 are different, any one of the first image processing unit 108 and the second image processing unit 109 performs a resolution conversion process to equalize the resolutions of the visible image and the infrared image. In the embodiment, an imaging apparatus that includes for example, one optical system, two image sensors, and two image processing units will be described. The imaging apparatus 101 may be able to simultaneously capture a visible image and an infrared image of the same subject and to generate the visible image and the infrared image, but the invention is not limited to this configuration. For example, one image sensor that outputs a plurality of image signals corresponding to visible light and infrared light may be used or one image processing unit may process the image signal of the visible image and the image signal of the infrared image.
The combination unit 110 combines the visible image generated by the first image processing unit 108 and the infrared image generated by the second image processing unit 109 based on, for example, Expression (1) below to generate a combined image.
[Math. 1]
Y
s
=αY
v
+βY
i
Cbs=αCbv (1)
Crs=αCrv
Here, Ys, Cbs, and Crs indicate a luminance signal, a blue color difference signal, and a red color difference signal of the combined image, respectively. Yv, Cbv, and Crv indicate a luminance signal, a blue color difference signal, and a red color difference signal of the infrared image, respectively. Yi is a luminance signal of the infrared image and α and β indicate coefficients.
The change unit 111 decides the coefficients α and β in Expression (1). The change unit 111 decides the coefficients α and β in accordance with, for example, the luminance signal Yv of the image of the visible light and the luminance signal Yi of the infrared image. The change unit 111 changes a combination ratio of the visible image to the infrared image by changing the coefficients α and β. The change unit 111 outputs the decided combination ratio to the combination unit 110.
The infrared illumination unit 112 radiates the infrared light to a subject. The illumination control unit 113 controls switching of ON/OFF of the infrared light or strength and weakness of the infrared light based on the combination ratio or the combined image generated by the combination unit 110. For example, when the coefficient β of the infrared image is 0, the combined image output from the combination unit 110 is an image of only the visible image. Therefore, the illumination control unit 113 may control the infrared illumination unit 112 such that the infrared illumination unit 112 is turned off. The superimposition unit 114 generates combination information indicating the combination ratio of the visible image to the infrared image as an on-screen-display (OSD) image and superimposes the OSD image on the combined image. The combination information is, for example, characters or a figure and is superimposed on the combined image with color or luminance in accordance with the combination ratio. The details of the combination information superimposed on the combined image will be described later. Here, the combination ratio may be a ratio of α to β or may be decided based on the luminance signals of the visible image and the infrared image as in a ratio of αYv to (1−α)Yi. The NW processing unit 115 outputs the combined image, a response to a command from the client apparatus 103, or the like to the client apparatus 103 via the network 102.
The client apparatus 103 includes a CPU 220, a ROM 221, a RAM 222, an NW processing unit 223, an input unit 224, and a display unit 225. The CPU 220 reads a program stored in the ROM 221 and performs various processes. The ROM 221 stores a boot program or the like. The RAM 222 is used as a temporary storage region such as a main memory, a work area, or the like of the CPU 220. The NW processing unit 223 outputs various commands related to control of the imaging apparatus 101 to the imaging apparatus 101 via the network 102 and receives the combined image output from the imaging apparatus 101.
The input unit 224 is a keyboard or the like and performs input of information to the client apparatus 103. The display unit 225 is a display medium such as a display and displays the combined image generated by the imaging apparatus 101 and the combination information which is the combination ratio of the visible image to the infrared image included in the combined image. The input unit 224 and the display unit 225 are independent devices from the client apparatus 103 or may be included in the client apparatus 103. The storage unit 226 is, for example, a storage medium such as a hard disk or an SD card and stores the combined image on which the combination information output from the imaging apparatus 101 is superimposed.
Hereinafter, a flow of generation of the combined image and superimposition of the combination information which is an OSD image will be described with reference to
When the subject illumination is equal to or greater than t1 in S202 (YES), the illumination control unit 113 turns off the infrared illumination unit 112 in S203. Subsequently, in S204, the coefficient β of the infrared image is set to 0 in the combination unit 110 and the generated combined image is output to the superimposition unit 114. At this time, since the coefficient β of the infrared image is set to 0, only the visible image is consequently selected in the combination unit 110 and is output to the superimposition unit 114. In the embodiment, however, this image output from the combination unit 110 including such an image is referred to as a combined image.
When the subject illumination in the visible image is less than t1 in S202 (NO), the illumination control unit 113 turns on the infrared illumination unit 112 in S205. Subsequently, in S206, the first image processing unit 108 determines whether the subject illumination in the visible image is equal to or greater than t2 (where t1>t2) and outputs a determination result to the combination unit 110. A method of determining the subject illumination is the same as that in S202. When the subject illumination in the visible image is equal to or greater than t2 in S206 (YES), the combination unit 110 combines the visible image and the infrared image in S207. Subsequently, in S208, the generated combined image is output to the superimposition unit 114. When the subject illumination in the visible image is less than t2 in S206 (NO), the combination unit 110 sets the coefficient α of the visible image to 0 and outputs the generated combined image to the superimposition unit 114 in S209. At this time, since the coefficient α of the visible image is 0, only the infrared image is consequently selected in the combination unit 110 and is output to the superimposition unit 114. Finally, the combination information indicating the combination ratio of the visible image to the infrared image in the combination unit 110 is superimposed on the image input to the superimposition unit 114.
Hereinafter, the details of the combination information will be described.
By superimposing the combination ratio as characters as the combination information, it is possible to easily determine whether the hue of an image is changed due to the combined image or changed for another reason when the hue of the image displayed in the client apparatus 103 is changed. In the case of the combined image, the combination ratio can also be determined. Since the infrared image includes no color, it is easy to determine that an image is the infrared image by checking the image in the client apparatus 103. Accordingly, for the infrared image, the combination ratio may not be superimposed on the image.
By superimposing the figure of luminance in accordance with the combination ratio in this way, it is possible to easily determine whether the hue of an image is changed due to the combined image or changed for another reason when the hue of the image displayed in the client apparatus 103 is changed. The figure is superimposed with the luminance in accordance with the combination ratio in
Next, a second embodiment will be described. Details not mentioned in the second embodiment are the same as those of the above-described embodiment. Hereinafter, overviews of a configuration and a function of an imaging apparatus 501 according to the second embodiment will be described with reference to
A first image processing unit 502 calculates an average value of luminance signals of a visible image. A second image processing unit 503 calculates an average value of luminance signals of an infrared image. The details of a method of calculating an average value of luminance signals of each image will be described later. A first superimposition unit 504 superimposes first superimposition information such as characters or a figure on the visible image. A second superimposition unit 505 superimposes second superimposition information such as characters or a figure on the infrared image. The details of the first superimposition information and the second superimposition information will be described later. A combination unit 506 combines the visible image on which the first superimposition information is superimposed and the infrared image on which the second superimposition information is superimposed based on Expression (1) of the first embodiment to generate a combined image.
Hereinafter, details of the first superimposition information and the second superimposition information will be described.
When the combination unit 506 combines the visible image 601a on which the first superimposition information 601b is superimposed and the infrared image 603a on which the second superimposition information 603b is superimposed, a combined image 602a is generated. Characters of luminance in accordance with the combination ratio are superimposed as the combination information 602b on the combined image 602a by combining the first superimposition information 601b and the second superimposition information 603b. When a combination ratio of the infrared image is 0 (where the coefficient β=0), only the first superimposition information 601b is consequently superimposed as combination information on the visible image 601a and is output to the client apparatus 103. When a combination ratio of the visible image is 0 (where the coefficient α=0), only the second superimposition information 603b is consequently superimposed as combination information on the infrared image 603a and is output to the client apparatus 103.
In this way, by superimposing the first superimposition information and the second superimposition information on the visible image and the infrared image, respectively, to generate the combined image, it is possible to output an image on which the combination information is superimposed to the client apparatus 103. Accordingly, when the hue of the image displayed in the client apparatus 103 is changed, it is possible to easily determine whether the hue of the image is changed due to the combined image or changed for another reason. In
The combination unit 506 combines the visible image 701a on which the first superimposition information 701b is superimposed and the infrared image 703a on which the second superimposition information 703b is superimposed, and generates a combined image 702a. The first superimposition information 701b and the second superimposition information 703b are superimposed on the combined image 702a. Two figures, a figure which is the first superimposition information 701b and a figure which is the second superimposition information 703b, are superimposed as the combination information 702b. A combination ratio can be checked from the luminance of each of the two figures included in the combination information 702b.
In this way, by superimposing the first superimposition information and the second superimposition information on the visible image and the infrared image, respectively, to generate the combined image, it is possible to output an image on which the combination information is superimposed to the client apparatus 103. Accordingly, when the hue of the image displayed in the client apparatus 103 is changed, it is possible to easily determine whether the hue of the image is changed due to the combined image or changed for another reason.
Next, a third embodiment will be described. Details not mentioned in the third embodiment are the same as those of the above-described embodiments. Hereinafter, overviews of a configuration and a function of a client apparatus 802 according to the third embodiment will be described with reference to
A combination unit 803 combines the visible image generated by the first image processing unit 108 and the infrared image generated by the second image processing unit 109 based on Expression (1) according to the first embodiment to generate a combined image. The combination unit 803 outputs the combined image and a combination ratio decided by the change unit 111 to an NW processing unit 805. The NW processing unit 805 outputs the combined image (video data 806) generated by the combination unit 803 and the combination ratio (metadata 807) decided by the change unit 111 to the client apparatus 802 via the network 102.
The client apparatus 802 includes an NW processing unit 808, a generation unit 811, a display unit 812, and a storage unit 813. The NW processing unit 808 receives the video data 806 and the metadata 807 output from the imaging apparatus 801 via the network 102. The generation unit 811 generates combination information indicating a combination ratio of the visible image to the infrared image from the metadata 807 as an OSD image. The generation unit 811 may superimpose the combination information on the video data 806 as in the first embodiment. For example, the combination information may be similar to the combination information of the first embodiment or may be a character string or the like from which the combination ratio can be understood. The display unit 812 is a display medium such as a display and displays the combined image and the combination information. The display unit 812 may superimpose and display the combined image and the combination information or may arrange and display the combined image and the combination information, or the like, without superimposing the combined image and the combination information for display. The storage unit 813 is, for example, a storage medium such as a hard disk or an SD card and stores the combined image and the combination information.
In this way, by displaying the combination ratio received as the metadata along with the combined image, it is possible to easily determine whether hue of an image is changed due to the combined image or changed for another reason when the hue of the image displayed in the client apparatus 103 is changed.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2018-145090, filed Aug. 1, 2018, which is hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2018-145090 | Aug 2018 | JP | national |