The present invention relates to illuminance acquisition devices, illumination control systems, and programs.
A device for calculating (and acquiring data about), based on an image captured, the illuminance of a shooting space where an object of shooting is located has been known in the art (see Patent Literature 1, for example).
An information processor as disclosed in Patent Literature 1 obtains an index value to a sense of brightness in the shooting space by processing an image representing the distribution of illuminance measured in an image capturing area of a camera.
The shooting space may sometimes cover a window and/or a TV, for example. In such a situation, the brightness of a view of the outdoors through the window or the brightness of video on the TV may vary often. That is why if the shooting space covers a window or a TV, then the luminance (pixel value) of its image varies too frequently to calculate (or obtain) the brightness appropriately.
Patent Literature 1: JP 2013-168333 A
It is therefore an object of the present invention to provide an illuminance acquisition device, illumination control system, and program with the ability to reduce an error in illuminance detected from an image.
An illuminance acquisition device according to an aspect of the present invention includes a setting unit, a human sensing unit, and an illuminance acquisition unit. The setting unit is configured to set a first area for use to acquire an illuminance in an image capturing area of an image capturing unit. The human sensing unit is configured to determine, based on an image captured by the image capturing unit, whether or not there is any human in a second area of the image capturing area. The illuminance acquisition unit is configured to acquire, based on the image, a spatial illuminance in the first area.
An illumination control system according to another aspect of the present invention includes the illuminance acquisition device described above, a lighting fixture, and a controller. The controller is configured to, when the illuminance acquisition device has detected presence of any human, control light output of the lighting fixture based on the illuminance acquired by the illuminance acquisition device.
A program according to still another aspect of the present invention is designed to make a computer function as a setting unit, a human sensing unit, and an illuminance acquisition unit. The setting unit is configured to set a first area for use to acquire an illuminance in an image capturing area of an image capturing unit. The human sensing unit is configured to determine, based on an image captured by the image capturing unit, whether or not there is any human in a second area of the image capturing area. The illuminance acquisition unit is configured to acquire, based on the image, a spatial illuminance in the first area.
The illuminance acquisition device, illumination control system, and program described above allows for reducing an error in illuminance detected from an image.
The following embodiments generally relate to illuminance acquisition devices, illumination control systems, and programs, and more particularly relate to an illuminance acquisition device, illumination control system, and program for acquiring data about the illuminance of a shooting space by using an image captured.
A first embodiment of an illumination control system 1, illuminance acquisition device 10, and program will now be described with reference to
As shown in
The illuminance acquisition device 10 is a device with the function of acquiring data about the illuminance of a shooting space based on an image captured (hereinafter referred to as a “first processing function”) and the function of determining whether or not there is any human in the shooting space (hereinafter referred to as a “second processing function”). The controller 20 is a device for controlling the light output of the plurality of lighting fixtures 30 based on a processing result of the first processing function and a processing result (sensing result) of the second processing function.
A configuration for the illuminance acquisition device 10 will be described. As shown in
The setting unit 13, the illuminance acquisition unit 14, and the human sensing unit 15 include, as their major component, a microcomputer (or microcontroller), and perform these functions by executing a program stored in a memory. Note that the program may have been written in a memory in advance or may be provided after having been stored in a storage medium such as a memory card.
The image capturing unit 11 is a two-dimensional image sensor with a solid-state image sensor in which a plurality of photodetectors are arranged two-dimensionally. The photodetectors may be implemented as charge coupled devices (CCDs) or complementary metal oxide semiconductor (CMOS) devices, for example. The image capturing unit 11 outputs an image captured to the illuminance acquisition unit 14 and the human sensing unit 15.
The operating unit 12 may be implemented as a remote controller, which accepts an operating command input and outputs an operating signal representing the operating command input and which includes a display screen with a touchscreen capability. The operating unit 12 accepts the operating command input by the operator to make the setting unit 13 set an illuminance acquisition area (first area) for use to acquire the illuminance in the shooting area. In addition, the operating unit 12 also accepts the operating command input by the operator to make the setting unit 13 set a human sensing area (second area) for use to detect the presence of any human in the shooting area. On the display screen of the operating unit 12, a pixel-by-pixel image capturing area (corresponding to the shooting area) may be displayed on a pixel-by-pixel basis. The operator is allowed to specify an illuminance acquisition area by touching the screen on the pixels to define the illuminance acquisition area. The operating unit 12 outputs, as an operating signal, information indicating the locations of all pixels specified as defining the illuminance acquisition area through the operating command input (e.g., the coordinates of the specified pixels), to the setting unit 13 as an infrared ray. In addition, the operator also touches the screen on the pixels to define the human sensing area, thereby specifying the human sensing area. The operating unit 12 outputs, as an operating signal, information indicating the locations of all pixels specified as defining the human sensing area through the operating command input (e.g., the coordinates of the specified pixels), to the setting unit 13 as an infrared ray. Note that the operating unit 12 may specify each of these areas on a pixel-by-pixel basis or on the basis of a rectangular area composed of plurality of pixels, without particular limitation.
The setting unit 13 sets the illuminance acquisition area and the human sensing area in accordance with the operating command input by the operator through the operating unit 12 (i.e., in accordance with the operating signal supplied from the operating unit 12). For example, if an area A2 has been specified as the illuminance acquisition area in the image capturing area A1 in accordance with the operating command input by the operator (see
The illuminance acquisition unit 14 performs calibrations while the illuminance acquisition device 10 is installed, thereby calculating and storing a transform coefficient for transforming the luminance of an image into an illuminance thereof. Specifically, the illuminance acquisition unit 14 calculates the transform coefficient by the equation “transform coefficient=(exposure duration×gain×brightness coefficient)/luminance of image,” where the brightness coefficient is a value measured with an illuminometer.
The illuminance acquisition unit 14 calculates and acquires the illuminance of a shooting space based on the brightness values (pixel values) of all pixels included in the illuminance acquisition area that has been set by the setting unit 13. Specifically, the illuminance acquisition unit 14 stores correction coefficients α1, β1, and γ1 for an R luminance (R1), a G luminance (G1), and a B luminance (B1), respectively. The illuminance acquisition unit 14 obtains, for each of the plurality of pixels included in the illuminance acquisition area represented by a single set or plural sets of coordinates stored in the setting unit 13, an R luminance (R1), a G luminance (G1), and a B luminance (B1) associated with that pixel. For example, if the illuminance acquisition area is the area A2 shown in
The illuminance acquisition unit 14 also calculates a representative value based on the pixel values that have been calculated on a pixel-by-pixel basis in the illuminance acquisition area (e.g., the area A2 in this case). Specifically, the illuminance acquisition unit 14 calculates the average of all pixel values calculated and sets the average thus calculated as the brightness (luminance) of the image. Alternatively, instead of calculating such an average as the representative value, the illuminance acquisition unit 14 may also regard, as the representative value, a median or mode with respect to all of those pixel values calculated.
The illuminance acquisition unit 14 calculates the illuminance by multiplying, by the transform coefficient stored in advance, the image luminance thus calculated. Then, the illuminance acquisition unit 14 outputs the illuminance, calculated based on the image luminance, to the controller 20 via the communications unit 16.
The human sensing unit 15 determines, within the human sensing area set by the setting unit 13, whether or not there is any human in the shooting space. Specifically, in accordance with the information provided by the image capturing unit 11, the human sensing unit 15 corrects the image captured by the image capturing unit 11 into an image for sensing any human. The human sensing unit 15 stores in advance, as a background image, an image that has been captured in the shooting space with no humans present. The human sensing unit 15 determines whether or not there is any human by comparing the human sensing area in the background image with the human sensing area in the corrected image. The human sensing unit 15 outputs the sensing result to the controller 20 via the communications unit 16.
The communications unit 16 outputs the illuminance calculated by the illuminance acquisition unit 14 and the sensing result obtained by the human sensing unit 15 to the controller 20.
In response, the controller 20 performs light output control on the plurality of lighting fixtures 30 based on the illuminance calculated by the illuminance acquisition unit 14 and the sensing result obtained by the human sensing unit 15.
For example, the controller 20 may control the light output of the plurality of lighting fixtures 30 based on the illuminance provided by the illuminance acquisition unit 14 such that the brightness in the shooting space will have a predetermined luminance. Alternatively, the controller 20 may also compare the illuminance provided by the illuminance acquisition unit 14 with a predetermined threshold value and control the light output of the plurality of lighting fixtures 30 based on a result of the comparison. Specifically, when finding the illuminance provided by the illuminance acquisition unit 14 less than the predetermined threshold value, the controller 20 controls the light output of the plurality of lighting fixtures 30 so as to increase the light output of the plurality of lighting fixtures 30. On the other hand, when finding the illuminance provided by the illuminance acquisition unit 14 greater than the predetermined threshold value, the controller 20 controls the light output of the plurality of lighting fixtures 30 so as to decrease the light output of the plurality of lighting fixtures 30.
In addition, the controller 20 also controls the light output of the plurality of lighting fixtures 30 based on the sensing result obtained by the human sensing unit 15 (i.e., depending on whether or not there is any human in the shooting space). Specifically, if there is any human in the shooting space, the controller 20 controls the plurality of lighting fixtures 30 to turn those lighting fixtures 30 ON. On the other hand, if there are no humans in the shooting space, the controller 20 controls the plurality of lighting fixtures 30 to turn those lighting fixtures 30 OFF.
The controller 20 may control the plurality of lighting fixtures 30 in various manners by changing combinations of the illuminance provided by the illuminance acquisition unit 14 and the sensing result obtained by the human sensing unit 15.
Optionally, according to this embodiment, the illuminance acquisition area and the human sensing area may be set in advance when the illuminance acquisition device 10 is going to be shipped. In that case, the operator is expected to change the preset illuminance acquisition area and the human sensing area by inputting operating commands. The illuminance acquisition area may be the same as the human sensing area. Alternatively, the human sensing area may be included in the illuminance acquisition area. Conversely, the illuminance acquisition area may be included in the human sensing area. Still alternatively, part of the illuminance acquisition area may be included in the human sensing area. Yet alternatively, there may be no overlapping area between the illuminance acquisition area and the human sensing area. In the embodiment described above, the illuminance acquisition area is defined such that pixels are contiguous with each other in the X-axis or Y-axis direction shown in
Also, in the embodiment described above, calibrations are supposed to be performed to calculate the transform coefficient when the illuminance acquisition device 10 is installed. However, this should not be construed as limiting. Rather the illuminance acquisition device 10 may also perform calibrations and update the transform coefficient even after having been installed. For example, the illuminance acquisition device 10 may update the transform coefficient at regular intervals (of, e.g., every month, every six months, or once a year) or every time the image capturing unit 1 is inspected.
Note that the image capturing unit 11 is not an essential element for the illuminance acquisition device 10. Also, the illuminance acquisition device 10 and the controller 20 may be implemented as a single integrated device.
According to a second embodiment, the setting unit 13 has not only the function of setting an illuminance acquisition area in accordance with operating commands input by the operator but also the function of automatically setting the illuminance acquisition area, which is a major difference from the first embodiment.
The following description of the second embodiment will be focused on differences from the first embodiment. Also, in the following description, any constituent member of the second embodiment having the same function as the counterpart of the first embodiment described above will be designated by the same reference numeral as that counterpart's, and a detailed description thereof will be omitted herein.
The setting unit 13 of this embodiment has the following functions in addition to the function of setting an illuminance acquisition area in accordance with the operating commands input by the operator.
The setting unit 13 has the function of setting (or changing) the illuminance acquisition area in accordance with a variation with time in the pixel value of every pixel included in a predetermined area of an image captured (hereinafter referred to as an “area setting function”). In this case, the predetermined area may be the illuminance acquisition area specified by the operator, for example. The setting unit 13 checks each of a plurality of pixels included in the illuminance acquisition area for a variation with time in their pixel value, and removes, when finding the magnitude of the variation in the pixel value of any of the plurality of pixels greater than a predetermined threshold value, the pixel from the illuminance acquisition area. On the other hand, when finding the magnitude of the variation in the pixel value of each of those pixels less than the predetermined threshold value, the setting unit 13 regards that pixel as a pixel to be included in the illuminance acquisition area.
The area setting function of the setting unit 13 will be described with reference to
With this regard, it will be described with reference to
For example,
In the embodiment described above, the setting unit 13 regards, as a predetermined area, the illuminance acquisition area that has been set in accordance with the operating commands input by the operator and changes that area. However, this is only an example and should not be construed as limiting. Alternatively, the illuminance acquisition area that has been set in advance when the illuminance acquisition device 10 is going to be shipped may be regarded as a predetermined area to be changed. Still alternatively, if no illuminance acquisition areas have been set, the setting unit 13 may set the illuminance acquisition area by regarding the image capturing area A1 as the predetermined area. Furthermore, even if an illuminance acquisition area has been set, the setting unit 13 may also newly set a different illuminance acquisition area by regarding the image capturing area A1 as the predetermined area.
This allows the illuminance acquisition device 10 of this embodiment to automatically set (or change) the illuminance acquisition area.
In the embodiment described above, the setting unit 13 sets (and changes) the illuminance acquisition area based on the degree of variation in the pixel value. However, this is only an example and should not be construed as limiting. Alternatively, the setting unit 13 may remove any pixel, of which the pixel value falls out of a predetermined range, from the illuminance acquisition area. Specifically, the setting unit 13 generates a histogram of pixel values with respect to a plurality of pixels included in a predetermined area. By reference to the histogram thus generated, the setting unit 13 determines a pixel with a low pixel value and a pixel with a high pixel value as pixels falling out of the predetermined range. Next, it will be described with reference to
The setting unit 13 generates a histogram of pixel values with respect to all pixels included in the image capturing area A1 of the image capturing unit 11 (see
The setting unit 13 removes pixels falling within the area A21, which is an area of pixels with pixel values less than the first threshold value and which includes the histogram H1, as pixels with low pixel values from the illuminance acquisition area. In addition, the setting unit 13 also removes pixels falling within the area A20, which is an area of pixels with pixel values greater than the second threshold value and which includes the histogram H4, as pixels with high pixel values from the illuminance acquisition area. Furthermore, the setting unit 13 sets the area A22, which is an area of pixels with pixel values falling within the range of the first threshold value X1 to the second threshold value X2 and which includes the histograms H2 and H3, as the illuminance acquisition area (i.e., changes the illuminance acquisition area from the entire image capturing area A1 into the area A22).
Optionally, such a change of the illuminance acquisition areas using the histogram and a change of the illuminance acquisition areas in accordance with a variation with time in pixel value as described above may be carried out in combination. In that case, the setting unit 13 checks the pixel values for their degree of variance with time, removes pixels with a significant degree of variance (with a significant degree of dispersion), and sets the illuminance acquisition area for the other pixels using a histogram (i.e., changes the illuminance acquisition areas).
Also, the setting unit 13 may also set, as the illuminance acquisition area, an area in which at least a predetermined number of pixels, among all pixels falling within the area A22 having pixel values within the range of the first threshold value X1 to the second threshold value X2 and including the histograms H2 and H3, are formed continuously. In that case, the illuminance acquisition area thus set is not made up of a plurality of dispersed areas (i.e., does not have a discontinuous distribution) but becomes a single continuous area.
Furthermore, in the embodiment described above, the lower limit value X1 and upper limit value X2 of the predetermined range are supposed to be determined in advance. However, this is only an example and should not be construed as limiting. Alternatively, the lower limit value X1 and upper limit value X2 of the predetermined range may also be set based on a histogram of pixel values generated by the setting unit 13. For example, the setting unit 13 may respectively set X1 and X2 at the lower and upper limit values of a range including 70% of the pixel values of a plurality of pixels falling within a predetermined area (such as the image capturing area A1). Note that this numerical value is only an example and should not be construed as limiting.
Moreover, even if pixels with pixel values falling out of the predetermined range are removed from the illuminance acquisition area, the illuminance acquisition area that has been set in advance when the illuminance acquisition device 10 is going to be shipped may be regarded as a predetermined area to be changed. If no illuminance acquisition areas have been set, the setting unit 13 may set the illuminance acquisition area by regarding the image capturing area A1 as the predetermined area. Furthermore, even if an illuminance acquisition area has been set, the setting unit 13 may also newly set a different illuminance acquisition area by regarding the image capturing area A1 as the predetermined area.
As can be seen from the foregoing description of embodiments, an illuminance acquisition device 10 according to a first aspect of the present invention includes a setting unit 13, a human sensing unit 15, and an illuminance acquisition unit 14. The setting unit 13 is configured to set a first area (illuminance acquisition area) for use to acquire an illuminance in an image capturing area of an image capturing unit 11. The human sensing unit 15 is configured to determine, based on an image captured by the image capturing unit 11, whether or not there is any human in a second area (human sensing area) of the image capturing area. The illuminance acquisition unit 14 is configured to acquire, based on the image, a spatial illuminance in the first area.
According to this configuration, the illuminance acquisition device 10 sets a first area (illuminance acquisition area) in an image capturing area, and therefore, may remove, for example, an area where the pixel value varies frequently from the first area. This allows the illuminance acquisition device 10 to reduce an error in the illuminance detected from an image by using the first area.
In an illuminance acquisition device 10 according to a second aspect of the present invention, which is dependent on the first aspect, the setting unit 13 sets the first area in accordance with an operating signal supplied from an operating unit 12. According to this configuration, the illuminance acquisition device 10 sets the first area in accordance with operating commands input by an operator. This allows the operator to set the first area according to his or her own preference.
In an illuminance acquisition device 10 according to a third aspect of the present invention, which is dependent on the first or second aspect, the setting unit 13 removes, from the first area, any of a plurality of pixels included in the first area, when finding magnitude of a variation with time in its pixel value to be greater than a predetermined threshold value. According to this configuration, the illuminance acquisition device 10 removes any pixel, of which the pixel value has varied significantly with time, from the first area, thus obtaining even more appropriate illuminance (brightness).
In an illuminance acquisition device 10 according to a fourth aspect of the present invention, which is dependent on the third aspect, the setting unit 13 determines, for each of the plurality of pixels, the magnitude of the variation in the pixel value by a degree of variance with time between pixel values of that pixel. According to this configuration, the illuminance acquisition device 10 determines the degree of the variation in the pixel value by a degree of variance between the pixel values, which allows the illuminance acquisition device 10 to set the first area even more appropriately.
In an illuminance acquisition device 10 according to a fifth aspect of the present invention, which is dependent on any one of the first to fourth aspects, the setting unit 13 removes, from the first area, any of a plurality of pixels included in the first area, when finding a pixel value of the pixel to be falling out of a predetermined range. According to this configuration, the illuminance acquisition device 10 removes a pixel, of which the pixel value falls out of a predetermined range, from the first area. This allows the illuminance acquisition device 10 to obtain even more appropriate illuminance (brightness).
In an illuminance acquisition device 10 according to a sixth aspect of the present invention, which is dependent on the fifth aspect, the setting unit 13 determines a pixel with a high pixel value and a pixel with a low pixel value to be the pixels falling out of the predetermined range in a histogram of pixel values of the plurality of pixels. According to this configuration, the illuminance acquisition device 10 determines a pixel with a high pixel value and a pixel with a low pixel value to be pixels falling out of the predetermined range using a histogram of pixel values, which allows the illuminance acquisition device 10 to set the first area even more appropriately.
In an illuminance acquisition device 10 according to a seventh aspect of the present invention, which is dependent on the sixth aspect, the setting unit 13 sets, as the first area, an area in which among all of the plurality of pixels but the pixel with the high pixel value and the pixel with the low pixel value, at least a predetermined number of pixels are formed continuously. This configuration allows the illuminance acquisition device 10 to set a single continuous area as the first area.
An illumination control system 1 according to an eighth aspect of the present invention includes the illuminance acquisition device 10 according to any of the first to seventh aspects described above, a lighting fixture 30, and a controller 20. The controller 20 is configured to, when the illuminance acquisition device 10 has detected presence of any human, control light output of the lighting fixture 30 based on the illuminance acquired by the illuminance acquisition device 10. According to this configuration, the illumination control system 1 is able to reduce an error in the illuminance detected from an image by using the first area (illuminance acquisition area) set by the illuminance acquisition device 10. This allows the controller 20 of the illumination control system 1 to perform highly accurate light output control on the lighting fixture 30.
A program according to a ninth aspect of the present invention is designed to make a computer function as a setting unit 13, a human sensing unit 15, and an illuminance acquisition unit 14. The setting unit 13 is configured to set a first area for use to acquire an illuminance in an image capturing area of an image capturing unit 11. The human sensing unit 15 is configured to determine, based on an image captured by the image capturing unit 11, whether or not there is any human in a second area of the image capturing area. The illuminance acquisition unit 14 is configured to acquire, based on the image, a spatial illuminance in the first area. This program allows for reducing an error in illuminance detected from an image by using an illuminance acquisition area.
1 Illumination Control System
10 Illuminance Acquisition Device
11 Image Capturing Unit
13 Setting Unit
14 Illuminance Acquisition Unit
15 Human Sensing Unit
20 Controller
30 Lighting fixture
Number | Date | Country | Kind |
---|---|---|---|
2015-187095 | Sep 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/004131 | 9/12/2016 | WO | 00 |