The disclosure of Japanese Patent Application No. 2010-14238, which was filed on Jan. 26, 2010, is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a congestion degree measuring apparatus. More particularly, the present invention relates to a congestion degree measuring apparatus which measures a congestion degree of one or at least two dynamic objects existing on a plane, based on a scene image outputted from a camera having an imaging surface that captures the plane.
2. Description of the Related Art
According to one example of this type of apparatus, a camera captures a plane where a human walks, from a diagonal direction. An image outputted from the camera is divided into a plurality of motion processing regions. At this time, a size of a motion processing region allocated to a close scene is made larger than a size of a motion processing region allocated to a distant scene. A motion of the image is detected in each motion processing region and is compared with a threshold value set to each motion processing region. A congestion degree of a human who walks is estimated based on a comparison result thus obtained.
However, in the above-described apparatus, since it is necessary to make the size of the motion processing region different in a perspective direction of the scene, a work burden for an initial setting may be increased.
A congestion degree measuring apparatus according to the present invention, comprises: a reproducer which reproduces a reference image representing a state in which a plane is overlooked; a taker which repeatedly takes a scene image outputted from a camera having an imaging surface that captures the plane; an allocator which allocates one or at least two second areas respectively corresponding to one or at least two first areas designated on the reference image reproduced by the reproducer to the scene image taken by the taker; and a measurer which executes a process of measuring a congestion degree of one or at least two dynamic objects existing on the plane at each second area allocated by the allocator, based on the scene image taken by the taker.
The above described features and advantages of the present invention will become more apparent from the following detailed description of the embodiment when taken in conjunction with the accompanying drawings.
With reference to
When the first area is designated on the reference image representing the state in which the plane is outlined or overlooked, the second area used for a congestion degree measurement referring to the scene image is allocated to the scene image corresponding to the designated first area. Thereby, it becomes possible to inhibit a work burden for an initial setting.
With reference to
With reference to
On a ceiling of the room, air conditioning devices D1 to D4 are installed at a predetermined distance. Each of the air conditioning devices D1 to D4 outputs air at a designated temperature in a designated air amount. A room temperature is adjusted by the air thus outputted.
When a measurement area setting mode is selected by a manipulation of an input device 18, following processes are executed by a CPU 14p.
Firstly, a map image shown in
When a rectangular area is designated on the map image by a drag manipulation of a mouse pointer arranged in the input device 18, the variable K is incremented, and a line indicating the designated rectangular area is drawn on the map image.
The line indicating the rectangular area is drawn as shown in
Four X-Y coordinates (X—1, Y—1), (X—2, Y—2), (X—3, Y—3), and (X—4, Y—4) respectively corresponding to four corners of the designated rectangular area are set to a register 14r shown in
Moreover, in the register 14r, a normalized coefficient α_K indicating a numerical value corresponding to dimensions of the rectangular area designated by the drag manipulation is described. The normalized coefficient α_K is obtained by dividing the dimensions of the designated rectangular area by unit area, and described in the register 14r corresponding to the variable K.
Upon completion of the process of setting to the register, a variable L is set to each of “1” to “4”. X-Y coordinates (X_L, Y_L) described in an L-th column that corresponds to the variable K are transformed into U-V coordinates (U_L, V_L) according to Equation 1.
Calibration parameters P11 to P33 shown in Equation 1 are equivalent to a matrix for performing a planar projective transformation between an X-Y coordinate system defining the plane FS1 and a U-V coordinate system defining the camera image. Therefore, when the X-Y coordinates (X_L, Y_L) on the plane FS1 are applied to Equation 1, the corresponding U-V coordinates (U_L, V_L) on the camera image are calculated. The U-V coordinates (U_L, V_L) thus transformed are described in the register 14r corresponding to the X-Y coordinates (X_L, Y_L), which are a transformation source.
A total of four transformation processes according to Equation 1 are executed, and as a result, the U-V coordinates (U—1, V—1), (U—2, V—2), (U—3, V—3), and (U—4, V—4) respectively corresponding to the X-Y coordinates (X—1, Y—1), (X—2, Y—2), (X—3, Y—3), and (X—4, Y—4) are calculated. Upon completion of the transformation process, an area surrounded by the U-V coordinates (U—1, V—1), (U—2, V—2), (U—3, V—3), and (U—4, V—4) is specified as the measurement area, and a line indicating the specified measurement area is drawn on the camera image. The number of pixels belonging to the measurement area out of a plurality of pixels configuring the camera image is described in the register 14r corresponding to the variable K.
Therefore, the line defining the measurement area is drawn as shown in
As a result, when a total of four drag manipulations are executed in a manner to surround each of the marks M1 to M4, the line indicating the rectangular area is drawn on the map image as shown in
When a congestion degree measuring mode is selected by the manipulation of the input device 18 after the setting of the measurement area is completed, following processes are executed by the CPU 14p at each arrival of a measurement cycle.
Firstly, an image indicating a motion, i.e., a motion image is detected on the camera image. Subsequently, the variable K is set to “1”, and the number of pixels of K-th measurement area (=the number of pixels described in the register 14r corresponding to the variable K) is set to a variable P_K—1. Moreover, a motion image belonging to the K-th measurement area is categorized from the motion image detected on the camera image, and the number of pixels of the categorized motion image is set to a variable P_K—2.
A congestion degree CR_K indicating the level of congestion of the K-th measurement area is calculated based on thus set variables P_K—1 and P_K—2 and the normalized coefficient α_K described in the register 14r. Specifically, the congestion degree CR_K is obtained by dividing the variable P_K—2 by the variable P_K—1 and multiplying a divided value obtained thereby by the normalized coefficient α_K. When the congestion degree CR_K is calculated, it is determined whether or not the variable K reaches a maximum value Kmax (=a total number of the measurement areas). When a determined result is NO, the variable K is incremented, and the above-described process is executed again. When the determined result is YES, each output of the air conditioning devices D1 to D4 is controlled based on the calculated congestion degrees CR—1 to CR_Kmax. Specifically, the output of an air conditioning device close to a measurement area in which the congestion degree is great is strengthened while the output of an air conditioning device close to a measurement area in which the congestion degree is small is weakened.
With reference to
Then, a congestion degree of the measurement area MA1 is calculated based on the number of pixels of an image belonging to the measurement area MA1, the number of pixels of an image representing the humans H1 and H2, and the normalized coefficient α—1, and a congestion degree of the measurement area MA2 is calculated based on the number of pixels of an image belonging to the measurement area MA2, the number of pixels of an image representing the humans H3 and H4, and the normalized coefficient α—2. Moreover, a congestion degree of the measurement area MA3 is calculated based on the number of pixels of an image belonging to the measurement area MA3, the number of pixels of a partial image belonging to the measurement area MA3 out of an image representing the human H5, and the normalized coefficient α—3, and a congestion degree of the measurement area MA4 is calculated based on the number of pixels of an image belonging to the measurement area MA4, the number of pixels of an image representing the human H6, and the normalized coefficient α—4. As a result, the outputs of the air conditioning devices D1 and D2 are more strengthened than the outputs of the air conditioning devices D3 and D4.
The CPU 14p executes a plurality of tasks including an imaging task shown in
With reference to
When YES is determined in the step S3, the measurement area setting task is started up in a step S5, and thereafter, the process advances to a step S15. When YES is determined in the step S7, it is determined in a step S9 whether or not the measurement area is already set. When a determined result is YES, the congestion degree measuring task is started up in a step S11, and then the process advances to the step S15 while when the determined result is NO, the process directly advances to the step S15. When NO is determined in both the steps S3 and S7, another process is executed in a step S13, and thereafter, the process advances to the step S15.
In the step S15, it is repeatedly determined whether or not a mode changing manipulation is performed. When a determined result is updated from NO to YES, the task that is being started up is ended in a step S17, and thereafter, the process returns to the step S3.
With reference to
In a step S25, it is determined whether or not the drag manipulation for designating the area is performed, and when a determined result is updated from NO to YES, the variable K is incremented in a step S27. In a step S29, the line defining the rectangular area designated by the drag manipulation is drawn on the map image.
In a step S31, the four X-Y coordinates (X—1, Y—1), (X—2, Y—2), (X—3, Y—3), and (X—4, Y—4) respectively corresponding to the four corners of the rectangular area thus designated are specified. The specified X-Y coordinates (X—1, Y—1), (X—2, Y—2), (X—3, Y—3), and (X—4, Y—4) are set in the register 14r corresponding to the value of the variable K.
In a step S33, the normalized coefficient α_K is calculated by dividing the dimensions of the rectangular area designated by the drag manipulation by the unit area. The calculated normalized coefficient α_K is also set to the register 14r corresponding to the value of the variable K.
Upon completion of the process of setting to the register 14r, the variable L is set to “1” in a step S35. In a subsequent step S37, the X-Y coordinates (X_L, Y_L) described in the L-th column that corresponds to the variable K are read out from the register 14r, and the read-out X-Y coordinates (X_L, Y_L) are transformed into the U-V coordinates (U_L, V_L) according to the above-described Equation 1. The transformed U-V coordinates (U_L, V_L) are described in the register 14r corresponding to the X-Y coordinates (X_L, Y_L), which are a transformation source.
In a step S39, it is determined whether or not the variable L reaches “4”. When a determined result is NO, the variable L is incremented in a step S41, and then, the process returns to the step S37 while when the determined result is YES, the process advances to a step S43. Therefore, processes after the step S43 are executed after the X-Y coordinates (X—1, Y—1), (X—2, Y—2), (X—3, Y—3), and (X—4, Y—4) are transformed into the U-V coordinates (U—1, V—1) (U—2, V—2), (U—3, V—3), and (U—4, V—4).
In the step S43, the area surrounded by the U-V coordinates (U—1, V—1), (U—2, V—2), (U—3, V—3), and (U—4, V—4) that correspond to the variable K is specified as the measurement area so as to draw the line defining the specified measurement area on the camera image. In a step S45, out of the plurality of pixels configuring the camera image, the number of pixels belonging to the measurement area specified in the step S43 is detected. The detected number of pixels is described in the register 14r corresponding to the variable K. Upon completion of the process in the step S45, it is regarded that the setting of the K-th measurement area is completed, and the process returns to the step S25.
With reference to
In a step S63, a ratio of the motion image belonging to the K-th measurement area occupying in the K-th measurement area is calculated based on the variables P_K—1 and P_K—2 respectively set in the steps S57 and S61. The ratio (=RT_K) is equivalent to a value of dividing the variable P_K—2 by the variable P_K—1. In a step S65, the congestion degree of the K-th measurement area is calculated as “CR_K” by multiplying the calculated ratio RT_K by the normalized coefficient α_K.
When the congestion degree CR_K is calculated, in a step S67, it is determined whether or not the variable K reaches the maximum value Kmax (=the total number of the measurement areas). When a determined result is NO, the variable K is incremented in a step S69, and thereafter, the process returns to the step S57. When the determined result is YES, the process advances to a step S71 so as to control the outputs of the air conditioning devices D1 to D4 based on the congestion degrees CR—1 to CR_Kmax calculated in the step S65. Upon completion of the air-conditioning control, the process returns to the step S51.
As can be seen from the above-described explanation, the CPU 14p displays on the monitor 16 the map image representing the state in which the plane FS1 is outlined or overlooked (S21), and accepts on the displayed map image the drag manipulation of designating one or at least two rectangular areas from the manipulator (S25). The camera 12 has the imaging surface capturing the plane FS1, and repeatedly outputs the scene image, i.e., the camera image. The CPU 14p takes the camera image outputted from the camera 12 (S1), and allocates one or at least two measurement areas respectively corresponding to one or at least two rectangular areas designated by the drag manipulation to the camera image (S27 to S31, S35 to S43). Moreover, the CPU 14p executes the process of measuring the congestion degree of one or at least two dynamic objects existing on the plane FS1 at each measurement area allocated to the camera image, based on the camera image (S51 to S69).
The drag manipulation of designating the rectangular area is accepted on the map image representing the state in which the plane FS1 is outlined or overlooked. The measurement area used for the congestion degree measurement referring to the camera image is allocated to the camera image corresponding to the rectangular area designated by the drag manipulation. Thereby, it becomes possible to inhibit the work burden for the initial setting.
It is noted that, in this embodiment, it is assumed that the planar projective transformation is performed between the X-Y coordinate system defining the plane FS1 and the U-V coordinate system defining the camera image. However, in a case where a congestion degree of humans working in an office where a large number of desks and chairs are placed, a measuring accuracy seems to be more improved by noticing a movement on the desk rather than a movement at feet.
In this case, the calibration parameters P11 to P33 applied to Equation 1 are adjusted to values in which an offset equivalent to a height of the desk (an offset in a length direction of a Z axis orthogonal to the X axis and the Y axis) is taken into consideration. If the planar projective transformation is executed by referring to the calibration parameters P11 to P33 having the thus adjusted values, then a measurement area corresponding to a rectangular area indicated by a bold line in
Moreover, in this embodiment, the rectangular area is designated by the drag manipulation of the mouse pointer. However, instead thereof, the rectangular area may be designated by input of a numerical value indicating the U-V coordinates.
Moreover, in this embodiment, the plane FS1 is captured from the diagonally upper position by the camera 12 installed at the upper portion on the wall surface, however, instead thereof, the plane FS1 may be captured from directly above by an omnidirectional camera set on the ceiling.
Furthermore, in this embodiment, it is assumed that the output of the air conditioning device is adaptively controlled, however, instead of the output of the air conditioning device, or together with the output of the air conditioning device, the output of an illuminating device (i.e., brightness) may be adaptively controlled.
Moreover, in this embodiment, the planar projective transformation referring to Equation 1 is assumed, however, instead thereof, a perspective projective transformation may be performed.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2010-14238 | Jan 2010 | JP | national |