Embodiments described herein relate generally to an energy management system.
Conventionally, a target space such as a floor, a living room, a private room, an office, a staircase, or a landing (to be referred to as a target area hereinafter) is divided into a plurality of areas, and air conditioning, illumination, and the like are controlled in accordance with the presence/absence of persons in each area. The presence/absence of persons is detected by, for example, a motion sensor.
In general, according to one embodiment, an energy management system for controlling an electrical apparatus installed in a target area, includes an image sensor and an energy management server.
The image sensor senses the target area, acquires, from a sensed image of the target area, human information representing a state of a person in the target area and environmental information concerning an environment of the target area for each of a plurality of divided areas obtained by dividing the target area, and outputs the human information and the environmental information. The energy management server is connected to the image sensor via a communication network. The energy management server executes task-ambient control for the electrical apparatus based on the human information and the environmental information for each of the divided areas output from the image sensor.
Embodiments will now be described with reference to the accompanying drawings.
A target floor 1 includes, for example, six divided areas E01 to E06. The target floor 1 is provided with a task illumination 2a and task air conditioning 2b for controlling illumination and air conditioning in a working area where at least a person does a necessary work (operation), and an ambient illumination 3a and ambient air conditioning 3b for controlling illumination and air conditioning in an aisle for a walking person or a non-working area that needs minimum brightness and air conditioning. The task illumination 2a, the task air conditioning 2b, the ambient illumination 3a, and the ambient air conditioning 3b are examples of electrical apparatuses according to this embodiment.
Note that as for the task illumination 2a, an illuminator is installed in each of divided areas E01 to E06 or for each desk arranged in each divided area. As for the task air conditioning 2b, an air conditioner including an air outlet is installed on the area ceiling or area floor of each of divided areas E01 to E06.
In addition, for example, the ceiling of the target floor 1 or required portions of the floor are provided with one or a plurality of image sensors 4 for sensing the whole target floor 1 or each of the plurality of divided areas. An example in which one image sensor 4 is provided will be explained below for the descriptive convenience.
When the target floor 1 is divided into, for example, the six areas E01 to E06, and the states of persons in each of divided areas E01 to E06 are detected from images sensed by the image sensor 4, the energy management system according to this embodiment controls task-ambient illumination and air conditioning based on the number of persons in each divided area, the states of the persons, and the like.
The states of persons are represented by information such as area E01=2 walking persons, area E02=2 standing persons and 1 seated person, area E03=0 persons, area E04=1 walking person, area E05=0 persons, and area E06=0 persons. Such information will generically be referred to as human information.
For example, since persons are present in both divided areas E01 and E02, the ambient illumination 3a and ambient air conditioning 3b are turned on. In divided area E02, since a person is working at the desk, the task illumination 2a and task air conditioning 2b are turned on.
On the other hand, divided areas E03 and E04 include only one person walking in the area E04. For this reason, the ambient illumination 3a and ambient air conditioning 3b are turned on to implement energy saving control. Neither of divided areas E05 and E06 has persons. Hence, control is performed to turn off the ambient illumination 3a and ambient air conditioning 3b to implement further energy saving control.
When illuminance in each of divided areas E01 to E06 can be acquired, the illuminance information is taken into consideration to implement energy saving control. As for the illumination in each divided area, for example, light control is executed while considering outside light, thereby implementing energy saving control.
Even air conditioning can be controlled in consideration of insolation.
The energy management system includes the image sensor 4 installed in the target floor 1, an image management system 5, and an energy management server 6. The image sensor 4 is connected to the image management system 5 and the energy management server 6 via a communication network 7 such as a LAN, WAN, wireless LAN, or the like.
The image sensor 4 has a function of sensing the target floor 1 in a wide visual field, and obtaining, from a plurality of sensed frame images, a person state (human information) and illuminance information in each predetermined divided area of the target floor 1. Details will be described later. Note that information (outside light amount, and the like) concerning the environment of the target floor 1 will generically be referred to as environmental information.
The image management system 5 comprises an image management server 51 and an image-associated data accumulation database 52 for accumulating data associated with images. The image management server 51 has a function of receiving necessary information, for example, information concerning the security in the target floor 1 or information according to a user request out of information sent from the image sensor 4, and accumulating the information in the image-associated data accumulation database 52 together with time data.
The image management server 51 also has a function of collecting process data such as image information from the image sensor 4 in a necessary time period and human information in each divided area and displaying the data based on an information request instruction from an input unit (not shown) such as a keyboard or a mouse.
The image-associated data accumulation database 52 accumulates image information acquired by a plurality of image sensors 4 under, for example, almost the same time data, and human information and illuminance information associated with the image information. The image-associated data accumulation database 52 thus accumulates information necessary for the image management server 51 to execute integration processing of images and image-associated information acquired by the plurality of image sensors 4 retain the security level, or edit the accumulated information to make them visually recognizable based on a user request.
The energy management server 6 includes a building maintenance unit 61, an illumination controller 62, and an air-conditioning controller 63. Based on information sent from the image sensor 4, the building maintenance unit 61 determines task-ambient control concerning illumination and air conditioning in each divided area in accordance with a predetermined control rule (for example, an IF . . . THEN rule) or a building maintenance program that meets user demands.
The illumination controller 62 controls the task illumination 2a and the ambient illumination 3a in accordance with a task-ambient control instruction sent from the building maintenance unit 61 concerning illumination in each divided area.
The air-conditioning controller 63 controls the task air conditioning 2b and the ambient air conditioning 3b in accordance with a task-ambient control instruction sent from the building maintenance unit 61 concerning air conditioning in each divided area.
The energy management server 6 is also provided with a monitoring display unit 64 and an input unit 65 such as a keyboard or a mouse to input necessary control instructions.
The image sensor 4 includes an image sensing unit 41, an image processing unit 42, a storage device 43, and a communication unit 44 that sends predetermined output information.
As shown in
The storage device 43 stores sensed frame image data and other data. The communication unit 44 sends predetermined output information.
Note that the storage device 43 comprises a frame image storage unit 43a, a divided area data storage unit 43b, a setting data storage unit 43c, and a process data storage unit 43d.
The divided area data storage unit 43b stores divided area data determined by the relationship between the task illumination 2a and task air conditioning 2b, the ambient illumination 3a and ambient air conditioning 3b, and a work (operation) area installed in the target floor 1. The divided area data is, for example, data shown in
The setting data storage unit 43c stores setting data such as an illuminance conversion formula. The process data storage unit 43d stores data necessary for image processing.
The image sensing unit 41 obtains two-dimensional image data in the target floor 1. As the image sensing unit 41, for example, a visible-light camera (for example, a CCD camera) or infrared camera including a wide-angle lens whose angle of view is, for example, about 180° is used. Note that acquiring a thermal image using an infrared camera as the camera makes it possible to further acquire a heat distribution.
The image processing unit 42 comprises an image information acquisition unit 421, a motion distribution extraction unit 422, a first reflection unit 423, a human information acquisition unit 424, a luminance distribution extraction unit 425, a second reflection unit 426, an illuminance information acquisition unit 427, and an output unit 428.
The image information acquisition unit 421 performs preprocessing (for example, filter processing or digital image conversion processing for analog image data) of time-series frame images sensed by the image sensing unit 41 to acquire image information as a desired frame image and stores it in the frame image storage unit 43a.
The motion distribution extraction unit 422 extracts cumulative difference image information with a video motion from two frame images that are temporarily continuous and are stored in the frame image storage unit 43a. That is, the motion distribution extraction unit 422 acquires difference image information between a plurality of frames based on two pieces of time-series image information. The motion distribution extraction unit 422 binarizes the acquired difference image information based on a predetermined threshold. The motion distribution extraction unit 422 accumulates a plurality of pieces of binarized difference image information, thereby extracting cumulative difference image information of a person.
The first reflection unit 423 determines the state (for example, standing still, seated, or walking) of each person at least from the cumulative difference image information obtained from the time-series image information sensed by the image sensing unit 41. The first reflection unit 423 obtains the positional coordinates of a person in, for example, a standing still, seated, or walking state from, for example, a position serving as a base point corresponding to x=0 and y=0 of divided area E01 appearing in the cumulative difference image information, the image sensing magnification, and the numbers of pixels in the x- and y-directions of the cumulative difference image information. After that, the first reflection unit 423 reflects the person state on divided areas E01 to E09 by referring to the divided area data shown in
The human information acquisition unit 424 has a function of storing the person state in the process data storage unit 43d as process data for each of divided areas E01 to E09 based on the person state (for example, standing still, seated, or walking) reflected on each of divided areas E01 to E09.
The luminance distribution extraction unit 425 has a function of extracting a luminance distribution from information about brightness appearing in a frame image acquired by the image information acquisition unit 421.
The second reflection unit 426 refers to the already determined divided area data shown in
The illuminance information acquisition unit 427 converts the luminance distribution information into an illuminance in accordance with illuminance conversion formula data set in the setting data storage unit 43c and stores the illuminance of each of divided areas E01 to E09 in the process data storage unit 43d.
The output unit 428 outputs a combination of human information and illuminance information of each of divided areas E01 to E09 as output information.
The communication unit 44 reads out time-series frame images or process data in the process data storage unit 43d based on information for each divided area output from the output unit 428 or a request instruction from the image management system 5 or the like, and sends the information to the communication network 7 in accordance with a communication protocol.
The above-described energy management system will be explained next with reference to
The image sensing unit 41 of the image sensor 4 installed at a required portion of the target floor 1 senses the target floor 1 at a predetermined time interval (frame rate), extracts time-series frame images, and sends them to the image information acquisition unit 421 of the image processing unit 42.
The image information acquisition unit 421 executes preprocessing such as filtering processing of removing general noise components and the like, thereby acquiring image information (frame image) (1). This image information is stored in the frame image storage unit 43a, as described above.
After that, the image processing unit 42 executes the motion distribution extraction unit 422. The motion distribution extraction unit 422 acquires difference image information from two pieces of frame image information that are continuous, and binarizes the acquired difference image information based on a predetermined threshold. The motion distribution extraction unit 422 accumulates a plurality of pieces of binarized difference image information, thereby extracting cumulative difference image information (2) with a video motion.
More specifically, if a person remains standing without moving, the motion distribution extraction unit 422 extracts cumulative difference image information (2) having, for example, a small circular portion corresponding to the head. If a person is sitting at a desk, the motion distribution extraction unit 422 extracts the cumulative difference image information (2) having a small elliptical portion without a cumulative difference, which includes the shoulders and arms as well as the head of the person. If a person is running, the motion distribution extraction unit 422 extracts the cumulative difference image information (2) having a large elliptical portion having a large area and a cumulative difference with an afterimage. The cumulative difference image information (2) is sent to the first reflection unit 423.
The first reflection unit 423 acquires human area reflection information (3) by reflecting each person according to a behavior pattern on a corresponding divided area based on the positional coordinates of each person obtained from the cumulative difference image information (2) and divided areas E01 to E09 stored in the divided area data storage unit 43b, and sends the human area reflection information (3) to the human information acquisition unit 424.
The human information acquisition unit 424 acquires, from the image of the human pattern shown in (2) reflected on divided areas E01 to E09, human information (4) representing that, for example, there are one walking person in divided area E02, one person sitting at the desk in divided area E05, a standing person in divided area E08, and no person in the remaining divided areas, stores the human information in the process data storage unit 43d, and also sends it to the output unit 428.
On the other hand, the luminance distribution extraction unit 425 extracts luminance distribution information (5) from information about brightness appearing in the frame image acquired by the image information acquisition unit 421. The luminance distribution extraction unit 425 causes the second reflection unit 426 to reflect the extracted luminance distribution information (5) on divided areas E01 to E09, thereby generating luminance area reflection information (6).
Based on the thus generated luminance area reflection information (6), the illuminance information acquisition unit 427 converts the luminance into illuminance information (7) for each of divided areas E01 to E09 using a general luminance-illuminance conversion formula (conversion formula) stored in the setting data storage unit 43c, stores the illuminance information in the process data storage unit 43d, and also sends it to the output unit 428.
The output unit 428 creates output information in accordance with a predetermined divided area order and sends it to the communication network 7 via the communication unit 44. The output information represents, for example, divided area E01: 0 persons, illuminance 900 lux; divided area E02: 1 walking person, illuminance 900 lux; divided area E03: 0 persons, illuminance 900 lux; divided area E04: 0 persons, illuminance 500 lux; divided area E05: 0 seated persons, illuminance 500 lux; . . . .
The output unit 428 creates the output information based on the human information and illuminance of each divided area acquired by the human information acquisition unit 424 and the illuminance information acquisition unit 427 or by reading out the human information and illuminance of each divided area temporarily stored in the process data storage unit 43d.
Note that at this time, the output information may be sent with, for example, the time data of continuous frame images received from the image sensing unit 41 or the time data of continuous subsequent frame images added at the start.
Alternatively, as shown in
The output information sent from the communication unit 44 to the communication network 7 is sent to the energy management server 6.
When the building maintenance unit 61 receives the output information, the energy management server 6 determines in accordance with, for example, the IF THEN rule serving as a control rule that a person is sitting and doing an operation in divided area E05. The energy management server 6 sends, to the air-conditioning controller 63, a control instruction to turn on the task air conditioning 2b corresponding to divided area E05, thereby on-controlling the task air conditioning 2b.
Upon determining that it is dark in divided area E05 because the illuminance is 500 lux, the energy management server 6 sends, to the illumination controller 62, a control instruction to increase the illuminance of the task illumination 2a or turn on the task illumination 2a in the peripheral area E08, thereby on-controlling the task illumination 2a.
Since only one person is walking in divided area E02, the energy management server 6 sends control instructions to turn on the ambient illumination 3a and ambient air conditioning 3b to the illumination controller 62 and the air-conditioning controller 63, respectively, thereby controlling the ambient illumination 3a and ambient air conditioning 3b.
That is, a rule is formed from the human behavior and illuminance condition in divided areas E01 to E09, and the task illumination 2a and task air conditioning 2b and the ambient illumination 3a and ambient air conditioning 3b are controlled, thereby implementing energy saving.
Hence, according to the above-described embodiment, the target floor 1 is finely divided in advance. The state (for example, walking, seated, or standing) of each person obtained from the image sensor 4 is determined. The person state is reflected on each of the finely divided areas, and output information is sent to the energy management server 6. Hence, the energy management server 6 can control the task illumination 2a and task air conditioning 2b and the ambient illumination 3a and ambient air conditioning 3b delicately in accordance with a predetermined rule while considering the state of each person and, for example, illuminance information in each divided area.
As described above, a target floor 1 is divided into nine areas E01 to E09 based on the relationship between task illumination 2a and task air conditioning 2b, ambient illumination 3a and ambient air conditioning 3b, and a working (operation) area. The nine divided areas E01 to E09 of frame image sensed by an image sensing unit 41 shown in
Hence, providing a conversion table for the frame image and the floor map enables to convert the positional coordinates of a person on the frame image into a position on the floor map in the real space.
In the second embodiment, a map conversion table corresponding to a person state is used to convert the positional coordinates of a person on a frame image into an accurate position on the floor map in the real space. For example, as shown in
Based on area reflection information (3) reflected by a motion distribution extraction unit 422 and a first reflection unit 423, a map position acquisition unit 429 or a human information acquisition unit 424 selects the walking person conversion table 43e1 if a person is a walking person. The map position acquisition unit 429 or the human information acquisition unit 424 selects the seated person conversion table 43e2 if a person is a seated person. The map position acquisition unit 429 or the human information acquisition unit 424 determines the position of each person on the floor map.
The walking person conversion table 43e1 is a conversion table that defines positional coordinate data only on an aisle 11 where a walking person passes, as schematically shown in
On the other hand, the seated person conversion table 43e2 is a conversion table that defines positional coordinate data in each desk group at which a seated person sits, as schematically shown in
An example of processing of causing the map position acquisition unit 429 or the human information acquisition unit 424 in an image processing unit 42 to specify the position of a person on the map will be described with reference to
First, based on cumulative difference image information (2) with a video motion obtained by the motion distribution extraction unit 422 or the human area reflection information (3) reflected by the first reflection unit 423, the map position acquisition unit 429 or the human information acquisition unit 424 determines where a person is present in any of divided areas E01 to E09 (step S1). Upon determining that a person is present, the map position acquisition unit 429 or the human information acquisition unit 424 determines whether the person is a walking person (step S2).
If the person is a walking person, the map position acquisition unit 429 or the human information acquisition unit 424 selects the walking person conversion table 43e1 from the storage device 43 (step S3), and then compares the positional coordinates of the walking person already specified by the first reflection unit 423 with the positional coordinates of the aisle 11 defined in the walking person conversion table 43e1. The map position acquisition unit 429 or the human information acquisition unit 424 determines the position of the aisle on the map from the numbers of pixels in the x- and y-directions, that is, {positional coordinates of walking person±(numbers of pixels in x- and y-directions×length of one pixel unit)} corresponding to the difference between the positional coordinates of the walking person and the positional coordinates of the aisle 11 (step S4), and stores the position of the aisle in a process data storage unit 43d (step S5).
On the other hand, upon determining in step S2 that the person is not a walking person, the map position acquisition unit 429 or the human information acquisition unit 424 determines whether the person is a seated person (including a standing person) (step S6). Upon determining that the person is a seated person (including a standing person), the map position acquisition unit 429 or the human information acquisition unit 424 selects the seated person conversion table 43e2 from the storage device 43 (step S7).
The map position acquisition unit 429 or the human information acquisition unit 424 compares the positional coordinates of the seated person already specified by the first reflection unit 423 with the positional coordinates of the desk 12 defined in the seated person conversion table 43e2. The map position acquisition unit 429 or the human information acquisition unit 424 determines the position of the desk 12 on the map from the numbers of pixels in the x- and y-directions corresponding to the difference between the positional coordinates of the seated person and the positional coordinates of the aisle 12 (step S8), and stores the position of the desk in the process data storage unit 43d (step S5).
Subsequently, the map position acquisition unit 429 or the human information acquisition unit 424 determines whether another person is present in the same or another divided area on the image (step S9). If another person is present, the map position acquisition unit 429 or the human information acquisition unit 424 returns the process to step S2 to repetitively execute the series of processes. If no other person is present, the map position acquisition unit 429 or the human information acquisition unit 424 determines whether to continue the processing (step S10). To continue the processing, the process returns to step S1.
Hence, according to the above-described embodiment, the map position of a person on the screen is specified, and for example, the positional information of the person on the real space is sent in addition to human information and illuminance information of each of the above-described divided areas E01 to E09. Hence, for example, an energy management server 6 can specify, out of the task air conditioning 2b and the ambient air conditioning 3b arranged on the floor map provided in advance, for example, the task air conditioning 2b or ambient air conditioning 3b that includes any air outlet and lies at a position closest to a person present. Executing air-conditioning control based on the result allows to efficiently perform air-conditioning control.
In this embodiment, a plurality of conversion tables having different unit granularities are prepared. As in the second embodiment, a map position acquisition unit 429 is used to select a conversion table based on a person state and determines the position in the real space.
More specifically, when an image sensing unit 41 senses a target floor 1, the cumulative difference of a walking person in the image is large, and the position can be identified in a large area. On the other hand, the cumulative difference of a seated person in the image is small, and the position can be identified in a small area.
To do this, the unit granularity of the map conversion table is changed depending on the person state. For example, the unit granularity of a seated person conversion table 43e2′ (corresponding to the seated person conversion table 43e2 in
The aisle person map conversion table 43e1′ schematically defines positional coordinate data corresponding to an aisle 11 where a walking person passes. From the positional coordinates of a walking person on the image and the positional coordinate data of each aisle 11, the position of the aisle 11 on the map table map can be specified.
From the positional coordinates of a seated person and the positional coordinates of each group of a plurality of desks, the desk map conversion table 43e2′ can specify the position of the desk on the map at which the seated person sits.
As in the second embodiment, upon determining from image information that a person state indicates a walking person, the map position acquisition unit 429 selects the aisle person map conversion table 43e1′. If a person is sitting, the map position acquisition unit 429 selects the desk map conversion table 43e2′. The map position acquisition unit 429 specifies the position of the person in the floor map from the positional coordinates of the walking person or seated person on the image and the positional coordinates data described in the conversion table 43e1′ or 43e2′.
Even a building maintenance unit 61 of an energy management server 6 can specify, out of task air conditioning 2b and ambient air conditioning 3b arranged on the floor map, for example, the task air conditioning 2b or ambient air conditioning 3b that includes any air outlet and lies at a position closest to a person present on the screen. Executing air-conditioning control based on the result allows to efficiently perform air-conditioning control. Since the position of a seated person can be specified more accurately than the position of a walking person, efficient task-ambient control can be performed.
In this embodiment, a storage device 43 stores a heat value management table 34g in advance, which considers the attribute (for example, aisle, desk, PC, display, or printer PR) of a heat generation target including a person state (walking, seated, or standing), as shown in
In divided areas E01 to E09 shown in
The output unit 428 sends the total heat value of each of divided areas E01 to E09 to an energy management server 6 together with or separately from human information acquired by a human information acquisition unit 424. A building maintenance unit 61 of the energy management server 6 can correct temperature control of task air conditioning 2b and ambient air conditioning 3b in consideration of the total heat value of each of divided areas E01 to E09 so as to comfortably and efficiently execute air-conditioning control.
In the first embodiment, a luminance obtained in each of divided areas E01 to E09 is converted into an illuminance in accordance with an illuminance conversion formula stored in the setting data storage unit 43c. In the fifth embodiment, for example, the illuminance value level of the luminance (brightness) of an image obtained by an image information acquisition unit 421 is checked for each of divided areas E01 to E09 in consideration of the layout of desks and OA equipment. A storage device 43 stores a correspondence conversion table 43h to obtain an optimum illuminance for each of divided areas E01 to E09.
An illuminance information acquisition unit 427 converts the average luminance obtained from the image of each of divided areas E01 to E09 acquired by a second reflection unit 426 into an illuminance of each of divided areas E01 to E09 in accordance with a conversion ratio as shown in
In this embodiment, for example, an original image sensed by an image sensor 4 or an inter-frame difference based image created by a motion distribution extraction unit 422 is stored in a process data storage unit 43d. Alternatively, a bird's-eye view (floor map) 43i of a target floor 1 is stored in a setting data storage unit 43c or the process data storage unit 43d in advance.
The original image sensed by the image sensor 4, the inter-frame difference based image, or the bird's-eye view 43i shown in
The above-described arrangement makes it possible to immediately grasp the states of persons in the target floor 1 by displaying them on, for example, a display unit 64 of the energy management server 6.
As a result, if, for example, different pieces of human information are output from the plurality of image sensors 4-1 and 4-2, the output information needs to be processed based on a predetermined rule.
As the output information processing rule, for example, output information acquired from an image sensed at the latest time out of the plurality of image sensors 4-1 and 4-2 may be employed. Alternatively, output information related to an image sensor, for example, the image sensor 4-1 on a side closer to the event occurrence position of a person (for example, the seated position of a person) may be given higher priority. If an illuminance value is included in output information, output information in which one of the maximum value, minimum value, and average value of the illuminance satisfies a predetermined standard may be given higher priority.
According to the above-described embodiment, when the sensing ranges of the plurality of image sensors 4-1 and 4-2 overlap, and different pieces of output information are obtained based on the overlap range, optimum information is preferentially employed in accordance with a predetermined rule. Using the optimum information for task-ambient control enables to implement safer control.
In the above-described embodiments, walking, seated, or standing is determined as the state of a person. Instead, gender detection may be done based on the color and pattern of clothing, or the temperature measured by a thermometer in each divided area may be used. These pieces of information may be output as output information for each divided area.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2011-108483 | May 2011 | JP | national |
This application is a Continuation Application of PCT. Application No. PCT/JP2012/062188, filed May 11, 2012 and based upon and claiming the benefit of priority from Japanese Patent Application No. 2011-108483, filed May 13, 2011, the entire contents of all of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2012/062188 | May 2012 | US |
Child | 13568902 | US |