The present invention relates to an air conditioner control apparatus, an air conditioner control system, an air conditioner control method, and a program.
Various techniques for providing a comfortable air environment using an air conditioner have been proposed.
For example, an area-specific environment management system described in Patent Document 1 includes a biological information sensor 10, an environment information sensor 20, a wakefulness degree estimation unit 30, and an environment provision unit 40. The wakefulness degree estimation unit 30 estimates, for example, a degree of wakefulness of an individual, based on biological information detected by the biological information sensor 10. The environment provision unit 40 provides an environment to an environment in each area, based on a degree of wakefulness and environment information.
Patent Document 1 describes that the invention in Patent Document 1 provides an environment for controlling a degree of wakefulness of an individual and the like, and can perform control in such a way that productivity of the individual and the like reaches a desired state.
Patent Document 1 describes that biological information sensors 10a and 10b are formed of a biological information sensor of a sheet type that is disposed on a chair, and senses a heartbeat, breathing, and a body movement of an individual sitting on the chair. Further, it is described that the biological information sensor 10 may capture a face image and the like of an individual by a camera on a front surface of a display of personal computers 3a and 3b, analyze image data, and detect a blink, a pose, a body movement, and the like.
For example, an environment facility control system described in Patent Document 2 is a system for bringing a degree of wakefulness being a physical/mental state of a user close to a degree of wakefulness being a target by using a plurality of kinds of environment facilities. An environment facility control system 1 mainly includes an air circulating apparatus 10, a ventilation apparatus 20, an aroma diffuser 30, a biological sensor 40, a remote controller 50, and an environment facility control apparatus 100.
The biological sensor 40 described in Patent Document 2 is a sensor for recognizing a degree of wakefulness of a user, and includes an electrocardiographic waveform sensor 41 that detects an electrocardiographic waveform of a user and an expression camera 42 that detects an expression of the user. The expression camera 42 is disposed in a specific position in a room in which an expression of a face of a user can be captured, and detected expression data can be transmitted to surrounding equipment such as the environment facility control apparatus 100 in a wireless manner. The environment facility control apparatus 100 can acquire information from the biological sensor 40 and the remote controller 50, and control the air circulating apparatus 10, the ventilation apparatus 20, and the aroma diffuser 30.
For example, a wakefulness degree control system described in Patent Document 3 determines a degree of wakefulness of a target person for control of a degree of wakefulness, controls a physical amount of a surrounding environment of the target person for the control of the degree of wakefulness according to a determination result, and attempts to maintain or improve the degree of wakefulness. The wakefulness degree control system includes a wakefulness degree control apparatus 100, one or more pieces of environment control equipment 200, one or more pieces of environment measurement equipment 300, and one or more pieces of wakefulness degree estimation equipment 400.
The environment measurement equipment 300 described in Patent Document 3 is equipment that measures a physical amount such as temperature and illumination, and forms numeric data. The wakefulness degree estimation equipment 400 estimates a degree of wakefulness of a target person from biological information and the like, and forms numeric data. It is described that any one or a combination of a body temperature, a moving image of a face, and a pulse wave may be used as biological information.
The wakefulness degree control apparatus 100 described in Patent Document 3 controls the environment control equipment 200 according to a degree of wakefulness of a target person. The wakefulness degree control apparatus 100 includes a surveillance control unit 181, a first acquisition unit 182, a second acquisition unit 183, a set value computation unit 184, and the like.
The surveillance control unit 181 described in Patent Document 3 acquires an equipment set value being set for the environment control equipment 200 from communication with the environment control equipment 200. The first acquisition unit 182 acquires a measured value of a physical amount being measured by the environment measurement equipment 300. The physical amount includes an air temperature such as room temperature. The second acquisition unit 183 performs communication with the wakefulness degree estimation equipment 400, and acquires an estimation value of a degree of wakefulness of a target person. The set value computation unit 184 acquires the equipment set value from the surveillance control unit 181, acquires the measured value of the physical amount from the first acquisition unit 182, acquires the wakefulness degree estimation value from the second acquisition unit 183, and computes an equipment set value, based on the values.
For example, an apparatus of a system described in Patent Document 4 is formed of three units of a detection unit 1, a decision unit 2, and a control unit 3. The detection unit 1 is used for detecting a state of an indoor place or a person, and transmits the information to the decision unit 2. The decision unit 2 decides the state of the person, based on the information received from the detection unit 1. The control unit 3 includes an interface for performing control of equipment, and performs control of the equipment.
It is described that the detection unit 1 includes, as input equipment, a plurality of cameras, microphones, temperature sensors, and humidity sensors, and is formed of an input control unit 11 that processes an input from the input equipment, detection function units 14, 15, 16, and 17 for each piece of the input equipment, and a result collection unit 13 that puts results together in various model databases 12 and puts outputs to the decision unit 2 together. A function of the detection unit 1 includes a pose detection function, a face detection function, an eye portion observation function, and a mouth portion observation function. The decision unit 2 decides, for example, how to control equipment, based on information received from the decision unit 1, and includes a control instruction function and the like.
For example, Patent Document 5 describes that clothing of a user may be determined from a captured video, and an instruction on a temperature adjustment may be provided to air conditioners 10, 10A, and 10B depending on clothing. For example, it is described that the air conditioners 10, 10A, and 10B are instructed to be turned down when it can be confirmed that a short-sleeved shirt is worn or a lap robe is used during cooling, and the air conditioners 10, 10A, and 10B are instructed to be turned up when a long-sleeve shirt is worn.
For example, Patent Document 6 describes that, when a user U1 performs a predetermined motion (S1-1), a control apparatus 70 determines the motion of the user U1 (S1-2), recognizes that a gesture instruction condition is instructed (S1-3), and controls a heat exchanger 4 and the like, based on the gesture instruction condition (S1-4), in an indoor unit 100 of an air circulating machine.
For example, an air conditioning system 1 in Patent Document 7 recognizes a change request for air conditioning setting by a gesture input operation of a user, and adjusts ventilation in response to the change request.
In the area-specific environment management system described in Patent Document 1, the biological information sensor 10 is provided for an individual. Thus, it takes time and effort to provide the biological information sensor 10. Particularly, when an environment for controlling a degree of wakefulness of a plurality of persons is to be provided, it may be difficult to provide the biological information sensor 10 for each of the plurality of persons. Further, when a person moves, it may be difficult to provide the biological information sensor 10 for the person. Thus, in the area-specific environment management system described in Patent Document 1, it may be difficult to make an air environment of a space comfortable.
The environment facility control system described in Patent Document 2 requires the expression camera 42 disposed in a specific position in a room in which an expression of a face of a user can be captured. It takes time and effort to install the expression camera 42 in such a specific position. Further, it may be difficult to determine a position in a room in which an expression of a face can be captured, and attach the expression camera 42 in the position.
It is described that, in the wakefulness degree control system described in Patent Document 3, a room temperature as a physical amount and a moving image of a face as biological information are used in order to compute an equipment set value. In a case where an equipment set value is computed by using only a room temperature, it may be difficult to make an air environment of a space comfortable.
Further, Patent Document 3 does not describe a method for acquiring a moving image of a face. Similarly to the expression camera 42 described in Patent Document 2, in a case where a camera disposed in a specific position in a room in which a face of a user can be captured is used in order to acquire a moving image of the face, there is the problem similar to that of the technique described in Patent Document 2.
In the system described in Patent Document 4, a pose, a face, an eye, and a mouth at one time are detected, and a state such as a yawn and opening/closing of an eyelid is detected. However, for example, when equipment is controlled based on a state of a person at that time, there is a risk that the equipment may be controlled based on an accidental state, and, as a result, comfort of the person may be impaired.
In the technique described in Patent Document 5, it is difficult to determine that wearing any of a short-sleeved shirt and a long-sleeved shirt during cooling is comfortable for a person. Further, when an air conditioner is instructed to be turned down in a case where a person using a lap robe during cooling can be confirmed, there is a risk that comfort of another person may be impaired. In other words, in the technique described in Patent Document 5, there is a possibility that an air environment of a space cannot be made comfortable.
In the techniques described in Patent Documents 6 and 7, control of the heat exchanger 4 and the like and an adjustment of ventilation are performed based on a gesture being a conscious motion of a user. Thus, there is a problem that a person needs to perform a conscious motion in order to acquire a comfortable air environment.
One example of an object of the present invention is, in view of the problem described above, to provide an air conditioner control apparatus, an air conditioner control system, an air conditioner control method, and a program that solve a problem in which it is difficult to provide a comfortable air environment without making a person conscious.
One aspect of the present invention provides an air conditioner control apparatus including:
One aspect of the present invention provides an air conditioner control system including:
One aspect of the present invention provides an air conditioner control method including,
One aspect of the present invention provides a program causing a computer to execute:
The present invention can easily provide a comfortable air environment without a person being conscious.
Hereinafter, one example embodiment of the present invention will be described by using drawings. Note that, in all of the drawings, a similar component has a similar reference sign, and description thereof will be appropriately omitted.
The air conditioner control apparatus 103 includes a history acquisition unit 104 and a control unit 105.
The history acquisition unit 104 processes image information including an image in which a target space is captured, and acquires appearance history information including a history of appearances of one or more persons P being active in the target space.
The control unit 105 outputs control information for controlling the air conditioner 102a, based on the appearance history information.
The air conditioner control apparatus 103 can easily provide a comfortable air environment without a person being conscious. The air conditioner control system 100 can easily provide a comfortable air environment without a person being conscious.
The history acquisition unit 104 processes image information including an image in which a target space is captured, and acquires appearance history information including a history of appearances of one or more persons P being active in the target space (step S101).
The control unit 105 outputs control information for controlling the air conditioner 102a, based on the appearance history information (step S102).
The air conditioner control processing can easily provide a comfortable air environment without a person being conscious.
A detailed example of the air conditioner control system 100 according to the present example embodiment will be described below.
The air conditioner control apparatus 103 and each of the capturing apparatuses 101a to 101c and the air conditioners 102a to 102c are connected to each other via a network N. The network N is a communication network constituted in a wired manner, a wireless manner, or a combination of the manners. The air conditioner control apparatus 103 and each of the capturing apparatuses 101a to 101c and the air conditioners 102a to 102c transmit and receive information to and from each other via the network N.
Hereinafter, the “air conditioner control system” is also expressed as a “control system”. Further, the “air conditioner control apparatus” is also expressed as a “control apparatus”.
The target space is a space being a target of capturing using the capturing apparatuses 101a to 101c. The target space may be appropriately determined. Each of the meeting room R and the corridor C according to the present example embodiment is an example of the target space. In other words, in the present example embodiment, one room and a predetermined section of the corridor C are set as the target space.
The target space is not limited to this. One or more target spaces may be set in a part or the whole of an indoor place or an outdoor place such as one or a plurality of buildings, structures, and facilities. The target space is, for example, a part or the whole of each room of a building and a structure, a predetermined section of a corridor, a predetermined section of a stair, a predetermined indoor place near an entrance, a predetermined outdoor place near an entrance, a porch, and the like. Various sizes, shapes, and the like of the target space may be set. When the target space is plural, sizes, shapes, and the like of the target spaces may be different. When the target space is plural, there may be a gap between the target spaces adjacent to each other, or the target spaces may be adjacent to each other without a gap.
Each of the capturing apparatuses 101a to 101c is, for example, a camera. The camera may be a camera that generates transmittable/receivable image information via the network N, and is, for example, an omnidirectional camera.
The capturing apparatuses 101a to 101c capture a part of the meeting room R and the corridor C. Specifically, the capturing apparatuses 101a to 101b respectively capture captured regions TSa to TSb included in the meeting room R. The capturing apparatuses 101c captures a captured region TSc included in the corridor C. In other words, the capturing apparatuses 101a to 101c are capturing apparatuses installed in such a way as to be able to capture the captured regions TSa to TSc being at least associated with the capturing apparatuses 101a to 101c, respectively.
The captured regions TSa to TSb according to the present example embodiment are two regions acquired by approximately dividing the entire meeting room R into two equal parts. The captured regions TSa to TSb have the same size, and are adjacent to each other without a gap in the meeting room R and without overlapping each other. The captured region TSc is a region of the corridor C adjacent to the meeting room R. In other words, the captured regions TSa to TSb are regions included in the meeting room R as the target space. The captured region TSc is a region included in the corridor C as the target space.
Hereinafter, when the captured regions TSa to TSc are not particularly distinguished, the captured regions TSa to TSc are simply expressed as a “captured region TS”. Note that, one or more captured regions TS may be set in a target space. Various sizes, shapes, and the like of each of the captured regions TS may be set. When the captured region TS is plural, sizes, shapes, and the like of the captured regions TS may be different. When the captured region TS is plural, there may be a gap between the captured regions TS adjacent to each other, the captured regions TS may be adjacent to each other without a gap, or the captured regions TS may partially overlap each other.
The capturing apparatuses 101a to 101c generate image information including images of the captured regions TSa to TSc captured by the capturing apparatuses 101a to 101c, respectively. The image information according to the present example embodiment further includes a capturing time and a region identifier (ID).
The images of the captured regions TSa to TSb are images of the meeting room R. The image of the captured region TSc is an image of the corridor C. In other words, the images captured by the capturing apparatuses 101a to 101b include an image of the meeting room R. The image captured by the capturing apparatus 101c includes an image of the corridor C.
A capturing time is a time at which capturing is performed. The capturing apparatuses 101a to 101c keep time, and include, in the image information, a capturing time being a time at which each of the capturing apparatuses 101a to 101c performs capturing.
A region ID is information for identifying each of the capturing apparatuses 101a to 101c. The region ID may be provided according to an appropriate rule. The region IDs of the captured regions TSa to TSc are assumed to be “TSa”, “TSb”, and “TSc”, respectively. The capturing apparatuses 101a to 101c hold, in advance, the region IDs of the captured regions TSa to TSb associated with the capturing apparatuses 101a to 101c, respectively, and include the region ID in the image information.
Each of the capturing apparatuses 101a to 101c transmits the generated image information to the control apparatus 103 via the network N. Specifically, the capturing apparatuses 101a to 101c respectively capture the captured regions TSa to TSc continuously at a predetermined time interval, and generate the image information. The image information continuously generated from each of the capturing apparatuses 101a to 101c constitutes a moving image in which the meeting room R and the corridor C are continuously captured.
Hereinafter, when the capturing apparatuses 101a to 101c are not particularly distinguished, the capturing apparatuses 101a to 101c are simply expressed as a “capturing apparatus 101”. Note that, the capturing apparatus 101 included in the control system 100 may be one or plural, and may be any number. Further, the time interval of capturing may be appropriately determined, and, in a case of a long period of time to some extent, and the like, the image information may not constitute a moving image.
Each of the air conditioners 102a to 102c is equipment, an apparatus, a facility, and the like that adjust an air environment of an air-conditioned space.
The air-conditioned space is a space in which each of the air conditioners 102a to 102c adjusts an air environment. The air-conditioned space is typically located in a vicinity (predetermined range) of each of the air conditioners 102a to 102c. The air-conditioned spaces of the air conditioners 102a to 102c according to the present example embodiment are spaces associated with the captured regions TSa to TSc, respectively. In other words, the air conditioners 102a to 102c according to the present example embodiment are associated with the captured regions TSa to TSc, respectively.
Each of the air conditioners 102a to 102c according to the present example embodiment is an air conditioner including each function of cooling, heating, dehumidifying, ventilating, and air cleaning. Each of the air conditioners 102a to 102c acquire control information from the control apparatus 103 via the network N. The air conditioners 102a to 102c adjust temperature, humidity, and the like of air in each air-conditioned space according to the acquired control information.
The control information includes an instruction to the air conditioners 102a to 102c that acquire the control information. The control information includes an instruction according to a function included in the air conditioners 102a to 102c that acquire the control information.
The instruction included in the control information includes, for example, one or a plurality of ON and OFF of a power source, setting or a change of an operation mode, setting or a change of a target temperature, setting or a change of an airflow rate, and setting or a change of a wind direction. The operation mode includes, for example, a mode (i.e., a cooling mode, a heating mode, a dehumidifying mode, a ventilating mode, and an air cleaning mode) that achieves each function of cooling, heating, dehumidifying, ventilating, and air cleaning.
Hereinafter, when the air conditioners 102a to 102c are not particularly distinguished, the air conditioners 102a to 102c are simply expressed as an “air conditioner 102”. Note that, the air conditioner 102 included in the control system 100 may be one or plural, and may be any number. An example of the air conditioner 102 other than an air conditioner will be described in modification examples described below.
The control apparatus 103 is an apparatus for controlling the air conditioner 102, based on image information including an image in which a target space is captured. The control apparatus 103 includes the history acquisition unit 104, the control unit 105, a history storage unit 106, and a control pattern storage unit 107.
The history acquisition unit 104 processes image information including an image in which a target space is captured, and acquires appearance history information including a history of appearances of one or more persons P being active in the target space.
Herein, “being active” mainly means a daily activity (including a state and behavior) of the person P. It is assumed that “being active” does not include a motion and a state intended for control of the air conditioner 102, such as a gesture for controlling the air conditioner 102.
The processing on the image information being performed by the history acquisition unit 104 is image processing for detecting the person P from an image included in the image information and also acquiring appearance information about the detected person P.
Specifically, for example, as illustrated in
The person detection unit 104a continuously acquires image information including an image in which the captured regions TSa to TSc are captured from each of the capturing apparatuses 101a to 101c via the network N. The person detection unit 104a detects the person P from the image included in each piece of the image information.
The analysis unit 104b analyzes the image information acquired by the person detection unit 104a, and outputs, as an analysis result, appearance information about each person P detected by the person detection unit 104a.
The analysis unit 104b includes a face recognition function, a person type recognition function, a pose recognition function, a movement recognition function, a worn clothes recognition function, a gradient feature detection function of an image, a color feature detection function of an image, an object recognition function, and the like. Details of each of the functions will be described below. Note that, the analysis unit 104b may include at least one of the functions exemplified herein, and the like.
The analysis unit 104b analyzes image information by using the functions, and continuously acquires, as a result of the analysis, appearance information about one or more persons P being active in the target space.
As illustrated in
A person ID is information for identifying the person P included in image information.
A region ID and a capturing time are a region ID and a capturing time included in the image information being a target of an analysis.
A position is information indicating a position of the person P of the person ID associated with the position.
By referring to the person ID, the region ID, the position, and the capturing time, the appearance information can be determined as information about an appearance of a certain person at a certain place at a certain time.
A pose is information about a posture of a body of the person P. The pose includes at least one of a standing position (a pose in a standing state), a sitting position (a pose in a sitting state), a lying position (a pose in a sleeping state), and the like.
The pose may be further subdivided. As an example acquired by subdividing a sitting position, there are a sitting position on a chair (a pose in a state of sitting on a chair), sitting in an orthopneic position (a pose in a sitting state with an upper body being raised at approximately 90 degrees and leaning), sitting with legs crossed, sitting in a kneeling position, and the like. As an example acquired by subdividing a lying position, there are a lying position on a back (a pose in a sleeping state on a back), a lying position on a side (a pose in a sleeping state on a side), a lying position on a stomach (a pose in a sleeping state on a stomach), and the like.
Behavior is information about a motion of the person P. The behavior may include a static motion of the person P.
The behavior is walking, running, speaking, fanning with a fan and the like, conducting business, eating a meal, wearing clothes, taking off clothes, and the like. The behavior may be stopping in a certain pose. The behavior may be a physiological phenomenon such as a yawn, a sneeze, shivering (trembling) from cold, sweating, and a cough. The behavior may include at least one of the examples exemplified herein, and the like.
Clothing is information about clothing of the person P that can be determined from an appearance.
In the present example embodiment, the clothing is represented by using a score. The score is an example of an index representing a degree of heavy clothing or light clothing of the person P. In the present example embodiment, a higher score is assumed to represent heavier clothing. Further, the score is assumed to be an integer of 1 to 5.
Specifically, for example, a score “5” of the clothing represents heavy clothing in a state of wearing a code or a corresponding state. A score “4” of the clothing represents, for example, a state of wearing a jacket or a corresponding state. A score “3” of the clothing represents, for example, a state of wearing a thin sweater or a corresponding state. A score “2” of the clothing represents a state of wearing a long-sleeved shirt or a corresponding state. A score “1” of the clothing represents, for example, light clothing in a state of wearing a short-sleeved shirt or a corresponding state.
Note that, an index may be, for example, an alphabet, a symbol, and the like. Further, how many levels a degree of heavy clothing or a degree of light clothing of the clothing is divided into may be appropriately determined. The clothing may not be indicated by an index, and may be indicated by worn clothes themselves in terms of an appearance (whether any of a coat, a jumper, a jacket, a sweater, a long-sleeved shirt, a short-sleeved shirt, a long-sleeved T-shirt, and a short-sleeved T-shirt is in a state of being worn on top).
A surrounding situation is information indicating a situation around the person P. The surrounding situation includes at least one of presence of a meal, presence of a book or a document, presence of equipment such as a personal computer, and the like.
The analysis unit 104b acquires appearance history information 106a, based on appearance information being continuously acquired. The analysis unit 104b stores the appearance history information 106a in the history storage unit 106 (see
The history storage unit 106 is a storage unit for storing the appearance history information 106a.
A time is information indicating a time corresponding to a person ID, a region ID, a position, a pose, behavior, clothing, and a surrounding situation associated with the time.
“˜” included in a time indicates continuance from a time included before “˜” without a change in each piece of information associated with the time. Further, “˜” included in a time indicates continuance until a time included after “˜” without a change in each piece of information associated with the time.
The appearance history information 106a illustrated in
The appearance history information 106a in
Note that, the appearance history information 106a may include at least one of a pose, behavior, clothing, and a surrounding situation.
The control unit 105 generates control information for controlling any one or a plurality of the air conditioners 102a to 102c, based on the appearance history information 106a. The control unit 105 transmits the generated control information to the air conditioners 102a to 102c being a target via the network N.
Specifically, the control unit 105 determines whether control of each of the air conditioners 102a to 102c is needed, based on the appearance history information 106a.
For example, the control unit 105 determines whether the control of the air conditioner 102 is needed, based on control pattern information 107a.
The control pattern storage unit 107 is a storage unit for storing the control pattern information 107a. In other words, the control unit 105 according to the present example embodiment determines whether control of the air conditioner 102 is needed, based on the control pattern information 107a stored in the control pattern storage unit 107. The control pattern information 107a may be appropriately set.
A control condition is a condition for applying a control content associated with the control condition. The control condition includes a condition related to a timing (or a season), a condition related to a space (or a region), a condition related to an appearance, and a condition related to a period of time. In
For the conditions related to a timing, an appearance, and a period of time, a timing, an appearance, and a period of time to which a control pattern is applied are respectively determined. For the conditions related to a timing and a period of time, the control unit 105 determines whether the appearance history information 106a satisfies the conditions, based on, for example, a time included in the appearance history information 106a. For the condition related to an appearance, the control unit 105 determines whether the appearance history information 106a satisfies the condition, based on, for example, an appearance (a pose, behavior, clothing, and a surrounding situation) included in the appearance history information 106a.
The control unit 105 determines whether there is the appearance history information 106a that satisfies a control condition included in a control pattern, based on, for example, the appearance history information 106a and the control pattern. When there is the appearance history information 106a that satisfies all of the conditions related to a timing, an appearance, and a period of time, the control unit 105 determines that there is the appearance history information 106a that satisfies the control condition. When there is the appearance history information 106a that satisfies the control condition, the control unit 105 determines that control is needed, and determines a region ID included in the appearance history information 106a. Then, the control unit 105 determines the air conditioner 102 being a control target, based on the determined region ID.
The control unit 105 generates control information according to a control content included in the control pattern that satisfies the condition. The control unit 105 transmits the generated control information to the air conditioner 102 being the control target via the network N.
The functional configuration of the control system 100 according to the present example embodiment is mainly described above. Hereinafter, a physical configuration of the control system 100 according to the present example embodiment will be described.
The control system 100 is physically formed of the capturing apparatuses 101a to 101c and the air conditioners 102a to 102c, and the control apparatus 103 connected to each other via the network N.
Each of the capturing apparatuses 101a to 101c and the air conditioners 102a to 102c according to the present example embodiment is physically individually formed. Note that, a part or the whole of the capturing apparatuses 101a to 101c and the air conditioners 102a to 102c may be physically formed in a single manner. For example, the air conditioner 102a may physically include the capturing apparatus 101a. Further, for example, a personal computer, a terminal apparatus, various types of equipment in a target space may include the capturing apparatuses 101a to 101c. In these cases, the air conditioner 102a, the personal computer, the terminal apparatus, various types of the equipment include the function of the capturing apparatus 101a.
The control apparatus 103 according to the present example embodiment is physically formed of a single apparatus. Note that, the control apparatus 103 may be physically formed of a plurality of apparatuses connected via an appropriate communication line such as the network N. For example, the control apparatus 103 may be physically divided and formed of an apparatus that performs processing on image information by the history acquisition unit 104 and an apparatus that performs other processing in the control apparatus 103.
The control apparatus 103 is physically a general-purpose computer, and the like, for example. Specifically, for example, as illustrated in
The bus 1010 is a data transmission path for allowing the processor 1020, the memory 1030, the storage device 1040, the network interface 1050, the input interface 1060, and the output interface 1070 to transmit and receive data with one another. However, a method for connecting the processor 1020 and the like to one another is not limited to bus connection.
The processor 1020 is a processor achieved by a central processing unit (CPU), a graphics processing unit (GPU), and the like.
The memory 1030 is a main storage apparatus achieved by a random access memory (RAM) and the like.
The storage device 1040 is an auxiliary storage apparatus achieved by a hard disk drive (HDD), a solid state drive (SSD), a memory card, a read only memory (ROM), or the like. The storage device 1040 stores a program module for achieving each function of the control apparatus 103. The processor 1020 reads each program module onto the memory 1030 and executes the program module, and each function associated with the program module is achieved.
The network interface 1050 is an interface for connecting the control apparatus 103 to the network N.
The input interface 1060 is an interface for a user to input information. The input interface 1060 is formed of, for example, one or a plurality of a touch panel, a keyboard, a mouse, and the like.
The output interface 1070 is an interface for providing information to a user. The output interface 1070 is formed of, for example, one or a plurality of a liquid crystal panel, an organic electro-luminescence (EL) panel, and the like.
The physical configuration of the control system 100 according to the present example embodiment is mainly described above. Hereinafter, a motion of the control system 100 according to the present example embodiment will be described.
For example, when the control apparatus 103 receives a start instruction from a user, the control apparatus 103 repeatedly performs the air conditioner control processing (see
As described above, the history acquisition unit 104 processes image information including an image in which a target space (the meeting room R and the corridor C in the present example embodiment) is captured, and acquires the appearance history information 106a including a history of appearances of one or more persons P being active in the target space (step S101).
For example, it is assumed that image information including an image in which the captured region TSa illustrated in
The person detection unit 104a acquires image information including an image in which the captured regions TSa to TSc are captured from each of the capturing apparatuses 101a to 101c via the network N (step S101a).
The person detection unit 104a detects the person P from the image included in each piece of the image information (step S101b).
In step S101b, the person detection unit 104a provides a person ID to each detected person P. The person ID is information for identifying each detected person P. The person ID may be provided according to an appropriate rule. The person ID provided herein is provisional.
For example, the person detection unit 104a detects eight persons in step S101b for the image in which the captured region TSa illustrated in
The analysis unit 104b analyzes the image information acquired in step S101a (step S101c).
Specifically, as described above, the analysis unit 104b includes the face recognition function, the person type recognition function, the pose recognition function, the movement recognition function, the worn clothes recognition function, the gradient feature detection function of an image, the color feature detection function of an image, the object recognition function, and the like.
In step S101c, the analysis unit 104b analyzes the image information by using the functions.
The face recognition function extracts a face feature value of the person P. The face recognition function may determine a position in an image of a face of the person P.
The person type recognition function extracts a human body feature value of the person P. The human body feature value indicates an overall feature such as fatness/slimness of a body shape, height, and clothing. The person type recognition function may determine a position in an image of the person P.
The pose recognition function detects a joint point of the person P, and creates a stick figure model connecting the joint point. The pose recognition function extracts a feature value of a pose by using information about the stick figure model, and estimates a pose of the person P. The pose recognition function may estimate height of the person P. The pose recognition function may determine a position in an image of the person P.
The movement recognition function determines a change in the pose of the person P, based on the feature value of the pose being extracted by the pose recognition function, and estimates a movement of the person P.
The pose recognition function and the movement recognition function may be achieved by the technique disclosed in International Patent Publication No. WO2021/084677, the technique disclosed in Zhe Cao, Tomas Simon, Shih-En Wei, Yaser Sheikh, “Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields”, The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, P. 7291-7299, and the like.
The worn clothes recognition function estimates worn clothes (such as a coat, a jumper, a jacket, a sweater, a long-sleeved shirt, a short-sleeved shirt, a long-sleeved T-shirt, and a short-sleeved T-shirt) that can be distinguished from an appearance of the person P.
The gradient feature detection function of an image is SIFT, SURF, RIFF, ORB, BRISK, CARD, HOG, and the like. The color feature detection function of an image outputs data indicating a feature of a color of an image, such as a color histogram, for example.
The object recognition function is constituted by using a technique such as YOLO (extraction of a general object [for example, a car, a bicycle, a chair, and the like] and extraction of a person can be achieved), for example. The object recognition function detects an object as a surrounding situation from an image, and determines a position of the object in the image.
The analysis unit 104b acquires a pose in the appearance information by using the pose recognition function.
The analysis unit 104b acquires behavior in the appearance information by using the movement recognition function. For example, the analysis unit 104b holds behavior data in which behavior according to a movement of the person P is predetermined. The analysis unit 104b acquires behavior in the appearance information, based on a movement of the person P being estimated by using the movement recognition function, and the behavior data.
The analysis unit 104b may acquire behavior in the appearance information, based on a pose of the person P being estimated by using the pose recognition function, and a situation around the person P being estimated by using the object recognition function. For example, the analysis unit 104b may acquire behavior of conducting business, based on presence of equipment such as a personal computer, a book, a document, and the like around a sitting person.
The analysis unit 104b may acquire behavior in the appearance information, based on a movement of the person P being estimated by using the movement recognition function, and a situation around the person P being estimated by using the object recognition function. For example, the analysis unit 104b may acquire behavior of eating a meal, based on presence of a meal around the person P having a movement of moving a hand toward a mouth.
The analysis unit 104b may acquire behavior in the appearance information, based on a movement of the person P being estimated by using the movement recognition function, a pose of the person P being estimated by using the pose recognition function, and a situation around the person P being estimated by using the object recognition function. For example, the analysis unit 104b may acquire behavior of eating a meal, based on presence of a meal around the person P sitting and having a movement of moving a hand toward a mouth.
The analysis unit 104b estimates worn clothes that can be distinguished from an appearance of the person P by using the clothing recognition function. The analysis unit 104b holds, in advance, clothing data in which worn clothes and a score are associated with each other, for example. The analysis unit 104b refers to the clothing data, and acquires a score of clothing according to estimated worn clothes. Specifically, for example, the behavior data associate worn clothes “coat” and a score “5” with each other. For example, the behavior data associate worn clothes “short-sleeved T-shirt” and a score “1” with each other.
The analysis unit 104b acquires a surrounding situation by using the object recognition function.
The analysis unit 104b determines identity of the person P between different images. For example, the analysis unit 104b determines identity of the person P between different images by using at least one of a face feature value, a human body feature value, a feature value of a pose, a feature value of behavior, a position in an image of a face of the person P, a position in an image of the person P, worn clothes, and the like.
The analysis unit 104b corrects the provisional person ID provided in step S101b in such a way that the same person has the same person ID. As a result of the correction, the person ID of each person P detected in step S101b is confirmed. Further, by determining identity of the person P, a flow line of the person P can be acquired (flow line analysis), based on the image information.
Such each function such as the face recognition function, the person type recognition function, the pose recognition function, the movement recognition function, the worn clothes recognition function, the gradient feature detection function of an image, the color feature detection function of an image, and the object recognition function included in the analysis unit 104b can be achieved by using, for example, a learning model being learned by performing machine learning.
The analysis unit 104b includes, for example, a learning model for each function. The learning model uses image information as an input, and outputs an estimation result according to the function. Input data to the learning model during learning are the image information. In the machine learning, supervised learning in which an estimation result according to the image information is correct may be performed.
Herein, the described processing on the image information is merely one example. The processing on the image information may be processing that can acquire the appearance information, and various publicly known techniques such as pattern matching may be applied.
The analysis unit 104b acquires, as a result of the analysis processing in step S101c, the appearance information about the person P detected in step S101b (step S101d).
Specifically, the analysis unit 104b sets the person ID confirmed in step S101c as “person ID” in the appearance information. The analysis unit 104b sets the region ID and the capturing time included in the image information being a target of the analysis in step S101c as “region ID” and “capturing time” in the appearance information, respectively.
The analysis unit 104b sets, as “position”, “pose”, “behavior”, “clothing”, and “surrounding situation” in the appearance information, respectively, the position, the pose, the behavior, the clothing, and the surrounding situation of the person P being estimated as the result of the analysis in step S101c.
For example, in step S101c for each piece of the image information including the image in which the captured region TSa is captured at the time T3, the analysis unit 104b determines identity between the person P detected based on the image and the persons P having the person IDs of “P1” to “P3” included in the appearance history information 106a. As a result, as illustrated in
Then, for example, as a result of the analysis processing (step S101c) on the image information including an image in which the captured region TSa is captured at the time T2 between the time T1 and the time T3, first appearance information is assumed to be acquired for the person ID “P3”. The first appearance information includes the person ID “P3”, the region ID “TSa”, a position “X3”, the time “T2”, a pose “sitting position”, behavior “business”, clothing “3”, and a surrounding situation “document”.
For example, as a result of the analysis processing (step S101c) on the image information including an image in which the captured region TSa is captured at the time T3 after the time T2, second appearance information and third appearance information are assumed to be acquired for the person ID “P3”.
The second appearance information includes the person ID “P3”, the region ID “TSa”, the position “X3”, the time “T3”, the pose “sitting position”, the behavior “business”, clothing “4”, and the surrounding situation “document”.
The third appearance information includes the person ID “P3”, the region ID “TSa”, the position “X3”, the time “T3”, the pose “sitting position”, behavior “wearing clothes”, the clothing “4”, and the surrounding situation “document”.
As a result of analyzing one piece of the image information in such a manner, when one person P has a plurality of appearances (for example, conducting business while wearing clothes), the plurality of pieces of appearance information are acquired for the one person P.
The analysis unit 104b repeats steps S101f to S101i for each piece of the appearance information acquired in step S101d (step S101e; loop A).
The analysis unit 104b determines whether there is the appearance history information 106a including the same person ID as the person ID included in the appearance information being a processing target (step S101f).
When it is determined that there is no appearance history information 106a (step S101f; No), the analysis unit 104b newly creates the appearance history information 106a, based on the appearance information being the processing target (step S101g). The analysis unit 104b stores the newly created appearance history information 106a in the history storage unit 106.
In step S101g, the analysis unit 104b newly creates the appearance history information 106a associated with the appearance information being the processing target. Specifically, the analysis unit 104b sets, as each associated item of the appearance history information 106a, each piece of the information included in the appearance information being the processing target. Herein, the analysis unit 104b sets a capturing time included in the appearance information as a time in the appearance history information 106a.
When the analysis unit 104b performs step S101g and there is unprocessed appearance information, the analysis unit 104b performs the loop A (step S101e) again. When there is no unprocessed appearance information, the analysis unit 104b ends the history acquisition processing (step S101), and returns to the air conditioner control processing.
When it is determined that there is the appearance history information 106a (step S101f; Yes), the analysis unit 104b determines whether there is a change in a position and an appearance (step S101h).
In step S101h, the analysis unit 104b determines whether there is a change in a position and an appearance included in each of the appearance information being the processing target and the appearance history information 106a including the same person ID as the person ID included in the appearance information.
When it is determined that there is no change (step S101h; No), the analysis unit 104b performs the loop A (step S101e) again in a case where there is unprocessed appearance information since an appearance included in the appearance history information 106a including the same person ID continues. When there is no unprocessed appearance information, the analysis unit 104b ends the history acquisition processing (step S101), and returns to the air conditioner control processing.
For example, in step S101h for the first appearance information described above, the analysis unit 104b determines that there is no change, and ends the loop A (step S101e) for the first appearance information.
When it is determined that there is a change (step S101h; Yes), the analysis unit 104b newly creates the appearance history information 106a and/or updates the appearance history information 106a, based on the appearance information being the processing target (step S101i). Herein, “A and/or B” means any one or both of A and B, and the same applies to the description below. In other words, in step S101g, the analysis unit 104b performs any one or both of newly creation of the appearance history information 106a and update of the appearance history information 106a.
For example, the second appearance information described above is assumed to be a processing target. The second appearance information indicates a change in an appearance (“clothing” in this example) in the appearance history information 106a being continued from the time T1 for the same appearance (a pose, behavior, clothing, and a surrounding situation). Thus, the analysis unit 104b updates the appearance history information 106a. Then, the analysis unit 104b newly creates the appearance history information 106a associated with the second appearance information. Processing of newly creating the appearance history information 106a is similar to step S101g described above. The analysis unit 104b stores the newly created appearance history information 106a in the history storage unit 106.
Specifically, the analysis unit 104b updates the appearance history information 106a of the person ID “P3”, the region ID “TSa”, the position “X3”, the time “T1˜”, the pose “sitting position”, the behavior “business”, the clothing “3”, and the surrounding situation “document” in step S101g. In this update, the analysis unit 104b updates time “T1˜” to times “T1˜T3” (see
Then, the analysis unit 104b creates, as the appearance history information 106a associated with the second appearance information, the appearance history information 106a including the person ID “P3”, the region ID “TSa”, the position “X3”, the time “T3”, the pose “sitting position”, the behavior “business”, the clothing “4”, and the surrounding situation “document” (see
In other words, when appearance information indicating a change in an appearance in which continuance is indicated in the appearance history information 106a is acquired, the analysis unit 104b updates a time in the appearance history information 106a in such a way as to indicate an end of the appearance indicated in the appearance history information 106a. Then, the analysis unit 104b newly creates the appearance history information 106a associated with the appearance information.
Further, for example, the third appearance information described above is assumed to be a processing target. Further, it is assumed that there is no appearance history information 106a continued from the time T1 for the same appearance (a pose, behavior, clothing, and a surrounding situation). In this case, the analysis unit 104b newly creates the appearance history information 106a associated with the third appearance information. Processing of newly creating the appearance history information 106a is similar to step S101g described above. The analysis unit 104b stores the newly created appearance history information 106a in the history storage unit 106.
Specifically, the analysis unit 104b creates, as the appearance history information 106a associated with the third appearance information, the appearance history information 106a including the person ID “P3”, the region ID “TSa”, the position “X3”, the time “T3”, the pose “sitting position”, the behavior “wearing clothes”, the clothing “4”, and the surrounding situation “document” (see
When the analysis unit 104b performs step S101i and there is unprocessed appearance information, the analysis unit 104b performs the loop A (step S101e) again. When there is no unprocessed appearance information, the analysis unit 104b ends the history acquisition processing (step S101), and returns to the air conditioner control processing.
As described above, the control unit 105 outputs control information for controlling the air conditioner 102, based on the appearance history information 106a (step S102).
For example, the history storage unit 106 and the control pattern storage unit 107 are assumed to are assumed to respectively store the appearance history information 106a illustrated in
The control unit 105 determines whether control of the air conditioner 102 is needed, based on the appearance history information 106a and the control pattern information 107a (step S102a).
Specifically, the control unit 105 determines whether there is the appearance history information 106a that satisfies a control condition included in the control pattern information 107a, based on, for example, the appearance history information 106a and the control pattern. When there is the appearance history information 106a that satisfies the control condition, the control unit 105 determines that control is needed. When there is no appearance history information 106a that satisfies the control condition, the control unit 105 determines that control is not needed.
More specifically, when there is the appearance history information 106a that satisfies all of the conditions related to a timing, a space, an appearance, and a period of time, the control unit 105 determines that control is needed. When there is no appearance history information 106a that satisfies all of the conditions related to a timing, a space, an appearance, and a period of time, the control unit 105 determines that control is not needed.
For example, with reference to
The first appearance history information 106a satisfies a control condition (hereinafter, the control condition is referred to as a “first control condition”) including “occurrence of predetermined behavior in cold environment” as a condition related to an appearance. In this case, there is the appearance history information 106a that satisfies the control condition, and thus the control unit 105 determines that control is needed.
Herein, the first control condition does not include control conditions related to (1) a timing, (2) a space, and (3) a period of time. A determination method when a control condition includes the conditions will be described.
(1) When the control condition includes the condition related to a timing, the control unit 105 determines whether the condition related to a timing is satisfied, based on whether “time” included in the appearance history information 106a is included in “timing” of the control condition. Specifically, when “time” is included in “timing”, the control unit 105 determines that the condition related to a timing is satisfied. When “time” is not included in “timing”, the control unit 105 determines that the condition related to a timing is not satisfied. Note that, “time” included in the appearance history information 106a is assumed to also include a date.
(2) When the control condition includes the condition related to a space, the control unit 105 determines whether the condition related to a space is satisfied, based on whether an attribute of a region indicated by “region ID” included in the appearance history information 106a is associated with an attribute indicated by “space” of the control condition.
Specifically, when an attribute of a region indicated by “region ID” is associated with an attribute indicated by “space” of the control condition (for example, when attributes coincide with each other), the control unit 105 determines that the condition related to a space is satisfied. When an attribute of a region indicated by “region ID” is not associated with an attribute indicated by “space” of the control condition (for example, when attributes do not coincide with each other), the control unit 105 determines that the condition related to a space is not satisfied.
For example, the control unit 105 determines an attribute of a region indicated by “region ID”, based on space association information being held in advance.
A target space ID is information for identifying a target space. A target space ID “R”. is a target space ID of the meeting room R. A target space ID “C” is a target space ID of the corridor C.
An attribute of a space indicates an attribute of a target space and a captured region indicated by an associated target space ID and an associated region ID. For example, the space association information in
An air conditioner ID is information for identifying the air conditioners 102a to 102c. Air conditioner IDs “AC1”, “AC2”, and “AC3” are air conditioner IDs of the air conditioner 102a, the air conditioner 102b, and the air conditioner 102c, respectively.
With reference to the space association information in
(3) When the control condition includes the condition related to a period of time, the control unit 105 determines whether the condition related to a period of time is satisfied, based on “time” included in the appearance history information 106a.
Specifically, the control unit 105 performs statistical processing of a history of appearances included in the appearance history information 106a. The control unit 105 performs the statistical processing according to a control condition.
Hereinafter, an example of the statistical processing according to a control condition in which a condition related to an appearance being “state where number of persons in clothing [4] is equal to or more than threshold value K4” and a condition related to a period of time being “continuance for time TH4 or longer” are associated with each other will be described.
The control unit 105 performs the statistical processing of counting “number of persons in clothing [4]” in “meeting room” on the appearance history information 106a at a latest time. As a result of the statistical processing, the appearance history information 106a at the latest time is assumed to satisfy “number of persons in clothing [4] equal to or more than threshold value K4” being a condition related to an appearance. When the condition related to an appearance is not satisfied, the control unit 105 ends the statistical processing.
In this case, the control unit 105 performs the statistical processing of counting “number of persons in clothing [4]” in “meeting room” on the appearance history information 106a in the past. In other words, the control unit 105 performs the statistical processing related to a time change herein. As a result of the statistical processing, when a state that satisfies “number of persons in clothing [4] equal to or more than threshold value K4” “continues for time TH4 or longer”, the control unit 105 determines that the condition related to a period of time is satisfied, and ends the statistical processing. When a state that satisfies “number of persons in clothing [4] equal to or more than threshold value K4” does not “continue for time TH4 or longer”, the control unit 105 determines that the condition related to a period of time is not satisfied, and ends the statistical processing.
When it is determined that control is not needed (step S102a; No), the control unit 105 ends the control information output processing (step S102), and also ends the air conditioner control processing (see
When it is determined that control is needed (step S102a; Yes), the control unit 105 determines a region ID included in the appearance history information 106a being the base of the determination (step S102b).
For example, the control unit 105 determines the region ID “TSa” included in the first appearance history information 106a.
The control unit 105 determines the air conditioner 102 being a control target according to the region ID determined in step S102b and a control content included in the control pattern that satisfies the control condition in step S102a (step S102c).
For example, the control unit 105 refers to the control pattern information 107a, and acquires a control content associated with the first control condition. The control unit 105 determines the air conditioner 102 being the control target, based on the region ID “TSa” determined in step S102b and a control content that “target temperature of target space is increased by TP1 degree from current temperature” associated with the first control condition.
A control content has “target space” as a target of control. Thus, the control unit 105 determines “target space” including the determined region ID “TSa”, based on the space association information (see
The control unit 105 determines, as the air conditioner 102 being the control target, the air conditioner 102 associated with the determined target space, based on the space association information (see
The control unit 105 generates control information according to the control content included in the control pattern that satisfies the control condition in step S102a (step S102d).
The control unit 105 transmits the control information generated in step S102d to the air conditioner 102 being the control target determined in step S102c via the network N (step S102e). The control unit 105 ends the control information output processing (step S102), and also ends the air conditioner control processing (see
The air conditioner 102 acquires, via the network N, the control information transmitted in step S102e. The air conditioner 102 operates according to the acquired control information. Thus, the air conditioner 102 can be controlled based on image information including an image in which a target space is captured.
For example, the appearance history information 106a is assumed to satisfy a control condition in which a condition related to a space being “doorway of meeting room” and a condition related to an appearance being “state where number of persons in clothing [5] is equal to or more than threshold value K3” are associated with each other among the control conditions illustrated in
In general, cold air may get into a doorway of a meeting room in fall and winter, and it may be colder than the inside of the meeting room located away from the doorway. Thus, there may be more persons P in heavy clothing at the doorway of the meeting room. In such a case, a target temperature of the air conditioner 102 associated with the doorway of the meeting room can be increased by the TP3 degree from a current temperature. Therefore, an air environment comfortable to the person P at the doorway of the meeting room can be easily provided without the person P being conscious.
For example, the appearance history information 106a is assumed to satisfy a control condition in which a condition related to a space being “meeting room” and a condition related to an appearance being “state where number of persons in clothing [4] is equal to or more than threshold value K4” are associated with each other among the control conditions illustrated in
In general, wearing suits may be manners in a meeting. In such a case, when the air conditioner 102 operates at a normal target temperature, it may be hot. In such a case, a target temperature of the air conditioner 102 associated with a meeting room can be lowered by the TP4 degree from a standard temperature. Therefore, an air environment comfortable to the person P in the meeting room can be easily provided without the person P being conscious.
For example, the appearance history information 106a is assumed to satisfy a control condition in which a condition related to a space being “corridor” and a condition related to an appearance being “occurrence of predetermined behavior in cold environment equal to or more than threshold value K5” are associated with each other among the control conditions illustrated in
In general, when the corridor C is cold, the meeting room R connected to the corridor C may also be cold. In such a case, when the air conditioner 102 operates at a standard target temperature, the person P using the meeting room R may be cold. According to this example, a target temperature of the air conditioner 102 associated with the meeting room can be increased by the TP5 degree from the target temperature. Therefore, an air environment comfortable to the person P in the meeting room can be easily provided without the person P being conscious.
A “space different from a target space” means that the air conditioner 102 for adjusting an air environment of the space does not substantially influence the target space.
One example embodiment of the present invention is described above.
According to the present example embodiment, the control apparatus 103 includes the history acquisition unit 104 and the control unit 105. The history acquisition unit 104 processes image information including an image in which a target space is captured, and acquires the appearance history information 106a including a history of appearances of one or more persons being active in the target space. The control unit 105 outputs control information for controlling the air conditioner 102, based on the appearance history information 106a.
The control apparatus can control the air conditioner 102, based on the appearance history information 106a acquired from image information including an image in which a target space is captured.
The image information can be acquired by providing an apparatus that captures a target space, and thus the appearance history information 106a can also be easily acquired. Thus, an air environment of a space can be easily controlled by using the air conditioner 102.
The image information can be acquired based on capturing of a target space, and thus the person P in the target space is hardly conscious of acquiring the image information. Thus, the air conditioner 102 can be controlled without making a person conscious.
The appearance history information 106a includes a history of appearances of one or more persons being active in a target space. Thus, how to control the air conditioner 102 can be decided based on the appearance history information 106a in the past and control information. As a result, a more comfortable air environment can be provided than that by control of an air conditioner, based on only a current appearance, for example.
Therefore, an air environment of a space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
According to the present example embodiment, the control unit 105 controls the air conditioner 102, based on a result of performing the statistical processing on the history of the appearances described above. Since the result of performing the statistical processing on the history of the appearances is used for control of the air conditioner 102, the air conditioner 102 can be controlled in such a way as to be able to provide a more comfortable air environment. Therefore, an air environment of a space can be made more comfortable by using the air conditioner 102.
According to the present example embodiment, the statistical processing includes statistical processing related to at least one of an attribute of a target space or a region included in the target space, an appearance of a person present in the target space or the region, and an attribute of a person present in the target space or the region. Since a result of performing the statistical processing on a history of appearances related to at least one of an attribute of a region and an attribute of a person is used for control of the air conditioner 102, the air conditioner 102 can be controlled in such a way as to be able to provide a more comfortable air environment. Therefore, an air environment of a space can be made more comfortable by using the air conditioner 102.
According to the present example embodiment, the statistical processing includes statistical processing related to a time change in at least one of an attribute of a target space or a region included in the target space, an appearance of a person present in the target space or the region, and an attribute of a person present in the target space or the region. Since a result of performing the statistical processing on a history of appearances related to at least one of an attribute of a region and an attribute of a person is used for control of the air conditioner 102, the air conditioner 102 can be controlled in such a way as to be able to provide a more comfortable air environment. Therefore, an air environment of a space can be made more comfortable by using the air conditioner 102.
According to the present example embodiment, the control unit 105 controls the air conditioner 102, based on control pattern information including a control content of the air conditioner 102 according to a result of the statistical processing. Since the control pattern information is used for control of the air conditioner 102, the air conditioner 102 can be controlled in such a way as to be able to provide a more comfortable air environment. Therefore, an air environment of a space can be easily made comfortable by using the air conditioner 102.
According to the present example embodiment, the control unit 105 outputs control information for adjusting a space different from a target space, based on the appearance history information 106a.
In this way, the space different from the target space can be adjusted to be comfortable. Therefore, an air environment of the space can be made more comfortable by using the air conditioner 102.
According to the present example embodiment, a target space is plural. The history acquisition unit 104 processes a plurality of pieces of image information each including an image in which the plurality of target spaces are captured, and acquires the appearance history information 106a being a history of appearances of one or more persons being active in each of the plurality of target spaces.
In this way, the plurality of target spaces can be adjusted to be comfortable. Therefore, an air environment of the space can be made more comfortable by using the air conditioner 102.
According to the present example embodiment, the control unit 105 outputs control information for controlling at least one air conditioner 102 associated with at least one of the plurality of target spaces. In this way, an air environment of each of the target spaces can be adjusted. Therefore, the air environment of the space can be easily made comfortable by using the air conditioner 102.
The example embodiments described above may be modified as follows.
The air conditioner 102 is not limited to an air conditioner, and may be equipment, an apparatus, a facility, and the like that adjust an air environment of a space associated with the air conditioner 102.
For example, the air conditioner 102 may be a cooler, a heater, a floor heater, a humidifier, a dehumidifier, an air washer, a ventilation fan, a circulator, a fan, an aroma diffuser that sprays an aromatic oil into air, and the like.
In other words, for example, the air conditioner 102 may be equipment, an apparatus, or a facility including one of a function of cooling air and a function of heating air as a function of adjusting a temperature of air. The air conditioner 102 may be equipment, an apparatus, or a facility including one of a function of humidifying air and a function of dehumidifying air as a function of adjusting humidity of air. The air conditioner 102 may be equipment, an apparatus, or a facility including a function of cleaning air. The air conditioner 102 may be equipment, an apparatus, or a facility including a function of circulating air. The air conditioner 102 may be equipment, an apparatus, or a facility including a function of spraying substance containing an aromatic component into air. The air conditioner 102 may be equipment, an apparatus, or a facility including a plurality of any functions exemplified herein.
A target space may be set in a house, a school, and the like.
In a case of a house, an attribute of a target space may be one or a plurality of a living room, a dining room, a kitchen, a private room, a bedroom, the outside of an entrance, an earthen-floored space in an entrance, a corridor, and the like. In a case of a school, an attribute of a target space may be one or a plurality of a classroom, a lecture hall, a corridor, a laboratory, a staff room, and the like. Further, an attribute of a target space may include a degree of an area of the target space.
A control pattern in a case of a target space being set in a house may lower a target temperature of a bedroom by a predetermined temperature from a current temperature when a state of a lying position (sleeping) continues for a predetermined period of time or longer in the bedroom. In this way, in a case of attempting to sleep in the bedroom, the target temperature can be lowered. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
A control pattern in a case of a target space being set in a house may increase a target temperature of a bedroom by a predetermined temperature from a current temperature when a state of a lying position (sleeping) continues for a predetermined period of time or longer in a living room. In this way, in a case of sleeping in the living room, the target temperature can be increased. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
A control pattern in a case of a target space being set in a house may lower a target temperature of a bedroom by a predetermined temperature from a current temperature when a sweating state continues for a predetermined period of time or longer. In this way, in a case of sweating after a bath, after coming back home, and the like, the target temperature can be lowered. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
Appearance information and the appearance history information 106a may further include an attribute of the person P. The attribute of the person P is at least one of gender, an age group, and the like.
As a control pattern when the appearance information and the appearance history information 106a include gender and/or an age group as the attribute of the person P, the following examples are given.
A control condition of the control pattern may be a condition that female and/or a predetermined age group (for example, elderly people, children, and the like) is equal to or more than a predetermined proportion in a target space or a region included in the target space. Further, a control condition of the control pattern may be continuance for a predetermined period of time or longer of a state where female and/or a predetermined age group (for example, elderly people, children, and the like) is equal to or more than a predetermined proportion in a target space or a region.
A control content in the cases may be lowering ventilation in the target space or the region included in the target space by a predetermined level from a current state and/or increasing a target temperature in the target space or the region by a predetermined temperature from a current state.
In this way, when the person P being female and/or in the predetermined age group is present in the predetermined proportion or more in the target space or the region, ventilation of the air conditioner 102 associated with the target space or the region can be lowered, or the target temperature can be increased. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
A control pattern may control the air conditioner 102 with, as a condition, a movement and/or in and out of the person P in a target space or a region included in the target space.
For example, a control condition of the control pattern may be continuance for a predetermined period of time or longer of a state where the person P having a position changing is equal to or more than a predetermined number of persons in a target space or a region included in the target space. Further, a control condition may be continuance for a predetermined period of time or longer of a state where the number of the persons P going out from a target space or a region included in the target space and/or the number of the persons P going into the target space or the region is equal to or more than a predetermined number of persons.
A control content in the cases may be increasing ventilation in the target space or the region included in the target space by a predetermined level from a current state and/or lowering a target temperature in the target space or the region by a predetermined temperature from a current state.
In this way, when it is hot due to a movement and/or in and out of the person P, ventilation of the air conditioner 102 associated with the target space or the region can be increased, or the target temperature can be lowered. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
A control pattern may control the air conditioner 102 with, as a condition, density of the person P in a target space or a region included in the target space.
A control condition of the control pattern may be a condition that density of the person P in a target space or a region included in the target space is equal to or more than a predetermined proportion. Further, a control condition may be continuance for a predetermined period of time or longer of a state where density of the person P in a target space or a region included in the target space is equal to or more than a predetermined proportion.
A control content in the cases may be increasing ventilation in the target space or the region included in the target space by a predetermined level from a current state and/or lowering a target temperature in the target space or the region by a predetermined temperature from a current state.
In this way, when it is hot due to the persons P being crowded, and the like, ventilation of the air conditioner 102 associated with the target space or the region can be increased, or the target temperature can be lowered. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
A control pattern may control the air conditioner 102 with, as a condition, a change in an appearance when and after the person P enters a target space or a region included in the target space, or a change in an appearance after the person P enters a target space or a region included in the target space.
A control condition of the control pattern may be a change in clothing to light clothing or heavy clothing when the person P enters a target space or a region included in the target space. For example, the control condition may be a condition that the person P wears a coat when the person P enters the target space or the region and takes off the coat or a sweater under the coat after the person P enters. For example, the control condition may be a condition that the person P does not wear a coat when the person P enters the target space or the region and wears the coat after the person P enters.
Further, a control condition may be a condition that the person P changes underwear within a predetermined period of time after the person P enters a target space or a region included in the target space.
A control content in this case may be switching an operation mode according to an appearance (clothing) after entering or changing a target temperature according to a predetermined temperature.
In this way, the air conditioner 102 associated with the target space or the region can be controlled according to a change in an appearance when and after the person P enters the target space or the region, or a change in an appearance after the person P enters the target space or the region. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
The control apparatus 103 may acquire various types of information such as weather forecast information about a forecast of weather from a not-illustrated external apparatus. Further, the control apparatus 103 may acquire various types of information from various apparatuses installed in a building and the like in which a target space is set. For example, the control apparatus 103 may estimate current weather, based on image information from a capturing apparatus that is installed in a doorway of a building and the like in which a target space is set, and captures an outdoor place.
In this case, a control condition of a control pattern may further include a condition related to various types of information (for example, weather forecast information or/and current weather).
In this way, the air conditioner 102 associated with a target space or a region included in the target space can be controlled according to information from an external apparatus and/or various apparatuses installed in a building and the like. Therefore, an air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
In the example embodiment, the example in which the control pattern information 107a is appropriately set is described. The control apparatus 103 may, for example, create and change the control pattern information 107a. The control apparatus 103 may use any one or a plurality of pieces of control pattern information acquired by, for example, creating and changing the control pattern information 107a for control of the air conditioner 102. When the plurality of pieces of control pattern information are used, the control apparatus 103 may use, for control of the air conditioner 102, any of the plurality of pieces of control pattern information according to a priority order determined by an appropriate method.
The motion situation acquisition unit 111 acquires motion situation information indicating a motion situation of the air conditioner 102 from each of one or the plurality of air conditioners 102 via the network N. The motion situation information includes, for example, at least one of an air conditioner ID, a current time, whether an air conditioner is operating, an operation mode when an air conditioner is operating, a target temperature, and the like. The motion situation acquisition unit 111 stores the acquired motion situation information in the motion history storage unit 112. In this way, a motion history of one or the plurality of air conditioners 102 is stored in the motion history storage unit 112.
Further, the motion situation acquisition unit 111 acquires, from each of one or the plurality of air conditioners 102 via the network N, change information about a change when setting of the air conditioner 102 is manually changed by using a not-illustrated remote controller of the air conditioner 102. The change information includes, for example, an air conditioner ID, a time (change time) at which a change is manually performed, and a content of the change. The content of the change includes one or a plurality of ON and OFF of a power source, setting or a change of an operation mode, setting or a change of a target temperature, setting or a change of an airflow rate, and setting or a change of a wind direction. The motion situation acquisition unit 111 stores the acquired change information in the motion history storage unit 112.
The motion history storage unit 112 is a storage unit for storing information about motion of the air conditioner 102. Information stored in the motion history storage unit 112 is motion situation information, change information, and the like.
The creation unit 113 creates control pattern information including a control pattern applied to one or the plurality of air conditioners 102, based on a motion history of the one or the plurality of air conditioners 102.
Specifically, the creation unit 113 acquires, as a motion history, the motion situation information stored in the motion history storage unit 112. The creation unit 113 performs statistical processing on the motion history, and creates a control pattern of the air conditioner 102 as a result of the statistical processing. The creation unit 113 stores control pattern information including the created control pattern in the control pattern storage unit 107.
Specifically, for example, the creation unit 113 acquires an average value of an operation mode, a target temperature, an airflow rate, and the like of all of the air conditioners 102 or in a predetermined period, based on the motion history. By using the average value, the creation unit 113 creates first control pattern information including a standard control pattern including a standard operation mode, a standard target temperature, a standard airflow rate, and the like according to a timing. The creation unit 113 stores the first control pattern information in the control pattern storage unit 107.
Specifically, for example, the creation unit 113 decides a variable (one or a plurality of threshold values K1 to K7, periods of time TH2 to TH7, temperatures TP1 to TP6, and an operation mode) included in the control pattern information 107a, based on the appearance history information 106a and the motion history.
More specifically, for example, the creation unit 113 determines the air conditioner 102 operating in a situation associated with a control condition of the control pattern, based on all pieces of the appearance history information 106a or in a predetermined period. The creation unit 113 acquires an average value being a value associated with the variable included in the control pattern, based on the determined motion history of the air conditioner 102. The creation unit 113 acquires an operation mode being most frequently adopted, based on the determined motion history of the air conditioner 102. An average value and a mode (an operation mode being most frequently adopted) are one example of a method for creating a control condition, based on the appearance history information 106a, and may be appropriately changed.
The creation unit 113 creates a second control pattern acquired by changing a variable of a control pattern included in the control pattern information 107a or a standard control pattern by using at least one of the acquired average value and the operation mode being most frequently adopted. The creation unit 113 stores the second control pattern information in the control pattern storage unit 107.
When setting of the air conditioner 102 is manually changed, the individual creation unit 114 creates control pattern information for the air conditioner 102 on which the change is made, based on change information about the change.
Specifically, for example, when the change information from the air conditioner 102a is acquired within a predetermined period of time since the control information is transmitted to the air conditioner 102a, the individual creation unit 114 refers to the change information. The individual creation unit 114 acquires, based on the change information, a value associated with a variable included in a control pattern applied when the control information is transmitted to the air conditioner 102a.
The individual creation unit 114 changes, by using the acquired value, the variable included in the control pattern applied when the control information is transmitted to the air conditioner 102a. As a result of the change, the individual creation unit 114 creates third control pattern information for the air conditioner 102a. The individual creation unit 114 stores the third control pattern information in the control pattern storage unit 107.
According to the present modification example, the creation unit 113 creates control pattern information, based on a result of performing the statistical processing on a motion history of the air conditioner 102a. In this way, the control pattern information according to an actual motion situation of the air conditioner 102a can be created. Therefore, an air environment of a space can be easily made comfortable by using the air conditioner 102a without a person being conscious.
Further, time and effort for a user of the control apparatus 103 to create the control pattern information in advance can be eliminated. Therefore, convenience of a user of the control apparatus 103 can be improved.
According to the present modification example, when setting of the air conditioner 102 is manually changed, the individual creation unit 114 creates individual control pattern information being the control pattern information for the air conditioner 102 on which the change is made, based on change information about the change.
In this way, the individual control pattern information according to a use environment of the air conditioner 102 being manually changed can be created. Then, the air conditioner 102 on which the change is made can be controlled according to the individual control pattern information. Therefore, an air environment of a space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
The control apparatus 103 may control the air conditioner 102 with reference to a use schedule of the meeting room R as a target space and a private schedule of a user of the meeting room R as a target space.
The schedule information includes one or a plurality of a subject of a schedule, a scheduled timing (including a scheduled time and a scheduled period), a place, a content (for example, a meeting), and the like.
The schedule management apparatus 215 is connected to the control apparatus 103 via the network N. The control apparatus 103 and the schedule management apparatus 215 transmit and receive information to and from each other via the network N.
As illustrated in
The schedule acquisition unit 216 acquires schedule information from the schedule management apparatus 215 via the network N.
The control unit 205 includes the function of the control unit 105 according to the example embodiment. The control unit 205 further controls the air conditioner 102, based on schedule information. For example, the control unit 205 controls the air conditioner 102, based on schedule information, before a target space is used.
Specifically, for example, when a target space is a place (for example, the meeting room R) included in schedule information, the control unit 205 transmits control information to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in the target space. The control information at this time may include, for example, the same content as that of the control information transmitted to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in the target space (for example, the meeting room R) within a predetermined period of time (for example, a previous time).
In this way, the air conditioner 102 can be controlled in a manner suitable for a target space before the target space is used, and a comfortable air environment can be adjusted in advance. Therefore, the air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
For example, when schedule information includes a schedule in which the same subject as a subject who has used a target space (for example, the meeting room R) in the past uses the target space (for example, the meeting room R) again, the control unit 205 transmits control information to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in the target space. The control information at this time may include, for example, the same content as that of the control information transmitted to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in the target space (for example, the meeting room R) when the subject has used the target space (for example, the meeting room R) in the past.
In general, when the same subject uses a target space again, a comfortable air environment is often the same as an air environment at a time of a previous use. Thus, according to this example, the air environment comfortable for the subject can be adjusted in advance. Therefore, the air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
For example, the control unit 205 transmits control information to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in a target space (for example, the meeting room R), based on a kind of a subject included in a use schedule of the target space. A kind of a subject is, for example, government and municipal offices, a company, a circle, a neighborhood association, and the like.
The control information at this time may include, for example, the same content as that of the control information transmitted to the air conditioner 102 (for example, the air conditioner 102a and/or 102b) provided in the target space (for example, the meeting room R) when a subject of the same kind has used the target space (for example, the meeting room R) in the past.
In general, when a subject of the same kind uses a target space, clothing is often similar. For example, when a subject who uses the meeting room R is government and municipal offices or a company, suits are often worn. When a subject who uses the meeting room R is a circle or a neighborhood association, clothing is often casual.
Thus, by controlling the air conditioner 102, based on a kind of a subject included in a use schedule of a target space, the air conditioner 102 can be controlled in a manner according to clothing of the subject, and a comfortable air environment can be adjusted in advance. Therefore, the air environment of the space can be easily made comfortable by using the air conditioner 102 without a person being conscious.
While the example embodiments and the modification examples of the present invention have been described with reference to the drawings, the example embodiments and the modification examples are only exemplification of the present invention, and various configurations other than the above-described example embodiments and modification examples can also be employed.
Further, the plurality of steps (pieces of processing) are described in order in the plurality of flowcharts used in the above-described description, but an execution order of steps performed in each of the example embodiments is not limited to the described order. In each of the example embodiments, an order of illustrated steps may be changed within an extent that there is no harm in context. Further, the example embodiments and the modification examples described above can be combined within an extent that a content is not inconsistent.
A part or the whole of the above-described example embodiments may also be described in supplementary notes below, which is not limited thereto.
1.
An air conditioner control apparatus including:
The air conditioner control apparatus according to supplementary note 1 described above, wherein
The air conditioner control apparatus according to supplementary note 2 described above, wherein
The air conditioner control apparatus according to supplementary note 3 described above, wherein
The air conditioner control apparatus according to any one of supplementary notes 2 to 4 described above, wherein
The air conditioner control apparatus according to supplementary note 5 described above, further including
The air conditioner control apparatus according to supplementary note 5 or 6 described above, further including
The air conditioner control apparatus according to any one of supplementary notes 1 to 7 described above, wherein
The air conditioner control apparatus according to any one of supplementary notes 1 to 8 described above, wherein
The air conditioner control apparatus according to any one of supplementary notes 1 to 9 described above, wherein
The air conditioner control apparatus according to supplementary note 10 described above, wherein
An air conditioner control system including:
An air conditioner control method including,
A program causing a computer to execute:
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/JP2022/002898 | 1/26/2022 | WO |