Shading and illumination system

Information

  • Patent Grant
  • 12320496
  • Patent Number
    12,320,496
  • Date Filed
    Tuesday, October 26, 2021
    3 years ago
  • Date Issued
    Tuesday, June 3, 2025
    6 days ago
  • Inventors
  • Original Assignees
    • HELLA SONNEN- UND WETTERSCHUTZTECHNIK GMBH
  • Examiners
    • Collins; Gary
    Agents
    • Wenderoth, Lind & Ponack, L.L.P.
Abstract
A shading and illumination system includes a shading device for shading viewing openings, an illumination device for illuminating a room, an external sensor for detecting an external parameter acting on the room, an internal sensor for detecting a 3D image of the room, a position of a person present in the room in the 3D image, and a viewing direction of the person, and a control unit for actuating the shading device and the illumination device. The shading device and the illumination device are actuatable depending on the values measured by the external sensor and by the internal sensor. A light parameter acting on the person is determinable depending on the detected viewing direction, on the detected position, on the 3D image of the room, and on the external parameter. The shading device and/or the illumination device are/is actuatable depending on the light parameter acting on the person.
Description
BACKGROUND OF THE INVENTION

The present invention relates to a shading and illumination system for a room in a building with at least one shading device for shading viewing openings, in particular windows, of the building, at least one illumination device for illuminating the room, at least one external sensor for detecting at least one external parameter acting on the room from outside, at least one internal sensor for detecting a 3D image of the room, at least one position of a person present in the room in this 3D image and a viewing direction of this at least one person and a control unit for actuating the at least one shading device and the at least one illumination device. The at least one shading device and the at least one illumination device are actuatable by the control unit depending on the values measured by the external sensor and by the internal sensor. The invention also relates to a building automation system for monitoring, controlling and adjusting facilities in a building, and this building automation system has such a shading and illumination device.


Shading and illumination systems are used in a variety of ways in buildings to give the persons present therein a pleasant residential experience. In a work environment in particular, it is essential that the shading and the illumination are well balanced. This means that in the case of strong solar radiation the shading is to prevent considerable heat generation and glare, while at the same time as much daylight as possible is to be available to illuminate the room. To achieve target illumination levels, artificial light is added effectively and efficiently: in short, intelligent shading is sought.


When working in an office environment, a person is to be protected from glare as much as possible. In scientific circles and in one European standard (EN 17037), the so-called daylight glare probability (DGP for short) is considered to be an indication of glare, wherein a value of below 0.40 should be achieved. Excessive heating to above 28° C. is also to be avoided in order to protect against long-term physical harm. Protective mechanisms are already used for this (e.g. glare protection is set to active or a sunshade is lowered, or the like). In the state of the art, such protective mechanisms are based on the detection of people at their workstation. This is effected, for example, via presence or movement detection or via automatic absence processes. In this case, optical sensors from Steinel with the type designation HDP2 KNX are used, for example, with which the number of people in a room can be detected.


The company infsoft GmbH for example offers an “infrared thermopile sensor” which can be used to track movement and temperature of persons and objects. The infrared sensor measures the current temperature as well as temperature gradients within the field of vision and represents the temperature distribution in corresponding colours. So-called presence detectors based on passive infrared (PIR) detect physical presence in the internal area and are suitable for offices, public buildings, corridors, storerooms and washrooms. One of the most frequent applications is presence-dependent and energy-efficient light control. Presence detection has a considerable effect in respect of reducing the artificial light requirement. If necessary, the artificial light system is added to the daylight until the target illuminance is reached. Ultrasonic sensors detect persons, objects or fill levels with high precision and over a wide distance range, even in the presence of fog, dirt or dust.


Nowadays, a certain degree of individualization is also achieved by using localization features of smartphones. The person or occupant saves basic settings in their smartphone, which are then applied at each of their workstations/residence areas.


With regard to visual comfort, however, presence detection is too little. Glare is a phenomenon which depends on the outside space and its light sources and brightness distribution, the position in the interior space, the viewing direction and the entire interior of the room. Models for this are e.g. the daylight glare probability of EN 17037 or vertical illuminance levels as estimators. It is disadvantageous to close the entire façade in order to implement the glare protection function—when the position of the sun is known. As a result, an increased artificial light requirement arises because of lower daylight illuminance levels and contact with the outside world is restricted (requirement definition in EN 14501).


With regard to energy flow optimization, partial aspects of the science are known. For example, in the calculation program provided by the research project DALEC (Day- and Artificial Light with Energy Calculation), the year-round energy and comfort characterization of buildings becomes possible for selected working points (www.dalec.net). There, the energy flow through the façade is calculated using angle-dependent overall energy transmittances, wherein angle dependence is to be considered within the meaning of angle of incidence and also positioning angle.


Korean patent KR 10-1049496 B1 describes in detail that the energy flows can be optimized in a time step on the basis of external and internal data and heating loads from the building management system, taking various temporary or thermal comfort criteria into account. Based on external weather data and internal room data including information about the users (distance from the window and viewing direction as an angle to the window), control of the sunshade can be optimized on the basis of a target function between comfort and energy.


US 2018/0101733 A1 describes a method for detecting persons in front of monitors from the perspective of security and authentication.


WO 2019/183232 A1 describes a method for the zonal control of switchable glass panes on the basis of cloud cover. It is described here that a building management system (BMS) operates to maintain a comfortable environment for the users while at the same time minimizing heating and cooling costs.


US 2018/0157141 A1 describes a BMS (building management system) that displays a multipurpose controller for multistate windows.


U.S. Pat. No. 10,242,269 B2 describes the coordinative occupant detection by means of IR technology to control artificial light and other environmental parameters in a house.


U.S. Pat. No. 10,513,415 B2 describes in detail sensor technology for detecting persons and its application in passenger control.


US 2016/0195856 A1 describes a multisystem which observes the user and also his/her viewing direction. An ergonomic consideration is also effected. A display can change the position after a period of time in order to reduce a lasting stiff posture of the user.


WO 2019/183289 A1 describes a control system with a 3D outside space model.


Individual zones of windows are controlled via a switchable glass pane. The interior space calculation is effected via predefined areas in which the user is to be expected.


EP 2 760 164 B1 describes an intuitive and self-learning residential and building comfort control, as well as encrypted, wireless or wired data transmission.


WO 2016/058695 A1 discloses a device and a method for reducing the glare effect in a room of a building. The light source can be dimmed in a pixelated manner. The calculation provides that a camera is then used to investigate whether shadows are falling on the users. The ergonomic advantage is seen in the fact that complete dimming is not necessary.


SUMMARY OF THE INVENTION

The object of the present invention is now to develop a shading and illumination system which is improved compared with the state of the art. In particular, the known disadvantages are to be at least partly avoided. A control of the system which is as automated, simple and efficient as possible is to be possible.


According to the invention, a light parameter acting on the person, in particular on an eye of the person, is determined depending on the viewing direction detected and the detected position and the 3D image of the room and depending on the at least one external parameter, wherein the at least one shading device and/or the at least one illumination device are/is actuatable depending on this light parameter acting on the person.


An individual light parameter geared to the at least one person in the room is thus determined by the system. This light parameter then serves as the basis for the shading and/or illumination. Even if several persons are present in the room, an individual (and in each case current) light parameter can be determined for each of these persons, whereupon an individual adaptation of the shading and/or illumination is effected.


The external parameter can comprise one or more values. With respect to the illumination, at least one radiometric quantity or a photometric equivalent should be incorporated in the external parameter. In addition (or alternatively) the external temperature can be incorporated in the external parameter.


Specifically, preferably the at least one external parameter is based on an irradiance and/or an illuminance and/or a radiation density (or radiation density distribution) and/or an external temperature.


The irradiance (E) (also radiant flux density) is the term for the overall power of the incoming electromagnetic energy which is incident on a surface relative to the size of the surface. The irradiance is indicated in watts per square metre and represents a radiometric quantity.


In contrast to this, the illuminance is the photometric equivalent. The illuminance Ev, on the one hand, describes the surface-related luminous flux which is incident on an illuminated object. The luminous intensity, on the other hand, describes the solid-angle-related luminous flux of a light source. The SI unit of illuminance is the lux (lx).


The radiance or radiation density (L) (also specific intensity) supplies detailed information about the location dependence and direction dependence of the radiation emitted by an emitter surface. The radiation density indicates what radiant power is emitted from a given point of the radiation source in the direction given by the polar angle and the azimuth angle per surface element projected and per solid angle element. The radiation density is indicated in watts per square metre and per steradian. The radiation density is a radiometric quantity. The luminance—indicated in cd/m2—forms the photometric equivalent.


In addition, to the exemplary external parameters mentioned, additionally (or alternatively) the luminous density, luminous intensity, radiant intensity or other radiometric quantities or their photometric equivalent can also be used. If the external temperature is also used as an external parameter, this can, for example, be recorded in degrees Celsius or Kelvins.


Furthermore, preferably the at least one external parameter represents values measured and/or calculated by the external sensor, and/or conditions based on the irradiance and/or the illuminance and/or the radiation density and/or the external temperature. This means that these values can also be partly based on calculations and include particular relationships between different values (e.g. direct components to diffuse components; one quarter space to another quarter space).


In principle, only one external sensor must be provided. However, this can by all means be arranged in the room of the building, as long as an adequate view of the environment outside the room is possible.


Several external sensors can also be provided, with which different values are detected or which are directed to different areas of the environment. These values can be processed together (offset) or individually.


Specifically, the at least one external sensor is a pyranometer. Reference may be made here to the “SPN1 Sunshine Pyranometer” from “Delta-T Devices”.


Values which refer to the time of day, the position of the sun and/or the geographical position of the building on the globe can also be incorporated in the external parameter. These do not have to be detected in an ongoing manner but can be stored or read out or retrieved via the Internet in an ongoing manner.


A similar fixed value can be stored for the horizon (from the location of the building). Therefore, also mountains and adjacent buildings can continually be taken into consideration as shade providers. Adjacent buildings can be added in the event of a change, and then remain fixed/static again.


With regard to the at least one internal sensor, it is in principle possible that similar radiometric or photometric quantities can be detected with this internal sensor.


It is preferably provided that the at least one internal sensor for detecting the 3D image is a ToF camera. ToF cameras are 3D camera systems using time of flight (ToF for short) techniques. To this end, the scene—in this case the room in the building—is illuminated by means of a light pulse, and for each pixel the camera measures the time that the light takes to the object and back again. The time required is directly proportional to the distance. For each pixel the ToF camera thus supplies the distance of the object imaged thereon. A three-dimensional point cloud is produced by this ToF camera. The intensity of the reflected beam gives an indication of the optical reflectance of the material.


Also, several ToF cameras can be arranged in the room. Several point clouds can be produced thereby, which are then combined into a more complete 3D image by fusion. Other methods could also be used for the 3D generation, e.g. 4D radar or the LIDAR system.


In principle, it is possible for the internal sensor in the form of the ToF camera to also detect the position of the person in the room. In addition (or alternatively), however, a further internal sensor in the form of a presence detector can be arranged in the room. With this presence detector, several persons can be detected in the room and their position in the room can be determined. The data of the presence detector can then be combined with those of the ToF camera.


Preferably, several internal sensors are arranged in the room to detect several persons.


The data of the ToF camera and/or of the presence detector can preferably be used to also determine the viewing direction of the person(s).


Furthermore, preferably the 3D image contains surfaces arranged in a coordinate system, wherein reflectances of these surfaces of the room are determinable via the at least one internal sensor, preferably by estimation or calculation. Specifically, a particular reflection value per pixel (e.g. 340×680 VGA) can be determined here.


The parameters mentioned hitherto refer to detected values which depend directly on the locations of the respective sensors, wherein starting from these locations a good overview of the conditions outside and inside the room of the building is certainly provided.


An important factor now, however, is what the condition is like in the area occupied by the person(s). The parameters in the area of the eye or face of the person(s) are particularly important in this case.


To this end, preferably, from the external parameters and internal parameters (3D image, position and viewing direction) made available by the at least one external sensor and the at least one internal sensor, an individual light parameter is determined for that position or that area where the face (or the eyes) of the person(s) is located.


Preferably, the light parameter corresponds to an illumination parameter. An illumination parameter can be a non-photometric, biologically active parameter (which includes e.g. the melanopic potential, wherein the melanopic efficiency factor is a measure of the circadian effect of a light source).


It is particularly preferable that the illumination parameter is defined by the vertical illuminance. Here the three-phase method (Three-Phase Method for Simulating Complex Fenestration with Radiance) can be used to determine the vertical illuminance (this will be explained in greater detail further below in the description of the figures).


However, not only is the illumination in the area of the person(s) relevant, but also the glare. It is therefore alternatively (or additionally) preferable that the light parameter corresponds to a glare parameter, wherein this glare parameter is defined on the basis of a daylight glare probability or other glare metrics. The luminance and the solid angle as well as the vertical illuminance are incorporated in the calculation of this daylight glare probability. The formula consists qualitatively of the sum of luminance at the solid angle×solid angle×location weighting factor+vertical illuminance. The luminance is calculated in turn from the 3D image with the surface properties (reflectance) and the data of the external parameter depending on the viewing direction and position of the person(s).


Anything that can change the radiation transmission into the building can be used as shading device. Any artificial light source can be used as illumination device.


It is preferable that the shading device is formed as a mechanical shading element (e.g. as a venetian blind, curtain, awning, roller shutter, roller blind, vertical venetian blind or pleated blind) and/or as intelligent glass.


Intelligent glass (also known as smart glass or dynamic glass) denotes tintable glazing, the light and radiation permeability of which changes due to the application of an electric voltage (electrochromism), changed light conditions (photochromism) or heating (thermochromism). A window made of intelligent glass can have several sections or areas that can be actuated differently and tinted differently.


It is particularly preferable that the shading device has several shading panels, preferably arranged in the manner of a matrix, wherein at least two, preferably at least four, shading increments can be set for each shading panel.


It can be provided here that the shading panels in totality can completely cover a (large) viewing opening.


The individual shading panels can have different sizes. In order to enable a simple actuation and a targeted and detailed shading, it is preferable that the individual shading panels have the same size. For example, the shading panels can have a size of between 1.5 m2 and 25 cm2, preferably of between 2000 cm2 and 100 cm2.


Furthermore, it is preferable that the illumination device has a lighting means, for example a light-emitting diode, a support, which is preferably integrated, suspended or standalone, and a power supply means. The lighting means can be dimmable or only have two states (off/on). The light distribution can be variable (for example the direct component and the indirect component of a standalone light can be taken into account).


The interaction of the person with the entire system (in particular with the control unit) can be effected via a computer or also via a smartphone app. These thus serve as a user interface.


The control unit (central computation unit) is preferably formed as a switch cabinet with an arithmetic unit and modules connectable to the arithmetic unit for the illumination device and for the shading device. Other inputs and outputs can also be provided.


Preferably, the user interface (including monitor and input means) is connected in terms of signalling to the control unit.


It may also be mentioned that the determination of the various parameters is effected in real time. This means that all parameters are detected afresh and updated at regular intervals, whereby movements of the person(s) can also be taken into account accordingly in the control of the system.


Protection is also sought for a building automation system for monitoring, controlling and adjusting facilities in a building with a shading and illumination system according to the invention.


In addition to the shading devices and illumination devices, facilities of this kind belonging to the building automation system can also be the air-conditioning systems, heating, electronics etc. As basic technical elements, the building automation system can have automation equipment (e.g. DDC-BAS control units), a switch cabinet, field devices (such as sensors and actuators), room automation systems, cabling and bus systems, servers and gateways as well as management and operating facilities.


The KNX standard for a field bus of the building automation can be used as a basis, for example.


Important points and preferred embodiments of the invention are mentioned again below in other words.


The invention describes a system for the model-based predictive control of a discretized shading and illumination system with the target requirement of satisfying visual ergonomic requirements and energy optimization. This discretization can be effected in a variety of ways: by a lamellar curtain, which has two different angular positions vertically; by adjacent curtains, which can be moved independently of one another; by a fabric curtain, which can be moved in strips; or by dynamic glass, which is actuatable as a pixel array, wherein two vertical sections are to be interpreted as two pixels. For example, closing a single façade pixel can be sufficient to achieve this optimum.


It is possible with the invention to prevent glare (part of visual comfort) of a person located in the room in any time resolution, from any working point and any viewing direction.


It is now possible to take the solar energy flows through the façade into account (radiometric and photometric).


The energy requirement for artificial light can also be taken into account.


In particular, the current position (x, y, z in the coordinate system) and the viewing direction (xview, yview, zview) of the person in relation to the environment are reacted to and the optimal configurations are found which can be found with respect to visual ergonomics (e.g. visual comfort) and energy flows into and out of the building.


The sensor component (internal sensor) can be positioned anywhere in the room, thus e.g. mounted on the ceiling or clipped onto monitors.


The at least one internal sensor preferably comprises a measuring system in the form of presence or movement detectors with which even rapid human gestures or body part movements can be detected.


The localization of the persons present, in particular their head position, viewing direction and posture can be effected by sound waves or electromagnetic waves (also optically by photography) and artificial intelligence in a building-wide coordinate system.


The localization of the workstation environment in the room, in particular displays, monitors, work surfaces, tables and chairs is effected by the aforesaid technologies.


Façade properties (window position, sunshade, daylight system, etc.) are likewise detected by an aforesaid technology or e.g. on the basis of planning data.


The detection of the outside space is effected by known sensor systems, which measure direct components and diffuse components of the irradiation and illumination and measure or estimate a degree of cloudiness, preferably including the surrounding buildings from e.g. planning data or survey work.


The calculation of the photometric effects related to the viewpoint found and viewing direction of the person is effected depending on the façade configuration (windows), e.g. by backward ray tracing methods (e.g. view matrix of the Radiance 3-phase method), forward ray tracing methods, radiosity methods, estimation of the vertical illuminance or the like.


The calculation of the energy flows for each possible façade configuration is effected in the central or decentralized arithmetic unit (control unit).


The use of temperature sensors internally and externally is possible.


The optimal setting of the components (e.g. façade, artificial light) is preferably also found from the perspectives of energy requirement and visual ergonomics via the arithmetic unit according to mathematical target optimization methods.


Digital or analogue movement commands can be sent to the façade system (shading device) and artificial light system (illumination device).


The control of the system can be effected by e.g. bidirectional data exchange between illumination device, shading device and control unit. To this end, reference can be made again to the above-mentioned EP 2 760 164 B1.


The person localization can also be used for gesture control.


The internal sensor (e.g. in the form of a ToF camera) detects the geometry and reflection properties in relation to a visible reference field. This reference field has defined diffuse reflection properties and is used to calibrate the intensity image of an optical measuring device (e.g. time of flight).


From the 3D data, the model-based control system produces a model, including estimated reflectances of the room. Using the model of the outside space, from planning data including topography and the current state of the sky, the control module can select energy-optimal configurations.


The energy requirement for artificial light can be recognized in a teaching process. To this end, for example, a monitoring of the power consumption of the illumination device and a measurement of the illuminance (artificial light system) are effected.


The daylight input is determined using common factor methods (e.g. daylight factors, 3-phase method).


In the time step, the model-based control system optimizes, for example, between various configurations (see the different switched shading panels further below in FIGS. 2 and 3).


The positioning of the lamella curtain can be discretized by different tilt angles over the curtain height: for example, 0° in the topmost area, 45° above half height, 0° below half height and raised so that the lowest area is completely open.


Switchable glass panes can be adjusted continuously in terms of transmission behaviour, from maximum to near zero. For example, the glass can be actuated separately in three areas.


The positioning of this discretizable shading device makes sense above all using a model-based automatic system. The diversity of the solution space is enormous, e.g. in said representation according to FIGS. 2 and 3 the façade element is represented by 5×4 elements (shading panels). If each of these elements supports only three degrees of freedom (open, 50%, 0% transmission), the solution space is 320.


Two target functions by way of example could now be:

    • minimizing the cooling energy requirement: for this, the shading device can be closed during the day in summer and in the absence of persons. When a person is present, the energy input by artificial light is juxtaposed with the solar input. The same movement command can then be transmitted over the entire façade or the entire shading device.
    • optimizing comfort: here the energy requirement is treated as a lower priority.


Since there are no global optima, finding the solution is predefined by a preference setting of the person(s) located in the room: for example, the person can set that they are sensitive to glare, or on the contrary would like it to be very bright.





BRIEF DESCRIPTION OF THE DRAWINGS

Further details and advantages of the present invention are explained in more detail below with reference to the embodiments represented in the drawings, in which:



FIG. 1, schematically shows a building with a shading and illumination system,



FIGS. 2+3, schematically shows a room with differently activated shading panels,



FIG. 4 shows a field of view with a building and the sun,



FIG. 5 shows the field of view from FIG. 4 with light parameters overlaid with a matrix,



FIG. 6, schematically shows a room together with outside world, and the matrices of the 3-phase method,



FIG. 7 shows a matrix with 145 fields, and



FIG. 8 shows a formula for the 3-phase method by way of example.





DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 schematically shows a building 3 (e.g. an office block) with an external façade 11. A plurality of viewing openings 5 are formed in this external façade 11. These viewing openings 5 can be formed as windows.


The majority of these viewing openings 5 are provided with a shading device 4. It is indicated schematically that a shading device 4 in the form of a lamellar curtain (e.g. a venetian blind) is lowered.


At least one room 2 (indicated by the dashed lines) is formed in the building 3. An (artificial) illumination device 6 is arranged in this room 2.


In addition, at least one internal sensor 8 is arranged in this room 2.


Outside the room 2 (e.g. on the external façade 11) there is an external sensor 7 for detecting at least one external parameter Pout acting on the room 2 from outside.


Furthermore, a control unit 9 is provided for actuating the at least one shading device 4 and the at least one illumination device 6.


The shading device 4, the illumination device 6, the external sensor 7, the internal sensor 8 and the control unit 9 together form the shading and illumination system 1 for the room 2 of the building 3.


It is also indicated in FIG. 1 that the building 3 has a building automation system 10 for monitoring, controlling and adjusting facilities in the building 3. The shading and illumination system 1 can be an integral constituent of this building automation system 10.


A room 2 in a building 3 is represented schematically in FIG. 2, wherein this room 2 has a shading and illumination system 1.


Furniture 12 (e.g. a desk) is arranged in the room 2. A person P sits at this furniture 12 formed as a desk.


A monitor 13 is arranged on the desk. This monitor 13 can act as a user interface for the control unit 9.


An internal sensor 8 is arranged on the ceiling of the room 2. This internal sensor 8 is configured to detect a 3D image 3D of the room 2, at least one position x, y, z of the person P located in the room 2 in this 3D image 3D and a viewing direction xview, yview, zview of this at least one person P.


In addition, an illumination device 6 is located on the ceiling of the room 2, wherein two lamps of this illumination device 6 are represented in this case.


Furthermore, a further sensor 14 is arranged on the ceiling of the room 2. This sensor 14 serves with its detected values to control the luminous flux at target illuminances. This type of sensor is generally termed a look-down sensor. This sensor 14 can also be part of the internal sensor 8.


Optical properties (e.g. reflectances) of surfaces (furniture surface 15 and floor surface 16) can also be detected by the internal sensor 8.


Outside the room 2, the sun 17 is indicated schematically. The state of the sun 17 can be read from a world coordinate system and is set relative to the room 2. Sun rays S1, S2 and S3 penetrate the room 2 through the viewing opening 5 and fall on the person P. The sun ray S1 falls directly on the face of the person P and results in glare. The sun ray S2 is reflected via the furniture part surface 15 onto the person P. The sun ray S3 results in glare for the person P via a reflection on the floor surface 16.


An external sensor 7 is arranged on the outside of the room 2. The control unit 9 is indicated schematically in the room 2. The individual components of the shading and illumination system 1 have a signalling connection to one another.


Starting from the detected viewing direction xview, yview, zview and from the detected position x, y, z of the person P and from the 3D image (reference sign 3D) of the room 2 and depending on the at least one external parameter Pout, the control unit 9 is configured to determine a light parameter L acting on the person P, in particular on the face of the person P. This light parameter L is then used by the control unit 9 again as a basis for the actuation of the at least one shading device 4 and/or of the at least one illumination device 6.


In the embodiment example represented in FIG. 2, the shading device 4 is formed as a window with an intelligent glass pane. This intelligent glass pane is divided in this case into 4×5 shading panels 4.1 to 4.20. To retain clarity, only the shading panels 4.1, 4.3 and 4.14 are given reference signs. The shading panels 4.1 to 4.20 together form the shading device 4.


Each shading panel 4.1 to 4.20 can assume several—in this case six—different shading settings A, B, C, D, E and F. In the shading setting A, there is no dimming or tinting of the corresponding panel. In the shading setting F, there is maximum dimming of the corresponding panel. The shading settings B, C, D and E form different shading stages (e.g. 20%, 40%, 60% and 80%) of maximum dimming.


If the shading panels 4.1 to 4.20 are part of an intelligent glass pane—as depicted—it is possible to set them in 1% increments or finer shading increments (with 1% increments there are 101 shading settings).


However, shading panels 4.1 to 4.n can also be produced—in contrast to the representation in FIG. 2—via mechanical shading elements. Thus, a window front can be covered by several venetian blinds, for example. Each window of this window front thus forms its own shading panel 4.1 to 4.n. In addition, this single window can be subdivided in itself again. Thus, the venetian blind can be lowered to a varying extent, resulting in another shading increment. In addition, the lamellae of the venetian blind can assume different angular positions (e.g. 15° and 45° and) 90°, resulting in several shading increments A to F.


As illustrated in FIG. 2, the two panels with the shading increment F serve to ensure that the rays S2 and S3—which would otherwise lead to reflection glare via surfaces of the room 2—are kept from penetrating the room 2 or at least the incident intensity is reduced. The sun rays S1 falling directly on the person P can also be mitigated by corresponding shading of the panels. Complete dimming is not necessary here, however, as due to the current viewing direction the sun is only in the field of view very laterally. For the detected viewing direction xview, yview, zview the reflected sun rays S2 and S3 incident from below are more uncomfortable for the person P, which is why these are dimmed more strongly.


Of course, other settings can also be made here according to individual wishes.


Such shadings and settings are not limited to intelligent glass either. On the contrary, this can also work in the same way in the case of shading devices 4 in the form of venetian blinds, roller shutters etc.


However, not only can the shading device 4 be actuated in this way depending on the detected light parameter L, but the illumination device 6 can also be actuated correspondingly. Thus, in the event of sun rays S1 falling frontally on the face, complete dimming via the shading device 4 can be necessary, wherein, however, the illumination device 6 then has to be turned on more strongly again.


A further embodiment example of a shading/illumination system 1 for a room 2 of a building 3 is represented in FIG. 3, wherein a different setting has been made for the shading here. Based on other basic settings or based on individual settings of the person P located in the room 2, a different shading via the shading device 4 can result, in the case of the same external parameters Pout.


Thus, the shading panel 4.8 is completely dimmed with the shading increment F. Direct solar radiation onto the head of the person P is thereby prevented. At the same time, however, the shading panel 4.3 arranged above it is not dimmed at all (shading increment A). Furthermore, in these settings, the shading panels 4.12 to 4.14 are set at shading increment C, so that greater reflection via the sun ray S2 is permitted. The shading panels 4.4, 4.5, 4.9 and 4.10, however, are set to the shading increment E.


In FIGS. 4 and 5, a possible embodiment for determining the light parameter L is examined in more detail. In FIG. 4, a field of view (hemispherical, spanning 180°) of a person P, who is standing in the outside space in this case, is represented schematically, wherein a building stands in the left area, the ground is in the lower area and the sun 17 is in the upper middle area, with approx. half a degree of magnitude and the much larger, very bright area around the sun. The brightness of the sun 17 itself has already been transferred in this representation according to FIG. 4 into a discretization of the firmament, which is why the representation of the sun 17 is a square solid angle region.


A matrix can be laid over this field of view (see FIG. 5), in this case a matrix with 145 fields. This representation is to reproduce the view matrix of the 3-phase method in the backward ray tracing software package radiance. The area of the sun 17 in FIG. 4 is now divided into these matrix fields. Then—as represented in FIG. 5—a particular value (e.g. the illuminance) is generated for each field and the sum of all field contributions is calculated. The glare potential can be evaluated from this. Above all, the matrix fields in the area of the sun 17 are primarily responsible for glare. The grey surfaces of the remainder of the world in the field of view, on the other hand, are not critical for glare.


This view matrix in FIG. 5 is to be calculated for every current viewpoint and the current viewing direction.


In FIG. 6, a room is represented schematically in relation to the outside world, wherein this is used to explain the 3-phase method.


In the 3-phase method (Radiance), the transfer from a discrete area of the sky (sky matrix S) to a calculation point in the room is calculated via three matrices. The daylight matrix D describes in discrete grids how much of the discrete sky (sky matrix S) is to be seen from the façade point. The transmission matrix T in turn describes in a discretized manner how much is transferred from a half space area outside to another half space inside. The view matrix V describes in a discretized manner how much from the calculation point enters from the half space.


The discretization can be effected in 145 areas. To this end, such a grid is represented by way of example in FIG. 7 with 145 fields.


A calculation formula for the illuminance is indicated by way of example in FIG. 8.


The illuminance at n calculation points can be calculated by means of matrix multiplication. It is thus a function of V, T, D and S, f(V,T,D,S).


More precise expansions of the 3-phase method are available in the science.


While the matrices V, D and S are constant in the time step, T can now be varied. The variation describes all possible settings of the shading and illumination system. For each setting a parameter is now calculated, specifically for glare Li=f(V,T,D,S), or for illuminance Ei=f(V,T,D,S) or for the solar input Q of the energy input based on an external irradiance and the value g (angle-dependent overall energy transmittance characteristic). Depending on the target function, the optimal façade setting is selected and the shading device 4 and the illumination device 6 are finally activated.


The 3-phase method can be expanded to five or six phases by including further transfer properties. Other methods can, of course, also be used.


As a result, the sky state (sky matrix S) then changes in a next time step and the optimization is effected afresh. The view matrix V has possibly also changed, whereby a change in the shading or illumination can result again. For example, in the case of dimming of the desk that is too strong, the illumination device 6 in the area of the desk can automatically be switched to brighter.


LIST OF REFERENCE SIGNS






    • 1 Shading and illumination system


    • 2 Room


    • 3 Building


    • 4 Shading device


    • 4.1 to 4.n Shading panels


    • 5 Viewing openings


    • 6 Illumination device


    • 7 External sensor


    • 8 Internal sensor


    • 9 Control unit


    • 10 Building automation system


    • 11 External façade


    • 12 Furniture (desk)


    • 13 Monitor


    • 14 Sensor


    • 15 Furniture surface


    • 16 Floor surface


    • 17 Sun

    • Pout External parameter

    • 3D 3D image

    • x, y, z Position of the person in the room

    • P Person

    • xview, yview, zview Viewing direction

    • L Light parameter

    • S1, S2, S3 Sun rays

    • A, B, C, D, E, F Shading increments

    • V View matrix

    • T Transmission matrix

    • D Daylight matrix

    • S Sky matrix




Claims
  • 1. A shading and illumination system for a room in a building, comprising: at least one shading device configured to shade viewing openings of the building;at least one illumination device configured to illuminate the room;at least one external sensor configured to detect at least one external parameter (Pout) acting on the room from outside;at least one internal sensor configured to detect a 3D image of the room,at least one position of a person present in the room in the 3D image, anda viewing direction of the person; anda control unit configured to actuate the at least one shading device and the at least one illumination device, wherein the at least one shading device and the at least one illumination device are actuatable by the control unit depending on the at least one external parameter detected by the at least one external sensor and the 3D image, the at least one position and the viewing direction detected by the at least one internal sensor,wherein the control unit is configured to determine a light parameter acting on the person based on the detected viewing direction, the detected at least one position, the detected 3D image of the room, and the detected at least one external parameter, and wherein the at least one shading device or the at least one illumination device is actuatable depending on the light parameter acting on the person.
  • 2. The shading and illumination system according to claim 1, wherein the at least one external parameter is based on an irradiance, an illuminance, a radiation density, or an external temperature.
  • 3. The shading and illumination system according to claim 2, wherein the at least one external parameter represents values measured or calculated by the external sensor, conditions based on the irradiance, the illuminance, the radiation density, or the external temperature.
  • 4. The shading and illumination system according to claim 1, wherein the at least one external sensor is a pyranometer.
  • 5. The shading and illumination system according to claim 1, wherein the at least one internal sensor configured to detect the 3D image is a ToF camera.
  • 6. The shading and illumination system according to claim 1, wherein the at least one internal sensor includes a plurality of internal sensors arranged in the room and configured to detect a plurality of persons.
  • 7. The shading and illumination system according to claim 1, wherein the 3D image contains surfaces of the room arranged in a coordinate system, wherein the at least one internal sensor is configured to determine reflectances of the surfaces of the room.
  • 8. The shading and illumination system according to claim 7, wherein the at least one internal sensor is configured to determine the reflectances of the surfaces of the room by estimation or calculation.
  • 9. The shading and illumination system according to claim 1, wherein the light parameter corresponds to an illumination parameter, and wherein the illumination parameter is defined by a vertical illuminance.
  • 10. The shading and illumination system according to claim 1, wherein the light parameter corresponds to a glare parameter, and wherein the glare parameter is defined based on a daylight glare probability or an approximation method of glare metrics.
  • 11. The shading and illumination system according to claim 1, wherein the at least one shading device is formed as a mechanical shading element, as a curtain, as an awning or as a roller shutter, or as intelligent glass.
  • 12. The shading and illumination system according to claim 11, wherein the at least one shading device is formed as the mechanical shading element, and the mechanical shading element is a venetian blind.
  • 13. The shading and illumination system according to claim 1, wherein the at least one shading device has several a plurality of shading panels, and wherein at least two shading increments can be set for each shading panel.
  • 14. The shading and illumination system according to claim 13, wherein the plurality of shading panels arranged in the manner of a matrix, and wherein at least four shading increments can be set for each shading panel.
  • 15. The shading and illumination system according to claim 1, wherein the illumination device has a light source, a support, and a power supply.
  • 16. The shading and illumination system according to claim 15, wherein the light source comprises a light-emitting diode, and wherein the support is integrated, mounted, suspended or standalone.
  • 17. The shading and illumination system according to claim 1, wherein the control unit is formed as a switch cabinet with an arithmetic unit and modules connectable to the arithmetic unit for the at least one illumination device and for the at least one shading device.
  • 18. A building automation system for monitoring, controlling and adjusting facilities in a building with the shading and illumination system according to claim 1.
  • 19. The shading and illumination system according to claim 1, wherein the at least one shading device is configured to shade windows of the building, and wherein the light parameter is a light parameter acting on an eye of the person.
Priority Claims (1)
Number Date Country Kind
A 50925/2020 Oct 2020 AT national
US Referenced Citations (330)
Number Name Date Kind
6064949 Werner et al. May 2000 A
8164818 Collins et al. Apr 2012 B2
8213074 Shrivastava et al. Jul 2012 B1
8254013 Mehtani et al. Aug 2012 B2
8643933 Brown Feb 2014 B2
8705162 Brown et al. Apr 2014 B2
8711465 Bhatnagar et al. Apr 2014 B2
8810889 Brown Aug 2014 B2
8864321 Mehtani et al. Oct 2014 B2
9019588 Brown et al. Apr 2015 B2
9030725 Pradhan et al. May 2015 B2
9081246 Rozbicki Jul 2015 B2
9081247 Pradhan et al. Jul 2015 B1
9128346 Shrivastava et al. Sep 2015 B2
9158173 Bhatnagar et al. Oct 2015 B2
9348192 Brown et al. May 2016 B2
9412290 Jack et al. Aug 2016 B2
9423664 Brown et al. Aug 2016 B2
9436054 Brown et al. Sep 2016 B2
9436055 Shrivastava et al. Sep 2016 B2
9442339 Parker et al. Sep 2016 B2
9442341 Shrivastava et al. Sep 2016 B2
9454055 Brown et al. Sep 2016 B2
9454056 Pradhan et al. Sep 2016 B2
9477131 Pradhan et al. Oct 2016 B2
9482922 Brown et al. Nov 2016 B2
9513525 Collins et al. Dec 2016 B2
9638978 Brown et al. May 2017 B2
9645465 Brown et al. May 2017 B2
9664976 Rozbicki May 2017 B2
9671665 Brown et al. Jun 2017 B2
9690162 Wilbur et al. Jun 2017 B2
9703167 Parker et al. Jul 2017 B2
9728920 Brown et al. Aug 2017 B2
9778532 Pradhan Oct 2017 B2
9885935 Jack et al. Feb 2018 B2
9897888 Bhatnagar et al. Feb 2018 B2
9910336 Parker et al. Mar 2018 B2
9921450 Pradhan et al. Mar 2018 B2
9927674 Brown et al. Mar 2018 B2
9946138 Shrivastava et al. Apr 2018 B2
D816518 Brown et al. May 2018 S
9958750 Parker et al. May 2018 B2
9995985 Parker et al. Jun 2018 B2
10001691 Shrivastava et al. Jun 2018 B2
10002297 Sengupta Jun 2018 B2
10048561 Brown Aug 2018 B2
10120258 Jack et al. Nov 2018 B2
10139696 Brown et al. Nov 2018 B2
10139697 Wilbur et al. Nov 2018 B2
10175549 Brown et al. Jan 2019 B2
10180606 Mullins et al. Jan 2019 B2
10241375 Collins et al. Mar 2019 B2
10242269 Aggarwal et al. Mar 2019 B2
10253558 Vigano et al. Apr 2019 B2
10268098 Shrivastava et al. Apr 2019 B2
10303035 Brown et al. May 2019 B2
10320231 Rozbicki Jun 2019 B2
10365531 Shrivastava et al. Jul 2019 B2
10365532 Vigano et al. Jul 2019 B2
10387221 Shrivastava et al. Aug 2019 B2
10401702 Jack et al. Sep 2019 B2
10409130 Parker et al. Sep 2019 B2
10409652 Shrivastava et al. Sep 2019 B2
10429712 Jack et al. Oct 2019 B2
10444589 Parker et al. Oct 2019 B2
10481459 Shrivastava et al. Nov 2019 B2
10495939 Brown et al. Dec 2019 B2
10503039 Jack et al. Dec 2019 B2
10513415 Fang et al. Dec 2019 B2
10514582 Jack et al. Dec 2019 B2
10514963 Shrivastava et al. Dec 2019 B2
10520784 Brown et al. Dec 2019 B2
10520785 Pradhan et al. Dec 2019 B2
10533892 Brown et al. Jan 2020 B2
10539456 Klawuhn et al. Jan 2020 B2
10539854 Brown et al. Jan 2020 B2
10591799 Brown Mar 2020 B2
10598970 Hainfellner Mar 2020 B2
10631379 Deixler et al. Apr 2020 B2
10673121 Hughes et al. Jun 2020 B2
10678103 Mullins et al. Jun 2020 B2
10684524 Collins et al. Jun 2020 B2
10690540 Brown et al. Jun 2020 B2
10704322 Vigano et al. Jul 2020 B2
10712627 Brown et al. Jul 2020 B2
10732028 Klawuhn et al. Aug 2020 B2
10747082 Shrivastava et al. Aug 2020 B2
10768582 Shrivastava et al. Sep 2020 B2
10782583 Bhatnagar et al. Sep 2020 B2
10797373 Hughes et al. Oct 2020 B2
10802372 Brown Oct 2020 B2
10809587 Brown et al. Oct 2020 B2
10809589 Brown Oct 2020 B2
10859887 Vigano et al. Dec 2020 B2
10859983 Shrivastava et al. Dec 2020 B2
10895498 Klawuhn et al. Jan 2021 B2
10895796 Pradhan et al. Jan 2021 B2
10901286 Parker et al. Jan 2021 B2
10908470 Brown et al. Feb 2021 B2
10908471 Vigano et al. Feb 2021 B2
10935864 Shrivastava et al. Mar 2021 B2
10935865 Pradhan et al. Mar 2021 B2
10942413 Vigano et al. Mar 2021 B2
10948797 Pradhan Mar 2021 B2
10949267 Shrivastava et al. Mar 2021 B2
10955718 Vigano et al. Mar 2021 B2
10956231 Shrivastava et al. Mar 2021 B2
10964320 Shrivastava et al. Mar 2021 B2
10969645 Rozbicki et al. Apr 2021 B2
10969646 Jack et al. Apr 2021 B2
10989976 Shrivastava et al. Apr 2021 B2
10989977 Shrivastava et al. Apr 2021 B2
11003041 Vigano et al. May 2021 B2
11016357 Brown et al. May 2021 B2
11054711 Shrivastava et al. Jul 2021 B2
11054792 Shrivastava et al. Jul 2021 B2
11065845 Parker et al. Jul 2021 B2
11067869 Brown et al. Jul 2021 B2
11073800 Shrivastava et al. Jul 2021 B2
11112674 Jack et al. Sep 2021 B2
11114742 Shrivastava et al. Sep 2021 B2
11126057 Brown et al. Sep 2021 B2
11353848 Madden Jun 2022 B1
20110148218 Rozbicki Jun 2011 A1
20120026573 Collins et al. Feb 2012 A1
20120062975 Mehtani et al. Mar 2012 A1
20120147449 Bhatnagar et al. Jun 2012 A1
20120182593 Collins et al. Jul 2012 A1
20120236386 Mehtani et al. Sep 2012 A1
20120239209 Brown et al. Sep 2012 A1
20120293855 Shrivastava et al. Nov 2012 A1
20120327499 Parker et al. Dec 2012 A1
20130157493 Brown Jun 2013 A1
20130271812 Brown et al. Oct 2013 A1
20130271813 Brown Oct 2013 A1
20130271814 Brown Oct 2013 A1
20130271815 Pradhan et al. Oct 2013 A1
20130278988 Jack et al. Oct 2013 A1
20140015930 Sengupta Jan 2014 A1
20140160550 Brown et al. Jun 2014 A1
20140170863 Brown Jun 2014 A1
20140192393 Bhatnagar et al. Jul 2014 A1
20140236323 Brown et al. Aug 2014 A1
20140247475 Parker et al. Sep 2014 A1
20140268287 Brown et al. Sep 2014 A1
20140303788 Sanders et al. Oct 2014 A1
20140349497 Brown et al. Nov 2014 A1
20140355097 Brown et al. Dec 2014 A1
20150002919 Jack et al. Jan 2015 A1
20150049378 Shrivastava et al. Feb 2015 A1
20150060648 Brown et al. Mar 2015 A1
20150070745 Pradhan Mar 2015 A1
20150092260 Parker et al. Apr 2015 A1
20150103389 Klawuhn et al. Apr 2015 A1
20150116811 Shrivastava et al. Apr 2015 A1
20150118869 Brown et al. Apr 2015 A1
20150185581 Pradhan et al. Jul 2015 A1
20150234369 Wen et al. Aug 2015 A1
20150270724 Rozbicki Sep 2015 A1
20150293422 Pradhan et al. Oct 2015 A1
20150346574 Collins et al. Dec 2015 A1
20150346575 Bhatnagar et al. Dec 2015 A1
20150346576 Pradhan et al. Dec 2015 A1
20160054633 Brown et al. Feb 2016 A1
20160054634 Brown et al. Feb 2016 A1
20160070151 Shrivastava et al. Mar 2016 A1
20160091769 Rozbicki Mar 2016 A1
20160109778 Shrivastava et al. Apr 2016 A1
20160124283 Brown et al. May 2016 A1
20160139477 Jack et al. May 2016 A1
20160154290 Brown et al. Jun 2016 A1
20160195856 Spero Jul 2016 A1
20160289042 Fang et al. Oct 2016 A1
20160334689 Parker et al. Nov 2016 A1
20160342060 Collins et al. Nov 2016 A1
20160342061 Pradhan et al. Nov 2016 A1
20160344148 Mullins et al. Nov 2016 A1
20160357083 Brown et al. Dec 2016 A1
20160377949 Jack et al. Dec 2016 A1
20170006948 Quinones Jan 2017 A1
20170045795 Brown et al. Feb 2017 A1
20170075183 Brown Mar 2017 A1
20170075323 Shrivastava et al. Mar 2017 A1
20170082903 Vigano et al. Mar 2017 A1
20170097259 Brown et al. Apr 2017 A1
20170097553 Jack et al. Apr 2017 A1
20170115544 Parker et al. Apr 2017 A1
20170117674 Brown et al. Apr 2017 A1
20170122802 Brown et al. May 2017 A1
20170131610 Brown et al. May 2017 A1
20170131611 Brown et al. May 2017 A1
20170146884 Vigano et al. May 2017 A1
20170168368 Brown et al. Jun 2017 A1
20170212400 Shrivastava et al. Jul 2017 A1
20170219907 Brown et al. Aug 2017 A1
20170219908 Brown et al. Aug 2017 A1
20170250163 Wilbur et al. Aug 2017 A1
20170269451 Shrivastava et al. Sep 2017 A1
20170276542 Klawuhn et al. Sep 2017 A1
20170285432 Shrivastava et al. Oct 2017 A1
20170285433 Shrivastava et al. Oct 2017 A1
20170363897 Martin Dec 2017 A1
20170364395 Shrivastava et al. Dec 2017 A1
20170365908 Hughes et al. Dec 2017 A1
20170371223 Pradhan Dec 2017 A1
20180039149 Jack et al. Feb 2018 A1
20180067372 Jack et al. Mar 2018 A1
20180088432 Shrivastava et al. Mar 2018 A1
20180090992 Shrivastava et al. Mar 2018 A1
20180095337 Rozbicki et al. Apr 2018 A1
20180101733 Sengupta Apr 2018 A1
20180129172 Shrivastava et al. May 2018 A1
20180143502 Pradhan et al. May 2018 A1
20180157140 Bhatnagar et al. Jun 2018 A1
20180157141 Brown et al. Jun 2018 A1
20180187478 Vigano et al. Jul 2018 A1
20180188627 Vigano et al. Jul 2018 A1
20180188628 Brown et al. Jul 2018 A1
20180189117 Shrivastava et al. Jul 2018 A1
20180196325 Parker et al. Jul 2018 A1
20180210307 Parker et al. Jul 2018 A1
20180239965 Aggarwal et al. Aug 2018 A1
20180267380 Shrivastava et al. Sep 2018 A1
20180301858 Mullins et al. Oct 2018 A9
20180307114 Brown et al. Oct 2018 A1
20180314100 Mullins et al. Nov 2018 A1
20180341163 Jack et al. Nov 2018 A1
20180373111 Brown Dec 2018 A1
20190011793 Jack et al. Jan 2019 A1
20190011798 Brown et al. Jan 2019 A9
20190025661 Brown et al. Jan 2019 A9
20190025662 Jack et al. Jan 2019 A1
20190049811 Shrivastava et al. Feb 2019 A9
20190049812 Brown Feb 2019 A1
20190056631 Brown et al. Feb 2019 A1
20190115786 Rozbicki Apr 2019 A1
20190121214 Wilbur et al. Apr 2019 A1
20190124749 Deixler et al. Apr 2019 A1
20190138704 Shrivastava et al. May 2019 A1
20190155122 Brown et al. May 2019 A1
20190196292 Brown et al. Jun 2019 A1
20190203528 Vigano et al. Jul 2019 A1
20190204705 Vigano et al. Jul 2019 A1
20190219881 Shrivastava et al. Jul 2019 A1
20190235343 Vigano et al. Aug 2019 A1
20190235451 Shrivastava et al. Aug 2019 A1
20190243204 Collins et al. Aug 2019 A1
20190243206 Brown et al. Aug 2019 A1
20190243207 Brown et al. Aug 2019 A1
20190250029 Zedlitz et al. Aug 2019 A1
20190267840 Rozbicki et al. Aug 2019 A1
20190271895 Shrivastava et al. Sep 2019 A1
20190294017 Vigano et al. Sep 2019 A1
20190294018 Shrivastava et al. Sep 2019 A1
20190317458 Shrivastava et al. Oct 2019 A1
20190319335 Hughes et al. Oct 2019 A1
20190324342 Jack et al. Oct 2019 A1
20190331978 Shrivastava et al. Oct 2019 A1
20190346732 Parker et al. Nov 2019 A1
20190346734 Shrivastava et al. Nov 2019 A1
20190347141 Shrivastava et al. Nov 2019 A1
20190353972 Shrivastava et al. Nov 2019 A1
20190356508 Trikha et al. Nov 2019 A1
20190384232 Casey Dec 2019 A1
20190384652 Shrivastava et al. Dec 2019 A1
20190391456 Parker et al. Dec 2019 A1
20200001687 Chow Jan 2020 A1
20200004096 Brown et al. Jan 2020 A1
20200026141 Brown et al. Jan 2020 A1
20200041861 Shrivastava et al. Feb 2020 A1
20200041963 Shrivastava et al. Feb 2020 A1
20200041967 Shrivastava et al. Feb 2020 A1
20200057346 Zedlitz et al. Feb 2020 A1
20200057421 Trikha et al. Feb 2020 A1
20200073193 Pradhan et al. Mar 2020 A1
20200080364 Shrivastava et al. Mar 2020 A1
20200089074 Pradhan et al. Mar 2020 A1
20200096831 Brown et al. Mar 2020 A1
20200150508 Patterson et al. May 2020 A1
20200150602 Trikha et al. May 2020 A1
20200200595 Klawuhn et al. Jun 2020 A1
20200209057 Brown et al. Jul 2020 A1
20200259237 Shrivastava et al. Aug 2020 A1
20200278245 Brown et al. Sep 2020 A1
20200301234 McNeil Sep 2020 A1
20200301236 Brown et al. Sep 2020 A1
20200310213 Shrivastava et al. Oct 2020 A1
20200310214 Brown Oct 2020 A1
20200318426 Vigano et al. Oct 2020 A1
20200321682 Hughes et al. Oct 2020 A1
20200348574 Bhatnagar et al. Nov 2020 A1
20200355977 Brown et al. Nov 2020 A1
20200363261 Klawuhn et al. Nov 2020 A1
20200387041 Shrivastava et al. Dec 2020 A1
20200393719 Mullins et al. Dec 2020 A1
20200393733 Brown Dec 2020 A1
20200398535 Collins et al. Dec 2020 A1
20210003899 Zedlitz et al. Jan 2021 A1
20210018880 Shrivastava et al. Jan 2021 A9
20210040789 Rozbicki et al. Feb 2021 A1
20210041759 Trikha et al. Feb 2021 A1
20210055619 Brown Feb 2021 A1
20210063834 Brown et al. Mar 2021 A1
20210063835 Vigano et al. Mar 2021 A1
20210063836 Patterson et al. Mar 2021 A1
20210072611 Brown Mar 2021 A1
20210080319 Brown et al. Mar 2021 A1
20210080793 Pradhan et al. Mar 2021 A1
20210108960 Klawuhn et al. Apr 2021 A1
20210116769 Shrivastava et al. Apr 2021 A1
20210116770 Pradhan et al. Apr 2021 A1
20210119318 Hughes et al. Apr 2021 A1
20210132458 Trikha et al. May 2021 A1
20210149266 Brown et al. May 2021 A1
20210165696 Shrivastava et al. Jun 2021 A1
20210174804 Shrivastava et al. Jun 2021 A1
20210181593 Pradhan Jun 2021 A1
20210191214 Rozbicki et al. Jun 2021 A1
20210191218 Trikha et al. Jun 2021 A1
20210191221 Shrivastava et al. Jun 2021 A1
20210208467 Shrivastava et al. Jul 2021 A1
20210208468 Jack et al. Jul 2021 A1
20210215990 Parker et al. Jul 2021 A1
20210232015 Brown et al. Jul 2021 A1
20210246719 Shrivastava et al. Aug 2021 A1
20210255519 Vigano et al. Aug 2021 A1
20210294172 Rasmus-Vorrath et al. Sep 2021 A1
20210294173 Trikha et al. Sep 2021 A1
20210294174 Brown et al. Sep 2021 A1
Foreign Referenced Citations (8)
Number Date Country
10 2014 220 818 Apr 2016 DE
2 760 164 Mar 2019 EP
10-1049496 Jul 2011 KR
2016058695 Apr 2016 WO
2016174215 Nov 2016 WO
2017174412 Oct 2017 WO
2019183232 Sep 2019 WO
2019183289 Sep 2019 WO
Non-Patent Literature Citations (6)
Entry
Research project DALEC (Day- and Artificial Light with Energy Calculation) Calculation program for selected working Points for year-round energy and comfort characterization buildings, URL: https://www.uibk.ac.at/bauphysik/forschung/projects/dalec/index.html.en.
Presence or motion detection/absence automatisms: optical sensors from Steinel with the type designation “HDP2 KNX”.
Tracking of movement and temperature of people and objects: Company infsoft GmbH “Infrared Thermopile Sensor”, URL: https://www.infsoft.com/technology/condition-monitoring/presence-motion-monitoring/infrared-thermopile.
English abstract of EN 14501:2021, “Blinds and shutters- Thermal and visual comfort- Performance characteristics and classification”.
English abstract of EN 17037:2018, “Daylight in buildings”.
Delta-T Devices, Outdoor sensor (pyranometer), “SPN1 Sunshine Pyranometer”, URL: https://delta-t.co.uk/product/spn1/.
Related Publications (1)
Number Date Country
20220128206 A1 Apr 2022 US