The embodiments disclosed herein relate generally to window controllers and related control logic for implementing methods of controlling tint and other functions of tintable windows (e.g., electrochromic windows).
Electrochromism is a phenomenon in which a material exhibits a reversible electrochemically-mediated change in an optical property when placed in a different electronic state, typically by being subjected to a voltage change. The optical property is typically one or more of color, transmittance, absorbance, and reflectance. One well known electrochromic material is tungsten oxide (WO3). Tungsten oxide is a cathodic electrochromic material in which a coloration transition, transparent to blue, occurs by electrochemical reduction.
Electrochromic materials may be incorporated into, for example, windows for home, commercial and other uses. The color, transmittance, absorbance, and/or reflectance of such windows may be changed by inducing a change in the electrochromic material, that is, electrochromic windows are windows that can be darkened or lightened electronically. A small voltage applied to an electrochromic device of the window will cause them to darken; reversing the voltage causes them to lighten. This capability allows control of the amount of light that passes through the windows, and presents an opportunity for electrochromic windows to be used as energy-saving devices.
While electrochromism was discovered in the 1960s, electrochromic devices, and particularly electrochromic windows, still unfortunately suffer various problems and have not begun to realize their full commercial potential despite many recent advances in electrochromic technology, apparatus and related methods of making and/or using electrochromic devices.
In one embodiment, the one or more tintable windows include only all solid state and inorganic electrochromic devices.
Certain aspects pertain to a method of determining a tint level for each zone of tintable windows of a building based on output from glare and reflection models of the building site. The method initializes and assigns attributes to a 3D model of the building site. The method also generates one or more three-dimensional occupancy regions in the 3D model and generates glare and reflection models based on the 3D model. In addition, the method determines an intersection of a three-dimensional occupancy region with three-dimensional light projections through tintable windows of each zone in the clear sky glare or reflection models, evaluates whether one or more conditions exists based on the determined intersection, and determines a tint state for each zone based on the evaluation. In one implementation, the 3D model resides on a on a cloud-based 3D modelling platform.
Certain aspects pertain to a system for generating a 3D model of a building site and determining a schedule of tint states for each zone of tintable windows of a building at the building site. The system comprises a network with computer readable medium and one or more processors in communication with the computer readable medium. The system further comprises a clear sky logic module stored on the computer readable medium, the clear sky logic module configured to generate a glare model and a reflection model based on the 3D model, determine a tint state for each zone at each time interval based on output from the glare model and/or the reflection model, and push, via a communication network, the schedule of tint states for each zone to a network of window controllers at the building. The network of window controllers is configured to control the tint state of each of the one or more zones of tintable windows of the building based on a minimum of the tint state from the schedule and a weather-based tint state based on one or both of infrared sensor readings and photosensor readings. In one implementation, the network is a cloud network.
Certain aspects pertain to a system for customizing spaces of a 3D model of a building site and controlling tinting of one or more zones of tintable windows of a building at the building site. The system comprises a network with one or more processors and computer readable medium in communication with the one or more processors, a communications interface configured to receive input for customizing spaces of the 3D model from one or more users and to output visualizations to the one or more users, a 3D modelling system configured to customize the 3D model based on the input received from the one or more users, and a clear sky logic module stored on the computer readable medium, the clear sky logic module configured to generate a glare model and a reflection model based on the customized 3D model, determine a tint state for each zone at each time interval based on output from the glare model and/or the reflection model, and provide a visualization of the customized 3D model to the one or more users via the communications interface. In one implementation, the network is a cloud network and the 3D modelling system resides on the cloud network.
Certain aspects pertain to a method of controlling tint of one or more zones of tintable windows of a building at a building site. The method includes receiving schedule information with a clear sky tint level for each of the zones, the schedule information derived from clear sky glare and reflection models of the building site, determining a cloud condition using one or both of photosensor readings and infrared sensor readings, calculating a weather-based tint level using the determined cloud condition, and communicating tint instructions over a network to a window controller to transition tint of the zone of tintable windows to the minimum of the clear sky tint level and the weather-based tint level. In one implementation, the clear sky glare and reflection models of the building site reside on a cloud network
One aspect pertains to a method of controlling tint of one or more tintable windows located between an interior and an exterior of a building. The method comprises determining a position of the sun with respect to a first tintable window and determining, using the determined position of the sun with respect to the first tintable window, a default tint state for the first tintable window. The method also comprises determining that an outside temperature is at or above a threshold temperature (e.g., at least about 40° C.) and using the determination that the outside temperature is at or above the threshold temperature to determine a modified tint state that is darker than the default tint state for the first tintable window. In addition, the method comprises providing instructions to transition the first tintable window to the modified tint state.
One aspect pertains to a system of controlling tint of one or more tintable windows located between an interior and an exterior of a building. The system includes one or more processors and a controller in communication with the one or more processors and with the tintable window. The one or more processors are configured to determine a position of the sun with respect to a first tintable window and determine, using the determined position of the sun with respect to the first tintable window, a default tint state for the first tintable window. The one or more processors are also configured to determine that an outside temperature is at or above a threshold temperature, use the determination that the outside temperature is at or above the threshold temperature to determine a modified tint state that is darker than the default tint state for the first tintable window, and provide instructions to transition the first tintable window to the modified tint state. The controller is configured to apply commands to transition the first tintable window to the modified tint state.
One aspect pertains to a method of determining a tint state of a tintable window located between an interior and an exterior of a building. The method includes determining a default tint state for the tintable window, determining that an outside temperature is above a threshold temperature, using the determination that the outside temperature is above the threshold temperature to determine a modified tint state that is darker than the default tint state for the tintable window, and providing instructions to transition the tintable window to the modified tint state.
One aspect pertains to a system for controlling tint of a tintable window located between an interior and an exterior of a building. The system includes one or more processors and a controller in communication with the one or more processors and with the tintable window. The one or more processors are configured to determine a default tint state for the tintable window, determine that an outside temperature is above a threshold temperature, use the determination that the outside temperature is above the threshold temperature to determine a modified tint state that is darker than the default tint state for the tintable window, and provide instructions to transition the tintable window to the modified tint state. The controller is configured to apply commands to transition the tintable window to the modified tint state.
One aspect pertains to a method of determining of controlling tint state of at least one tintable window. The method includes determining a baseline tint state for the at least one tintable window using one or more tint decision modules. The method also includes
if it is determined that an outside temperature is (i) at or above a first threshold temperature and/or (ii) at or below a second threshold temperature, determining a modified tint state that is a predefined amount darker than the baseline tint state and providing instructions to transition the at least one tintable window to the modified tint state.
One aspect pertains to a system for controlling tint of a tintable window located between an interior and an exterior of a building. The system includes one or more processors and a controller in communication with the one or more processors and with the tintable window. The one or more processors are configured to determine a baseline tint state for the at least one tintable window using one or more tint decision modules, and if it is determined that an outside temperature is (i) at or above a first threshold temperature and/or (ii) at or below a second threshold temperature, determine a modified tint state that is a predefined amount darker than the baseline tint state and provide instructions to transition the at least one tintable window to the modified tint state. The controller is configured to apply commands to transition the tintable window to the modified tint state.
These and other features and embodiments will be described in more detail below with reference to the drawings.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the presented embodiments. The disclosed embodiments may be practiced without some or all of these specific details. In other instances, well-known process operations have not been described in detail to not unnecessarily obscure the disclosed embodiments. While the disclosed embodiments will be described in conjunction with the specific embodiments, it will be understood that it is not intended to limit the disclosed embodiments. It should be understood that while disclosed embodiments focus on electrochromic windows (also referred to as smart windows), the aspects disclosed herein may apply to other types of tintable windows. For example, a tintable window incorporating a liquid crystal device or a suspended particle device, instead of an electrochromic device could be incorporated in any of the disclosed embodiments.
I. Overview of Electrochromic Devices and Window Controllers
In order to orient the reader to the embodiments of systems and methods disclosed herein, a brief discussion of electrochromic devices and window controllers is provided. This initial discussion is provided for context only, and the subsequently described embodiments of systems, window controllers, and methods are not limited to the specific features and fabrication processes of this initial discussion.
A. Electrochromic Devices
A particular example of an electrochromic lite is described with reference to
After formation of the electrochromic device, edge deletion processes and additional laser scribing are performed.
After laser scribing is complete, bus bars are attached. Non-penetrating bus bar 1 is applied to the second TCO. Non-penetrating bus bar 2 is applied to an area where the device was not deposited (e.g., from a mask protecting the first TCO from device deposition), in contact with the first TCO or, in this example, where an edge deletion process (e.g., laser ablation using an apparatus having a XY or XYZ galvanometer) was used to remove material down to the first TCO. In this example, both bus bar 1 and bus bar 2 are non-penetrating bus bars. A penetrating bus bar is one that is typically pressed into and through the electrochromic stack to make contact with the TCO at the bottom of the stack. A non-penetrating bus bar is one that does not penetrate into the electrochromic stack layers, but rather makes electrical and physical contact on the surface of a conductive layer, for example, a TCO.
The TCO layers can be electrically connected using a non-traditional bus bar, for example, a bus bar fabricated with screen and lithography patterning methods. In one embodiment, electrical communication is established with the device's transparent conducting layers via silk screening (or using another patterning method) a conductive ink followed by heat curing or sintering the ink. Advantages to using the above described device configuration include simpler manufacturing, for example, and less laser scribing than conventional techniques which use penetrating bus bars.
After the bus bars are connected, the device is integrated into an insulated glass unit (IGU), which includes, for example, wiring the bus bars and the like. In some embodiments, one or both of the bus bars are inside the finished IGU, however in one embodiment one bus bar is outside the seal of the IGU and one bus bar is inside the IGU. In the former embodiment, area 140 is used to make the seal with one face of the spacer used to form the IGU. Thus, the wires or other connection to the bus bars runs between the spacer and the glass. As many spacers are made of metal, e.g., stainless steel, which is conductive, it is desirable to take steps to avoid short circuiting due to electrical communication between the bus bar and connector thereto and the metal spacer.
As described above, after the bus bars are connected, the electrochromic lite is integrated into an IGU, which includes, for example, wiring for the bus bars and the like. In the embodiments described herein, both of the bus bars are inside the primary seal of the finished IGU.
Electrochromic devices having distinct layers as described can be fabricated as all solid state devices and/or all inorganic devices. Such devices and methods of fabricating them are described in more detail in U.S. patent application Ser. No. 12/645,111, entitled “Fabrication of Low-Defectivity Electrochromic Devices,” filed on Dec. 22, 2009, and naming Mark Kozlowski et al. as inventors, and in U.S. patent application Ser. No. 12/645,159, entitled, “Electrochromic Devices,” filed on Dec. 22, 2009 and naming Zhongchun Wang et al. as inventors, both of which are hereby incorporated by reference in their entireties. It should be understood, however, that any one or more of the layers in the stack may contain some amount of organic material. The same can be said for liquids that may be present in one or more layers in small amounts. It should also be understood that solid state material may be deposited or otherwise formed by processes employing liquid components such as certain processes employing sol-gels or chemical vapor deposition.
Additionally, it should be understood that the reference to a transition between a bleached state and colored state is non-limiting and suggests only one example, among many, of an electrochromic transition that may be implemented. Unless otherwise specified herein (including the foregoing discussion), whenever reference is made to a bleached-colored transition, the corresponding device or process encompasses other optical state transitions such as non-reflective-reflective, transparent-opaque, etc. Further, the term “bleached” refers to an optically neutral state, for example, uncolored, transparent, or translucent. Still further, unless specified otherwise herein, the “color” of an electrochromic transition is not limited to any particular wavelength or range of wavelengths. As understood by those of skill in the art, the choice of appropriate electrochromic and counter electrode materials governs the relevant optical transition.
In embodiments described herein, the electrochromic device reversibly cycles between a bleached state and a colored state. In some cases, when the device is in a bleached state, a potential is applied to the electrochromic stack 320 such that available ions in the stack reside primarily in the counter electrode 310. When the potential on the electrochromic stack is reversed, the ions are transported across the ion conducting layer 308 to the electrochromic material 306 and cause the material to transition to the colored state. In a similar way, the electrochromic device of embodiments described herein can be reversibly cycled between different tint levels (e.g., bleached state, darkest colored state, and intermediate levels between the bleached state and the darkest colored state).
Referring again to
Any material having suitable optical, electrical, thermal, and mechanical properties may be used as substrate 302. Such substrates include, for example, glass, plastic, and mirror materials. Suitable glasses include either clear or tinted soda lime glass, including soda lime float glass. The glass may be tempered or untempered.
In many cases, the substrate is a glass pane sized for residential window applications. The size of such glass pane can vary widely depending on the specific needs of the residence. In other cases, the substrate is architectural glass. Architectural glass is typically used in commercial buildings, but may also be used in residential buildings, and typically, though not necessarily, separates an indoor environment from an outside environment. In certain embodiments, architectural glass is at least 20 inches by 20 inches, and can be much larger, for example, as large as about 80 inches by 120 inches. Architectural glass is typically at least about 2 mm thick, typically between about 3 mm and about 6 mm thick. Of course, electrochromic devices are scalable to substrates smaller or larger than architectural glass. Further, the electrochromic device may be provided on a mirror of any size and shape.
On top of substrate 302 is conductive layer 304. In certain embodiments, one or both of the conductive layers 304 and 314 is inorganic and/or solid. Conductive layers 304 and 314 may be made from a number of different materials, including conductive oxides, thin metallic coatings, conductive metal nitrides, and composite conductors. Typically, conductive layers 304 and 314 are transparent at least in the range of wavelengths where electrochromism is exhibited by the electrochromic layer. Transparent conductive oxides include metal oxides and metal oxides doped with one or more metals. Examples of such metal oxides and doped metal oxides include indium oxide, indium tin oxide, doped indium oxide, tin oxide, doped tin oxide, zinc oxide, aluminum zinc oxide, doped zinc oxide, ruthenium oxide, doped ruthenium oxide and the like. Since oxides are often used for these layers, they are sometimes referred to as “transparent conductive oxide” (TCO) layers. Thin metallic coatings that are substantially transparent may also be used, as well as combinations of TCOs and metallic coatings.
The function of the conductive layers is to spread an electric potential provided by voltage source 316 over surfaces of the electrochromic stack 320 to interior regions of the stack, with relatively little ohmic potential drop. The electric potential is transferred to the conductive layers though electrical connections to the conductive layers. In some embodiments, bus bars, one in contact with conductive layer 304 and one in contact with conductive layer 314, provide the electric connection between the voltage source 316 and the conductive layers 304 and 314. The conductive layers 304 and 314 may also be connected to the voltage source 316 with other conventional means.
Overlaying conductive layer 304 is electrochromic layer 306. In some embodiments, electrochromic layer 306 is inorganic and/or solid. The electrochromic layer may contain any one or more of a number of different electrochromic materials, including metal oxides. Such metal oxides include tungsten oxide (WO3), molybdenum oxide (MoO3), niobium oxide (Nb2O5), titanium oxide (TiO2), copper oxide (CuO), iridium oxide (Ir2O3), chromium oxide (Cr2O3), manganese oxide (Mn2O3), vanadium oxide (V2O5), nickel oxide (Ni2O3), cobalt oxide (Co2O3) and the like. During operation, electrochromic layer 306 transfers ions to and receives ions from counter electrode layer 310 to cause optical transitions.
Generally, the colorization (or change in any optical property—e.g., absorbance, reflectance, and transmittance) of the electrochromic material is caused by reversible ion insertion into the material (e.g., intercalation) and a corresponding injection of a charge balancing electron. Typically some fraction of the ions responsible for the optical transition is irreversibly bound up in the electrochromic material. Some or all of the irreversibly bound ions are used to compensate “blind charge” in the material. In most electrochromic materials, suitable ions include lithium ions (Li+) and hydrogen ions (H+) (that is, protons). In some cases, however, other ions will be suitable. In various embodiments, lithium ions are used to produce the electrochromic phenomena. Intercalation of lithium ions into tungsten oxide (WO3-y (0<y≤˜0.3)) causes the tungsten oxide to change from transparent (bleached state) to blue (colored state).
Referring again to
In some embodiments, suitable materials for the counter electrode complementary to WO3 include nickel oxide (NiO), nickel tungsten oxide (NiWO), nickel vanadium oxide, nickel chromium oxide, nickel aluminum oxide, nickel manganese oxide, nickel magnesium oxide, chromium oxide (Cr2O3), manganese oxide (MnO2), and Prussian blue.
When charge is removed from a counter electrode 310 made of nickel tungsten oxide (that is, ions are transported from counter electrode 310 to electrochromic layer 306), the counter electrode layer will transition from a transparent state to a colored state.
In the depicted electrochromic device, between electrochromic layer 306 and counter electrode layer 310, there is the ion conducting layer 308. Ion conducting layer 308 serves as a medium through which ions are transported (in the manner of an electrolyte) when the electrochromic device transitions between the bleached state and the colored state. Preferably, ion conducting layer 308 is highly conductive to the relevant ions for the electrochromic and the counter electrode layers, but has sufficiently low electron conductivity that negligible electron transfer takes place during normal operation. A thin ion conducting layer with high ionic conductivity permits fast ion conduction and hence fast switching for high performance electrochromic devices. In certain embodiments, the ion conducting layer 308 is inorganic and/or solid.
Examples of suitable ion conducting layers (for electrochromic devices having a distinct IC layer) include silicates, silicon oxides, tungsten oxides, tantalum oxides, niobium oxides, and borates. These materials may be doped with different dopants, including lithium. Lithium doped silicon oxides include lithium silicon-aluminum-oxide. In some embodiments, the ion conducting layer comprises a silicate-based structure. In some embodiments, a silicon-aluminum-oxide (SiAlO) is used for the ion conducting layer 308.
Electrochromic device 300 may include one or more additional layers (not shown), such as one or more passive layers. Passive layers used to improve certain optical properties may be included in electrochromic device 300. Passive layers for providing moisture or scratch resistance may also be included in electrochromic device 300. For example, the conductive layers may be treated with anti-reflective or protective oxide or nitride layers. Other passive layers may serve to hermetically seal electrochromic device 300.
A power source 416 is configured to apply a potential and/or current to an electrochromic stack 420 through suitable connections (e.g., bus bars) to the conductive layers 404 and 414. In some embodiments, the voltage source is configured to apply a potential of a few volts in order to drive a transition of the device from one optical state to another. The polarity of the potential as shown in
As described above, an electrochromic device may include an electrochromic (EC) electrode layer and a counter electrode (CE) layer separated by an ionically conductive (IC) layer that is highly conductive to ions and highly resistive to electrons. As conventionally understood, the ionically conductive layer therefore prevents shorting between the electrochromic layer and the counter electrode layer. The ionically conductive layer allows the electrochromic and counter electrodes to hold a charge and thereby maintain their bleached or colored states. In electrochromic devices having distinct layers, the components form a stack which includes the ion conducting layer sandwiched between the electrochromic electrode layer and the counter electrode layer. The boundaries between these three stack components are defined by abrupt changes in composition and/or microstructure. Thus, the devices have three distinct layers with two abrupt interfaces.
In accordance with certain embodiments, the counter electrode and electrochromic electrodes are formed immediately adjacent one another, sometimes in direct contact, without separately depositing an ionically conducting layer. In some embodiments, electrochromic devices having an interfacial region rather than a distinct IC layer are employed. Such devices, and methods of fabricating them, are described in U.S. Pat. No. 8,300,298 and U.S. patent application Ser. No. 12/772,075 filed on Apr. 30, 2010, and U.S. patent application Ser. Nos. 12/814,277 and 12/814,279, filed on Jun. 11, 2010—each of the three patent applications and patent is entitled “Electrochromic Devices,” each names Zhongchun Wang et al. as inventors, and each is incorporated by reference herein in its entirety.
B. Window Controllers
A window controller is used to control the tint level of the electrochromic device of an electrochromic window. In some embodiments, the window controller is able to transition the electrochromic window between two tint states (levels), a bleached state and a colored state. In other embodiments, the controller can additionally transition the electrochromic window (e.g., having a single electrochromic device) to intermediate tint levels. In some disclosed embodiments, the window controller is able to transition the electrochromic window to four or more tint levels. Certain electrochromic windows allow intermediate tint levels by using two (or more) electrochromic lites in a single IGU, where each lite is a two-state lite. This is described in reference to
As noted above with respect to
In some embodiments, the window controller is able to transition an electrochromic window having an electrochromic device capable of transitioning between two or more tint levels. For example, a window controller may be able to transition the electrochromic window to a bleached state, one or more intermediate levels, and a colored state. In some other embodiments, the window controller is able to transition an electrochromic window incorporating an electrochromic device between any number of tint levels between the bleached state and the colored state. Embodiments of methods and controllers for transitioning an electrochromic window to an intermediate tint level or levels are further described in U.S. Pat. No. 8,254,013, naming Disha Mehtani et al. as inventors, titled “CONTROLLING TRANSITIONS IN OPTICALLY SWITCHABLE DEVICES,” which is hereby incorporated by reference in its entirety.
In some embodiments, a window controller can power one or more electrochromic devices in an electrochromic window. Typically, this function of the window controller is augmented with one or more other functions described in more detail below. Window controllers described herein are not limited to those that have the function of powering an electrochromic device to which it is associated for the purposes of control. That is, the power source for the electrochromic window may be separate from the window controller, where the controller has its own power source and directs application of power from the window power source to the window. However, it is convenient to include a power source with the window controller and to configure the controller to power the window directly, because it obviates the need for separate wiring for powering the electrochromic window.
Further, the window controllers described in this section are described as standalone controllers which may be configured to control the functions of a single window or a plurality of electrochromic windows, without integration of the window controller into a building control network or a building management system (BMS). Window controllers, however, may be integrated into a building control network or a BMS, as described further in the Building Management System section of this disclosure.
In
In disclosed embodiments, a building may have at least one room having an electrochromic window between the exterior and interior of a building. One or more sensors may be located to the exterior of the building and/or inside the room. In embodiments, the output from the one or more sensors may be input to the signal conditioning module 465 of the window controller 450. In some cases, the output from the one or more sensors may be input to a BMS, as described further in the Building Management Systems section. Although the sensors of depicted embodiments are shown as located on the outside vertical wall of the building, this is for the sake of simplicity, and the sensors may be in other locations, such as inside the room or on other surfaces to the exterior, as well. In some cases, two or more sensors may be used to measure the same input, which can provide redundancy in case one sensor fails or has an otherwise erroneous reading.
Exterior sensor 510 is a device, such as a photosensor, that is able to detect radiant light incident upon the device flowing from a light source such as the sun or from light reflected to the sensor from a surface, particles in the atmosphere, clouds, etc. The exterior sensor 510 may generate a signal in the form of electrical current that results from the photoelectric effect and the signal may be a function of the light incident on the sensor 510. In some cases, the device may detect radiant light in terms of irradiance in units of watts/m2 or other similar units. In other cases, the device may detect light in the visible range of wavelengths in units of foot candles or similar units. In many cases, there is a linear relationship between these values of irradiance and visible light.
In some embodiments, exterior sensor 510 is configured to measure infrared light. In some embodiments, an exterior photosensor is configured to measure infrared light and/or visible light. In some embodiments, an exterior photosensor 510 may also include sensors for measuring temperature and/or humidity data. In some embodiments, intelligence logic may determine the presence of an obstructing cloud and/or quantify the obstruction caused by a cloud using one or more parameters (e.g., visible light data, infrared light data, humidity data, and temperature data) determined using an exterior sensor or received from an external network (e.g., a weather station). Various methods of detecting clouds using infrared sensors are described in International Patent Application No. PCT/US17/55631, titled “INFRARED CLOUD DETECTOR SYSTEMS AND METHODS,” and filed, Oct. 6, 2017 which designates the United States and is incorporated herein in its entirety.
Irradiance values from sunlight can be predicted based on the time of day and time of year as the angle at which sunlight strikes the earth changes. Exterior sensor 510 can detect radiant light in real-time, which accounts for reflected and obstructed light due to buildings, changes in weather (e.g., clouds), etc. For example, on cloudy days, sunlight would be blocked by the clouds and the radiant light detected by an exterior sensor 510 would be lower than on cloudless days.
In some embodiments, there may be one or more exterior sensors 510 associated with a single electrochromic window 505. Output from the one or more exterior sensors 510 could be compared to one another to determine, for example, if one of exterior sensors 510 is shaded by an object, such as by a bird that landed on exterior sensor 510. In some cases, it may be desirable to use relatively few sensors in a building because some sensors can be unreliable and/or expensive. In certain implementations, a single sensor or a few sensors may be employed to determine the current level of radiant light from the sun impinging on the building or perhaps one side of the building. A cloud may pass in front of the sun or a construction vehicle may park in front of the setting sun. These will result in deviations from the amount of radiant light from the sun calculated to normally impinge on the building.
Exterior sensor 510 may be a type of photosensor. For example, exterior sensor 510 may be a charge coupled device (CCD), photodiode, photoresistor, or photovoltaic cell. One of ordinary skill in the art would appreciate that future developments in photosensor and other sensor technology would also work, as they measure light intensity and provide an electrical output representative of the light level.
In some embodiments, output from exterior sensor 510 may be input to the signal conditioning module 465. The input may be in the form of a voltage signal to signal conditioning module 465. Signal conditioning module 465 passes an output signal to the window controller 450. Window controller 450 determines a tint level of the electrochromic window 505, based on various information from the configuration file 475, output from the signal conditioning module 465, override values. Window controller 450 and then instructs the PWM 460, to apply a voltage and/or current to electrochromic window 505 to transition to the desired tint level.
In disclosed embodiments, window controller 450 can instruct the PWM 460, to apply a voltage and/or current to electrochromic window 505 to transition it to any one of four or more different tint levels. In disclosed embodiments, electrochromic window 505 can be transitioned to at least eight different tint levels described as: 0 (lightest), 5, 10, 15, 20, 25, 30, and 35 (darkest). The tint levels may linearly correspond to visual transmittance values and solar heat gain coefficient (SHGC) values of light transmitted through the electrochromic window 505. For example, using the above eight tint levels, the lightest tint level of 0 may correspond to an SHGC value of 0.80, the tint level of 5 may correspond to an SHGC value of 0.70, the tint level of 10 may correspond to an SHGC value of 0.60, the tint level of 15 may correspond to an SHGC value of 0.50, the tint level of 20 may correspond to an SHGC value of 0.40, the tint level of 25 may correspond to an SHGC value of 0.30, the tint level of 30 may correspond to an SHGC value of 0.20, and the tint level of 35 (darkest) may correspond to an SHGC value of 0.10.
Window controller 450 or a master controller in communication with the window controller 450 may employ any one or more predictive control logic components to determine a desired tint level based on signals from the exterior sensor 510 and/or other input. The window controller 450 can instruct the PWM 460 to apply a voltage and/or current to electrochromic window 505 to transition it to the desired tint level.
Building Management System (BMS)
The window controllers described herein also are suited for integration with or are within/part of a BMS. A BMS is a computer-based control system installed in a building that monitors and controls the building's mechanical and electrical equipment such as ventilation, lighting, power systems, elevators, fire systems, and security systems. A BMS consists of hardware, including interconnections by communication channels to a computer or computers, and associated software for maintaining conditions in the building according to preferences set by the occupants and/or by the building manager. For example, a BMS may be implemented using a local area network, such as Ethernet. The software can be based on, for example, internet protocols and/or open standards. One example is software from Tridium, Inc. (of Richmond, Virginia). One communications protocol commonly used with a BMS is BACnet (building automation and control networks).
A BMS is most common in a large building, and typically functions at least to control the environment within the building. For example, a BMS may control temperature, carbon dioxide levels, and humidity within a building. Typically, there are many mechanical devices that are controlled by a BMS such as heaters, air conditioners, blowers, vents, and the like. To control the building environment, a BMS may turn on and off these various devices under defined conditions. A core function of a typical modern BMS is to maintain a comfortable environment for the building's occupants while minimizing heating and cooling costs/demand. Thus, a modern BMS is used not only to monitor and control, but also to optimize the synergy between various systems, for example, to conserve energy and lower building operation costs.
In some embodiments, a window controller is integrated with a BMS, where the window controller is configured to control one or more electrochromic windows (e.g., 505) or other tintable windows. In other embodiments, the window controller is within or part of the BMS and the BMS controls both the tintable windows and the functions of other systems of the building. In one example, the BMS may control the functions of all the building systems including the one or more zones of tintable windows in the building.
In some embodiments, each tintable window of the one or more zones includes at least one solid state and inorganic electrochromic device. In one embodiment, each of the tintable windows of the one or more zones is an electrochromic window having one or more solid state and inorganic electrochromic devices. In one embodiment, the one or more tintable windows include at least one all solid state and inorganic electrochromic device, but may include more than one electrochromic device, e.g. where each lite or pane of an IGU is tintable. In one embodiment, the electrochromic windows are multistate electrochromic windows, as described in U.S. patent application Ser. No. 12/851,514, filed on Aug. 5, 2010, and entitled “Multipane Electrochromic Windows.”
Also, the BMS 605 manages a window control system 602. The window control system 602 is a distributed network of window controllers including a master controller, 603, network controllers, 607a and 607b, and end or leaf controllers 608. The end or leaf controllers 608 may be similar to the window controller 450 described with respect to
Each of controllers 608 can be in a separate location from the electrochromic window that it controls, or can be integrated into the electrochromic window. For simplicity, only ten electrochromic windows of building 601 are depicted as being controlled by the master window controller 602. In a typical setting there may be a large number of electrochromic windows in a building controlled by the window control system 602. Advantages and features of incorporating window controllers as described herein with BMSs are described below in more detail and in relation to
One aspect of certain disclosed embodiments is a BMS including a multipurpose window controller as described herein. By incorporating feedback from a window controller, a BMS can provide, for example, enhanced: 1) environmental control, 2) energy savings, 3) security, 4) flexibility in control options, 5) improved reliability and usable life of other systems due to less reliance thereon and therefore less maintenance thereof, 6) information availability and diagnostics, 7) effective use of, and higher productivity from, staff, and various combinations of these, because the electrochromic windows can be automatically controlled. In some embodiments, a BMS may not be present or a BMS may be present but may not communicate with a master controller or may communicate at a high level with the master controller. In these cases, maintenance on the BMS would not interrupt control of the electrochromic windows.
According to certain disclosed examples, the systems of the BMS (e.g., BMS 605) or building network run according to daily, monthly, quarterly, or yearly schedules. For example, the lighting control system, the window control system, the HVAC, and the security system may operate on a 24-hour schedule accounting for when people are in the building during the work day. At night, the building may enter an energy savings mode, and during the day, the building systems may operate in a manner that minimizes the energy consumption of the building while providing for occupant comfort. As another example, the building systems may shut down or enter an energy savings mode over a holiday period or other time period with low building occupancy.
The BMS schedule may be combined with geographical information. Geographical information may include the latitude and longitude of the building. Geographical information also may include information about the direction that each side of the building faces. Using such information, different rooms on different sides of the building may be controlled in different manners. For example, for east-facing rooms of the building in the winter, the window controller may instruct the windows to have no tint in the morning so that the room warms up due to sunlight shining in the room and the lighting control panel may instruct the lights to be dim because of the lighting from the sunlight. The west-facing windows may be controllable by the occupants of the room in the morning because the tint of the windows on the west side may have no impact on energy savings. However, the modes of operation of the east-facing windows and the west-facing windows may switch in the evening (e.g., when the sun is setting, the west-facing windows are not tinted to allow sunlight in for both heat and lighting).
Described below is an example of a building, for example, like building 601 in
For buildings with exterior sensors, the exterior sensors may be on the roof of the building. Alternatively or additionally, the building may include an exterior sensor associated with each exterior window (e.g., exterior sensor 510 described in relation to
Regarding the methods described with respect to
In some embodiments, the output signals received include a signal indicating energy or power consumption by a heating system, a cooling system, and/or lighting within the building. For example, the energy or power consumption of the heating system, the cooling system, and/or the lighting of the building may be monitored to provide the signal indicating energy or power consumption. Devices may be interfaced with or attached to the circuits and/or wiring of the building to enable this monitoring. Alternatively, the power systems in the building may be installed such that the power consumed by the heating system, a cooling system, and/or lighting for an individual room within the building or a group of rooms within the building can be monitored.
Tint instructions can be provided to change to tint of the tintable window to the determined level of tint. For example, referring to
In some embodiments, a building including tintable windows and a BMS may be enrolled in or participate in a demand response program run by the utility or utilities providing power to the building. The program may be a program in which the energy consumption of the building is reduced when a peak load occurrence is expected. The utility may send out a warning signal prior to an expected peak load occurrence. For example, the warning may be sent on the day before, the morning of, or about one hour before the expected peak load occurrence. A peak load occurrence may be expected to occur on a hot summer day when cooling systems/air conditioners are drawing a large amount of power from the utility, for example. The warning signal may be received by the BMS of the building or by window controllers configured to control the tintable windows in the building. This warning signal can be an override mechanism that disengages the Modules A, B, and C as shown in
In some embodiments, tintable windows for the exterior windows of the building (i.e., windows separating the interior of the building from the exterior of the building), may be grouped into zones, with tintable windows in a zone being instructed in a similar manner. For example, groups of tintable windows on different floors of the building or different sides of the building may be in different zones. For example, on the first floor of the building, all of the east facing tintable windows may be in zone 1, all of the south facing tintable windows may be in zone 2, all of the west facing tintable windows may be in zone 3, and all of the north facing tintable windows may be in zone 4. As another example, all of the tintable windows on the first floor of the building may be in zone 1, all of the tintable windows on the second floor may be in zone 2, and all of the tintable windows on the third floor may be in zone 3. As yet another example, all of the east facing tintable windows may be in zone 1, all of the south facing tintable windows may be in zone 2, all of the west facing tintable windows may be in zone 3, and all of the north facing tintable windows may be in zone 4. As yet another example, east facing tintable windows on one floor could be divided into different zones. Any number of tintable windows on the same side and/or different sides and/or different floors of the building may be assigned to a zone. In embodiments where individual tintable windows have independently controllable zones, tinting zones may be created on a building façade using combinations of zones of individual windows, e.g. where individual windows may or may not have all of their zones tinted.
In some embodiments, tintable windows in a zone may be controlled by the same window controller. In some other embodiments, tintable windows in a zone may be controlled by different window controllers, but the window controllers may all receive the same output signals from sensors and use the same function or lookup table to determine the level of tint for the windows in a zone.
In some embodiments, tintable windows in a zone may be controlled by a window controller or controllers that receive an output signal from a transmissivity sensor. In some embodiments, the transmissivity sensor may be mounted proximate the windows in a zone. For example, the transmissivity sensor may be mounted in or on a frame containing an IGU (e.g., mounted in or on a mullion, the horizontal sash of a frame) included in the zone. In some other embodiments, tintable windows in a zone that includes the windows on a single side of the building may be controlled by a window controller or controllers that receive an output signal from a transmissivity sensor.
In some embodiments, a sensor (e.g., photosensor) may provide an output signal to a window controller to control the tintable windows (e.g., electrochromic window 505) of a first zone (e.g., a master control zone). The window controller may also control the tintable windows in a second zone (e.g., a slave control zone) in the same manner as the first zone. In some other embodiments, another window controller may control the tintable windows in the second zone in the same manner as the first zone.
In some embodiments, a building manager, occupants of rooms in the second zone, or other person may manually instruct (using a tint or clear command or a command from a user console of a BMS, for example) the tintable windows in the second zone (i.e., the slave control zone) to enter a tint level such as a colored state (level) or a clear state. In some embodiments, when the tint level of the windows in the second zone is overridden with such a manual command, the tintable windows in the first zone (i.e., the master control zone) remain under control of the window controller receiving output from the transmissivity sensor. The second zone may remain in a manual command mode for a period of time and then revert back to be under control of the window controller receiving output from the transmissivity sensor. For example, the second zone may stay in a manual mode for one hour after receiving an override command, and then may revert back to be under control of the window controller receiving output from the transmissivity sensor.
In some embodiments, a building manager, occupants of rooms in the first zone, or other person may manually instruct (using a tint command or a command from a user console of a BMS, for example) the windows in the first zone (i.e., the master control zone) to enter a tint level such as a colored state or a clear state. In some embodiments, when the tint level of the windows in the first zone is overridden with such a manual command, the tintable windows in the second zone (i.e., the slave control zone) remain under control of the window controller receiving outputs from the exterior sensor. The first zone may remain in a manual command mode for a period of time and then revert back to be under control of window controller receiving output from the transmissivity sensor. For example, the first zone may stay in a manual mode for one hour after receiving an override command, and then may revert back to be under control of the window controller receiving output from the transmissivity sensor. In some other embodiments, the tintable windows in the second zone may remain in the tint level that they are in when the manual override for the first zone is received. The first zone may remain in a manual command mode for a period of time and then both the first zone and the second zone may revert back to be under control of the window controller receiving output from the transmissivity sensor.
Any of the methods described herein of control of a tintable window, regardless of whether the window controller is a standalone window controller or is interfaced with a building network, may be used control the tint of a tintable window.
Wireless or Wired Communication
In some embodiments, window controllers described herein include components for wired or wireless communication between the window controller, sensors, and separate communication nodes. Wireless or wired communications may be accomplished with a communication interface that interfaces directly with the window controller. Such interface could be native to the microprocessor or provided via additional circuitry enabling these functions.
A separate communication node for wireless communications can be, for example, another wireless window controller, an end, intermediate, or master window controller, a remote control device, or a BMS. Wireless communication is used in the window controller for at least one of the following operations: 1) programming and/or operating the tintable windows (e.g., electrochromic window 505), 2) collecting data from the tintable windows from the various sensors and protocols described herein, and 3) using the tintable window windows as relay points for wireless communication. Data collected from tintable windows may also include count data such as, e.g., the number of times an electrochromic device has been activated, efficiency of the electrochromic device over time, and the like. These wireless communication features is described in more detail below.
In one embodiment, wireless communication is used to operate the associated electrochromic windows (e.g., electrochromic window 505), for example, via an infrared (IR), and/or radio frequency (RF) signal. In certain embodiments, the controller will include a wireless protocol chip, such as Bluetooth, EnOcean, WiFi, Zigbee, and the like. Window controllers may also have wireless communication via a network. Input to the window controller can be manually input by an end user at a wall switch, either directly or via wireless communication, or the input can be from a BMS of a building of which the electrochromic window is a component.
In one embodiment, when the window controller is part of a distributed network of controllers, wireless communication is used to transfer data to and from each of a plurality of electrochromic windows via the distributed network of controllers, each having wireless communication components. For example, referring again to
In some embodiments, more than one mode of wireless communication is used in the window controller distributed network. For example, a master window controller may communicate wirelessly to intermediate controllers via WiFi or Zigbee, while the intermediate controllers communicate with end controllers via Bluetooth, Zigbee, EnOcean, or other protocol. In another example, window controllers have redundant wireless communication systems for flexibility in end user choices for wireless communication.
Wireless communication between, for example, master and/or intermediate window controllers and end or leaf window controllers offers the advantage of obviating the installation of hard communication lines. This is also true for wireless communication between window controllers and the BMS or a building network. In one aspect, wireless communication in these roles is useful for data transfer to and from electrochromic windows for operating the window and providing data to, for example, a BMS for optimizing the environment and energy savings in a building. Window location data as well as feedback from sensors are synergized for such optimization. For example, granular level (window-by-window) microclimate information may be fed to a BMS and used to determine control instructions for the building systems in order to optimize the building's various environments.
Example of System for Controlling Functions of Tintable Windows
The system 700 includes a window control system 702 having a network of window controllers that can send control signals to the tintable windows to control its functions. The system 700 also includes a network 701 in electronic communication with the master controller 703. The predictive control logic, other control logic and instructions for controlling functions of the tintable window(s), sensor data, and/or schedule information regarding clear sky models can be communicated to the master controller 703 through the network 701. The network 701 can be a wired or wireless network (e.g. a cloud network). In one embodiment, the network 701 may be in communication with a BMS to allow the BMS to send instructions for controlling the tintable window(s) through the network 701 to the tintable window(s) in a building.
System 700 also includes electrochromic devices 780 of the tintable windows (not shown) and optional wall switches 790, which are both in electronic communication with the master controller 703. In this illustrated example, the master controller 703 can send control signals to electrochromic device(s) 780 to control the tint level of the tintable windows having the electrochromic device(s) 780. Each wall switch 790 is also in communication with electrochromic device(s) 780 and master controller 703. An end user (e.g., occupant of a room having the tintable window) can use the wall switch 790 to input an override tint level and other functions of the tintable window having the electrochromic device(s) 780.
In
In
Each wall switch 790 can be operated by an end user (e.g., occupant of the room) to control the tint level and other functions of the tintable window in communication with the wall switch 790. The end user can operate the wall switch 790 to communicate control signals to the EC device(s) 780 in the associated tintable window. These signals from the wall switch 790 may override signals from window control system 702 in some cases. In other cases (e.g., high demand cases), control signals from the window control system 702 may override the control signals from wall switch 1490. Each wall switch 790 is also in communication with the leaf or end window controller 710 to send information about the control signals (e.g. time, date, tint level requested, etc.) sent from wall switch 790 back to master window controller 703 e.g., to be stored in memory. In some cases, the wall switches 790 may be manually operated. In other cases, the wall switches 790 may be wirelessly controlled by the end user using a remote device (e.g., cell phone, tablet, etc.) sending wireless communications with the control signals, for example, using infrared (IR), and/or radio frequency (RF) signals. In some cases, wall switches 790 may include a wireless protocol chip, such as Bluetooth, EnOcean, WiFi, Zigbee, and the like. Although the wall switches 790 depicted in
In another embodiment, system 700 also includes a multi-sensor device in electronic communication with the one or more controllers via the communication network 701 in order to communicate sensor readings and/or filtered sensor values to the controller(s).
II. General System Architecture
Actively maintaining and storing models of shadows and reflections on a building can be cumbersome and an inefficient use of computing resources at a building. The system architecture described herein does not require the window control system to actively generate these models of the building. Instead, models specific to the building site are generated and maintained on a cloud network or other network separate from the window control system. Tint schedule information derived from these models is pushed to the window control system. The window control system uses the tint schedule information derived from these predefined models, customized for the building, to make final tinting decisions implemented at the tintable windows. The models can be maintained, for example, on a cloud-based 3D modeling platform. The cloud-based 3D modeling platform can be used to generate visualizations of the 3D model of the building site to allow users to manage input for setting up and customizing the models of the building site and the corresponding final tint states applied to the tintable windows. With this system architecture, once the tint schedule information is loaded into the window control system, there is no need for modeling calculations to tie up computing power of the window control system. Tint schedule information resulting from any changes to the models on the cloud-based 3D modeling platform can be pushed to the window control system when and as needed. It would be understood that although the system architecture is generally described herein with respect to controlling tintable windows, other components at the building could additionally or alternatively be controlled with this architecture.
In various implementations, the system architecture includes cloud-based modules to setup and customize a 3D model of the building site. For example, the system architecture includes a cloud-based 3D modelling system that initializes the 3D model of the building site using architectural model(s) data as input, for example, data from an Autodesk®Revit model or other industry standard building model may be used. A 3D model in its simplest form includes exterior surfaces of structures of the building including window openings and a stripped version of the interior of the building with only floors and walls. More complex 3D models may include the exterior surfaces of objects surrounding the building as well as more detailed features of the interior and exterior of the building.
The system architecture also includes a cloud-based clear sky module that assigns reflective or non-reflective properties to the exterior surfaces of the objects in the 3D model, defines interior three-dimensional occupancy regions, assigns IDs to windows, and groups windows into zones based on input from user(s). Time-varying simulations of the resulting clear sky 3D model (i.e. the 3D model with configuration data having the assigned attributes) can be used to determine the direction of sunlight at the different positions of the sun in the sky under clear sky conditions and taking into account shadows and reflections from the objects at the building site, sunlight entering into spaces of the building through windows or other apertures, and the intersection of 3D projections of sunlight though the windows with three-dimensional occupancy regions in the building. The clear sky module uses this information to determine whether certain conditions exist for particular occupancy regions (i.e. from the perspective of the occupant) such as, for example, a glare condition, direct and indirect reflection condition, and passive heat condition. The clear sky module determines a clear sky tint state for each zone at each time interval based on the existence of particular conditions at that time, tint states assigned to the conditions, and the priority of different conditions if multiple conditions exist. Each zone having one or more tintable windows. The clear sky tint schedule information for each zone over a period of time, typically for a year, is pushed to, e.g. a master controller of, the window control system at the building. The window control system determines a weather-based tint state for each zone at each time interval based on sensor data such as measurements from infrared sensors and/or photosensors or filtered sensor data such as a median/mean of rolling sensor readings taken over time. The window control system then determines the minimum of the weather-based tint state and the clear sky tint state or each zone to set the final tint state and send tint instructions to implement the final tint state at the zones of the tintable windows. Thus window control systems described herein do not actively model the building or 3D parameters around and inside the building, that is done offline and therefore computing power of the window control system is primarily used to apply the tint states derived from the model, depending upon sensor or other inputs to the window control system.
Although many examples of the control architecture and models are described herein with the 3D model platform and various models residing on a cloud network, in other implementations, one or more of the 3D model platform, models, and control modules do not reside on a cloud network. For example, one or more of the 3D model platform, models, and control modules may reside on a standalone computer or other computing device that is separate from and in communication with the window control system (e.g., window control system 840). Alternatively, in certain implementations, the control network may be an edge cloud network where the cloud is part of the window control system and/or BMS at the building of interest, at other building(s), or a combination of the building and other buildings.
Returning to
The system architecture 800 also includes a graphical user interface (GUI) 890 for communicating with customers and other users to provide application services, reports, and visualizations of the 3D model and to receive input for setting up and customizing the 3D model. Visualizations of the 3D model (also referred to herein as “3D building site visualizations”) can be provided to users and received from users through the GUI 890. The illustrated users include site operations 892 that are involved in troubleshooting at the building site and have the capability to review visualizations and edit the 3D model. The users also include a Customer Success Manager (CSM) 894 with the capability of reviewing visualizations and on-site configuration changes to the 3D model. The users also include a customer(s) configuration portal 898 in communication with various customers. Through the customer(s) configuration portal 898, the customer(s) can review various visualizations of data mapped to the 3D model and provide input to change the configuration at the building site. Some examples of input from the users include space configuration including e.g., occupancy areas in the building, 3D object definition at the building site, tint states for particular conditions, and the priority of the conditions at the building. Some examples of output provided to the users include visualizations of data on the 3D model such as visualization of tint states on the 3D model, standard reporting, and performance evaluation of the building. Certain users are depicted for illustrative purposes. It would be understood that other or additional users could be included.
A. Cloud-Based 3D Modelling System
In various implementations, the system architecture has a cloud-based 3D modelling system that can generate a 3D model (e.g., solid model, surface model, or wireframe model) of the building site on a 3D modelling platform. Various commercially-available programs can be used as the 3D modelling platform. An example of such a commercially-available program is Rhino® 3D software produced by McNeel North America of Seattle Wash. Another example of a commercially-available program is Autocad® computer-aided design and drafting software application by Autodesk® of San Rafael, Calif.
The 3D model is a three-dimensional representation of the buildings and other objects at the site of the building with the tintable windows. A building site generally refers to a region surrounding the building of interest. The region is typically defined to include all objects surrounding the building that would cause shadows or reflections on the building. The 3D model includes three-dimensional representations of the exterior surfaces of the buildings and other objects surrounding the building and also of the building stripped of all its surfaces except walls, floors, and exterior surfaces. The 3D modelling system can generate the 3D model, for example, automatically using a standard building model such as a Revit or other industry standard building model and stripping the modelled building of all its surfaces except walls, floors, and exterior surfaces with window openings. Any other object in the 3D model would be automatically stripped of all elements except exterior surfaces. As another example, the 3D model can be generated from scratch using 3D modelling software. An example of a 3D model of a building site 901 having three buildings is shown in
B. Cloud-Based Clear Sky Module
Recent installations of large numbers of tintable windows such as electrochromic windows, sometimes referred to as “smart windows,” in large-scale buildings have created an increased need for complex control and monitoring systems that can involve extensive computing resources. For example, a high number of tintable windows deployed in a large-scale building may have a huge number of zones (e.g., 10,000) would require complex and memory-intensive reflection and glare models. As these tintable windows continue to gain acceptance and are more widely deployed, they will require more sophisticated control systems and models that will involve storing and manipulating a large amount of data.
The system architecture described herein implements cloud-based modules on a 3D modelling platform to generate clear sky 3D models stored and maintained on a cloud network. These clear sky models include, for example, a glare/shadow model, a reflection model, and a passive heat model that are based on clear sky conditions at the building site. An example of a visualization of a glare/shadow model of the building site 901 in
The clear sky module includes logic that can be implemented to assign attributes to the 3D model to generate the clear sky 3D model. The clear sky module also includes logic that can be used to generate other models to determine various conditions such as, for example, a glare/shadow model, a reflection model, and a passive heat model. These models of the building site 901 can be used to generate a yearly schedule of tint states for the zones of the building that is pushed to the window control system at the building to make final tinting decisions. With this system architecture, most of the data is kept on the cloud network. Keeping the models on the cloud network allows for easy access to and customization by customers and other users. For example, visualizations of various models can be sent to the users to allow them to review and send input, for example, to setup and customize the models and/or override final tint states in the clear sky tint schedules or other systems functions at the building. For example, the visualizations can be used by users to manage input used to assign attributes to the clear sky model such as in zone management and window management as part of site set up or customization.
C. Graphical User Interface (GUI) for Site Setup and Customization
The system architecture also includes a GUI for interfacing with various customers and other users. The GUI can provide application services or reports to the users and receive input for the various models from the users. The GUI can, for example, provide visualizations of various models to the users. The GUI can also provide an interface for zone management, window management, and occupancy region definition to set up the clear sky model. The GUI can also provide an interface for entering priority data, reflective properties of exterior surfaces, override values, and other data. In addition, the users can use the GUI to customize the spaces of the 3D model, for example, after viewing visualizations of the clear sky model of the building site. Some examples of customizations include:
The system architecture described herein includes a window control system that includes a network of window controllers controlling the tint levels of the zones of tintable windows at the building. Some examples of controllers that may be included in the window control system of the system architecture are described with respect to
The window control system includes control logic for making tinting decisions and sending tint instructions to change tint levels of the windows. In certain embodiments, the control logic includes logic of a Module C and a Module D that determine a tint level for each zone based on infrared sensor and/or photosensor measurements. As discussed above, the clear sky model schedule information is pushed to the window control system. In one implementation, the control logic of the window control system determines the final tint state as the minimum value between the tint state from the yearly schedule information and the maximum tint state from Module C/D.
As mentioned above, the window control system of the system architecture described herein does not generate models. The models maintained by the control architecture are specific to the building site and are maintained in the cloud network by the cloud-based modules.
E. General Process of System Architecture
As discussed above,
In
III. Clear Sky Module—Models Setup/Customizations and Generating Scheduling Information
The 3D model of a building site is initialized during a site setup process. In some implementations, the user is given the capability, e.g., through a GUI, of revising the 3D model to customize the control of the tintable windows and/or other systems in the building. These customizations can be reviewed by the user through visualizations using the 3D modelling platform. For example, customers or other users can view what has been designed for the building site after customization and how it will operate on a given day and provide “what if” scenarios. Also, different users can review the same 3D model stored on the cloud network to compare and discuss options that will cater to multiple users. For example, Customer Success Managers (CSMs) can review user locations, tint states by condition, priorities and expected behavior during clear sky conditions with the facility managers. The site setup process includes generating a 3D model of the building site and assigning attributes to the elements of the 3D model. The 3D model platform is typically used to generate a 3D model of the building site by stripping away unnecessary features from an architectural model of the building of interest and creating external surfaces of objects surrounding the building.
At operation 1520, a unique window ID is assigned to each window openings of the building of interest in the 3D model. In this window management operation, the window openings are mapped to unique window/controller IDs. In one implementation, these mappings may be validated and/or revised based on input from commissioning of the windows at installation in the building.
At operation 1530, window openings in the 3D model are grouped into zones and zone IDs and/or names of the zones are assigned to the zones. In this zone management operation, window openings in the building of interest in the 3D model are mapped to zones. Each zone maps to one or more window openings.
At operation 1540, one or more 3D occupancy regions in the building of interest in the 3D model are determined. For example, the user may identify/define two-dimensional (2D) areas as occupancy areas on floors of the 3D model and also define an eye level of the occupant associated with each occupancy area. The logic of the clear sky module can generate an extrusion of each 2D occupancy area from the floor to the defined eye level to generate a 3D occupancy region in the 3D model.
At operation 1550, the clear sky models that will be applied are determined and the clear sky models are run to determine the 3D projections of sunlight through the window openings. In this model management operation, the various clear sky models, e.g., glare/shadow model and reflection model, are generated. The clear sky module includes a ray tracing engine that can be used to determine the directions of rays of sunlight based on different positions of the sun in the sky throughout a day of a year or other time period. The ray tracing engine can also be used to determine the reflection direction and intensity of the reflected light from objects surrounding the building of interest from the location and reflective properties of the external surfaces of the objects surrounding the building. From these determinations, 3D projections of sunlight through the window openings in the 3D model can be determined.
At operation 1560, the amount and duration of any intersection of the 3D projections of sunlight through the window openings and the 3D occupancy region(s) is determined. For each time interval of the day, the clear sky models are run to determine the 3D projection of sunlight through the window openings and the amount of any intersection of the determined 3D projection with a 3D occupancy region(s). By determining amount of the intersection at each time interval, the duration of the intersection can be determined.
At operation 1570, the conditions are evaluated based on the intersection properties determined at operation 1560. For example, a value for a glare/condition for a particular zone can be determined based on the amount and duration of any intersection of the determined 3D projections through the window openings with 3D occupancy region(s) in that zone.
At operation 1580, the priority data is applied to the condition values evaluated at operation 1570 to determine a tint state for each zone of the building over time, e.g., in a yearly schedule. For example, the process described with respect to
A. Window Management
During set up of the 3D model of the building site, each window opening is assigned a unique window id that corresponds to a local window controller. Assigning the window opening to a window id maps the window opening to a single window controller. Each window id effectively represents each window controller that can be grouped into a zone. Alternatively or additionally, after installation of the windows and their controllers in a building, commissioning operations may be used to determined which window is installed in which location and paired to which window controller. These associations from the commissioning process can then be used to compare to and validate the mapping in the 3D model or update the mapping in the configuration data of the 3D model. An example of a commissioning process that can be used to determine such mappings is described in International application PCT/US2017/062634, filed on Nov. 11, 2017 and titled “AUTOMATED COMMISSIONING OF CONTROLLERS IN A WINDOW NETWORK,” which is hereby incorporated by reference in its entirety. The mapping of each window opening to a window ID may also be revised based on user customizations.
In one implementation, the user can select window openings in the 3D model on the 3D model platform and assign unique window ids.
B. Zone Management
Each zone of a building includes one or more tintable windows. The tintable windows are represented as openings in the 3D model. The one or more tintable windows in each zone will be controlled to behave in the same way (e.g., transitioned to the same end tint state). This means that if the occupancy region(s) associated with one of the tintable windows in a zone experiences a particular condition, all the tintable windows will be controlled to react to that condition. The configuration data with attributes of the 3D model include zone properties such as one or more of a name of the zone, an SHGC value of a representative window of the zone, and a maximum value of internal radiation.
During zone management as part of site setup or customization of the 3D model, a user can define the window openings that will be grouped together in zones and assign properties to the defined zones.
In one implementation, the user can select and group together multiple zones so that the multiple zones behave in the same way.
During zone management, each zone is assigned zone properties. Some examples of zone properties include: zone name (user defined), zone id (system generated), IDs of windows, glass SHGC, maximum allowable radiation into the space in watts per meter squared.
C. Generate 3D Occupancy Regions
As used herein, an occupancy region refers to a three-dimensional volume that is likely to be occupied during a particular time period. Occupancy regions are defined during site setup and can be re-defined during customization. In certain implementations, defining an occupancy region involves defining a three-dimensional volume by extruding a two-dimensional occupancy area on a surface (e.g., a floor) to a vertical plane at the eye level of an occupant, and assigning properties to the 3D occupancy region. Some examples of properties include occupancy region name, glare tint state (tint state if glare condition exists), direct reflection tint state (tint states for different levels of direct reflection radiation), and indirect reflection tint state (tint states for different levels of indirect reflection radiation).
In certain implementations, an occupancy region is generated on the 3D modelling platform. The user draws or otherwise defines the user location or area of activity as a two-dimensional shape (e.g., polygon) or shapes on the floor or other surface (e.g., desktop) of the 3D model and also defines an occupant eye level. An example of a two-dimensional four-sided user location drawn on the floor of a 3D model is shown in
D. Clear Sky Models
In certain implementations, a glare/shadow model, a direct reflection model, and an indirect reflection model are generated based on the 3D model of a building site. These models can be used to determine 3D projections of sunlight through the window openings of the building of interest in the 3D model over time based on clear sky conditions. A raytracing engine is used to simulate the directions of rays of sunlight at the location of the sun during each time interval. The simulations are run to evaluate different glare conditions in each of the zones of a building such as a basic glare condition (direct radiation intersecting an occupancy region), direct reflection glare condition (single bounce reflection off a direct reflective surface to an occupancy region), indirect reflection glare condition (multiple bounce reflection off an indirect reflective surface(s) to an occupancy region). The simulations assume clear sky conditions and take into account shadowing on spaces and reflection by external objects surrounding the building. The simulations determine values of glare and other conditions in time intervals over a year or other time period. The clear sky tint schedule data includes values for each of the conditions and/or tint state for each time interval (e.g., every 10 minutes) over a time period such as a year.
Generally, the clear sky module includes logic to determine whether different conditions (e.g., glare, reflection, passive heat) exist at each zone of the building at each time interval (e.g., every ten minutes) of a time period such as a year. The clear sky module outputs tint schedule information of values for these conditions and/or associated tint states at each zone for each time interval. The value of a condition may be, for example, a binary value of 1 (condition does exist) or 0 (condition does not exist). In some cases, the clear sky module includes a raytracing engine that determines the direction of rays of sunlight (direct or reflected) based on the location of the sun at different times.
In one aspect, the glare condition is evaluated based on multiple glare areas from the models in a single occupancy region. For example, light projections can intersect different occupancy areas within a single occupancy region. In one aspect, the conditions are evaluated based on multiple elevations within in a single zone.
Glare Control
In certain implementations, a determination of a glare condition is made using the intersection (also referred to as an “overlap”) of a 3D projection of sunlight from the glare (absence of shadow) model and/or the direct reflection (one bounce) model with a three-dimensional (3D) occupancy region. A positive determination of basic glare from the glare model is a function of the percentage (%) of total intersection with the 3D occupancy region and the duration of the intersection. The determination of reflection glare based on the reflection model is a function of the duration of the intersection. The clear sky module includes logic for evaluating the existence (positive determination) of a glare condition based on the glare (absence of shadow) model and/or the direct reflection (one bounce) model based on surrounding objects to the building.
According to one implementation, for each zone, the logic determines from the glare model if 3D projections of direct sunlight through the window openings of the zone intersect any of the three-dimensional occupancy regions in the zone. If the % intersection is greater than the minimum % of total Intersection (minimum threshold of overlap from the window projection into the occupancy region before glare condition is considered) and the duration of the intersection is greater than the minimum duration of intersection (minimum amount of time the intersection must occurs before it becomes significant), then a glare condition value (e.g., 1) and tint state associated with the glare condition is returned. If the logic determines from the glare model that a 3D projection of direct sunlight through the window openings does not intersect any of the three-dimensional occupancy regions in the zone, for example, zone is in a shadow, then a glare condition value (e.g., 0) and tint state associated with no glare condition is returned. The logic takes the maximum tint state of the zones that may be linked together. If there are no intersections, a lowest tint state is returned (e.g., tint 1).
In another implementation, the logic determines for each time interval, for each zone of tintable windows (collection of window openings), if the sun is directly intersecting any of the three-dimensional occupancy regions. If any of the occupancy regions are simultaneously intersected, output is that the condition does exist. If none of the occupancy regions are intersected, the condition does not exist.
Reflected Radiation Control
The clear sky module includes logic for evaluating the existence of a reflection condition under clear sky conditions based on the models and for determining the lowest state to keep the internal radiation below the maximum allowable internal radiation. The logic determines a radiation condition based on the direct normal radiation hitting the window openings of a zone. The logic determines a tint state based on the clearest tint state that can keep the normal radiation below the defined threshold for that zone.
The logic determines the external normal radiation on the tintable window from the 3D model and calculates the internal radiation for each tint state by multiplying the determined level of external radiation by the glass SHGC. The logic compares the maximum internal radiation for the zone to the calculated internal radiation for each of the tint states and chooses the lightest calculated tint state that is below the maximum internal radiation for that zone. For example, the external normal radiation from the model is 800 and the maximum internal radiation is 200 and the T1=0.5, T2=0.25, and T3=0.1. The logic calculated the internal radiation for each tint state by multiplying the determined level of external radiation by the glass SHGC: Calc T1 (800)*0.5=400, Calc T2 (800)*0.25=200, and Calc T3 (800)*0.1=80. The logic would select T2 since T2 is lighter than T3.
In another implementation, the logic determines for each zone of windows (collection of openings), if the sun has a single bounce off of the external objects. If there is a reflection to any of the occupancy regions, then the reflection condition does exist. If reflection is not on any of the occupancy regions, the reflection condition does not exist.
Passive Heat Control
In certain implementations, the clear sky module includes logic for evaluating the existence of a passive heat condition that sets a darker tinting state in the windows of a zone based on output from the clear sky models. The logic determines the external solar radiation hitting the tintable windows under clear sky conditions from the clear sky models. The logic determines the estimated clear sky heat entering the room based on the external radiation on the tintable windows. If the logic determines that the estimated clear sky heat entering the room is greater than a maximum allowable value, then the passive heat conditions exists and a darker tint state is set to the zone based on the passive heat condition. The maximum allowable value may be set based on the external temperature to the building and/or user input. In one example, if the external temperature is low, the maximum allowable external radiation may be set very high to allow for an increased level of passive heat to enter the building space.
E. Building Site Clear Sky Model Customizations
F. Visualizations
In certain implementations, the system architecture includes GUI that allows the user to make changes to attributes of the clear sky model to see the changes to the model and/or changes to the schedule data in visualizations on the 3D modeling platform. Visualizations of the building site on the 3D modeling platform can be used for the purposes of customization.
In one aspect, the user can see how that sun is impacting glare on a zone or group of zones by selecting the zone or group of zones and a date/time from the glare/shadow clear sky model.
As another aspect, the user can see how that sun is impacting reflection on a zone or group of zones by selecting the zone or group of zones and a date/time from the reflection model.
In another aspect, the user can see how the is impacting glare and reflection light on a zone or group of zones by implementing both the reflection model and the glare/shadow clear sky model.
III. Window Control System
Examples logic modules based on infrared sensors and photosensors are described in international PCT application PCT/US17/55631, filed on Oct. 6, 2017 titled “INFRARED CLOUD DETECTOR SYSTEMS AND METHODS, which is hereby incorporated by reference in its entirety.
A. Control Method with Clear Sky Tint Schedule Data and Modules C and D
The operations of the illustrated iterative control logic in
At operation 2620, the control logic sets a tint level for a zone at a single instant in time ti based on one or more of a tint level calculated from the clear sky tint schedule data received from the clear sky module(s), a tint level calculated by logic Module C, and a tint level calculated by a tint level calculated by logic Module D. The control logic also performs the calculations to determine these tint levels. An example of calculations used to determine these tint levels is described in detail with respect to
In certain implementations, the control logic implemented by the window control system is predictive logic that calculates how the tintable window should transition in advance of the actual transition. In these cases, the determined tint levels are based on a future time e.g., around or after transition is complete. For example, the future time used in the calculations may be a time in the future that is sufficient to allow the transition to be completed after receiving the tint instructions. In these cases, the window controller can send tint instructions in the present time in advance of the actual transition. By the completion of the transition, the tintable window will have transitioned to a tint level that is desired for that future time.
Module C and Module D each determine a cloud cover condition and a tint level based on the determined cloud cover condition. The operations of Module C determine a cloud cover condition based on raw photosensor readings or a filtered photosensor value and determine a tint level based on the determined cloud cover condition. The photosensor readings/value can be sent directly to Module C as input. Alternatively, Module C can query a database for the photosensor readings/value. The photosensor readings are measurements taken by the plurality of photosensor(s), for example, of a multi-sensor device. In one aspect, the processor uses the logic of Module C to determine the cloud cover condition by comparing the photosensor readings/value with a threshold value. In one implementation, Module C can also calculate a filtered photosensor value using the raw photosensor readings. Generally the operations of Module C determine a tint level that is the same or is lighter than the tint level in the clear sky tint schedule data from the clear sky model(s).
In certain implementations, the logic of Module D uses filtered infrared (IR) sensor values (e.g., rolling average or median values of readings) to determine a tint level for a zone of one or more electrochromic windows in a building. Module D includes logic to calculate the filtered IR sensor values based on sky temperature readings (Tsky) and ambient temperature readings from local sensors (Tamb) or from weather feed (Tweather), and/or a difference, delta (Δ), between sky temperature readings and ambient temperature readings. The raw sensor readings are input directly to Module D or retrieved from a database in response to a database query. The ambient temperature readings are measurements taken by local ambient temperature sensors, Tamb, or ambient temperature readings from weather feed, Tweather. The sky temperature readings are generally taken by one or more infrared sensors. The ambient temperature readings may be received from various sources. For example, the ambient temperature readings may be communicated from one or more ambient temperature sensors located onboard an infrared sensor and/or a standalone temperature sensor of, for example, a multi-sensor device at the building. As another example, the ambient temperature readings may be received from weather feed. Generally the operations of Module D determine a tint level that is darker or the same as the tint level determined by Module C. In other implementations, the logic of Module D uses raw infrared (IR) sensor values to determine a tint level for a zone of one or more electrochromic windows in a building.
Returning to
At operation 2650, the control logic determines whether a tint level for each zone of the building being determined has been determined. If not, the control logic iterates to determine a final tint level for the next zone. If the tint state for the final zone being determined is complete, the control signals for implementing the tint level for each zone are transmitted over a network to the power supply in electrical communication with the device(s) of the tintable windows of the zone to transition to the final tint level at operation 2660 and the control logic iterates for the next time interval returning to operation 2610. For example, the tint level may be transmitted over a network to the power supply in electrical communication with electrochromic device(s) of the one or more electrochromic windows to transition the windows to the tint level. In certain embodiments, the transmission of tint level to the windows of a building may be implemented with efficiency in mind. For example, if the recalculation of the tint level suggests that no change in tint from the current tint level is required, then there is no transmission of instructions with an updated tint level. As another example, the control logic may recalculate tint levels for zones with smaller windows more frequently than for zones with larger windows.
In one case, the control logic in
Another example of control logic that can be used to derive tint levels from Module C and Module D is described with respect to
Also, there may be certain adaptive components of the control logic of embodiments. For example, the control logic may determine how an end user (e.g. occupant) tries to override the algorithm at particular times of day and then makes use of this information in a more predictive manner to determine a desired tint level. For example, the end user may be using a wall switch to override the tint level provided by the control logic at a certain time on each day over a consecutive sequence of days to an override value. The control logic may receive information about these instances and change the control logic to introduce an override value that changes the tint level to the override value from the end user at that time of day.
Examples of Module D
In certain implementations, Module D uses filtered infrared (IR) sensor values (e.g., rolling average or median values of readings) to determine a tint level for a zone of one or more electrochromic windows in a building. Module D includes logic to calculate the filtered IR sensor values using a Cloudy Offset value and sky temperature readings (Tsky) and ambient temperature readings from local sensors (Tamb) or from weather feed (Tweather), and/or a difference, delta (Δ), between sky temperature readings and ambient temperature readings. The Cloudy Offset value is a temperature offset that corresponds to the threshold values that will be used to determine the cloudy condition by the logic in Module D. The logic of Module D may be performed by the one or more processors of a network controller or a master controller. Alternatively, the logic of Module D may be performed by one or more processors of a multi-sensor device. In one case, the calculated filtered IR sensor values from Module D are saved into an IR sensor measurement database which is stored in memory. In this case, the one or more processors performing the calculations of Module D retrieves the IR sensor values as input from IR sensor measurement database.
In another implementation, the logic of Module D receives and uses raw sensor readings of measurements taken by two or more IR sensor devices at the building (e.g., of a rooftop multi-sensor device), each IR sensor device having an onboard ambient temperature sensor for measuring ambient temperature (Tamb) and an onboard infrared sensor directed to the sky for measuring sky temperature (Tsky) based on infrared radiation received within its field-of-view. Two or more IR sensor devices are typically used to provide redundancy. In one case, each infrared sensor device outputs readings of ambient temperature (Tamb) and sky temperature (Tsky). In another case, each infrared sensor device outputs readings of ambient temperature (Tamb), sky temperature (Tsky), and the difference between Tsky, and Tamb, delta Δ. In one case, each infrared sensor device outputs readings the difference between Tsky and Tamb, delta Δ. According to one aspect, the logic of Module D uses raw sensor readings of measurements taken by two IR sensor devices at the building. In another aspect, the logic of Module D uses raw sensor readings of measurements taken by 1-10 IR sensor devices at the building.
In another implementation, the logic of Module D receives and uses raw sky temperature (Tsky) readings taken by infrared sensors at the building and directed to the sky to received infrared radiation within its field-of-view and ambient temperature readings from weather feed data (Tweather). The weather feed data is received from one or more weather services and/or other data sources over a communication network. Weather feed data generally includes data associated with weather conditions such as, for example, cloud coverage percentage, visibility data, wind speed data, temperature data, percentage probability of precipitation, and/or humidity. Typically weather feed data is received in a signal through a communication network by a window controller. According to certain aspects, the window controller can send a signal with a request for the weather feed data through a communication interface over the communication network to one or more weather services. The request usually includes at least the longitude and latitude of the location of the window(s) being controlled. In response, the one or more weather services send a signal with weather feed data through the communication network through a communication interface to the window controller. The communication interface and network may be in wired or wireless form. In some cases, a weather service may be accessible through a weather website. An example of a weather website can be found at www.forecast.io. Another example is the National Weather Service (www.weather.gov). The weather feed data may be based on a current time or may be forecasted at a future time. More details regarding logic that uses weather feed data can be found in international application PCT/US16/41344, filed on Jul. 7, 2016 and titled “CONTROL METHOD FOR TINTABLE WINDOWS,” which is hereby incorporated by reference in its entirety.
Returning to
In certain implementations, Module D calculates the temperature value (Tcalc) based on sky temperature readings from two or more pairs of thermal sensors. Each pair of thermal sensors having an infrared sensor and an ambient temperature sensor. In one implementation, the thermal sensors of each pair are integral components of an IR sensor device. Each IR sensor device has an onboard infrared sensor and an onboard ambient temperature sensor. Two IR sensor devices are typically used to provide redundancy. In another implementation, the infrared sensor and ambient temperature sensor are separate.
At operation 2820, the logic of Module D calculates the temperature value (Tcalc) using:
Tcalc=minimum(Tsky1,Tsky2, . . . )−minimum(Tamb1,Tamb2, . . . )−Cloudy Offset (Eqn. 1)
where Tsky1, Tsky2, . . . are temperature readings taken by the multiple infrared sensors and Tamb1, Tamb2, . . . are temperature readings taken the multiple ambient temperature sensors. If two infrared sensors and two ambient temperature sensors are used, Tcalc=minimum (Tsky1, Tsky2)−minimum (Tamb1, Tamb2)—Cloudy Offset. Minimums of the readings from multiple sensors of the same type are used to bias the result toward lower temperature values that would indicate lower cloud cover and result in higher tint level in order to bias the result toward avoiding glare.
In another implementation, the logic of Module D may switch from using a local ambient temperature sensor to using weather feed data when ambient temperature sensor readings become unavailable or inaccurate, for example, where an ambient temperature sensor is reading heat radiating from a local source such as from a rooftop. In this implementation, the temperature value (Tcalc) is calculated based on sky temperature readings and ambient temperature readings from weather feed data (Tweather). In this implementation, the temperature value is calculated as:
Tcalc=minimum(Tsky1,Tsky2, . . . )−Tweather−Cloudy Offset (Eqn. 2)
In another implementation, the temperature value (Tcalc) is calculated based on readings of the difference, A, between sky temperature and ambient temperature as measured by two or more IR sensor devices, each having an onboard infrared sensor measuring and ambient temperature sensor. In this implementation, the temperature value is calculated as:
Tcalc=minimum(Δ1,Δ2, . . . )−Cloudy Offset (Eqn. 3)
where Δ1, Δ2, . . . are readings of the difference, Δ, between sky temperature and ambient temperature measured by multiple IR sensor devices.
In implementations that use Eqn. 1 and Eqn. 3, the control logic uses the difference between the sky temperature and the ambient temperature to determine the IR sensor value input to determine a cloud condition. Ambient temperature readings tend to fluctuate less than sky temperature readings. By using the difference between sky temperature and ambient temperature as input to determine tint state, the tint states determined over time may fluctuate to a lesser degree and provide a more stable tinting of the window.
In another implementation, the control logic calculates Tcalc based only on sky temperature readings from two or more infrared sensors. In this implementation, the IR sensor value determined by Module D and input into Module D is based on sky temperature readings and not on ambient temperature readings. In this case, Module D determines a cloud condition based on sky temperature readings. Although the above described implementations for determining Tcalc are based on two or more redundant sensors of each type, it would be understood that the control logic may be implemented with readings from a single sensor.
At operation 2830, the processor updates the short term box car and long term box car with the Tcalc determined in operation 2820. To update the box cars, the most recent sensor reading is added to the box cars and the oldest sensor reading is dropped out of the box cars.
For Module D and other control logic described herein, filtered sensor values are used as input to making tinting decisions. Module D and other logic described herein determine filtered sensor values using short term and long term box cars (filters). A short box car (e.g., box car that employs sample values taken over 10 minutes, 20 minutes, 5 minutes, etc.) is based on a smaller number of sensor samples (e.g., n=1, 2, 3, . . . 10, etc.) relative to the larger number of sensor samples (e.g., n=10, 20, 30, 40, etc.) in a long box car (e.g., box car that employs sample values taken over 1 hour, 2 hours, etc.). A box car (illumination) value may be based on a mean, average, median or other representative value of the sample values in the box car. In one example, the short box car value is a median value of sensor samples and the long box car value is a median value of sensor samples. Module D typically uses a rolling median value of sensor samples for each of the short box car value and long box car value. In another example, the short box car value is a mean value of sensor samples and the long box car value is a mean value of sensor samples. Module C typically uses filtered photosensor values that are determined from short and/or long box car values based on mean value of sensor samples.
Since a short box car value is based on a smaller number of sensor samples, short box car values more closely follow the current sensor readings than long box car values. Thus, short box car values respond to rapidly changing conditions more quickly and to a greater degree than the long box car values. Although both the calculated short and long box car values lag behind the sensor readings, short box car values will lag behind to a lesser extent than the long box car values. Short box car values tend to react more quickly than long box car values to current conditions. A long box car can be used to smooth the response of the window controller to frequent short duration weather fluctuations like a passing cloud, while a short box car does not smooth as well but responds more quickly to rapid and significant weather changes like overcast conditions. In the case of a passing cloud condition, control logic using only a long box car value will not react quickly to the current passing cloud condition. In this case, the long box car value can be used in tinting decisions to determine an appropriate high tint level. In the case of a fog burning off condition, it may be more appropriate to use a short term box car value in tinting decisions. In this case, the short term box car reacts more quickly to a new sunny condition after the fog burns off. By using the short term box car value to make tinting decisions, the tintable window can quickly adjust to the sunny condition and keeps the occupant comfortable as the fog rapidly burns off.
At operation 2840, the processor determines a short box car value (Sboxcar value) and the long box car value (Lboxcar value) based on the current sensor readings in the box cars updated at operation 2830. In this example, each box car value is calculated by taking the median value of the sensor readings in the box car after the last update made at operation 2830. In another implementation, each box car value is calculated by taking the mean value of the current sensor readings in each box car. In other implementations, other calculations of the sensor readings in each box car may be used.
In certain implementations, control logic described herein evaluates the difference between the short term box car value and the long term box car value to determine which box car value to implement in making decisions. For example, when the absolute value of the difference between the short term box car value and the long term box car value exceeds a threshold value, the short term box car value is used in the calculation. In this case, the short box car value of the sensor readings in the short term is larger by the threshold the value of the long term sensor readings which may indicate a short term fluctuation of a large enough significance, e.g., a large cloud that may suggest transitioning to a lower tint state. If the absolute value of the difference between the short and long box car values does not exceed the threshold value, the long term box car is used. Returning to
If the absolute value of the difference is above the delta threshold value, the Sboxcar value is assigned to the IR sensor value and the short term box car is reset to empty its values (operation 2860). If the absolute value of the difference is not above the delta threshold value, the Lboxcar value is assigned to the IR sensor value and the long term box car is reset to empty its values (operation 2870). At operation 2880, the filtered IR sensor value is saved to the IR sensor Measurement database.
Implementing Module C and/or Module D Depending on Solar Elevation
In the morning and evening, sunlight levels are low and readings taken by visible light photosensors in, for example, a multi-sensor are low values that might be considered consistent with a cloudy condition. For this reason, visible light photosensor readings taken during the morning and evening might falsely indicate a cloudy condition. In addition, any obstruction from a building or a hill/mountain may also result in a false positive indication for a cloudy condition based on visible light photosensor readings taken alone. Moreover, visible light photosensor readings taken before sunrise may result in a false positive cloudy condition if taken alone. Where control logic is predictive in determining a tint level at sunrise in advance based on visible light photosensor readings alone taken just before sunrise, a false positive cloudy condition might lead to transitioning an electrochromic window to a clear state at sunrise allowing glare in the room.
At operation 3022, the control logic determines whether the calculated solar elevation is less than 0 at a single instant in time ti. In one implementation, the control logic also determines the solar elevation based on the sun's position and calculates the sun's position based on the latitude and longitude of the building with the window(s) and the time of day, ti, and the day of the year (date). Publicly-available programs can provide the calculations for determining the sun's position.
If the solar elevation is determined to be less than 0 at operation 3022, it is nighttime and the control logic sets a nighttime tint state at operation 3024. An example of a nighttime tint state is a cleared tint level which is the lowest tint state. A cleared tint level may be used as a nighttime tint state, for example, to provide security by allowing security personnel outside the building to see inside lighted room(s) of the building through the cleared windows. Another example of a nighttime tint state is a highest tint level, which can also provide privacy and/or security by not allowing others to see inside the building at nighttime when the windows are in the darkest tint state. If the solar elevation is determined to be less than 0, the control logic returns the nighttime tint state.
At operation 3030, the control logic determines whether the solar elevation is less than or equal to a lower solar elevation threshold value, Θ1 (e.g., Θ1 is about 5 degrees, about 12 degrees, about 10 degrees, in a range of 10-15 degrees) or greater than or equal to an upper solar elevation threshold value, Θ2 (where Θ2 is about 5 degrees, about 12 degrees, about 10 degrees, in a range of 10-15 degrees) from 180 degrees. If the solar elevation is in this range, the control logic determines that time ti is in the morning or evening.
If the solar elevation is not in this range, the control logic determines whether the solar elevation is increasing or decreasing at operation 3032. The control logic determines whether the solar elevation is increasing or decreasing by comparing the calculated solar elevation values over time. If the control logic determines that the solar elevation is increasing, it is determined to be morning and the control logic runs a morning IR sensor algorithm implementation of Module D at operation 3034. An example of a morning IR sensor algorithm that can be used is described with respect to the flowchart 3100 in
If the control logic determines that the solar elevation is not increasing (decreasing) at operation 3032, it is determined to be evening and the control logic runs an evening IR sensor algorithm implementation of Module D at operation 3036. An example of an evening IR sensor algorithm that can be used is described with respect to the flowchart 3200 illustrated in
If it is determined at operation 3030 that the solar elevation is greater than the Θ1 and less than 180−Θ2, then it is during the daytime region and the control logic runs a daytime algorithm which implements Module C and/or Module D to determine a tint level based on photosensor and/or infrared sensor readings (operation 3040). The control logic sets the tint level to the minimum value of the Module C/D tint level and the tint level from the clear sky tint schedule output from the clear sky models at operation 3040 and returns the minimum value tint level.
Examples of Morning and Evening IR Sensor Algorithms of Module D
Module D queries an infrared sensor measurements database for a filtered IR sensor value and then determines a cloud condition and associated tint level based on the filtered IR sensor value. If the filtered IR sensor value is below a lower threshold value, it is a “sunny” condition and the tint level from Module D set to the highest tint level. If the filtered IR sensor value is above an upper threshold value, it is a “Cloudy” condition and the tint level from Module D is set to the lowest tint level. If the filtered IR sensor value is less than or equal to the upper threshold value and greater than or equal to the lower threshold value, the tint level from Module D is set to an intermediate tint level. The upper and lower threshold values used in these calculations are based on whether the morning IR sensor algorithm, evening IR sensor algorithm, or daytime algorithm is being implemented.
The control logic of the flowchart 3100 starts at operation 3110 and the filtered IR sensor value is compared with a Morning Lower threshold value to determine whether the filtered IR sensor value is less than the Morning Lower threshold value. The control logic of Module D queries an infrared sensor measurements database or other database to retrieve the filtered IR sensor value. Alternatively, the control logic calculates the filtered IR sensor value. An example of control logic that can be used to calculate the filtered IR sensor value and store the value to the infrared sensor measurements database is the control logic of Module D described with reference to the flowchart in
If it is determined at operation 3110 that the filtered IR sensor value is less than the Morning Lower threshold value, the filtered IR sensor value is in a lower region which is the “Clear” or “Sunny” region. In this case, the control logic sets the tint level from Module D to a high tint state (e.g. tint level 4) and passes the tint level from Module D (operation 3120).
If it is determined at operation 3110 that the filtered IR sensor value is not less than the Morning Lower threshold value, the control logic proceeds to determine whether the filtered IR sensor value is less than or equal to a Morning Upper threshold value and greater than or equal to a Morning Lower threshold value at operation 3130. The Morning Upper threshold is the temperature at the upper boundary of the filtered IR sensor values between the mid region (“Partly Cloudy” region) and the upper region (“Cloudy” region) that applies during the morning region of the day. In certain implementations, the Morning Upper threshold value is in the range of −20 and 20 millidegrees Celsius. In one example, the Morning Upper threshold value is 3 millidegrees Celsius.
If it is determined at operation 3130 that the filtered IR sensor value is less than or equal to the Morning Upper threshold value and greater than or equal to the Morning Lower Threshold Value, the filtered IR sensor value is determined to be in a mid region that is the “Partial Cloudy” region (operation 3140). In this case, the control logic sets the tint level of Module D to an intermediate tint state (e.g. tint level 2 or 3) and the tint level from Module D is passed.
If it is determined at operation 3130 that the filtered IR sensor value is not less than or equal to the Morning Upper threshold value and greater than or equal to the Morning Lower Threshold Value (i.e., the filtered sensor value is greater than the Morning Upper threshold value), the filtered IR sensor value is determined to be in an upper region that is the “Cloudy” or “Overcast” condition (operation 3150). In this case, the control logic sets the tint level of Module D to a low tint state (e.g. tint level 2 or lower tint level) and the tint level from Module D is passed.
The control logic of the flowchart 3200 starts at operation 3210 and the filtered IR sensor value is compared with an Evening lower threshold value to determine whether the filtered IR sensor value is less than the Evening Lower threshold value. The control logic of Module D queries an infrared sensor measurements database or other database to retrieve the filtered IR sensor value. Alternatively, the control logic calculates the filtered IR sensor value. An example of control logic that can be used to calculate the filtered IR sensor value and store the value to the infrared sensor measurements database is the control logic of Module D described with reference to the flowchart in
If it is determined at operation 3210 that the filtered IR sensor value is less than the Evening Lower threshold value, the control logic determines the filtered IR sensor value is in a lower region which is the “Clear” or “Sunny” region. In this case, the control logic sets the tint level from Module D to a high tint state (e.g. tint level 4) at operation 3220 and passes the tint level from Module D.
If it is determined at operation 3210 that the filtered IR sensor value is not less than the Evening Lower threshold value, the control logic proceeds to determine whether the filtered IR sensor value is less than or equal to an Evening Upper threshold value and greater than or equal to an Evening Lower threshold value at operation 3230. The Evening Upper threshold is the temperature at the upper boundary of the filtered IR sensor values between the mid region (“Partly Cloudy” region) and the upper region (“Cloudy” region) that applies during the evening region of the day. In certain implementations, the Evening Upper threshold value is in the range of −20 and 20 millidegrees Celsius. In one example, the Evening Upper threshold value is 5 millidegrees Celsius.
If it is determined at operation 3230 that the filtered IR sensor value is less than or equal to the Evening Upper threshold value and greater than or equal to the Evening Lower Threshold Value, the filtered IR sensor value is determined to be in a mid region that is the “Partial Cloudy” region (operation 3240). In this case, the control logic sets the tint level of Module D to an intermediate tint state (e.g. tint level 2 or 3) and the tint level from Module D is passed.
If it is determined at operation 3230 that the filtered IR sensor value is not less than or equal to the Evening Upper threshold value and greater than or equal to the Evening Lower Threshold Value (i.e., the filtered sensor value is greater than the Evening Upper threshold value), the filtered IR sensor value is determined to be in an upper region that is the “Cloudy” (operation 3250). In this case, the control logic sets the tint level of Module D to a low tint state (e.g. tint level 2 or lower tint level) and this tint level from Module D is passed.
Example of Daytime Algorithm of Module C and/or Module D
During the daytime, temperature readings taken by an infrared sensor can tend to fluctuate if the local area around the infrared sensor is heated up. For example, an infrared sensor located on a rooftop may be heated by the rooftop as it absorbs heat from the midday sun. In certain implementations, a daytime algorithm disables the use of IR sensor readings in its tinting decisions under certain circumstances and uses Module C to determine tint level from photosensor readings alone. In other circumstances, the daytime algorithm determines a first tint level based on IR sensor readings using Module D, determines a second tint level based on photosensor readings using Module C, and then sets the tint level to the maximum of the first and second tint levels.
At operation 3310, it is determined whether using IR sensor readings is enabled. In one case, the default setting for tinting control logic is to disable using the IR sensor readings unless photosensor readings are unavailable, for example, due to malfunctioning photosensors. In another case, the control logic disables using the IR sensor readings if the IR sensor data is not available, for example, due to malfunctioning IR sensors.
If it is determined at operation 3310 that the using IR sensor readings is enabled, the control logic runs both the daytime IR sensor algorithm of Module D and the daytime photosensor algorithm of Module C (operation 3320). In this case, the control logic proceeds to both operation 3330 and operation 3332. If it is determined at operation 3310 that the using IR sensor readings is not enabled, the control logic runs the daytime photosensor algorithm of Module C (operation 3350).
At operation 3330, the logic of a daytime IR sensor algorithm of Module D is run to determine a first tint state. A filtered IR sensor value is retrieved from an infrared sensor measurements database or other database. Alternatively, the logic of the daytime IR sensor algorithm calculates the filtered IR sensor value. An example of logic that can be used to calculate the filtered IR sensor value and store the value to the infrared sensor measurements database is the control logic of Module D described with reference to the flowchart in
At operation 3332, the logic of a daytime photosensor sensor algorithm of Module C is run to determine a second tint level. Module C determines the second tint level based on real-time irradiance using photosensor readings. An example of control logic of Module C that can be used to determine the second tint level is described with respect to a flowchart 3500 shown in
At operation 3340, the logic of the daytime algorithm calculates the maximum of the first tint level using Module D based on IR sensor readings determined at operation 3330 and the second tint level using Module C based on photosensor readings determined at operation 3332. The tint level from the daytime algorithm is set to the maximum of the calculated first tint level and the calculated second tint level and the. The tint level from Module C or D is returned.
If it is determined at operation 3310 that the using IR sensor readings is not enabled, the control logic runs the daytime photosensor algorithm of Module C (operation 3350). At operation 3350, the logic of the daytime photosensor sensor algorithm of Module C is run to determine the second tint level. In this case, the tint state from the daytime algorithm is set to second tint level based on photosensor readings and this tint level from Module C is returned. An example of control logic of Module C that can be used to determine the second tint level is described with respect to the flowchart shown in
Examples of Module C and Module C′
At operation 3520, the control logic uses thresholding to calculate the suggested tint level by determining whether the current filtered photosensor value has crossed one or more thresholds over a period of time. The period of time may be, for example, the time period between the current time and the last sample time taken by the photosensors or between the current time and the first of multiple sample readings previously taken. Photosensor readings may be taken on a periodic basis such as once a minute, once every 10 seconds, once every 10 minutes, etc.
In one implementation, thresholding involves using two threshold values: a lower photosensor threshold value and an upper photosensor threshold value. If it is determined that the photosensor value is higher than the upper photosensor threshold value, the photosensor value is in a higher region which is the “Clear” or “Sunny” region. In this case, the control logic determines the suggested tint level from Module C is a high tint state (e.g. tint level 4). If it is determined that the photosensor value is less than or equal to the Upper photosensor threshold value and greater than or equal to the Lower photosensor Threshold Value, the photosensor value is determined to be in a mid region that is the “Partial Cloudy” region. In this case, the control logic determines the suggested tint level from Module C is an intermediate tint state (e.g. tint level 2 or 3). If it is determined that the photosensor sensor value is greater than the Evening Upper threshold value, the photo sensor value is determined to be in an upper region that is the “Cloudy” region. In this case, the control logic determines the suggested tint level from Module C to a low tint state (e.g. tint level 2 or lower tint level).
If the current time ti is the instant in time after the lockout period has ended, the control logic calculates the suggested tint level at operation 3520 based on the conditions monitored during the lockout period. The suggested tint level calculated based on the conditions monitored during the lockout period is based on a statistical evaluation of the monitored input. Various techniques can be used for the statistical evaluation of the input monitored during the wait time. One example is tint level averaging during the wait time. During the wait time, the control logic implements an operation that monitors the input and calculates tint levels determined. The operation then averages the determined tint levels over the wait time to determine which direction is suggested for a one tint region transition.
At operation 3525, the control logic determines whether the current time is during a lockout period. If the current time is during a lockout period, Module C does not change the tint level. During the lockout period, photosensor values of outside conditions are monitored. In addition, the control logic monitors the suggested tint levels determined by operation 3520 during the lockout period. If the current time is determined to not be during a lockout period, the control logic proceeds to operation 3530.
At operation 3530, the logic goes on to determine whether the current information suggests a tint transition. This operation 3530 compares the suggested tint level determined in operation 3520 with the current tint level applied to the one or more windows to determine whether the tint levels are different. If the suggested tint level is not different from the current tint level, the tint level is not changed.
At operation 3550, if the suggested tint level is different from the current tint level, the logic sets a new tint level that is one tint level toward the suggested tint level determined in operation 3520 (even if the suggested tint level is two or more tint levels from the current tint level). For example, if the suggested tint region determined in operation 3520 is from a first tint level to a third tint level, the tint level returned by Module C is to transition one tint level to a second tint level.
At operation 3570, a lock out period is set to lock out from transitions to other tint levels during the lockout period. During the lockout period, photosensor values of outside conditions are monitored. In addition, the control logic calculates a suggested tint region during intervals based on the conditions monitored during the lockout period. The new tint level passed from Module C is determined at operation 3550 as one tint level toward the suggested tint level determined in operation 3520.
The logic of Module C1′ may be performed by one or more processors of a local window controller, a network controller, a master controller, and/or a multi-sensor device. At operation 3110, the processor(s) performing the operations of Module C1′ receives as input photosensor readings at a current time. The photosensor readings may be received via a communication network at the building, for example, from a rooftop multi-sensor device. The received photosensor readings are real-time irradiance readings.
At operation 3610, the processor(s) performing the logic operations of Module C′ receives as input raw photosensor readings at a current time. The photosensor readings may be received via a communication network at the building, for example, from a rooftop multi-sensor device. The received photosensor readings are real-time irradiance readings.
At operation 3620, processor(s) performing the logic operations of Module C′ calculates a photosensor value based on raw measurements taken by two or more photosensors. For example, the photosensor value may be calculated as the maximum value of measurements taken by the two or more photosensors at a single sample time.
At operation 3630, the processor(s) updates the short term box car and long term box car with the photosensor value determined in operation 3620. In Module C and other control logic described herein, filtered photosensor values are used as input to making tinting decisions. Module C and other logic described herein determines filtered sensor values using short term and long term box cars (filters). A short box car (e.g., box car that employs sample values taken over 10 minutes, 20 minutes, 5 minutes, etc.) is based on a smaller number of sensor samples (e.g., n=1, 2, 3, . . . 10, etc.) relative to the larger number of sensor samples (e.g., n=10, 20, 30, 40, etc.) in a long box car (e.g., box car that employs sample values taken over 1 hour, 2 hours, etc.). A box car (illumination) value may be based on a mean, average, median or other representative value of the sample values in the box car. In one example, the short box car value is a mean value of sensor samples and the long box car value is a mean value of photosensor samples.
At operation 3640, the processor(s) determines the short box car value (Sboxcar value) and the long box car value (Lboxcar value) based on the current photosensor readings in the box cars updated at operation 3630. In this example, each box car value is calculated by taking the mean value of the photosensor readings in the box car after the last update made at operation 3630. In another example, each box car value is calculated by taking the median value of the photosensor readings in the box car after the last update made at operation 3630.
At operation 3650, the processor(s) performing the logic operations of Module C′ evaluates whether the value of the absolute value of the difference between the Sboxcar value and the Lboxcar value is greater than a delta threshold value (|Sboxcar Value−Lboxcar Value|>Delta Threshold). In some cases, the value of the Delta Threshold is in the range of 0 millidegrees Celsius to 10 millidegrees Celsius. In one case, the value of the Delta Threshold is 0 millidegrees Celsius.
If the difference is above the delta threshold value, the Sboxcar value is assigned to the photosensor value and the short term box car is reset to empty its values (operation 3660). If the difference is not above the delta threshold value, the Lboxcar value is assigned to the photosensor value and the long term box car is reset to empty its values (operation 3670). At operation 3680, the photosensor value is saved to a database.
In certain implementations, control logic described herein uses filtered sensor values based on temperature readings from one or more infrared sensors and from ambient temperature sensors to determine a cloud condition in the morning and evening and/or at a time just before sunrise. The one or more infrared sensors operate generally independent of sunlight levels allowing for the tinting control logic to determine a cloud condition before sunrise and as the sun is setting to determine and maintain a proper tint level during the morning and evening. In addition, the filtered sensor values based on the temperatures readings from the one or more infrared sensors can be used to determine a cloud condition even when the visible light photosensors are shaded or otherwise obstructed.
IV. Outside Temperature
Tint decision logic may operate by maintaining or improving occupant comfort and minimizing energy consumption. In some approaches, a glare condition from penetration of direct sunlight into the building interior and/or the current cloud conditions factor into decisions made by the tint decision logic. However, for buildings exposed to particularly hot outside temperatures (e.g., at or above 40° C.), additional measures may be advantageous to achieve both occupant comfort and limited energy consumption.
As indicated in the above-described embodiments, tint decisions may be based on information from various sources such as weather feed, sensors, clear sky models, solar position models, etc. However, in some embodiments presented in this section, such tint decisions additionally employ outside temperature information, particularly for tintable windows that face in directions where they are not receiving direct sunlight (e.g., the windows oriented azimuthally away from the sun's current position). For such tintable windows, the tint decision logic might, without using outside temperature information, determine that the tintable windows should be in the clearest tint state, e.g., tint state 1, because little if any solar radiation is directly penetrating into a room of a building. For example, baseline tint decision logic may specify that a tintable window clears if it is not in a direct line of sight with the sun. But if the temperature outside is very high, e.g., 40° C. or higher, there may be so much infrared radiation passing through such a window that the temperature in the room becomes uncomfortably hot very quickly. Therefore, from an indoor temperature and/or air conditioning usage perspective, a relatively clear tint state is ineffective. In other words, while in a relatively clear tint state, either the room may become uncomfortably hot or it requires too much air conditioning to maintain a comfortable temperature in the room. Therefore, in accordance with some embodiments, when the outside temperature is over a defined threshold, the tint decision logic is configured to tint electrochromic windows to a higher tint state (lower optical transmissivity) than would otherwise be required by baseline or default tint decision logic.
Many electrochromic windows and other tintable windows are relatively transparent to infrared radiation. In other words, without tinting or in a relatively clear tint state, these windows may allow significant amounts of indirect, heat-producing solar radiation, particularly infrared radiation, to enter buildings. As an example, an electrochromic window in a clear tint state may have a transmissivity to near infrared radiation (at some wavelengths close to the visible spectrum) of about 50%.
In certain embodiments, temperature, and particularly outside temperature, is used to influence tint state decisions. In various embodiments, when the outside temperature is determined to exceed a particular threshold (or otherwise meet a set of conditions), the outside temperature is used to make the window tint state darker than it would be otherwise. However, the tint decision logic may incorporate outside temperature in many different ways. For example, where the outside temperature is particularly low (as encountered in the winter in temperate climates, at night time, and/or in polar latitudes), the tint decision logic may make the window darker than otherwise specified. This can prevent some IR radiation from escaping the building.
The outside temperature used in such logic may be monitored in various ways. One approach employs an external source of information such as a local weather station that provides weather feed data with current outside temperatures in the vicinity of the building. Another approach employs one or more sensors at or on a building. One such example employs a roof top sensor, such as a sky sensor as described in U.S. patent application Ser. No. 14/998,019 filed Oct. 6, 2015, and incorporated herein by reference in its entirety.
The outside temperature override function may be implemented in various ways. For example, the tint decision override may be implemented as one or more configuration files that apply for particular buildings or portions of buildings that have been are expected to be subject to the problem of excessive infrared radiation penetration when the sun is not in the line of sight of building windows.
In various embodiments, the outside temperature override function described in this section may be used to override, or be used in conjunction with, tint decision logic as described in the sections above. For example, an outside temperature override may be implemented at the operation 2630 of
1. Example of an Outside Temperature Implementation
In this example, a separate temperature profile is created for each location where installation occurs, e.g., one for Las Vegas, Nevada, another for Phoenix, Arizona, a third for Anchorage, Alaska, another for Greensboro, North Carolina, etc. For each location, a temperature override profile may be created in which outside temperatures in certain ranges or above certain thresholds automatically cause the window to tint by some predefined amount beyond what the tint decision logic would otherwise set (e.g., based on a glare condition from solar penetration into the building, other information determined by clear sky models including reflection models, a cloud cover condition, etc.). This increase in tint state can be particularly defined for windows that would otherwise be in a relatively clear tint states and/or for windows that are outside the sun's line of sight.
2. Example an Outside Temperature Implementation
In this example, an outside temperature reading overrides tint decision logic based on occupant set preferences for particular outside temperatures in the occupants' particular rooms. In some embodiments, the occupants have only limited power to adjust default tint decision logic. For example, if an occupant's preference would require an air conditioning load exceeding a pre-set maximum, the tint decision logic may force a darker tint state the state that would be set using the occupant's preference.
3. Example of an Outside Temperature Implementation
In this example, the tint state logic considers penetration of direct sunlight into a region of a building having a tintable window for which a tint state must be determined. For a direct line of sight, and consequently some penetration of direct solar radiation through the window, the tint state logic employs a default tint state logic to determine a baseline tint state (also referred to as “default tint state”) such as that employed in various embodiments described above. Only for tintable windows that do not have any penetration of solar radiation directly through the windows is the modified or override tint decision logic activated. In such cases, the outside temperature may be sufficiently high that the tint state control logic determines that the tint state of the window should be darker than that selected by the default tint state logic. An example of default tint state logic that can calculate penetration depth of direct sunlight into a room through a tintable window and considers the penetration depth in tint decisions is described in U.S. patent application Ser. No. 15/347,677 filed on Nov. 9, 2016, which is incorporated herein by reference in its entirety.
At operation 3710, a baseline or default tint state is determined for one or more tintable windows using baseline tint decision logic. Some examples of logic that may be included in baseline tint decision logic are, e.g., a clear sky model, Module C, Module D, etc. In one implementation, the baseline tint decision logic includes logic that determines the baseline tint state using a penetration depth of direct sunlight through a tintable window. The penetration depth may be calculated using the relative position of the sun with respect to the tintable window. In another implementation, the baseline tint decision logic includes logic that determines a position of the sun and determines the baseline tint state using the determined position of the sun. In yet another implementation, the baseline tint logic determines a baseline tint state using a glare condition value determined from clear sky tint schedule data generated by a clear sky model. In yet another implementation, the baseline tint logic determines a baseline tint state using an irradiance value, e.g., determined using a predicted clear sky irradiance and/or from a signal received from one or more sensors.
At operation 3720, it is determined whether the outside temperature is at or above a first threshold temperature (e.g., at least about 40° C.). In another implementation, alternatively or additionally, it is determined whether the outside temperature is at or below a second threshold temperature that is lower than the first threshold temperature.
If it is determined that the outside temperature is at or above a first threshold temperature at operation 3720, the determination is used to determine a modified tint state that is darker than the baseline tint state and tint instructions are provided, e.g., to a window controller, to transition the tintable window to the modified tint state (operation 3730). For example, the baseline tint decision logic may provide a baseline clear tint state after determining there is no glare condition on a tintable windows that is outside the line of sight of direct sunlight. If the outside temperature is determined to be at or above a first threshold temperature, e.g., 40° C., it may be determined to use a modified tint state that is one or two levels higher than the clear tint state. As another example, the baseline tint decision logic may provide a dark tint state after determining that the tintable window is in the line of sight of direct sunlight and that sunlight directly penetrates into the building interior through a tintable window and overlaps an occupancy region in the room with the tintable window. In this example, if the outside temperature is determined to be at or above a first threshold temperature, e.g., 40° C., it is determined to maintain the baseline tint state.
If it is determined that the outside temperature is not at or above a first threshold temperature at operation 3720, the baseline or default tint state is maintained (operation 3740). Tint instructions may be provided to maintain the tintable window at the default tint state.
In certain embodiments described herein, control logic determines tint levels (states) based on a condition that is likely to occur at a future time (also referred to herein as a “future condition”). For example, a tint level may be determined based on the likelihood of the occurrence of a cloud condition at a future time (e.g., ti=present time+time duration such as the transition time for the one or more electrochromic windows). The future time used in these logic operations may be set to a time in the future that is sufficient to allow the transition of the window to the tint level to be completed after receiving the control instructions. In these cases, a controller can send instructions in the present time in advance of the actual transition. By the completion of the transition, the window will have transitioned to the tint level that is desired for that future time. In other embodiments, the disclosed control logic may be used to determine tint levels based on a condition occurring or likely to occur at present time, e.g., by setting the time duration to 0. For example, in certain electrochromic windows the transition time to a new tint level, e.g., to an intermediate tint level, may be very short so that sending instructions to transition to a tint level based on the present time would be appropriate.
It should be understood that techniques as described above can be implemented in the form of control logic using computer software in a modular or integrated manner. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will know and appreciate other ways and/or methods to implement the disclosed techniques using hardware and a combination of hardware and software.
Any of the software components or functions described in this application, may be implemented as software code to be executed by a processor using any suitable computer language such as, for example, Java, C++ or Python using, for example, conventional or object-oriented techniques. The software code may be stored as a series of instructions, or commands on a computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM. Any such computer readable medium may reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.
Although the foregoing disclosed embodiments have been described in some detail to facilitate understanding, the described embodiments are to be considered illustrative and not limiting. It will be apparent to one of ordinary skill in the art that certain changes and modifications can be practiced within the scope of the appended claims.
Although the foregoing disclosed embodiments for controlling lighting received through a window or a building's interior have been described in the context of optically switchable windows such as electrochromic windows, one can appreciate how the methods described herein may be implemented on appropriate controllers to adjust a position of a window shade, a window drapery, a window blind, or any other device that may be adjusted to limit or block light from reaching a building's interior space. In some cases, methods described herein may be used to control both the tint of one or more optically switchable windows and the position of a window shading device. All such combinations are intended to fall within the scope of the present disclosure.
One or more features from any embodiment may be combined with one or more features of any other embodiment without departing from the scope of the disclosure. Further, modifications, additions, or omissions may be made to any embodiment without departing from the scope of the disclosure. The components of any embodiment may be integrated or separated according to particular needs without departing from the scope of the disclosure.
This is a National Stage Application under 35 U.S.C. 371 of International Patent Application PCT/US2020/047525 (designating the United States), titled “CONTROL METHODS AND SYSTEMS USING OUTSIDE TEMPERATURE AS A DRIVER FOR CHANGING WINDOW TINT STATES” filed on Aug. 21, 2020, which claims benefit of and priority to U.S. Provisional Patent Application No. 62/891,102, titled “OUTSIDE TEMPERATURE AS A DRIVER FOR CHANGING WINDOW TINT STATE” and filed on Aug. 23, 2019; International Application PCT/US2020/047525 is also a continuation-in-part of International Application PCT/US2019/023268, titled “CONTROL METHODS AND SYSTEMS USING EXTERNAL 3D MODELING AND SCHEDULE-BASED COMPUTING” (designating the United States) and filed on Mar. 20, 2019; International Application PCT/US2020/047525 is also a continuation-in-part of U.S. patent application Ser. No. 16/013,770, filed on Jun. 20, 2018 and titled “CONTROL METHOD FOR TINTABLE WINDOWS;” International Application PCT/US2019/023268 claims benefit of and priority to U.S. Provisional Patent Application No. 62/646,260, filed on Mar. 21, 2018 and titled “METHODS AND SYSTEMS FOR CONTROLLING TINTABLE WINDOWS WITH CLOUD DETECTION,” and to U.S. Provisional Patent Application No. 62/666,572, filed on May 3, 2018 and titled “CONTROL METHODS AND SYSTEMS USING EXTERNAL 3D MODELING AND SCHEDULE-BASED COMPUTING;” International PCT application PCT/US2019/023268 is also a continuation-in-part of U.S. patent application Ser. No. 16/013,770, titled “CONTROL METHOD FOR TINTABLE WINDOWS” and filed on Jun. 20, 2018, which is a continuation of U.S. patent application Ser. No. 15/347,677, titled “CONTROL METHOD FOR TINTABLE WINDOWS,” and filed on Nov. 9, 2016, which is a continuation-in-part of International Application PCT/US15/29675 (designating the United States), titled “CONTROL METHOD FOR TINTABLE WINDOWS” and filed on May 7, 2015, which claims priority to and benefit of U.S. Provisional Patent Application No. 61/991,375, titled “CONTROL METHOD FOR TINTABLE WINDOWS” and filed on May 9, 2014; U.S. patent application Ser. No. 15/347,677 is also a continuation-in-part of U.S. patent application Ser. No. 13/772,969, (issued on May 2, 2017, as U.S. Pat. No. 9,638,987) titled “CONTROL METHOD FOR TINTABLE WINDOWS” and filed on Feb. 21, 2013; each of these applications is hereby incorporated by reference in its entirety and for all purposes.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2020/047525 | 8/21/2020 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/041261 | 3/4/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3963347 | Segre et al. | Jun 1976 | A |
4355896 | Frosch et al. | Oct 1982 | A |
5124833 | Barton et al. | Jun 1992 | A |
5170108 | Peterson et al. | Dec 1992 | A |
5204778 | Bechtel | Apr 1993 | A |
5220317 | Lynam et al. | Jun 1993 | A |
5290986 | Colon et al. | Mar 1994 | A |
5353148 | Eid et al. | Oct 1994 | A |
5365365 | Ripoche et al. | Nov 1994 | A |
5379146 | Defendini | Jan 1995 | A |
5379215 | Kruhoeffer et al. | Jan 1995 | A |
5384578 | Lynam et al. | Jan 1995 | A |
5402144 | Ripoche | Mar 1995 | A |
5451822 | Bechtel et al. | Sep 1995 | A |
5583972 | Miller | Dec 1996 | A |
5598000 | Popat | Jan 1997 | A |
5621526 | Kuze | Apr 1997 | A |
5663621 | Popat | Sep 1997 | A |
5673028 | Levy | Sep 1997 | A |
5694144 | Lefrou et al. | Dec 1997 | A |
5760558 | Popat | Jun 1998 | A |
5764402 | Thomas et al. | Jun 1998 | A |
5822107 | Lefrou et al. | Oct 1998 | A |
5900720 | Kallman et al. | May 1999 | A |
5956012 | Turnbull et al. | Sep 1999 | A |
5973818 | Sjursen et al. | Oct 1999 | A |
5973819 | Pletcher et al. | Oct 1999 | A |
5978126 | Sjursen | Nov 1999 | A |
6002511 | Varaprasad et al. | Dec 1999 | A |
6039390 | Agrawal et al. | Mar 2000 | A |
6039850 | Schulz | Mar 2000 | A |
6055089 | Schulz et al. | Apr 2000 | A |
6064949 | Werner et al. | May 2000 | A |
6084231 | Popat | Jul 2000 | A |
6084700 | Knapp et al. | Jul 2000 | A |
6125327 | Kalenian | Sep 2000 | A |
6130448 | Bauer et al. | Oct 2000 | A |
6130772 | Cava | Oct 2000 | A |
6163756 | Baron et al. | Dec 2000 | A |
6222177 | Bechtel et al. | Apr 2001 | B1 |
6262831 | Bauer et al. | Jul 2001 | B1 |
6266063 | Baron et al. | Jul 2001 | B1 |
6386713 | Turnbull et al. | May 2002 | B1 |
6398118 | Rosen et al. | Jun 2002 | B1 |
6407468 | LeVesque et al. | Jun 2002 | B1 |
6407847 | Poll et al. | Jun 2002 | B1 |
6449082 | Agrawal et al. | Sep 2002 | B1 |
6471360 | Rukavina et al. | Oct 2002 | B2 |
6493128 | Agrawal et al. | Dec 2002 | B1 |
6535126 | Lin et al. | Mar 2003 | B2 |
6567708 | Bechtel et al. | May 2003 | B1 |
6614577 | Yu et al. | Sep 2003 | B1 |
6795226 | Agrawal et al. | Sep 2004 | B2 |
6819367 | Cava | Nov 2004 | B1 |
6829511 | Bechtel et al. | Dec 2004 | B2 |
6856444 | Ingalls et al. | Feb 2005 | B2 |
6897936 | Li et al. | May 2005 | B1 |
6940627 | Freeman et al. | Sep 2005 | B2 |
6965813 | Granqvist et al. | Nov 2005 | B2 |
7085609 | Bechtel et al. | Aug 2006 | B2 |
7111952 | Veskovic | Sep 2006 | B2 |
7133181 | Greer | Nov 2006 | B2 |
7215318 | Turnbull et al. | May 2007 | B2 |
7277215 | Greer | Oct 2007 | B2 |
7304787 | Whitesides et al. | Dec 2007 | B2 |
7417397 | Berman et al. | Aug 2008 | B2 |
7542809 | Bechtel et al. | Jun 2009 | B2 |
7548833 | Ahmed | Jun 2009 | B2 |
7567183 | Schwenke | Jul 2009 | B2 |
7588067 | Veskovic | Sep 2009 | B2 |
7610910 | Ahmed | Nov 2009 | B2 |
7800812 | Moskowitz | Sep 2010 | B2 |
7817326 | Rennig et al. | Oct 2010 | B1 |
7822490 | Bechtel et al. | Oct 2010 | B2 |
7873490 | MacDonald | Jan 2011 | B2 |
7941245 | Popat | May 2011 | B1 |
7950827 | Veskovic | May 2011 | B2 |
7963675 | Veskovic | Jun 2011 | B2 |
7972021 | Scherer | Jul 2011 | B2 |
7977904 | Berman et al. | Jul 2011 | B2 |
7990603 | Ash et al. | Aug 2011 | B2 |
8004739 | Letocart | Aug 2011 | B2 |
8018644 | Gustavsson et al. | Sep 2011 | B2 |
8102586 | Albahri | Jan 2012 | B2 |
8164818 | Collins et al. | Apr 2012 | B2 |
8213074 | Shrivastava | Jul 2012 | B1 |
8254013 | Mehtani et al. | Aug 2012 | B2 |
8270059 | Friedman et al. | Sep 2012 | B2 |
8288981 | Zaharchuk et al. | Oct 2012 | B2 |
8292228 | Mitchell et al. | Oct 2012 | B2 |
8300298 | Wang et al. | Oct 2012 | B2 |
8380393 | Ohtomo | Feb 2013 | B1 |
8432603 | Wang et al. | Apr 2013 | B2 |
8456729 | Brown et al. | Jun 2013 | B2 |
8547624 | Ash et al. | Oct 2013 | B2 |
8582193 | Wang et al. | Nov 2013 | B2 |
8705162 | Brown | Apr 2014 | B2 |
8723467 | Berman et al. | May 2014 | B2 |
8764950 | Wang et al. | Jul 2014 | B2 |
8764951 | Wang et al. | Jul 2014 | B2 |
8836263 | Berman et al. | Sep 2014 | B2 |
8864321 | Mehtani et al. | Oct 2014 | B2 |
8902486 | Chandrasekhar | Dec 2014 | B1 |
8934170 | Takeda et al. | Jan 2015 | B2 |
8976440 | Berland et al. | Mar 2015 | B2 |
9016630 | Mitchell et al. | Apr 2015 | B2 |
9030725 | Pradhan et al. | May 2015 | B2 |
9078299 | Ashdown | Jul 2015 | B2 |
9081246 | Rozbicki | Jul 2015 | B2 |
9081247 | Pradhan et al. | Jul 2015 | B1 |
9128346 | Shrivastava et al. | Sep 2015 | B2 |
9226366 | Orillard et al. | Dec 2015 | B2 |
9261751 | Pradhan et al. | Feb 2016 | B2 |
9298203 | Wenzel | Mar 2016 | B2 |
9341912 | Shrivastava et al. | May 2016 | B2 |
9348192 | Brown et al. | May 2016 | B2 |
9404793 | Yang et al. | Aug 2016 | B2 |
9406028 | Humann | Aug 2016 | B2 |
9423664 | Brown et al. | Aug 2016 | B2 |
9454055 | Brown et al. | Sep 2016 | B2 |
9523902 | Parker | Dec 2016 | B2 |
9546515 | Hall et al. | Jan 2017 | B2 |
9574934 | Verbeek et al. | Feb 2017 | B2 |
9638978 | Brown et al. | May 2017 | B2 |
9645465 | Brown et al. | May 2017 | B2 |
9664974 | Kozlowski et al. | May 2017 | B2 |
9668315 | Shearer et al. | May 2017 | B2 |
9674924 | Lashina et al. | Jun 2017 | B2 |
9709869 | Baumann et al. | Jul 2017 | B2 |
9807857 | Huang | Oct 2017 | B2 |
9927674 | Brown et al. | Mar 2018 | B2 |
9938765 | Berman et al. | Apr 2018 | B2 |
10048561 | Brown | Aug 2018 | B2 |
10234596 | Frank et al. | Mar 2019 | B2 |
10254618 | Parker | Apr 2019 | B2 |
10316581 | Nagel et al. | Jun 2019 | B1 |
10495939 | Brown et al. | Dec 2019 | B2 |
10520784 | Brown et al. | Dec 2019 | B2 |
10539854 | Brown et al. | Jan 2020 | B2 |
10605970 | Blair et al. | Mar 2020 | B2 |
10690540 | Brown et al. | Jun 2020 | B2 |
10712627 | Brown et al. | Jul 2020 | B2 |
10802372 | Brown | Oct 2020 | B2 |
10908470 | Brown et al. | Feb 2021 | B2 |
10921675 | Barnum et al. | Feb 2021 | B2 |
10982487 | Ramirez | Apr 2021 | B2 |
11126057 | Brown et al. | Sep 2021 | B2 |
11255722 | Zedlitz et al. | Feb 2022 | B2 |
11261654 | Brown et al. | Mar 2022 | B2 |
11520207 | Brown et al. | Dec 2022 | B2 |
11635666 | Klawuhn et al. | Apr 2023 | B2 |
11674843 | Zedlitz et al. | Jun 2023 | B2 |
11719990 | Zedlitz et al. | Aug 2023 | B2 |
20020075472 | Holton | Jun 2002 | A1 |
20020135881 | Rukavina et al. | Sep 2002 | A1 |
20020144831 | Kalt | Oct 2002 | A1 |
20020152298 | Kikta et al. | Oct 2002 | A1 |
20030142140 | Brown et al. | Jul 2003 | A1 |
20030191546 | Bechtel et al. | Oct 2003 | A1 |
20030210449 | Ingalls et al. | Nov 2003 | A1 |
20030210450 | Yu et al. | Nov 2003 | A1 |
20030227663 | Agrawal et al. | Dec 2003 | A1 |
20030227664 | Agrawal et al. | Dec 2003 | A1 |
20040001056 | Atherton et al. | Jan 2004 | A1 |
20040108191 | Su et al. | Jun 2004 | A1 |
20040135989 | Klebe | Jul 2004 | A1 |
20040160322 | Stilp | Aug 2004 | A1 |
20050046584 | Breed | Mar 2005 | A1 |
20050046920 | Freeman et al. | Mar 2005 | A1 |
20050063036 | Bechtel et al. | Mar 2005 | A1 |
20050200934 | Callahan et al. | Sep 2005 | A1 |
20050225830 | Huang et al. | Oct 2005 | A1 |
20050268629 | Ahmed | Dec 2005 | A1 |
20050270620 | Bauer et al. | Dec 2005 | A1 |
20050278047 | Ahmed | Dec 2005 | A1 |
20060018000 | Greer | Jan 2006 | A1 |
20060107616 | Ratti et al. | May 2006 | A1 |
20060170376 | Piepgras et al. | Aug 2006 | A1 |
20060187608 | Stark | Aug 2006 | A1 |
20060207730 | Berman et al. | Sep 2006 | A1 |
20060209007 | Pyo et al. | Sep 2006 | A1 |
20060245024 | Greer | Nov 2006 | A1 |
20070002007 | Tam | Jan 2007 | A1 |
20070055757 | Mairs et al. | Mar 2007 | A1 |
20070067048 | Bechtel et al. | Mar 2007 | A1 |
20070162233 | Schwenke | Jul 2007 | A1 |
20070285759 | Ash et al. | Dec 2007 | A1 |
20080012755 | Venkatachalam et al. | Jan 2008 | A1 |
20080018979 | Mahe et al. | Jan 2008 | A1 |
20080043316 | Moskowitz | Feb 2008 | A2 |
20080283621 | Quirino et al. | Nov 2008 | A1 |
20090020233 | Berman et al. | Jan 2009 | A1 |
20090027759 | Albahri | Jan 2009 | A1 |
20090066157 | Tarng et al. | Mar 2009 | A1 |
20090139669 | Robin | Jun 2009 | A1 |
20090143141 | Wells et al. | Jun 2009 | A1 |
20090187287 | Bruhnke et al. | Jul 2009 | A1 |
20090204269 | Bechtel et al. | Aug 2009 | A1 |
20090222137 | Berman et al. | Sep 2009 | A1 |
20090231092 | Maegawa et al. | Sep 2009 | A1 |
20090243732 | Tarng et al. | Oct 2009 | A1 |
20090243802 | Wolf et al. | Oct 2009 | A1 |
20090296188 | Jain et al. | Dec 2009 | A1 |
20090323160 | Egerton et al. | Dec 2009 | A1 |
20100039410 | Becker et al. | Feb 2010 | A1 |
20100066484 | Hanwright et al. | Mar 2010 | A1 |
20100071856 | Zaharchuk et al. | Mar 2010 | A1 |
20100082081 | Niessen et al. | Apr 2010 | A1 |
20100172009 | Matthews | Jul 2010 | A1 |
20100172010 | Gustavsson et al. | Jul 2010 | A1 |
20100188057 | Tarng | Jul 2010 | A1 |
20100235206 | Miller et al. | Sep 2010 | A1 |
20100243427 | Kozlowski et al. | Sep 2010 | A1 |
20100245972 | Wright | Sep 2010 | A1 |
20100245973 | Wang et al. | Sep 2010 | A1 |
20100294330 | Huang et al. | Nov 2010 | A1 |
20100296081 | Granqvist | Nov 2010 | A1 |
20100313476 | Sethuraman | Dec 2010 | A1 |
20100315693 | Lam et al. | Dec 2010 | A1 |
20110035061 | Altonen et al. | Feb 2011 | A1 |
20110046810 | Bechtel et al. | Feb 2011 | A1 |
20110063708 | Letocart | Mar 2011 | A1 |
20110066302 | McEwan | Mar 2011 | A1 |
20110080629 | Neuman et al. | Apr 2011 | A1 |
20110148218 | Rozbicki | Jun 2011 | A1 |
20110164304 | Brown et al. | Jul 2011 | A1 |
20110167617 | Letocart | Jul 2011 | A1 |
20110235152 | Letocart | Sep 2011 | A1 |
20110249313 | Letocart | Oct 2011 | A1 |
20110255142 | Ash et al. | Oct 2011 | A1 |
20110260961 | Burdis | Oct 2011 | A1 |
20110266137 | Wang et al. | Nov 2011 | A1 |
20110266138 | Wang et al. | Nov 2011 | A1 |
20110266419 | Jones et al. | Nov 2011 | A1 |
20110267674 | Wang et al. | Nov 2011 | A1 |
20110267675 | Wang et al. | Nov 2011 | A1 |
20110292488 | McCarthy et al. | Dec 2011 | A1 |
20110295575 | Levine et al. | Dec 2011 | A1 |
20110304898 | Letocart | Dec 2011 | A1 |
20110304899 | Kwak et al. | Dec 2011 | A1 |
20120007507 | Niemann et al. | Jan 2012 | A1 |
20120026573 | Collins et al. | Feb 2012 | A1 |
20120033287 | Friedman | Feb 2012 | A1 |
20120033288 | Lee | Feb 2012 | A1 |
20120062975 | Mehtani et al. | Mar 2012 | A1 |
20120069420 | Suzuki | Mar 2012 | A1 |
20120091315 | Moskowitz | Apr 2012 | A1 |
20120095601 | Abraham et al. | Apr 2012 | A1 |
20120133315 | Berman et al. | May 2012 | A1 |
20120147449 | Bhatnagar et al. | Jun 2012 | A1 |
20120188627 | Chen et al. | Jul 2012 | A1 |
20120190386 | Anderson | Jul 2012 | A1 |
20120194895 | Podbelski et al. | Aug 2012 | A1 |
20120200908 | Bergh et al. | Aug 2012 | A1 |
20120236386 | Mehtani et al. | Sep 2012 | A1 |
20120239209 | Brown | Sep 2012 | A1 |
20120261078 | Adams et al. | Oct 2012 | A1 |
20120265350 | Ashdown | Oct 2012 | A1 |
20120268803 | Greer et al. | Oct 2012 | A1 |
20120275008 | Pradhan et al. | Nov 2012 | A1 |
20120293855 | Shrivastava et al. | Nov 2012 | A1 |
20130011315 | Ahmed | Jan 2013 | A1 |
20130021659 | Friedman et al. | Jan 2013 | A1 |
20130038093 | Snider | Feb 2013 | A1 |
20130057157 | Nackaerts et al. | Mar 2013 | A1 |
20130057937 | Berman | Mar 2013 | A1 |
20130063065 | Berman et al. | Mar 2013 | A1 |
20130139804 | Goldberg | Jun 2013 | A1 |
20130158790 | McIntyre, Jr. et al. | Jun 2013 | A1 |
20130173926 | Morese et al. | Jul 2013 | A1 |
20130242370 | Wang | Sep 2013 | A1 |
20130263510 | Gassion | Oct 2013 | A1 |
20130264948 | Orillard et al. | Oct 2013 | A1 |
20130271812 | Brown et al. | Oct 2013 | A1 |
20130271813 | Brown | Oct 2013 | A1 |
20130271814 | Brown | Oct 2013 | A1 |
20130271815 | Pradhan et al. | Oct 2013 | A1 |
20130321923 | Thuot et al. | Dec 2013 | A1 |
20140043667 | Bergh et al. | Feb 2014 | A1 |
20140067733 | Humann | Mar 2014 | A1 |
20140083413 | Bibi et al. | Mar 2014 | A1 |
20140104667 | Greer et al. | Apr 2014 | A1 |
20140145002 | Caldeira et al. | May 2014 | A1 |
20140160550 | Brown et al. | Jun 2014 | A1 |
20140177025 | Lee et al. | Jun 2014 | A1 |
20140177028 | Shrivastava et al. | Jun 2014 | A1 |
20140236323 | Brown et al. | Aug 2014 | A1 |
20140259931 | Plummer | Sep 2014 | A1 |
20140262057 | Chambers et al. | Sep 2014 | A1 |
20140268287 | Brown et al. | Sep 2014 | A1 |
20140288715 | Beaujeu et al. | Sep 2014 | A1 |
20140300945 | Parker | Oct 2014 | A1 |
20140303788 | Sanders et al. | Oct 2014 | A1 |
20140330538 | Conklin et al. | Nov 2014 | A1 |
20140371931 | Lin et al. | Dec 2014 | A1 |
20150002919 | Jack et al. | Jan 2015 | A1 |
20150035440 | Spero | Feb 2015 | A1 |
20150049378 | Shrivastava et al. | Feb 2015 | A1 |
20150060648 | Brown et al. | Mar 2015 | A1 |
20150070745 | Pradhan | Mar 2015 | A1 |
20150092259 | Greer et al. | Apr 2015 | A1 |
20150116811 | Shrivastava | Apr 2015 | A1 |
20150122474 | Petersen | May 2015 | A1 |
20150129140 | Dean | May 2015 | A1 |
20150185581 | Pradhan et al. | Jul 2015 | A1 |
20150219975 | Phillips et al. | Aug 2015 | A1 |
20150234945 | Marceau et al. | Aug 2015 | A1 |
20150293422 | Pradhan et al. | Oct 2015 | A1 |
20150338713 | Brown | Nov 2015 | A1 |
20150368967 | Lundy et al. | Dec 2015 | A1 |
20160040478 | Lundy et al. | Feb 2016 | A1 |
20160054633 | Brown et al. | Feb 2016 | A1 |
20160054634 | Brown et al. | Feb 2016 | A1 |
20160062332 | Call | Mar 2016 | A1 |
20160124283 | Brown et al. | May 2016 | A1 |
20160154290 | Brown et al. | Jun 2016 | A1 |
20160202589 | Nagel et al. | Jul 2016 | A1 |
20160203403 | Nagel et al. | Jul 2016 | A1 |
20160223878 | Tran et al. | Aug 2016 | A1 |
20160258209 | Berman | Sep 2016 | A1 |
20170053068 | Pillai et al. | Feb 2017 | A1 |
20170075183 | Brown | Mar 2017 | A1 |
20170097259 | Brown et al. | Apr 2017 | A1 |
20170122802 | Brown et al. | May 2017 | A1 |
20170123286 | Parker | May 2017 | A1 |
20170130523 | Shrivastava et al. | May 2017 | A1 |
20170168368 | Brown et al. | Jun 2017 | A1 |
20170219907 | Brown et al. | Aug 2017 | A1 |
20170242315 | Ash | Aug 2017 | A1 |
20170276542 | Klawuhn et al. | Sep 2017 | A1 |
20170279876 | Prasad et al. | Sep 2017 | A1 |
20170328121 | Purdy | Nov 2017 | A1 |
20170365908 | Hughes et al. | Dec 2017 | A1 |
20180073712 | Baaijens et al. | Mar 2018 | A1 |
20180129172 | Shrivastava et al. | May 2018 | A1 |
20180141414 | Lota | May 2018 | A1 |
20180157141 | Brown et al. | Jun 2018 | A1 |
20180162203 | Boehm | Jun 2018 | A1 |
20180187484 | Hebeisen | Jul 2018 | A1 |
20180231860 | Podbelski et al. | Aug 2018 | A1 |
20180284555 | Klawuhn et al. | Oct 2018 | A1 |
20180307114 | Brown et al. | Oct 2018 | A1 |
20180373111 | Brown | Dec 2018 | A1 |
20190025661 | Brown et al. | Jan 2019 | A9 |
20190171081 | Zedlitz et al. | Jun 2019 | A1 |
20190230776 | Casey et al. | Jul 2019 | A1 |
20190250029 | Zedlitz et al. | Aug 2019 | A1 |
20190257143 | Nagel et al. | Aug 2019 | A1 |
20200007762 | Dallmeier | Jan 2020 | A1 |
20200057346 | Zedlitz et al. | Feb 2020 | A1 |
20200063490 | Hebeisen et al. | Feb 2020 | A1 |
20200072674 | Baker et al. | Mar 2020 | A1 |
20200096831 | Brown et al. | Mar 2020 | A1 |
20200260556 | Rozbicki et al. | Aug 2020 | A1 |
20200355977 | Brown et al. | Nov 2020 | A1 |
20200393733 | Brown | Dec 2020 | A1 |
20210003899 | Zedlitz et al. | Jan 2021 | A1 |
20210080319 | Brown et al. | Mar 2021 | A1 |
20210190991 | Frank et al. | Jun 2021 | A1 |
20210214274 | Friedman et al. | Jul 2021 | A1 |
20210325754 | Brown et al. | Oct 2021 | A1 |
20220113184 | Zedlitz | Apr 2022 | A1 |
20220214592 | Brown et al. | Jul 2022 | A1 |
20230004059 | Klawuhn et al. | Jan 2023 | A1 |
20230152654 | Klawuhn et al. | May 2023 | A1 |
20230341259 | Zedlitz et al. | Oct 2023 | A1 |
Number | Date | Country |
---|---|---|
1333807 | Jan 2002 | CN |
1359479 | Jul 2002 | CN |
1380482 | Nov 2002 | CN |
1097760 | Jan 2003 | CN |
2590732 | Dec 2003 | CN |
1534413 | Oct 2004 | CN |
1659080 | Aug 2005 | CN |
1672189 | Sep 2005 | CN |
1704556 | Dec 2005 | CN |
1822951 | Aug 2006 | CN |
201104273 | Aug 2008 | CN |
101322069 | Dec 2008 | CN |
101421558 | Apr 2009 | CN |
101438205 | May 2009 | CN |
101501757 | Aug 2009 | CN |
101600604 | Dec 2009 | CN |
101641618 | Feb 2010 | CN |
101678209 | Mar 2010 | CN |
101702036 | May 2010 | CN |
101707892 | May 2010 | CN |
101762920 | Jun 2010 | CN |
101969207 | Feb 2011 | CN |
102168517 | Aug 2011 | CN |
102203370 | Sep 2011 | CN |
102330530 | Jan 2012 | CN |
202110359 | Jan 2012 | CN |
202230346 | May 2012 | CN |
102183237 | Aug 2012 | CN |
102749781 | Oct 2012 | CN |
202794021 | Mar 2013 | CN |
103168269 | Jun 2013 | CN |
103370192 | Oct 2013 | CN |
103370490 | Oct 2013 | CN |
103370649 | Oct 2013 | CN |
103370986 | Oct 2013 | CN |
203271490 | Nov 2013 | CN |
103547965 | Jan 2014 | CN |
103649826 | Mar 2014 | CN |
103987909 | Aug 2014 | CN |
203870367 | Oct 2014 | CN |
104181612 | Dec 2014 | CN |
104321497 | Jan 2015 | CN |
104429162 | Mar 2015 | CN |
104685428 | Jun 2015 | CN |
104781493 | Jul 2015 | CN |
105143586 | Dec 2015 | CN |
105549293 | May 2016 | CN |
106103191 | Nov 2016 | CN |
106462023 | Feb 2017 | CN |
106796305 | May 2017 | CN |
106971028 | Jul 2017 | CN |
108351471 | Jul 2018 | CN |
10124673 | Nov 2002 | DE |
102014220818 | Apr 2016 | DE |
0445314 | Sep 1991 | EP |
0869032 | Oct 1998 | EP |
0920210 | Jun 1999 | EP |
1078818 | Feb 2001 | EP |
1441269 | Jul 2004 | EP |
0835475 | Sep 2004 | EP |
1510854 | Mar 2005 | EP |
1417535 | Nov 2005 | EP |
1619546 | Jan 2006 | EP |
2161615 | Mar 2010 | EP |
2357544 | Aug 2011 | EP |
2590095 | May 2013 | EP |
2764998 | Aug 2014 | EP |
3114903 | Jan 2017 | EP |
2517332 | Sep 2018 | EP |
2638429 | Feb 2021 | EP |
3026496 | Apr 2016 | FR |
2462754 | Feb 2010 | GB |
S6122897 | Feb 1986 | JP |
S6282194 | Apr 1987 | JP |
S63208830 | Aug 1988 | JP |
H02132420 | May 1990 | JP |
H0431833 | Feb 1992 | JP |
H04363495 | Dec 1992 | JP |
H05178645 | Jul 1993 | JP |
H06282194 | Oct 1994 | JP |
H1063216 | Mar 1998 | JP |
H10159465 | Jun 1998 | JP |
H10249278 | Sep 1998 | JP |
2000008476 | Jan 2000 | JP |
2000096956 | Apr 2000 | JP |
2002148573 | May 2002 | JP |
2004170350 | Jun 2004 | JP |
2004245985 | Sep 2004 | JP |
2005054356 | Mar 2005 | JP |
2005282106 | Oct 2005 | JP |
2005314870 | Nov 2005 | JP |
2006009281 | Jan 2006 | JP |
2006029027 | Feb 2006 | JP |
2007120090 | May 2007 | JP |
2007308971 | Nov 2007 | JP |
2008502949 | Jan 2008 | JP |
2008531879 | Aug 2008 | JP |
2009508387 | Feb 2009 | JP |
2009540376 | Nov 2009 | JP |
2010101151 | May 2010 | JP |
2010529488 | Aug 2010 | JP |
4694816 | Jun 2011 | JP |
4799113 | Oct 2011 | JP |
2013057975 | Mar 2013 | JP |
2016516921 | Jun 2016 | JP |
6541003 | Jul 2019 | JP |
6818386 | Jan 2021 | JP |
200412640 | Mar 2006 | KR |
100752041 | Aug 2007 | KR |
20080022319 | Mar 2008 | KR |
20090026181 | Mar 2009 | KR |
100904847 | Jun 2009 | KR |
100931183 | Dec 2009 | KR |
20100034361 | Apr 2010 | KR |
20110003698 | Jan 2011 | KR |
20110052721 | May 2011 | KR |
20110094672 | Aug 2011 | KR |
20110118783 | Nov 2011 | KR |
20130018527 | Feb 2013 | KR |
20140139894 | Dec 2014 | KR |
101815919 | Jan 2018 | KR |
29501 | May 2003 | RU |
200532346 | Oct 2005 | TW |
200920987 | May 2009 | TW |
M368189 | Nov 2009 | TW |
201029838 | Aug 2010 | TW |
201215981 | Apr 2012 | TW |
201231789 | Aug 2012 | TW |
201243470 | Nov 2012 | TW |
201248286 | Dec 2012 | TW |
I395809 | May 2013 | TW |
201447089 | Dec 2014 | TW |
WO-9632560 | Oct 1996 | WO |
WO-9816870 | Apr 1998 | WO |
WO-0209338 | Jan 2002 | WO |
WO-0213052 | Feb 2002 | WO |
WO-2004003649 | Jan 2004 | WO |
WO-2005098811 | Oct 2005 | WO |
WO-2005103807 | Nov 2005 | WO |
WO-2007016546 | Feb 2007 | WO |
WO-2007146862 | Dec 2007 | WO |
WO-2008030018 | Mar 2008 | WO |
WO-2008048181 | Apr 2008 | WO |
WO-2008147322 | Dec 2008 | WO |
WO-2009044330 | Apr 2009 | WO |
WO-2009124647 | Oct 2009 | WO |
WO-2010007988 | Jan 2010 | WO |
WO-2010079388 | Jul 2010 | WO |
WO-2010120771 | Oct 2010 | WO |
WO-2011020478 | Feb 2011 | WO |
WO-2011087677 | Jul 2011 | WO |
WO-2011087684 | Jul 2011 | WO |
WO-2011087687 | Jul 2011 | WO |
WO-2011124720 | Oct 2011 | WO |
WO-2011127015 | Oct 2011 | WO |
WO-2012079159 | Jun 2012 | WO |
WO-2012080589 | Jun 2012 | WO |
WO-2012080618 | Jun 2012 | WO |
WO-2012080656 | Jun 2012 | WO |
WO-2012080657 | Jun 2012 | WO |
WO-2012125332 | Sep 2012 | WO |
WO-2012138074 | Oct 2012 | WO |
WO-2012145155 | Oct 2012 | WO |
WO-2013059674 | Apr 2013 | WO |
WO-2013102932 | Jul 2013 | WO |
WO-2013105244 | Jul 2013 | WO |
WO-2013109881 | Jul 2013 | WO |
WO-2013130781 | Sep 2013 | WO |
WO-2013155467 | Oct 2013 | WO |
WO-2014045163 | Mar 2014 | WO |
WO-2014078429 | May 2014 | WO |
WO-2014121863 | Aug 2014 | WO |
WO-2014130471 | Aug 2014 | WO |
WO-2014134451 | Sep 2014 | WO |
WO-2014150153 | Sep 2014 | WO |
WO-2014165692 | Oct 2014 | WO |
WO-2014209812 | Dec 2014 | WO |
WO-2015023842 | Feb 2015 | WO |
WO-2015077097 | May 2015 | WO |
WO-2015095615 | Jun 2015 | WO |
WO-2015171886 | Nov 2015 | WO |
WO-2016004109 | Jan 2016 | WO |
WO-2016029156 | Feb 2016 | WO |
WO-2016029165 | Feb 2016 | WO |
WO-2016053960 | Apr 2016 | WO |
WO-2016054112 | Apr 2016 | WO |
WO-2016058695 | Apr 2016 | WO |
WO-2016085964 | Jun 2016 | WO |
WO-2016094445 | Jun 2016 | WO |
WO-2016191406 | Dec 2016 | WO |
WO-2017007942 | Jan 2017 | WO |
WO-2017059362 | Apr 2017 | WO |
WO-2017062592 | Apr 2017 | WO |
WO-2017075472 | May 2017 | WO |
WO-2017189437 | Nov 2017 | WO |
WO-2018034935 | Feb 2018 | WO |
WO-2018038972 | Mar 2018 | WO |
WO-2018039433 | Mar 2018 | WO |
WO-2018067996 | Apr 2018 | WO |
WO-2018098089 | May 2018 | WO |
WO-2018112095 | Jun 2018 | WO |
WO-2018112095 | Jul 2018 | WO |
WO-2018140495 | Aug 2018 | WO |
WO-2018157063 | Aug 2018 | WO |
WO-2019183289 | Sep 2019 | WO |
Entry |
---|
“SageGlass helps Solar Decathlon- and AIA award-winning home achieve net-zero energy efficiency” in MarketWatch.com, http://www.marketwatch.com/story/sageglass-helps-solar-decathlon-and-aia-award-winning-home-achieve-net-zero-energy-efficiency-2012-06-07, Jun. 7, 2012. |
American Chemical Society, “Solar smart window could offer privacy and light control on demand (video),” EurakAlert! Pub Release, Nov. 16, 2016 [https://www.eurekalert.org/pub_releases/2016-11/acs-ssw111616.php]. |
APC by Schneider Electric, Smart-UPS 120V Product Brochure, 2013, 8 pp. |
AU Examination Report dated Aug. 28, 2021, in the AU Application No. 2020202011. |
Australian Examination Report dated Dec. 19, 2018 in AU Application No. 2017270472. |
Australian Examination Report dated Feb. 21, 2019 in AU Application No. 2018201341. |
Australian Examination Report dated Mar. 31, 2017 in AU Application No. 2014219076. |
Australian Examination Report dated May 20, 2021 in AU Application No. 2020202135. |
Australian Examination Report dated Sep. 9, 2016 in AU Application No. 2013249621. |
Australian Examination Report No. 2 dated Feb. 12, 2020 in AU Application No. 2018201341. |
Australian Notice of Acceptance for Patent Application, dated Sep. 29, 2020, for Australian Patent Application No. 2015255913. |
Australian Office Action dated Jul. 1, 2019 in AU Application No. 2015255913. |
Benson D. K. et al., “Design goals and challenges for a photovoltaic-powered electrochromic window covering”, Solar Energy Materials and Solar Cells, vol. 39, No. 2/04, Dec. 1995, pp. 203-211. |
Boltwood Cloud Sensor II by Diffraction Limited, 2016, [online], [retrieved Dec. 15, 2016]. Retrieved from the internet URL http://diffractionlimited.com/product/boltwood-cloud-sensor-ii/. |
CA Office Action dated Dec. 24, 2021, in Application No. CA2948668. |
Campbell-Burns, Peter, “Building a Cloud Sensor”, Farnham Astronomical Society, (Apr. 15, 2013), Retrieved from the internet: URL: https://www.farnham-as.co.uk/2813/84/building-a-cloud-sensor/ [retrieved on 2828-84-24]. |
Canadian Notice of Allowance dated Aug. 12, 2020 in Canadian Application No. 2,902,106. |
Canadian Notice of Allowance dated Jan. 18, 2021 in Canadian Application No. 2,902,106. |
Canadian Office Action dated Feb. 11, 2021 in CA Application No. 2,870,627. |
Canadian Office Action dated Jan. 28, 2020 in Canadian Application No. 2,902,106. |
Canadian Office Action dated Jun. 10, 2021 in CA Application No. 2,948,668. |
Chinese Notice of Allowance & Search Report dated Sep. 12, 2019 in CN Application No. 201580035315.2. |
Chinese Notice of Allowance dated Jun. 3, 2021 in CN Application No. 201680043725.6, No Translation. |
Chinese Office Action dated Apr. 5, 2016 in Chinese Application No. 201280023631.4. |
Chinese Office Action dated Aug. 19, 2019 in CN Application No. 201610645398.3. |
Chinese Office Action dated Aug. 23, 2019 in CN Application No. 201680063892.7. |
Chinese Office Action dated Dec. 1, 2016 in Chinese Application No. 201280023631.4. |
Chinese Office Action dated Dec. 16, 2020 in CN Application No. 201680063892.7, with English Translation. |
Chinese Office Action dated Dec. 19, 2018 in CN Application No. 201610645398.3. |
Chinese Office Action dated Dec. 25, 2018 in CN Application No. 201710111979.3. |
Chinese Office Action dated Feb. 2, 2021 in Chinese Application No. 201880022572.6, with English Translation. |
Chinese Office Action dated Feb. 3, 2020 in Chinese Application No. 201710600395.2, with English Translation. |
Chinese Office Action dated Feb. 9, 2018 in CN Application No. 201480022064.X. |
Chinese Office Action dated Jan. 12, 2021 in CN Application No. 201780065447.9 with Translation. |
Chinese Office Action dated Jan. 13, 2021 in Chinese Application No. 201811232377.4, with English Translation. |
Chinese Office Action dated Jan. 14, 2019 in CN Application No. 201580035315.2. |
Chinese Office Action dated Jan. 21, 2020 in Chinese Application No. 201811232377.4, with English Translation. |
Chinese Office Action dated Jul. 14, 2020 in CN Application No. 201680063892.7, with English Translation. |
Chinese Office Action dated Jul. 2, 2018 in Chinese Application No. 201710111979.3. |
Chinese Office Action dated Jun. 23, 2021 in Chinese Application No. 201811232377.4, with English Translation. |
Chinese Office Action dated Jun. 26, 2015 in Chinese Application No. 201280023631.4. |
Chinese Office Action dated Jun. 27, 2016 in Chinese Application No. 201480022064.X. |
Chinese Office Action dated Jun. 28, 2020 in CN Application No. 201680043725.6. |
Chinese Office Action dated Mar. 10, 2020 in CN Application No. 201610645398.3, with English Translation. |
Chinese Office Action dated Mar. 26, 2015 in CN Application No. 201280060910.8. |
Chinese Office Action dated May 15, 2017 in Chinese Application No. 201480022064.X. |
Chinese Office Action dated May 20, 2021 in Chinese Application No. 201710600395.2, with English Translation. |
Chinese Office Action dated Nov. 27, 2015 in Chinese Application No. 201280060910.8. |
Chinese Office Action dated Nov. 3, 2020 in Chinese Application No. 201710600395.2, with English Translation. |
Chinese Office Action dated Oct. 10, 2015 in CN Application No. 201380026428.7. |
CN Office Action dated Aug. 4, 2021, in CN Application No. 201780039437.8 with English translation. |
CN Office Action dated Aug. 17, 2021, in CN Application No. 201680063892.7 with English translation. |
CN Office Action dated Nov. 3, 2021, in Application No. 201780065447.9 with English translation. |
CN Office Action dated Nov. 8, 2021, in Application No. 201880022572.6 with English translation. |
CN Office Action dated Nov. 10, 2021, in Application No. CN201811232377.4 with English Translation. |
CN office action dated Nov. 24, 2021, in application No. 201780084583.2 with English Translation. |
CN Office action dated Oct. 29, 2021 in CN Application No. CN201710600395.2 with English translation. |
Communication re Third-Party Observation dated Dec. 4, 2014 and Third-Party Observation dated Dec. 3, 2014 in PCT/US2014/016974. |
Co-pending U.S. Appl. No. 17/573,509, filed Jan. 11, 2022. |
Decision to Grant, dated Oct. 27, 2020, for Japanese Patent Application No. JP 2019-031229, with partial translation. |
Duchon, Claude E. et al., “Estimating Cloud Type from Pyranometer Observations,” Journal of Applied Meteorology, vol. 38, Jan. 1999, pp. 132-141. |
English translation of CN201104273 description form worldwide.espacenet.com. |
English translation of JP2004170350 description form worldwide.espacenet.com. |
EP Extended Search Report dated Apr. 29, 2020 in EP Application No. 17881918.1. |
EP Extended Search Report dated Dec. 4, 2020 in EP Application No. 18756696.3. |
EP Extended Search Report dated Mar. 23, 2020 in EP Application No. 17807428.2. |
EP Extended Search Report dated May 12, 2020 in EP Application No. 17859286.1. |
EP Extended Search Report dated May 16, 2019 in EP Application No. 16852784.4. |
EP Invitation to Indicate Search dated Jun. 22, 2016 in EP Application No. 14753897.9. |
EP Office Action dated Oct. 1, 2021, in application No. EP17807428.2. |
EP Partial Supplemental Search Report dated Apr. 12, 2019 in EP Application No. 16852784.4. |
EP Search Report dated Nov. 25, 2021, in Application No. EP21171305.2. |
EPO Communication dated Sep. 2, 2015 in EP Application No. 14753897.9 re Third-Party Observations. |
European Extended Search Report dated Jan. 17, 2019 in EP Application No. 16821984.8. |
European Extended Search Report dated Jan. 18, 2019 in EP Application No. 18208971.4. |
European Extended Search Report dated Jun. 18, 2019 in EP Application No. 19165771.7. |
European Extended Search Report dated Oct. 12, 2016 in EP Application No. 14753897.9. |
European Intention to Grant, dated Jan. 18, 2021, in EP Application No. 18208971.4. |
European Intention to Grant, dated Jul. 9, 2020, in EP Application No. 18208971.4. |
European Intention to Grant, dated Mar. 23, 2021, in EP Application No. 18208971.4. |
European Intention to Grant, dated Sep. 21, 2020, in EP Application No. 19165771.7. |
European Office Action dated Dec. 12, 2017 in EP Application No. 14753897.9. |
European Office Action dated Dec. 2, 2015 in EP Application No. 12841714.4. |
European Office Action dated Jul. 15, 2019 in EP Application No. 13777540.9. |
European Office Action dated Mar. 12, 2021 in EP Application No. 16852784.4. |
European Office Action dated Mar. 20, 2020 in EP Application No. 16852784.4. |
European Office Action dated May 15, 2017 in EP Application No. EP 12841714.4. |
European Office Action dated May 3, 2021 in EP Application No. 17881918.1. |
European Office Action dated Oct. 2, 2020 in EP Application No. 13777540.9. |
European (Partial) Search Report dated Dec. 17, 2019 in EP Application No. 17807428.2. |
European Search Report dated Aug. 11, 2014 in European Application No. 12757877.1. |
European Search Report dated Jul. 23, 2014 in European Application No. 12756917.6. |
European Search Report dated Jul. 29, 2014 in European Application No. 12758250.0. |
European Search Report dated Mar. 5, 2015 in European Application No. 12841714.4. |
European Search Report dated May 11, 2016 in EP Application No. 13777540.9. |
Ex Parte Quayle Action, dated Feb. 2, 2021, in U.S. Appl. No. 16/335,222. |
Extended European Search Report dated Apr. 3, 2018 in EP Application No. 15789108.6. |
Extended European Search Report dated Oct. 13, 2021, for EP Application No. EP21163294.8. |
Extended European Search Report dated Oct. 15, 2020 in EP Application No. 20182982.7. |
Final Office Action dated Jun. 5, 2015 in U.S. Appl. No. 13/968,258. |
Graham, Steve, “Clouds & Radiation,” Mar. 1, 1999. [http://earthobservatory.nasa.gov/Features/Clouds/]. |
Haby, Jeff, “Cloud Detection (IR v. VIS),” (known as of Sep. 3, 2014) [http://theweatherprediction.com/habyhints2/512/]. |
Halio Automation Brochure, halioglass.com, dated Aug. 2019, 13 pages. |
“Halio Rooftop Sensor Kit (Model SR500),” Product Data Sheet, Kinestral Technologies, 2020, 4 pp. |
“Halio Smart-Tinting Glass System,” Product Data Sheet, Kinestral Technologies, www.kinestral.com, copyright 2017, 4 pp. |
Hoosier Energy, “How do they do that? Measuring Real-Time Cloud Activity” Hoosier Energy Current Connections, (known as of Sep. 3, 2014). (http://members.questline.com/Article.aspx?articleID=18550&accountID=196000&nl=11774). |
“How Cleantech wants to make a 2012 comeback” http://mountainview.patch.com/articles/how-cleantech-wants-to-make-a-2012-comeback, Jan. 23, 2012. |
Idso, Sherwood B., “Humidity measurement by infrared thermometry,” Remote Sensing of Environment, vol. 12, 1982, pp. 87-91. |
IN First Examination Report dated Jul. 7, 2021 in Indian Patent Application No. 201917013204. |
IN Office Action dated Dec. 17, 2021, in Application No. IN201917027304. |
Indian Office Action dated Dec. 18, 2019 in IN Application No. 2371/KOLNP/2014. |
Indian Office Action dated Jul. 9, 2020 in IN Application No. 201637038970. |
Indian Office Action dated Jun. 10, 2021, in IN Application No. 202038025893. |
Indian Office Action dated Sep. 3, 2019 in IN Application No. 3074/KOLNP/2015. |
International Preliminary Report on Patentability dated Apr. 12, 2018 in PCT/US16/55005. |
International Preliminary Report on Patentability dated Apr. 18, 2019 in PCT/US17/55631. |
International Preliminary Report on Patentability dated Dec. 13, 2018 in PCT/US17/35290. |
International Preliminary Report on Patentability dated Feb. 19, 2015 issued in PCT/US2013/053625. |
International Preliminary Report on Patentability dated Jan. 18, 2018 in PCT/US2016/041344. |
International Preliminary Report on Patentability dated Jun. 17, 2019 in PCT/US2017/066198. |
International Preliminary Report on Patentability dated May 1, 2014 in PCT/US2012/061137. |
International Preliminary Report on Patentability dated Nov. 24, 2016 in PCT/US2015/029675. |
International Preliminary Report on Patentability dated Oct. 1, 2020 issued in PCT/US2019/023268. |
International Preliminary Report on Patentability dated Oct. 23, 2014 issued in PCT/US2013/036456. |
International Preliminary Report on Patentability dated Oct. 30, 2014 issued in PCT/US2013/034998. |
International Preliminary Report on Patentability dated Oct. 30, 2014 issued in PCT/US2013/036235. |
International Preliminary Report on Patentability dated Sep. 26, 2013, issued in PCT/US2012/027742. |
International Preliminary Report on Patentability dated Sep. 26, 2013, issued in PCT/US2012/027828. |
International Preliminary Report on Patentability dated Sep. 26, 2013, issued in PCT/US2012/027909. |
International Preliminary Report on Patentability dated Sep. 3, 2015, issued in PCT/US2014/016974. |
International Preliminary Report on Patentability dated Sep. 6, 2019 issued in PCT/US2018/019737. |
International Search Report and Written Opinion dated Dec. 13, 2016 in PCT/US16/55005. |
International Search Report and Written Opinion dated Dec. 26, 2013, issued in PCT/US2013/053625. |
International Search Report and Written Opinion dated Jan. 25, 2018 in PCT/US17/55631. |
International Search Report and Written Opinion dated Jul. 11, 2013, issued in PCT/US2013/034998. |
International Search Report and Written Opinion dated Jul. 23, 2013, issued in PCT/US2013/036235. |
International Search Report and Written Opinion dated Jul. 23, 2015 in PCT/US2015/029675. |
International Search Report and Written Opinion dated Jul. 26, 2013, issued in PCT/US2013/036456. |
International Search Report and Written Opinion dated Mar. 28, 2013 in PCT/US2012/061137. |
International Search Report and Written Opinion dated Mar. 30, 2018 in PCT/US2017/066198. |
International Search Report and Written Opinion dated May 26, 2014 in PCT/US2014/016974. |
International Search Report and Written Opinion dated Nov. 25, 2020, in PCT Application No. PCT/US2020/047525. |
International Search Report and Written Opinion dated Oct. 13, 2016, issued in PCT/US2016/041344. |
International Search Report and Written Opinion dated Oct. 16, 2014, issued in PCT/US2014/043514. |
International Search Report and Written Opinion dated Sep. 24, 2012, issued in PCT/US2012/027742. |
International Search Report and Written Opinion dated Sep. 24, 2012, issued in PCT/US2012/027909. |
International Search Report and Written Opinion dated Sep. 26, 2012, issued in PCT/US2012/027828. |
International Search Report and Written Opinion dated Sep. 4, 2019, issued in PCT/US2019/023268. |
International Search Report and Written Opinion dated Sep. 8, 2017, issued in PCT/US17/35290. |
International Search Report and Written Opinion (ISA: KIPO) dated Jun. 11, 2018 issued in PCT/US2018/019737. |
Invitation to Pay Fees and Communication Relating to the Result of the Partial International Search, dated Jul. 12, 2019, issued in PCT/US2019/023268. |
Japanese Decision of Rejection dated Oct. 24, 2018 in JP Application No. JP 2015-558909. |
Japanese Office Action dated Apr. 2, 2019 in JP Application No. 2016-567021. |
Japanese Office Action dated Feb. 6, 2018 in JP Application No. 2015-558909. |
Japanese Office Action dated Jan. 27, 2021 in JP Application No. 2019-232669. |
Japanese Office Action dated Mar. 10, 2020 in JP Application No. 2019-031229. |
JP Decision to Grant a Patent dated Jul. 29, 2021, in JP Application No. 2019-232669. |
JP Office Action dated Jan. 4, 2022, in Application No. JP2020-215729. |
JP Office Action dated Oct. 12, 2021, in application No. JP2019531271 with Machine Translation. |
Kipp & Zonen, “Solar Radiation” (known as of Sep. 3, 2014) [http://www.kippzonen.com/Knowledge-Center/Theoretical-info/Solar-Radiation]. |
Kleissl, Jan et al., “Recent Advances in Solar Variability Modeling and Solar Forecasting at UC San Diego,” Proceedings, American Solar Energy Society, 2013 Solar Conference, Apr. 16-20, 2013, Baltimore, MD. |
Korean Notice of Decision to Grant dated Jun. 22, 2021 in KR Application No. KR10-2015-7026041, with English Translation. |
Korean Notice of First Refusal dated Feb. 18, 2021 in KR Application No. KR10-2015-7026041. |
Korean Notification of Provisional Rejection dated Jun. 22, 2021 in KR Application No. KR10-2016-7032512. |
Korean Office Action, dated Feb. 16, 2021, for Korean Patent Application No. 10-2020-7002032 with English Translation. |
Korean Office Action, dated Jun. 15, 2020, for Korean Patent Application No. 10-2020-7002032 with English Translation. |
Korean Office Action, dated Jun. 7, 2021, for Korean Patent Application No. 10-2020-7002032, with English Translation. |
Korean Office Action dated Mar. 30, 2020 in KR Application No. KR 10-2015-7026041, No translation. |
KR Office Action dated Dec. 23, 2021, in application No. 1020197011968 with English Translation. |
Letter dated Dec. 1, 2014 re Prior Art re U.S. Appl. No. 13/772,969 from Ryan D. Ricks representing MechoShade Systems, Inc. |
Lim, Sunnie H.N. et al., “Modeling of optical and energy performance of tungsten-oxide-based electrochromic windows including their intermediate states,” Solar Energy Materials & Solar Cells, vol. 108, Oct. 16, 2012, pp. 129-135. |
Maghrabi, A., et al., “Design and development of a simple infrared monitor for cloud detection,” Energy Conversion and Management, vol. 50, 2009, pp. 2732-2737. |
Maghrabi, A., et al., “Precipitable water vapour estimation on the basis of sky temperatures measured by a single-pixel IR detector and screen temperatures under clear skies,” Meteorological Applications, vol. 17, 2010, pp. 279-286. |
“Smart Glazing: Making smart-tinting glazing even smarter”, Daylighting: Design & Technology for Better Buildings, Issue 20 (Jan./Feb. 2020), 5 pages. |
Melexis “MLX90614 family Datasheet” (3901090614, Rev. 004), Jul. 30, 2008, 42 pp. |
Mims III, Forrest M., et al., “Measuring total column water vapor by pointing an infrared thermometer at the sky,” Bulletin of the American Meteorological Society, Oct. 2011, pp. 1311-1320. |
Morris, V.R. et al., “Deployment of an infrared thermometer network at the atmospheric radiation measurement program southern great plains climate research facility,” Sixteenth ARM Science Team Meeting Proceedings, Albuquerque, NM, Mar. 27-31, 2006, 11 pp. |
National Aeronautics & Space Administration, “Cloud Radar System (CRS),” (known as of Sep. 3, 2014), published date of Jun. 16, 2014, [http://har.gsfc.nasa.gov/index.php?section=12]. |
National Aeronautics & Space Administration, “Cloud Remote Sensing and Modeling,” (known as of Sep. 3, 2014), published date of Sep. 15, 2014, [http://atmospheres.gsfc.nasa.gov/climate/index.php?section=134]. |
“New from Pella: Windows with Smartphone-run blinds”, Pella Corp., http://www.desmoinesregister.com/article/20120114/BUSINESS/301140031/0/biggame/?odyssey=nav%7Chead, Jan. 13, 2012. |
Notice of Allowance dated Aug. 16, 2021 in U.S. Appl. No. 16/695,004. |
Notice of Allowance dated Aug. 24, 2021 in U.S. Appl. No. 16/695,004. |
Notice of Allowance, dated Jun. 18, 2020 in CN Application No. 201610645398.3, No Translation. |
Notice of Allowance dated Oct. 27, 2021 in U.S. Appl. No. 16/335,222. |
Office Action dated Oct. 6, 2014 in U.S. Appl. No. 13/968,258. |
Partial EP Supplemental Search Report dated Nov. 29, 2017 in EP Application No. 15789108.6. |
Partial European Search Report dated Jul. 6, 2021 for EP Application No. EP21163294.8. |
“SPN1 Sunshine Pyranometer,” Product Overview, Specification, Accessories and Product Resources, Delta-T Devices, May 5, 2016, 9 pp. https://www.delta-t.co.uk/product/spn1/ (downloaded Apr. 28, 2020). |
Preliminary Amendment dated Aug. 21, 2019 for U.S. Appl. No. 16/487,802. |
Preliminary Amendment dated Jul. 1, 2021, in U.S. Appl. No. 17/305,132. |
Preliminary Amendment dated Jul. 10, 2020 for U.S. Appl. No. 15/929,958. |
Preliminary Amendment dated Mar. 8, 2021, in U.S. Appl. No. 17/249,595. |
Preliminary Amendment dated Nov. 9, 2020 for U.S. Appl. No. 17/008,342. |
Preliminary Amendment No. 2, dated Dec. 9, 2020 for U.S. Appl. No. 16/695,004. |
“Remote Sensing: Clouds,” Department of Atmospheric and Ocean Science, University of Maryland, (known as of Sep. 3, 2014) [http://www.atmos.umd.edu/˜pinker/remote_sensing_clouds.htm]. |
Russian Decision to Grant with Search Report dated Feb. 28, 2018 in RU Application No. 2015139884. |
Russian Office Action dated Apr. 13, 2017 in RU Application No. 2014144632. |
Russian Office Action dated Dec. 7, 2018 in RU Application No. 2016148196. |
Russian Office Action dated Nov. 22, 2017 in RU Application No. 2014144632. |
Science and Technology Facilities Council. “Cloud Radar: Predicting the Weather More Accurately.” ScienceDaily, Oct. 1, 2008. [www.sciencedaily.com/releases/2008/09/080924085200.htm]. |
Selkowitz, S. et al., “Dynamic, Integrated Facade Systems for Energy Efficiency and Comfort,” Journal of Buiding Enclosure Design, Summer 2006, pp. 11-17. |
Singapore Search Report dated May 29, 2020 in SG Application No. 10201608572S. |
Singapore Supplementary Examination Report dated Dec. 7, 2016 in SG Application No. 11201406676Q. |
Smith, et al. “Measuring Cloud Cover and Brightness Temperature with a Ground Based Thermal Infrared Camera”, (Feb. 2008), American Meteorological Society, vol. 47, pp. 683-693. |
Taiwan Office Action dated May 13, 2021 in Taiwan Patent Application No. 106134521 with English Translation. |
Taiwan Office Action dated Jul. 30, 2020 in ROC (Taiwan) Pat. Appln. No. 105121480, with English Translation. |
Taiwanese Office Action dated Apr. 17, 2020 in TW Application No. TW 107102210, No translation. |
Taiwanese Office Action dated Aug. 22, 2017 in TW Application No. 103105957. |
Taiwanese Office Action dated Jan. 11, 2016 in TW Application No. 101108958. |
Taiwanese Office Action dated Jan. 30, 2019 in TW Application No. 104114812. |
Taiwanese Office Action dated Jun. 21, 2021 in TW Application No. TW 107106439, No translation. |
Taiwanese Office Action dated May 8, 2019 in TW Application No. 107122055. |
Taiwanese Office Action dated Nov. 23, 2016 in TW Application No. 105129854. |
Taiwanese Office Action dated Oct. 17, 2017 in TW Application No. 106115702. |
Taiwanese Office Action dated Sep. 11, 2020 in TW Application No. 109103256, with English Translation. |
Taiwanese Office Action dated Sep. 16, 2020 in TW Application No. 108143706, with English Translation. |
Third-Party Submission dated Feb. 2, 2015 and Feb. 18, 2015 PTO Notice re Third- Party Submission for U.S. Appl. No. 13/772,969. |
Thompson, Marcus, “Boltwood cloud sensor,” Cloudynights.com, Nov. 25, 2005, 6 pp. [online], [retrieved Dec. 15, 2016]. Retrieved from the internet URL http://www.cloudynights.com/page/articles/cat/user-reviews/photography/photography-accessories/boltwood-cloud-sensor-r1222. |
TW Notice of Allowance dated Sep. 9, 2021, in application No. TW110106134. |
TW Office Action dated Dec. 29, 2021, in application No. 110124070. |
TW Office Action dated Oct. 26, 2021 in TW Application No. TW20170143996 with English translation. |
TW Reissued Office Action dated Jul. 8, 2021, in Taiwanese Application No. 107106439. |
U.S. Corrected Notice of Allowability dated Jan. 12, 2022, in U.S. Appl. No. 16/335,222. |
U.S. Corrected Notice of Allowability dated Nov. 24, 2021, in U.S. Appl. No. 16/335,222. |
U.S. Corrected Notice of Allowance dated Feb. 22, 2022 in U.S. Appl. No. 15/762,077. |
U.S. Corrected Notice of Allowance dated Jan. 21, 2022 in U.S. Appl. No. 15/742,015. |
U.S. Final Office Action dated Apr. 30, 2012 in U.S. Appl. No. 13/049,750. |
U.S. Final Office Action dated Apr. 30, 2020 in U.S. Appl. No. 15/891,866. |
U.S. Final Office Action dated Aug. 19, 2013 in U.S. Appl. No. 13/049,756. |
U.S. Final Office Action dated Feb. 26, 2015 in U.S. Appl. No. 13/479,137. |
U.S. Final Office Action dated Jan. 11, 2019 in U.S. Appl. No. 15/891,866. |
U.S. Final Office Action dated Jan. 27, 2014 in U.S. Appl. No. 13/479,137. |
U.S. Final Office Action dated Jul. 2, 2015 in U.S. Appl. No. 13/049,756. |
U.S. Final Office Action dated Jul. 29, 2016 in U.S. Appl. No. 13/772,969. |
U.S. Final Office Action dated May 15, 2014 in U.S. Appl. No. 13/449,251. |
U.S. Final Office Action dated May 16, 2014 in U.S. Appl. No. 13/449,248. |
U.S. Non Final Office Action dated Feb. 16, 2022 in U.S. Appl. No. 15/929,958. |
U.S. Non Final Office Action dated Jan. 21, 2022, in U.S. Appl. No. 16/303,384. |
U.S. Notice of Allowability (corrected) dated Jul. 28, 2016 in U.S. Appl. No. 14/163,026. |
U.S. Notice of Allowance (corrected) dated Jun. 9, 2020 in U.S. Appl. No. 15/442,509. |
U.S. Notice of Allowance dated Apr. 13, 2015 in U.S. Appl. No. 14/657,380. |
U.S. Notice of Allowance dated Apr. 14, 2021 in U.S. Appl. No. 16/335,222. |
U.S. Notice of Allowance dated Apr. 22, 2021 in U.S. Appl. No. 15/742,015. |
U.S. Notice of Allowance dated Apr. 4, 2016 in U.S. Appl. No. 14/535,080. |
U.S. Notice of Allowance dated Aug. 3, 2021 in U.S. Appl. No. 16/335,222. |
U.S. Notice of Allowance dated Aug. 12, 2016 in U.S. Appl. No. 14/352,973. |
U.S. Notice of Allowance dated Aug. 12, 2020 in U.S. Appl. No. 16/013,770. |
U.S. Notice of Allowance dated Aug. 12, 2021 in U.S. Appl. No. 16/335,222. |
U.S. Notice of Allowance dated Aug. 16, 2018 in U.S. Appl. No. 15/349,860. |
U.S. Notice of Allowance dated Aug. 19, 2021 in U.S. Appl. No. 15/742,015. |
U.S. Notice of Allowance dated Aug. 7, 2020 in U.S. Appl. No. 15/891,866. |
U.S. Notice of Allowance dated Dec. 20, 2021, in U.S. Appl. No. 15/742,015. |
U.S. Notice of Allowance dated Dec. 22, 2016 in U.S. Appl. No. 13/772,969. |
U.S. Notice of Allowance dated Dec. 30, 2021, in U.S. Appl. No. 15/742,015. |
U.S. Notice of Allowance dated Feb. 8, 2022 in U.S. Appl. No. 15/762,077. |
U.S. Notice of Allowance dated Jan. 10, 2014 in U.S. Appl. No. 13/449,235. |
U.S. Notice of Allowance dated Jan. 12, 2018 in U.S. Appl. No. 14/932,474. |
U.S. Notice of Allowance dated Jan. 22, 2015 in U.S. Appl. No. 13/682,618. |
U.S. Notice of Allowance dated Jan. 22, 2018 in U.S. Appl. No. 15/464,837. |
U.S. Notice of Allowance dated Jan. 27, 2017 in U.S. Appl. No. 14/931,390. |
U.S. Notice of Allowance dated Jan. 8, 2016 in U.S. Appl. No. 13/049,756. |
U.S. Notice of Allowance dated Jul. 20, 2012 in U.S. Appl. No. 13/049,623. |
U.S. Notice of Allowance dated Jul. 23, 2020 in U.S. Appl. No. 16/013,770. |
U.S. Notice of Allowance dated Jun. 17, 2014 in U.S. Appl. No. 13/309,990. |
U.S. Notice of Allowance dated Jun. 22, 2016 in U.S. Appl. No. 13/049,756. |
U.S. Notice of Allowance dated Jun. 8, 2016 in U.S. Appl. No. 14/163,026. |
U.S. Notice of Allowance dated Mar. 20, 2018 in U.S. Appl. No. 15/347,677. |
U.S. Notice of Allowance dated Mar. 31, 2021 in U.S. Appl. No. 15/742,015. |
U.S. Notice of Allowance dated May 12, 2021 for U.S. Appl. No. 15/762,077. |
U.S. Notice of Allowance dated May 13, 2021 in U.S. Appl. No. 16/695,004. |
U.S. Notice of Allowance dated May 14, 2015 in U.S. Appl. No. 13/479,137. |
U.S. Notice of Allowance dated May 27, 2020 in U.S. Appl. No. 15/442,509. |
U.S. Notice of Allowance dated May 3, 2018 in U.S. Appl. No. 14/993,822. |
U.S. Notice of Allowance dated May 8, 2012 in U.S. Appl. No. 13/049,750. |
U.S. Notice of Allowance dated Nov. 16, 2018 in U.S. Appl. No. 15/349,860. |
U.S. Notice of Allowance dated Oct. 2, 2019 in U.S. Appl. No. 15/464,837. |
U.S. Notice of Allowance dated Sep. 25, 2018 in U.S. Appl. No. 15/442,509. |
U.S. Notice of Allowance dated Sep. 3, 2019 in U.S. Appl. No. 15/442,509. |
U.S. Notice of Allowance dated Sep. 6, 2019 in U.S. Appl. No. 14/993,822. |
U.S. Notice of Allowance (supplemental) dated Jun. 12, 2015 in U.S. Appl. No. 13/479,137. |
U.S. Office Action dated Apr. 14, 2016 in U.S. Appl. No. 14/163,026. |
U.S. Office Action dated Aug. 28, 2017 in U.S. Appl. No. 14/932,474. |
U.S. Office Action dated Dec. 10, 2020 in U.S. Appl. No. 16/695,004. |
U.S. Office Action dated Dec. 24, 2013 in U.S. Appl. No. 13/309,990. |
U.S. Office Action dated Feb. 22, 2016 in U.S. Appl. No. 14/535,080. |
U.S. Office Action dated Feb. 24, 2015 in U.S. Appl. No. 14/163,026. |
U.S. Office Action dated Feb. 3, 2012 in U.S. Appl. No. 13/049,750. |
U.S. Office Action dated Jan. 16, 2015 in U.S. Appl. No. 14/468,778. |
U.S. Office Action dated Jan. 18, 2013 in U.S. Appl. No. 13/049,756. |
U.S. Office Action dated Jan. 2, 2020 in U.S. Appl. No. 15/442,509. |
U.S. Office Action dated Jan. 23, 2020 in U.S. Appl. No. 15/762,077. |
U.S. Office Action dated Jan. 5, 2016 in U.S. Appl. No. 13/772,969. |
U.S. Office Action dated Jan. 5, 2018 in U.S. Appl. No. 15/442,509. |
U.S. Office Action dated Jul. 3, 2014 in U.S. Appl. No. 13/479,137. |
U.S. Office Action dated Jun. 11, 2020 in U.S. Appl. No. 16/303,384. |
U.S. Office Action dated Jun. 23, 2020 in U.S. Appl. No. 15/742,015. |
U.S. Office Action dated Jun. 3, 2015 in U.S. Appl. No. 13/449,251. |
U.S. Office Action dated Jun. 6, 2017 in U.S. Appl. No. 15/442,509. |
U.S. Office Action dated Mar. 12, 2019 in U.S. Appl. No. 15/464,837. |
U.S. Office Action dated Mar. 18, 2020 in U.S. Appl. No. 16/013,770. |
U.S. Office Action dated Mar. 27, 2012 in U.S. Appl. No. 13/049,623. |
U.S. Office Action dated Mar. 27, 2019 in U.S. Appl. No. 14/993,822. |
U.S. Office Action dated Mar. 3, 2021 in U.S. Appl. No. 16/303,384. |
U.S. Office Action dated Mar. 5, 2019 in U.S. Appl. No. 15/442,509. |
U.S. Office Action dated Mar. 8, 2017 in U.S. Appl. No. 14/993,822. |
U.S. Office Action dated May 18, 2018 in U.S. Appl. No. 15/891,866. |
U.S. Office Action dated May 30, 2018 in U.S. Appl. No. 15/464,837. |
U.S. Office Action dated Nov. 19, 2015 in U.S. Appl. No. 14/535,080. |
U.S. Office Action dated Nov. 2, 2017 in U.S. Appl. No. 15/349,860. |
U.S. Office Action dated Nov. 27, 2015 in U.S. Appl. No. 14/352,973. |
U.S. Office Action dated Nov. 27, 2015 in U.S. Appl. No. 14/163,026. |
U.S. Office Action dated Nov. 29, 2013 in U.S. Appl. No. 13/449,248. |
U.S. Office Action dated Nov. 29, 2013 in U.S. Appl. No. 13/449,251. |
U.S. Office Action dated Oct. 11, 2013 in U.S. Appl. No. 13/449,235. |
U.S. Office Action dated Oct. 11, 2017 in U.S. Appl. No. 14/993,822. |
U.S. Office Action dated Oct. 21, 2019 in U.S. Appl. No. 15/742,015. |
U.S. Office Action dated Oct. 27, 2020 in U.S. Appl. No. 15/762,077. |
U.S. Office Action dated Oct. 28, 2014 in U.S. Appl. No. 13/449,251. |
U.S. Office Action dated Oct. 6, 2014 in U.S. Appl. No. 13/049,756. |
U.S. Office Action dated Sep. 14, 2018 in U.S. Appl. No. 14/993,822. |
U.S. Office Action dated Sep. 15, 2014 in U.S. Appl. No. 13/682,618. |
U.S. Office Action dated Sep. 16, 2021, in U.S. Appl. No. 16/469,851. |
U.S. Office Action dated Sep. 19, 2019 in U.S. Appl. No. 15/891,866. |
U.S. Office Action dated Sep. 21, 2021, in U.S. Appl. No. 16/487,802. |
U.S. Office Action dated Sep. 23, 2013 in U.S. Appl. No. 13/479,137. |
U.S. Office Action dated Sep. 29, 2014 in U.S. Appl. No. 13/449,248. |
U.S. Appl. No. 17/249,595, inventors Frank et al., filed Mar. 5, 2021. |
U.S. Appl. No. 17/305,132, inventors Brown et al., filed Jun. 30, 2021. |
U.S. Preliminary Amendment dated Oct. 30, 2019 for U.S. Appl. No. 16/013,770. |
U.S. Appl. No. 63/080,899, inventor Makker et al., filed Sep. 21, 2020. |
Werner, Christian, “Automatic cloud cover indicator system,” Journal of Applied Meteorology, vol. 12, Dec. 1973, pp. 1394-1400. |
AU Office Action dated Feb. 15, 2023, in Application No. AU2021200070. |
AU Office Action dated Feb. 21, 2023, in Application No. AU2021200070. |
AU Office Action dated Feb. 22, 2022, in Application No. AU2021200070. |
AU Office Action dated Jul. 3, 2023, in application No. AU20220200523. |
AU Office action dated Nov. 28, 2022, in AU Application No. AU2021200070. |
Australian Examination Report dated Feb. 20, 2023, in Application No. AU2017376447. |
Australian Examination Report dated Feb. 28, 2022, in Application No. 2017376447. |
CA Office Action dated Feb. 11, 2022, in Application No. CA2902106. |
CA Office Action dated Apr. 11, 2023, in Application No. CA2991419. |
CA Office Action dated Jun. 23, 2023, in Application No. CA3025827. |
CA Office Action dated Nov. 23, 2022, in Application No. CA2902106. |
CA Office Action dated Oct. 4, 2022, in Application No. CA2991419. |
CN Office Action dated Apr. 25, 2022, in Application No. CN201980027469.5 with English translation. |
CN Office Action dated Apr. 21, 2022 in Application No. CN201811232377.4 with English translation. |
CN Office Action dated Apr. 27, 2022, in Application No. CN201780039437.8 with English translation. |
CN Office Action dated Aug. 10, 2022, in Application No. CN201911184096.0 with English translation. |
CN Office Action dated Feb. 14, 2023 in Application No. CN201780084583.2 with English translation. |
CN Office Action dated Jan. 5, 2023, in Application No. CN201980027469.5 with English translation. |
CN Office Action dated Jul. 7, 2022 in Application No. CN201780084583.2 with English translation. |
CN Office Action dated Mar. 9, 2022, in Application No. CN201911184096.0 with EnglishTranslation. |
CN Office Action dated Mar. 8, 2022, in Application No. CN201680063892.7 with English translation. |
CN Office Action dated May 31, 2022, in Application No. CN201880022572.6 With English Translation. |
CN Office Action dated Oct. 14, 2022, in Application No. CN201880022572.6 With English Translation. |
CN Office Action dated Sep. 5, 2022, in Application No. CN201780039437.8 with English translation. |
EP Extended European Search report dated Jun. 1, 2023, in Application No. EP23151011.6. |
EP Extended European Search report dated May 15, 2023, in Application No. EP22201987.9. |
EP Office Action dated Mar. 4, 2022, in Application No. EP17859286.1. |
EP office action dated Apr. 14, 2023, in application No. EP17859286.1. |
EP office action dated Jun. 30, 2023, in application No. EP13777540.9. |
EP office action dated Jun. 30, 2023, in application No. EP20768741.9. |
European Office Action dated Apr. 5, 2023 in Application No. EP21163294.8. |
European Office Action dated Feb. 22, 2023 for EP Application No. EP22197030.4. |
European Office Action dated Feb. 23, 2023 in Application No. EP17807428.2. |
European office action dated Mar. 18, 2022, in Application No. 13777540.9. |
European Office Action dated Sep. 5, 2022 in Application No. EP18756696.3. |
Extended European search report dated Oct. 10, 2022, in Application No. EP22161794.7. |
Humann, C., “HDR sky imaging for real time control of facades,” presented Nov. 18, 2021 at Velux Build for Life conference. 21 pages of screenshots. Retrieved from internet: https://vimeo.com/647274396. |
IN Office Action dated Dec. 18, 2019 in Application No. IN202038052140. |
IN Office Action dated Feb. 25, 2022 in Application No. IN202138016166. |
IN office action dated Mar. 4, 2022, in application No. 202038052140. |
JP Decision to Grant a Patent dated Jul. 29, 2021, in JP Application No. 2019-232669 withEnglish translation. |
JP Office Action dated Aug. 16, 2022 in Application No. JP2021-142788 with English translation. |
JP Office Action dated Jan. 4, 2022, in Application No. JP2020-215729 with English Translation. |
JP Office Action dated Jan. 10, 2023 in Application No. JP2019-531271 with English translation. |
JP Office Action dated Jul. 12, 2022, in Application No. JP2019-531271 with English translation. |
JP Office Action dated Jun. 6, 2023 in Application No. JP2022-127648 with English translation. |
JP Office Action dated Mar. 7, 2023 in Application No. JP2021-142788 with English translation. |
KR Office Action dated Apr. 27, 2022, in Application No. KR10-2016-7032512 with English Translation. |
KR Office Action dated Dec. 3, 2021, in Application No. KR1020217028534 with English translation. |
KR Office Action dated Dec. 27, 2021, in Application No. KR1020217030376 with English translation. |
KR Office Action dated Mar. 6, 2023 in Application No. KR10-2022-7028868 with English translation. |
KR Office Action dated Oct. 17, 2022 in Application No. KR10-2021-7030376 with English translation. |
PCT International Preliminary Report on Patentability and Written Opinion dated Mar. 3, 2022, issued in PCT/US2020/047525. |
Reno. M, et al., “Global Horizontal Irradiance Clear Sky Models: Implementation and Analysis”, Sandia Report, SAND2012-2389, 2012, pp. 1-67. |
Subramaniam, S., “Daylighting Simulations with Radiance using Matrix-based Methods”, Lawrence Berkeley National Laboratory, Oct. 3, 2017, 145 pages. |
Taiwanese Office Action dated Jun. 21, 2021 in TW Application No. TW107106439 with English translation. |
TW Office Action dated Sep. 15, 2022 In Application No. TW110140343 with English translation. |
TW Notice of Allowance and Search Report dated Sep. 9, 2021, in application No. TW110106134 with English Translation. |
TW Office Action dated Apr. 11, 2022, in Application No. TW106134521 with English Translation. |
TW Office Action dated Dec. 19, 2022, in Application No. TW111117328 with English translation. |
TW Office Action dated Jun. 29, 2022 In Application No. TW108109631 with English translation. |
TW Office Action dated Mar. 16, 2022, in Application No. TW106143996 with English translation. |
TW Office Action dated Mar. 23, 2023 in Application No. TW20210146592 with English translation. |
TW Office Action dated Oct. 17, 2022, in Application No. TW111114527 with English Translation. |
TW Office Action dated Sep. 13, 2022, in Application No. TW106134521 with English Translation. |
U.S. Non-Final office Action dated Aug. 31, 2022 in U.S. Appl. No. 16/469,851. |
U.S. Non-Final office Action dated Jul. 14, 2022 in U.S. Appl. No. 16/487,802. |
U.S. Advisory Action dated Jun. 7, 2022 in U.S. Appl. No. 16/469,851. |
U.S. Corrected Notice of Allowance dated Aug. 12, 2022 in U.S. Appl. No. 15/929,958. |
U.S. Corrected Notice of Allowance dated Nov. 3, 2022 in U.S. Appl. No. 15/929,958. |
U.S. Corrected Notice of Allowance dated Feb. 16, 2023 in U.S. Appl. No. 17/304,832. |
U.S. Corrected Notice of Allowance dated Jun. 23, 2023, in U.S. Appl. No. 16/469,851. |
U.S. Corrected Notice of Allowance dated May 18, 2022, in U.S. Appl. No. 16/303,384. |
U.S. Final office Action dated Jan. 3, 2023 in U.S. Appl. No. 16/487,802. |
U.S. Final office Action dated Jan. 4, 2023 in U.S. Appl. No. 17/249,595. |
U.S. Final office Action dated Jul. 27, 2023 in U.S. Appl. No. 16/303,384. |
U.S. Final Office Action dated Mar. 10, 2022, in U.S. Appl. No. 16/487,802. |
U.S. Non-Final office Action dated Jul. 21, 2022 in U.S. Appl. No. 17/249,595. |
U.S. Non-Final Office Action dated Dec. 6, 2022 in U.S. Appl. No. 16/303,384. |
U.S. Non-Final office Action dated Dec. 22, 2022 in U.S. Appl. No. 17/305,132. |
U.S. Non-Final Office Action dated Jul. 18, 2023, in U.S. Appl. No. 17/931,014. |
U.S. Non-Final Office Action dated Jul. 20, 2023, in U.S. Appl. No. 17/305,132. |
U.S. Non-Final Office Action dated Mar. 14, 2023 in U.S. Appl. No. 17/008,342. |
U.S. Non-Final Office Action dated May 11, 2023, in U.S. Appl. No. 16/982,535. |
U.S. Non-Final Office Action dated May 15, 2023 in U.S. Appl. No. 16/487,802. |
U.S. Non-Final Office Action dated Oct. 13, 2022, in U.S. Appl. No. 17/304,832. |
U.S. Notice of Allowance dated Aug. 24, 2022 in U.S. Appl. No. 16/303,384. |
U.S. Notice of Allowance dated Apr. 3, 2023 in U.S. Appl. No. 16/469,851. |
U.S. Notice of Allowance dated Feb. 1, 2023 in U.S. Appl. No. 17/304,832. |
U.S. Notice of Allowance dated Jul. 26, 2022, in U.S. Appl. No. 15/929,958. |
U.S. Notice of Allowance dated Jun. 22, 2023, in U.S. Appl. No. 17/008,342. |
U.S. Notice of Allowance dated Mar. 1, 2023 in U.S. Appl. No. 15/762,077. |
U.S. Notice of Allowance dated Mar. 13, 2023 in U.S. Appl. No. 16/469,851. |
U.S. Notice of Allowance dated Mar. 16, 2023 in U.S. Appl. No. 17/305,132. |
U.S. Notice of Allowance dated Mar. 24, 2023 in U.S. Appl. No. 16/469,851. |
U.S. Notice of Allowance dated May 4, 2022 in U.S. Appl. No. 16/303,384. |
U.S. Notice of Allowance dated May 4, 2023 in U.S. Appl. No. 17/304,832. |
U.S. Notice of Allowance dated May 18, 2022, in U.S. Appl. No. 15/762,077. |
U.S. Notice of Allowance dated Sep. 5, 2023, in U.S. Appl. No. 17/249,595. |
U.S. Notice of Allowance dated Sep. 22, 2022 in U.S. Appl. No. 15/762,077. |
U.S. Office Action dated Apr. 1, 2022, in U.S. Appl. No. 16/469,851. |
U.S. Appl. No. 18/150,146, Inventors Klawuhn et al., filed Jan. 4, 2023. |
U.S. Appl. No. 18/308,658, inventors Zedlitz; Jason David., filed Apr. 27, 2023. |
U.S. Restriction Requirement dated Oct. 14, 2022, in U.S. Appl. No. 17/008,342. |
Zhu, H. et al., Understanding Radiance (Brightness), Irradiance and Radiant Flux, Energetiq Technology, Inc., Technical Note #004-3-25-2011,2018, 4 Pages. |
CA Office Action dated Nov. 2, 2023 in CA Application No. CA3047093. |
CN Office Action dated Sep. 1, 2023, in application No. CN20208072995.6 with English Translation. |
CN Office Action dated Sep. 26, 2023, in Application No. CN202210751723.X withEnglish Translation. |
EP Office Action dated Nov. 9, 2023 in EP Application No. 21171305.2. |
International Search Report and Written Opinion dated Sep. 8, 2023, in Application No. PCT/US2023/022140. |
JP Office Action dated Sep. 12, 2023, in application No. JP2022-180244 with English Translation. |
KR Office Action dated Jul. 31, 2023, in Application No. KR10-2022-7039319 with English translation. |
TW Office Action dated Oct. 24, 2023 in TW Application No. 111136120, with EnglishTranslation. |
U.S. Corrected Notice of Allowance dated Nov. 1, 2023, in U.S. Appl. No. 16/487,802. |
U.S. Final Office Action dated Dec. 18, 2023 in U.S. Appl. No. 17/931,014. |
U.S. Notice of Allowance dated Dec. 5, 2023 in U.S. Appl. No. 16/982,535. |
U.S. Notice of Allowance dated Dec. 15, 2023 in U.S. Appl. No. 16/982,535. |
U.S. Notice of Allowance dated Oct. 18, 2023 in U.S. Appl. No. 16/487,802. |
U.S. Notice of Allowance dated Sep. 29, 2023 in U.S. Appl. No. 17/008,342. |
U.S. Appl. No. 18/338,296, inventors Zedlitz, et al., filed Jun. 20, 2023. |
U.S. Appl. No. 18/486,197, inventors Brown et al., filed Oct. 13, 2023. |
Number | Date | Country | |
---|---|---|---|
20220326584 A1 | Oct 2022 | US |
Number | Date | Country | |
---|---|---|---|
62891102 | Aug 2019 | US | |
62666572 | May 2018 | US | |
62646260 | Mar 2018 | US | |
61991375 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15347677 | Nov 2016 | US |
Child | 16013770 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2019/023268 | Mar 2019 | US |
Child | 17753098 | US | |
Parent | 16013770 | Jun 2018 | US |
Child | PCT/US2019/023268 | US | |
Parent | 16013770 | Jun 2018 | US |
Child | PCT/US2019/023268 | US | |
Parent | PCT/US2015/029675 | May 2015 | US |
Child | 15347677 | US | |
Parent | 13772969 | Feb 2013 | US |
Child | PCT/US2015/029675 | US |