Infrared cloud detector systems and methods

Information

  • Patent Grant
  • 11674843
  • Patent Number
    11,674,843
  • Date Filed
    Friday, June 25, 2021
    2 years ago
  • Date Issued
    Tuesday, June 13, 2023
    10 months ago
Abstract
Infrared cloud detector systems and methods for detecting cloud cover conditions.
Description
FIELD

The present disclosure generally relates to arrangements of sensing elements for detecting cloud cover conditions, and in particular to, infrared cloud detector systems and methods of detecting cloud cover conditions thereof.


BACKGROUND

Detecting cloud cover can be an important part of making decisions about placing equipment into operation at, for example, a robotic observatory since astronomers may want to detect clouds that may interfere with their observations. Conventional methods of mapping the sky to detect cloud cover rely on expensive imaging devices that typically rely on visible light measurements.


SUMMARY

Certain aspects pertain to infrared cloud detector systems and methods of detecting cloud cover conditions thereof.


Certain aspects pertain to infrared cloud detector systems. In some aspects, an infrared cloud detector system comprises an infrared sensor configured to measure sky temperature based on infrared radiation received within its field-of-view, an ambient temperature sensor configured to measure an ambient temperature, and logic configured to determine a cloud condition based on a difference between the measured sky temperature and the measured ambient temperature.


In some aspects, an infrared cloud detector system comprises an infrared sensor configured to measure sky temperature based on infrared radiation received within its field-of-view, an ambient temperature sensor configured to measure an ambient temperature, a photosensor configured to measure intensity of visible light, and logic configured to determine a cloud condition. If a time of day is between a first time before sunrise and a second time after sunrise or between a third time before sunset and sunset, the logic is configured to determine the cloud condition based on a difference between the measured sky temperature and the measured ambient temperature. If the time of day is between the second time after sunrise and before the third time before sunset, the logic is configured to determine the cloud condition based on the measured intensity of visible light from the photosensor.


Certain aspects pertain to infrared cloud detector methods. In some aspects, an infrared cloud detector method comprises receiving a sky temperature reading from an infrared sensor and an ambient temperature reading from an ambient temperature sensor, calculating a difference between the sky temperature reading and the ambient temperature reading, and determining a cloud condition based on the calculated difference between the sky temperature reading and the ambient temperature reading.


In some aspects, an infrared cloud detector method comprises receiving a sky temperature reading from an infrared sensor, an ambient temperature reading from an ambient temperature sensor, and an intensity reading from a photosensor and determining whether a time of day is: (i) between a first time before sunrise and a second time after sunrise or between a third time before sunset and sunset; (ii) between the second time after sunrise and before a third time before sunset; (iii) after (i) and before (iii); or (iv) after (iii) and before (i). If the time of day is (i), (iii), or (iv), the cloud condition is determined based on a difference between the measured sky temperature and the measured ambient temperature. If the time of day is (iii), the cloud condition is determined based on the intensity reading received from the photosensor.


These and other features and embodiments will be described in more detail below with reference to the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a schematic representation of a side view of an infrared cloud detector system, according to some implementations.



FIG. 2A shows a graph with two plots of temperature readings taken over time by an infrared sensor of the infrared cloud detector, according to this implementation.



FIG. 2B shows a graph having two plots of ambient temperature readings taken over time by the ambient temperature sensor of the infrared cloud detector discussed with respect to FIG. 2A.



FIG. 2C shows a graph having two plots of the calculated delta between the temperature readings taken by the infrared sensor and the ambient temperature readings taken by the ambient temperature sensor of the infrared cloud detector discussed with respect to FIGS. 2A and 2B.



FIG. 3 depicts a schematic (side view) diagram of an infrared cloud detector system comprising an infrared cloud detector and a photosensor, according to an implementation.



FIG. 4A shows a perspective view of a diagrammatic representation of an infrared cloud detector system comprising an infrared cloud detector in the form of a multi-sensor, according to an implementation.



FIG. 4B shows another perspective view of the infrared cloud detector system comprising the infrared cloud detector in the form of the multi-sensor shown in FIG. 4A.



FIG. 4C shows a perspective view of some of the inner components of the multi-sensor device of the infrared cloud detector system shown in FIGS. 4A and 4B.



FIG. 5A is a graph with a plot of intensity readings taken by the visible light photosensor over time.



FIG. 5B is a graph with a plot of the difference between temperature readings taken by the infrared sensor and temperature readings taken by the ambient temperature sensor over time.



FIG. 6A is a graph with a plot of intensity readings taken by the visible light photosensor over time.



FIG. 6B is a graph with a plot of the difference between temperature readings taken by the infrared sensor over time and temperature readings taken by an ambient temperature sensor over time.



FIG. 7A is a graph with a plot of intensity readings taken by the visible light photosensor over time.



FIG. 7B is a graph with a plot of the difference between temperature readings taken by the infrared sensor and temperature readings taken by an ambient temperature sensor over time.



FIG. 8 shows a flowchart describing a method that uses temperature readings from an infrared sensor and an ambient temperature sensor to determine a cloud cover condition, according to implementations.



FIG. 9 shows a flowchart describing a method that determines a cloud cover condition using readings from an infrared sensor, an ambient temperature sensor, and a photosensor of an infrared cloud detector system, according to implementations.



FIG. 10A depicts a schematic cross-section of an electrochromic device.



FIG. 10B depicts a schematic cross-section of an electrochromic device in a bleached state (or transitioning to a bleached state).



FIG. 10C depicts a schematic cross-section of the electrochromic device shown in FIG. 10B, but in a colored state (or transitioning to a colored state).



FIG. 11A shows the penetration depth of direct sunlight into a room through an electrochromic window between the exterior and the interior of a building, which includes the room, according to an implementation.



FIG. 11B shows direct sunlight and radiation under clear sky conditions entering the room through the electrochromic window, according to an implementation.



FIG. 11C shows radiant light from the sky as may be obstructed by or reflected from objects such as, for example, clouds and other buildings, according to an implementation.



FIG. 12 depicts a flowchart showing general control logic for a method of controlling one or more electrochromic windows in a building, according to embodiments.



FIG. 13 is a diagram showing a particular implementation of one of the blocks from FIG. 12, according to an implementation.



FIG. 14A is a flowchart depicting a particular implementation of the control logic of an operation shown in FIG. 13, according to an implementation.



FIG. 14B is a flowchart depicting a particular implementation of the control logic of an operation shown in FIG. 14A, according to an implementation.





DETAILED DESCRIPTION
I. Introduction

At certain times of the day, the intensity of visible light is at a low level such as in the early morning around sunrise and in the evening just before sunset. A photosensor calibrated to measure the intensity of visible light (referred to herein as a “visible light photosensor” or generally as a “photosensor”) does not detect direct sunlight and its intensity measurements at these times of day are not effective in determining when the sky is clear (a “clear” condition) and when the sky is cloudy (a “cloudy” condition). That is, a visible light photosensor directed toward the sky at these times would measure low intensity values both during a “clear” condition and a “cloudy” condition. Consequently, the intensity measurements taken by a visible light photosensor alone cannot be used to accurately distinguish between “cloudy” and “clear” conditions at these times. If intensity measurements from a visible light photosensor alone were used to determine a “cloudy” condition (e.g., when measured intensity levels drop below a particular minimal value) in the evening at dusk just before sunset, a false “cloudy” condition could be detected. Similarly, visible light photosensor measurements are not effective in distinguishing between “cloudy” and “clear” conditions just before sunrise when there is no direct sunlight. At any of these time periods, the photosensor measurements might be used to detect a false “cloudy” condition. A controller that relies on a false “cloudy” determination from such photosensor readings could consequently implement an inappropriate control decision based on this false “cloudy” determination. For example, if photosensor readings determine a false “cloudy” condition at a time just before sunrise, a window controller that controls tint levels in an optically switchable window (e.g., electrochromic window) facing East might inappropriately clear the window allowing direct glare from the rising sun to shine into the room.


Moreover, a controller that makes decisions based primarily on current readings from a visible light photosensor does not account for historical intensity levels in the geographic region that could bear on probable current/future cloud cover conditions, for example, to make control commands in anticipation of a condition that is likely to occur. For example, there may be a historically low light level in the morning when small clouds pass the geographic region. In this circumstance, a small cloud temporarily blocking sunlight to the photosensor would result in the same determination of a “cloudy” condition as when a large storm were rolling into the region. In this case, the passing of a small cloud could cause the controller to transition a tintable window and possibly lock an optically switchable window into an inappropriately low tint level until the window can transition to a higher (darker) tint level.


II. Infrared (IR) Cloud Detectors

Both clouds and water vapor absorb and re-emit radiation in discrete bands across the infrared (IR) spectrum. Since clouds absorb and re-emit IR radiation and a clear sky transmits IR radiation, clouds are generally warmer (have higher temperature) than clear sky. In other words, the presence of clouds generally produces an enhanced IR signal (which corresponds to an approximate black body spectrum at about ground temperature) above a signal from the clear sky. There is also the lesser effect of atmospheric humidity, which can also produce an enhanced IR signal, particularly at low elevations. Based on these distinctions, devices that measure IR radiation can be used to detect a cloud and a “cloudy” condition.


Various implementations relate to infrared cloud detectors and methods thereof that detect cloud cover based on infrared readings. The infrared cloud detectors generally include at least one infrared (IR) sensor and an ambient temperature sensor used in conjunction to take temperature readings of the sky that can be used to detect cloud cover conditions. Generally speaking, the amount of infrared radiation emitted by a medium/object and that is then measured by an IR sensor varies depending on the temperature of the medium/object, the surface and other physical characteristics of the medium/object, the field-of-view of the IR sensor, and the distance between the medium/objects and the IR sensor. The IR sensor converts IR radiation received within its field-of-view to a voltage/current and the voltage/current to corresponding temperature readings (e.g., digital temperature reading) of the medium/object within its field-of-view. For example, an IR sensor directed (oriented) to face the sky outputs temperature readings of a region of the sky within its field-of-view. The IR sensor can be oriented in a particular direction (e.g., azimuthal angle and altitude angle) to preferentially capture IR radiation in the geographical region of the sky within its field-of-view centered about that direction. The ambient temperature sensor measures the temperature of ambient air surrounding the sensor. Generally the ambient temperature sensor is located to measure the temperature of ambient air surrounding the infrared cloud detector. The infrared cloud detector further comprises a processor that determines the difference between the temperature readings taken by the IR sensor and the ambient temperature sensor and uses this difference to detect the amount of cloud cover in a region of the sky within the field-of-view of the IR sensor.


Generally, sky temperature readings taken by an ambient temperature sensor tend to fluctuate to a lesser extent with changing weather conditions than sky temperature readings taken by an infrared radiation sensor. For example, sky temperature readings taken by an infrared radiation sensor tend to fluctuate with high frequency during an “intermittent cloudy” condition in a fast moving weather pattern. Certain implementations of infrared cloud detectors have logic that determines the difference between infrared sensor temperature readings (TIR) and ambient temperature readings (TA), the delta (custom character), according to Eqn. 1 to help normalize any fluctuations in the infrared sensor temperature readings (TIR). In one example, logic determines a “cloudy” condition if the delta (custom character) is determined to be above the upper threshold value (e.g., about 0 degrees Celsius), a “clear” condition if the delta (custom character) is determined to be below the lower threshold value (e.g., about −5 degrees Celsius), and an “intermittent cloudy” condition if the delta (custom character) is determined to be between upper and lower threshold values. In another example, the logic determines a “cloudy” condition if the delta (custom character) is above a single threshold value and a “clear” condition if the delta (custom character) is below the threshold value. In one aspect, the logic can apply one or more correction factors to the delta (custom character) before determining whether it is above or below threshold value(s). Some examples of correction factors that may be used in implementations include humidity, sun angle/elevation, and site elevation. For example, a correction factor may be applied based on the altitude and density of the clouds being detected. Lower altitude and/or higher density clouds more closely relate to ambient temperature readings than infrared sensor readings. Higher altitude and/or less dense clouds closely relate to infrared sensor readings then to ambient temperature readings. In this example, a correction factor can be applied that weights the ambient temperature readings higher for lower altitude and/or higher density clouds or weights the infrared sensor readings higher for higher altitude and/or less dense clouds could be used. In another example, a correction factor may be applied based on humidity and/or sun position to more accurately describe cloud cover and/or remove any outliers. To illustrate the technical advantages of using the delta (custom character) to determine a cloud condition is described with reference to FIGS. 2A-2C below.


Since sky temperature readings are generally independent of direct sunlight being present, temperature readings can be used by the infrared cloud detector to more accurately detect a cloud cover condition in certain instances than a visible light photosensor could detect at times when intensity of sunlight is low (e.g., just before sunrise and in the early morning just after sunrise, in the early evening before sunset). At these times, a visible light photosensor could potentially detect a false “cloudy” condition. According to these implementations, infrared cloud detectors can be used to detect cloud cover and the accuracy of their detection has no bearing on whether the sun is out or whether there are otherwise low light intensity levels such as, for example, just before sunrise or sunset. In these implementations, a relatively low sky temperature generally indicates the likelihood of a “clear” condition and a relatively high sky temperature reading generally indicates the likelihood of a “cloudy” condition (i.e. cloud cover).


In various implementations, the IR sensor of the infrared cloud detector is calibrated to measure radiant flux of long wavelength infrared radiation within a specific range. A processor of the IR sensor or a separate processor can be used to infer temperature readings from these measurements. In one aspect, the IR sensor is calibrated to detect infrared radiation in a wavelength range of between about 8 μm and about 14 μm. In another aspect, an IR sensor is calibrated to detect infrared radiation having wavelengths above about 5 μm. In another aspect, an IR sensor is calibrated to detect infrared radiation in a wavelength range of between about 9.5 μm and about 11.5 μm. In another aspect, an IR sensor is calibrated to detect infrared radiation in a wavelength range of between about 10.5 μm to 12.5 μm. In another aspect, an IR sensor is calibrated to detect infrared radiation in a wavelength range of between about 6.6 μm to 20 μm. Some examples of types of IR sensors that can be used include an infrared thermometer (e.g., a thermopile), infrared radiometer, infrared pyrgeometer, infrared pyrometer, and the like. A commercially-available example of an IR sensor is the Melexis MLX90614 made by Melexis of Detroit, Mich. Another commercially-available example of an IR sensor is the TS305-11C55 Temperature Sensor made by TE connectivity Ltd. of Switzerland. Another commercially-available example of an IR sensor is the SI-111 Infrared radiometer made by Apogee Temperature Sensor made by TE connectivity Ltd. of Switzerland.


In various implementations, the infrared cloud detector has an IR sensor that is located and oriented so that its field-of-view can receive infrared radiation from a particular region of sky of interest. In one implementation, the IR sensor may be located on a roof-top of a building and oriented with its sensing surface facing vertically upward or at a small angle from vertical so that its field-of-view is of a region of the sky above or at a distance from the building.


In certain implementations, the infrared cloud detector has a protective housing and the infrared sensor is located within the housing. The housing may have a cover with one or more apertures or thinned areas that allow/restrict transmission of infrared radiation to the infrared sensor. In some cases, the cover may be formed from a plastic such as polycarbonate, polyethylene, polypropylene and/or a thermoplastic such as nylon or other polyamide, polyester or other thermoplastic, among other suitable materials. In one example, the material is a weather-resistant plastic. In other cases, the cover may be formed from a metallic material such as aluminum, cobalt or titanium, or a semi-metallic material such as alumide. In some implementations, the cover may be sloped or convex-shaped to prevent the accumulation of water. Depending on the type of material or materials used to form the cover, the cover may be 3D-printed, injection molded or formed via another suitable process or processes.


In some implementations, the cover includes one or more apertures or thinned areas to increase transmission (lessen blocking) of incident radiation or other signals to detectors within the housing. For example, the cover may include one or more apertures or thinned areas proximate infrared sensors in the housing to allow for improved transmission of incident infrared radiation to the infrared sensors. Apertures or thinned areas may also improve transmission of other signals (e.g., GPS signals) to other detecting devices within the housing. Additionally or alternatively, some or all of the cover can be formed of a light-diffusing material. In some implementations, the cover can be connected with the housing via an adhesive or with some mechanical coupling mechanism such as through the use of threads and threading or via a pressure gasket or other press-on fitting.


The field-of-view of the sensing surface of the infrared sensor is defined by its material composition and its structure. In some cases, the field-of-view of infrared sensor may be narrowed by obstructions. Some examples of obstructions include a building structure such as an overhanging or a roof-top structure, an obstruction near the building such as a tree or another building, etc. As another example, if the infrared sensor is located within a housing, structures within the housing may narrow the field-of-view.


In one aspect, a single IR sensor has a vertical unconstrained field-of-view of about 50 degrees to about 130 degree+−40 degrees off of vertical. In one aspect, an IR sensor has a field of view in a range of 50 degrees and 100 degrees. In another aspect, an IR sensor has a field of view in a range of 50 degrees and 80 degrees. In another aspect, an IR sensor has a field-of-view of about 88 degrees. In another aspect, an IR sensor has a field-of-view of about 70 degrees. In another aspect, an IR sensor has a field-of-view of about 44 degrees. The field-of-view of an IR sensor is typically defined as a conical volume. IR sensors typically have wider fields-of-view than visible light photosensors and are consequently capable of receiving radiation from larger regions of the sky. Since an IR sensor can take readings of larger regions of the sky, the IR sensor can be more useful in determining an approaching condition (e.g., incoming storm clouds) than a visible light photosensor which would be more limited to detecting a current condition affecting the immediate vicinity of the photosensor within its smaller field-of-view. In one aspect, a five-sensor obstructed IR sensor arrangement (e.g., in a multi-sensor configuration) of mounted sensors has four angularly mounted IR sensors, each constrained to a field-of-view of 20-70 degrees or 110-160 degrees, and one upward facing IR sensor constrained to a field-of-view of 70-110 degrees.


Certain IR sensors tend to be more effective in measuring sky temperature when direct sunlight is not impinging the sensing surface. In certain implementations, the infrared cloud detector has a structure that shades direct sunlight from the sensing surface of the IR sensor or has a structure that diffuses direct sunlight (e.g., enclosure of opaque plastic) before it impinges the sensing surface of IR sensor. In one implementation, an IR sensor may be shaded by an overhanging structure of the building or of the infrared cloud detector. In another implementation, an IR sensor may be located within a protective housing with a diffusing material between the sensing surface of the IR sensor and the sky to diffuse any direct sunlight from reaching the sensing surface of the IR sensor and also to provide protection from potentially harmful elements such as dirt, animals, etc. Additionally or alternatively, some implementations only use IR sensor readings taken before sunrise or after sunset to avoid the possibility of direct sunlight impinging the IR sensor. In these implementations, photosensor readings or other sensor readings may be used to detect cloud cover conditions between sunrise and sunset.


In various implementations of the infrared cloud detector has an ambient temperature sensor for measuring the temperature of the air surrounding the ambient temperature sensor. Typically, the ambient temperature sensor is located in contact with the outdoor environment (e.g. located outside of a building) to take temperature readings of the sky. The ambient temperature sensor may be, for example, a thermistor, a thermocouple, a resistance thermometer, a thermocouple, a silicon bandgap temperature sensor, etc. A commercially-available example of an ambient temperature sensor that can be used is the Pt100 thermometer probe made by Omega. Certain implementations include an ambient temperature sensor that is located to avoid direct sunlight from impinging its sensing surface. For example, the ambient temperature sensor may be located under an overhanging or mounted underneath a structure that shades the ambient temperature sensor from direct sunlight.


Although many implementations of the infrared cloud detector described herein include one IR sensor and one ambient temperature sensor, it would be understood that other implementations can include more than one IR sensor and/or more than one ambient temperature sensor. For example, in one implementation, the infrared cloud detector includes two or more IR sensors for redundancy and/or to direct IR sensors to different regions of the sky. Additionally or alternatively, the infrared cloud detector may have two or more ambient temperature sensors for redundancy in another implementation. An example of a system that uses two IR sensors directed different regions of the sky for detecting clouds can be found in international application PCT/US15/53041, filed on Sep. 29, 2015 and titled “SUNLIGHT INTENSITY OR CLOUD DETECTION WITH VARIABLE DISTANCE SENSING,” which is hereby incorporated by reference in its entirety.


Various implementations of the infrared cloud detector have the basic functionality of detecting cloud cover conditions. In some cases, the infrared cloud detector can detect a “cloudy” condition and a “clear” condition. Additionally, some implementations can further differentiate a “cloudy” condition into gradations. For example, one implementation can differentiate a “cloudy” condition as either “overcast” or “intermittent clouds.” In another example, an implementation can assign different levels (e.g., 1-10) of cloudiness to the “cloudy” condition. In yet another example, an implementation can determine a future cloud condition. Additionally or alternatively, some implementations can also detect other weather conditions.


In various implementations, the infrared cloud detector comprises an IR sensor configured to take temperature readings, TIR, and an ambient temperature sensor configured to take ambient temperature readings, TA. The infrared cloud detector also includes one or more processors containing program instructions that can be executing to perform various functions of the infrared cloud detector. The processor(s) executes program instructions to determine the temperature difference, delta (custom character) between the temperature readings as provided in Eqn. 1. The processor(s) also executes program instructions to determine the cloud cover condition based on the delta (custom character). As mentioned above, using the ambient temperature readings can help normalize any rapid fluctuations in the IR sensor temperature readings in some circumstances.

Delta (Δ)=Infrared Sensor Temperature Reading (TIR)−Ambient Temperature Reading (TA)  (Eqn. 1)


In one implementation the processor(s) executes program instructions to compare the delta (custom character) to an upper threshold value and a lower threshold value and determine a cloud cover condition. If the delta (custom character) is above the upper threshold value, a “clear” condition is determined. If the delta (custom character) is below the lower threshold value, a “cloudy” condition is determined. If the delta (custom character) is below the upper threshold value and above the lower threshold value (i.e. between threshold values), an “intermittent” cloud cover condition is determined. Additionally or alternatively, additional factors may be used to determine a cloud cover condition when the delta (custom character) is between threshold values. This implementation works well in the morning around dawn and in the evening around dusk to accurately determine a “cloudy” condition or a “clear” condition. Between sunrise and sunset, additional factors may be used to determine cloud cover condition such as, for example, by using visible photosensor values. Some examples of additional factors include: elevation, wind speed/direction, and sun elevation/angle.


A. Infrared (IR) Cloud Detection Sensor Systems



FIG. 1 shows a schematic representation of a side view of system with an infrared cloud detector 100, according to some implementations. The infrared cloud detector 100 comprises a housing 101 with a cover 102 having an aperture or thinned portion 104 at a first surface 106 of the housing 101. The housing 101 also has a second surface 108 opposing the first surface 106. The infrared cloud detector 100 further comprises an IR sensor 110 configured to take temperature readings, TIR, based on infrared radiation received within its conical field-of-view 114, an ambient temperature sensor 130 for taking ambient temperature readings, TA, and a processor 140 in communication (wired or wirelessly) with the IR sensor 110 and the ambient temperature sensor 130. In one aspect, the IR sensor is one of an infrared thermometer (e.g., a thermopile), infrared radiometer, infrared pyrgeometer, and infrared pyrometer. In one aspect, the ambient temperature sensor is one of a thermistor, a thermometer, and a thermocouple.


In FIG. 1, the IR sensor 110 is located behind the aperture or thinned portion 104 and within the enclosure of the housing 101. The aperture or thinned portion 104 enables the IR sensor 110 to measure infrared radiation transmitted through the aperture or thinned portion 104 and received at its sensing surface. The IR sensor 110 includes an imaginary axis 112 that is orthogonal to the sensing surface of the IR sensor 110 and passes through the center of the IR sensor 110. In the illustrated example, the IR sensor 110 is oriented so that its axis 112 is in a vertical orientation and the sensing surface is facing upward. In other examples, the IR sensor 110 can be directed so that the sensing surface is facing in another orientation to direct the IR sensor, for example, to a particular region of the sky. The IR sensor 110 has a conical field-of-view 114 through the aperture or thinned portion 104 to outside of the housing 101. In this example, the portions of the cover 102 around the aperture or thinned portion 104 are made of a material that blocks infrared radiation and the perimeter of the aperture or thinned portion 104 defines the field-of-view 114. The field-of-view 114 has an angle, a, and is centered about the axis 112. In FIG. 1, the ambient temperature sensor 130 is located and affixed to the second surface 108 of the housing 101 away from the edge to avoid direct sunlight from impinging the ambient temperature sensor 130 when the infrared cloud detector 100 is in this orientation. Although not shown, the infrared cloud detector 100 also includes one or more structures that hold the infrared sensor 110 and other components in place within the housing 101.


The infrared cloud detector 100 further comprises logic that calculates a delta (custom character) between infrared sensor sky temperature readings (TIR) and the ambient temperature readings (TA) at each reading time and determine a cloud cover condition based on the calculated delta (custom character). During operation, the IR sensor 110 takes temperature readings, TIR, based on infrared radiation received form the region of sky within its field-of-view 114 and the ambient temperature sensor 130 takes ambient temperature readings, TA, of the ambient air surrounding the infrared cloud detector 100. The processor 140 receives signals with temperature readings, TIR, from the IR sensor 110 and signals with ambient temperature readings, TA, from the ambient temperature sensor 130. The processor 140 executes instructions stored in memory (not shown) that uses the logic to calculate a delta (custom character) between infrared sensor temperature readings (TIR) and the ambient temperature readings (TA) at particular time to determine the cloud cover condition. For example, the processor 140 may execute instructions that determines a “cloudy” condition if the delta (custom character) at that time is above the upper threshold value, determines a condition “clear” if the delta (Δ) is below the lower threshold value, and determines an “intermittent cloudy” condition if is determined that the delta (custom character) is between the upper threshold value and the lower threshold value. The processor 140 may also execute instructions stored in memory to perform other operations of methods described herein.


Although a single infrared sensor 110 is illustrated in FIG. 1, two or more infrared sensors can be used, in another implementation, for redundancy in case one malfunctions and/or is obscured by, for example, bird droppings or another environmental agent. In one implementation, two or more infrared sensors are used to face different orientations to capture IR radiation from different fields-of-view and/or at different distances from the building/structure. If two or more IR sensors are located within a housing of an infrared cloud detector 100, the IR sensors are typically offset from one another by a distance sufficient to reduce the likelihood that an obscuring agent would affect all the IR sensors. For example, IR sensors may be separated by at least about one inch or at least about two inches.


B. Comparison of Infrared Sensor Temperature Readings, Ambient Temperature Readings, and Delta Values During a Clear Day and a Day with Afternoon Clouds


As discussed above, sky temperature readings taken by an ambient temperature sensor tend to fluctuate to a lesser extent than sky temperature readings taken by an infrared radiation sensor. Certain implementations of infrared cloud detectors have logic that determines the difference between infrared sensor temperature readings (TIR) and ambient temperature readings (TA), the delta (custom character), according to Eqn. 1 to help normalize any fluctuations in the infrared sensor temperature readings (TIR). By way of comparison, FIGS. 2A-2C include graphs of examples of temperature readings, TIR, taken by an infrared sensor of an infrared cloud detector according to an implementation, temperature readings, TA, taken by an ambient temperature sensor of the infrared cloud detector, and the delta (Δ) between these readings. Each graph includes two plots: a plot of readings taken during a clear day and a plot of readings taken during a day with afternoon clouds. The infrared cloud detector used in this example includes components that are similar to those described with respect to the infrared cloud detector 100 shown in FIG. 1. In this case, the infrared cloud detector is located on the rooftop of a building and the infrared sensor is oriented to face vertically upward. The infrared sensor is calibrated to measure infrared radiation in the wavelength range from about 8 μm to about 14 μm. To avoid direct sunlight from impinging the infrared sensor, the infrared sensor is located behind a cover formed of a light diffusing material such as a plastic e.g., polycarbonate, polyethylene, polypropylene and/or a thermoplastic such as nylon or other polyamide, polyester or other thermoplastic, among other suitable materials. In this example, the infrared cloud detector also comprises logic that can be used to calculate the difference, delta (custom character), between the temperature readings, TIR, taken by the IR sensor and the ambient temperature readings, TA, taken by the ambient temperature sensor of the infrared cloud detector. The logic can also be used to determine a “cloudy” condition if the delta (custom character) is at or above the upper threshold value, a “clear” condition if the delta (custom character) is at or below the lower threshold value, and an “intermittent cloudy” condition if is determined that the delta (custom character) is between the upper and lower threshold values.



FIG. 2A shows a graph with two plots of temperature readings, TIR, taken over time by an infrared sensor of the infrared cloud detector, according to this implementation. Each of the two plots is of temperature readings, TIR, taken by the infrared sensor over a time period of a day. The first plot 110 is of temperature readings, TIR, taken by the infrared sensor during a first day with clouds in the afternoon. The second plot 112 is of temperature readings, TIR, taken by the infrared sensor during a second day that is clear all day. As shown, the temperature readings, TIR, of the first plot 110 taken during the afternoon of the first day with afternoon cloudiness are generally higher than the temperature readings, TIR, of the second plot 112 taken during the second that is clear all day.



FIG. 2B shows a graph having two plots of ambient temperature readings, TA, taken over time by the ambient temperature sensor of the infrared cloud detector discussed with respect to FIG. 2A. Each of the two plots is of temperature readings, TA, taken by the ambient temperature sensor over a time period of a day. To avoid direct sunlight from impinging the ambient temperature sensor, it is shaded from direct sunlight. The first plot 220 is of temperature readings taken by the ambient temperature sensor during the first day with clouds in the afternoon. The second plot 222 is of temperature readings taken by the infrared sensor during a second day that is clear all day. As shown, the ambient temperature readings, TA, of the first plot 220 taken during the first day with clouds in the afternoon are at lower levels than the temperature readings, TA, of the second plot 222 taken during the second day that is clear all day.



FIG. 2C shows a graph having two plots of the calculated delta (custom character) between the temperature readings, TIR, taken by the IR sensor and the ambient temperature readings, TA, taken by the ambient temperature sensor of the infrared cloud detector discussed with respect to FIGS. 2A and 2B. Each of the two plots is of the calculated delta (custom character) over a time period of a day. The first plot 230 is the calculated delta (custom character) of the readings taken during the first day with clouds in the afternoon. The second plot 232 is the calculated delta (custom character) taken during the second day that is clear all day. The graph also includes an upper threshold value and a lower threshold value.


In FIG. 2C, the values of delta (custom character) of the second plot 232 during a time interval from just before sunrise until just after sunrise and during a time interval from just before sunset until sunset are below the lower threshold value. Using the calculated delta (custom character) values shown in the plots in FIG. 2C, the logic of the infrared cloud detector would determine a “clear” condition during this time interval. Also, since the values of delta (custom character) of the second plot 232 are below the lower threshold value at most other times of the day, the logic of the infrared cloud detector would determine a “clear” condition for the other times as well.


In FIG. 2C, the values of delta (custom character) of the first plot 230 are above the upper threshold value for most of the afternoon and the infrared cloud detector would determine a “cloudy” condition during the afternoon. The values of delta (custom character) of the first plot 230 are below the lower threshold value during a time interval just before sunrise until just after sunrise and during a time interval from just before sunset until sunset. Based on these calculated delta (custom character) values, the logic of the infrared cloud detector would determine a “clear” condition during this time interval. The values of delta (custom character) of the first plot 230 are between the lower and upper threshold values during a brief period of time in transition in early and late afternoon. Based on these calculated delta (custom character) values, the logic of the infrared cloud detector would determine an “intermittent cloudy” condition.


C. Infrared Cloud Detector Systems with Optional Photosensor(s)


In certain implementations, infrared cloud detector systems include an optional visible light photosensor (e.g., a photodiode) for measuring intensity of visible light radiation during operation. These systems generally comprise an infrared sensor, an ambient temperature sensor, a visible light photosensor, and logic for determining a cloud cover condition based on readings taken by one or more of the infrared sensor, the ambient temperature sensor, and the visible light photosensor. In some cases, the infrared sensor is calibrated to measure wavelengths in the 8-14 μm spectrum. In some cases, the photosensor is calibrated to detect intensity of visible light (e.g., between about 390 nm and about 700 nm) within a photopic range. The photosensor may be located in/on the same housing as the infrared sensor and the ambient temperature sensor or may be located separately. In some cases, the logic determines the cloud cover condition based on a calculated delta (custom character) value between the infrared sensor temperature readings, TIR, and the ambient temperature readings, TA, when the confidence level of the infrared sensor is high and/or the confidence level of the photosensor is low. The logic determines the cloud cover condition based on photosensor readings when the confidence level of the infrared sensor is low and/or the confidence level of the photosensor is high.


In various implementations, an infrared cloud detector system includes logic for determining a cloud cover condition using, as input, the time of day, day of year, temperature readings, TIR, from the infrared sensor, ambient temperature readings, TA, from the ambient temperature sensor, and light intensity readings from the photosensor, the oscillation frequency of the visible light intensity readings from the photosensor, and the oscillation frequency of the temperature readings, TIR, from the infrared sensor. In some cases, the logic determines the oscillation frequency from the visible light intensity readings and/or the oscillation frequency from the temperature readings, TIR. The logic determines whether the time of day is during one of the following four time periods: (i) a time period shortly before sunrise and up to slightly after sunrise; (ii) daytime defined as after (i) and before (iii); (iii) a time period shortly before sunset (dusk) and up until sunset; or (iv) nighttime defined as after (iii) and before (i). In one case, the time of sunrise can be determined from measurements taken by the visible wavelength photosensor. For example, the time period (i) may end at the point where a visible light wavelength photosensor begins to measure direct sunlight i.e. an intensity reading of the visible light photosensor is at or above a minimum intensity value. In addition or alternatively, the time period (iii) may be determined to end at the point where the intensity reading from a visible light wavelength photosensor is at or below a minimum intensity value. In another example, the time of sunrise and/or the time of sunset may be calculated using a solar calculator based on the day of the year and the time periods (i) and (iii) can be calculated by a defined period of time (e.g., 45 minutes) before and after the calculated times of sunrise/sunset. If the time of day is within (i) or (iii) time periods, the confidence level of the photosensor readings tends to be low and the infrared sensor readings high. In this situation, the logic determines the cloud cover condition based on a calculated delta (custom character) with or without correction factors. For example, the logic may determine a “cloudy” condition if the delta (custom character) is above the upper threshold value, a “clear” condition if the delta (custom character) is below the lower threshold value, and an “intermittent cloudy” condition if the delta (custom character) is between upper and lower threshold values. As another example, the logic may determine a “cloudy” condition if the delta (custom character) is above a single threshold value and a “clear” condition if the delta (custom character) is below the threshold value. If the time of day is during (ii) daytime, the confidence level of the photosensor readings is at a high level and the confidence level of the infrared sensor readings tends to be low. In this case, the logic may use the photosensor readings to determine the cloud cover condition as long as a calculated difference between the infrared readings and the photosensor readings stays at or below an acceptable value. For example, the logic may determine a “clear” condition if the photosensor reading is above a certain intensity level and determine a “cloudy” condition if the photosensor reading is at or below the intensity level. If the calculated difference between the infrared readings and the photosensor readings increases above the acceptable value, the confidence of the infrared readings is increased and the logic determines the cloud cover condition based on the delta (custom character) as described above. Alternatively or additionally, if the photosensor readings are determined to be oscillating at a frequency greater than a first defined level, the confidence level of the infrared readings increased and the logic determines the cloud cover condition based on the delta (custom character). If the infrared readings are determined to be oscillating at a frequency greater than a second defined level, the confidence level of the photosensor readings is increased and the logic determines the cloud cover condition based on the photosensor readings. If the time of day is during (iv) nighttime, the logic may determine the cloud cover condition based on the delta (custom character) as described above.



FIG. 3 depicts a schematic (side view) diagram of an infrared cloud detector system 300 comprising an infrared cloud detector 310 and a photosensor 320, according to an implementation. The infrared cloud detector 310 comprises a housing 312, an infrared sensor 314 within the enclosure of the housing 312, and an ambient temperature sensor 316 also within the enclosure of the housing 312. The infrared sensor 314 is configured to take temperature readings, TIR, based on infrared radiation received form the region of sky within its conical field-of-view 315. The ambient temperature sensor 316 is configured to take ambient temperature readings, TA, of the ambient air surrounding the infrared cloud detector 310. In one aspect, the IR sensor is one of an infrared thermometer (e.g., a thermopile), infrared radiometer, infrared pyrgeometer, and infrared pyrometer. In one aspect, the ambient temperature sensor is one of a thermistor, a thermometer, and a thermocouple.


The infrared cloud detector 310 is located on the roof of a building having a room 330 with a tintable window 332 (e.g., electrochromic window with at least one electrochromic device) and the photosensor 320 is located on an exterior surface of the building. The tintable window 332 is located between the exterior and the interior of the building, which includes the room 330. FIG. 5 also shows a desk 334 in the room 330. Although the photosensor 320 is located separately from the infrared cloud detector 310 in this example, in other implementations, the photosensor 320 is located in the enclosure of the housing or on the outside of the housing 312.


The infrared sensor 314 includes an imaginary axis that is perpendicular to the sensing surface of the infrared sensor 314 and passes through its center. The infrared cloud detector 310 is supported by a wedge-shaped structure that orients the infrared cloud detector 310 such that its axis is directed at an angle of inclination, pi, from a horizontal plane. Other components can be used to support the infrared cloud detector 310 in other implementations. The infrared sensor 314 is directed so that the sensing surface faces the sky and can receive infrared radiation from a region of the sky within its field-of-view 315. The ambient temperature sensor 316 is located within the enclosure of the housing 312 away from the edge and shaded by an overhanging portion of the housing 312 avoid direct sunlight from impinging the sensing surface of the ambient temperature sensor 316. Although not shown, the infrared cloud detector 310 also includes one or more structures that hold its components within the housing 312.


In FIG. 3, the infrared cloud detector system 300 also includes a controller 340 with a processor that can execute instructions stored in memory (not shown) for using the logic of the infrared cloud detector system 300. The controller 340 is in communication with (wirelessly or wired) the infrared sensor 314 and the ambient temperature sensor 316 to receive signals with temperature readings. The controller 340 is also in communication with (wirelessly or wired) the photosensor 320 to receive signals with visible light intensity readings.


In some implementations, power/communication lines can extend from the building or another structure to the infrared cloud detector 310. In one implementation, the infrared cloud detector 310 includes a network interface that can couple the infrared cloud detector 310 to a suitable cable. The infrared cloud detector 310 can communicated data through the network interface to the controller 340 or another controller (e.g., network controller and/or master controller) of the building. In some other implementations, the infrared cloud detector 310 can additionally or alternatively include a wireless network interface enabling wireless communication with one or more external controllers.


In some implementations, the infrared cloud detector 310 or other examples of infrared cloud detectors can also include a battery within or coupled with its housing to power the sensors and electrical components within. The battery can provide such power in lieu of or in addition to the power from a power supply (for example, from a building power supply). In one implementation, an infrared cloud detector further includes at least one photovoltaic cell, for example, on an outer surface of the housing. This at least one photovoltaic cell can provide power in lieu of or in addition to the power provided by any other power supply.


The infrared cloud detector system 300 further comprises logic for determining the cloud cover condition that uses, as input, the time of day, day of year, temperature readings, TIR, from the infrared sensor 314, ambient temperature readings, TA, from the ambient temperature sensor 316, and light intensity readings from the photosensor 320, the oscillation frequency of the visible light intensity readings from the photosensor 320, and the oscillation frequency of the temperature readings, TIR, from the infrared sensor 314. During operation, the infrared sensor 314 takes temperature readings, TIR, based on infrared radiation received from the region of sky within its field-of-view 315, the ambient temperature sensor 316 takes ambient temperature readings, TA, of the ambient air surrounding the infrared cloud detector 310, and the photosensor 320 takes intensity readings of visible light received at its sensing surface. The processor of the controller 340 receives signals with temperature readings, TIR, from the infrared sensor 314, signals with ambient temperature readings, TA, from the ambient temperature sensor 316, and signals with intensity readings from the photosensor 320. The processor executes instructions stored in memory for using the logic to determine the cloud cover condition based on the various inputs. An example of such logic is described above and also with reference to FIG. 9. In one implementation, the controller 340 is also be in communication with and configured to control one or more building components. For example, the controller 340 may be in communication with and configured to control the tint level of the tintable window 332. In this implementation, the infrared cloud detector system 300 further comprises logic for determining control decisions for the one or more building components e.g., the tintable window 332, based on the determined cloud cover condition. An example of logic for determining control decisions based on a determined cloud cover condition is described in more detail with respect to FIG. 10.


Although a single infrared sensor 314, ambient temperature sensor 316, and photosensor 320 are illustrated in FIG. 3, it would be understood that the disclosure is not so limiting and that additional components can be used, in another implementation. For example, multiple components can be used for redundancy in case one malfunctions and/or is obscured or otherwise prevented from functioning. In another example, two or more components may be used at different locations or at different orientations to capture different information. In one implementation, two or more infrared sensors are used to face different orientations to capture infrared radiation from different fields-of-view and/or at different distances from the building/structure. In cases with multiple sensors, an average or mean value of the values from the multiple sensors may be used to determine the cloud cover condition. If two or more IR sensors are located within a housing of an infrared cloud detector 310, the IR sensors are typically offset from one another by a distance sufficient to reduce the likelihood that an obscuring agent would affect all the IR sensors. For example, IR sensors may be separated by at least about one inch or at least about two inches.


Another example of an infrared cloud detector system is described with respect to FIGS. 11A-C in Section III below.


Multi-Sensor Implementations


In certain implementations, an infrared cloud detector system includes an infrared cloud detector with a visible light photosensor in the form of a multi-sensor device with various other optional sensors and electrical components within or on its housing. Details of different examples of multi-sensor devices are described in U.S. patent application Ser. No. 14/998,019, filed on Oct. 6, 2016 and titled “MULTI-SENSOR,” which is hereby incorporated by reference in its entirety. Multi-sensor devices of these implementations are configured to be located in an environment exterior to a building in order to expose sensors to the outside environment. In some of these implementations with multi-sensor devices, power/communication lines extend from the building to the multi-sensor device. In one such case, the multi-sensor device includes a network interface that can couple the multi-sensor device to a suitable cable. The multi-sensor device can communicate data through the network interface to a local controller or controllers, a network controller, and/or a master controller of the building. In other implementations, the multi-sensor device can additionally or alternatively include a wireless network interface enabling wireless communication with one or more external controllers. In some implementations, the multi-sensor device may also include a battery within or coupled with its housing to power the sensors and electrical components within. The battery can provide such power in lieu of or in addition to the power from a power supply (for example, from a building power supply). In some implementations, the multi-sensor device further includes at least one photovoltaic cell, for example, on a surface of its housing.



FIGS. 4A, 4B, and 4C show perspective views of a diagrammatic representation of an infrared cloud detector system 400 comprising an infrared cloud detector in the form of a multi-sensor device 401, according to one such implementation. FIGS. 4A and 4B show that the multi-sensor device 401 comprises a housing 410 coupled to a mast 420. The mast 420 can function as a mounting assembly including a first end portion for coupling to a base portion 414 of the housing 410 and a second end portion for mounting to the building. In one example, the base portion 414 is fixedly attached or coupled to or with the first end portion of the mast 420 via mechanical threading or via a rubber gasket press-on. The mast 420 also can include a second end portion that can include a mounting or attachment mechanism for mounting or attaching the mast 420 to a roof top of the building (e.g., on roof of building with room 330 shown in FIG. 3) such as, for example, to a surface of the roof, a wall on the roof, or to another structure on the roof. The housing includes a cover 411 that is formed of a light-diffusing material. The cover 411 also includes a thinned portion 412.



FIG. 4B also shows that the infrared cloud detector system 400 includes an ambient temperature sensor 420 located on the bottom surface of the base portion 414 of the multi-sensor device 401. The ambient temperature sensor 420 is configured to measure ambient temperature of the external environment during operation. The ambient temperature sensor 420 is located on the bottom surface to be shaded from direct solar radiation when infrared cloud detector system 400 is located in an outdoor environment with the upper surface facing upward. The temperature sensor 420 may be, for example, a thermistor, a thermocouple, a resistance thermometer, a silicon bandgap temperature sensor, etc.



FIG. 4C shows a perspective view of some of the inner components of the multi-sensor device 401 of the infrared cloud detector system 400 shown in FIGS. 4A and 4B. As shown, the infrared cloud detector system 400 further includes a visible light sensor 440, a first infrared sensor 452 and a second infrared sensor 454. The first infrared sensor 452 and second infrared sensor 454 are located on an upper portion of the multi-sensor device 401 and positioned behind the cover 411 (shown in FIGS. 4A and 4B) formed of the light-diffusing material.


As shown in FIG. 4C, the first infrared sensor 452 has a first axis of orientation 453 that is perpendicular to its sensing surface. The second infrared sensor 454 has a second axis of orientation 455 that is perpendicular to its sensing surface. In the illustrated example, the first and second infrared sensors 452, 454 are positioned so that their axis of orientation 453, 455 face outward from the top portion of the housing 410 (shown in FIGS. 4A and 4B) in order to be able to take temperature readings during operation that are based on infrared radiation captured from above the multi-sensor device 401. The first infrared sensor 452 is separated from the second infrared sensor 454 by at least about one inch. During operation, the first and second infrared sensors 452, 454 detect infrared radiation that is radiated from any objects or medium within their field-of-view. The field-of-view is based on the physical and material properties of the first and second infrared sensors 452, 454. Based on their physical and material properties alone, some examples of infrared sensors have a field-of-view that ranges from about 50 degrees to about 80 degrees. In one particular example, an infrared sensor has a field-of-view of about 70.


The photosensor 440 has an axis of orientation 442 that is perpendicular to its sensing surface. The photosensor 440 is positioned behind the thinned portion 412 of the housing 410 as shown in FIG. 4A. The thinned portion 412 allows the photosensor 440 to receive visible light radiation through the thinned portion 412. During operation, the photosensor 440 measures the intensity of visible light received through the thinned portion 412.


In one implementation, the infrared cloud detector system 400 also includes an external controller with a processor that can execute instructions stored in memory (not shown) for using the logic of the infrared cloud detector system 400. In this implementation, the infrared cloud detector system 400 further includes logic for determining a cloud cover condition using as input the time of day, day of year, temperature readings, TIR, from one of both of the infrared sensors 452, 454, ambient temperature readings, TA, from the ambient temperature sensor 420, and light intensity readings from the photosensor 440, the oscillation frequency of the visible light intensity readings from the photosensor 440, and the oscillation frequency of the temperature readings, TIR, from the infrared sensors 452, 454. Examples of such logic are described herein, for example, with respect to FIGS. 8-10.


The external controller is in communication with (wirelessly or wired) the infrared sensors 452, 454 and the ambient temperature sensor 420 to receive signals with temperature readings. The controller is also in communication with (wirelessly or wired) the photosensor 440 to receive signals with visible light intensity readings. In some implementations, power/communication lines can extend from the building or another structure to the infrared cloud detector system 400. In one implementation, the infrared cloud detector system 400 includes a network interface that can couple to a suitable cable. The infrared cloud detector system 400 can communicated data through the network interface to the external controller or another controller of the building. In some other implementations, the infrared cloud detector system 400 can additionally or alternatively include a wireless network interface enabling wireless communication with one or more external controllers. In some implementations, the infrared cloud detector system 400 can also include a battery within or coupled with the housing to power the sensors and electrical components within. The battery can provide such power in lieu of or in addition to the power from a power supply (for example, from a building power supply). In some implementations, the infrared cloud detector system 400 further includes at least one photovoltaic cell, for example, on a surface of the housing.


D. Comparison of Intensity Readings from a Photosensor with Delta Values During Different Cloud Cover Conditions


As discussed above, infrared sensors can be more accurate than a visible light photosensor in detecting a “clear” condition in the early morning and evening. Direct sun light and other conditions can cause, however, some noise that results in oscillations in the infrared sensor readings. If the frequency of these oscillations is low, the infrared sensor readings can be used to make a high confidence assessment of the cloud cover condition. Also, certain conditions (e.g., fast moving clouds) may cause oscillations in the photosensor readings. If the frequency of oscillation is low, the photosensor readings can be used to make a high confidence assessment of the cloud cover condition during the daytime. In certain implementations, logic may determine whether the oscillations of the infrared sensor readings are of high frequency and/or the whether the oscillations of the photosensor readings are of high frequency. If it is determined that the oscillations of the infrared sensor readings are of high frequency, the logic uses the photosensor readings to determine the cloud cover condition. If it is determined that the oscillations of the photosensor readings are of high frequency, the logic uses the difference between the infrared sensor readings and the ambient temperature sensor readings to determine the cloud cover condition. To illustrate technical advantages of this logic selecting the type of sensor reading to use depending on the oscillations, FIGS. 5A, 5B, 6A, 6B, 7A, and 7B include graphs of plots of intensity readings, I, taken by a visible light photosensor for comparison with the difference, delta (Δ), between temperature readings, TIR, taken by an infrared sensor and temperature readings, TA, taken by an ambient temperature sensor during different cloud cover conditions. The visible light photosensor, infrared sensor, and ambient temperature sensor are similar to those described with respect to components of the infrared cloud detector 310 shown in FIG. 3. Each of the plots is of readings taken during the time period of a day.



FIGS. 5A-5B include graphs of plots of readings taken over a day that is sunny and clear all day except for a passing cloud during the middle of the daytime. FIG. 5A is a graph with a plot 510 of intensity readings, I, taken by the visible light photosensor over time. FIG. 5B is a graph with a plot 520 of the difference, delta (Δ), between temperature readings, TIR, taken by the infrared sensor and temperature readings, TA, taken by the ambient temperature sensor over time. As shown in the plot 510 of FIG. 5A, the intensity readings, I, taken by the visible light photosensor are high most of the daytime and drop with a high frequency (short period) oscillation when a cloud passes during the middle of the daytime. The plot 520 of FIG. 5A shows the values of delta (Δ) do not increase above the lower threshold value during the entire day, which indicates a high confidence “clear” condition.



FIGS. 6A-6B include graphs of plots of readings taken over a day with frequent passing clouds in the morning until afternoon and the two slow moving clouds passing later in the afternoon. FIG. 6A is a graph with a plot 610 of intensity readings, I, taken by the visible light photosensor over time. FIG. 6B is a graph with a plot 640 of the difference, delta (Δ), between temperature readings, TIR, taken by the infrared sensor over time and temperature readings, TA, taken by an ambient temperature sensor over time. As shown in the plot 610 of FIG. 6A, the intensity readings, I, taken by the visible light photosensor has a high frequency portion 620 during the time period that the frequent clouds are passing in the morning until afternoon. The plot 610 has a low frequency portion 630 later in the afternoon when two slow moving clouds pass by. The plot 640 in FIG. 6B shows that the values of delta (Δ) have high frequency during the time period that the frequent clouds are passing in the morning until afternoon and the values remain between the upper and lower threshold values indicating intermittent cloudy. The values of delta (Δ) later in the afternoon have a low frequency oscillations that have values between the upper and lower thresholds and also below the lower thresholds value shifting between “intermittent cloudy” and “clear” condition. In this case, the infrared sensor values indicate high confidence “intermittent cloudy” condition from morning until afternoon and the photosensor values indicate a high confidence “intermittent cloudy condition in the later afternoon.



FIGS. 7A-7B include graphs of plots of readings taken over time during a day that is cloudy except for a short time during the middle of the day. FIG. 7A is a graph with a plot 710 of intensity readings, I, taken by the visible light photosensor over time. FIG. 7B is a graph with a plot 720 of the difference, delta (Δ), between temperature readings, TIR, taken by the infrared sensor and temperature readings, TA, taken by an ambient temperature sensor over time. As shown in the plot 710 of FIG. 7A, the intensity readings, I, taken by the visible light photosensor are low most of the daytime and increase with a high frequency (short period) oscillation when the sky clears briefly in the middle of the day. The plot 720 of FIG. 7A shows the values of delta (Δ) do not go below the upper threshold value during the entire day which indicates a high confidence “cloudy” condition.


In some implementations, an infrared cloud detector system uses readings from an infrared sensor to evaluate the delta differential between the ambient temperature and the temperature reading from an infrared sensor measuring wavelengths in an infrared range, for example, wavelengths between 8-14 micrometers. In some cases, one or more correcting factors are applied to the calculated delta differential. The delta differential provides a relative sky temperature value that can be used to classify the cloud cover condition. For example, a cloud cover condition may be determined in one of three buckets “Clear,” “Cloudy,” and “Overcast.” In using this infrared cloud detector system, the cloud cover condition determined has no bearing on if the sun is out or if it is before sunrise/sunset.


The infrared cloud detector system according to certain implementations may have one or more technical advantages. For example, during early morning and evening conditions, the infrared sensor can determine if it is cloudy or sunny independent of visible light intensity levels. This determination of cloud cover condition during these times, when a photosensor would be ineffective while the sun is still up, may provide additional context to determining a tint state of a tintable window. As another example, the infrared sensor can be used to detect the general cloud cover condition within its field-of-view. This information can be used in conjunction with photosensor readings to determine if a “clear” or “cloudy” condition determined by the photosensor is likely to persist. For example, if the photo sensor detects a sharp rise in intensity levels which would tend to indicate a “clear” condition, but the infrared sensor indicates a “cloudy” condition, the “clear” condition is not expected to persist. Conversely, if the infrared sensor says a “clear” condition and the photosensor readings indicate that its a “clear” condition, then the “clear” condition is likely to persist. As another example, on occasions where a tintable window needs to be at a steady state at sunrise, the transition needs to start at X time (e.g., transition time) before sunrise. During this time, the photosensor is ineffective as there is minimal light exposure. The IR sensor can determine the cloud conditions before sunrise to inform the control logic whether to begin the tinting process (during clear sky) or keep the tintable window clear in anticipation of a “cloudy” condition at sunrise.


III. Methods of Using Readings from at Least One Infrared Sensor and One Ambient Temperature Sensor to Determine a Cloud Cover Condition


FIGS. 8-10 show flow charts describing methods of using readings from at least one infrared sensor and one ambient temperature sensor to determine a cloud cover condition, according to various embodiments. In FIGS. 9-10, readings from at least one photosensor can also be used to determine the cloud cover condition under certain conditions. In some cases, the infrared sensor used to take temperature readings is calibrated to detect infrared radiation in about the 8 μm to 14 μm spectrum and/or has a field-of-view of about 72 degrees. In some cases, the photosensor used to take the photosensor readings is calibrated to detect intensity of visible light (e.g., between about 390 nm and about 700 nm) within a photopic range, which generally refers to light that is visible to the average human eye under well-lit conditions (e.g., a luminance level ranging from between about 10 cd/m2 and about 108 cd/m2). Although these methods are described with respect to readings from a single infrared sensor, a single ambient temperature sensor, and/or a single photosensor, it would be understood that values from multiple sensors of a type can be used, for example, multiple sensors oriented in different directions can be used. If multiple sensors are used, the method may use a single value based on a sensor (e.g., a functioning sensor) of a particular orientation or take an average, mean, or other statistically relevant value of readings from multiple functioning sensors. In other cases, there may be redundant sensors and the infrared cloud detector may have logic that uses the values from a functioning sensor. For example, by evaluating which of the sensors is functioning and/or which are not functioning based on comparing the readings from the various sensors.


A. Method I



FIG. 8 shows a flowchart 800 describing a method that uses temperature readings from an infrared sensor and an ambient temperature sensor to determine a cloud cover condition, according to implementations. The infrared sensor and ambient temperature sensor of the infrared cloud detector system generally take readings (at sample times) on a periodic basis. A processor executes instructions stored in memory to perform the operations of this method. In one implementation, the infrared cloud detector system has components similar to those described with respect to the system having the infrared cloud detector 100 described with respect to FIG. 1. In another implementation, the infrared cloud detector system has components similar to those described with respect to the system with infrared cloud detector 310 in FIG. 3.


In FIG. 8, the method starts at operation 801. At operation 810, a signal(s) is received, at the processor, with temperature reading, TIR, taken by an infrared sensor and temperature reading, TA, taken by the ambient temperature sensor. Signals from the infrared sensor and/or ambient temperature sensor are received wirelessly and/or via wired electrical connections. The infrared sensor takes temperature readings based on infrared radiation received within its field-of-view. The infrared sensor is usually oriented toward a region of sky of interest, for example, a region above a building. The ambient temperature sensor is configured to be exposed to the outside environment to measure the ambient temperature.


At operation 820, the processor calculates the difference, delta (custom character), between the temperature reading, TIR, taken by the infrared sensor and the temperature reading, TA, taken by an ambient temperature sensor at a sample time. Optionally (denoted by dotted line), correction factors are applied to the calculated delta (custom character) (operation 830). Some examples of correction factors that may be applied include humidity, sun angle/elevation, and site elevation.


At operation 840, the processor determines whether the calculated delta (custom character) value is below a lower threshold value (e.g., −5 degrees Celsius, −2 degrees Celsius, etc.). If it is determined that the calculated delta (custom character) value is below the lower threshold value, the cloud cover condition is determined to be a “clear” condition (operation 850). During operation of the infrared cloud detector, the method then increments to the next sample time and returns to operation 810.


If it is determined that the calculated delta (custom character) is above the lower threshold value, then the processor determines whether the calculated delta (custom character) is above an upper threshold value (e.g., 0 degrees Celsius, 2 degrees Celsius, etc.) at operation 860. If it is determined that the calculated delta (custom character) is above the upper threshold value at operation 860, then the processor determines the cloud cover condition to be a “cloudy” condition (operation 870). During operation of the infrared cloud detector, the method then increments to the next sample time and returns to operation 810.


If it is determined that the calculated delta (custom character) is below the upper threshold value at operation 860, then the processor determines the cloud cover condition to be “intermittent cloudy” or another intermediate condition (operation 880). During operation of the infrared cloud detector, the method then increments to the next sample time and returns to operation 810.


B. Method II



FIG. 9 shows a flowchart 900 describing a method that determines a cloud cover condition using readings from an infrared sensor, an ambient temperature sensor, and a photosensor of an infrared cloud detector system, according to implementations. The infrared sensor, ambient temperature sensor, and photosensor generally take readings (at sample times) on a periodic basis. The infrared cloud detector system also includes a processor that can execute instructions stored in memory to perform the operations of this method. In one implementation, the infrared sensor, ambient temperature sensor, and photosensor are similar to components of the infrared cloud detector system 300 described with respect to FIG. 3. In another implementation, the infrared sensor, ambient temperature sensor, and photosensor are similar to components of the infrared cloud detector system 400 described with respect to FIG. 4A-4C.


In FIG. 9, the method starts at operation 901. At operation 910, one or more signals are received, at the processor, with a temperature reading, TIR, taken by an infrared sensor at a particular sample time, a temperature reading, TA, taken by the ambient temperature sensor at the sample time, and an intensity reading taken by the photosensor at the sample time. Signals from the infrared sensor, ambient temperature sensor, and photosensor are received wirelessly and/or via wired electrical connections. The infrared sensor takes temperature readings based on infrared radiation received within its field-of-view. The infrared sensor is usually oriented toward a region of sky of interest, for example, a region above a building. The ambient temperature sensor is configured to be exposed to the outside environment to measure the ambient temperature. The sensing surface of the photosensor is usually also oriented toward the region of sky of interest and direct sunlight is blocked or diffused from impinging the sensing surface.


At operation 920, the processor determines whether the time of day is during one of the following time periods: (i) a time period shortly before sunrise (e.g., starting at a first time of 45 minutes before sunrise, 30 minutes before sunrise, 20 minutes before sunrise, or other suitable amount of time before sunrise) and up to slightly after sunrise (e.g., starting at a second time of 45 minutes after sunrise, 30 minutes after sunrise, 20 minutes after sunrise, or other suitable amount of time after sunrise) and (iii) a time period shortly before sunset (dusk) (e.g., starting at a third time of 45 minutes before sunset, 30 minutes before sunset, 20 minutes before sunset, or other suitable amount of time before sunset) and up until sunset. In one case, the time of sunrise can be determined from measurements taken by the visible wavelength photosensor. For example, the time period (i) may end at the point where a visible light wavelength photosensor begins to measure direct sunlight i.e. an intensity reading of the visible light photosensor is at or above a minimum intensity value. In addition or alternatively, the time period (iii) may be determined to end at the point where the intensity reading from a visible light wavelength photosensor is at or below a minimum intensity value. In another example, the time of sunrise and/or the time of sunset may be calculated using a solar calculator and the day of the year and the time periods (i) and (iii) can be calculated by a defined period of time (e.g., 45 minutes) before and after the calculated times of sunrise/sunset.


If it is determined at operation 920 that the time of day is during either of the time periods (i) or (iii), then the processor calculates the difference, delta (custom character), between the temperature reading, TIR, taken by the infrared sensor and the temperature reading, TA, taken by an ambient temperature sensor at a sample time (operation 930). Optionally (denoted by dotted line), correction factors are applied to the calculated delta (custom character) (operation 930). Some examples of correction factors that may be applied include humidity, sun angle/elevation, and site elevation.


In one embodiment, the processor also determines at operation 920 whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 920 that the time of day is either within the time period (i) or (iii) and the infrared readings are oscillating at a frequency greater than a second defined level, then the processor applies operation 990 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. If the system is still in operation, the method increments to the next sample time and returns to operation 910.


At operation 934, the processor determines whether the calculated delta (custom character) value is below a lower threshold value (e.g., −5 degrees Celsius, −2 degrees Celsius, etc.). If it is determined that the calculated delta (custom character) value is below the lower threshold value, the cloud cover condition is determined to be a “clear” condition (operation 936). During operation of the infrared cloud detector, the method then increments to the next sample time and returns to operation 910.


If it is determined that the calculated delta (custom character) is above the lower threshold value, then the processor determines whether the calculated delta (custom character) is above an upper threshold value (e.g., 0 degrees Celsius, 2 degrees Celsius, etc.) at operation 940. If it is determined that the calculated delta (custom character) is above the upper threshold value at operation 940, then the processor determines the cloud cover condition to be a “cloudy” condition (operation 942). If still in operation, the method increments to the next sample time and returns to operation 910.


If it is determined that the calculated delta (custom character) is below the upper threshold value at operation 940, then the processor determines the cloud cover condition to be “intermittent cloudy” or another intermediate condition (operation 950). If the system is still in operation, the method increments to the next sample time and returns to operation 910.


If it is determined at operation 920 that the time of day is not during either of the time periods (i) or (iii), then the processor determines whether the time of day is during the time period (ii) which is in the daytime after the time period (i) and before time period (iii) (operation 960). If the processor determines at operation 960 that the time of day is during the time period (ii) daytime, then the processor calculates the difference between the temperature reading, TIR, taken by the infrared sensor and the intensity reading taken by the photosensor (operation 970). At operation 980, the processor determines whether the calculated difference is within an acceptable limit. If the processor determines at operation 980 that the calculated difference is more than the acceptable limit, then the processor applies operation 930 to calculate the delta (custom character) and uses the calculated delta (Δ) to determine the cloud cover condition as discussed above.


In one embodiment, the processor also determines at operation 960 whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 960 the time of day is within the time period (ii) and that the infrared readings are oscillating at a frequency greater than a second defined level, then the processor applies operation 990 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. If the system is still in operation, the method increments to the next sample time and returns to operation 910.


If the processor determines at operation 980 that the calculated difference is within the acceptable limit, the photosensor reading is used to determine the cloud cover condition (operation 990). For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. If the system is still in operation, the method increments to the next sample time and returns to operation 910.


In one embodiment, the processor also determines at operation 970 whether the photosensor readings are oscillating at a frequency greater than a first defined level and whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 980 that the calculated difference is within the acceptable limit and the processor determines that the photosensor readings are oscillating at a frequency greater than the first defined level, then the processor applies operation 930 to calculate the delta (custom character) and use the calculated delta (custom character) is used determine the cloud cover condition as discussed above. If the processor determines at operation 980 that the calculated difference is not within the acceptable limit and the processor determines that the infrared readings are oscillating at a frequency greater than the second defined level, then the processor applies operation 990 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. If the system is still in operation, the method increments to the next sample time and returns to operation 910.


If the processor determines at operation 960 that the time of day is in the nighttime time period (iv) after time period (iii) and before time period (i), the processor calculates the delta at operation 930 and uses the calculated delta (custom character) to determine the cloud cover condition as discussed above.


C. Method III—Module C Algorithm that Uses Infrared Sensor, Ambient Temperature Sensor, and Photosensor Readings.


In energy efficient buildings, control logic for setting levels of its building systems may consider cloud cover. For example, in buildings with optically-switchable windows, control logic may consider cloud cover in setting window optical states (e.g., tint states in an electrochromic window). Conventional systems that purport to provide this functionality typically employ expensive sensing equipment to map the entire sky and track clouds. This mapping technology can also be hampered by not being able to register clouds until there is enough light to see them Thus, by the time the clouds are registered, building systems may not need to be adjusted.


In various implementations described herein, a cloud cover condition determined by sensor data from an infrared cloud detector system (e.g., a system of FIG. 1, system 300 in FIG. 3, system 400 in FIGS. 4A-4C, or other infrared cloud detector system described herein) can be used to set levels of building systems. As an example, this section describes control logic that uses sensor readings from sensors in an infrared cloud detector system to determine a cloud cover condition and set tint levels in one or more optically-switchable windows (e.g., electrochromic windows) of a building based on the determined cloud cover condition. Electrochromic windows have one or more electrochromic devices such as the electrochromic devices described in U.S. Pat. No. 8,764,950, issued on Jul. 1, 2014 and titled “ELECTROCHROMIC DEVICES,” and U.S. patent application Ser. No. 13/462,725, filed on May 2, 2012 and titled “ELECTROCHROMIC DEVICES,” both of which are hereby incorporated by reference in their entirety.


i) Introduction to Electrochromic Devices/Windows



FIG. 10A schematically depicts an electrochromic device 1000, in cross-section. The electrochromic device 1000 includes a substrate 1002, a first conductive layer (CL) 1004, an electrochromic layer (EC) 1006, an ion conducting layer (IC) 1008, a counter electrode layer (CE) 1010, and a second conductive layer (CL) 1014. In one implementation, the electrochromic layer (EC) 1006 comprising tungsten oxide and the counter electrode layer (CE) 1010 comprises nickel-tungsten oxide. Layers 1004, 1006, 1008, 1010, and 1014 are collectively referred to as an electrochromic stack 1020. A voltage source 1016 operable to apply an electric potential across the electrochromic stack 1020 effects transition of the electrochromic device, for example, between a bleached state (e.g., as depicted in FIG. 10B) and a colored state (e.g., as depicted in FIG. 10C). The order of layers can be reversed with respect to the substrate 1002.


In some cases, electrochromic devices having distinct layers and can be fabricated as all solid state devices and/or all inorganic devices. Examples of such devices and methods of fabricating them are described in more detail in U.S. patent application Ser. No. 12/645,111, titled “Fabrication of Low-Defectivity Electrochromic Devices” and filed on Dec. 22, 2009, and in U.S. patent application Ser. No. 12/645,159 (issued as U.S. Pat. No. 8,432,603 on Apr. 30, 2013), titled “Electrochromic Devices” and filed on Dec. 22, 2009, both of which are hereby incorporated by reference in their entireties. It should be understood, however, that any one or more of the layers in the stack may contain some amount of organic material. The same can be said for liquids that may be present in one or more layers in small amounts. It should also be understood that solid state material may be deposited or otherwise formed by processes employing liquid components such as certain processes employing sol-gels or chemical vapor deposition. Additionally, it should be understood that reference to a transition between a bleached state and colored state is non-limiting and suggests only one example, among many, of an electrochromic transition that may be implemented. Unless otherwise specified herein (including the foregoing discussion), whenever reference is made to a bleached-colored transition, the corresponding device or process encompasses other optical state transitions such as non-reflective-reflective, transparent-opaque, etc. Further, the term “bleached” refers to an optically neutral state, for example, uncolored, transparent, or translucent. Still further, unless specified otherwise herein, the “color” of an electrochromic transition is not limited to any particular wavelength or range of wavelengths. As understood by those of skill in the art, the choice of appropriate electrochromic and counter electrode materials governs the relevant optical transition.


In some implementations, an electrochromic device is configured to reversibly cycle between a bleached state and a colored state. When the electrochromic device is in a bleached state, a potential is applied to the electrochromic stack 1020 such that available ions in the stack reside primarily in the counter electrode 1010. When the potential on the electrochromic stack is reversed, the ions are transported across the ion conducting layer 1008 to the electrochromic material 1006 and cause the material to transition to the colored state. In a similar way, the electrochromic device of certain implementations described herein is configured to reversibly cycle between different tint levels (e.g., bleached state, darkest colored state, and intermediate levels between the bleached state and the darkest colored state).


Referring again to FIG. 10A, a voltage source 1016 is configured to operate in conjunction with input from sensors. As described herein, the voltage source 1016 interfaces with a controller (not shown in this figure). Additionally, the voltage source 1016 may interface with an energy management system that controls the electrochromic device according to various criteria such as the time of year, time of day, and measured environmental conditions. Such an energy management system, in conjunction with large area electrochromic windows can dramatically lower the energy consumption of a building having the electrochromic windows.


Any material having suitable optical, electrical, thermal, and mechanical properties may be used as the substrate 1002 or other substrate of an electrochromic stack described herein. Examples of suitable substrates include, for example, glass, plastic, and mirror materials. Suitable glasses include either clear or tinted soda lime glass, including soda lime float glass. The glass may be tempered or untempered. In many cases, the substrate is a glass pane sized for residential window applications. The size of such glass pane can vary widely depending on the specific needs of the residence. In other cases, the substrate is architectural glass. Architectural glass is typically used in commercial buildings, but may also be used in residential buildings, and typically, though not necessarily, separates an indoor environment from an outdoor environment. In certain examples, architectural glass is at least 20 inches by inches, and can be much larger, for example, as large as about 80 inches by 120 inches. Architectural glass is typically at least about 2 mm thick, typically between about 3 mm and about 6 mm thick. Of course, electrochromic devices are scalable to substrates smaller or larger than architectural glass. Further, the electrochromic device may be provided on a mirror of any size and shape.


On top of the illustrated substrate 1002 is a conductive layer 1004. In certain implementations, one or both of the conductive layers 1004 and 1014 is inorganic and/or solid. The conductive layers 1004 and 1014 may be made from a number of different materials, including conductive oxides, thin metallic coatings, conductive metal nitrides, and composite conductors. Typically, the conductive layers 1004 and 1014 are transparent at least in the range of wavelengths where electrochromism is exhibited by the electrochromic layer. Transparent conductive oxides include metal oxides and metal oxides doped with one or more metals. Examples of such metal oxides and doped metal oxides include indium oxide, indium tin oxide, doped indium oxide, tin oxide, doped tin oxide, zinc oxide, aluminum zinc oxide, doped zinc oxide, ruthenium oxide, doped ruthenium oxide and the like. Since oxides are often used for these layers, they are sometimes referred to as “transparent conductive oxide” (TCO) layers. Thin metallic coatings that are substantially transparent may also be used, as well as combinations of TCOs and metallic coatings.


The function of the conductive layers is to spread an electric potential provided by the voltage source 1016 over surfaces of the electrochromic stack 1020 to interior regions of the stack, with relatively little ohmic potential drop. The electric potential is transferred to the conductive layers though electrical connections to the conductive layers. In some aspects, bus bars, at least one in contact with conductive layer 1004 and at least one in contact with conductive layer 1014, provide the electric connection between the voltage source 1016 and the conductive layers 1004 and 1014. The conductive layers 1004 and 1014 may also be connected to the voltage source 1016 with other conventional means.


Overlaying the illustrated conductive layer 1004 is an electrochromic layer 1006. In some aspects, the electrochromic layer 1006 is inorganic and/or solid. The electrochromic layer may contain any one or more of a number of different electrochromic materials including metal oxides. Some examples of suitable metal oxides include tungsten oxide (WO3), molybdenum oxide (MoO3), niobium oxide (Nb2O5), titanium oxide (TiO2), copper oxide (CuO), iridium oxide (Ir2O3), chromium oxide (Cr2O3), manganese oxide (Mn2O3), vanadium oxide (V2O5), nickel oxide (Ni2O3), cobalt oxide (Co2O3) and the like. During operation, the electrochromic layer 1006 transfers ions to and receives ions from the counter electrode layer 1010 to cause reversible optical transitions. Generally, the colorization (or change in any optical property—e.g., absorbance, reflectance, and transmittance) of the electrochromic material is caused by reversible ion insertion into the material (e.g., intercalation) and a corresponding injection of a charge balancing electron. Typically some fraction of the ions responsible for the optical transition is irreversibly bound up in the electrochromic material. Some or all of the irreversibly bound ions are used to compensate “blind charge” in the material. In most electrochromic materials, suitable ions include lithium ions (Li+) and hydrogen ions (H+) (that is, protons). In some cases, however, other ions will be suitable. In various embodiments, lithium ions are used to produce the electrochromic phenomena. Intercalation of lithium ions into tungsten oxide (WO3-y (0<y≤˜0.3)) causes the tungsten oxide to change from transparent (bleached state) to blue (colored state).


Referring again to FIG. 10A, in the electrochromic stack 1020, the ion conducting layer 1008 is sandwiched between the electrochromic layer 1006 and the counter electrode layer 1010. In some embodiments, the counter electrode layer 1010 is inorganic and/or solid. The counter electrode layer may comprise one or more of a number of different materials that serve as a reservoir of ions when the electrochromic device is in the bleached state. During an electrochromic transition initiated by, for example, application of an appropriate electric potential, the counter electrode layer transfers some or all of the ions it holds to the electrochromic layer, changing the electrochromic layer to the colored state. Concurrently, in the case of NiWO, the counter electrode layer colors with the loss of ions. Suitable materials for the counter electrode complementary to WO3 include nickel oxide (NiO), nickel tungsten oxide (NiWO), nickel vanadium oxide, nickel chromium oxide, nickel aluminum oxide, nickel manganese oxide, nickel magnesium oxide, chromium oxide (Cr2O3), manganese oxide (MnO2), and Prussian blue. When charge is removed from a counter electrode 1010 made of nickel tungsten oxide (that is, ions are transported from counter electrode 1010 to electrochromic layer 1006), the counter electrode layer 1010 will transition from a transparent state to a colored state.


In the illustrated electrochromic device 1100, between the electrochromic layer 1006 and the counter electrode layer 1010, there is the ion conducting layer 1008. The ion conducting layer 1008 serves as a medium through which ions are transported (in the manner of an electrolyte) when the electrochromic device transitions between the bleached state and the colored state. Preferably, ion conducting layer 1008 is highly conductive to the relevant ions for the electrochromic and the counter electrode layers, but has sufficiently low electron conductivity that negligible electron transfer takes place during normal operation. A thin ion conducting layer with high ionic conductivity permits fast ion conduction and hence fast switching for high performance electrochromic devices. In certain aspects, the ion conducting layer 1008 is inorganic and/or solid.


Examples of suitable materials for an ion conducting layer (i.e. for electrochromic devices having a distinct IC layer) include silicates, silicon oxides, tungsten oxides, tantalum oxides, niobium oxides, and borates. These materials may be doped with different dopants, including lithium. Lithium doped silicon oxides include lithium silicon-aluminum-oxide. In some embodiments, the ion conducting layer comprises a silicate-based structure. In one aspect, a silicon-aluminum-oxide (SiAlO) is used for the ion conducting layer 1008.


In certain implementations, the electrochromic device 1000 includes one or more additional layers (not shown), such as one or more passive layers. Passive layers used to improve certain optical properties may be included in the electrochromic device 1000. Passive layers for providing moisture or scratch resistance may also be included in electrochromic device 1000. For example, the conductive layers may be treated with anti-reflective or protective oxide or nitride layers. Other passive layers may serve to hermetically seal electrochromic device 300.



FIG. 10B is a schematic cross-section of an electrochromic device in a bleached state (or transitioning to a bleached state). In accordance with this illustrated example, an electrochromic device 1100 includes a tungsten oxide electrochromic layer (EC) 1106 and a nickel-tungsten oxide counter electrode layer (CE) 1110. The electrochromic device 1100 also includes a substrate 1102, a conductive layer (CL) 11011, an ion conducting layer (IC) 1108, and conductive layer (CL) 1114. Layers 1104, 1106, 1108, 1010, and 1114 are collectively referred to as an electrochromic stack 1120. A power source 1116 is configured to apply a voltage potential and/or current to the electrochromic stack 1120 through suitable electrical connections (e.g., bus bars) to the conductive layers 1104 and 1114. In one aspect, the voltage source is configured to apply a potential of a few volts in order to drive a transition of the device from one optical state to another. The polarity of the potential as shown in FIG. 10B is such that the ions (lithium ions in this example) primarily reside (as indicated by the dashed arrow) in nickel-tungsten oxide counter electrode layer 1110



FIG. 10C is a schematic cross-section of the electrochromic device 1100 shown in FIG. 10B but in a colored state (or transitioning to a colored state). In FIG. 10C, the polarity of voltage source 1116 is reversed, so that the tungsten oxide electrochromic layer 1106 is made more negative to accept additional lithium ions, and thereby transition to the colored state. As indicated by the dashed arrow, lithium ions are transported across the ion conducting layer 1108 to the tungsten oxide electrochromic layer 1106. The tungsten oxide electrochromic layer 1106 is shown in the colored state or transitioning to the colored state. The nickel-tungsten oxide counter electrode 1110 is also shown in the colored state or transitioning to the colored state. As explained, nickel-tungsten oxide becomes progressively more opaque as it gives up (deintercalates) lithium ions. In this example, there is a synergistic effect where the transition to colored states for both layers 1106 and 1110 are additive toward reducing the amount of light transmitted through the electrochromic stack and the substrate.


In certain implementations, an electrochromic device includes an electrochromic (EC) electrode layer and a counter electrode (CE) layer separated by an ionically conductive (IC) layer that is highly conductive to ions and highly resistive to electrons. As conventionally understood, the ionically conductive layer therefore prevents shorting between the electrochromic layer and the counter electrode layer. The ionically conductive layer allows the electrochromic and counter electrodes to hold a charge and thereby maintain their bleached or colored states. In electrochromic devices having distinct layers, the components form a stack which includes the ion conducting layer sandwiched between the electrochromic electrode layer and the counter electrode layer. The boundaries between these three stack components are defined by abrupt changes in composition and/or microstructure. Thus, the devices have three distinct layers with two abrupt interfaces.


In accordance with certain implementations, the counter electrode and electrochromic electrodes are formed immediately adjacent one another, sometimes in direct contact, without separately depositing an ionically conducting layer. In some implementations, electrochromic devices having an interfacial region rather than a distinct IC layer are employed. Such devices, and methods of fabricating them, are described in U.S. Pat. Nos. 8,300,298, 8,582,193, 8,764,950, and 8,764,951—each of the patents is titled “Electrochromic Devices,” and each is hereby incorporated by reference in its entirety.


In certain implementations, an electrochromic device may be integrated into an insulated glass unit (IGU) of an electrochromic window or may be in a single pane electrochromic window. For example, an electrochromic window may have an IGU including a first electrochromic lite and a second lite. The IGU also includes a spacer separating the first electrochromic lite and the second lite. The second lite in the IGU may be a non-electrochromic lite or otherwise. For example, the second lite may have an electrochromic device thereon and/or one or more coatings such as low-E coatings and the like. Either of the lites can also be laminated glass. Between the spacer and the first TCO layer of the electrochromic lite is a primary seal material. This primary seal material is also between spacer and second glass lite. Around the perimeter of spacer is a secondary seal. These seals aid in keeping moisture out of the interior space of the IGU. They also serve to prevent argon or other gas that may be introduced into the interior space of the IGU from escaping. The IGU also includes bus bar wiring for connection to a window controller. In some implementations, one or both of the bus bars are inside the finished IGU, however in one implementation one bus bar is outside the seal of the IGU and one bus bar is inside the IGU. In the former embodiment, an area is used to make the seal with one face of the spacer used to form the IGU. Thus, the wires or other connection to the bus bars runs between the spacer and the glass. As many spacers are made of metal, e.g., stainless steel, which is conductive, it is desirable to take steps to avoid short circuiting due to electrical communication between the bus bar and connector thereto and the metal spacer.


iii) Logic for Controlling Electrochromic Devices/Windows


In some implementations, a controller (e.g., local window controller, network controller, master controller, etc.) includes intelligence control logic for calculating, determining, selecting or otherwise generating tint states for one or more optically-switchable windows (e.g., electrochromic windows) of a building. This control logic can be used to determine a cloud cover condition based on sensor data from an infrared cloud detector system at the building and use the determined cloud cover condition to determine tint states for the optically-switchable windows. This control logic can be used to implement methods for determining and controlling desired tint levels for the one more electrochromic windows or other tintable windows to account for occupant comfort and/or energy conservation considerations. In some cases, the control logic employs one or more logic modules. FIGS. 11A-11C include diagrams depicting some general input to each of three logic Modules A, B, and C of an exemplary control logic of disclosed implementations. Additional examples of Modules A, B, and C are described in International Patent Application PCT/US16/41344, titled “CONTROL METHOD FOR TINTABLE WINDOWS,” filed on Jul. 7, 2016, and in PCT/US15/29675, titled “CONTROL METHOD FOR TINTABLE WINDOWS” and filed on May 5, 2015, each of which is hereby incorporated by reference in its entirety.



FIGS. 11A-11C include diagrams depicting some general input to each of three logic Modules A, B, and C of an exemplary control logic of disclosed implementations. Each diagram depicts a schematic side view of a room 1200 of a building having a desk 1201 and an electrochromic window 1205 located between the exterior and the interior of the building. The diagram also depicts an infrared cloud detector system in accordance with one example. In other implementations, another example of an infrared cloud detector system described herein can be used. In the illustrated example, the infrared cloud detector system includes an infrared cloud detector 1230 located on the roof of the building. The infrared cloud detector 1230 comprises a housing 1232 having a cover made of light-diffusing material, an infrared sensor 1234 and a photosensor 1210 within the enclosure of the housing 1232, and an ambient temperature sensor 1236 located on a shaded surface of the housing 1232. The infrared sensor 1234 is configured to take temperature readings, TIR, based on infrared radiation received form the region of sky within its conical field-of-view 1235. The ambient temperature sensor 1236 is configured to take ambient temperature readings, TA, of the ambient air surrounding the infrared cloud detector 1230. The infrared sensor 1234 includes an imaginary axis that is perpendicular to the sensing surface of the infrared sensor 1234 and passes through its center. The infrared sensor 1234 is directed so that its sensing surface faces upward and can receive infrared radiation from a region of the sky within its field-of-view 1235. The ambient temperature sensor 1236 is located on a shaded surface to avoid direct sunlight from impinging its sensing surface. Although not shown, the infrared cloud detector 1230 also includes one or more structures that hold its components within the housing 1232.


The infrared cloud detector system also includes a local window controller 1250 with a processor that can execute instructions stored in memory (not shown) for implementing the control logic to control the tint level of the electrochromic window 1205. The controller 1250 is in communication with the electrochromic window 1205 to send control signals. The controller 1250 is also in communication with (wirelessly or wired) the infrared sensor 1234 and the ambient temperature sensor 1236 to receive signals with temperature readings. The controller 1250 is also in communication with (wirelessly or wired) the photosensor 1210 to receive signals with visible light intensity readings.


According to certain aspects, power/communication lines extend from the building or another structure to the infrared cloud detector 1230. In one implementation, the infrared cloud detector 1230 includes a network interface that can couple the infrared cloud detector 1230 to a suitable cable. The infrared cloud detector 1230 can communicated data through the network interface to the controller 1250 or another controller (e.g., network controller and/or master controller) of the building. In some other implementations, the infrared cloud detector 1230 can additionally or alternatively include a wireless network interface enabling wireless communication with one or more external controllers. In some aspects, the infrared cloud detector 1230 can also include a battery within or coupled with its housing to power the sensors and electrical components within. The battery can provide such power in lieu of or in addition to the power from a power supply (for example, from a building power supply). In one aspect, the infrared cloud detector 1230 further includes at least one photovoltaic cell, for example, on an outer surface of the housing. This at least one photovoltaic cell can provide power in lieu of or in addition to the power provided by any other power supply.



FIG. 11A shows the penetration depth of direct sunlight into a room 1200 through an electrochromic window 1205 between the exterior and the interior of a building, which includes the room 1200. Penetration depth is a measure of how far direct sunlight will penetrate into the room 1200. As shown, penetration depth is measured in a horizontal direction away from the sill (bottom) of window 1205. Generally, the window defines an aperture that provides an acceptance angle for direct sunlight. The penetration depth is calculated based upon the geometry of the window (e.g., window dimensions), its position and orientation in the room, any fins or other exterior shading outside of the window, and the position of the sun (e.g. angle of direct sunlight for a particular time of day and date). Exterior shading to an electrochromic window 1205 may be due to any type of structure that can shade the window such as an overhang, a fin, etc. In FIG. 11A, there is an overhang 1220 above the electrochromic window 1205 that blocks a portion of the direct sunlight entering the room 1200 thus shortening the penetration depth.


Module A can be used to determine a tint level that considers occupant comfort from direct sunlight through the electrochromic window 1205 onto an occupant or their activity area. The tint level is determined based on a calculated penetration depth of direct sunlight into the room and the space type (e.g., desk near window, lobby, etc.) in the room at a particular instant in time (time of day and day of year). In some cases, the tint level may also be based on providing sufficient natural lighting into the room. In some cases, the penetration depth is the value calculated at a time in the future to account for glass transition time (the time required for the window to tint, e.g. to 80%, 90% or 100% of the desired tint level). The issue addressed in Module A is that direct sunlight may penetrate so deeply into the room 1200 as to show directly on an occupant working at a desk or other work surface in a room. Publicly available programs can provide calculation of the sun's position and allow for easy calculation of penetration depth.



FIGS. 11A-11C also shows a desk 1201 in the room 1200 as an example of a space type associated with an activity area (i.e. desk) and location of the activity area (i.e. location of desk). Each space type is associated with different tint levels for occupant comfort. For example, if the activity is a critical activity such as work in an office being done at a desk or computer, and the desk is located near the window, the desired tint level may be higher than if the desk were further away from the window. As another example, if the activity is non-critical, such as the activity in a lobby, the desired tint level may be lower than for the same space having a desk.



FIG. 11B shows direct sunlight and radiation under clear sky conditions entering the room 1200 through the electrochromic window 1205, according to an implementation. The radiation may be from sunlight scattered by molecules and particles in the atmosphere. Module B determines a tint level based on calculated values of irradiance under clear sky conditions flowing through the electrochromic window 1205 under consideration. Various software, such as open source RADIANCE program, can be used to calculate clear sky irradiance at a certain latitude, longitude, time of year, and time of day, and for a given window orientation.



FIG. 11C shows radiant light from the sky as may be obstructed by or reflected from objects such as, for example, clouds and other buildings, according to an implementation. These obstructions and reflections are not accounted for in the clear sky radiation calculations. The radiant light from the sky is determined based on sensor data from sensors such as, for example, the infrared sensor 1234, the photosensor 1210, and the ambient temperature sensor 1236 of the infrared cloud detector system. The tint level determined by Module C is based on the sensor data. In many cases, the tint level is based on a cloud cover condition determined using sensor data from the sensors. Generally, the operations of Module B will determine a tint level that darkens (or does not change) the tint level determined by Module A and the operations of Module C will determine a tint level that lightens (or does not change) the tint level determined by Module B.


The control logic may implement one or more of the logic Modules A, B and C separately for each electrochromic window 1205 in the building. Each electrochromic window 1205 can have a unique set of dimensions, orientation (e.g., vertical, horizontal, tilted at an angle), position, associated space type, etc. A configuration file with this information and other information can be maintained for each electrochromic window 1205. The configuration file may be stored in a computer readable medium of the local window controller 1250 of the electrochromic window 1205 or in the building management system (“BMS”) described later in this disclosure. The configuration file can include information such as a window configuration, an occupancy lookup table, information about an associated datum glass, and/or other data used by the control logic. The window configuration may include information such as the dimensions of the electrochromic window 1205, the orientation of the electrochromic window 1205, the position of the electrochromic window 1205, etc. The occupancy lookup table describes tint levels that provide occupant comfort for certain space types and penetration depths. That is, the tint levels in the occupancy lookup table are designed to provide comfort to occupant(s) that may be in the room 1200 from direct sunlight on the occupant(s) or their workspace. The space type is a measure to determine how much tinting will be required to address occupant comfort concerns for a given penetration depth and/or provide comfortable natural lighting in the room. The space type parameter may take into consideration many factors. Among these factors is the type of work or other activity being conducted in a particular room and the location of the activity. Close work associated with detailed study requiring great attention might be at one space type, while a lounge or a conference room might have a different space type. Additionally, the position of the desk or other work surface in the room with respect to the window is a consideration in defining the space type. For example, the space type may be associated with an office of a single occupant having a desk or other workspace located near the electrochromic window 1205. As another example, the space type may be a lobby.


In certain embodiments, one or more modules of the control logic can determine desired tint levels while accounting for energy conservation in addition to occupant comfort. These modules may determine energy savings associated with a particular tint level by comparing the performance of the electrochromic window 1205 at that tint level to a datum glass or other standard reference window. The purpose of using this reference window can be to ensure that the control logic conforms to requirements of the municipal building code or other requirements for reference windows used in the locale of the building. Often municipalities define reference windows using conventional low emissivity glass to control the amount of air conditioning load in the building. As an example of how the reference window 1205 fits into the control logic, the logic may be designed so that the irradiance coming through a given electrochromic window 1205 is never greater than the maximum irradiance coming through a reference window as specified by the respective municipality. In disclosed embodiments, control logic may use the solar heat gain coefficient (SHGC) value of the electrochromic window 1205 at a particular tint level and the SHGC of the reference window to determine the energy savings of using the tint level. Generally, the value of the SHGC is the fraction of incident light of all wavelengths transmitted through the window. Although a datum glass is described in many embodiments, other standard reference windows can be used. Generally the SHGC of the reference window (e.g., datum glass) is a variable that can be different for different geographical locations and window orientations, and is based on code requirements specified by the respective municipality.


Generally, buildings are designed to have a heating, ventilation, and air conditioning (“HVAC”) system with the capacity to fulfill the maximum expected heating and/or air-conditioning loads required at any given instance. The calculation of required capacity may take into consideration the datum glass or reference window required in a building at the particular location where the building is being constructed. Therefore, it is important that the control logic meet or exceed the functional requirements of the datum glass in order to allow building designers to confidently determine how much HVAC capacity to put into a particular building. Since the control logic can be used to tint the window to provide additional energy savings over the datum glass, the control logic could be useful in allowing building designers to have a lower HVAC capacity than would have been required using the datum glass specified by the codes and standards.


Particular embodiments described herein assume that energy conservation is achieved by reducing air conditioning load in a building. Therefore, many of the implementations attempt to achieve the maximum tinting possible, while accounting for occupant comfort level and perhaps lighting load in a room having with the window under consideration. However, in some climates, such as those at far northern and for southern latitudes, heating may be more of a concern than air conditioning. Therefore, the control logic can be modified, specifically, road reversed in some matters, so that less tinting occurs in order to ensure that the heating load of the building is reduced.



FIG. 12 depicts a flowchart 1400 showing general control logic for a method of controlling one or more electrochromic windows (e.g., electrochromic window 1205) in a building, according to embodiments. The control logic uses one or more of Modules A, B, and C to calculate tint levels for the window(s) and sends instructions to transition the window(s) to the tint levels. The calculations in the control logic are run 1 to n times at intervals timed by the timer at operation 1410. For example, the tint level can be recalculated 1 to n times by one or more of the Modules A, B, and C and calculated for instances in time ti=t1, t2 . . . tn. n is the number of recalculations performed and n can be at least 1. The logic calculations can be done at constant time intervals in some cases. In one cases, the logic calculations may be done every 2 to 5 minutes. However, tint transition for large pieces of electrochromic glass (e.g. up to 6′×10 feet) can take up to 30 minutes or more. For these large windows, calculations may be done on a less frequent basis such as every 30 minutes.


At operation 1420, logic Modules A, B, and C perform calculations to determine a tint level for each electrochromic window at a single instant in time ti. These calculations can be performed by a processor of a controller. In certain embodiments, the control logic calculates how the window should transition in advance of the actual transition. In these cases, the calculations in Modules A, B, and C are based on a future time, for example, around or after transition is complete. For example, the future time used in the calculations may be a time in the future that is sufficient to allow the transition to be completed after receiving the tint instructions. In these cases, the controller can send tint instructions in the present time in advance of the actual transition. By the completion of the transition, the window will have transitioned to a tint level that is desired for that time.


At operation 1430, the control logic allows for certain types of overrides that disengage the algorithm at Modules A, B, and C and define override tint levels at operation 1440 based on some other consideration. One type of override is a manual override. This is an override implemented by an end user who is occupying a room and determines that a particular tint level (override value) is desirable. There may be situations where the user's manual override is itself overridden. An example of an override is a high demand (or peak load) override, which is associated with a requirement of a utility that energy consumption in the building be reduced. For example, on particularly hot days in large metropolitan areas, it may be necessary to reduce energy consumption throughout the municipality in order to not overly tax the municipality's energy generation and delivery systems. In such cases, the building may override the tint level from the control logic described herein to ensure that all windows have a particularly high level of tinting. Another example of an override may be if there is no occupant in the room example weekends in a commercial office building. In these cases, the building may disengage one or more Modules that relate to occupant comfort and all the windows may have a low level of tinting in cold weather and high level of tinting in warm weather.


At operation 1450, the control signals for implementing the tint levels are transmitted over a network to the power supply in electrical communication with the electrochromic device(s) in one or more electrochromic windows in the building. In certain embodiments, the transmission of tint levels to all windows of a building may be implemented with efficiency in mind. For example, if the recalculation of a tint level suggests that no change in tint from the current tint level is required, then there is no transmission of instructions with an updated tint level. As another example, the building may be divided into zones based on window size and/or location in the building. In one case, control logic recalculates tint levels for zones with smaller windows more frequently than for zones with larger windows.


In some embodiments, the control logic in FIG. 12 for implementing the control method(s) for multiple electrochromic windows in an entire building can be on a single device, for example, a single master window controller. This device can perform the calculations for each and every tintable window in the building and also provide an interface for transmitting tint levels to one or more electrochromic devices in individual electrochromic windows, for example, in multi-zone windows or on multiple EC lites of an insulated glass unit. Some examples of multi-zone windows can be found in PCT application No. PCT/US14/71314 titled “MULTI-ZONE EC WINDOWS,” which is hereby incorporated by reference in its entirety.


Also, there may be certain adaptive components of the control logic of embodiments. For example, the control logic may determine how an end user (e.g. occupant) tries to override the algorithm at particular times of day and makes use of this information in a more predictive manner to determine desired tint levels. In one case, the end user may be using a wall switch to override the tint level provided by the control logic at a certain time each day to an override value. The control logic may receive information about these instances and change the control logic to change the tint level to the override value at that time of day.



FIG. 13 is a diagram showing a particular implementation of block 1420 from FIG. 12. This diagram shows a method of performing all three Modules A, B, and C in sequence to calculate a final tint level of a particular electrochromic window for a single instant in time ti. The final tint level may be the maximum permissible transmissivity of the window under consideration. FIG. 13 also shows some exemplary inputs and outputs of Modules A, B, and C. The calculations in Modules A, B, and C are performed by the processor of a local window controller, a network controller, or a master controller. Although certain examples describe all three Modules A, B, and C being used, other implementations may use one or more of the Modules A, B, and C or may use additional/different modules.


At operation 1470, the processor uses Module A to determine a tint level for occupant comfort to prevent direct glare from sunlight penetrating the room. The processor uses Module A to calculate the penetration depth of direct sunlight into the room based on the sun's position in the sky and the window configuration from the configuration file. The position of the sun is calculated based on the latitude and longitude of the building and the time of day and date. The occupancy lookup table and space type are input from a configuration file for the particular window. Module A outputs the Tint level from A to Module B. The goal of Module A is generally to ensure that direct sunlight or glare does not strike the occupant or his or her workspace. The tint level from Module A is determined to accomplish this purpose. Subsequent calculations of tint level in Modules B and C can reduce energy consumption and may require even greater tint. However, if subsequent calculations of tint level based on energy consumption suggest less tinting than required to avoid interfering with the occupant, the logic prevents the calculated greater level of transmissivity from being executed to assure occupant comfort.


At operation 1480, the tint level calculated in Module A is input into Module B. Generally Module B determines a tint level that darkens (or does not change) the tint level calculated in Module B. A tint level is calculated based on calculations of irradiance under clear sky conditions (clear sky irradiance). The processor of the controller uses Module B to calculate clear sky irradiance for the electrochromic window based on window orientation from the configuration file and based on latitude and longitude of the building. These calculations are also based on a time of day and date. Publicly available software such as the RADIANCE program, which is an open-source program, can provide the calculations for calculating clear sky irradiance. The SHGC of the datum glass is also input into Module B from the configuration file. The processor uses Module B to determine a tint level that is darker than the tint level in A and transmits less heat than the datum glass is calculated to transmit under maximum clear sky irradiance. Maximum clear sky irradiance is the highest level of irradiance for all times calculated for clear sky conditions.


At operation 1490, a tint level from Module B and the calculated clear sky irradiance are input to Module C. Sensor readings are input to Module C based on measurements taken by the infrared sensor(s), the ambient temperature sensor(s), and the photosensor(s). The processor uses Module C to determine the cloud cover condition based on the sensor readings and the actual irradiance. The processor also uses Module C to calculate irradiance transmitted into the room if the window were tinted to the Tint level from Module B under clear sky conditions. The processor uses Module C to find the appropriate tint level if the actual irradiance through the window with this tint level is less than or equal to the irradiance through the window with the Tint level from Module B based on the determined cloud cover condition from the sensor readings. Generally the operations of Module C will determine a tint level that lightens (or does not change) the tint level determined by the operations of Module B. The tint level determined in Module C is the final tint level in this example.


Much of the information input to the control logic is determined from fixed information about the latitude and longitude, time of day and date. This information describes where the sun is with respect to the building, and more particularly with respect to the window for which the control logic is being implemented. The position of the sun with respect to the window provides information such as the penetration depth of direct sunlight into the room assisted with the window. It also provides an indication of the maximum irradiance or solar radiant energy flux coming through the window. This calculated level of irradiance can be based on sensor input which might indicated that there is a reduction based on the determined cloud cover condition or another obstruction between the window and the sun.


A program such as the open source program Radiance, is used to determine clear sky irradiance based on window orientation and latitude and longitude coordinates of the building for both a single instant in time ti and a maximum value for all times. The datum glass SHGC and calculated maximum clear sky irradiance are input into Module B. Module B increases the tint level calculated in Module A in steps and picks a tint level where the Inside radiation is less than or equal to the Datum Inside Irradiance where: Inside Irradiance=Tint level SHGC×Clear Sky Irradiance and Datum Inside Irradiance=Datum SHGC×Maximum Clear Sky Irradiance. However, when Module A calculates the maximum tint of the glass, module B doesn't change the tint to make it lighter. The tint level calculated in Module B is then input into Module C. The calculated clear sky irradiance is also input into Module C.


Example of Control Logic for Making Tinting Decisions Using an Infrared Cloud Detector System with a Photosensor



FIG. 14A is a flowchart 1500 depicting a particular implementation of the control logic of operation 1420 shown in FIG. 13, according to an implementation. Although this control logic is described with respect to a single window, it would be understood that control logic can be used to control multiple windows or a zone of one or more windows.


At operation 1510, the control logic determines whether the time of day is during one of the following time periods: (i) a time period starting shortly before sunrise (e.g., starting at a first time of 45 minutes before sunrise, 30 minutes before sunrise, 20 minutes before sunrise, or other suitable amount of time before sunrise) and up to slightly after sunrise (e.g., starting at a second time of 45 minutes after sunrise, 30 minutes after sunrise, 20 minutes after sunrise, or other suitable amount of time after sunrise) and (iii) a time period shortly before sunset (dusk) (e.g., starting at a third time of 45 minutes before sunset, 30 minutes before sunset, 20 minutes before sunset, or other suitable amount of time before sunset) and up until sunset, or (ii) after (i) and before (iii). In one case, the time of sunrise can be determined from measurements taken by the visible wavelength photosensor. For example, the time period (i) may end at the point where a visible light wavelength photosensor begins to measure direct sunlight i.e. an intensity reading of the visible light photosensor is at or above a minimum intensity value. In addition or alternatively, the time period (iii) may be determined to end at the point where the intensity reading from a visible light wavelength photosensor is at or below a minimum intensity value. In another example, the time of sunrise and/or the time of sunset may be calculated using a solar calculator and the day of the year and the time periods (i) and (iii) can be calculated by a defined period of time (e.g., 45 minutes) before and after the calculated times of sunrise/sunset. If it is determined that the time of day is not during one of the time periods (i), (ii), or (iii) at operation 1510, then the control logic determines the time of day is in the time period (iv) after time period (iii) and before time period (i) i.e. at nighttime. In this case, the control logic passes a nighttime tint state (e.g., “clear”) and proceeds to operation 1570 to determine whether there is an override, for example, an override command received in a signal from an operator. If it is determined that there is an override at operation 1560, the override value is the final tint level. If it is determined that there is no override in place, the tint level from Module C is the final tint level. At operation 1570, a control command is sent to over a network or directed to electrochromic device(s) of the window to transition the window to the final tint level, the time of day is updated, and the method returns to operation 1510. If, instead, it is determined at operation 1510 that the time of day is during one of the time periods (i), (ii), or (iii), then the time of day is between just before sunrise and sunset and the control logic goes on to determine whether the sun azimuth is between critical angles of the tintable window at operation 1520.


If it is determined by the control logic at operation 1520 that the sun azimuth is outside the critical angles, then Module A is bypassed, and a “clear” tint level is passed to Module B, and Module B is used to make calculations at operation 1540. If it is determined at operation 1520 that the sun azimuth is between the critical angles, the control logic in Module A is used to calculate penetration depth and an appropriate tint level based on penetration depth at operation 1530. The tint level determined from Module A is then input to Module B and Module B is used to make calculations at operation 1540.


At operation 1540, the control logic from Module B determines a tint level that darkens (or does not change) the tint level from Module A. The tint level is calculated based on calculations of irradiance under clear sky conditions (clear sky irradiance). Module B is used to calculate clear sky irradiance for the window based on window orientation from the configuration file and based on latitude and longitude of the building. These calculations are also based on a time of day and date. Publicly available software such as the RADIANCE program, which is an open-source program, can provide calculations for determining clear sky irradiance. The SHGC of the datum glass is also input into Module B from the configuration file. The processor uses the control logic of Module B to determine a tint level that is darker than the tint level from Module A and transmits less heat than the datum glass is calculated to transmit under maximum clear sky irradiance. Maximum clear sky irradiance is the highest level of irradiance for all times calculated for clear sky conditions.


At operation 1550, a tint level from Module B, the calculated clear sky irradiance and sensor readings from an infrared sensor(s), an ambient temperature sensor(s), and a photosensor(s) are input to Module C. The control logic of Module C determines the cloud cover condition based on the sensor readings and determines the actual irradiance based on the cloud cover condition. The control logic of Module C also calculates an irradiance level that would be transmitted into the room if the window were tinted to the Tint level from Module B under clear sky conditions. The control logic in Module C decreases the tint level if the determined actual irradiance through the window based on the cloud cover condition is less than or equal to the calculated irradiance through the window when tinted to the tint level from Module B. Generally the operations of Module C will determine a tint level that lightens (or does not change) the tint level determined by the operations of Module B.


At operation 1550, the control logic determines a tint level from Module C based on sensor readings and then proceeds to operation 1560 to determine whether there is an override in place, for example, an override command received in a signal from an operator. If it is determined that there is an override at operation 1560, the override value is the final tint level. If it is determined that there is no override in place, the tint level from Module C is the final tint level. At operation 1570, a control command is sent to over a network or directed to electrochromic device(s) of the window to transition the window to the final tint level, the time of day is updated, and the method returns to operation 1510.



FIG. 14B is a flowchart 1600 depicting a particular implementation of the control logic of operation 1550 shown in FIG. 14A. At operation 1610, one or more signals are received, at the processor, with a temperature reading, TIR, taken by an infrared sensor at a particular sample time, a temperature reading, TA, taken by the ambient temperature sensor at the sample time, and an intensity reading taken by the photosensor at the sample time. Signals from the infrared sensor, ambient temperature sensor, and photosensor are received wirelessly and/or via wired electrical connections. The infrared sensor takes temperature readings based on infrared radiation received within its field-of-view. The infrared sensor is usually oriented toward a region of sky of interest, for example, a region above a building. The ambient temperature sensor is configured to be exposed to the outside environment to measure the ambient temperature. The sensing surface of the photosensor is usually also oriented toward the region of sky of interest and direct sunlight is blocked or diffused from impinging the sensing surface.


If it is determined at operation 1620 that the time of day is during either of the time periods (i) or (iii), then the processor calculates the difference, delta (custom character), between the temperature reading, TIR, taken by the infrared sensor and the temperature reading, TA, taken by an ambient temperature sensor at a sample time (operation 1630). Optionally (denoted by dotted line), correction factors are applied to the calculated delta (custom character) (operation 1630). Some examples of correction factors that may be applied include humidity, sun angle/elevation, and site elevation.


In one embodiment, the processor also determines at operation 1620 whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 1620 that the time of day is either within the time period (i) or (iii) and the infrared readings are oscillating at a frequency greater than a second defined level, then the processor applies operation 1690 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. If the system is still in operation, the method increments to the next sample time and returns to operation 1610.


At operation 1634, the processor determines whether the calculated delta (custom character) value is below a lower threshold value (e.g., −5 degrees Celsius, −2 degrees Celsius, etc.). If it is determined that the calculated delta (custom character) value is below the lower threshold value, the cloud cover condition is determined to be a “clear” condition (operation 1636). During operation of the infrared cloud detector, the method then increments to the next sample time and returns to operation 1610.


If it is determined that the calculated delta (custom character) is above the lower threshold value, then the processor determines whether the calculated delta (custom character) is above an upper threshold value (e.g., 0 degrees Celsius, 2 degrees Celsius, etc.) at operation 1640. If it is determined that the calculated delta (custom character) is above the upper threshold value at operation 1640, then the processor determines the cloud cover condition to be a “cloudy” condition (operation 1642).


At operation 1695, the control logic determines the actual irradiance based on the cloud cover condition and calculates an irradiance level that would be transmitted into the room if the window were tinted to the Tint level from Module B under clear sky conditions. The control logic in Module C decreases the tint level from Module B if the irradiance based on the cloud cover condition is less than or equal to the calculated irradiance through the window when tinted to the tint level from Module B. The control logic then increments to the next sample time and returns to operation 1560.


If it is determined that the calculated delta (custom character) is below the upper threshold value at operation 1640, then the processor determines the cloud cover condition to be “intermittent cloudy” or another intermediate condition (operation 1650) and proceeds to operation 1695 described in detail above.


If it is determined at operation 1620 that the time of day is not during either of the time periods (i) or (iii), then the time of day is during the time period (ii) daytime and the processor calculates the difference between the temperature reading, TIR, taken by the infrared sensor and the intensity reading taken by the photosensor at operation 1670. At operation 1680, the processor determines whether the calculated difference is within an acceptable limit. If the processor determines at operation 1680 that the calculated difference is more than the acceptable limit, then the processor applies operation 1630 to calculate the delta (custom character) and uses the calculated delta (custom character) to determine the cloud cover condition as discussed above.


In one embodiment, the processor also determines at operation 1660 whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 1660 the time of day is within the time period (ii) and that the infrared readings are oscillating at a frequency greater than a second defined level, then the processor applies operation 1690 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. The control logic then proceeds to operation 1695 described in detail above.


If the processor determines at operation 1680 that the calculated difference is within the acceptable limit, the photosensor reading is used to determine the cloud cover condition (operation 1690). For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. The control logic then proceeds to operation 1695 described in detail above.


In one embodiment, the processor also determines at operation 1670 whether the photosensor readings are oscillating at a frequency greater than a first defined level and whether the infrared readings are oscillating at a frequency greater than a second defined level. If the processor determines at operation 1680 that the calculated difference is within the acceptable limit and the processor determines that the photosensor readings are oscillating at a frequency greater than the first defined level, then the processor applies operation 1630 to calculate the delta (custom character) and use the calculated delta (custom character) is used determine the cloud cover condition as discussed above. If the processor determines at operation 1680 that the calculated difference is not within the acceptable limit and the processor determines that the infrared readings are oscillating at a frequency greater than the second defined level, then the processor applies operation 1690 to use the photosensor readings to determine the cloud condition. For example, the processor may determine a “clear” condition if the photosensor reading is above a certain minimum intensity level and determine a “cloudy” condition if the photosensor reading is at or below minimum intensity level. The control logic then proceeds to operation 1695 described in detail above.


Although a single infrared sensor is described as included in the infrared cloud detector of certain implementations, two or more infrared sensors can be used, according to another implementation, for redundancy in case one malfunctions and/or is obscured by, for example, bird droppings or another environmental agent. In one aspect, two or more infrared sensors can be included that face different orientations to capture infrared radiation from different fields-of-view and/or at different distances from the building/structure. If two or more infrared sensors are located within a housing of the infrared cloud detector, the infrared sensors are typically offset from one another by a distance sufficient to reduce the likelihood that an obscuring agent would affect all the infrared sensors. For example, the infrared sensors may be separated by at least about one inch or at least about two inches.


It should be understood that the present invention as described above can be implemented in the form of control logic using computer software in a modular or integrated manner. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will know and appreciate other ways and/or methods to implement the present invention using hardware and a combination of hardware and software.


Any of the software components or functions described in this application, may be implemented as software code to be executed by a processor using any suitable computer language such as, for example, Java, C++ or Perl using, for example, conventional or object-oriented techniques. The software code may be stored as a series of instructions, or commands on a computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM. Any such computer readable medium may reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.


Although the foregoing disclosed embodiments have been described in some detail to facilitate understanding, the described embodiments are to be considered illustrative and not limiting. It will be apparent to one of ordinary skill in the art that certain changes and modifications can be practiced within the scope of the appended claims.


One or more features from any embodiment may be combined with one or more features of any other embodiment without departing from the scope of the disclosure. Further, modifications, additions, or omissions may be made to any embodiment without departing from the scope of the disclosure. The components of any embodiment may be integrated or separated according to particular needs without departing from the scope of the disclosure.

Claims
  • 1. A rooftop sensor system disposed on a building, the rooftop sensor system comprising: at least one infrared sensor; andat least one ambient temperature sensor;wherein the rooftop sensor system is configured to determine a cloud condition based in part on a difference between a sky temperature reading from the at least one infrared sensor and an ambient temperature reading from the at least one ambient temperature sensor; andwherein the rooftop sensor system further comprises, or is in communication with, one or more window controllers configured to control tint of one or more tintable windows disposed in the building based in part on the cloud condition determined.
  • 2. The rooftop sensor system of claim 1, wherein the cloud condition occurs at a future time.
  • 3. The rooftop sensor system of claim 1, wherein the rooftop sensor system is configured to determine a tint state for the one or more tintable windows based on the cloud condition determined and/or a calculated irradiance.
  • 4. The rooftop sensor system of claim 3, wherein the rooftop sensor system is configured to send instructions to the one or more window controllers to transition the one or more tintable windows to the tint state determined.
  • 5. The rooftop sensor system of claim 1, wherein the at least one infrared sensor is configured to generate the sky temperature reading based on infrared radiation received within its field-of-view and/or having wavelength above 5 μm.
  • 6. The rooftop sensor system of claim 1, wherein the at least one infrared sensor is configured to detect infrared radiation having wavelength in a range between 8 μm and 14 μm.
  • 7. The rooftop sensor system of claim 1, wherein the rooftop sensor system is configured to apply one or more correction factors to the difference between the sky temperature reading and the ambient temperature reading before determining the cloud condition.
  • 8. The rooftop sensor system of claim 1, wherein the rooftop sensor system is configured to adjust the difference to account for humidity, an elevation of the sun, an angle of the sun, and/or an elevation of the building.
  • 9. The rooftop sensor system of claim 1, wherein the at least one infrared sensor comprises at least one of an infrared thermometer, an infrared radiometer, an infrared pyrgeometer, and an infrared pyrometer.
  • 10. The rooftop sensor system of claim 1, wherein the rooftop sensor system is configured to (A) determine the cloud condition is a clear condition if the difference is below a first threshold value and (B) determine the cloud condition is a cloudy condition if the difference is above a second threshold value.
  • 11. The rooftop sensor system of claim 10, wherein the rooftop sensor system is configured to determine the cloud condition is an intermediate condition if the difference is above the first threshold value and below the second threshold value.
  • 12. The rooftop sensor system of claim 1, further comprising a light diffusing material between the at least one infrared sensor and an environment outside the building.
  • 13. A rooftop sensor system disposed on a building, the rooftop sensor system comprising: at least one infrared sensor; at least one ambient temperature sensor; andat least one photosensor;wherein the rooftop sensor system is configured to determine a cloud condition:(I) based in part on a sky temperature reading from the at least one infrared sensor and an ambient temperature reading from the at least one ambient temperature sensor if a time of day is (i) between a first time before a sunrise time and a second time after the sunrise time or (ii) between a third time before a sunset time and the sunset time; and(II) based in part on an intensity reading from the at least one photosensor if the time of day is between the second time after the sunrise time and before the third time before the sunset time.
  • 14. The rooftop sensor system of claim 13, wherein the time of day is a time at which the sky temperature reading, the ambient temperature reading, and the intensity reading are taken.
  • 15. The rooftop sensor system of claim 13, wherein if the time of day is (i) between the first time before the sunrise time and the second time after the sunrise time or (ii) between the third time before the sunset time and the sunset time, the cloud condition is determined based in part on a difference between the sky temperature reading and the ambient temperature reading.
  • 16. The rooftop sensor system of claim 15, wherein the rooftop sensor system is configured to apply one or more correction factors to the difference between the sky temperature reading and the ambient temperature reading before determining the cloud condition.
  • 17. The rooftop sensor system of claim 15, wherein the rooftop sensor system is configured to adjust the difference to account for humidity, an elevation of the sun, an angle of the sun, and/or an elevation of the building.
  • 18. The rooftop sensor system of claim 15, wherein the rooftop sensor system is configured to (A) determine the cloud condition is a clear condition if the difference is below a first threshold value and (B) determine the cloud condition is a cloudy condition if the difference is above a second threshold value.
  • 19. The rooftop sensor system of claim 18, wherein the rooftop sensor system is configured to determine the cloud condition is an intermediate condition if the difference is above the first threshold value and below the second threshold value.
  • 20. The rooftop sensor system of claim 13, wherein the cloud condition occurs at a future time.
  • 21. The rooftop sensor system of claim 13, wherein the rooftop sensor system is configured to determine a tint state for one or more tintable windows based on the cloud condition determined and/or a calculated irradiance.
  • 22. The rooftop sensor system of claim 21, wherein the rooftop sensor system is configured to send instructions to one or more window controllers to transition the one or more tintable windows to the tint state determined.
  • 23. The rooftop sensor system of claim 13, wherein the at least one photosensor is configured to generate the intensity reading based on visible light received.
  • 24. The rooftop sensor system of claim 13, wherein the at least one infrared sensor is configured to generate the sky temperature reading based on infrared radiation received within its field-of-view and/or having wavelength above 5 μm.
  • 25. The rooftop sensor system of claim 13, wherein the at least one infrared sensor is configured to detect infrared radiation having wavelength in a range between 8 μm and 14 μm.
  • 26. The rooftop sensor system of claim 13, wherein the at least one infrared sensor comprises at least one of an infrared thermometer, an infrared radiometer, an infrared pyrgeometer, and an infrared pyrometer.
  • 27. The rooftop sensor system of claim 13, further comprising a light diffusing material between the at least one infrared sensor and an environment outside the building.
  • 28. A method of controlling at least one tintable window disposed in a building, the method comprising: determining a tint state for the at least one tintable window based at least in part on a cloud condition, the cloud condition determined based in part on a difference between a sky temperature reading from at least one infrared sensor and an ambient temperature reading from at least one ambient temperature sensor, wherein the at least one infrared sensor and at least one ambient temperature sensor are disposed on a rooftop of the building; andsending instructions to transition the at least one tintable window to the tint state determined.
  • 29. The method of claim 28, wherein the cloud condition occurs at a future time.
  • 30. The method of claim 28, wherein the tint state is determined in part on the cloud condition and a calculated irradiance.
  • 31. The method of claim 28, further comprising applying one or more correction factors to the difference between the sky temperature reading and the ambient temperature reading.
  • 32. The method of claim 28, further comprising adjusting the difference between the sky temperature reading and the ambient temperature reading to account for humidity, an elevation of the sun, an angle of the sun, and/or an elevation of the building.
  • 33. The method of claim 28, wherein the cloud condition is a clear condition if the difference is below a first threshold value and the cloud condition is a cloudy condition if the difference is above a second threshold value.
  • 34. The method of claim 33, wherein the cloud condition is an intermediate condition if the difference is above the first threshold value and below the second threshold value.
  • 35. The method of claim 33, wherein the first threshold value is between −5 degrees Celsius and −10 degrees Celsius and/or the second threshold value is in a range between −5 degrees Celsius to 0 degrees Celsius.
  • 36. A method of controlling at least one tintable window disposed in a building, the method comprising: determining a tint state for the at least one tintable window based at least in part on a cloud condition determined: (I) based in part on a sky temperature reading from at least one infrared sensor and an ambient temperature reading from at least one ambient temperature sensor if a time of day is (i) between a first time before a sunrise time and a second time after the sunrise time or (ii) between a third time before a sunset time and the sunset time; and(II) based in part on an intensity reading from at least one photosensor if the time of day is between the second time after the sunrise time and before the third time before the sunset time, wherein the at least one infrared sensor, the at least one ambient temperature sensor, and the at least one photosensor are disposed on a rooftop of the building; andsending instructions to transition the at least one tintable window to the tint state determined.
  • 37. The method of claim 36, wherein the cloud condition occurs at a future time.
  • 38. The method of claim 36, wherein the time of day is a time at which the sky temperature reading, the ambient temperature reading, and the intensity reading are taken.
  • 39. The method of claim 36, wherein the tint state is determined in part on the cloud condition and a calculated irradiance.
  • 40. The method of claim 36, further comprising applying one or more correction factors to a difference between the sky temperature reading and the ambient temperature reading.
  • 41. The method of claim 36, further comprising adjusting a difference between the sky temperature reading and the ambient temperature reading to account for humidity, an elevation of the sun, an angle of the sun, and/or an elevation of the building.
  • 42. The method of claim 36, wherein the cloud condition is a clear condition if a difference between the sky temperature reading and the ambient temperature reading is below a first threshold value and the cloud condition is a cloudy condition if the difference is above a second threshold value.
  • 43. The method of claim 42, wherein the cloud condition is an intermediate condition if the difference is above the first threshold value and below the second threshold value.
  • 44. The method of claim 42, wherein the first threshold value is between −5 degrees Celsius and −10 degrees Celsius and/or the second threshold value is in a range between −5 degrees Celsius to 0 degrees Celsius.
CROSS-REFERENCES TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/335,222, titled “INFRARED CLOUD DETECTOR SYSTEMS AND METHODS” and filed on Mar. 20, 2019, which is a National Stage Application under 35 U.S.C. § 371 of International PCT Application No. PCT/US17/55631 (designating the United States), titled “INFRARED CLOUD DETECTOR SYSTEMS AND METHODS” and filed on Oct. 6, 2017, which claims benefit of and priority to U.S. Provisional Application 62/453,407, filed on Feb. 1, 2017 and titled “INFRARED CLOUD DETECTOR SYSTEMS AND METHODS; International PCT Application No. PCT/US17/55631 is a continuation-in-part of International PCT Application No. PCT/US16/55709 (designating the United States), titled “MULTI-SENSOR” and filed on Oct. 6, 2016, which is a continuation-in-part of U.S. patent application Ser. No. 14/998,019, titled “MULTI-SENSOR HAVING A LIGHT DIFFUSING ELEMENT AROUND A PERIPHERY OF A RING OF PHOTOSENSORS” and filed on Oct. 6, 2015; International PCT Application No. PCT/US17/5561 is also a continuation-in-part of U.S. patent application Ser. No. 15/287,646, titled “MULTI-SENSOR DEVICE AND SYSTEM WITH A LIGHT DIFFUSING ELEMENT AROUND A PERIPHERY OF A RING OF PHOTOSENSORS AND AN INFRARED SENSOR” and filed on Oct. 6, 2016, which is a continuation-in-part of U.S. patent application Ser. No. 14/998,019; all of these applications are hereby incorporated by reference in their entireties and for all purposes.

US Referenced Citations (434)
Number Name Date Kind
3675023 Kunke et al. Jul 1972 A
3963347 Segre et al. Jun 1976 A
4171911 Aberle et al. Oct 1979 A
D256787 Petersen Sep 1980 S
D258871 Rajotte, Jr. et al. Apr 1981 S
4355896 Frosch et al. Oct 1982 A
4491727 Appelbaum et al. Jan 1985 A
D278132 Powell Mar 1985 S
4628206 Astheimer Dec 1986 A
4690553 Fukamizu et al. Sep 1987 A
4742236 Kawakami et al. May 1988 A
5124833 Barton et al. Jun 1992 A
5170108 Peterson et al. Dec 1992 A
5204778 Bechtel Apr 1993 A
5220317 Lynam et al. Jun 1993 A
5290986 Colon et al. Mar 1994 A
5353148 Eid et al. Oct 1994 A
5365365 Ripoche et al. Nov 1994 A
5379146 Defendini Jan 1995 A
5379215 Kruhoeffer et al. Jan 1995 A
5384578 Lynam et al. Jan 1995 A
5402144 Ripoche Mar 1995 A
5451822 Bechtel et al. Sep 1995 A
5489777 Stedman et al. Feb 1996 A
5583972 Miller Dec 1996 A
5598000 Popat Jan 1997 A
5606393 Schoenherr et al. Feb 1997 A
5621526 Kuze Apr 1997 A
5656807 Packard Aug 1997 A
5663621 Popat Sep 1997 A
5670774 Hill Sep 1997 A
5673028 Levy Sep 1997 A
5694144 Lefrou et al. Dec 1997 A
5760558 Popat Jun 1998 A
5764402 Thomas et al. Jun 1998 A
5822107 Lefrou et al. Oct 1998 A
5838432 Tokuhashi et al. Nov 1998 A
5900720 Kallman et al. May 1999 A
5956012 Turnbull et al. Sep 1999 A
5973818 Sjursen et al. Oct 1999 A
5973819 Pletcher et al. Oct 1999 A
5978126 Sjursen Nov 1999 A
6002511 Varaprasad et al. Dec 1999 A
6039390 Agrawal et al. Mar 2000 A
6039850 Schulz Mar 2000 A
6055089 Schulz et al. Apr 2000 A
6064949 Werner et al. May 2000 A
6084231 Popat Jul 2000 A
6084700 Knapp et al. Jul 2000 A
6104513 Bloom Aug 2000 A
6125327 Kalenian Sep 2000 A
6130448 Bauer et al. Oct 2000 A
6130772 Cava Oct 2000 A
6163756 Baron et al. Dec 2000 A
D439532 Off et al. Mar 2001 S
6222177 Bechtel et al. Apr 2001 B1
6262831 Bauer et al. Jul 2001 B1
6266063 Baron et al. Jul 2001 B1
6269763 Woodland Aug 2001 B1
6386713 Turnbull et al. May 2002 B1
6398118 Rosen et al. Jun 2002 B1
6407468 LeVesque et al. Jun 2002 B1
6407847 Poll et al. Jun 2002 B1
6417500 Wood Jul 2002 B1
6449082 Agrawal et al. Sep 2002 B1
6466312 Li Oct 2002 B1
6471360 Rukavina et al. Oct 2002 B2
6493128 Agrawal et al. Dec 2002 B1
6535126 Lin et al. Mar 2003 B2
6567708 Bechtel et al. May 2003 B1
6614577 Yu et al. Sep 2003 B1
6795226 Agrawal et al. Sep 2004 B2
6819367 Cava Nov 2004 B1
6829511 Bechtel et al. Dec 2004 B2
6856444 Ingalls et al. Feb 2005 B2
6897936 Li et al. May 2005 B1
6940627 Freeman et al. Sep 2005 B2
6965813 Granqvist et al. Nov 2005 B2
7049602 Tokhtuev et al. May 2006 B2
7079944 Litchfield et al. Jul 2006 B2
7085609 Bechtel et al. Aug 2006 B2
7105800 Staerzl Sep 2006 B1
7111952 Veskovic Sep 2006 B2
7133181 Greer Nov 2006 B2
7215318 Turnbull et al. May 2007 B2
7277215 Greer Oct 2007 B2
7304787 Whitesides et al. Dec 2007 B2
7417397 Berman et al. Aug 2008 B2
7472590 Solheim et al. Jan 2009 B2
7542809 Bechtel et al. Jun 2009 B2
7548833 Ahmed Jun 2009 B2
7562928 Morazan Jul 2009 B1
7567183 Schwenke Jul 2009 B2
7587289 Sivertsen Sep 2009 B1
7588067 Veskovic Sep 2009 B2
7610910 Ahmed Nov 2009 B2
7714290 Kudoh May 2010 B2
7800812 Moskowitz Sep 2010 B2
7817326 Rennig et al. Oct 2010 B1
7822490 Bechtel et al. Oct 2010 B2
7873490 MacDonald Jan 2011 B2
7941245 Popat May 2011 B1
7950827 Veskovic May 2011 B2
7963675 Veskovic Jun 2011 B2
7972021 Scherer Jul 2011 B2
7977904 Berman et al. Jul 2011 B2
7990603 Ash et al. Aug 2011 B2
8004739 Letocart Aug 2011 B2
8018644 Gustavsson et al. Sep 2011 B2
8102586 Albahri Jan 2012 B2
8164818 Collins et al. Apr 2012 B2
8213074 Shrivastava et al. Jul 2012 B1
8254013 Mehtani et al. Aug 2012 B2
8270059 Friedman et al. Sep 2012 B2
8288981 Zaharchuk et al. Oct 2012 B2
8292228 Mitchell et al. Oct 2012 B2
8300298 Wang et al. Oct 2012 B2
8380393 Ohtomo Feb 2013 B1
8432603 Wang et al. Apr 2013 B2
8456729 Brown et al. Jun 2013 B2
8547624 Ash et al. Oct 2013 B2
8582193 Wang et al. Nov 2013 B2
8681073 Robbins et al. Mar 2014 B1
8705162 Brown et al. Apr 2014 B2
8723467 Berman et al. May 2014 B2
8754983 Sutton Jun 2014 B2
8764950 Wang et al. Jul 2014 B2
8764951 Wang et al. Jul 2014 B2
D712759 Forsberg et al. Sep 2014 S
8836263 Berman et al. Sep 2014 B2
8864321 Mehtani et al. Oct 2014 B2
8902486 Chandrasekhar Dec 2014 B1
8934170 Takeda et al. Jan 2015 B2
D723600 Nauli et al. Mar 2015 S
D725284 Karlsson et al. Mar 2015 S
8976440 Berland et al. Mar 2015 B2
D727467 Batiste Apr 2015 S
9016630 Mitchell et al. Apr 2015 B2
9030725 Pradhan et al. May 2015 B2
9078299 Ashdown Jul 2015 B2
9081246 Rozbicki Jul 2015 B2
9081247 Pradhan et al. Jul 2015 B1
9128346 Shrivastava et al. Sep 2015 B2
9164002 Anderson Oct 2015 B2
9226366 Orillard et al. Dec 2015 B2
D747988 Matsumiya et al. Jan 2016 S
D748508 Park et al. Feb 2016 S
9261751 Pradhan et al. Feb 2016 B2
9298203 Wenzel Mar 2016 B2
9341912 Shrivastava et al. May 2016 B2
9348192 Brown et al. May 2016 B2
D761135 Allen, Sr. et al. Jul 2016 S
9404793 Yang et al. Aug 2016 B2
9406028 Humann Aug 2016 B2
9423664 Brown et al. Aug 2016 B2
9454055 Brown et al. Sep 2016 B2
9506802 Chu et al. Nov 2016 B2
9523902 Parker Dec 2016 B2
9546515 Hall et al. Jan 2017 B2
9574934 Verbeek et al. Feb 2017 B2
D780612 Alexander et al. Mar 2017 S
9638978 Brown et al. May 2017 B2
9645465 Brown et al. May 2017 B2
9664974 Kozlowski et al. May 2017 B2
9668315 Shearer et al. May 2017 B2
9674924 Lashina et al. Jun 2017 B2
9709869 Baumann et al. Jul 2017 B2
9746181 Mori Aug 2017 B2
9807857 Huang Oct 2017 B2
9927674 Brown et al. Mar 2018 B2
9938765 Berman et al. Apr 2018 B2
D816518 Brown et al. May 2018 S
10048561 Brown Aug 2018 B2
10063815 Spivey et al. Aug 2018 B1
10234596 Frank et al. Mar 2019 B2
10241095 Coates Mar 2019 B2
10254618 Parker Apr 2019 B2
10316581 Nagel et al. Jun 2019 B1
10495939 Brown et al. Dec 2019 B2
10520784 Brown et al. Dec 2019 B2
10533892 Brown et al. Jan 2020 B2
10539456 Klawuhn et al. Jan 2020 B2
10539854 Brown et al. Jan 2020 B2
10605970 Blair et al. Mar 2020 B2
10690540 Brown et al. Jun 2020 B2
10712627 Brown et al. Jul 2020 B2
10732028 Klawuhn et al. Aug 2020 B2
10802372 Brown Oct 2020 B2
10895498 Klawuhn et al. Jan 2021 B2
10908470 Brown et al. Feb 2021 B2
10921675 Barnum et al. Feb 2021 B2
10982487 Ramirez Apr 2021 B2
11126057 Brown et al. Sep 2021 B2
11221434 Frank et al. Jan 2022 B2
11255722 Zedlitz Feb 2022 B2
11261654 Brown et al. Mar 2022 B2
11280671 Brown et al. Mar 2022 B2
11346710 Klawuhn et al. May 2022 B2
11520207 Brown et al. Dec 2022 B2
20020075472 Holton Jun 2002 A1
20020135881 Rukavina et al. Sep 2002 A1
20020144831 Kalt Oct 2002 A1
20020152298 Kikta et al. Oct 2002 A1
20030076480 Burbulla Apr 2003 A1
20030142140 Brown et al. Jul 2003 A1
20030191546 Bechtel et al. Oct 2003 A1
20030210449 Ingalls et al. Nov 2003 A1
20030210450 Yu et al. Nov 2003 A1
20030227663 Agrawal et al. Dec 2003 A1
20030227664 Agrawal et al. Dec 2003 A1
20040043760 Rosenfeld et al. Mar 2004 A1
20040047518 Tiana Mar 2004 A1
20040108191 Su et al. Jun 2004 A1
20040135989 Klebe Jul 2004 A1
20040160322 Stilp Aug 2004 A1
20050046920 Freeman et al. Mar 2005 A1
20050063036 Bechtel et al. Mar 2005 A1
20050200934 Callahan et al. Sep 2005 A1
20050225830 Huang et al. Oct 2005 A1
20050268629 Ahmed Dec 2005 A1
20050270620 Bauer et al. Dec 2005 A1
20050278047 Ahmed Dec 2005 A1
20060018000 Greer Jan 2006 A1
20060038983 Bickel et al. Feb 2006 A1
20060107616 Ratti et al. May 2006 A1
20060170376 Piepgras et al. Aug 2006 A1
20060187608 Stark Aug 2006 A1
20060207730 Berman et al. Sep 2006 A1
20060209007 Pyo et al. Sep 2006 A1
20060238860 Baun et al. Oct 2006 A1
20060245024 Greer Nov 2006 A1
20070002007 Tam Jan 2007 A1
20070012349 Gaudiana et al. Jan 2007 A1
20070055757 Mairs et al. Mar 2007 A1
20070067048 Bechtel et al. Mar 2007 A1
20070145252 Litchfield et al. Jun 2007 A1
20070162233 Schwenke Jul 2007 A1
20070221338 Meewis et al. Sep 2007 A1
20070285759 Ash et al. Dec 2007 A1
20080012755 Venkatachalam et al. Jan 2008 A1
20080018979 Mahe et al. Jan 2008 A1
20080043316 Moskowitz Feb 2008 A2
20080128586 Johnson et al. Jun 2008 A1
20080173818 Staerzl Jul 2008 A1
20080174455 Staerzl Jul 2008 A1
20090020233 Berman et al. Jan 2009 A1
20090027759 Albahri Jan 2009 A1
20090066157 Tarng et al. Mar 2009 A1
20090079349 Sibalich et al. Mar 2009 A1
20090139669 Robin Jun 2009 A1
20090143141 Wells et al. Jun 2009 A1
20090187287 Bruhnke et al. Jul 2009 A1
20090204269 Bechtel et al. Aug 2009 A1
20090231092 Maegawa et al. Sep 2009 A1
20090243732 Tarng et al. Oct 2009 A1
20090243802 Wolf et al. Oct 2009 A1
20090281820 Sarkar et al. Nov 2009 A1
20090296188 Jain et al. Dec 2009 A1
20090316671 Rolf et al. Dec 2009 A1
20090323160 Egerton et al. Dec 2009 A1
20090326709 Hooper et al. Dec 2009 A1
20100039410 Becker et al. Feb 2010 A1
20100066484 Hanwright et al. Mar 2010 A1
20100071856 Zaharchuk et al. Mar 2010 A1
20100082081 Niessen et al. Apr 2010 A1
20100100324 Caminiti et al. Apr 2010 A1
20100172009 Matthews Jul 2010 A1
20100172010 Gustavsson et al. Jul 2010 A1
20100188057 Tarng Jul 2010 A1
20100235206 Miller et al. Sep 2010 A1
20100243427 Kozlowski et al. Sep 2010 A1
20100245972 Wright Sep 2010 A1
20100245973 Wang et al. Sep 2010 A1
20100294330 Huang et al. Nov 2010 A1
20100296081 Granqvist Nov 2010 A1
20100315693 Lam et al. Dec 2010 A1
20110035061 Altonen et al. Feb 2011 A1
20110046810 Bechtel et al. Feb 2011 A1
20110063708 Letocart Mar 2011 A1
20110066302 McEwan Mar 2011 A1
20110080629 Neuman et al. Apr 2011 A1
20110148218 Rozbicki Jun 2011 A1
20110164304 Brown et al. Jul 2011 A1
20110167617 Letocart Jul 2011 A1
20110235152 Letocart Sep 2011 A1
20110249313 Letocart Oct 2011 A1
20110255142 Ash et al. Oct 2011 A1
20110266137 Wang et al. Nov 2011 A1
20110266138 Wang et al. Nov 2011 A1
20110266419 Jones et al. Nov 2011 A1
20110267674 Wang et al. Nov 2011 A1
20110267675 Wang et al. Nov 2011 A1
20110292488 McCarthy et al. Dec 2011 A1
20110295575 Levine et al. Dec 2011 A1
20110304898 Letocart Dec 2011 A1
20110304899 Kwak et al. Dec 2011 A1
20110308318 Magnussen Dec 2011 A1
20120006110 Niemann et al. Jan 2012 A1
20120007507 Niemann et al. Jan 2012 A1
20120026573 Collins et al. Feb 2012 A1
20120033287 Friedman et al. Feb 2012 A1
20120062975 Mehtani et al. Mar 2012 A1
20120070071 Rankin et al. Mar 2012 A1
20120089257 Holger Apr 2012 A1
20120091315 Moskowitz Apr 2012 A1
20120095601 Abraham et al. Apr 2012 A1
20120133315 Berman et al. May 2012 A1
20120147449 Bhatnagar et al. Jun 2012 A1
20120188627 Chen et al. Jul 2012 A1
20120190386 Anderson Jul 2012 A1
20120194895 Podbelski et al. Aug 2012 A1
20120200908 Bergh et al. Aug 2012 A1
20120236386 Mehtani et al. Sep 2012 A1
20120239209 Brown et al. Sep 2012 A1
20120261078 Adams et al. Oct 2012 A1
20120265350 Ashdown Oct 2012 A1
20120268803 Greer et al. Oct 2012 A1
20120275008 Pradhan et al. Nov 2012 A1
20120293855 Shrivastava et al. Nov 2012 A1
20120323382 Kamel et al. Dec 2012 A1
20130021659 Friedman et al. Jan 2013 A1
20130057157 Nackaerts et al. Mar 2013 A1
20130057937 Berman et al. Mar 2013 A1
20130063065 Berman et al. Mar 2013 A1
20130139804 Goldberg Jun 2013 A1
20130158790 McIntyre, Jr. et al. Jun 2013 A1
20130173926 Morese et al. Jul 2013 A1
20130242370 Wang Sep 2013 A1
20130263510 Gassion Oct 2013 A1
20130264948 Orillard et al. Oct 2013 A1
20130271812 Brown et al. Oct 2013 A1
20130271813 Brown Oct 2013 A1
20130271814 Brown Oct 2013 A1
20130271815 Pradhan et al. Oct 2013 A1
20130321923 Thuot et al. Dec 2013 A1
20140043667 Bergh et al. Feb 2014 A1
20140055014 Pan Feb 2014 A1
20140067733 Humann Mar 2014 A1
20140083413 Bibi et al. Mar 2014 A1
20140104667 Greer et al. Apr 2014 A1
20140117852 Zhai et al. May 2014 A1
20140145002 Caldeira et al. May 2014 A1
20140156079 Courtney et al. Jun 2014 A1
20140160550 Brown et al. Jun 2014 A1
20140177025 Lee et al. Jun 2014 A1
20140177028 Shrivastava et al. Jun 2014 A1
20140236323 Brown et al. Aug 2014 A1
20140259931 Plummer Sep 2014 A1
20140262057 Chambers et al. Sep 2014 A1
20140268287 Brown et al. Sep 2014 A1
20140288715 Beaujeu et al. Sep 2014 A1
20140300945 Parker Oct 2014 A1
20140303788 Sanders et al. Oct 2014 A1
20140330538 Conklin et al. Nov 2014 A1
20140371931 Lin et al. Dec 2014 A1
20150002919 Jack et al. Jan 2015 A1
20150035440 Spero Feb 2015 A1
20150049378 Shrivastava et al. Feb 2015 A1
20150060648 Brown et al. Mar 2015 A1
20150070190 Wai Fong et al. Mar 2015 A1
20150070745 Pradhan Mar 2015 A1
20150092259 Greer et al. Apr 2015 A1
20150116811 Shrivastava et al. Apr 2015 A1
20150122474 Petersen May 2015 A1
20150161515 Matsuoka Jun 2015 A1
20150177415 Bing Jun 2015 A1
20150185581 Pradhan et al. Jul 2015 A1
20150219975 Phillips et al. Aug 2015 A1
20150234945 Marceau et al. Aug 2015 A1
20150293422 Pradhan et al. Oct 2015 A1
20150316473 Kester et al. Nov 2015 A1
20150338713 Brown Nov 2015 A1
20150368967 Lundy et al. Dec 2015 A1
20160040478 Lundy et al. Feb 2016 A1
20160054633 Brown et al. Feb 2016 A1
20160054634 Brown et al. Feb 2016 A1
20160090098 Kim et al. Mar 2016 A1
20160124283 Brown et al. May 2016 A1
20160127642 Schechner et al. May 2016 A1
20160154290 Brown et al. Jun 2016 A1
20160202589 Nagel et al. Jul 2016 A1
20160203403 Nagel et al. Jul 2016 A1
20160277688 Gaskamp et al. Sep 2016 A1
20160283774 Buchanan et al. Sep 2016 A1
20170031056 Vega-Avila et al. Feb 2017 A1
20170053068 Pillai et al. Feb 2017 A1
20170075183 Brown Mar 2017 A1
20170097259 Brown et al. Apr 2017 A1
20170122802 Brown et al. May 2017 A1
20170123286 Parker May 2017 A1
20170125875 Courtney et al. May 2017 A1
20170130523 Shrivastava et al. May 2017 A1
20170168368 Brown et al. Jun 2017 A1
20170219907 Brown et al. Aug 2017 A1
20170276542 Klawuhn et al. Sep 2017 A1
20170279876 Prasad et al. Sep 2017 A1
20170293049 Frank et al. Oct 2017 A1
20170365908 Hughes et al. Dec 2017 A1
20180073712 Baaijens et al. Mar 2018 A1
20180129172 Shrivastava et al. May 2018 A1
20180157141 Brown et al. Jun 2018 A1
20180231860 Podbelski et al. Aug 2018 A1
20180284555 Klawuhn et al. Oct 2018 A1
20180307114 Brown et al. Oct 2018 A1
20180373111 Brown Dec 2018 A1
20190025661 Brown et al. Jan 2019 A9
20190171081 Zedlitz et al. Jun 2019 A1
20190230776 Casey et al. Jul 2019 A1
20190235129 Frank et al. Aug 2019 A1
20190250029 Zedlitz et al. Aug 2019 A1
20190257143 Nagel et al. Aug 2019 A1
20200057346 Zedlitz et al. Feb 2020 A1
20200063490 Hebeisen et al. Feb 2020 A1
20200072674 Baker et al. Mar 2020 A1
20200096831 Brown et al. Mar 2020 A1
20200200595 Klawuhn et al. Jun 2020 A1
20200209057 Brown et al. Jul 2020 A1
20200260556 Rozbicki et al. Aug 2020 A1
20200278245 Brown et al. Sep 2020 A1
20200355977 Brown et al. Nov 2020 A1
20200363261 Klawuhn et al. Nov 2020 A1
20200393733 Brown Dec 2020 A1
20210003899 Zedlitz et al. Jan 2021 A1
20210080319 Brown et al. Mar 2021 A1
20210108960 Klawuhn et al. Apr 2021 A1
20210190991 Frank et al. Jun 2021 A1
20210214274 Friedman et al. Jul 2021 A1
20210325754 Brown et al. Oct 2021 A1
20220026267 Brown et al. Jan 2022 A1
20220163385 Brown et al. May 2022 A1
20220214592 Brown et al. Jul 2022 A1
20220244098 Klawuhn et al. Aug 2022 A1
20220326584 Khanna et al. Oct 2022 A1
20230004059 Klawuhn et al. Jan 2023 A1
Foreign Referenced Citations (211)
Number Date Country
707305 Jun 2014 CH
1333807 Jan 2002 CN
1359479 Jul 2002 CN
1380482 Nov 2002 CN
1097760 Jan 2003 CN
2590732 Dec 2003 CN
1534413 Oct 2004 CN
1659080 Aug 2005 CN
1672189 Sep 2005 CN
1704556 Dec 2005 CN
1822951 Aug 2006 CN
200966026 Oct 2007 CN
201104273 Aug 2008 CN
101421558 Apr 2009 CN
101438205 May 2009 CN
101501757 Aug 2009 CN
101600604 Dec 2009 CN
101641618 Feb 2010 CN
101678209 Mar 2010 CN
101702036 May 2010 CN
101707892 May 2010 CN
101762920 Jun 2010 CN
101969207 Feb 2011 CN
102168517 Aug 2011 CN
102183237 Sep 2011 CN
102203370 Sep 2011 CN
102330530 Jan 2012 CN
202110359 Jan 2012 CN
202230346 May 2012 CN
102183237 Aug 2012 CN
102749781 Oct 2012 CN
202794021 Mar 2013 CN
103168269 Jun 2013 CN
103370192 Oct 2013 CN
103370490 Oct 2013 CN
103370649 Oct 2013 CN
103370986 Oct 2013 CN
203271490 Nov 2013 CN
103547965 Jan 2014 CN
103649826 Mar 2014 CN
103987909 Aug 2014 CN
203870367 Oct 2014 CN
104181612 Dec 2014 CN
104321497 Jan 2015 CN
204155059 Feb 2015 CN
104429162 Mar 2015 CN
104685428 Jun 2015 CN
204422071 Jun 2015 CN
104781493 Jul 2015 CN
105143586 Dec 2015 CN
105549293 May 2016 CN
106103191 Nov 2016 CN
106462023 Feb 2017 CN
106575064 Apr 2017 CN
106971028 Jul 2017 CN
10124673 Nov 2002 DE
102014220818 Apr 2016 DE
0067706 Feb 1991 EP
0445314 Sep 1991 EP
0869032 Oct 1998 EP
0920210 Jun 1999 EP
1078818 Feb 2001 EP
1012633 Mar 2002 EP
1441269 Jul 2004 EP
0835475 Sep 2004 EP
1510854 Mar 2005 EP
1417535 Nov 2005 EP
1619546 Jan 2006 EP
2161615 Mar 2010 EP
2357544 Aug 2011 EP
2518254 Oct 2012 EP
2590095 May 2013 EP
2764998 Aug 2014 EP
2787162 Oct 2014 EP
3114903 Jan 2017 EP
2517332 Sep 2018 EP
2638429 Feb 2021 EP
2308936 Dec 2008 ES
2462754 Feb 2010 GB
S6122897 Feb 1986 JP
S6282194 Apr 1987 JP
S63208830 Aug 1988 JP
H02132420 May 1990 JP
H0431833 Feb 1992 JP
H04363495 Dec 1992 JP
H05178645 Jul 1993 JP
H1063216 Mar 1998 JP
H10159465 Jun 1998 JP
H10249278 Sep 1998 JP
2000008476 Jan 2000 JP
2000096956 Apr 2000 JP
2002148573 May 2002 JP
2004170350 Jun 2004 JP
2004245985 Sep 2004 JP
2005054356 Mar 2005 JP
2005282106 Oct 2005 JP
2005314870 Nov 2005 JP
2006009281 Jan 2006 JP
2006029027 Feb 2006 JP
2007120090 May 2007 JP
2007308971 Nov 2007 JP
2009508387 Feb 2009 JP
2010101151 May 2010 JP
2010529488 Aug 2010 JP
4694816 Jun 2011 JP
4799113 Oct 2011 JP
2013057975 Mar 2013 JP
2014062801 Apr 2014 JP
2016516921 Jun 2016 JP
6541003 Jul 2019 JP
6818386 Jan 2021 JP
200412640 Mar 2006 KR
100752041 Aug 2007 KR
20080022319 Mar 2008 KR
20090026181 Mar 2009 KR
100904847 Jun 2009 KR
100931183 Dec 2009 KR
20100034361 Apr 2010 KR
20110003698 Jan 2011 KR
20110052721 May 2011 KR
20110094672 Aug 2011 KR
20110118783 Nov 2011 KR
20130018527 Feb 2013 KR
20140139894 Dec 2014 KR
101815919 Jan 2018 KR
29501 May 2003 RU
200532346 Oct 2005 TW
200920987 May 2009 TW
M368189 Nov 2009 TW
201029838 Aug 2010 TW
201215981 Apr 2012 TW
201217999 May 2012 TW
201220111 May 2012 TW
201231789 Aug 2012 TW
201243470 Nov 2012 TW
201248286 Dec 2012 TW
I395809 May 2013 TW
201403034 Jan 2014 TW
201447089 Dec 2014 TW
WO-9632560 Oct 1996 WO
WO-9816870 Apr 1998 WO
WO-9913359 Mar 1999 WO
WO-0209338 Jan 2002 WO
WO-0213052 Feb 2002 WO
WO-2004003649 Jan 2004 WO
WO-2005052524 Jun 2005 WO
WO-2005098811 Oct 2005 WO
WO-2005103807 Nov 2005 WO
WO-2007016546 Feb 2007 WO
WO-2007146862 Dec 2007 WO
WO-2008030018 Mar 2008 WO
WO-2008048181 Apr 2008 WO
WO-2008147322 Dec 2008 WO
WO-2009044330 Apr 2009 WO
WO-2009124647 Oct 2009 WO
WO-2010079388 Jul 2010 WO
WO-2010120771 Oct 2010 WO
WO-2011020478 Feb 2011 WO
WO-2011087684 Jul 2011 WO
WO-2011087687 Jul 2011 WO
WO-2011124720 Oct 2011 WO
WO-2011127015 Oct 2011 WO
WO-2012079159 Jun 2012 WO
WO-2012080589 Jun 2012 WO
WO-2012080618 Jun 2012 WO
WO-2012080656 Jun 2012 WO
WO-2012080657 Jun 2012 WO
WO-2012125332 Sep 2012 WO
WO-2012145155 Oct 2012 WO
WO-2013059674 Apr 2013 WO
WO-2013102932 Jul 2013 WO
WO-2013105244 Jul 2013 WO
WO-2013109881 Jul 2013 WO
WO-2013130781 Sep 2013 WO
WO-2013155467 Oct 2013 WO
WO-2013181408 Dec 2013 WO
WO-2014121863 Aug 2014 WO
WO-2014130471 Aug 2014 WO
WO-2014134451 Sep 2014 WO
WO-2014150153 Sep 2014 WO
WO-2014165692 Oct 2014 WO
WO-2014209812 Dec 2014 WO
WO-2015023842 Feb 2015 WO
WO-2015077097 May 2015 WO
WO-2015095615 Jun 2015 WO
WO-2015171886 Nov 2015 WO
WO-2016004109 Jan 2016 WO
WO-2016029156 Feb 2016 WO
WO-2016029165 Feb 2016 WO
WO-2016058695 Apr 2016 WO
WO-2016085964 Jun 2016 WO
WO-2016094445 Jun 2016 WO
WO-2016191406 Dec 2016 WO
WO-2017007942 Jan 2017 WO
WO-2017059362 Apr 2017 WO
WO-2017062592 Apr 2017 WO
WO-2017075472 May 2017 WO
WO-2017189437 Nov 2017 WO
WO-2017210346 Dec 2017 WO
WO-2018034935 Feb 2018 WO
WO-2018038972 Mar 2018 WO
WO-2018039433 Mar 2018 WO
WO-2018067996 Apr 2018 WO
WO-2018098089 May 2018 WO
WO-2018112095 Jun 2018 WO
WO-2018112095 Jul 2018 WO
WO-2018140495 Aug 2018 WO
WO-2018157063 Aug 2018 WO
WO-2019183232 Sep 2019 WO
WO-2019183289 Sep 2019 WO
WO-2021080879 Apr 2021 WO
Non-Patent Literature Citations (512)
Entry
“SageGlass helps Solar Decathlon- and AIA award-winning home achieve net-zero energy efficiency” in MarketWatch.com, http://www.marketwatch.com/story/sageglass-helps-solar-decathlon-and-aia-award-winning-home-achieve-net-zero-energy-efficiency-2012-06-07, Jun. 7, 2012.
American Chemical Society, “Solar smart window could offer privacy and light control on demand (video),” EurakAlert! Pub Release, Nov. 16, 2016 [https://www.eurekalert.org/pub_releases/2016-11/acs-ssw111616.php].
APC by Schneider Electric, Smart-UPS 120V Product Brochure, 2013, 8 pp.
AU Examination Report dated Aug. 28, 2021, in the AU Application No. 2020202011.
AU Examination Report dated Jul. 9, 2021 in AU Application No. 2020250299.
Australian Examination Report dated Aug. 20, 2020 in AU Application No. 2016334038.
Australian Examination Report dated Dec. 19, 2018 in AU Application No. 2017270472.
Australian Examination Report dated Feb. 21, 2019 in AU Application No. 2018201341.
Australian Examination Report dated Mar. 31, 2017 in AU Application No. 2014219076.
Australian Examination Report dated May 20, 2021 in AU Application No. 2020202135.
Australian Examination Report dated May 31, 2019 in AU Application No. 2016334038.
Australian Examination Report dated Sep. 9, 2016 in AU Application No. 2013249621.
Australian Examination Report No. 2 dated Feb. 12, 2020 in AU Application No. 2018201341.
Australian Notice of Acceptance for Patent Application, dated Sep. 29, 2020, for Australian Patent Application No. 2015255913.
Australian Office Action dated Jul. 1, 2019 in AU Application No. 2015255913.
Benson D. K. et al., “Design goals and challenges for a photovoltaic-powered electrochromic window covering”, Solar Energy Materials and Solar Cells, vol. 39, No. 2/04, Dec. 1995, pp. 203-211.
Boltwood Cloud Sensor II by Diffraction Limited, 2016, [online], [retrieved Dec. 15, 2016]. Retrieved from the internet URL http://diffractionlimited.com/product/boltwood-cloud-sensor-ii/.
Campbell-Burns, Peter, “Building a Cloud Sensor”, Farnham Astronomical Society, (Apr. 15, 2013), Retrieved from the internet: URL: https://www.farnham-as.co.uk/2813/84/building-a-cloud-sensor/ [retrieved on 2828-84-24].
Canadian Notice of Allowance dated Aug. 12, 2020 in Canadian Application No. 2,902,106.
Canadian Notice of Allowance dated Jan. 18, 2021 in Canadian Application No. 2,902,106.
Canadian Office Action dated Apr. 11, 2017 in CA Design Application No. 170770.
Canadian Office Action dated Feb. 11, 2021 in CA Application No. 2,870,627.
Canadian Office Action dated Jan. 28, 2020 in Canadian Application No. 2,902,106.
Canadian Office Action dated Jun. 10, 2021 in CA Application No. 2,948,668.
Canadian Office Action dated Nov. 15, 2017 in CA Design Application No. 170770.
C-Bus Multi-Sensors, posted on clipsal.com. [online] <url: https:=″″ <a=″″ href=“http://www.clipsal.com/Trade/Products/Integrated-Systems/C-Bus-Control-and-Management-System/C-Bus-1”>www.clipsal.com/Trade/Products/Integrated-Systems/C-Bus-Control-and-Management-System/C-Bus-1 nput -Units/Sensors-and- Detectors#. WOf1 LvkrlmE</url:>.
Chinese Notice of Allowance & Search Report dated Sep. 12, 2019 in CN Application No. 201580035315.2.
Chinese Notice of Allowance dated Jun. 3, 2021 in CN Application No. 201680043725.6, No Translation.
Chinese Office Action dated Apr. 27, 2017 in CN Design Application No. 201630492174.4.
Chinese Office Action dated Apr. 3, 2018 in CN Application No. 201580059148.5.
Chinese Office Action dated Apr. 5, 2016 in Chinese Application No. 201280023631.4.
Chinese Office Action dated Aug. 1, 2019 in CN Application No. 201580059148.5.
Chinese Office Action dated Aug. 17, 2020 in CN Application No. 201680062708.7 with Translation.
Chinese Office Action dated Aug. 19, 2019 in CN Application No. 201610645398.3.
Chinese Office Action dated Aug. 23, 2019 in CN Application No. 201680063892.7.
Chinese Office Action dated Dec. 1, 2016 in Chinese Application No. 201280023631.4.
Chinese Office Action dated Dec. 1, 2020 in CN Application No. 201580059148.5, with English Translation.
Chinese Office Action dated Dec. 10, 2020 in CN Application No. 201580052861.7 with Translation.
Chinese Office Action dated Dec. 16, 2020 in CN Application No. 201680063892.7, with English Translation.
Chinese Office Action dated Dec. 19, 2018 in CN Application No. 201610645398.3.
Chinese Office Action dated Dec. 25, 2018 in CN Application No. 201710111979.3.
Chinese Office Action dated Feb. 2, 2019 in CN Application No. 201580059148.5.
Chinese Office Action dated Feb. 2, 2021 in Chinese Application No. 201880022572.6, with English Translation.
Chinese Office Action dated Feb. 3, 2020 in Chinese Application No. 201710600395.2, with English Translation.
Chinese Office Action dated Feb. 9, 2018 in CN Application No. 201480022064.X.
Chinese Office Action dated Jan. 12, 2021 in CN Application No. 201780065447.9 with Translation.
Chinese Office Action dated Jan. 13, 2021 in Chinese Application No. 201811232377.4, with English Translation.
Chinese Office Action dated Jan. 14, 2019 in CN Application No. 201580035315.2.
Chinese Office Action dated Jan. 21, 2020 in Chinese Application No. 201811232377.4, with English Translation.
Chinese Office Action dated Jul. 14, 2020 in CN Application No. 201680063892.7, with English Translation.
Chinese Office Action dated Jul. 2, 2018 in Chinese Application No. 201710111979.3.
Chinese Office Action dated Jun. 23, 2021 in Chinese Application No. 201811232377.4, with English Translation.
Chinese Office Action dated Jun. 26, 2015 in Chinese Application No. 201280023631.4.
Chinese Office Action dated Jun. 27, 2016 in Chinese Application No. 201480022064.X.
Chinese Office Action dated Jun. 28, 2020 in CN Application No. 201680043725.6.
Chinese Office Action dated Mar. 10, 2020 in CN Application No. 201610645398.3, with English Translation.
Chinese Office Action dated Mar. 25, 2019 in CN Application No. 201580052861.7.
Chinese Office Action dated Mar. 26, 2015 in CN Application No. 201280060910.8.
Chinese Office Action dated May 15, 2017 in Chinese Application No. 201480022064.X.
Chinese Office Action dated May 20, 2021 in Chinese Application No. 201710600395.2, with English Translation.
Chinese Office Action dated May 6, 2020 in CN Application No. 201580059148.5.
Chinese Office Action dated Nov. 20, 2019 in CN Application No. 201680062708.7 with Translation.
Chinese Office Action dated Nov. 27, 2015 in Chinese Application No. 2015032301101560.
Chinese Office Action dated Nov. 3, 2020 in Chinese Application No. 201710600395.2, with English Translation.
Chinese Office Action dated Oct. 10, 2015 in CN Application No. 201380026428.7.
Clay, R.W., et al., “A cloud monitoring system for remote sites,” Publications of the Astronomical Society of Australia, vol. 15, No. 3, Jan. 1998, pp. 332-335.
CN Office Action dated Apr. 13, 2020 in Chinese Application No. 201580052861.7, with Translation.
CN Office Action dated Aug. 4, 2021, in CN Application No. 201780039437.8 with English translation.
CN Office Action dated Aug. 17, 2021, in CN Application No. 201680063892.7 with English translation.
CN Office Action dated Feb. 23, 2021 in Chinese Application No. 201680062708.7,with Translation.
CN Rejection Decision dated Mar. 19, 2021 in Chinese Application No. 201580059148.5, without English Translation.
Communication re Third-Party Observation dated Dec. 4, 2014 and Third-Party Observation dated Dec. 3, 2014 in PCT/US2014/016974.
Decision to Grant, dated Oct. 27, 2020, for Japanese Patent Application No. JP 2019-031229, with partial translation.
Duchon, Claude E. et al., “Estimating Cloud Type from Pyranometer Observations,” Journal of Applied Meteorology, vol. 38, Jan. 1999, pp. 132-141.
English translation of CN201104273 description form worldwide.espacenet.com.
English translation of JP2004170350 description form worldwide.espacenet.com.
EP Extended Search Report dated Apr. 29, 2020 in EP Application No. 17881918.1.
EP Extended Search Report dated Dec. 4, 2020 in EP Application No. 18756696.3.
EP Extended Search Report dated Mar. 23, 2020 in EP Application No. 17807428.2.
EP Extended Search Report dated May 12, 2020 in EP Application No. 17859286.1.
EP Extended Search Report dated May 16, 2019 in EP Application No. 16852784.4.
EP Invitation to Indicate Search dated Jun. 22, 2016 in EP Application No. 14753897.9.
EP Office Action dated Feb. 3, 2021 in European Application No. 15848030.1.
EP Partial Supplemental Search Report dated Apr. 12, 2019 in EP Application No. 16852784.4.
EPO Communication dated Sep. 2, 2015 in EP Application No. 14753897.9 re Third-Party Observations.
European Examination Report dated Jul. 11, 2019 in EP Application No. 16854302.3.
European Extended Search Report dated Jan. 17, 2019 in EP Application No. 16821984.8.
European Extended Search Report dated Jan. 18, 2019 in EP Application No. 18208971.4.
European Extended Search Report dated Jun. 18, 2019 in EP Application No. 19165771.7.
European Extended Search Report dated Oct. 12, 2016 in EP Application No. 14753897.9.
European Intention to Grant, dated Jan. 18, 2021, in EP Application No. 18208971.4.
European Intention to Grant, dated Jul. 9, 2020, in EP Application No. 18208971.4.
European Intention to Grant, dated Mar. 23, 2021, in EP Application No. 18208971.4.
European Intention to Grant, dated Sep. 21, 2020, in EP Application No. 19165771.7.
European Office Action dated Aug. 27, 2019 in EP Application No. 15848030.1.
European Office Action dated Dec. 12, 2017 in EP Application No. 14753897.9.
European Office Action dated Dec. 2, 2015 in EP Application No. 12841714.4.
European Office Action dated Feb. 21, 2019 in EP Application No. 15845648.3.
European Office Action dated Jul. 15, 2019 in EP Application No. 13777540.9.
European Office Action dated Mar. 12, 2021 in EP Application No. 16852784.4.
European Office Action dated Mar. 20, 2020 in EP Application No. 16852784.4.
European Office Action dated May 15, 2017 in EP Application No. EP 12841714.4.
European Office Action dated May 3, 2021 in EP Application No. 17881918.1.
European Office Action dated Oct. 2, 2020 in EP Application No. 13777540.9.
European (Partial) Search Report dated Dec. 17, 2019 in EP Application No. 17807428.2.
European Search Report dated Aug. 11, 2014 in European Application No. 12757877.1.
European Search Report dated Jul. 23, 2014 in European Application No. 12756917.6.
European Search Report dated Jul. 29, 2014 in European Application No. 12758250.0.
European Search Report dated Mar. 5, 2015 in European Application No. 12841714.4.
European Search Report dated May 11, 2016 in EP Application No. 13777540.9.
Ex Parte Quayle Action, dated Feb. 2, 2021, in U.S. Appl. No. 16/335,222.
Examination Report No. 1, dated May 7, 2020, for Australian Patent Application No. 2015324000 3 pages.
Extended European Search Report dated Apr. 26, 2018 in EP Application No. 15845648.3.
Extended European Search Report dated Apr. 3, 2018 in EP Application No. 15789108.6.
Extended European Search Report dated Jul. 16, 2021 in EP Application No. 21150305.7.
Extended European Search Report dated May 3, 2018 in EP Application No. 15848030.1.
Extended European Search Report dated Oct. 15, 2020 in EP Application No. 20182982.7.
Final Office Action dated Jun. 5, 2015 in U.S. Appl. No. 13/968,258.
Gen5 Z-Wave Plus 6-in-1 Multisensor, posted on thesmartesthouse.com, Earliest review on Aug. 27, 2015. [online] <url:<a href=“https://www.thesmartesthouse.com/products/aeotec-by-aeon-labs-z-wave-5-in-1-multisensor#shopify-product-reviews”>https://www.thesmartesthouse.com/products/aeotec-by-aeon-labs-z-wave-5-in-1-multisensor#shopify-product-reviews</url:<a>.
Graham, Steve, “Clouds & Radiation,” Mar. 1, 1999. [http://earthobservatory.nasa.gov/Features/Clouds/].
Haby, Jeff, “Cloud Detection (IR v. VIS),” (known as of Sep. 3, 2014) [http://theweatherprediction.com/habyhints2/512/].
Halio Automation Brochure, halioglass.com, dated Aug. 2019, 13 pages.
“Halio Rooftop Sensor Kit (Model SR500),” Product Data Sheet, Kinestral Technologies, 2020, 4 pp.
“Halio Smart-Tinting Glass System,” Product Data Sheet, Kinestral Technologies, www.kinestral.com, copyright 2017, 4 pp.
Hoosier Energy, “How do they do that? Measuring Real-Time Cloud Activity” Hoosier Energy Current Connections, (known as of Sep. 3, 2014). (http://members.questline.com/Article.aspx?articleID=18550&accountID=196000&nl=11774).
“How Cleantech wants to make a 2012 comeback” http://mountainview.patch.com/articles/how-cleantech-wants-to-make-a-2012-comeback, Jan. 23, 2012.
Idso, Sherwood B., “Humidity measurement by infrared thermometry,” Remote Sensing of Environment, vol. 12, 1982, pp. 87-91.
IN First Examination Report dated Jul. 7, 2021 in Indian Patent Application No. 201917013204.
IN First Examination Report dated Mar. 15, 2021 in Indian Patent Application No. 201837010260.
Indian First Examination Report dated Sep. 17, 2020 in Indian Patent Application No. 201737011527.
Indian Office Action dated Dec. 18, 2019 in IN Application No. 2371/KOLNP/2014.
Indian Office Action dated Jul. 7, 2021, in IN Application No. 201917013204.
Indian Office Action dated Jul. 9, 2020 in IN Application No. 201637038970.
Indian Office Action dated Jun. 10, 2021, in IN Application No. 202038025893.
Indian Office Action dated Septembers, 2019 in IN Application No. 3074/KOLNP/2015.
International Preliminary Reporton Patentability dated Apr. 12, 2018 in PCT/US16/55005.
International Preliminary Reporton Patentability dated Apr. 13, 2017 in PCT/US2015/052822.
International Preliminary Reporton Patentability dated Apr. 13, 2017 in PCT/US2015/053041.
International Preliminary Reporton Patentability dated Apr. 18, 2019 in PCT/US17/55631.
International Preliminary Reporton Patentability dated Apr. 19, 2018 in PCT/US2016/055709 International Preliminary Reporton Patentability dated Apr. 19, 2018 in PCT/US2016/055709.
International Preliminary Reporton Patentability dated Dec. 13, 2018 in PCT/US17/35290.
International Preliminary Reporton Patentability dated Feb. 19, 2015 issued in PCT/US2013/053625.
International Preliminary Reporton Patentability dated Jan. 18, 2018 in PCT/US2016/041344.
International Preliminary Reporton Patentability dated Jun. 17, 2019 in PCT/US2017/066198.
International Preliminary Reporton Patentability dated May 1, 2014 in PCT/US2012/061137.
International Preliminary Report on Patentability dated Nov. 24, 2016 in PCT/US2015/029675.
International Preliminary Report on Patentability dated Oct. 23, 2014 issued in PCT/US2013/036456.
International Preliminary Report on Patentability dated Oct. 30, 2014 issued in PCT/US2013/034998.
International Preliminary Report on Patentability dated Oct. 30, 2014 issued in PCT/US2013/036235.
International Preliminary Reporton Patentability dated Sep. 22, 2020 in PCT/US19/23186.
International Preliminary Reporton Patentability dated Sep. 22, 2020, issued in PCT/US2019/023268.
International Preliminary Reporton Patentability dated Sep. 26, 2013, issued in PCT/US2012/027742.
International Preliminary Reporton Patentability dated Sep. 26, 2013, issued in PCT/US2012/027828.
International Preliminary Reporton Patentability dated Sep. 26, 2013, issued in PCT/US2012/027909.
International Preliminary Reporton Patentability dated Sep. 3, 2015, issued in PCT/US2014/016974.
International Preliminary Report on Patentability dated Sep. 6, 2019 issued in PCT/US2018/019737.
International Search Report and Written Opinion dated Dec. 13, 2016 in PCT/US16/55005.
International Search Report and Written Opinion dated Dec. 15, 2016 in PCT/US2016/055709.
International Search Report and Written Opinion dated Dec. 18, 2015 in PCT/US2015/053041.
International Search Report and Written Opinion dated Dec. 26, 2013, issued in PCT/US2013/053625.
International Search Report and Written Opinion dated Jan. 14, 2016 in PCT/US2015/052822.
International Search Report and Written Opinion dated Jan. 25, 2018 in PCT/US17/55631.
International Search Report and Written Opinion dated Jul. 11, 2013, issued in PCT/US2013/034998.
International Search Report and Written Opinion dated Jul. 23, 2013, issued in PCT/US2013/036235.
International Search Report and Written Opinion dated Jul. 23, 2015 in PCT/US2015/029675.
International Search Report and Written Opinion dated Jul. 26, 2013, issued in PCT/US2013/036456.
International Search Report and Written Opinion dated Mar. 28, 2013 in PCT/US2012/061137.
International Search Report and Written Opinion dated Mar. 30, 2018 in PCT/US2017/066198.
International Search Report and Written Opinion dated May 26, 2014 in PCT/US2014/016974.
International Search Report and Written Opinion dated Oct. 13, 2016, issued in PCT/US2016/041344.
International Search Report and Written Opinion dated Oct. 16, 2014, issued in PCT/US2014/043514.
International Search Report and Written Opinion dated Sep. 24, 2012, issued in PCT/US2012/027742.
International Search Report and Written Opinion dated Sep. 24, 2012, issued in PCT/US2012/027909.
International Search Report and Written Opinion dated Sep. 26, 2012, issued in PCT/US2012/027828.
International Search Report and Written Opinion dated Sep. 4, 2019, issued in PCT/US2019/023268.
International Search Report and Written Opinion dated Sep. 8, 2017, issued in PCT/US17/35290.
International Search Report and Written Opinion (ISA: KIPO) dated Jun. 11, 2018 issued in PCT/US2018/019737.
International Search Report and Written Opinion (ISA:EPO) dated Feb. 22, 2021 in PCT/US2020/056164.
International Search Report and Written Opinion (ISA:EPO) dated Jul. 5, 2019 in PCT/US19/23186.
Invitation to Pay Fees and Communication Relating to the Result of the Partial International Search, dated Jul. 12, 2019, issued in PCT/US2019/023268.
Japanese Decision of Rejection dated Oct. 24, 2018 in JP Application No. JP 2015-558909.
Japanese Office Action dated Apr. 2, 2019 in JP Application No. 2016-567021.
Japanese Office Action dated Feb. 6, 2018 in JP Application No. 2015-558909.
Japanese Office Action dated Jan. 27, 2021 in JP Application No. 2019-232669.
Japanese Office Action dated Mar. 10, 2020 in JP Application No. 2019-031229.
JP Decision to Grant a Patent dated Jul. 29, 2021, in JP Application No. 2019-232669.
Kipp & Zonen, “Solar Radiation” (known as of Sep. 3, 2014) [http://www.kippzonen.com/Knowledge-Center/Theoretical-info/Solar-Radiation].
Kleissl, Jan et al., “Recent Advances in Solar Variability Modeling and Solar Forecasting at UC San Diego,” Proceedings, American Solar Energy Society, 2013 Solar Conference, Apr. 16-20, 2013, Baltimore, MD.
Korean Notice of Decision to Grant dated Jun. 22, 2021 in KR Application No. KR10-2015-7026041, with English Translation.
Korean Notice of First Refusal dated Feb. 18, 2021 in KR Application No. KR10-2015-7026041.
Korean Notification of Provisional Rejection dated Jun. 22, 2021 in KR Application No. KR10-2016-7032512.
Korean Office Action, dated Feb. 16, 2021, for Korean Patent Application No. 10-2020-7002032 with English Translation.
Korean Office Action, dated Jun. 15, 2020, for Korean Patent Application No. 10-2020-7002032 with English Translation.
Korean Office Action, dated Jun. 7, 2021, for Korean Patent Application No. 10-2020-7002032, with English Translation.
Korean Office Action dated Mar. 30, 2020 in KR Application No. KR 10-2015-7026041, No translation.
“Sky temperature” in Designing Buildings Wiki, Apr. 26, 2015, https://www.designingbuildings.co.uk/wiki/Sky_temperature [retrieved Jul. 19, 2019].
Letter dated Dec. 1, 2014 re Prior Art re U.S. Appl. No. 13/772,969 from Ryan D. Ricks representing MechoShade Systems, Inc.
Lim, Sunnie H.N. et al., “Modeling of optical and energy performance of tungsten-oxide-based electrochromic windows including their intermediate states,” Solar Energy Materials & Solar Cells, vol. 108, Oct. 16, 2012, pp. 129-135.
Maghrabi, A., et al., “Design and development of a simple infrared monitor for cloud detection,” Energy Conversion and Management, vol. 50, 2009, pp. 2732-2737.
Maghrabi, A., et al., “Precipitable water vapour estimation on the basis of sky temperatures measured by a single-pixel IR detector and screen temperatures under clear skies,” Meteorological Applications, vol. 17, 2010, pp. 279-286.
“Smart Glazing: Making smart-tinting glazing even smarter”, Daylighting: Design & Technology for Better Buildings, Issue 20 (Jan./Feb. 2020), 5 pages.
Melexis “MLX90614 family Datasheet” (3901090614, Rev. 004), Jul. 30, 2008, 42 pp.
Merriam-Webster, Inc., “Pyrometer,” Merriam-Webster's Collegiate Dictionary 10th Edition, Springfield, MA, 2001, p. 950 (4 pp.).
Mims III, Forrest M., et al., “Measuring total col. water vapor by pointing an infrared thermometer at the sky,” Bulletin of the American Meteorological Society, Oct. 2011, pp. 1311-1320.
Morris, V.R. et al., “Deployment of an infrared thermometer network at the atmospheric radiation measurement program southern great plains climate research facility,” Sixteenth ARM Science Team Meeting Proceedings, Albuquerque, NM, Mar. 27-31, 2006, 11 pp.
National Aeronautics & Space Administration, “Cloud Radar System (CRS),” (known as of Sep. 3, 2014), published date of Jun. 16, 2014, [http://har.gsfc.nasa.gov/index.php?section=12].
National Aeronautics & Space Administration, “Cloud Remote Sensing and Modeling,” (known as of Sep. 3, 2014), published date of Sep. 15, 2014, [http://atmospheres.gsfc.nasa.gov/climate/index.php?section=134].
“New from Pella: Windows with Smartphone-run blinds”, Pella Corp., http://www.desmoinesregister.com/article/20120114/BUSINESS/301140031/0/biggame/?odyssey=nav%7Chead, Jan. 13, 2012.
Notice of Allowance, dated Jun. 18, 2020 in CN Application No. 201610645398.3, No Translation.
Office Action dated Oct. 6, 2014 in U.S. Appl. No. 13/968,258.
Partial EP Supplemental Search Report dated Nov. 29, 2017 in EP Application No. 15789108.6.
“SPN1 Sunshine Pyranometer,” Product Overview, Specification, Accessories and Product Resources, Delta-T Devices, May 5, 2016, 9 pp. https://www.delta-t.co.uk/product/spn1/ (downloaded Apr. 28, 2020).
Preliminary Amendment dated Aug. 21, 2019 for U.S. Appl. No. 16/487,802.
Preliminary Amendment dated Feb. 20, 2018 in U.S. Appl. No. 15/287,646.
Preliminary Amendment dated Jul. 1, 2021, in U.S. Appl. No. 17/305,132.
Preliminary Amendment dated Jul. 10, 2020 for U.S. Appl. No. 15/929,958.
Preliminary Amendment dated Mar. 12, 2020, in U.S. Appl. No. 16/695,057.
Preliminary Amendment dated Nov. 9, 2020 for U.S. Appl. No. 17/008,342.
Preliminary Amendment No. 2, dated Dec. 9, 2020 for U.S. Appl. No. 16/695,004.
“Remote Sensing: Clouds,” Department of Atmospheric and Ocean Science, University of Maryland, (known as of Sep. 3, 2014) [http://www.atmos.umd.edu/˜pinker/remote_sensing_clouds.htm].
Russian Decision to Grant with Search Report dated Feb. 28, 2018 in RU Application No. 2015139884.
Russian Office Action & Search Report dated Apr. 1, 2019 in RU Application No. 2017114996.
Russian Office Action dated Apr. 13, 2017 in RU Application No. 2014144632.
Russian Office Action dated Dec. 7, 2018 in RU Application No. 2016148196.
Russian Office Action dated Nov. 22, 2017 in RU Application No. 2014144632.
Science and Technology Facilities Council. “Cloud Radar: Predicting The Weather More Accurately.” ScienceDaily, Oct. 1, 2008. [www.sciencedaily.com/releases/2008/09/080924085200.htm].
Selkowitz, S. et al., “Dynamic, Integrated Faade Systems for Energy Efficiency and Comfort,” Journal of Buiding Enclosure Design, Summer 2006, pp. 11-17.
Singapore Search Report dated May 29, 2020 in SG Application No. 10201608572S.
Singapore Supplementary Examination Report dated Dec. 7, 2016 in SG Application No. 11201406676Q.
Sloan, Raymond, et al., “Infrared Emission Spectrum of the Atmosphere,” Journal of the Optical Society of America, vol. 45, No. 6, Jun. 1955, pp. 455-460.
Smith, et al. “Measuring Cloud Cover and Brightness Temperature with a Ground Based Thermal Infrared Camera”, (Feb. 2008), American Meteorological Society, vol. 47, pp. 683-693.
SurroundVideo Series, “Pioneering Multi-Sensor IP Megapixel Cameras,” [webpage] 10 pp. [retrieved Jul. 24, 2015] <url:<a href=“http://web.archive.org/web/20150724235343/http://www.arecontvision.com/landing-pages/surround-video/overview.php”>http://web.archive.org/web/20150724235343/http://www.arecontvision.com/landing-pages/surround-video/overview.php</url:<a>.
Taiwan Office Action dated Jul. 2, 2020 in Taiwan Patent Application No. 105132457 with Translation.
Taiwan Office Action dated May 13, 2021 in Taiwan Patent Application No. 106134521 with English Translation.
Taiwan Office Action dated Jul. 30, 2020 in ROC (Taiwan) Pat. Appln. No. 105121480, with English Translation.
Taiwanese Decision of Rejection dated Nov. 1, 2019 in TW Application No. 104131932.
Taiwanese Office Action dated Apr. 17, 2020 in TW Application No. TW 107102210, No translation.
Taiwanese Office Action dated Aug. 22, 2017 in TW Application No. 103105957.
Taiwanese Office Action dated Jan. 11, 2016 in TW Application No. 101108958.
Taiwanese Office Action dated Jan. 30, 2019 in TW Application No. 104114812.
Taiwanese Office Action dated Jun. 21, 2021 in TW Application No. TW 107106439, No translation.
Taiwanese Office Action dated Jun. 28, 2019 in TW Application No. 104131932.
Taiwanese Office Action dated May 8, 2019 in TW Application No. 107122055.
Taiwanese Office Action dated Nov. 23, 2016 in TW Application No. 105129854.
Taiwanese Office Action dated Oct. 17, 2017 in TW Application No. 106115702.
Taiwanese Office Action dated Sep. 11, 2020 in TW Application No. 109103256, with English Translation.
Taiwanese Office Action dated Sep. 16, 2020 in TW Application No. 108143706, with English Translation.
Third-Party Submission dated Feb. 2, 2015 and Feb. 18, 2015 PTO Notice re Third-Party Submission for U.S. Appl. No. 13/772,969.
Thompson, Marcus, “Boltwood cloud sensor,” Cloudynights.com, Nov. 25, 2005, 6 pp. [online], [retrieved Dec. 15, 2016]. Retrieved from the internet URL http://www.cloudynights.com/page/articles/cat/user-reviews/photography/photography-accessories/boltwood-cloud-sensor-r1222.
TW Notice of Allowance, dated Jan. 25, 2021 in Taiwanese Application No. 104131932 No translation.
TW Notice of Allowance dated Jun. 29, 2021 in Taiwan Patent Application No. 110115755, with English translation.
TW Notice of Allowance dated Mar. 4, 2021 in Taiwan Patent Application No. 105132457, No translation.
TW Reissued Office Action dated Jul. 8, 2021, in Taiwanese Application No. 107106439.
U.S. Corrected Notice of Allowance dated Aug. 19, 2021 in U.S. Appl. No. 16/871,976.
U.S. Final Office Action dated Apr. 30, 2012 in U.S. Appl. No. 13/049,750.
U.S. Final Office Action dated Apr. 30, 2020 in U.S. Appl. No. 15/891,866.
U.S. Final Office Action dated Aug. 19, 2013 in U.S. Appl. No. 13/049,756.
U.S. Final Office Action dated Feb. 21, 2019 in U.S. Appl. No. 15/287,646.
U.S. Final Office Action dated Feb. 26, 2015 in U.S. Appl. No. 13/479,137.
U.S. Final Office Action dated Jan. 11, 2019 in U.S. Appl. No. 15/891,866.
U.S. Final Office Action dated Jan. 17, 2019 in U.S. Appl. No. 14/998,019.
U.S. Final Office Action dated Jan. 27, 2014 in U.S. Appl. No. 13/479,137.
U.S. Final Office Action dated Jul. 2, 2015 in U.S. Appl. No. 13/049,756.
U.S. Final Office Action dated Jul. 29, 2016 in U.S. Appl. No. 13/772,969.
U.S. Final Office Action dated May 15, 2014 in U.S. Appl. No. 13/449,251.
U.S. Final Office Action dated May 16, 2014 in U.S. Appl. No. 13/449,248.
U.S. Final Office Action dated Sep. 26, 2018 in U.S. Appl. No. 15/514,480.
U.S. Non-Final Office Action dated Sep. 13, 2021, in U.S. Appl. No. 17/027,601.
U.S. Notice of Allowability (corrected) dated Jul. 28, 2016 in U.S. Appl. No. 14/163,026.
U.S. Notice of Allowance (corrected) dated Jun. 9, 2020 in U.S. Appl. No. 15/442,509.
U.S. Notice of Allowance dated Apr. 13, 2015 in U.S. Appl. No. 14/657,380.
U.S. Notice of Allowance dated Apr. 14, 2021 in U.S. Appl. No. 16/335,222.
U.S. Notice of Allowance dated Apr. 22, 2021 in U.S. Appl. No. 15/742,015.
U.S. Notice of Allowance dated Apr. 29, 2020 in U.S. Appl. No. 14/998,019.
U.S. Notice of Allowance dated Apr. 4, 2016 in U.S. Appl. No. 14/535,080.
U.S. Notice of Allowance dated Aug. 3, 2021 in U.S. Appl. No. 16/335,222.
U.S. Notice of Allowance dated Aug. 12, 2016 in U.S. Appl. No. 14/352,973.
U.S. Notice of Allowance dated Aug. 12, 2020 in U.S. Appl. No. 16/013,770.
U.S. Notice of Allowance dated Aug. 12, 2021 in U.S. Appl. No. 16/335,222.
U.S. Notice of Allowance dated Aug. 16, 2018 in U.S. Appl. No. 15/349,860.
U.S. Notice of Allowance dated Aug. 19, 2021 in U.S. Appl. No. 15/742,015.
U.S. Notice of Allowance dated Aug. 7, 2020 in U.S. Appl. No. 15/891,866.
U.S. Notice of Allowance dated Dec. 12, 2017 in Design U.S. Appl. No. 29/560,076.
U.S. Notice of Allowance dated Dec. 22, 2016 in U.S. Appl. No. 13/772,969.
U.S. Notice of Allowance dated Jan. 10, 2014 in U.S. Appl. No. 13/449,235.
U.S. Notice of Allowance dated Jan. 12, 2018 in U.S. Appl. No. 14/932,474.
U.S. Notice of Allowance dated Jan. 22, 2015 in U.S. Appl. No. 13/682,618.
U.S. Notice of Allowance dated Jan. 22, 2018 in U.S. Appl. No. 15/464,837.
U.S. Notice of Allowance dated Jan. 23, 2020 in U.S. Appl. No. 14/998,019.
U.S. Notice of Allowance dated Jan. 24, 2019 in U.S. Appl. No. 15/514,480.
U.S. Notice of Allowance dated Jan. 27, 2017 in U.S. Appl. No. 14/931,390.
U.S. Notice of Allowance dated Jan. 8, 2016 in U.S. Appl. No. 13/049,756.
U.S. Notice of Allowance dated Jul. 15, 2021 in U.S. Appl. No. 16/871,976.
U.S. Notice of Allowance dated Jul. 20, 2012 in U.S. Appl. No. 13/049,623.
U.S. Notice of Allowance dated Jul. 23, 2020 in U.S. Appl. No. 16/013,770.
U.S. Notice of Allowance dated Jun. 1, 2020 in U.S. Appl. No. 16/695,057.
U.S. Notice of Allowance dated Jun. 17, 2014 in U.S. Appl. No. 13/309,990.
U.S. Notice of Allowance dated Jun. 22, 2016 in U.S. Appl. No. 13/049,756.
U.S. Notice of Allowance dated Jun. 8, 2016 in U.S. Appl. No. 14/163,026.
U.S. Notice of Allowance dated Mar. 20, 2018 in U.S. Appl. No. 15/347,677.
U.S. Notice of Allowance dated Mar. 31, 2021 in U.S. Appl. No. 15/742,015.
U.S. Notice of Allowance dated May 12, 2021 for U. S. U.S. Appl. No. 15/762,077.
U.S. Notice of Allowance dated May 13, 2021 in U.S. Appl. No. 16/695,004.
U.S. Notice of Allowance dated May 14, 2015 in U.S. Appl. No. 13/479,137.
U.S. Notice of Allowance dated May 23, 2019 in U.S. Appl. No. 15/514,480.
U.S. Notice of Allowance dated May 27, 2020 in U.S. Appl. No. 15/442,509.
U.S. Notice of Allowance dated May 3, 2018 in U.S. Appl. No. 14/993,822.
U.S. Notice of Allowance dated May 8, 2012 in U.S. Appl. No. 13/049,750.
U.S. Notice of Allowance dated Nov. 1, 2018 in U.S. Appl. No. 15/513,535.
U.S. Notice of Allowance dated Nov. 16, 2018 in U.S. Appl. No. 15/349,860.
U.S. Notice of Allowance dated Oct. 15, 2020 in U.S. Appl. No. 16/946,168.
U.S. Notice of Allowance dated Oct. 2, 2019 in U.S. Appl. No. 15/464,837.
U.S. Notice of Allowance dated Sep. 25, 2018 in U.S. Appl. No. 15/442,509.
U.S. Notice of Allowance dated Sep. 3, 2019 in U.S. Appl. No. 15/442,509.
U.S. Notice of Allowance dated Sep. 6, 2019 in U.S. Appl. No. 14/993,822.
U.S. Notice of Allowance dated Sep. 6, 2019 in U.S. Appl. No. 15/514,480.
U.S. Notice of Allowance dated Sep. 9, 2019 in U.S. Appl. No. 15/287,646.
U.S. Notice of Allowance (supplemental) dated Jun. 12, 2015 in U.S. Appl. No. 13/479,137.
U.S. Office Action dated Apr. 14, 2016 in U.S. Appl. No. 14/163,026.
U.S. Office Action dated Apr. 14, 2021 in U.S. Appl. No. 16/262,775.
U.S. Office Action dated Aug. 2, 2017 in U.S. Appl. No. 14/998,019.
U.S. Office Action dated Aug. 28, 2017 in U.S. Appl. No. 14/932,474.
U.S. Office Action dated Aug. 9, 2018 in U.S. Appl. No. 15/287,646.
U.S. Office Action dated Dec. 10, 2020 in U.S. Appl. No. 16/695,004.
U.S. Office Action dated Dec. 10, 2020 in U.S. Appl. No. 16/871,976.
U.S. Office Action dated Dec. 24, 2013 in U.S. Appl. No. 13/309,990.
U.S. Office Action dated Dec. 9, 2020 in U.S. Appl. No. 16/696,887.
U.S. Office Action dated Feb. 20, 2020 in U.S. Appl. No. 16/262,775.
U.S. Office Action dated Feb. 22, 2016 in U.S. Appl. No. 14/535,080.
U.S. Office Action dated Feb. 24, 2015 in U.S. Appl. No. 14/163,026.
U.S. Office Action dated Feb. 3, 2012 in U.S. Appl. No. 13/049,750.
U.S. Office Action dated Jan. 16, 2015 in U.S. Appl. No. 14/468,778.
U.S. Office Action dated Jan. 18, 2013 in U.S. Appl. No. 13/049,756.
U.S. Office Action dated Jan. 2, 2020 in U.S. Appl. No. 15/442,509.
U.S. Office Action dated Jan. 22, 2018 in U.S. Appl. No. 15/514,480.
U.S. Office Action dated Jan. 23, 2020 in U.S. Appl. No. 15/762,077.
U.S. Office Action dated Jan. 5, 2016 in U.S. Appl. No. 13/772,969.
U.S. Office Action dated Jan. 5, 2018 in U.S. Appl. No. 15/442,509.
U.S. Office Action dated Jul. 13, 2021 in U.S. Appl. No. 16/696,887.
U.S. Office Action dated Jul. 3, 2014 in U.S. Appl. No. 13/479,137.
U.S. Office Action dated Jun. 11, 2020 in U.S. Appl. No. 16/303,384.
U.S. Office Action dated Jun. 14, 2019 in U.S. Appl. No. 14/998,019.
U.S. Office Action dated Jun. 23, 2020 in U.S. Appl. No. 15/742,015.
U.S. Office Action dated Jun. 29, 2018 in U.S. Appl. No. 15/513,535.
U.S. Office Action dated Jun. 3, 2015 in U.S. Appl. No. 13/449,251.
U.S. Office Action dated Jun. 6, 2017 in U.S. Appl. No. 15/442,509.
U.S. Office Action dated Jun. 8, 2018 in U.S. Appl. No. 14/998,019.
U.S. Office Action dated Mar. 12, 2019 in U.S. Appl. No. 15/464,837.
U.S. Office Action dated Mar. 18, 2020 in U.S. Appl. No. 16/013,770.
U.S. Office Action dated Mar. 27, 2012 in U.S. Appl. No. 13/049,623.
U.S. Office Action dated Mar. 27, 2019 in U.S. Appl. No. 14/993,822.
U.S. Office Action dated Mar. 3, 2021 in U.S. Appl. No. 16/303,384.
U.S. Office Action dated Mar. 5, 2019 in U.S. Appl. No. 15/442,509.
U.S. Office Action dated Mar. 8, 2017 in U.S. Appl. No. 14/993,822.
U.S. Office Action dated May 18, 2018 in U.S. Appl. No. 15/891,866.
U.S. Office Action dated May 30, 2018 in U.S. Appl. No. 15/464,837.
U.S. Office Action dated May 5, 2017 in Design U.S. Appl. No. 29/560,076.
U.S. Office Action dated Nov. 19, 2015 in U.S. Appl. No. 14/535,080.
U.S. Office Action dated Nov. 2, 2017 in U.S. Appl. No. 15/349,860.
U.S. Office Action dated Nov. 27, 2015 in U.S. Appl. No. 14/352,973.
U.S. Office Action dated Nov. 27, 2015 in U.S. Appl. No. 14/163,026.
U.S. Office Action dated Nov. 29, 2013 in U.S. Appl. No. 13/449,248.
U.S. Office Action dated Nov. 29, 2013 in U.S. Appl. No. 13/449,251.
U.S. Office Action dated Oct. 11, 2013 in U.S. Appl. No. 13/449,235.
U.S. Office Action dated Oct. 11, 2017 in U.S. Appl. No. 14/993,822.
U.S. Office Action dated Oct. 21, 2019 in U.S. Appl. No. 15/742,015.
U.S. Office Action dated Oct. 27, 2020 in U.S. Appl. No. 15/762,077.
U.S. Office Action dated Oct. 28, 2014 in U.S. Appl. No. 13/449,251.
U.S. Office Action dated Oct. 6, 2014 in U.S. Appl. No. 13/049,756.
U.S. Office Action dated Sep. 14, 2018 in U.S. Appl. No. 14/993,822.
U.S. Office Action dated Sep. 15, 2014 in U.S. Appl. No. 13/682,618.
U.S. Office Action dated Sep. 16, 2021, in U.S. Appl. No. 16/469,851.
U.S. Office Action dated Sep. 19, 2019 in U.S. Appl. No. 15/891,866.
U.S. Office Action dated Sep. 21, 2021, in U.S. Appl. No. 16/487,802.
U.S. Office Action dated Sep. 23, 2013 in U.S. Appl. No. 13/479,137.
U.S. Office Action dated Sep. 29, 2014 in U.S. Appl. No. 13/449,248.
U.S. Office Action dated Sep. 29, 2020 in U.S. Appl. No. 16/262,775.
U.S. Appl. No. 17/305,132, inventors Brown et al., filed Jun. 30, 2021.
U.S. Preliminary Amendment dated Oct. 30, 2019 for U.S. Appl. No. 16/013,770.
U.S. Appl. No. 63/080,899, inventor Makker et al., filed Sep. 21, 2020.
Werner, Christian, “Automatic cloud cover indicator system,” Journal of Applied Meteorology, vol. 12, Dec. 1973, pp. 1394-1400.
AU Office action dated Nov. 28, 2022, in AU Application No. AU2021200070.
AU Office Action dated Feb. 22, 2022, in Application No. AU2021200070.
AU Office Action dated Nov. 19, 2021, in Application No. AU2020250299.
AU Office action dated Sep. 29, 2022, in AU Application No. AU2021205049.
Australian Examination Report dated Feb. 28, 2022, in Application No. 2017376447.
CA Office Action dated Nov. 23, 2022, in Application No. CA2902106.
CA Office Action dated Feb. 11, 2022, in Application No. CA2902106.
CA Office Action dated Dec. 5, 2022 in Application No. CA3000852.
CA Office Action dated Dec. 7, 2021, in Application No. CA2963096.
CA Office Action dated Dec. 24, 2021, in Application No. CA2948668.
CA Office Action dated Oct. 4, 2022, in Application No. CA2991419.
CA Office Action dated Sep. 29, 2022, in Application No. CA2963096.
CN Office Action dated Mar. 9, 2022, in Application No. CN201911184096.0 with English Translation.
CN Office Action dated Sep. 28, 2022 in Application No. CN201980029927.9 with English translation.
CN Office Action dated Apr. 25, 2022, in Application No. CN201980027469.5 with English translation.
CN Office Action dated Feb. 25, 2022 in Application No. CN201980029927.9 with English translation.
CN Office Action dated Apr. 21, 2022 in Application No. CN201811232377.4 with English translation.
CN Office Action dated Apr. 27, 2022, in Application No. CN201780039437.8 with English translation.
CN Office Action dated Aug. 10, 2022, in Application No. CN201911184096.0 with English translation.
CN Office Action dated Jul. 7, 2022 in Application No. CN201780084583.2 with English translation.
CN Office Action dated Mar. 8, 2022, in Application No. CN201680063892.7 with English translation.
CN Office Action dated May 31, 2022, in Application No. CN201880022572.6 With English Translation.
CN Office Action dated Nov. 3, 2021, in Application No. 201780065447.9 with English translation.
CN Office Action dated Nov. 8, 2021, in Application No. 201880022572.6 with English translation.
CN Office Action dated Nov. 10, 2021, in Application No. CN201811232377.4 with English Translation.
CN office action dated Nov. 24, 2021, in application No. 201780084583.2 with English Translation.
CN Office Action dated Oct. 14, 2022, in Application No. CN201880022572.6 With English Translation.
CN Office action dated Oct. 29, 2021 in CN Application No. CN201710600395.2 with English translation.
CN Office Action dated Sep. 5, 2022, in Application No. CN201780039437.8 with English translation.
EP Office Action dated Mar. 4, 2022, in Application No. EP17859286.1.
EP Office Action dated Oct. 1, 2021, in application No. EP17807428.2.
EP Search Report dated Nov. 25, 2021, in Application No. EP21171305.2.
European office action dated Mar. 18, 2022, in Application No. 13777540.9.
European Office Action dated Sep. 5, 2022 in Application No. EP18756696.3.
European Office Action dated Sep. 29, 2022 in Application No. EP16854302.3.
Extended European search report dated Oct. 10, 2022, in Application No. EP22161794.7.
Extended European Search Report dated Oct. 13, 2021, for EP Application No. EP21163294.8.
Humann, C., “HDR sky imaging for real time control of facades,” presented Nov. 18, 2021 at Velux Build for Life conference. 21 pages of screenshots. Retrieved from internet: https://vimeo.com/647274396.
IN Office Action dated Aug. 5, 2022 In Application No. 202037045202.
IN office action dated Dec. 23, 2021, in application No. 202138004652.
IN Office Action dated Dec. 17, 2021, in Application No. IN201917027304.
IN Office Action dated Dec. 18, 2019 in Application No. IN202038052140.
IN Office Action dated Feb. 25, 2022 in Application No. IN202138016166.
IN Office Action dated Mar. 30, 2022 in Application No. IN202138040983.
IN office action dated Mar. 4, 2022, in application No. 202038052140.
International Search Report and Written Opinion dated Nov. 25, 2020, in PCT Application No. PCT/US2020/047525.
JP Office Action dated Jan. 4, 2022, in Application No. JP2020-215729 with English Translation.
JP Office Action dated Jul. 12, 2022, in Application No. JP2019-531271 with English translation.
JP Office Action dated Oct. 12, 2021, in Application No. JP2019531271 with Machine Translation.
KR Office Action dated Apr. 27, 2022, in Application No. KR10-2016-7032512 with English Translation.
KR Office Action dated Dec. 23, 2021, in application No. 1020197011968 with English Translation.
KR Office Action dated Feb. 18, 2022, in Application No. KR1020177011520 with English Translation.
Notice of Allowance dated Aug. 16, 2021 in U.S. Appl. No. 16/695,004.
Notice of Allowance dated Aug. 24, 2021 in U.S. Appl. No. 16/695,004.
Notice of Allowance dated Nov. 29, 2021 in U.S. Appl. No. 16/696,887.
Notice of Allowance dated Oct. 14, 2021 in U.S. Appl. No. 16/871,976.
Notice of Allowance dated Oct. 27, 2021 in U.S. Appl. No. 16/335,222.
Notice of Allowance dated Sep. 17, 2021 in U.S. Appl. No. 16/262,775.
Partial European Search Report dated Jul. 6, 2021 for EP Application No. EP21163294.8.
PCT International Preliminary Report on Patentability and Written Opinion dated Mar. 3, 2022, issued in PCT/US2020/047525.
PCT International Preliminary Report on Patentability dated May 5, 2022 issued in PCT/US2020/056164.
Reno. M, et al., “Global Horizontal Irradiance Clear Sky Models: Implementation and Analysis”, Sandia Report, SAND2012-2389, 2012, pp. 1-67.
Subramaniam, S., “Daylighting Simulations with Radiance using Matrix-based Methods”, Lawrence Berkeley National Laboratory, Oct. 3, 2017, 145 pages.
TW Notice of Allowance and Search Report dated Sep. 9, 2021, in application No. TW110106134 with English Translation.
TW Office Action dated Sep. 15, 2022 In Application No. TW110140343 with English translation.
TW Office Action dated Apr. 11, 2022, in Application No. TW106134521 with English Translation.
TW Office Action dated Aug. 30, 2022, in Application No. TW108109593 with English translation.
TW Office Action dated Aug. 30, 2022 In Application No. TW110115143 with English translation.
TW Office Action dated Dec. 29, 2021, in application No. TW110124070 with English Translation.
TW Office Action dated Jun. 29, 2022 In Application No. TW108109631 with English translation.
TW Office Action dated Mar. 16, 2022, in Application No. TW106143996 with English translation.
TW Office Action dated Oct. 26, 2021 in TW Application No. TW20170143996 with English translation.
TW Office Action dated Sep. 13, 2022, in Application No. TW106134521 with English Translation.
U.S. Corrected Notice of Allowance dated Jan. 5, 2023 in U.S. Appl. No. 17/027,601.
U.S. Non-Final office Action dated Aug. 31, 2022 in U.S. Appl. No. 16/469,851.
U.S. Non-Final office Action dated Jul. 14, 2022 in U.S. Appl. No. 16/487,802.
U.S. Advisory Action dated Jun. 7, 2022 in U.S. Appl. No. 16/469,851.
U.S. Corrected Notice of Allowance dated Aug. 12, 2022 in U.S. Appl. No. 15/929,958.
U.S. Corrected Notice of Allowance dated Nov. 3, 2022 in U.S. Appl. No. 15/929,958.
U.S. Corrected Notice of Allowability dated Dec. 9, 2021 in U.S. Appl. No. 16/262,775.
U.S. Corrected Notice of Allowability dated Jan. 12, 2022, in U.S. Appl. No. 16/335,222.
U.S. Corrected Notice of Allowability dated Nov. 24, 2021, in U.S. Appl. No. 16/335,222.
U.S. Corrected Notice of Allowance dated Feb. 10, 2022 in U.S. Appl. No. 16/949,493.
U.S. Corrected Notice of Allowance dated Feb. 22, 2022 in U.S. Appl. No. 15/762,077.
U.S. Corrected Notice of Allowance dated Feb. 22, 2022 in U.S. Appl. No. 16/696,887.
U.S. Corrected Notice of Allowance dated Jan. 21, 2022 in U.S. Appl. No. 15/742,015.
U.S. Corrected Notice of Allowance dated May 4, 2022, in U.S. Appl. No. 16/949,493.
U.S. Corrected Notice of Allowance dated May 18, 2022, in U.S. Appl. No. 16/303,384.
U.S. Corrected Notice of Allowance dated Sep. 26, 2022 in U.S. Appl. No. 17/027,601.
U.S. Final office Action dated Jan. 3, 2023 in U.S. Appl. No. 16/487,802.
U.S. Final office Action dated Jan. 4, 2023 in U.S. Appl. No. 17/249,595.
U.S. Final Office Action dated Apr. 8, 2022, in U.S. Appl. No. 17/027,601.
U.S. Final Office Action dated Mar. 10, 2022, in U.S. Appl. No. 16/487,802.
U.S. Non Final Office Action dated Feb. 16, 2022 in U.S. Appl. No. 15/929,958.
U.S. Non Final Office Action dated Jan. 21, 2022, in U.S. Appl. No. 16/303,384.
U.S. Non-Final office Action dated Dec. 22, 2022 in U.S. Appl. No. 17/305,132.
U.S. Non-Final office Action dated Jul. 21, 2022 in U.S. Appl. No. 17/249,595.
U.S. Non-Final Office Action dated Dec. 6, 2022 in U.S. Appl. No. 16/303,384.
U.S. Non-Final Office Action dated Oct. 18, 2021 in U.S. Appl. No. 16/949,493.
U.S. Notice of Allowance dated Aug. 24, 2022 in U.S. Appl. No. 16/303,384.
U.S. Notice of Allowance dated Sep. 12, 2022 in U.S. Appl. No. 17/027,601.
U.S Notice of Allowance dated Dec. 20, 2021, in U.S. Appl. No. 15/742,015.
U.S. Notice of Allowance dated Dec. 30, 2021, in U.S. Appl. No. 15/742,015.
U.S. Notice of Allowance dated Feb. 8, 2022 in U.S. Appl. No. 15/762,077.
U.S. Notice of Allowance dated Jan. 28, 2022 in U.S. Appl. No. 16/949,493.
U.S. Notice of Allowance dated Jul. 26, 2022, in U.S. Appl. No. 15/929,958.
U.S. Notice of Allowance dated May 4, 2022 in U.S. Appl. No. 16/303,384.
U.S. Notice of Allowance dated May 18, 2022, in U.S. Appl. No. 15/762,077.
U.S. Notice of Allowance dated Sep. 22, 2022 in U.S. Appl. No. 15/762,077.
U.S. Notice of Allowance dated Sep. 27, 2021 in U.S. Appl. No. 16/262,775.
U.S. Office Action dated Apr. 1, 2022, in U.S. Appl. No. 16/469,851.
U.S. Restriction Requirement dated Oct. 14, 2022, in U.S. Appl. No. 17/008,342.
CN Office Action dated Jan. 5, 2023, in Application No. CN201980027469.5 with English translation.
JP Office Action dated Jan. 10, 2023 in Application No. JP2019-531271 with English translation.
TW Office Action dated Dec. 19, 2022, in Application No. TW111117328 with English translation.
U.S. Appl. No. 18/150,146, Inventors Klawuhn et al., filed Jan. 4, 2023.
AU Office Action dated Feb. 15, 2023, in Application No. AU2021200070.
AU Office Action dated Feb. 21, 2023, in Application No. AU2021200070.
Australian Examination Report dated Feb. 20, 2023, in Application No. AU2017376447.
CN Office Action dated Feb. 14, 2023 in Application No. CN201780084583.2 with English translation.
European Office Action dated Feb. 22, 2023 for EP Application No. EP22197030.4.
KR Office Action dated Mar. 6, 2023 in Application No. KR10-2022-7028868 with English translation.
U.S. Non-Final Office Action dated Apr. 12, 2023 in U.S. Appl. No. 17/450,091.
U.S. Non-Final Office Action dated Apr. 13, 2023 in U.S. Appl. No. 18/066,799.
U.S. Non-Final Office Action dated Mar. 14, 2023 in U.S. Appl. No. 17/008,342.
U.S. Notice of Allowance dated Apr. 3, 2023 in U.S. Appl. No. 16/469,851.
U.S. Notice of Allowance dated Mar. 1, 2023 in U.S. Appl. No. 15/762,077.
U.S. Notice of Allowance dated Mar. 13, 2023 in U.S. Appl. No. 16/469,851.
U.S. Notice of Allowance dated Mar. 16, 2023 in U.S. Appl. No. 17/305,132.
U.S. Notice of Allowance dated Mar. 24, 2023 in U.S. Appl. No. 16/469,851.
U.S. Restriction requirement dated Mar. 14, 2023 in U.S. Appl. No. 17/651,013.
Related Publications (1)
Number Date Country
20220113184 A1 Apr 2022 US
Provisional Applications (1)
Number Date Country
62453407 Feb 2017 US
Continuations (1)
Number Date Country
Parent 16335222 US
Child 17304832 US
Continuation in Parts (4)
Number Date Country
Parent PCT/US2016/055709 Oct 2016 US
Child 16335222 US
Parent 15287646 Oct 2016 US
Child PCT/US2016/055709 US
Parent 14998019 Oct 2015 US
Child 15287646 US
Parent 14998019 Oct 2015 US
Child PCT/US2016/055709 US