Embodiments of this disclosure relate generally to a system for detecting conditions within a predetermined space and, more particularly, to a fiber optic detection system.
Conventional smoke detection systems operate by detecting the presence of smoke or other airborne pollutants. Upon detection of a threshold level of particles, an alarm or other signal, such as a notification signal, may be activated and operation of a fire suppression system may be initiated.
High sensitivity smoke detection systems may incorporate a pipe network consisting of one or more pipes with holes or inlets installed at positions where smoke or pre-fire emissions may be collected from a region or environment being monitored. Air is drawn into the pipe network through the inlets, such as via a fan, and is subsequently directed to a detector. In some conventional smoke detection systems, individual sensor units may be positioned at each sensing location, and each sensor unit has its own processing and sensing components.
Delays in the detecting the presence of the fire may occur in conventional point smoke detectors and also pipe network detection systems, for example due to the smoke transport time. In pipe network detection systems, due to the size of the pipe network, there is a typically a time delay between when the smoke enters the pipe network through an inlet and when that smoke actually reaches the remote detector. In addition, because smoke or other pollutants initially enter the pipe network through a few of the inlets, the smoke mixes with the clean air provided to the pipe from the remainder of the inlets. As a result of this dilution, the smoke detectable from the smoke and air mixture may not exceed the threshold necessary to indicate the existence of a fire.
According to an embodiment, a detection system for measuring one or more condition within a predetermined area includes a fiber harness having at least one fiber optic cable for transmitting light. The at least one fiber optic cable defines a node arranged to measure the one or more conditions. A light source is coupled to the at least one fiber optic cable for emitting a modulated light to the node. The modulated light is transmitted into the predetermined area. A light sensitive device is coupled to the at least one fiber optic cable for receiving scattered light associated with the node. A control unit is operably coupled to the light source and to the light sensitive device to determine at least one of a presence and magnitude of the one or more conditions within the predetermined area.
In addition to one or more of the features described above, or as an alternative, in further embodiments wherein the modulated light is emitted from the light source in a known pattern.
In addition to one or more of the features described above, or as an alternative, in further embodiments the modulated light is emitted from the light source in a randomly generated pattern.
In addition to one or more of the features described above, or as an alternative, in further embodiments the modulated light includes pulses of light that vary in width.
In addition to one or more of the features described above, or as an alternative, in further embodiments the modulated light includes pulses of light that vary in at least one of intensity, frequency, and phase.
In addition to one or more of the features described above, or as an alternative, in further embodiments the modulated light includes a continuous signal that varies in at least one of amplitude, frequency, and phase.
In addition to one or more of the features described above, or as an alternative, in further embodiments determining if the one or more conditions are present includes computing an autocorrelation.
In addition to one or more of the features described above, or as an alternative, in further embodiments determining if the one or more conditions are present includes computing a cross-correlation with another pattern.
In addition to one or more of the features described above, or as an alternative, in further embodiments wherein the one or more conditions include smoke.
In addition to one or more of the features described above, or as an alternative, in further embodiments the predetermined area is one of a building and an avionics bay of an aircraft.
According to another embodiment, a method of measuring a condition within a predetermined area includes transmitting a first pattern of light along a fiber harness and through a node of a fiber optic cable of the fiber harness. The node is arranged to measure the one or more conditions. A second pattern of scattered light associated with the node is received. The first pattern and the second pattern are used to measure at least one of the presence and magnitude of the one or more conditions within the predetermined area.
In addition to one or more of the features described above, or as an alternative, in further embodiments using the first pattern and the second pattern to measure the presence of the condition within the predetermined area includes computing a correlation between the first pattern and the second pattern.
In addition to one or more of the features described above, or as an alternative, in further embodiments using the first pattern and the second pattern to measure the presence of the condition within the predetermined area includes computing a correlation between the second pattern and a third pattern.
In addition to one or more of the features described above, or as an alternative, in further embodiments the fiber optic detection system is configured to detect a plurality of conditions and the second pattern varies in response to each of the plurality of conditions.
In addition to one or more of the features described above, or as an alternative, in further embodiments the modulated light includes pulses of light that vary in at least one of intensity, frequency, and phase.
In addition to one or more of the features described above, or as an alternative, in further embodiments first pattern of scattered light is a signal that varies in at least one of amplitude, frequency, and phase.
In addition to one or more of the features described above, or as an alternative, in further embodiments using the first pattern and the second pattern to measure at least one of the presence and magnitude comprises computing an autocorrelation.
In addition to one or more of the features described above, or as an alternative, in further embodiments using the first pattern and the second pattern to measure at least one of the presence and magnitude comprises computing a cross-correlation.
In addition to one or more of the features described above, or as an alternative, in further embodiments the one or more conditions include smoke.
In addition to one or more of the features described above, or as an alternative, in further embodiments the predetermined area is one of a building and an avionics bay of an aircraft.
The subject matter, which is regarded as the present disclosure, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the present disclosure are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the present disclosure, together with advantages and features, by way of example with reference to the drawings.
Referring now to the FIGS., a system 20 for detecting one or more conditions or events within a designated area is illustrated. The detection system 20 may be able to detect one or more hazardous conditions, including but not limited to the presence of smoke, fire, temperature, flame, or any of a plurality of pollutants, combustion products, or chemicals. Alternatively, or in addition, the detection system 20 may be configured to perform monitoring operations of people, lighting conditions, or objects. In an embodiment, the system 20 may operate in a manner similar to a motion sensor, such as to detect the presence of a person, occupants, or unauthorized access to the designated area for example. The conditions and events described herein are intended as an example only, and other suitable conditions or events are within the scope of the disclosure.
The detection system 20 uses light to evaluate a volume for the presence of a condition. In this specification, the term “light” means coherent or incoherent radiation at any frequency or a combination of frequencies in the electromagnetic spectrum. In an example, the photoelectric system uses light scattering to determine the presence of particles in the ambient atmosphere to indicate the existence of a predetermined condition or event. In this specification, the term “scattered light” may include any change to the amplitude/intensity or direction of the incident light, including reflection, refraction, diffraction, absorption, and scattering in any/all directions. In this example, light is emitted into the designated area; when the light encounters an object (a person, smoke particle, or gas molecule for example), the light can be scattered and/or absorbed due to a difference in the refractive index of the object compared to the surrounding medium (air). Depending on the object, the light can be scattered in all different directions. Observing any changes in the incident light, by detecting light scattered by an object for example, can provide information about the designated area including determining the presence of a predetermined condition or event.
In its most basic form, as shown in
As shown in
In another embodiment, the detection system 20 can include a plurality of nodes 34. For example, as illustrated in
In embodiments where a single light sensitive device 38 is configured to receive scattered light from a plurality of nodes 34, the control system 50 is able to localize the scattered light, i.e. identify the scattered light received from each of the plurality of nodes 34. In an embodiment, the control system 50 uses the position of each node 34, specifically the length of the fiber optic cables 28 associated with each node 34 and the corresponding time of flight (i.e. the time elapsed between when the light was emitted by the light source 36 and when the light was received by the light sensitive device 38), to associate different parts of the light signal with each of the respective nodes 34 that are connected to that light sensitive device 38. Alternatively, or in addition, the time of flight may include the time elapsed between when the light is emitted from the node and when the scattered light is received back at the node. In such embodiments, the time of flight provides information regarding the distance of the object relative to the node.
In an embodiment, illustrated in the cross-section of the fiber optic cable shown in
With reference now to
Alternatively, the fiber harness 30 may include a fiber optic cable (not shown) having a plurality of branches 32 integrally formed therewith and extending therefrom. The branches 32 may include only a single fiber optic core. The configuration, specifically the spacing of the nodes 34 within a fiber harness 30 may be substantially equidistant, or may vary over the length of the harness 30. In an embodiment, the positioning of each node 34 may correlate to a specific location within the designated area.
With reference now to
The detection system 20 may be configured to monitor a predetermined area such as a building. The detection system 20 may be especially utilized for predetermined areas having a crowded environment, such as a server room, as shown in
The control system 50 of the detection system 20 is utilized to manage the detection system operation and may include control of components, data acquisition, data processing and data analysis. The control system 50, illustrated in
The processor 54 may be coupled to the at least one light source 36 and the at least one light sensitive device 38 via connectors. The light sensitive device 38 is configured to convert the scattered light received from a node 34 into a corresponding signal receivable by the processor 54. In an embodiment, the signal generated by the light sensing device 38 is an electronic signal. The signal output from the light sensing device 38 is then provided to the control unit 52 for processing using an algorithm to determine whether a predefined condition is present.
The signal received by or outputted from the light sensitive device(s) 38 may be amplified and/or filtered, such as by a comparator (not shown), to reduce or eliminate irrelevant information within the signal prior to being communicated to the control unit 52 located remotely from the node 34. In such embodiments, the amplification and filtering of the signal may occur directly within the light sensing device 38, or alternatively, may occur via one or more components disposed between the light sensing device 38 and the control unit 52. The control unit 52 may control the data acquisition of the light sensitive device 38, such as by adjusting the gain of the amplifier, the bandwidth of filters, sampling rates, the amount of timing and data buffering for example.
With reference now to
Data representative of the output from each APD sensor 64 in the APD array 66 is periodically taken by a switch 68, or alternatively, is collected simultaneously. The data acquisition 67 collects the electronic signals from the APD and associates the collected signals with metadata. The metadata as an example can be time, frequency, location or node. In an example, the electronic signals are from the APD are synchronized to the laser modulation such that the electrical signals are collected for a period of time that starts when the laser is pulsed to several microseconds after the laser pulse. The data will be collected and processed by the processor 54 to determine whether any of the nodes 34 indicates the existence of a predefined condition or event. In an embodiment, only a portion of the data outputted by the sensor array 66, for example the data from a first APD sensor 64 associated with a first fiber harness 30, is collected. The switch 68 is therefore configured to collect information from the various APD sensors 64 of the sensor array 66 sequentially. While the data collected from a first APD sensor 64 is being processed to determine if an event or condition has occurred, the data from a second APD 66 of the sensor array 66 is collected and provided to the processor 54 for analysis. When a predefined condition or event has been detected from the data collected from one of the APD sensors 64, the switch 68 may be configured to provide additional information from the same APD sensor 64 to the processor 54 to track the condition or event.
A method of operation 100 of the detection system 20 is illustrated in
Using the algorithm 58 executed by the processor 54, each of the signals representing the scattered light received by the corresponding nodes 34 are evaluated to determine whether the light at the node 34 is indicative of a predefined condition, such as smoke for example. With reference to
In an embodiment, the time of flight record is parsed and features are extracted. The time of flight record can cover a period of time. For example, a time of flight record can record light intensity over 0.001-1,000,000 nanoseconds, 0.1-100,000 nanosceconds, or 0.1-10,000 microseconds. The features extracted from the signal can include, but are not limited to height, full width at half maximum, signal pick up time, signal drop off time, group velocity, integration, rate of change, mean, and variance for example.
Through application of the data processing, illustrated schematically at block 76, the features may then be further processed by using, for example, smoothing, Fourier transforms or cross correlation. In an embodiment, the processed data is then sent to the detection algorithm at block 78 to determine whether or not the signal indicates the presence and/or magnitude of a condition or event at a corresponding node 34. This evaluation may be a simple binary comparison that does not identify the magnitude of deviation between the characteristic and a threshold. The evaluation may also be a comparison of a numerical function of the characteristic or characteristics to a threshold. The threshold may be determined a priori or may be determined from the signal. The determination of the threshold from the signal may be called background learning. Background learning may be accomplished by adaptive filtering, model-based parameter estimation, statistical modeling, and the like. In some embodiments, if one of the identified features does not exceed a threshold, the remainder of the detection algorithm is not applied in order to reduce the total amount processing done during the detection algorithm. In the event that the detection algorithm indicated the presence of the condition at one or more nodes 34, an alarm or other fire suppression system may, but need not be activated. It should be understood that the process for evaluating the data illustrated and described herein is intended as an example only and that other processes including some or all of the steps indicated in the FIG. are also contemplated herein.
The evaluation may also advantageously employ classifiers including those that may be learned from the signal via deep learning techniques including, but not limited to deep neural networks, convolutional neural networks, recursive neural networks, dictionary learning, bag of visual/depth word techniques, Support Vector Machine (SVM), Decision Trees, Decision Forests, Fuzzy Logic, and the like. The classifiers may also be constructed using Markov Model techniques, Hidden Markov Models (HMM), Markov Decision Processes (MDP), Partially Observable MDPs, Markov Decision Logic, Probabilistic Programming, and the like.
In addition to evaluating the signals generated from each node 34 individually, the processor 54 may additionally be configured to evaluate the plurality of signals or characteristics thereof collectively, such as through a data fusion operation to produce fused signals or fused characteristics. The data fusion operation may provide information related to time and spatial evolution of an event or predetermined condition. As a result, a data fusion operation may be useful in detecting a lower level event, insufficient to initiate an alarm at any of the nodes 34 individually. For example, in the event of a slow burning fire, the light signal generated by a small amount of smoke near each of the nodes 34 individually may not be sufficient to initiate an alarm. However, when the signals from the plurality of nodes 34 are reviewed in aggregate, the increase in light returned to the light sensitive device 38 from multiple nodes 34 may indicate the occurrence of an event or the presence of an object not otherwise detected. In an embodiment, the fusion is performed by Bayesian Estimation. Alternatively, linear or non-linear joint estimation techniques may be employed such as maximum likelihood (ML), maximum a priori (MAP), non-linear least squares (NNLS), clustering techniques, support vector machines, decision trees and forests, and the like.
As illustrated and described above, the processor 54 is configured to analyze the signals generated by at least one light sensing device 38 relative to time. In another embodiment, the detection algorithm may be configured to apply one or more of a Fourier transform, Wavelet transform, space-time transform, Choi-Williams distribution, Wigner-Ville distribution and the like, to the signals to convert the signals from a temporal domain to a frequency domain. This transformation may be applied to the signals when the nodes 34 are being analyzed individually, when the nodes 34 are being analyzed collectively during a data fusion, or both.
The relationship between the light scattering and the magnitude or presence of a condition is inferred by measuring a signal's causality and dependency. As an example, the measure of a causality utilizes one or more signal features as an input and determines one or more outputs from a calculation of a hypothesis testing method, foreground ratio, second derivative, mean or Granger Causality Test. Similarly, one or more signal features may be used as an input to evaluate the dependency of a signal. One or more outputs are selected from a calculation of a correlation, fast Fourier transform coefficients, a second derivative, or a window. The magnitude and presence of the condition is then based on the causality and dependency. The magnitude and presence of a condition may be calculated utilizing one or more evaluation approaches: a threshold, velocity, rate of change or a classifier. The detection algorithm may include utilizing the output from the calculation causality, dependency or both. This is used to indicate the presence of the condition at one or more nodes 34 and initiate a response.
Because the frequency of smoke varies within a small range, such as from about 0.01 Hz to about 10 Hz for example, evaluation of the signals with respect to frequency may effectively and accurately determine the presence of smoke within the predetermined space 82. The detection algorithm may be configured to evaluate the signals in a fixed time window to determine the magnitude of the frequency or the strength of the motion of the smoke. Accordingly, if the magnitude of a frequency component exceeds a predetermined threshold, the detection algorithm may initiate an alarm indicating the presence of a fire. In an embodiment, the predetermined threshold is about 10 Hz such that when the magnitude of the optical smoke frequency exceeds the threshold, smoke is present.
In an embodiment, the algorithm 58 is configured to distinguish between different events or conditions based on the rate of change in the light scattered by the atmosphere near the node 34 and received by one or more of the nodes 34 over time. With reference to
To reduce the noise associated with each signal, the light emitting device 36 may be modulated such that the device 36 is selectively operated to generate modulated light in a specific pattern. In an embodiment, the light within the pattern may vary in intensity, width, frequency, phase, and may comprise discrete pulses or may be continuous. The specific pattern of light may be designed to have desirable properties such as a specific autocorrelation with itself or cross-correlation with a second specific pattern. When the light is emitted in a specific pattern, the light scattered back to a corresponding light sensing device 38 should arrive in the substantially same pattern. Use of one or more specific and known patterns provides enhanced processing capabilities by allowing for the system 20 to reduce overall noise. This reduction in noise when combined with the signal processing may result in an improved signal to noise ratio and the total number of false events or conditions detected will decrease. Alternatively, or in addition, the device sensitivity may be improved thereby increasing the limits of the detection system 20. Similarly, by cross-correlating one or more second patterns, specific causes of transmitted or reflected signals may be distinguished, e.g. by Bayesian estimation of the respective cross-correlations of the received signal with the one or more second patterns.
In addition, modulation of the light signal emitted by the light source 36 may provide improved detection by determining more information about the event or condition causing the scatter in the light signal received by the node 34. For example, such modulation may allow the system 20 to more easily distinguish between a person walking through the designated area adjacent a node, as shown in
Referring now to
As shown in
While in the embodiment of
Referring now to
As shown in
Referring now to
In some embodiments, both lens 84 and mirror 86 may be utilized at node 34. Further, while in the embodiments illustrated in
In addition to smoke or dust, the system 20 may be utilized to monitor or detect pollutants such as volatile organic compounds (VOC's), particle pollutants such as PM2.5 or PM10.0 particles, biological particles, and/or chemicals or gases such as H2, H2S, CO2, CO, NO2, NO3, or the like. Multiple wavelengths may be transmitted by the light source 36 to enable simultaneous detection of smoke, as well as individual pollutant materials. For example, a first wavelength may be utilized for detection of smoke, while a second wavelength may be utilized for detection of VOC's. Additional wavelengths may be utilized for detection of additional pollutants, and using multiple wavelength information in aggregate may enhance sensitivity and provide discrimination of gas species from false or nuisance sources. In order to support multiple wavelengths, one or more lasers may be utilized to emit several wavelengths. Alternatively, the control system can provide selectively controlled emission of the light. Utilization of the system 20 for pollutant detection can lead to improved air quality in the predetermined space 82 as well as improved safety.
In some embodiments, such as shown in
In another embodiment, such as shown in
Further, as an alternative to or in addition to the splice connection, fused connections, one or more solid state switching devices, optical amplifiers 96 may be placed along the fiber harness 30 to amplify signals proceeding through the fiber harness 31. The optical amplifier 96 may be located, for example as shown in
Referring now to
Referring now to
While the disclosure has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the disclosure is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the disclosure. Additionally, while various embodiments of the disclosure have been described, it is to be understood that aspects of the disclosure may include only some of the described embodiments. Accordingly, the disclosure is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
This application is a National Stage application of PCT/US2017/060900, filed Nov. 9, 2017, which claims the benefit of U.S. Provisional Application No. 62/420,885, filed Nov. 11, 2016, both of which are incorporated by reference in their entirety herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2017/060900 | 11/9/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/089660 | 5/17/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2877453 | Mendenhal | Mar 1959 | A |
3805066 | Chijuma et al. | Apr 1974 | A |
3874795 | Packham et al. | Apr 1975 | A |
3982130 | Trumble | Sep 1976 | A |
4321595 | Tresch | Mar 1982 | A |
4491830 | Miyabe | Jan 1985 | A |
4528555 | Yasukawa et al. | Jul 1985 | A |
4533834 | McCormack | Aug 1985 | A |
4594581 | Matoba | Jun 1986 | A |
4839527 | Leitch | Jun 1989 | A |
4916432 | Tice et al. | Apr 1990 | A |
4960989 | Liebenrood et al. | Oct 1990 | A |
5142141 | Talat et al. | Aug 1992 | A |
5144125 | Carter et al. | Sep 1992 | A |
5164604 | Blair et al. | Nov 1992 | A |
5352901 | Poorman | Oct 1994 | A |
5452087 | Taylor et al. | Sep 1995 | A |
5564832 | Ball et al. | Oct 1996 | A |
5705988 | McMaster | Jan 1998 | A |
5748325 | Tulip | May 1998 | A |
5754693 | Takesue | May 1998 | A |
5757487 | Kersey | May 1998 | A |
6111511 | Sivathanu et al. | Aug 2000 | A |
6861952 | Billmaier | Mar 2005 | B1 |
6876786 | Chliaguine et al. | Apr 2005 | B2 |
6879253 | Thuillard | Apr 2005 | B1 |
6881948 | Dammann | Apr 2005 | B2 |
7109888 | Iannotti | Sep 2006 | B2 |
7154081 | Friedersdorf et al. | Dec 2006 | B1 |
7173690 | Haran | Feb 2007 | B2 |
7244936 | Von Drasek | Jul 2007 | B2 |
7301641 | Overby et al. | Nov 2007 | B1 |
7310459 | Rahman | Dec 2007 | B1 |
7418171 | Grattan et al. | Aug 2008 | B2 |
7493816 | Petrovic et al. | Feb 2009 | B1 |
7538860 | Moore | May 2009 | B2 |
7669457 | Griffith et al. | Mar 2010 | B2 |
7683357 | Von Drasek et al. | Mar 2010 | B2 |
7859395 | Butalla, III et al. | Dec 2010 | B2 |
8035527 | Powell | Oct 2011 | B2 |
8059925 | Edwards et al. | Nov 2011 | B1 |
8223128 | Edwards et al. | Jul 2012 | B1 |
8232884 | Pattok et al. | Jul 2012 | B2 |
8306373 | Xia et al. | Nov 2012 | B2 |
8666074 | Zhu et al. | Mar 2014 | B2 |
8742939 | Polak et al. | Jun 2014 | B2 |
8744264 | Yang | Jun 2014 | B2 |
9170130 | Pan et al. | Oct 2015 | B2 |
9224281 | Zhang | Dec 2015 | B2 |
9373238 | Lang et al. | Jun 2016 | B2 |
9470588 | Westbrook | Oct 2016 | B2 |
10600300 | Williamson | Mar 2020 | B2 |
10665075 | Birnkrant et al. | May 2020 | B2 |
20020011570 | Castleman | Jan 2002 | A1 |
20020141015 | Chang et al. | Oct 2002 | A1 |
20020171953 | Yang | Nov 2002 | A1 |
20030091313 | Paradiso et al. | May 2003 | A1 |
20030184735 | Kotten | Oct 2003 | A1 |
20030215172 | Koenig | Nov 2003 | A1 |
20040008749 | Tsang et al. | Jan 2004 | A1 |
20040056765 | Miller et al. | Mar 2004 | A1 |
20040075549 | Haller | Apr 2004 | A1 |
20040145467 | Roby et al. | Jul 2004 | A1 |
20050077455 | Townley-Smith et al. | Apr 2005 | A1 |
20070181814 | Crosetto | Aug 2007 | A1 |
20080036593 | Rose-Pehrsson et al. | Feb 2008 | A1 |
20090009347 | Kaelin et al. | Jan 2009 | A1 |
20090086189 | Drinkard | Apr 2009 | A1 |
20090262760 | Krupkin et al. | Oct 2009 | A1 |
20090315722 | Hou | Dec 2009 | A1 |
20090316116 | Melville | Dec 2009 | A1 |
20090320526 | Carberry et al. | Dec 2009 | A1 |
20100098335 | Yamagishi et al. | Apr 2010 | A1 |
20100188235 | Asano | Jul 2010 | A1 |
20100194574 | Monk et al. | Aug 2010 | A1 |
20110064264 | Caballero et al. | Mar 2011 | A1 |
20110094946 | Prystupa et al. | Apr 2011 | A1 |
20110102776 | Hasegawa et al. | May 2011 | A1 |
20120106960 | Uhlhorn et al. | May 2012 | A1 |
20130136007 | Jiang et al. | May 2013 | A1 |
20130322490 | Bell et al. | Dec 2013 | A1 |
20130327943 | Parks, II et al. | Dec 2013 | A1 |
20140024020 | Tanabe | Jan 2014 | A1 |
20140031956 | Slessman et al. | Jan 2014 | A1 |
20140211207 | Clark | Jul 2014 | A1 |
20140267710 | Asada | Sep 2014 | A1 |
20150015879 | Papadopoulos | Jan 2015 | A1 |
20150077240 | Eck | Mar 2015 | A1 |
20150160613 | Ferraro et al. | Jun 2015 | A1 |
20150185161 | Gettings et al. | Jul 2015 | A1 |
20150204781 | Wagner et al. | Jul 2015 | A1 |
20150234048 | Miller et al. | Aug 2015 | A1 |
20150371515 | Zribi et al. | Dec 2015 | A1 |
20150379847 | Fischer et al. | Dec 2015 | A1 |
20160027537 | Loewen et al. | Jan 2016 | A1 |
20160086468 | Calvert | Mar 2016 | A1 |
20160153905 | Allemann et al. | Jun 2016 | A1 |
20170017075 | Reddy | Jan 2017 | A1 |
20170034605 | Yang et al. | Feb 2017 | A1 |
20170067794 | Matsuura | Mar 2017 | A1 |
20170108358 | Bastianini | Apr 2017 | A1 |
20170205253 | Handerek | Jul 2017 | A1 |
20170334574 | Wilson et al. | Nov 2017 | A1 |
20170353712 | Price | Dec 2017 | A1 |
20180171778 | Hoehn et al. | Jun 2018 | A1 |
20180266854 | Moore et al. | Sep 2018 | A1 |
20180320503 | Smith et al. | Nov 2018 | A1 |
20180356210 | Moore et al. | Dec 2018 | A1 |
20190025095 | Steel et al. | Jan 2019 | A1 |
20190287365 | Birnkrant et al. | Sep 2019 | A1 |
20190287366 | Birnkrant et al. | Sep 2019 | A1 |
20190287367 | Birnkrant et al. | Sep 2019 | A1 |
20190287369 | Birnkrant et al. | Sep 2019 | A1 |
20190346295 | Moore et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
101308598 | Nov 2008 | CN |
102292621 | Dec 2011 | CN |
102521942 | Jun 2012 | CN |
102788764 | Nov 2012 | CN |
202720745 | Feb 2013 | CN |
103782327 | May 2014 | CN |
104200606 | Dec 2014 | CN |
19900019 | Aug 2000 | DE |
102013213721 | May 2014 | DE |
102013213721 | May 2014 | DE |
102013002859 | Aug 2014 | DE |
102013204962 | Oct 2014 | DE |
102014200243 | Jul 2015 | DE |
0484038 | May 1992 | EP |
0926646 | Jun 1999 | EP |
1688898 | Aug 2006 | EP |
1887536 | Feb 2008 | EP |
2169398 | Jul 1986 | GB |
2248108 | Mar 1992 | GB |
2259761 | Mar 1993 | GB |
S6257096 | Mar 1987 | JP |
H09178550 | Jul 1997 | JP |
2012053596 | Mar 2016 | JP |
20090120032 | Nov 2009 | KR |
20150104455 | Sep 2015 | KR |
1179401 | Sep 1985 | SU |
1179402 | Sep 1985 | SU |
9403792 | Feb 1994 | WO |
2008109932 | Sep 2008 | WO |
2014041350 | Mar 2014 | WO |
2018089629 | May 2018 | WO |
Entry |
---|
Benazza-Benyahia et al.; “Early Smoke Detection in Forest Areas from DCT Based Compressed Video”; 20th European Signal Processing Conference (EUSIPCO 2012); Bucharest, Romania; Aug. 27-31, 2012; 5 Pages. |
Cestari et al.; “Advanced Fire Detection Algonthms Using Data from the Home Smoke Detector Project”; Fire Safety Journal; vol. 40; 2005; pp. 1-28. |
International Search Report; International Application No. PCT/US2017/060900; International Filing Date: Nov. 9, 2017; dated Mar. 1, 2018; 6 Pages. |
Tobin; “The Use of Fiber Optics in a Networked Fire Alarm System”; Fire Engineering; vol. 152, Issue 2; Date Published: Feb. 1, 1999; Available at: http://www.fireengineering.com/articles/print/volume-152/issue-2/departments/technoiogy-today/the-use-of-fiber-optics-in-a-networked-fire-alarm-system.html; 4 Pages. |
Written Opinion of the International Searching Authority; International Application No. PCT/US2017/060900; International Filing Date: Nov. 9, 2017; dated Mar. 1, 2018; 8 Pages. |
European Office Action; Application No. 17801293.6-1206; dated Feb. 25, 2020; 8 pages. |
Internation Preliminary Report on Patentability; International Application No. PCT/US2017/060855; International Filing Date: Nov. 9, 2017; dated May 23, 2019; 14 pages. |
International Preliminary Report on Patentability; International Application No. PCT/US2017/060864; International Filing Date: Nov. 9, 2017; dated May 23, 2019; 16 pages. |
International Preliminary Report on Patentability; International Application No. PCT/US2017/060892; International Filing Date: Nov. 9, 2017; dated May 23, 2019; 8 pages. |
International Preliminary Report on Patentability; International Application No. PCT/US2017/060900; International Filing Date: Nov. 9, 2017; dated May 23, 2019; 8 pages. |
International Preliminary Report on Patentability; International Application No. PCT/US2017/060910; International Filing Date: Nov. 9, 2017; dated May 23, 2019; 14 pages. |
International Search Report; International Application No. PCT/US2017/060855; International Filing Date: Nov. 9, 2017; dated Apr. 18, 2018; 8 Pages. |
International Search Report; International Application No. PCT/US2017/060864; International Filing Date: Nov. 9, 2017; dated Apr. 18, 2018; 7 Pages. |
International Search Report; International Application No. PCT/US2017/060892; International Filing Date: Nov. 9, 2017; dated Mar. 1, 2018; 6 Pages. |
International Search Report; International Application No. PCT/US2017/060910; International Filing Date: Nov. 9, 2017; dated May 15, 2018; 8 Pages. |
U.S. Non-Final Office Action; U.S. Appl. No. 16/349,174, filed May 10, 2019; Notification Date: Dec. 23, 2019; 26 pages. |
U.S. Non-Final Office Action; U.S. Appl. No. 16/349,179; filed May 10, 2019; Notification Date: Dec. 31, 2019; 27 pages. |
Written Opinion of the International Searching Authority; International Application No. PCT/US2017/060855; International Filing Date: Nov. 9, 2017; dated Apr. 18, 2018; 12 Pages. |
Written Opinion of the International Searching Authority; International Application No. PCT/US2017/060864; International Filing Date: Nov. 9, 2017; dated Apr. 18, 2018; 14 Pages. |
Written Opinion of the International Searching Authority; International Application No. PCT/US2017/060892 International Filing Date: Nov. 9, 2017; dated Mar. 1, 2018; 6 Pages. |
Written Opinion of the International Searching Authority; International Application No. PCT/US2017/060910 International Filing Date: Nov. 9, 2017; dated May 15, 2018; 12 Pages. |
European Office Action; International Application No. 17804750.2-1206; International Filing Date: May 23, 2019; dated Mar. 13, 2020; 8 pages. |
U.S. Final Office Action; U.S. Appl. No. 16/349,174, filed May 10, 2019; dated May 21, 2020; 22 pages. |
U.S. Final Office Action; U.S. Appl. No. 16/349,179, filed May 10, 2019; dated Jul. 2, 2020; 22 pages. |
U.S. Non-Final Office Action; U.S. Appl. No. 16/349,177, filed May 10, 2019; dated Jul. 2, 2020; 29 pages. |
U.S. Non-Final Office Action; U.S. Appl. No. 16/349,182, filed May 10, 2019; dated May 6, 2020; 29 pages. |
Chinese Office Action; International Application No. 201780069739.X; International Filing Date: May 10, 2019; dated Sep. 3, 2020; 11 pages. |
U.S. Final Office Action; U.S. Appl. No. 16/349,177, filed May 10, 2019; dated Oct. 27, 2020; 15 pages. |
Chinese Office Action; International Application No. 201780069738.5; International Filing Date: May 10, 2019; dated Sep. 3, 2020; 13 pages. |
First Office Action; Chinese Application No. 201780069782.6; International Filing Date: May 10, 2019; dated Dec. 1, 2020; 10 pages. |
Notice of Allowance for U.S. Appl. No. 16/349,377, dated Jun. 17, 2021, 16 pages. |
U.S. Non-Final Office Action; U.S. Appl. No. 16/349,177, filed May 10, 2019; dated Feb. 12, 2021; 17 pages. |
Number | Date | Country | |
---|---|---|---|
20190287368 A1 | Sep 2019 | US |
Number | Date | Country | |
---|---|---|---|
62420885 | Nov 2016 | US |