Embodiments relate generally to detection of gas plumes, and more particularly to automated detection of gas plumes using optical gas imaging.
Methane (CH4) is an odorless and colorless naturally occurring organic molecule, which is present in the atmosphere at average ambient levels of approximately 1.85 ppm as of 2018 and is projected to continually climb. While methane is found globally in the atmosphere, a significant amount is collected or “produced” through anthropogenic processes including exploration, extraction, and distribution of petroleum in the form of natural gas. Natural gas, an odorless and colorless gas, is a primary source of energy used to produce electricity and heat. The main component of natural gas is methane (93.9 mol % CH4 typ.). While extraction of natural gas is a large source of methane released to atmosphere, major contributors of methane also include livestock farming (enteric fermentation), and solid waste and wastewater treatment (anaerobic digestion). Optical cells may be used to detect methane and other trace gasses.
A system embodiment may include: a camera; a processor in communication with the camera, the processor configured to: capture at least two images of a scene with the camera; compare the captured at least two images to determine at least one of: a motion associated with a movement of the camera and a motion associated with a movement of a gas plume; apply a color to each pixel of the captured images based on at least one of: a direction of movement and a velocity of movement between the at least two captured images; subtract pixels from the colored pixels associated with the movement of the camera; and generate an image of an output of a gas plume based on the subtracted pixels and the applied color pixels.
In additional system embodiments, the camera may be an optical gas imaging (OGI) camera. In additional system embodiments, the OGI camera may be tuned to specific wavelengths in the infra-red (IR). In additional system embodiments, the gas plume may be a trace gas plume.
In additional system embodiments, the captured scene comprises a site with one or more structures. In additional system embodiments, the one or more structures comprise one or more potential gas sources. In additional system embodiments, the captured at least two images may be compared to determine the motion associated with the movement of the camera and the motion associated with the movement of the gas plume.
In additional system embodiments, the color may be applied to each pixel of the captured images based on the direction of movement and the velocity of movement between the at least two captured images. In additional system embodiments, the direction of movement and the velocity of movement of the camera between the at least two captured images may be different than the direction of movement and the velocity of movement of the gas plume. In additional system embodiments, the generated image of the output of the gas plume may be isolated from a background image of the at least two captured images.
A method embodiment may include: capturing at least two images of a scene with a camera; comparing, by a processor in communication with the camera, the captured at least two images to determine at least one of: a motion associated with a movement of the camera and a motion associated with a movement of a gas plume; applying, by the processor, a color to each pixel of the captured images based on at least one of: a direction of movement and a velocity of movement between the at least two captured images; subtracting, by the processor, pixels from the colored pixels associated with the movement of the camera; and generating, by the processor, an image of an output of a gas plume based on the subtracted pixels and the applied color pixels.
In additional method embodiments, the camera may be an optical gas imaging (OGI) camera. In additional method embodiments, the OGI camera may be tuned to specific wavelengths in the infra-red (IR). In additional method embodiments, the gas plume may be a trace gas plume.
In additional method embodiments, the captured scene comprises a site with one or more structures In additional method embodiments, the one or more structures comprise one or more potential gas sources In additional method embodiments, the captured at least two images may be compared to determine the motion associated with the movement of the camera and the motion associated with the movement of the gas plume.
In additional method embodiments, the color may be applied to each pixel of the captured images based on the direction of movement and the velocity of movement between the at least two captured images. In additional method embodiments, the direction of movement and the velocity of movement of the camera between the at least two captured images may be different than the direction of movement and the velocity of movement of the gas plume, and where the generated image of the output of the gas plume may be isolated from a background image of the at least two captured images.
Another method embodiment may include: capturing at least two images of a scene with an optical gas imaging (OGI) camera; detecting, by a processor in communication with the camera, a dominant movement direction and velocity, where the detected dominant movement direction and velocity may be based on a movement of the camera between the captured at least two images; applying, by the processor, a color to each pixel of the captured images based on a direction of movement and a velocity of movement between the at least two captured images; subtracting, by the processor, pixels from the colored pixels associated with the dominant movement direction and velocity; and generating, by the processor, an image of an output of a gas plume based on the subtracted pixels and the applied color pixels, where the gas plume may be accentuated in the generated image.
The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principals of the invention. Like reference numerals designate corresponding parts throughout the different views. Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which:
The described technology concerns one or more methods, systems, apparatuses, and mediums storing processor-executable process steps for the detection of gas plumes using optical gas imaging.
The techniques introduced below may be implemented by programmable circuitry programmed or configured by software and/or firmware, or entirely by special-purpose circuitry, or in a combination of such forms. Such special-purpose circuitry (if any) can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
The described technology may also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), or the Internet. In a distributed computing environment, program modules or subroutines may be located in both local and remote memory storage devices. Those skilled in the relevant art will recognize that portions of the described technology may reside on a server computer, while corresponding portions may reside on a client computer (e.g., PC, mobile computer, tablet, or smart phone). Data structures and transmission of data particular to aspects of the technology are also encompassed within the scope of the described technology.
With respect to
When looking at a still image of a site, it may be challenging to detect a gas plume. For example, and with respect to
With respect to
The computing device 110 of
In one embodiment, the computing device 110 may execute steps to perform a dense optical flow algorithm, such as a Gunner Farneback algorithm. More specifically, an image may have a very large collection of pixels (such as pixels of the pixel map 118), and a corresponding combined sequence of images may reveal the movement of the pixels from frame to frame. These pixels are changing constantly as one frame is replaced by another. The camera 108 is also moving and this may contribute to the overall direction to which pixels appear to be moving in the overall scene in the field of view. The optical flow algorithm executed by the computing device 110 may execute steps to quantify the movement in the scene in real time. In one embodiment, the Gunner Farneback dense optical flow algorithm utilizes polynomial expansion to estimate the motion that occurred in two frames. For example, considering the exact quadratic polynomial:
f1=xTA1x+b1Tx+c1
A new signal may be constructed that has been translated by displacement d:
f2(x)=f1(x−d)=xTA2x+b2Tx+c2
Where equating the coefficients in the quadratic polynomials yields:
A2=A1
b2=b1−2A1d
c2=dTA1d−b1Td+c1
The translation may then be extracted by the processor of the computing device 110 to show the movement of, for instance, a gas plume in footage of the camera 108.
There may be a dominant movement direction and velocity at which most of the scene is moving due to motion of the camera 108. For instance, there is the apparent bulk movement of structures 120a, 120b, 120c and background object 122 to the left between first image 119 and second image 121, while the isolated gas plume 106 appears to move up and to the right. Using the Gunner Farneback algorithm, the amount of movement may be quantified as the displacement d of the pixels from the position x in the first image 119 to the position x-d in the second image 121, and the velocity may be quantified as that displacement divided by the time between the capturing of image 119 and image 121. In one embodiment, pixels associated with the dominant movement may be given a value. In other embodiments, pixels associated with the dominant movement are given as a color representation of the value.
For example,
In one embodiment, the pixel values/colors of the dominant color representation 130 associated with motion of the camera 108 of
In one embodiment, it is possible that even if camera movement occurs in the same direction as the gas plume, the magnitude of the movement may be different for the camera motion versus the gas plume motion, therefore causing a different “color” to represent the plume. As such, the gas plume will still be left on the image after the velocity subtraction. The colors left over on the image accentuate the movement of everything that is not moving with the camera or that is static. In one embodiment, the pixel value representation may be shown with arrows pointing in the direction of movement instead of colors.
With respect to
System embodiments include computing devices such as a server computing device, a buyer computing device, and a seller computing device, each comprising a processor and addressable memory and in electronic communication with each other. The embodiments provide a server computing device that may be configured to: register one or more buyer computing devices and associate each buyer computing device with a buyer profile; register one or more seller computing devices and associate each seller computing device with a seller profile; determine search results of one or more registered buyer computing devices matching one or more buyer criteria via a seller search component. The service computing device may then transmit a message from the registered seller computing device to a registered buyer computing device from the determined search results and provide access to the registered buyer computing device of a property from the one or more properties of the registered seller via a remote access component based on the transmitted message and the associated buyer computing device; and track movement of the registered buyer computing device in the accessed property via a viewer tracking component. Accordingly, the system may facilitate the tracking of buyers by the system and sellers once they are on the property and aid in the seller's search for finding buyers for their property. The figures described below provide more details about the implementation of the devices and how they may interact with each other using the disclosed technology.
Information transferred via communications interface 514 may be in the form of signals such as electronic, electromagnetic, optical, or other signals capable of being received by communications interface 514, via a communication link 516 that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular/mobile phone link, an radio frequency (RF) link, and/or other communication channels. Computer program instructions representing the block diagram and/or flowcharts herein may be loaded onto a computer, programmable data processing apparatus, or processing devices to cause a series of operations performed thereon to produce a computer implemented process.
Embodiments have been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments. Each block of such illustrations/diagrams, or combinations thereof, can be implemented by computer program instructions. The computer program instructions when provided to a processor produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/operations specified in the flowchart and/or block diagram. Each block in the flowchart/block diagrams may represent a hardware and/or software module or logic, implementing embodiments. In alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures, concurrently, etc.
Computer programs (i.e., computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface 512. Such computer programs, when executed, enable the computer system to perform the features of the embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor and/or multi-core processor to perform the features of the computer system. Such computer programs represent controllers of the computer system.
The server 630 may be coupled via the bus 602 to a display 612 for displaying information to a computer user. An input device 614, including alphanumeric and other keys, is coupled to the bus 602 for communicating information and command selections to the processor 604. Another type or user input device comprises cursor control 616, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to the processor 604 and for controlling cursor movement on the display 612.
According to one embodiment, the functions are performed by the processor 604 executing one or more sequences of one or more instructions contained in the main memory 606. Such instructions may be read into the main memory 606 from another computer-readable medium, such as the storage device 610. Execution of the sequences of instructions contained in the main memory 606 causes the processor 604 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in the main memory 606. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the embodiments. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
The terms “computer program medium,” “computer usable medium,” “computer readable medium”, and “computer program product,” are used to generally refer to media such as main memory, secondary memory, removable storage drive, a hard disk installed in hard disk drive, and signals. These computer program products are means for providing software to the computer system. The computer readable medium allows the computer system to read data, instructions, messages or message packets, and other computer readable information from the computer readable medium. The computer readable medium, for example, may include non-volatile memory, such as a floppy disk, ROM, flash memory, disk drive memory, a CD-ROM, and other permanent storage. It is useful, for example, for transporting information, such as data and computer instructions, between computer systems. Furthermore, the computer readable medium may comprise computer readable information in a transitory state medium such as a network link and/or a network interface, including a wired network or a wireless network that allow a computer to read such computer readable information. Computer programs (also called computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface. Such computer programs, when executed, enable the computer system to perform the features of the embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor multi-core processor to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
Generally, the term “computer-readable medium” as used herein refers to any medium that participated in providing instructions to the processor 604 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as the storage device 610. Volatile media includes dynamic memory, such as the main memory 606. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 602. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor 604 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the server 630 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to the bus 602 can receive the data carried in the infrared signal and place the data on the bus 602. The bus 602 carries the data to the main memory 606, from which the processor 604 retrieves and executes the instructions. The instructions received from the main memory 606 may optionally be stored on the storage device 610 either before or after execution by the processor 604.
The server 630 also includes a communication interface 618 coupled to the bus 602. The communication interface 618 provides a two-way data communication coupling to a network link 620 that is connected to the world wide packet data communication network now commonly referred to as the Internet 628. The Internet 628 uses electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 620 and through the communication interface 618, which carry the digital data to and from the server 630, are exemplary forms or carrier waves transporting the information.
In another embodiment of the server 630, interface 618 is connected to a network 622 via a communication link 620. For example, the communication interface 618 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line, which can comprise part of the network link 620. As another example, the communication interface 618 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, the communication interface 618 sends and receives electrical electromagnetic or optical signals that carry digital data streams representing various types of information.
The network link 620 typically provides data communication through one or more networks to other data devices. For example, the network link 620 may provide a connection through the local network 622 to a host computer 624 or to data equipment operated by an Internet Service Provider (ISP). The ISP in turn provides data communication services through the Internet 628. The local network 622 and the Internet 628 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 620 and through the communication interface 618, which carry the digital data to and from the server 630, are exemplary forms or carrier waves transporting the information.
The server 630 can send/receive messages and data, including e-mail, program code, through the network, the network link 620 and the communication interface 618. Further, the communication interface 618 can comprise a USB/Tuner and the network link 620 may be an antenna or cable for connecting the server 630 to a cable provider, satellite provider or other terrestrial transmission system for receiving messages, data and program code from another source.
The example versions of the embodiments described herein may be implemented as logical operations in a distributed processing system such as the system 600 including the servers 630. The logical operations of the embodiments may be implemented as a sequence of steps executing in the server 630, and as interconnected machine modules within the system 600. The implementation is a matter of choice and can depend on performance of the system 600 implementing the embodiments. As such, the logical operations constituting said example versions of the embodiments are referred to for e.g., as operations, steps or modules.
Similar to a server 630 described above, a client device 601 can include a processor, memory, storage device, display, input device and communication interface (e.g., e-mail interface) for connecting the client device to the Internet 628, the ISP, or LAN 622, for communication with the servers 630.
The system 600 can further include computers (e.g., personal computers, computing nodes) 605 operating in the same manner as client devices 601, where a user can utilize one or more computers 605 to manage data in the server 630.
Referring now to
The one or more vehicles 2002, 2004, 2006, 2010 may include an unmanned aerial vehicle (UAV) 2002, an aerial vehicle 2004, a handheld device 2006, and a ground vehicle 2010. In some embodiments, the UAV 2002 may be a quadcopter or other device capable of hovering, making sharp turns, and the like. In other embodiments, the UAV 2002 may be a winged aerial vehicle capable of extended flight time between missions. The UAV 2002 may be autonomous or semi-autonomous in some embodiments. In other embodiments, the UAV 2002 may be manually controlled by a user. The aerial vehicle 2004 may be a manned vehicle in some embodiments. The handheld device 2006 may be any device having one or more trace gas sensors operated by a user 2008. In one embodiment, the handheld device 2006 may have an extension for keeping the one or more trace gas sensors at a distance from the user 2008. The ground vehicle 2010 may have wheels, tracks, and/or treads in one embodiment. In other embodiments, the ground vehicle 2010 may be a legged robot. In some embodiments, the ground vehicle 2010 may be used as a base station for one or more UAVs 2002. In some embodiments, one or more aerial devices, such as the UAV 2002, a balloon, or the like, may be tethered to the ground vehicle 2010. In some embodiments, one or more trace gas sensors may be located in one or more stationary monitoring devices 2026. The one or more stationary monitoring devices may be located proximate one or more potential gas sources 2020, 2022. In some embodiments, the one or more stationary monitoring devices may be relocated.
The one or more vehicles 2002, 2004, 2006, 2010 and/or stationary monitoring devices 2026 may transmit data including trace gas data to a ground control station (GCS) 2012. The GCS may include a display 2014 for displaying the trace gas concentrations to a GCS user 2016. The GCS user 2016 may be able to take corrective action if a gas leak 2024 is detected, such as by ordering a repair of the source 2020 of the trace gas leak. The GCS user 2016 may be able to control movement of the one or more vehicles 2002, 2004, 2006, 2010 in order to confirm a presence of a trace gas leak in some embodiments.
In some embodiments, the GCS 2012 may transmit data to a cloud server 2018. In some embodiments, the cloud server 2018 may perform additional processing on the data. In some embodiments, the cloud server 2018 may provide third party data to the GCS 2012, such as wind speed, temperature, pressure, weather data, or the like.
It is contemplated that various combinations and/or sub-combinations of the specific features and aspects of the above embodiments may be made and still fall within the scope of the invention. Accordingly, it should be understood that various features and aspects of the disclosed embodiments may be combined with or substituted for one another in order to form varying modes of the disclosed invention. Further, it is intended that the scope of the present invention is herein disclosed by way of examples and should not be limited by the particular disclosed embodiments described above.
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 63/053,376, filed Jul. 17, 2020, the contents of which are hereby incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3780566 | Smith et al. | Dec 1973 | A |
4135092 | Milly | Jan 1979 | A |
4233564 | Kerbel | Nov 1980 | A |
4507558 | Bonne | Mar 1985 | A |
4988833 | Lai | Jan 1991 | A |
5047639 | Wong | Sep 1991 | A |
5075619 | Said | Dec 1991 | A |
5173749 | Tell et al. | Dec 1992 | A |
5317156 | Cooper et al. | May 1994 | A |
5822058 | Adler-Golden et al. | Oct 1998 | A |
6064488 | Brand et al. | May 2000 | A |
6509566 | Wamsley et al. | Jan 2003 | B1 |
6549630 | Bobisuthi | Apr 2003 | B1 |
7833480 | Blazewicz et al. | Nov 2010 | B2 |
8294899 | Wong | Oct 2012 | B2 |
8730461 | Andreussi | May 2014 | B2 |
9183371 | Narendra et al. | Nov 2015 | B2 |
9183731 | Bokhary | Nov 2015 | B1 |
9235974 | Johnson, Jr. et al. | Jan 2016 | B2 |
9250175 | McManus | Feb 2016 | B1 |
9599529 | Steele et al. | Mar 2017 | B1 |
10023323 | Roberts et al. | Jul 2018 | B1 |
10126200 | Steele et al. | Nov 2018 | B1 |
10268198 | Mantripragada et al. | Apr 2019 | B2 |
10429546 | Ulmer | Oct 2019 | B1 |
10830034 | Cooley et al. | Nov 2020 | B2 |
11299268 | Christensen et al. | Apr 2022 | B2 |
11519855 | Black | Dec 2022 | B2 |
20020005955 | Kramer et al. | Jan 2002 | A1 |
20030160174 | Grant et al. | Aug 2003 | A1 |
20030189711 | Orr et al. | Oct 2003 | A1 |
20030230716 | Russell et al. | Dec 2003 | A1 |
20040012787 | Galle et al. | Jan 2004 | A1 |
20040212804 | Neff et al. | Oct 2004 | A1 |
20060015290 | Warburton et al. | Jan 2006 | A1 |
20060044562 | Hagene et al. | Mar 2006 | A1 |
20060234621 | Desrochers et al. | Oct 2006 | A1 |
20070137318 | Desrochers et al. | Jun 2007 | A1 |
20080243372 | Bodin et al. | Oct 2008 | A1 |
20090201507 | Kluczynski et al. | Aug 2009 | A1 |
20090263286 | Isomura et al. | Oct 2009 | A1 |
20100147081 | Thomas | Jun 2010 | A1 |
20110074476 | Heer et al. | Mar 2011 | A1 |
20110150035 | Hanson et al. | Jun 2011 | A1 |
20110257944 | Du et al. | Oct 2011 | A1 |
20120120397 | Furtaw et al. | May 2012 | A1 |
20130044314 | Koulikov et al. | Feb 2013 | A1 |
20130076900 | Mrozek et al. | Mar 2013 | A1 |
20130208262 | Andreussi | Aug 2013 | A1 |
20140172323 | Marino | Jun 2014 | A1 |
20140204382 | Christensen | Jul 2014 | A1 |
20140236390 | Mohamadi | Aug 2014 | A1 |
20140336957 | Hanson et al. | Nov 2014 | A1 |
20150072633 | Massarella et al. | Mar 2015 | A1 |
20150275114 | Tumiatti et al. | Oct 2015 | A1 |
20150295543 | Brown et al. | Oct 2015 | A1 |
20150316473 | Kester et al. | Nov 2015 | A1 |
20160018373 | Pagé et al. | Jan 2016 | A1 |
20160202225 | Feng et al. | Jul 2016 | A1 |
20160214715 | Meffert | Jul 2016 | A1 |
20160307447 | Johnson et al. | Oct 2016 | A1 |
20170003684 | Knudsen | Jan 2017 | A1 |
20170057081 | Krohne et al. | Mar 2017 | A1 |
20170097274 | Thorpe et al. | Apr 2017 | A1 |
20170115218 | Huang et al. | Apr 2017 | A1 |
20170134497 | Harter et al. | May 2017 | A1 |
20170158353 | Schmick | Jun 2017 | A1 |
20170199647 | Richman et al. | Jul 2017 | A1 |
20170206648 | Marra et al. | Jul 2017 | A1 |
20170235018 | Foster et al. | Aug 2017 | A1 |
20170307519 | Black et al. | Oct 2017 | A1 |
20170336281 | Waxman et al. | Nov 2017 | A1 |
20180023974 | Otani et al. | Jan 2018 | A1 |
20180045561 | Leen et al. | Feb 2018 | A1 |
20180050798 | Kapuria | Feb 2018 | A1 |
20180059003 | Jourdainne | Mar 2018 | A1 |
20180067066 | Giedd et al. | Mar 2018 | A1 |
20180109767 | Li et al. | Apr 2018 | A1 |
20180127093 | Christensen et al. | May 2018 | A1 |
20180188129 | Choudhury et al. | Jul 2018 | A1 |
20180259955 | Noto | Sep 2018 | A1 |
20180266241 | Ferguson et al. | Sep 2018 | A1 |
20180284088 | Verbeck, IV | Oct 2018 | A1 |
20180292374 | Dittberner et al. | Oct 2018 | A1 |
20190011920 | Heinonen et al. | Jan 2019 | A1 |
20190011935 | Ham et al. | Jan 2019 | A1 |
20190025199 | Koulikov | Jan 2019 | A1 |
20190033194 | DeFreez et al. | Jan 2019 | A1 |
20190077506 | Shaw et al. | Mar 2019 | A1 |
20190086202 | Guan et al. | Mar 2019 | A1 |
20190095687 | Shaw et al. | Mar 2019 | A1 |
20190178743 | McNeil | Jun 2019 | A1 |
20190195789 | Pan et al. | Jun 2019 | A1 |
20190212419 | Jeong | Jul 2019 | A1 |
20190331652 | Ba et al. | Oct 2019 | A1 |
20200109976 | Ajay et al. | Apr 2020 | A1 |
20200400635 | Potyrailo et al. | Dec 2020 | A1 |
20210017926 | Alkadi et al. | Jan 2021 | A1 |
20210190745 | Buckingham et al. | Jun 2021 | A1 |
20210255158 | Smith et al. | Aug 2021 | A1 |
Number | Date | Country |
---|---|---|
205749271 | Nov 2016 | CN |
106769977 | May 2017 | CN |
109780452 | May 2019 | CN |
112213443 | Jan 2021 | CN |
69333010 | Apr 2004 | DE |
1371962 | Jul 2011 | EP |
3047073 | Aug 2019 | FR |
200975823 | Apr 2009 | JP |
20170062813 | Jun 2017 | KR |
101770254 | Aug 2017 | KR |
1999054700 | Oct 1999 | WO |
2008021311 | Feb 2008 | WO |
2015073687 | May 2015 | WO |
2016045791 | Mar 2016 | WO |
2016162673 | Oct 2016 | WO |
2017069979 | Apr 2017 | WO |
2018121478 | Jul 2018 | WO |
2018227153 | Dec 2018 | WO |
2019246280 | Dec 2019 | WO |
2020007684 | Jan 2020 | WO |
2020206020 | Oct 2020 | WO |
Entry |
---|
U.S. Appl. No. 62/687,147, filed Jun. 19, 2018, Brandon James Smith. |
“SAFESITE Multi-Threat Detection System”, Jul. 11, 2012 (Jul. 11, 2012), pp. 1-6, XP055245980. |
International Search Report and Written Opinion for PCT/US19/38011 dated Sep. 9, 2019. |
International Search Report and Written Opinion for PCT/US19/38015, dated Oct. 18, 2019. |
International Search Report and Written Opinion for PCT/US19/44119, dated Oct. 17, 2019. |
International Search Report and Written Opinion for PCT/US20/26228 dated Jul. 1, 2020. |
International Search Report and Written Opinion for PCT/US20/26232 dated Jun. 26, 2020. |
International Search Report and Written Opinion for PCT/US20/26246 dated Jun. 29, 2020. |
International Search Report and Written Opinion for PCT/US20/51696, dated Feb. 3, 2021. |
International Search Report and Written Opinion for PCT/US2020/044978, dated Oct. 26, 2020. |
International Search Report and Written Opinion for PCT/US2021/016821 dated Apr. 26, 2021. |
International Search Report and Written Opinion for PCT/US2021/024177, dated Jun. 23, 2021. |
International Search Report and Written Opinion for PCT/US2021/056708, dated Jan. 27, 2022. |
International Search Report and Written Opinion for PCT/US21/42061, dated Nov. 26, 2021. |
International Search Report and Written Opinion for PCT/US21/44532, dated Jan. 11, 2022. |
International Search Report and Written Opinion for PCT/US21/56710, dated Feb. 23, 2022. |
International Search Report and Written Opinion of PCT/US19/57305, dated Jan. 2, 2020. |
International Search Report and Written Opinion of PCT/US20/54117, dated Dec. 22, 2020. |
Joly, “Atmospheric Measurements by Ultra-Light Spectrometer (AMULSE) Dedicated to Vertical Profile In Situ Measurements of Carbon Dioxide (CO2) Under Weather Balloons: Instrumental Development and Field Application,” Sensors 2016, 16, 1609. |
Khan, “Low Power Greenhouse Gas Sensors for Unmanned Aerial Vehicles”, Remote Snse. 2012, 4, 1355-1368. |
Villa. “An Overview of Small Unmanned Aerial Vehicles for Air Quality Measurements: Present Applications and Future Prospectives”. Sensors. Web . Jul. 12, 2016. |
White, “Development of an Unmanned Aerial Vehicle for the Measurement of Turbulence in the Atmospheric Boundary Layer”, Atmosphere, v.8, issue 10, 195, pp. 1-25. |
International Search Report and Written Opinion for PCT/US22/38951, dated Nov. 28, 2022. |
Kelly J F et al. “A capillary absorption spectrometer for stable carbon isotope ratio (C/C) analysis in very small samples”, Review of Scientific Instruments, American Institute of Physics, 2 Huntington Quadrangle, Melville, NY 11747, vol. 83, No. 2, Feb. 1, 2012 (Feb. 1, 2012), pp. 23101-23101, XP012161835, ISSN: 0034-6748, DOI: 10.1063/1.3680593. |
Krings et al., Atmos. Meas. Tech., 11, 721-739, Feb. 7, 2018. |
International Search Report and Written Opinion for PCT/US23/13893, dated Jun. 30, 2023. |
Lilian Joly, The evolution of Amulse (Atmospheric Measurements by Ultra-Light Spectrometer) and its interest in atmospheric applications. Results of the Atmospheric Profiles of GreenhousE gasEs (APOGEE) weather balloon release campaign for satellite retrieval validation, p. 1-28, Sep. 25, 2019, Atmospheric Measurement Techniques Discussion (Joly). |
Number | Date | Country | |
---|---|---|---|
20220020141 A1 | Jan 2022 | US |
Number | Date | Country | |
---|---|---|---|
63053376 | Jul 2020 | US |