Detection of suspended particles, such as airborne particles, may be important due to the impact of suspended particles on a range of issues, from air pollution to disease transmission. Suspended particles may cause different adverse effects due to their relatively high specific surface area. Airborne nanoparticles can easily spread over a large area for extended periods and can easily enter and transfer within organisms and interact with cells and subcellular components. Detection of suspended particles may be an important step in treating gases which contain suspended particles, or evaluating systems or equipment designed to remove suspended particles. Detection and analysis of suspended particles may be important even when the airborne particles are present at extremely low concentrations.
In general, the disclosure is directed to systems and techniques for detecting and analyzing particles suspended within a gas, such as air. A colloidal suspension of particles, whether solid or liquid, in air may be called an aerosol. In some examples, the concentration of airborne particles or airborne particles of interest (e.g., biological particles) may be so low that proper sampling is difficult and/or time consuming. As described in more detail, the disclosed systems and techniques may use one or more particle concentrators to concentrate (e.g., increase the particle count per unit volume) an aerosol for analysis. In some examples, the particle concentrator may be configured to preferentially separate particles of interest, thereby increasing their concentration. The particles of interest may, in some examples, be particles that fall within a certain particle size range. The particle concentrator may receive the raw or unconcentrated aerosol as a bulk gas and output a particle-rich stream of gas (i.e., a concentrated aerosol) to a particle detection and analysis unit. The particle detection and analysis unit may be separate from or integral with the particle concentrator. The particle detection and analysis unit may perform image processing to detect, analyze, quantify, and/or categorize suspended particles in the particle-rich stream of gas. Thus, by concentration of particles of interest within an aerosol, the disclosed systems and techniques may detect and characterize particles such as biological particles present in the aerosol at low (e.g., less than about 1,000 particles per liter (particles/L) or extremely low (e.g., less than about 1 particles/L) concentration, which may be beyond the capability of other particle detection techniques. “About,” as used herein, may comprise the stated value and those values with a range of 10%, or 20%, or 30% of the stated value.
Furthermore, by concentrating particles present in the particle-rich stream of gas at a known ratio relative to the aerosol, the disclosed systems and techniques may more accurately and/or quickly detect, count, and/or analyze particles within the aerosol, whether the system is sampling an aerosol contained within a duct or open to the ambient conditions in a building or outdoors. Additionally, because the disclosed systems may include a particle concentrator and particle detection unit packaged together or as discrete modules, the disclosed systems may be portable and/or lightweight enough to be deployed in more applications than conventional stationary particle detection and analysis systems.
The disclosed systems and techniques may employ a particle detection and analysis unit to categorize target particle types, such as bioaerosols including bacteria, viruses, fungi, or the like. The particle concentrator may advantageously be configured to preferentially separate particles within a certain particle size range into the particle-rich stream for analysis. For example, because many biological particles define a maximum dimension that is greater than about 1 micrometer, whether by themselves or when combined with other particles present in the air, the particle concentrator may be configured to preferentially separate (e.g., by an inertia-based mechanism) a majority of particles above a particle size cutoff point (e.g., of 1 micrometer) into the particle-rich stream for analysis.
In this way, a majority of the particles of interest may be concentrated into a particle-rich stream via a particle concentrator for detection and analysis by a particle detection and analysis unit. The particle detection and analysis unit may then capture an image of at least one particle in the particle-rich stream of gas when the particle rich stream is routed into a detection chamber.
In some examples, the disclosed system may be configured to irradiate the at least one particle using a light source. An image sensor may detect images generated by elastic scattered light and the induced fluorescence when light from the light source contacts the particle or particles. The system may include processing circuitry configured to store image data from one or more image sensors in a detection video. The captured images of induced fluorescence in the detection video may be converted to quantitative information about one or more particles. The quantitative data may include one or more of a particle count, particle concentration, image size distribution, or wavelength distribution of induced fluorescence.
In some examples, the disclosure is directed to techniques for suspended particle detection. The technique may include receiving, with a particle concentrator, an aerosol comprising particles suspended within a bulk gas. The aerosol may have a first concentration indicative of count of particles per unit volume of the bulk gas. The technique further includes concentrating, with the particle concentrator, the aerosol to generate a particle-rich stream of gas comprising at least one particle. The particle-rich stream of gas having a second concentration greater than the first concentration. The technique also includes irradiating the at least one particle in the particle-rich stream of gas with a light source of a certain wavelength in a detection chamber. The technique also includes capturing image data relating to the at least one particle with an image sensor located within the detection chamber.
In some examples, the disclosure is directed to a system which includes a particle concentrator and a particle detection and analysis unit. The particle detection unit may include at least one light source of a certain wavelength configured to irradiate at least one particle, at least one image sensor or camera configured to capture image relating to the at least one particle and one or more processors. The one or more processors may be configured to obtain a frame of grayscale image data comprising luminance values of image data captured by the image sensor or camera and analyze the image data in the frame to identify at least one particle captured in the frame. To analyze the image data, the one or more processors are configured to identify pixels having luminance values that satisfy a threshold, determine particle contours of the at least one particle based on the identified pixels, and generate at least one of quantitative or qualitative information for the at least one particle based at least partially on the analyzing of the image data.
The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages will be apparent from the description and drawings, and from the claims.
Detecting particles suspended in the air using optical detection techniques may be challenging compared to detecting particles suspended in liquids such as in water. Furthermore, in some examples, particles of particular interest in the gas may be difficult to detect at low or extremely low concentration. Such particles of interest may be biological particles such as viruses or the like. Example cases where such particles may be present are in clean processes, although other scenarios where detection of particles at low (e.g., less than about 1,000 particles/L or extremely low (e.g., less than about 1 particle/L) concentration are considered.
For example, in the natural environment, whether outdoors or in an enclosed environment the fraction of particles of biological origin may be small because other natural or man-made particles may be present in much larger numbers, such as dust particles or the like. Therefore, in some applications it may be necessary to concentrate or increase the concentration of the particles of interest by several orders of magnitude before the particles can be counted, measured, categorized, and/or analyzed. Systems and techniques according to the present disclosure combine a particle concentrator with a light-based particle detection and analysis unit which may detect the induced fluorescence or enhanced light scattering signals from the particles of interest. In this way, particles of interest may be quickly and accurately measured and analyzed relative to other particle detection systems and techniques. Furthermore, systems according to the present disclosure which include a particle concentrator and a particle sensing device, which may also be called a detection and analysis unit, may be relatively easy to make, additively manufactured, portable, low-power, and/or lightweight relative to other systems.
Systems according to the present disclosure may include a particle detection and analysis unit which senses and analyzes a particle in a particle-rich stream output by a particle concentrator. Although the aerosol may contain a low or extremely low concentration of particles of interest, the particle concentrator may increase the concentration of particles of interest and output the particle-rich stream of gas for analysis by the particle detection and analysis unit. In some examples, the particle concentrator may be a concentrating virtual impactor (CVI) device that performs an inertia-based preferential particle separation into a particle-rich stream of gas and a particle-lean stream of gas.
With respect to the detection and analysis of the particle-rich gas stream, a light-based image processing technique may be employed. Particles in a suspending media can be detected by measuring fluctuations in the intensity of light scattered from moving particles, as in dynamic light scattering (DLS) measurement. This is because when particles move randomly in Brownian motion (motion caused by diffusion only), the diffusivity of suspended particles can be deduced from the autocorrelation function describing the fluctuation signals. For particles suspended in a liquid, it may be easy to maintain the motion of particles as Brownian motion, especially when the liquid is confined in a small container or in a stationary droplet. For particles suspended in the air, the detection is still challenging. It may not be practical in some instances to confine air samples in small spaces or small containers or to control the motion of the airborne particles so that the motion is caused only by their diffusion. Since airborne nanoparticles are more mobile and more prone to uncontrolled non-Brownian motion than nanoparticles suspended in liquids, techniques that can successfully detect nanoparticles in liquids, such as DLS or advanced optic microscopes, are rarely used for detecting or analyzing airborne nanoparticles.
Systems and techniques according to the present disclosure may be suitable for particle detection of particles suspended in air or another gas. For instance, techniques described in this disclosure may successfully detect and analyze airborne particles. Particles may be irradiated with a light source in a detection chamber, and an imager (e.g., a color image sensor or camera) may capture image data indicative of the detection chamber at a particular point in time. The image data may be image processed (e.g., in real-time or at a later time) to capture quantitative data and/or qualitative data about at least one particle within the detection chamber. For example, quantitative data may include one or more of a particle count, particle concentration, or particle size. The particle-rich stream of gas may then be output by the particle detection and analysis unit.
Furthermore, the disclosed systems and techniques may be used to detect bioparticles suspended in a gas. Bioparticles (“bioaerosols,” when suspended in air), may be particles that include biological material. Bioaerosols may be detected by the disclosed systems and techniques because irradiation of suspended particles with light of a certain known wavelength may induce fluorescence in some types of particles and not induce fluorescence in other types of particles. For example, excitation of some wavelengths of light may induce fluorescence in bioparticles and not induce fluorescence in abiotic particles, which do not include biological material.
The disclosed system may include a light source configured to emit light at wavelengths which induce fluorescence in bioparticles and not induce fluorescence in abiotic particles. The imager may be configured to detect the induced fluorescence by filtering at least a portion of the sensed image data so that only induced fluorescence is detected. In some examples, a single imager may be used, and a portion of the image data may be filtered such that a portion of the captured image data may be filtered to capture the induced fluorescence of at least one particle. Alternatively, in some examples, a second imager may be included, and one imager may be configured to capture clastic scattered light scattered by the particle, where particles scatter light according to their size as demonstrated by the principles of Rayleigh scattering. As such, the second imager may include a filter configured to capture only induced fluorescence of the particle or particles in the detection chamber. The dominant color hue of the induced fluorescence may be used to calculate a dominant wavelength of the particle. Since the wavelength (e.g., the dominant wavelength) of certain particles is known, this wavelength may be used in categorize the detected particle or particles into, for example, bioparticles and abiotic particles, or between different categories of bioparticles. The emitted wavelength of a particle in the detection chamber may be compared to a database of known particles in a database, and a match may allow for a particular particle species to be recognized.
Aerosol 103 may define a first concentration of particles. Individual particles are indicated by “x” in
Particle concentrator 101 may be a concentrating virtual impactor (CVI) device that performs an inertia-based preferential particle separation of aerosol 103 and output particle-rich stream of gas 105 and particle-lean stream of gas 107. Particle-rich stream of gas 105 may also be called the minor flow stream because particle-rich stream of gas 105 may have a lower volumetric flow rate than particle-lean stream of gas 107, which may be called the major flow stream. As illustrated, particle concentrator 101 may preferentially separate particles (or, a portion of the particles within a certain size range or above a particle cut-off point) such that particle-rich minor flow stream 105 contains a majority of the particles larger than the cutoff size in aerosol 103 and only a small portion of the bulk gas. Accordingly, particle-lean major flow stream may contain a majority of the bulk gas in aerosol 103, but only a minority of the particles that are larger than the particle size cutoff point that were originally in aerosol 103.
In some examples, particle concentrator 101 is configured to concentrate aerosol 103 by preferentially separating particles by the particle's inertia into one of particle-lean stream 107 or particle-rich stream of gas 105. In some examples, inertial separation may include causing a majority of particles in aerosol 103 which have a maximum dimension that is above a particle size cut point in the aerosol to enter the particle-rich minor stream of gas 105, and causing a majority of particles which have a maximum dimension that is below the particle size cut point in aerosol 103 to enter the particle-lean major stream of gas 107. Furthermore, although not illustrated in
In some examples, system 100 may include one or more optional blowers 113 as a separate component or integral with particle concentrator 101. In some examples, blower 113 may be a fan configured to cause particle concentrator 101 to receive aerosol 103 and output particle-rich stream 105 from a first outlet and particle-lean stream 107 from a second outlet. In some examples, blower 113 according to the present disclosure may advantageously be run without needing a high-power supply. For example, blower 113 may be provided with power from about 10 watts to about 300 watts, such as less than about 100 watts. The power supplied to blower 113, along with other design considerations, may impact the way particle concentrator 101 preferentially separates particles by inertia. As such, the power supply may be selectively tailored to drive the blower with a relatively low pressure drop (e.g., about 2.5 kPa) across particle concentrator 101. Furthermore, the smaller size of the system 100 may also allow for inclusion of system 100 in applications where compact, portable, and/or lightweight systems are required, such as within a vehicle, boat, or aircraft, etc.
In some examples, a ratio of a volumetric flow rate of the particle-lean stream of gas to a volumetric flow rate of the particle-rich stream of gas is in a range of from about 10:1 to about 1000:1. In some examples, the concentration ratio between particle-rich stream of gas 105 and aerosol 103 is in a range of from about 10 to about 1000. Put differently, particle-rich stream 105 may contain at least about 10 times more particles per unit volume than aerosol 103. As such, particle-rich stream of gas 105 may be output into detection chamber 102 of particle detection and analysis unit 111 at a flow rate and concentration where detection and analysis of the particles may be performed. In systems that do not include particle concentrator 101, detection and analysis of particles in aerosol 103 may be difficult or impossible because the low or extremely low concentration of the particles would cause the analysis to be time-inefficient. Furthermore, since particles of interest are preferentially sorted into particle-rich stream 105 by particle concentrator 101, image capture and image processing of these particles may be performed with reduced interference from undesired particles such as dust particles which are below a particle size cut point.
Particle-rich flow stream 105 may be output into detection chamber 102 of particle detection and analysis unit 111. In some examples, as illustrated, particle concentrator 101 may be a separate component fluidically connected to detection chamber 102 by a pipe or tube. Detection chamber 102 may be a chamber configured to receive a particle-rich minor stream of gas 105 (e.g., air) containing a majority of suspended particles larger than a cutoff size of particle concentrator 101 from aerosol 103. The particles may be excited and/or irradiated by light source 114 in detection chamber 102. Image detection by imager 110 may be performed before outputting the stream of gas 105 into the surroundings. As such, detection chamber 102 may include one or more inlets 104 and one or more outlets 106. Blower 113 may be configured to input energy into one or more of aerosol 103, particle-rich stream of gas 105, and/or particle-lean stream of fluid 107 to move aerosol 103 through system 100. In the illustrated example, blower 113 pulls both particle-lean stream of gas 107 through particle concentrator 101 and particle-rich stream of gas 105 through particle concentrator 101 and detection chamber 102. In other examples, blower 113 may include a first blower to draw particle-lean stream of fluid 107 through particle concentrator 101 and a second small blower or suction device may draw particle-rich stream of gas 105 into detection chamber 102 and out through outlet 106. In some examples, inlet 104, outlet 106, and/or particle concentrator 101 may include one or more valves 119 or other flow regulation mechanisms.
Light source 114 may be configured to emit a beam of light 116 into detection chamber 102, and imager 110 may be configured to capture image data within detection chamber 102. In some examples, detection chamber 102 may be configured to control light within detection chamber 102, such as by allowing light source 114 to irradiate particles and blocking out other light. Therefore, detection chamber 102 may include walls or a lining which create a dark background by completely or nearly completely occluding ambient light from outside detection chamber 102, for example by reducing or eliminating light from entering detection chamber 102.
Workstation 115 may include, for example, an off-the-shelf device, such as a laptop computer, desktop computer, tablet computer, smart phone, or other similar device. In some examples, workstation 115 may be a specific purpose device. Workstation 115 may be configured to control blower 113 and/or any associated valves, imager 110, light source 114, or any other accessories and peripheral devices relating to, or forming part of, system 100.
Computing device 120 may include, for example, an off-the-shelf device such as a laptop computer, desktop computer, tablet computer, smart phone, or other similar device or may include a specific purpose device. In some examples, computing device 120 control blower 113 and/or any associated valves, imager 110, light source 114, or any other accessories and peripheral devices relating to, or forming part of, system 100 and may interact extensively with workstation 115. Workstation 115 may be communicatively coupled to computing device 120, enabling workstation 115 to control the operation of imager 110 and receive the output of imager 110.
Graphical user interface (GUI) 130 may be configured to output instructions, images, and messages relating to at least one of a performance, position, viewing angle, image data, or the like from imager 110, light source 114, and or pump 108. GUI may include display 132. Display 132 may be configured to display outputs from any of the components of system 100, such as computing device 120. Further, GUI 130 may be configured to output information regarding imager 110, e.g., model number, type, size, etc. on display 132. Further, GUI 130 may be configured to output sample information regarding sampling time, location, volume, flow rate, concentration ratio, volumetric flow rate of minor stream 105 and/or major stream 107, or the like. GUI 130 may be configured to present options to a user that include step-by step, on screen instructions for one or more operations of system 100. For example, GUI 130 may present an option to a user to select a file of sensed image data from imager 110 at a particular point in time or a duration in time as video image data. GUI 130 may allow a user to click rather than type to select, for examples, an image data file from imager 110 for analysis, a technique selection for system 100, a mode of operation of system 130, various settings of operation of system 130 (e.g., an intensity or wavelength of light from light source 114, a zoom, angle, or frame rate of imager 110, a power supplied to blower 113, or the like), a plot other presentation of quantitative information relating at least one particle in detection chamber 102, or the like. As such, GUI 130 may offer a user zoom in and zoom out functions, individual particle images with size and/or wavelength distribution, imager sensor setup and preview in a large pop-up, on-board sensor and analysis control, pause and continue functions, restart and reselect functions, or the like.
Light source 114 is configured to generate beam 116 of light into detection chamber 102 to irradiate at least one particle within detection chamber 102 at a certain wavelength or wavelengths. In some examples, beam 116 may be collimated and or focused by a lens system, and configured to beam across detection chamber 102 to a light trap 118. Light trap may trap or stop beam 116 from reflecting back into detection chamber 102. In some examples, the light may be generated at the certain target wavelength. Alternatively, in some examples, light at a variety of wavelengths may be generated by light source 114, and light source 114 may include one or more filters, such as short-pass or long-pass filters configured to occlude light at certain wavelengths and prevent the occluded wavelengths from being beamed into detection chamber 102. Light source 114 may include a laser, LED, or another light generating device. Light source 114 may generate and/or employ a filter system such that beam 116 includes wavelengths less than 450 nanometers (nm), for example from about 250 nm to about 450 nm, or from about 250 nm to about 350 nm. Light at these wavelengths may induce fluorescence in target particles (e.g., bioaerosols) while not inducing, or only minimally inducing, fluorescence in other types of particles (e.g., abiotic particles). Light source 114 may be external, that is, located remotely from imager 110. In some examples, system 100 may include multiple light sources, which may use the same or different light generating techniques, and may generate one or more than one beam 116 at the same wavelength(s) or different wavelength(s).
Light source 114 may include a lens system configured to generate beam 116 as a collimated beam. A collimated beam may have light rays that are substantially parallel. In this way, beam 116 may focus on a particular region within detection chamber 102, such as a portion of detection chamber 102 where the gas stream containing suspended particles are configured to pass.
Imager 110 is configured to capture image data indicative of at least one particle in a region of interest in detection chamber 102. For example, imager 110 may include a lens system which makes imager 110 focused on a region of detection chamber 102 within beam 116 of light source 114. Imager 110 may be a single image sensor or camera, as illustrated, which may be configured to capture image data as elastic light scattering data, induced fluorescence data, or both. In some examples, one or more filters (e.g., short pass filters) may be included which may reduce or eliminate light of certain selectable wavelengths from reaching an array of image sensors within imager 110 such that imager 110 captures only induced fluorescence from at least one particle suspended within detection chamber 102.
In some examples, as discussed elsewhere, imager 110 may include more than one imager, such as a camera for sensing induced fluorescence (e.g., by filtering) and a camera for sensing elastic light scattering. Imager 110 may be configured to capture image data as a picture or frame (i.e., image data sensed at a particular point in time) or as video data. In some examples, a frame may refer to an overall matrix of image data captured by imager 110. The overall matrix may be made up of individual pixels, or multiple matrices made up of individual pixels (e.g., three image data matrices including a red matrix, a green matrix, and a blue matrix). Video data, as used herein, comprises a series of frames over a duration in time. In some examples, the video data may be a series of frames over a duration in time, and each respective frame in the series of frames may be separated in time from the adjacent frames by the same length of time.
Imager 110 may be a color image sensor or camera. Accordingly, imager 110 may include color sensors, which may be located in a sensor array. The color image sensor configured to detect colors in addition to black and white and capture the detected colors in one or more data matrices made up of individual pixels. Accordingly, in some examples, imager 110 may sense, capture, and record image data that includes red, green, and blue sensors, and may assign a value for red, green, and blue respectively for each pixel, creating a red matrix, a blue matrix, and a green matrix. Imager 110 or associated processing circuitry may also create an overall image data matrix. The overall image data matrix may be a sum of the red, green, and blue matrices for, and/or may be the average of the red, green, and blue matrices. Imager 110 may be configured to sense, capture, store, and/or transmit image data in a data matrix as any or all of the red matrix, green matrix, blue matrix, or overall data matrix.
Each respective matrix may include a luminance value for each pixel in the data matrix. For example, the overall data matrix may include an overall luminance value for each pixel in the overall matrix, which may be based on scaling the values in red, green, and blue matrices. As one example, the overall image data matrix may include a luma for each individual pixel, which may be a weighted sum of gamma-compressed value from each of the red image data matrix, the green image data matrix, and the blue image data matrix. In some examples, the luminance value for each pixel may be based on conversion of the overall matrix to a grayscale image that includes luminance values. The techniques described in this disclosure should not be considered limited to ways in which to determine luminance values.
In some examples, the each of the red, green, blue, and overall data matrices may include a rectangular array of pixels, such as a 1980×1080 data matrix. Processing circuitry within imager 110 or another component of system 100, such as computing device 120, may be configured to break up the overall data matrix (e.g., 1980×1080 pixels, or another matrix size) into a grid of smaller data matrices (e.g., 100×100 pixels, or another matrix size). A grid of smaller data matrices may be considered as a subset of pixels (e.g., 100×100 pixels is a subset of the 1980×1080 pixels). As described in more detail, sweeping processing across subset of pixels may allow for efficient utilization of processing capabilities, as compared to processing the overall data matrix, while ensuring that particles are properly identified in respective subsets. However, the example techniques are not so limited, and processing of the overall data matrix is also possible, as described below.
Computing device 120 may be communicatively coupled to imager 110, GUI 130, light source 114, and/or server 140, for example, by wired, optical, or wireless communications. Server 140 may be a server which may or may not be located in a particle detection laboratory, a cloud-based server, or the like. Server 140 may be configured to store image data as video data, still frame data at a particular point in time, particle information, calibration information, or the like.
In operation, aerosol 103 may be drawn or directed into nozzle 121 and receiving tube 123 using one or more blowers as described above. The volumetric flow rate of aerosol 103 may be selectively split to impact the way particles are preferentially separated into particle-rich minor flow stream 105. In some examples, blower 113 (
As illustrated in
In some examples, computing device 200 may be configured to perform image processing, control and other functions associated with workstation 115, imager 110, light source 114, blower 113, or other function of system 100 of
While processing circuitry 204 appears in computing device 200 in
Memory 202 of computing device 200 includes any non-transitory computer-readable storage media for storing data or software that is executable by processing circuitry 204 and that controls the operation of computing device 120, workstation 115, imager 110, or server 140, as applicable. In one or more examples, memory 202 may include one or more solid-state storage devices such as flash memory chips. In one or more examples, memory 202 may include one or more mass storage devices connected to the processing circuitry 204 through a mass storage controller (not shown) and a communications bus (not shown).
Although the description of computer-readable media herein refers to a solid-state storage, it should be appreciated by those skilled in the art that computer-readable storage media may be any available media that may be accessed by the processing circuitry 204. That is, computer readable storage media includes non-transitory, volatile, and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. For example, computer-readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store the desired information and that may be accessed by computing device 200. In one or more examples, computer-readable storage media may be stored in the cloud or remote storage and accessed using any suitable technique or techniques through at least one of a wired or wireless connection.
Memory 202 may store one or more applications 216. Applications 216 may include a gain adjuster 222, a particle contour broadener 224, color manipulator 218, and/or other computer vision model(s) or machine learning module(s), such as a model to determine particle contours in sensed image data, broaden particle contours to determine broadened particle contours, determine a particle boundary based on the broadened particle contours, or the like. Applications 216 stored in memory 202 may be configured to be executed by processing circuitry 204 to carry out operations on imaging data 214 of at least one particle within detection chamber 102 (
Memory 202 may store imaging data 214 and excitation data 228. Imaging data 214 may be captured by one or more sensors within or separate from imager 110 (
Processing circuitry 204 is configured to generate at least one of quantitative or qualitative information for the at least one particle within detection chamber 102. The quantitative data may include one or more of a particle count, particle size, and or a particle concentration, and/or how these or other quantitative data change over time (e.g., from frame to frame in a video file). Example qualitative data may include one or more of a particle category (e.g., bioparticle or abiotic particle) or particle species (e.g., specific bioparticle), particle image of a particular particle, or the like. Qualitative data may be generated by comparing imaging data 214 to stored particle data 226 and particle classifications 203. Stored particle data 226 may include calibration data of known particle size, count, concentration, category, species or the like. Processing circuitry 204 may register imaging data 214 and/or excitation data 228 using timestamps (which may be placed in the data by, for example, imager 110, computing device 120, or workstation 115). Processing circuitry 204 may output for display by display 206, e.g., to GUI 130 of
In some examples, processing circuitry may perform an analysis technique on stored imaging data 214, which may be called analysis mode operation. Processing circuitry 204 may be configured to output for display on GUI 130 of
In some examples, processing circuitry 204 may perform a real-time particle detection and analysis technique. Processing circuitry 204 may receive image data directly from imager 110, or from imaging data 214 stored in memory 202, and, in substantially real time, capture a first frame of the sensed image data representing data sensed at a first time. Substantially real-time, as used herein, may mean that the image data is captured and analyzed without stopping the imager 110, that is, during the sampling operation. Processing circuitry 204 is configured to analyzing image data in the frame to identify at least one particle, convert image data within the frame to quantitative information about the at least one particle within the frame at the first time, and capture a second frame of the sensed image data representing data sensed at a second time.
Processing circuitry 204 may be configured to execute color manipulator 218 to generate grayscale image data from color image data sensed by imager 110. Alternatively, processing circuitry 204 may facilitate receipt of grayscale image data. Regardless, grayscale image data may be obtained by processing circuitry 204 for analysis. The grayscale image data may be the overall image data matrix, which may be created by scaling of each of the red, green, and blue matrices. The resulting grayscale image data may include a luminance value for each pixel in an image data matrix, as described above.
Processing circuitry 204 may be configured to determine particle contours of at least one particle in detection chamber 102 in the sensed the image data based on the luminance values of the grayscale image, or of other image data. For example, the luminance value of a particular pixel may be relatively high, indicating the presence of an irradiated particle in the location of the pixel in the grayscale image. Particle contours, as described herein, may be a particle boundary, but due to the small size and irregular shape of some particles, particle contours may in some examples only represent a feature (e.g., a spike) on a particle. In some examples, particle contours may be lights spots (e.g., pixels with relatively higher luminance values) that satisfy a threshold. One example of the threshold is an average of a subset of pixels, and pixels within that subset that are greater than the threshold are part of the particle contours.
That is, processing circuitry 204 may determine that, when a particular pixel satisfies a threshold, the pixel is part of the particle contours of a particle. Adjacent pixels that all satisfy the threshold may be grouped together as a group of pixels that form an island (or a “spot”) of particle contours. In some examples, processing circuitry 204 may be configured to identify pixels having luminance values that satisfy the threshold by determining local thresholds within respective subsets of pixels (e.g., each respective small matrix in a grid of small matrices making up the overall matrix). Processing circuitry 204 may be configured to compare luminance values of pixels within each respective subsets of pixels to respective local threshold for that subset of pixels. Then, processing circuitry 204 may be configured to sweep through the subsets to pixels to identify the pixels based on the comparison, and determine particle contours by grouping the identified pixels of each of the respective subsets of pixels together as an island of particle contours. In other words, in some examples, the threshold may be assigned as the average value of a small matrix (e.g., a subset of the overall number of pixels, such as a 100×100 matrix of pixels) in which the particular pixel resides, and each individual pixel above the average of the small matrix in which it resides may be assigned as belonging to an island of particle contours.
In some examples, the threshold may be assigned as the average luminance value of the entire image data matrix (e.g., a 1980×1080 matrix of pixels) and each individual pixel with a luminance value above the average may be assigned as part of a group of proximate pixels an island of particle contours. In some examples, the threshold may be set by a fitting function. In some examples, the fitting function may use both the small matrix in which the particle resides and the overall matrix to determine whether an individual pixel is part of the particle contours. In some examples, processing circuitry 204 may execute a fitting function to identify particular pixels within the small matrix as being part an of island of particle contours. In some examples, the fitting function may be a Gaussian function, an adaptive mean threshold, an adaptive Gaussian function, combinations thereof, or another fitting function.
In some examples, processing circuitry 204 may be configured to determine particle contours in other ways. For example, processing circuitry may scan the grayscale image to find a local peak. The local peak may be found when processing circuitry 204 determines that a difference value indicative of a difference between luminance values of proximate pixels satisfies a threshold; and based on the difference value satisfying the threshold, determines that one of the pixels (e.g., the pixel with the higher luminance value) is part of the particle contours for the at least one particle. In some examples, processing circuitry may scan surrounding pixels for other local peaks. In some examples, processing circuitry 204 may determine that all local peaks within a certain number of pixels from each other are part of the same island of particle contours. For example, where a local peak is found within 1, 2, 3, 4, 5, or other number of pixels of another local peak, processing circuitry 204 may connect the local peaks as part of the same particle contours.
In some examples, before executing the algorithm or function configured to determine particle contours, processing circuitry 204 may be configured to reduce or eliminate macroscale differences in luminance values due to imager 110, light source 114, and/or detection chambers by executing gain adjuster 222. In some examples, gain adjuster 222 may adjust (e.g., change) the average luminance value of each individual pixel within a small matrix within the grid of small matrices. In this way, the overall image data matrix may be normalized to account for trends in average luminance values on a macro level, such that each grid may have the same or a similar average luminance value relative to the rest of the small matrices within the grid.
It may be possible that counting each island of particle contours may result in overcounting and/or under-sizing particles, because two or more spikes or other topographical features on the same particle may show up as individual islands of particle contours in the luminance values of the image data. That is, two individual islands may be for the same particle, but appear to be for different particles, and therefore, two particles are counted for one particle. Processing circuitry 204 may be configured to execute one or more applications configured to address such possible overcounting. For example, processing circuitry 204 may determine a particle boundary based on the determined particle contours broadening the determined particle contours and may determine a particle boundary based on the broadened particle contours. For example, applications 216 may include particle contour broadener 224, which may store instructions for processing circuitry to execute such an operation.
Processing circuitry 204 may execute the particle contour broadener 224 application, which may be housed within memory 202 of computing device 204. Particle contour broadener 224 may be configured to adjust (e.g., change by increasing or decreasing) the luminance value for individual pixels within the overall image data matrix (e.g., 1980×1080 pixels). Particle contour broadener 224 may be configured to adjust (e.g., increase or decrease) the luminance values of the image data to assist in determining a particle boundary from sensed particle contours. For example, particle contour broadener 224 may be configured to group several small islands of particle contours together to define a particle boundary that includes each of the more than one islands of particle contours as one particle by defining a boundary around both of the islands. For example, particle contour broadener 224 may be configured to broaden the particle contours by assigning additional pixel points around an identified spot or island the same luminance value as a neighboring pixel, such that particle contour broadener 224 may connect small spots very close to each other as a big spot to avoid over-counting one big particle as many small particles.
In some examples, processing circuitry 204 may determine broadened particle contours by determining that the identified pixels include a first pixel and a second pixel that are separated by a distance. Processing circuitry 204 may be configured to assign one or more pixels, within the distance, proximate to the first pixel and second pixel approximately the same luminance value as nearest pixel within identified pixels to create a broadened cluster of pixels that include the first pixel and the second pixel; and determine the particle contours based on the cluster of pixels.
Accordingly, particle contour broadener 224 may reduce overcounting and/or under-sizing of particles, because particles with topography that is sensed and stored as image data that includes separate islands of particle contours connects the small spots together as one larger spot, and correctly counts and sizes the multiple spots as a single particle. In some examples, particle contour broadener 224 may be configured to broaden the sensed particle contours by increasing the luminance values of one or more pixels proximate to the sensed particle contours to define broadened particle contours. For example, each pixel within 1, 2, 3 or more pixels from a sensed local peak, or from a pixel that is part of a particle contour, may be assigned the same luminance value as the luminance value of the local peak or member pixel of a particle contour. In this way, each island of particle contours may be stretched in size to define broadened particle contours. In some examples, user input may indicate how many neighboring pixels should have their luminance value adjusted, based on user knowledge of particle size or particle topography, or by experimentation (e.g., comparison against a calibration sample of known particle size or particle size distribution).
Additionally, or alternatively, particle contour broadener 224 may execute one or more computer vision or machine learning modules to determine how sensed particle contours should be stretched to determine broadened particle contours. In some examples, a fitting function may be executed to determine broadened particle contours. In some examples, the fitting function may be a Gaussian function, an adaptive mean threshold, an adaptive Gaussian function, combinations thereof, or another fitting function.
Once processing circuitry 204 has executed particle contour broadener 224 to determine broadened particle contours, processing circuitry may execute instructions to determine a particle boundary from the broadened particle contours. Stated similarly, processing circuitry 204 may be configured to determine which individual islands of particle contours in the sensed image data should be grouped together and assigned as belonging to the same particle, such that the particle boundary may be determined around the islands which are part of the same particle. In some examples, determining a boundary may include determining whether the broadened particle contours intersect with another spot or island of broadened particle contours. Based on determining that there is no intersection between the broadened particle contours, processing circuitry 204 may determine that the particle contour in the image data is a boundary of a particle. Conversely, based on the determination that there is intersection, determining that the particle contours and the other broadened particle contours together belong to the same particle, and connecting the islands of particle contours, and a line or curve set by a fitting function connecting the islands forms a boundary for the particle. As such, the determination that there is intersection between the broadened particle contours may include determining that the intersecting particle contours form a boundary for the at least one particle.
Once a particle boundary has been determined based on the broadened particle contours, processing circuitry 204 may be configured to mark the pixels within the boundary as making up an individual particle. Processing circuitry 204 may be configured to count the marked particles, size the particles within the image data by correlating the number of pixels to a scale that maps that the pixels to a map of the detection chamber and/or a zoom setting of the lens system of imager 110, and determine the concentration of particles within the gas stream based on the marked particles and sampling information. As such, processing circuitry 204 may generate quantitative information based on the determined particle contours.
Processing circuitry 204 may execute the color manipulator 224 application, which may be housed within memory 202 of computing device 204. Processing circuitry 204 may execute color manipulator 218 to perform color analysis received color image data. The color image data may be from imager 110, which may be a color image sensor or a color video camera. The color image data may include colors in addition to black and white, such as one or more of red, green and blue colors.
In some examples, color manipulator 224 may store instructions for processing circuitry 204 to perform color analysis based on the determined particle boundary from the luminance analysis technique with the grayscale image data described above. For example, color analysis may be performed using the determined particle boundary as described above. Processing circuitry 204 may be configured to use determined particle boundary to locate a particle area in the color image data, such as by overlaying the determined particle boundary over the color image data from imager 110. Processing circuitry 204 may be configured to determine a dominant color within the particle area. In some examples, the dominant color may be the hue that appears most frequently within the particle area. In some examples, the dominant color may be the average of red, green, and blue values of pixels within the particle area. Processing circuitry 204 may convert the dominant color to the dominant wavelength of the particle by using the hue of the dominant color calculate the wavelength of induced fluorescent light emitted by the particle. The color image data may be signals sensed at red, green, and blue pixels in a sensor array of imager 110.
Processing circuitry 204 may be further configured to compare the dominant wavelength of the particle to a database of known wavelengths of particles stored within memory 202 as particle data 226. Since certain particles induce fluorescence at known wavelengths when irradiated with beam 116 of known wavelength, processing circuitry may thus determine a particle species when the dominant wavelength matches, or is within a certain tolerance, of a known particle species stored in the database. Similarly, memory 202 may store particles classification database(s) 203. These databases may use the dominant wavelength, size of the particle area, shape of the particle area, particle images of specific particles, or the like to classify particles by matching these features against known particle parameters stored within the database. For example, processing circuitry 204 may be configured to determine whether the particle is a bioaerosol or abiotic aerosol. Thus, processing circuitry 204 may be configured to generate qualitative information about at least one particle based on the determined particle contours.
In some examples, processing circuitry 204 may be configured to aggregate the results of frames of image data from imager 110, such as a first set of image data captured at a first time and a second set of image data captured at a second time. Processing circuitry 204 may be configured to output for display via display 206 a representation the first set of image data, the second set of image data, or both sets of image data. In some examples, the representation of the image data may be in the form of a chart, table or graph.
Advantageously, system 100 and its associated techniques for operation may be suitable for detecting and analyzing smaller particles and/or particles at lower concentrations than other particle detection and image processing techniques, because system 100 may process the sensed data to more accurately determine at least one of the shape, size, count, concentration, type, or species of particle.
Additionally, technique 300 of
The image data from imager 110 may be processed in any way. For example, technique 300 may optionally include receiving a frame of grayscale image data comprising luminance values of image data captured by imager 110 (310). Processing circuitry 204 may analyze the received image data to identify at least one particle within the frame (312). In some examples, technique 300 may include determining particle contours of the at least one particle based on the luminance values (314). Technique 300 optionally includes generating, by processing circuitry 204, at least one of quantitative or qualitative information for the at least one particle based on the determined particle contours (316). In some examples, technique 300 may further include irradiating particles within detection chamber 102 by projecting beam 116 into the detection chamber. Beam 116 may comprise light ray(s) with a wavelength of less than about 450 nm, such as from about 250 nm to about 350 nm. In some examples, technique 300 may include capturing imaging data 214 (
The technique of
In some examples, the technique of
In some examples, the technique of
The technique of
In some examples, the technique of
In some examples, the technique of
In some examples, as illustrated in
With continued reference to
One or more of the techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors or processing circuitry, including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), graphics processing units (GPUs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing circuitry” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit comprising hardware may also perform one or more of the techniques of this disclosure.
Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various operations and functions described in this disclosure. In addition, any of the described units, circuits or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as circuits or units is intended to highlight different functional aspects and does not necessarily imply that such circuits or units must be realized by separate hardware or software components. Rather, functionality associated with one or more circuits or units may be performed by separate hardware or software components or integrated within common or separate hardware or software components.
Various examples have been described. These and other examples are within the scope of the following clause and claims.
Clause 1. A method of suspended particle detection, the method comprising: receiving, with a particle concentrator, an aerosol comprising particles suspended within a bulk gas, the aerosol having a first concentration indicative of count of particles per unit volume of the bulk gas; concentrating, with the particle concentrator, the aerosol to generate a particle-rich stream of gas comprising at least one particle, the particle-rich stream of gas having a second concentration greater than the first concentration; irradiating the at least one particle in the particle-rich stream of gas with a light source of a certain wavelength in a detection chamber; and capturing image data relating to the at least one particle with an image sensor located within the detection chamber.
Clause 2. The method of clause 1, further comprising: obtaining a frame of grayscale image data comprising luminance values of image data captured by the image sensor or camera; analyzing the image data in the frame to identify at least one particle captured in the frame, wherein analyzing the image data comprises: identifying pixels having luminance values that satisfy a threshold; and determining particle contours of the at least one particle based on the identified pixels; and generating at least one of quantitative or qualitative information for the at least one particle based at least partially on the analyzing of the image data.
Clause 3. The method of clause 1, wherein concentrating the aerosol comprises: receiving at least a portion of the bulk gas into at least one inlet of the particle concentrator; outputting a particle-lean stream of gas as a major flow stream from a first outlet of the particle concentrator; and outputting the particle rich-stream of gas as a minor flow stream from a second outlet of the particle concentrator.
Clause 4. The method of clause 3, wherein a ratio of a volumetric flow rate of the particle-lean stream of gas to a volumetric flow rate of the particle-rich stream of gas is in a range of from about 10:1 to about 1000:1.
Clause 5. The method of clause 3 or clause 4, wherein a concentration ratio between the particle-rich stream of gas and the aerosol is in a range of from about 10 to about 1000.
Clause 6. The method of any of clauses 1-5, wherein concentrating the aerosol further comprises powering a blower, and wherein the blower causes at least a portion of the bulk gas to be received by an inlet of the particle concentrator and causes the particle-rich stream of gas to be available at the output of the particle concentrator.
Clause 7. The method of clause 6, wherein powering the blower comprises providing power to the blower in a range of from about 10 watts to about 300 watts.
Clause 8. The method of any of clauses 1-7, wherein the particle concentrator is a concentrating virtual impactor (CVI) device that performs an inertia-based preferential particle separation.
Clause 9. The method of any of clauses 1-8, wherein concentrating the aerosol comprises passing the bulk gas through a nozzle, receiving the minor flow stream at a receiving tube, and ejecting the major flow stream at a major flow exit, wherein the minor flow stream is the particle-rich stream of gas and the major flow stream is the particle-lean stream of gas.
Clause 10. The method of clause 9, wherein the major flow exit substantially surrounds the receiving tube.
Clause 11. The method of clause 9, wherein the major flow exit is disposed at an angle of from about 90 degrees to about 180 degrees from the receiving tube.
Clause 12. The method of clause 9, wherein concentrating the aerosol comprises preferentially separating at least one particle by the particle's inertia into the particle-rich stream of gas.
Clause 13. The method of clause 12, wherein preferentially separating particles comprises: causing a majority of particles in the aerosol which have a maximum dimension that is above a particle size cut point in the aerosol to enter the particle-rich minor stream of gas.
Clause 14. The method of clause 13, wherein the particle size cut point is about 1 (±0.5) micrometer or larger.
Clause 15. The method of any of clauses 1-14, wherein concentrating the aerosol comprises: between sampling the aerosol and outputting the particle-rich stream of gas, concentrating the aerosol in a first stage comprising a first set of nozzles, and further concentrating the aerosol in a second stage comprising a second set of nozzles.
Clause 16. The method of clause 14, wherein each nozzle of the first set of nozzles and each nozzle of the second set of nozzles defines an aperture that is larger than about 1 millimeter.
Clause 17. The method of clause 14 or clause 15, wherein the first set of nozzles comprises more than one nozzle and the second set of nozzles consists of a single nozzle.
Clause 18. The method of any of clauses 1-17, further comprising performing a first-pass preseparator to remove a majority of particles which have a maximum dimension above a second particle size cut point desired for concentration enhancement.
Clause 19. The method of clause 17, wherein the second particle size cut point is about 10 micrometers or larger from an inlet stream of gas.
Clause 20. The method of clause 18 or clause 19, wherein particle preseparator comprises performing an inertia-based separation in the particle concentrator.
Clause 21. The method of any of clauses 1-20, wherein the particle concentrator is formed by an additive manufacturing process.
Clause 22. The method of any of clauses 1-21, wherein the particle concentrator has a maximum dimension of less than about 150 millimeters.
Clause 23. The method of clause 2, wherein the light source is an external light source, wherein the light source comprises a laser or LED, and wherein the light source generates a beam of light with a wavelength below 450 nanometers (nm), such as from about 250 nm to about 350 nm.
Clause 24. The method of clause 2, wherein the captured image data comprises image data of at least one particle induced or enhanced by the light source.
Clause 25. The method of clause 2, wherein the image sensor or camera comprises a color image sensor or camera, such as a color video camera.
Clause 26. The method of clause 2, wherein the image data includes a red image data matrix, a green image data matrix, and a blue image data matrix, and wherein obtaining grayscale image data comprises at least one of summing or averaging each of the red image data matrix, the green image data matrix, and the blue image data matrix to form an overall image data matrix.
Clause 27. The method of clause 2, wherein identifying pixels having luminance values that satisfy the threshold comprises: determining local thresholds within respective subsets of pixels; comparing luminance values of pixels within each respective subsets of pixels to respective local threshold for that subset of pixels; and sweeping through the subsets to pixels to identify the pixels based on the comparison, and wherein determining particle contours comprises grouping the identified pixels of each of the respective subsets of pixels together as an island of particle contours.
Clause 28. The method of clause 27, wherein determining the local thresholds comprises averaging pixel values of the image data within the respective subsets of pixels.
Clause 29. The method of clause 28, further comprising identifying adjacent islands of particle contours as belonging to the same particle, wherein determining the particle contours comprises determining particle contours by fitting the data in the subsets of pixels using a fitting function.
Clause 30. The method of clause 29, wherein the fitting function is a Gaussian function.
Clause 31. The method of clause 2, wherein identifying pixels having luminance values that satisfy the threshold comprises: determining the threshold within the image data; comparing luminance values of pixels to the threshold; and
Clause 32. The method of clause 31, wherein determining the local thresholds comprises averaging pixel values of the image data within the respective subsets of pixels.
Clause 33. The method of clause 2, further comprising: applying a gain adjustment to the luminance values to determine adjusted luminance values for one or more pixels, wherein identifying pixels that satisfy the threshold comprises identifying pixels that satisfy the threshold based on the adjusted luminance values.
Clause 34. The method of clause 2, wherein the identified pixels comprises a first pixel and a second pixel that are separated by a distance, wherein determining particle contours comprises: assigning one or more pixels, within the distance, proximate to the first pixel and second pixel approximately the same luminance value as nearest pixel within identified pixels to create a broadened cluster of pixels that include the first pixel and the second pixel; and determining the particle contours based on the cluster of pixels.
Clause 35. The method of any of clauses 1-34, wherein generating at least one of quantitative or qualitative information includes generating quantitative information comprising at least one of a particle count or a particle concentration.
Clause 36. The method of any of clauses 1-35, wherein generating at least one of quantitative or qualitative information includes generating qualitative information comprising images of individual particles, sizes of the captured particles represented by the image data, and colors or dominant wavelengths of induced or enhanced light emitting from the captured particles.
Clause 37. The method of any of clauses 1-36, further comprising: selecting a file from a memory associated with the image sensor or color image data directly camera; and reading a frame from the file to generate the grayscale image data.
Clause 38. The method of clause 37, wherein the file comprises video data.
Clause 39. The method of clause 37, further comprising determining whether the file contains at least one additional frame, and responsive to determining that the file contains at least one additional frame, reading a second frame from the file to generate a second set of grayscale image data.
Clause 40. The method of any of clauses 38 or 39, wherein generating at least one of quantitative or qualitative information for the at least one particle based at least partially on the determined particle contours comprises marking the at least one particle within the image data based on the determined boundary.
Clause 41. The method of clause 40, further comprising counting the marked at least one particle.
Clause 42. The method of clause 41, further comprising determining a particle concentration based on the counted at least one particle.
Clause 43. The method of any of clauses 40-42, further comprising determining the size of at least one particle within the frame based on the determined boundary.
Clause 44. The method of clause 2, further comprising: receiving color image data that includes colors in addition to black and white, wherein the color image data is from the image sensor or camera, and wherein the grayscale image data is based on the color image data; performing color analysis on the color image data using the determined particle contours, wherein generating at least one of the quantitative or qualitative information comprises generating qualitative information based on the color analysis.
Clause 45. The method of clause 44, wherein performing color analysis comprises locating a particle area in the color image data.
Clause 46. The method of clause 45, wherein performing color analysis comprises determining a dominant color within the particle area.
Clause 47. The method of any of clauses 44-46, wherein performing color analysis comprises converting the dominant color to a dominant wavelength of the at least one particle by using the hue of the color image data to calculate the wavelength of induced fluorescent light emitted by the at least one particle.
Clause 48. The method of clause 47, wherein converting the dominant color to a dominant wavelength of at least one particle is based at least partially on signals sensed at red, green, and blue pixels in a sensor array of the image sensor.
Clause 49. The method of clause 48, further comprising comparing the dominant wavelength of at least one particle to a database of known wavelengths to determine a particle species.
Clause 50. The method of clauses 48 or 49, further comprising comparing the dominant wavelength of the at least one particle to a database of known wavelengths to determine a particle type, wherein the particle type is a bioaerosol or an abiotic aerosol.
Clause 51. The method of any of clauses 1-50, further comprising outputting, for display via a display, a representation of one or more pieces of the at least one of quantitative or qualitative information, wherein the at least one of quantitative or qualitative information comprises one or more of a particle count, a particle size, a particle concentration, a particle type, or a particle species.
Clause 52. A system configured to perform the method of any of clauses 1-51.
Clause 53. A system comprising: a particle concentrator; and a particle detection and analysis unit comprising: at least one light source of a certain wavelength configured to irradiate at least one particle; at least one image sensor or camera configured to capture image relating to the at least one particle; and one or more processors configured to: obtain a frame of grayscale image data comprising luminance values of image data captured by the image sensor or camera; analyze the image data in the frame to identify at least one particle captured in the frame, wherein to analyze the image data, the one or more processors are configured to: identify pixels having luminance values that satisfy a threshold; and determine particle contours of the at least one particle based on the identified pixels; and generate at least one of quantitative or qualitative information for the at least one particle based at least partially on the analyzing of the image data.
Clause 54. The system of clause 53, further comprising performing the method of any of claims 1-50.
Clause 55. A system comprising: a particle concentrator configured to receive an aerosol having a first concentration of particles and output a particle-rich stream of gas having a second concentration of particles to a particle sensor; and a particle sensor comprising: a light source configured to irradiate at least one particle in the particle-rich stream of gas; an image sensor to capture an image of the at least one particle; and processing circuitry configured to perform and image analysis algorithm to analyze the image of the at least one particle.
This application claims the benefit of U.S. Provisional Patent Application No. 63/597,607, filed Nov. 9, 2023 and U.S. Provisional Application No. 63/448,573, filed Feb. 27, 2023, the entire contents of each application is incorporated herein by reference.
This invention was made with government support under W9124P-23-P-0024 awarded by the Army Research Lab. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
63448573 | Feb 2023 | US | |
63597607 | Nov 2023 | US |