The present invention relates to image analysis and chemical detection, and more specifically, to determining a presence or concentration of a chemical in a sample based on image analysis.
Chemical detection is an ongoing challenge for several federal agencies and industries. Compounds of interest range from illicit drugs and opioids, explosives, chemical or biological warfare agents, environmental pollutants, foodborne pathogens, or viruses and COVID-19. Many detection methods utilize color changes to indicate the presence of chemicals of interest (e.g., color tests and immunoassays). Color test protocols used in the laboratory are typically performed in a spot plate or test tube format whereby a color change is visually examined when an evidence sample is added to specific chemical reagents (different reagents change color in the presence of a different drug compound or drug class). Color tests are the predominant screening technique used in laboratory seized drug workflows. For example, the 4-aminophenol (4-AP) color test can be used to determine whether a Cannabis sample is hemp-rich or THC-rich. Color tests are advantageous due to the low cost, portability, and simplicity.
However, several disadvantages of color tests include: (1) large number and variability in the compounds of interest, (2) subjective nature of color interpretation, (3) only allows for one substance to be manually tested at a time, (4) no storage of data for traceability, recall, or review, (5) Lack of automation, and (6) cannot integrate into a laboratory information management system (LIMS).
In addition to color tests, lateral flow immunoassays (LFIs) can be used for detection of antibody or antigen in a chemical sample. In LFIs, fluid movement is driven along a cellulose strip based on capillary action, or wicking. In general, instead of using chemical reagents that change color in the presence of specific drugs of interest due to chemical structure, drug immunoassays and LFIs use capture antibodies of a complimentary chemical structure labeled with an indicator that changes color when substrate is added for detection. Similar to color tests, LFIs advantages are simplicity, low cost, rapid analysis times, and no external instrumentation.
LFIs, however, suffer from the same limitations as color tests. In addition to LFIs being routinely used in urine drug testing to provide an intuitive method for analysis due to the ubiquity of other LFIs, such as pregnancy tests, commercially available LFIs have also been used as field presumptive tests by law enforcement. For example, LFIs that incorporate and augment current color tests with drug immunoassays may improve the sensitivity, reliability, reproducibility, and accuracy of current workflows. Notably, including more than one drug test specific to a particular drug of interest can provide increased confidence in the results.
The prevalence of color tests for presumptive screening is due to the low cost, quick turnaround time, and simplicity in which color changes are observed visually. Limitations that arise with color test operation can include subjectivity, human factors, improper use, manual procedure, incorrect results recorded, uncontrolled interferents, and multiple individual color tests required for drug identification, all providing potential for user error or unreliable results. The susceptibility of color testing to these errors presents a need for improving the sensitivity, reliability, reproducibility, and accuracy of current workflows. Subsequent forensic laboratory analysis is performed for confirmation of presumptive testing, which includes replication of the color test. In addition to the challenges listed above, current technology lacks the ability to be automated and/or integrated with laboratory information management systems (LIMS).
Even though forensic laboratories have implemented more efficient processes and digitization, color testing has remained largely stagnant. Moreover, recent legislation has created an additional and unique color test challenge. The 2018 Farm Bill redefined hemp as Cannabis containing less than 0.3% Δ9-tetrahydrocannabinol (THC) and removed hemp from the controlled substances list. Prior to this legislation, forensic laboratory workflows focused solely on identification of THC rather than determining THC concentrations. The strain of this change halted Cannabis evidence processing in many laboratories until new protocols were developed, creating pressure to rapidly validate a presumptive color test, confirmatory method, new standard operating procedures, and any necessary training materials, all while juggling current caseloads demands.
The 4-aminophenol (4-AP) color test semi-quantitatively identifies cannabidiol (CBD)- or THC-rich Cannabis. The rapid adoption and implementation of this new 4-AP color test into forensic laboratory workflows for Cannabis highlights a preference to continue utilizing color tests in seized drug processing when possible. For example, currently, if a sample is seized for subsequent analysis within a forensic laboratory, an analytical workflow following recommendations from the Scientific Working Group for the Analysis of Seized Drugs is used to achieve a sufficient level of selectivity for a scientifically supported conclusion. This is generally a multistep process that includes screening and confirmation. The United Nations Office on Drugs and Crime's International Collaborative Exercises (ICE) reported color testing as the most common screening method.
Other methods, such as Raman spectroscopy and ion mobility spectrometry (IMS), also have limitations resolving drug mixtures which may result from, for example, weak intensity of Raman scattering with Raman spectroscopy and competitive ionization or unsolvable signals with IMS. Specifically, such current solutions cannot detect a minor illicit drug component in a drug sample that is majority excipient or cutting agent.
Drug use takes a costly toll on communities throughout the United States, and laboratory processing of seized drug evidence aids in enforcing public safety and controlling crime. As mentioned above, evolving drug landscapes and regulations can create analytical challenges for seized drug processing and detection within the criminal justice and forensic science communities.
These challenges present a need for improved methods and systems for more reliable, reproducible, accurate, and unbiased evidence processing results.
Disclosed herein are systems, methods, and devices for determining a presence or concentration of a chemical in a sample based on image analysis. In one embodiment, a computing system for image analysis includes one or more hardware computer processors configured to execute software instructions and one or more storage devices storing software instructions for execution by the one or more hardware computer processors configured to cause the computing system to perform the following steps. The computing system receives imaging data of at least one sample droplet of a sample and at least one reagent. One or more imaging properties are identified from the imaging data and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The computing system stores the imaging data and the result.
In some embodiments, the reagent is associated with an immunoassay, a colorimetric reaction assay, or another chemical indicator. Colorimetric reactions can include color changes based on chemicals and chemical indicators. Chemical indicators can include, for example, pH tests, and may be different from chemicals that change the composition of a chemical of interest, a chemical reagent used, or precipitate out all allowing for a color change. Additionally, a colorimetric reaction assay may be different from an immunoassay or other bioassays which are based on antibodies or antigens.
In some embodiments, the computing system is configured to identify the imaging properties based on a subset of the imaging data.
In some embodiments, the computing system is configured to identify one or more image pixels to be analyzed and one or more image pixels to be excluded from being analyzed.
In some embodiments, the computing system is configured to generate a result based on the identified imaging properties by applying criteria associated with the reagent. It is appreciated that various criteria may be used to generate a result without departing from the scope of the subject matter disclosed herein. For example, the criteria can include a threshold value where applying the threshold may include generating a positive result if the concentration is above the threshold and a negative result if the concentration is below the threshold. Applying the criteria can also include quantification which, rather than using a threshold value, measures the response fit to a curve (e.g., linear) and extrapolates the concentration.
In some embodiments, the system also includes a sensor configured to capture the imaging data.
In some embodiments, the sensor includes one or more of: a microscope, a spectrophotometer, a camera, and a digital imaging device.
In some embodiments, the identified imaging properties include at least one of:
a size, a location, and a color.
In some embodiments, the imaging data is convertible into a viewable image.
In some embodiments, the imaging data is a Raster image file.
In some embodiments, the Raster image file is one of: a jpeg, a tiff, a gif, a png, a RAW, a Bitmap, an encapsulated postscript (ESP), and a portable document format (PDF).
In some embodiments, the system includes an additional device configured to at least one of: transport, aliquot, split, combine, or mix the at least one sample droplet and the at least one reagent.
In some embodiments, the additional device is one of: a test tube, a containment tube, a PCR tube, a multiple-well plate, a spot plate, a cuvette, a lateral flow assay, a test strip, pH paper, a substrate material, a reagent 3D printed into a polymeric material, a microfluidic system, and a digital microfluidic system.
In some embodiments, the digital microfluidic system includes a microfluidic device and a microprocessor coupled to the microfluidic device. The microfluidic device includes a substrate, one or more microfluidic channels arranged on or within the substrate, and one or more electrodes arranged on or within the substrate. A first portion of the microfluidic device is configured to receive a reagent and a second portion of the microfluidic device is configured to receive a sample droplet. The microprocessor coupled to the microfluidic device is configured to combine sample droplets and reagents by applying one or more electric potentials to the one or more electrodes.
In another embodiment, a method implemented on a computing device includes receiving imaging data of at least one sample droplet of a sample and at least one reagent. One or more imaging properties are identified from the imaging data and a result is generated indicating the presence or concentration of a chemical in the sample based on the identified imaging properties. The imaging data and the result are stored.
The subject matter described herein also includes a digital microfluidic system and associated method(s) for chemical detection. According to one embodiment, the digital microfluidic system includes a microfluidic device and a microprocessor coupled to the microfluidic device. The microfluidic device includes a substrate, one or more microfluidic channels arranged on or within the substrate, and one or more electrodes arranged on or within the substrate. At least one reagent is introduced to a first portion of the microfluidic device and at least one sample droplet is introduced to a second portion of the microfluidic device. The microprocessor is configured to combine the at least one sample droplet with the at least one reagent included on the microfluidic device. Combining the at least one sample droplet with the at least one reagent includes applying one or more electric potentials to the electrodes of the microfluidic device. A sensor, including any hardware or other device capable of capturing imaging data, is used to capture imaging data of the at least one sample droplet and the at least one reagent. One or more imaging properties are identified from the captured imaging data, including color data or other, and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The captured imaging data and the result are stored.
According to another embodiment, a method for chemical detection using a digital microfluidic system includes combining at least one sample droplet with at least one reagent included on a digital microfluidic device of the digital microfluidic system. Combining a sample droplet with a reagent includes applying one or more digitally controlled electric potentials to electrodes of the digital microfluidic device. Utilizing a sensor, imaging data of the sample droplet(s) and the reagent(s) is captured. Imaging data can be captured before and/or after the sample droplet(s) and the reagent(s) are combined. One or more imaging properties are then identified from the captured imaging data and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The captured imaging data and the result are stored.
The subject matter described herein includes a computing device for determining a presence or concentration of a chemical in a sample based on image analysis. In one embodiment, the computing system includes one or more hardware computer processors configured to execute software instructions and one or more storage devices storing software instructions for execution by the one or more hardware computer processors configured to cause the computing system to perform the following steps. The computing system receives imaging data of at least one sample droplet of a sample and at least one reagent. One or more imaging properties are identified from the imaging data and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The computing system stores the imaging data and the result.
In other embodiments, the computing system may include or be associated with one or more other devices configured to transport, aliquot, split, combine, or mix sample droplet(s) and reagent(s). Such devices can include: a test tube, a containment tube, a PCR tube, a multiple-well plate, a spot plate, pH paper, a substrate material, a reagent 3D printed into a polymeric material, or a digital microfluidic system. In an embodiment where the computing system includes or is associated with a digital microfluidic system, the digital microfluidic system can include a microfluidic device and a microprocessor coupled to the microfluidic device. The microfluidic device includes a substrate, one or more microfluidic channels arranged on or within the substrate, and one or more electrodes arranged on or within the substrate. A first portion of the microfluidic device is configured to receive the at least one reagent and a second portion of the microfluidic device is configured to receive the at least one sample droplet. The microprocessor is configured to combine the at least one sample droplet and the at least one reagent by applying one or more electric potentials to the one or more electrodes.
Referring to
The computing system 100 receives imaging data 108, where the imaging data 108 is of at least one sample droplet of a sample and at least one reagent, from one or more image capture sensors 112, such as digital camera 114, spectrophotometer 116, or microscope 118. The one or more other devices 122 are configured to transport, aliquot, split, combine, or mix sample droplet(s) and reagent(s).
Alternatively, the computing system 100 may receive imaging data 108 from an external imaging data source 120, such as a database containing raw imaging data or viewable image files previously captured. The image capture sensor 112 may be associated with (e.g., co-located) an additional device 122, such as a spot plate 124, test tube 126, or digital microfluidic system 128, for capturing imaging data of sample(s) and reagent(s) in the additional device.
One or more imaging properties are identified from the imaging data 108 and a result 110 is generated. The result 110 indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The computing system 100 stores the imaging data 108 and the result 110.
At step 200, the computing system 100 receives imaging data of at least one sample droplet of a sample and at least one reagent.
For example, receiving the imaging data may include capturing the imaging data from a sensor such as a microscope, a spectrophotometer, a camera, and a digital imaging device. The imaging data may include a Raster image file, such as a jpeg, a tiff, a gif, or a png. Additionally, the imaging data converted into a viewable image. The imaging data may also be received from a source other than directly from a sensor, such as a database containing imaging data that was previously captured or otherwise generated.
The imaging data may be associated with a sample, one or more sample droplets, a reagent, and/or one or more reagents before and/or after being mixed. Typically, at least one sample droplet and at least one reagent are at least one of: transported, aliquoted, split, combined, or mixed prior to receiving the imaging data in one of a test tube, a spot plate, of a digital microfluidic system or any other suitable apparatus or material described herein. This allows for a change in the color of the sample when combined with the reagent(s) for indicating presence or concentration of a chemical in the sample. This color information is an example of imaging data.
For example, in an embodiment where sample droplets and reagents are combined using a digital microfluidic system, the digital microfluidic system may include a microfluidic device and microprocessor coupled to the microfluidic device. The microfluidic device may include a substrate, one or more microfluidic channels arranged on or within the substrate, one or more electrodes arranged on or within the substrate, a first portion of the microfluidic device configured to receive the at least one reagent, and a second portion of the microfluidic device configured to receive the at least one sample droplet. The microprocessor coupled to the microfluidic device may be configured to combine the at least one sample droplet and the at least one reagent by applying one or more electric potentials to the one or more electrodes.
At step 202, the computing system 100 identifies one or more imaging properties are identified from the imaging data. The identified imaging properties include at least one of: a size, a location, and a color. As used herein, “imaging data” and “image data” may be used interchangeably to refer to data produced by an optical or electronic sensor. “Imaging properties” may include both properties of an image file itself as well as properties of its production. In some cases, “image properties” may be used to refer to properties such as length, width, and file size of the image file, while “imaging properties” may be used to refer to properties such as the type of light, aperture, ISO, and camera model used to capture the imaging data. It is understood, however, that “imaging properties” includes both the properties of an image and the properties of its production unless otherwise specified.
The imaging properties identified from the imaging data may be based on either all of the imaging data or a subset of the imaging data. For example, identifying one or more imaging properties from the imaging data may include identifying one or more image pixels to be analyzed and one or more image pixels to be excluded from being analyzed.
In some embodiments, the imaging data may be converted into a viewable image, such as a Raster image file that can include a jpeg, a tiff, a gif, a png, or other image format.
At step 204, the computing system 100 generates a result. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. For example, the result may indicate the presence of THC in a sample.
At step 206, the computing system 100 stores the imaging data and the result. For example, color data for a sample captured both before and after being combined with a reagent may be stored along with the result indicating whether this color change indicated the presence or concentration of a chemical in the sample. This may allow for auditing of the result based on a review of the underlying imaging data.
The subject matter described herein may also include a digital microfluidic system 128 (also referred to herein as a digital microfluidic “platform”) and associated method(s) for chemical detection. According to one embodiment, the digital microfluidic system 128 includes a microfluidic device and a microprocessor coupled to the microfluidic device. The microfluidic device includes a substrate, one or more microfluidic channels arranged on or within the substrate, and one or more electrodes arranged on or within the substrate. At least one reagent is introduced to a first portion of the microfluidic device and at least one sample droplet is introduced to a second portion of the microfluidic device. The microprocessor is configured to combine the at least one sample droplet with the at least one reagent included on the microfluidic device. Combining the at least one sample droplet with the at least one reagent includes applying one or more electric potentials to the electrodes of the microfluidic device. A sensor is used to capture imaging data of the at least one sample droplet and the at least one reagent. One or more imaging properties are identified from the captured imaging data and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The captured imaging data and the result are stored.
While at least one sample droplet and at least one reagent are combined to process a sample, the digital microfluidic system 128 may also perform at least one of the following additional or intermediate steps: transporting, aliquoting, splitting, combining, or mixing the at least one sample droplet and the at least one reagent. This allows for many combinations of sample droplets and reagents and, as a result, allows samples to be processed faster than conventional methods and systems with improved automation.
The captured color and image data may not be a viewable image. For example, unprocessed camera sensor data stored in RAW photo formats may include an m-by-n array of pixels (where m and n are the dimensions of the sensor) where each pixel contains light intensity values and color channel information (e.g., red, green, or blue). This unprocessed sensor data can be converted into a viewable image, such as a jpg, png, or tiff. All raster file and other file types are encompassed. Including .bmp, .jpeg, .gif, and .mp4. Videos or still-frame images may be identified. Identifying one or more imaging properties from the captured imaging data may include identifying one or more image pixels to be analyzed and identifying one or more image pixels to be excluded from being analyzed. For example, unwanted background features or lighting reflections may be excluded from the image analysis because they do not represent the color response of the sample droplet(s) after being mixed with the reagent(s). The result generated based on the identified imaging properties can include applying a criteria associated with each of the one or more reagents. For example, a threshold may be defined for classifying the identified image pixels to be analyzed as either a first color or a second color.
According to another embodiment, a method for chemical detection using a digital microfluidic system includes combining at least one sample droplet with at least one reagent included on a digital microfluidic device of the digital microfluidic system. Combining a sample droplet with a reagent includes applying one or more digitally-controlled electric potentials to electrodes of the digital microfluidic device. Utilizing a sensor, imaging data of the sample droplet(s) and the reagent(s) is captured. Imaging data can be captured before and/or after the sample droplet(s) and the reagent(s) are combined. One or more imaging properties are then identified from the captured imaging data and a result is generated. The result indicates the presence or concentration of a chemical in the sample based on the identified imaging properties. The captured imaging data and the result are stored.
In contrast to conventional methods and systems that require subjective color interpretation, only allow for one substance to be manually tested at a time, do not provide data storage for traceability, recall, or review, which lack automation, and cannot be integrated into a LIMS, the presently disclosed methods and systems allow for objective color interpretation, parallel substance testing, storage of data for traceability, recall, or review, automation, and can be integrated into a LIMS.
The disclosure herein includes a two-pronged approach for addressing the shortcomings of current color testing systems and methods. First, a digital microfluidic platform (also referred to herein as a digital microfluidic “system”) is used to perform color tests that includes automation, objective interpretation of results, performing multiple color tests in parallel, and collecting images of the results. Second, drug immunoassays are integrated into the platform to augment color tests for more reliable, reproducible, accurate, and unbiased results. This may be accomplished within a single system, or may alternatively be accomplished within a first system that conducts color tests and a second system that conducts immunoassays.
A digital microfluidic system may be approximately the size of a tissue box and use replaceable microfluidic devices that are approximately the size of a credit card. Additional specifications and parameters can be determined through design and experimentation. Microfluidic devices (also commonly known as microfluidic “chips”), droplet testing kits (various dye and buffer solutions), digital microprocessors, and software (computer-executable instructions for controlling droplets that can be optimized in house) as part of a compiled kit may be part of a given DMF platform or system. Incorporation of a camera or other imaging device with off-the-shelf DMF systems and parts are within the scope of the present disclosure.
Using a digital microfluidic (DMF) device or chip enables digital manipulation of droplets across electrodes, which could be in the form of an array, where different color tests can be performed at each droplet. A DMF system or platform, as disclosed herein, includes a DMF device that is integrated with a sensor for capturing imaging data, a processor for controlling the DMF device and performing analysis, and optionally a memory for storing the imaging data and the analysis results. The sensor, such as a digital camera is used for automated imaging and objective interpretation of color test results. Current color tests for seized drugs can be translated onto a DMF platform, sampling can be optimized for performing multiple color tests in parallel, and sampling and sample processing can be integrated for multiplexed analysis.
Additionally, objective image analysis methods for each color test are integrated into an algorithm for unbiased analysis and data integrity. Image analysis is a quantitative metric for measuring sensitivity, accuracy, and reproducibility and is considered an improvement on current color testing evaluations. In addition to the DMF platform being configured for image capture, objective image analysis methods may be optimized for each color test. The analysis method can then be used as a quantitative metric for evaluating and optimizing color test results.
Lastly, routine color tests can be augmented with drug immunoassays to incorporate more than one test per illicit drug for increased reliability of screening results. This may include translating immunoassays onto the DMF platform, optimizing and refining the image analysis method, and implementing a multiplexed method for performing immunoassays alongside the color tests.
A droplet can be split by charging two electrodes on opposite sides of a droplet on an uncharged electrode. In the same way a droplet on an uncharged electrode will move towards an adjacent, charged electrode, a droplet will move towards each neighboring active electrode. The droplet can thus be split by gradually changing the potential of the electrodes.
Droplets can also be merged into one droplet using the same concept applied to splitting an existing droplet with electrodes. An aqueous droplet resting on an uncharged electrode can move towards a charged electrode where droplets will join and merge into one droplet.
Discrete droplets can be transported in a highly controlled way using an array of electrodes. In the same way droplets move from an uncharged electrode to a charged electrode, or vice versa, droplets can be continuously transported along the electrodes by sequentially energizing the electrodes. Since droplet transportation involves an array of electrodes, multiple electrodes can be programmed to selectively apply a voltage to each electrode for better control over transporting multiple droplets.
For example, referring to
As mentioned above, DMF devices have several advantages over conventional color tests. First, each droplet serves as a color test “chamber” without requiring any tubes. Moreover, several droplets can be analyzed in parallel. Second, generic insulated electrode array format allows for flexibility and methods can be easily reconfigured and adapted using software. Third, several fluid manipulations, including transporting, aliquoting, combining, and mixing (analogous to pipetting steps) can all be preprogrammed for automation. Fourth, digital microfluidic devices are the size of a credit card with droplets less than a microliter in volume.
When compared with traditional color testing systems and methods, DMF offers faster turnaround time (e.g., 3× faster) because sample and reagent movements are completed within seconds and tests are performed simultaneously in parallel rather than sequentially. Additionally, when compared with other screening technologies, such as direct analysis in real-time mass spectrometry (DART-MS), DMF may be significantly less expensive (e.g., approximately 100×). DMF can also offer costs savings compared with other techniques (approximately 4-6× less expensive) such as dipsticks because other colorimetric based products (e.g., dipsticks) require multiple cartridges for sample identification and use a wicking format of cellulose membrane or paper-like material.
Digital microfluidic technology offers several advantages to address challenges with color testing. In addition to those listed above, microfluidics inherently offers scaled-down size, rapid analysis, minimal sample consumption, low cost, closed systems, and simple operating procedures. Because DMF allows for flexibility in the generic control of several individual droplets in parallel, DMF devices can perform multiple tests at once using the same sample input.
Individual color tests are integrated into the microfluidic operational method for multiplexed sample processing described herein. Digital control of sample and color test reagents using DMF control software is performed using a combination of microfluidic droplet manipulations, including sample splitting, reagent addition, combining, mixing, and transporting. Then, a sampling method is used for aliquoting, or sample splitting, within the DMF platform 128 so each sample droplet can be used to perform a different color test, all in parallel. Sample splitting in this format is used analogously to pipetting, whereby sample is taken from one tube multiple times to add to several different spot plate wells (or test tubes) to perform different color tests. After sample is added to the DMF platform, a DMF method will be optimized for moving sample to locations on the microfluidic device for mixing with the different color test reagents. Turbulent mixing is used to generate homogenous color changes that are ready for subsequent imaging and analysis.
Samples and reagents may be introduced to the DMF device in a variety of ways. For example, a microfluidic device may be configured for detection of a particular chemical by being pre-loaded or pre-stored within the device with one or more reagents. For example, reagents 606a, 606b, 606c, 606d may be introduced at a first portion 602 (e.g., top) of the microfluidic device 128. Alternatively, the DMF device 128 may not be pre-loaded with reagents to allow customization of the reagents used. Similarly, samples may be introduced to the DMF device in a variety of ways. In
As discussed above, droplets and reagents may be moved via one or more droplet cells 612 and combined to form mixed droplets 610a or 610b having desired combinations of reagents 606a, 606b, 606c, 606d and sample 608.
A schematic illustration of these processes on a DMF platform is illustrated in
Sampling begins when a seized drug or other substance to be tested for the presence or absence of a target chemical is added to the DMF platform. The sample is extracted and dissolved so that it may be easier to process (e.g., divide into droplets). Next, the sample solution is divided (aliquoted) into a plurality of droplets for multiple color tests.
Next, sample processing includes combining the sample aliquots with different color test reagents. The combined sample and reagents are mixed for producing homogeneous color. This may aid in image processing so that a combined sample has a consistent color without significant color variation. Finally, the resulting color changes of the combined samples are imaged. For example, a camera or other optical sensor may capture visual data (e.g., color images) of the combined sample droplets.
In one example, the analysis and results begins by establishing a baseline and subtracting a background from all images. For example, reflections on the surface of a droplet, portions of the DMF device surrounding that droplet, or extraneous material within the droplet may all be subtracted from the images in order to produce images that only contain the desired color data for each droplet. Next, one or more pre-defined thresholds, corresponding to each color test, are applied to the images. For example, a color value above a first pre-defined threshold may indicate a color that is clearly blue and not green, while a color value below a second pre-defined threshold may indicate a color that is clearly green and not blue. A color value that is between the first pre-defined threshold and the second pre-defined threshold may indicate the color that is neither clearly blue and nor clearly green (i.e., inconclusive). It may be appreciated, however, that the multi-step image analysis process mentioned above may not be required, depending on the application.
Lastly, the objective analysis and results, including any image files captured, may be automatically stored locally within the DMF platform and/or uploaded to a remote device. For example, results of the automated analysis may include a determination as to whether the sample has tested positive for a target chemical. This result may be displayed on the display of the DMF platform. Alternatively, or additionally, the DMF platform may include a wired or wireless communications device that allows the DMF platform to connect to a remote computer over a network, such as the internet. In a wireless example, the DMF platform may include either a cellular radio or a Wi-Fi radio for connecting to the internet. A remote data storage device, such as a database, may store results, imagery, and analysis data for multiple DMF platforms and multiple DMF tests. In a wired example, the DMF platform may be connected directly to another data storage device via USB.
Collecting images of the resultant color changes aids in both developing an objective image analysis method for detection as well as guiding optimization and validation. Because color test interpretation is inherently subjective (visual), assigning a numerical value using color data (e.g., RGB) can provide an unbiased quantitative metric for evaluating results and performance metrics (e.g., detection limits, selectivity, error rates). The image analysis protocol disclosed herein may be refined by generating drug calibration curves using each corresponding color test to evaluate various color models for improved detection (e.g., HSB, CMYK, and LAB color models). Analysis can be individually optimized for each color test, then integrated into a single protocol allowing for an algorithm to perform automated analysis in parallel, indicating the results of each color test without practitioner interpretation required. Results can be exported into a simplified data file with key parameters.
Using objective imaging devices, including smartphones and cameras, image capture can be seamlessly implemented into a forensic laboratory workflow without disruption. Applying image capture to a forensic drug chemist workflow can decrease the burden on the color interpretation and can allow for improved record keeping opportunities. As mentioned above, these collected images can be used for an objective interpretation of color test by assigning a numerical value associated with the color change utilizing several color models (e.g., RGB color model). These numerical values can then be associated with a result from a color test and inform standard operating procedures (SOPs) and use-case improvements based on results from several different sample types, including challenge samples. These image analysis procedures can be applied to collected color test images shared from collaborating forensic laboratories.
In the present disclosure, additional image features may be used for improved analysis, automation, and detection. For example, image correction using background pixels (those not needed for detection) can account for lighting or other unintended differences between images. This can be accomplished using white balancing or other color correction techniques. Additionally, specific pixels of interest can be selected for analysis to exclude any unexpected or outlier pixels. This process can be accomplished using color thresholding or other pixel selection techniques to specify or exclude certain color ranges. Both of these described processes can be controlled in an automated and unbiased format for integration into the analysis protocol.
As shown in
The sigmoidal curve in
Once the DMF platform has been adapted and utilized for color testing, the methods disclosed herein can be applied for use with routine drug immunoassays. Similar to color testing, immunoassays may test for drugs including Cannabis/THC, methamphetamine, cocaine, heroin, and fentanyl. Translating immunoassays onto the DMF platform may follow a similar process to translating color tests onto the DMF platform discussed above. However, unlike color tests where each separate reagent indicator binds or detects the drug compounds using differing mechanisms, the overall immunoassay mechanism is the same across all drugs and kits. It is appreciated that various immunoassay formats may be translated on the DMF platform without departing from the scope of the subject matter described herein where such immunoassay formats include, but are not limited to, sandwich, competitive, and antigen-down formats.
In one example, routine immunoassay kits are competitive assays that utilize the competition between enzyme-labeled drug and free drug from the sample for a fixed amount of drug-specific antibody binding sites. This creates an indirect relationship between drug concentration and enzyme activity. In terms of a color change in this example, as the concentration of drug in the sample increases, the color response decreases. Since the objective image analysis method is based on an overall change in color response, any change in color can be detected. Additionally, by having the same mechanism for all immunoassays (and only the drug-specific antibody and enzyme-labeled drug changes), translating each immunoassay onto the DMF platform is more streamlined.
In one embodiment, the DMF method for sample and reagent control only needs to be optimized for one assay and then the same can be applied for the remaining assays. Correspondingly, the image analysis method only needs to be optimized with one assay and then can be applied for use with all assays. Then, the image analysis method can be implemented to determine detection limits and related studies.
In another embodiment, the DMF method for sample and reagent control will need to be optimized for all immunoassays and the image analysis method will need to be optimized and applied for each assay. This is because each assay may have different detection limits, etc. to factor in.
As mentioned previously, drawbacks of the simplistic and subjective visual analysis approach routinely used with color tests, such as variations in the interpretation of a perceived color, may be overcome using a visual interpretation where inherent color information in an image resulting from a 4-AP color change can provide an objective analysis approach for interpreting and reporting the results. For example, image analysis compatible with current laboratory workflows was explored for the 4-AP test. THC and cannabidiol (CBD) may be successfully detected down to 0.05 mg mL−1 using an image analysis approach. Additionally, threshold values may be defined for objectively interpreting and reporting results using drug standards and 35 different Cannabis plant samples of varying THC to CBD ratios. This new objective analysis approach will be discussed in greater detail below.
The method may begin by selecting chemicals. Chemicals may include Δ-9-tetrahydrocannabinol and cannabidiol at 20 mg mL−1 in ethanol, Δ-9-tetrahydrocannabinol, cannabidiol, Δ9-tetrahydrocannabinolic acid A, Δ-8-tetrahydrocannabinol, and cannabinol at 1 mg mL−1 in methanol, 4-aminophenol, hydrochloric acid, ethanol, and sodium hydroxide. 4-AP regents may be stored in amber bottles in a 4° C. refrigerator. Reagent A may consist of 75 mg 4-AP, 248.75 mL ethanol, and 1.25 mL 2M hydrochloric acid, and reagent B was comprised of 12 g NaOH, 120 mL deionized water, and 280 mL ethanol.
Next, the method may include color test operation. When using standard drug solutions, the 4-AP color test may be performed by first adding 10 μL of standard drug solution to a polystyrene spot plate well, followed by 500 μL of Reagent A and 100 μL of Reagent B sequentially. When using Cannabis plant reference samples, a scoop tool may be used to add sample to the spot plate well, followed by the addition of 4-AP reagents using the identical standard drug solution procedure. The amount of Cannabis sample collected using the scoop tool may be measured as ≈2 mg.
Next, the method may include identifying reference cannabis samples. Homogenized Cannabis reference samples may be either hemp, marijuana, or a prepared mixture of hemp and marijuana to provide a range of THC to CBD concentration ratios. Each reference sample may be quantified using liquid chromatography with a photodiode array detector for CBD, CBDA, total CBD, THC, THCA, total THC compositions.
Finally, the method may include imaging and analysis. The imaging setup may use a smartphone positioned directly above the spot plate on a stand for consistent imaging. Before imaging a set of color tests, the smartphone camera may be put into AE-AF lock mode using spot plate wells filled with water to lock the focus and exposure settings. Images may be extracted from the smartphone for downstream processing. Resultant color changes within the spot plate wells of each image may be cropped to include only one well and saved as TIF files. The cropped images may be processed to correct image variabilities or subtract background image features, as needed, and specified color data was extracted using either ImageJ or MATLAB. Huc values may be reported as arbitrary units and used for data plots and defining threshold values.
Referring to
Defining and validating threshold values for analysis is the predominant technique used for objectively interpreting and reporting results in a drug “present” or “not present” (yes/no) format. Predefined thresholds associated with the resultant color changes for the 4-AP test can be used to determine if a Cannabis sample is either THC- or CBD-rich. Since the resultant color changes from the 4-AP color test can vary, alternatively to color tests that result in a monotone or singular color change, a standard curve was generated over a wide range of concentration ratios of THC to CBD to determine a method for image analysis and defining thresholds. Standard solutions were prepared at consistent THC concentration (1.0 mg mL-1 or 0.1% w/v THC) and varied CBD concentrations to generate THC to CBD concentration ratios of 15:1, 10:1, 5:1, 4:1, 3:1, 2:1, 1.5:1, 1:1, 1:1.5, 1:2, 1:3, 1:4, 1:5, 1:10, 1:15, and 1:40. The 4-AP color test was performed by first adding the THC to CBD concentration ratio mixture to a spot plate well, followed by the addition of Reagent A and then Reagent B. Images were taken of the resultant color changes at 1 min, 2 min, 3 min, and 5 min timepoints to measure any temporal differences in the color changes.
Referring to
The limit of detection (LOD) for THC with the 4-AP test using image analysis was empirically determined to be 0.05 mg mL−1 for both analysis formats without image processing (or using the raw smartphone images) and with image processing (image correction and pixel selection). The LOD was defined by a hue value greater than 3-times the standard deviation (+36) of the hue value for 0 mg mL−1 THC at each timepoint.
Table 1 below shows empirically determined LODs for THC and CBD at each imaging timepoint (1 min, 2 min, 3 min, and 5 min) with and without image processing.
Predefined threshold values can be used for simple analysis of an unknown sample. Based on the standard curve data for the 3 min timepoint shown in
Several ground Cannabis samples with known THC and CBD concentrations were used. To simplify test operation towards increased case-of-use in a forensic laboratory, a ≈2 mg scoop measuring tool was used for adding Cannabis plant samples to the spot plates. After the Cannabis samples were added, the 4-AP color test was performed following the standard solution protocol (500 μL of Reagent A and then 100 μL of Reagent B). Image analysis results for the Cannabis samples are shown in
Threshold values for a THC-rich Cannabis sample were defined by the hue value greater than and less than 3-times the standard deviation of the average hue values. Unknown Cannabis samples that result in hue values that fall within 0.505 arb. units—0.672 arb. units can be considered THC-rich. Although some samples resulted in a hue value within the THC-rich threshold value range and were comprised of a total THC greater than 0.3%, these samples were not included in the threshold calculation because of the high total % CBD content.
Due to the process that linearizes hue data, values near 0 and 1 arb. units are red colored requiring the need for two threshold value ranges. Therefore, the first range of threshold values for a CBD-rich Cannabis sample were defined by the hue value greater than and less than 3-times the standard deviation of the average hue values from a first set of samples, and the second range of threshold values from a second set of samples. Unknown Cannabis samples that result in hue values that fall within 0.093 arb. units—0.106 arb. units or 0.828 arb. units—1.026 arb. can be considered CBD-rich.
Values that fall between 0.675 arb. units—0.828 arb. units can be considered Cannabis samples with similar THC and CBD concentrations and would likely result in a purple-colored sample. Yet another sample, which fell within this range was a marijuana sample with increased CBD content, may have resulted in a value within this mixed colored range. Lastly, a marijuana sample that fell below the THC-rich threshold value (0.489±0.009 arb. units) may be because the resultant, colored product was green, alternatively to the expected blue color for the 4-AP test.
Thus,
As discussed above, the 4-AP color test can determine THC- and CBD-rich drug samples to address recent legislation redefining hemp and removing it from the controlled substances list. Although color tests can offer a method for simple qualitative analysis, drawbacks can include variations in the interpretation of a perceived color. Images are uniquely compatible for use with color tests because of the inherent color information captured within a photo.
An objective image analysis method using a smartphone camera to capture photos can be applied for the detection of THC and CBD-rich samples using the 4-AP test for interpreting and reporting results with quantitative values. Image post-processing may also be performed prior to extracting color information for analysis and interpreting results. Image processing may be used to correct image variabilities and exclude background, environmental features, large Cannabis plant fragments, or blank samples from the image analysis and results. Using objective image analysis, both CBD and THC may be detected down to 0.05 mg mL−1. Additionally, image analysis of 4-AP resultant color changes over varying ratios of THC and CBD (15:1-1:40 THC:CBD) with a constant THC concentration and over varying concentrations of THC and CBD (0%-1% w/v THC and CBD) with a constant 1:1 ratio may show the impact of total concentration in addition to the relative THC to CBD ratio on the results.
In summary, image analysis was applied to 35 different Cannabis samples of known THC and CBD concentrations and ratios. Threshold values were then defined based on these known Cannabis samples to demonstrate use of image analysis as an objective method for reporting results as either THC-rich, CBD-rich, similar THC and CBD concentrations, or unexpected (i.e., inconclusive) results.
It is appreciated that, in some embodiments, the computing system disclosed herein, having one or more hardware computer processors configured to execute software instructions, may utilize machine learning to perform the image analysis disclosed herein.
Machine learning (ML) is the use of computer algorithms that can improve automatically through experience and by the use of data. Machine learning algorithms build a model based on sample data, known as training data, to make predictions or decisions without being explicitly programmed to do so. Machine learning algorithms are used where it is unfeasible to develop conventional algorithms to perform the needed tasks.
In certain embodiments, instead of or in addition to performing the functions described herein manually, the system may perform some or all of the functions using machine learning or artificial intelligence. Thus, in certain embodiments, machine learning-enabled software relies on unsupervised and/or supervised learning processes to perform the functions described herein in place of a human user.
Machine learning may include identifying one or more data sources and extracting data from the identified data sources. Instead of or in addition to transforming the data into a rigid, structured format, in which certain metadata or other information associated with the data and/or the data sources may be lost, incorrect transformations may be made, or the like, machine learning-based software may load the data in an unstructured format and automatically determine relationships between the data. Machine learning-based software may identify relationships between data in an unstructured format, assemble the data into a structured format, evaluate the correctness of the identified relationships and assembled data, and/or provide machine learning functions to a user based on the extracted and loaded data, and/or evaluate the predictive performance of the machine learning functions (e.g., “learn” from the data).
In certain embodiments, machine learning-based software assembles data into an organized format using one or more unsupervised learning techniques. Unsupervised learning techniques can identify relationship between data elements in an unstructured format. In certain embodiments, machine learning-based software can use the organized data derived from the unsupervised learning techniques in supervised learning methods to respond to analysis requests and to provide machine learning results, such as a classification, a confidence metric, an inferred function, a regression function, an answer, a prediction, a recognized pattern, a rule, a recommendation, or other results. Supervised machine learning, as used herein, comprises one or more modules, computer executable program code, logic hardware, and/or other entities configured to learn from or train on input data, and to apply the learning or training to provide results or analysis for subsequent data.
Machine learning-based software may include a model generator, a training data module, a model processor, a model memory, and a communication device. Machine learning-based software may be configured to create prediction models based on the training data. In some embodiments, machine learning-based software may generate decision trees. For example, machine learning-based software may generate nodes, splits, and branches in a decision tree. Machine learning-based software may also calculate coefficients and hyper parameters of a decision tree based on the training data set. In other embodiments, machine learning-based software may use Bayesian algorithms or clustering algorithms to generate predicting models. In yet other embodiments, machine learning-based software may use association rule mining, artificial neural networks, and/or deep learning algorithms to develop models. In some embodiments, to improve the efficiency of the model generation, machine learning-based software may utilize hardware optimized for machine learning functions, such as an FPGA.
The system disclosed herein may also be implemented as a client/server type architecture but may also be implemented using other architectures, such as cloud computing, software as a service model (SaaS), a mainframe/terminal model, a stand-alone computer model, a plurality of non-transitory lines of code on a computer readable medium that can be loaded onto a computer system, a plurality of non-transitory lines of code downloadable to a computer, and the like.
The system may be implemented as one or more computing devices that connect to, communicate with and/or exchange data over a link that interact with each other. Each computing device may be a processing unit-based device with sufficient processing power, memory/storage and connectivity/communications capabilities to connect to and interact with the system. For example, each computing device may be an Apple iPhone or iPad product, a Blackberry or Nokia product, a mobile product that executes the Android operating system, a personal computer, a tablet computer, a laptop computer and the like and the system is not limited to operate with any particular computing device. The link may be any wired or wireless communications link that allows the one or more computing devices and the system to communicate with each other. In one example, the link may be a combination of wireless digital data networks that connect to the computing devices and the Internet. The system may be implemented as one or more server computers (all located at one geographic location or in disparate locations) that execute a plurality of lines of non-transitory computer code to implement the functions and operations of the system as described herein. Alternatively, the system may be implemented as a hardware unit in which the functions and operations of the back-end system are programmed into a hardware system. In one implementation, the one or more server computers may use Intel® processors, run the Linux operating system, and execute Java, Ruby, Regular Expression, Flex 4.0, SQL etc.
In some embodiments, each computing device may further comprise a display and a browser application so that the display can display information generated by the system. The browser application may be a plurality of non-transitory lines of computer code executed by a processing unit of the computing device. Each computing device may also have the usual components of a computing device such as one or more processing units, memory, permanent storage, wireless/wired communication circuitry, an operating system, etc.
The system may further comprise a server (that may be software based or hardware based) that allows each computing device to connect to and interact with the system such as sending information and receiving information from the computing devices that is executed by one or more processing units. The system may further comprise software- or hardware-based modules and database(s) for processing and storing content associated with the system, metadata generated by the system for each piece of content, user preferences, and the like.
In one embodiment, the system includes one or more processors, server, clients, data storage devices, and non-transitory computer readable instructions that, when executed by a processor, cause a device to perform one or more functions. It is appreciated that the functions described herein may be performed by a single device or may be distributed across multiple devices.
When a user interacts with the system, the user may use a frontend client application. The client application may include a graphical user interface that allows the user to select one or more digital files. The client application may communicate with a backend cloud component using an application programming interface (API) comprising a set of definitions and protocols for building and integrating application software. As used herein, an API is a connection between computers or between computer programs that is a type of software interface, offering a service to other pieces of software. A document or standard that describes how to build or use such a connection or interface is called an API specification. A computer system that meets this standard is said to implement or expose an API. The term API may refer either to the specification or to the implementation.
Software-as-a-service (SaaS) is a software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. SaaS is typically accessed by users using a thin client, e.g., via a web browser. SaaS is considered part of the nomenclature of cloud computing.
Many SaaS solutions are based on a multitenant architecture. With this model, a single version of the application, with a single configuration (hardware, network, operating system), is used for all customers (“tenants”). To support scalability, the application is installed on multiple machines (called horizontal scaling). The term “software multitenancy” refers to a software architecture in which a single instance of software runs on a server and serves multiple tenants. Systems designed in such manner are often called shared (in contrast to dedicated or isolated). A tenant is a group of users who share a common access with specific privileges to the software instance. With a multitenant architecture, a software application is designed to provide every tenant a dedicated share of the instance-including its data, configuration, user management, tenant individual functionality and non-functional properties.
The backend cloud component described herein may also be referred to as a SaaS component. One or more tenants which may communicate with the SaaS component via a communications network, such as the Internet. The SaaS component may be logically divided into one or more layers, each layer providing separate functionality and being capable of communicating with one or more other layers.
Cloud storage may store or manage information using a public or private cloud. Cloud storage is a model of computer data storage in which the digital data is stored in logical pools. The physical storage spans multiple servers (sometimes in multiple locations), and the physical environment is typically owned and managed by a hosting company. Cloud storage providers are responsible for keeping the data available and accessible, and the physical environment protected and running. People and/or organizations buy or lease storage capacity from the providers to store user, organization, or application data. Cloud storage services may be accessed through a co-located cloud computing service, a web service API, or by applications that utilize the API.
The following is a description of an exemplary use case illustrating steps for performing image analysis and chemical detection using a software application implementing the functionality disclosed herein including determining a presence or concentration of a chemical in a sample. In this example, the 4-aminophenol (4-AP) color test is used to screen and classify Cannabis as either cannabidiol (CBD)- or tetrahydrocannabinol (THC)-rich based on the relative THC-to-CBD concentration ratio. Drawbacks of the simplistic and subjective visual approach routinely used with color tests can include variations in the interpretation of a perceived color, whether a consequence of the sample, user, or test chemistry. As an alternative to visual interpretation, objective interpretation can be achieved by implementing an imaging and corresponding image analysis. The software described herein can support the interpretation of the 4-AP color test and may not make recommendations about the testing procedure.
It is appreciated that while this example applies to spot plates rather than test tubes it is not limited to spot plates. After the 4-AP test Reagent B is added, a photo may be taken of the resultant color after 2-3 minutes using an imaging device (i.e., camera or smartphone).
Next, an objective image analysis platform may use predefined threshold values to interpret resultant color changes. The predefined threshold values applied for the 4-AP test may be associated with the following expectant results and color changes:
THC-rich: if the concentration of THC is greater than CBD, then the color test results in a blue color.
CBD-rich: if the concentration of CBD is greater than THC, then the color test results in a pink color.
Similar THC and CBD: similar concentrations can result in mixed blue and pink (e.g., purple).
Blank: no THC or CBD present, or the concentration is below the detection limits.
Inconclusive: any resultant color change not associated with the expected results above (e.g., contaminant or different cannabinoid present).
The operation of the software program may begin by importing an image, image file, or imaging data. In one embodiment, one photo is analyzed at a time.
Next, the user may select a white area to color balance the photo. An ideal selection may be an area of white printer paper within the image. A white area may be selected using the cursor. Once an area is selected, a box labeled ‘W’ may appear. This box can then be moved or sized using the cursor and clicking on the circles around the box while dragging to resize. In the example shown, an area of white printer paper within the image is selected.
Next, the user may select one or more well(s) of interest for analysis by clicking on the center of the well(s). The selection may be an area fully encompassed by the resultant colored solution. Once a well is selected, a numbered box may appear. This box can then be moved or sized using the cursor or clicking on the circles around the box while dragging to resize. An example selection that may lead to inadequate results is an area partially containing both the resultant colored solution and the spot plate. The selection sizing feature can be used to size or move the selection box to only contain the resultant colored solution. In the example below, an area fully encompassed by the resultant colored solution is selected for the three used wells. Once a selection is made, notes can be added to each selection as a tag. Tags may be limited to 33 characters and no more than 3 tags per selection in one embodiment. Example tags include case number, evidence number, date, FSSP name, test information, etc.
Finally, the user can review the classifications within the program or, optionally, classifications can be exported and viewed in an Excel file using the ‘Export’ feature. An exported Excel spreadsheet may include various information such as selection numbering, tags added, selection images, the detected color, and the classification (i.e., THC-rich, CBD-rich, Blank, Inconclusive). Additionally, the date of the export may be automatically populated. Additional areas can be used to fill in additional relevant information, as desired.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware,
resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module”, “platform” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium (including, but not limited to, non-transitory computer readable storage media). A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter situation scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
This application is a continuation of PCT patent application no. PCT/US2023/063121 titled “METHODS, SYSTEMS, AND DEVICES FOR DETERMINING A PRESENCE OR CONCENTRATION OF A CHEMICAL IN A SAMPLE BASED ON IMAGE ANALYSIS,” filed Feb. 23, 2023, which claims the benefit of priority of U.S. provisional patent application No. 63/314,532 titled “DIGITAL MICROFLUIDIC PLATFORM FOR AUTOMATED CHEMICAL DETECTION,” filed Feb. 28, 2022, which are all incorporated herein by their entireties by this reference.
Number | Date | Country | |
---|---|---|---|
63314532 | Feb 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US23/63121 | Feb 2023 | WO |
Child | 18810821 | US |