Methods for constructing a color composite image

Information

  • Patent Grant
  • 10868958
  • Patent Number
    10,868,958
  • Date Filed
    Thursday, February 13, 2020
    4 years ago
  • Date Issued
    Tuesday, December 15, 2020
    3 years ago
Abstract
A method is provided for constructing a composite image. The method comprises illuminating an object with light of a particular spectral band, capturing a digital image of the illuminated object using a monochromatic image sensor of an imaging device to obtain a monochrome image, repeating the steps of illuminating and capturing to obtain a plurality of monochrome images of the object illuminated by light of a plurality of different spectral bands, processing the plurality of monochrome images to generate image data for one or more output channels, and generating a color composite image from the image data. The color composite image comprises the one or more output channels.
Description
FIELD OF THE INVENTION

The present invention relates to digital imaging and more specifically, to methods for constructing a color composite image.


BACKGROUND

Generally speaking, digital imaging devices fall into one of two categories: monochromatic imaging devices and color imaging devices. Monochromatic imaging devices employ a single (broad or narrow) spectral illumination band paired with a monochromatic image sensor (i.e., an image sensor that does not use multiple colored, spatially separate red-green-blue (RGB) filters) for capturing black and white images. Color imaging devices employ a single broad visible spectral band paired with a color-filtered image sensor for capturing color images using a RGB (red-green-blue) filter pattern. The three output channels (RGB) of the color images are displayed on an industry standard RGB monitor. Monochromatic image sensors however offer better performance in the way of higher sensitivity and better resolution relative to color image sensors. For example, barcode scanners using color image sensors may suffer drawbacks in performance.


Therefore, a need exists for methods for constructing color composite images using a monochromatic image sensor. A further need exists for constructing color composite images with higher sensitivity and better resolution.


SUMMARY

Accordingly, in one aspect, the present invention embraces a method for constructing a composite image. The method comprises illuminating an object with light of a particular spectral band, capturing a digital image of the illuminated object using a monochromatic image sensor of an imaging device to obtain a monochrome image, repeating the steps of illuminating and capturing to obtain a plurality of monochrome images of the object illuminated by light of a plurality of different spectral bands, processing the plurality of monochrome images to generate image data for one or more output channels, and generating a color composite image from the image data. The color composite image comprises the one or more output channels.


In another aspect, the present invention embraces a method for constructing a color composite image. The method comprises capturing a plurality of digital monochrome images with a monochromatic image sensor, processing the plurality of digital monochrome images to generate image data for one or more output channels, and generating the color composite image from the image data. Each digital monochrome image in the plurality of digital monochrome images is illuminated with a different spectral band. The color composite image comprises the one or more output channels.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the present invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 graphically depicts an exemplary imaging device for performing the steps of a method for constructing a color composite image, according to various embodiments of the present invention;



FIG. 2 is a schematic block diagram of the components of the exemplary imaging device shown in FIG. 1, according to various embodiments of the present invention;



FIGS. 3 and 4 graphically depicts two cross sections of the exemplary imaging device shown in FIG. 1, according to various embodiments of the present invention;



FIG. 5 graphically depicts a cross section of the exemplary imaging device of FIG. 1, illustrating exemplary locations of the various light sources and the monochromatic image sensor, according to various embodiments of the present invention;



FIG. 6 graphically illustrates exemplary spectral profiles of the various light sources used in methods for constructing a color composite image, according to various embodiments of the present invention; and



FIG. 7 is a flow diagram of a method for constructing a color composite image, according to various embodiments of the present invention.





DETAILED DESCRIPTION

The present invention embraces methods for constructing a color composite image. Various embodiments provide different ways of combining different color channels to view different sources of image data across the electromagnetic spectrum.


As used herein, the term “true-color composite image” approximates the range of vision for the human eye, and hence a “true-color composite image” appears to be close to what one would expect to see in a normal photograph. A “false color composite image” is an image that uses visible colors to represent portions of the electromagnetic spectrum outside the typical range of vision, allowing the image to provide data that is otherwise invisible to the naked eye. A “composite image” is a combined image made up of more than one constituent image. False color images may be used to enhance, contrast, or distinguish details. In contrast to a true color image, a false color image sacrifices natural color rendition to ease the detection of features that are not readily discernible otherwise, for example, the use of near infrared for the detection of vegetation in images. While a false color image can be created using solely the visible spectrum, some or all image data used may be from electromagnetic radiation (EM) outside the visible spectrum (e.g., infrared or ultraviolet). The choice of spectral bands is governed by the physical properties of the object under investigation. As the human eyes uses three spectral bands, three spectral bands are commonly combined into a false color image. At least two spectral bands are needed for a false color encoding, and it is possible to combine more bands into the three visual RGB bands, with the eye's ability to discern three channels being the limiting factor. For a true color image, the red, green, and blue (RGB) spectral bands from the camera are mapped to the corresponding RGB channels of the image, yielding a RGB TO RGB mapping. For the false color image, this relationship is changed. The simplest false color encoding is to take an RGB image in the visible spectrum, but map it differently, e.g., GBR TO RGB. With a false color composite image, at least one of the red, green, and blue channels is supplemented by data that does not correspond to its originally intended color band (for example [R,G,B]=[IR,G,UV]).


Referring now to FIG. 1, an imaging device 10 is depicted. The imaging device 10 may be a single, hand-held device such as the hand-held imaging device depicted in FIG. 1. In various embodiments, the imaging device 10 is not handheld, but rather integrated with a supermarket slot scanner or fixedly mounted options on a counter top or as an overhead document imager, etc. The imaging device 10 may include a trigger 34 as interface circuitry 57. Activating the trigger 34 (e.g., by pushing a button, touching a specific area on the imaging device 10 (i.e., handheld imager)) initiates the imaging device 10 to capture images. Additional components may be included as interface circuitry 57 of FIG. 2 (e.g., indicator LEDs, etc.)


Referring now to FIG. 2, according to various embodiments of the present invention, the imaging device 10 has an imaging subsystem 12 used in combination with multiplexed light sources 16 (e.g., see also FIGS. 3 and 5) of an illumination subsystem 14 to capture digital images of an object 15 (including but not limited to produce, retail merchandise, personal identification, currency, etc.). The imaging subsystem 12 and illumination subsystem 14 are contained within a housing 20 of the imaging device 10. The imaging device 10 generally comprises some or all of the following features: (i) positioning indication/guidance for operator use (i.e., aiming subsystem 40) (ii) a display 22 (FIG. 2), etc.); (iii) a communications subsystem 60 communicatively coupled to a point of sale (POS); and (iv) an energy storage/charging scheme. The imaging subsystem 12 and illumination subsystem 14 (FIG. 2) are typically optimized to a particular distance at which the object is illuminated and captured. The imaging device 10 allows a user to image objects in a variety of positions and orientations.


The imaging device 10 uses various colored light sources 16 (or light sources 16 with different spectral profiles) to illuminate the object. Filters 24 (FIG. 2) may be used to create light having different spectral profiles when combined with a broadband (e.g., white light) light source (or light-source combination). The filtering results from the filter's particular transmission, absorption, or reflectance characteristics.


The imaging device 10 may use filters 24 of different construction and/or composition. For example, colored plastic (or glass) may be used or multilayer interference filters may be used. Colored plastic (or glass) filters are relatively insensitive to angular orientation, whereas interference filters may be highly sensitive to angular orientation. Control of the illumination's spectral profile (e.g., color) may be accomplished by controlling the filters 24 and/or the light sources 16 in the imaging device 10. In various embodiments, the filter (or filters) 24 may be positioned in front of a light source 16 and mechanically moved in and out of position to change the spectral profile of the illumination. In various embodiments, a multilayer filter may be positioned in front of a light source 16 and mechanically rotated to change the spectral profile of the illumination. This filter-tuning approach is especially useful with very narrow changes in peak emission wavelengths. In various embodiments, diffractive optical elements (e.g., gratings) may be used to produce illumination having different spectral profiles. In various embodiments, multiple light sources 16 (e.g., FIGS. 3 and 5) may be used to produce illumination of various spectral profiles, such as shown in FIG. 6. These multiple light sources may be individually controlled (i.e., turned on and off in various combinations) to produce different illumination spectral profiles.


In various embodiments embraced by the present invention, the various images may be obtained using optical filters 26 positioned in front of a monochromatic image sensor (i.e., in the return path) of the imaging subsystem 12. A benefit to using optical filters in this way is that the spectral profile of the light reaching the monochromatic image sensor 28 is controlled, even if ambient light levels vary (e.g., vary in intensity, color, etc.). The optical filters 26 used in the return path (i.e., receive path) of imaging subsystem 12 may be of various constructions and/or compositions. For example, colored (dies) plastic, colored glass, or interference (i.e., multilayer, dichroic, etc.) filters may be used. Colored plastics and glass filters are relatively insensitive to angular orientation, whereas interference filters may be highly sensitive to angular orientation.


In various embodiments, multiple optical filters 26 may be placed in the return path and may be mechanically moved in and out of position to change the spectral profile of the light reaching the monochromatic image sensor 28. In various embodiments, the angular orientation of an interference filter in front of the monochromatic image sensor 28 may be changed to tune the spectral profile precisely. Similarly, diffractive optical elements (e.g., gratings) may be used to filter the light reaching the monochromatic image sensor.


The imaging device 10 embraced by the present invention embraces the multispectral imaging of objects to construct color composite images. Multiple light sources 16 and/or filters 24 may be used to provide illumination having various spectral profiles. For each illumination, the imaging subsystem 12 may be controlled (i.e., exposure control) to capture digital images. The present invention embraces different methods for controlling multiple illumination devices (i.e., strings of LEDs, LED arrays, etc.), each having a different spectral profile. The multispectral illumination could be a simple RGB set of LEDs (three separate LED colors) or it could be a hyperspectral set of LEDs extending into the UV or IR ranges. Any combination of these LEDs can be flashed simultaneously or separately to capture different color image data using the monochrome image sensor.


The present invention embraces minimizing specular reflections from an object by controlling polarization of the illumination light and the light detected by the monochromatic image sensor 28. Specifically, the illumination light may be polarized in a particular direction and the light captured by the monochromatic image sensor is polarized in a direction orthogonal to the particular direction (if polarizers 30 and 32 are used). In this way, the light reflected from the object is filtered (i.e., by its polarization) to remove the polarization of the illuminating light. As diffuse reflected light is largely unpolarized, a portion of the diffuse reflected light will reach the monochromatic image sensor 28. As the specular reflected light is largely polarized in the direction of the illumination, the specular reflected light will be substantially blocked. In various embodiments, a linear polarizer may be positioned in front of the illumination subsystem and a crossed polarizer may be positioned in front of the monochromatic image sensor. In this way, very little light from the illuminator or from specular reflection is detected by the monochromatic image sensor.


Still referring to FIG. 2, according to various embodiments of the present invention, the imaging device 10 further comprises a processor 36 (also referred to herein as processing circuitry) communicatively coupled to the imaging subsystem 12 and the illumination subsystem 14. The processor 36 is configured by software 38 (stored, for example, in a storage device 42 or memory 44 of the imaging device 10) to activate one or more of the light sources 16 in the illumination subsystem 14 to illuminate the object 15 with light of a particular spectral band, capture a digital image of the illuminated object using the monochromatic image sensor to obtain a monochrome image, and repeat illuminating and capturing digital images until a plurality of digital monochrome images of the object have been captured, processing the plurality of monochrome images to generate image data for one or more output channels of a color composite image, and generate the color composite image from the image data. The storage device 42 of FIG. 2 is also depicted as including an operating system 46. “Output channels” may also be referred to herein as “display channels”. Output channels comprise image data representing color. A color image comprises more than one output channel.


In various embodiments of the present invention, the imaging device 10 further comprises an aiming subsystem 40 capable of projecting two different targeting patterns, one for each of two modes of operation. In a first mode, one light pattern will be projected into the field of view (FOV) of the imaging device 10. If the mode of operation is changed, a different pattern will be projected. The targeting pattern may alert the operator of the mode and/or the mode change. The aiming subsystem 40 may be communicatively coupled to a mode-selection switch and has one or more aiming-light sources 41 and optics 43 for projecting (i) a first targeting pattern into the field of view when the imaging device is in indicia reading mode and (ii) a second targeting pattern into the field of view when the imaging device is in a color composite image construction mode as hereinafter described. The one or more aiming-light sources 41 may include a first laser for radiating light for the first targeting pattern and a second laser for radiating light for the second targeting pattern.


The aiming subsystem 40 may project the targeting pattern into the field of view using a variety of technologies (e.g., aperture, diffractive optical element (DOE), shaping optics, etc. (referred to collectively as projection optics 43 (FIG. 2)). A combination of technologies may also be used to create the two targeting patterns. In various embodiments, two diffractive rectangular patterns may be used. For barcodes, a pattern with a square aspect ratio could be projected, while for color composite image construction mode a pattern with a selected aspect ratio may be projected (e.g., 2×1 aspect ratio). In various embodiments, a red line pattern may be projected for barcodes, while a green line pattern may be projected for the color composite image construction mode. In various embodiments, a red rectangular area for barcodes may be projected from an LED, while a green crosshair is projected for color composite image construction mode from a DOE. The present invention envisions any combination of technology and patterns that produce easily visualized modes of operation.


The imaging device 10 envisioned by the present invention requires significant energy to provide the high-intensity illumination and fast image-capture necessary for operation. As a result, the current consumption required by the imaging device may exceed the current limits (e.g., 500 milliamps) of a typical power source 62 (e.g., USB) (FIG. 2). For example, current consumption of the illumination subsystem may exceed the power limits of a USB connector if multiple illuminations/image-captures are required.


The imaging device 10 may store energy in an optional energy storage element 50 (FIG. 2) during periods of rest (i.e., nonoperation) and then use the stored energy for illumination, when high current is required. In various embodiments, the optional energy storage element 50 is at least one super-capacitor capable of supplying the illumination subsystem energy without depleting the energy necessary for other operations (e.g., scanning). A typical super-capacitor has enough energy capacity for a sequence of illuminations (i.e., “flashes”) before charging is required. In various embodiments, the optional energy storage element 50 may be a rechargeable battery. The battery may be charged when image capture is not required and then may be used to provide energy for the sequences of “flashes” during image capture.


The present invention also embraces integrating the optional energy storage element (or elements) 50 outside the housing 20 of the imaging device 10. For example, the storage element 50 may be incorporated inside the power/data cable of the imaging device 10. In this case, efficient charging may be accomplished using a current limiting resistor directly from the power source. The storage element may also be distributed along the cable, using the length of the cable and multiple layers to create a “cable battery” or “cable capacitor”.


While various components of an exemplary imaging device (such as imaging device 10 of FIG. 1) are depicted in FIG. 2, it is to be understood that there may be a fewer or a greater number of components in the imaging device 10 and their location within and/or outside the imaging device may vary.


Referring now to FIG. 7, according to various embodiments of the present invention, a method 700 for constructing a color composite image comprises illuminating, with an activated imaging device 10, the object 15 with light of a particular spectral band (step 710) (“applied illumination”). The object may be illuminated with light of a single spectral band or with light of one or more spectral bands at the same time. Illuminating the object with light of the particular spectral band comprises illuminating the object with at least one of visible and near-visible light. Illuminating the object with light of the particular spectral band may comprise illuminating the object with light having a bandwidth between about 10 to about 100 nanometers (i.e., individual spectral bands (in which the color is the same across the width of the spectral band) may be about 10 to about 100 nm wide). The number of different individual spectral bands used to illuminate the object is theoretically unlimited. Standard spectral bands may include red, green, and blue spectral bands, but other visible and/or non-visible spectral bands may be used to illuminate the object. Other individual spectral bands may include, for example, an ultraviolet spectral band, an infrared spectral band, a yellow spectral band, etc.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image continues by capturing a digital image of the illuminated object using the monochromatic image sensor 28 of the imaging device to obtain a monochrome image (step 720). A “monochrome image” is a digital representation of image data that does not have more than one spectral band. However, a monochrome image can be viewed on a defined color scale (e.g., a pink scale or a rainbow scale) as well as the typical grayscale.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image continues by repeating the illuminating and capturing steps to obtain a plurality of monochrome images of the object illuminated by light of a plurality of different spectral bands (i.e., capturing a plurality of digital monochrome images with a monochromatic image sensor, each digital monochrome image in the plurality of digital monochrome images illuminated with a different spectral band) (step 740). The captured images may be stored.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a composite image may continue, prior to processing the plurality of monochrome images to generate image data for one or more output channels, by analyzing the plurality of monochrome images (step 745). The plurality of monochrome images may be analyzed to determine if any (one or more) of the monochrome images needs to be aligned (step 746) and/or if at least one of an illumination setting and a camera setting for at least one monochrome image of the plurality of monochrome images needs adjustment (step 749).


If the results of the analysis (i.e., step 745) indicates sub-optimal image(s), at least one of the illumination setting and the camera setting for the at least one monochrome image may be adjusted based on the analysis (step 749). In step 749, illumination and/or camera settings of the imaging device 10 may be adjusted based on the analysis. In various embodiments of the present invention, the control methods provide variable sequences, durations, and intensities for multi-wavelength illumination. For example, the illumination may be controlled by adjusting the current for each LED array using DACs, programmable LED drivers (via serial interface), or PWM controls (duty cycle). In another example, the illumination may be controlled by adjusting the illumination time independently for each of the LED arrays. In another example, the illumination may be controlled by activating different LED arrays in a sequence or activating different LED arrays at the same time. In another example, the illumination may be controlled by adjusting the exposure time of the monochromatic image sensor 28 synchronously with illumination time and dependent on the type or spectral profile of LED arrays.


Other parameters that may be adjusted include illumination pulse characteristics (e.g. frequency, duty cycle, waveform), analog or digital gain, and sensor exposure time. The adjustment is determined by analyzing the images for image quality, such as brightness and signal to noise ratio. After adjustment, the method 700 for constructing a color composite image may return to step 710 as depicted before once again obtaining a plurality of monochrome images of the light illuminated by light of different spectral bands (step 740). Step 745 may be omitted in its entirety. The method 700 for constructing a color composite image may proceed directly from step 740 to 746, from step 740 to 748, or from step 740 to step 750 as depicted in FIG. 7.


If the images are determined to be optimal (i.e., requiring no adjustment of an illumination setting and/or a camera setting,), the method 700 for constructing a color composite image may proceed to step 746 of “Aligning one or more monochrome images” before proceeding to step 748 or the processing step (step 750). Step 748 may be omitted. In step 746, one or more of the monochrome images obtained in step 740 may be aligned to each other digitally.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image may continue by adjusting for contaminating noise in the one or more monochrome images (step 748). Adjusting for contaminating noise may occur during the processing step 750. Contaminating noise is from ambient light. Adjusting for contaminating noise comprises capturing a digital image of the object using the monochromatic image sensor without applied illumination to obtain a reference monochrome image that is representative of contaminating noise from ambient light. Capturing the digital image of the object without applied illumination (but with ambient illumination) may occur simultaneously with step 740. An unknown amplitude (A) of the contaminating noise is determined based on a difference in exposure time between a particular monochrome image and the reference monochrome image, with the difference in exposure time used to determine the unknown amplitude of the contaminating noise between the monochrome image and the reference monochrome image. The unknown amplitude may be the same or different in different monochrome images. The contaminating noise is then subtracted in the one or more monochrome images of the plurality of monochrome images (i.e., “subtracting” the contaminating noise comprises subtracting a scaled proportion of the reference monochrome image from each of the captured images, the scaled proportion being determined by the ratio of the captured image exposure to the reference monochrome image exposure). Capturing an image with only ambient illumination is useful. The ambient exposure itself could be called noise because it is uncontrolled. Subtracting it off from the other channels removes the uncontrolled component leaving only the controlled component. The optional “light source sensing subsystem” 70 and photodiode 72 of FIG. 2 can be used for subtracting ambient light from the other channels. The light source sensing subsystem 70 detects an additional aspect of the ambient light in order to make the subtraction more effective.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image continues by processing the plurality of monochrome images (after step 740) to generate image data for one or more output channels of the color composite image to be generated. The color composite image that is generated in step 760 from the image data as hereinafter described comprises the one or more output channels. The plurality of monochrome images is processed using an editing computer program that may be stored in the memory 44 of the imaging device 10, or elsewhere (e.g., a scanner, a PC, on a display device such as a tablet, etc.


Processing the plurality of monochrome images to generate image data comprises linearly combining, using the editing computer program, individual spectral bands or there may be higher order combinations thereof used to generate the image data. Processing the plurality of monochrome images to generate image data comprises processing one or more monochrome images of the plurality of monochrome images by at least one of mapping each particular spectral band in the one or more monochrome images to an output channel of the one or more output channels, adding a percentage of one monochrome image to another monochrome image, subtracting a percentage of one monochrome image from a different monochrome image, multiplying one monochrome image by a different monochrome image, dividing one monochrome image by a different monochrome image, applying a positive or negative multiplier to the one or more monochrome images, applying a positive or negative offset value to the one or more monochrome images, and applying a positive or negative exponent to the one or more monochrome images.


The primary elements of the step “processing digital images” would be independently scaling the different color images to the proper brightness and doing any color channel combinations desired to produce the true or false color image of interest (for example, subtracting off an ambient light component from the color channel).


The particular spectral bands can be mapped to the three visible output channels, red, green, and blue. For mapping each particular spectral band in the one or more monochrome images to an output channel of the one or more output channels, the simplest example is mapping the red-green-blue spectral bands in the one or more monochrome images to red-green-blue output channels of a red-green-blue color composite image. Starting with a monochrome image (a grayscale image) and using computer software, a color is assigned to each of the individual spectral bands. For example, infrared and UV spectral bands may be used for illuminating the object. The human eye is not sensitive to either infrared or ultraviolet. Therefore, to construct a color composite image that can be seen that includes image data about captured infrared light, the image data must be represented with colors that can be seen, i.e., red, green, and blue. Therefore, image data about the infrared light can be assigned the colors red, green, or blue. Red, green, or blue can be used to represent any of the wavelength ranges. One can make lots of color combinations. Making images with different band combinations, an individual can see more than otherwise. There can be images of a same scene, taken with light of different wavelengths. Wavelength ranges (spectral bands) are combined to generate the color composite image. Normalization of the output channels by one particular channel may be desired if there is a common spatially structured noise in the images. A normalized output channel is a single channel multiplied by the inverse of another channel or divided by the other channel and multiplied by a positive factor. Another example is simple channel substitution in which one color channel is substituted for another to generate a false color composite image. For example, an infrared spectral band may be substituted for red.


Adding a percentage of one monochrome image to another monochrome image comprises multiplying the pixel values of one monochrome image by a constant and adding those values to the corresponding pixels of another monochrome image.


Subtracting a percentage of one monochrome image from a different monochrome image comprises multiplying the pixel values of one monochrome image by a constant and subtracting those values from the corresponding pixels of another monochrome image.


Multiplying one monochrome image by a different monochrome image comprises multiplying the pixel values of one monochrome image by the corresponding pixel values of another monochrome image.


Dividing one monochrome image by a different monochrome image comprises dividing the pixel values of one monochrome image by the corresponding pixel values of another monochrome image.


Applying a positive or negative multiplier to the one or more monochrome images comprises multiplying the pixel values of the one or more monochrome images by a constant. A positive multiplier is any constant positive real number whereas a negative multiplier is any constant negative real number.


Applying a positive or negative offset value to the one or more monochrome images comprises adding a constant value to the pixel values of the one or more monochrome images.


Applying a positive or negative exponent to the one or more monochrome images comprises raising the pixel values of the one or more monochrome images to a constant power.


One of more of these processing steps may be repeated the same or a different number of times. Processing the one or more monochrome images may result in an adjustment to at least one of hue, saturation, lightness, chroma, intensity, contrast, and brightness of the composite image.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image continues by either generating a color composite image from the image data (step 760) or by generating additional image data for the one or more output channels from the image data (step 755) prior to generating the color composite image (step 760). Thus, the color composite image may be generated from image data and/or additional image data. Generating additional image data for the one or more output channels comprises using the editing computer software. Generating the additional image data comprises at least one of:

    • adding image data together;
    • subtracting image data from other image data;
    • multiplying image data by other image data;
    • dividing image data by other image data;
    • applying a positive or negative multiplier to image data;
    • applying a positive or negative offset value to image data; and
    • applying a positive or negative exponent to image data.


Adding image data together comprises adding the pixel values of image data to the pixel values of other image data.


Subtracting image data from other image data comprises subtracting the pixel values of image data from the pixel values of other image data.


Multiplying image data by other image data comprises multiplying the pixel values of image data by the pixel values of other image data.


Dividing image data by other image data comprises dividing the pixel values of image data by the pixel values of other image data.


Applying a positive or negative multiplier to image data comprises multiplying the pixel values of image data by a constant.


Applying a positive or negative offset value to image data comprises adding a constant to the pixel values of image data.


Applying a positive or negative exponent to image data comprises raising the pixel values of image data to a constant power.


Generating additional image data results in adjusting at least one of hue, saturation, lightness, chroma, intensity, contrast, and brightness of the color composite image.


Generating the color composite image from the image data and optionally, the additional image data comprises assigning the image data and additional image data to the one or more output channels. Step 750 of processing the one or more monochrome images comprises altering, scaling, combining spectral bands before assigning to one or more output channels. Assigning includes re-assigning to different one or more output channels. The one or more output channels may be the color display channels (red, green, and blue) that have visible wavelengths. Wavelengths we see as green are about 525-550 nanometers (nm) in length. Wavelengths we see as red are 630-700 nm in length. Wavelengths seen as blue are 450-495 nm. To generate the color composite image, three output channels may be produced, but the input to those three channels can be more than three spectral bands as previously noted. Various embodiments embrace how the output channels are generated.


While three output channels (red, green, and blue output channels) have been described, it may be possible to add a yellow output channel to the red, green blue output channels, resulting in a composite image that has a richer color than a composite image with just red, green, and blue output channels. The display 22 must be capable of handling four output channels.


In various embodiments, several “colors” of illumination may be flashed at once, while a reflected image may be obtained by illuminating the object with several narrow bands of light, with each illumination band being associated with an individual image as described above, various embodiments may be directed to illuminating with several bands of light at once, thereby allowing image combination (and false image construction) with fewer input monochrome images.


At least one image obtained by flashing several colors of illumination can be combined with another image (from a similar multi-band illumination). The images can then be mathematically combined (thru addition, subtraction, and/or scaled/multiplicative combinations, etc. as described above in the processing and generating steps) to derive a false color image. This approach could achieve similar “false color” images and feature detection (as previously described), but would require fewer overall (input) images and fewer exposures. One example would be to generate the first image while illuminating with red, green, and blue, and then generate a second image with blue illumination, and then mathematically combine the two images.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a composite image continues by displaying the color composite image (step 770). The color composite image is displayed on, for example, the display 22 for visualization.


Still referring to FIG. 7, according to various embodiments of the present invention, the method 700 for constructing a color composite image may further comprise adjusting a mode of operation (step 780) in the device by:


analyzing the object in the captured digital image for a machine-readable indicium (e.g., a barcode);


detecting the presence or absence of the machine-readable indicium within the captured digital image;


wherein if the machine-readable indicium is detected in the captured digital image, operating the device in the indicia-reading mode wherein digital images are automatically acquired and processed to read indicia;


wherein if the machine-readable indicium is not detected in the captured digital image, operating the imaging device in a different color construction mode wherein the digital images are automatically captured to generate the color composite image.


From the foregoing, it is to be appreciated that various embodiments provide methods for constructing a color composite image. Various embodiments provide different ways of combining different color channels to view different sources of image data across the electromagnetic spectrum.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,424,768; 8,448,863;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. Nos. 8,727,223; 8,740,082;
  • U.S. Pat. Nos. 8,740,085; 8,746,563;
  • U.S. Pat. Nos. 8,750,445; 8,752,766;
  • U.S. Pat. Nos. 8,756,059; 8,757,495;
  • U.S. Pat. Nos. 8,760,563; 8,763,909;
  • U.S. Pat. Nos. 8,777,108; 8,777,109;
  • U.S. Pat. Nos. 8,779,898; 8,781,520;
  • U.S. Pat. Nos. 8,783,573; 8,789,757;
  • U.S. Pat. Nos. 8,789,758; 8,789,759;
  • U.S. Pat. Nos. 8,794,520; 8,794,522;
  • U.S. Pat. Nos. 8,794,525; 8,794,526;
  • U.S. Pat. Nos. 8,798,367; 8,807,431;
  • U.S. Pat. Nos. 8,807,432; 8,820,630;
  • U.S. Pat. Nos. 8,822,848; 8,824,692;
  • U.S. Pat. Nos. 8,824,696; 8,842,849;
  • U.S. Pat. Nos. 8,844,822; 8,844,823;
  • U.S. Pat. Nos. 8,849,019; 8,851,383;
  • U.S. Pat. Nos. 8,854,633; 8,866,963;
  • U.S. Pat. Nos. 8,868,421; 8,868,519;
  • U.S. Pat. Nos. 8,868,802; 8,868,803;
  • U.S. Pat. Nos. 8,870,074; 8,879,639;
  • U.S. Pat. Nos. 8,880,426; 8,881,983;
  • U.S. Pat. Nos. 8,881,987; 8,903,172;
  • U.S. Pat. Nos. 8,908,995; 8,910,870;
  • U.S. Pat. Nos. 8,910,875; 8,914,290;
  • U.S. Pat. Nos. 8,914,788; 8,915,439;
  • U.S. Pat. Nos. 8,915,444; 8,916,789;
  • U.S. Pat. Nos. 8,918,250; 8,918,564;
  • U.S. Pat. Nos. 8,925,818; 8,939,374;
  • U.S. Pat. Nos. 8,942,480; 8,944,313;
  • U.S. Pat. Nos. 8,944,327; 8,944,332;
  • U.S. Pat. Nos. 8,950,678; 8,967,468;
  • U.S. Pat. Nos. 8,971,346; 8,976,030;
  • U.S. Pat. Nos. 8,976,368; 8,978,981;
  • U.S. Pat. Nos. 8,978,983; 8,978,984;
  • U.S. Pat. Nos. 8,985,456; 8,985,457;
  • U.S. Pat. Nos. 8,985,459; 8,985,461;
  • U.S. Pat. Nos. 8,988,578; 8,988,590;
  • U.S. Pat. Nos. 8,991,704; 8,996,194;
  • U.S. Pat. Nos. 8,996,384; 9,002,641;
  • U.S. Pat. Nos. 9,007,368; 9,010,641;
  • U.S. Pat. Nos. 9,015,513; 9,016,576;
  • U.S. Pat. Nos. 9,022,288; 9,030,964;
  • U.S. Pat. Nos. 9,033,240; 9,033,242;
  • U.S. Pat. Nos. 9,036,054; 9,037,344;
  • U.S. Pat. Nos. 9,038,911; 9,038,915;
  • U.S. Pat. Nos. 9,047,098; 9,047,359;
  • U.S. Pat. Nos. 9,047,420; 9,047,525;
  • U.S. Pat. Nos. 9,047,531; 9,053,055;
  • U.S. Pat. Nos. 9,053,378; 9,053,380;
  • U.S. Pat. Nos. 9,058,526; 9,064,165;
  • U.S. Pat. Nos. 9,064,165; 9,064,167;
  • U.S. Pat. Nos. 9,064,168; 9,064,254;
  • U.S. Pat. Nos. 9,066,032; 9,070,032;
  • U.S. Pat. Nos. 9,076,459; 9,079,423;
  • U.S. Pat. Nos. 9,080,856; 9,082,023;
  • U.S. Pat. Nos. 9,082,031; 9,084,032;
  • U.S. Pat. Nos. 9,087,250; 9,092,681;
  • U.S. Pat. Nos. 9,092,682; 9,092,683;
  • U.S. Pat. Nos. 9,093,141; 9,098,763;
  • U.S. Pat. Nos. 9,104,929; 9,104,934;
  • U.S. Pat. Nos. 9,107,484; 9,111,159;
  • U.S. Pat. Nos. 9,111,166; 9,135,483;
  • U.S. Pat. Nos. 9,137,009; 9,141,839;
  • U.S. Pat. Nos. 9,147,096; 9,148,474;
  • U.S. Pat. Nos. 9,158,000; 9,158,340;
  • U.S. Pat. Nos. 9,158,953; 9,159,059;
  • U.S. Pat. Nos. 9,165,174; 9,171,543;
  • U.S. Pat. Nos. 9,183,425; 9,189,669;
  • U.S. Pat. Nos. 9,195,844; 9,202,458;
  • U.S. Pat. Nos. 9,208,366; 9,208,367;
  • U.S. Pat. Nos. 9,219,836; 9,224,024;
  • U.S. Pat. Nos. 9,224,027; 9,230,140;
  • U.S. Pat. Nos. 9,235,553; 9,239,950;
  • U.S. Pat. Nos. 9,245,492; 9,248,640;
  • U.S. Pat. Nos. 9,250,652; 9,250,712;
  • U.S. Pat. Nos. 9,251,411; 9,258,033;
  • U.S. Pat. Nos. 9,262,633; 9,262,660;
  • U.S. Pat. Nos. 9,262,662; 9,269,036;
  • U.S. Pat. Nos. 9,270,782; 9,274,812;
  • U.S. Pat. Nos. 9,275,388; 9,277,668;
  • U.S. Pat. Nos. 9,280,693; 9,286,496;
  • U.S. Pat. Nos. 9,298,964; 9,301,427;
  • U.S. Pat. Nos. 9,313,377; 9,317,037;
  • U.S. Pat. Nos. 9,319,548; 9,342,723;
  • U.S. Pat. Nos. 9,361,882; 9,365,381;
  • U.S. Pat. Nos. 9,373,018; 9,375,945;
  • U.S. Pat. Nos. 9,378,403; 9,383,848;
  • U.S. Pat. Nos. 9,384,374; 9,390,304;
  • U.S. Pat. Nos. 9,390,596; 9,411,386;
  • U.S. Pat. Nos. 9,412,242; 9,418,269;
  • U.S. Pat. Nos. 9,418,270; 9,465,967;
  • U.S. Pat. Nos. 9,423,318; 9,424,454;
  • U.S. Pat. Nos. 9,436,860; 9,443,123;
  • U.S. Pat. Nos. 9,443,222; 9,454,689;
  • U.S. Pat. Nos. 9,464,885; 9,465,967;
  • U.S. Pat. Nos. 9,478,983; 9,481,186;
  • U.S. Pat. Nos. 9,487,113; 9,488,986;
  • U.S. Pat. Nos. 9,489,782; 9,490,540;
  • U.S. Pat. Nos. 9,491,729; 9,497,092;
  • U.S. Pat. Nos. 9,507,974; 9,519,814;
  • U.S. Pat. Nos. 9,521,331; 9,530,038;
  • U.S. Pat. Nos. 9,572,901; 9,558,386;
  • U.S. Pat. Nos. 9,606,581; 9,646,189;
  • U.S. Pat. Nos. 9,646,191; 9,652,648;
  • U.S. Pat. Nos. 9,652,653; 9,656,487;
  • U.S. Pat. Nos. 9,659,198; 9,680,282;
  • U.S. Pat. Nos. 9,697,401; 9,701,140;
  • U.S. Design Patent No. D702,237;
  • U.S. Design Patent No. D716,285;
  • U.S. Design Patent No. D723,560;
  • U.S. Design Patent No. D730,357;
  • U.S. Design Patent No. D730,901;
  • U.S. Design Patent No. D730,902;
  • U.S. Design Patent No. D734,339;
  • U.S. Design Patent No. D737,321;
  • U.S. Design Patent No. D754,205;
  • U.S. Design Patent No. D754,206;
  • U.S. Design Patent No. D757,009;
  • U.S. Design Patent No. D760,719;
  • U.S. Design Patent No. D762,604;
  • U.S. Design Patent No. D766,244;
  • U.S. Design Patent No. D777,166;
  • U.S. Design Patent No. D771,631;
  • U.S. Design Patent No. D783,601;
  • U.S. Design Patent No. D785,617;
  • U.S. Design Patent No. D785,636;
  • U.S. Design Patent No. D790,505;
  • U.S. Design Patent No. D790,546;
  • International Publication No. 2013/163789;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0194692;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0332996;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0191684;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0178685;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. Patent Application Publication No. 2015/0199957;
  • U.S. Patent Application Publication No. 2015/0210199;
  • U.S. Patent Application Publication No. 2015/0212565;
  • U.S. Patent Application Publication No. 2015/0213647;
  • U.S. Patent Application Publication No. 2015/0220753;
  • U.S. Patent Application Publication No. 2015/0220901;
  • U.S. Patent Application Publication No. 2015/0227189;
  • U.S. Patent Application Publication No. 2015/0236984;
  • U.S. Patent Application Publication No. 2015/0239348;
  • U.S. Patent Application Publication No. 2015/0242658;
  • U.S. Patent Application Publication No. 2015/0248572;
  • U.S. Patent Application Publication No. 2015/0254485;
  • U.S. Patent Application Publication No. 2015/0261643;
  • U.S. Patent Application Publication No. 2015/0264624;
  • U.S. Patent Application Publication No. 2015/0268971;
  • U.S. Patent Application Publication No. 2015/0269402;
  • U.S. Patent Application Publication No. 2015/0288689;
  • U.S. Patent Application Publication No. 2015/0288896;
  • U.S. Patent Application Publication No. 2015/0310243;
  • U.S. Patent Application Publication No. 2015/0310244;
  • U.S. Patent Application Publication No. 2015/0310389;
  • U.S. Patent Application Publication No. 2015/0312780;
  • U.S. Patent Application Publication No. 2015/0327012;
  • U.S. Patent Application Publication No. 2016/0014251;
  • U.S. Patent Application Publication No. 2016/0025697;
  • U.S. Patent Application Publication No. 2016/0026838;
  • U.S. Patent Application Publication No. 2016/0026839;
  • U.S. Patent Application Publication No. 2016/0040982;
  • U.S. Patent Application Publication No. 2016/0042241;
  • U.S. Patent Application Publication No. 2016/0057230;
  • U.S. Patent Application Publication No. 2016/0062473;
  • U.S. Patent Application Publication No. 2016/0070944;
  • U.S. Patent Application Publication No. 2016/0092805;
  • U.S. Patent Application Publication No. 2016/0101936;
  • U.S. Patent Application Publication No. 2016/0104019;
  • U.S. Patent Application Publication No. 2016/0104274;
  • U.S. Patent Application Publication No. 2016/0109219;
  • U.S. Patent Application Publication No. 2016/0109220;
  • U.S. Patent Application Publication No. 2016/0109224;
  • U.S. Patent Application Publication No. 2016/0112631;
  • U.S. Patent Application Publication No. 2016/0112643;
  • U.S. Patent Application Publication No. 2016/0117627;
  • U.S. Patent Application Publication No. 2016/0124516;
  • U.S. Patent Application Publication No. 2016/0125217;
  • U.S. Patent Application Publication No. 2016/0125342;
  • U.S. Patent Application Publication No. 2016/0125873;
  • U.S. Patent Application Publication No. 2016/0133253;
  • U.S. Patent Application Publication No. 2016/0171597;
  • U.S. Patent Application Publication No. 2016/0171666;
  • U.S. Patent Application Publication No. 2016/0171720;
  • U.S. Patent Application Publication No. 2016/0171775;
  • U.S. Patent Application Publication No. 2016/0171777;
  • U.S. Patent Application Publication No. 2016/0174674;
  • U.S. Patent Application Publication No. 2016/0178479;
  • U.S. Patent Application Publication No. 2016/0178685;
  • U.S. Patent Application Publication No. 2016/0178707;
  • U.S. Patent Application Publication No. 2016/0179132;
  • U.S. Patent Application Publication No. 2016/0179143;
  • U.S. Patent Application Publication No. 2016/0179368;
  • U.S. Patent Application Publication No. 2016/0179378;
  • U.S. Patent Application Publication No. 2016/0180130;
  • U.S. Patent Application Publication No. 2016/0180133;
  • U.S. Patent Application Publication No. 2016/0180136;
  • U.S. Patent Application Publication No. 2016/0180594;
  • U.S. Patent Application Publication No. 2016/0180663;
  • U.S. Patent Application Publication No. 2016/0180678;
  • U.S. Patent Application Publication No. 2016/0180713;
  • U.S. Patent Application Publication No. 2016/0185136;
  • U.S. Patent Application Publication No. 2016/0185291;
  • U.S. Patent Application Publication No. 2016/0186926;
  • U.S. Patent Application Publication No. 2016/0188861;
  • U.S. Patent Application Publication No. 2016/0188939;
  • U.S. Patent Application Publication No. 2016/0188940;
  • U.S. Patent Application Publication No. 2016/0188941;
  • U.S. Patent Application Publication No. 2016/0188942;
  • U.S. Patent Application Publication No. 2016/0188943;
  • U.S. Patent Application Publication No. 2016/0188944;
  • U.S. Patent Application Publication No. 2016/0189076;
  • U.S. Patent Application Publication No. 2016/0189087;
  • U.S. Patent Application Publication No. 2016/0189088;
  • U.S. Patent Application Publication No. 2016/0189092;
  • U.S. Patent Application Publication No. 2016/0189284;
  • U.S. Patent Application Publication No. 2016/0189288;
  • U.S. Patent Application Publication No. 2016/0189366;
  • U.S. Patent Application Publication No. 2016/0189443;
  • U.S. Patent Application Publication No. 2016/0189447;
  • U.S. Patent Application Publication No. 2016/0189489;
  • U.S. Patent Application Publication No. 2016/0192051;
  • U.S. Patent Application Publication No. 2016/0202951;
  • U.S. Patent Application Publication No. 2016/0202958;
  • U.S. Patent Application Publication No. 2016/0202959;
  • U.S. Patent Application Publication No. 2016/0203021;
  • U.S. Patent Application Publication No. 2016/0203429;
  • U.S. Patent Application Publication No. 2016/0203797;
  • U.S. Patent Application Publication No. 2016/0203820;
  • U.S. Patent Application Publication No. 2016/0204623;
  • U.S. Patent Application Publication No. 2016/0204636;
  • U.S. Patent Application Publication No. 2016/0204638;
  • U.S. Patent Application Publication No. 2016/0227912;
  • U.S. Patent Application Publication No. 2016/0232891;
  • U.S. Patent Application Publication No. 2016/0292477;
  • U.S. Patent Application Publication No. 2016/0294779;
  • U.S. Patent Application Publication No. 2016/0306769;
  • U.S. Patent Application Publication No. 2016/0314276;
  • U.S. Patent Application Publication No. 2016/0314294;
  • U.S. Patent Application Publication No. 2016/0316190;
  • U.S. Patent Application Publication No. 2016/0323310;
  • U.S. Patent Application Publication No. 2016/0325677;
  • U.S. Patent Application Publication No. 2016/0327614;
  • U.S. Patent Application Publication No. 2016/0327930;
  • U.S. Patent Application Publication No. 2016/0328762;
  • U.S. Patent Application Publication No. 2016/0330218;
  • U.S. Patent Application Publication No. 2016/0343163;
  • U.S. Patent Application Publication No. 2016/0343176;
  • U.S. Patent Application Publication No. 2016/0364914;
  • U.S. Patent Application Publication No. 2016/0370220;
  • U.S. Patent Application Publication No. 2016/0372282;
  • U.S. Patent Application Publication No. 2016/0373847;
  • U.S. Patent Application Publication No. 2016/0377414;
  • U.S. Patent Application Publication No. 2016/0377417;
  • U.S. Patent Application Publication No. 2017/0010141;
  • U.S. Patent Application Publication No. 2017/0010328;
  • U.S. Patent Application Publication No. 2017/0010780;
  • U.S. Patent Application Publication No. 2017/0016714;
  • U.S. Patent Application Publication No. 2017/0018094;
  • U.S. Patent Application Publication No. 2017/0046603;
  • U.S. Patent Application Publication No. 2017/0047864;
  • U.S. Patent Application Publication No. 2017/0053146;
  • U.S. Patent Application Publication No. 2017/0053147;
  • U.S. Patent Application Publication No. 2017/0053647;
  • U.S. Patent Application Publication No. 2017/0055606;
  • U.S. Patent Application Publication No. 2017/0060316;
  • U.S. Patent Application Publication No. 2017/0061961;
  • U.S. Patent Application Publication No. 2017/0064634;
  • U.S. Patent Application Publication No. 2017/0083730;
  • U.S. Patent Application Publication No. 2017/0091502;
  • U.S. Patent Application Publication No. 2017/0091706;
  • U.S. Patent Application Publication No. 2017/0091741;
  • U.S. Patent Application Publication No. 2017/0091904;
  • U.S. Patent Application Publication No. 2017/0092908;
  • U.S. Patent Application Publication No. 2017/0094238;
  • U.S. Patent Application Publication No. 2017/0098947;
  • U.S. Patent Application Publication No. 2017/0100949;
  • U.S. Patent Application Publication No. 2017/0108838;
  • U.S. Patent Application Publication No. 2017/0108895;
  • U.S. Patent Application Publication No. 2017/0118355;
  • U.S. Patent Application Publication No. 2017/0123598;
  • U.S. Patent Application Publication No. 2017/0124369;
  • U.S. Patent Application Publication No. 2017/0124396;
  • U.S. Patent Application Publication No. 2017/0124687;
  • U.S. Patent Application Publication No. 2017/0126873;
  • U.S. Patent Application Publication No. 2017/0126904;
  • U.S. Patent Application Publication No. 2017/0139012;
  • U.S. Patent Application Publication No. 2017/0140329;
  • U.S. Patent Application Publication No. 2017/0140731;
  • U.S. Patent Application Publication No. 2017/0147847;
  • U.S. Patent Application Publication No. 2017/0150124;
  • U.S. Patent Application Publication No. 2017/0169198;
  • U.S. Patent Application Publication No. 2017/0171035;
  • U.S. Patent Application Publication No. 2017/0171703;
  • U.S. Patent Application Publication No. 2017/0171803;
  • U.S. Patent Application Publication No. 2017/0180359;
  • U.S. Patent Application Publication No. 2017/0180577;
  • U.S. Patent Application Publication No. 2017/0181299;
  • U.S. Patent Application Publication No. 2017/0190192;
  • U.S. Patent Application Publication No. 2017/0193432;
  • U.S. Patent Application Publication No. 2017/0193461;
  • U.S. Patent Application Publication No. 2017/0193727;
  • U.S. Patent Application Publication No. 2017/0199266;
  • U.S. Patent Application Publication No. 2017/0200108; and
  • U.S. Patent Application Publication No. 2017/0200275.


In the specification and/or figures, typical embodiments of the present invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method for constructing a composite image, the method comprising: illuminating an object with light of a particular spectral band;capturing a digital image of the illuminated object using a monochromatic image sensor of an imaging device to obtain a monochrome image;repeating the steps of illuminating and capturing to obtain a plurality of monochrome images of the object illuminated by light of a plurality of different spectral bands;aligning the plurality of monochrome images;analyzing the plurality of monochrome images;determining whether at least one of an illumination setting and a camera setting for the plurality of monochrome images needs adjustment;adjusting at least one of the illumination setting and the camera setting for the plurality of monochrome images depending upon the determination;processing the plurality of monochrome images to generate image data for one or more output channels; andgenerating a color composite image from the image data, the color composite image comprising the one or more output channels.
  • 2. The method according to claim 1, wherein illuminating the object with light of the particular spectral band comprises at least one of: illuminating the object with light of a single spectral band or illuminating the object with light of one or more spectral bands at the same time.
  • 3. The method according to claim 1, wherein illuminating the object with light of the particular spectral band comprises illuminating the object with at least one of visible and near-visible light.
  • 4. The method according to claim 1, wherein illuminating the object with light of the particular spectral band comprises illuminating the object with light having a bandwidth between about 10 to about 100 nanometers.
  • 5. The method according to claim 1, further comprising, after adjusting at least one of the illumination setting and the camera setting: repeating the steps of illuminating and capturing to obtain the plurality of monochrome images of the object illuminated by light of the plurality of different spectral bands.
  • 6. The method according to claim 1, further comprising: capturing another digital image of the object using the monochromatic image sensor without applied illumination to obtain a reference monochrome image that is representative of contaminating noise from ambient light;determining an unknown amplitude (A) of the contaminating noise based on a difference in exposure time between a particular monochrome image and the reference monochrome image, with the difference in exposure time used to determine the unknown amplitude of the contaminating noise between the particular monochrome image and the reference monochrome image; andsubtracting the contaminating noise in one or more monochrome images of the plurality of monochrome images.
  • 7. The method according to claim 6, wherein the unknown amplitude is different in different monochrome images.
  • 8. The method according to claim 1, wherein processing the plurality of monochrome images to generate the image data comprises processing one or more monochrome images of the plurality of monochrome images by at least one of: mapping each particular spectral band in the one or more monochrome images to an output channel of the one or more output channels;adding a percentage of one monochrome image to another monochrome image;subtracting the percentage of the one monochrome image from a different monochrome image;multiplying the one monochrome image by a different monochrome image;dividing the one monochrome image by a different monochrome image;applying a positive or negative multiplier to the one or more monochrome images;applying a positive or negative offset value to the one or more monochrome images; andapplying a positive or negative exponent to the one or more monochrome images.
  • 9. The method according to claim 8, further comprising generating additional image data for the one or more output channels from the image data, wherein generating the additional image data comprises at least one of: adding the image data together;subtracting the image data from other image data;multiplying the image data by other image data;dividing the image data by other image data;applying a positive or negative multiplier to the image data;applying a positive or negative offset value to the image data; andapplying a positive or negative exponent to the image data.
  • 10. The method according to claim 1, wherein generating the color composite image from the image data comprises: assigning the image data to the one or more output channels.
  • 11. The method according to claim 9, wherein generating the color composite image from the image data comprises generating the color composite image from the image data generated from processing the one or more monochrome images and from the additional image data.
  • 12. The method according to claim 11, wherein processing the one or more monochrome images and generating the additional image data results in adjusting at least one of hue, saturation, lightness, chroma, intensity, contrast, and brightness of the color composite image.
  • 13. A method for constructing a color composite image comprising: capturing a plurality of digital monochrome images with a monochromatic image sensor, each digital monochrome image in the plurality of digital monochrome images illuminated with a different spectral band;aligning the plurality of digital monochrome images;analyzing the plurality of digital monochrome images;determining whether at least one of an illumination setting and a camera setting for the plurality of digital monochrome images needs adjustment;adjusting at least one of the illumination setting and the camera setting for the plurality of digital monochrome images depending upon the determination;processing the plurality of digital monochrome images to generate image data for one or more output channels; andgenerating the color composite image from the image data, the color composite image comprising the one or more output channels.
  • 14. The method according to claim 13, wherein processing the plurality of digital monochrome images to generate the image data comprises processing one or more digital monochrome images of the plurality of digital monochrome images by at least one of: mapping each particular spectral band in the one or more digital monochrome images to an output channel of the one or more output channels;adding a percentage of one digital monochrome image to another digital monochrome image;subtracting the percentage of the one digital monochrome image from a different digital monochrome image;multiplying the one digital monochrome image by a different digital monochrome image;dividing the one digital monochrome image by a different digital monochrome image;applying a positive or negative multiplier to the one or more digital monochrome images;applying a positive or negative offset value to the one or more digital monochrome images; andapplying a positive or negative exponent to the one or more digital monochrome images.
  • 15. The method according to claim 14, further comprising generating additional image data for the one or more output channels from the image data, wherein generating the additional image data comprises at least one of: adding the image data together;subtracting the image data from other image data;multiplying the image data by other image data;dividing the image data by other image data;applying a positive or negative multiplier to the image data;applying a positive or negative offset value to the image data; andapplying a positive or negative exponent to the image data.
  • 16. The method according to claim 13, wherein generating the color composite image from the image data comprises: assigning the image data to the one or more output channels.
  • 17. The method according to claim 15, wherein generating the color composite image from the image data comprises generating the color composite image from the image data generated from processing the one or more digital monochrome images and from the additional image data.
  • 18. The method according to claim 17, wherein processing the one or more digital monochrome images and generating the additional image data results in adjusting at least one of hue, saturation, lightness, chroma, intensity, contrast, and brightness of the false color composite image.
  • 19. The method according to claim 14, wherein adding the percentage of the one digital monochrome image to another digital monochrome image comprises multiplying pixel values of the one digital monochrome image by a constant and adding result values of the multiplication to the corresponding pixels of another monochrome image.
US Referenced Citations (674)
Number Name Date Kind
6081612 Gutkowicz-Krusin et al. Jun 2000 A
6151424 Hsu Nov 2000 A
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Van et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8556180 Liou Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein, Jr. Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682038 Blair Mar 2014 B2
8682077 Longacre, Jr. Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz, Sr. Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue et al. Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein, Jr. Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 El et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9082031 Liu et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz, Sr. et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
D737321 Lee Aug 2015 S
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham, IV Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein, Jr. Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Smith Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
D754205 Nguyen et al. Apr 2016 S
D754206 Nguyen et al. Apr 2016 S
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey et al. May 2016 B2
9360304 Xue et al. Jun 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390304 Chang et al. Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein, Jr. Aug 2016 B2
9412242 Van et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van et al. Aug 2016 B2
9423318 Liu et al. Aug 2016 B2
9424454 Tao et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9436860 Smith et al. Sep 2016 B2
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9487113 Schukalski Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
9894298 Solh Feb 2018 B1
10110805 Pomerantz et al. Oct 2018 B2
10325436 Van et al. Jun 2019 B2
20050111694 Loce et al. May 2005 A1
20070063048 Havens et al. Mar 2007 A1
20080185432 Caballero et al. Aug 2008 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100265880 Rautiola et al. Oct 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168511 Kotlarsky et al. Jul 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120194692 Mers et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20120228382 Havens et al. Sep 2012 A1
20120248188 Kearney Oct 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130082104 Kearney et al. Apr 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedrao Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130332996 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein, Jr. Feb 2014 A1
20140036229 Hsu Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140037196 Blair Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140097249 Gomez et al. Apr 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140106725 Sauerwein, Jr. Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191684 Valois Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140225926 Mathers Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140253686 Wong et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 Digregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150039878 Barten Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178537 El et al. Jun 2015 A1
20150178685 Krumel et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150212565 Murawski et al. Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150220901 Gomez et al. Aug 2015 A1
20150227189 Davis et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150239348 Chamberlin Aug 2015 A1
20150242658 Nahill et al. Aug 2015 A1
20150248572 Soule et al. Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150264624 Wang et al. Sep 2015 A1
20150268971 Barten Sep 2015 A1
20150269402 Barber et al. Sep 2015 A1
20150288689 Todeschini et al. Oct 2015 A1
20150288896 Wang Oct 2015 A1
20150310243 Ackley et al. Oct 2015 A1
20150310244 Xian et al. Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160025697 Alt et al. Jan 2016 A1
20160026838 Gillet et al. Jan 2016 A1
20160026839 Qu et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160070944 McCloskey et al. Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue et al. Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160125873 Braho et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Franz Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George et al. Jun 2016 A1
20160189284 Mellott et al. Jun 2016 A1
20160189288 Todeschini et al. Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 Dipiazza et al. Jun 2016 A1
20160192051 Dipiazza et al. Jun 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggerty et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Wilz et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160316190 McCloskey et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Germaine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress, Jr. Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 D'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delario et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Celinder et al. Jul 2017 A1
20170193727 Van et al. Jul 2017 A1
20170199266 Rice et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
20180025529 Wu et al. Jan 2018 A1
Foreign Referenced Citations (1)
Number Date Country
2013163789 Nov 2013 WO
Non-Patent Literature Citations (6)
Entry
Examiner initiated interview summary (PTOL-413B) dated Dec. 4, 2019 for U.S. Appl. No. 15/725,753.
Non-Final Rejection dated May 2, 2019 for U.S. Appl. No. 15/725,753.
Notice of Allowance and Fees Due (PTOL-85) dated Dec. 4, 2019 for U.S. Appl. No. 15/725,753.
Notice of Allowance and Fees Due (PTOL-85) dated Mar. 19, 2020 for U.S. Appl. No. 15/725,753.
Requirement for Restriction/Election dated Dec. 13, 2018 for U.S. Appl. No. 15/725,753.
Notice of Allowance and Fees Due (PTOL-85) dated Apr. 15, 2020 for U.S. Appl. No. 15/725,753.
Related Publications (1)
Number Date Country
20200186704 A1 Jun 2020 US
Continuations (1)
Number Date Country
Parent 15725753 Oct 2017 US
Child 16789690 US