Artificial generation of color blood smear image

Information

  • Patent Grant
  • 12189112
  • Patent Number
    12,189,112
  • Date Filed
    Thursday, December 10, 2020
    4 years ago
  • Date Issued
    Tuesday, January 7, 2025
    3 months ago
Abstract
Apparatus and methods are described for use with a blood sample. Using a microscope (24), three images of a microscopic imaging field of the blood sample are acquired, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light brightfield imaging. Using at least one computer processor (28), an artificial color microscopic image of the microscopic imaging field is generated, by mapping the first one of the three images to a red channel of the artificial color microscopic image, mapping a second one of the three images to a second color channel of the artificial color microscopic image, and mapping a third one of the three images to a third color channel of the artificial color microscopic image. Other applications are also described.
Description
FIELD OF EMBODIMENTS OF THE INVENTION

Some applications of the presently disclosed subject matter relate generally to analysis of bodily samples, and in particular, to optical density and microscopic measurements that are performed upon blood samples.


BACKGROUND

In some optics-based methods (e.g., diagnostic, and/or analytic methods), a property of a biological sample, such as a blood sample, is determined by performing an optical measurement. For example, the density of a component (e.g., a count of the component per unit volume) may be determined by counting the component within a microscopic image. Similarly, the concentration and/or density of a component may be measured by performing optical absorption, transmittance, fluorescence, and/or luminescence measurements upon the sample. Typically, the sample is placed into a sample carrier and the measurements are performed with respect to a portion of the sample that is contained within a chamber of the sample carrier. The measurements that are performed upon the portion of the sample that is contained within the chamber of the sample carrier are analyzed in order to determine a property of the sample.


SUMMARY OF EMBODIMENTS

In accordance with some applications of the present invention, a plurality of images of a microscopic imaging field of a blood sample are acquired, each of the images being acquired using respective, different imaging conditions. Typically, at least one of the images is a brightfield image that is acquired under violet lighting conditions (e.g., under lighting by light at a wavelength within the range of 400 nm-450 nm). Further typically, at least one of the images is a fluorescent image. A computer processor combines data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image. For some applications, the computer processor runs a neural network such as to combine the images to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image. Typically, one or more color models such as RGB, CIE, HSV, and/or a combination thereof is used to generate the artificial color microscopic image.


Typically, the image that was acquired under brightfield, violet lighting conditions is mapped to a red channel of the artificial color microscopic image. Further typically, the image is converted to a negative contrast image before being mapped to the red channel. For some applications, the result of mapping to the negative contrast image of the image acquired under brightfield, violet lighting conditions is that red blood cells have a similar appearance to the appearance of red blood cells in a color smear image (e.g., similar to those generated using Giemsa or Wright-Romanowsky smear staining).


For some applications, three images are acquired under respective imaging modalities. For example, in addition to the image acquired under brightfield violet lighting conditions, two fluorescent images may be acquired. For example, the two fluorescent images may be acquired after exciting the blood sample with light at respective wavelength bands (e.g., UV light, and blue light). Alternatively, the two fluorescent images may be acquired after exciting the sample with light at the same wavelength band, but using respective, different emission filters. Typically, the second image is mapped to a second color channel of the artificial color microscopic image, and the third image is mapped to a third color channel of the artificial color microscopic image. For example, when an RGB color model is used, the first image may be mapped to the red channel (as described above), the second image mapped to the green channel, and the third image mapped to the blue channel.


There is therefore provided, in accordance with some applications of the present invention, a method for use with a blood sample, the method including:

    • using a microscope, acquiring three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light brightfield imaging; and
    • using at least one computer processor, generating an artificial color microscopic image of the microscopic imaging field, by:
      • mapping the first one of the three images to a red channel of the artificial color microscopic image;
      • mapping a second one of the three images to a second color channel of the artificial color microscopic image; and
      • mapping a third one of the three images to a third color channel of the artificial color microscopic image.


In some applications, the first one of the three images is an image acquired under off-focus, violet-light brightfield imaging conditions.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a neural network to generate the artificial color microscopic image of the microscopic imaging field.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, mapping the first one of the three images to the red channel of the artificial RGB microscopic image includes generating a negative contrast image of the first one of the three images and mapping the negative contrast image to the red channel of the artificial RGB microscopic image.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a blood sample, the apparatus including:

    • a microscope configured to acquire three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light brightfield imaging;
    • an output device; and
    • at least one computer processor configured to generate an artificial color microscopic image of the microscopic imaging field upon the output device, by:
      • mapping the first one of the three images to a red channel of the artificial color microscopic image,
      • mapping a second one of the three images to a second color channel of the artificial color microscopic image, and
      • mapping a third one of the three images to a third color channel of the artificial color microscopic image.


In some applications, the microscope is configured to acquire the first one of the three images is an image acquired under off-focus, violet-light brightfield imaging conditions.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a neural network.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, the computer processor is configured to the first one of the three images to the red channel of the artificial RGB microscopic image by generating a negative contrast image of the first one of the three images and mapping the negative contrast image to the red channel of the artificial RGB microscopic image.


There is further provided, in accordance with some applications of the present invention, a method for use with a blood sample, the method including:

    • using a microscope, acquiring a plurality of images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions; and
    • using at least one computer processor, combining data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image.


In some applications, combining data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image includes using a neural network to combine data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image.


In some applications, combining data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image includes using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a blood sample, the apparatus including:

    • a microscope configured to acquire a plurality of images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions;
    • an output device; and
    • at least one computer processor configured to combine data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field upon the output device that appears like a color smear image.


In some applications, the computer processor is configured to combine data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image using a neural network.


In some applications, the computer processor is configured to combine data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


There is further provided, in accordance with some applications of the present invention, a method for use with a blood sample, the method including:

    • using a microscope, acquiring three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions; and
    • using at least one computer processor, generating an artificial color microscopic image of the microscopic imaging field, by:
      • generating normalized versions of each of the images, such as to remove pixels within the image having an intensity that is below a threshold; and
      • mapping the normalized version of each one of the images to a respective, different channel within an additive color model.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a neural network to generate the artificial color microscopic image of the microscopic imaging field.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, generating normalized versions of each of the images includes, for at least one of the images:

    • determining a maximum intensity within the image; and
    • removing all pixels having an intensity that is less than half of the maximum intensity.


In some applications, generating normalized versions of each of the images further includes, for the at least one of the images:

    • generating an intensity histogram of the image; and
    • for each pixel within the image that has an intensity that is at least equal to half of the maximum intensity:
      • identifying a closest local maximum in the intensity histogram having an intensity that is greater than half of the maximum intensity within the image; and
      • normalizing the intensity of the pixel based upon the difference between the maximum intensity and the intensity of the local maximum.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a blood sample, the apparatus including:

    • a microscope configured to acquire three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions;
    • an output device; and
    • at least one computer processor configured to generate an artificial color microscopic image of the microscopic imaging field upon the output device, by:
      • generating normalized versions of each of the images, such as to remove pixels within the image having an intensity that is below a threshold, and
      • mapping the normalized version of each one of the images to a respective, different channel within an additive color model.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a neural network.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, the computer processor is configured to generate the normalized versions of each of the images by, for at least one of the images:

    • determining a maximum intensity within the image; and
    • removing all pixels having an intensity that is less than half of the maximum intensity.


In some applications, the computer processor is configured to generate the normalized versions of each of the images by, for the at least one of the images:

    • generating an intensity histogram of the image; and
    • for each pixel within the image that has an intensity that is at least equal to half of the maximum intensity:
      • identifying a closest local maximum in the intensity histogram having an intensity that is greater than half of the maximum intensity within the image; and
      • normalizing the intensity of the pixel based upon the difference between the maximum intensity and the intensity of the local maximum.


There is further provided, in accordance with some applications of the present invention, a method for use with a blood sample, the method including:

    • using a microscope, acquiring three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions; and
    • using at least one computer processor, generating an artificial color microscopic image of the microscopic imaging field, by:
      • mapping each one of the images to a respective, different channel within an additive color model to generate an initial color image; and
      • generating a normalized version of the initial color image, such as to remove pixels within the image having an intensity that is below a threshold.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a neural network to generate the artificial color microscopic image of the microscopic imaging field.


In some applications, generating the artificial color microscopic image of the microscopic imaging field includes using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, generating the normalized version of the initial color image includes:

    • determining a maximum intensity within the initial color image; and
    • removing all pixels having an intensity that is less than half of the maximum intensity.


In some applications, generating the normalized version of the initial color image further includes:

    • generating an intensity histogram of the image; and
    • for each pixel within the initial color image that has an intensity that is at least equal to half of the maximum intensity:
      • identifying a closest local maximum in the intensity histogram having an intensity that is greater than half of the maximum intensity within the image; and
      • normalizing the intensity of the pixel based upon the difference between the maximum intensity and the intensity of the local maximum.


There is further provided, in accordance with some applications of the present invention, apparatus for use with a blood sample, the apparatus including:

    • a microscope configured to acquire three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions;
    • an output device; and
    • at least one computer processor configured to generate an artificial color microscopic image of the microscopic imaging field upon the output device, by:
      • mapping each one of the images to a respective, different channel within an additive color model to generate an initial color image, and
      • generating a normalized version of the initial color image, such as to remove pixels within the image having an intensity that is below a threshold.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field includes using a neural network.


In some applications, the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field includes using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.


In some applications, the computer processor is configured to generate the normalized version of the initial color image by:

    • determining a maximum intensity within the initial color image; and
    • removing all pixels having an intensity that is less than half of the maximum intensity.


In some applications, the computer processor is configured to generate the normalized version of the initial color image by:

    • generating an intensity histogram of the image, and
    • for each pixel within the initial color image that has an intensity that is at least equal to half of the maximum intensity:
      • identifying a closest local maximum in the intensity histogram having an intensity that is greater than half of the maximum intensity within the image, and
      • normalizing the intensity of the pixel based upon the difference between the maximum intensity and the intensity of the local maximum.


The present invention will be more fully understood from the following detailed description of embodiments thereof, taken together with the drawings, in which:





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram showing components of a biological sample analysis system, in accordance some applications of the present invention;



FIGS. 2A, 2B, and 2C are schematic illustrations of an optical measurement unit, in accordance with some applications of the present invention;



FIGS. 3A, 3B, and 3C are schematic illustrations of respective views of a sample carrier that is used for performing both microscopic measurements and optical density measurements, in accordance with some applications of the present invention; and



FIGS. 4A, 4B, 4C, and 4D are flowcharts showing steps of methods that are performed, in accordance with some applications of the present invention.





DETAILED DESCRIPTION OF EMBODIMENTS

Reference is now made to FIG. 1, which is block diagram showing components of a biological sample analysis system 20, in accordance with some applications of the present invention. Typically, a biological sample (e.g., a blood sample) is placed into a sample carrier 22. While the sample is disposed in the sample carrier, optical measurements are performed upon the sample using one or more optical measurement devices 24. For example, the optical measurement devices may include a microscope (e.g., a digital microscope), a spectrophotometer, a photometer, a spectrometer, a camera, a spectral camera, a hyperspectral camera, a fluorometer, a spectrofluorometer, and/or a photodetector (such as a photodiode, a photoresistor, and/or a phototransistor). For some applications, the optical measurement devices include dedicated light sources (such as light emitting diodes, incandescent light sources, etc.) and/or optical elements for manipulating light collection and/or light emission (such as lenses, diffusers, filters, etc.).


A computer processor 28 typically receives and processes optical measurements that are performed by the optical measurement device. Further typically, the computer processor controls the acquisition of optical measurements that arc performed by the one or more optical measurement devices. The computer processor communicates with a memory 30. A user (e.g., a laboratory technician, or an individual from whom the sample was drawn) sends instructions to the computer processor via a user interface 32. For some applications, the user interface includes a keyboard, a mouse, a joystick, a touchscreen device (such as a smartphone or a tablet computer), a touchpad, a trackball, a voice-command interface, and/or other types of user interfaces that are known in the art. Typically, the computer processor generates an output via an output device 34. Further typically, the output device includes a display, such as a monitor, and the output includes an output that is displayed on the display. For some applications, the processor generates an output on a different type of visual, text, graphics, tactile, audio, and/or video output device, e.g., speakers, headphones, a smartphone, or a tablet computer. For some applications, user interface 32 acts as both an input interface and an output interface, i.e., it acts as an input/output interface. For some applications, the processor generates an output on a computer-readable medium (e.g., a non-transitory computer-readable medium), such as a disk, or a portable USB drive, and/or generates an output on a printer.


Reference is now made to FIGS. 2A, 2B, and 2C, which are schematic illustrations of an optical measurement unit 31, in accordance with some applications of the present invention. FIG. 2A shows an oblique view of the exterior of the fully assembled device, while FIGS. 2B and 2C shows respective oblique views of the device with the cover having been made transparent, such components within the device are visible. For some applications, one or more optical measurement devices 24 (and/or computer processor 28 and memory 30) is housed inside optical measurement unit 31. In order to perform the optical measurements upon the sample, sample carrier 22 is placed inside the optical measurement unit. For example, the optical measurement unit may define a slot 36, via which the sample carrier is inserted into the optical measurement unit. Typically, the optical measurement unit includes a stage 64, which is configured to support sample carrier 22 within the optical measurement unit. For some applications, a screen 63 on the cover of the optical measurement unit (e.g., a screen on the front cover of the optical measurement unit, as shown) functions as user interface 32 and/or output device 34.


Typically, the optical measurement unit includes microscope system 37 (shown in FIGS. 2B-C) configured to perform microscopic imaging of a portion of the sample. For some applications, the microscope system includes a set of light sources 65 (which typically include a set of brightfield light sources (e.g. light emitting diodes) that are configured to be used for brightfield imaging of the sample, a set of fluorescent light sources (e.g. light emitting diodes) that are configured to be used for fluorescent imaging of the sample), and a camera (e.g., a CCD camera, or a CMOS camera) configured to image the sample. Typically, the optical measurement unit also includes an optical-density-measurement unit 39 (shown in FIG. 2C) configured to perform optical density measurements (e.g., optical absorption measurements) on a second portion of the sample. For some applications, the optical-density-measurement unit includes a set of optical-density-measurement light sources (e.g., light emitting diodes) and light detectors, which are configured for performing optical density measurements on the sample. For some applications, each of the aforementioned sets of light sources (i.e., the set of brightfield light sources, the set of fluorescent light sources, and the set optical-density-measurement light sources) includes a plurality of light sources (e.g. a plurality of light emitting diodes), each of which is configured to emit light at a respective wavelength or at a respective band of wavelengths.


Reference is now made to FIGS. 3A and 3B, which are schematic illustrations of respective views of sample carrier 22, in accordance with some applications of the present invention. FIG. 3A shows a top view of the sample carrier (the top cover of the sample carrier being shown as being opaque in FIG. 3A, for illustrative purposes), and FIG. 3B shows a bottom view (in which the sample carrier has been rotated around its short edge with respect to the view shown in FIG. 3A). Typically, the sample carrier includes a first set 52 of one or more sample chambers, which are used for performing microscopic analysis upon the sample, and a second set 54 of sample chambers, which are used for performing optical density measurements upon the sample. Typically, the sample chambers of the sample carrier are filled with a bodily sample, such as blood via sample inlet holes 38. For some applications, the sample chambers define one or more outlet holes 40. The outlet holes arc configured to facilitate filling of the sample chambers with the bodily sample, by allowing air that is present in the sample chambers to be released from the sample chambers. Typically, as shown, the outlet holes are located longitudinally opposite the inlet holes (with respect to a sample chamber of the sample carrier). For some applications, the outlet holes thus provide a more efficient mechanism of air escape than if the outlet holes were to be disposed closer to the inlet holes.


Reference is made to FIG. 3C, which shows an exploded view of sample carrier 22, in accordance with some applications of the present invention. For some applications, the sample carrier includes at least three components: a molded component 42, a glass layer 44 (e.g., a glass sheet), and an adhesive layer 46 configured to adhere the glass layer to an underside of the molded component. The molded component is typically made of a polymer (e.g., a plastic) that is molded (e.g., via injection molding) to provide the sample chambers with a desired geometrical shape. For example, as shown, the molded component is typically molded to define inlet holes 38, outlet holes 40, and gutters 48 which surround the central portion of each of the sample chambers. The gutters typically facilitate filling of the sample chambers with the bodily sample, by allowing air to flow to the outlet holes, and/or by allowing the bodily sample to flow around the central portion of the sample chamber.


For some applications, a sample carrier as shown in FIGS. 3A-C is used when performing a complete blood count on a blood sample. For some such applications, the sample carrier is used with optical measurement unit 31 configured as generally shown and described with reference to FIGS. 2A-C. For some applications, a first portion of the blood sample is placed inside first set 52 of sample chambers (which are used for performing microscopic analysis upon the sample, e.g., using microscope system 37 (shown in FIGS. 2B-C)), and a second portion of the blood sample is placed inside second set 54 of sample chambers (which are used for performing optical density measurements upon the sample, e.g., using optical-density-measurement unit 39 (shown in FIG. 2C)). For some applications, first set 52 of sample chambers includes a plurality of sample chambers, while second set 54 of sample chambers includes only a single sample chamber, as shown. However, the scope of the present application, includes using any number of sample chambers (e.g., a single sample chamber or a plurality of sample chambers) within either the first set of sample chambers or within the second set of sample chambers, or any combination thereof. The first portion of the blood sample is typically diluted with respect to the second portion of the blood sample. For example, the diluent may contain pH buffers, stains, fluorescent stains, antibodies, sphering agents, lysing agents, etc. Typically, the second portion of the blood sample, which is placed inside second set 54 of sample chambers is a natural, undiluted blood sample. Alternatively or additionally, the second portion of the blood sample may be a sample that underwent some modification, including, for example, one or more of dilution (e.g., dilution in a controlled fashion), addition of a component or reagent, or fractionation.


For some applications, one or more staining substances are used to stain the first portion of the blood sample (which is placed inside first set 52 of sample chambers) before the sample is imaged microscopically. For example, the staining substance may be configured to stain DNA with preference over staining of other cellular components. Alternatively, the staining substance may be configured to stain all cellular nucleic acids with preference over staining of other cellular components. For example, the sample may be stained with Acridine Orange reagent, Hoechst reagent, and/or any other staining substance that is configured to preferentially stain DNA and/or RNA within the blood sample. Optionally, the staining substance is configured to stain all cellular nucleic acids but the staining of DNA and RNA are each more prominently visible under some lighting and filter conditions, as is known, for example, for Acridine Orange. Images of the sample may be acquired using imaging conditions that allow detection of cells (e.g., brightfield) and/or imaging conditions that allow visualization of stained bodies (e.g. appropriate fluorescent illumination). Typically, the first portion of the sample is stained with Acridine Orange and with a Hoechst reagent. For example, the first (diluted) portion of the blood sample may be prepared using techniques as described in U.S. Pat. No. 9,329,129 to Pollak, which is incorporated herein by reference, and which describes a method for preparation of blood samples for analysis that involves a dilution step, the dilution step facilitating the identification and/or counting of components within microscopic images of the sample. For some applications, the first portion of the sample is stained with one or more stains that cause platelets within the sample to be visible under brightfield imaging conditions and/or under fluorescent imaging conditions, e.g., as described hereinabove. For example, the first portion of the sample may be stained with methylene blue and/or Romanowsky stains.


Referring again to FIG. 2B, typically, sample carrier 22 is supported within the optical measurement unit by stage 64. Further typically, the stage has a forked design, such that the sample carrier is supported by the stage around the edges of the sample carrier, but such that the stage does not interfere with the visibility of the sample chambers of the sample carrier by the optical measurement devices. For some applications, the sample carrier is held within the stage, such that molded component 42 of the sample carrier is disposed above the glass layer 44, and such that an objective lens 66 of a microscope unit of the optical measurement unit is disposed below the glass layer of the sample carrier. Typically, at least some light sources 65 that are used during microscopic measurements that are performed upon the sample (for example, light sources that are used during brightfield imaging) illuminate the sample carrier from above the molded component. Further typically, at least some additional light sources (not shown) illuminate the sample carrier from below the sample carrier (e.g., via the objective lens). For example, light sources that are used to excite the sample during fluorescent microscopy may illuminate the sample carrier from below the sample carrier (e.g., via the objective lens).


Typically, prior to being imaged microscopically, the first portion of blood (which is placed in first set 52 of sample chambers) is allowed to settle such as to form a monolayer of cells, e.g., using techniques as described in U.S. Pat. No. 9,329,129 to Pollak, which is incorporated herein by reference. For some applications, the first portion of blood is a cell suspension and the chambers belonging to the first set 52 of chambers each define a cavity 55 that includes a base surface 57 (shown in FIG. 3C). Typically, the cells in the cell suspension are allowed to settle on the base surface of the sample chamber of the carrier to form a monolayer of cells on the base surface of the sample chamber. Subsequent to the cells having been left to settle on the base surface of the sample chamber (e.g., by having been left to settle for a predefined time interval), at least one microscopic image of at least a portion of the monolayer of cells is typically acquired. Typically, a plurality of images of the monolayer are acquired, each of the images corresponding to an imaging field that is located at a respective, different area within the imaging plane of the monolayer. Typically, an optimum depth level at which to focus the microscope in order to image the monolayer is determined, e.g., using techniques as described in U.S. Pat. No. 10,176,565 to Greenfield, which is incorporated herein by reference. For some applications, respective imaging fields have different optimum depth levels from each other.


It is noted that, in the context of the present application, the term monolayer is used to mean a layer of cells that have settled, such as to be disposed within a single focus level of the microscope. Within the monolayer there may be some overlap of cells, such that within certain areas there are two or more overlapping layers of cells. For example, red blood cells may overlap with each other within the monolayer, and/or platelets may overlap with, or be disposed above, red blood cells within the monolayer.


For some applications, the microscopic analysis of the first portion of the blood sample is performed with respect to the monolayer of cells. Typically, the first portion of the blood sample is imaged under brightfield imaging, i.e., under illumination from one or more light sources (e.g., one or more light emitting diodes, which typically emit light at respective spectral bands). Further typically, the first portion of the blood sample is additionally imaged under fluorescent imaging. Typically, the fluorescent imaging is performed by exciting stained objects (i.e., objects that have absorbed the stain(s)) within the sample by directing light toward the sample at known excitation wavelengths (i.e., wavelengths at which it is known that stained objects emit fluorescent light if excited with light at those wavelengths), and detecting the fluorescent light. Typically, for the fluorescent imaging, a separate set of light sources (e.g., one or more light emitting diodes) is used to illuminate the sample at the known excitation wavelengths.


As described with reference to US 2019/0302099 to Pollak, which is incorporated herein by reference, for some applications, sample chambers belonging to set 52 (which is used for microscopy measurements) have different heights from each other, in order to facilitate different measurands being measured using microscope images of respective sample chambers, and/or different sample chambers being used for microscopic analysis of respective sample types. For example, if a blood sample, and/or a monolayer formed by the sample, has a relatively low density of red blood cells, then measurements may be performed within a sample chamber of the sample carrier having a greater height (i.e., a sample chamber of the sample carrier having a greater height relative to a different sample chamber having a relatively lower height), such that there is a sufficient density of cells, and/or such that there is a sufficient density of cells within the monolayer formed by the sample, to provide statistically reliable data. Such measurements may include, for example red blood cell density measurements, measurements of other cellular attributes, (such as counts of abnormal red blood cells, red blood cells that include intracellular bodies (e.g., pathogens, Howell-Jolly bodies), etc.), and/or hemoglobin concentration. Conversely, if a blood sample, and/or a monolayer formed by the sample, has a relatively high density of red blood cells, then such measurements may be performed upon a sample chamber of the sample carrier having a relatively low height, for example, such that there is a sufficient sparsity of cells, and/or such that there is a sufficient sparsity of cells within the monolayer of cells formed by the sample, that the cells can be identified within microscopic images. For some applications, such methods are performed even without the variation in height between the sample chambers belonging to set 52 being precisely known.


For some applications, based upon the measurand that is being measured, the sample chamber within the sample carrier upon which to perform optical measurements is selected. For example, a sample chamber of the sample carrier having a greater height may be used to perform a white blood cell count (e.g., to reduce statistical errors which may result from a low count in a shallower region), white blood cell differentiation, and/or to detect more rare forms of white blood cells. Conversely, in order to determine mean corpuscular hemoglobin (MCH), mean corpuscular volume (MCV), red blood cell distribution width (RDW), red blood cell morphologic features, and/or red blood cell abnormalities, microscopic images may be obtained from a sample chamber of the sample carrier having a relatively low height, since in such sample chambers the cells arc relatively sparsely distributed across the area of the region, and/or form a monolayer in which the cells are relatively sparsely distributed. Similarly, in order to count platelets, classify platelets, and/or extract any other attributes (such as volume) of platelets, microscopic images may be obtained from a sample chamber of the sample carrier having a relatively low height, since within such sample chambers there are fewer red blood cells which overlap (fully or partially) with the platelets in microscopic images, and/or in a monolayer.


In accordance with the above-described examples, it is preferable to use a sample chamber of the sample carrier having a lower height for performing optical measurements for measuring some measurands within a sample (such as a blood sample), whereas it is preferable to use a sample chamber of the sample carrier having a greater height for performing optical measurements for measuring other measurands within such a sample. Therefore, for some applications, a first measurand within a sample is measured, by performing a first optical measurement upon (e.g., by acquiring microscopic images of) a portion of the sample that is disposed within a first sample chamber belonging to set 52 of the sample carrier, and a second measurand of the same sample is measured, by performing a second optical measurement upon (e.g., by acquiring microscopic images of) a portion of the sample that is disposed within a second sample chamber of set 52 of the sample carrier. For some applications, the first and second measurands are normalized with respect to each other, for example, using techniques as described in US 2019/0145963 to Zait, which is incorporated herein by reference.


Typically, in order to perform optical density measurements upon the sample, it is desirable to know the optical path length, the volume, and/or the thickness of the portion of the sample upon which the optical measurements were performed, as precisely as possible. Typically, an optical density measurement is performed on the second portion of the sample (which is typically placed into second set 54 of sample chambers in an undiluted form). For example, the concentration and/or density of a component may be measured by performing optical absorption, transmittance, fluorescence, and/or luminescence measurements upon the sample.


Referring again to FIG. 3B, for some applications, sample chambers belonging to set 54 (which is used for optical density measurements), typically define at least a first region 56 (which is typically deeper) and a second region 58 (which is typically shallower), the height of the sample chambers varying between the first and second regions in a predefined manner, e.g., as described in US 2019/0302099 to Pollak, which is incorporated herein by reference. The heights of first region 56 and second region 58 of the sample chamber are defined by a lower surface that is defined by the glass layer and by an upper surface that is defined by the molded component. The upper surface at the second region is stepped with respect to the upper surface at the first region. The step between the upper surface at the first and second regions, provides a predefined height difference Δh between the regions, such that even if the absolute height of the regions is not known to a sufficient degree of accuracy (for example, due to tolerances in the manufacturing process), the height difference Δh is known to a sufficient degree of accuracy to determine a parameter of the sample, using the techniques described herein, and as described in US 2019/0302099 to Pollak, which is incorporated herein by reference. For some applications, the height of the sample chamber varies from the first region 56 to the second region 58, and the height then varies again from the second region to a third region 59, such that, along the sample chamber, first region 56 defines a maximum height region, second region 58 defines a medium height region, and third region 59 defines a minimum height region. For some applications, additional variations in height occur along the length of the sample chamber, and/or the height varies gradually along the length of the sample chamber.


As described hereinabove, while the sample is disposed in the sample carrier, optical measurements are performed upon the sample using one or more optical measurement devices 24. Typically, the sample is viewed by the optical measurement devices via the glass layer, glass being transparent at least to wavelengths that are typically used by the optical measurement device. Typically, the sample carrier is inserted into optical measurement unit 31, which houses the optical measurement device while the optical measurements are performed. Typically, the optical measurement unit houses the sample carrier such that the molded layer is disposed above the glass layer, and such that the optical measurement unit is disposed below the glass layer of the sample carrier and is able to perform optical measurements upon the sample via the glass layer. The sample carrier is formed by adhering the glass layer to the molded component. For example, the glass layer and the molded component may be bonded to each other during manufacture or assembly (e.g. using thermal bonding, solvent-assisted bonding, ultrasonic welding, laser welding, heat staking, adhesive, mechanical clamping and/or additional substrates). For some applications, the glass layer and the molded component are bonded to each other during manufacture or assembly using adhesive layer 46.


For some microscopy applications, microscopic images of imaging fields are acquired using a plurality of different imaging modalities. For example, as described hereinabove, brightfield images may be acquired under illumination of the sample at several, respective, different wavelength bands. The brightfield images may be acquired while cells (e.g., a monolayer of the cells) are in focus or out of focus. Alternatively or additionally, fluorescent images are acquired by exciting stained objects (i.e., objects that have absorbed the stain(s)) within the sample by directing light toward the sample at known excitation wavelengths (i.e., wavelengths at which it is known that stained objects emit fluorescent light if excited with light at those wavelengths), and detecting the fluorescent light. Respective fluorescent images are acquired by exciting the sample with light at respective, different wavelength bands, or by exciting the sample with light a given wavelength band and then using emission filters that filter light that is emitted from the sample at respective wavelength bands.


Typically, the computer processor analyzes the microscopic images and/or other data relating to the sample (e.g., optical absorption measurements), in order to determine properties of the sample. For some applications, the computer processor additionally outputs images of the sample to a user via output device 34. It may be challenging though for a human observer to extract useful information from the images, especially if that information is contained in the overlap between images that were acquired using respective, different imaging modalities and these images arc overlaid upon each other as black-and-white or grayscale images. For example, in order to verify that an element is an intraerythrocytic parasite, it may be helpful to see a single image in which the parasite candidate is visible and red blood cells are visible. The red blood cells are typically visible in brightfield images (e.g., brightfield images acquired under violet illumination), whereas the parasites are typically visible in fluorescent images. Therefore, it is helpful to see such images overlaid upon each other, but in which elements from the respective imaging modalities are visible without interfering with each other. Similarly, in order to see morphological features of white blood cells (which can help in the classification of an element as a white blood cell, and/or as a given type of white blood cell), it is typically helpful to see respective fluorescent images acquired under respective fluorescent illumination conditions overlaid upon each other.


Therefore, in accordance with some applications of the present invention, a plurality of images of a microscopic imaging field of a blood sample are acquired, each of the images being acquired using respective, different imaging conditions. Typically, at least one of the images is a brightfield image that is acquired under violet lighting conditions (e.g., under lighting by light at a wavelength within the range of 400 nm-450 nm). For some applications, the brightfield image is an off-focus image that is acquired under violet lighting conditions. Further typically, at least one of the images is a fluorescent image. A computer processor combines data from each of the plurality of images such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image. Typically, one or more color models such as RGB, CIE, HSV, and/or a combination thereof is used to generate the artificial color microscopic image.


Typically, the image that was acquired under brightfield, violet lighting conditions is mapped to a red channel of the artificial color microscopic image. Further typically, the image is converted to a negative contrast image before being mapped to the red channel. For some applications, the result of mapping to the negative contrast image of the image acquired under brightfield, violet lighting conditions is that red blood cells have a similar appearance to the appearance of red blood cells in a color smear image (e.g., similar to those generated using Giemsa or Wright-Romanowsky smear staining). Typically, the brightfield image that was acquired under violet lighting conditions is used in the aforementioned manner, since violet light is absorbed strongly by hemoglobin and therefore red blood cells appear as red once the contrast of the image is made negative and the image is mapped to the red channel.


For some applications, three images are acquired under respective imaging modalities. For example, in addition to the image acquired under brightfield violet lighting conditions, two fluorescent images may be acquired. For example, the two fluorescent images may be acquired after exciting the blood sample with light at respective wavelength bands. Alternatively, the two fluorescent images may be acquired after exciting the sample with light at the same wavelength band, but using respective, different emission filters. Typically, the second image is mapped to a second color channel of the artificial color microscopic image, and the third image is mapped to a third color channel of the artificial color microscopic image. For example, when an RGB color model is used, the first image may be mapped to the red channel (as described above), the second image mapped to the green channel, and the third image mapped to the blue channel. For some applications, one of the second and third images is acquired while the sample is excited using light (e.g., UV light) that causes cell nuclei (e.g., DNA of the cell nuclei) to fluoresce. Alternatively or additionally, a second one of the second and third images is acquired while the sample is excited using light (e.g., blue light) that causes RNA and/or cytoplasm to fluoresce. For some applications, imaging modalities are used that are similar to those used in images generated using Giemsa or Wright-Romanowsky smear staining.


For some applications, each of the fluorescent images is acquired using a relatively long exposure time. For example, this may be used in order to visualize reticulocytes as well as platelets. Alternatively, one of the fluorescent images may be acquired using a relatively short exposure time, and the other one of the fluorescent images may be acquired using a relatively short exposure time. The long and short exposure fluorescent images typically contain different information. The images acquired using the short exposure are typically optimized to provide data relating to white blood cells and other high intensity objects, while the images acquired using the long exposure time are typically optimized to provide data relating to low intensity objects such as reticulocytes, platelets, parasites, ghost cells, etc.


For some applications, the short-exposure-time images are combined with the long-exposure-time images into a single fluorescent image (for example, by replacing over-exposed regions in the long-exposure-image image with the corresponding region in the short-exposure-time image). For some applications, the resultant composite image (and/or a composite image that is generated using a different composite-image-generation technique) is mapped to one of the channels of an artificial color images, e.g., using the techniques described hereinabove.


For some applications, a neural network is used in the generation of an artificial color image. In some cases, an artificial color image generated using the methods described hereinabove may have different characteristics from the type of images that are commonly used in the field. For example, such images may differ from standard images in color, intensity resolution, shading, etc. For some applications, a convolutional neural network is used to generate an image that is more similar to standard images in the field, such that the image has a similar appearance to that of a color smear image (e.g., similar to an image generated using Giemsa or Wright-Romanow sky smear staining).


For some applications, one or more of the images that are mapped to the color image is normalized. For example, the image may be normalized by dividing the image by a background map. Alternatively or additionally, a function of the image (such as optical density) may be used in the color image. For some applications, the displayed color image is normalized such that that relevant features arc of similar magnitude in all of the channels. For some applications, one or more of the original images, and/or the displayed color image is normalized by determining a maximum intensity within the image, and removing all pixels having an intensity that is less than a given proportion of the maximum intensity (e.g., less than half of the maximum intensity), and renormalizing the pixel intensity as described below. For some applications, one or more of the original images, and/or the displayed color image is normalized in the following manner. An intensity histogram of the image is generated. For each pixel within the image that has an intensity that is at least equal to half of the maximum intensity, a closest local maximum in the intensity histogram having an intensity that is greater than half of the maximum intensity within the image is identified. The intensity of the pixel is then normalized based upon the difference between the maximum intensity and the intensity of the local maximum. For example, a given pixel may be assigned an intensity based upon the following formula:

INp=N*(Ip−Vmin)/(Vmin−Vmax), for Vmin<=Ip<=Vmax;
INp=N for Ip>Vmax;
INp=0 for Ip<Vmin

where:

    • INp is the normalized intensity of the pixel,
    • N is an integer (e.g., 255),
    • Ip is the original intensity of the pixel,
    • Vmax is the maximum intensity within the image, and
    • V min is the intensity of the closest maximum having an intensity that is greater than half of the maximum intensity within the image.


Reference is now made to FIGS. 4A-D, which are flowcharts showing steps of methods that are performed, in accordance with some applications of the present invention, in accordance with the techniques described hereinabove.


Referring to FIG. 4A, for some applications, in step 100, a plurality of images of a microscopic imaging field of the blood sample are acquired, each of the images being acquired using respective, different imaging conditions. Subsequently, in step 102, data from each of the plurality of images arc combined such as to generate an artificial color microscopic image of the microscopic imaging field that appears like a color smear image. Step 102 is typically performed by computer processor 28.


Referring to FIG. 4B, for some applications, in step 110, three images of a microscopic imaging field of the blood sample are acquired using the microscope, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light hrightfield imaging. Subsequently, in step 112, an artificial color microscopic image of the microscopic imaging field is generated, by mapping the first one of the three images to a red channel of the artificial color microscopic image (sub-step 114), mapping a second one of the three images to a second color channel of the artificial color microscopic image (sub-step 116), and mapping a third one of the three images to a third color channel of the artificial color microscopic image (sub-step 118). Step 112, and sub-steps 114-118, are typically performed by computer processor 28.


Referring to FIG. 4C, for some applications, in step 120, three images of a microscopic imaging field of the blood sample are acquired using the microscope, each of the images being acquired using respective, different imaging conditions. Subsequently, in step 122, an artificial color microscopic image of the microscopic imaging field is generated, by generating normalized versions of each of the images, such as to remove pixels within the image having an intensity that is below a threshold (sub-step 124), and mapping the normalized version of each one of the images to a respective, different channel within an additive color model (sub-step 126). Step 122, and sub-steps 124-126, are typically performed by computer processor 28.


Referring to FIG. 4D, for some applications, in step 130, three images of a microscopic imaging field of the blood sample are acquired using the microscope, each of the images being acquired using respective, different imaging conditions. Subsequently, in step 132, an artificial color microscopic image of the microscopic imaging field is generated, by mapping each one of the images to a respective, different channel within an additive color model to generate an initial color image (sub-step 134), and generating a normalized version of the initial color image, such as to remove pixels within the image having an intensity that is below a threshold (sub-step 136). Step 132, and sub-steps 134-136, are typically performed by computer processor 28.


For some applications, the apparatus and methods described herein are applied to a biological sample, such as, blood, saliva, semen, sweat, sputum, vaginal fluid, stool, breast milk, bronchoalveolar lavage, gastric lavage, tears and/or nasal discharge, mutatis mutandis. The biological sample may be from any living creature, and is typically from warm blooded animals. For some applications, the biological sample is a sample from a mammal, e.g., from a human body. For some applications, the sample is taken from any domestic animal, zoo animals and farm animals, including but not limited to dogs, cats, horses, cows and sheep. Alternatively or additionally, the biological sample is taken from animals that act as disease vectors including deer or rats.


For some applications, the apparatus and methods described herein are applied to a non-bodily sample. For some applications, the sample is an environmental sample, such as, a water (e.g. groundwater) sample, surface swab, soil sample, air sample, or any combination thereof, mutatis mutandis. In some embodiments, the sample is a food sample, such as, a meat sample, dairy sample, water sample, wash-liquid sample, beverage sample, and/or any combination thereof.


For some applications, the sample as described herein is a sample that includes blood or components thereof (e.g., a diluted or non-diluted whole blood sample, a sample including predominantly red blood cells, or a diluted sample including predominantly red blood cells), and parameters are determined relating to components in the blood such as platelets, white blood cells, anomalous white blood cells, circulating tumor cells, red blood cells, reticulocytes, Howell-Jolly bodies, etc.


Applications of the invention described herein can take the form of a computer program product accessible from a computer-usable or computer-readable medium (e.g., a non-transitory computer-readable medium) providing program code for use by or in connection with a computer or any instruction execution system, such as computer processor 28. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Typically, the computer-usable or computer readable medium is a non-transitory computer-usable or computer readable medium.


Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random-access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.


A data processing system suitable for storing and/or executing program code will include at least one processor (e.g., computer processor 28) coupled directly or indirectly to memory elements (e.g., memory 30) through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. The system can read the inventive instructions on the program storage devices and follow these instructions to execute the methodology of the embodiments of the invention.


Network adapters may be coupled to the processor to enable the processor to become coupled to other processors or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.


Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the C programming language or similar programming languages.


It will be understood that algorithms described herein, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer (e.g., computer processor 28) or other programmable data processing apparatus, create means for implementing the functions/acts specified in the algorithms described in the present application. These computer program instructions may also be stored in a computer-readable medium (e.g., a non-transitory computer-readable medium) that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart blocks and algorithms. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the algorithms described in the present application.


Computer processor 28 is typically a hardware device programmed with computer program instructions to produce a special purpose computer. For example, when programmed to perform the algorithms described herein, computer processor 28 typically acts as a special purpose artificial-image-generation computer processor. Typically, the operations described herein that are performed by computer processor 28 transform the physical state of memory 30, which is a real physical article, to have a different magnetic polarity, electrical charge, or the like depending on the technology of the memory that is used.


The apparatus and methods described herein may be used in conjunction with apparatus and methods described in any one of the following patents or patent applications, all of which are incorporated herein by reference:

    • U.S. Pat. No. 9,522,396 to Bachelet;
    • U.S. Pat. No. 10,176,565 to Greenfield;
    • U.S. Pat. No. 10,640,807 to Pollak;
    • U.S. Pat. No. 9,329,129 to Pollak;
    • U.S. Pat. No. 10,093,957 to Pollak;
    • U.S. Pat. No. 10,831,013 to Yorav Raphael;
    • U.S. Pat. No. 10,843,190 to Bachelet;
    • U.S. Pat. No. 10,482,595 to Yorav Raphael;
    • U.S. Pat. No. 10,488,644 to Eshel;
    • WO 17/168411 to Eshel;
    • US 2019/0302099 to Pollak;
    • US 2019/0145963 to Zait; and
    • WO 19/097387 to Yorav-Raphael.


It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof that are not in the prior art, which would occur to persons skilled in the art upon reading the foregoing description.

Claims
  • 1. A method for use with a blood sample, the method comprising: using a microscope, acquiring three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light brightfield imaging; andusing at least one computer processor, generating an artificial color microscopic image of the microscopic imaging field, by: mapping the first one of the three images to a red channel of the artificial color microscopic image;mapping a second one of the three images to a second color channel of the artificial color microscopic image; andmapping a third one of the three images to a third color channel of the artificial color microscopic image.
  • 2. The method according to claim 1, wherein the first one of the three images is an image acquired under off-focus, violet-light brightfield imaging conditions.
  • 3. The method according to claim 1, wherein generating the artificial color microscopic image of the microscopic imaging field comprises using a neural network to generate the artificial color microscopic image of the microscopic imaging field.
  • 4. The method according to claim 1, wherein generating the artificial color microscopic image of the microscopic imaging field comprises using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.
  • 5. The method according to claim 1, wherein mapping the first one of the three images to the red channel of the artificial RGB microscopic image comprises generating a negative contrast image of the first one of the three images and mapping the negative contrast image to the red channel of the artificial RGB microscopic image.
  • 6. Apparatus for use with a blood sample, the apparatus comprising: a microscope configured to acquire three images of a microscopic imaging field of the blood sample, each of the images being acquired using respective, different imaging conditions, and the first one of the three images being acquired under violet-light brightfield imaging;an output device; andat least one computer processor configured to generate an artificial color microscopic image of the microscopic imaging field upon the output device, by: mapping the first one of the three images to a red channel of the artificial color microscopic image,mapping a second one of the three images to a second color channel of the artificial color microscopic image, andmapping a third one of the three images to a third color channel of the artificial color microscopic image.
  • 7. The apparatus according to claim 6, wherein the microscope is configured to acquire the first one of the three images is an image acquired under off-focus, violet-light brightfield imaging conditions.
  • 8. The apparatus according to claim 6, wherein the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a neural network.
  • 9. The apparatus according to claim 6, wherein the computer processor is configured to generate the artificial color microscopic image of the microscopic imaging field using a color model selected from the group consisting of: RGB, CIE, HSV, and a combination thereof.
  • 10. The apparatus according to claim 6, wherein the computer processor is configured to the first one of the three images to the red channel of the artificial RGB microscopic image by generating a negative contrast image of the first one of the three images and mapping the negative contrast image to the red channel of the artificial RGB microscopic image.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a US national phase application of PCT Application No. PCT/IB2020/061736 to Halperin (published as WO 21/116962), which claims priority from U.S. Provisional Patent Application No. 62/946,988 to Halperin et al., filed Dec. 12, 2019, entitled “Artificial generation of a color blood smear image,” which is incorporated herein by reference. . .

PCT Information
Filing Document Filing Date Country Kind
PCT/IB2020/061736 12/10/2020 WO
Publishing Document Publishing Date Country Kind
WO2021/116962 6/17/2021 WO A
US Referenced Citations (426)
Number Name Date Kind
3203768 Tiller et al. Aug 1965 A
3603156 Konkol Sep 1971 A
3676076 Grady Jul 1972 A
3786184 Pieters Jan 1974 A
3916205 Kleinerman Oct 1975 A
3967056 Yata et al. Jun 1976 A
4030888 Yamamoto Jun 1977 A
4076419 Kleker Feb 1978 A
4097845 Bacus Jun 1978 A
4199748 Bacus Apr 1980 A
4209548 Bacus Sep 1980 A
4350884 Vollath Sep 1982 A
4453266 Bacus Jun 1984 A
4454235 Johnson Jun 1984 A
4494479 Drury et al. Jan 1985 A
4580895 Patel Apr 1986 A
4700298 Palcic et al. Oct 1987 A
4761381 Blatt et al. Aug 1988 A
4774192 Terminiello et al. Sep 1988 A
4803352 Bierleutgeb Feb 1989 A
4849430 Fleet Jul 1989 A
4851330 Kohne Jul 1989 A
4902101 Fujihara et al. Feb 1990 A
5001067 Coleman et al. Mar 1991 A
5064282 Curtis Nov 1991 A
5229265 Tometsko Jul 1993 A
5300779 Hillman et al. Apr 1994 A
5331958 Oppenheimer Jul 1994 A
5430542 Shepherd et al. Jul 1995 A
5470751 Sakata et al. Nov 1995 A
5499097 Ortyn et al. Mar 1996 A
5566249 Rosenlof et al. Oct 1996 A
5625706 Lee et al. Apr 1997 A
5663057 Drocourt et al. Sep 1997 A
5671288 Wilhelm et al. Sep 1997 A
5672861 Fairley et al. Sep 1997 A
5674457 Williamsson et al. Oct 1997 A
5745804 Iwane Apr 1998 A
5782770 Mooradian et al. Jul 1998 A
5827190 Palcic et al. Oct 1998 A
5834217 Levine et al. Nov 1998 A
5932872 Price Aug 1999 A
5948686 Wardlaw et al. Sep 1999 A
5978497 Lee et al. Nov 1999 A
5985595 Krider et al. Nov 1999 A
5993702 Davis Nov 1999 A
6005964 Reid et al. Dec 1999 A
6007996 McNamara et al. Dec 1999 A
6027695 Oldenburg Feb 2000 A
6064474 Lee May 2000 A
6074879 Zelmanovic Jun 2000 A
6101404 Yoon Aug 2000 A
6235536 Wardlaw May 2001 B1
6262423 Hell et al. Jul 2001 B1
6262798 Shepherd et al. Jul 2001 B1
6320979 Melen Nov 2001 B1
6330348 Kerschmann et al. Dec 2001 B1
6339472 Hafeman Jan 2002 B1
6350613 Wardlaw et al. Feb 2002 B1
6448024 Bruegger Sep 2002 B1
6519355 Nelson Feb 2003 B2
6554788 Hunley et al. Apr 2003 B1
6582964 Samsoondar et al. Jun 2003 B1
6611777 Samsoondar Aug 2003 B2
6632681 Chu Oct 2003 B1
6658143 Hansen Dec 2003 B2
6664528 Cartlidge et al. Dec 2003 B1
6711516 Samsoondar Mar 2004 B2
6799119 Voorhees et al. Sep 2004 B1
6819408 Scrivens et al. Nov 2004 B1
6831733 Pettersson Dec 2004 B2
6834237 Noergaard et al. Dec 2004 B2
6836559 Abdel-fattah Dec 2004 B2
6842233 Narisada Jan 2005 B2
6866823 Wardlaw Mar 2005 B2
6872930 Cartlidge et al. Mar 2005 B2
6898451 Wuori May 2005 B2
6903323 Cartlidge et al. Jun 2005 B2
6929953 Wardlaw et al. Aug 2005 B1
6949384 Samsoondar Sep 2005 B2
6955872 Maples et al. Oct 2005 B2
6956650 Boas Oct 2005 B2
6989891 Braig et al. Jan 2006 B2
7027628 Gagnon Apr 2006 B1
7030351 Wasserman Apr 2006 B2
7034883 Rosenqvist Apr 2006 B1
7105795 Cartlidge et al. Sep 2006 B2
7132636 Cartlidge Nov 2006 B1
7133547 Marcelpoil Nov 2006 B2
7151246 Fein et al. Dec 2006 B2
7155049 Wetzel Dec 2006 B2
7248716 Fein et al. Jul 2007 B2
7274810 Reeves et al. Sep 2007 B2
7283217 Ikeuchi Oct 2007 B2
7288751 Cartlidge et al. Oct 2007 B2
7305109 Gagnon Dec 2007 B1
7324694 Chapoulaud Jan 2008 B2
7329537 Qiu Feb 2008 B2
7338168 Cartlidge et al. Mar 2008 B2
7344890 Perez et al. Mar 2008 B2
7346205 Walker, Jr. Mar 2008 B2
7369696 Arini et al. May 2008 B2
7387898 Gordon Jun 2008 B1
7411680 Chang Aug 2008 B2
7417213 Krief et al. Aug 2008 B2
7385168 Cartlidge et al. Sep 2008 B2
7425421 Dertinger Sep 2008 B2
7439478 Cartlidge et al. Oct 2008 B2
7450223 Ikeuchi Nov 2008 B2
7450762 Morell Nov 2008 B2
7460222 Kalveram Dec 2008 B2
7490085 Walker et al. Feb 2009 B2
7493219 Qi Feb 2009 B1
7580120 Hamada Aug 2009 B2
7599893 Sapir Oct 2009 B2
7601938 Cartlidge et al. Oct 2009 B2
7602954 Marcelpoil Oct 2009 B2
7605356 Krief Oct 2009 B2
7609369 Simon-Lopez Oct 2009 B2
7630063 Padmanabhan Dec 2009 B2
7633604 Ikeuchi Dec 2009 B2
7638748 Krief et al. Dec 2009 B2
7663738 Johansson Feb 2010 B2
7668362 Olson Feb 2010 B2
7692131 Fein et al. Apr 2010 B2
7697764 Kataoka Apr 2010 B2
7702181 Gouch Apr 2010 B2
7706862 Alfano et al. Apr 2010 B2
7713474 Schulman et al. May 2010 B2
7747153 Ibaraki Jun 2010 B2
7765069 Ostoich Jul 2010 B2
7777869 Nerin Aug 2010 B2
7787109 Dosmann et al. Aug 2010 B2
7796797 Nakaya et al. Sep 2010 B2
7863552 Cartlidge et al. Jan 2011 B2
7869009 Dosmann et al. Jan 2011 B2
7894047 Hamada Feb 2011 B2
7911617 Padmanabhan Mar 2011 B2
7925070 Sumida Apr 2011 B2
7929121 Wardlaw Apr 2011 B2
7933435 Hunter Apr 2011 B2
7936913 Nordell May 2011 B2
7951599 Levine May 2011 B2
7995200 Matsumoto Aug 2011 B2
7998435 Reed Aug 2011 B2
8000511 Perz Aug 2011 B2
8044974 Sumida Oct 2011 B2
8045782 Li Oct 2011 B2
8055471 Qi Nov 2011 B2
8064680 Ramoser Nov 2011 B2
8077296 Wardlaw Dec 2011 B2
8081303 Levine Dec 2011 B2
8105554 Kanigan et al. Jan 2012 B2
8125643 Hansen Feb 2012 B2
D655421 Lee et al. Mar 2012 S
8131035 Grady Mar 2012 B2
8131052 Alexandrov Mar 2012 B2
8150114 Svanberg Apr 2012 B2
8154713 Simon-Lopez Apr 2012 B2
8165385 Reeves Apr 2012 B2
8175353 Westphal May 2012 B2
8179597 Namba May 2012 B2
8184273 Dosmann May 2012 B2
8192995 Zhang et al. Jun 2012 B2
8216832 Battrell et al. Jul 2012 B2
8224058 Lindberg Jul 2012 B2
8269954 Levine Sep 2012 B2
8280134 Hoyt Oct 2012 B2
8310659 Wardlaw Nov 2012 B2
8320655 Sarachan Nov 2012 B2
8327724 Fairs Dec 2012 B2
8331642 Zerfass Dec 2012 B2
8339586 Zahniser Dec 2012 B2
8345227 Zahniser Jan 2013 B2
8351676 Dai Jan 2013 B2
8363221 Hansen Jan 2013 B2
8379944 Grady Feb 2013 B2
8406498 Ortyn Mar 2013 B2
8428331 Dimarzio Apr 2013 B2
8432392 Kim Apr 2013 B2
8477294 Zahniser Jul 2013 B2
8481303 Faris et al. Jul 2013 B2
8488111 Zahniser Jul 2013 B2
8491499 Choi et al. Jul 2013 B2
8526704 Dobbe Sep 2013 B2
8570496 Chen Oct 2013 B2
8582924 De La Torre-bueno Nov 2013 B2
8638427 Wardlaw Jan 2014 B2
8712142 Rajpoot Apr 2014 B2
8736824 Matsui May 2014 B2
8744165 Liu Jun 2014 B2
8778687 Levine Jul 2014 B2
8787650 Marugame Jul 2014 B2
8792693 Satish Jul 2014 B2
8837803 Wang et al. Sep 2014 B2
8849024 Shinoda Sep 2014 B2
8873827 Mcculloch Oct 2014 B2
8877458 Maurer Nov 2014 B2
8878923 Henderson Nov 2014 B2
8885154 Wardlaw Nov 2014 B2
8885912 Sui Nov 2014 B2
8891851 Spaulding Nov 2014 B2
8922761 Zahniser Dec 2014 B2
8942458 Takahashi Jan 2015 B2
8964171 Zahniser Feb 2015 B2
8992750 Beaty Mar 2015 B1
8994930 Levine Mar 2015 B2
9012868 Courtney et al. Apr 2015 B2
9041792 Van Leeuwen May 2015 B2
9050595 Miller et al. Jun 2015 B2
9064301 Zie et al. Jun 2015 B2
9046473 Levine Sep 2015 B2
9176121 Winkelman et al. Nov 2015 B2
9186843 Chan et al. Nov 2015 B2
9240043 Christiansen Jan 2016 B2
9322767 Ehrenkranz Apr 2016 B2
9329129 Pollak et al. May 2016 B2
9342734 Lin et al. May 2016 B2
9404852 Braig et al. Aug 2016 B2
9470609 Wimberger-friedl Oct 2016 B2
9477875 Ohya Oct 2016 B2
9522396 Bachelet Dec 2016 B2
9528978 Yamada Dec 2016 B2
9588033 Zahniser et al. Mar 2017 B2
9767343 Jones et al. Sep 2017 B1
9820990 Pak et al. Nov 2017 B2
9933363 Danuser et al. Apr 2018 B2
9934571 Ozaki Apr 2018 B2
9976945 Kendall et al. May 2018 B2
10024858 Smith et al. Jul 2018 B2
10061972 Champlin Aug 2018 B2
10093957 Pollak et al. Oct 2018 B2
10169861 Ozaki et al. Jan 2019 B2
10176565 Greenfield Jan 2019 B2
10281386 Hsu et al. May 2019 B2
10395368 Berezhna Aug 2019 B2
10482595 Yorav-Raphael Nov 2019 B2
10488644 Eshel Nov 2019 B2
10508983 Kendall et al. Dec 2019 B2
10527635 Bhatia Jan 2020 B1
10640807 Pollak May 2020 B2
10663712 Eshel May 2020 B2
10843190 Bachelet Nov 2020 B2
11099175 Zait et al. Aug 2021 B2
11199690 Eshel Dec 2021 B2
11609413 Yorav-Raphael et al. Mar 2023 B2
20020009711 Wada et al. Jan 2002 A1
20020028158 Wardlaw et al. Mar 2002 A1
20020028471 Oberhardt Mar 2002 A1
20030017085 Kercso et al. Mar 2003 A1
20030161514 Curry Aug 2003 A1
20030170613 Straus Sep 2003 A1
20030197925 Hamborg Oct 2003 A1
20030208140 Pugh Nov 2003 A1
20030224522 de Jong Dec 2003 A1
20030227612 Fein et al. Dec 2003 A1
20030227673 Nakagawa Dec 2003 A1
20030231791 Torre-Bueno et al. Dec 2003 A1
20040240050 Ogihara Feb 2004 A1
20040241677 Lin et al. Feb 2004 A1
20040054283 Corey et al. Mar 2004 A1
20040122216 Nielsen Jun 2004 A1
20040132171 Rule et al. Jul 2004 A1
20040170312 Soenksen Sep 2004 A1
20040185447 Maples et al. Sep 2004 A1
20040218804 Affleck et al. Nov 2004 A1
20050089208 Dong et al. Apr 2005 A1
20050109959 Wasserman et al. May 2005 A1
20050175992 Aberl et al. Aug 2005 A1
20050286800 Gouch Dec 2005 A1
20060002817 Bohm Jan 2006 A1
20060003458 Golovchenko et al. Jan 2006 A1
20060045505 Zeineh Mar 2006 A1
20060051778 Kallick Mar 2006 A1
20060063185 Vannier Mar 2006 A1
20060079144 Klisch et al. Apr 2006 A1
20060187442 Chang et al. Aug 2006 A1
20060190226 Jojic et al. Aug 2006 A1
20060222567 Kloepfer et al. Oct 2006 A1
20060223052 MacDonald et al. Oct 2006 A1
20060223165 Chang et al. Oct 2006 A1
20070252984 Van Beek et al. Jan 2007 A1
20070054350 Walker, Jr. Mar 2007 A1
20070076190 Nakaya et al. Apr 2007 A1
20070161075 Gleich Jul 2007 A1
20070172956 Magari et al. Jul 2007 A1
20070231914 Deng et al. Oct 2007 A1
20070243117 Wardlaw et al. Oct 2007 A1
20070250301 Vaisberg et al. Oct 2007 A1
20080019584 Lindberg et al. Jan 2008 A1
20080020128 van Ryper Jan 2008 A1
20080059135 Murugkar et al. Mar 2008 A1
20080118399 Fleming May 2008 A1
20080187466 Wardlaw et al. Aug 2008 A1
20080212069 Goldberg et al. Sep 2008 A1
20080260369 Ibaraki Oct 2008 A1
20080273776 Krief et al. Nov 2008 A1
20080305514 Alford et al. Dec 2008 A1
20090066934 Gao et al. Mar 2009 A1
20090074282 Pinard et al. Mar 2009 A1
20090075324 Pettersson Mar 2009 A1
20090086314 Namba Apr 2009 A1
20090088336 Burd et al. Apr 2009 A1
20090128618 Fahn et al. May 2009 A1
20090185734 Lindberg et al. Jul 2009 A1
20090191098 Beard et al. Jul 2009 A1
20090195688 Henderson et al. Aug 2009 A1
20090213214 Yamada Aug 2009 A1
20090258347 Scott Oct 2009 A1
20090269799 Winkelman et al. Oct 2009 A1
20090291854 Wiesinger-Mayr et al. Nov 2009 A1
20100003265 Scheffler Jan 2010 A1
20100068747 Herrenknecht Mar 2010 A1
20100104169 Yamada Apr 2010 A1
20100112631 Hur et al. May 2010 A1
20100120129 Amshey et al. May 2010 A1
20100136556 Friedberger et al. Jun 2010 A1
20100136570 Goldberg et al. Jun 2010 A1
20100152054 Love Jun 2010 A1
20100157086 Segale et al. Jun 2010 A1
20100172020 Price Jul 2010 A1
20100192706 Fairs Aug 2010 A1
20100232675 Ortyn et al. Sep 2010 A1
20100234703 Sterling et al. Sep 2010 A1
20100253907 Korb Oct 2010 A1
20100254596 Xiong Oct 2010 A1
20100256918 Chen et al. Oct 2010 A1
20100265323 Perz Oct 2010 A1
20100272334 Yamada et al. Oct 2010 A1
20100295998 Sakai et al. Nov 2010 A1
20100300563 Ramunas et al. Dec 2010 A1
20110007178 Kahlman Jan 2011 A1
20110009163 Fletcher Jan 2011 A1
20110030458 Park et al. Feb 2011 A1
20110059481 Wardlaw et al. Mar 2011 A1
20110102571 Yoneyama May 2011 A1
20110123398 Carrhilo et al. May 2011 A1
20110144480 Lu et al. Jun 2011 A1
20110149097 Danuser et al. Jun 2011 A1
20110151502 Kendall et al. Jun 2011 A1
20110178716 Krockenberger et al. Jul 2011 A1
20110212486 Yamada Sep 2011 A1
20110243794 Wardlaw Oct 2011 A1
20110249910 Henderson Oct 2011 A1
20110275111 Pettigrew et al. Nov 2011 A1
20110301012 Dolecek et al. Dec 2011 A1
20120002195 Wu et al. Jan 2012 A1
20120021951 Hess et al. Jan 2012 A1
20120030618 Leong et al. Feb 2012 A1
20120044342 Hing et al. Feb 2012 A1
20120058504 Li et al. Mar 2012 A1
20120092477 Kawano et al. Apr 2012 A1
20120120221 Dong et al. May 2012 A1
20120169863 Bachelet et al. Jul 2012 A1
20120225446 Wimberger-friedl et al. Sep 2012 A1
20120237107 Tawfik et al. Sep 2012 A1
20120275671 Eichhorn et al. Nov 2012 A1
20120312957 Loney et al. Dec 2012 A1
20120320045 Yao Dec 2012 A1
20130023007 Zahniser et al. Jan 2013 A1
20130078668 Levine et al. Mar 2013 A1
20130130262 Battrell et al. May 2013 A1
20130169948 Xie Jul 2013 A1
20130170730 Yu et al. Jul 2013 A1
20130176551 Wardlaw et al. Jul 2013 A1
20130177974 Mamghani et al. Jul 2013 A1
20130203082 Gonda et al. Aug 2013 A1
20130273968 Rhoads et al. Oct 2013 A1
20130284924 Mizuochi et al. Oct 2013 A1
20130290225 Kamath et al. Oct 2013 A1
20130323757 Poher Dec 2013 A1
20140139625 Mathuis et al. May 2014 A1
20140139630 Kowalevicz May 2014 A1
20140185906 Ding et al. Jul 2014 A1
20140186859 Calderwood et al. Jul 2014 A1
20140205176 Obrien et al. Jul 2014 A1
20140270425 Kenny et al. Sep 2014 A1
20140273064 Smith et al. Sep 2014 A1
20140347459 Greenfield et al. Nov 2014 A1
20140347463 Lin Nov 2014 A1
20140353524 Danuser et al. Dec 2014 A1
20150037806 Pollak et al. Feb 2015 A1
20150124082 Kato et al. May 2015 A1
20150183153 Chan et al. Jul 2015 A1
20150190063 Zakharov et al. Jul 2015 A1
20150246170 Miao et al. Sep 2015 A1
20150278575 Allano et al. Oct 2015 A1
20150302237 Ohya et al. Oct 2015 A1
20150316477 Pollak et al. Nov 2015 A1
20160042507 Turner Feb 2016 A1
20160146750 Hughes et al. May 2016 A1
20160168614 Hunt Jun 2016 A1
20160187235 Fine Jun 2016 A1
20160208306 Pollak et al. Jul 2016 A1
20160246046 Yorav-Raphael et al. Aug 2016 A1
20160250312 Longley Sep 2016 A1
20160279633 Bachelet et al. Sep 2016 A1
20170052110 Malissek et al. Feb 2017 A1
20170115271 Xie et al. Apr 2017 A1
20170146558 Ishii et al. May 2017 A1
20170160185 Minemura et al. Jun 2017 A1
20170191945 Zhang et al. Jul 2017 A1
20170218425 Chen et al. Aug 2017 A1
20170292905 Obrien et al. Oct 2017 A1
20170307496 Zahniser et al. Oct 2017 A1
20170326549 Jones et al. Nov 2017 A1
20170328924 Jones Nov 2017 A1
20180080885 Ginsberg et al. Mar 2018 A1
20180246313 Eshel et al. Aug 2018 A1
20180259318 Yelin et al. Sep 2018 A1
20180296102 Satish et al. Oct 2018 A1
20180297024 Tran Oct 2018 A1
20190002950 Pollak et al. Jan 2019 A1
20190087953 Yorav-Raphael Mar 2019 A1
20190110718 Brittenham Apr 2019 A1
20190130567 Greenfield May 2019 A1
20190145963 Zait May 2019 A1
20190266723 Blanchard et al. Aug 2019 A1
20190302099 Pollak Oct 2019 A1
20190347467 Ohsaka et al. Nov 2019 A1
20200034967 Yorav-Raphael Jan 2020 A1
20200049970 Eshel Feb 2020 A1
20200111209 Greenfield Apr 2020 A1
20200249458 Eshel Aug 2020 A1
20200300750 Eshel Sep 2020 A1
20220189016 Barnes et al. Jun 2022 A1
Foreign Referenced Citations (101)
Number Date Country
2655024 Nov 2014 CA
1918501 Feb 2007 CN
101403650 Apr 2009 CN
101501785 Aug 2009 CN
102282467 Dec 2011 CN
104094118 Oct 2014 CN
105556276 Nov 2018 CN
0073551 Mar 1983 EP
0479231 Apr 1992 EP
1 381 229 Jan 2004 EP
1698883 Sep 2006 EP
2145684 Jan 2010 EP
2 211 165 Jul 2010 EP
3001174 Mar 2016 EP
3123927 Feb 2017 EP
3482189 May 2019 EP
1 873 232 Feb 2020 EP
2329014 Mar 1999 GB
60-162955 Aug 1985 JP
61-198204 Sep 1986 JP
7-504038 Apr 1995 JP
H08-313340 Nov 1996 JP
9-54083 Feb 1997 JP
H11-500832 Jan 1999 JP
H11-73903 Mar 1999 JP
2000-199845 Jul 2000 JP
2002-516982 Jun 2002 JP
2004-144526 May 2004 JP
2004-257768 Sep 2004 JP
2006-506607 Feb 2006 JP
2006-301270 Nov 2006 JP
2007-40814 Feb 2007 JP
2009-180539 Aug 2009 JP
2009-233927 Oct 2009 JP
2009-268432 Nov 2009 JP
2011-95225 May 2011 JP
2013-515264 May 2013 JP
2013-541767 Nov 2013 JP
2014-41139 Mar 2014 JP
2015-57600 Mar 2015 JP
2016-70658 May 2016 JP
2016-528506 Sep 2016 JP
2017-209530 Nov 2017 JP
2018-525611 Sep 2018 JP
6952683 Oct 2021 JP
2402006 Oct 2010 RU
1985005446 Dec 1985 WO
1996001438 Jan 1996 WO
1996012981 May 1996 WO
1996013615 May 1996 WO
2000006765 Feb 2000 WO
2000052195 Sep 2000 WO
2000055572 Sep 2000 WO
2003056327 Jul 2003 WO
2003065358 Aug 2003 WO
2003073365 Sep 2003 WO
2003081525 Oct 2003 WO
2004020112 Mar 2004 WO
2004111610 Dec 2004 WO
2005121863 Dec 2005 WO
2006121266 Nov 2006 WO
2008063135 May 2008 WO
2010036827 Apr 2010 WO
2010056740 May 2010 WO
2010116341 Oct 2010 WO
2010126903 Nov 2010 WO
2010137543 Dec 2010 WO
2011056658 May 2011 WO
2011076413 Jun 2011 WO
2011123070 Oct 2011 WO
2011143075 Nov 2011 WO
2012000102 Jan 2012 WO
2012029269 Mar 2012 WO
2012030313 Mar 2012 WO
2012090198 Jul 2012 WO
2012154333 Nov 2012 WO
2013041951 Mar 2013 WO
2013098821 Jul 2013 WO
2013102076 Jul 2013 WO
2014146063 Sep 2014 WO
2014159620 Oct 2014 WO
2014188405 Nov 2014 WO
2015001553 Jan 2015 WO
2015029032 Mar 2015 WO
2015089632 Jun 2015 WO
2016021311 Feb 2016 WO
2016030897 Mar 2016 WO
2016203320 Dec 2016 WO
2017046799 Mar 2017 WO
2017168411 Oct 2017 WO
2017195205 Nov 2017 WO
2017195208 Nov 2017 WO
2018009920 Jan 2018 WO
2018102748 Jun 2018 WO
2019035084 Feb 2019 WO
2019097387 May 2019 WO
2019102277 May 2019 WO
2019198094 Oct 2019 WO
2021079305 Apr 2021 WO
2021079306 Apr 2021 WO
2021116962 Jun 2021 WO
Non-Patent Literature Citations (231)
Entry
Sawhney, A.K., et al., “Erythrocyte Alterations Induced by Malathion in Channa punctatus (Bloch)”, Bull. Environ. Contam. Toxicol., 2000, vol. 64, pp. 395-405.
Communication dated Mar. 20, 2023 issued by the European Patent Office in application No. 22209948.3.
Communication dated Feb. 22, 2023 issued by the Canadian Patent Office in application No. 3,081,669.
Communication dated Mar. 17, 2023 issued by the United States Patent and Trademark Office in U.S. Appl. No. 17/083,647.
Communication dated Mar. 2, 2023 issued in the Canadian Patent Office in application No. 3,018,536.
Communication dated Mar. 27, 2023 issued by the Brazilian Patent Office in application No. BR122020017765-9.
Communication dated Mar. 27, 2023 issued by the United States Patent and Trademark Office in U.S. Appl. No. 16/763,810.
Communication dated Mar. 3, 2023 issued by the United States Patent and Trademark Office in U.S. Appl. No. 17/082,483.
Communication dated Mar. 3, 2022 issued by the United States Patent and Trademark Office in U.S. Appl. No. 17/063,320.
Communication dated Mar. 7, 2023 issued by the Japanese Patent Office in application No. 2021-157849.
Communication dated Nov. 25, 2022 issued by the United States Patent and Trademark Office in U.S. Appl. No. 17/082,483.
Onodera, M., “Organ Derangement”, Medicina, Sep. 9, 2005, vol. 42, No. 9, pp. 1582-1584 (5 pages total).
Notice of Allowance dated Apr. 12, 2023 issued by the United States Patent and Trademark Office in U.S. Appl. No. 16/088,321.
Tyulina, O., et al., “Erythrocyte and plasma protein modification in alcoholism: A possible role of acctaldehyde”, Biochimica et Biophysica Acta, vol. 1762, 2006, pp. 558-563 (6 pages total).
Takakusaki, T., “Shape Change of Red Cell Ghost and ATP”, The KITAKANTO Medical Journal, 1960, vol. 10, Issue 4, pp. 522-531 (11 pages total).
Hirota, T., et al., “Kusanon A® Poisoning Complicated by Heinz Body Hemolytic Anemia”, Japanese Association for Acute Medicine Magazine, vol. 12, No. 12, Dec. 15, 2001, (1 page total).
Communication dated Jun. 8, 2023, issued by the Canadian Patent Office in application No. 3,160,692.
Communication dated Jun. 9, 2023 issued by the Canadian Patent Office in application No. 3,160,688.
Communication dated Jun. 22, 2023 issued by the Canadian Office Action in application No. 3,160,697.
Communication dated Jul. 3, 2023 issued by the United States Patent and Trademark Office in U.S. Appl. No. 17/568,858.
Communication dated Jul. 12, 2023 issued by the Canadian Patent Office in application No. 3,155,820.
Communication dated Jul. 17, 2023 issued by the Canadian Patent Office in application No. 3,155,821.
United States Second Notice of Allowance issued Aug. 23, 2023 in U.S. Appl. No. 17/490,767.
United States Notice of Allowance issued May 15, 2023 in U.S. Appl. No. 17/490,767.
An Examination Report issued Aug. 16, 2023, in Australian Patent Application No. 2018369859.
Canadian Office Action issued Aug. 25, 2023 in Application No. 3,160,702.
United States Office Action issued May 30, 2023 in U.S. Appl. No. 17/082,615.
A Hearing Notice issued by the Indian Patent Office on Aug. 29, 2023 for IN 201817012117.
A Hearing Notice issued by the Indian Patent Office on Sep. 1, 2023 for IN 201817036130.
United States Office Action issued Sep. 14, 2023 in U.S. Appl. No. 17/083,647.
United States Office Action issued Sep. 14, 2023 in U.S. Appl. No. 17/063,320.
Office Action dated Dec. 7, 2023 which issued during the prosecution of Canadian Application No. 3,081,669.
A Summons to an Oral Hearing issued by the European Patent Office on Oct. 5, 2023 for Application No. 17728277.9.
A Decision to Refuse issued on Oct. 3, 2023 for Japanese Patent Application No. 2021-157849.
Canadian Office Action issued Oct. 13, 2023 in Application No. 2,998,829.
United States Notice of Allowance issued Oct. 12, 2023 in U.S. Appl. No. 17/568,858.
United States Office Action issued Oct. 17, 2023 in U.S. Appl. No. 17/082,615.
A Hearing Notice issued by the Indian Patent Office on Nov. 9, 2023 for IN 201817040226.
Canadian Office Action issued Dec. 19, 2023 in Application No. 3,018,536.
An Office Action dated Dec. 21, 2023 which issued during the prosecution of Brazilian Application No. 112018 072627 3.
United States Office Action issued Jan. 9, 2024 in U.S. Appl. No. 18/203,109.
Notice of Allowance issued for U.S. Appl. No. 16/763,810 on Feb. 8, 2024.
United States Office Action dated Feb. 29, 2024 in U.S. Appl. No. 17/083,647.
Canadian Office Action dated Mar. 4, 2024 in Application No. 3022770.
Canadian Office Action dated Jan. 12, 2023 in Application No. 3022770.
International Search Report and Written Opinion dated Mar. 11, 2024 in Application No. PCT/IB2023/062469.
Canadian Office Action dated Mar. 6, 2024 in Application No. 3160688.
Canadian Office Action dated Mar. 11, 2024 in Application No. 3160692.
United States Office Action dated Mar. 22, 2024 in U.S. Appl. No. 17/360,503.
Canadian Office Action dated Apr. 3, 2024 in Application No. 3160697.
Canadian Office Action dated Apr. 19, 2024 in Application No. 3155820.
New Zealand Office Action dated Apr. 23, 2024 in Application No. 787743.
New Zealand Office Action dated Apr. 24, 2024 in Application No. 787745.
European Office Action dated Apr. 29, 2024 in Application No. 20800326.9.
European Office Action dated May 6, 2024 in Application No. 20 800 325.1.
United States Office Action dated May 7, 2024 in U.S. Appl. No. 17/770,339.
European Office Action dated May 8, 2024 in Application No. 20828314.3.
Kerem Delikoyun, et al., “2 Deep learning-based cellular image analysis for intelligent medical diagnosis” , De Gruyter, 2021, (4 pages) https://www.degruyter.com/document/doi/10.1515/9783110668322-002/html.
C.Briggs, et al., “ICSH Guidelines for the evaluation of blood cell analysers including those used for differential leucocyte and reticulocyte counting”, International Journal of Laboratory Hematology, 2014, vol. 36, pp. 613-627 (15 pages).
An Office Action dated May 16, 2024 which issued during the prosecution of U.S. Appl. No. 17/063,320.
An Office Action dated May 29, 2024 which issued during the prosecution of Korean Application No. 10-2022-7017082.
A Chinese Office Action dated May 9, 2024 which issued during the prosecution of Application No. 202080085933.9.
Notice of Allowance issued for Canadian Application No. 3,155,821 on May 21, 2024.
An Office Action dated May 29, 2024 which issued during the prosecution of Korean Application No. 10-2022-7017081.
“Blood specimens: Microscopic Examination”, Centers for Disease Control and Prevention CDC, Diagnostic Procedures, 2009 <http://www.dpd.cdc.gov/dpdx/HTML/Frames/DiagnosticProcedures/body_dp_bloodexamin.htm>.
Agero, U., Mesquita, L.G., Neves, B.R.A., Gazzinelli, R.T. and Mesquita, O.N., 2004. Defocusing microscopy. Microscopy research and technique, 65(3), pp. 159-165.
Ahirwar, Neetu et al., “Advanced Image Analysis Based System for Automatic Detection and Classification of Malarial Parasite in Blood Images,” International Journal of Information Technology and Knowledge Management Jan.-Jun. 2012, vol. 5, No. 1, pp. 59-64, Serial Publications Pvt. Ltd, India.
An International Search Report and a Written Opinion both dated Jan. 23, 2017. which issued during the prosecution of Applicant's PCT/IL2016/051025.
An International Search Report and Written Opinion in International Application No. PCT/IB2018/058861, issued on Apr. 8, 2019.
An International Search Report and Written Opinion, dated Aug. 30, 2017 from the International Bureau in counterpart International application No. PCT/IL2017/050526.
An International Search Report and Written Opinion, dated Aug. 8, 2017 from the International Bureau in counterpart International application No. PCT/IL2017/050523.
An International Search Report and Written Opinion, dated May 18, 2017 from the International Bureau in counterpart International application No. PCT/IL2017/050363.
Anand, A., et al. “Automatic Identification of Malaria-Infected RBC with Digital Holographic Microscopy Using Correlation Algorithms.” Photonics Journal, IEEE 4.5 (2012): 1456-1464.
Bacus, J.W., 1985. Cytometric approaches to red blood cells. Pure and Applied AL Chemistry, 57(4), pp. 593-598.
Biéler, Sylvain et al. “Improved detection of Trypanosoma brucei by lysis of red blood cells, concentration and LED fluorescence microscopy”; Acta Tropica; vol. 121, Issue 2, Feb. 2012, pp. 135-140.
Bravo-Zanoguera, M.E., Laris, C.A., Nguyen, L.K., Oliva, M. and Price, J.H., 2007. Dynamic autofocus for continuous-scanning time-delay-and-integration image acquisition in automated microscopy. Journal of biomedical optics, 12(3), pp. 034011-034011.
Brenner et al., An Automated Microscope for Cytologic Research a Preliminary Evaluation, [The Journal of Histochecmistry and Cytochemistry, vol. 24, No. 1, pp. 100-111, 1976.
Briggs, C., et al., “Continuing developments with the automated platelet count”, Blackwell Publishing Ltd, International Journal of Laboratory Hematology, Jan. 18, 2007, pp. 77-91, vol. 29 (15 pages total).
Centers for Disease Control and Prevention. DPDx—Laboratory Identification of Parasitic Diseases of Public Health Concern <http://www.cdc.gov/dpdx/diagnosticProcedures/blood/microexam.html>, Nov. 29, 2013.
Cervantes, Serena , Jacques Prudhomme, David Carter, Krishna G Gopi, Qian Li, Young-Tae Chang, and Karine G Le Roch, High-content live cell imaging with RNA probes: advancements in high-throughput antimalarial drug discovery, BMC Cell Biology 2009, 10:45, https://bmcmolcellbiol.biomedcentral.com/track/pdf/10.1186/1471-2121-10-45 (Jun. 10, 2009).
Chiodini, P.L. et al., “Rapid diagnosis of malaria by fluorescence microscopy”; The Lancet, vol. 337, Issue 8741, p. 624-625, Mar. 9, 1991.
Chong, Shau Poh, Shilpa Pant, and Nanguang Chen. “Line-scan Focal Modulation Microscopy for Rapid Imaging of Thick Biological Specimens.” SPIE/OSA/IEEE Asia Communications and Photonics. International Society for Optics and Photonics, 2011.
Emma Eriksson et al: “Automated focusing of nuclei for time lapse experiments on single cells using holographic optical tweezers”, Optics Express , vol. 17, No. 7 , Mar. 2-4, 2009, pp. 5585-5594.
F. Boray Tek et al. “Parasite detection and identification for automated thin blood film malaria diagnosis”; Computer Vision and Image Understanding vol. 114, Issue 1, Jan. 2010, pp. 21-32.
Fohlen-Walter, Anne PhD, et al., “Laboratory Identification of Cryoglobulinemia From Automated Blood Cell Counts, Fresh Blood Samples, and Blood Films”, American Society for Clinical Pathology, Am J Clin Pathol, 2002, pp. 606-614, vol. 117 (9 pages total).
Frean, John. “Microscopic Determination of Malaria Parasite Load: Role of Image Analysis.” Microscopy: Science, technology, Applications, and Education (2010): 862-866.
Gallo, V., Skorokhod, O.A., Schwarzer, e, and Arese, P. “Simultaneous determination of phagocytosis of Plasmodium falciparum-parasitized and non-parasitized red blood cells by flow cytometry”; Malaria Journal 2012 11:428.
Garcia, et al. “Laboratory Diagnosis of Blood-borne Parasitic Diseases; Approved Guideline”; NCCLS Documents M115-a, Jun. 2000.
Gordon, Andrew et al. “Single-cell quantification of molecules” Nature Methods 4, Jan. 21, 2007, pp. 175-181.
Gordon, Andrew et al. Supplementary Note to Gordon et al: “Single-cell quantification of molecules” Nature Methods, Jan. 21, 2007, pp. 1-35.
Groen, F.C.A., et al. “A comparison of different focus functions for use in autofocus algorithms”, Cytometry, Alan Liss, New York, US vol. 6, No. 2, Mar. 1, 1985 (Mar. 1, 1985) pp. 81-91.
Guy, Rebecca, Paul Liu, Peter Pennefather and Ian Crandall, “The use of fluorescence enhancement to improve the microscopic diagnosis of falciparum malaria”, Malaria Journal 2007 6:89, https://malariajournal.biomedcentral.com/articles/10.1186/1475-2875-6-89, (Jul. 9, 2007).
Houri-Yafin, A., et al. “An enhanced computer vision platform for clinical diagnosis of malaria” Malaria Control and Elimination, 2016, p. 138, Vo. 5, Issue 1, omics International, India.
Jager et al. “Five-minute Giemsa stain for rapid detection of malaria parasites in blood smears”, Tropical Doctor, Vo. 41, pp. 33-35, Jan. 2011.
Jahanmehr,S A H et al., “Simple Technique for Fluorescence Staining of Blood Cells with Acridine Orange”, Journal of Clinical Pathology, Feb. 12, 1987, pp. 926-929 (4 pages total).
Joanny, Fanny, Helda Jana, and Benjamin Mordmllera, “In Vitro Activity of Fluorescent Dyes against Asexual Blood Stages of Plasmodium falciparum” DOI: 10.1128/AAC.00709-12.
Kawamoto, F. and P.F.Billingsley, “Rapid diagnosis of malaria by fluorescence microscopy”, Parasitology Today, 8.2 (1992): 69-71.
Kawamoto,F. “Rapid diagnosis of malaria by fluorescence microscopy with light microscope and interference filter”, The Lancet, vol. 337, pp. 200-202, Jan. 26, 1991.
Keiser, J. et al., “Acridine Orange for malaria diagnosis: its diagnostic performance, its promotion and implementation in Tanzania, and the implications for malaria control”, Annals of Tropical Medicine and parasitology, 96.7 (2002): 643-654.
Knesel, “Roche Image Analysis Systems, Inc.”, Acta Cytologica, vol. 40, pp. 60-66, (1996).
Kumar, Amit et al. “ Enhanced Identification of Malarial Infected Objects using Otsu Algorithm from Thin Smear Digital Images.” International Journal of Latest Research in Science and Technology vol. 1,Issue 2 :p. Nos. 159-163, Jul.-Aug. 2012).
Le, Minh-Tam et al., “A novel semi-automatic image processing approach to determine Plasmodium falciparum parasitemia in Giemsa-stained thin blood smears”, BMC Cell Biology, published Mar. 28, 2008.
Leif, “Methods for Preparing Sorted Cells as Monolayer Specimens”, Springer Lab Manuals, Section 7—Chapter 5 pp. 592-619, (2000).
Life Technologies Corporation, “Counting blood cells with Countless Automated Cell Counter” found at http://www.lifetechnologies.com/content/dam/LifeTech/migration/files/cell-tissue-analysis/pdfs.par.83996.file.dat/w-082149-countless-application-blood-cells.pdf, four pages, (2009).
Matcher, S.J., et al. “Use of the water absorption spectrum to quantify tissue chromophore concentration changes in near-infrared spectroscopy”, Physics in Medicine & Biology, vol. 39, No. 1, 1994 pp. 177-196, IOP Publishing Ltd., UK.
Mendiratta, D.K. et al. Evaluation of different methods for diagnosis of P. falciparum malaria; Indian J Med Microbiol. Jan. 2006;24(1):49-51.
Merchant et al. , “Computer-Assisted Microscopy”, The essential guide to image processing, Chapter 27, pp. 777-831, Academic Press, (2009).
Moody , “Rapid Diagnostic Tests for Malaria Parasites”, Clinical Microbiology Reviews, vol. 15, No. 1, pp. 66-78, 12 (2002).
Moon S, Lee S, Kim H, Freitas-Junior LH, Kang M, Ayong L, et al. (2013) An Image Analysis Algorithm for Malaria Parasite Stage Classification and Viability Quantification. PLoS ONE 8(4): e61812. https://doi.org/10.1371/journal.pone.0061812.
Ortyn, William E.,et al. “Extended Depth of Field Imaging for High Speed Cell Analysis.” Cytometry Part A 71.4, 2007): 215-231.
Osibote, O. A., et al. “Automated focusing in bright-field microscopy for tuberculosis detection”, Journal of Microscopy 240.2 (2010)pp. 155-163.
Pasini, Erica M. et al. “A novel live-dead staining methodology to study malaria parasite viability”; Malaria Journal 2013 12:190.
Piruska, Aigars et al., “The autofluorescence of plastic materials and chips measured under laser irradiation” Lab on a Chip, 2005, 5, 1348-1354, published Nov. 1, 2005.
Poon et al., “Automated Image Detection and Segmentation in Blood Smears”, [Cytometry 1992 13:766-774].
Purwar, Yashasvi, et al. “Automated and Unsupervised Detection of Malarial Parasites in Microscopic Images.” Malaria Journal 10.1 (2011): 364.
Rappaz, Benjamin et al., “Comparative study of human erythrocytes by digital holographic microscopy, confocal microscopy, and impedance volume analyzer” Cytometry Part A, 2008, vol. 73, Issue 10, pp. 895-903, John Wiley & Sons, US.
Roma, P. M. S., et al . “Total three-dimensional imaging of phase objects using defocusing microscopy: Application to red blood cells.” Applied Physics Letters 104,25 (2014): 251107.
Ross, Nichoals E., et al., “Automated image processing method for the diagnosis and classification of malaria on thin blood smears”, Medical and Biological Engineering and Computing, May 2006, vol. 44, Issue 5, pp. 427-436, Springer Publishing Company, US.
Sheikh , H., Bin Zhu, Micheli-Tzanakou, E. (1996) “Blood cell identification using neural networks.” Proceedings of the IEEE 22nd Annual Northeast Bioengineering Conference; pp. 119-120.
Shen, Feimo, Louis Hodgson and Klaus Hahn, “Digital autofocus method for automated microscopy”, Methods in Enzymology vol. 414, 2006, pp. 620-632.
Shute G. T. and T. M. Sodeman, “Identification of malaria parasites by fluorescence microscopy and acridine orange staining”, Bulletin of the World Health Organ. 1973; 48(5): 591-596.
Sun, Yu, S. Duthaler and B.J. Nelson, “Autofocusing algorithm selection in computer microscopy”, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.
Tek, F. Boray, Andrew G. Dempster, and Izzet Kale. “Computer Vision for Microscopy Diagnosis of Malaria.” Malaria Journal 8.1 (2009): 153.
Thung, Ferdian, and Iping Supriana Suwardi. “Blood Parasite Identification Using Feature Based Recognition.” Electrical Engineering and Informatics (ICEEI), 2011 International Conference on. IEEE, 2011.
U.S. Appl. No. 61/427,809, filed Dec. 29, 2010. (CAT-1).
U.S. Appl. No. 61/870,106, filed Aug. 26, 2013. (CAT-2).
U.S. Appl. No. 62/042,388, filed Aug. 27, 2014 (CAT-2).
UNITAID Malaria Diagnostic Technology and Market Landscape, 2nd Edition (2014).
Wissing, Frank et al. “Illumination of the Malaria Parasite Plasmodium falciparum Alters Intracellular pH”, Implications for Live Cell Imaging; published Jul. 24, 2002,JBS Papers in Press, vol. 277 No. 40, pp. 27747-37755.
Wright, J H. “A Rapid Method for the Differential Staining of Blood Films and Malarial Parasites” Journal of medical research vol. 7,1 (1902): 138-44.
Wu, Caicai et al., “Feasibility study of the spectroscopic measurement of oxyhemoglobin using whole blood without pre-treatment”, The Analyst, Mar. 1998, pp. 477-481, vol. 123 (5 pages total).
Wu, Qiang, Fatima Mechant, and Kenneth Castleman. Microscope Image Processing. Chapter 16, Autofocusing, pp. 441-467, Academic Press, 2010.
Xu, Lili, Asok Chaudhuri, “Plasmodium yoelii: A differential fluorescent technique using Acridine Orange to identify infected erythrocytes and reticulocytes in Duffy knockout mouse”, Experimental Parasitology, vol. 110, Issue 1, May 2005, pp. 80-87, https://www.sciencedirect.com/science/article/pii/S001448940500038X, (May 31, 2005).
Yang, Ming, and Li Luo. “A Rapid Auto-Focus Method in Automatic Microscope.” Signal Processing, 2008. ICSP 2008. 9th International Conference on. IEEE,2008.
Yao, LN et al. “Pathogen identification and clinical diagnosis for one case infected with Babesia”. Zhongguo ji sheng chong xue yu ji sheng chong bing za zhi Chinese journal of parasitology parasitic diseases, Aug. 2012.
Yazdanfar, S., Kenny, K.B., Tasimi, K., Corwin, A.D., Dixon, E.L. and Filkins, R.J., 2008. Simple and robust image-based autofocusing for digital microscopy. Optics express, 16(12), pp. 8670-8677.
Zahniser et al., Automated Slide Preparation System for the Clinical Laboratory, Cytometry, vol. 26, No. 10, pp. 60-64, (1996).
A European Examination Report dated Dec. 9, 2019. which issued during the prosecution of Applicant's European App No. 16782094.3.
Notice of Allowance dated Mar. 2, 2020, which issued during the prosecution of U.S. Appl. No. 16/657,473.
A European Examination Report dated Feb. 1, 2019. which issued during the prosecution of Applicant's European App No. 17717000.8.
A European Examination Report dated Sep. 3, 2019. which issued during the prosecution of Applicant's European App No. 17717000.8.
A European Examination Report dated Apr. 8, 2020. which issued during the prosecution of Applicant's European App No. 17717000.8.
A European Examination Report dated Apr. 6, 2020. which issued during the prosecution of Applicant's European App No. 17726036.1.
A European Examination Report dated Feb. 11, 2020. which issued during the prosecution of Applicant's European App No. 17728277.9.
An Office Action dated Aug. 24, 2020 for U.S. Appl. No. 16/098,893.
A Chinese Office Action and dated May 22, 2020. which issued during the prosecution of Chinese Application No. 201680053431.1.
A Restriction Requirement issued by the USPTO on Aug. 24, 2020 for U.S. Appl. No. 16/088,321.
Saraswat, et al. “Automated microscopic image analysis for leukocytes identification: A survey”, ABV—Indian Institute of Information Technology and Management, Gwalior, India.
Hiremath, P.S,. et al., “Automated Identification and Classification of White Blood Cells (Leukocytes) in Digital Microscopic Images”, IJCA Special Issue on “Recent Trends in Image Processing and Pattern Recognition” RTIPPR, 2010.
Witt, et al. “Establishing traceability of photometric absorbance values for accurate measurements of the haemoglobin concentration in blood.”,Metrologia 50 (2013) 539-548.
Putzu, et al., “Leucocyte classification for leukaemia detection using image processing techniques.”, Artificial Intelligence in Medicine, vol. 63, No. 3, Nov. 1, 2014.
Varga, et al., “An automated scoring procedure for the micronucleus test by image analysis,”, Mutagenesis vol. 19 No. 5 pp. 391-397, 2004.
Ran, Qiong et al. “Spatial-spectral blood cell classification with microscopic hyperspectral imagery” Proc. SPIE 10461, AOPC 2017: Optical Spectroscopy and Imaging, 1046102 (Oct. 24, 2017).
Omucheni et al. “Application of principal component analysis to multispectral-multimodal optical image analysis for malaria diagnostics”, Malaria Journal 2014, 13:485 http://www.malariajournal.com/content/13/1/485.
Ben-Suliman-2018-Computerized Counting-Based System for Acute Lymphoblastic Leukemia Detection in Microscopic Blood Images: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, Oct. 4-7, 2018, Proceedings, Part II.
An Office Action dated Dec. 8, 2020 for Japanese Patent Application No. 2018/512961.
An Examination Report issued on Dec. 7, 2020 for Australian Patent Application No. 2016322966.
An Office Action dated Jan. 11, 2021 for U.S. Appl. No. 16/098,893.
An Examination Report issued on Apr. 29, 2021 for Australian Patent Application No. 2016322966.
International Search Report issued for PCT Application No. PCT/IB2020/059924 on Mar. 22, 2021.
International Search Report issued for PCT Application No. PCT/IB2020/059925 on Mar. 26, 2021.
Invitation to pay fees and Partial Search Report issued for PCT Application No. PCT/IB2020/059924 on Jan. 28, 2021.
Invitation to pay fees and Partial Search Report issued for PCT Application No. PCT/IB2020/059925 on Feb. 4, 2021.
A Japanese Office Action dated Mar. 30, 2021. which issued during the prosecution of Application No. 2018/558180.
An Office Action dated Mar. 9, 2021 for U.S. Appl. No. 16/088,321.
An Office Action dated Jan. 29, 2021 for U.S. Appl. No. 16/099,270.
Bovik, Alan C., et. “The Essential Guide to Image Processing”, Chapter 27, “Computer Assisted Microscopy”,pp. 777-831; Academic Press, 2009 (Merchant).
Communication dated Nov. 18, 2014 from the Canadian Patent Office in application No. 2,655,024.
Laboratory diagnosis of blood-borne parasitic diseases: approved guideline, 2000—NCCLS (CAT-1).
Price Jeffrey H. and David A. Gough, “Comparison of phase-contrast and fluorescence digital autofocus for scanning microscopy”, Cytometry 16.4 (1994) 283-297.
Vink, J.P. et al., “An automatic vision-based malaria diagnosis system”, Journal of Microscopy 250.3 (2013):166-178.
An International Search Report and Written Opinion for Application No. PCT/IB2020/061731 issued on Feb. 22, 2021.
Invitation to pay fees and Partial Search Report issued for PCT Application No. PCT/IB2020/061732 on Mar. 10, 2021.
Invitation to pay fees and Partial Search Report issued for PCT Application No. PCT/IB2020/061736 on Mar. 12, 2021.
Invitation to pay fees and Partial Search Report issued for PCT Application No. PCT/IB2020/061728 on Mar. 15, 2021.
International Search Report issued for PCT Application No. PCT/IB2020/061724 on Mar. 10, 2021.
An International Search Report and Written Opinion for PCT Application No. PCT/IB2020/061732 mailed on May 7, 2021.
International Search Report and Written Opinion for PCT Application No. PCT/IB2020/061728 mailed on May 7, 2021.
International Search Report and Written Opinion for PCT Application No. PCT/IB2020/061736 mailed on May 3, 2021.
Non-Final Office Action dated Jun. 17, 2021 which issued during the prosecution of U.S. Appl. No. 16/851,410.
A Final Office Action dated Jun. 17, 2021 which issued during the prosecution of U.S. Appl. No. 16/088,321.
Notice of Allowance dated May 19, 2021 which issued during the prosecution of U.S. Appl. No. 16/099,270.
A Restriction Requirement issued by the USPTO on Oct. 19, 2020 for U.S. Appl. No. 16/099,270.
An Extended European Search Report issued for European Patent Application No. 21164814.2 on Jun. 9, 2021.
Third Office Action dated Jul. 12, 2021 which issued during the prosecution of Chinese Patent Application No. 201680053431.1.
Non-Final Office Action dated Jul. 27, 2021, which issued during the prosecution of U.S. Appl. No. 16/851,686.
Non-Final Office Action dated Aug. 19, 2021, which issued during the prosecution of U.S. Appl. No. 16/098,893.
Non-Final Office Action dated Sep. 1, 2021 which issued during the prosecution of U.S. Appl. No. 16/088,321.
First Office Action dated Aug. 4, 2021 which issued during the prosecution of Chinese Patent Application No. 201780027908.3.
An Examination Report dated Mar. 4, 2021 which issued during the prosecution of Indian Patent Application No. 201817036130.
An Examination Report dated May 5, 2021 which issued during the prosecution of Indian Patent Application No. 201817012117.
Non-Final Office Action dated Oct. 6, 2021, which issued during the prosecution of U.S. Appl. No. 17/063,320.
Notice of Allowance dated Aug. 3, 2021, which issued during the prosecution of U.S. Appl. No. 16/851,410.
Notice of Allowance dated Nov. 5, 2021, which issued during the prosecution of U.S. Appl. No. 16/851,410.
Notice of Allowance dated Nov. 10, 2021, which issued during the prosecution of U.S. Appl. No. 16/851,686.
Supplemental Notice of Allowance dated Nov. 12, 2021, which issued during the prosecution of U.S. Appl. No. 16/851,686.
A European Examination Report issued for European Patent Application No. 17728277.9 on Dec. 23, 2021.
Notice of Allowance dated Jan. 21, 2022, which issued during the prosecution of U.S. Appl. No. 16/098,893.
An Office Action dated Feb. 16, 2022 which issued during the prosecution of U.S. Appl. No. 16/088,321.
An Office Action dated May 6, 2022 which issued during the prosecution of U.S. Appl. No. 16/763,810.
A Non-Final Office Action dated May 26, 2022 which issued during the prosecution of U.S. Appl. No. 17/083,775.
An Office Action dated May 31, 2022 which issued during the prosecution of U.S. Appl. No. 17/083,659.
Examination Report issued by the Indian Patent Office on Jun. 28, 2022 in Indian Patent Application No. 202047019700.
A Japanese Office Action dated Jul. 1, 2024 which issued during the prosecution of Application No. 2022-521112.
A Japanese Office Action dated Jul. 3, 2024 which issued during the prosecution of Application No. 2022-521238.
Non-Final Office Action dated Jul. 8, 2024 which issued during the prosecution of U.S. Appl. No. 18/397,324.
Notice of Allowance issued for U.S. Appl. No. 18/203,109 on Jun. 13, 2024.
A Chinese Office Action dated May 23, 2024, which issued during the prosecution of Application No. 202080073583.4.
A Chinese Office Action dated May 23, 2024, which issued during the prosecution of Application No. 202080073623.5.
An Office Action dated Jul. 12, 2022, which issued during the prosecution of U.S. Appl. No. 16/088,321.
An Office Action dated Aug. 2, 2022, which issued during the prosecution of Japanese Patent Application No. 2021-145455.
An Examination Report dated Aug. 25, 2022, which issued during the prosecution of Australian Patent Application No. 2017263807.
An Office Action dated Aug. 30, 2022 which issued during the prosecution of Japanese Patent Application No. 2020-526176.
An Office Action dated Sep. 13, 2022 which issued during the prosecution of Japanese Patent Application No. 2021-157849.
Hideto Miura, “How to regard as how to consider the poikilocyte in urine an erroneous decision factor”, Modern Medical Laboratory, Sep. 1, 2002, vol. 30, No. 9, pp. 862-864 (6 pages total).
Jun Hashimoto, “Morphological Studies of Urinary Red Blood Cells in Renal and Urinary Tract Disorders (II) Use of Wright's Stain in Differential Diagnosis between Renal and Urinary Tract Disorders” Kawasaki Medical Congress Magazine, Mar. 1989, vol. 15, No. 1, pp. 94-101 (9 pages total).
D F Birch et al., “The research on the differential diagnosis of the kidney urinary tract obstacle by glomerular or non-glomerular”, Lancet, Oct. 20, 1979, vol. 2, No. 8147, pp. 845-846 (3 pages total).
A First Examination Report dated Sep. 19, 2022, which issued during the prosecution of Indian Patent Application No. 201817040226.
An Office Action dated Oct. 3, 2022 which issued during the prosecution of U.S. Appl. No. 16/763,810.
An Office Action dated Oct. 25, 2022 which issued during the prosecution of Canadian Application No. 2,998,829.
An Office Action dated Oct. 5, 2022 which issued during the prosecution of Brazilian Application No. 112018005099-7.
An Office Action dated Nov. 25, 2022 which issued during the prosecution of Brazilian Application No. 122020017765-9.
An Office Action dated Dec. 9, 2022 which issued during the prosecution of United States U.S. Appl. No. 17/083,647.
An Office Action dated Dec. 28, 2022 which issued during the prosecution of Russian Patent Application No. 2022112399.
An Office Action dated Dec. 28, 2022 which issued during the prosecution of Russian Patent Application No. 2022112393.
An Office Action dated Jan. 6, 2023 which issued during the prosecution of U.S. Appl. No. 17/063,320.
An Office Action dated Sep. 2, 2022 which issued during the prosecution of U.S. Appl. No. 17/063,320.
An Office Action dated Jan. 5, 2023 which issued during the prosecution of Chinese Patent Application No. 201880079888.9.
An Examination Report dated Jan. 23, 2023, which issued during the prosecution of Australian Patent Application No. 2022200112.
An Office Action dated Jan. 19, 2023 which issued during the prosecution of U.S. Appl. No. 17/490,767.
A Japanese Office of Action dated Nov. 5, 2024, which issued during the prosecution of JP Application No. 2022-534369.
Related Publications (1)
Number Date Country
20230028360 A1 Jan 2023 US
Provisional Applications (1)
Number Date Country
62946988 Dec 2019 US