Cell Recognition

Abstract
Disclosed herein are cell differentiating systems for differentiating cells. The systems comprise an input means for receiving a reconstructed image of a cell based on a holographic image of the cell in suspension, and a cell recognition means for determining cell characterization features from the reconstructed image of the cell for characterization of the cell. The cell recognition means thereby is configured for determining, as cell recognition features, image moments.
Description
FIELD OF THE INVENTION

The present invention relates to the field of biological image processing and cell classification. More specifically, it relates to systems and methods for label-free counting and/or classifying of cells, for example in lens-free holographic imaging flow cytometry.


BACKGROUND OF THE INVENTION

Cell characterization plays an important role in medical diagnostics. For example, white blood cell (WBC), also known as leukocyte, plays important roles in the immune system by helping to fight against pathogens. Recognition of WBCs can assist in diagnosing many diseases, such as leukemia, AIDS and some common virus. Differentiation of the 3 parts of WBCs, i.e. T-lymphocytes, granulocytes and monocytes, has been a hot topic that is of particularly interesting to hematologists.


Conventional white blood cell recognition usually requires several essential image processing. Feature analysis and classification have been investigated for cell identification using different image modalities. The earliest cell classification attempts are based on segmentation. Bao et al. obtained quantitative measurements from cell image without separation of nucleus from cytoplasm based on microscopic images using blood smears. As an extension, Gelsema et al. performed image segmentation based on the principle of multiple grey level thresholding. Extracted features express geometry, optical density and texture properties. Theera-Umpon et al. used information of nucleus alone to classify white blood cells using the microscopic images of bone marrow smear. They analyzed a set of white-blood-cell-nucleusbased features using mathematical morphology. Tabrizi et al. extracted the color, morphological and textural features from the segmented nucleus and cytoplasm using blood smears for five-part WBC differential. These studies have shown the advantages of automated WBC identification using light microscopy compared to semi-automatic methods or manual investigation. Unfortunately, image quality can suffer from variations of illumination, uneven staining and color mixing, making these methods sensitive to error or variation. Several studies demonstrate the capability of cell identification using hyperspectral imaging, multispectral image or Raman spectroscopic imaging technologies which provide both spatial and spectral information. However, these studies also use samples of blood smear, which may change the actual cell morphology.


Conventional blood analysis is usually performed using bulky and expensive microscope or hematology analyzers. Recently, lens-free imaging technology has matured and started acting as a competitor. Without using any optical lenses, the lens-free imaging techniques offer advantages in portability, scalability, and cost-effectiveness. An imaging flow cytometer can capture high content cell images and can provide much more information than just the single feature of a fluorescent label. Chen et al. extracted both optical and morphology features from the high-throughput quantitative images of cells in suspension. Dannhauser extracted 3D morphology features of mononuclear cells captured by a quantitative phase imaging holographic system, which is consist of complicated microscopies. Lens-free imaging techniques brings new possibilities and challenges into blood flow cytometry. One of the recently developed lens free flow cytometers is based on digital holographic microscopy. Whereas the technique provides the appropriate hardware for imaging cells, an accurate and efficient cell identification technique still is required.


SUMMARY OF THE INVENTION

It is an object of the present invention to provide efficient methods and systems for classifying cells.


It is an advantage of embodiments of the present invention that a cell classification system is provided allowing to classify cells in suspension, i.e. where the cells are not influenced by imaging them in a smear. It is an advantage that classification can be performed on images of cells that are less distorted than when these cells would be obtained in smears.


It is an advantage of embodiments of the present invention that an automated classification of cells is obtained, e.g. an automated high classification at least 3-part white blood cell classification based on lens-free holographic images. It is an advantage that such an automated classification can be based on a machine learning algorithm.


It is an advantage of embodiments of the present invention that the images used may be of fluorescent labeled samples, without biological purification of the samples being required.


It is an advantage of embodiments of the present invention that the most important features are used for cell recognition, resulting in a reduced computational cost for cell recognition, since accurate cell recognition can be performed without evaluating the full feature space.


The above objective is accomplished by a method and device according to the present invention.


The present invention relates to a cell differentiating system for differentiating cells, the system comprising an input means for receiving a reconstructed image of a cell based on a holographic image of the cell in suspension, and a cell recognition means for determining cell characterization features from the reconstructed image of the cell for characterization of the cell,


wherein the cell recognition means is configured for determining, as cell recognition features, image moments, the image moments being defined as






m
ijx,y(f(x,yxi·yj) and





μijx,y(f(x,y)·(x−x)i·(y−y)j)


where








x
_

=


m

1

0



m

0

0




,


y
_

=


m

0

1



m

0

0




,


and






η

i

j



=


μ

i

j



m

0

0




(

i
+
j

)

/
2

+
1








wherein j and i are orders. f(x, y) is pixel intensity located at (x, y).


The system according to embodiments of the present invention is suitable for differentiating cells. It is an advantage of embodiments of the present invention that the system is especially suitable for differentiating white blood cells. The system may be especially suitable for differentiating white blood cells into T-lymphocytes granulocytes and monocytes. The system may be especially suitable for providing 3-part white blood cell (WBC) recognition based on reconstructions of images of WBC in suspension. It is an advantage of embodiments of the present invention that, for WBC in suspension, accurate recognition can be obtained by using image moments.


Advantageously, the image moments may be geometric moments.


The cell recognition means may be configured for determining at least the following image moments as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21. It is an advantage of embodiments of the present invention that for at least some datasets, accurate and fast cell recognition can be obtained by using the above identified image moments as cell recognition features. It thereby is an advantage that only a limited set of image moments need to be determined, while still getting accurate results for at least some datasets.


The cell recognition means furthermore may be configured for determining the Zernike moment Z20 as cell recognition feature:







Z

n
,
m


=



n
+
1

π





x





y




f


(

x
,
y

)





V

n
,
m

*



(

ρ
,
θ

)










for x2+y2≤1 and






V
n,m(ρ,θ)=Rn,m exp(jmθ)


n are non-negative integers representing order of the Zernike polynomials and m represents repetitions of Zernike polynomials which satisfy the constrain of (n−|m|)=even and |m|≤n. θ is the azimuthal angle and ρ is the radial distance 0≤ρ≤1. Rn,m are the radial polynomials







R

n
,
m


=




s
=
0



(

n
-


m



)

/
2






(

-
1

)

s





(

n
-
s

)

!


s


!


(



n
+


m



2

-
s

)



!

*

(



n
-


m



2

-
s

)










ρ

(

n
-

2

s


)


.







It is an advantage of embodiments of the present invention that for at least some datasets the accuracy for cell recognition can be significantly improved by using one additional Zernike moment.


The cell recognition means furthermore may be configured for also determining at least the following image moments as cell recognition features: m30 and μ20. It is an advantage of embodiments of the present invention that for at least some datasets the accuracy for cell recognition can be significantly improved by use of additional image moments.


The cell recognition means furthermore may be configured for also determining at least the following image moments as cell recognition features: m02, m10, m20, m21, η02 and η30. It is an advantage of embodiments of the present invention that the cell recognition technique results in a high accuracy for cell recognition in different datasets by using the identified image moments. The cell recognition means furthermore may be configured for determining the Zernike moments Z20, Z40, Z60 and Z80 as cell recognition features, with







Z

n
,
m


=



n
+
1

π





x





y




f


(

x
,
y

)





V

n
,
m

*



(

p
,
θ

)










for x2+y2≤1 and Vn,m(ρ,θ)=Rn,m exp(jmθ)


wherein n are non-negative integers representing order of the Zernike polynomials and m represents repetitions of Zernike polynomials which satisfy the constrain of (n−|m|)=even and |m|≤n. θ is the azimuthal angle and ρ is the radial distance 0≤ρ≤1 and Rn,m are the radial polynomials







R

n
,
m


=




s
=
0



(

n
-


m



)

/
2






(

-
1

)

s





(

n
-
s

)

!


s


!



(



n
+


m



2

-
s

)

!

*

(



n
-


m



2

-
s

)







ρ

(

n
-

2

s


)








recognition feature.


The cell recognition means furthermore may be configured for determining the HU moments hu1, hu5 and hu7 with






hu
12002






hu
5=(η30−3η12)(η3012)[(η3012)2−3(η2103)2]+(3η21−η03)(η2103)[3(η3012)2−(η2103)2]






hu
7=(3η21−η03)(η2103)[3(η3012)2−(η2103)2]−(η30−3η12)(η2103)[3(η3012)2−(η2103)2] as cell recognition feature.


The cell recognition means furthermore may be configured for determining the cell diameter as cell recognition feature. The cell recognition means furthermore may be configured for determining and the cell ridge as cell recognition feature.


The system furthermore may comprise a cell classification means for classifying the cell as a white blood cell. The cell classification means furthermore may be adapted for classifying the cell as a T-lymphocyte, a granulocyte or a monocyte based on the determined features of the cell determined using the cell recognition means.


It is an advantage of embodiments according to the present invention that accurate classification of cells can be performed.


The input means may comprise a hologram data acquisition system for acquiring hologram data of a cell in suspension.


The input means may comprise a data preprocessing system configured for removing a background from the hologram data and/or for normalizing an illumination intensity of the hologram data.


The input means may comprise an image reconstruction system.


The cell recognition means may be configured for determining cell recognition features being exactly image moments m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21 and Zernike moment Z20.


The cell recognition means may be configured for determining cell recognition features being exactly image moments moo, m01, m11, m12, μ03, μ12, and μ21, η03, μ12, η20, η21, m30 and μ20, Zernike moment Z20 and the diameter.


The cell recognition means may be configured for determining cell recognition features being exactly image moments m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20, η21, m30, μ20, m02, m10, m20, m21, μ02 and η30, Zernike moments Z20, Z40, Z60, Z80, Hu moments hu1, hu5 and hu7 and the diameter and ridge.


The present invention also relates to a diagnostic device comprising a system as described above for analyzing cells. The diagnostic device may be adapted for analyzing blood cells, such as for example white blood cells.


The present invention also relates to a method for differentiating cells, the method comprising receiving a reconstructed image of a cell based on a holographic image of the cell in suspension, and determining cell characterization features from the reconstructed image of the cell for characterization of the cell, wherein the cell recognition features comprise image moments, the image moments being defined as






m
ijx,y(f(x,yxi·yj),  the spatial moments:





μijijx,y(f(x,y)·(x−x)i·(y−y)j), where  and the central moments:








x
_

=


m
10


m
00



,






y
_

=


m

0

1



m

0

0




,




and the normalized central moments:







η
ij

=


μ
ij


m

0

0




(

i
+
j

)

/
2

+
1







wherein j and i are orders. f(x, y) is the pixel intensity located at (x, y).


The cells may be white blood cells. The method may be adapted for classifying white blood cells into T-lymphocytes, granulocytes and monocytes.


The method furthermore may comprise, based on the determined cell recognition features, identifying whether a white blood cell is a T-lymphocyte, a granulocyte or a monocyte.


Advantageously, the image moments may be geometric moments.


Determining cell characterization features from the reconstructed image of the cell may comprise determining at least the following image moments as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21.


Determining cell characterization features from the reconstructed image of the cell may comprise determining the Zernike moment Z20 as cell recognition feature:







Z

n
,
m


=



n
+
1

π





x





y




f


(

x
,
y

)





V

n
,
m

*



(

ρ
,
θ

)










for x2+y2≤1 and






V
n,m(ρ,θ)=Rn,m exp(jmθ)


n are non-negative integers representing order of the Zernike polynomials and m represents repetitions of Zernike polynomials which satisfy the constrain of (n−|m|)=even and |m|≤n. θ is the azimuthal angle and ρ is the radial distance 0≤ρ≤1. Rn,m are the radial polynomials







R

n
,
m


=




s
=
0



(

n
-


m



)

/
2






(

-
1

)

s





(

n
-
s

)

!


s


!


(



n
+


m



2

-
s

)



!

*

(



n
-


m



2

-
s

)










ρ

(

n
-

2

s


)


.







Determining cell characterization features from the reconstructed image of the cell may comprise determining at least the following image moments as cell recognition features: m30 and μ20.


Determining cell characterization features from the reconstructed image of the cell may comprise determining at least the following image moments as cell recognition features: m02, m10, m20, m21, η02 and η30.


Determining cell characterization features from the reconstructed image of the cell may comprise determining the Zernike moments Z20, Z40, Z60 and Z80 as cell recognition features, with







Z

n
,
m


=



n
+
1

π





x





y




f


(

x
,




y

)





V

n
,
m

*



(

p
,
θ

)










for x2+y2≤1 and Vn,m(ρ,θ)=Rn,m exp(jmθ)


wherein n are non-negative integers representing order of the Zernike polynomials and m represents repetitions of Zernike polynomials which satisfy the constrain of (n−|m|)=even and |m|≤n. θ is the azimuthal angle and ρ is the radial distance 0≤ρ≤1 and Rn,m are the radial polynomials








R

n
,
m


=




s
=
0



(

n
-


m



)

/
2






(

-
1

)

s





(

n
-
s

)

!


s


!


(



n
+


m



2

-
s

)



!

*

(



n
-


m



2

-
s

)









ρ

(

n
-

2

s


)





,




determining the HU moments hu1, hu5 and hu7 with






hu
12002






hu
5=(η30−3η12)(η3012)[(η3012)2−3(η2103)2]+(3η21−η03)(η2103)[3(η3012)2−(η2103)2]






hu
7=(3η21−η03)(η2103)[3(η3012)2−(η2103)2]−(η30−3η12)(η2103)[3(η3012)2−(η2103)2], and


determining the diameter and ridge as cell recognition features.


The method may comprise classifying the cell as a white blood cell. Classifying may be classifying the cell as a T-lymphocyte, a granulocyte or a monocyte based on the determined features of the cell.


Receiving a reconstructed image of a cell may comprise acquiring hologram data of a cell in suspension.


Receiving a reconstructed image of a cell may comprise removing a background from the hologram data and/or normalizing an illumination intensity of the hologram data.


Receiving a reconstructed image of a cell may comprise reconstructing the image based on the hologram data, after removing the background and/or normalizing the illumination intensity.


Particular and preferred aspects of the invention are set out in the accompanying independent and dependent claims. Features from the dependent claims may be combined with features of the independent claims and with features of other dependent claims as appropriate and not merely as explicitly set out in the claims.


For purposes of summarizing the invention and the advantages achieved over the prior art, certain objects and advantages of the invention have been described herein above. Of course, it is to be understood that not necessarily all such objects or advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other objects or advantages as may be taught or suggested herein.


The above and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention will now be described further, by way of example, with reference to the accompanying drawings.



FIG. 1 illustrates a schematic overview of components of a cell differentiating system according to an embodiment of the present invention.



FIG. 2 shows a flow diagram of the WBC recognition pipeline, as can be used in an embodiment of the present invention.



FIG. 3 shows conventional bright field microscope and lens-free, holographically reconstructed amplitude and phase images of three WBC types, the latter being used in embodiments of the present invention.



FIG. 4 shows a measurement of a WBC edge and extracted edge features, as can be used in embodiments of the present invention.



FIG. 5 shows the resulting p-values of one-way ANOVA for each feature extracted and differentiated for possible pairs of white blood cells, illustrating advantages of embodiments of the present invention.



FIG. 6 shows the strength of standardized coefficients obtained by linear discriminant analysis for all three white blood cells, illustrating advantages of embodiments of the present invention.



FIG. 7 compares classification accuracies for linear SVM for grouped features, illustrating advantages of embodiments of the present invention.



FIG. 8 illustrates the marginal increase in classification accuracy of linear SVM as the number of selected features is increased, illustrating advantages of embodiments of the present invention.





The drawings are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn on scale for illustrative purposes. The dimensions and the relative dimensions do not necessarily correspond to actual reductions to practice of the invention.


Any reference signs in the claims shall not be construed as limiting the scope.


In the different drawings, the same reference signs refer to the same or analogous elements.


DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

The present invention will be described with respect to particular embodiments and with reference to certain drawings but the invention is not limited thereto but only by the claims.


The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a sequence, either temporally, spatially, in ranking or in any other manner. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other sequences than described or illustrated herein.


Moreover, directional terminology such as top, bottom, front, back, leading, trailing, under, over and the like in the description and the claims is used for descriptive purposes with reference to the orientation of the drawings being described, and not necessarily for describing relative positions. Because components of embodiments of the present invention can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration only, and is in no way intended to be limiting, unless otherwise indicated. It is, hence, to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other orientations than described or illustrated herein.


It is to be noticed that the term “comprising”, used in the claims, should not be interpreted as being restricted to the means listed thereafter; it does not exclude other elements or steps. It is thus to be interpreted as specifying the presence of the stated features, integers, steps or components as referred to, but does not preclude the presence or addition of one or more other features, integers, steps or components, or groups thereof. Thus, the scope of the expression “a device comprising means A and B” should not be limited to devices consisting only of components A and B. It means that with respect to the present invention, the only relevant components of the device are A and B.


Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.


Similarly it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.


Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.


It should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to include any specific characteristics of the features or aspects of the invention with which that terminology is associated.


In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.


In a first aspect, the present invention relates to a cell differentiating system for differentiating cells. Whereas in the description of particular embodiments the system will be mainly described and illustrated with reference to a system for differentiating white blood cells and even more particularly for differentiating white blood cells into T-lymphocytes, granulocytes and monocytes, embodiments of the present invention are not limited thereto. More particularly, the system is suitable for characterization or differentiation of cells in flow in general. According to the present invention, the cell differentiating system comprises an input means for receiving a reconstructed image of a cell based on a holographic image of the cell in suspension. The input means may be configured for receiving reconstructed images from an external source but may alternatively comprise an imaging means, such as for example a lens-free imaging flow cytometer, and an image reconstruction means for obtaining image reconstructions of the cell. The input means may be adapted for receiving images from single cells, e.g. single cells in suspension. The input means also may be adapted for receiving images from single cells in flow. The cell thereby typically may be imaged using a spherical point radiation source, such as for example a spherical point source laser. It also is an advantage of embodiments of the present invention that no segmentation of individual cells is required. Furthermore, the cell differentiating system also comprises a cell recognition means for determining cell characterization features from the reconstructed image of the cell for characterization of the cell. The cell recognition means according to embodiments of the present invention is configured for determining, as cell recognition features, image moments for the reconstructed image. These image moments may be geometric moments or a group of moments comprising such geometric moments. The image moments are defined as






m
ijx,y(f(x,yxi·yj) and





μijx,y(f(x,y)·(x−x)i·(y−y)j) where








x
_

=


m

1

0



m

0

0




,


y
¯

=


m

0

1



m

0

0




,
and







η

i

j


=


μ

i

j



m

0

0




(

i
+
j

)

/
2

+
1







wherein j and i are orders and wherein f(x, y) is a digital image.


In some embodiments, the cell recognition means is adapted for determining at least the following image moments as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21, It thereby is an advantage that these moments may result in very efficient cell differentiation, e.g. if the features used are limited to these moments or to a group of features comprising these moments. In some embodiments also the Zernike moment Z20 is added as cell recognition feature. Still other embodiments further include the image moments m30 and μ20. In yet other embodiments, also the image moments m02, m10, m20, m21, η02 and η30 are included.


Still other embodiments include also the Zernike moments Z20, Z40, Z60 and Z80 as cell recognition features, the HU moments hu1, hu5 and hu7 and/or the diameter and/or ridge as cell recognition features. In some embodiments at least the following image moments were selected or exactly the following imaging moments were selected: m00, m01, m11, m12, μ03, μ12, μ21, η03, η12, η20 and η21 and Z20. In other embodiments at least the following image moments were selected or exactly the following imaging moments were selected: m00, m01, m11, m12, m30, μ03, μ12, μ20, μ21, η03, η12, η20 and η21 and Z20. In some embodiments, the cell recognition features are limited to those explicitly mentioned in the respective embodiments described above. It was surprisingly found that with the limited features indicated above for the different embodiments it was possible to obtain an accurate cell differentiation in a computational efficient way. Such features can be selected based on linear discriminant analysis. These models take into account the labeled information (cell classes) of the cell, resulting in highly relevant features for differentiating different cell types, such as for example the three subtypes of white blood cells. The identification of the cell features may be based on a particular feature selection, as will be illustrated in a particular example described further below. Some of the selected features have some physical meaning. The moment m00 is representative of the mass of the cell image, in the image being representative of the sum of the grey levels. The moment m01 helps to describe the centroid of the image, i.e.







x
_

=



m
10


m
00


.





Alternatively for this moment or additionally also the moment m10 can be used, also assisting in describing the centroid of the image, i.e.







y
¯

=



m

0

1



m

0

0



.





The moments μ03, μ12, μ21, and optionally μ30 and μ20 are invariant with respect to translations. The moments η03, η12, η20 and η21 are invariant with respect to both translation and scale and the moment Z20 is invariant with respect to rotation. According to some embodiments, the cell differentiating system also comprises a cell classification means for classifying the cells using the cell recognition features. By way of illustration, FIG. 1 shows a cell identification system 100 comprising an input means 110 for receiving a reconstructed image of a cell based on a holographic image of the cell in suspension, a cell recognition means 120 and a cell classification means 130. In some embodiments, the cell recognition means 120 and the cell classification means 130 may be combined. The cell differentiating system 100 may be implemented as software as well as hardware. Such a software may run on a processor. It is an advantage of embodiments of the present invention that the required amount of computing power for the processor can be limited since only a specific selection of recognition features are required as indicated above. As indicated above, the system may be adapted for classifying cells for cells in suspension or flow. It thereby is an advantage that holographic images can be used of cells in suspension or flow in contrast to cells in a smear, since classifying cells in a smear requires more processing of the blood sample and/or may result in a less accurate classification. The input means thus may be adapted for receiving holographic images from or thus may comprise an imaging setup comprising a fluidic or microfluidic compartment, e.g. channel, that is adapted for receiving a suspension with the cells or a flow with the cells, as well as a holographic imaging means. Advantageously, the imaging means and the holographic image processing means and classifying system are combined in a single system, so as to allow for real-time imaging and classification. Further features and advantages of embodiments of the present invention will be illustrated with reference to a particular example described further below.


The present invention, in another aspect, also relates to a diagnostic device comprising a cell differentiating system for analyzing cells such as for example white blood cells. The cell differentiating system thereby is a cell differentiating system as described in the first aspect comprising the same features and advantages as embodiments described in the first aspect. Aside from the cell differentiating system—or where appropriate some components may be part of the cell differentiating system as well—the diagnostic device typically may comprise a lens free imaging flow cytometer. The flow cytometer may be compact. One example of a lens free imaging flow cytometer that may be used is based on a holographic set-up comprising a point source illumination source such as for example a waveguide a microfluidic chip and a high speed camera for imaging, such as for example a CMOS camera. In a particular example, immersion oil and a glass window are placed between the point source, the microfluidic chip and the camera, in order to create a complete refractive index-matched system. Radiation, e.g. laser light, is scattered by cells traveling through the main microfluidic channel. From the scattered and unscattered radiation an interference pattern or hologram is captured. Stroboscopic illumination is employed to avoid motion blur for the cells in flow in the system. In order to identify a cell, its hologram is numerically diffracted back to the channel plane and the resulting reconstruction is subsequently handled by the cell differentiating system. It will be clear that the particular flow cytometer described is only an example of a system that can be used in embodiments of the present invention for obtaining images of cells in flow, and therefore the present invention is not limited to this particular example. The diagnostic device has the advantage that it can be used as an inexpensive and portable cell analyzer, e.g. a haematology analyzer, that can be utilized at the point-of-care in emergency settings or in resource-limited settings.


In a third aspect the present invention relates to a method for differentiating cells. The method may for example be used for characterizing white blood cells, although embodiments are not limited thereto and can be applicable more general to cells in flow. The method comprises receiving a reconstructed image of a cell based on a holographic image of the cell in suspension, and determining cell characterization features from the reconstructed image of the cell for characterization of the cell. The cell recognition features comprise image moments being defined as






m
ijx,y(f(x,yxi·yj)  the spatial moments:





μijx,y(f(x,y)·(x−x)i·(y−y)j), where  and the central moment








x
_

=


m

1

0



m

0

0




,


y
_

=


m

0

1



m

0

0




,




and the normalized central moments:







η

i

j


=


μ

i

j



m

0

0




(

i
+
j

)

/
2

+
1







wherein j and i are orders. f(x, y) is the pixel intensity of the digital image located at (x, y). In particular embodiments, at least the following image moments are selected as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, μ12, η20 and η21. In some embodiments at least the following image moments were selected or exactly the following imaging moments were selected: m00, m01, m11, m1203, μ12, μ21, η03, η12, η20 and η21 and Z20. In other embodiments at least the following image moments were selected or exactly the following imaging moments were selected: m00, m01, m11, m12, m30, μ03, μ12, μ20, μ21, η03, η12, η20 and η21 and Z20. In some embodiments, the cell recognition features are limited to those explicitly mentioned in the respective embodiments described above. It was surprisingly found that with the limited features indicated above for the different embodiments it was possible to obtain an accurate cell differentiation in a computational efficient way. Such features can be selected based on linear discriminant analysis. These models take into account the labeled information (cell classes) of the cell, resulting in highly relevant features for differentiating different cell types, such as for example the three subtypes of white blood cells.


The method, in some embodiments furthermore comprises the step of classifying the cell. The method further may comprise method steps corresponding with the functionality of components of the system described in system embodiments of the first aspect of the present invention. The method may be implemented as computer program product and may be stored on a processor. The present invention thus also includes a computer program product which performs the method according to embodiments of the present invention when executed on a computing device. One configuration of a processor that can include such a computer program product may for example include at least one programmable computing component coupled to a memory subsystem that includes at least one form of memory, e.g., RAM, ROM, and so forth. It is to be noted that the computing component or computing components may be a general purpose, or a special purpose computing component. Thus, one or more aspects of the present invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.


In another aspect, the present invention relates to a data carrier for carrying a computer program product for performing cell identification, e.g. using a system or device as described above. Such a data carrier may comprise a computer program product tangibly embodied thereon and may carry machine-readable code for execution by a programmable processor. The present invention thus relates to a carrier medium carrying a computer program product that, when executed on computing means, provides instructions for executing any of the functions as described for the devices and systems as described above. The term “carrier medium” refers to any medium that participates in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as a storage device which is part of mass storage. Common forms of computer readable media include, a CD-ROM, a DVD, a memory chip or any other medium from which a computer can read. The computer program product can also be transmitted via a carrier wave in a network, such as a LAN, a WAN or the Internet. Transmission media can take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a bus within a computer.


By way of illustration, embodiments of the present invention not being limited thereto, features and advantages of particular embodiments of the invention will be described with reference to test results illustrating recognizing and classifying three types of human white blood cells (WBCs) using lens-free holographic imaging in combination with a flow cytometer. These three types may be T-lymphocytes, granulocytes, and monocytes. A comprehensive image database for these three WBCs is thus established and is well balanced with respect to the fractional population sizes of the three types. The test results also illustrate how feature selection and classification algorithms can be obtained using machine learning techniques, whereby it is a stringent requirement for accuracy to have sufficient data. The obtained feature selection and classification algorithms illustrate the possibility for a fully automated, high-throughput blood analyzer such as compact flow cytometers as described in some embodiments of the present invention.


For a highly accurate cell recognition device based on a classification algorithm, the predictions with respect to a specific cell type made by such algorithm have to agree to a very high degree with a standard. In particular embodiments of the present invention such a standard may be represented by a reference measurement performed on the same blood sample and used to label the WBCs according to their subtype. In the present example, the labeling of WBCs in a reference measurement was achieved by fluorescent tagging. Fluorophores stain the three WBC types differently since they are attached to antibodies that bind selectively to specific subtypes out of all the possible WBCs. Typical antibodies for fluorescent labeling may be anti-CD3, anti-CD14, and anti-CD15. Under the influence of an external light source, the fluorophores are excited and re-emit at a different wavelength. Provided that there is little cross-contamination, the wavelength of light as emitted by a single cell or the re-emission event itself as a consequence of the target cell specific binding in a run is indicative of its type, hence fluorescent labeling leads to a standard of high quality.


In the present example, a cell preparation step was performed that enables the simultaneous recording of a fluorescent reference signal at the time the label-free imaging is performed on the cell passing across an imaging region of the flow cytometer. Blood donors were identified and blood was drawn from them under the consent of a supervising medical and ethical committee. The whole blood was divided into 2 mL aliquots which then are incubated with either the anti-CD3, anti-CD14, or anti-CD15 phycoerythrin-conjugated antibodies (BD Biosciences). Thus WBC subtype specific tagging by a fluorophore was obtained for lymphocytes, monocytes, and granulocytes, respectively. It is advantage of to use of phycoerythrin molecules for staining as those give rise to a very bright emission signal due to their high absorption coefficient and quantum efficiency. Moreover phycoerythrin emission spectra peak around 570 nm wavelengths to which CMOS imagers are very responsive. Thereafter the stained aliquots were washed to remove unwanted debris, plasma, platelets, and other undesired components present in human whole blood that may be sources of contamination during the reference measurements of investigated WBCs. Next, the washed samples were suspended in BD FACS Lyse solution in order to lyse the red blood cells still present in the aliquots. This was followed by another washing step which removes the lysed red blood cells and possible debris. Then the samples were suspended in a running buffer comprising PBS, 0.5% BSA, and 2 mM EDTA). In a last step, the cell concentration in each aliquot was adjusted to a same level of 3*106 cells/mL so as to have a balanced number of WBC subtypes in each sample. This adjustment was achieved by counting WBCs with a Scepter cell counter (Merck Millipore). Adjusting the cell concentrations facilitated expediting the acquisition of a balanced image dataset of the three WBC types.


In the present example, the image acquisition of the prepared samples containing stained WBCs were performed by an imaging 532 nm laser beam focused above a transparent microfluidic flow channel through which the WBCs pass one at a time, thus illuminating the cells. Hydrodynamic focusing was used to align and load single WBCs into the focusing flow channel. The light of the illuminating laser that is transmitted, diffracted, or scattered by the fluidic channel and the cells was recorded by a CMOS imager, e.g. a CMOS camera or CMOS sensor, that may be positioned underneath the flow channel. Other optical elements may be placed between the flow channel and the CMOS imager, such as wavelength filters and thin dichroic mirrors that may improve signal quality in cases in which an additional fluorescent emission of light takes place which superposes the image forming light signal. It is to be noticed that typically in embodiments of the present invention such additional fluorescent light will be absent because the WBCs are imaged and classified in a label-free manner. Nevertheless, for the sake of creating a reference, in the present example cross-referencing with fluorescent radiation was used. In the diagnostic system of the present example there was no lens, mirror or other bulky optical component required for holographic image formation on a compact CMOS sensor. This lens-free imaging has the advantage of reducing costs related to the design and fabrication of lenses or lens systems, reducing weight of the imaging flow cytometer, and removing aberrations imparted by lens elements that degrade image quality. In the present example the flow channel was a fully integrated microfluidic flow channel which further reduces size, cost, and weight of the imaging flow cytometer. This may make such a device scalable, cheaper and easy to transport.


In the present example the 532 nm imaging laser beam was focused by a 20× microscope objective. The same objective was used for focusing a 488 nm excitation laser beam onto the WBCs inside the transparent flow channel so as to trigger the emission of a fluorescent signal that identifies the WBC type. A separate, fast photodetector was used for this purpose such that a fluorescent emission event was quickly registered and the CMOS camera triggered so as to rapidly record holographic images of the WBC of interest before the WBC left the imaging area of the flow cytometer. This has the advantage that only WBCs of interest are imaged and those imagers stored on a hard drive for later image processing reducing drastically the storage requirements as compared to continuous recording of holographic images.


In the present example, the fluorescent signal was recorded as well and correlated to the holographic image record and serving as an additional feature for WBC recognition and classification. An FPGA was used and configured in such a way that it compared the measured, integrated fluorescent signal to a reference value set by the user above which it starts recording the measured value of the fluorescent signal, turns off the excitation laser beam, turns on the imaging laser beam, and triggers the capture of holographic images by the CMOS camera. The FPGA was used to control the exact timing of those event very rapidly by running a LabView program on it. It is of advantage to capture two consecutive holographic images in such a configuration and setting a delay that allows for the imaged cell to leave the imaging area of the flow cytometer. This way of proceeding ensures that the two consecutive images captured could be used to perform background removal and hence, reduce image artifacts and enhance contrast during image reconstruction. This also alleviates the problem of changing illumination conditions due to laser misalignment, thermal variations of the laser focal point, multiple reflections, and stray light as they typically occur on timescales larger than the delay between the two consecutive holographic image captures; hence these effects may be accounted for by background subtraction.


Reference will now be made to FIGS. 2 and 3. In experiments 5000 holographic, referenced images have been acquired 211 by applying the described cell preparation and image acquisition steps. Data preprocessing 210 was another step that was performed in preparing data as input for cell differentiating systems according to embodiments of the present invention. This typically precedes the reconstruction of microscopy images 220 from the underlying holographic images. Background removal 212 also was performed and was enabled by the two consecutively captured holographic images. It was used for significantly cancelling unwanted variations across the recorded images, e.g. by slowly changing illumination conditions. Another measure aiming at a reduction in variation across image data was the implementation of an intensity normalization step 213, i.e. all the holograms were normalized with respect to same illumination intensity profile.


The angular spectrum method, as the reconstruction method of choice, was used in the current example to propagate the holographic image back to the focal plane. It is advantage of digital holographic imaging techniques that the focal plane may be easily shifted to an optimal plane at which the reconstruction is performed and results in the sharpest microscopic images of the cell. The focusing step 221 is shown in FIG. 2. However, a varying focal plane as a consequence of varying focal points of the illuminating laser beam or the axial displacement of the cells within the flow channel, leads to a varying magnification factor for the reconstructed cell images. In the present example the resulting variations of pixel sizes were eliminated by rescaling to a uniform pixel size of typically 5 μm. This was achieved by calibrating the holographic imaging setup in such a way that the image magnification factor is known or may be inferred or interpolated by placing a characterized target of known dimensions at different planes, recording and reconstructing the holographic images, and comparing the apparent and actual size of the target at different axial positions. The reconstruction 222 followed by the pixel scaling calibration 223 is also shown in FIG. 2 Moreover, one may strive for even cleaner reconstructed microscopy images by removing the out-of-focus contribution of the twin image. The latter contribution may be estimated by inspection of the fringe pattern just outside the reconstructed cell. Digital holographic reconstruction methods have the additional advantage that the reconstructed image is complex and may be decomposed on an amplitude image as usually seen in bright field microscopy and a quantitive phase image similar to microscopy images seen in phase contrast microscopes. FIG. 3 shows exemplary bright field images of the three WBC types in a blood smear, their corresponding amplitude images as obtained through holographic reconstruction, as well as the phase images as obtained through holographic reconstruction.


In the present example numerous features 231 with regard to the morphological, biological, or optical aspects of the WBC were extracted during the recognition phase 230. Such features may be related to the geometry of the reconstructed WBC images, for instance by calculating geometric moments mij of the reconstructed WBC amplitude image 32 having pixel intensities f(x,y) for a pixel located at the point (x,y) as indicated in Eq.1a. Since these moments lack desirable invariance properties which exploit symmetries in the image and have the advantage of reducing effectively image dimensions and speeding up image processing notwithstanding the fact that they encode the most relevant information of the image in condensed form, also centered moments μij, Eq.1b, and scale invariant moments μij, Eq.1c were introduced. There is a total of 24 such moment coefficients that are independent and include moments up to the third degree.






m
ijx,y(f(x,yxi·yj)  (Eq.1a)





μijx,y(f(x,y)·(x−m10/m00)i·(ym01/m00)j)  (Eq.1b)





ηijij001+0.5(i+j),i+j≥2  (Eq.1c)


Furthermore rotationally invariant Hu moments were also determined which may be another set of geometric extracted features 231. There are seven such Hu moments that derive from the degree three moments. A further set of geometric moments establishing extracted features 231 of the reconstructed WBC images are the set of fixed order Zernike polynomials. Zernike polynomials have the advantage that they form an orthogonal, rotation invariant basis and therefore are less redundant than the corresponding geometric moments of corresponding degree. In the present example 24 such Zernike polynomials were used as extracted features 231 relating to the cell's morphology. Other morphological features 231 were extracted such as the reconstructed cell diameter or the cell ridge along its edge as a measure of the WBC size and linear internal patterns. Cell edges were detected by the zero-crossing of second order directional derivatives operating on the reconstructed WBC phase image. A circle was fitted to the detected edge points so as to derive a cell diameter. Two other features 231 were extracted based on the edge characteristics proper to the reconstructed cell. This is shown in FIG. 4. Drawing normal lines 41 across the cell edges 42 of a reconstructed amplitude image 40 of the WBC, one detects a sudden drop in intensities 43 as one progresses outwards along such a normal. This is because edges typically appear dark in reconstructed holographic amplitude images 32. Averaging the minimum intensities 44 occurring in the drop region 43 over the number of normal drawn around the cell circumferential edge 42 and ignoring possible outliers, e.g. due to the presence of granules close to the cell edge 42, allowed to extract an edge intensity feature 231. Moreover an edge width feature 231 was extracted by measuring the average full width half minimum 45 around the cell's circumferential edge 42. Therefore a total of 60 extracted features 231 were obtained in the present example.


In the present example, a feature selection stage 232 followed the feature extraction stage 231 during the WBC recognition phase 230. Feature selection 232 aims at reducing the number of characterizing features that are necessary for a successful classification 233 of the WBCs. Feature selection 232 is a crucial advantage in high-throughput flow cytometry applications in which the number of holographic images that need to be reconstructed per second is already a computationally intensive operation; every extra feature extraction 231 with only little impact on the classification result 233 leads to computational overhead that is avoided if those features 231 were not extracted. The feature selection illustrates how embodiments of the present invention can be so beneficial with respect to accuracy and selectiveness. Another beneficial factor of feature selection 232 may be a faster classification 233 of WBCs. This may be of importance in cell sorting applications in integrated flow cytometry devices with limited latency between the imaging trigger event and the sorting or separating event during which the cell is deviated from its original path according to its type. At high flow speeds and short, compact distances a decision has to be taken within this short latency period; the decision may involve the image reconstructing 221, feature extraction 231, feature selection 232, and classification 233 stages for instance. Feature selection 232 may also enable weighting extracted features 231 according to their importance in discriminating between WBC types. In general it may prove useful to center and normalize all the extracted features 231 before starting the selection stage 232 or classification stage 233.


In experiments on WBC recognition 230 the total amount of reconstructed images 221 and the extracted features thereof 231 is randomly split into a training and a testing set. Typically 75% of the available data is used for training and 25% for testing the trained feature selection 232 and classification 233 algorithm. The statistical effect of sampling from a limited dataset may be mitigated by the use of shuffle-split validation which resamples and splits 10 times the total available data into a test and a validation subset. The accuracy achieved by the classification method 233 may be averaged accordingly over the ten folds, whereby classification accuracy is defined as the percentage of the ratio of correctly classified WBC types and the total amount of WBC samples in the test fold.


In the present example analysis of variance, ANOVA, was used to assess the importance of extracted features 231. This was achieved by performing a series of statistical t-tests and affirm or reject the null hypothesis of equal group means for a particular feature variable. More specifically the null hypothesis was rejected for the statistical p-value being smaller than a fixed level of significance, e.g. p<0.05. Useful features 231 were thus selected if the corresponding p-value was small as they introduce differences in the group means that are significant. Likewise, extracted features 231 with large p-values under ANOVA testing tend to affirm the null hypothesis and were interpreted as having little impact on discriminating between WBC groups. The top graph of FIG. 5 lists the p-values that was derived from 60 extracted features 231 by performing ANOVA as the selection rule 232. Furthermore, ANOVA for feature selection 232 was performed using different pairing of WBC types. This is illustrated in FIG. 5 in the last three graphs for all possible pairings of WBC groups, viz. T-cells and granulocytes, T-cells and monocytes, and granulocytes and monocytes. Filled circles in FIG. 5 correspond to extracted features that perform well under linear discriminant analysis, LDA, as selection criterion 232 too.


In the example LDA was also used as selection rule 232. LDA is a widely used machine learning algorithm that is used for dimensionality reduction. It projects high dimensional data onto lower dimensional spaces in which the data is visualized or classified more easily. LDA was used as projection and feature dimension limiting algorithm 232 and as classification algorithm 233 at the same time. This has the advantage of limiting the complexity of the computer code that runs the instruction for cell recognition 233 on a data processing device, e.g. an ASIC or computer. LDA makes the assumption of homoscedastic, normally distributed classes and fits separating hyperplanes between the projected clusters of WBC types. The linearity of the separating hyperplanes is an attractive feature for implementing LDA as a selection rule 232 and classification method 233, as the decision hyperplanes are easily implemented as dot products of vectors, a weight vector characterizing the position and orientation of the separating hyperplane and the feature vector 231 which combines all the extracted features. Importance of features was measured by introducing standardized coefficients of the discriminating functions. A strongly positive standardized coefficient favored the variance between WBC groups and the associated feature may be selected for classification. FIG. 6 illustrates the coefficient weights as obtained by using LDA on 60 extracted features 231. Some coefficients associated to specific features were beyond unity indicating correlations among features in this particular case. In the present example the three features associated to the three strongest coefficients in each WBC group were selected for classification, in another embodiment the six strongest features in each group were selected for classification. For the latter case experiments have shown overlapping features among the three WBC groups reducing their effective number to twelve. It is therefore an advantage of LDA as selection rule 232 that it significantly reduces the number of relevant extracted features 231, e.g. from 60 to twelve.


In the present example, the classification algorithm 233 for WBC recognition 230 based on the set of selected features 232 could be selected from LDA, quadratic discriminant analysis (QDA), support vector machines (SVMs), nearest neighbor algorithms, random forests, decision trees, naïve Bayes, AdaBoost, or any other suitable classification algorithm as is appreciated by a person skilled in this field. The overall best performing algorithm in this particular experiment was SVM-linear, a linear kernel SVM. Support vector machines, as LDA, may be advantageous in some cases where linear separating planes are easily implemented and rapidly executed. Moreover, use of SVM or LDA may require only a small set of model parameters to be stored, e.g. the support vectors or the class centroids or discriminating function coefficients, respectively. The parameter look up and retrieval is thus fast in such examples leading to short latency periods. SVM has the advantage of implementing the kernel trick which implies that it may achieve classification 233 and good separability in high-dimensional nonlinear embedding spaces for which the explicit embedding functions do not have to be carried out.



FIG. 7 shows the classification accuracies of SVM-linear for the present example for the case of grouped extracted features 232, e.g. cell diameter and ridge, edge features (intensity and width) only, geometric moments, Hu moments, and Zernike moments.



FIG. 8 shows another feature selection strategy 232 which focuses on the experimental increase in classification accuracy of SVM-linear as more and more features 231 are included into the set of selected features. The features 231 to be included are ranked according to their LDA standardized coefficients in descending order as described before. Features of WBC groups may overlap. In this particular experiment the first six dominant features of each WBC group lead to almost optimal classification scores; the curves saturate quickly to a plateau value and the marginal gain in accuracy when adding more insignificant features is very small.


In this example, three-part WBC differential is achieved by lens-free and label-free holographic imaging of cells suspended in a flow. Accuracies of 99% were obtained in experiments in which fluorescent labeling was used to establish a reference line. Fast and fully automated cell recognition and high throughput may be achieved by quick holographic image reconstruction combined with the described advantages of feature reduction and properly designed classification algorithms.

Claims
  • 1. A cell differentiation system for differentiating cells, the system comprising: (a) an input means for receiving a reconstructed image of a cell based on a holographic image of the cell in suspension; and(b) a cell recognition means for determining cell characterization features from the reconstructed image of the cell for characterization of the cell;
  • 2. The system according to claim 1, wherein the cell recognition means is configured for determining at least the following image moments as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21.
  • 3. The system according to claim 2, wherein the cell recognition means is further configured for determining the Zernike moment Z20 as cell recognition feature, with the Zernike moments of an image are defined as:
  • 4. The system according to claim 2, wherein the cell recognition means is further configured for also determining at least the diameter as cell recognition feature.
  • 5. The system according to claim 2, wherein the cell recognition means is further configured for also determining at least the following image moments as cell recognition features: m30 and μ20.
  • 6. The system according to claim 2, wherein the cell recognition means is further configured for determining the Zernike moments Z20, Z40, Z60 and Z80 as cell recognition features, with
  • 7. The system according to claim 2, wherein the cell recognition means is further configured for also determining at least the following image moments as cell recognition features: m02, m10, m20, m21, η02 and η30.
  • 8. The system according to claim 2, wherein the cell recognition means is further configured for determining the HU moments hu1, hu5, and hu7, wherein hu1=η20+η02,hu5=(η30−3η12)(η30+η12)[(η30+η12)2−3(η21+η03)2]+(3η21−η03)(η21+η03)[3(η30+η12)−(η21+η03)2],hu7=(3η21−η03)(η21+η03)[3(η30+η12)2−(η21+η03)2]−(η30−3η12)(η21+η03)[3(η30+η12)2−(η21+η03)2], andridge as cell recognition features.
  • 9. The system according to claim 1, wherein the system further comprises a cell classification means for classifying the white blood cell as a T-lymphocyte, a granulocyte, or a monocyte based on the determined features of the white blood cell determined using the cell recognition means.
  • 10. The system according to claim 1, wherein the input means comprises a hologram data acquisition system for acquiring hologram data of a white blood cell in suspension.
  • 11. The system according to claim 1, wherein the input means comprises a data preprocessing system configured for removing a background from the hologram data and/or for normalizing an illumination intensity of the hologram data.
  • 12. The system according to claim 1, wherein the input means comprises an image reconstruction system.
  • 13. The system according to claim 1, the system being adapted for differentiating white blood cells into T-lymphocytes, granulocytes, and monocytes.
  • 14. A diagnostic device comprising the system according to claim 1, for analyzing cells.
  • 15. A method for differentiating cells, the method comprising: (a) receiving a reconstructed image of a cell based on a holographic image of the cell in suspension; and(b) determining cell characterization features from the reconstructed image of the cell for characterization of the cell;wherein the cell recognition features comprise image moments, the image moments being defined as: mij=Σx,y(f(x,y)·xi·yj):  the spatial moments:μij=Σx,y(f(x,y)·(x−x)i·(y−y)j), where  the central moments:
  • 16. The method according to claim 15, wherein determining cell characterization features comprises determining the following image moments as cell recognition features: m00, m01, m11, m12, μ03, μ12, and μ21, η03, η12, η20 and η21, and Z20, wherein the Zernike moments of an image are defined as:
  • 17. The method according to claim 16, wherein determining cell characterization features further comprises determining the image moments m30, μ20, and the cell diameter.
  • 18. The method according to claim 15, wherein the method further comprises, based on the determined cell recognition features, identifying whether a cell is a white blood cell.
  • 19. The method according to claim 15, wherein the method further comprises, based on the determined cell recognition features, identifying whether a white blood cell is a T-lymphocyte, a granulocyte, or a monocyte.
Priority Claims (1)
Number Date Country Kind
17181569.9 Jul 2017 EP regional
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2018/069324 7/16/2018 WO 00