The instant invention relates to methods and installations for obtaining an image of a sample emitting a light signal from within its inside.
In the field of pharmaceutical imaging, detection of light emitted from inside an animal (often a mammal) has become an effective way of qualifying the occurrence of a phenomenon under study taking place inside the animal.
It is an object of the present invention to provide an improved system and method by which the detected light signal can be accurately associated with a region of the animal from which it is emitted.
To this aim, it is provided a method for obtaining an image of a sample having an external surface enclosing an inside, a light signal being emitted from within said inside, the method comprising:
(a) providing, at least two positioning images each comprising detection data related to the external surface of the sample,
(b) providing, for at least one time of an observation period, a light-emission image of the sample comprising data related to the light signal emitted from within said inside of the sample,
(c) on each of said positioning images, detecting, an external contour of said sample and a landmark pattern integral with the sample,
(d) defining a transformation to be applied to the light-emission image from the detected landmark position,
(e) obtaining a referenced light-emission image by applying said transformation onto said light-emission image.
By the use of landmarks integral with the animal, it becomes easier to associate the detected light signal with the part of the animal from which it is emitted.
According to another aspect, the invention relates to a corresponding imaging installation and software.
In some embodiments, one might also use one or more of the features as defined in the dependant claims.
Other characteristics and advantages of the invention appear from the following description of three of embodiments thereof given by way of non-limiting example, and with reference to the accompanying drawings.
In the drawings:
In the various figures, like references designate elements that are identical or similar.
The marking device 100 is for example an electronically controlled printing device comprising a support 101 adapted to receive the animal 2, for example previously anesthetized. A module 102 comprising a printing head 103 and an imaging camera 104 is carried at the end of an arm 105 movable with respect to the support 101 along two displacement axis X, Y in a plane parallel to the support above the animal 2. The printing head is in a fluid communication with an ink tank 106 providing ink to the printing head. A computerized control unit 107 controls the displacement of the arm 105 in the X-Y plane and the emission of an ink drop in suitable locations of the animal 2. Suitable locations are for example determined by an user having on a display screen the output of the imaging camera 104, and determining the locations of the ink drops.
Of course, the landmarks could be of any suitable shape, such as regularly spaced dots, lines, or any other suitable patterns. Further, the arm 105 could be made to move vertically out of the X-Y plane, for example keeping constant the printing head to animal distance.
Further, it should be noted that other embodiments of marking devices are possible, provided the formed marks are made integral with the sample, i.e. will move with the sample when the sample moves.
The imaging apparatus described herein is a luminescence imaging apparatus, e.g. bioluminescence imaging apparatus, i.e. designed to take an image of a sample 2, such as, in particular, a small laboratory animal, e.g. a mammal, emitting light from inside its body. By light, It is understood an electro magnetic radiation having a wavelength between 300 nm and 1300 nm, and preferably between 400 and 900 nm.
For example, said light is generated due to a chemical reaction inside the body of the small animal. In order to obtain the chemical reaction, it is possible, for example, to use a small laboratory animal that has been genetically modified to include a gene encoding for a protein that presents the particularity of emitting light, the gene being expressed under the control of a suitable promoter upon an event.
Before placing the laboratory animal 2 in the imaging apparatus 1, or even before placing it in the marking device, the event is generated. The quantity of light given off locally is representative of the quantity of produced protein, and thus makes it possible to locally measure the level of expression of the gene.
In particular, if it is desired to check whether the gene in question is expressed particularly in response to a given event, it is possible to implement the measurement explained above firstly for a small laboratory animal 2 for which the event has been triggered, and secondly for a small laboratory animal 2 for which the event has not been triggered, in order to compare the signals emitted by the two animals.
Alternatively, the experiment in question can, for example, consist in measuring the muscular activity generated by an event in a laboratory animal, by detecting the quantity of light emitted by the coelenterazine-aequorin substrate-photoprotein pair which reacts with a given complementary chemical entity. For example, the entity in question is calcium arriving in the proximity of the photoprotein at the axons.
Since such events have a very fast time signature, it is useful to obtain information relating to the reaction rate rapidly.
According to a possible embodiment, the present method is used when imaging a moving animal. A moving animal can be either awake and running in the imaging apparatus, or still (for example anesthetized). In this latter case, the animal's movement is mainly due to breath.
The apparatus described herein can also be used to implement a method of performing imaging by delayed luminescence or phosphorescence. During such a method, a molecule adapted to emit light by phosphorescence for a time that is sufficiently long, of the order of a few minutes, is illuminated ex-vivo in order to trigger said phosphorescence. The molecule is then introduced into a small laboratory animal and can be used as a light tracer. The concentration of the molecule in a location of the organism, e.g. because a certain reaction takes place at that location, and because the molecule in question participates in said reaction, is detectable by the apparatus described below and makes it possible to characterize the reaction in question quantitatively or qualitatively.
As shown in
Due to the above-described reaction, the small laboratory animal 2 naturally emits a first light signal that carries information relating to the luminescence of the small animal. In addition, due to the illumination generated by the light source 8, a second positioning light signal, corresponding substantially to the incident illumination 8 being reflected by the small laboratory animal 2 is also emitted in the enclosure 5. Said second light signal can also include a portion corresponding to the autofluorescence of the sample 2 due to the illumination by the light source 8.
Said first and second light signals combine to form a combined light signal arriving at the detecting device 9 shown outlined in dashed lines in
In the first embodiment shown with reference to
In the example shown, the light source 8 emits incident illumination continuously towards the stage so that the combined light signal corresponds to a spectral combination of the first light signal (carrying the luminescence information) and of the second light signal. The combined light signal is separated by a separator plate 12, which separates the signals on the basis of their wavelengths. For example, such a separator plate is a dichroic mirror or a mirror of the “hot mirror” type that separates visible from infrared. The light signal carrying the luminescence information is transmitted substantially in full towards the first detector 10, whereas the second light signal is transmitted substantially in full to the second detector 11.
In order to be sure that only the signal carrying the luminescence information reaches the first detector 10, it is also possible to dispose a filter 13 at the inlet of the first detector 10, which filter is adapted to prevent the wavelengths that do not correspond to that signal from reaching the first detector 10.
In practice, in order to be certain that the signal reaching the first detector 10 corresponds only to the luminescence from the inside of the sample 2, provision is made for the autofluorescence signal emitted by the sample under the effect of the light source 8 to present a wavelength that is different from the wavelength of the signal in question. To this end, it is possible to choose to work with a light source 8 that emits incident illumination presenting an adapted spectrum, distributed beyond the range of wavelengths emitted by luminescence. For example, it is possible to use infrared illumination centered on a wavelength substantially equal to 800 nanometers (nm) when the luminescence spectrum presents a longest wavelength of 700 nm or shorter.
Other variations are possible, where the illumination is synchronized with the acquisition of the light-emission images by periodically shuttering the light-emission detecting camera.
As shown in
In similar manner, at the start of each time frame, the signal generated by the first detector 10 is stored in a first memory 20 as are the co-ordinates relating to each pixel. A processor unit 15 is adapted to read the data stored in the first and second memories 20, 21, so as store it and/or so as to display the corresponding images on the display 4.
However, it can happen that it is preferable not to read the data measured at the first detector 10 for each time frame, but rather once every n time frames, where n is greater than 1, in order to allow the light-emission signal to accumulate to improve the signal-to-noise ratio.
Once the five images coming from the second detector 11 for the five instants t1, t2, t3, t4 and t5, and the five images coming from the first detector 10 for these instants have all been recorded, the processor unit 15 can, on the basis of the five photographic positioning representations delivered by the second detector 11, express, in a frame of reference attached to the sample at a reference time, the light-emission representations from inside the sample. For example, t3 is set as the reference time and the displacement field T1-3 to which the sample 2 has been subjected between t1 and t3 is extracted from the photographic representations delivered by the second detector 11, for t1 and t3. Then, this field of deformation T1-3 is applied to the light-emission image obtained from the first detector 10 for time t1, said processing providing, from the light-emission image of t1, a light-emission image for t1 expressed in the sample frame of reference at t3. It should be mentioned that T1-3 could be expressed as T2-3 O T1-2, where T2-3 is the field of displacement to which the sample has been subjected between t2 and t3 and where T1-2 is the field of displacement to which the sample has been subjected between t1 and t2.
A similar processing is performed for the images obtained at t2, t4 and t5, whereby the fields of displacement T2-3, T4-3 and T5-3 are determined. By applying these fields of displacement to the respective detected light-emission representations at t2, t4 and t5, one obtains five light-emission images, each expressed in the sample frame of reference at t3. These five images are summed as shown on the bottom of
Then, a similar process can be performed at t4 taking into account images from t2, t3, t4, t5, and t6 (not shown, detected after t5). Fields of displacement T2-4, T3-4, T5-4 and T6-4 are used. T2-4 is expressed as T3-4 O T2-3 and T6-4 as T5-4 O T6-5. Among these, T2-3, T3-4 and T5-4 are known from the previous calculation and need not be re-calculated. In particular T3-4=T4-3−1.
Hence, the geographical location of the landmarks, in the x-y frame of reference of the detector 11, is memorized in the memory of the computerized unit for the time t1.
The same image treatment is performed for the image obtained for the sample at time t3, so that the geographical locations in the x-y frame of reference, of the landmarks at time t3 M1,3, M2,3, . . . , Mn,3 also stored in this memory.
It should be noted that all the detected landmarks are, in these images, enclosed by the external contour of the animal for each time.
First of all, a rigid transformation between t1 and t3 is estimated. This rigid transformation would be roughly estimated from the displacement of the barycentre of the detected outlines between t1 and t3.
The obtained geographical locations M1,1, M2,1, . . . , Mn,1 at time t1 are represented by crosses on the left side of
A field of displacement T1-3 suitable for making the contour and/or points obtained for t1 and the contour and/or points obtained for t3 coincide is calculated.
For example, the field of displacement to be calculated is composed of a rigid displacement (global rotation), and of a global deformation which can for example be expressed by the combination of a plurality of Eigen deformation modes. An example of a method for determining the field of deformation comprises for example defining a similarity criterion between the image at t3 and a virtual image based on the image at t1 to which a candidate transformation has been applied. When a predefined threshold is reached, the parameters of the actual candidate transformation are memorized.
For example, the similarity criterion (or energy) is made up of a similarity criterion on the outlines (for example based on the distance maps of the shape) and on a similarity criterion on the landmarks (for example using a closest neighbour algorithm). An optical flow representing the grey level on the images can be added up into the energy criterion. The parameters of the transformation which minimize the energy criterion are determined, for example by a gradient descent method. The transformation can be parameterized in any known way such as a linear matrix, a thin plate spline model, a free-form deformation function or the like.
On the right side of
The calculated field of deformation T1-3 is applied onto the light emission image obtained for time t1 in order to obtain a light-emission image corresponding to light emitted during t1, expressed in the frame of reference of the sample at time t3 (so-called “referenced light-emission image”).
The process of
In the above example, the sampling times of the light emission images and of the positioning images was the same. However, in other embodiments, it is contemplated that one does not necessarily have a light emission image for each positioning image and/or that the positioning and light/emission images are not necessarily exactly simultaneous. For example, the light emission images could be each spaced in between two positioning images. Suitable interpolations of the calculated field of displacement can then be used in order to obtain a result similar to the one described above.
Further, the time of reference at which the referenced light emission image is expressed does not necessarily correspond to a time at which a positioning image is detected. For example, the invention could be implemented from four imaging times of an observation period, or any other suitable number of times of an observation period.
In a second embodiment, as shown on
As shown on
As explained above in relation to
The three-dimensional position, in the frame of reference U, V, W of the enclosure for each of the points Mi of the animal's surface at time t1 is calculated from the detected bi-dimensional position on both images obtained respectively from both detectors. Knowing the geographical positions of the cameras in the enclosure, the three-dimensional coordinates of the points can be stereoscopically determined from the offset, between the two images, of the points on the two images, such as applying one of the methods described in “Structure from stereo—a review”, Dhond and al., IEEE Transactions on Systems, Man and Cybernetics, November/December 1989, Volume 19, Issue 6, pp 1489-1510. This calculation enables to roughly obtain the three-dimensional outer surface of the animal as shown on
If the 3D surface of the animal is projected into a plane, the field of displacement between the image obtained by the camera 10A and the projected 3D image can be calculated as described above. This field of displacement can then be applied to the light-emission image (for example obtained along the same line of sight as the one of the camera 10A) in order to express the light emission image in an undistorted frame of reference. Further, the light emission signal as calculated according to the first embodiment and as shown on
The resulting three-dimensional surfacic representation of the animal and three-dimensional surfacic light-emission image can be displayed superimposed.
The above-mentioned stereoscopy calculation could be performed for each time of the observation period, or for one time only, for example if one does not wish to take into account the displacement of the animal during the imaging period.
It should be noted that the field of deformation for one of the cameras between a first time and a reference time of the observation period could be calculated as described above with relation to
In another variation, it should be noted that a three-dimensional field of displacement as obtained by relation to
It should be mentioned that more than 2 positioning cameras could be used with different angles of sight, in order to obtain the 3D surface positioning representation of the mammal.
As shown on
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB08/52202 | 3/13/2008 | WO | 00 | 9/13/2010 |