Gating with anatomically varying durations

Information

  • Patent Grant
  • 10964075
  • Patent Number
    10,964,075
  • Date Filed
    Sunday, October 16, 2016
    7 years ago
  • Date Issued
    Tuesday, March 30, 2021
    3 years ago
Abstract
A method for reconstructing a radioactive emission image of an overall volume having first and second volumetric regions, each volumetric region having respectively independent dynamic characteristics. The method comprises the following steps: a) obtaining radioactive emissions from the overall volume, including the volumetric regions, b) reconstructing an initial radioactive emission image of the volumetric region according to the radioactive emissions, c) segmenting the initial radioactive emission image to delineate the first and second volumetric regions, and d) separately reconstructing the first and the second volumetric regions according to the respectively independent dynamic characteristics.
Description
FIELD AND BACKGROUND OF THE INVENTION

The present invention relates to a method and an apparatus for image reconstruction in nuclear medicine imaging and, more particularly, but not exclusively to image reconstruction in nuclear medicine imaging using gating techniques.


Radionuclide imaging aims at obtaining an image of a radioactively labeled substance, that is, a radiopharmaceutical, within the body, following administration, generally, by injection. The substance is chosen so as to be picked up by active pathologies to a different extent from the amount picked up by the surrounding, healthy tissue in consequence; the pathologies are operative as radioactive-emission sources and may be detected by radioactive-emission imaging. Pathology may appear as a concentrated source of high radiation, that is, a hot region, as may be associated with a tumor, or as a region of low-level radiation, which is nonetheless above the background level, as may be associated with carcinoma.


A reversed situation is similarly possible. Dead tissue has practically no pick up of radiopharmaceuticals, and is thus operative as a cold region.


The mechanism of localization of a radiopharmaceutical in a particular organ of interest depends on various processes in the organ of interest such as antigen-antibody reactions, physical trapping of particles, receptor site binding, removal of intentionally damaged cells from circulation, and transport of a chemical species across a cell membrane and into the cell by a normally operative metabolic process. A summary of the mechanisms of localization by radiopharmaceuticals is found in http://www(dot)lunis(dot)luc(dot)edu/nucmed/tutorial/radpharm/i(dot)htm.


The particular choice of a radionuclide for labeling antibodies depends upon the chemistry of the labeling procedure and the isotope nuclear properties, such as the number of gamma rays emitted, their respective energies, the emission of other particles such as beta or positrons, the isotope half-life, and the decay scheme.


In PET imaging, positron-emitting radioisotopes are used for labeling, and the imaging camera detects coincidence photons, the gamma pair of 0.511 Mev, traveling in opposite directions. Each one of the coincident detections defines a line of sight, along which annihilation takes place. As such, PET imaging collects emission events, which occurred in an imaginary tubular section enclosed by the PET detectors. A gold standard for PET imaging is PET NH3 rest myocardial perfusion imaging with N-13-ammonia (NH3), at a dose level of 740 MBq, with attenuation correction. Yet, since the annihilation gamma is of 0.511 Mev, regardless of the radioisotope, PET imaging does not provide spectral information, and does not differentiate between radioisotopes.


In SPECT imaging, primarily gamma emitting radioisotopes are used for labeling, and the imaging camera is designed to detect the actual gamma emission, generally, in an energy range of approximately 11-511 KeV. Generally, each detecting unit, which represents a single image pixel, has a collimator that defines the solid angle from which radioactive emission events may be detected.


Because PET imaging collects emission events, in the imaginary tubular section enclosed by the PET detectors, while SPECT imaging is limited to the solid collection angles defined by the collimators, generally, PET imaging has a higher sensitivity and spatial resolution than does SPECT. Therefore, the gold standard for spatial and time resolutions in nuclear imaging is defined for PET. For example, there is a gold standard for PET imaging for at rest myocardial perfusion with N-13-ammonia (NH3), at a dose of 740 MBq with attenuation correction.”


Conventional SPECT cameras generally employ an Anger camera, in which a single-pixel scintillation detector, such as NaI(Tl), LSO, GSO, CsI, CaF, or the like, is associated with a plurality of photomultipliers. Dedicated algorithms provide a two dimensional image of the scintillations in the single pixel scintillation detector. There are several disadvantages to this system, for example:


1. The dedicated algorithms associated with the single pixel cannot reach the accuracy of a two-dimensional image of a plurality of single pixel detectors;


2. The single-pixel detector is a rigid unit, which does not have the flexibility of motion of a plurality of small detectors, each with independent motion; and


3. A single hot spot may cause the single pixel detector of the Anger camera to saturate, whereas when a plurality of single pixel detectors is employed, saturation is localized to a few pixels and does not affect the whole image.


Other SPECT cameras which employ a plurality of single pixel detectors are also known.


U.S. Pat. No. 6,628,984, to Weinberg, issued on Sep. 30, 2003 and entitled, “Handheld camera with tomographic capability,” describes a tomographic imaging system, which includes a moveable detector or detectors capable of detecting gamma radiation; one or more position sensors for determining the position and angulation of the detector(s) in relation to a gamma ray emitting source; and a computational device for integrating the position and angulation of the detector(s) with information as to the energy and distribution of gamma rays detected by the detector and deriving a three dimensional representation of the source based on the integration. A method of imaging a radiation emitting lesion located in a volumetric region of interest also is disclosed.


U.S. Pat. No. 6,242,743, to DeVito, et al., issued on Jun. 5, 2001 and entitled, “Non-orbiting tomographic imaging system,” describes a tomographic imaging system which images ionizing radiation such as gamma rays or x rays and which: 1) can produce tomographic images without requiring an orbiting motion of the detector(s) or collimator(s) around the object of interest, 2) produces smaller tomographic systems with enhanced system mobility, and 3) is capable of observing the object of interest from sufficiently many directions to allow multiple time-sequenced tomographic images to be produced. The system consists of a plurality of detector modules which are distributed about or around the object of interest and which fully or partially encircle it. The detector modules are positioned close to the object of interest thereby improving spatial resolution and image quality. The plurality of detectors view a portion of the patient or object of interest simultaneously from a plurality of positions. These attributes are achieved by configuring small modular radiation detector with high-resolution collimators in a combination of application-specific acquisition geometries and non-orbital detector module motion sequences composed of tilting, swiveling and translating motions, and combinations of such motions. Various kinds of module geometry and module or collimator motion sequences are possible, and several combinations of such geometry and motion are shown. The geometric configurations may be fixed or variable during the acquisition or between acquisition intervals. Clinical applications of various embodiments of U.S. Pat. No. 6,242,743 include imaging of the human heart, breast, brain or limbs, or small animals. Methods of using the non-orbiting tomographic imaging system are also included.


U.S. Pat. No. 5,939,724, to Eisen, et al., issued on Aug. 17, 1999, and entitled, “Light weight-camera head and—camera assemblies containing it,” describes a lightweight gamma-camera head, assemblies, and kits that embody it. The gamma-camera head has a detector assembly which includes an array of room temperature, solid state spectroscopy grade detectors each associated with a collimator and preamplifier, which detectors and associated collimators and preamplifiers are arranged in parallel rows extending in a first direction and suitably spaced from each other in a second direction normal to the first direction, each of the parallel detector rows holding a plurality of detectors. The head may optionally have an electric motor for moving the detector in the second direction and optionally also in the first direction, either stepwise or continuously.


U.S. Pat. No. 6,525,320, to Juni, issued on Feb. 25, 2003, and entitled, single photon emission computed tomography system, describes a single photon emission computed tomography system, which produces multiple tomographic images of the type representing a three-dimensional distribution of a photon-emitting radioisotope. The system has a base including a patient support for supporting a patient such that a portion of the patient is located in a field of view. A longitudinal axis is defined through the field of view. A detector module is adjacent the field of view and includes a photon-responsive detector. The detector is an elongated strip with a central axis that is generally parallel to the longitudinal axis. The detector is operable to detect if a photon strikes the detector. The detector can also determine a position along the length of the strip where a photon is detected. A photon-blocking member is positioned between the field of view and the detector. The blocking member has an aperture slot for passage of photons aligned with the aperture slot. The slot is generally parallel to the longitudinal axis. A line of response is defined from the detector through the aperture. A displacement device moves either the detector module or the photon-blocking member relative to the other so that the aperture is displaced relative to the detector and the line of response is swept across at least a portion of the field of view.


U.S. Pat. No. 6,271,525, to Majewski, et al., issued on Aug. 7, 2001, and entitled, “Mini gamma camera, camera system and method of use,” describes a gamma camera, which comprises essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.


U.S. Pat. No. 6,271,524, to Wainer, et al., issued on Aug. 7, 2001 and entitled, “Gamma ray collimator,” describes a gamma ray collimator assembly comprising collimators of different gamma ray acceptance angles. For example, the acceptance angle of a first collimator may be between 0.2 and 5 degrees, and the acceptance angle of a second collimator may be between about 5 and 30 degrees.


U.S. Pat. No. 6,212,423, to Krakovitz, issued on Apr. 3, 2001 and entitled, “Diagnostic hybrid probes,” describes a hybrid nuclear and ultrasonic probe, comprising a cylindrical outer casing surrounding a nuclear probe, which comprises two scintillator plates intersecting perpendicularly, each of the scintillator plates having a plurality of parallel collimators; and an ultrasonic probe situated between said casing at the intersection of said scintillator plates.


List mode data acquisition is known in PET studies, and enables the determination of coincidence. It relates to recording every radiation event together with data pertinent to that event, which includes:


i. the time the radiation event impinged upon a detector pixel, with respect to a clock, with respect to a time bin, or with respect to another time definition, for example, a time interval between two clock signals; and


ii. the detector pixel location with respect to a coordinate system, at the time of the impinging.


The knowledge of time and location enables the determination of coincidence counts, namely photon counts that arrive substantially simultaneously, 180 degrees apart.


The time and location data may be stamped onto the radiation-event data packet, for example, as a header or as a footer, or otherwise associated with the radiation-event data packet, as known.


The time-stamped data available in PET studies may further be used for perfusion studies, where the timing of physiological processes of short durations, that is, durations shorter than about half the time span between heartbeats, is important. Perfusion studies usually involve a sequence of continuous acquisitions, each of which may represent data acquisition duration of about 10-30 seconds, although longer durations are sometimes employed. Data from each of the frames is independently reconstructed to form a set of images that can be visualized and used to estimate physiological parameters. This approach involves selection of the set of acquisition times, where one must choose between collecting longer scans with good counting statistics but poor temporal resolution, or shorter scans that are noisy but preserve temporal resolution.


US Patent Application 2003010539, to Tumer, et al., published on Jun. 5, 2003, and entitled, “X-ray and gamma ray detector readout system,” describes a readout electronics scheme, under development for high resolution, compact PET (positron emission tomography) imagers, using time tagging, based on LSO (lutetium ortho-oxysilicate, Lu.sub.2SiO.sub.5) scintillator and avalanche photodiode (APD) arrays.


There is some work relating to timing data in SPECT systems, employing Anger cameras.


U.S. Pat. No. 5,722,405, to Goldberg, issued on Mar. 3, 1998, and entitled, “Method and apparatus for acquisition and processsing of event data in semi list mode,” describes a system for acquisition, processing and display of gated SPECT imaging data for use in diagnosing Coronary Artery Disease (CAD) in nuclear medicine, employing an Anger camera, and provides a physician with two parameters for evaluating CAD: information relating to the distribution of blood flow within the myocardium (perfusion) and information relating to myocardium wall motion (function). One aspect provides the physician with a display of functional images representing quantitative information relating to both perfusion and function with respect to selected regions of interest of the subject heart at end-diastole and end-systole segments of the cardiac cycle. The functional display consists of arcs of varied width depending on wall motion and color coded to illustrate degrees of myocardial perfusion for different pie shaped sections of a selected region of interest within a given short axis slice of reconstructed volumetric region data. Another aspect provides a series of display images allowing facilitated access, display, and comparison of the numerous image frames of the heart that may be collected during gated SPECT sessions. U.S. Pat. No. 5,722,405 also teaches the ability to define and recall parameter files representative of data acquisition and processing parameters and protocol for use in gated SPECT studies and includes a semi-list processing mode to increase efficiency of data acquisition within a camera computer system.


U.S. Pat. No. 7,026,623, to Oaknin, et al., issued on Apr. 11, 2006, and entitled, “Efficient single photon emission imaging,” describes a method of diagnostic imaging in a shortened acquisition time for obtaining a reconstructed diagnostic image of a portion of a body of a human patient who has been administered with dosage of radiopharmaceutical substance radiating gamma rays, using SPECT and an Anger camera. The method comprises acquiring photons emitted from said portion of the body, by means of a detector capable of converting the photons into electric signals, wherein the total time of photon acquiring is substantially shorter than the clinically acceptable acquisition time; processing said electric signals by a position logic circuitry and thereby deriving data indicative of positions on said photon detector crystal, where the photons have impinged the detector; and reconstructing an image of a spatial distribution of the pharmaceutical substance within the portion of the body by iteratively processing said data. For example, the method includes effective acquisition time of less than 10 minutes, or less than 8 minutes, and acquiring photons in a list-mode procedure.


Current techniques record data with SPECT and electrocardiogram (ECG), and perform some gating to the data which is captured by the SPECT detectors, to incorporate the global and regional atrial and ventricular function and assessment of the relationship of perfusion to regional function.


Gated images are used to overcome distortions such as motion artifacts, which are caused due to motion of the heart during image acquisition. The images are needed as the physical model used for reconstruction assumes that the imaged objects are static. In gated imaging, photon-counting takes into account the portion of the heart contraction cycle within which a photon is measured. The Gating enables the reconstruction of an anatomical structure which is subject to periodic motion by enabling image acquisition only when the structure has reached the same configuration. Cardiac contraction is usually synchronized to the recorded electrocardiogram (ECG) signal that indicates the current heart pose. The period between a certain repetitive wave, such as R-wave, and a subsequent wave is divided into several time segments, called “frames”, which are usually spaced evenly. Each photon which is detected by the PET detectors during one of the frames is collected and associated with the related frame.


In gated imaging, each frame generates a single dataset. The collection of all the datasets belonging to all the frames are defined as a “dynamic” dataset.


The dynamic dataset is created by dividing the time span between one R-wave to the next R-wave into M frames that usually have an identical duration. Each detected photon is accumulated into a dataset of one of the M frames. Each dataset of the M datasets contains data relevant to a defined portion (“snapshot”) within the cardiac cycle.


Usually, during the image reconstruction process, each one of the gated datasets of the M frames is processed independently by a suitable reconstruction algorithm, see Leahy R et al., Computer tomography in: Handbook of Image and Video Processing, BovikA, Academic press, 2000, pp. 771-787; J. Kay. The EM algorithm in medical imaging, Stat. Meth. Med. Res., 6(1):55-75, January 1997; J. A. Fessler, Statistical image reconstruction methods for transmission tomography, Handbook of Medical Imaging, Volumetric region 2, pages 1-70. SPIE, Bellingham, 2000; R. M. Leahy et al., Statistical approaches in quantitative positron emission tomography, 10(2):147-65, April 2000; M. Defrise, A short reader's guide to 3D tomographic reconstruction, Computerized Medical Imaging and Graphics, 25(2):1 13-6, March 2001; Vandenberghe, Y. D'Asseler, et al. Iterative reconstruction algorithms in nuclear medicine, Computerized Medical Imaging and Graphics, 25(2):105-11, March 2001; G. L. Zeng. Image reconstruction, a tutorial, Computerized Medical Imaging and Graphics, 25(2):97-103, March 2001; and R. M. Lewitt et al., Overview of methods for image reconstruction from projections in emission computed tomography, Proc. IEEE, 91(9):1588-611, October 2003, which are incorporated herein by reference in its entirety.


A common practice in gated SPECT reconstruction is to divide the gated dynamic dataset into M ‘non-gated’ data sets. Each one of the datasets includes data from a single frame i. The reconstruction of each volumetric region is performed independently using the relevant data set.


In particular, once the emission data is obtained, the data is processed to reconstruct the intensity distribution within the measured volumetric region. The reconstruction process is generally complex, due to the large quantity of data that must be processed in order to obtain an accurate reconstruction. The following prior art statistical model may be used to perform reconstruction.


We assume an intensity distribution, I, defined over an input overall volume U, where U denotes a set of basic elements, such as pixels in two dimensional overall volumes and voxels in three dimensional overall volumes, and I(u) is the intensity of a given basic element u∈U. A detecting unit positioned on a radiation-emission-measuring-probe such as a PET detector or the like takes a series of measurements y=(yt)t=1T from different positions and orientations around the volumetric region U. The geometrical and physical properties of the detecting unit, together with its position and orientation in a given measurement t, determine the detection probability ϕt(u) of a photon emitted from location u in time t. Thus, the effective intensity of location u as viewed by the detecting unit during measurement t is ϕt(u)I(u).


The random count Xt(u) of photons that are emitted from location u and detected in measurement t is modeled by a Poisson process with mean ϕt(u)I(u). The total count of photons detected in measurement t is Yt=Σu∈U Xt(u), and the reconstruction problem is to reconstruct the intensities (I(u))u∈U from the measurements (yt)t=1T.


The 2-D Radon transform is a mathematical relationship that may be used for reconstructing the emission intensities of volumetric region U when the set of measurements (yt)t=1T is unconstrained. The Radon transform is not statistical and does not take into account the Poissonian nature of the counts. In addition, it models the views as line projections. The Radon transform maps the spatial domain (x,y) to the Radon domain (p,ϕ). For a fixed projection angle, the Radon transform is simply a projection of the object. A technique known in the ART as filtered back-projection (FBP) uses a back-projection operator and the inverse of the Radon transform to reconstruct the intensity distribution in volumetric region U from measurements (yt)t=1T.


The basic, idealized problem solved by the FBP approach is to reconstruct an image from its Radon transform. The Radon transform, when properly defined, has a well-defined inverse. However, in order to invert the transform one needs measured data spanning 180°. In many medical imaging situations, the positioning of the detecting unit relative to the emitting object is constrained, so that complete measured data is not available. Reconstruction based on filtered back-projection is therefore of limited use for medical imaging. Maximum likelihood (ML) and Maximum A Posteriori (MAP) estimation methods, which address the statistical nature of the counts, have been found to provide better image reconstructions than FBP.


Limited-angle tomography is a reconstruction technique in the related art which reconstructs an image from projections acquired over a limited range of angular directions. The success of the reconstruction process depends upon the extent of the angular range acquired compared with the angular range of the missing projections. Any reconstruction from a limited range of projections potentially results in spatial distortions (artifacts) in the image. Limited angle techniques can be applied for both the Radon transform and the statistical models, but better results are generally achieved within the statistical framework. While it is known that the severity of the artifacts increases with the increasing angular range of the missing projections, limited-angle tomography does not provide information on which projections should be used in order to most effectively reconstruct the image.


ML estimation is a widely used method in the related art for reconstructing an image from a constrained set of measurements. A parameterization of the generative model described above is obtained by assigning an intensity I(u) to every voxel in U. The likelihood of the observed data y=(yt)t, given the set of parameters I={I(u):u∈U} is:














L


(

y

I

)


=


ln






P


(

y

I

)



=




ln




t



P


(


y
t


I

)




=



t



ln






P


(




u




x
t



(
u
)




I

)












=





t



ln






Poisson


(


y
t





u





ϕ
t



(
u
)




I


(
u
)





)










=





t



{





-



u





ϕ
t



(
u
)



I


(
u
)




+








y
t


ln




u





ϕ
t



(
u
)




I


(
u
)





-

ln


(


y
t

!

)






}











(
1
)







Note that the lower and upper bound of an indexing variable, such as voxels u and time index t, are omitted in the following description, when they are clear from the context.


There is currently no analytic way to solve Eqn. 1 for the maximum of the likelihood function. However, optimization methods that find local maxima of the likelihood are known. One such method is the Expectation-Maximization (EM) process.


Since the data generated by the model is only partially observable by our measurements, a basic ingredient of the EM formalism is to define a set of random variables that completely define the data generated by the model. In the current case, since Yt=ΣuXt(u), the set of variables {Xu(t):u∈U; t=1, . . . , T} is such a set; the generated data is x=(xt)t, where xt=(xt(u))u, and the observed data y is completely determined by x. The main tool in the EM formalism is the complete data likelihood:














ln






P


(

x

I

)



=




ln




t



P


(


x
t


I

)




=



t



ln




u



Poisson


(



x
t



(
u
)






ϕ
t



(
u
)




I


(
u
)




)












=





t





u



{



-


ϕ
t



(
u
)





I


(
u
)



+



x
t



(
u
)




ln


(



ϕ
t



(
u
)




I


(
u
)



)



+

ln


(



x
t



(
u
)


!

)



}
















(
2
)







Since the likelihood depends on the complete data, which is only partially observable, we take its expectation with respect to the overall volume of the unobserved data, given the current set of hypothesized parameters (i.e. the current estimator). The result is a function Q(I|I′) which assigns likelihood to sets I of model parameters, given the current set I′, and given the observed data y:














Q


(

I


I



)


=



E


[


ln






P


(

x

I

)



y

;

I



]








=





t





u



{



-


ϕ
t



(
u
)





I


(
u
)



+


E


[




x
t



(
u
)




y
t


;

I



]




ln


(



ϕ
t



(
u
)




I


(
u
)



)



+
C

}












(
3
)







where C is a term which is independent of the intensities I. The function Q(I|I′) is maximized by the following new estimates:












I


(
u
)


=


1



t




ϕ
t



(
u
)








t



[




x
t



(
u
)




y
t


;

I



]




;









u


U
.







(
4
)







The expectation in Equation 4 is obtained as follows:














P


X
t



(
u
)





(




x
t



(
u
)




y
t


;

I



)


=






P

Y
t




(



y
t




x
t



(
u
)



;

I



)









P


X
t



(
u
)





(



x
t



(
u
)




I



)





P

Y
τ




(


y
t



I



)









=







Poisson
(



y
t

-


x
t



(
u
)








v

u






ϕ
t



(
v
)





I




(
v
)





)






Poisson
(



x
t



(
u
)






ϕ
t



(
u
)





I




(
u
)




)





Poisson






(


y
t





v





ϕ
t



(
v
)




I


(
v
)





)









=



Binomial
(




x
t



(
u
)







ϕ
t



(
u
)





I




(
u
)






v





ϕ
t



(
v
)





I




(
v
)






;

y
t


)








(
5
)







It follows that








E


[



xt


(
u
)



yt

;

I



]


=


y
t






ϕ
t



(
u
)





I




(
u
)






v





ϕ
t



(
v
)





I




(
v
)







,





and hence the EM iteration is:










I


(
u
)


=


1



t




ϕ
t



(
u
)








t




y
t






ϕ
t



(
u
)





I




(
u
)






v





ϕ
t



(
v
)





I




(
v
)












(
6
)







It is provable that each EM iteration improves the likelihood. Thus, given a random starting estimator, the EM algorithm iterates the above improvement step until it converges to a local maximum of the likelihood. Several random starts increase the chance of finding a globally good estimator.


It is usually desired to maximize the expected posterior probability (given a proper prior) rather than the expected likelihood. In that case we assume a prior probability on the intensities P(I)=ΠuP(I(u)). A proper conjugate prior for the Poisson distribution is the Gamma distribution:










P


(

I


(
u
)


)


=


Gamma


(



I


(
u
)




α
u


;

β
u


)


=



β
u


α
u

+
1



Γ


(


α
u

+
1

)






I


(
u
)



α
u




e


-

β
u




I


(
u
)










(
7
)







Now the maximization is done on Q(I|I′)=E[lnP(x|I)p(I)|y; I′]. Plugging the Gamma prior into Q, and solving for I(u), we get the following EM iteration for the maximum posterior estimation:










I


(
u
)


=






α
u

+



t



E


[




x
t



(
u
)




y
t


;

I



]






β
u

+




ϕ
t



(
u
)











(
8
)








=





1


β
u

+



t




ϕ
t



(
u
)





[


α
u

+



t




y
t






ϕ
t



(
u
)





I




(
u
)






v





ϕ
t



(
u
)





I




(
v
)








]







(
9
)









The EM update step can be formulated in matrix notation as follows. Let Φ be the matrix of the projections [ϕt(u)]t,u, and let I, I′,y, α and β be represented as column vectors. Equation 8 can be written in vector and matrix notations as:









I
=


α
+


I


·

(


Φ
T



y

Φ






I





)




β
+


Φ
T






1







(
10
)







where the explicit multiplication and division denote element-wise operations, and where 1 is a vector (of the appropriate length) consisting solely of 1's.


Limited computational resources (i.e., when the entire projection matrix Φ cannot be kept in memory) may require breaking the update computation according to a partition of Φ into a set of sub-matrices (Φi). In that case the intensities can be updated gradually (using only one sub-matrix at each step) according to the following computation:









I
=


α
+


I


·



i




Φ
i
T




y
i



Φ
i



I









β
+



i




Φ
i
T






1








(
11
)







where yi denotes the vector of observations that are obtained using the views of Φi.


In order to achieve a reconstructed image which is adequate for medical diagnostic and treatment purposes, a high-resolution image of the tested object must be obtained. When high-resolution detecting units are used, their efficiency is relatively low, and the detecting units must remain at each position for a relatively long time in order to achieve a high probability of detection. Since during medical testing, measurements are generally performed at many locations as the detecting unit is moved relative to the observed organ, the testing procedure generally requires a long time and is physically and emotionally difficult for the patient. Additionally, reconstruction is based upon a large quantity of data, and is a lengthy and computationally complex process.


Reference is now made to FIG. 13, which is a schematic flowchart that illustrates steps of a typical prior art gated image reconstruction method. In FIG. 13, i denotes a frame counter and M denotes the number of frames. Usually, after all the M datasets are fetched, as shown at 2, and i is set with 1, as shown at 4, the dataset that corresponds with frame i is loaded into the processing unit, as shown at 6. During the following step, as shown at 8, the processing unit is used to perform a frame reconstruction according to the dataset that corresponds with frame i using any suitable reconstruction algorithm known in the art, such as the aforementioned EM algorithm, ordered subset expectation maximization (OSEM), and algebraic reconstruction techniques (ART) FBP.


As shown at 12, after the reconstruction is completed, the frame counter i is incremented by 1. If i is larger than M and there are no more frame the process ends. If i is not larger than M, the next dataset that corresponds with the subsequent frame is loading for reconstruction. In such a manner, the generation of a static imaging of the heart in a specific configuration becomes possible.


However, a known problem of such a method is the high computational load that is needed for the execution thereof. Even when using an ordered set method, such as the aforementioned Hudson et al. method, which is capable of reducing the computational load while giving similar results, the computational power needed to obtain a good image reconstruction is still quite high. Such a high computational load results in a longer reconstruction time that reduces the throughput of the processing unit and requires a more expensive processing hardware.


There is thus a widely recognized need for, and it would be highly advantageous to have, a method and an apparatus for image reconstruction in nuclear medicine imaging devoid of the above limitations.


SUMMARY OF THE INVENTION

According to one aspect of the present invention there is provided a method for iteratively reconstructing a volumetric image of an overall volume from radioactive emissions, the method comprising:


a) obtaining radioactive emissions from the overall volume, the overall volume comprising at least a part of a body organ or other body portion;


b) using the radioactive emissions to reconstruct an initial volumetric image of the overall volume, the initial volumetric image containing an initial location and initial shape of the at least a part of a body organ or other body portion and an initial estimation of number of photons emitted from the at least a part of a body organ or other body portion; and


c) reconstructing a further volumetric image from the initial volumetric image by an iterative process using object implantation for refining reconstruction, wherein the object implantation includes:


providing a model of at least a portion of the overall volume, the model including a general location and shape of the at least a part of a body organ or other body portion and an expected number of photons emitted from the at least a part of a body organ or other body portion;


replacing, at the general location, at least a portion of the initial volumetric image with the general shape of the at least a part of a body organ or other body portion, based on the model;


determining an improved estimation of a number of photons emitted from the at least a portion of the initial volumetric image, based on the expected number of photons, wherein the improved estimation is an increase in number of photons over the initial estimation; and


replacing the initial estimation of number of photons with the improved estimation, wherein the object implantation is used one or more times during the iterative process, each time for providing a better starting point for performing a next iteration of the iterative process, whereby the improved estimation is used to redistribute photon counts in an iteration.


Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.


Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps may be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention may be implemented as a chip or a circuit. As software, selected steps of the invention may be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention may be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the Office upon request and payment of the necessary fee.


The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in order to provide what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.


In the drawings:



FIGS. 1A-1D schematically illustrate a dynamic SPECT camera, in accordance with embodiments of the present invention;



FIGS. 2A and 2B schematically illustrate the camera structure with the assemblies, in accordance with an embodiment of the present invention.



FIGS. 3A-3D schematically illustrate viewing positions, in accordance with embodiments of the present invention.



FIGS. 4A-4F schematically illustrate stereo views and cross views, in accordance with embodiments of the present invention.



FIGS. 5A and 5B illustrate experimental radiopharmaceutical data, as known;



FIGS. 5C-5F illustrate cardiac gating, in accordance with embodiments of the present invention;



FIG. 6A-6I illustrate an intracorporeal dynamic SPECT camera, in accordance with embodiments of the present invention;



FIG. 7 illustrates assembly-damping parameters, in accordance with embodiments of the present invention;



FIGS. 8A and 8B schematically illustrate grid and anatomical construction of voxels, in accordance with embodiments of the present invention;



FIGS. 9A-9J present experimental data, obtained by the dynamic SPECT camera, in accordance with embodiments of the present invention;



FIG. 10 presents experimental data, obtained by the dynamic SPECT camera, in accordance with embodiments of the present invention;



FIG. 11 illustrates components of the dynamic SPECT camera, in accordance with embodiments of the present invention;



FIG. 12 illustrates an electrical scheme, in accordance with embodiments of the present invention;



FIG. 13 is a schematic flowchart that illustrates steps of a typical prior art gated image reconstruction method;



FIG. 14 is a schematic illustration of an apparatus for reconstructing a radioactive emission image of an input overall volume having dynamic and static volumetric regions, according to a preferred embodiment of present invention;



FIG. 15 is a schematic isometric view of the input overall volume that is depicted in FIG. 14, according to one embodiment of the present invention;



FIG. 16 is a schematic cross-sectional view of the input overall volume of FIG. 15, according to one embodiment of the present invention;



FIG. 17 is a flowchart that depicts a method for reconstruction an input overall volume using anatomically varying time-bin lengths, according to one embodiment of the present invention;



FIG. 18 is a graphical representation of a one-dimensional vector of voxels that represents the reconstruction of the input overall volume, according to one embodiment of the present invention;



FIG. 19 is another flowchart that depicts another method for reconstruction an input overall volume using anatomically varying time-bin lengths, according to another embodiment of the present invention; and



FIG. 20 is a graphical representation of a position of two selected sub-regions in two sequential frames, according to another embodiment of the present invention.





DESCRIPTION OF SPECIFIC EMBODIMENTS OF THE INVENTION

The present invention relates to an apparatus and a method for reconstructing a radioactive emission image of a overall volume having dynamic and static volumetric regions. The reconstructing is based on gated images with anatomically varying time-bin lengths. The apparatus and the method are designed to allow the segmentation of the radioactive emission image to gated and non-gated regions. In such a manner, the reconstructions radioactive emissions from the dynamic volumetric region and the static volumetric region are carried out separately. Thus, the high computational throughput that is needed in order to reconstruct a dynamic volumetric region, such as the heart, using time binning techniques has less or no effect on the reconstruction of the static volumetric region, as further described below. The disclosed apparatus comprises a number of detectors, such as PET or SPECT detectors, which are designed for obtaining radioactive emissions from the overall volume and an image reconstruction module that is designed for generating radioactive emission images of the overall volume according to the obtained radioactive emissions. The apparatus further comprises a segmentation module that is designed for segmenting an initial radioactive emission image to gated and non-gated regions, according to the dynamic and static volumetric regions of the overall volume. The image reconstruction module is designed to reconstruct separately the gated and non-gated regions in the radioactive emission image respectively according to radioactive emissions the dynamic and static volumetric regions.


The method for reconstructing a radioactive emission image of a overall volume having static and dynamic volumetric regions comprises several steps. During the first step, radioactive emissions are obtained from the overall volume. Then, an initial radioactive emission image of the overall volume is reconstructed according to the radioactive-emission data. In the following step, the initial radioactive emission image is segmented to gated and non-gated regions, respectively according to the dynamic and static volumetric regions. During the last step, the radioactive emission image is reconstructed, wherein the gated region is according to radioactive emissions from said dynamic volumetric region and the non-gated region is separately reconstructed according to radioactive emissions from the static volumetric region.


In another embodiment, only the dynamic volumetric region is reconstructed using time binning. The static volumetric region is reconstructed according to the initial radioactive emission image. Preferably, time binning of different anatomical segments has dynamically varying time-bin lengths, as further described below.


The principles and operation of the dynamic SPECT camera according to aspects of the present invention may be better understood with reference to the drawings and accompanying descriptions.


The principles and operation of an apparatus and method according to the present invention may be better understood with reference to the drawings and accompanying description.


Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.


Reference is now made to FIG. 14, which is a schematic illustration of an apparatus 990 for reconstructing a radioactive emission image of an input overall volume 1002 having dynamic 1003 and static 1002 volumetric regions, according to a preferred embodiment of present invention. In one embodiment of the present invention, the input overall volume 1002 is the thorax, the static volumetric region is the related viscus, and the dynamic volumetric region is the heart and the area that confines it. The apparatus 990, which is preferably a SPECT camera, has a number of detecting units 993, such as SPECT detectors. Each one of the detectors 993 is designed for obtaining radiation emission that is emitted from the input overall volume 1002, as described below, and for generating accordingly radioactive-emission data. The apparatus 990 comprises an image reconstruction module 992 that is connected to the detecting units 993. The image reconstruction module 992 is designed for generating radioactive emission images according to the radioactive-emission data. The images are preferably gated, as described below. The apparatus 990 further comprises a segmentation module 991 that is designed for segmenting an initial radioactive emission image, which has been generated by the image reconstruction module 992, to gated and non-gated regions according to the dynamic and static volumetric regions. The gated and non-gated regions are used by the image reconstruction module 992 for separately reconstructing the dynamic and static volumetric regions, as further described below in the anatomically varying time-bin lengths section.


Reference is now made to a more elaborated description of a preferred apparatus 990.


Dynamic Spect Camera


Design Description of the Dynamic SPECT Camera


As described above, in one embodiment of the present invention the apparatus 990 is a dynamic SPECT camera. Hereinbelow a description of a dynamic SPECT camera with temporal and spatial resolutions, which meet and even outperforms those of PET, and with a high spectral resolution not available in PET is given.


Temporal resolution, as used herein, relates to a minimal acquisition time for a tomographic reconstruction image of a predetermined volumetric region, for example 15×15×15 cubic centimeters, and predetermined spatial resolution, for example, 10×10×10 cubic millimeters. The minimal acquisition time may be, for example, 30 seconds, 10 seconds, or 1 second.


Reference is now made to FIGS. 1A-1D, which schematically illustrate a dynamic SPECT camera 10 that is configured for capturing gated images and non-gated image, in accordance with embodiments of the present invention. The dynamic SPECT camera 10 comprises: an overall structure 15, which defines proximal and distal ends and, with respect to a body 100; a number of assemblies 20, for example, 6, 9, or 16 assemblies 20, arranged on the overall structure 15, forming an array 25 of the assemblies 20. Each one the each assemblies 20 comprises a number of detecting units 12. Each detecting unit 12 includes a single-pixel detector 14 for detecting radioactive emissions and a dedicated collimator 16, attached to the single-pixel detector 14, at the proximal end thereof, for defining a solid collection angle δ for the detecting unit 12.


Additionally, each assembly 20 comprises an assembly motion provider 40, configured for providing the assembly 20 with individual assembly motion, with respect to the overall structure 15, during the acquisition of radioactive-emission data for a tomographic image.


The dynamic SPECT camera 10 further includes a timing mechanism 30, in communication with each single-pixel detector 14, configured for enabling time binning of the radioactive emissions impinging upon each single-pixel detector 14 to periods, which are not greater than substantially 30 seconds. As the timing mechanism 30 has can control each one of the single-pixel detector 14 separately, each one of the single-pixel detectors 14 can be configured according to a different time binning scheme. In one embodiment of the present invention, the time binning scheme, which is applied to a certain detector, is determined according to the region in the input overall volume that the detector is designed to detect.


The dynamic SPECT camera 10 further includes a position tracker 50, which is designed for providing information on the position and orientation of each detecting unit 12, with respect to the overall structure 15, substantially at all times, during the individual assembly motion.


The dynamic SPECT camera 10 is configured for acquiring a tomographic reconstruction image of a region of interest of about 15×15×15 cubic centimeters, for example, of a target organ 110, such as a heart or a stomach, during an acquisition period no greater than 300 seconds, at a spatial resolution of at least 10×10×10 cubic millimeters.


It will be appreciated that the time period may be no greater than 200 seconds, 100 seconds, 60 seconds, 30 seconds, 10 seconds, or 1 second.


Additionally, the dynamic SPECT camera 10 is configured for acquiring a series of tomographic reconstruction images of a region of interest, as a function of time, at a rate of at least a tomographic reconstruction image every 300 seconds.


Again, the rate may further be every 200 seconds, 100 seconds, 60 seconds, 30 seconds, 10 seconds, or 1 second.


In accordance with embodiments of the present invention, the individual assembly motion may be, for example, an assembly oscillatory sweeping motion, as described by an arrow 60. Additionally or alternatively, the individual assembly motion may be a first oscillatory lateral motion, as described by an arrow 80. Additionally or alternatively, the individual assembly motion may be a second oscillatory lateral motion, orthogonal to the first, as described by an arrow 90. Thus, the assembly motion provider 40 may comprise between one and three motion providing units, for the different assembly motions.


Alternatively, the individual assembly motion is an assembly oscillatory sweeping motion, as described by an arrow 60, while the array 25 moves with either the first or the second oscillatory lateral motions, described by the arrows 80 and 90, or with both.


Additionally, the detecting units 12 may be grouped into square or rectangular blocks 18, for example, of 4×4 detecting units 12, as seen in FIG. 1A, or of 16×16, 64×64, 64×128 or another number of detecting units 12. Furthermore, the blocks 18 may be provided with individual block oscillatory sweeping motion, as described by an arrow 70, with respect to the overall structure 15, during the acquisition of radioactive-emission data for a tomographic image. Preferably, the block oscillatory sweeping motion is orthogonal to, or at an angle to the assembly oscillatory sweeping motion, described by the arrow 60. Thus, the assembly motion provider 40 may further comprise a dedicated block motion providing unit, in communication with each block of an assembly.


A control unit 55 may be integrated with the dynamic SPECT camera 10, to form a single physical unit, or in communication with the dynamic SPECT camera 10.


A spectral selection mechanism 56, in communication with each of the detecting unit 12, is discussed hereinbelow, under the heading, “dynamically varying spectral bins.”


The body 100 may be a human or an animal, and the region of interest, or the target organ 110 may be a heart, a brain, a breast, a stomach, a GI tract, a colon, a prostate, a uterus, a cervix, a vagina, a throat, a gland, a lymph node, a portion of skin, a portion of bone, portion of another tissue, or another body portion.


As seen in FIGS. 1A and 1B, a reference x;y;z coordinate system illustrates a preferred orientation of the dynamic SPECT camera 10 with respect to the body 100, wherein z runs along a length of the body 100. For convenience, the assembly axis along the assembly length will be referred to as the assembly longitudinal axis, and the assembly axis along the assembly width will be referred to as the assembly traverse axis.


Preferably, the assemblies 20 are long and narrow columns, arranged longitudinally against the body 100, wherein the oscillatory sweeping motion, described by an arrow 60, is about the z-axis. It will be appreciated that other arrangements are similarly possible.


As seen in FIG. 1C, illustrating a cross-sectional view in the x-y plane, preferably, the assemblies 20 are arranged in an arc or an arc-like structure, about the body 100, maintaining a shape that follows the body contours, so as to keep as close as possible to the body 100.



FIG. 1D provides details of the detecting unit 12. The collimator has a length L, a collection angle δ, and a septa thickness τ. The single pixel detector is preferably a square of sides D and a detector thickness τd.


Preferred dimensions for the detecting unit 12 may be, for example, 2.46 mm×2.46 mm, and the solid collection angle δ may be at least 0.005 steradians. Generally, there may be 16×64 detecting units 12 per block 18.


The detector 14 is preferably, a room temperature, solid-state CdZnTe (CZT) detector, which is among the more promising that is currently available. It may be obtained, for example, IMARAD IMAGING SYSTEMS LTD., of Rehovot, ISRAEL, 76124, www(dot)imarad(dot)com, or from eV Products, a division of II-VI Corporation, Saxonburg Pa., 16056, or from or from another source. Alternatively, another solid-state detector such as CdTe, HgI, Si, Ge, or the like, or a combination of a scintillation detector (such as NaI(Tl), LSO, GSO, CsI, CaF, or the like) and a photomultiplier, or another detector as known, may be used, preferably with a photomultiplier tube for each single-pixel detector 14 and collimator 16, for accurate spatial resolution.


Reference is further made to FIGS. 2A and 2B that schematically illustrate the structure 15 with the assemblies 20, in accordance with an embodiment of the present invention. As seen, the assemblies 20 are arranged in an arc of an angle α, around the body 100, and move in the assembly oscillatory sweeping motion, about the z-axis, so as to provide a plurality of views of the heart 110, from many positions, along the x-y plane.


As seen in FIGS. 2A and 2B, the dynamic camera 10 is configured for simultaneous acquisition by the assemblies 20, each scanning the same region of interest from a different viewing position, thus achieving both shorter acquisition time and better edge definitions.


Preferably, the structure 15 conforms to the contours of the body 100, to maintain substantial contact or near contact with the body.


The embodiment of FIGS. 2A and 2B illustrates a single type of motion-assembly oscillatory sweeping motion about the z-axis, as described by the arrow 60 (FIG. 1A). In some cases, additional motions or views from additional directions may be desirous, as illustrated in FIGS. 3A-3D, hereinbelow.


Reference is further made to FIGS. 3A-3D, which schematically illustrate viewing positions, in accordance with embodiments of the present invention.



FIG. 3A illustrates a cylindrical target organ 110, with a cylindrical radioactive emission source 115 therein.


As seen in FIG. 3B, a view along the x-axis will observe the cylindrical radioactive emission source 115 as a bar 115.


As seen in FIG. 3C, a view along the y-axis will similarly observe the cylindrical radioactive emission source 115 as a bar 115, thus not adding new information to the view along the x-axis.


It will be appreciated that in the present example, any view along the x-y plane will observe the radioactive emission source 115 as a bar 115.


As seen in FIG. 3D, a view along the z-axis will observe the cylindrical radioactive emission source 115 as a circle 115, adding new information to the views along the x and y axes.


As FIGS. 3A-3D illustrate, at times, views along two axes may be insufficient for a three-dimensional definition of an object, and it may be beneficial to include views with a component along the third axis. For the sake of definition, views along two axes will be referred to stereo views, while views that include a component of the third axis will be referred to as cross views, since they intersect the planer stereo views.


Reference is further made to FIGS. 4A-4F, which schematically illustrate stereo views and cross views, in accordance with embodiments of the present invention.



FIG. 4A illustrate the body 100 with a single assembly 20 arranged for viewing, for example, the heart 110. The assembly 20 is afforded with assembly oscillatory sweeping motion along the z-axis, as described by the arrow 60, and preferably first and second preferably orthogonal oscillatory lateral motions, described the arrows 80 and 90, respectively.


As seen in FIG. 4B, the assembly oscillatory sweeping motion along the z-axis, described by the arrow 60, produces views 65 in the x-y planes. The first and second orthogonal oscillatory lateral motions, described the arrows 80 and 90, augment these with additional views 65 in the x-y planes. The purpose of the first and second oscillatory lateral motions is to compensate for “dead areas,” that is, structural areas and other areas that do not participate in the detection, within the assembly 20 and between the assemblies 20, so as to provide complete coverage of the body 100, by the array 25 (FIG. 1A). These motions produce views substantially in the x-y plane. It will be appreciated that there is a component of viewing in a third axis, due to the solid collection angle of the collimator 16. Yet this component is rather small.


Returning to FIG. 4A, the blocks 18 of the assembly 20 may be further afforded with block oscillatory sweeping motion, described by the arrow 70 and preferably orthogonal to the assembly oscillatory sweeping motion described by the arrow 60.


As seen in FIG. 4C, the block oscillatory sweeping motion, described by the arrow 70, produces cross views 75, which supplement views 65, by providing components of the third axis, namely, the z-axis. As illustrated in FIGS. 3A-3D, hereinabove, the views 75 may add additional information, not available or barely available in the views 65 along the x-y planes.



FIGS. 4D and 4F illustrate an alternative mode for acquiring the cross views 75. Accordingly, the dynamic camera 10 further includes assemblies 22, arranged at an angle β to the assemblies 20, and moving with an assembly oscillatory sweeping motion, described by an arrow 62, so as to provide the cross views 75.


It should be noted that the detectors of the dynamic camera 10 do not have to be arranged in arrays. In one embodiment of the present invention the detectors are scattered in front of the body so as to provide complete coverage of the body internal overall volume. The detectors can be scattered in a certain structure or in an arbitrary order.


The Position Tracker 50


The position tracker 50 is configured for providing information on the position and orientation of each detecting unit 12, with respect to the overall structure 15, substantially at all times, during the individual assembly motion.


In accordance with a preferred embodiment of the present invention, the position tracker 50 relates to software and (or) hardware that receive information from the motion provider 40 and calculate the position and orientation of each detecting unit 12, based on that information. Preferably, the calculation is performed within the control unit 55.


Alternatively, position sensors, as known, may be used for determining the position and angular orientation of each detecting unit 12.


Alternatively still, a combination of information from the motion provider 40 and position sensors may be employed.


The Timing Mechanism 30


The timing mechanism 30 associates timing information with the radioactive emission data impinging the single-pixel detectors 14 of the detecting units 12. Preferably, the timing mechanism 30 includes a single clock used for all of the single-pixel detectors 14 in the dynamic SPECT camera 10, so that timing information is synchronized for the camera as a whole. The timing information is collected at the single-pixel level, so that time binning may be performed for the emission data collected by each pixel. Exemplary methods for associating timing information with the radioactive emission data include:


1) Time stamping—Each event, impinging on a given single-pixel detector 14 at a given time is stamped with a time of detection and a pixel identification. Stamping may be performed by any manner known in the art, for example as a data packet header or footer. The time-stamped, pixel stamped radioactive emission data may be binned, per time and per pixel, by the control unit 55.


2) Time binning—In an alternate approach, timing information is provided for a cumulative count collected from each single-pixel detector 14 over a fixed time interval, for example, 0.001 seconds, 1 second, or 10 seconds, rather than for individual events. Each time bin is then stamped with a time stamp or sequential number and pixel identification. One technique for performing time binning is to insert a periodic clock pulse into the data stream. The interval between the clock pulses equals the minimum bin length. Thus, periodic pulses every 0.001 seconds may lead to bin lengths of 0.001 seconds or greater, for example, 1 second, or 10 seconds.


The timing Mechanism 30 is used by the reconstruction module in order to allow the separate reconstruction of the dynamic and static volumetric regions. The timing Mechanism 30 allows the reconstruction module to apply a time binning with a certain length on the dynamic volumetric region and a time binning with another length on the static volumetric region.


Time Scale Considerations


Dynamic studies, aimed at obtaining kinetic parameters, require the acquisition of full-reconstructed images at a rate that is no greater than about half the frequency of the sampled kinetic parameter. For example, for adult humans, blood circulates through the body at a rate of about 1 cycle per minute. Thus, sampling a process affected by blood circulation should take place at a rate of at least two samplings per minute. Preferably, sampling should be at a much greater rate, for example, 6 samplings or 10 samplings per minute—that is, about every 10 seconds or about every 6 seconds.


Additionally, based on FIGS. 5A and 5B, according to Garcia et al. (Am. J. Cardiol. 51st Annual Scientific Session, 2002), showing physiological behavior of different radiopharmaceuticals, dynamic studies for Tc-99m teboroxime are best performed within about the first 100 seconds after administration, and better still, within the first 60 seconds after administration.


Moreover, based on FIGS. 5A and 5B, the dynamic behavior of a radiopharmaceutical in the body, varies as a function of time, depending on the radiopharmaceutical and on the time elapsed since its administration. For example, myocardial perfusion of Tc-99m teboroxime shows a very steep uptake between about the first 10-15 seconds and the first 50-60 seconds, followed by a more gradual washout, after the first 60 seconds. The rate of sampling of Tc-99m teboroxime, during the first 60 seconds after administration should be adjusted to the very steep uptake, for example, a sampling rate of every second. For radiopharmaceutical with a slower dynamic behavior, a slower rate may be sufficient.


It will be appreciated that a dynamic analysis requires precise knowledge of the time of administration.


Obtaining the Time of Administration of a Radiopharmaceutical


As noted hereinabove, precise knowledge of the time of administration of a radiopharmaceutical is important both in order to evaluate physiological processes made visible by the radiopharmaceutical, with respect to the time of the radiopharmaceutical's entry to the body and in order to perform the evaluation at an optimal period, with respect to the radiopharmaceutical's cycle in the body.


There are several methods for acquiring precise knowledge of the time of administration of the radiopharmaceutical, as follows:


1. providing communication means between an administration device, for example, a syringe or an IV device, and the dynamic SPECT camera 10, and communicating the precise time of administration, vis-a-vis a clock, by the administration device to the dynamic SPECT camera 10. This method may be employed for example, when administration takes place when the patient is positioned at the dynamic SPECT camera 10, for imaging.


2. providing communication means between the administration device, the dynamic SPECT camera 10, and a third unit, for example, a control system or a hospitals ERP system, communicating the precise time of administration, vis a vis a clock, by the administration device to the third unit, and reporting the precise time of administration by the third unit to the dynamic SPECT camera 10. This method may be employed for example, when administration takes place at a different location than the imaging station.


3. allowing the dynamic SPECT camera 10 to image the site of administration, for example, the arm of the patient, while administration takes place, while employing the timing mechanism 30 of the dynamic SPECT camera 10. A marker, for example, a line of radioactive ink may drawn, for example, on the patient's arm or on the administration device, for defining the time of administration as the time the radiopharmaceutical first crosses the marker. Alternatively, observing a flow of the radiopharmaceutical in the administration device or in the patient's vein may be used to determine the time of administration.


4. Observing a transparent administration device, for example, with a video camera, associated with a clock, may be employed for defining a time of administration based on the radiopharmaceutical distribution in the administration device, or based on the time the radiopharmaceutical first crosses a marker, visible by the video camera. Communication between the video camera and the dynamic SPECT camera 10, or between the video camera, the dynamic SPECT camera 10, and a third unit will provide the information to the dynamic SPECT camera 10.


In accordance with embodiments of the present invention, the administration may include various administration profiles, for example, bolus, continuous drip, or sinusoidal.


Spatial and Temporal Resolution


In order to meet the time scale considerations, described hereinabove, the dynamic SPECT camera 10 according to embodiments of the present invention is designed at least for acquiring a tomographic reconstruction image of about 15×15×15 cubic centimeters, which is approximately the volumetric region of a heart, at a predetermined spatial resolution of at least 10×10×10 cubic millimeters, at an acquisition time no greater than about 30 seconds. Preferably, the acquisition time is no greater than about 10 seconds, and more preferably, the acquisition time is no greater than about 1 second.


Additionally, the spatial resolution of the tomographic reconstruction image may be at least 7×7×7 cubic millimeters, or better yet, at least 4×4×4 cubic millimeters, or better still, at least 1×1×1 cubic millimeters.


Anatomically Varying Time-Bin Lengths


As discussed in the background section, the time binning is needed in order to generate a clear imaging of a dynamic organ, such as the heart, or a section thereof. Though the time binning allows the acquisition of a clear image of the heart, it has at least one major disadvantage. The reconstruction of the image using time binning requires high computational throughput. Thus, binning images of the input overall volume may provide a clear imaging of the heart however have high computational throughput. Reconstruction using anatomically varying time-bin lengths can be used to reduce to computational throughput of the time binning.


While some body organs, such as the kidney, the lung, or the liver, are relatively static, so as to enable imaging of a period of time that allows acquiring a statistically significant number of counts, the heart moves relatively rapidly, with about 80-100 beats per minute, on the average. In one embodiment of the present invention, as the static region does not have to be gated to provide a clear imaging, only the dynamic region that preferably contains the heart is gated. In such an embodiment, fewer voxels are gated and therefore the computational complexity is reduced.


In such an embodiment, different areas in the body can be gated in a rate that is adjusted to according to a respective level of activeness. For example, the heart that has high level of activeness can be gated using a large number of bins, the visceral background, which is relativity static, is gated using one of two bins, and the stomach, that have higher level of activeness than the visceral background but lower level of activeness than the heart is gated using a limited number of bins.


As described below, performing gated image reconstruction using anatomically varying time-bin lengths improves the reconstruction quality, reduces the reconstruction time, or both. The improvement is an outcome of a reduction in the needed computational resources.


Reference is now made to FIGS. 15 and 16, which are respectively a schematic isometric view of the input overall volume 1001 segmented into dynamic volumetric region and static volumetric regions 1003, 1002, as depicted in FIG. 14, and a schematic cross-sectional view of the segmented input overall volume 1001A taken along the lines Ill-Ill, according to one embodiment of the present invention.


As described below a radioactive emission image of the input overall volume 1001 is segmented into a non-gated region, which includes non-gated voxels, in accordance with the static volumetric region 1002, and to a gated region, which includes gated voxels, in accordance with the dynamic volumetric region 1003. Preferably, the dynamic volumetric region 1003 is adjusted to delimit a dynamic organ, such as the human heart that is schematically represented by a hollow sphere 1004. Preferably, the dynamic volumetric region 1003 is larger than the apparent volumetric region of the heart 1004 to account for segmentation errors. It should be noted that the dynamic volumetric region 1003 may be adjusted to contain other human and animal internal organs such as the stomach. In FIG. 16, the hatched region 1003A represents the cross-section of the dynamic volumetric region 1003 and the annular crosshatched region 1004A schematically represents a cross-section through the heart muscle of the heart 1004. The region 1002A represents a cross section of the static volumetric region 1002.


It should be noted that the cubical shape of the dynamic volumetric region 1003 and the static volumetric region 1002 are not obligatory and the segmentation to regions may be performed using differently shaped volumetric regions. Preferably, the dynamic and the static volumetric regions 1003, 1002 have several non-connected parts. For example, the dynamic volumetric region 1003 may be a spherical volumetric region, an ellipsoid of revolution, an ellipsoid with a hole that represents the blood inside the heart, a cylindrical volumetric region or any other type of suitable regularly shaped or non-regularly shaped volumetric region. Preferably, the dynamic volumetric region 1003 comprises non-connected components, which may be referred to as sub-volumetric regions.


Reference is now made jointly to FIG. 15, previously described, and to FIG. 17, which is a flowchart that depicts a method for reconstruction an input overall volume using anatomically varying time-bin lengths, according to one embodiment of the present invention.


During the first step, as shown at 1301, radiation emitted from the input overall volume 1001 is captured by the SPECT detectors and recorded, as described above. The captured radiation is used to generate a set of gated images, which are used to overcome distortions such as motion artifacts. As described above, each gated image is generated by a photon counting that takes into account the portion of the heart contraction cycle within which a photon is measured. The number of photons hitting the detector within a specific integration time is calculated and used as raw data, which may be referred to as datasets.


Then, as shown at step 1302, the captured datasets are firstly used in an initial reconstruction process in which an initial estimation image is generated. Preferably, a non-gated reconstruction is used to provide a reconstruction that estimates the static intensity distribution.


In the following step, as shown at step 1303, the initial estimation image is segmented to a gated region and a non-gated region that respectively define the boundaries of the dynamic and static volumetric regions.


The segmentation of the input overall volume 1001 to dynamic and static volumetric regions 1003, 1002 is performed using a suitable image segmentation method or process.


Preferably, the input overall volume 1001 is further segmented to one or more other segments such as the liver. Such segments may be joined to the dynamic or to the static volumetric regions 1003, 1002 according to the nature of the activity level of the segment. For instance, the liver may be joined to the static volumetric region.


The segmentation to static and dynamic volumetric regions may be performed using a number of possible methods. In one embodiment of the present invention, a system user marks the boundaries of the dynamic volumetric region that comprises the gated voxels. In such an embodiment, the reconstructed image is displayed on a screen of a user interface that allows the system user to delimit the dynamic volumetric region. Though the captured image is blurry, as it is not gated, it provides the system user a perceptual image of the outlines of the internal organs in the input overall volume, including the heart, the liver, the spleen, the kidneys, and the aorta. In such an embodiment, the system user segments the captured image to gated and non-gated regions according to their level of activity, thereby defines the gated and non-gated regions. Preferably, the system user segments the heart as a non-gated region.


In one embodiment of the present invention, the segmentation is based on a voxel value threshold that reflects a certain percentage of the maximal reconstruction value. In such an embodiment, voxels of the reconstructed image having a value above the threshold are presumed to be voxels that depicts the heart and tagged as gated voxels of the dynamic volumetric region and voxels of the input overall volume 1001 having a value below the threshold are tagged as non-gated voxels of the static volumetric region. Preferably, regions in the captured image are segmented according to predefined characteristics. For example, the liver region, which can be characterized as a very large segment residing in the lower part of an image that depicts the thorax, is identified and segmented as a static volumetric region 1003 or a section thereof.


Preferably, the predefined threshold is defined according to the radiation intensity of the visceral background of the input overall volume 1001. In such an embodiment, the radiation intensity of the visceral background is estimated before the segmentation process. Such an initial estimation can be performed using median or linear filters such as Gaussian and moving average filters. Each one of the voxels of the input overall volume 1001 with a value that is well above the estimated background radiation is tagged as a gated voxel. Each one of the voxels of the input overall volume 1001 with a value, which is below the estimated background radiation, is tagged as a non-gated voxel.


In one embodiment of the present invention, the segmentation is performed according to morphological segmentation methods that adjusted according to the volumetric characteristics of the segmented volumetric regions. For example, for the heart that has convex faces can be segmented using top hat transform.


In one embodiment of the present invention, the segmentation is performed according to the growing rate of regions of the input overall volume 1001. In such a manner, regions such as the heart may be indented. In one embodiment, voxels having high growing rate are clustered as a group of voxels that depicts the heart.


In one embodiment of the present invention, the faces of the heart are identified. Such identification may be performed using an objective function with two parts. One part of the objective function is dependent on the organ border smoothness alone and the other part is dependent on the edge strength near a defined border, see M. Kass, A. Witkin, and D. Terzopoulos. Snakes: active contour models, International Conference on Computer Vision, pages 259-268, 1987, which is incorporated in its entirety by reference into the specification.


For clarity, Zdyn(u) and Zstat(u) respectively denotes the dynamic volumetric region and the static volumetric region of the captured image. The dynamic and static volumetric regions respectively define the boundaries of gated and non-gated regions in the radioactive emission image that depicts the input overall volume. It should be noted that though only two volumetric regions are exemplified hereinbelow, the overall volume may be segmented according to any number of volumetric regions such as three volumetric region, four volumetric region, ten volumetric region etc.


Preferably, after the gated and non-gated regions of the radioactive emission image have been segmented according to the dynamic and static volumetric regions, different resolutions are used for gated and non-gated voxels. In such an embodiment, the computational load of the reconstruction may be reduced by using large voxels with a low resolution in the static volumetric region 1002 and small voxels in the gated volumetric region.


Preferably, various morphological methods, such as, dilation, closing and the like are used after the initial segmentation to expand the dynamic volumetric region 1003. The broadening of the dynamic volumetric region 1003 is done in order ensure that if the segmentation has been made according to an organ in a contracted state, the dynamic volumetric region 1003 still encompasses the organ in an expanded state.


After the static and dynamic volumetric regions have been segmented during the initial reconstruction process, time binning of the dynamic volumetric region of the input overall volume is performed and a separate reconstruction of the static and the dynamic volumetric regions is enabled. As shown at 1304-1306, the reconstruction is based on an iterative process in which the time binning of gated images of the dynamic volumetric region is enabled.


For clarity, I0(u) denotes an input image I, which is preferably constant, that depicts u∈U voxels, t denotes a certain detector, g denotes a certain gate in a set of G gates, such as 8, 16, and 24, ϕt(u) denotes a standard functional matrix that depicts the detection probability of a photon emitted from location u∈U to be detected by detector t, st denotes the sensitivity of the detector t, Ttg denotes the integration time of detector t for gate g, Ig(u) denotes a set of G gated reconstructed images, ytg denotes the number of photons that are emitted from voxel u and detected in detector t at gate g.


Tt denotes the integration time of detector t and calculated as follows:

TtgTtg


Istat(u) and Idyng(u) respectively denote static and dynamic region images, where Istat(u) and Idyng(u) are mutually exclusive as Istat(u)·Idyng(u)=0, ∀u.


Zdyn(u) and Zstat(u) respectively denote static and dynamic regions in I, as defined in the aforementioned segmentation process. Preferably, Zdyn(u) and Zstat(u) are defined as follows:








Z
dyn



(
u
)


=

{






1
,




u






dynamic





region







0
,



otherwise










Z
stat



(
u
)



=

1
-


Z
dyn



(
u
)









Preferably, before the input overall volume I is iteratively reconstructed, few preliminary sub-steps are taken. During the first sub-step, stat Istat(u) and Idyng(u) are initialized as follows:

Istat(u)=I0(u)·(1−Zdyn(u))
Idyng(u)=I0(uZdyn


Preferably, if Zdyn(u)=0, the size of Idyng(u) is reduced.


As described above, Zdyn(u) and Zstat(u), which respectively confine the static and dynamic volumetric regions, are defined at step 1302.


During the second sub-step the scale is calculated as follows:


scaleO.S.g(u)=Σt∈O.S.St·Ttg·ϕt(u)

scaleO.S.(u)=ΣgscaleO.S.g(u)

After the preliminary steps have been completed, the reconstruction of the input overall volume according to time binning process commences. Preferably, during the reconstruction Istat(u) and Idyng(u) are calculated for each voxel u∈U in the input overall volume.


During each iteration of the time binning process, as shown at 1305, the gated and non-gated regions that represent the static and dynamic images Istat(u), Idyng(u) are updated. The updating of the regions is calculated according to a deviation between the number of photons that has been detected by the SPECT detectors and an estimation of this number, as described below. During each one of the iterations, the gated voxels of the dynamic volumetric region are binned according to the number of gates and the non-gated voxels are binned only once. As the non-gated voxels are binned only once, the computational complexity of the process is relatively low. The separation between the static and dynamic volumetric regions improves the computational efficiency and reduces the statistical variance.


In particular, in order to calculate Istat(u) and Idyng(u), a number of sub-iterations take places. First, ŷstat,t is calculated as follows:

ŷstat,t=stTtΣuϕt(u)Istat(u)


Where ŷstat,t denotes an estimation of the number of photons that are emitted from the voxels u∈U and detected by detector t, wherein values of voxels from the dynamic volumetric region are zeroed. It should be noted that the sensitivity parameter of and the integration time of detector t are taken into account at some stage in the calculation.


Then, ŷdyn,tg is calculated as follows:

ŷdyn,tg=stTtgΣuϕt(u)Idyng(u)


Where ŷdyn,tg denotes an estimation of the number of photons that are emitted at gate g from voxels u∈U and detected in detector t, wherein values of voxels from the static volumetric region are zeroed. It should be noted that the sensitivity parameter and the integration time of detector t for gate g are taken into account at some stage in the calculation.


ŷtg is calculated according to ŷdyn,tg and ŷstat,t, as follows:

ŷtgstat,tdyn,tg


Where ŷtg denotes an estimation of the number of photons that are emitted from a certain voxel u∈U and detected by detector t at gate g. It should be noted that unlike the calculation of ytg, the calculation of ŷtg does not take into account the integration time and the sensitivity factor.


Then, for each gate g, the numerator numg(u) is evaluated as follows:








num
g



(
u
)


=



t





y
t
g



y
^

t
g




(



s
t



T
t
g




ϕ
t



(
u
)



-
1

)







Where numg(u) sums the deviation between the number of photons that are emitted from voxel u and detected in detector t at gate g and the estimation thereof of all the detectors, wherein the sensitivity and the integration time of each detector t are taken into account. It should be noted that the calculation can be directly extended to an ordered sets method or any of its variations by summing the deviation over subsets of the group of detectors.


Based thereupon, the numerator num(u) is evaluated as follows:

num(u)=Σgnumg(u)


Where num(u) is a sum of all the numerators that are evaluated for every g∈G.


Istat(u) and Idyng(u) are updated according to the calculation of the aforementioned scales and numerators, as follows:












I
stat



(
u
)


=



I
stat



(
u
)


+



num


(
u
)



scale


(
u
)



·


I
stat



(
u
)













I

dy





n

g



(
u
)


=



I
dyn
g



(
u
)


+




num
g



(
u
)




scale
g



(
u
)



·


I

dy





n

g



(
u
)

















The updated Istat(u) and Idyng(u) are stored and used during the next iteration, as shown at step 1306. Steps 1303-1306 are repeated iteratively until the reconstruction of the input overall volume has reached a desired quality.


Preferably, in order to determine whether the reconstruction has reached a desired quality, as shown at 1306, the number of gated voxels with activity above a predefined threshold is checked. For example, the number of gated voxels with activity level that is in the range between the maximal gated voxel intensity value and 20% therefrom is checked.


When the time binning process has been completed, as shown at 1306, a gated reconstructed image can be generated as follows:

Ig(u)=Istat(u)+Idyng(u)


Reference is now made to FIG. 18, which is a graphical representation of one dimensional vector Ic of voxels that represents the reconstruction of the input overall volume. Preferably, all the non-gated voxels Istat(u) that represent static regions of the input overall volume U are arranged 1101 first within the vector Ic. The non-gated voxels are followed 1101 by gated voxels that comprise a set of different frames in a consecutive order that are arranged in clusters. Each cluster represents the dynamic volumetric region of the input overall volume at a certain frame. The frames are denoted by 1102A, . . . , 1102G. The frames can be arranged in any predefined order.


As described above, ϕt(u) is a standard functional matrix that depicts the detection probability of a photon emitted from a voxel u∈U to be detected by a detector t. Since ϕt(u) is a sparse matrix, the number of math operations can be reduced by defining ϕtg(u) which is zero wherever Idyng(u) is zero.


Reference is now made jointly to FIG. 15, previously described, and to FIG. 19, which is a flowchart that depicts another method for reconstruction an input overall volume using anatomically varying time-bin lengths, according to another embodiment of the present invention. In the method depicted in FIG. 19, the static region is estimated only once according to the initial reconstruction process step.


The method depicted in FIG. 19 is based on the assumption that the non-gated static region equals to the average of the gated dynamic region images reconstructions. Though the assumption is not accurate, it is expected to be sufficient for the reconstruction of the input overall volume. As the static region is calculated only once, the memory usage and the computational complexity decrease.


Steps 1301 and 1302 are as depicted in FIG. 17. During steps 1301 and 1302 the first step I(u) is obtained. Then, as shown at 1310 and 1311, I(u) is segmented to current and static regions, preferably according to the following equations:

Istat(u)=I(u)·(1−Zdyn(u))
Idyng(u)=I(u)·Zdyn(u), for each g∈G


In order to reduce the computational complexity of the following iterative process, ŷstat,t is evaluated in advance as follows:

ŷstat,t=stTtΣuϕt(u)Istat(u)


The previously described method uses the standard functional matrix ϕt(u) that is a representation of the probability to detect a photon emitted from location u∈U by a detector t. The calculation of ϕt(u) requires high computational complexity as all the voxels of the input overall volume have to be calculated. In order to reduce the computational complexity a standard functional matrix that is limited to dynamic voxels is used ϕt,dyn(u). The limited standard functional matrix is defined as follows:








ϕ

t
,

dy





n





(
u
)


=

{






ϕ
t



(
u
)


,





u






dynamic





region












0
,



otherwise








Then, for each g∈G, the scale on the dynamic region scaleg(u) is evaluated, as described above.


During the following step, as shown at 1312, the dynamic volumetric region Idyng(u) is calculated. In particular, in order to calculate Idyng(u), a number of sub-iterations take place. First, Idyng(u) is calculated using ϕt,dyn(u) as follows:

ŷdyn,tg=stTtgΣuϕt(u)Idyng(u)


Then based on ŷdyn,tg and ŷstat,t that has been calculated in the step 1310, ŷtg is calculated as follows:

ŷtgstat,tdyn,tg


Where ŷtg denotes an estimation of the number of photons that are emitted from a certain voxel u∈U and detected by detector t at gate g. It should be noted that unlike ŷdyn,tg, ŷstat,t is not recalculated during the iterative process.


Then, for each gate g, the numerator numg(u) is evaluated as follows:








num
g



(
u
)


=



t





y
t
g



y
^

t
g




(



s
t



T
t
g




ϕ
t



(
u
)



-
1

)







Based on the calculation of the scale that has been calculated before the iterative process and the numerator that is calculated according to radiation emitted from the dynamic region and captured by the detectors, Idyng(u) is updated as follows:








I

dy





n

g



(
u
)


=



I

d





yn

g



(
u
)


+




num
g



(
u
)




scale
g



(
u
)



·


I

d





yn

g



(
u
)








The updated Idyng(u) is stored and used during the next iteration, as shown at step 1316.


Then, as shown at 1313, the input overall volume is reconstructed using the updated dynamic region and the static region. As shown at 1314, steps 1311-1314 are repeated iteratively until the reconstruction of the input overall volume has reached a desired quality. In the end of each on of the iterations, the dynamic region is updated, as described above.


When the iterative process has been completed, as shown at 1314, a gated reconstructed image can be generated as follows:

Ig(u)=Istat(u)+Idyng(u)


The gated voxels in the dynamic volumetric region 1003 may represent ischemic regions of the heart. The radiation reflected from such ischemic regions may have specific radiation patterns such as a center with low radiation. Thus, such regions can be reconstructed using morphological closing methods or by taking into account the typical shape of the heart (one way is by fitting an ellipsoid to the edges in the image, but other methods may also be used).


Reference is now made to FIG. 20, which is a graphical representation of a position of two selected sub-regions in two sequential frames. As described above, the reconstruction of the dynamic volumetric region is based on time binning of a number of consecutive frames that depict a dynamic organ such as the heart. As each frame is based on a number of gated images, it has high computational load.


In one embodiment of the present invention, the set of frames is a set of sequential images that depict the heart. As all the frames depict the same input overall volume and as the heart has an expected movement pattern, we can use one or more frames to estimate another. In such a manner, fewer frames are calculated and therefore the computational complexity of the reconstruction decreases. Preferably, during the time binning, geometrical information from one or more prior frames assist in the reconstruction of subsequent one or more frames. Such geometric prior methods are generally well known and therefore, are not described here in greater detail.


For example, FIG. 20 depicts two sequential frames, frame i−1 1054 and frame i 1056, which are taken from a sequence of M frames. The sub-regions 1057A and 1059A in frame 54 schematically represent two different regions of the heart 1055. The regions 1057B and 1059B of frame 1056 schematically represent the respective positions of regions 1057A and 1059A in frame 1056. The change in the positions is an outcome of the movement of the heart 1055. The vector T1 represents the movement of the region 1057B relative to the region 1057A and the vector T2 represents the movement of the region 1059B relative to the region 1059A. T1 and T2 can be used to estimate the position of additional one or more frames in some of the embodiments of the present invention. See, Green P, Bayesian Reconstructions from Emission Tomography Data using a Modified EM Algorithm, IEEE Tran. On medical imaging vol. 9 No. 1, March 1990, pp. 84-93 and Fessler J, 2004 NSS/MIC statistical image reconstruction short course notes entitled “Statistical Methods For Image Reconstruction”, www(dot)eecs(dot)umich(dot)edu/fessler/papers/files/talkIO4/mic(dot)notes(dot)Pdf which are incorporated in their entirety by reference into the specification. Preferably, voxels that represent the same anatomical location are stated to be alike. Therefore, voxels from different frames that represent the same anatomical location can form a clique, or a neighborhood, as defined by the Gibbs prior equation. The strength of applying such a geometric prior may be depended on the movement amplitude and on the gate phase.


Dynamically Varying Time-Bin Lengths


There are times when dynamically varying time-bin lengths are desired. For example, Tc-99m-teboroxime has an uptake curve (FIG. 5B) which is very steep during the uptake and which becomes less so during the washout. Thus, different time-bin lengths may be desired for different portions of the Tc-99m-teboroxime uptake curve. Similarly, different radiopharmaceuticals have different uptake curves, and dedicated time-bin lengths may be desired for each radiopharmaceutical, and for different portions of their respective uptake curves. Moreover, the cardiac RR cycle has very steep periods, during the rise and fall of the R peak (FIG. 5F), followed by periods that are nearly flat as a function of time. Again, time bin lengths of different durations may be employed for the different portions of the RR cycle. Furthermore, while the actual region of interest, for example, the heart, requires imaging at a very high level of accuracy, adjacent regions, for example, the chest muscle, may be of lesser interest, and may be viewed at time bins of greater lengths. Additionally, continuous acquisition mode may require shorter time-bin lengths than stop and shoot mode.


For example, the actual rise and fall of the R peak may be gated at time bins of 10 milliseconds, while the nearly leveled U-wave may be gated at 100 milliseconds. Similarly, while the heart muscle may be gated at an average time bin of 50 milliseconds, the adjacent chest muscle may be gated at time bins of 1 second and longer. It will be appreciated that other values may similarly be employed.


In accordance with embodiments of the present invention, a lookup system of recommended time-bin lengths may be provided, for specifying recommended time-bin lengths as functions of one or more of the following:


a specific region of interest;


an administered radiopharmaceutical;


time elapsed since the administration of the radiopharmaceutical;


cardiac state with respect to an RR cycle;


a view of the detecting unit 12, with respect to the region of interest;


patient general data; and


data acquisition mode.


The lookup system may be, for example, tables or curves.


Thus the dynamic SPECT camera 10 may be configured for time binning at dynamically varying time-bin lengths, by providing communication between the timing mechanism 30 and the lookup system, wherein the timing mechanism is configured for selecting a recommended time-bin length from the lookup system, for each time bin.


Clearly, if the input image has been segmented to gated and non-gated regions, as described in above, only the gated region is gated at time bins at dynamically varying time-bin lengths.


Dynamically Varying Spectral Bins


It is sometimes of value to image only a specific spectral bin so as to eliminate scatter or contributions from other radiopharmaceuticals. Additionally, it may be of value to image several spectral bins simultaneously, for different radiopharmaceuticals, wherein different groups of detecting units are dedicated to different spectral bins.


Thus, the dynamic SPECT camera 10 may be configured for dynamically determining a spectral energy bin for each detecting unit 12, as follows:


providing a spectral selection mechanism 56 (FIG. 1A), for enabling a selection of a spectral energy bin to be used for each detecting unit 12, independently from the other detecting units 12; and


a lookup system of recommended spectral energy bin values, as functions of at least one of a specific region of interest, an administered radiopharmaceutical, time since the administration of the radiopharmaceutical, a view of the detecting unit with respect to the region of interest, and patient's details;


wherein the spectral selection mechanism 56 is further configured for dynamically determining the spectral energy bin for each detecting unit, as functions of the specific region of interest, the administered radiopharmaceutical, the time elapsed since the administration of the radiopharmaceutical, the view of the detecting unit with respect to the region of interest, and patients' details, from the lookup system.


The spectral energy bin is designed to include a primary photon energy ±10%, or the primary photon energy ±7%, or the primary photon energy ±5%.


Additionally, at least two radiopharmaceuticals may be administered and viewed by different groups of detecting units, each group being configured for a different spectral energy bin, so as to view each radiopharmaceutical in the same region independently of the other radiopharmaceutical.


The spectral selection mechanism may be a hardware unit or software.


The spectral selection may be performed during data acquisition, or later.


Intracorporeal Dynamic SPECT Camera


Referring further to the drawings, FIGS. 6A-6I describe the dynamic SPECT camera 10 as an intracorporeal dynamic SPECT camera 10, which includes a single assembly 20, preferably configured for oscillatory sweeping motion around its longitudinal axis—the z axis, as described by the arrow 60. The blocks 18 may be further configured for oscillatory sweeping motion in an orthogonal direction, as described by the arrows 70. An end block 18′ may be further configured for motion, for example, as described by the arrow 70′. It will be appreciated that other motions are similarly possible, for example, oscillatory lateral motions, or rotational motions. For example, the arrow 90 describes the oscillatory lateral motion along the z axis of the assembly 20.


An ultrasound transducer 45 may be included with the intracorporeal dynamic SPECT camera 10.


Other features of the intracorporeal dynamic SPECT camera 10 are as described for the dynamic SPECT camera 10 of FIGS. 1A-1D.



FIG. 6A illustrates the intracorporeal dynamic SPECT camera 10 as a single rigid unit, for example, for rectal or vaginal insertion. FIG. 6C illustrates the intracorporeal dynamic SPECT camera 10 as having an incorporeal portion 44, an extracorporeal portion 42 and a cable 46, for example, for insertion to the esophagus.



FIGS. 6F and 6E illustrate motions of the blocks 18, as described by the arrows 70. FIGS. 6F-6I illustrate motion of the assembly 20, as described by the arrow 60.


Reconstruction with Object Implantaion


As described above, the reconstruction of the radioactive emission image is based on datasets that have been acquired from a certain overall volume, such as the thorax, with objects having known volume and structure, such as the heart. As the reconstructed volumetric region has a known structure and comprises organs with estimated structure, relative location, and volume, the throughput of the reconstruction can be reduced.


The reconstruction process is an iterative process. During each step, the reconstruction of the overall volume and one or more volumetric regions thereof are being refined. Preferably, one or more object models, which are defined according to an image, such as a CT or an MRI image, an anatomical atlas, or other accurate reconstructions of respective objects, are used to improve and enhance the reconstruction process.


Object implantation proceeds as follows: after a few iterations, which provide a general idea of both:


i. the location and general shape of an organ in question, such as a heart, lungs, a stomach, visceral background elements, etc.; and


ii. an estimation of the expected number of photons that are emitted from different portions of the organ in question,


the general shape and photon counts of the organ in question are replaced by an implanted model, based on a CT image, an MRI image, an anatomical atlas, or the like, thereby providing both:

    • i. a better definition of edges between the organ in question and the surroundings; and


      ii. some analytical evaluation of the expected number of photons that are emitted from different portions of the organ in question, based on the first few iterations, for example, given that the organ is a heart, an average count values for the blood and for the heart muscle, respectively, may be used, based on first few iterations, for the different areas of the model. It will be appreciated that an anatomical construction of voxels may be employed with the voxel implantation.


In this manner, object implantation improves the reconstruction that is based on counting statistics.


It will be appreciated that object implantation may be employed once or several times during the reconstruction process, each time, providing a better starting point for the next iteration.


Object implantation comes to solve the problem that during the first steps of the reconstruction, a blurry radioactive emission image of the overall volume is received, as described above. An organ, such as the heart, can be identified in the blurry radioactive emission image according to a cluster of voxels with expected values in an expected relative position. The value of voxels in such a cluster can be adjusted or changed according to a respective object model. For example, if after a certain number of iterations a cluster of voxels in the upper right section of the overall volume has voxels with a certain average expected value, the cluster can be identified as the heart. As it is known that the number of photons, which are emitted from voxels of the heart is relatively high, the value of voxels in the related cluster are adjusted or changed to have relatively high values, according to the object model. In such a manner, the actual shape of the heart can be reconstructed more efficiently.


Reconstruction Using a Minimal Number of Gray Levels


Strictly speaking, variations in radioactive emission activity between different voxels can be infinite, one voxel showing a photon count of 17,584/for a given time period, and another, a photon count of 18,900/for the given time period. Yet, to a doctor, interested in identifying background muscles, heart muscles, or blood, and further, interested in differentiating between healthy heart muscle, ischemic muscle, and dead tissue, a few levels of gray, for example, between 5 or 10 levels or gradations of gray, may be sufficient. Thus, reconstruction need not be carried out in order to evaluate an accurate photon count per voxel, but merely to determine the level of gray, from amongst 5-10 levels of gray, per voxel.


As described above, the reconstruction of the radioactive emission image is based on datasets that have been acquired from a certain overall volume. The reconstruction is performed by summing up the photons that are emitted from voxels of the overall volume. The sums of the emitted photons are translated to gray level values, which are used to reconstruct the radioactive emissions image. Preferably, in order to determine whether the reconstruction has reached a desired quality, the number of voxels with activity above a predefined threshold is checked. For example, the number of voxels with activity level that is in the range between the maximal voxel intensity value and 20% therefrom is checked. Other criteria may be determined is order to evaluate whether the reconstruction has reached a desired quality. Threshold values are preferably chosen empirically to yield accurate reconstruction of the overall volume. This analysis applies to gated and ungated regions, alike.


Preferably, the values of the voxels are mapped to a limited number of gray level values, such as 5, 7, 8, 9, or 10. By limiting the number of gray level values a radioactive emissions image, which is more coarsened, is generated and the computational load of the reconstruction process is reduced. Though such a limitation reduces the sharpness and the contrast level of radioactive emission image, the coarsened radioactive emission image still depicts enough information that allows a physician to identify ischemic regions in the overall volume. It should be noted that such a mapping may separately be used on one or more volumetric regions of the overall volume, preferably according to the dynamic characteristics thereof.


Image Acquisition Modes


In accordance with embodiments of the present invention, several image acquisition modes are available, as follows:


In a continuous acquisition mode, also referred to as fanning, data is acquired while the array, the assembly, or the block is in continuous motion. Continuous acquisition mode may apply also to oscillatory motions, although strictly speaking there is a momentary pause with each change of direction. This mode leads to some blurring of the data, but it does not require the array, assembly, or block to stabilize between periods of motion and periods of stationary data acquisition.


In a stop and shoot acquisition mode, incremental travels are followed by stationary acquisition intervals. This mode leads to better resolution, yet it requires a damping period, to allow the array, assembly, or block to stabilize between periods of motion and periods of stationary data acquisition, as discussed hereinbelow, under the heading, “Stability and Damping Time”.


Interlacing is a fast stop and shoot acquisition mode with oscillatory motion, for example, sweeping oscillatory motion, wherein on rather than stopping at each predetermined locations, with each sweep, the odd locations are visited on a right sweep and the even locations are visited on the left sweep, or vice vera, so that each sweeping direction stops at different locations.


Prescanning relates to a fast prescan of a subject undergoing diagnosis, to identify a region-of-interest, and thereafter collect higher quality data from the region-of-interest. A prescan according to the present invention may be performed by the dynamic SPECT camera 10, preferably, with interlacing, or in a continuous mode, or by any other imaging device, including, for example, ultrasound or MRI.


Stability and Damping Time


Stop and shoot acquisition mode involves discontinuities in motion between travel and shooting modes, and at the beginning of each shooting mode, the assemblies 20 must be allowed to stabilize till vibrations are less than about ±0.25 mm, so as not to interfere with the acquisition.


Prior art SPECT cameras must allow for a damping time of about 5 seconds, but the dynamic SPECT camera 10, according to embodiments of the present invention reaches stability in about 1 second or less.



FIG. 7 schematically illustrate the assembly 20, according to embodiments of the present invention. The damping time for the assembly 20 may be described as:

Damping Time=C×[(1/12)M(T2+W2)+MX02],

wherein:


M is the mass of the assembly 20;


T is the thickness of the assembly 20;


W is the width of the assembly 20;


X0 is the axis of rotation; and


C is a constant that depends on the braking force applied to the assembly 20.


The factor 1/12 is calculated assuming the assembly proximal end is tangential to the sweeping path.


As the damping time equation illustrates, the damping time is highly dependent on both the axis of rotation X0 and the mass of the assembly 20.


In the present case, the axis of rotation is that of the sweeping motion described by the arrow 60 (FIG. 1A), which is considerably shorter than an axis of rotation around the body 100.


Similarly, the mass of a single assembly is far less than that of a conventional SPECT camera.


Possible values for the assembly 20, according to embodiments of the present invention may be:


Weight of the assembly 20≈1.5 kg.


Thickness of the assembly 20≈5 cm.


Width of the assembly 20≈7 cm.


As such, the assembly is designed with a damping time constant of under 50 msec during which vibrations amplitude subsides to under 0.25 mm.


It will be appreciated that the present example applies to both extracorporeal and intracorporeal dynamic cameras.


Stationary Dynamic SPECT Camera


It may be desired to perform imaging, especially prescanning with a stationary camera, that is without motion, for the following reasons:


1. in continuous acquisition mode, the blurring produced by the motion is eliminated;


2. in stop and shoot acquisition mode, the time spent in motion is avoided, as are the vibrations, associated with the discontinuities between the motions and the stationary intervals.


In general, a stationary camera does not provide sufficient viewing positions and detecting units, yet the camera may be specifically designed to provide those, to a desired level.


Preferably, the assemblies 20 are positioned at optimal positions prior to imaging, and imaging takes place while the camera is stationary.


Thus, in accordance with embodiments of the present invention, there is provided a stationary dynamic SPECT camera 10, which is described herein with reference to FIGS. 1A-1D. The stationary dynamic SPECT camera 10 comprises:


the overall structure 15, which defines proximal and distal ends with respect to a body;


the first plurality of the assemblies 20, arranged on the overall structure 15, forming an array 25 of the assemblies 20, each assembly 20 comprising:

    • a second plurality of detecting units 12, each detecting unit 12 including:
    • a single-pixel detector 14, for detecting radioactive emissions; and
    • a dedicated collimator 16, attached to the single-pixel detector, at the proximal end thereof, for defining a solid collection angle δ for the detecting unit; and
    • an assembly motion provider 40, configured for providing the assembly 20 with individual assembly motion with respect to the overall structure, prior to the acquisition of radioactive-emission data;


a position-tracker 50, configured for providing information on the position and orientation of each of the detecting units 12, with respect to the overall structure 15, during the individual motion,


the stationary dynamic SPECT camera 10 being configured for acquiring a tomographic reconstruction image of a region of interest while stationary, for the whole duration of the tomographic image acquisition.


Preferably, the region of interest is about 15×15 ×15 cubic centimeters, and the tomographic image may be acquired during an acquisition time of 60 seconds, at a spatial resolution of at least 20×20×20 cubic millimeter.


Additionally, the tomographic image may be acquired during an acquisition time of 30 seconds, at a spatial resolution of at least 20×20×20 cubic millimeter.


Furthermore, the tomographic image may be acquired during an acquisition time of 60 seconds, at a spatial resolution of at least 10×10×10 cubic millimeter.


Additionally, the tomographic image may be acquired during an acquisition time of 30 seconds, at a spatial resolution of at least 10×10×10 cubic millimeter.


Preferably, the structure 15 conforms to the contours of the body 100, for acquisition with substantial contact or near contact with the body.


Additionally, the assemblies 20 in the array 25 are configured to provide stereo views in a plane and cross views.


Anatomic Construction of Voxels


Anatomic construction of voxels avoids the smearing effect of a rigid voxel grid construction, where different tissue types, for example, blood and muscle, appear in a same voxel. This is important especially for perfusion studies, where the perfusion of blood into the tissue is sought.


Reference is now made to FIGS. 8A and 8B, which schematically illustrate a rigid voxel grid construction and an anatomic construction of voxels, respectively, in accordance with the present invention.



FIGS. 8A and 8B illustrate a heart 200, having atria 202 and 204, chambers 206 and 208, and a muscle 218.


As seen in FIG. 8A, a rigid voxel construction 202 causes smearing of the different tissue types. However, as seen in FIG. 8B, blood and muscle tissues are anatomically divided into different voxels, allowing an accurate study of perfusion. The atria and chambers are divided into an anatomic voxel system 222, or to an anatomic voxel system 224, while the muscle is divided into a relatively coarse voxel system 226, or to a finer voxel system 228, as desired. It will be appreciated that the anatomic voxels may vary in volumetric region. For example, since ischemia is not relevant to the atria and chambers, they may be divided into coarse voxels, while the heart muscle may be divided into fine voxels.


As further seen in FIG. 8B, the rigid voxel construction 202 may still applied to the surrounding chest muscle.


It will be appreciated that parametric equations, such as F(1) and F(2) may be created and used in the construction of the anatomic construction of the voxels.


The following describes methods for obtaining the anatomic construction of voxels.


A first method for the anatomic construction of voxels includes:


providing a structural image of a region of interest, such as a heart;


constructing an anatomic system of voxels, for the region of interest, in which voxel boundaries are aligned with boundaries of structural objects of the region of interest, based on the structural image;


performing radioactive-emission imaging of the region of interest, utilizing the anatomic system of voxels; and


performing reconstruction of the radioactive-emission imaging, utilizing the anatomic system of voxels.


Preferably, the structural image is provided by a structural imager, selected from the group consisting of 2-D ultrasound, 3-D ultrasound, planner x-rays, CT x-rays, and MRI.


Additionally, the structural imager is co-registered to a radioactive-emission imaging camera which performs the radioactive-emission imaging.


Moreover, attenuation correction of the radioactive-emission imaging may be performed, based on the structural image.


Furthermore, the structural image and the radioactive-emission image, constructed with the anatomic voxels, may be displayed together.


Alternatively, the structural imager is not co-registered to a radioactive-emission imaging camera which performs the radioactive-emission imaging, and further including corrections for misregistration.


Alternatively still, the structural image is provided from a lookup system, which is preferably corrected for patient's details.


It will be appreciated that the anatomic construction of voxels may be based on fitting the boundaries of the structural objects to parametric equations and utilizing the parametric equations in the constructing of the anatomic system of voxels.


Additionally, the anatomic system of voxels includes voxels of varying volumetric regions, depending on their anatomic position and relevance.


Furthermore, the method includes time binning of the radioactive emissions to time periods not greater than substantially 30 seconds, or not greater than substantially 10 seconds, or not greater than substantially 1 second.


Additionally, the anatomic system of voxels includes voxels of varying volumetric regions, depending on the relevance of their dynamic activity.


An alternative method for the anatomic construction of voxels includes, relates to the use of the radioactive emission imaging itself for the anatomic reconstruction, as follows:


providing a first system of voxels for a region of interest;


obtaining radioactive-emission data from the region of interest;


performing a first reconstruction, based on the radioactive-emission data and the first system of voxels, to obtain a first image;


correcting the first system of voxels, by aligning voxel boundaries with object boundaries, based on the first image; thus obtaining a second system of voxels;


performing a second reconstruction, based on the radioactive-emission data and the second system of voxels, thus obtaining a second image.


Alternatively, a set of radioactive emission data is obtained, possibly with a second injection, in order to concentrate the viewing on the anatomic voxels, as follows:


providing a first system of voxels for a region of interest;


obtaining a first set of radioactive-emission data from the region of interest;


performing a first reconstruction, based on the first set of the radioactive-emission data and the first system of voxels, to obtain a first image;


correcting the first system of voxels, by aligning voxel boundaries with object boundaries, based on the first image; thus obtaining a second system of voxels, which is anatomically based;


obtaining a second set of radioactive-emission data from the region of interest, based on the second system of voxels, which is anatomically based; and


performing a second reconstruction, based on the second set of the radioactive-emission data and the second system of voxels, thus obtaining a second image.


Anatomic Modeling


Bull's Eye, or polar map, is a semi-automatic method for the quantification and evaluation of coronary artery disease from SPECT tomograms obtained by marking the myocardium with Tl-201 or MIBI-Tc-99. The polar map is computed from cross-sectional slices of the Left Ventricle (LV). For each slice, the center and a radius of search that contains the LV are determined and the LV is divided into radial sectors. The maximum count value of each sector is computed, generating a profile. Profiles are plotted as concentric circle onto the map. The resulting map is a compression of 3D information (LV perfusion) onto a single 2D image.


Yet the bull's eye or polar map is reconstructed from a rigorous geometry of voxels, for example, of 5×5×5 mm, or 4×4×4 mm, which cuts across tissue types, thus providing smeared information.


A voxel division that is based on an anatomical structure would be highly preferred, as it would allow the measurements of processes within and across anatomical features, substantially without the smearing effect. For example, if specific voxels are used to define blood regions, and others are used to define muscle regions, than diffusion across boundary membranes and other processes may be evaluated, substantially without a smearing effect.


Anatomical model is based on voxels that follow anatomical structures, and may be shaped for example, as a sphere, a tube, or as a shell segment, rather than as a the standard cube.


When combined with a camera of high resolution and sensitivity and with gated measurements, anatomic modeling would be clearly advantageous over standard, rigorous modeling, especially for kinetic studies are meaningful only with respect to specific tissue types.


In accordance with embodiments of the present invention, the polar map may be produced with a reduced number of amplitudes, or levels, for example, 7 levels of severity, or 5 levels of severity, from healthy to severe.


Kinetic Modeling


As part of the imaging and analysis processes, the camera may be able to produce a time series of 2D or 3D images, showing reconstructed intensity in overall volume and its changes over time.


Likewise, it may be desirable not to reconstruct the entire volumetric region but only limited segments of interest. In those segments, resolution of segment definition may be very important in order to minimize partial volumetric region effect, which results in a biased estimate of the kinetic process.


In an exemplary embodiment, the analysis of the kinetic process may be after reconstruction of the intensity in the entire volumetric region or in the selected segments has been done for a series of time points. In that case, each segment or location in overall volume (u) has a list of intensity (I) values in time (t), and the list I(u,t) may be further analyzed to fit parametric kinetic model.


Such a parametric kinetic model may be a variety of kinds, depending on the modeling on the biological process. Examples of such models may be found in PCT/IL2005/001173.


In a simplistic example, the model may be

I(u,t)=B(t)·(1−e−k1(u)·t)·e−k2(u)·t

where B(t) is a concentration in the blood, whether obtained from imaging a segment which is pure blood (e.g. major blood vessel, or volumetric region within the heart chamber), or may be known from other sources (by injection profile, other invasive or non invasive measurements from the blood, etc). k1(u) is the time constant representing a process of uptake into the tissue at segment u, and k2(u) is the time constant representing a process of washout from the tissue at segment u.


There may be many other models, and for example the equation above may take other forms such as

I(u,t)=B(t)*F1(k1(u),τ)*F2(k2(u),τ)

where * stands for either multiply operation or convolution in most cases, and F1 and F2 represent processes. In an example, the effect of such process on the intensity may be modeled in linear cases by convolution of the intensity in the blood with an impulse response of a linear process F1 (ki(u), τ). Each of these may include one or more time constants ki(u), and the time profile is described as a function of time τ. There may be one or more such processes Fi, for example 1 (e.g. uptake or decay only), 2 (e.g. simultaneous uptake and clearance processes, 3 (e.g. combination with accumulation or metabolism), 4 or more.


A process of fitting may be used between the reconstructed intensity in overall volume and time and the parametric models mentioned above.


In another example, the parametric model may be incorporated into the reconstruction process. In this case, it is not necessary to perform reconstruction of intensities I(u,t) in overall volume and time and then use that information to extract time constants of biological processes ki(u).


Instead, the imaging equation








y
n



(
t
)








Poisson
(

[



u





φ
n



(
u
)




I


(

u
,
t

)




]

)






may be explicitly replaced with the model of the intensities








y
n



(
t
)








Poisson
(

[



u





φ
n



(
u
)




B


(
t
)


*


F
1



(



k
1



(
u
)


,
τ

)


*


F
2



(



k
2



(
u
)


,
τ

)




]

)






(where yn(t) is the number of photon measured from a viewing position n with a probability function of the view ϕn(u)).


In this case, the reconstruction process (e.g. by Maximum-Likelihood, Expectation maximization, or other equation solving techniques) is used to recover the best fitting values of ki(u), instead of recovering I(u,t) and then ki(u).


In some embodiments of the present invention, the use of a camera directly intended to perform dynamic studies, the ability to avoid interim recovery of intensities in 3D-overall volume in various time periods may be a benefit, as the design of the scanning is optimized for the kinetic parameters reconstruction, and not necessarily to image quality in each time point.


Active Vision


The camera of the present invention may further include active vision which relates to a method of radioactive-emission measurements of a body structure, comprising:


performing radioactive-emission measurements of the body structure, at a predetermined set of views;


analyzing the radioactive-emission measurements; and


dynamically defining further views for measurements, based on the analyzing.


Active vision may be used, for example, to better define an edge, by changing a view direction, to direct a saturating detecting unit away from a hot spot, to change the duration at a certain location, when a greater number of counts are required, or when sufficient counts have been obtained.


Reconstruction Stabilizer


The method of reconstruction employed by the present invention may further include a method for stabilizing the reconstruction of an imaged volumetric region, comprising:


performing an analysis of the reliability of reconstruction of a radioactive-emission density distribution of said volumetric region from radiation detected over a specified set of views; and


defining modifications to at least one of a reconstruction process and a data collection process to improve said reliability of reconstruction, in accordance with said analysis.


Additionally, the method may include calculating a measure of said reliability of reconstruction, said measure of reliability of reconstruction being for determining a necessity of performing said modifications.


Furthermore, the method may include:


providing a detection probability matrix defining a respective detection probability distribution of said volumetric region for each of said views; calculating the singular values of said detection probability matrix;


identifying singular values as destabilizing singular values.


Additionally, the method may include calculating a condition number of said probability matrix as a measure of said reliability of reconstruction.


It will be appreciated that this approach may result in non-uniform voxels, wherein voxel volumetric region may increase or decrease as necessary to increase the reliability of the reconstruction


View Selection


The present invention further utilizes a method of optimal view selection, as follows:


providing said volumetric region to be imaged;


modeling said volumetric region;


providing a collection of views of said model;


providing a scoring function, by which any set of at least one view from said collection is scorable with a score that rates information obtained from said volumetric region by said set;


forming sets of views and scoring them, by said scoring function; and


selecting a set of views from said collection, based on said scoring function for imaging said volumetric region.


Additionally, zooming in onto a suspected pathology may be performed by a two-step view selection, wherein once the suspected pathology is observed, that region of the volumetric region is modeled anew and a new collection of views is obtained specifically for the suspected pathology.


Experimental Results


Reference is now made to FIGS. 9A-9J, which schematically illustrate cardiac imaging of Tc-99m-Teboroxime, with the dynamic camera 10 in accordance with aspects of the present invention. The significance of the experimental data provided herein is the ability to successfully image Teboroxime, which as FIG. 5B illustrates is washed out of the body very quickly.



FIG. 9A provides anatomical landmarks, as follows:

    • Left Ventricle (LV)
    • Right Ventricle (RV)
    • Left Atrium (LA)
    • Right Atrium (RA)



FIG. 9B is a dynamic study input of bloodpool, mayocardium, and body timed activity.



FIG. 9C is a Film-stripe representation of a dynamic SPECT study, as follows:

    • First 2 minutes after Tc99m-Teboroxime* injection, 10 s/frame
    • Mid-ventricular slices (upper row: SA lower row: HLA)


Note: as the intense blood pool activity at the center of the heart chambers gradually clears, while the myocardial uptake gradually intensifies.



FIG. 9D is a Film-stripe representation of a dynamic SPECT study, as follows:

    • First 4 minutes after Tc99m-Teboroxime* injection, 10 s/frame
    • Mid-ventricular slices (upper row: SA lower row: HLA)


Note: as the intense blood pool activity at the center of the heart chambers gradually clears, while the myocardial uptake gradually intensifies



FIG. 9E is a Movie representation of a dynamic SPECT study (SA), as follows:

    • First 4 minutes after Tc99m-Teboroxime* injection, 10 s/frame
    • Mid-ventricular SA slices.


Note: as the intense blood pool activity gradually clears in LV and RV cavities


Note: Myocardial uptake gradually intensifies, (the thin walled RV is less intense)



FIG. 9F is a Movie representation of a dynamic SPECT study (SA), as follows:

    • First 4 minutes after Tc99m-Teboroxime* injection, 10 s/frame
    • Mid-ventricular SA slices.


Note: as the intense blood pool activity gradually clears in LV, RV, LA and RA cavities


Note: Myocardial uptake gradually intensifies, (the thin walled RV ant atria are less intense)



FIG. 9G is a Movie representation of a dynamic SPECT study (fast).



FIG. 9H is a Movie representation of a dynamic SPECT study (slow).



FIG. 9I represents volumetric region segmentation for separate tissue flow dynamics measurement



FIG. 9J represents measured kinetic curves.



FIG. 10 is another experiment, illustrating time binning at a rate of 0.001 seconds.


Electronic Scheme for Fast Throughput


High-sensitivity detecting units, such as the room temperature, solid-state CdZnTe (CZT) detectors utilized in the present embodiments, must be discharged frequently, as their high-sensitivity can lead to rapid saturation. When a given detector saturates, the output count for the associated pixel no longer accurately reflects the number of incoming photons, but rather the maximum number that the detector is capable of absorbing. This inaccuracy may lead to errors during reconstruction. It is therefore important to perform readout often enough to avoid detector saturation.


The data channel from the assembly 20 (or the assembly 20 readout circuitry) to the signal processing components must be fast enough to handle the large quantities of data which are obtained from the detecting units 12.


The electronic scheme of the present embodiments preferably includes one or more of the following solutions for performing frequent detector unit readout, while maintaining high data throughput to prevent data channel saturation.


In a preferred embodiment, the dynamic SPECT camera 10 includes a parallel readout unit for performing parallel readout of emission count data. Parallel readout requires less time than serial readout (in which the pixels are read out in sequence), as multiple pixels may be read out in a single cycle without losing the information of the individual pixel counts. The readout rate can thus be increased without loss of data.


Parallel readout may be performed at many levels. Reference is now made to FIG. 11, which illustrates various levels of detector unit organization at which parallel readout may be performed. The present exemplary embodiment shows a single detector array 25, which includes three assemblies 20. Each assembly includes a plurality of blocks 18 of detector units 12. Each detecting unit 12 includes a single-pixel detector (FIG. 1D).


The parallel readout unit preferably performs parallel readout at the level of one or more of:


a) detecting units 12, each of the single-pixel detector 14;


b) blocks 18, which include a plurality of detecting units 12;


c) assemblies 20, which include a plurality of blocks 18


d) array 25, which includes a plurality of assemblies 20.


When the parallel readout unit performs parallel readout at the level of the detecting units 12, count values are read out in parallel from each of the electrically insulated single-pixel detector 14. The single-pixel detector 14 is discharged at readout, and the photon collection process begins anew.


When the parallel readout unit performs parallel readout at the level of the block 18, count values from each of the detecting units 12 are read out serially, however multiple blocks 18 are read out in parallel. This approach is less complex to implement than parallel readout of the detecting units 12, although it results in a certain reduction in readout rate to accommodate the serial readout. Again, the single-pixel detectors 14 are discharged at readout.


Similarly, when the parallel readout unit performs parallel readout at the level of the assembly 20, count values from each of the detecting units 12 in the assembly 20 are read out serially, however multiple assemblies 20 are read out in parallel.


Parallel readout preferably includes multiple detection, amplification and signal processing paths for each of the pixels, thereby avoiding saturation due to a single localized high emission area—“hot spot”. This is in contrast with the Anger camera, in which multiple collimators are associated with a single-pixel scintillation detector, and saturation of the scintillation detector may occur even due to a localized hot spot.



FIG. 12 illustrates an exemplary embodiment of parallel readout in the dynamic SPECT camera 10. Radioactive emissions are detected by pixelated CZT crystals, where each crystal is divided into 256 pixels. The crystal is part of a ‘CZT MODULE’ (B) which also includes two ASICS each receiving events from 128 pixels. The ASIC is an OMS ‘XAIM3.4’ made by Orbotech Medical Systems, Rehovot, Israel, together with the CZT crystal. The 2 ASICs share a common output and transmit the data to ‘ADC PCB’ (C) that handles four ‘CZT MODULES’ (B) in parallel. Thus, a total of 1024 pixels are presently channeled through one ADC board. The system is capable of further increasing the accepted event rate by channeling every two ASICS through a single ADC. The ‘ADC PCB’ (C) transmits the data to the ‘NRG PCB’ (D) that handles ten ‘ADC PCB’s (C) in parallel, but could be further replicated should one want to further decrease “dead time”. The ‘NRG PCB’ (D) transmits the data to the ‘PC’ (E) where it is stored.


All in all, in the present embodiment, forty CZT MODULEs which contain a total of 10240 pixels transmit in parallel to the PC.


The bottle neck, and hence the only constraint, of the system data flow is the ASICS in the ‘CZT MODULE’ (B) and the connection to the ‘ADC PCB’s (C):


1. An ASIC (128 pixels) can process one photon hit within 3.5 uSec, or 285,000 events/sec over 128 pixels, i.e. over 2200 events/px/sec-an exceedingly high rate.


2. Two ASICS share the same output, and hence coincident event output of the two ASICS in a ‘CZT MODULE’ (B) will cause a collision and information loss. The duration of an event output from the ASIC is 1 uSec.


When the readout is rapid, the rate at which the radiation emission data is read out of the single-pixel detectors 14 may be greater than the rate at which it may be output to the processor. One known solution for managing a difference data arrival and data processing rates is to use a buffer. The buffer provides temporary storage of the incoming data, which is retrieved at a later time.


A buffered readout configuration can result in the loss of timing information, unless active steps are taken to preserve the time information associated with the collected emission data, for example, as taught hereinabove, under the heading, “The Timing Mechanism 30.”


In accordance with embodiments of the present invention, timing information is preserved. The electrical scheme may include a buffer which stores emission data along with timing information for each data item or group of data items (in the case where emission data from several detectors was obtained at substantially the same time, for example due to parallel readout), and an identification of the associated detector unit. Utilizing a buffer ensures that emission data may be collected from the detectors frequently enough to avoid saturation, even when the data channel throughput is limited. In stop and shoot mode, for example, the emission count data may be stored in the buffer for retrieval while the detector head is moving to the next location. Accurate reconstruction may thus be performed.


The camera readout circuitry is preferably designed to provide fast readout and detecting unit discharge. Fast readout circuitry may include fast analog and digital circuitry, fast A/D converters, pipelined readout, and so forth.


After the emission data has been read out of the single-pixel detectors 14, it may be necessary to convey the data to a processor for reconstruction as a single or limited number of data streams. The camera electronic scheme may include a multiplexer, for combining two or more emission data streams into a single data stream. The emission data may thus be conveyed to the processor over one physical link (or alternately over a reduced number of parallel links). For each radioactive emission event, the multiplexer includes both the timing information and an identification of the single-pixel detector 14 supplying the event. The multiplexed data may later be de-multiplexed by the processor, and reconstruction may be performed with complete information for each data item, including for example, total counts per single-pixel detector 14, per time bin, single-pixel detector location and orientation, and the time bin. Parallel readout may thus be performed, even when the collected data is to be output over a single data link.


Sensitivity Consideration.


It will be appreciated that dynamic imaging with a SPECT camera has been attempted in the past, unsuccessfully, primarily, because prior-art SPECT cameras are not sensitive enough to provide tomographic reconstruction images, for example, of a heart, with sufficient object resolution, for example, 10×10×10 cubic millimeters, in less than a minute.


As a case in point, U.S. Pat. No. 7,026,623, to Oaknin, et al., filed on Jan. 7, 2004, issued on Apr. 11, 2006, and entitled, “Efficient single photon emission imaging,” describes a method of diagnostic imaging in a shortened acquisition time for obtaining a reconstructed diagnostic image of a portion of a body of a human patient who was administered with dosage of radiopharmaceutical substance radiating gamma rays, using an Anger Camera and SPECT imaging. The method includes effective acquisition times from less than 14 minutes to less than 8 minutes. Oaknin, et al., do not claim an effective acquisition time of less than 7 minutes. Yet, in view of the section entitled, “Time Scale Considerations,” hereinabove, a sampling rate of 8 about minutes is far too slow for myocardial perfusion studies, where a sampling rate of at least two tomographic reconstruction images per heartbeat, that is, about every 30 seconds, is desired, and furthermore, where processes occur at rates of several seconds, and must be sampled at rates of a second or less, as seen in FIG. 5B.


The dynamic SPECT camera 10 in accordance with embodiments of the present invention achieves sensitivity sufficient for the required sampling rates of between every 30 seconds and every half a second, by combining several features, specifically intended to increase sensitivity, as follows:


a collimator 16 with a solid collection angle δ of at least 0.005 steradians or greater, for a fast collection rate, and high sensitivity, wherein the loss in resolution is compensated by one or a combination of the following factors:


i. motion in a stop and shoot acquisition mode, at very small incremental steps, of between about 0.01 degrees and about 0.75 degrees.


ii. simultaneous acquisition by the assemblies 20, each scanning the same region of interest from a different viewing position, thus achieving both shorter acquisition time and better edge definitions.


iii. the structure 15 conforming to the body contours, for acquisition with substantial contact or near contact with the body.


Definition of a Clinically-Valuable Image


In consequence, the dynamic SPECT camera 10 is capable of producing a “clinically-valuable image” of an intra-body region of interest (ROI) containing a radiopharmaceutical, while fulfilling one or more of the following criteria:


1. the dynamic SPECT camera 10 is capable of acquiring at least one of 5000 photons emitted from the ROI during the image acquisition procedure, such as at least one of 4000, 3000, 2500, 2000, 1500, 1200, 1000, 800, 600, 400, 200, 100, or 50 photons emitted from the ROI. In one particular embodiment, the camera is capable of acquiring at least one of 2000 photons emitted from the ROI during the image acquisition procedure;


2. the dynamic SPECT camera 10 is capable of acquiring at least 200,000 photons, such as at least 500,000, 1,000,000, 2,000,000, 3,000,000, 4,000,000, 5,000,000, 8,000,000, or 10,000,000 photons, emitted from a portion of the ROI having a volume of no more than 500 cc, such as a volume of no more than 500 cc, 400 cc, 300 cc, 200 cc, 150 cc, 100 cc, or 50 cc. In one particular embodiment, the camera is capable of acquiring at least 1,000,000 photons emitted from a volume of the ROI having a volume of no more than 200 cc;


3. the dynamic SPECT camera 10 is capable of acquiring an image of a resolution of at least 7×7×7 mm, such as at least 6×6×6 mm, 5×5×5 mm, 4×4×4 mm, 4×3×3 mm, or 3×3×3 mm, in at least 50% of the reconstructed volume, wherein the radiopharmaceutical as distributed within the ROI has a range of emission-intensities I (which is measured as emitted photons/unit time/volume), and wherein at least 50% of the voxels of the reconstructed three-dimensional emission-intensity image of the ROI have inaccuracies of less than 30% of range I, such as less than 25%, 20%, 15%, 10%, 5%, 2%, 1%, or 0.5% of range I. For example, the radiopharmaceutical may emit over a range from 0 photons/second/cc to 10{circumflex over ( )}5 photons/second/cc, such that the range I is 10{circumflex over ( )}5 photons/second/cc, and at least 50% of the voxels of the reconstructed three-dimensional intensity image of the ROI have inaccuracies of less than 15% of range I, i.e., less than 1.5×104 photons/second/cc. For some applications, the study produce a parametric image related to a physiological process occurring in each voxel. In one particular embodiment, the image has a resolution of at least 5×5×5 mm, and at least 50% of the voxels have inaccuracies of less than 15% of range I;


4. the dynamic SPECT camera 10 is capable of acquiring an image, which has a resolution of at least 7×7×7 mm, such as at least 6×6×6 mm, 5×5×5 mm, 4×4×4 mm, 4×3×3 mm, or 3×3×3 mm, in at least 50% of the reconstructed volume, wherein the radiopharmaceutical as distributed within the ROI has a range of emission-intensities I (which is measured as emitted photons/unit time/volume), and wherein at least 50% of the voxels of the reconstructed three-dimensional emission-intensity image of the ROI have inaccuracies of less than 30% of range I, such as less than 25%, 20%, 15%, 10%, 5%, 2%, 1%, or 0.5% of range I. For example, the radiopharmaceutical may emit over a range from 0 photons/second/cc to 105 photons/second/cc, such that the range I is 105 photons/second/cc, and at least 50% of the voxels of the reconstructed three-dimensional intensity image of the ROI have inaccuracies of less than 15% of range I, i.e., less than 1.5×104 photons/second/cc. For some applications, the study produces a parametric image related to a physiological process occurring in each voxel. In one particular embodiment, the image has a resolution of at least 5×5×5 mm, and at least 50% of the voxels have inaccuracies of less than 15% of range I;


5. the dynamic SPECT camera 10 is capable of acquiring an image, which has a resolution of at least 20×20×20 mm, such as at least 15×15×15 mm, 10×10×10 mm, 7×7×7 mm, 5×5×5 mm, 4×4×4 mm, 4×3×3 mm, or 3×3×3 mm, wherein values of parameters of a physiological process modeled by a parametric representation have a range of physiological parameter values I, and wherein at least 50% of the voxels of the reconstructed parametric three-dimensional image have inaccuracies less than 100% of range I, such as less than 70%, 50%, 40%, 30%, 25%, 20%, 15%, 10%, 5%, 2%, 1%, or 0.5% of range I. For example, the physiological process may include blood flow, the values of the parameters of the physiological process may have a range from 0 to 100 cc/minute, such that the range I is 100 cc/minute, and at least 50% of the voxels of the reconstructed parametric three-dimensional image have inaccuracies less than 25% of range I, i.e., less than 25 cc/minute. In one particular embodiment, the image has a resolution of at least 5×5×5 mm, and at least 50% of the voxels have inaccuracies of less than 25% of range I; and/or


6. the dynamic SPECT camera 10 is capable of acquiring an image, which has a resolution of at least 7×7×7 mm, such as at least 6×6×6 mm, 5×5×5 mm, 4×4×4 mm, 4×3×3 mm, or 3×3×3 mm, in at least 50% of the reconstructed volume, wherein if the radiopharmaceutical is distributed substantially uniformly within a portion of the ROI with an emission-intensity I+/−10% (which is defined as emitted photons/unit time/volume), and wherein at least 85% of the voxels of the reconstructed three-dimensional emission-intensity image of the portion of the ROI have inaccuracies of less than 30% of intensity I, such as less than 15%, 10%, 5%, 2%, 1%, 0.5%, 20%, or 25% of intensity I. For example, the radiopharmaceutical may be distributed within a volumetric region with a uniform emission-intensity I of 10{circumflex over ( )}5 photons/second/cc, and at least 85% of the voxels of the reconstructed three-dimensional intensity image of the volumetric region have inaccuracies of less than 15% of intensity I, i.e., less than 1.5×104 photons/second/cc. For some applications, the same definition may apply to a study which produces a parametric image related to a physiological process occurring in each voxel. In one particular embodiment, the image has a resolution of at least 5×5×5 mm, and at least 50% of the voxels have inaccuracies of less than 15% of intensity I.


It is expected that during the life of this patent many relevant dynamic SPECT cameras will be developed and the scope of the term dynamic SPECT camera is intended to include all such new technologies a priori.


As used herein the term “substantially” refers to ±10%.


As used herein the term “about” refers to ±30%.


Additional objects, advantages, and novel features of the present invention will become apparent to one ordinarily skilled in the art upon examination of the following examples, which are not intended to be limiting. Additionally, each of the various embodiments and aspects of the present invention as delineated hereinabove and as claimed in the claims section below finds experimental support in the following examples.


It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.


Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.


It is expected that during the life of this patent many relevant devices and systems will be developed and the scope of the terms herein, particularly of the terms SPECT detectors, processing unit, communication, and images are intended to include all such new technologies a priori.


It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.


Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents, and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.

Claims
  • 1. A method for reconstructing a radioactive emission image of an overall volume in a human body having first and second volumetric regions, each volumetric region having respectively independent dynamic characteristics, the method comprising: a) obtaining a structural image of the overall volume in the human body, including said volumetric regions;b) delineating said structural image of the overall volume including said first and second volumetric regions, to delineate said first and second volumetric regions-;c) obtaining radioactive emissions from the overall volume in the human body, including said volumetric regions and including timing data, using a plurality of detectors each performing oscillatory motion;d) reconstructing, using said radioactive emissions, a series of 3D images with time by:separately reconstructing said first volumetric region and said second volumetric regionusing said timing data, where different gating is employed for said first and said second volumetric regions, said different gating according to said respectively independent dynamic characteristics; andusing a model for at least one of said volumetric regions, said model describing a relationship, with time, for said at least one volumetric region, between radioactive emission intensities and one or more of: concentration of radioactive tracer in the blood, uptake time constant of the tissue and washout time constant of the tissue.
  • 2. The method of claim, 1 wherein said separately reconstructing is an iterative process including using object implantation for refining reconstruction of at least one of the first and second volumetric regions.
  • 3. The method of claim 2, wherein at least one of the volumetric regions comprises at least a part of a body organ or other body portion; and wherein said object implantation includes: providing a model of only a part of said overall volume in the human body, said model comprising a general location and shape of, and expected number of photons emitted from, said body organ or said at least part of body portion;replacing, at a general location in said initial radioactive emission image corresponding to said model general location, initial radioactive emission image voxels with said model general shape and initial radioactive emission image voxel photon counts, based on said expected number of photons from said model, where said replacing is one or more times during said iterative process, each time for providing a better starting point for performing next iteration of said reconstruction of at least one of the first and second volumetric regions.
  • 4. The method of claim 1, wherein said obtaining radioactive emissions further includes obtaining timing data by a method selected from the group consisting of time stamping, time binning, and a combination thereof.
  • 5. The method of claim 4, wherein said separately reconstructing includes separately reconstructing said first and second volumetric regions according to said respectively independent dynamic characteristics of said timing data.
  • 6. The method of claim 4, wherein said timing data comprises first timing data for said first volumetric region and second timing data for said second volumetric region;wherein said separately reconstructing comprises: reconstructing said first volumetric region using said first timing data; and reconstructing said second volumetric region using said second timing data.
  • 7. The method of claim 6, including: obtaining a second set of radioactive emissions from the overall volume in the human body, including said volumetric regions, after the reconstructing the initial radioactive emission image, the second set including timing data,wherein said separately reconstructing includes separately reconstructing said first and second volumetric regions according to said respectively independent dynamic characteristics, from said second set.
  • 8. The method of claim 7, wherein said first and second volumetric regions have higher dynamic activity than surrounding tissue; wherein obtaining a second set of radioactive emissions, further includes independently directing detecting resources to said first and second volumetric regions, so as to concentrate detecting resources on the volumetric region of greater dynamic activity.
  • 9. The method of claim 1, wherein separately reconstructing further includes independently carrying reconstruction of said first and second volumetric regions, to different levels of accuracy, so as to concentrate reconstruction resources on a volumetric region of greater dynamic activity.
  • 10. The method of claim 1, wherein said first volumetric region confines a human heart.
  • 11. The method of claim 1, wherein said volumetric regions comprises independent sub-volumetric regions, wherein during said separately reconstructing, each one of said independent sub-volumetric regions is reconstructed separately.
  • 12. The method of claim 1, wherein said radioactive emissions are obtained using a single photon emission computed tomography (SPECT) camera.
  • 13. The method of claim 12, wherein said SPECT camera comprises a plurality of detecting units.
  • 14. The method of claim 13, wherein time binning is independently enabled to each detecting unit from said plurality of detecting units.
  • 15. The method of claim 13, wherein time binning is independently enabled to one or more sub-groups of said plurality of detecting units from said plurality of detecting units.
  • 16. The method of claim 13, wherein each said detecting unit includes: a single-pixel detector, for detecting radioactive emissions; anda dedicated collimator, attached to the single-pixel detector, at the proximal end thereof, for defining a solid collection angle δ for the detecting unit.
  • 17. The method of claim 1, wherein said separately reconstructing comprises reconstructing an initial radioactive emission image which is represented by a system of voxels.
  • 18. The method of claim 17, further comprising: constructing an anatomical construction of voxels, for said system of voxels, in which voxel boundaries are aligned with boundaries of structural objects of said first volumetric region, based on said structural image,said separately reconstructing being based on said anatomical construction of voxels.
  • 19. The method of claim 18, wherein the anatomical construction of voxels includes voxels of varying volumetric regions, depending on their anatomic position and relevance.
  • 20. The method of claim 1, wherein said structural image is provided by a source, selected from the group consisting of 2-D ultrasound, 3-D ultrasound, planner x-rays, CT x-rays, MRI, and an anatomical atlas.
  • 21. The method of claim 1, wherein said obtaining radioactive emissions comprises time binning said radioactive emissions using different time bin lengths for said first and said second volumetric regions.
  • 22. The method of claim 1, wherein said obtaining a structural image of the overall volume includes obtaining the structural image of the overall volume, including the separate volumetric regions, wherein the first volumetric region has a first amount of dynamic activity and the second volumetric region has a second amount of dynamic activity, and wherein the first volumetric region moves in the overall volume relative to the second volumetric region.
  • 23. The method of claim 1 comprising, after said separately reconstructing, repeating said separately reconstructing until a desired quality of reconstruction of the overall volume has been reached.
  • 24. The method of claim 23, wherein it is determined that the desired quality of reconstructions of the overall volume has been reached when a number of gated voxels with activity is above a predefined threshold.
  • 25. The method of claim 1, wherein said separately reconstructing includes separately reconstructing said first volumetric region and reconstructing said second volumetric region according to said respectively independent dynamic characteristics of said respective first and second volumetric regions; wherein said method further includes combining said first and second reconstructed volumetric regions into a single image.
  • 26. The method of claim 1, wherein said separately reconstructing comprises using at least two different said models for different portions of said at least one of said volumetric regions.
  • 27. The method of claim 1, wherein said structural image is a first reconstruction based on said radioactive emissions of said overall volume.
  • 28. The method of claim 1, wherein said model describes a relationship, between radioactive emission intensities and concentration of radioactive tracer in the blood, uptake time constant of the tissue, and washout time constant of the tissue.
  • 29. The method of claim 1, wherein said using said model comprises generating, for said at least one volumetric region, a three dimensional relationship with time between said radioactive emission intensities and said one or more of: concentration of radioactive tracer in the blood, uptake time constant of the tissue and washout time constant of the tissue.
Priority Claims (2)
Number Date Country Kind
171346 Oct 2005 IL national
172349 Nov 2005 IL national
RELATED APPLICATIONS

This application is a Continuation of U.S. patent application Ser. No. 12/087,150, filed on Dec. 22, 2008 which is a National Phase of PCT Patent Application No. PCT/IL2006/001511 having International Filing Date of Dec. 28, 2006, which claims the benefit of U.S. Provisional Patent Application No. 60/816,970 filed on Jun. 28, 2006, 60/800,846 filed on May 17, 2006, 60/800,845 filed on May 17, 2006, 60/799,688 filed on May 11, 2006, 60/763,458 filed on Jan. 31, 2006, 60/754,199 filed on Dec. 28, 2005. U.S. patent application Ser. No. 12/087,150 is also a Continuation of U.S. patent application Ser. No. 11/607,075 filed on Dec. 1, 2006 now U.S. Pat. No. 8,094,894. U.S. patent application Ser. No. 12/087,150 is also a Continuation of PCT Patent Application No. PCT/IL2006/001291 having International Filing Date of Nov. 9, 2006, which also claims the benefit of U.S. Provisional Patent Application No. 60/800,846 filed on May 17, 2006, 60/800,845 filed on May 17, 2006, 60/799,688 filed on May 11, 2006 and 60/754,199 filed on Dec. 28, 2005. U.S. patent application Ser. No. 12/087,150 is also a Continuation of PCT Patent Application No. PCT/IL2006/000834 having International Filing Date of Jul. 19, 2006, which also claims benefit of U.S. Provisional Patent Application No. 60/816,970 filed on Jun. 28, 2006, 60/800,846 filed on May 17, 2006, 60/800,845 filed on May 17, 2006, 60/799,688 filed on May 11, 2006 and 60/763,458 filed on Jan. 31, 2006. U.S. patent application Ser. No. 12/087,150 is also a Continuation of PCT Patent Application No. PCT/IL2006/000840 having International Filing Date of Jul. 19, 2006, which also claims benefit of U.S. Provisional Patent Application No. 60/816,970 filed on Jun. 28, 2006, 60/800,846 filed on May 17, 2006, 60/800,845 filed on May 17, 2006, 60/799,688 filed on May 11, 2006 and 60/763,458 filed on Jan. 31, 2006. U.S. patent application Ser. No. 12/087,150 is also a Continuation of PCT Patent Application No. PCT/IL2006/000562 having International Filing Date of May 11, 2006, which also claims the benefit of U.S. Provisional Patent Applications No. 60/763,458 filed on Jan. 31, 2006. U.S. patent application Ser. No. 12/087,150 is also a Continuation of PCT Patent Application No. PCT/IL2006/000059 having International Filing Date of Jan. 15, 2006. U.S. patent application Ser. No. 12/087,150 is also a Continuation of U.S. patent application Ser. No. 12/084,559 filed on Nov. 26, 2008, now U.S. Pat. No. 7,705,316 which is a National Phase of PCT Patent Application No. PCT/IL2006/001291 having International Filing Date on Nov. 9, 2006, which is a Continuation-in-Part (CIP) of PCT Patent Application Nos. PCT/IL2006/000840 and PCT/IL2006/000834, both having International Filing Date on Jul. 19, 2006, PCT/IL2006/000562 having International Filing Date on May 11, 2006, PCT/IL2006/000059 having International Filing Date on Jan. 15, 2006, PCT/IL2005/001215 having International Filing Date on Nov. 16, 2005 and PCT/IL2005/001173 having International Filing Date on Nov. 9, 2005. PCT Patent Application No. PCT/IL2006/001291 also claims the benefit of priority of U.S. Provisional Patent Application No. 60/816,970 filed on Jun. 28, 2006, 60/800,846 and 60/800,845, both filed on May 17, 2006, 60/799,688 filed on May 11, 2006, 60/763,458 filed on Jan. 31, 2006, 60/754,199 filed on Dec. 28, 2005, 60/750,597 and 60/750,334, both filed on Dec. 15, 2005, 60/750,287 filed on Dec. 13, 2005 and 60/741,440 filed on Dec. 2, 2005. PCT Patent Application No. PCT/IL2006/001291 also claims the benefit of priority of Israel Patent Application No. 172349, filed on Nov. 27, 2005. PCT Patent Application No. PCT/IL2005/001173 is a Continuation-in-Part (CIP) of PCT Patent Application Nos. PCT/IL2005/000575 and PCT/IL2005/000572, both having International Filing Date on Jun. 1, 2005, and PCT/IL2005/000048 having International Filing Date on Jan. 13, 2005. PCT Patent Application No. PCT/IL2005/001173 also claims the benefit of priority of U.S. Provisional Patent Application Nos. 60/720,652 and 60/720,541, both filed on Sep. 27, 2005, 60/720,034 filed on Sep. 26, 2005, 60/702,979 filed on Jul. 28, 2005, 60/700,753 and 60/700,752, both filed on Jul. 20, 2005, 60/700,318, 60/700,317, and 60/700,299 all filed on Jul. 19, 2005, 60/691,780 filed on Jun. 20, 2005, 60/675,892 filed on Apr. 29, 2005, 60/648,690 filed on Feb. 2, 2005, 60/648,385 filed on Feb. 1, 2005, 60/640,215 filed on Jan. 3, 2005, 60/636,088 filed on Dec. 16, 2004, 60/635,630 filed on Dec. 14, 2004, 60/632,515 filed on Dec. 3, 2004, 60/632,236 filed on Dec. 2, 2004, 60/630,561 filed on Nov. 26, 2004, 60/628,105 filed on Nov. 17, 2004 and 60/625,971 filed on Nov. 9, 2004. PCT Patent Application No. PCT/IL2005/001173 also claims the benefit of priority of Israel Patent Application No. 171346 filed on Oct. 10, 2005. PCT Patent Application No. PCT/IL2005/000575 claims the benefit of priority of U.S. Provisional Patent No. 60/575,369 filed on Jun. 1, 2004. U.S. patent application Ser. No. 12/084,559 is also a Continuation-in-Part (CIP) of U.S. patent application Ser. No. 11/034,007 filed on Jan. 13, 2005, now U.S. Pat. No. 7,176,466, which claims the benefit of priority of U.S. Provisional Patent Application No. 60/535,830 filed on Jan. 13, 2004. The contents of the above applications are all incorporated herein by reference.

US Referenced Citations (787)
Number Name Date Kind
630611 Knapp et al. Aug 1899 A
2776377 Anger Jan 1957 A
3340866 Noeller Sep 1967 A
3446965 Ogier et al. May 1969 A
3535085 Shumate et al. Oct 1970 A
3684887 Hugonin Aug 1972 A
3690309 Pluzhnikov et al. Sep 1972 A
3719183 Schwartz Mar 1973 A
3739279 Hollis Jun 1973 A
3971362 Pope et al. Jul 1976 A
3978337 Nickles et al. Aug 1976 A
3988585 O'Neill et al. Oct 1976 A
4000502 Butler et al. Dec 1976 A
4015592 Bradley-Moore Apr 1977 A
4055765 Gerber et al. Oct 1977 A
4061919 Miller et al. Dec 1977 A
4095107 Genna et al. Jun 1978 A
4165462 Macovski et al. Aug 1979 A
4181856 Bone Jan 1980 A
4278077 Mizumoto Jul 1981 A
4289969 Cooperstein et al. Sep 1981 A
4291708 Frei et al. Sep 1981 A
4296785 Vitello et al. Oct 1981 A
4302675 Wake et al. Nov 1981 A
4364377 Smith Dec 1982 A
4383327 Kruger May 1983 A
4476381 Rubin Oct 1984 A
4492119 Dulapa et al. Jan 1985 A
4503331 Kovacs, Jr. et al. May 1985 A
4521688 Yin Jun 1985 A
4529882 Lee Jul 1985 A
H12 Bennett et al. Jan 1986 H
4580054 Shimoni Apr 1986 A
4584478 Genna et al. Apr 1986 A
4595014 Barrett et al. Jun 1986 A
4674107 Urban et al. Jun 1987 A
4679142 Lee Jul 1987 A
4689041 Corday et al. Aug 1987 A
4689621 Kleinberg Aug 1987 A
4709382 Sones Nov 1987 A
4710624 Alvarez et al. Dec 1987 A
4731536 Rische et al. Mar 1988 A
4773430 Porath Sep 1988 A
4782840 Martin, Jr. et al. Nov 1988 A
4791934 Brunnett Dec 1988 A
4801803 Denen et al. Jan 1989 A
4828841 Porter et al. May 1989 A
4834112 Machek et al. May 1989 A
4844067 Ikada et al. Jul 1989 A
4844076 Lesho et al. Jul 1989 A
4853546 Abe et al. Aug 1989 A
4854324 Hirschman et al. Aug 1989 A
4854330 Evans, III et al. Aug 1989 A
4867962 Abrams Sep 1989 A
4893013 Denen et al. Jan 1990 A
4893322 Hellmick et al. Jan 1990 A
4919146 Rhinehart et al. Apr 1990 A
4924486 Weber et al. May 1990 A
4928250 Greenberg et al. May 1990 A
4929832 Ledly May 1990 A
4938230 Machek et al. Jul 1990 A
4951653 Fly et al. Aug 1990 A
4959547 Carroll et al. Sep 1990 A
4970391 Uber, III Nov 1990 A
4995396 Inaba et al. Feb 1991 A
5014708 Hayashi et al. May 1991 A
5018182 Cowan et al. May 1991 A
5032729 Charpak Jul 1991 A
5033998 Corday et al. Jul 1991 A
5039863 Matsuno et al. Aug 1991 A
5042056 Hellmick et al. Aug 1991 A
5070877 Mohiuddin et al. Dec 1991 A
5070878 Denen Dec 1991 A
5088492 Takayama et al. Feb 1992 A
5115137 Andersson-Engels et al. May 1992 A
5119818 Carroll et al. Jun 1992 A
5132542 Bassalleck et al. Jul 1992 A
5142557 Toker et al. Aug 1992 A
5145163 Cowan et al. Sep 1992 A
5151598 Denen Sep 1992 A
5170055 Carroll et al. Dec 1992 A
5170439 Zeng et al. Dec 1992 A
5170789 Narayan et al. Dec 1992 A
5196796 Misic et al. Mar 1993 A
5210421 Gullberg et al. May 1993 A
5243988 Sieben et al. Sep 1993 A
5246005 Carroll et al. Sep 1993 A
5249124 DeVito Sep 1993 A
5252830 Weinberg Oct 1993 A
5254101 Trombley, III Oct 1993 A
5258717 Misic et al. Nov 1993 A
5263077 Cowan et al. Nov 1993 A
5279607 Schentag et al. Jan 1994 A
5284147 Hanaoka et al. Feb 1994 A
5299253 Wessels Mar 1994 A
5304165 Haber et al. Apr 1994 A
5307808 Dumoulin et al. May 1994 A
5307814 Kressel et al. May 1994 A
5309959 Shaw et al. May 1994 A
5317506 Coutre et al. May 1994 A
5317619 Hellmick et al. May 1994 A
5323006 Thompson et al. Jun 1994 A
5329976 Haber et al. Jul 1994 A
5334141 Carr et al. Aug 1994 A
5338936 Gullberg et al. Aug 1994 A
5349190 Hines et al. Sep 1994 A
5350355 Sklar Sep 1994 A
5355087 Claiborne et al. Oct 1994 A
5365069 Eisen et al. Nov 1994 A
5365928 Rhinehart et al. Nov 1994 A
5367552 Peschmann Nov 1994 A
5377681 Drane Jan 1995 A
5381791 Qian Jan 1995 A
5383456 Arnold et al. Jan 1995 A
5383858 Reilly et al. Jan 1995 A
5386446 Fujimoto et al. Jan 1995 A
5387409 Nunn et al. Feb 1995 A
5391877 Marks Feb 1995 A
5395366 D'Andrea Mar 1995 A
5396531 Hartley Mar 1995 A
5399868 Jones et al. Mar 1995 A
5404293 Weng et al. Apr 1995 A
5415181 Hofgrefe et al. May 1995 A
5431161 Ryals et al. Jul 1995 A
5435302 Lenkinski et al. Jul 1995 A
5436458 Tran et al. Jul 1995 A
5441050 Thurston et al. Aug 1995 A
5448073 Jeanguillaume Sep 1995 A
5451232 Rhinehart et al. Sep 1995 A
5472403 Cornacchia et al. Dec 1995 A
5475219 Olson Dec 1995 A
5475232 Powers et al. Dec 1995 A
5476095 Schnall et al. Dec 1995 A
5479969 Hardie et al. Jan 1996 A
5481115 Hsieh et al. Jan 1996 A
5484384 Fearnot Jan 1996 A
5489782 Wernikoff Feb 1996 A
5493595 Schoolman Feb 1996 A
5493805 Penuela et al. Feb 1996 A
5494036 Uber, III et al. Feb 1996 A
5501674 Trombley, III et al. Mar 1996 A
5517120 Misik et al. May 1996 A
5519221 Weinberg May 1996 A
5519222 Besett May 1996 A
5519931 Reich May 1996 A
5520182 Leighton et al. May 1996 A
5520653 Reilly et al. May 1996 A
5521506 Misic et al. May 1996 A
5524622 Wilson Jun 1996 A
5536945 Reich Jul 1996 A
5545899 Tran et al. Aug 1996 A
5559335 Zeng et al. Sep 1996 A
5565684 Gullberg et al. Oct 1996 A
5569181 Heilman et al. Oct 1996 A
5572132 Pulyer et al. Nov 1996 A
5572999 Funda et al. Nov 1996 A
5579766 Gray Dec 1996 A
5580541 Wells et al. Dec 1996 A
5585637 Bertelsen et al. Dec 1996 A
5587585 Eisen et al. Dec 1996 A
5591143 Trombley, III et al. Jan 1997 A
5600145 Plummer Feb 1997 A
5604531 Iddan et al. Feb 1997 A
5610520 Misic Mar 1997 A
5617858 Taverna et al. Apr 1997 A
5629524 Stettner et al. May 1997 A
5630034 Oikawa et al. May 1997 A
5635717 Popescu Jun 1997 A
5657759 Essen-Moller Aug 1997 A
5672877 Liebig et al. Sep 1997 A
5677539 Apotovsky et al. Oct 1997 A
5682888 Olson et al. Nov 1997 A
5687250 Curley et al. Nov 1997 A
5687542 Lawecki et al. Nov 1997 A
5690691 Chen et al. Nov 1997 A
5692640 Caulfield et al. Dec 1997 A
5694933 Madden et al. Dec 1997 A
5695500 Taylor et al. Dec 1997 A
5716595 Goldenberg Feb 1998 A
5717212 Fulton et al. Feb 1998 A
5727554 Kalend et al. Mar 1998 A
5729129 Acker Mar 1998 A
5732704 Thurston et al. Mar 1998 A
5739508 Uber, III Apr 1998 A
5741232 Reilly et al. Apr 1998 A
5742060 Ashburn Apr 1998 A
5744805 Raylman et al. Apr 1998 A
5757006 De Vito et al. May 1998 A
5779675 Reilly et al. Jul 1998 A
5780855 Pare et al. Jul 1998 A
5781442 Engleson et al. Jul 1998 A
5784432 Kurtz et al. Jul 1998 A
5786597 Lingren et al. Jul 1998 A
5795333 Reilly et al. Aug 1998 A
5799111 Guissin Aug 1998 A
5800355 Hasegawa Sep 1998 A
5803914 Ryals et al. Sep 1998 A
5806519 Evans, III et al. Sep 1998 A
5808203 Nolan, Jr. et al. Sep 1998 A
5810008 Dekel et al. Sep 1998 A
5810742 Pearlman Sep 1998 A
5811814 Leone et al. Sep 1998 A
5813985 Carroll Sep 1998 A
5818050 Dilmanian et al. Oct 1998 A
5821541 Tuemer Oct 1998 A
5825031 Wong et al. Oct 1998 A
5827219 Uber, III et al. Oct 1998 A
5828073 Zhu et al. Oct 1998 A
5833603 Kovacs et al. Nov 1998 A
5838009 Plummer et al. Nov 1998 A
5840026 Uber, III et al. Nov 1998 A
5841141 Gullberg et al. Nov 1998 A
5842977 Lesho et al. Dec 1998 A
5843037 Uber, III Dec 1998 A
5846513 Carroll et al. Dec 1998 A
5847396 Lingren et al. Dec 1998 A
5857463 Thurston et al. Jan 1999 A
5871013 Wainer et al. Feb 1999 A
5873861 Hitchins et al. Feb 1999 A
5880475 Oka et al. Mar 1999 A
5882338 Gray Mar 1999 A
5884457 Ortiz et al. Mar 1999 A
5885216 Evans, III et al. Mar 1999 A
5891030 Johnson et al. Apr 1999 A
5893397 Peterson et al. Apr 1999 A
5899885 Reilly et al. May 1999 A
5900533 Chou May 1999 A
5903008 Li May 1999 A
5910112 Judd et al. Jun 1999 A
5911252 Cassel Jun 1999 A
5916167 Kramer et al. Jun 1999 A
5916197 Reilly et al. Jun 1999 A
5920054 Uber, III Jun 1999 A
5927351 Zhu et al. Jul 1999 A
5928150 Call Jul 1999 A
5932879 Raylman et al. Aug 1999 A
5938639 Reilly et al. Aug 1999 A
5939724 Eisen et al. Aug 1999 A
5944190 Edelen Aug 1999 A
5944694 Hitchins et al. Aug 1999 A
5947935 Rhinehart et al. Sep 1999 A
5953884 Lawecki et al. Sep 1999 A
5954668 Uber, III et al. Sep 1999 A
5961457 Raylman et al. Oct 1999 A
5967983 Ashburn Oct 1999 A
5973598 Beigel Oct 1999 A
5974165 Giger et al. Oct 1999 A
5984860 Shan Nov 1999 A
5987350 Thurston Nov 1999 A
5993378 Lemelson Nov 1999 A
5997502 Reilly et al. Dec 1999 A
6002134 Lingren Dec 1999 A
6002480 Izatt et al. Dec 1999 A
6017330 Hitchins et al. Jan 2000 A
6019745 Gray Feb 2000 A
6021341 Scibilia et al. Feb 2000 A
6026317 Verani Feb 2000 A
6037595 Lingren Mar 2000 A
6040697 Misic Mar 2000 A
6042565 Hirschman et al. Mar 2000 A
RE36648 Uber, III et al. Apr 2000 E
6046454 Lingren et al. Apr 2000 A
6048334 Hirschman et al. Apr 2000 A
6052618 Dahlke et al. Apr 2000 A
6055450 Ashburn Apr 2000 A
6055452 Pearlman Apr 2000 A
RE36693 Reich May 2000 E
6063052 Uber et al. May 2000 A
D426891 Beale et al. Jun 2000 S
D426892 Beale et al. Jun 2000 S
6072177 McCroskey et al. Jun 2000 A
6076009 Raylman et al. Jun 2000 A
6080984 Friesenhahn Jun 2000 A
D428491 Beale et al. Jul 2000 S
6082366 Andra et al. Jul 2000 A
6090064 Reilly et al. Jul 2000 A
6091070 Lingren et al. Jul 2000 A
6096011 Trombley, III et al. Aug 2000 A
6107102 Ferrari Aug 2000 A
6115635 Bourgeois Sep 2000 A
6129670 Burdette et al. Oct 2000 A
6132372 Essen-Moller Oct 2000 A
6135955 Madden et al. Oct 2000 A
6135968 Brounstein Oct 2000 A
6137109 Hayes Oct 2000 A
6145277 Lawecki et al. Nov 2000 A
6147352 Ashburn Nov 2000 A
6147353 Gagnon et al. Nov 2000 A
6148229 Morris, Sr. et al. Nov 2000 A
6149627 Uber, III Nov 2000 A
6155485 Coughlin et al. Dec 2000 A
6160398 Walsh Dec 2000 A
6162198 Coffey et al. Dec 2000 A
6172362 Lingren et al. Jan 2001 B1
6173201 Front Jan 2001 B1
6324418 Crowley et al. Jan 2001 B1
6184530 Hines et al. Feb 2001 B1
6189195 Reilly et al. Feb 2001 B1
6194715 Lingren et al. Feb 2001 B1
6194725 Colsher et al. Feb 2001 B1
6194726 Pi et al. Feb 2001 B1
6197000 Reilly et al. Mar 2001 B1
6202923 Boyer et al. Mar 2001 B1
6203775 Torchilin et al. Mar 2001 B1
6205347 Morgan et al. Mar 2001 B1
6212423 Krakovitz Apr 2001 B1
6223065 Misic et al. Apr 2001 B1
6224577 Dedola et al. May 2001 B1
6226350 Hsieh May 2001 B1
6229145 Weinberg May 2001 B1
6232605 Soluri et al. May 2001 B1
6233304 Hu et al. May 2001 B1
6236050 Turner May 2001 B1
6236878 Taylor et al. May 2001 B1
6236880 Raylman et al. May 2001 B1
6239438 Schubert May 2001 B1
6240312 Alfano et al. May 2001 B1
6241708 Reilly et al. Jun 2001 B1
6242743 DeVito Jun 2001 B1
6242744 Soluri et al. Jun 2001 B1
6242745 Berlad et al. Jun 2001 B1
6246901 Benaron Jun 2001 B1
6252924 Davantes et al. Jun 2001 B1
6258576 Richards-Kortum et al. Jul 2001 B1
6259095 Bouton et al. Jul 2001 B1
6261562 Xu et al. Jul 2001 B1
6263229 Atalar et al. Jul 2001 B1
6269340 Ford et al. Jul 2001 B1
6270463 Morris, Sr. et al. Aug 2001 B1
6271524 Wainer et al. Aug 2001 B1
6271525 Majewski et al. Aug 2001 B1
6280704 Schutt et al. Aug 2001 B1
6281505 Hines et al. Aug 2001 B1
6308097 Pearlman Oct 2001 B1
6310968 Hawkins et al. Oct 2001 B1
6315981 Unger Nov 2001 B1
6317623 Griffiths et al. Nov 2001 B1
6317648 Sleep et al. Nov 2001 B1
6318630 Coughlin et al. Nov 2001 B1
6322535 Hitchins et al. Nov 2001 B1
6323648 Belt et al. Nov 2001 B1
RE37487 Reilly et al. Dec 2001 E
D452737 Nolan, Jr. et al. Jan 2002 S
6336913 Spohn et al. Jan 2002 B1
6339652 Hawkins et al. Jan 2002 B1
6339718 Zatezalo et al. Jan 2002 B1
6344745 Reisker et al. Feb 2002 B1
6346706 Rogers et al. Feb 2002 B1
6346886 de la Huerga Feb 2002 B1
RE37602 Uber, III et al. Mar 2002 E
6353227 Boxen Mar 2002 B1
6356081 Misic Mar 2002 B1
6368331 Front et al. Apr 2002 B1
6371938 Reilly et al. Apr 2002 B1
6375624 Uber, III et al. Apr 2002 B1
6377838 Iwanczyk et al. Apr 2002 B1
6381349 Zeng et al. Apr 2002 B1
6385483 Uber, III et al. May 2002 B1
6388244 Gagnon May 2002 B1
6388257 Gagnon et al. May 2002 B1
6388258 Berlad et al. May 2002 B1
6392235 Barrett et al. May 2002 B1
6396273 Misic May 2002 B2
6397098 Uber, III et al. May 2002 B1
6399951 Paulus et al. Jun 2002 B1
6402717 Reilly et al. Jun 2002 B1
6402718 Reilly et al. Jun 2002 B1
6407391 Mastrippolito et al. Jun 2002 B1
6408204 Hirschman Jun 2002 B1
6409987 Cardin et al. Jun 2002 B1
6415046 Kerut, Sr. Jul 2002 B1
6420711 Tuemer Jul 2002 B2
6425174 Reich Jul 2002 B1
6426917 Tabanou et al. Jul 2002 B1
6429431 Wilk Aug 2002 B1
6431175 Penner et al. Aug 2002 B1
6432089 Kakimi et al. Aug 2002 B1
6438401 Cheng et al. Aug 2002 B1
6439444 Shields, II Aug 2002 B1
6440107 Trombley, III et al. Aug 2002 B1
6442418 Evans, III et al. Aug 2002 B1
6448560 Turner Sep 2002 B1
6453199 Kobozev Sep 2002 B1
6459925 Nields et al. Oct 2002 B1
6459931 Hirschman Oct 2002 B1
6468261 Small et al. Oct 2002 B1
6469306 Van Dulmen et al. Oct 2002 B1
6471674 Emig et al. Oct 2002 B1
6480732 Tanaka et al. Nov 2002 B1
6484051 Daniel Nov 2002 B1
6488661 Spohn et al. Dec 2002 B1
6490476 Townsend et al. Dec 2002 B1
6504157 Juhi Jan 2003 B2
6504178 Carlson et al. Jan 2003 B2
6504899 Pugachev et al. Jan 2003 B2
6506155 Sluis et al. Jan 2003 B2
6510336 Daghighian et al. Jan 2003 B1
6512374 Misic et al. Jan 2003 B1
6516213 Nevo Feb 2003 B1
6519569 White et al. Feb 2003 B1
6520930 Critchlow et al. Feb 2003 B2
6522945 Sleep et al. Feb 2003 B2
6525320 Juni Feb 2003 B1
6525321 Juni Feb 2003 B2
6592520 Peszynski et al. Mar 2003 B1
6541763 Lingren et al. Apr 2003 B2
6545280 Weinberg et al. Apr 2003 B2
6549646 Yeh et al. Apr 2003 B1
6560354 Maurer et al. May 2003 B1
6562008 Reilly et al. May 2003 B1
6563942 Takeo et al. May 2003 B2
6565502 Bede et al. May 2003 B1
6567687 Front et al. May 2003 B2
6574304 Hsieh et al. Jun 2003 B1
6575930 Trombley, III et al. Jun 2003 B1
6576918 Fu et al. Jun 2003 B1
6583420 Nelson et al. Jun 2003 B1
6584348 Glukhovsky Jun 2003 B2
6585700 Trocki et al. Jul 2003 B1
6587710 Wainer Jul 2003 B1
6591127 McKinnon Jul 2003 B1
6589158 Winkler Aug 2003 B2
6602488 Daghighian Aug 2003 B1
6607301 Glukhovsky et al. Aug 2003 B1
6611141 Schulz et al. Aug 2003 B1
6614453 Suri et al. Sep 2003 B1
6620134 Trombley, III et al. Sep 2003 B1
6627893 Zeng et al. Sep 2003 B1
6628983 Gagnon Sep 2003 B1
6628984 Weinberg Sep 2003 B2
6630735 Carlson et al. Oct 2003 B1
6631284 Nutt et al. Oct 2003 B2
6632216 Houzego et al. Oct 2003 B2
6633658 Dabney et al. Oct 2003 B1
6638752 Contag et al. Oct 2003 B2
6643537 Zatezalo et al. Nov 2003 B1
6643538 Majewski et al. Nov 2003 B1
6652489 Trocki et al. Nov 2003 B2
6657200 Nygard et al. Dec 2003 B2
6662036 Cosman Dec 2003 B2
6664542 Ye et al. Dec 2003 B2
6670258 Carlson et al. Dec 2003 B2
6671563 Engleson et al. Dec 2003 B1
6673033 Sciulli et al. Jan 2004 B1
6674834 Acharya et al. Jan 2004 B1
6676634 Spohn et al. Jan 2004 B1
6677182 Carlson et al. Jan 2004 B2
6677755 Belt et al. Jan 2004 B2
6680750 Tournier et al. Jan 2004 B1
6694172 Gagnon et al. Feb 2004 B1
6697660 Robinson Feb 2004 B1
6699219 Emig et al. Mar 2004 B2
6704592 Reynolds et al. Mar 2004 B1
6713766 Garrard et al. Mar 2004 B2
6714012 Belt et al. Mar 2004 B2
6714013 Misic Mar 2004 B2
6716195 Nolan, Jr. et al. Apr 2004 B2
6722499 Reich Apr 2004 B2
6723988 Wainer Apr 2004 B1
6726657 Dedig et al. Apr 2004 B1
6728583 Hallett Apr 2004 B2
6731971 Evans, III et al. May 2004 B2
6731989 Engleson et al. May 2004 B2
6733477 Cowan et al. May 2004 B2
6733478 Reilly et al. May 2004 B2
6734416 Carlson et al. May 2004 B2
6734430 Soluri et al. May 2004 B2
6737652 Lanza et al. May 2004 B2
6737866 Belt et al. May 2004 B2
6740882 Weinberg et al. May 2004 B2
6743202 Hirschman et al. Jun 2004 B2
6743205 Nolan, Jr. et al. Jun 2004 B2
6747454 Belt Jun 2004 B2
6748259 Benaron et al. Jun 2004 B1
6751500 Hirschman et al. Jun 2004 B2
6765981 Heumann Jul 2004 B2
6766048 Launay et al. Jul 2004 B1
6771802 Patt et al. Aug 2004 B1
6774358 Hamill et al. Aug 2004 B2
6776977 Liu Aug 2004 B2
6787777 Gagnon et al. Sep 2004 B1
6788758 De Villiers Sep 2004 B2
6798206 Misic Sep 2004 B2
6808513 Reilly et al. Oct 2004 B2
6809321 Rempel Oct 2004 B2
6813868 Baldwin et al. Nov 2004 B2
6821013 Reilly et al. Nov 2004 B2
6822237 Inoue et al. Nov 2004 B2
6833705 Misic Dec 2004 B2
6838672 Wagenaar et al. Jan 2005 B2
6841782 Balan et al. Jan 2005 B1
6843357 Bybee et al. Jan 2005 B2
6851615 Jones Feb 2005 B2
6866654 Callan et al. Mar 2005 B2
6870175 Dell et al. Mar 2005 B2
6881043 Barak Apr 2005 B2
6888351 Belt et al. May 2005 B2
6889074 Uber, III et al. May 2005 B2
6897658 Belt et al. May 2005 B2
6906330 Blevis et al. Jun 2005 B2
D507832 Yanniello et al. Jul 2005 S
6915170 Engleson et al. Jul 2005 B2
6915823 Osborne et al. Jul 2005 B2
6917828 Fukuda Jul 2005 B2
6921384 Reilly et al. Jul 2005 B2
6928142 Shao et al. Aug 2005 B2
6935560 Andreasson et al. Aug 2005 B2
6936030 Pavlik et al. Aug 2005 B1
6937750 Natanzon et al. Aug 2005 B2
6939302 Griffiths et al. Sep 2005 B2
6940070 Turner Sep 2005 B2
6943355 Shwartz et al. Sep 2005 B2
6957522 Baldwin et al. Oct 2005 B2
6958053 Reilly Oct 2005 B1
6963770 Scarantino et al. Nov 2005 B2
6970735 Uber, III et al. Nov 2005 B2
6972001 Emig et al. Dec 2005 B2
6974443 Reilly et al. Dec 2005 B2
6976349 Baldwin et al. Dec 2005 B2
6984222 Hitchins et al. Jan 2006 B1
6985870 Martucci et al. Jan 2006 B2
6988981 Hamazaki Jan 2006 B2
6994249 Peterka et al. Feb 2006 B2
7009183 Wainer et al. Mar 2006 B2
7011814 Suddarth et al. Mar 2006 B2
7012430 Misic Mar 2006 B2
7017622 Osborne et al. Mar 2006 B2
7018363 Cowan et al. Mar 2006 B2
7019783 Kindem et al. Mar 2006 B2
7102138 Belvis et al. Mar 2006 B2
7025757 Reilly et al. Apr 2006 B2
7026623 Oaknin et al. Apr 2006 B2
7043063 Noble et al. May 2006 B1
7103204 Celler et al. Sep 2006 B1
7127026 Amemiya et al. Oct 2006 B2
7142634 Engler et al. Nov 2006 B2
7145986 Wear et al. Dec 2006 B2
7147372 Nelson et al. Dec 2006 B2
7164130 Welsh et al. Jan 2007 B2
7176466 Rousso et al. Feb 2007 B2
7187790 Sabol et al. Mar 2007 B2
7217953 Carlson May 2007 B2
7256386 Carlson et al. Aug 2007 B2
7291841 Nelson et al. Nov 2007 B2
7327822 Sauer et al. Feb 2008 B2
7359535 Salla et al. Apr 2008 B2
7373197 Daighighian et al. May 2008 B2
7394923 Zou et al. Jul 2008 B2
7444010 De Man Oct 2008 B2
7468513 Charon et al. Dec 2008 B2
7470896 Pawlak et al. Dec 2008 B2
7490085 Walker et al. Feb 2009 B2
7495225 Hefetz et al. Feb 2009 B2
7502499 Grady Mar 2009 B2
7570732 Stanton et al. Aug 2009 B2
7592597 Hefetz et al. Sep 2009 B2
7620444 Le et al. Nov 2009 B2
7627084 Jabri et al. Dec 2009 B2
7652259 Kimchy et al. Jan 2010 B2
7671331 Hefez Mar 2010 B2
7671340 Uribe et al. Mar 2010 B2
7672491 Krishnan et al. Mar 2010 B2
7680240 Manjeshwar et al. Mar 2010 B2
7705316 Rousso et al. Apr 2010 B2
7734331 Dhawale et al. Jun 2010 B2
7826889 David et al. Nov 2010 B2
7831024 Metzler et al. Nov 2010 B2
7835927 Schlotterbeck et al. Nov 2010 B2
7872235 Rousso et al. Jan 2011 B2
7894650 Weng et al. Feb 2011 B2
7968851 Rousso et al. Jun 2011 B2
8013308 Guerin et al. Sep 2011 B2
8055329 Kimchy et al. Nov 2011 B2
8111886 Rousso et al. Feb 2012 B2
8158951 Bal et al. Apr 2012 B2
8163661 Akiyoshi et al. Apr 2012 B2
8204500 Weintraub et al. Jun 2012 B2
8338788 Zilberstein et al. Dec 2012 B2
8440168 Yang et al. May 2013 B2
8489176 Ben-David et al. Jul 2013 B1
8565860 Kimchy et al. Oct 2013 B2
8620046 Nagler et al. Dec 2013 B2
8909325 Kimchy et al. Dec 2014 B2
20010016029 Turner Aug 2001 A1
20010020131 Kawagishi et al. Sep 2001 A1
20010035902 Iddan et al. Nov 2001 A1
20010049608 Hochman Dec 2001 A1
20020068864 Bishop et al. Jun 2002 A1
20020072784 Sheppard, Jr. et al. Jun 2002 A1
20020085748 Baumberg Jul 2002 A1
20020087101 Barrick et al. Jul 2002 A1
20020099295 Gil et al. Jul 2002 A1
20020099310 Kimchy et al. Jul 2002 A1
20020099334 Hanson et al. Jul 2002 A1
20020103429 DeCharms Aug 2002 A1
20020103431 Toker et al. Aug 2002 A1
20020145114 Inoue et al. Oct 2002 A1
20020148970 Wong et al. Oct 2002 A1
20020165491 Reilly Nov 2002 A1
20020168094 Kaushikkar et al. Nov 2002 A1
20020168317 Daighighian et al. Nov 2002 A1
20020172405 Schultz Nov 2002 A1
20020179843 Tanaka et al. Dec 2002 A1
20020183645 Nachaliel Dec 2002 A1
20020188197 Bishop et al. Dec 2002 A1
20020191734 Kojima et al. Dec 2002 A1
20020198738 Osborne Dec 2002 A1
20030001098 Stoddart et al. Jan 2003 A1
20030001837 Baumberg Jan 2003 A1
20030006376 Tumer Jan 2003 A1
20030013950 Rollo Jan 2003 A1
20030013966 Barnes et al. Jan 2003 A1
20030139661 Barnes et al. Jan 2003 A1
20030038240 Weinberg Feb 2003 A1
20030055685 Cobb et al. Mar 2003 A1
20030063787 Natanzon et al. Apr 2003 A1
20030071219 Motomura et al. Apr 2003 A1
20030081716 Turner May 2003 A1
20030136912 Juni Jun 2003 A1
20030135388 Martucci et al. Jul 2003 A1
20030144322 Kozikowski et al. Jul 2003 A1
20030147887 Wang et al. Aug 2003 A1
20030158481 Stotzka et al. Aug 2003 A1
20030174804 Bulkes et al. Sep 2003 A1
20030178559 Hamill et al. Sep 2003 A1
20030183226 Brand et al. Oct 2003 A1
20030189174 Tanaka et al. Oct 2003 A1
20030191430 D'Andrea et al. Oct 2003 A1
20030202629 Dunham et al. Oct 2003 A1
20030208117 Shwartz et al. Nov 2003 A1
20030215122 Tanaka Nov 2003 A1
20030215124 Li Nov 2003 A1
20030216631 Bloch et al. Nov 2003 A1
20030219149 Vailaya et al. Nov 2003 A1
20040003001 Shimura Jan 2004 A1
20040010397 Barbour et al. Jan 2004 A1
20040015075 Kimchy et al. Jan 2004 A1
20040021065 Weber Feb 2004 A1
20040044282 Mixon et al. Mar 2004 A1
20040051368 Caputo et al. Mar 2004 A1
20040054248 Kimchy et al. Mar 2004 A1
20040054278 Kimchy et al. Mar 2004 A1
20040065838 Tumer Apr 2004 A1
20040075058 Blevis et al. Apr 2004 A1
20040081623 Eriksen et al. Apr 2004 A1
20040082918 Evans et al. Apr 2004 A1
20040084340 Morelle et al. May 2004 A1
20040086437 Jackson et al. May 2004 A1
20040101176 Mendonca et al. May 2004 A1
20040101177 Zahlmann et al. May 2004 A1
20040116807 Amrami et al. Jun 2004 A1
20040120557 Sabol Jun 2004 A1
20040122311 Cosman Jun 2004 A1
20040125918 Shanmugavel et al. Jul 2004 A1
20040138557 Le et al. Jul 2004 A1
20040143449 Behrenbruch et al. Jul 2004 A1
20040144925 Stoddart et al. Jul 2004 A1
20040153128 Suresh et al. Aug 2004 A1
20040162492 Kobayashi Aug 2004 A1
20040171924 Mire et al. Sep 2004 A1
20040183022 Weinberg Sep 2004 A1
20040184644 Leichter et al. Sep 2004 A1
20040193453 Butterfield et al. Sep 2004 A1
20040195512 Crosetto Oct 2004 A1
20040204646 Nagler et al. Oct 2004 A1
20040205343 Forth et al. Oct 2004 A1
20040210126 Hajaj et al. Oct 2004 A1
20040238743 Gravrand et al. Dec 2004 A1
20040251419 Nelson et al. Dec 2004 A1
20040253177 Elmaleh et al. Dec 2004 A1
20040258201 Hayashida Dec 2004 A1
20040263865 Pawlak et al. Dec 2004 A1
20050001170 Juni Jan 2005 A1
20050006589 Young et al. Jan 2005 A1
20050020898 Vosniak et al. Jan 2005 A1
20050020915 Bellardinelli et al. Jan 2005 A1
20050023474 Persyk et al. Feb 2005 A1
20050029277 Tachibana Feb 2005 A1
20050033157 Klein et al. Feb 2005 A1
20050049487 Johnson et al. Mar 2005 A1
20050055174 David et al. Mar 2005 A1
20050056788 Juni Mar 2005 A1
20050074402 Cagnolini et al. Apr 2005 A1
20050107698 Powers et al. May 2005 A1
20050107914 Engleson et al. May 2005 A1
20050108044 Koster May 2005 A1
20050113945 Engleson et al. May 2005 A1
20050113960 Karau et al. May 2005 A1
20050117029 Shiomi Jun 2005 A1
20050121505 Metz et al. Jun 2005 A1
20050123183 Schleyer Jun 2005 A1
20050131270 Weil et al. Jun 2005 A1
20050145797 Oaknin et al. Jul 2005 A1
20050148869 Masuda Jul 2005 A1
20050149350 Kerr et al. Jul 2005 A1
20050156115 Kobayashi et al. Jul 2005 A1
20050173643 Tumer Aug 2005 A1
20050187465 Motomura et al. Aug 2005 A1
20050198800 Reich Sep 2005 A1
20050203389 Williams Sep 2005 A1
20050205792 Rousso et al. Sep 2005 A1
20050205796 Bryman Sep 2005 A1
20050207526 Altman Sep 2005 A1
20050211909 Smith Sep 2005 A1
20050215889 Patterson, II Sep 2005 A1
20050234424 Besing et al. Oct 2005 A1
20050247893 Fu et al. Nov 2005 A1
20050253073 Joram et al. Nov 2005 A1
20050261936 Silverbrook et al. Nov 2005 A1
20050261937 Silverbrook et al. Nov 2005 A1
20050261938 Silverbrook et al. Nov 2005 A1
20050266074 Zilberstein et al. Dec 2005 A1
20050277833 Williams, Jr. Dec 2005 A1
20050277911 Stewart et al. Dec 2005 A1
20050278066 Graves et al. Dec 2005 A1
20050288869 Kroll et al. Dec 2005 A1
20060000983 Charron Jan 2006 A1
20060033028 Juni Feb 2006 A1
20060036157 Tumer Feb 2006 A1
20060072799 McLain Apr 2006 A1
20060074290 Chen et al. Apr 2006 A1
20060104519 Stoeckel et al. May 2006 A1
20060109950 Arenson et al. May 2006 A1
20060122503 Burbank et al. Jun 2006 A1
20060145081 Hawman Jul 2006 A1
20060160157 Zuckerman Jul 2006 A1
20060188136 Ritt et al. Aug 2006 A1
20060214097 Wang et al. Sep 2006 A1
20060237652 Kimchy et al. Oct 2006 A1
20060257012 Kaufman et al. Nov 2006 A1
20070081700 Blumenfeld et al. Apr 2007 A1
20070116170 De Man et al. May 2007 A1
20070133852 Collins et al. Jun 2007 A1
20070156047 Nagler et al. Jul 2007 A1
20070166227 Liu et al. Jul 2007 A1
20070183582 Baumann et al. Aug 2007 A1
20070189436 Goto et al. Aug 2007 A1
20070194241 Rousso et al. Aug 2007 A1
20070265230 Rousso et al. Nov 2007 A1
20080001090 Ben-Haim et al. Jan 2008 A1
20080029704 Hefetz et al. Feb 2008 A1
20080033291 Rousso et al. Feb 2008 A1
20080036882 Uemura et al. Feb 2008 A1
20080039721 Shai et al. Feb 2008 A1
20080042067 Rousso et al. Feb 2008 A1
20080128626 Rousso et al. Jun 2008 A1
20080137938 Zahniser Jun 2008 A1
20080230702 Rousso et al. Sep 2008 A1
20080230705 Rousso et al. Sep 2008 A1
20080237482 Shahar et al. Oct 2008 A1
20080260228 Dichterman et al. Oct 2008 A1
20080260580 Helle et al. Oct 2008 A1
20080260637 Dickman Oct 2008 A1
20080277591 Shahar et al. Nov 2008 A1
20090001273 Hawman Jan 2009 A1
20090018412 Schmitt Jan 2009 A1
20090078875 Rousso et al. Mar 2009 A1
20090112086 Melman Apr 2009 A1
20090152471 Rousso et al. Jun 2009 A1
20090190807 Rousso et al. Jul 2009 A1
20090201291 Ziv et al. Aug 2009 A1
20090236532 Frach et al. Sep 2009 A1
20090304582 Rousso et al. Dec 2009 A1
20100006770 Balakin Jan 2010 A1
20100021378 Rousso et al. Jan 2010 A1
20100102242 Burr et al. Apr 2010 A1
20100121184 Dhawale et al. May 2010 A1
20100140483 Rousso et al. Jun 2010 A1
20100202664 Busch et al. Aug 2010 A1
20100245354 Rousso et al. Sep 2010 A1
20120106820 Rousso et al. May 2012 A1
20120172699 Nagler et al. Jul 2012 A1
20120248320 Wangerin et al. Oct 2012 A1
20120326034 Sachs et al. Dec 2012 A1
20130051643 Jackson et al. Feb 2013 A1
20130114792 Zilberstein et al. May 2013 A1
20130308749 Zilberstein et al. Nov 2013 A1
20140151563 Rousso et al. Jun 2014 A1
20140163368 Rousso et al. Jun 2014 A1
20140187927 Nagler et al. Jul 2014 A1
20140193336 Rousso et al. Jul 2014 A1
20140200447 Rousso et al. Jul 2014 A1
20140249402 Kimchy et al. Sep 2014 A1
20160253826 Ziv et al. Sep 2016 A9
20170000505 Gordon et al. Jan 2017 A1
20170007193 Nagler et al. Jan 2017 A1
20180235557 Rousso et al. Aug 2018 A1
Foreign Referenced Citations (67)
Number Date Country
1516429 Dec 1969 DE
19814199 Oct 1999 DE
19815362 Oct 1999 DE
0273257 Jul 1988 EP
0525954 Feb 1993 EP
0526970 Feb 1993 EP
0543626 May 1993 EP
0592093 Apr 1994 EP
0697193 Feb 1996 EP
0813692 Dec 1997 EP
0887661 Dec 1998 EP
1237013 Sep 2002 EP
2031142 Apr 1980 GB
59-141084 Aug 1984 JP
61-026879 Feb 1986 JP
01-324568 Jun 1986 JP
03-121549 May 1991 JP
04-151120 May 1992 JP
06-109848 Apr 1994 JP
07-059763 Mar 1995 JP
07-141523 Jun 1995 JP
08-292268 Nov 1996 JP
10-260258 Sep 1998 JP
11-072564 Mar 1999 JP
2003-098259 Apr 2003 JP
WO 9200402 Jan 1992 WO
WO 9742524 Nov 1997 WO
WO 1998016852 Apr 1998 WO
WO 9903003 Jan 1999 WO
WO 9930610 Jun 1999 WO
WO 9939650 Aug 1999 WO
WO 0010034 Feb 2000 WO
WO 0018294 Apr 2000 WO
WO 0022975 Apr 2000 WO
WO 0025268 May 2000 WO
WO 0031522 Jun 2000 WO
WO 0038197 Jun 2000 WO
WO 0062093 Oct 2000 WO
WO 0189384 Nov 2001 WO
WO 0216965 Feb 2002 WO
WO 02058531 Aug 2002 WO
WO 02075357 Sep 2002 WO
WO 03073938 Sep 2003 WO
WO 03086170 Oct 2003 WO
WO 2004004787 Jan 2004 WO
WO 2004016166 Feb 2004 WO
WO 2004032151 Apr 2004 WO
WO 2004042546 May 2004 WO
WO 2004113951 Dec 2004 WO
WO 2005002971 Jan 2005 WO
WO 2005059592 Jun 2005 WO
WO 2005059840 Jun 2005 WO
WO 2005067383 Jul 2005 WO
WO 2005104939 Nov 2005 WO
WO 2005118659 Dec 2005 WO
WO 2005119025 Dec 2005 WO
WO 2006042077 Apr 2006 WO
WO 2006051531 May 2006 WO
WO 2006054296 May 2006 WO
WO 2006075333 Jul 2006 WO
WO 2006129301 Dec 2006 WO
WO 2007010534 Jan 2007 WO
WO 2007010537 Jan 2007 WO
WO 2007054935 May 2007 WO
WO 2007074467 Jul 2007 WO
WO 2008010227 Jan 2008 WO
WO 2008075362 Jun 2008 WO
Non-Patent Literature Citations (389)
Entry
Zaidi, H., et al. “Fuzzy clustering-based segmented attenuation correction in whole-body PET imaging.” Physics in Medicine & Biology 47.7 (2002): 1143.
Parker, Geoffrey JM, et al. “Probing tumor microvascularity by measurement, analysis and display of contrast agent uptake kinetics.” Journal of Magnetic Resonance Imaging 7.3 (1997): 564-574.
Boucher, Luc, et al. “Respiratory gating for 3-dimensional PET of the thorax: feasibility and initial results.” Journal of Nuclear Medicine 45.2 (2004): 214-219.
Advisory Action Before the Filing of an Appeal Brief dated Apr. 14, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363. (3 pages).
Applicant-Initiated Interview Summary dated Oct. 6, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/100,082. (3 Pages).
Communication Pursuant to Article 94(3) EPC dated Jan. 5, 2018 From the European Patent Office Re. Application No. 05703091.8. (4 Pages).
Communication Pursuant to Article 94(3) EPC dated Nov. 10, 2017 From the European Patent Office Re. Application No. 05747259.9. (3 Pages).
Communication Pursuant to Article 94(3) EPC dated May 15, 2017 From the European Patent Office Re. Application No. 01951883.6. (10 Pages).
Communication Pursuant to Rule 164(1) EPC [Supplementary Partial European Search Report] dated Mar. 1, 2017 From the European Patent Office Re. Application No. 06700631.2. (7 pages).
Communication Pursuant to Rule 164(1) EPC [Supplementary Partial European Search Report] dated Feb. 17, 2017 From the European Patent Office Re. Application No. 05803158.4. (8 Pages).
Corrected European Search Opinion dated Sep. 19, 2017 From the European Patent Office Re. Application No. 06700631.2. (5 Pages).
Invitation Pursuant to Rule 62a(1) EPC dated Apr. 7, 2017 From the European Patent Office Re. Application No. 10171259.4. (2 Pages).
Notice of Allowance dated Dec. 6, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/100,082. (10 pages).
Official Action dated Feb. 2, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363. (9 pages).
Official Action dated Feb. 8, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/100,082. (182 pages).
Official Action dated Jun. 9, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/100,082. (41 Pages).
Official Action dated Jul. 17, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363. (21 pages).
Official Action dated Apr. 21, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 15/184,041. (76 pages).
Partial European Search Report dated Jul. 21, 2017 From the European Patent Office Re. Application No. 10171259.4. (13 Pages).
Supplementary European Search Report and the European Search Opinion dated Jun. 6, 2017 From the European Patent Office Re. Application No. 06700631.2. (10 Pages).
Supplementary European Search Report and the European Search Opinion dated Jun. 14, 2017 From the European Patent Office Re. Application No. 05803158.4. (13 Pages).
Chiao et al. “Compartment Analysis of Technetium-99m-Teboroxime Kinetics Employing Fast Dynamic SPECT at Rest and Stress”, The Journal of Nuclear Medicine, XP055347116, 35(8): 1265-1273, Aug. 1994. Abstract, Sections ‘Imaging Protocol’, ‘Data Analysis’, ‘Quantification of Myiocardial Blood Flow—Next Step’.
Garcia et al. “Diagnostic Performance of an Expert System for the Interpretation of Myocardial Perfusion SPECT Studies”, The Journal of Nuclear Medicine, XP055347227, 42(8): 1185-1191, Aug. 2001. Abstract, Section ‘Data Analysis and Expert System Interpretation’.
Graham et al. “Quatitation of SPECT Performance: Report of Task Group 4, Nuclear Medicine Committee”, American Association of Physicists in Medicine, AAPM Report No. 52, XP055341256, 22(4): 401-409, Apr. 1995. Abstract, Section II. B. ‘Spatial Resolution’, Section II. C. ‘Tomographic Uniformity’, Section III. A.2. ‘Spatial Resolution’, Section III. A.3. ‘System Performance: Tomographic Uniformity and Contrast’.
Advisory Action Before the Filing of an Appeal Brief dated Jul. 12, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Advisory Action before the Filing of an Appeal Brief dated May 21, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,653.
Advisory Action Before the Filing of an Appeal Brief dated Feb. 26, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/989,223.
Amendment After Allowance Under 37 CFR 1.312 dated Sep. 13, 2010 to Notice of Allowance dated Jul. 22, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/794,799.
Appeal Brief Dated Jan. 19, 2010 to Notice of Appeal of Nov. 16, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,301.
Applicant-Initiated Interview Summary dated May 1, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Applicant-Initiated Interview Summary dated May 9, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Applicant-Initiated Interview Summary dated Jun. 11, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/214,960.
Applicant-Initiated Interview Summary dated Mar. 20, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Applicant-Initiated Interview Summary dated Jan. 28, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Applicant-Initiated Interview Summary dated Jan. 29, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/345,773.
Applicant-Interview Summary dated Mar. 21, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/514,785.
Communication Pursuant to Article 94(3) EPC dated Mar. 2, 2011 From the European Patent Office Re. Application No. 06756259.5.
Communication Pursuant to Article 94(3) EPC dated May 8, 2014 From the European Patent Office Re. Application No. 05803689.8.
Communication Pursuant to Article 94(3) EPC dated Oct. 10, 2014 From the European Patent Office Re. Application No. 05803689.8.
Communication Pursuant to Article 94(3) EPC dated Jun. 11, 2012 From the European Patent Office Re. Application No. 06756259.5.
Communication Pursuant to Article 94(3) EPC dated May 12, 2010 From the European Patent Office Re. Application No. 06809851.6.
Communication Pursuant to Article 94(3) EPC dated Nov. 12, 2012 From the European Patent Office Re. Application No. 06756258.7.
Communication Pursuant to Article 94(3) EPC dated Apr. 16, 2010 From the European Patent Office Re. Application No. 01951883.6.
Communication Pursuant to Article 94(3) EPC dated Sep. 2013 From the European Patent Office Re.: Application No. 06832278.3.
Communication Pursuant to Article 94(3) EPC dated Oct. 17, 2014 From the European Patent Office Re. Application No. 06809851.6.
Communication Pursuant to Article 94(3) EPC dated Nov. 18, 2011 From the European Patent Office Re. Application No. 05803689.8.
Communication Pursuant to Article 94(3) EPC dated Oct. 21, 2009 From the European Patent Office Re.: Application No. 02716285.8.
Communication Pursuant to Article 94(3) EPC dated Jul. 22, 2009 From the European Patent Office Re.: Application No. 06809851.6.
Communication Pursuant to Article 94(3) EPC dated Sep. 2011 From the European Patent Office Re. Application No. 06756258.7.
Communication Pursuant to Article 94(3) EPC dated Nov. 25, 2013 From the European Patent Office Re. Application No. 06756258.7.
Communication Pursuant to Article 94(3) EPC dated Oc. 26, 2012 From the European Patent Office Re. Application No. 05803689.8.
Communication Pursuant to Article 94(3) EPC dated May 29, 2012 From the European Patent Office Re. Application No. 05803689.8.
Communication Pursuant to Article 96(2) EPC dated Jun. 19, 2006 From the European Patent Office Re.: Application No. 03810570.6.
Communication Pursuant to Article 96(2) EPC dated Aug. 30, 2007 From the European Patent Office Re. Application No. 03810570.6.
Communication Relating to the Results of the Partial International Search dated Apr. 18, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL2006/001291.
Communication Relating to the Results of the Partial International Search dated May 21, 2008 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL2007/001588.
Examination Report dated Jun. 22, 2011 From the Government of India, Patent Office, Intellectual Property Building Re. Application No. 2963/CHENP/2006.
International Preliminary Report on Patentability dated Apr. 7, 2009 From the International Bureau of WIPO Re. Application No. PCT/IL2007/000918.
International Preliminary Report on Patentability dated Jan. 13, 2009 From the International Bureau of WIPO Re. Application No. PCT/IL2006/000834.
International Preliminary Report on Patentability dated May 14, 2008 From the International Bureau of WIPO Re. Application No. PCT/IL2006/001291.
International Preliminary Report on Patentability dated May 15, 2007 From the International Bureau of WIPO Re. Application No. PCT/IL2005/001173.
International Preliminary Report on Patentability dated Apr. 16, 2009 From the International Bureau of WIPO Re. Applicaiton No. PCT/IL007/000918.
International Preliminary Report on Patentability dated Jun. 21, 2007 From the International Bureau of WIPO Re. Application No. PCT/IL2005/000575.
International Preliminary Report on Patentability dated Jan. 22, 2009 From the International Bureau of WIPO Re.: Application No. PCT/IL2006/000834.
International Preliminary Report on Patentability dated May 22, 2007 From the International Preliminary Examining Authority Re.: Application No. PCT/IL06/00059.
International Preliminary Report on Patentability dated May 22, 2008 From the International Bureau of WIPO Re. Application No. PCT/IL2006/001291.
International Preliminary Report on Patentability dated May 24, 2007 From the International Bureau of WIPO Re.: Application No. PCT/IL2005/001173.
International Preliminary Report on Patentability dated Apr. 26, 2007 From the International Bureau of WIPO Re.: Application No. PCT/IL2005/000394.
International Preliminary Report on Patentability dated Jan. 31, 2008 From the International Bureau of WIPO Re.: Application No. PCT/IL2006/000840.
International Search Report dated Oct. 10, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/00059.
International Search Report dated Jul. 25, 2008 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL2007/001588.
International Search Report dated Feb. 1, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00048.
International Search Report dated Jul. 1, 2008 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/00834.
International Search Report dated Jul. 1, 2008 From the International Searching Authority Re. Application No. PCT/IL2006/000834.
International Search Report dated Nov. 1, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/00840.
International Search Report dated Jul. 2, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL2006/001291.
International Search Report dated Jul. 2, 2007 From the International Searching Authority Re. Application No. PCT/IL2006/001291.
International Search Report dated Aug. 3, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/001173.
International Search Report dated Aug. 3, 2006 From the International Searching Authority Re. Application No. PCT/IL2005/001173.
International Search Report dated May 11, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/001215.
International Search Report dated Sep. 11, 2002 From the International Searching Authority Re. Application No. PCT/IL01/00638.
International Search Report dated Sep. 12, 2002 From the International Searching Authority of the Patent Cooperation Treaty Re: Application No. PCT/IL02/00057.
International Search Report dated Oct. 15, 2008 From the International Searching Authority Re. Application No. PCT/2007/000918.
International Search Report dated Oct. 15, 2008 From the International Searching Authority Re. Application No. PCT/IL07/00918.
International Search Report dated Mar. 18, 2004 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL03/00917.
International Search Report dated Mar. 23, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00572.
International Search Report dated May 24, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00575.
International Search Report dated Mar. 26, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00394.
Interview Summary dated Mar. 25, 2011 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/836,223.
Interview Summary dated May 31, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,301.
Invitation to Pay Additional Fees dated Feb. 15, 2007 From the International Searching Authority Re. Application No. PCT/IL05/00575.
Notice of Appeal and Pre-Appeal Brief Dated Jan. 4, 2010 to Official Action dated Sep. 2, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/343,792.
Notice of Appeal Dated Nov. 16, 2009 to Official Action dated Jul. 15, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,301.
Notice of Non-Compliant Amendment dated Feb. 14, 2011 From the US Patent and Trademark Office Re: U.S. Appl. No. 10/616,307.
Notice of Non-Compliant Amendment dated Jul. 15, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363.
Notice of Panel Decision From Pre-Appeal Brief Review dated Feb. 29, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/836,223.
Office Action dated Dec. 2, 2007 From the Israeli Patent Office Re. Application No. 158442.
Office Action dated Jan. 2, 2006 From the Israeli Patent Office Re. Application No. 154323.
Office Action dated Sep. 4, 2007 From the Israeli Patent Office Re.: Application No. 157007.
Office Action dated Jul. 17, 2007 From the Israeli Patent Office Re. Application No. 154323 and Its Translation Into English.
Office Action dated Jul. 17, 2007 From the Israeli Patent Office Re. Application No. 154323.
Official Action dated Jun. 1, 2006 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/686,536.
Official Action dated Mar. 1, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/794,799.
Official Action dated Mar. 1, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,307.
Official Action dated Nov. 1, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/728,383.
Official Action dated Sep. 1, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/794,799.
Official Action dated Jul. 2, 2004 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Mar. 2, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/836,223.
Official Action dated Mar. 2, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,617.
Official Action dated Sep. 2, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/343,792.
Official Action dated Aug. 3, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,690.
Official Action dated May 3, 2007 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/240,239.
Official Action dated Aug. 4, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/082,314.
Official Action dated Sep. 4, 2008 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/533,568.
Official Action dated Aug. 5, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/309,479.
Official Action dated Jul. 5, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Sep. 5, 2002 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/084,559.
Official Action dated Sep. 5, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/947,198.
Official Action dated Mar. 6, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/792,856.
Official Action dated Feb. 7, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,653.
Official Action dated Jan. 7, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,307.
Official Action dated Jul. 7, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/533,568.
Official Action dated Oct. 7, 2008 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Oct. 7, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/750,057.
Official Action dated Oct. 7, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/932,872.
Official Action dated Apr. 8, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,690.
Official Action dated Dec. 8, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/132,320.
Official Action dated Dec. 8, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Dec. 8, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,690.
Official Action dated Jan. 8, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Jul. 8, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated Apr. 9, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated Mar. 9, 2011 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Aug. 10, 2007 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/836,223.
Official Action dated Feb. 10, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated Nov. 10, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,307.
Official Action dated Nov. 10, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/836,223.
Official Action dated Oct. 10, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated Apr. 11, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/309,479.
Official Action dated Aug. 11, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Jul. 11, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,683.
Official Action dated Mar. 11, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/607,075.
Official Action dated Mar. 11, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/345,719.
Official Action dated Oct. 11, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/976,852.
Official Action dated Dec. 12, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Official Action dated Jul. 12, 2007 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Jul. 12, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Aug. 13, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/769,826.
Official Action dated Dec. 13, 2007 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated May 13, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated May 13, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Official Action dated Sep. 13, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/976,852.
Official Action dated Aug. 14, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Official Action dated May 14, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Apr. 15, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Apr. 15, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/214,960.
Official Action dated Dec. 15, 2006 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,301.
Official Action dated Dec. 15, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Official Action dated Dec. 15, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated Feb. 15, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Official Action dated Jul. 15, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Jul. 15, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Mar. 15, 2004 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/765,316.
Official Action dated Mar. 15, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Official Action dated Nov. 15, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/345,773.
Official Action dated Sep. 15, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,307.
Official Action dated Sep. 15, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/836,223.
Official Action dated Sep. 15, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/147,682.
Official Action dated Apr. 16, 2012 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/836,223.
Official Action dated Dec. 16, 2008 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/343,792.
Official Action dated Feb. 16, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/747,378.
Official Action dated Sep. 16, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/727,464.
Official Action dated Jan. 17, 2006 From the United States Patent and Trademark Office Re.: U.S. Appl. No. 11/034,007.
Official Action dated Jun. 17, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated May 18, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/989,223.
Official Action dated May 18, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/514,785.
Official Action dated Apr. 19, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/750,057.
Official Action dated Dec. 19, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Official Action dated Jan. 19, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated Jul. 19, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/607,075.
Official Action dated Jul. 19, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Mar. 19, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/240,239.
Official Action dated Apr. 20, 2006 From the United States Patent and Trademark Office Re. U.S. Appl. No. 10/240,239.
Official Action dated Apr. 20, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated Dec. 20, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/309,479.
Official Action dated Jul. 20, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,617.
Official Action dated Jun. 21, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/309,479.
Official Action dated Mar. 21, 2008 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/533,568.
Official Action dated Sep. 21, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated Dec. 22, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/514,785.
Official Action dated Feb. 22, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,307.
Official Action dated Apr. 23, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/932,987.
Official Action dated Dec. 23, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/727,464.
Official Action dated Feb. 23, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Jun. 23, 2006 From the United States Patent and Trademark Office Re. U.S. Appl. No. 09/727,464.
Official Action dated May 23, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,307.
Official Action dated May 23, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated May 23, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,690.
Official Action dated Nov. 23, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Jun. 25, 2008 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Sep. 25, 2006 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Aug. 26, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363.
Official Action dated Mar. 26, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/147,682.
Official Action dated May 26, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/214,960.
Official Action dated Nov. 26, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/240,239.
Official Action dated Oct. 26, 2011 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/836,223.
Official Action dated Apr. 27, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/836,223.
Official Action dated Jul. 27, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,307.
Official Action dated Oct. 27, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/656,548.
Official Action dated Apr. 28, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 10/616,301.
Official Action dated Aug. 28, 2009 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/240,239.
Official Action dated Dec. 28, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/607,075.
Official Action dated Dec. 28, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Official Action dated Feb. 28, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Jan. 28, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 09/641,973.
Official Action dated Jun. 28, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/628,074.
Official Action dated Apr. 29, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/980,690.
Official Action dated Oct. 29, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/214,960.
Official Action dated Jul. 30, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,683.
Official Action dated Jul. 30, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/343,792.
Official Action dated Nov. 30, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/989,223.
Official Action dated Oct. 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/980,690.
Official Action dated Sep. 30, 2008 From the US Patent and Trademark Office Re. U.S. Appl. No. 10/616,301.
Official Action dated Sep. 30, 2010 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/798,017.
Official Action dated Aug. 31, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,653.
Official Action dated Jan. 31, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Official Action dated Jul. 31, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/667,793.
Proceeding Further With the European Patent Application Pursuant to Rule 70(2) EPC Dated Nov. 10, 2015 From the European Patent Office Re. Application No. 05703091.8.
Proceeding Further With the European Patent Application Pursuant to Rule 70(2) EPC Dated Dec. 11, 2015 From the European Patent Office Re. Application No. 05747259.9.
Restriction Official Action dated Nov. 8, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/309,479.
Restriction Official Action dated Mar. 9, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/976,852.
Restriction Official Action dated Sep. 12, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,653.
Restriction Official Action dated Apr. 13, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/989,223.
Restriction Official Action dated Nov. 15, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/980,683.
Restriction Official Action dated Aug. 16, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/448,473.
Restriction Official Action dated Apr. 21, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/058,363.
Second International Search Report dated Jun. 1, 2009 From the International Searching Authority Re.: Application No. PCT/IL07/00918.
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC Dated Jan. 16, 2009 From the European Patent Office Re.: Application No. 03810570.6.
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC Dated Nov. 29, 2012 From the European Patent Office Re. Application No. 06756259.5.
Supplementary European Search Report and the European Search Opinion dated Nov. 13, 2012 From the European Patent Office Re. Application No. 06728347.3.
Supplementary European Search Report and the European Search Opinion dated Mar. 16, 2011 From the European Patent Office Re. Application No. 05803689.8.
Supplementary European Search Report dated Dec. 12, 2005 From the European Patent Office Re Application No. 03810570.6.
Supplementary European Search Report dated Oct. 22, 2015 From the European Patent Office Re. Application No. 05703091.8.
Supplementary European Search Report dated Nov. 25, 2015 From the European Patent Office Re. Application No. 05747259.9.
Supplementary Partial European Search Report and the European Search Opinion dated Oct. 16, 2009 From the European Patent Office Re.: Application No. 06756259.5.
Supplementary Partial European Search Report dated Sep. 4, 2007 From the European Patent Office Re. Application No. 0 2716285.8.
Supplementary Partial European Search Report dated Nov. 11, 2008 From the European Patent Office Re. Application No. 01951883.6.
Supplementary Partial European Search Report dated Nov. 20, 2007 From the European Patent Office Re. Application No. 02716285.8.
Translation of Office Action dated May 13, 2005 From the Patent Office of the People's Republic of China Re. Application No. 01817689.5.
Written Opinion dated Feb. 1, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00048.
Written Opinion dated Jul. 1, 2008 From the International Searching Authority Re. Application No. PCT/IL06/00834.
Written Opinion dated Nov. 1, 2007 from the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/00840.
Written Opinion dated Jul. 2, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL2006/001291.
Written Opinion dated Aug. 3, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/001173.
Written Opinion dated Oct. 10, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re. Application No. PCT/IL06/00059.
Written Opinion dated Oct. 15, 2008 From the International Searching Authority Re. Application No. PCT/IL07/00918.
Written Opinion dated Mar. 23, 2006 From the International Searching Authority of the Patent Cooperation Treaty Re. Application No. PCT/IL05/00572.
Written Opinion dated May 24, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00575.
Written Opinion dated Jul. 25, 2008 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/001173.
Written Opinion dated Mar. 26, 2007 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL05/00394.
Aoi et al. “Absolute Quantitation of Regional Myocardial Blood Flow of Rats Using Dynamic Pinhole SPECT”, IEEE Nuclear Science Symposium and Medical Imaging Conference Record, 3: 1780-1783, 2002. Abstract, Figs.
Beekman et al. “Efficient Fully 3-D Iterative SPECT Reconstruction With Monte Carlo-Based Scatter Compensation”, IEEE Transactions on Medical Imaging, 21(8): 867-877, Aug. 2002.
Berman et al. “D-SPECT: A Novel Camera for High Speed Quantitative Molecular Imaging: Initial Clinical Results”, The Journal of Nuclear Medicine, 47(Suppl.1): 131P, 2006.
Berman et al. “Dual-Isotope Myocardial Perfusion SPECT With Rest Thallium-201 and Stress Tc-99m Sestamibi”, Nuclear Cardiology, 12(2): 261-270, May 1994.
Berman et al. “Myocardial Perfusion Imaging With Technetium-99m-Sestamibi: Comparative Analysis of Available Imaging Protocols”, The Journal of Nuclear Medicine, 35: 681-688, 1994.
Bloch et al. “Application of Computerized Tomography to Radiation Therapy and Surgical Planning”, Proceedings of the IEEE, 71(3): 351-355, Mar. 1983.
Borges-Neto et al. “Perfusion and Function at Rest and Treadmill Exercise Using Technetium-99m-Sestamibi: Comparison of One- and Two-Day Protocols in Normal Volunteers”, The Journal of Nuclear Medicine, 31(7): 1128-1132, Jul. 1990.
Bowsher et al. “Treatment of Compton Scattenng in Maximum-Likelihood, Expectation-Maximization Reconstructions of SPECT Images”, Journal of Nuclear Medicine, 32(6): 1285-1291, 1991.
Bromiley et al. “Attenuation Correction in PET Using Consistency Conditions and a Three-Dimensional Template”, IEEE Transactions on Nuclear Science, XP002352920, 48(4): 1371-1377, 2001. p. 1376, col. 2, § 2.
Brown et al. “Method for Segmenting Chest CT Image Data Using An Anatomical Model: Preliminaiy Results”, IEEE Transactions on Medical Imaging, 16(6): 828-839, Dec. 1997.
Brzymialkiewicz et al. “Evaluation of Fully 3-D Emission Mammotomography With a Compact Cadmium Zinc Telluride Detector”, IEEE Transactions on Medical Imaging, 24(7): 868-877, Jul. 2005.
Cancer Medicine “Radiolabeled Monoclonal Antibodies. Historical Perspective”, Cancer Medicine, 5th Ed., Sec.16: Principles of Biotherapeutics, Chap.65: Monoclonal Serotherapy, 2000.
Charland et al. “The Use of Deconvolution and Total Least Squares in Recovering a Radiation Detector Line Spread Function”, Medical Physics, 25(2): 152-160, Feb. 1998. Abstract Only!
Chengazi et al. “Imaging Prostate Cancer With Technetium-99m-7E11-05.3 (CYT-351)”, Journal of Nuclear Medicine, 38: 675-682, 1997.
Corstens et al. “Nuclear Medicine's Role in Infection and Inflammation”, The Lancet, 354: 765-770, 1999.
Day at al. “Localization of Radioiodinated Rat Fibrogen in Transplanted Rat Tumors”, Journal of the National Cancer Institute, 23(4): 799-812, 1959.
DeGrado et al. “Topics in Integrated Systems Physiology. Tracer Kinetic Modeling in Nuclear Cardiology”, Journal of Nuclear Cardiology, 7: 686-700, 2000.
Del Guerra et al. “An Integrated PET-SPECT Small Animal Imager: Preliminary Results”, Nuclear Science Symposium, IEEE Records, 1: 541-544, 1999.
Dewaraja et al. “Accurate Dosimetry in [131]I Radionuclide Therapy Using Patient-Specific, 3-Dimensional Methods for SPECT Reconstruction and Basorbed Dose Calculation”, The Journal of Nuclear Medicine, 46(5): 840-849, May 2005.
Di Bella et al. “Automated Region Selection for Analysis of Dynamic Cardiac SPECT Data”, IEEE Transactions on Nuclear Science, XP011087693, 44(3): 1355-1361, Jun. 1997. Section II, B), Blood and Liver Correlations, Section III, Results, Figs.4, 5.
Dillman “Radiolabeled Anti-CD20 Monoclonal Antibodies for the Treatment of B-Cell Lymphoma”, Journal of Clinical Oncology, 20(16): 3545-3557, Aug. 15, 2002.
Ellestad “Stress Testing: Principles and Practice”, XP008143015, 5th Edition, p. 432, Jan. 1, 2003.
Erbil et al. “Use and Limitations of Serum Total and Lipid-Bound Sialic Acid Concentrations as Markers for Colorectal Cancer”, Cancer, 55: 404-409, 1985.
Garcia et al. “Accuracy of Dynamic SPECT Acquisition for Tc-99m Teboroxime Myocardial Perfusion Imaging: Preliminary Results”, American College of Cardiology, 51st Armual Scientific Session, Atlanta, Georgia, USA, 8 P., 2002.
GE Healthcare “Myoview™: Kit for the Preparation of Technetiurn Tc99m Tetrofosmin for Injection. Diagnostic Radiopharmaceutical. For Intravenous Use Only. Rx Only”, GE Healthcare, Product Sheet, 4 P., Aug. 2006.
Gilland et al. “Long Focal Length, Asymmetnc Fan Beam Collimation for Transmission Acquisition With A Triple Camera SPECT System”, IEEE Transactions on Nuclear Science, XP011087666, 44(3): 1191-1196, Jun. 1, 1997.
Gugnin et al “Radiocapsule for Recording the Ionizing Radiation in the Gastrointestinal Tract”, UDC 615. 417:616.34-005.1-073.916-71 (All-Union Scientific-Research Institute of medical Instrument Design, Moscow. Translated from Meditsinskaya Tekhnika, 1.21-25, Jan.-Feb. 1972).
Hassan et al. “A Radiotelemetry Pill for the Measurement of Ionising Radiation Using a Mercuric Iodide Detector”, Physics in Medicine and Biology, 23(2): 302-308, 1978.
Hayakawa et al. “A PET-MRI Registration Technique for PET Studies of the Rat Brain”, Nuclear Medicine & Biology, 27: 121-125, 2000. p. 121, col. 1.
Herrmann et al. “Mitochondrial Proteome: Altered Cytochtrome C Oxidase Subunit Levels in Prostate Cancer”, Proteomics, XP002625778, 3(9): 1801-1810, Sep. 2003.
Hoffman et al. “Intraoperative Probes and Imaging Probes”, European Journal of Nuclear Medicine, 26(8): 913-935, 1999.
Huesman et al. “Kinetic Parameter Estimation From SPECT Cone-Beam Projection Measurements”, Physics in Medicine and Biology, 43(4): 973-982, 1998.
Jan et al. “Preliminary Results From the AROPET”, IEEE Nuclear Science Symposium Conference Record, Nov. 4-10, 2001, 3: 1607-1610, 2001.
Jeanguillaume et al. “From the Whole-Body Counting to Imaging: The Computer Aided Collimation Gamma Camera Project (CACAO)”, Radiation Projection Dosimetry, 89(3-4): 349-352, 2000. & RSNA 2000 Infosystem, 87th Scientific Assembly and Armual Meeting, Chicago, Illinois, 2000.
Jessup “Tumor Markers—Prognostic and Therapeutic Implications for Colorectal Carcinoma”, Surgical Oncology, 7: 139-151, 1998.
Kadrmas et al. “Static Versus Dynamic Teboroxime Myocardial Perfusion SPECT in Canines”, IEEE Transactions on Nuclear Science, 47(3): 1112-1117, Jun. 2000.
Kinahan et al. “Attenuation Correction for A Combined 3D PET/CT Scanner”, Medical Physics, 25(10): 2046-2053, Oct. 1998.
Kojima et al. “Quantitative Planar Imaging Method for Measurement of Renal Activity by Using A Conjugate-Emission Image and Transmission Data”, Medical Physics, 27(3): 608-615, 2000. p. 608.
Krieg et al. “Mitochondrial Proteome: Cancer-Altered Metabolism Associated With Cytochrome C Oxidase Subunit Level Variation”, Proteomics, XP002625779, 4(9): 2789-2795, Sep. 2004.
Kwok et al. “Feasability of Simultaneous Dual-Isotope Myocardial Perfusion Acquisition Using a Lower Dose of Sestamibi”, European Journal of Nuclear Medicine, 24(3): 281-285, Mar. 1997.
Lange et al. “EM Reconstruction Algorithms for Emission and Transmission Tomography”, Journal of Computer Assisted Tomography, 8(2): 306-316, Apr. 1984.
Lavallee et al. “Building A Hybrid Patient's Model for Augmented Reality in Surgery: A Registration Problem”, Computing in Biological Medicine, 25(2): 149-164, 1995.
Li et al. “A HOTLink/Networked PC Data Acquisition and Image Reconstruction System for a High Resolution Whole-Body PET With Respiratory or ECG-Gated Performance”, IEEE Nuclear Sience Symposium and Medical Imaging Conference, Norfolk, VA, USA, Nov. 10-16, 2002, XP010663724, 2: 1135-1139, Nov. 10, 2002. p. 1137, First col. 2nd §.
Lin et al. “Improved Sensor Pills for Physiological Monitoring”, NASA Technical Brief, JPL New Technology Report, NPO-20652, 25(2), 2000.
Links “Advances in SPECT and PET Imaging”, Annals in Nuclear Medical Science, 13(2): 107-120, Jun. 2000.
Mao et al. “Human Prostatic Carcinoma: An Electron Microscope Study”, Cancer Research, XP002625777, 26(5): 955-973, May 1966.
McJilton et al. “Protein Kinase C[Epsilon] Interacts With Bax and Promotes Survival of Human Prostate Cancer Cells”, Oncogene, 22; 7 958-7968, 2003.
Mettler et al. “Legal Requirements and Radiation Safety”, Essentials of Nuclear Medicine Imaging, 2nd Ed., Chap. 13: 323-331, 1985.
Meyers et al. “Age, Perfusion Test Results and Dipyridamole Reaction”, Radiologic Technology, XP008142909, 73(5): 409-414, May 1, 2002.
Molinolo et al. “Enhanced Tumor Binding Using Immunohistochemical Analyses by Second Generation Anti-Tumor-Associated Glycoprotein 72 Monoclonal Antibodies versus Monoclonal Antibody B72.3 in Human Tissue”, Cancer Research, 50: 1291-1298, 1990.
Moore et al. “Quantitative Multi-Detector Emission Computerized Tomography Using Iterative Attenuation Compensation”, Journal of Nuclear Medicine, XP002549083, 23(8): 706-714, Aug. 1982. Abstract, p. 707, Section ‘The Multi-Detector Scanner’, First §.
Mori et al. “Overexpression of Matrix Metalloproteinase-7mRNA in Human Colon Carcinomas”, Cancer, 75: 1516-1519, 1995.
Ogawa et al. “Ultra High Resoultion Pinhole SPECT”, IEEE Nuclear Science Symposium, 2: 1600-1604, 1998.
Ohno et al. “Selection of Optimum Projection Angles in Three Dimensional Myocardial SPECT”, IEEE Nuclear Science Symposium Conference Record 2001, 4: 2166-2169, 2001.
Ohrvall et al. “Intraoperative Gamma Detection Reveals Abdominal EndocrineTumors More Efficiently Than Somatostatin Receptor Scintigraphy”, 6th Conference on Radioimrnunodetection and Radioimmunotherapy of Cancer, Cancer, 80: 2490-2494, 1997.
Pardridge et al. “Tracer Kinetic Model of Blood-Brain Barrier Transport of Plasma Protein-Bound Ligands”, Journal of Clinical Investigation, 74: 745-752, 1984. Suppl. IDS in 27480.
Patton et al. “D-SPECT: A New Solid State Camera for High Speed Molecular Imaging”, The Journal of Nuclear Medicine, 47(Suppl.1): 189P, 2006.
Pellegrini et al. “Design of Compact Pinhole SPECT System Based on Flat Panel PMT”, IEEE Nuclear Science Symposium Conference Record, 3: 1828-1832, 2003.
Pharmalucence “Kit for the Preparation of Technetium Tc99m Sulfur Colloid Injection for Subcutaneous, Intraperitoneal, Intravenous, and Oral Use”, Pharrnalucence Inc, Reference ID: 2977567, Prescribing Information, 10 P., Jul. 2011.
Piperno et al. “Breast Cancer Screening by Impedance Measurements”, Frontiers Med. Biol. Engng., 2(2): 11-17, 1990.
Pluim et al. “Image Registration by Maximization of Combined Mutual Information and Gradient Information”, IEEE Transactions on Medical Imaging, 19(8): 1-6, 2000.
Qi et al. “Resolution and Noise Properties of MAP Reconstruction for Fully 3-D PET”, IEEE Transactions on Medical Imaging, XP002549082, 19(5): 493-506, May 2000. p. 493, col. 2, Lines 10-21, p. 495, col. 1, Last §.
Quartuccia et al. “Computer Assisted Collimation Gama Camera: A New Approach to Imaging Contaminated Tissues”, Radiation Projection Dosimetry, 89(3-4): 343-348, 2000.
Rajshekhar “Continuous Impedence Monitoring During CT-Guided Stereotactic Surgery: Relative Value in Cystic and Solid Lesions”, British Journal of Neurosurgery, 6: 439-444, 1992.
Rockmore et al. “A Maximum Likelihood Approach to Emission Image Reconstruction From Projections”, IEEE Transactions on Nuclear Science, 23(4): 1428-1432, Aug. 1976.
Saltz et al. “Interim Report of Randomized Phase II Trial of Cetuximab/Bevacizumab/Irinotecan (CB1) Versus Cetuximab/Bevacizumab (CB) in Irinotecan-Refractory Colorectal Cancer”, Gastrointestinal Cancer Symposium, Hollywood, FL, USA, Jan. 27-29, 2005, American Society of Clinical Oncology, Abstract 169b, 4P., 2005.
Sands et al. “Methods for the Study of the Metabolism of Radiolabeled Monoclonal Antibodies by Liver and Tumor”, The Journal of Nuclear Medicine, 28: 390-398, 1987.
Seret et al. “Intrinsic Uniformity Requirements for Pinhole SPECT”, Journal of Nuclear Medicine Technology, 34(1): 43-47, Mar. 2006.
Sharir et al. “D-SPECT: High Speed Myocardial Perfusion Imaging: A Comparison With Dual Detector Anger Camera (A-SPECT)”, The Journal of Nuclear Medicine, 48(Supp1.2): 51P, # 169, 2007.
Shepp et al. “Maximum Likelihood Reconstruction for Emission Tomography”, IEEE Transactions on Medical Imaging, MI-1: 113-122, Oct. 1982.
Sitek et al. “Reconstruction of Dynamic Renal Tomographic Data Acquired by Slow Rotation”, The Journal of Nuclear Medicine, 42(11): 1704-1712, Nov. 2001.
Smither “High Resolution Medical Imaging System for 3-D Imaging of Radioactive Sources With 1 mmFWHM Spatial Resolution”, Proceedings of the SPIE, Medical Imaging 2003: Physics ofMedical Imaging, 5030: 1052-1060, Jun. 9, 2003.
Solanki “The Use of Automation in Radiopharmacy”, Hospital Pharmacist, 7(4): 94-98, Apr. 2000.
Stoddart et al. “New Multi-Dimensional Reconstructions for the 12-Detector, Scanned Focal Point, Single-Photon Tomograph”, Physics in Medicine and Biology, XP020021960, 37(3): 579-586, Mar. 1, 1992. p. 582, § 2-p. 585, § 1.
Storey et al. “Tc-99m Sestamibi Uptake in Metastatic Prostate Carcinoma”, Clinical Nuclear Medicine, XP009145398, 25(2): 133-134, Feb. 2000.
Studen “Compton Camera With Position-Sensitive Silicon Detectors”, Doctoral Thesis, University of Ljubljana, Faculty of Mathematics and Physics, 36 P, 2005.
Takahashi et al. “Attenuation Correction of Myocardial SPECT Images With X-Ray CT: Effects of Registration Errors Between X-Ray CT and SPECT”, Annals of Nuclear Medicine, 16(6): 431-435, Sep. 2002.
Tornai et al. “A 3D Gantry Single Photon Emission Tomograph With Hemispherical Coverage for Dedicated Breast Imaging”, Nuclear Instruments & Methods in Physics Research, Section A, 497: 157-167, 2003.
Trikha et al. “Monoclonal Antibodies as Therapeutics in Oncology”, Current Opinion in Biotechnology, 13: 609-614, 2002.
Volkow et al. “Imaging the Living Human Brain: Magnetic Resonance Imaging and Positron Emission Tomography”, Proc. Natl. Acad. Sci. USA, 94: 2787-2788, Apr. 1997.
Weldon et al. “Quantification of Inflammatory Bowel Disease Activity Using Technetium-99m HMPAO Labelled Leucocyte Single Photon Emission Computerised Tomography (SPECT)”, Gut, 36: 243-250, 1995.
Wilson et al. “Non-Stationary Noise Characteristics for SPECT Images”, Proceedings of the Nuclear Science Symposium and Medical Imaging Conference, Santa Fe, CA, USA, Nov. 2-9, 1991, XP010058168, p. 1736-1740, Nov. 2, 1991. p. 1736, col. 2, Lines 4-6.
Wong et al. “Segmentation of Dynamic PET Images Using Cluster Analysis”, IEEE Transactions on Nuclear Science, XP002347001, 49(1): 200-207, Feb. 2002. Introduction, Section I, Last Para, Section II, A, Segmentation Scheme, Section II, C, Validation Study, Figs.2, 5.
Wu et al. “ECG-Gated Pinhole SPECT in Mice With Millimeter Spatial Resolution”, IEEE Transactions on Nuclear Science, 47(3): 1218-1221, Jun. 2000.
Xu et al. “Quantitative Expression Profile of Androgen-Regulated Genes in Prostate Cancer Cells and Identification of Prostate-Specific Genes”, International Journal of Cancer, 92: 322-328, 2001.
Yu et al. “Using Correlated CT Images in Compensation for Attenuation in PET Image Reconstruction”, Proceedings of the SPIE, Applications of Optical Engineering: Proceedings of OE/Midwest '90, 1396: 56-58, 1991.
Zaidi et al. “Magnetic Resonance Imaging-Guided Attenuation and Scatter Corrections in Three-Dimensional Brain Positron Emission Tomography”, Medical Physics, 30(5): 937-948, May 2003.
Zaidi et al. “MRI-Guided Attenuation Correction in 3D Brain PET”, Neuroimage Human Brain Mapping 2002 Meeting, 16(2): Abstract 504, Jun. 2002.
Zhang et al. “An Innovative High Efficiency and High Resolution Probe for Prostate Imaging”, The Journal of Nuclear Medicine, 68: 18, 2000. Abstract.
Zhang et al. “Potential of a Compton Camera Scintimammography”, Physics in Medicine for High Performance and Biology, XP020024019, 49(4): 617-638, Feb. 21, 2004.
Restriction Official Action dated Oct. 20, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/100,082.
Notice of Allowance dated Jul. 19, 2018 From the US Patent and Trademark Office Re. U.S. Appl. No. 15/953,461. (312 pages).
Communication Pursuant to Article 94(3) EPC dated Mar. 8, 2010 From the European Patent Office Re. Application No. 06832278.3.
Communication Pursuant to Article 94(3) EPC dated Sep. 12, 2014 From the European Patent Office Re. Application No. 06832278.3.
Communication Pursuant to Article 94(3) EPC dated Sep. 16, 2013 From the European Patent Office Re.: Application No. 06832278.3.
Communication Pursuant to Article 94(3) EPC dated Sep. 17, 2012 From the European Patent Office Re. Application No. 06832278.3.
Communication Pursuant to Article 94(3) EPC dated Sep. 23, 2011 From the European Patent Office Re.: Application No. 06832278.3.
International Preliminary Report on Patentability dated Jan. 22, 2009 From the International Bureau of WIPO Re.: Application No. PCT/IL2006/001511.
International Search Report dated Jul. 11, 2008 From the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/01511.
Invitation to Pay Additional Fees dated Jul. 10, 2008 From the International Searching Authority Re.: Application No. PCT/IL06/01511.
Official Action dated Aug. 2, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Dec. 4, 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Jun. 12, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Sep. 12, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Dec. 16, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Jun. 2014 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Official Action dated Jan. 23, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 12/087,150.
Supplementary Partial European Search Report and the European Search Opinion dated Dec. 15, 2009 From the European Patent Office Re.: Application No. 06832278.3.
Written Opinion dated Jul. 11, 2008 from the International Searching Authority of the Patent Cooperation Treaty Re.: Application No. PCT/IL06/01511.
Bacharach et al. “Attenuation Correction in Cardiac Positron Emission Tomography and Single-Photon Emission Computed Tomography”, Journal of Nucelar Cardiology, 2(3): 246-255, 1995.
Bracco Diagnostics “Cardiotec®: Kit for the Preparation of Technetium Tc 99m Teboroxime. For Diagnostic Use”, Bracco Diagnostics Inc., Product Sheet, 2 P., Jul. 2003.
Bracco Diagnostics “Techneplex®: Kit for the Preparation of Technetium Tc 99m Pentetate Injection. Diagnostic—for Intravenous Use”, Bracco Diagnostics™ Product Sheet, 5 P., Jun. 1995.
GE Healthcare “Myoview™: Kit for the Preparation of Technetium Tc99m Tetrofosmin for Injection. Diagnostic Radiopharmaceutical. For Intravenous Use Only. Rx Only”, GE Healthcare, Product Sheet, 4 P., Aug. 2006.
Gilland et al. “A 3D Model of Non-Uniform Attenuation and Detector Response for Efficient Iterative Reconstruction in SPECT”, Physics in Medicine and Biology, XP002558623, 39(3): 547-561, Mar. 1994. p. 549-550, Section 2.3 ‘Active Voxel Reconstruction’, p. 551, Lines 4-8.
Gilland et al. “Simultaneous Reconstruction and Motion Estimation for Gated Cardiac ECT”, IEEE Transactions on Nuclear Science, XP011077797, 49(5): 2344-2349, Oct. 1, 2002. p. 2344, Section ‘Introduction’, First §.
Handrick et al. “Evaluation of Binning Strategies for Tissue Classification in Computed Tomography Images”, Medical Imaging 2006: Image Processing, Proceedings of the SPIE, 6144: 1476-1486, 2006.
Jin et al. “Reconstruction of Cardiac-Gated Dynamic SPECT Images”, IEEE International Conference on Image Processing 2005, ICIP 2005, Sep. 11-14, 2005, 3: 1-4, 2005.
Johnson et al. “Analysis and Reconstruction of Medical Images Using Prior Information”, Lectures Notes in Statistics, Case Studies in Bayesian Statistics, II: 149-228, 1995.
Li et al. “A HOTLink/Networked PC Data Acquisition and Image Reconstruction System for a High Resolution Whole-Body PET With Respiratory or ECG-Gated Performance”, IEEE Nuclear Science Symposium and Medical Imaging Conference, Norfolk, VA, USA, Nov. 10-16, 2002, XP010663724, 2: 1135-1139, Nov. 10, 2002. p. 1137, First col. 2nd §.
Mallinckrodt “Kit for the Preparation of Technetium Tc 99m Sestamibi Injection”, Mallinckrodt Inc., Product Sheet, 2 P., Sep. 8, 2008.
Mallinckrodt “OctreoScan®: Kit for the Preparation of Indium In-111 Pentetreotide. Diagnostic—for Intravenous Use. Rx Only”, Mallinckrodt Inc., Product Sheet, 2 P., Oct. 25, 2006.
Ouyang et al. “Incorporation of Correlated Structural Images in PET Image Reconstruction”, IEEE Transactions of Medical Imaging, 13(4): 627-640, Dec. 1994.
Pharmalucence “Kit for the Preparation of Technetium Tc99m Sulfur Colloid Injection for Subcutaneous, Intraperitoneal, Intravenous, and Oral Use”, Pharmalucence Inc., Reference ID: 2977567, Prescribing Information, 10 P., Jul. 2011.
Reutter et al. “Direct Least Squares Estimation of Spatiotemporal Distributions From Dynamic SPECT Projections Using a Spatial Segmentation and Temporal B-Splines”, IEEE Transactions on Medical Imaging, 19(5): 434-450, 2000.
Reutter et al. “Kinetic Parameter Estimation From Attenuated SPECT Projection Measurements”, IEEE Transactions on Nuclear Science, 45(6): 3007-3013, 1998.
Thorndyke et al. “Reducing Respiratory Motion Artifacts in Positron Emission Tomography Through Retrospective Stacking”, Medical Physics, 33(7): 2632-2641, Jul. 2006.
Uni Magdeburg “Attenuation Map”, University of Magdeburg, Germany, Retrieved From the Internet, Archived on Jul. 31, 2002.
Toennies et al. “Scatter Segmentation in Dynamic SPECT Images Using Principal Component Analysis”, Progress in Biomedical Optics and Imaging, 4(23): 507-516, 2003.
Zaidi et al. “Determination of the Attenuation Map in Emission Tomography”, Journal of Nuclear Medicine, 44(2): 291-315, 2003.
Related Publications (1)
Number Date Country
20170039738 A1 Feb 2017 US
Provisional Applications (33)
Number Date Country
60816970 Jun 2006 US
60800846 May 2006 US
60800845 May 2006 US
60799688 May 2006 US
60763458 Jan 2006 US
60754199 Dec 2005 US
60750597 Dec 2005 US
60750334 Dec 2005 US
60750287 Dec 2005 US
60741440 Dec 2005 US
60720652 Sep 2005 US
60720541 Sep 2005 US
60720034 Sep 2005 US
60702979 Jul 2005 US
60700753 Jul 2005 US
60700752 Jul 2005 US
60700318 Jul 2005 US
60700317 Jul 2005 US
60700299 Jul 2005 US
60691780 Jun 2005 US
60675892 Apr 2005 US
60648690 Feb 2005 US
60648385 Feb 2005 US
60640215 Jan 2005 US
60636088 Dec 2004 US
60635630 Dec 2004 US
60632515 Dec 2004 US
60632236 Dec 2004 US
60630561 Nov 2004 US
60628105 Nov 2004 US
60625971 Nov 2004 US
60575369 Jun 2004 US
60535830 Jan 2004 US
Continuations (8)
Number Date Country
Parent 12087150 US
Child 15294737 US
Parent 11607075 Dec 2006 US
Child 12087150 US
Parent PCT/IL2006/001291 Nov 2006 US
Child 11607075 US
Parent PCT/IL2006/000834 Jul 2006 US
Child PCT/IL2006/001291 US
Parent PCT/IL2006/000840 Jul 2006 US
Child PCT/IL2006/000834 US
Parent PCT/IL2006/000562 May 2006 US
Child PCT/IL2006/000840 US
Parent PCT/IL2006/000059 Jan 2006 US
Child PCT/IL2006/000562 US
Parent 12084559 US
Child PCT/IL2006/000059 US
Continuation in Parts (10)
Number Date Country
Parent PCT/IL2006/000840 Jul 2006 US
Child 12084559 US
Parent PCT/IL2006/000834 Jul 2006 US
Child PCT/IL2006/000840 US
Parent PCT/IL2006/000562 May 2006 US
Child PCT/IL2006/000834 US
Parent PCT/IL2006/000059 Jan 2006 US
Child PCT/IL2006/000562 US
Parent PCT/IL2005/001215 Nov 2005 US
Child PCT/IL2006/000059 US
Parent PCT/IL2005/001173 Nov 2005 US
Child PCT/IL2005/001215 US
Parent PCT/IL2005/000575 Jun 2005 US
Child PCT/IL2005/001173 US
Parent PCT/IL2005/000572 Jun 2005 US
Child PCT/IL2005/000575 US
Parent PCT/IL2005/000048 Jan 2005 US
Child PCT/IL2005/000572 US
Parent 11034007 Jan 2005 US
Child 12084559 US