The subject matter disclosed herein relates generally to systems and methods for guided interventional procedures.
Interventional procedures involve the insertion of a device into a patient to perform a medical task. For example, a needle or probe may be inserted into a patient to perform a biopsy or ablation. The path of the needle or probe may be selected to maintain the needle in a position that avoids impact on blood vessels, nerves, or organs. However, such positioning may be time consuming. For example, a number of computed tomography (CT) scans may be performed at different stages during the insertion of the needle or probe as part of a “guess and check” approach to determining needle position and path. Additionally, as the number of CT scans increases, so does the radiation dose received by the patient.
In one embodiment, a system is provided that includes an interventional device, a tracking device, an optical tracking system, at least one processor, and a display unit. The interventional device includes an insertion portion and an exterior portion. The insertion portion is configured for use inside of a patient to perform an interventional procedure. The tracking device is disposed proximate to the exterior portion of the interventional device. The optical tracking system is configured to cooperate with the tracking device to provide tracking imaging information corresponding to a location and orientation of the tracking device. At least one processor is configured to correlate the tracking imaging information with an anatomical image of the patient to provide a combined image during a medical task for which at least a portion of the insertion portion of the interventional device is disposed inside of the patient. The display unit is configured to display the combined image.
In another embodiment, a method is provided that includes inserting an interventional device into a patient. The interventional device includes an insertion portion and an exterior portion. The insertion portion is configured for use inside of a patient to perform an interventional procedure. The interventional device has associated therewith a tracking device that is disposed proximate to the exterior portion of the interventional device. The method also includes acquiring tracking imaging information with an optical tracking system in cooperation with the tracking device. The optical tracking information corresponds to a location and orientation of the tracking device. Further, the method includes correlating the tracking imaging information with an anatomical image of the patient to provide a combined image while the insertion portion of the interventional device is disposed inside of the patient. Also, the method includes displaying the combined image.
In another embodiment, a tangible and non-transitory computer readable medium is provided that includes one or more computer software modules configured to direct one or more processors to: acquire tracking imaging information with an optical tracking system, where the optical tracking information corresponds to a location and orientation of a tracking device, with the tracking device associated with an interventional device, wherein the interventional device comprises an insertion portion and an exterior portion, with the insertion portion configured for use inside of a patient to perform an interventional procedure; correlate the tracking imaging information with an anatomical image of the patient to provide a combined image while the insertion portion of the interventional device is disposed inside of the patient; and display the combined image.
The following detailed description of certain embodiments will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. For example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor or a block of random access memory, hard disk, or the like) or multiple pieces of hardware. Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. It should be understood that the various embodiments are not limited to the arrangements and instrumentality shown in the drawings. It should be further understood that the figures illustrate example embodiments of the present disclosure. Variations, such as replacing or modifying one or more functional blocks, are possible to achieve similar results.
As used herein, the terms “system,” “unit,” or “module” may include a hardware and/or software system that operates to perform one or more functions. For example, a module, unit, or system may include a computer processor, controller, or other logic-based device that performs operations based on instructions stored on a tangible and non-transitory computer readable storage medium, such as a computer memory. Alternatively, a module, unit, or system may include a hard-wired device that performs operations based on hard-wired logic of the device. Various modules or units shown in the attached figures may represent the hardware that operates based on software or hardwired instructions, the software that directs hardware to perform the operations, or a combination thereof. It may be noted that wireless devices and wireless data transmission may also be utilized.
“Systems,” “units,” or “modules” may include or represent hardware and associated instructions (e.g., software stored on a tangible and non-transitory computer readable storage medium, such as a computer hard drive, ROM, RAM, or the like) that perform one or more operations described herein. The hardware may include electronic circuits that include and/or are connected to one or more logic-based devices, such as microprocessors, processors, controllers, or the like. These devices may be off-the-shelf devices that are appropriately programmed or instructed to perform operations described herein from the instructions described above. Additionally or alternatively, one or more of these devices may be hard-wired with logic circuits to perform these operations. Again, wireless devices and wireless data transmission may also be utilized.
As used herein, an element or step recited in the singular and preceded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising” or “having” an element or a plurality of elements having a particular property may include additional elements not having that property.
Also used herein, the phrase “reconstructing an image” is not intended to exclude embodiments in which data representing an image is generated, but a viewable image is not. As used herein, the term “image” broadly refers to both viewable images and data representing a viewable image. It may be noted that various embodiments generate, or are configured to generate, at least one viewable image.
Various embodiments provide systems and methods for tracking of interventional devices (e.g., biopsy needle, ablation probe) in 2D or 3D space using motion capture and/or virtual reality techniques. For example, ceiling mounted cameras or laser systems may be employed. In some embodiments, a fiducial marker is placed on a patient or patient table, and used to calculate probe or needle position relative to the patient on a continuous, ongoing basis as the probe or needle position changes. The determined position may then be used to display the current or updated needle position as an overlay on a screen or other display along with anatomical information of the patient (e.g., anatomical information acquired during an X-ray, CT, MR, or ultrasound imaging scan). In some embodiments, a practitioner (e.g., surgeon or interventional radiologist) may plan an optimal or preferred probe or needle path before beginning an interventional procedure, with the display providing guidance along the optimal or preferred path during the interventional procedure, as well as displaying when the probe or needle is in the correct position. Accordingly, various embodiments save time and reduce radiation dose (e.g., from X-ray or CT scans) relative to conventional “guess and check” approaches. Further, various embodiments provide for more complex needle or probe insertion paths relative to conventional approaches that are limited to axial or near axial slices due to limitations in patient dose and/or image reconstruction time.
Various embodiments provide improved tracking of devices used in connection with interventional procedures. A technical effect of at least one embodiment includes reduction of time required for interventional procedures. A technical effect of at least one embodiment includes reduction of dosage for imaging in connection with interventional procedures. A technical effect of at least one embodiment includes improved ease and accuracy of directing an interventional device inside a patient.
As seen in
It may be noted that in various embodiments, the anatomical image may be understood as static, and generated using imaging information acquired before insertion of the interventional device 130 into the patient 102, whereas the tracking imaging information may be understood as dynamic, and updated to provide live needle or probe tracking as the interventional device 130 is advanced toward a target or desired location within the patient 102. It may be noted that various embodiments may include additional components, or may not include all of the components shown in
The CT acquisition unit 110 is configured to acquire CT imaging information of the patient 102 (or a portion thereof), and the processing unit 120 is configured to reconstruct or generate an anatomical image using the CT imaging information acquired via the CT acquisition unit 110. It may be noted that, while a CT acquisition unit 110 is depicted in the illustrated embodiment, in other embodiments the component represented by the acquisition unit 110 may utilize one or more other imaging modalities alternatively or additionally. For example, an ultrasound imaging acquisition may be employed. As another example, a multi-modality acquisition unit may be employed. For example, both ultrasound and CT imaging information may be acquired. In some embodiments, ultrasound may be used for a first portion of the patient 102 and CT used for a second portion of the patient 102. The first and second portions may overlap or may not overlap in different embodiments.
The depicted CT acquisition unit 110 includes an X-ray source 112 and a CT detector 114. (For additional information regarding example CT systems, see
Generally, X-rays from the X-ray source 112 may be guided to an object to be imaged (e.g., patient 102) through a source collimator and bowtie filter. The object to be imaged in various embodiments is a human patient, or a portion thereof (e.g., portion of patient 102 associated with an interventional procedure to be performed). The source collimator may be configured to allow X-rays within a desired field of view (FOV) to pass through to the object to be imaged while blocking other X-rays. The bowtie filter may be configured to absorb radiation from the X-ray source 112 to control distribution of X-rays passed to the object to be imaged.
X-rays that pass through the object to be imaged are attenuated by the patient 102 and received by the CT detector 114 (which may have a detector collimator associated therewith), which detects the attenuated X-rays and provides imaging information to the processing unit 120. The processing unit 120 may then reconstruct an image of the scanned portion of the patient 102 using the imaging information (or projection information) provided by the CT detector 114.
In the illustrated embodiment, the X-ray source 112 is configured to rotate about an object being imaged. For example, the X-ray source 112 and the CT detector 114 may be positioned about a bore 118 of the gantry 116 and rotated about the patient 102 prior to performance of an interventional procedure. For example, the patient 102 may be positioned on a table 115 outside of the bore 118. Then, the table 115 and patient 102 may be advanced into the bore 118 to acquire CT imaging information. Once the acquisition is complete, the table 115 and patient 102 may be retracted from the bore 118, and the interventional procedure performed with the patient 102 outside of the bore 118. The CT imaging information may be collected as a series of views that together make up a rotation or portion thereof. Each view or projection may have a view duration during which information (e.g., counts) is collected for the particular view. The view duration for a particular view defines a CT information acquisition period for that particular view. For example, each rotation may be made up of about 1000 views or projections, with each view or projection having a duration or length of about 1/1000 of a complete rotation. The X-ray source may be turned on and off to control the acquisition time. For example, to perform an imaging scan of a complete rotation, the X-ray source may be turned on at a particular rotational position of the gantry and turned off when the X-ray source returns to the particular rotational position after a complete rotation. It may be noted that in the illustrated embodiment, the CT acquisition unit 110 is depicted in proximity to other components of the system 100. However, in other embodiments, the anatomical image may be generated from an acquisition unit (e.g., CT and/or ultrasound) that is remotely located away from a location at which the interventional procedure is performed, and may be understood as not forming a part of system 100 in various embodiments.
The interventional device 130 is configured for use inside of the patient 102 for performance of an interventional procedure. As one example, the interventional device 130 may be configured as a needle with a hollow conduit configured for extraction of a sample during a biopsy procedure. As another example, the interventional device 130 may be configured as a probe that delivers energy to a desired location to perform an ablation. The depicted interventional device 130 includes an insertion portion 132 and an exterior portion 134. The insertion portion 132 is generally configured to be used inside of the patient 102. Generally, at least a part of the insertion portion 132 will be disposed inside of the patient 102 during performance of an interventional procedure. The exterior portion 134 is configured for use outside of the patient 102. For example, the exterior portion 134 may include a handle for manipulating the interventional device.
The tracking device 140 is disposed proximate to the exterior portion 134 of the interventional device 130. For example, the tracking device 140 may be part of a handle of the exterior portion 134, or attached to a handle of the exterior portion 134. As another example, the tracking device 140 may be affixed to the exterior portion 134 of the interventional device 130 (e.g., via adhesive, or via one or more fasteners). In other embodiments, the tracking device 140 may be disposed near, but not on the interventional device 130. For example, the tracking device 140 may be affixed to a portion of the insertion portion 132 that is adjacent to or near the exterior portion 134 (e.g., a portion of the insertion portion 132 that is not inserted into the patient 102, and remains outside of the patient 102 during performance of an interventional procedure. Generally, the tracking device 140 is configured as or includes one or more tracking features (e.g., IR sensor chip, reflector) that may be used to determine position and orientation of the tracking device 140 (and accordingly, the interventional device 130 associated with the tracking device 140). It may be noted that the tracking device 140 may be integrally formed with the interventional device 130, or may be removable from the interventional device 130. Further, in embodiments where the tracking device has electronics or circuitry associated therewith, the electronics or circuitry may be mounted to the interventional device 130 or may be located off-board of the interventional device 130 (e.g., coupled to the tracking device 140 via a wire or cable that may be connected and disconnected). Locating electronics or circuitry off-board of the interventional device 130 in various embodiments reduces weight and improves ease of handling the interventional device 130, and also allows the electronics or circuitry to be removed if the interventional device 130 is to be left in the patient 102 for an imaging procedure (e.g., within bore 118 of the CT acquisition unit 110).
In some embodiments, the tracking device 140 is integrally formed with the interventional device 130.
In some embodiments, the tracking device 140 includes an adaptor that is configured to be removably attached to the interventional device 130, allowing the tracking device 140 and interventional device to be separable.
With continued reference to
It may be noted that the particular component (or components) of the system 100 that acquires the tracking imaging information may vary in different embodiments. In some embodiments, the optical tracking system 150 may provide signals that are detected by the tracking device 140. For example, the tracking device 140 in various embodiments uses IR sensors on the tracking device 140 that receive IR signals from the optical tracking system 150, and measures a time difference between a synchronization pulse and signals received at each IR sensor, with the timing information used to determine the position and orientation of the tracking device 140 (and the interventional device 130). In some embodiments, the optical tracking system 150 may provide signals that are reflected off the tracking device 140 (or a portion thereof) using reflectors, with the optical tracking system 150 detecting the reflected signals. The reflected signals are then used to determine the position and orientation of the tracking device 140 (and the interventional device 130). It may further be noted that the position and orientation of the tracking device 140 (and the interventional device 130) in some embodiments, for example, may be determined by the tracking device 140 and/or the optical tracking system 150, with the determined position and orientation information then provided to the processing unit 120. In other embodiments, raw data (such as timing information) may be provided from the optical tracking system 150 and/or the tracking device 140 to processing unit 120, with the position and orientation of the tracking device 140 (and interventional device 130) determined by the processing unit 120. It may be noted that the optical tracking system 150 in various embodiments may include cameras.
Generally, the processing unit 120 in various embodiments correlates the tracking imaging information acquired via the optical tracking system 150 (and tracking device 140) with an anatomical image of the patient 102 to provide a combined image. In various embodiments, the processing unit 120 correlates the tracking imaging information using a fiducial marker 170 that is disposed at a predetermined position relative to the patient 102. In the illustrated embodiment, the fiducial marker 170 is disposed on the patient 102 at a predetermined, measured, or otherwise known location. In other embodiments, the fiducial marker may be placed on a support table or other structure at a known position relative to the patient. Generally, the location of the fiducial marker 170 is used to register the tracking imaging information with the anatomical information. For example, in some embodiments, the fiducial marker 170 may include a feature or tag that is visible in both the anatomical image and the tracking imaging information. In other embodiments, the fiducial marker 170 may be visible in the tracking imaging information (e.g., including a tracking device or feature, or otherwise visible to the optical tracking system 150), and placed at a known position relative to a landmark present in the anatomical image. Multiple fiducial markers may be employed for redundancy and/or improved accuracy of registration of the tracking imaging information to the anatomical image.
In some embodiments, the fiducial marker 170 includes a ring or annular structure.
With continued reference to
Generally speaking, in various embodiments, the anatomical image is acquired, provided, or produced before insertion of the interventional device 130 into the patient 102. For example, the processing unit 120 may reconstruct or generate the anatomical image using imaging information (e.g., CT imaging information, or ultrasound imaging information, among others) acquired with an imaging system or unit. In the illustrated embodiment, the processing unit 120 acquires CT imaging information from the acquisition unit 110 and reconstructs the anatomical image using the CT imaging information. For example, the CT imaging information may be acquired in projection space and transformed into a reconstructed image in image space.
In various embodiments, the processing unit 120 modifies the anatomical image or provides additional or supplemental information. For example, in some embodiments, the anatomical image is modified, or the combined image is modified, so that the combined image includes a marked entry point and/or marked path corresponding to a desired route for the insertion portion. For example, the anatomical image may be displayed to a practitioner via the display unit 160 or other display. Then, based on the internal structures depicted in the anatomical image and the desired end position of the interventional device 130, the practitioner may provide a desired path (and/or entry point) for the interventional device 130 to arrive at a target destination while avoiding internal structures such as blood vessels or organs. Then, during insertion of the interventional device 130, the desired path may displayed along with the anatomical image and tracking imaging information to allow a live comparison of the actual measured or determined position of the interventional device 130 with the desired path, which may be used to guide insertion of the interventional device 130 without requiring CT scans with the interventional device 130 at intermediate steps between insertion and the final, desired end position for performance of the interventional procedure.
It may be noted that in various embodiments, the desired path may be displayed in 2 and/or 3 dimensions. For example, in some embodiments, the display unit 160 (e.g., under control of the processing unit 120) is configured to show plural views of the marked path.
Returning to
In the illustrated embodiment, the processing unit includes a memory 122. The memory 122 may include one or more computer readable storage media. For example, the process flows and/or flowcharts discussed herein (or aspects thereof) may represent one or more sets of instructions that are stored in the memory 122 for direction of operations of the processing unit 120 and system 100.
The display unit 160 is configured to provide information (e.g., a combined image as discussed herein) to a user. The display unit 160 may include one or more of a screen, a touchscreen, a printer, or the like. In some embodiments, the display unit 160 may include a wearable headset, for example for use with a VR or AR display.
As discussed herein, in various embodiments, plural optical signal sources and/or detectors may be utilized. For example,
At 802, a patient is positioned (e.g., on a bed, table, or other support). The patient may also be prepared for an interventional procedure. For example, the patient may be sedated. In the illustrated embodiment, the bed, table, or support is located proximate to and configured for use with an imaging system (e.g., CT imaging system), but it may be noted that in other embodiments imaging information may be acquired at a remote location. At 804, a fiducial marker (e.g., fiducial marker 170) is placed. The fiducial marker in various embodiments may be placed directly on the patient, or in other embodiments may be placed on the table or other structure at a known position relative to the patient (e.g., relative to the portion of the patient being imaged). The fiducial marker is placed for later use in registering imaging information corresponding to internal anatomy of the patient with tracking information corresponding to the position and orientation of an interventional device.
At 806, the patient and support are advanced into the bore of a CT acquisition unit that includes an X-ray source and a detector. It may be noted that in alternate embodiments, a different modality (e.g., ultrasound) may be used additionally or alternatively. At 808, CT imaging information is acquired via the CT acquisition unit. At 810, an anatomical image is generated or reconstructed using the CT imaging information. At 812, the patient and support are retracted out of the bore.
At 814, an entry point and/or marked path are added to the anatomical image. The marked path corresponds to a desired route for an insertion portion of an interventional device within the patient. For example, a practitioner viewing the anatomical image may provide inputs which are used to add the marked path to the anatomical image. It may be noted that this step may be performed on this or a separate system.
At 816, tracking imaging information is acquired with an optical tracking system (e.g., optical tracking system 150). The tracking information corresponds to a location and orientation of the tracking device. With the tracking device at a known spatial relationship with respect to the interventional device, and the tracking devices position and orientation determined using the tracking imaging information acquired in cooperation with the optical tracking system, the position and orientation of the interventional device may be determined. It may be noted that the tracking information may be acquired on an ongoing basis during insertion of the interventional device, with the determined position and orientation of the interventional device updated on an ongoing basis. It may be noted that the tracking imaging information may be initially acquired while the interventional device is outside of the patient. For example, a combined image as discussed herein may be provided before insertion of the interventional device to help guide insertion at a desired point of insertion and at a desired direction or angle of insertion.
At 818, an interventional device (e.g., interventional device 130) is inserted into the patient. For example, an incision may be made at a desired location, and the interventional device inserted into the incision. The interventional device may be configured, for example, for use during a biopsy or ablation. It may be noted that the initial insertion may be quite shallow, with further insertion performed using tracking information as discussed herein. The interventional device includes an insertion portion configured for use inside of a patient (e.g., a needle or probe), and an exterior portion. A tracking device (e.g., tracking device 140) including at least one tracking feature (e.g., IR sensor, reflector) is associated with the interventional device and disposed proximate to the exterior portion of the interventional device. For example, tracking features may be integrally formed with a handle of the interventional device. As another example, an adaptor with one or more tracking features may be removably secured to the interventional device.
At 820, the tracking imaging information is correlated with the anatomical image to provide a combined image. The tracking imaging information and the anatomical image may be correlated using one or more fiducial markers as discussed herein. At 822, the combined image is displayed. The combined image may be displayed on a screen, for example, or, as another example, to a practitioner wearing a headset. The combined image in various embodiments shows the position and orientation of the interventional device overlayed with or relative to the anatomical information. Accordingly, a practitioner viewing the combined image is provided with a visual representation of the interventional device within the interior of the patient, along with surrounding or neighboring anatomical structures. The combined image may also show additional information, such as background or environmental information depicting a surrounding environment, supplemental information including a desired path for the interventional device and/or a projected path of the interventional device based on its current position and orientation, historical information including patient data, or instructional information regarding performance of a given interventional procedure. With the tracking information acquired on an ongoing basis and the position and orientation of the interventional device updated on an ongoing basis, the displayed position of the interventional device may also be updated.
At 824, the interventional device is guided further into the patient using the combined display. As the position and orientation of the interventional device is updated generally continuously or on an ongoing basis, the combined display provides visual feedback for the changing position and orientation of the interventional device as it is further introduced into the patient. For example, the progress of the interventional device may be tracked relative to internal structures, with appropriate adjustments made to avoid the internal structures as the interventional device is advance. As another example, the progress of the interventional device may be tracked relative to a desired path and/or target location, with appropriate adjustments made to match the actual path of the interventional device as measured or determined with the desired path and/or target location.
At 826, with the interventional device advanced to the target location (e.g., as determined using the combined display), the position of the interventional device may be confirmed. For example, the patient, with the interventional device still in place, may be imaged to provide an updated anatomical image. In the illustrated embodiment, the patient may be re-advanced into the bore of the CT imaging acquisition unit, CT imaging information acquired, and a CT image reconstructed showing the position of the interventional device within the patient. If the interventional device is in the desired position, then an interventional procedure may be performed. However, if the interventional device is not in the desired position, the position may be adjusted until it is. It may be noted that in some embodiments the tracking device and/or associated electronics or circuitry may be detached from the interventional device before the patient is advanced into the CT bore, or before the imaging scan to check the position of the interventional device is performed.
At 828, the interventional procedure is performed. With the interventional device in the desired position, it may be used to perform an interventional procedure, such as ablation, or, as another example, to perform a biopsy. After the interventional procedure is complete, the interventional device may be withdrawn from the patient.
Various methods and/or systems (and/or aspects thereof) described herein may be implemented using a medical imaging system. For example,
The CT imaging system 900 includes a gantry 910 that has the X-ray source 912 that projects a beam of X-rays toward the detector array 914 on the opposite side of the gantry 910. A source collimator 913 and a bowtie filter are provided proximate the X-ray source 912. In various embodiments, the source collimator 913 may be configured to provide wide collimation as discussed herein. The detector array 914 includes a plurality of detector elements 916 that are arranged in rows and channels that together sense the projected X-rays that pass through a subject 917. The imaging system 900 also includes a computer 918 that receives the projection data from the detector array 914 and processes the projection data to reconstruct an image of the subject 917. The computer 918, for example, may include one or more aspects of the processing unit 120, or be operably coupled to one or more aspects of the processing unit 120. In operation, operator supplied commands and parameters are used by the computer 918 to provide control signals and information to reposition a motorized table 922. More specifically, the motorized table 922 is utilized to move the subject 917 into and out of the gantry 910. Particularly, the table 922 moves at least a portion of the subject 917 through a gantry opening (not shown) that extends through the gantry 910. Further, the table 922 may be used to move the subject 917 vertically within the bore of the gantry 910.
The depicted detector array 914 includes a plurality of detector elements 916. Each detector element 916 produces an electrical signal, or output, that represents the intensity of an impinging X-ray beam and hence allows estimation of the attenuation of the beam as it passes through the subject 917. During a scan to acquire the X-ray projection data, the gantry 910 and the components mounted thereon rotate about a center of rotation 940.
Rotation of the gantry 910 and the operation of the X-ray source 912 are governed by a control mechanism 942. The control mechanism 942 includes an X-ray controller 944 that provides power and timing signals to the X-ray source 912 and a gantry motor controller 946 that controls the rotational speed and position of the gantry 910. A data acquisition system (DAS) 948 in the control mechanism 942 samples analog data from detector elements 916 and converts the data to digital signals for subsequent processing. An image reconstructor 950 receives the sampled and digitized X-ray data from the DAS 948 and performs high-speed image reconstruction. The reconstructed images are input to the computer 918 that stores the image in a storage device 952. The computer 918 may also receive commands and scanning parameters from an operator via a console 960 that has a keyboard. An associated visual display unit 962 allows the operator to observe the reconstructed image and other data from computer. It may be noted that one or more of the computer 918, controllers, or the like may be incorporated as part of a processing unit such as the processing unit 120 discussed herein.
The operator supplied commands and parameters are used by the computer 918 to provide control signals and information to the DAS 948, the X-ray controller 944 and the gantry motor controller 946. In addition, the computer 918 operates a table motor controller 964 that controls the motorized table 922 to position the subject 917 in the gantry 910. Particularly, the table 922 moves at least a portion of the subject 917 through the gantry opening.
In various embodiments, the computer 918 includes a device 970, for example, a CD-ROM drive, DVD drive, magnetic optical disk (MOD) device, or any other digital device including a network connecting device such as an Ethernet device for reading instructions and/or data from a tangible non-transitory computer-readable medium 972, that excludes signals, such as a CD-ROM, a DVD or another digital source such as a network or the Internet, as well as yet to be developed digital means. In another embodiment, the computer 918 executes instructions stored in firmware (not shown). The computer 918 is programmed to perform functions described herein, and as used herein, the term computer is not limited to just those integrated circuits referred to in the art as computers, but broadly refers to computers, processors, microcontrollers, microcomputers, programmable logic controllers, application specific integrated circuits, and other programmable circuits, and these terms are used interchangeably herein.
In the exemplary embodiment, the X-ray source 912 and the detector array 914 are rotated with the gantry 910 within the imaging plane and around the subject 917 to be imaged such that the angle at which an X-ray beam 974 intersects the subject 917 constantly changes. A group of X-ray attenuation measurements, i.e., projection data, from the detector array 914 at one gantry angle is referred to as a “view” or “projection.” A “scan” of the subject 917 comprises a set of views made at different gantry angles, or view angles, during one or more revolutions of the X-ray source 912 and the detector array 914. In a CT scan, the projection data is processed to reconstruct an image that corresponds to a three-dimensional volume taken of the subject 917. It may be noted that, in some embodiments, an image may be reconstructed using less than a full revolution of data. For example, with a multi-source system, substantially less than a full rotation may be utilized. Thus, in some embodiments, a scan (or slab) corresponding to a 360 degree view may be obtained using less than a complete revolution.
It should be noted that the various embodiments may be implemented in hardware, software or a combination thereof. The various embodiments and/or components, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or processors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus. The computer or processor may also include a memory. The memory may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a solid-state drive, optical disk drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
As used herein, the term “computer” or “module” may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), ASICs, logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of the term “computer”.
The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
The set of instructions may include various commands that instruct the computer or processor as a processing machine to perform specific operations such as the methods and processes of the various embodiments. The set of instructions may be in the form of a software program. The software may be in various forms such as system software or application software and which may be embodied as a tangible and non-transitory computer readable medium. Further, the software may be in the form of a collection of separate programs or modules, a program module within a larger program or a portion of a program module. The software also may include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to operator commands, or in response to results of previous processing, or in response to a request made by another processing machine.
As used herein, a structure, limitation, or element that is “configured to” perform a task or operation is particularly structurally formed, constructed, or adapted in a manner corresponding to the task or operation. For purposes of clarity and the avoidance of doubt, an object that is merely capable of being modified to perform the task or operation is not “configured to” perform the task or operation as used herein. Instead, the use of “configured to” as used herein denotes structural adaptations or characteristics, and denotes structural requirements of any structure, limitation, or element that is described as being “configured to” perform the task or operation. For example, a processing unit, processor, or computer that is “configured to” perform a task or operation may be understood as being particularly structured to perform the task or operation (e.g., having one or more programs or instructions stored thereon or used in conjunction therewith tailored or intended to perform the task or operation, and/or having an arrangement of processing circuitry tailored or intended to perform the task or operation). For the purposes of clarity and the avoidance of doubt, a general purpose computer (which may become “configured to” perform the task or operation if appropriately programmed) is not “configured to” perform a task or operation unless or until specifically programmed or structurally modified to perform the task or operation.
As used herein, the terms “software” and “firmware” are interchangeable, and include any computer program stored in memory for execution by a computer, including RAM memory, ROM memory, EPROM memory, EEPROM memory, and non-volatile RAM (NVRAM) memory. The above memory types are exemplary only, and are thus not limiting as to the types of memory usable for storage of a computer program.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the various embodiments without departing from their scope. While the dimensions and types of materials described herein are intended to define the parameters of the various embodiments, they are by no means limiting and are merely exemplary. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the various embodiments should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112(f) unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
This written description uses examples to disclose the various embodiments, including the best mode, and also to enable any person skilled in the art to practice the various embodiments, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the various embodiments is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if the examples have structural elements that do not differ from the literal language of the claims, or the examples include equivalent structural elements with insubstantial differences from the literal language of the claims.