It is sometimes necessary to join together two images to make a larger, composite image. These images typically have some overlapping features in common which must be aligned properly to create the larger image. Typically, in plane translational offsets must be calculated to align the images. The problem is compounded when dealing with 3D volume data. For example, to diagnose and analyze scoliosis using MRI, separate 3D volume data sets of the upper and lower spine are acquired in order to achieve the necessary field-of-view while preserving necessary image quality and detail. In this case, both in-plane and out-of-plane translational offsets must be computed to align the volume data.
Presently used techniques register two images from differing modalities (e.g., MRI and CT) for the purpose of fusing the data. Both modalities cover the same volume of anatomy and several control points are used to identify common features. Having to use two different modalities with several control points increases the complexity of these techniques.
Of utility then are methods and system that reduce the complexity of prior art systems and processes in joining volume image data from a single modality.
In one aspect, the present invention provides one or more processes and a user interface that allows for 3D alignment of volume image data. This aspect of the invention is used to join together images from the same modality for the purpose of generating a composite image with a larger field-of-view but with the resolution and quality of individual scans.
In another aspect, the present invention preferably requires only one control point placed on a single image from each data set. Given this single data point, 3D volume data is combined to form a larger 3D volume. For example, where one set of image data is displayed on screen and shows a common area of anatomy with a second set of image data, a control point within the common area is selected. Once that selection is made, a search is done through the volumetric overlapping areas. The two image data sets are joined based on minimization of an error function generated from the area surrounding the control point in both volumetric data sets. The area surrounding the control point is referred to as the control point's neighborhood. The neighborhood can be in-plane (two dimensional) or it can include neighboring slices (three dimensional). In the case of two dimensions it may be rectangular and in the case of three dimensions it may be a rectangular prism (rectangular parallelepiped). The size of the neighborhood is chosen to encompass distinguishing features in the images. In the case of MRI spinal images the neighborhood is chosen to be slightly larger than the dimensions of lumbar vertebrae and adjacent discs. The similarity of the pixel intensities of the neighborhoods of an upper control point and a lower control point is determined by computing the error function. The two neighborhoods are then offset from each other in two (or three) dimensions and the error function is re-computed. This process is continued for other offsets, the limits of which are predetermined. Those offsets that produce the minimum error function are chosen as the best overlapping match and the two image volumes are joined using them.
In other embodiments, more than one data point may be used to form a larger 3D volume. More specifically, two 3D volume data sets may be combined into one larger volume, e.g., in the case of MRI diagnosis and analysis of scoliosis, 3D image data of the upper and lower spine are acquired. These two image sets are loaded into the “joining” software program. The center slice of each set is displayed where the images may be adjusted in brightness and contrast using controls provided through a computer interface, e.g., a mouse, stylus, touch screen, etc. If these images are not desired, other images may be selected by computer control, e.g., scroll bars, either in-synch or independently. The images are displayed above and below each other vertically. If their order is incorrect, they may be swapped.
In another aspect of the present invention an apparatus is provided. The apparatus comprises a memory containing executable instructions; and a processor. The processor is preferably programmed using the instructions to receive pixel data associated with a first magnetic resonance image; receive pixel data associated with a second magnetic resonance image; and detect a common area between the first and second images. Further in accordance with this aspect of the present invention, the processor is also programmed to combine the images together at one or more places along the common area by processing the first and second image data using the square of the normalized intensity difference between at least one group of pixels in the first image data and another group of pixels in the second image data and display the combined image.
Further still in accordance with this aspect of the present invention, the processor is further programmed to display data indicating the offset between the first and second magnetic resonance images in forming the combined image. In addition, in detecting the common area the processor is programmed to search volumetric data associated with the first and second magnetic resonance images based on one or more control points selected in the range.
Further in accordance with this aspect of the present invention, the processor is further programmed to display at least two views of the combined image with one view being orthogonal to the other.
Further still in accordance with this aspect of the present invention, the processor combines the images based on in-plane and out-of-plane offsets.
Turning now to
More specifically,
As shown in
More specifically, with regard to manual mode 222, an adjustment is made of the slices associated with the top image, block 230. This may require a vertical image adjustment 232 and a horizontal image adjustment 238. Once these adjustments are made, the images are aligned and joined 244 as is discussed in further detail below.
If the manual mode 222 is not selected, the process proceeds in automatic mode 224. In automatic mode, after the images showing the best overlapping features are selected into the display, an “Auto Join” feature is then selected, block 250, preferably via software. Using mouse or other control a user may then place a cursor on one of the features in the upper image, block 254. (Although this description is done with respect to upper and lower images, it should be apparent to one skilled in the art the images may be arranged in other orientations, e.g., side-by-side, in lieu of an upper/lower orientation.) In this regard, the feature may be a particular portion of the anatomy, such as a particular vertebrae. A second cursor is placed over the corresponding feature in the lower image, block 260, and the 3D volume data is then joined automatically 244 using the best in-plane and out-of-plane offsets. The offsets are computed using an algorithm that minimizes the square of the normalized intensity differences between the pixel neighborhoods of the two cursor locations. The pixel neighborhoods may be in-plane (2 dimensional) or a volume (3 dimensional). That is, in 3D:
where
Ni is the pixel neighborhood in the i-direction;
Nj is the pixel neighborhood in the j-direction;
Nk is the pixel neighborhood in the k-direction;
G(i,j,k) is the pixel value at i, j, k for the first or upper data;
H(i,j,k) is the pixel value at i, j, k for the second or lower data;
AG(i,j,k) is the average intensity of the pixel neighborhood centered at i, j, k for the upper data; and
AH(i,j,k) is the average intensity of the pixel neighborhood centered at i, j, k for the lower data.
The average intensities, AG(i,j,k) and AH(i,j,k) are used to normalize the pixel data so as to mitigate differences in common feature intensity levels between the two data sets. The combination of i, j and k which minimizes the function yields the offset values used to translate the images before joining the pixel data.
As discussed above, the neighborhood is chosen based on distinguishing features in the images. In effect, this results in a determination based on distance. If too big of an area is chosen, the computational complexity increases. Once the control points are chosen, a search is performed to determine similarities of the control point neighborhoods. The pixel intensities are used in the above error function to determine the similarities between the two neighborhoods. The images are then offset from each other and the error function is then recalculated. The change in the error function value provides an indication of whether the offset is converging toward an optimal point. In other words, the offsets that minimize the above error function provide the best overlapping match.
Next, the normalized error function is computed based on overlapping neighborhood pixel intensities, block 274. As discussed above, the normalized error function is computed using the above equation. The newly computed normalized error function is compared to the minimum error function, block 276. If the new error function is less than the minimum error function, the minimum error function is replaced by the new error function, block 280, and the x, y and z offsets are incremented, block 282. If the new error function is not less than the minimum error function, the process proceeds directly to block 282.
The process then determines whether the volume search is completed, block 284. That is, a determination is made of whether a search of the pertinent volume of neighborhood pixel data is searched. If the volume search is incomplete, the process returns to block 270 and repeats. If the volume search is completed, the first and second images are joined or stitched together, block 290. A “C” routine that computes the offsets is provided below as part of the specification.
An example of a user interface and a composite image formed using the present invention is shown in
The images shown in
A block diagram of a computer system on which the features of the inventions discussed above may be executed is shown in
Memory 154 stores information accessible by processor 152, including instructions 180 that may be executed by the processor 152 and data 182 that may be retrieved, manipulated or stored by the processor. The memory may be of any type capable of storing information accessible by the processor, such as a hard-drive, memory card, ROM, RAM, DVD, CD-ROM, write-capable, read-only memories. The memory 154 may contain machine executable instructions or other software programs that embody the methods discussed above and generally shown in
The processor may comprise any number of well known general purpose processors, such as processors from Intel Corporation or AMD. Alternatively, the processor may be a dedicated controller such as an ASIC.
The instructions 180 may comprise any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts or source code) by the processor. Further, the terms “instructions,” “steps” and “programs” may be used interchangeably herein. The instructions may be stored in object code form for direct processing by the processor, or in any other computer language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. As shown below, the functions, methods and routines of instructions may comprise source code written in “C”.
Data 182 may be retrieved, stored or modified by processor 152 in accordance with the instructions 180. The data may be stored as a collection of data and will typically comprise the MRI image or pixel data discussed above. For instance, although the invention is not limited by any particular data structure, the data may be stored in computer registers, in a database as a table having a plurality of different fields and records, XML documents, or flat files. The data may also be formatted in any computer readable format such as, but not limited to, binary values, ASCII or EBCDIC (Extended Binary-Coded Decimal Interchange Code). Moreover, the data may comprise any information sufficient to identify the relevant information, such as descriptive text, proprietary codes, pointers, references to data stored in other memories (including other network locations) or information which is used by a function to calculate the relevant data.
Although the processor and memory are functionally illustrated in
In another aspect, the methods, programs, software or instructions described above may executed via a server 110. This is desirable where many users need to access and view the same data, such as in a hospital or in non-collocated facilities. Server 110 communicates with one or more client computers 150, 151, 153. Each client computer may be configured as discussed above. Each client computer may be a general purpose computer, intended for use by a person, having all the internal components normally found in a personal computer such as a central processing unit (CPU), display 160, CD-ROM, hard-drive, mouse, keyboard, speakers, microphone, modem and/or router (telephone, cable or otherwise) and all of the components used for connecting these elements to one another. Moreover, computers in accordance with the systems and methods described herein may comprise any device capable of processing instructions and transmitting data to and from humans and other computers, including network computers lacking local storage capability, PDA's with modems and Internet-capable wireless phones. In addition to a mouse, keyboard and microphone, other means for inputting information from a human into a computer are also acceptable such as a touch-sensitive screen, voice recognition, etc.
The server 110 and client computers 150, 151, 153 are capable of direct and indirect communication, such as over a network. Although only a few computers are depicted in
The information may also be transmitted over a global or private network, or directly between two computer systems, such as via a dial-up modem. In other aspects, the information may be transmitted in a non-electronic format and manually entered into the system.
It should be understood that the operations discussed above and illustrated, for example, in
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims.
This application claims the benefit of the filing date of U.S. Provisional Application No. 61/126,752, filed May 7, 2008, the disclosure of which is hereby incorporated herein by reference.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3810254 | Utsumi et al. | May 1974 | A |
| 4407292 | Edrich et al. | Oct 1983 | A |
| 4411270 | Damadian | Oct 1983 | A |
| 4534076 | Barge | Aug 1985 | A |
| 4534358 | Young | Aug 1985 | A |
| D283858 | Opsvik | May 1986 | S |
| 4608991 | Rollwitz | Sep 1986 | A |
| 4613820 | Edelstein et al. | Sep 1986 | A |
| 4614378 | Picou | Sep 1986 | A |
| 4641119 | Moore | Feb 1987 | A |
| 4651099 | Vinegar et al. | Mar 1987 | A |
| 4663592 | Yamaguchi et al. | May 1987 | A |
| 4664275 | Kasai et al. | May 1987 | A |
| 4668915 | Daubin et al. | May 1987 | A |
| 4672346 | Miyamoto et al. | Jun 1987 | A |
| 4675609 | Danby et al. | Jun 1987 | A |
| 4707663 | Minkoff et al. | Nov 1987 | A |
| 4766378 | Danby et al. | Aug 1988 | A |
| 4767160 | Mengshoel et al. | Aug 1988 | A |
| 4770182 | Damadian et al. | Sep 1988 | A |
| 4777464 | Takabatashi et al. | Oct 1988 | A |
| 4816765 | Boskamp et al. | Mar 1989 | A |
| 4829252 | Kaufman | May 1989 | A |
| 4875485 | Matsutani | Oct 1989 | A |
| 4908844 | Hasegawa et al. | Mar 1990 | A |
| 4920318 | Misic et al. | Apr 1990 | A |
| 4924198 | Laskaris | May 1990 | A |
| 4943774 | Breneman et al. | Jul 1990 | A |
| 4985678 | Gangarosa et al. | Jan 1991 | A |
| 5008624 | Yoshida et al. | Apr 1991 | A |
| 5030915 | Boskamp et al. | Jul 1991 | A |
| 5050605 | Eydelman et al. | Sep 1991 | A |
| 5061897 | Danby et al. | Oct 1991 | A |
| 5065701 | Punt | Nov 1991 | A |
| 5065761 | Pell | Nov 1991 | A |
| 5081665 | Kostich | Jan 1992 | A |
| 5124651 | Danby et al. | Jun 1992 | A |
| 5134374 | Breneman et al. | Jul 1992 | A |
| 5153517 | Oppelt et al. | Oct 1992 | A |
| 5153546 | Laskaris | Oct 1992 | A |
| 5155758 | Vogl | Oct 1992 | A |
| 5162768 | McDougall et al. | Nov 1992 | A |
| 5171296 | Herman | Dec 1992 | A |
| 5194810 | Breneman et al. | Mar 1993 | A |
| 5197474 | Englund et al. | Mar 1993 | A |
| 5207224 | Dickinson et al. | May 1993 | A |
| 5221165 | Goszczynski | Jun 1993 | A |
| 5229723 | Sakurai et al. | Jul 1993 | A |
| 5250901 | Kaufman et al. | Oct 1993 | A |
| 5251961 | Pass | Oct 1993 | A |
| 5256971 | Boskamp | Oct 1993 | A |
| 5274332 | Jaskolski et al. | Dec 1993 | A |
| 5291890 | Cline et al. | Mar 1994 | A |
| 5304932 | Carlson | Apr 1994 | A |
| 5305365 | Coe | Apr 1994 | A |
| 5305749 | Li et al. | Apr 1994 | A |
| 5315244 | Griebeler | May 1994 | A |
| 5315276 | Huson et al. | May 1994 | A |
| 5317297 | Kaufman et al. | May 1994 | A |
| 5323113 | Cory et al. | Jun 1994 | A |
| 5349956 | Bonutti | Sep 1994 | A |
| 5382904 | Pissanetzky | Jan 1995 | A |
| 5382905 | Miyata et al. | Jan 1995 | A |
| 5386447 | Siczek | Jan 1995 | A |
| 5394087 | Molyneaux | Feb 1995 | A |
| 5412363 | Breneman et al. | May 1995 | A |
| 5471142 | Wang et al. | Nov 1995 | A |
| 5473251 | Mori | Dec 1995 | A |
| 5475885 | Ishikawa et al. | Dec 1995 | A |
| 5477146 | Jones | Dec 1995 | A |
| 5490513 | Damadian et al. | Feb 1996 | A |
| 5515863 | Damadian | May 1996 | A |
| 5519372 | Palkovich et al. | May 1996 | A |
| 5548218 | Lu | Aug 1996 | A |
| 5553777 | Lampe | Sep 1996 | A |
| 5566681 | Manwaring et al. | Oct 1996 | A |
| 5592090 | Pissanetzky | Jan 1997 | A |
| 5606970 | Damadian | Mar 1997 | A |
| 5621323 | Larsen | Apr 1997 | A |
| 5623241 | Minkoff | Apr 1997 | A |
| 5640958 | Bonutti | Jun 1997 | A |
| 5652517 | Maki et al. | Jul 1997 | A |
| 5654603 | Sung et al. | Aug 1997 | A |
| 5666056 | Cuppen et al. | Sep 1997 | A |
| 5671526 | Merlano et al. | Sep 1997 | A |
| 5680861 | Rohling | Oct 1997 | A |
| 5682098 | Vij | Oct 1997 | A |
| 5743264 | Bonutti | Apr 1998 | A |
| 5754085 | Danby et al. | May 1998 | A |
| 5779637 | Palkovich et al. | Jul 1998 | A |
| 5836878 | Mock et al. | Nov 1998 | A |
| 5862579 | Blumberg et al. | Jan 1999 | A |
| 5929639 | Doty | Jul 1999 | A |
| 5951474 | Matsunaga et al. | Sep 1999 | A |
| D417085 | Kanwetz, II | Nov 1999 | S |
| 5983424 | Naslund et al. | Nov 1999 | A |
| 5988173 | Scruggs | Nov 1999 | A |
| 6008649 | Boskamp et al. | Dec 1999 | A |
| 6014070 | Danby et al. | Jan 2000 | A |
| 6023165 | Damadian et al. | Feb 2000 | A |
| 6075364 | Damadian et al. | Jun 2000 | A |
| 6122541 | Cosman et al. | Sep 2000 | A |
| 6137291 | Szumowski et al. | Oct 2000 | A |
| 6138302 | Sashin et al. | Oct 2000 | A |
| 6144204 | Sementchenko | Nov 2000 | A |
| 6150819 | Laskaris et al. | Nov 2000 | A |
| 6150820 | Damadian et al. | Nov 2000 | A |
| 6201394 | Danby et al. | Mar 2001 | B1 |
| 6208144 | McGinley et al. | Mar 2001 | B1 |
| 6226856 | Kazama et al. | May 2001 | B1 |
| 6246900 | Cosman et al. | Jun 2001 | B1 |
| 6249121 | Boskamp et al. | Jun 2001 | B1 |
| 6249695 | Damadian | Jun 2001 | B1 |
| 6285188 | Sakakura et al. | Sep 2001 | B1 |
| 6332034 | Makram-Ebeid et al. | Dec 2001 | B1 |
| 6357066 | Pierce | Mar 2002 | B1 |
| 6369571 | Damadian et al. | Apr 2002 | B1 |
| 6377044 | Burl et al. | Apr 2002 | B1 |
| 6385481 | Nose et al. | May 2002 | B2 |
| 6411088 | Kuth et al. | Jun 2002 | B1 |
| 6414490 | Damadian et al. | Jul 2002 | B1 |
| 6424854 | Hayashi et al. | Jul 2002 | B2 |
| 6456075 | Damadian et al. | Sep 2002 | B1 |
| 6468218 | Chen et al. | Oct 2002 | B1 |
| 6504371 | Damadian et al. | Jan 2003 | B1 |
| 6591128 | Wu et al. | Jul 2003 | B1 |
| 6677753 | Danby et al. | Jan 2004 | B1 |
| 6792257 | Rabe et al. | Sep 2004 | B2 |
| 6806711 | Reykowski | Oct 2004 | B2 |
| 6828792 | Danby et al. | Dec 2004 | B1 |
| 6850064 | Srinivasan | Feb 2005 | B1 |
| 6882149 | Nitz et al. | Apr 2005 | B2 |
| 6882877 | Bonutti | Apr 2005 | B2 |
| 6894495 | Kan | May 2005 | B2 |
| 7049819 | Chan et al. | May 2006 | B2 |
| 7221161 | Fujita et al. | May 2007 | B2 |
| 7245127 | Feng et al. | Jul 2007 | B2 |
| 7348778 | Chu et al. | Mar 2008 | B2 |
| 7474098 | King | Jan 2009 | B2 |
| 20030156758 | Bromiley et al. | Aug 2003 | A1 |
| 20040204644 | Tsougarakis et al. | Oct 2004 | A1 |
| 20050122343 | Bailey et al. | Jun 2005 | A1 |
| 20050213849 | Kreang-Arekul et al. | Sep 2005 | A1 |
| 20070092121 | Periaswamy et al. | Apr 2007 | A1 |
| 20070165921 | Agam et al. | Jul 2007 | A1 |
| 20080292194 | Schmidt et al. | Nov 2008 | A1 |
| 20080317317 | Shekhar et al. | Dec 2008 | A1 |
| 20100111375 | Jones | May 2010 | A1 |
| Number | Date | Country | |
|---|---|---|---|
| 61126752 | May 2008 | US |