Segmentation is the process of extracting anatomic configurations from images. Many applications in medicine require segmentation of standard anatomy in volumetric images acquired from CT, MRI and other imaging devices. Clinicians, or other professionals, often use segmentation for treatment planning. For example, segmentation may be used for radiation therapy planning such that radiation treatment may be delivered, at a desired dosage, to a target tissue. Currently, segmentation can be performed manually, in which the clinician examines individual image slices and manually draws two-dimensional contours of a relevant organ in each slice. The hand-drawn contours are then combined to produce a three-dimensional representation of the relevant organ. Alternatively, the clinician may use an automatic algorithm for segmentation.
Most structures, however, are still delineated manually slice-by-slice in volumetric medical datasets. Segmentation by hand is tedious and time-consuming, requiring significant expert knowledge to execute. For example, for some applications like radiotherapy in the head and neck region, the segmentation step is one of the main limitations for patient throughput in the clinical workflow. Generally, the clinician must select a slice of the image in which the structure is clearly visible and window-level settings may be manually adjusted to such that a particular region of the image is more clearly visible. Subsequently the contouring process is continued in adjacent slices. As the image contrast often changes from slice to slice, visualization settings such as the window-level setting must be adjusted accordingly for each slice. Manually adjusting the window-level settings for each subsequent image slice or for various regions of a single image slice is time-consuming and tedious.
A method for automatic contrast enhancement for contouring. The method including displaying a volumetric image slice to be analyzed, receiving a delineation of a target anatomic structure in the volumetric image slice, identifying a region of interest based upon an area being delineated in the volumetric image slice, analyzing voxel intensity values in the region of interest and determining an appropriate window-level setting based on the voxel intensity values.
A system having a display displaying a volumetric image slice to be analyzed, a user interface capable of accepting a user delineation of a target anatomic structure in the volumetric image slice and a processor identifying a region of interest based on the user delineation and analyzing a voxel intensity value of the region of interest to determine an appropriate window-level setting.
A computer-readable storage medium including a set of instructions executable by a processor. The set of instructions operable to display a volumetric image slice to be analyzed, receive a delineation of a target anatomic structure in the volumetric image slice, identify a region of interest based upon an area being delineated in the volumetric image slice, analyze voxel intensity values in the region of interest and determine an appropriate window-level setting based on the voxel intensity values.
The exemplary embodiments set forth herein may be further understood with reference to the following description and the appended drawings, wherein like elements are referred to with the same reference numerals. The exemplary embodiments relate to a system and method for segmentation of a standard anatomy in volumetric images acquired from CT, MRI, etc. In particular, exemplary embodiments set forth herein describe a method for automatically adjusting image visualization in volumetric images such that a target structure may be easily distinguished from neighboring structures.
The processor 102 identifies a region of interest, in a step 240. The region of interest is identified as a portion of the image, which is determined by an area of the image in which the user is drawing. The processor 102 then analyzes voxel intensity values in the region of interest, in a step 250. The voxel intensity values are analyzed via an intensity histogram drawn by the processor 102. It is well known in the art that a voxel represents a volume element in a three dimensional space. The intensity histogram is a graph showing the number of voxels in an image, at each different intensity value found in that image. It will be understood by those of skill in the art that a wide range of intensity values may exist in a grayscale image, from the darkest of blacks to the lightest of whites. For example, as shown in
Based upon the voxel intensity values analyzed in step 250, an appropriate window-level setting for optimal visibility of the region of interest is derived in a step 260. The window-level settings are, for example, defined based on maximum and minimum image gray values in the region of interest. An interval between the maximum and minimum gray values may be mapped to an interval available for display in a linear fashion. For example, the minimum value may be mapped to pure black while the maximum value may be mapped to pure white to provide greater contrast for the image displayed on the display 104. Alternatively, a non-linear transfer function may be used to enhance display contrast along edges of the displayed image. As shown in
It will also be understood by those of skill in the art that the window-level settings may be adjusted prior to any delineation of the target anatomic structure by the user. Where the target anatomic structure is not delineated, the method 200 moves directly from step 220 to step 240, identifying the region of interest as the entire displayed image slice. Voxel intensity values of the entire image slice are analyzed in the step 250, appropriate window-level settings derived in the step 260, and the window-level settings adapted accordingly, in the step 270. Once the window-level setting has been adjusted, the method 200 returns to the step 230 so that the user may begin delineation of the target anatomic structure.
Once the target anatomic structure has been completely delineated by the user in the image slice, the image slice, including markings showing the delineated target anatomic structure, may be stored in the memory 108 in a step 280, such that the series of images may be later processed for contouring after each of the image slices in the series of images has been delineated by the user. The appropriate window-level setting corresponding to the image slice may also be stored in the memory 108.
The window-level may automatically adjust from slice to slice so long as there is an additional slice in the series of images to be analyzed and drawn. Thus, where there is an additional slice, the method 200 returns to the step 220. In the step 220, the processor 102 displays on the display 104 a slice of the image that is to be analyzed. Thus, it will be understood by those of skill in the art that for each cycle of the method 200, the processor 102 displays slices of the series of images that have not yet been analyzed such that each subsequent cycle displays a new slice of the series of images. After displaying another slice, the method 200 returns to step 220. It will be understood by those of skill in the art that the method 200 will continue to return to step 220 from the step 280, N−1 times. On the Nth time, however, the method 200 continues to a step 290, where the processor 102 processes the delineated target anatomic structure in each of the image slices of the series of images to contour the three dimensional target anatomic structure.
It will be understood by those of skill in the art that the user may adjust parameter settings according to user preferences, via the user interface 106. For example, the user may determine whether to delineate the target anatomic structure in each of the image slices in the series of images such that only a subset of image slices in the series of images may be delineated. It will also be understood by those of skill in the art that the user may also make manual adjustments to window-level settings, when necessary. Manual adjustments to window-level settings may be incorporated into the automatic adaptation of the window-level setting, as described above.
It is noted that the exemplary embodiments or portions of the exemplary embodiments may be implemented as a set of instructions stored on a computer readable storage medium, the set of instructions being executable by a processor.
It will be apparent to those skilled in the art that various modifications may be made to the disclosed exemplary embodiments and methods and alternatives without departing from the spirit or the scope of the disclosure. Thus, it is intended that the present disclosure cover modifications and variations provided they come within the scope of the appended claims and their equivalents.
It is also noted that the claims may include reference signs/numerals in accordance with PCT Rule 6.2(b). However, the present claims should not be considered to be limited to the exemplary embodiments corresponding to the reference signs/numerals.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2010/050734 | 2/18/2010 | WO | 00 | 9/29/2011 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2010/113047 | 10/7/2010 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5042077 | Burke | Aug 1991 | A |
6353674 | Dewaele | Mar 2002 | B1 |
7149333 | Pieper et al. | Dec 2006 | B2 |
7158692 | Chalana et al. | Jan 2007 | B2 |
7218763 | Belykh et al. | May 2007 | B2 |
7236618 | Chui et al. | Jun 2007 | B1 |
7565000 | Capolunghi et al. | Jul 2009 | B2 |
8285010 | Rowe | Oct 2012 | B2 |
20010033283 | Liang et al. | Oct 2001 | A1 |
20060098010 | Dwyer et al. | May 2006 | A1 |
20070116335 | Capolunghi et al. | May 2007 | A1 |
20100130860 | Yamagata | May 2010 | A1 |
20110313285 | Fallavollita et al. | Dec 2011 | A1 |
20120269413 | Hautvast et al. | Oct 2012 | A1 |
Number | Date | Country |
---|---|---|
03617698 | Feb 2005 | JP |
Entry |
---|
By R Sakellaropoulos et al.; Entitled: “An Image Visualization Tool in Mammorgraphy”; Published in: Informatics for Health and Social Care; vol. 24, Issue 1 Jan. 1999, p. 1 of 1. |
Number | Date | Country | |
---|---|---|---|
20120032953 A1 | Feb 2012 | US |
Number | Date | Country | |
---|---|---|---|
61165025 | Mar 2009 | US |