The present invention relates to a technology for automatically setting a procedure for extracting a target area from within an image by a combination of a plurality of image processes.
The progress in diagnostic imaging apparatuses and the like has resulted in significant increases in medical images and medical information. As a result, huge volumes of medical images and medical information are being accumulated. Meanwhile, the increase in stored volumes has also led to an increased burden on clinicians and radiologists who use medical images for diagnosis. This has resulted in a situation in which the accumulated medical images and medical information are not fully utilized.
In order to effectively utilize medical images and increase the quality of diagnosis or treatment, a method for determining a plurality of image processes to be applied to a single medical image and a procedure for implementing the processes in advance has been proposed (see Patent Document 1, for example).
The document discloses an apparatus by which an analysis protocol (image analyzing procedure) to be applied to image data from a diagnostic imaging apparatus (such as a computed tomography (CT) apparatus) is determined in accordance with the purpose of examination and the examined region, and a desired processing result is obtained through an image process using parameters acquired by preprocessing. Specifically, the document discloses a technique for selecting an image process implementing procedure in advance based on image data and image-accompanying information, and for carrying out the procedure in sequence.
In the case of the apparatus according to the above document, prior to starting an image process (image analysis), the order of implementation of the process is automatically determined in advance. Namely, the implementation order is fixed in advance. Thus, when the content of the process is desired to be modified in the image process, the user needs to input an instruction for each change in process content. Particularly, when a desired processing result is not obtained by the image process currently being carried out, it may become necessary to change the subsequent process content.
However, if the separate operation inputs by the user are required, the burden on the user cannot be reduced.
Based on a detailed analysis of the above problem, the present inventors provide a mechanism such that the content of image processes to be sequentially applied to a process object image can be automatically determined.
According to the present invention, the content of an image process to be carried out in the next and subsequent rounds is automatically determined based on a history of results of image processes applied to the process object image up to the immediately preceding round.
According to the present invention, the content of an image process for the next and subsequent rounds can be automatically determined by referring to the history of image process results that are stored in large volumes. Thus, the operational burden on the user when extracting a target area from the process object image through an image process can be decreased.
Other problems, configurations, and effects will become apparent from a reading of the following description of embodiments.
In the following, embodiments of the present invention will be described with reference to the drawings. The mode for carrying out the present invention is not limited to the following embodiments, and various modifications may be made within the technical scope of the present invention.
The image processing apparatuses described below are all based on the assumption that a plurality of image processes is applied in sequence in order to extract a target area from a process object image. The image processing apparatuses according to the various embodiments are common in that a database is searched for an image process procedure with a history similar to the history of processing results acquired up to the immediately preceding round, and the content of an image process to be applied next is automatically determined from the search result. Specifically, the image processing apparatuses statistically determine the image process to be applied next based on a large amount of information about the image process procedures used in the past that are stored in the database.
In the database, the results of successive judgments made by technical experts based on experience or processing results and the like are stored as the image process procedures. Thus, it is statistically meaningful for target area extraction to search for the past image process procedure with a history similar to the history of processing results with respect to the process object image that is currently being processed, and to apply the image process for the next round used in the detected image process procedure for the current process as is. In the image processing apparatuses according to the embodiments, this determination process is repeated to automatically extract the target areas from the process object image.
In the process flow model database 102, image process procedures carried out in the past and image process procedures registered as standard models are stored. In the present specification, a “procedure” refers to information specifying an implementation order of a plurality of image processes. According to the present embodiment, the image process procedure includes a history (which may hereafter be referred to as “procedure feature quantities”) of processing results (which may hereafter be referred to as “target area feature quantities”) obtained upon carrying out each image process.
In the image database 103, image data as a process object are stored. According to the present embodiment, medical image data are stored. For example, contrast enhanced CT data are stored. Of course, the image data are not limited to contrast enhanced CT data.
The image processing apparatus 100 includes an image processing unit 121, a target area feature quantity extraction unit 111, a target area feature quantity storage unit 112, and a next image process determination unit 120. According to the present embodiment, the image processing apparatus 100 includes a computer as a basic configuration, and the respective processing units illustrated in
The image processing unit 121 provides the function of applying an image process designated by an image process 204 to an examination image 200 or a result image obtained by the image process of the immediately preceding round. A program corresponding to each image process is stored in a storage area which is not illustrated, read when carrying out the image process, and carried out. The image processing unit 121 includes a storage area for storing the process object image (such as the examination image 200), and a program work area. The image processing unit 121 outputs a final processing result 206 to the image display apparatus 104. Thus, the image processing unit 121 is also provided with a function related to user interface.
The target area feature quantity extraction unit 111 provides the function of extracting target area feature quantities (size and number of target areas) 202 from the result image obtained by the image process by the image processing unit 121. The target area feature quantity storage unit 112 provides a storage area for storing the extracted target area feature quantities 202. The storage area may include a semiconductor storage device or a hard disk device.
The next image process determination unit 120 provides the function of comparing procedure feature quantities 203 specifying changes in the target area feature quantities 202 between procedures and a past process flow model 205, and of determining the image process 204 to be applied to the process object image next.
In the following, the details of the content of an image diagnosis assisting process carried out by the image processing apparatus 100 according to the first embodiment will be described.
First, a doctor as an operator selects a process object image from the image database 103 (process 300). Specifically, a contrast enhanced CT image is selected.
Then, the doctor makes an initial setting for procedure feature quantities (process 301). The initial setting is the process of determining initial values 350 of the procedure feature quantities, i.e., the size and the number of the target areas. According to the present embodiment, both are initialized to “0”.
After the procedure feature quantities are determined, the next image process determination unit 120 carries out a process of determining an image process to be carried out next (process 302 (1)). Because the initial values are “0” in the initial process and there is no amount of change in the procedure feature quantities, the image processing unit 121 is notified of a general-purpose image process (level set algorithm) for ischemic liver cancer extraction. As a result, the image processing unit 121 carries out an extraction process to which the level set algorithm is applied, for example (process 303(1)).
The image processing unit 121 transfers information about areas determined to be target areas 260 based on the processing result of the process to the target area feature quantity extraction unit 111 as target area data 201. The target area feature quantity extraction unit 111 extracts the target area feature quantities (i.e., size and number) contained in the process object image from the given target area data 201 (process 304(1)). The extracted target area feature quantities 202 are stored in the target area feature quantity storage unit 112.
Thereafter, the next image process determination unit 120 searches the target area feature quantity storage unit 112 and extracts the amounts of change in the target area feature quantities (size and number) as procedure feature quantities 203 (process 305(1)).
Next, the next image process determination unit 120 compares the extracted procedure feature quantities 203 with preset threshold values 351 (process 306(1)). When the procedure feature quantities 203 are not more than the threshold values (such as when, in the case of
On the other hand, when the procedure feature quantities 203 are not less than the threshold values (such as when, in the case of
Thus, according to the present embodiment, the processes 302 to 306 are repeatedly carried out until the procedure feature quantities 203 become lower than the predetermined threshold values 351. Namely, as long as a negative result is acquired in the process 306, the process flow with a high degree of similarity with the history of the procedure feature quantities 203 acquired up to the point in time of carrying out each round of the process 302 is extracted from the process flow model database 102, and the image process for the next round which is registered with respect to the process flow model is given as the image process 204 to be applied next by the image processing unit 121.
By carrying out such process, after the initial setting operation by the operator, the image processing apparatus 100 according to the present embodiment can automatically determine an image process until a desired processing result is obtained, and apply the image process to the process object image.
A specific example of the operation of the process carried out when automatically determining the next image process based on the procedure feature quantities 203 will be described.
In the case of
Also, in the next image processes 404A and 404B, the content of an image process carried out next to the implemented round corresponding to the procedure feature quantities 403A and 403B is stored. Namely,
In this case, the next image process is uniquely determined upon detection of a process procedure model with a high similarity degree with the procedure feature quantities that have appeared with regard to an image currently being processed.
Preferably, a process flow model in which information about the procedure feature quantities for all of the implemented rounds and the image process carried out in each of the rounds may be used. In this case, the procedure feature quantities of the process procedure models may be referenced within the range of rounds of up to the round immediately before the implemented round for which determination is to be made, and, upon detection of a process procedure model with a high similarity degree, the image process carried out in the next implemented round of the detected process procedure model may be read by the next image process determination unit 120.
According to the present embodiment, the next image process determination unit 120 calculates the similarity degree between the process flow model 205 and the procedure feature quantities 203 based on a sum of squared differences of two corresponding procedure feature quantities, for example. In this case, the smaller the sum of squared differences, the higher the similarity degree. Obviously, the similarity degree calculating method is not limited to the sum of squared differences and may include the sum of absolute differences. In the case of
As described above, by adopting the image processing apparatus 100 according to the first embodiment, when the target areas are to be automatically extracted from the process object image, the operator, after inputting initial conditions, can extract the required target areas from within the process object image without performing any additional operation. Accordingly, the image process content correcting operation by the operator, which is still often required during an image process in conventional apparatuses, can be eliminated. As a result, the operational burden on the operator can be decreased, and the time before the target areas are extracted can be reduced.
According to the first embodiment, the initial values of the procedure feature quantities are set by the process 301. In this case, the image process that is carried out in the first round is determined by the set initial values. Obviously, the image process that is carried out in the first round may be modified depending on the initial values given. However, the image process determination is carried out by the next image process determination unit 120, and the operator's intention will not be reflected in the image process determination.
Meanwhile, according to the present embodiment, the operator can specifically select or designate the image process that is carried out in the first round via the input device 105 in the process 307. Preferably, the designation may be carried out prior to the process 300, and the initial process input 207 that is inputted in advance may be taken into the next image process determination unit 120 in the process 307.
According to the present embodiment, the operator can select level set (general-purpose), filter (cyst removal), or level set (treatment mark), for example, as the initial process input 207.
As described above, by adopting the image processing apparatus 100 according to the second embodiment, an image process desired by the operator can be selected or designated as the initial round image process. Thus, an image processing apparatus that can provide an image process in accordance with the operator's intension, in addition to the effect of the first embodiment, can be implemented.
According to the present embodiment, another process function that may be preferably implemented in the next image process determination unit 120 of the image processing system (
According to the present embodiment, the next image process determination unit 120 determines the image process to be applied to the process object image next through the following process (process 3021).
First, the next image process determination unit 120 compares the procedure feature quantities 203 acquired with respect to the process object image and the process flow model 205, and calculates the similarity degree between the process flow models 402A and 402B. The similarity degree is an index expressed in ratios: 100% when there is complete agreement, and 0% when there is complete disagreement.
Next, the next image process determination unit 120 determines the priority order of each process flow model by using the reliability and the similarity degree. According to the present embodiment, the reliability and the similarity degree are summed and then standardized by 100 to obtain priority. When the reliability of a process flow model is A1 and its weight is w1, and the similarity degree is A2 and its weight is w2, priority may be calculated by the following expression.
Priority=(w1·A1+w2·A2)/(w1+w2)
If weight w1=w2=1, priority of the process flow model 402A in
In this case, the priority order is opposite to the priority order of the first embodiment. Namely, the process flow model 402B has the first priority order, and the process flow model 402A has the second priority order. Thus, the next image process determination unit 120 outputs region growing (general-purpose) stored as the next image process 403B of the process flow model 402B to the image processing unit 121.
As described with reference to the present embodiment, by introducing the index indicating the reliability of algorithm with respect to the process flow model as the object of similarity determination, the operator can be presented with an extraction result with higher accuracy than according to the first embodiment.
According to the present embodiment, another process function that may be preferably implemented in the next image process determination unit 120 of the image processing system (
Of course, as a prerequisite, each process flow model stored in the process flow model database 102 includes the image process algorithm in the procedure feature quantities. In the image process algorithm, the parameters used are also stored, in addition to the image process algorithm carried out in each round.
According to the present embodiment, the operator can be provided with a result with high extraction accuracy in which the order of implementation of the image process algorithm is taken into consideration.
The present invention is not limited to the foregoing embodiments but may include various modifications. For example, the foregoing embodiments have been described in detail to facilitate an understanding of the present invention, and the present invention is not necessarily limited to embodiments having all of the details described. A part of one embodiment may be substituted by a configuration of another embodiment, or a configuration of the other embodiment may be incorporated into a configuration of the one embodiment. With regard to a part of the configuration of an embodiment, additions, deletions, or substitutions may be made.
The configurations, functions, processing units, process means and the like described above may be partly or entirely implemented in the form of hardware, such as an integrated circuit. The configurations, functions and the like described above may be implemented in the form of software, such as a program for implementing the respective functions that is interpreted and executed by a processor. Programs, tables, files, and other information for implementing the respective functions may be stored in a storage device such as a memory, a hard disk, or a solid state drive (SSD), or a storage medium such as an IC card, an SD card, or a DVD.
The illustrated control lines and information lines are only those believed necessary for description purposes, and do not represent all of the control lines or information lines required in a product. It may be considered that, in practice, almost all elements are mutually connected.
Number | Date | Country | Kind |
---|---|---|---|
2011-116145 | May 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/062892 | 5/21/2012 | WO | 00 | 1/29/2014 |