The present disclosure relates to an image processing apparatus, an image processing method, and a non-transitory storage medium storing an image processing program.
In recent years, in a radiography apparatus using radiation such as X-rays or gamma-rays, in order to observe a diseased part in more detail, there has been proposed tomosynthesis imaging of performing imaging by moving a radiation source and irradiating a subject with radiation at a plurality of radiation source positions, and deriving tomographic images in which desired tomographic planes are highlighted by adding a plurality of projection images acquired by the imaging. In the tomosynthesis imaging, the plurality of projection images are acquired by moving the radiation source in parallel with a radiation detector or so as to draw a circular or an elliptical arc according to characteristics of an imaging apparatus and required tomographic images and imaging the subject at the plurality of radiation source positions, and the tomographic images are derived by reconfiguring the projection images using an inverse projection method such as a simple inverse projection method or a filtering inverse projection method.
By deriving such a tomographic image on a plurality of tomographic planes of the subject, it is possible to separate structures overlapping each other in a depth direction in which the tomographic planes are aligned. Therefore, it is possible to find a lesion which is unlikely to be detected in the two-dimensional image acquired by simple imaging in the related art. The simple imaging is an imaging method for acquiring one two-dimensional image, which is a transmission image of a subject, by emitting radiation to the subject once.
On the other hand, the tomosynthesis imaging also has a problem that the reconstructed tomographic image is blurred due to an influence by a body movement or the like of the subject due to a time difference of imaging at each of a plurality of radiation source positions. In a case where the tomographic image is blurred as described above, it is difficult to find a lesion such as minute calcification, which is useful for early detection of breast cancer, particularly in a case where the breast is a subject.
For this reason, a method of correcting body movement in the case of deriving a tomographic image from a projection image acquired by tomosynthesis imaging has been proposed. For example, WO2020/067475A proposes a method of detecting at least one feature point in a derived tomographic image, deriving a misregistration amount between a plurality of projection images based on a body movement of a subject by using, as a reference, the feature point in a corresponding tomographic plane corresponding to the tomographic image from which the feature point is detected, reconstructing the plurality of projection images by correcting the misregistration amount, and deriving a corrected tomographic image in which an influence of the body movement is corrected.
In the method disclosed in WO2020/067475A, the feature point is detected from the tomographic image. However, the feature point is not always detected on the projection image. For example, even in a case where a feature point is present with high contrast in the tomographic image, the contrast of the feature point is low in the projection image, and as a result, in some cases, it is difficult to detect the feature point. In a case where it is difficult to detect the feature point in the projection image, the feature point on the tomographic image and the feature point on the projection image cannot be accurately associated with each other, and in that case, a body movement cannot be accurately corrected.
The present disclosure has been made in view of the above circumstances, and an object of the present disclosure is to provide an image processing apparatus, an image processing method, and a non-transitory storage medium storing an image processing program capable of acquiring a tomographic image with high image quality in which a body movement is accurately corrected.
According to the present disclosure, there is provided an image processing apparatus comprising: at least one processor, in which the processor is configured to: acquire a plurality of projection images that are generated by performing, by an imaging apparatus, tomosynthesis imaging of relatively moving a radiation source with respect to a detection surface of a detection unit and irradiating a subject with radiation at a plurality of radiation source positions due to movement of the radiation source, the plurality of projection images corresponding to the plurality of radiation source positions; derive a plurality of feature-structure projection images by extracting a specific structure from the plurality of projection images; derive a plurality of feature-structure tomographic images respectively for a plurality of tomographic planes of the subject by reconstructing the plurality of feature-structure projection images; detect at least one feature structure from the plurality of feature-structure tomographic images; and derive a corrected tomographic image for at least one tomographic plane of the subject by correcting misregistration between the plurality of projection images due to a body movement of the subject by using, as a reference, the feature structure in a corresponding tomographic plane corresponding to the feature-structure tomographic image from which the feature structure is detected, and reconstructing the plurality of projection images.
The “relatively moving a radiation source with respect to a detection unit” includes a case of moving only the radiation source, a case of moving only the detection unit, and a case of moving both the radiation source and the detection unit.
In the image processing apparatus according to the present disclosure, the specific structure may be at least one of a line structure or a point structure.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to extract at least one of the line structure or the point structure based on a concentration degree of a gradient vector representing a gradient of pixel values in the projection image.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: derive, in the corresponding tomographic plane, a misregistration amount between the plurality of projection images due to the body movement of the subject by using, as a reference, the feature structure; and derive the corrected tomographic image by correcting the misregistration amount and reconstructing the plurality of projection images.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: detect a plurality of feature structures from the plurality of feature-structure tomographic images; determine whether or not the corresponding tomographic plane corresponding to the feature-structure tomographic image from which each of the plurality of feature structures is detected is a focal plane; and derive the misregistration amount in the corresponding tomographic plane determined as the focal plane.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to detect, as the feature structure, a point at which a specific threshold value condition is satisfied in the feature-structure tomographic image.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: update the feature-structure tomographic image by reconstructing the feature-structure projection images while correcting the misregistration; detect an updated feature structure from the updated feature-structure tomographic image; update the misregistration amount using the updated feature structure; and repeat the update of the feature-structure tomographic image, the update of the feature structure, and the update of the misregistration amount.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: update the feature-structure tomographic image by reconstructing the feature-structure projection images while correcting the misregistration; detect an updated feature structure from the updated feature-structure tomographic image based on an updated threshold value condition; update the misregistration amount by using the updated feature structure; and repeat the update of the feature-structure tomographic image, the update of the feature structure based on the updated threshold value condition, and the update of the misregistration amount.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: derive a tomographic-plane projection image corresponding to each of the plurality of projection images by projecting the plurality of projection images onto the corresponding tomographic plane based on a positional relationship between the radiation source position and the detection unit when performing imaging for each of the plurality of projection images; and derive, in the corresponding tomographic plane, as the misregistration amount between the plurality of projection images, a misregistration amount between a plurality of the tomographic-plane projection images based on the body movement of the subject, by using, as a reference, the feature structure.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: set a local region corresponding to the feature structure in the plurality of tomographic-plane projection images; and derive the misregistration amount based on the local region.
The term “local region” is a region including the feature structure in the tomographic image or the tomographic-plane projection image, and can be a region having any size smaller than the tomographic image or the tomographic-plane projection image.
The local region needs to be larger than a range of movement as the body movement. The body movement may be approximately 2 mm in a case of being large. Therefore, in a case of the tomographic image or the tomographic-plane projection image in which the size of one pixel is 100 μm square, the local region may be set to, for example, a region of 50×50 pixels or 100×100 pixels around the feature structure.
The term “region around the feature structure in the local region” means a region that is smaller than the local region and includes the feature structure in the local region.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: set a plurality of first local regions including the feature structure in the plurality of tomographic-plane projection images; set a second local region including the feature structure in a tomographic image from which the feature structure is detected; derive misregistration amounts of the plurality of first local regions with respect to the second local region, as temporary misregistration amounts; and derive the misregistration amount based on a plurality of the temporary misregistration amounts.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to derive the temporary misregistration amounts based on a region around the feature structure in the second local region.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: derive a plurality of the tomographic images as target tomographic images by reconstructing the plurality of projection images excluding a target projection image corresponding to a target tomographic-plane projection image that is a target for deriving the misregistration amount; and derive the misregistration amount for the target tomographic-plane projection image by using the target tomographic image.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: perform image quality evaluation of a region of interest including the feature structure in the corrected tomographic image; and determine whether the derived misregistration amount is appropriate or inappropriate based on a result of the image quality evaluation.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: derive a plurality of tomographic images by reconstructing the plurality of projection images; and perform image quality evaluation of a region of interest including the feature structure in the tomographic image; compare the result of the image quality evaluation for the corrected tomographic image with a result of the image quality evaluation for the tomographic image; and determine a tomographic image of which the result of the image quality evaluation is better as a final tomographic image.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to: derive an evaluation function for performing image quality evaluation of a region of interest including the feature structure in the corrected tomographic image; and derive the misregistration amount for optimizing the evaluation function.
Further, in the image processing apparatus according to the present disclosure, the subject may be a breast.
Further, in the image processing apparatus according to the present disclosure, the processor may be configured to change a search range in derivation of a misregistration amount according to at least one of a density of a mammary gland, a size of the breast, an imaging time of the tomosynthesis imaging, a compression pressure of the breast in the tomosynthesis imaging, or an imaging direction of the breast.
According to the present disclosure, there is provided an image processing method comprising: acquiring a plurality of projection images that are generated by performing, by an imaging apparatus, tomosynthesis imaging by relatively moving a radiation source with respect to a detection surface of a detection unit and irradiating a subject with radiation at a plurality of radiation source positions due to movement of the radiation source, the plurality of projection images corresponding to the plurality of radiation source positions; deriving a plurality of feature-structure projection images by extracting a specific structure from the plurality of projection images; deriving a plurality of feature-structure tomographic images respectively for a plurality of tomographic planes of the subject by reconstructing the plurality of feature-structure projection images; detecting at least one feature structure from the plurality of feature-structure tomographic images; and deriving a corrected tomographic image for at least one tomographic plane of the subject by correcting misregistration between the plurality of projection images based on a body movement of the subject by using, as a reference, the feature structure in a corresponding tomographic plane corresponding to the feature-structure tomographic image from which the feature structure is detected, and reconstructing the plurality of projection images.
According to the present disclosure, there is provided a non-transitory storage medium storing a program causing a computer to execute an image processing, the image processing comprising: acquiring a plurality of projection images that are generated by performing, by an imaging apparatus, tomosynthesis imaging by relatively moving a radiation source with respect to a detection surface of a detection unit and irradiating a subject with radiation at a plurality of radiation source positions due to movement of the radiation source, the plurality of projection images corresponding to the plurality of radiation source positions; deriving a plurality of feature-structure projection images by extracting a specific structure from the plurality of projection images; deriving a plurality of feature-structure tomographic images respectively for a plurality of tomographic planes of the subject by reconstructing the plurality of feature-structure projection images; detecting at least one feature structure from the plurality of feature-structure tomographic images; and deriving a corrected tomographic image for at least one tomographic plane of the subject by correcting misregistration between the plurality of projection images due to a body movement of the subject by using, as a reference, the feature structure in a corresponding tomographic plane corresponding to the feature-structure tomographic image from which the feature structure is detected, and reconstructing the plurality of projection images.
According to the present disclosure, it is possible to acquire a high-quality tomographic image in which the body movement is accurately corrected.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
The imaging apparatus 10 comprises an arm part 12 that is connected to a base (not illustrated) by a rotation shaft 11. An imaging table 13 is attached to one end of the arm part 12, and a radiation irradiation unit 14 is attached to the other end of the arm part 12 so as to face the imaging table 13. The arm part 12 is configured such that only the end to which the radiation irradiation unit 14 is attached can be rotated. Therefore, the imaging table 13 is fixed and only the radiation irradiation unit 14 can be rotated. The rotation of the arm part 12 is controlled by the console 2.
A radiation detector 15, such as a flat panel detector, is provided in the imaging table 13. The radiation detector 15 has a detection surface 15A of radiation such as X-rays. In addition, a circuit board including a charge amplifier that converts a charge signal read from the radiation detector 15 into a voltage signal, a sampling two correlation pile circuit that samples the voltage signal output from the charge amplifier, and an analog-to-digital (AD) conversion unit that converts the voltage signal into a digital signal is provided in the imaging table 13. The radiation detector 15 is an example of a detection unit. Further, in the present embodiment, as the detection unit, the radiation detector 15 is used. On the other hand, the detection unit is not limited to the radiation detector 15 as long as the detection unit can detect radiation and convert the radiation into an image.
The radiation detector 15 can repeatedly perform recording and reading of a radiation image, may be a so-called direct-type radiation detector that directly converts radiation such as X-rays into charges, or may be a so-called indirect-type radiation detector that converts radiation into visible light once and converts the visible light into a charge signal. As a method for reading a radiation image signal, it is desirable to use the following method: a so-called thin film transistor (TFT) reading method which reads a radiation image signal by turning on and off a TFT switch; or a so-called optical reading method which reads a radiation image signal by irradiating a target with read light. On the other hand, the reading method is not limited thereto, and other methods may be used.
An X-ray source 16 that is a radiation source is accommodated in the radiation irradiation unit 14. The console 2 controls a timing when the X-ray source 16 emits an X-ray, which is radiation, and X-ray generation conditions of the X-ray source 16, that is, selection of a target and filter materials, a tube voltage, an irradiation time, and the like.
Further, the arm part 12 is provided with a compression plate 17 that is arranged above the imaging table 13 and presses and compresses the breast M, a support portion 18 that supports the compression plate 17, and a moving mechanism 19 that moves the support portion 18 in a vertical direction in
The console 2 has a function of controlling the imaging apparatus 10 using, for example, an imaging order and various kinds of information acquired from a radiology information system (RIS) (not illustrated) or the like through a network, such as a wireless communication local area network (LAN), and commands or the like directly issued by an engineer or the like. Specifically, the console 2 acquires a plurality of projection images as described below by causing the imaging apparatus 10 to perform tomosynthesis imaging of the breast M. As an example, in the present embodiment, a server computer is used as the console 2.
The image storage system 3 is a system that stores image data such as a radiation image and a tomographic image which are obtained by imaging of the imaging apparatus 10. The image storage system 3 extracts image data corresponding to a request from the console 2 or the image processing apparatus 4 from the stored image data, and transmits the extracted image data to a device that is a source of the request. A specific example of the image storage system 3 is a picture archiving and communication system (PACS).
Next, the image processing apparatus according to the first embodiment will be described. Next, a hardware configuration of the image processing apparatus according to the first embodiment will be described with reference to
The storage 23 is implemented by a hard disk drive (HDD), a solid state drive (SSD), a flash memory, and the like. An image processing program 22 to be installed in the image processing apparatus 4 is stored in the storage 23 as a storage medium. The CPU 21 reads the image processing program 22 from the storage 23, expands the image processing program 22 in the memory 26, and executes the expanded image processing program 22.
The image processing program 22 is stored in a storage device of a server computer connected to the network or in a network storage in a state of being accessible from the outside, and is downloaded and installed in the computer that configures the image processing apparatus 4 in response to a request. Alternatively, the image processing program is distributed by being recorded on a recording medium such as a digital versatile disc (DVD) or a compact disc read only memory (CD-ROM), and is installed in a computer that configures the image processing apparatus 4 from the recording medium.
Next, a functional configuration of the image processing apparatus according to the first embodiment will be described.
The image acquisition unit 31 acquires a plurality of projection images generated in a case where the console 2 causes the imaging apparatus 10 to perform tomosynthesis imaging. The image acquisition unit 31 acquires a plurality of projection images from the console 2 or the image storage system 3 via the network I/F 27.
Here, the tomosynthesis imaging in the console 2 will be described. When performing the tomosynthesis imaging for generating a tomographic image, the console 2 moves the X-ray source 16 by rotating the arm part 12 around the rotation shaft 11. Further, the console 2 performs control to irradiate the breast M, which is the subject, with X-rays under predetermined imaging conditions for tomosynthesis imaging at a plurality of radiation source positions by the movement of the X-ray source 16. Further, the console 2 acquires a plurality of projection images Gi (i=1 to n; n is the number of radiation source positions, for example, n=15) at the plurality of radiation source positions which are obtained by detection of the X-rays transmitted through the breast M by the radiation detector 15.
As illustrated in
In
The structure extraction unit 32 extracts a specific structure from each of the plurality of projection images Gi, and thus a plurality of feature-structure projection images are derived. Examples of the specific structure include at least one of a line structure or a point structure included in the breast M. Examples of the line structure include a mammary gland, a spicula, and a blood vessel. Examples of the point structure include calcification, an intersection of a plurality of mammary glands, and an intersection of blood vessels. The point structure is not limited to a fine point, and a region having a predetermined area is also included in the point structure. In addition, in the following description, it is assumed that both the line structure and the point structure are used as a specific structure. In order to extract the line structure and the point structure, in the present embodiment, a concentration degree of a gradient vector representing a gradient of pixel values of a projection image Gi, which is described in “Evaluation Method of Concentration Degree and Convergence Index Filter”, Yoshinaga et al., Medical Imaging Technology, Vol. 19, No. 3, 2001”, is used.
The method of extracting the line structure using the concentration degree is a method of obtaining a gradient vector on both sides of a search line in a certain direction of the projection image Gi by using a line concentration degree filter, evaluating a concentration degree at which the gradient vector is concentrated, and extracting a search line having a high evaluation value as the line structure. The method of extracting the point structure using the concentration degree is a method of obtaining a gradient vector in a certain direction of the projection image Gi by using a point concentration degree filter, evaluating a concentration degree at which the gradient vector is concentrated, and extracting a point having a high evaluation value as the point structure. By using the concentration degree of the vector, it is possible to extract the point structure or the line structure from the projection image Gi regardless of the contrast of the projection image Gi.
It is preferable to set a coefficient of the concentration degree filter such that a line structure having an actual dimension of approximately 1 mm in a case of a line structure and a point structure having an actual dimension of approximately 0.5 mm in a case of a point structure are extracted. Thereby, it is possible to prevent a noise included in the projection image Gi from being extracted as the line structure and the point structure.
In addition, the structure extraction unit 32 may extract an edge, an intersection of edges, a corner of edges, or the like included in the projection image Gi as the point structure by using an algorithm such as a Harris' corner detection method, scale-invariant feature transform (SIFT), features from accelerated segment test (FAST), or speeded up robust features (SURF).
In addition, the structure extraction unit 32 may extract at least one of the line structure or the point structure included in the projection image Gi by using a learning model obtained by machine learning.
The reconstruction unit 33 derives a feature-structure tomographic image in each of a plurality of tomographic planes of the breast M by reconstructing a plurality of feature-structure projection images SGi. In addition, the reconstruction unit 33 derives a tomographic image in which a desired tomographic plane of the breast M is emphasized by reconstructing all or some of the plurality of projection images Gi while correcting misregistration as described below. In addition, the reconstruction unit 33 derives a tomographic image in which a desired tomographic plane of the breast M is emphasized by reconstructing all or some of the plurality of projection images Gi without correcting a misregistration amount, as necessary.
Specifically, the reconstruction unit 33 derives a plurality of feature-structure tomographic images SDj (j=1 to m) in each of the plurality of tomographic planes of the breast M as illustrated in
The feature structure detection unit 34 detects at least one feature structure from a plurality of the feature-structure tomographic images SDj. Examples of the feature structure include a point structure and a line structure included in the feature-structure tomographic images SDj.
The feature structure detection unit 34 detects a point-shape structure such as calcification, as a point structure, that is, a feature structure, from the feature-structure tomographic image SDk by using a known algorithm. In addition, the feature structure detection unit 34 may detect, as a feature structure, a point such as an edge, an intersection of edges, and a corner of an edge included in the feature-structure tomographic image SDk by using an algorithm such as Harris' corner detection method, SIFT, FAST, or SURF. For example, the feature structure detection unit 34 detects the point-shape structure E1 included in the feature-structure tomographic image SDk illustrated in
As described above, the feature structure is not limited to the point structure. On the other hand, even in a case where a line structure is detected as the feature structure, for example, the brightness satisfying a specific threshold value condition can be used as a reference, or a known algorithm can be appropriately used. In addition, in order to detect the feature structure, an algorithm of computer aided diagnosis (CAD) may be used.
Here, for the sake of explanation, only one feature structure F1 is detected from one feature-structure tomographic image SDk. On the other hand, it is preferable to detect a plurality of feature structures. For example, all the point structures of the point-shape structures E1 to E3 and the intersections E4 and E5 included in the feature-structure tomographic image SDk illustrated in
The projection unit 35 projects the plurality of projection images Gi on the corresponding tomographic plane, which is the tomographic plane corresponding to the tomographic image from which the feature structure F1 is detected, based on the positional relationship between the radiation source position and the radiation detector 15 when performing imaging, for each of the plurality of projection images Gi. Thereby, the projection unit 35 derives tomographic-plane projection images GTi corresponding to each of the plurality of projection images Gi. Hereinafter, derivation of the tomographic-plane projection images GTi will be described. In the present embodiment, since the feature structure is detected in each of the plurality of tomographic images Dj, the plurality of projection images Gi are respectively projected on the plurality of tomographic planes Tj corresponding to the plurality of tomographic images Dj. Thereby, the tomographic-plane projection images GTi are derived.
The tomographic image derived from the projection image Gi and the tomographic plane Tj is an image consist of a plurality of pixels that are discretely arranged in a two-dimensional shape at a predetermined sampling interval, and is an image in which pixels are arranged at grid points having a predetermined sampling interval. In
Here, a relationship of the coordinates (sxi, syi, szi) of the radiation source position at the radiation source position Si, the coordinates (pxi, pyi) of the pixel position Pi in the projection image Gi, and the coordinates (tx, ty, tz) of the projection position on the tomographic plane Tj is represented by the following Expression (1). In the present embodiment, a z-axis is set to a direction orthogonal to the detection surface 15A of the radiation detector 15, a y-axis is set to a direction parallel to a direction in which the X-ray source 16 moves on the detection surface of the radiation detector 15, and an x-axis is set to a direction perpendicular to the y-axis.
Therefore, by setting pxi and pyi in Expression (1) to the pixel position of the projection image Gi, and solving Expression (1) for tx and ty, the projection position on the tomographic plane Tj on which the pixel value of the projection image Gi is projected can be calculated. Therefore, by projecting the pixel value of the projection image Gi at the calculated projection position on the tomographic plane Tj, the tomographic-plane projection image GTi is derived.
In this case, the intersection of the straight line, which connects the radiation source position Si and the pixel position on the projection image Gi, and the tomographic plane Tj may not be positioned on the pixel position on the tomographic plane Tj. For example, as illustrated in
As the interpolation calculation, a linear interpolation calculation that weights the pixel value of the projection image at the projection position according to the distance between the pixel position and the plurality of projection positions around the pixel position can be used. In addition, any method such as a non-linear bicubic interpolation calculation using more pixel values of projection positions around the pixel position and a B-spline interpolation calculation can be used. Also, in addition to the interpolation calculation, the pixel value at the projection position closest to the pixel position may be used as the pixel value at the pixel position. Thereby, for the projection image Gi, the pixel values at all of the pixel positions of the tomographic plane Tj can be obtained. In the present embodiment, for each of the plurality of projection images Gi, the tomographic-plane projection image GTi having the pixel values obtained at all of the pixel positions of the tomographic plane Tj in this way is derived. Therefore, in one tomographic plane, the number of tomographic-plane projection images GTi matches with the number of projection images Gi.
The misregistration amount derivation unit 36 derives a misregistration amount between the plurality of tomographic-plane projection images GTi based on the body movement of the breast M during the tomosynthesis imaging. First, the misregistration amount derivation unit 36 sets a local region corresponding to the feature structure F1 as a region of interest for the plurality of tomographic-plane projection images GTi. Specifically, the local region having a predetermined size centered on the coordinate position of the feature structure F1 is set as the region of interest.
As illustrated in
Further, the misregistration amount derivation unit 36 performs registration of the regions of interest R1 to R3. At this time, the registration is performed by using, as a reference, the region of interest that is set in the reference tomographic-plane projection image. In the present embodiment, the registration of other regions of interest is performed by using, as a reference, the region of interest that is set in the tomographic-plane projection image (reference tomographic-plane projection image) for the reference projection image (referred to as Gc) acquired at the radiation source position Sc at which the optical axis X0 of the X-rays from the X-ray source 16 is perpendicular to the radiation detector 15.
Here, it is assumed that the region of interest R2 illustrated in
In a case where the misregistration amount is derived, a search range in a case of deriving the misregistration amount may be changed depending on at least one of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast.
Here, in a case where the density of the mammary gland is low, an amount of fat in the breast M is large, and thus the body movement tends to be large in a case of imaging. Also, in a case where the breast M is large, the body movement tends to be large in a case of imaging. In addition, as the tomosynthesis imaging time is longer, the body movement during imaging tends to be large. In addition, even in a case where the compression pressure of the breast M is low, the body movement tends to be large during imaging. Further, in a case where the imaging direction of the breast M is a medio-lateral oblique (MLO) direction, the body movement during imaging tends to be large than in a cranio-caudal (CC) direction.
Therefore, preferably, the misregistration amount derivation unit 36 changes a search range in a case of deriving the misregistration amount by receiving, from the input device 25, the input of at least one piece of information of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast M. Specifically, in a case where the body movement tends to increase, the wide search range H2 illustrated in
In the above, for the sake of explanation, the misregistration amount between the plurality of tomographic-plane projection images GTi is derived for one feature structure F1 on one tomographic plane Tj. In practice, however, as illustrated in
In addition, the reconstruction unit 33 derives corrected tomographic image Dhj in which the body movement is corrected by reconstructing the projection image Gi while correcting the misregistration amount derived in this way. Specifically, in a case where the back projection method is used for reconstruction, the pixel of the projection image Gi in which the misregistration occurs is reconstructed by correcting the misregistration based on the derived misregistration amount such that the pixel corresponding to the other projection image is projected at the position for back projection.
Instead of deriving the misregistration amounts for the plurality of different feature structures F, one misregistration amount may be derived from the plurality of different feature structures F. In this case, the region of interest is set for each of the plurality of different feature structures F, and the misregistration amount is derived on an assumption that the entire region of interest moves in the same direction by the same amount. In this case, the misregistration amount may be derived such that a representative value (for example, an average value, a median value, or a maximum value) of the correlation for all of the regions of interest between the tomographic-plane projection images is maximized, the tomographic-plane projection images being targets of the derivation of the misregistration amount. Here, in a case where a signal-to-noise ratio of each feature structure F in the tomographic-plane projection image is not very good, the accuracy in the derivation of the misregistration amount deteriorates. On the other hand, by deriving one misregistration amount from the plurality of different feature structures F in this way, even in a case where the signal-to-noise ratio of each feature structure F is not very good, the accuracy in the derivation of the misregistration amount can be improved.
The three-dimensional space of the breast M represented by the plurality of tomographic images Dj may be divided into a plurality of three-dimensional regions, and one misregistration amount may be derived from the plurality of feature structures F in the same manner as described above for each region.
Further, in the present embodiment, the reconstruction unit 33 derives the plurality of tomographic images Dj by reconstructing the plurality of projection images Gi without correcting the misregistration amount.
The display control unit 37 displays the corrected tomographic image that is derived on the display 24.
Further, it is preferable that the tomographic image Dj and the corrected tomographic image Dhj display the same cross section. In a case of switching the tomographic plane to be displayed according to an instruction from the input device 25, it is preferable to link the tomographic plane to be displayed in the tomographic image Dj and the corrected tomographic image Dhj. In addition to the tomographic image Dj and the corrected tomographic image Dhj, the projection image Gi may be displayed.
The operator can confirm the success or failure of the body movement correction by looking at the display screen 40. Further, in a case where the body movement is too large, even in a case where the tomographic image is derived by performing reconstruction while correcting the misregistration amount as in the present embodiment, the body movement cannot be corrected accurately, and the body movement correction may fail. In such a case, the tomographic image Dj may have a higher image quality than the corrected tomographic image Dhj due to the failure of the body movement correction. Therefore, the input device 25 may receive an instruction to store any of the tomographic image Dj or the corrected tomographic image Dhj, and the instructed image may be stored in the storage 23 or an external storage device.
Next, processing performed in the first embodiment will be described.
Subsequently, the reconstruction unit 33 derives the feature-structure tomographic images SDj in each of the plurality of tomographic planes of the breast M by reconstructing the plurality of feature-structure projection images SGi (step ST3). Next, the feature structure detection unit 34 detects at least one feature structure from a plurality of the feature-structure tomographic images SDj (step ST4). Further, the projection unit 35 projects the plurality of projection images Gi on the corresponding tomographic plane corresponding to the tomographic image from which the feature structure F1 is detected, based on the positional relationship between the radiation source position and the radiation detector 15 in a case of imaging each of the plurality of projection images Gi, and derives the tomographic-plane projection image GTi corresponding to each of the plurality of projection images Gi (step ST5).
Next, the misregistration amount derivation unit 36 derives the misregistration amount between the plurality of tomographic-plane projection images GTi (step ST6). Further, the reconstruction unit 33 derives the corrected tomographic image Dhj by reconstructing the plurality of projection images Gi while correcting the misregistration (step ST7). Moreover, the display control unit 37 displays the corrected tomographic image Dhj on the display 24 (step ST8), and the processing is ended. The corrected tomographic image Dhj that is derived is transmitted to the image storage system 3, and is stored in the image storage system 3.
Here, since a plurality of times of imaging is performed in a case where the tomosynthesis imaging is performed, the irradiation dose of the radiation in one imaging is small in order to reduce the exposure dose. As a result, the projection image Gi has a large amount of noise. For this reason, a structure such as calcification in the breast M may be buried in noise in the projection image Gi because the contrast of the structure is reduced depending on how the structures overlap with each other. Therefore, in a case where the feature structure is detected from the tomographic image derived by reconstructing the plurality of projection images Gi, it may not be possible to accurately associate the feature structure detected from the tomographic image with the structure corresponding to the feature structure included in the projection image. As a result, it may not be possible to accurately correct the misregistration of the projection image Gi using the feature structure.
In the first embodiment, the feature-structure projection image SGi is derived by extracting the specific structure, such as a line structure and a point structure, from the projection image, and the feature-structure tomographic image SDj is derived by reconstructing the feature-structure projection image SGi. The feature structure is detected from the feature-structure tomographic image SDj. Here, since the feature-structure tomographic image SDj is derived from the feature-structure projection image SGi, it is guaranteed that the structure corresponding to the feature structure detected from the feature-structure tomographic image SDj is included in the feature-structure projection image SGi and the projection image Gi. Therefore, according to the first embodiment, the misregistration amount between the plurality of projection images Gi can be appropriately derived by using the detected feature structure. As a result, according to the present embodiment, it is possible to acquire the high-quality corrected tomographic image Dhj in which the influence of the body movement is reduced.
Further, in the first embodiment, the feature structure is detected from a plurality of the feature-structure tomographic images SDj, instead of the projection image Gi or the tomographic-plane projection image GTi. Here, the feature-structure tomographic image SDj includes only the structure included in the corresponding tomographic plane Tj. For this reason, the structure on another tomographic plane included in the projection image Gi is not included in the feature-structure tomographic image SDj. Thus, according to the first embodiment, the feature structure can be accurately detected without being affected by the structures on other tomographic planes. Therefore, the misregistration amount between the plurality of projection images Gi can be appropriately derived. As a result, according to the present embodiment, the high-quality corrected tomographic image Dhj in which the influence of the body movement is reduced can be acquired.
Next, a second embodiment of the present disclosure will be described. A configuration of an image processing apparatus according to the second embodiment is the same as the configuration of the image processing apparatus according to the first embodiment illustrated in
In the second embodiment, the misregistration amount derivation unit 36 derives the misregistration amount between the tomographic-plane projection images GTi based on the temporary misregistration amount. Specifically, as in the first embodiment, the misregistration amount is derived by using, as a reference, the projection image acquired at the reference radiation source position Sc at which the optical axis X0 of the X-rays from the X-ray source 16 is perpendicular to the radiation detector 15. Here, in a case where the projection image G2 is a reference tomographic-plane projection image, the misregistration amount derivation unit 36 derives the misregistration amount between the tomographic-plane projection image GT1 and the tomographic-plane projection image GT2 by a difference value Vf1−Vf2 of the shift vectors Vf1 and Vf2 of the regions of interest R1 and R2 with respect to the region of interest RfM. In addition, the misregistration amount derivation unit 36 derives the misregistration amount between the tomographic-plane projection image GT3 and the tomographic-plane projection image GT2 by a difference value Vf3−Vf2 of the shift vectors Vf3 and Vf2 of the regions of interest R3 and R2 with respect to the region of interest RfM.
As described above, in the second embodiment, the temporary misregistration amounts of the regions of interest R1 to R3 that are set in the tomographic-plane projection images GTi with respect to the region of interest Rf0 that is set in the feature-structure tomographic image SDj are derived, and the misregistration amount between the tomographic-plane projection images GTi is derived based on the temporary misregistration amounts. Here, since the region of interest Rf0 is set in the feature-structure tomographic image SDj, unlike the projection image Gi, only the structure on the tomographic plane from which the feature-structure tomographic image SDj is acquired is included. Therefore, according to the second embodiment, the influence of the structures included in the tomographic planes other than the tomographic plane in which the feature structure is set is reduced, and the misregistration amount is derived. Therefore, according to the second embodiment, the influence of the structures on other tomographic planes can be reduced, and thus the misregistration amount between the plurality of projection images Gi can be accurately derived. As a result, according to the second embodiment, a high-quality corrected tomographic image Dhj in which the influence of the body movement is reduced can be acquired.
In the second embodiment, as in the first embodiment, a search range in a case of deriving the misregistration amount may be changed depending on at least one of a density of a mammary gland for the breast M, a size of the breast M, an imaging time of the tomosynthesis imaging, a compression pressure of the breast M in a case of the tomosynthesis imaging, or an imaging direction of the breast M.
Further, in the second embodiment, the shift vectors Vf1 to Vf3 of the regions of interest R1 to R3 with respect to the region of interest Rf0 are derived as the temporary misregistration amounts. On the other hand, in this case, a peripheral region Ra0 that is smaller than the region of interest Rf0 may be set around the feature structure F1 of the region of interest Rf0 as illustrated in
Further, in the second embodiment, the region of interest Rf0 is set in the feature-structure tomographic image SDj. On the other hand, the feature-structure tomographic images to be derived may be different for each of the tomographic-plane projection images GTi for derivation of the temporary misregistration amount. Specifically, it is preferable to derive the feature-structure tomographic image excluding the target projection image corresponding to the target tomographic-plane projection image that is a target for derivation of the temporary misregistration amount. Hereinafter, this case will be described as a third embodiment.
In a case where the temporary misregistration amount for the projection image G2 is derived, the reconstruction unit 33 derives a feature-structure tomographic image (referred to as SDj_2) by reconstructing the feature-structure projection images SG1, and SG3 to SG15 other than the feature-structure projection image SG2 derived from the projection image G2. In addition, the feature structure detection unit 34 detects the feature structure from the feature-structure tomographic image SDj_2, and the projection unit 35 derives the tomographic-plane projection images GT1 to GT15 from the projection images G1 to G15. The misregistration amount derivation unit 36 sets the region of interest Rf0_2 in the feature-structure tomographic image SDj_2, and derives the shift vector Vf2 of the region of interest R2 that is set in the tomographic-plane projection image GT2, as the temporary misregistration amount.
In addition, the temporary misregistration amounts for all of the tomographic-plane projection images GTi are derived by sequentially changing the target tomographic-plane projection image. As in the second embodiment, the misregistration amount between the tomographic-plane projection images GTi is derived based on the temporary misregistration amounts.
As described above, according to the third embodiment, the temporary misregistration amount is derived using the feature-structure tomographic image that is not affected by the target projection image. Therefore, the temporary misregistration amount can be more accurately derived, and as a result, the misregistration amount can be accurately derived.
In the third embodiment, in a case of reconstructing the feature-structure tomographic image excluding the feature-structure projection image for the target projection image, as shown in the following Expression (2), the feature-structure tomographic image may be derived by subtracting the corresponding pixel value Gp of the feature-structure projection image SGi derived from the target projection image Gi from the pixel value Dp of each pixel of the feature-structure tomographic image SDj derived by reconstructing all of the feature-structure projection images SGi and multiplying the subtracted pixel value by n/(n−1) times. Although the method of Expression (2) is a simple method, an amount of calculation for deriving the feature-structure tomographic image excluding the feature-structure projection image for the target projection image can be reduced. Therefore, it is possible to perform the processing for deriving the temporary misregistration amount at high speed.
Next, a fourth embodiment will be described. A configuration of the image processing apparatus according to the fourth embodiment is the same as the configuration of the image processing apparatus according to the first embodiment illustrated in
In a case where a determination result in step ST17 is “No”, the reconstruction unit 33 updates the feature-structure tomographic image by reconstructing the plurality of feature-structure projection images SGi while correcting the misregistration amount (step ST18). In addition, the process returns to the processing of step ST14, and processing of step ST14 to step ST17 is performed. In this case, in the processing of step ST14, the feature structure detection unit 34 updates the threshold value Th1 to be used in a case of detecting the feature structure in the first processing of step ST14. In the present embodiment, since the pixel value of the feature-structure tomographic image SDj has a smaller value as the brightness is higher, the updated feature structure is detected by using the updated threshold value Th2, which is smaller than the threshold value Th1 used in the first processing of step ST14.
In addition, in the processing of step ST15, the projection unit 35 projects the plurality of projection images Gi onto the corresponding tomographic plane corresponding to the tomographic image from which the updated feature structure F1 is detected, based on the positional relationship between the radiation source position and the radiation detector 15 when performing imaging for each of the plurality of projection images Gi, and derives the updated tomographic-plane projection images GTi corresponding to each of the plurality of projection images Gi. In addition, in the processing of step ST16, the misregistration amount derivation unit 36 derives the updated misregistration amount between the plurality of updated tomographic-plane projection images GTi.
In a case where the pixel value of the feature-structure tomographic image SDj has a larger value as the brightness is higher, the updated feature structure is detected by using the updated threshold value Th2, which is larger than the threshold value Th1 used in the first processing of step ST14.
In a case where a determination result in step ST17 is No, processing of step ST18 and step ST14 to step ST16 is repeated until the determination result in step ST17 is Yes. In this case, the feature structure detection unit 34 detects the feature structure from the updated feature-structure tomographic image SDj by using the updated threshold value.
In a case where the determination result in step ST17 is Yes, the reconstruction unit 33 derives the corrected tomographic images Dhj by reconstructing the plurality of projection images Gi while correcting the updated misregistration amount (step ST19). In addition, the display control unit 37 displays the corrected tomographic image Dhj on the display 24 (step ST20), and the processing is ended. The corrected tomographic image Dhj that is derived is transmitted to the image storage system 3, and is stored in the image storage system 3.
As described above, in the fourth embodiment, the feature-structure tomographic image SDj is updated by reconstructing the feature-structure projection images SGi while correcting the misregistration amount. The updated feature structure is detected from the updated feature-structure tomographic image by using the updated threshold value. The misregistration amount is updated by using the updated feature structure. The update of the feature-structure tomographic image, the detection of the updated feature structure by using the updated threshold value, and the update of the misregistration amount are repeated until the misregistration amount converges. Therefore, the misregistration due to the body movement can be removed more appropriately and efficiently, and thus, it is possible to acquire a higher-quality tomographic image.
Also in the second embodiment and the third embodiment, the processing of updating the misregistration amount may be repeated until the misregistration amount converges as in the fourth embodiment.
In addition, in the fourth embodiment, while updating the threshold value, the update of the feature-structure tomographic image, the detection of the updated feature structure by using the updated threshold value, and the update of the misregistration amount are repeated until the misregistration amount converges. On the other hand, the present disclosure is not limited thereto. The update of the feature-structure tomographic image, the detection of the updated feature structure, and the update of the misregistration amount may be repeated without updating the threshold value.
In addition, in the fourth embodiment, the processing of updating the misregistration amount is repeated until the misregistration amount converges. On the other hand, the present disclosure is not limited thereto. The processing of updating the misregistration amount may be repeated a predetermined number of times.
Further, in the above embodiments, the misregistration amount derived by the misregistration amount derivation unit 36 may compared with a predetermined threshold value, and only in a case where the misregistration amount exceeds the threshold value, the tomographic image may be reconstructed while correcting the misregistration amount. The threshold value may be set to a value at which it can be said that there is no influence of the body movement on the tomographic image without correcting the misregistration amount. In this case, as illustrated in
In the above embodiments, in order to easily derive the misregistration amount and the temporary misregistration amount, the regions of interest are set in the feature-structure tomographic image SDj and the tomographic-plane projection image GTi, and the movement direction and the movement amount of the region of interest are derived as the shift vector, that is, the misregistration amount and the temporary misregistration amount. On the other hand, the present disclosure is not limited thereto. The misregistration amount may be derived without setting the region of interest.
Next, a fifth embodiment of the present disclosure will be described.
Here, in the tomographic image acquired by the tomosynthesis imaging, the reflected glare of the structure occurs in the tomographic image other than the tomographic image in which the structure is present. This is called a ripple artifact.
Here, in a case where the feature structure F detected by the feature structure detection unit 34 from the feature-structure tomographic image SDj of the corresponding tomographic plane is the ripple artifact, the feature structure F is blurred and widely spread. For this reason, in a case where such a feature structure F is used, the misregistration amount cannot be accurately derived.
Therefore, in the fifth embodiment, the focal plane determination unit 38 determines whether or not the corresponding tomographic plane for the feature-structure tomographic image SDj from which the feature structure F is detected is a focal plane, the projection unit 35 derives the tomographic-plane projection images GTi in the corresponding tomographic plane determined as a focal plane, and the misregistration amount derivation unit 36 derives the misregistration amount. Specifically, the misregistration amount is derived by using the feature structure detected from the feature-structure tomographic image in the corresponding tomographic plane determined as a focal plane. Hereinafter, the determination as to whether or not the corresponding tomographic plane is a focal plane will be described.
The focal plane determination unit 38 derives corresponding points corresponding to the feature structures in the plurality of feature-structure tomographic images SDj for the feature structures detected by the feature structure detection unit 34.
Therefore, in a case where the position of the tomographic plane in which the feature structure F3 is detected is the position P0 illustrated in
The projection unit 35 derives the tomographic-plane projection image GTi only in the corresponding tomographic plane determined as the focal plane, as in the above embodiments. The misregistration amount derivation unit 36 derives the misregistration amount of the tomographic-plane projection image GTi in the corresponding tomographic plane determined as the focal plane. That is, the misregistration amount derivation unit 36 derives the misregistration amount of the tomographic-plane projection image GTi by using the feature structure detected in the corresponding tomographic plane determined as the focal plane.
Next, processing performed in the fifth embodiment will be described.
In a case where the feature structure detection unit 34 detects a plurality of feature structures, the focal plane determination unit 38 determines whether or not a corresponding tomographic plane, which corresponds to the feature-structure tomographic image from which each of the plurality of feature structures is detected by the feature structure detection unit 34, is the focal plane (focal plane determination; step ST25). In addition, the projection unit 35 derives the tomographic-plane projection image GTi in the corresponding tomographic plane determined as the focal plane (step ST26), and the misregistration amount derivation unit 36 derives the misregistration amount by using the feature structure detected in the feature-structure tomographic image of the corresponding tomographic plane determined as the focal plane (step ST27).
Further, the reconstruction unit 33 derives the corrected tomographic image Dhj by reconstructing the plurality of projection images Gi while correcting the misregistration amount (step ST28). In addition, the display control unit 37 displays the corrected tomographic image Dhj on the display 24 (step ST29), and the processing is ended. The corrected tomographic image Dhj that is derived is transmitted to the image storage system 3, and is stored in the image storage system 3.
As described above, in the fifth embodiment, the misregistration amount is derived in the corresponding tomographic plane determined as the focal plane. Therefore, the misregistration amount can be accurately derived without being affected by the ripple artifact, and as a result, the corrected tomographic image Dhj in which the misregistration is accurately corrected can be derived.
In the fifth embodiment, determination as to whether or not the corresponding tomographic plane is the focal plane by using the result obtained by plotting the pixel values of the feature structure and the corresponding points is performed. On the other hand, the determination as to whether or not the corresponding tomographic plane is the focal plane is not limited thereto. In the feature structure and the ripple artifact, a difference in contrast with surrounding pixels is larger in the feature structure than in the ripple artifact. Therefore, a difference in contrast with surrounding pixels in the feature structure and the corresponding points may be derived, and in a case where the contrast for the feature structure is the maximum, the corresponding tomographic plane in which the feature structure is detected may be determined as the focal plane. In addition, the pixel value at the position corresponding to the feature structure in the projection image has a small variation between the projection images in a case where the feature structure is present in the focal plane. On the other hand, in a case where the feature structure is not present in the focal plane, the pixel value at the position corresponding to the feature structure in the projection image may represent a structure other than the structure corresponding to the feature structure, and as a result, the pixel value has a large variation between the projection images. Therefore, a variance value of the pixel value corresponding to the feature structure between the projection images Gi may be derived, and in a case where the variance value is equal to or smaller than a predetermined threshold value, the corresponding tomographic plane in which the feature structure is detected may be determined as the focal plane. Further, the focal plane determination unit 38 may include a discriminator that is obtained by machine learning, receives the feature structure and the pixel value in the periphery of the feature structure, and outputs a determination result as to whether or not the corresponding tomographic plane in which the feature structure is detected is the focal plane. In this case, the discriminator may determine whether or not the corresponding tomographic plane in which the feature structure is detected is the focal plane.
Next, a sixth embodiment of the present disclosure will be described.
The misregistration amount determination unit 39 sets, for the image quality evaluation, the regions of interest Rh1 and Rh2 centered on the coordinate positions of the plurality (here, two) of the feature structures F4 and F5 included in the corrected tomographic image Dhj illustrated in
In a case where the misregistration correction is appropriately performed by deriving the misregistration amount appropriately, the image blurriness of the corrected tomographic image Dhj decreases, and the high-frequency components increase. On the other hand, in a case where the misregistration correction is inappropriate because the derived misregistration amount is not appropriate, the image blurriness of the corrected tomographic image Dhj increases, and the high-frequency components decrease. Therefore, in the sixth embodiment, the misregistration amount determination unit 39 performs the image quality evaluation based on the magnitudes of the high-frequency components. That is, the misregistration amount determination unit 39 determines whether or not the sum of the magnitudes of the high-frequency components of all of the regions of interest Rh1 and Rh2, which are derived as above, is equal to or larger than a predetermined threshold value Th20. In a case where the sum is equal to or larger than the threshold value Th20, the misregistration amount determination unit 39 determines that the misregistration amount is appropriate, and in a case where the sum is smaller than the threshold value Th20, the misregistration amount determination unit 39 determines that the misregistration amount is inappropriate. In a case where the misregistration amount determination unit 39 determines that the misregistration amount is inappropriate, the reconstruction unit 33 derives the tomographic images Dj by reconstructing the plurality of projection images Gi without correcting the misregistration amount. In addition, the display control unit 37 displays the tomographic image Dj before correction on the display 24 instead of the corrected tomographic image Dhj. In this case, instead of the corrected tomographic image Dhj, the tomographic image Dj before correction is transmitted to an external storage device.
Next, the processing performed in the sixth embodiment will be described.
In a case where the misregistration amount is appropriate, the display control unit 37 displays the corrected tomographic images Dhj on the display 24 (step ST39), and the processing is ended. The corrected tomographic image Dhj that is derived is transmitted to the image storage system 3, and is stored in the image storage system 3. On the other hand, in a case where the misregistration amount is inappropriate, the reconstruction unit 33 derives the tomographic images Dj by reconstructing the plurality of projection images Gi without correcting the misregistration amount (step ST40). In addition, the display control unit 37 displays the tomographic images Dj on the display 24 (step ST41), and the processing is ended. In this case, the tomographic image Dj is transmitted to the image storage system 3, and is stored in the image storage system 3.
Here, in a case where the misregistration amount is derived by the misregistration amount derivation unit 36, an appropriate misregistration amount may not be derived due to the influence of the structure other than the feature structure. In the sixth embodiment, the image quality evaluation is performed on the corrected tomographic image Dhj, and the determination as to whether the misregistration amount is appropriate or inappropriate is performed based on the result of the image quality evaluation. Therefore, it is possible to appropriately determine whether the derived misregistration amount is appropriate or inappropriate. Further, in a case where it is determined that the misregistration amount is inappropriate, the tomographic image Dj before correction is displayed or stored. Thus, it is possible to reduce a possibility of performing an erroneous diagnosis due to the corrected tomographic image Dhj derived based on the inappropriate misregistration amount.
In the sixth embodiment, the image quality evaluation is performed based on the magnitude of the high-frequency components of the region of interest that is set in the corrected tomographic image Dhj. On the other hand, the present disclosure is not limited thereto. The reconstruction unit 33 may derive a plurality of tomographic images Dj by reconstructing a plurality of projection images Gi without performing the misregistration correction. The misregistration amount determination unit 39 may further perform the image quality evaluation of the region of interest including the feature structure in the tomographic image Dj, compare the result of the image quality evaluation for the corrected tomographic image Dhj and the result of the image quality evaluation for the tomographic image Dj, and determine the tomographic image having higher image quality as the final tomographic image. Here, the final tomographic image is a tomographic image displayed on the display 24 or transmitted to an external device and stored.
Even in the fifth embodiment and the sixth embodiment, the update of the misregistration amount may be repeated as in the fourth embodiment.
Further, also in the fifth embodiment and the sixth embodiment, the misregistration amount derived by the misregistration amount derivation unit 36 may be compared with a predetermined threshold value, and only in a case where the misregistration amount exceeds the threshold value, the tomographic image may be reconstructed while correcting the misregistration amount.
Next, a seventh embodiment of the present disclosure will be described.
In the seventh embodiment, the evaluation function derivation unit 50 derives a high-frequency image for the region of interest corresponding to the feature structure F, the region of interest being set with respect to the tomographic-plane projection image GTi by the misregistration amount derivation unit 36. The derivation of the high-frequency image may be performed, as in the misregistration amount determination unit 39 according to the sixth embodiment, by performing filtering processing using a Laplacian filter and deriving a secondary differential image. It is assumed that the pixel value of the derived high-frequency image in the region of interest is qkl. k represents a k-th projection image, and 1 represents the number of pixels in the region of interest.
Here, it is assumed that a transformation matrix for correcting the misregistration amount is Wk and a transformation parameter in the transformation matrix is θk. The transformation parameter θk corresponds to the misregistration amount. In this case, the image quality evaluation value of the region of interest corresponding to the feature structure F in the corrected tomographic image Dhj can be regarded as an added value of the magnitudes of the high-frequency image of the region of interest after misregistration correction in each of the projection images Gi. By deriving the transformation parameter θk, that is, the misregistration amount such that the added value is the maximum, the corrected tomographic image Dhj in which the misregistration amount is appropriately corrected can be derived.
Therefore, the evaluation function derivation unit 50 derives the evaluation function shown in the following Expression (3). The evaluation function Ec shown in Expression (3) is an evaluation function Ec to obtain the transformation parameter θk for minimizing the value in parentheses on the right side with a minus in order to maximize the above addition result. The evaluation function shown in Expression (3) has a plurality of local solutions. Therefore, a constraint condition is applied to the range and the average value of the transformation parameter θk. For example, a constraint condition is applied such that the average of the transformation parameters θk for all of the projection images is 0. More specifically, in a case where the transformation parameter θk is a movement vector representing parallel movement, a constraint condition is applied such that the average value of the movement vectors for all of the projection images Gi is set to 0. In addition, in the seventh embodiment, the misregistration amount derivation unit 36 derives the transformation parameter θk to minimize the evaluation function Ec shown in the following Expression (3), that is, the misregistration amount.
As described above, in the seventh embodiment, the image processing apparatus further comprises the evaluation function derivation unit 50 that derives an evaluation function for performing image quality evaluation for a region of interest including the feature structure in the corrected tomographic image Dhj, and the misregistration amount derivation unit 36 derives the misregistration amount for optimizing the evaluation function. Therefore, it is possible to reduce a possibility of performing an erroneous diagnosis due to the corrected tomographic image Dhj derived based on the inappropriate misregistration amount.
In the above embodiments, in order to easily derive the misregistration amount and the temporary misregistration amount, the regions of interest are set in the tomographic image Dj and the tomographic-plane projection image GTi, and the movement direction and the movement amount of the region of interest are derived as the shift vectors, that is, the misregistration amount and the temporary misregistration amount. On the other hand, the present disclosure is not limited thereto. The misregistration amount may be derived without setting the region of interest.
Further, in the above embodiments, the tomographic-plane projection image GTi is derived by the projection unit 35, and the misregistration amount between the tomographic-plane projection images GTi is derived by the misregistration amount derivation unit 36. On the other hand, the present disclosure is limited to thereto. The misregistration amount between the projection images Gi may be derived without deriving the tomographic-plane projection image GTi. In this case, the projection unit 35 is unnecessary in the above embodiments. Further, the misregistration amount derivation unit 36 may derive the misregistration amount based on the positional relationship of the projection images Gi in the corresponding tomographic plane corresponding to the tomographic image from which the feature structure F is detected.
Further, in the embodiments described above, the subject is the breast M, but the present disclosure is not limited thereto. It is needless to say that any part such as the chest or the abdomen of the human body may be the subject.
In the embodiments described above, for example, the following various processors can be used as the hardware structures of processing units that execute various kinds of processing, such as the image acquisition unit 31, the structure extraction unit 32, the reconstruction unit 33, the feature structure detection unit 34, the projection unit 35, the misregistration amount derivation unit 36, the display control unit 37, the focal plane determination unit 38, the misregistration amount determination unit 39, and the evaluation function derivation unit 50. The various processors include, as described above, a CPU, which is a general-purpose processor that functions as various processing units by executing software (program), and a dedicated electric circuit, which is a processor having a circuit configuration specifically designed to execute a specific processing, such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC) that is a processor of which the circuit configuration may be changed after manufacturing such as a field programmable gate array (FPGA).
One processing unit may be configured by one of these various processors, or may be configured by a combination of two or more processors having the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA). In addition, a plurality of processing units may be configured by one processor.
As an example in which the plurality of processing units are configured by one processor, firstly, as represented by a computer such as a client and a server, a form in which one processor is configured by a combination of one or more CPUs and software and the processor functions as the plurality of processing units may be adopted. Secondly, as represented by a system on chip (SoC) or the like, a form in which a processor that realizes the function of the entire system including the plurality of processing units by one integrated circuit (IC) chip is used may be adopted. As described above, the various processing units are configured by using one or more various processors as a hardware structure.
Further, as the hardware structure of the various processors, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined may be used.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2022-035379 | Mar 2022 | JP | national |
This application is a continuation application of International Application No. PCT/JP2022/046280, filed on Dec. 15, 2022, which is incorporated herein by reference in its entirety. Further, this application claims priority from Japanese Patent Application No. 2022-035379, filed on Mar. 8, 2022, the disclosure of which is incorporated by reference herein in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/JP2022/046280 | Dec 2022 | WO |
| Child | 18820266 | US |