This application is based upon and claims the benefit of priority from Chinese Patent Application No. 202210109999.8, filed on Jan. 29, 2022, the entire contents of all of which are incorporated herein by reference.
Embodiments described herein relate generally to a medical image processing apparatus, a medical image processing method, and a recording medium.
General medical images include ultrasonic (US) images, computed tomography (CT) images, magnetic resonance (MR) images, and the like. A medical image processing apparatus needs to perform a registration on different types of medical images according to the needs of different medical practices.
For example, some surgeries and examinations need to be guided by ultrasonic images, but the ultrasonic images have problems such as low contrast and high noise. Therefore, it may be required to fuse ultrasonic images with medical images (for example, MR images) other than the ultrasonic images to more accurately position a certain organ (or site) of the body. The image fusion usually establishes a correspondence relation between a coordinate system of the ultrasonic image and a coordinate system of the MR image by a rigid registration of the ultrasonic image and the MR image.
However, when collecting the ultrasonic image, a probe needs to be held in close contact with a site to be scanned, and in this case, an organ to be detected is likely to be deformed and an image including a deformation may adversely affect the accuracy of rigid registration. Particularly, when the ultrasonic image includes a relatively severe deformation and a part of the contours (edges) of a deformed organ is similar to a part of the contours (edges) of another organ in the MR image, the organ in the ultrasonic image and the another organ in the MR image may be erroneously registered. This may result in rigid registration failures and image fusion failures.
A medical image processing apparatus according to the present embodiment includes processing circuitry. The processing circuitry extracts a registration target area corresponding to a registration target organ from an ultrasonic image by acquiring edges of the registration target organ from the ultrasonic image among a plurality of types of medical images including the registration target organ. The processing circuitry calculates the degree of deformation of a plurality of positions in the registration target area. The processing circuitry performs a rigid registration on the types of medical images on the basis of the degree of deformation for each position calculated at the positions and the similarity between the types of medical images at the positions.
A medical image processing apparatus 1 according to a first embodiment is described below.
The input interface 20 is implemented by a trackball, a switch button, a mouse, a keyboard, a touch pad for performing an input operation by touching an operation surface, a touch screen with integrated display screen and touch pad, non-contact input circuitry using an optical sensor, voice input circuitry, or the like, for making various settings. The input interface 20 is connected to the processing circuitry 10, converts an input operation received from an operator into an electrical signal, and outputs the electrical signal to the processing circuitry 10. In
The display 22 is connected to the processing circuitry 10 and displays various information and various image data output from the processing circuitry 10. For example, the display 22 is implemented by a liquid crystal monitor, a cathode ray tube (CRT) monitor, a touch panel, or the like. For example, the display 22 displays a graphical user interface (GUI) for receiving operator's instructions, various display images, and various processing results by the processing circuitry 10. The display 22 is an example of a display unit. In
The communication interface 21 is a network interface card (NIC) or the like, and communicates with other devices. For example, the communication interface 21 is connected to the processing circuitry 10, and collects image data from an ultrasonic diagnostic apparatus, which is an ultrasonic system, an X-ray computed tomography (CT) apparatus and a magnetic resonance imaging (MRI) apparatus, which are modalities other than the ultrasonic system, and outputs the collected image data to the processing circuitry 10.
The storage circuitry 100 is connected to the processing circuitry 10 and stores therein various data. Specifically, the storage circuitry 100 stores therein at least various medical images for image registration, a fusion image obtained after registration, or the like. For example, the storage circuitry 100 is implemented by a semiconductor memory element such as a random-access memory (RAM) and a flash memory, a hard disk, an optical disc, or the like. The storage circuitry 100 stores therein computer programs corresponding to respective processing functions performed by the processing circuitry 10. In
The processing circuitry 10 controls each component of the medical image processing apparatus 1 according to an input operation received from the operator via the input interface 20.
For example, the processing circuitry 10 is implemented by a processor. As illustrated in
The term “processor” used in the above description means, for example, circuitry such as a central processing unit (CPU), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), or a programmable logic device (for example, a simple programmable logic device (SPLD), a complex programmable logic device (CPLD), and a field programmable gate array (FPGA)). When the processor is, for example, a CPU, the processor reads out and executes the computer programs stored in the storage circuitry 100 to implement the functions. On the other hand, when the processor is, for example, an ASIC, the computer programs are directly incorporated in the circuitry of the processor instead of storing the computer programs in the storage circuitry 100. Each processor of the present embodiment is not limited to being configured as single piece of circuitry for each processor, and one processor may be configured by combining a plurality of pieces of independent circuitry to implement the functions thereof. The plurality of components in
The medical image processing apparatus 1 in
The processing contents of the image acquisition function 200, the feature extraction function 300, the target area extraction function 400, the deformation degree calculation function 500, the image rigid registration function 600, and the display control function 700 performed by the processing circuitry 10 are described below.
The image acquisition function 200 acquires two or more types of medical images as registration target images. The registration target image may be an image acquired from the storage circuitry 100 or an image acquired during surgery by an imaging apparatus such as an ultrasonic imaging apparatus. For example, when performing prostate puncture biopsy surgery, MR images (3D images) of a prostate need to be acquired in advance and stores in the storage circuitry 100 as medical images other than ultrasonic images, and then ultrasonic images (3D images) need to be acquired after the start of surgery.
The feature extraction function 300 extracts image features from the registration target image. Feature maps obtained by extracting the image features are, for example, gradation maps, gradation histograms, dispersion maps, gradient maps, and the like.
The target area extraction function 400 extracts a registration target area corresponding to a registration target organ from an ultrasonic image by acquiring edges of the registration target organ from the ultrasonic image among a plurality of types of medical images including the registration target organ. The extraction can be performed using an edge segmentation algorithm, a deep learning area extraction algorithm, or the like in the related art. The target area extraction function 400 is an example of a “target area extraction unit”.
The deformation degree calculation function 500 calculates the degree of deformation for each position in the registration target area. The “position” in the present embodiment may be a sub-area including one pixel point or a plurality of pixel points in the registration target area. That is, the deformation degree calculation function 500 may calculate the degree of deformation for each pixel point or for each sub-area. The degree of deformation includes at least one of a parameter indicating the unevenness of an edge (that is, edge unevenness parameter), a parameter indicating the smoothness of the edge (that is, edge smoothness parameter), and a parameter indicating distance from an organ to a pressing object (that is, probe distance parameter). The method of calculating the degree of deformation is described later. The deformation degree calculation function 500 is an example of a “deformation degree calculation unit”.
The image rigid registration function 600 performs rigid registration on the types of medical images on the basis of the degree of deformation for each position calculated at the positions and the similarity between the types of medical images at the positions. In the rigid registration process, a position of optimal similarity is found by adjusting the amounts of rotation and translation between the images. To reduce the influence of a deformed portion of the registration target organ on the registration, deformation coefficients are added to a similarity calculation, a large weight is given to an area with a small deformation, and a small weight is given to an area with a large deformation. The method of calculating the similarity is described later. The image rigid registration function 600 is an example of an “image rigid registration unit”.
The display control function 700 causes the display 22 to display a fusion image representing a registration result that is a result of the rigid registration. In addition to displaying the fusion image, the registration reliability of different positions in the registration results is preferably displayed on the basis of the similarity of a plurality of positions in the fusion image. As a display method, when causing the display 22 to display the registration reliability, the display control function 700 presents the level of the reliability in different colors for different positions in the fusion image. By marking, in different colors, positions in the fusion image where deformation is small and the registration reliability is high and positions in the fusion image where deformation is large and the registration reliability is low, the reliability of each position in the fusion image can be intuitively presented to a user. Display examples of the fusion images are described later. The display control function 700 is an example of a “display control unit”.
An example of a registration of an ultrasonic image and an MR image including a prostate is described below.
An image acquisition process of step S11 in
A feature extraction process at step S12 in
A target area extraction process of step S13 in
A deformation degree calculation process at step S14 in
The edge unevenness parameter is described below.
The reason why the degree of deformation is calculated using the unevenness of the edge is because, when the prostate, which is the registration target organ in the present embodiment, is not pressed, the overall contour of the organ in a normal state is a roughly chestnut-shape as illustrated in an axial view of an MR image in
When calculating the edge unevenness coefficient, the positive direction of the organ contour is first specified, and for the prostate in the present embodiment, the positive direction is the direction in which the contour is convex outward. In such a case, an edge unevenness coefficient di(p) is calculated by Equation 1 below:
where p is any one point in the prostate area and q is an edge point closest to the point p. convex(q) is the unevenness of the point q (1 in the positive direction in which the contour is convex outward and 0 in the negative direction in which the contour is concave inward), and distance(p,q) is the distance between the point p and the point q. T is a preset distance threshold and C is a preset fixed coefficient.
As can be seen from Equation 1 above, the closer the point p is to the edge, the greater the influence received from the unevenness of the edge, but when the distance between the point p and the point q is greater than the distance threshold T, the unevenness of the edge has no influence on the deformation coefficient.
The edge smoothness parameter is described below.
The reason why the degree of deformation is calculated based on the smoothness of the edge is because, when the prostate, which is the registration target organ, is not pressed, the edge of the organ in the normal state is smooth and has a small curvature, but when the organ is deformed by the pressure of the ultrasonic probe, the curvature of the edge increases.
When calculating the edge smoothness coefficient, the edge curvature is added to the deformation coefficient. In such a case, an edge smoothness coefficient d2(p) is calculated by Equation 2 below:
where p is any one point in the prostate area and q is an edge point closest to the point p. curvature(q) is the curvature of the point q, and distance(p,q) is the distance between the point p and the point q. T is the preset distance threshold and C is the preset fixed coefficient.
As can be seen from Equation 2 above, the closer the point p is to the edge, the greater the influence received from the edge curvature, but when the distance between the point p and the point q is greater than the distance threshold T, the edge curvature has no influence on the deformation coefficient.
The probe distance parameter is described below.
The reason why the degree of deformation is calculated based on the distance to the probe is because an area closer to the probe is more likely to be pressed and deformed.
When calculating the probe distance coefficient, a large deformation coefficient is given to an area close to the probe and a small deformation coefficient is given to an area far from the probe. In such a case, a probe distance coefficient d3(p) is calculated by Equation 3 below:
d(p)=decay_function(distance(p,probe)) (Equation 3)
where p is any one point in the prostate area, distance(p,probe) is the distance between the point p and the probe, and decay_function is a decay function. The decay function decay_function may be an exponential decay function ek/x or a power decay function xk, and k is a decay parameter.
As can be seen from Equation 3 above, the closer the point p is to the probe, the greater the influence on the deformation coefficient, but the farther the point p is from the probe, the smaller the influence on the deformation coefficient.
In the present embodiment, an optimal combination of deformation coefficients is generated by adjusting the weights of the above three types of deformation coefficients. An optimal combination d(p) of deformation coefficients is calculated by Equation 4 below:
where di(p) is an ith kind of deformation coefficient and wi is a weight according to the ith kind of deformation coefficient. The proportion occupied by each deformation coefficient may be set in advance on the basis of empirical values, or may be set on the basis of results of machine learning or the like. For example, weights of the edge unevenness coefficient, the edge smoothness coefficient, and the probe distance coefficient are 0.3, 0.3, and 0.4, respectively.
The description is continued by referring now back to
In the registration process, the position of optimal similarity is found by adjusting the amounts of rotation and translation between the images. To reduce the influence of a deformed portion of the registration target organ on the registration, deformation coefficients are added to the similarity calculation, and weighted similarity is obtained by giving a large weight to an area with a small deformation and giving a small weight to an area with a large deformation.
The weighted similarity considering the deformation coefficients is calculated by Equation 5 below:
similarity_w(p,q)=similarity(p,q)·d(p)−1 (Equation 5)
where p is any one point on the ultrasonic image and q is any one point on the MR image. similarity(p,q) is the similarity between the point p and the point q, and d(p) is the deformation coefficient of the point p calculated at the deformation degree calculation step S14.
More specifically, the similarity(p,q) indicates the similarity between any one point p in the feature map of the ultrasonic image and any one point q in the feature map of the MR image. In the present embodiment, since the gradient images of the ultrasonic image and the MR image are obtained at step S12, the similarity(p,q) is, for example, an included angle of a gradient feature vector between the point p in the gradient image of the ultrasonic image and the point q in the gradient image of the MR image.
When the gradient feature vector of the point p is vp=(v1p, v2p, v3p) and the gradient feature vector of the point q is vq=(v1q, v2q, v3q), the similarity between the two gradient images is calculated by Equation 6 below.
When the feature map is not a gradient image but a gradation map, the similarity(p,q) is, for example, a value of mutual information between the point p in the gradation map of the ultrasonic image and the point q in the gradation map of the MR image.
When the feature map is not a gradient image but a dispersion map, the similarity(p,q) is, for example, a correlation coefficient between the point p in the dispersion map of the ultrasonic image and the point q in the dispersion map of the MR image.
In
On the other hand, in
When the ultrasonic image and the MR image are registered by each step of the present embodiment, the degree of deformation for each position in the ultrasonic image is calculated at step S14, and when the similarity is calculated at step S15, a large weight is given to an area with a small degree of deformation and a small weight is given to an area with a large degree of deformation. Therefore, even though a portion with a large deformation exists in the registration target area, successful registration is achieved and a fusion image in
As a comparative example,
In the fusion image in
The description is continued by referring now back to
At step S16, the display control function 700 causes the display 22 to display the fusion image, and also preferably causes the display 22 to display the registration reliability of different positions in the registration result on the basis of the similarity of a plurality of positions in the fusion image. For example, when displaying the registration reliability, the display control function 700 presents the level of the reliability in different colors for different positions in the fusion image.
Effects of the medical image processing apparatus 1 according to the first embodiment are described below.
First, the medical image processing apparatus 1 according to the first embodiment calculates the degree of deformation of a plurality of positions in a registration target area in an ultrasonic image, and performs a rigid registration on a plurality of types of medical images on the basis of the degree of deformation for each position calculated at the positions and the similarity between the types of medical images at the positions. Thus, in the present embodiment, even though the deformation of a registration target organ included in the ultrasonic image is large, interference of a portion with a large deformation with the rigid registration can be suppressed, and accuracy of the rigid registration of the ultrasonic image and an MR image can be improved.
In the present embodiment, the rigid registration is performed by calculating weighted similarity so that a large weight is given to an area with a small deformation and a small weight is given to an area with a large deformation. Thus, in the present embodiment, adverse effects on a registration result of an area with a large degree of deformation can be automatically reduced, contribution to a registration result of an area with no deformation or a small degree of deformation can be reinforced, and interference of the portion with a large deformation in the ultrasonic image with the rigid registration is more reliably suppressed.
Furthermore, in the present embodiment, by selecting at least one of the edge unevenness parameter, the edge smoothness parameter, and the probe distance parameter, the accuracy of the calculated degree of deformation can be improved by using deformation parameters according to the features of the registration target organ. Moreover, in the present embodiment, an optimum combination of deformation coefficients is generated by adjusting the weights of two or more types of coefficients according to the state of deformation of the registration target organ. Thus, in the present embodiment, deformation parameters more adapted to the features of the registration target organ can be obtained.
A medical image processing apparatus 1 according to a second embodiment is described below with reference to
The deformation vector field generation function 800 performs a non-rigid registration on a plurality of types of medical images (two types of medical images in the present embodiment) at a plurality of positions to generate deformation vector fields. A deformation vector field between two types of medical images can be obtained using a non-rigid registration algorithm (for example, an ICP algorithm or the like) in the related art. The deformation vector field generation function 800 is an example of a “deformation vector field generation unit”.
The deformation degree correction function 900 corrects the degree of deformation at the positions, which is calculated by the deformation degree calculation function 500, on the basis of the magnitudes of vectors in the deformation vector fields corresponding to the two types of medical images, so that as the magnitude of the vector increases, the degree of deformation increases. The deformation degree correction function 900 is an example of a “deformation degree correction unit”.
The image rigid registration function 600 in the second embodiment performs the rigid registration again on the types of medical images on the basis of the degree of deformation for each position corrected at the positions and the similarity between the two types of medical images at the positions.
In
A deformation degree correction process at step S18 in
As an example, the deformation degree correction function 900 corrects the optimal combination d(p) of deformation coefficients calculated at the deformation degree calculation step S14, according to Equation 7 below:
d(p)=d(p)+λ·|transform_field(p)| (Equation 7)
where p is any one point in the ultrasonic image, |transform field(P)| is the magnitude of the deformation vector field at the point p, and λ is a weight coefficient. The weight coefficient λ may be set in advance on the basis of empirical values or may be set on the basis of results of machine learning or the like. For example, the weight coefficient λ can be selected in the range of [0.01 to 0.2].
An image rigid re-registration process at step S19 in
In this way, the medical image processing apparatus 1 according to the second embodiment generates the deformation vector field by the non-rigid registration and corrects the optimal combination d(p) of deformation coefficients by using the magnitude of a deformation amount represented by the magnitude of each vector. Thus, in the present embodiment, a more accurate optimal combination d(p) of deformation coefficients can be obtained, and the accuracy of image registration can be further improved.
Although some embodiments have been described, the above-described embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These embodiments and variations thereof fall within the scope and gist of the invention and within the scope of the invention defined in the technical proposal and equivalents thereof.
For example, each of the above-described embodiments is described using, as an example, a registration of an ultrasonic image and an MR image including a prostate; however, it is of course that the embodiments can be applied to a case of a registration of an ultrasonic image and a CT image.
In the above-described first and second embodiments, the fusion image is displayed at step S16 as a result of the registration at step S15 or step S19; however, the embodiments are not limited thereto. For example, in the process at step S16, not only the fusion image but also an image obtained by further fusing the fusion image with another medical image may be displayed. Instead of displaying the fusion image, the fusion image may be further analyzed and the result of the analysis process may be displayed.
Step S16 may not be performed, and as a result of the registration at step S15 or step S19, a correspondence relation between an ultrasonic image coordinate system and an MR image coordinate system is obtained. This correspondence relation is stored and used, for example, in the form of a transformation matrix.
According to at least one of the above-described embodiments, even though a registration target organ included in an ultrasonic image is highly deformed, a registration of the ultrasonic image and other types of medical images can be achieved.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
202210109999.8 | Jan 2022 | CN | national |