This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2023-104435, filed Jun. 26, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to a medical image processing apparatus, a method, and a non-transitory computer readable medium.
Catheter-based left atrial appendage closure (LAAC) has recently been attracting attention as a measure for stroke prevention. In LAAC, a treatment device is placed in the left atrial appendage (LAA) to prevent blood flow and clot formation in the LAA. To determine the size and placement position of the treatment device to be placed in the LAA, measurement based on morphological information about the LAA and the ostium of the LAA is to be made. The LAA ostium refers to the border between the LAA and the left atrium (LA), a region at the entrance of the LAA. A section perpendicular to the axis of the LAA in the ostium region will be referred to as an ostium plane. The ostium plane is a two-dimensional plane, and the ostium region is drawn as a circle on the ostium plane. There are several previous papers discussing means for identifying the ostium plane in a computed tomography (CT) image.
The use of conventional organ segmentation techniques is being contemplated as a method for identifying a reference plane or a reference line based on a region of interest (e.g., LAA), like the ostium plane described above. However, the method using organ segmentation techniques can have difficulty in identifying a reference plane or a reference line with clinically sufficient accuracy. That is, the treatment device may be unable to be actually placed along the identified reference plane or the identified reference line.
A medical image processing apparatus according to an exemplary embodiment includes processing circuitry. The processing circuitry is configured to extract a region of interest from a medical image. The processing circuitry is configured to identify a reference plane or a reference line based on the region of interest. The processing circuitry is configured to correct the identified reference plane or the identified reference line based on a morphology of the region of interest.
Exemplary embodiments of a medical image processing apparatus, a method, and a non-transitory computer readable medium according to the present application will be described in detail below with reference to the accompanying drawings. The medical image processing apparatus, method, and non-transitory computer readable medium according to the present application are not limited to the following exemplary embodiments. In the following description, similar components are denoted by the same reference numerals. A redundant description thereof will be omitted.
The medical image diagnostic apparatus 1 captures an image of a subject to generate a medical image. The medical image diagnostic apparatus 1 transmits the generated medical image to various apparatuses on the network. Examples of the medical image diagnostic apparatus 1 includes an X-ray diagnostic device, an X-ray computed tomography (CT) device, a magnetic resonance imaging (MRI) device, an ultrasonic diagnostic device, a single photon emission computed tomography (SPECT) device, and a positron emission computed tomography (PET) device.
The medical image storage apparatus 2 stores various medical images related to subjects. Specifically, the medical image storage apparatus 2 receives medical images from the medical image diagnostic apparatus 1 via the network and stores the medical images in a storage circuit in the medical image diagnostic apparatus 1. For example, the medical image storage apparatus 2 is implemented by a computer device, such as a server or a workstation. As an example, the medical image storage apparatus 2 is implemented by a picture archiving and communication system (PACS) and stores medical images in a format compliant with Digital Imaging and Communications in Medicine (DICOM).
The medical image processing apparatus 3 performs various types of image processing on medical images collected from subjects. Specifically, the medical image processing apparatus 3 receives medical images from the medical image diagnostic apparatus 1 or the medical image storage apparatus 2 via the network, and performs various types of information processing using the medical images. For example, the medical image processing apparatus 3 is implemented by a computer device, such as a server or a workstation.
The medical image processing apparatus 3 includes, for example, a communication interface 31, an input interface 32, a display 33, storage circuitry 34, and processing circuitry 35.
The communication interface 31 controls transmission and communication of various types of data communicated between the medical image processing apparatus 3 and other apparatuses connected via the network. Specifically, the communication interface 31 is connected to the processing circuitry 35, and transmits data received from another apparatus to the processing circuitry 35 or transmits data received from the processing circuitry 35 to another apparatus. The communication interface 31 is implemented by, for example, a network card, a network adaptor, or a network interface controller (NIC).
The input interface 32 receives input operations for various instructions and various types of information from a user. Specifically, the input interface 32 is connected to the processing circuitry 35, and converts the input operations received from the user into electrical signals and transmits the electrical signals to the processing circuitry 35. For example, the input interface 32 is implemented by a trackball, a switch button, a mouse, a keyboard, a touchpad to which input operations are made with a touch on the operation surface, a touchscreen into which a display screen and a touchpad are integrated, a noncontact input interface using an optical sensor, and/or a voice input interface. As employed herein, the input interface 32 is not limited to ones including physical operation components, such as a mouse and a keyboard. Examples of the input interface 32 may include an electrical signal processing circuit that receives an electrical signal corresponding to an input operation from an external input device provided separately from the medical image processing apparatus 3 and transmits this electrical signal to the processing circuitry 35.
The display 33 displays various types of information and various types of data. Specifically, the display 33 is connected to the processing circuitry 35, and displays various types of information and various types of data received from the processing circuitry 35. For example, the display 33 is implemented by a liquid crystal display, a cathode ray tube (CRT) display, or a touchscreen.
The storage circuitry 34 stores various types of data and various programs. Specifically, the storage circuitry 34 is connected to the processing circuitry 35, and stores data received from the processing circuitry 35 or reads stored data and transmits the read data to the processing circuitry 35. For example, the storage circuitry 34 is implemented by a semiconductor memory device, such as a random access memory (RAM) and a flash memory, a hard disk, and/or an optical disc.
The processing circuitry 35 controls the entire medical image processing apparatus 3. For example, the processing circuitry 35 performs various types of processing based on input operations received from the user via the input interface 32. For example, the processing circuitry 35 receives data transmitted from another apparatus via the communication interface 31 and stores the received data in the storage circuitry 34. For example, the processing circuitry 35 transmits data received from the storage circuitry 34 to the communication interface 31 and thereby transmits the data to another apparatus. For example, the processing circuitry 35 displays data received from the storage circuitry 34 on the display 33.
The configuration example of the medical image processing apparatus 3 according to the present exemplary embodiment has been described above. For example, the medical image processing apparatus 3 according to the present exemplary embodiment is installed in a medical facility such as a hospital and a clinic, and supports various diagnoses and treatment planning which are made by users, such as doctors. More specifically, the medical image processing apparatus 3 enables identification of a reference plane or a reference line suitable for clinical use by correcting a reference plane or reference line serving as a reference position in measuring a region of interest or determining a treatment device during treatment planning based on the morphology of the region of interest. Here, the reference plane or the reference line is a position derived from the region of interest. The medical image processing apparatus 3 with such a configuration will now be described in detail.
For example, as illustrated in
The control function 351 generates various graphical user interfaces (GUIs) and/or various types of display information based on operations made via the input interface 32, and controls display of the generated GUIs and/or display information on the display 33. For example, the control function 351 displays results of processing performed with various functions on the display 33. The control function 351 can also generate various display images based on medical images acquired by the image acquisition function 352 and display the display images.
The image acquisition function 352 acquires a medical image of a subject from the medical image diagnostic apparatus 1 or the medical image storage apparatus 2 via the communication interface 31. Specifically, the image acquisition function 352 acquires a medical image containing morphological information about a three-dimensional anatomical structure of the region of interest to be processed. Here, the image acquisition function 352 can acquire a plurality of medical images obtained by capturing a plurality of three-dimensional images in a time direction.
The image acquisition function 352 acquires CT images, ultrasonic images, MRI images, X-ray images, or angiographic images as the foregoing medical images. The processing circuitry 35 receives medical images of the subject from the medical image diagnostic apparatus 1 or the medical image storage apparatus 2 by performing the image acquisition function 352 described above, and stores the received medical images in the storage circuitry 34.
The extraction function 353 extracts the region of interest from a medical image acquired by the image acquisition function 352. The processing to be performed by the extraction function 353 will be described in detail below.
The identification function 354 identifies a reference plane or reference line based on the region of interest. The processing to be performed by the identification function 354 will be described in detail below.
The morphology acquisition function 355 acquires the morphology of a treatment device to be placed in the region of interest. The processing to be performed by the morphology acquisition function 355 will be described in detail below.
The correction function 356 corrects the identified reference plane or reference line based on the morphology of the region of interest. Specifically, the correction function 356 corrects the reference plane or the reference line based on a relationship between the morphology of the region of interest and that of the treatment device. For example, the correction function 356 sets a candidate placement position of the treatment device based on the reference plane or the reference line, and corrects the reference plane or the reference line based on a relationship between the state of the treatment device placed at the candidate placement position and the morphology of the region of interest. The processing to be performed by the correction function 356 will be described in detail below.
The processing circuitry 35 described above is implemented by, for example, a processor. In such a case, the foregoing various processing functions are stored in the storage circuitry 34 in the form of computer-executable programs. The processing circuitry 35 implements the functions corresponding to the respective programs by reading the programs stored in the storage circuitry 34 and executing the programs. In other words, the processing circuitry 35 comes to have the processing functions illustrated in
Next, a processing procedure of the medical image processing apparatus 3 will be described with reference to
For example, as illustrated in
In step S102, the extraction function 353 extracts the region of interest included in the acquired medical image. This processing is implemented, for example, by the processing circuitry 35 calling the program corresponding to the extraction function 353 from the storage circuitry 34 and executing the program.
In step S103, the identification function 354 identifies a reference plane or a reference line based on the extracted region of interest. This processing is implemented, for example, by the processing circuitry 35 calling the program corresponding to the identification function 354 from the storage circuitry 34 and executing the program.
In step S104, the morphology acquisition function 355 acquires morphological information indicating the morphology of the treatment device to be placed in the region of interest. This processing is implemented, for example, by the processing circuitry 35 calling the program corresponding to the morphology acquisition function 355 from the storage circuitry 34 and executing the program.
In step S105, the correction function 356 determines whether to correct the reference plane or reference line identified in step S103. If the correction function 356 determines that the reference plane or reference line is to be corrected (YES in step S105), the processing proceeds to step S106. In step S106, the correction function 356 performs correction processing on the reference plane or reference line identified in step S103. If, in step S105, the correction function 356 determines the reference plane or reference line is determined not to be corrected (NO in step S105), the processing is ended. This processing is implemented, for example, by the processing circuitry 35 calling the program corresponding to the correction function 356 from the storage circuitry 34 and executing the program.
Details of each process performed by the medical image processing apparatus 3 will now be described. The following description deals with a case where the region of interest is the left atrial appendage (LAA). However, this is not restrictive, and the present exemplary embodiment can be targeted for any living organ to which a treatment is applied by placing a device in part or all of the structure. For example, the aortic value or the mitral valve may be the region of interest. In the following description, a case where a CT image captured by an X-ray CT device is obtained as a three-dimensional medical image will be described as an example.
As described in step S101 of
The operation of the medical image acquisition processing in step S101 may be started based on the user's instruction provided via the input interface 32 as described above, or automatically started. In the latter case, for example, the image acquisition function 352 monitors the medical image storage apparatus 2, and each time a new medical image is stored, automatically acquires the medical image.
Here, the image acquisition function 352 may determine the stored new medical image based on a preset acquisition condition, and if the medical image satisfies the acquisition condition, the image acquisition function 352 may perform the acquisition processing. For example, an acquisition condition based on which the state of the medical image is determinable is stored in the storage circuitry 34, and the image acquisition function 352 determines the stored new medical image based on the acquisition condition stored in the storage circuitry 34.
For example, the storage circuitry 34 stores, as the acquisition condition, “acquisition of a medical image captured using an imaging protocol intended for the heart” or “acquisition of a magnified and reconstructed medical image”, or a combination of these. The image acquisition function 352 acquires a medical image that satisfies the foregoing acquisition condition.
As described in step S102 of
As another example, the extraction function 353 can extract the region of interest based on an anatomical structure drawn on the CT image with a known region extraction technique. For example, the extraction function 353 extracts the region of interest in the CT image using Otsu's binarization, region growing, snake algorithm, graph cut, or mean shift based on CT values.
As another example, the extraction function 353 can extract the region (coordinate information) of the LAA in the CT image by using a trained model for the region of interest (LAA), constructed using machine learning techniques (including deep learning) based on training data prepared in advance. The region of interest extraction processing described above does not necessarily need to be applied to the entire image. For example, an area that relates to the region of interest and is greater than the region of interest and smaller than the entire image (e.g., if the region of interest is the LAA, the heart area or the left atrium area) may be identified, and the foregoing known region extraction techniques and/or machine learning techniques may be applied to only the identified area to extract the region representing the region of interest.
As described in step S103 of
The identification function 354 can acquire various types of information as the information indicating the ostium plane. For example, the identification function 354 can acquire coordinate information about at least three points included in the border between the left atrium (LA) and the left ventricle as the information indicating the ostium plane. As another example, the identification function 354 can acquire a conversion matrix for converting the position of a characteristic section (e.g., the most cephalic or caudal horizontal section) in the CT image from which the LAA is extracted, into the position of the ostium plane as the information indicating the ostium plane. As another example, the identification function 354 can acquire the center coordinates of the ostium area (area corresponding to the entrance of the LAA) and vector information from a characteristic point (e.g., the point where all the X-, Y-, and Z-coordinates are 0) in the original CT image to the center coordinates of the ostium area.
The identification function 354 can identify the ostium plane using various methods. For example, the identification function 354 can identify an area manually specified using existing image display software based on a procedure discussed in Non-Patent Literature 1 as the ostium plane. As another example, the identification function 354 can estimate an area representing the anatomical structure relating to the ostium plane based on a shape model of the anatomical structure constructed by learning training data prepared in advance using machine learning techniques, and identify the ostium plane based the position of the estimated anatomical structure. For example, the identification function 354 estimates the positions of the coumadin ridge and/or the bifurcation of the left main trunk of the left main coronary artery into the left circumflex branch as anatomical structures relating to the ostium surface, and identifies the ostium plane from the estimated positions of the anatomical structures. Furthermore, the identification function 354 can calculate a two-dimensional plane by applying the least squares method to the closed curve of the border between the LAA area extracted in step S102 and the LA area, and identifies the two-dimensional plane as the ostium plane.
(Processing for Acquiring Morphological Information about Treatment Device)
As described in step S104 of
Morphological information about treatment devices that are already in clinical use or under clinical research is made public by the manufacturers and vendors of the treatment devices. The medical image processing apparatus 3 records such morphological information made public in the storage circuitry 34 in association with names or other identification information about the treatment devices in advance. The morphology acquisition function 355 acquires the corresponding morphological information from the storage circuitry 34. Alternatively, for example, the user may define morphological information about user-intended unknown treatment devices by themselves, and store the defined morphological information in the storage circuitry 34.
For example, the storage circuitry 34 stores morphological information about treatment devices for use in left atrial appendage closure (LAAC) for the LAA, the region of interest, in advance. As discussed in Non-Patent Literature: Asmarats, Lluis, and Josep Rodes-Cabau, “Percutaneous left atrial appendage closure: current devices and clinical outcomes”, Circulation: cardiovascular interventions 10.11 (2017): e005359, there are various types of treatment devices for use in LAAC. Morphological information about all such devices may be stored in the storage circuitry 34.
The morphology acquisition function 355 acquires morphological information about a treatment device or devices for use in LAAC from the storage circuitry 34. For example, the morphology acquisition function 355 can obtain the morphological information about all the treatment devices for use in LAAC. As another example, the morphology acquisition function 355 can acquire only the morphological information about one or more predetermined treatment devices. Alternatively, the morphology acquisition function 355 can acquire only the morphological information about a type of treatment device selected by the user via a not-illustrated GUI. If the same type of treatment device has a plurality of size variations, the morphological information about all the sizes may be acquired. The morphological information about a user-specified or predetermined size of treatment device may be acquired. Examples of the predetermined size include a maximum size and a minimum size.
Treatment devices is designed to be deformed to some extent by compression. The morphological information about the treatment devices can thus include deformation information about their deformable ranges.
As described in conjunction with step S105 of
In such a case, as illustrated in
Suppose, for example, that the expected placement position (region D2 in the diagram) of the treatment device is calculated as illustrated in
The correction function 356 performs the determination processing based on a geometric relationship between the morphological information about the treatment device and the morphological information about the LAA. For example, the correction function 356 performs the foregoing determination processing by calculating the distance between the contour of the treatment device and that of the LAA and comparing the distance with a predetermined determination criterion. The determination criterion for performing the determination processing may be set and stored in the storage circuitry 34 in advance. Alternatively, the determination method and possible candidates for the determination criterion (e.g., both
In the description provided as an example in conjunction with
Since cardiac structures, such as that of the LAA, change their morphology over time, the relationship between the treatment device and the region of interest may be calculated as a four-dimensional feature amount based a plurality of structures of the region of interest at different points in time. In such a case, the image acquisition function 352, in step S102, acquires medical images at a plurality of points in time which is collected over time. The extraction function 353 extracts the region of interest (such as the LAA) from each of the medical images at the plurality of points in time. The identification function 354 identifies the reference plane or reference line at each point in time.
The correction function 356 corrects the reference plane or reference line based on the relationship between the morphology of the region of interest and that of the treatment device in each time phase. Specifically, the correction function 356 calculates the relationship between the region of interest and the treatment device at each point in time based on the morphology of the treatment device, the morphology of the region of interest at each point in time, and the reference plane (or reference line), and determines whether to correct the reference plane (or reference line) based on the calculations. For example, if the correction function 356 determines that the reference plane or reference line is to be corrected at least one point in time, the correction function 356 determines to make the correction. If the correction function 356 determines that the reference plane or reference line is not to be corrected at any of the points in time, the correction function 356 performs control so that the processing is ended.
As described in step S106 of
Here, the correction function 356 can determine the correction position (the position of the plane P2 in
The foregoing repetition of the correction processing and the determination processing may be controlled to end based on a preset number of repetitions or processing time. Specifically, the correction function 356 can perform control so that the processing is ended without correction when the preset number of repetitions or processing time is reached.
The medical image processing apparatus 3 can display a message if there is no correction position satisfying the condition. Specifically, if the correction function 356 determines that there is no correction position satisfying the condition, the control function 351 displays display information to indicate that on the display 33.
In the foregoing exemplary embodiment, in step S104, the morphological information about all or some of the types or sizes of treatment devices determined in advance or specified by the user are described as being acquired. However, the exemplary embodiment is not limited thereto. For example, the type of treatment device about which the morphological information is to be acquired may be determined based on the morphological information about the region of interest acquired in step S102 and/or morphological information about a structure of interest acquired in step S103. For example, the morphology acquisition function 355 acquires the morphological information about the treatment device of the closest size based on a major diameter or a minor diameter of an elliptical closed curve that the ostium area forms on the ostium plane (closed curve corresponding to the entrance of the LAA on the ostium plane).
Aside from the foregoing processing, the medical image processing apparatus 3 can display various types of display information. For example, the control function 351 can display the corrected reference plane or reference line and the uncorrected position in a juxtaposed or superposed manner. For example, as illustrated in
Moreover, the medical image processing apparatus 3 can display the positions before and after the correction, and present whether to accept the displayed correction to the user. In such a case, the control function 351 displays on the display 33 a GUI for receiving an operation as to whether to accept the correction, and receives the user's determination via the input interface 32. If the operation to accept the correction is received, the control function 351 ends the processing. If the operation to accept the correction is not received (the correction is not accepted), the control function 351 performs control such that the processing is ended or is returned to step S106. In step S106, the control function 351 controls correction under different conditions. Here, the control function 351 may display a GUI for prompting the user to set the different conditions.
The control function 351 further displays a GUI for receiving user operations as illustrated in a display area 331 of the display example of
As illustrated in the display area 331 of
In the foregoing exemplary embodiment, the morphological information about a single treatment device is described as being acquired in step S104. However, the exemplary embodiment is not limited thereto. For example, morphological information about a plurality of treatment devices may be acquired. In such a case, the morphology acquisition function 355 acquires the morphological information about each of the treatment devices. The correction function 356 performs the operations in steps S105 and S106 on each of the treatment devices.
With the plurality of treatment devices processed as described above, the control function 351 can display the processing results related to the respective treatment devices. For example, the control function 351 displays the determinations as to whether to correct the plane P1 with the respective treatment devices placed on the plane P1 illustrated in
In the foregoing exemplary embodiment, the determination processing of step S105 with deformation information not being used as the morphological information about the treatment device has been described. A fourth modification deals with determination processing using deformation information. In such a case, for example, the morphology acquisition function 355 acquires morphological information including deformation information about the treatment device, as illustrated in
For example, as illustrated in
If the contour of the deformed treatment device (contour of the deformed region D1) falls within the contour of the LAA, the correction function 356 determines that correction of the plane P1 is unnecessary. If part of the deformed region D1 still lies outside the contour of the LAA, the correction function 356 determines that the plane P1 is to be corrected.
As described above, according to the first exemplary embodiment, the extraction function 353 extracts the region of interest from the medical image. The identification function 354 identifies the reference plane or the reference line based on the region of interest. The correction function 356 corrects the identified reference plane or reference line based on the morphology of the region of interest. The medical image processing apparatus 3 according to the first exemplary embodiment can thus correct the reference plane or the reference line based on the region of interest depending on the morphology of the region of interest, thus enabling identification of a reference plane or reference line suitable for clinical use.
According to the first exemplary embodiment, the morphology acquisition function 355 acquires the morphology of the treatment device to be placed in the region of interest. The correction function 356 corrects the reference plane or reference line based on the relationship between the morphology of the region of interest and the morphology of the treatment device. The medical image processing apparatus 3 according to the first exemplary embodiment can thus make a correction based on the morphology of the treatment device to be placed in the region of interest and the morphology of the region of interest, thus enabling identification of a reference plane or reference line suitable for clinical use.
According to the first exemplary embodiment, the correction function 356 sets candidate placement position(s) of the treatment device based on the reference plane or reference line. The correction function 356 corrects the reference plane or reference line based on the relationship between the state of the treatment device when placed at the candidate placement position and the morphology of the region of interest. The medical image processing apparatus 3 according to the first exemplary embodiment can thus make the correction based on the expected placement position of the treatment device, thus enabling identification of a reference plane or reference line more suitable for clinical use.
According to the first exemplary embodiment, the morphology acquisition function 355 further acquires the deformation information about the treatment device. The correction function 356 acquires the deformed morphology of the treatment device based on the deformation information, and corrects the reference plane or reference line based on the relationship between the deformed morphology of the treatment device and the morphology of the region of interest. The medical image processing apparatus 3 according to the first exemplary embodiment can thus make the correction based on the properties of the treatment device to be used, thus enabling identification of a reference plane or reference line more suitable for clinical use.
According to the first exemplary embodiment, the extraction function 353 extracts the region of interest from each of medical images in a plurality of time phases, collected over time. The correction function 356 corrects the reference plane or reference line based on the relationship between the morphology of the region of interest in each time phase and the morphology of the treatment device. The medical image processing apparatus 3 according to the first exemplary embodiment can thus identify a reference plane or reference line more suitable for clinical use with the region of interest changing its morphology over time.
A second exemplary embodiment of the present disclosure will be described. In the first exemplary embodiment, the reference plane or reference line is described as being corrected based on the morphology of the region of interest and the morphology of the treatment device. The second exemplary embodiment deals with a case where the reference plane or reference line is corrected based on the morphology of the region of interest (the morphology of the region of interest and the morphology of a structure of interest included in the region of interest).
The structure of interest identification function 357 identifies a structure of interest included in the region of interest. Specifically, the structure of interest identification function 357 identifies a structure of interest that is a partial anatomical structure included in the region of interest (region of importance in the region of interest) to be reflected in setting the reference plane or reference line. The processing to be performed by the structure of interest identification function 357 will be described in detail below.
The correction function 356 according to the second exemplary embodiment corrects the reference plane or reference line based on the morphology of the region of interest and the morphology of the structure of interest. Specifically, the correction function 356 determines whether to correct the reference plane or reference line based on the morphology of the region of interest, the morphology of the structure of interest, and a positional relationship with the reference plane or reference line, and performs correction processing depending on the determination. The processing to be performed by the correction function 356 will be described in detail below.
Next, a processing procedure of the medical image processing apparatus 3a will be described with reference to
For example, as illustrated in
In step S204, the structure of interest identification function 357 identifies a structure of interest included in the region of interest. This processing is implemented, for example, by the processing circuitry 35a calling a program corresponding to the structure of interest identification function 357 from storage circuitry 34 and executing the program.
In step S205, the correction function 356 determines whether to correct the identified reference plane or reference line. If the correction function 356 determines that the reference plane or reference line is determined to be corrected (YES in step S205), the processing proceeds to step S206. In step S206, the correction function 356 performs correction processing on the identified reference plane or reference line. If, in step S205, the correction function 356 determines that the reference plane or reference line is not to be corrected (NO in step S205), the processing is ended. Such processing is implemented, for example, by the processing circuitry 35a calling a program corresponding to the correction function 356 from the storage circuitry 34 and executing the program.
Details of the processes performed by the medical image processing apparatus 3a will now be described. The following description deals with a case where the region of interest is the LAA.
As described in step S204 of
As another example, the structure of interest identification function 357 can identify the structure of interest based on an anatomical structure drawn on the CT image using known region extraction techniques (such as Otsu's binarization, region growing, snake algorithm, graph cut, and mean shift based on CT values). As another example, the structure of interest identification function 357 can identify the lobe region (coordinate information) in the CT image using a trained model for the structure of interest (lobe), constructed using machine learning techniques (including deep learning) based on training data prepared in advance.
The foregoing method for identifying the structure of interest may be performed on only the vicinity of the region of interest (LAA) extracted in step S202. For example, only the interior of a rectangular solid circumscribing the region of interest may be assumed as the target of the identification processing (range to be searched for the structure of interest). The interior of the region of interest may be excluded from the identification processing. Such ranges may be combined so that the structure of interest is identified with only a range in the vicinity of and outside the region of interest to be subjected to the identification processing.
As described in step S205 of
For example, in
If no structure of interest is identified by the structure of interest identification function 357, the correction function 356 may determine that correction of the reference plane or the reference line is unnecessary.
As illustrated in step S206 of
In the foregoing example, a single structure of interest is described as being identified. However, the exemplary embodiment is not limited thereto, and two or more structures of interest may be identified. Structures of interest are not limited to ones that every subject has, such as a lobe mentioned above, and may be ones that a specific subject has.
As described above, according to the second exemplary embodiment, the structure of interest identification function 357 identifies the structure of interest included in the region of interest. The correction function 356 corrects the reference plane or reference line based on the morphology of the region of interest and the morphology of the structure of interest. The medical image processing apparatus 3a according to the second exemplary embodiment can thus correct the reference plane or reference line based on the region of interest depending on the morphology of the structure of interest included in the region of interest, and enables identification of a reference plane or reference line suitable for clinical use.
The foregoing exemplary embodiments have dealt with the cases where a reference plane (e.g., plane P1) is subjected to correction as a reference position in measuring the region of interest or determining the treatment device during treatment planning. However, the exemplary embodiments are not limited thereto, and a reference line may be subjected to connection as a reference position in measuring the region of interest or determining the treatment device during treatment planning. In such a case, for example, two lines each passing two of three points included in the border between the LA and the left ventricle may be generated and the generated two lines may be subjected to correction.
In the foregoing exemplary embodiments, the LAA is described as the region of interest. However, the exemplary embodiments are not limited thereto, and the region of interest may be the aortic valve or the mitral valve.
If the region of interest is the aortic valve, the present method is applicable to processing for correcting the reference plane in replacing the aortic valve with an artificial valve. In such a case, for example, the extraction function 353 extracts the anatomical structure of the aortic valve and its vicinity in the CT image as the region of interest. The identification function 354 identifies a reference plane serving as a reference in implantation of an artificial valve included in the CT image based on the morphology of the extracted aortic valve. The morphology acquisition function 355 acquires the morphological information about the artificial valve. The correction function 356 determines whether to correct the reference plane based on the morphology of the region of interest and the morphology of the artificial valve, and corrects the reference plane depending on a result of the determination.
For example, the correction function 356 determines whether to correct the reference plane based on the positional relationship between the artificial valve after deployment and the entrance of the coronary artery included in the region of interest when the artificial valve is deployed on the reference plane identified by the identification function 354. For example, if the distance between the artificial valve and the entrance of the coronary artery is less than a threshold, the correction function 356 determines that the reference plane is to be corrected, and corrects the reference plane so that the distance between the artificial valve and the entrance of the coronary artery is greater than or equal to the threshold.
If the region of interest is the mitral valve, the present method is applicable to processing for correcting the reference plane in placement of MITRACLIP (registered trademark) on the mitral valve. In such a case, for example, the extraction function 353 extracts the mitral valve included in the CT image as the region of interest. The identification function 354 identifies a reference plane serving as a reference in placement of MITRACLIP based on the morphology of the extracted mitral valve. The morphology acquisition function 355 acquires the morphological information regarding MITRACLIP. The correction function 356 determines whether to correct the reference plane based on the morphology of the mitral valve and the morphology of MITRACLIP, and corrects the reference plane depending on the determination.
For example, the correction function 356 determines whether to correct the reference plane identified by the identification function 354 based on the positional relationship between MITRACLIP and the mitral valve after deployment when MITRACLIP is deployed on the reference plane. For example, if the size of MITRACLIP is greater than the leaflet length of the mitral valve along the reference plane, the correction function 356 determines that the reference plane is to be corrected, and corrects the reference plane so that the size of MITRACLIP is less than the leaflet length of the mitral valve along the reference plane.
The processing circuitry described in each of the foregoing exemplary embodiments may be configured by combining a plurality of independent processors, and the processing functions may be implemented by the processors executing the programs. The processing functions of the processing circuitry may be implemented on a single processing circuit or a plurality of processing circuits in an appropriately distributed or integrated manner. The processing functions of the processing circuitry may be implemented by a combination of hardware, such as a circuit, and software. While the programs corresponding to the processing functions have been described as being stored in single storage circuitry 34, the exemplary embodiments are not limited thereto. For example, the programs corresponding to the processing functions may be distributedly stored in a plurality of storage circuits, and the processing circuitry may be configured to read the programs from the storage circuits and execute the programs.
In the foregoing exemplary embodiments, the components set forth in this specification are described as being implemented by respective functions of a processing circuitry. However, the exemplary embodiments are not limited thereto. For example, instead of the function-based implementation described in the exemplary embodiments, the components set forth in the specification may be implemented by only hardware, only software, or a combination of hardware and software.
The term “processor” used in the description of the foregoing exemplary embodiments refers to a circuit such as a central processing unit (CPU), a graphics processing unit (GPU), an application specific integrated circuit (ASIC), and a programmable logic device (e.g., a simple programmable logic device [SPLD], a complex programmable logic device [CPLD], or a field programmable gate array [FPGA]). Instead of storing the programs in the storage circuit, the programs may be directly built in the processor circuit. In such a case, the processor implements the functions by reading the built-in programs in its own circuit and executing the programs. The processors according to the exemplary embodiments are not limited to a single-circuit configuration. A plurality of independent circuits may be combined into a processor that implements the functions.
A medical image processing program to be executed by the processor is provided as a built-in program in a read-only memory (ROM) or other storage circuits. This medical image processing program may be provided recorded on a computer-readable non-transitory storage medium such as a compact disk read-only memory (CD-ROM), a flexible disk (FD), a compact disk recordable (CD-R), and a digital versatile disk (DVD), in a file format installable or executable on the apparatuses. The medical image processing program may be stored on a computer connected to a network such as the Internet, and provided or distributed through downloading over the network. For example, the medical image processing program includes modules including the foregoing processing functions. In terms of actual hardware, a CPU reads the medical image processing program from a storage medium such as a ROM and executes the program, whereby the modules are loaded into and generated on a main storage device.
In the foregoing exemplary embodiments and modifications, the components of the apparatuses illustrated in the diagrams are functional concepts and do not necessarily need to be physically configured as illustrated in the diagrams. In other words, the specific forms of distribution or integration of the apparatuses are not limited to the illustrated ones, and all or part of the apparatuses can be functionally or physically distributed or integrated into any units depending on various loads and usages. All or part of the processing functions for the apparatuses to perform can be implemented by a CPU and programs to be analyzed and executed by the CPU, or as wired logic hardware.
All or part of the processes described to be automatically performed in the foregoing exemplary embodiments and modifications can be manually performed. All or part of the processes described to be manually performed can be automatically performed using known methods. Furthermore, the processing procedures, control procedures, specific names, and information including various types of data and parameters described above or illustrated in the drawings can be freely modified unless otherwise specified.
According to at least one of the exemplary embodiments described above, a reference plane or reference line suitable for clinical use can be identified.
While several exemplary embodiments have been described, these exemplary embodiments are presented by way of example only, and not intended to limit the scope of the invention. The exemplary embodiments can be practiced in various other forms, and various omissions, replacements, and modifications can be made without departing from the gist of the invention. Such exemplary embodiments and their variations are encompassed by the scope and gist of the invention, as well as by the invention set forth in the claims and the range of equivalency thereof.
Number | Date | Country | Kind |
---|---|---|---|
2023-104435 | Jun 2023 | JP | national |