This application claims priority from Japanese Patent Application No. 2023-147944, filed Sep. 12, 2023, the disclosure of which is incorporated herein by reference in its entirety.
The present disclosure relates to an image processing apparatus, a radiography system, and a non-transitory storage medium storing a program.
In recent years, in order to promote early detection of breast cancer, image diagnosis using a radiography apparatus (called mammography) that performs imaging of a breast has attracted attention. In addition, in mammography, there has been proposed tomosynthesis imaging of performing imaging by moving a radiation source and irradiating a breast with radiation at a plurality of radiation source positions, and generating tomographic images in which desired tomographic planes are emphasized by reconstructing a plurality of projection images acquired by the imaging. In the tomosynthesis imaging, a plurality of projection images are acquired by imaging the breast at a plurality of radiation source positions by moving a radiation source in parallel with a radiation detector or moving the radiation source to draw a circle or an elliptical arc in accordance with characteristics of an imaging apparatus and a required tomographic image, and the projection images are reconstructed using a back projection method such as a simple back projection method or a filtered back projection method or a sequential reconstruction method. Thereby, a tomographic image is generated.
The tomographic images are generated on a plurality of tomographic planes of the breast, and thus it is possible to separate structures that are overlapped in a depth direction in which the tomographic planes of the breast are lined up. Therefore, it is possible to find an abnormal part such as a lesion that is unlikely to be detected in a two-dimensional image (hereinafter, referred to as a simple two-dimensional image) acquired by simple imaging in the related art, which irradiates a subject with radiation in a predetermined direction.
As a technology using the tomographic images in this way, JP2014-128716A discloses a technology of generating a pseudo two-dimensional image (hereinafter, referred to as a synthesized two-dimensional image) corresponding to a simple two-dimensional image by synthesizing a plurality of tomographic images having different distances (positions in a height direction) from a detection surface of a radiation detector to a radiation source, using an addition method, an averaging method, a maximum intensity projection method, a minimum intensity projection method, or the like, the tomographic images being acquired by tomosynthesis imaging.
Meanwhile, in a case of creating a synthesized two-dimensional image, it is desirable to emphasize the structure of interest as much as possible. However, in a case where the structures of interest overlap with each other, it is difficult to determine priorities of the structures of interest. Depending on the priority, the structure for reference may become thinner or disappear.
For this reason, U.S. Pat. No. 11,445,993B discloses a technology of detecting an object using an independent recognition module for each type of the object (structure of interest). This technology discloses that, in a case where there is a possibility that a first object and a second object overlap with each other, a weight of the object is calculated from a recognition result of a type of the object and a synthesized two-dimensional image is created according to the weight.
Meanwhile, in a case where a plurality of structures of interest are present in the synthesized two-dimensional image, in many cases, the structures of interest to be focused differ depending on the mutual relationship between the plurality of structures of interest. For example, in a case where a plurality of calcifications are present, in many cases, it is more important to focus on calcifications that densely exist than calcifications that exist alone. In addition, in many cases, calcifications that exist along the mammary glands are more important than other calcifications.
On the other hand, in the technology disclosed in U.S. Pat. No. 11,445,993B, it is possible to set the structures of interest to be focused according to the types of the structures of interest. However, since a mutual relationship between the structures of interest is not considered, there is a problem in that it is not possible to determine superiority and inferiority of the structures of interest having the same type in consideration of the mutual relationship.
The present disclosure has been made in view of the above circumstances, and an object of the present disclosure is to provide an image processing apparatus, a radiography system, and a non-transitory storage medium storing a program capable of determining superiority and inferiority of structures of interest having the same type in consideration of a mutual relationship between the structures of interest.
In order to achieve the above object, according to a first aspect of the present disclosure, there is provided an image processing apparatus comprising: at least one processor, in which the processor is configured to: acquire projection images obtained by imaging a subject with radiation at a plurality of angles; reconstruct a plurality of tomographic images from the acquired projection images; detect a plurality of structures of interest from the plurality of reconstructed tomographic images; and determine a priority for each of the structures of interest having the same type among the plurality of detected structures of interest by using a mutual relationship with other structures of interest.
According to a second aspect of the present disclosure, in the image processing apparatus according to the first aspect, the processor is configured to create a synthesized two-dimensional image of the subject by using the determined priority.
According to a third aspect of the present disclosure, in the image processing apparatus according to the first aspect, the mutual relationship is at least one of a closeness relationship with the other structures of interest in a three-dimensional distance or a size relationship with the other structures of interest.
According to a fourth aspect of the present disclosure, in the image processing apparatus according to the third aspect, the mutual relationship is both the closeness relationship and the size relationship.
According to a fifth aspect of the present disclosure, in the image processing apparatus according to the first aspect, the structure of interest is at least one type of a calcification, a tumor, a spicula, or a mammary gland.
According to a sixth aspect of the present disclosure, in the image processing apparatus according to the first aspect, the processor is configured to detect the structure of interest by using a machine learning model which is obtained by performing machine learning in advance and in which information indicating the tomographic image is set as input information and information indicating the plurality of structures of interest is set as output information.
According to a seventh aspect of the present disclosure, in the image processing apparatus according to the first aspect, the processor is configured to detect the structure of interest by performing pattern matching using a template image of a target structure of interest.
In addition, in order to achieve the above object, according to an eighth aspect of the present disclosure, there is provided a radiography system comprising: the image processing apparatus according to the first aspect; and a mammography apparatus that acquires projection images to be used by the image processing apparatus.
Further, in order to achieve the above object, according to a ninth aspect of the present disclosure, there is provided a non-transitory storage medium storing a program causing a computer to execute an image process, the image processing comprising: acquiring projection images obtained by imaging a subject with radiation at a plurality of angles; reconstructing a plurality of tomographic images from the acquired projection images; detecting a plurality of structures of interest from the plurality of reconstructed tomographic images; and determining a priority for each of the structures of interest having the same type among the plurality of detected structures of interest by using a mutual relationship with other structures of interest.
According to a tenth aspect of the present disclosure, there is provided a method executed by a computer, the method comprising: acquiring projection images obtained by imaging a subject with radiation at a plurality of angles; reconstructing a plurality of tomographic images from the acquired projection images; detecting a plurality of structures of interest from the plurality of reconstructed tomographic images; and determining a priority for each of the structures of interest having the same type among the plurality of detected structures of interest by using a mutual relationship with other structures of interest.
According to the present disclosure, it is possible to determine superiority and inferiority of structures of interest having the same type in consideration of a mutual relationship between the structures of interest.
Hereinafter, an embodiment of the present disclosure will be described with reference to the drawings.
As illustrated in
The mammography apparatus 1 comprises an arm part 12 that is connected to a base (not illustrated) by a rotation shaft 11. An imaging table 13 is attached to one end of the arm part 12, and a radiation emitting unit 14 is attached to the other end of the arm part 12 so as to face the imaging table 13. The arm part 12 is configured such that only the end to which the radiation emitting unit 14 is attached can be rotated. Therefore, the imaging table 13 is fixed and only the radiation emitting unit 14 can be rotated.
A radiation detector 15, such as a flat panel detector, is provided in the imaging table 13. The radiation detector 15 has a radiation detection surface 15A. In addition, a circuit board including a charge amplifier that converts a charge signal read from the radiation detector 15 into a voltage signal, a sampling two correlation pile circuit that samples the voltage signal output from the charge amplifier, and an analog-to-digital (AD) conversion unit that converts the voltage signal into a digital signal is provided in the imaging table 13.
A radiation source 16 is accommodated in the radiation emitting unit 14. The radiation source 16 emits X-rays as radiation. The console 2 controls a timing when the radiation source 16 emits the radiation and radiation generation conditions of the radiation source 16, that is, selection of a target and filter materials, a tube voltage, an irradiation time, and the like.
Further, the arm part 12 is provided with a compression plate 17 that is disposed above the imaging table 13 and presses and compresses the breast M, a support part 18 that supports the compression plate 17, and a moving mechanism 19 that moves the support part 18 in a vertical direction in
The console 2 has a function of controlling the mammography apparatus 1 using, for example, an imaging order and various kinds of information acquired from a radiology information system (RIS) (not illustrated) or the like through a network, such as a wireless communication local area network (LAN), and instructions or the like directly issued by an engineer or the like. Specifically, the console 2 acquires a plurality of projection images as described below by causing the mammography apparatus 1 to perform tomosynthesis imaging of the breast M. As an example, in the present embodiment, a server computer is used as the console 2.
The image storage system 3 is a system that stores image data such as the projection images which are obtained by imaging of the mammography apparatus 1. The image storage system 3 extracts an image corresponding to a request from, for example, the console 2 and the image processing apparatus 4 from the stored images and transmits the image to a device that is the source of the request. A specific example of the image storage system 3 is a picture archiving and communication system (PACS).
Next, the image processing apparatus 4 according to the present embodiment will be described. Next, a hardware configuration of the image processing apparatus 4 according to the present embodiment will be described with reference to
The storage 23 is implemented by a hard disk drive (HDD), a solid state drive (SSD), a flash memory, and the like. An image processing program 22 to be installed in the image processing apparatus 4 is stored in the storage 23 as a storage medium. The CPU 21 reads out the image processing program 22 from the storage 23, expands the image processing program 22 in the memory 26, and executes the expanded image processing program 22.
The image processing program 22 is stored in a storage device of a server computer connected to the network or in a network storage in a state of being accessible from the outside, and is downloaded and installed in the computer that configures the image processing apparatus 4 in response to a request. Alternatively, the image processing program 22 is distributed in a state of being recorded on a recording medium, such as a digital versatile disc (DVD) or a compact disc read only memory (CD-ROM), and is installed in the computer that configures the image processing apparatus 4 from the recording medium.
Hereinafter, a functional configuration of the image processing apparatus 4 according to the present embodiment will be described.
As illustrated in
The image acquisition unit 30 acquires a plurality of projection images acquired by causing the console 2 to perform tomosynthesis imaging on the breast M by the mammography apparatus 1. The image acquisition unit 30 acquires the projection images from the console 2 or the image storage system 3 via the network I/F 27. Further, the tomographic image reconstruction unit 31 reconstructs a plurality of tomographic images from the projection images acquired by the image acquisition unit 30.
Here, the tomosynthesis imaging and the generation of the tomographic image will be described. In a case of performing the tomosynthesis imaging for generating a tomographic image, the console 2 acquires a plurality of projection images Gi (i=1 to n, where n is the number of radiation source positions and is, for example, n=15) at a plurality of radiation source positions by moving the radiation source 16 by rotating the arm part 12 around the rotation shaft 11, irradiating the breast M as a subject with radiation at a plurality of radiation source positions obtained by the movement of the radiation source 16 according to predetermined imaging conditions for tomosynthesis imaging, and detecting the radiation passing through the breast M by the radiation detector 15.
In
In addition, the tomographic image reconstruction unit 31 generates a tomographic image in which the desired tomographic planes of the breast M are emphasized by reconstructing the plurality of projection images Gi. Specifically, the image processing apparatus 4 generates a plurality of tomographic images Dj (j=1 to m) on each of the plurality of tomographic planes of the breast M as illustrated in
The structure-of-interest detection unit 32 detects a structure of interest from the plurality of tomographic images Dj. In the present embodiment, a calcification, a tumor, a spicula, and a mammary gland included in the breast M are detected as the structure of interest.
The structure-of-interest detection unit 32 according to the present embodiment detects a structure-of-interest by using a machine learning model 32A that is configured with a neural network obtained by performing machine learning, such as deep learning, using training data so as to extract a structure-of-interest from each of the tomographic images Dj. That is, the machine learning model 32A according to the present embodiment is a machine learning model in which information indicating the tomographic image is set as input information, and information indicating a plurality of types of structures of interest is set as output information, and is obtained by performing machine learning in advance.
As the machine learning model 32A, for example, a well-known neural network, such as a convolutional neural network (CNN) or a support vector machine (SVM), can be used.
In the present embodiment, although a single machine learning model 32A that can detect a plurality of types of structures of interest is applied, the present disclosure is not limited thereto. For example, the machine learning model 32A may be prepared and applied for each type of the structures of interest to be detected.
As described above, in the present embodiment, the structure of interest is detected by the machine learning model 32A. On the other hand, the present disclosure is not limited thereto. For example, an aspect in which the structure of interest is detected from the tomographic image Dj using a known computer aided diagnosis (that is, CAD) algorithm may be used. In the algorithm by the CAD, the probability (likelihood) indicating that the pixel in the tomographic image Dj is the structure of interest is derived, and the pixel in which the probability is equal to or higher than the predetermined threshold value is detected as the structure of interest. An algorithm by CAD is prepared for each type of the structures of interest. In a case where the algorithm by the CAD is applied in the present embodiment, a CAD algorithm for tumor detection, a CAD algorithm for spicula detection, a CAD algorithm for calcification detection, and a CAD algorithm for mammary gland detection are prepared.
In addition, the detection of the structure of interest is not limited to detection using the machine learning model 32A or detection using the CAD. For example, the structure of interest may be detected from the tomographic image Dj by performing pattern matching using a template image of a structure of interest to be detected or a filtering process using a filter for detecting the structure of interest. Here, in a case where the pattern matching is applied, an image that is already acquired for the type of the structure of interest to be extracted is applied as the template image.
In addition, in the present embodiment, the structure of interest is detected using only the machine learning model 32A, but the present disclosure is not limited thereto. For example, a form in which the structure of interest is detected by using both the machine learning model 32A and the pattern matching may be used. Further, in this form, a method of detecting the structure of interest may be switched according to the type of the structure of interest to be detected.
As described above, in the present embodiment, all of the calcification, the tumor, the spicula, and the mammary gland are applied as the structures of interest to be detected, but the present disclosure is not limited thereto. For example, a plurality of combinations excluding any one or all of the calcification, the tumor, the spicula, and the mammary gland may be applied as the structures of interest to be detected.
The priority determination unit 33 determines a priority of each of the structures of interest having the same type among the plurality of structures of interest that are detected by the structure-of-interest detection unit 32, by using a mutual relationship with the other structures of interest.
In the present embodiment, as the mutual relationship, both a closeness relationship with the other structures of interest in three-dimensional distance and a size relationship with the other structures of interest are applied, but the present disclosure is not limited thereto. For example, a form in which only one of the closeness relationship or the size relationship is applied as the mutual relationship may be adopted. A method of determining the priority by the priority determination unit 33 will be described in detail below.
The image creation unit 34 creates a synthesized two-dimensional image of the subject (in the present embodiment, the breast M) by using the priority determined by the priority determination unit 33.
The image creation unit 34 according to the present embodiment creates a synthesized two-dimensional image by taking a weighted average for each pixel by setting a weight value (for example, 0.9) of a tomographic image including the structure of interest having a high priority determined by the priority determination unit 33 to a value larger than a weight value (for example, 0.1) of a tomographic image other than the tomographic image. Here, the method of creating the synthesized two-dimensional image is not limited to such an averaging method, and an addition method, a maximum intensity projection method, a minimum intensity projection method, or the like may be applied as a method of creating the synthesized two-dimensional image.
For example, in a case where the minimum intensity projection method is applied as a method of creating the synthesized two-dimensional image, among the tomographic images to be projected, only tomographic images including the structures of interest having a high priority may be projected.
As described above, in the present embodiment, the synthesized two-dimensional image is applied as the image created by the image creation unit 34, but the present disclosure is not limited thereto. For example, a form in which a slab image that is an image including thickness information is applied as the image created by the image creation unit 34 may be adopted. As a form example in this case, a form in which only the tomographic images including the structures of interest having a high priority are used to create the slab images may be used.
The display controller 35 displays the synthesized two-dimensional image generated by the image creation unit 34 on the display 24. The synthesized two-dimensional image is created by setting a weight value for the tomographic image including the structure of interest having a high priority to be larger than weight values for other tomographic images. Therefore, the synthesized two-dimensional image displayed here is an extremely-high-quality image in which the structure of interest having a high priority is emphasized.
The image created by the image creation unit 34 is not limited to the synthesized two-dimensional image and the slab image described above, and may be an image for performing display of linking the synthesized two-dimensional image created from the plurality of tomographic images Dj with the corresponding tomographic images Dj.
That is, in general, the number of the tomographic images may be equal to or larger than 100 depending on the application. Therefore, in a case of reading the tomographic images, there is a problem that a relatively long reading time is required.
Therefore, in this form, the CPU 21 first creates a synthesized two-dimensional image as described above, and displays the synthesized two-dimensional image. Accordingly, an operator refers to the displayed synthesized two-dimensional image, and in a case where a lesion is suspected in the structure of interest displayed on the synthesized two-dimensional image, designates the structure of interest. In response to the designation, the CPU 21 displays the tomographic image including the designated structure of interest. Thereby, a reading time of the tomographic image can be significantly shortened.
Next, a method of determining the priority by the priority determination unit 33 according to the present embodiment will be specifically described with reference to
In the example illustrated in
In this state, in a case where the synthesized two-dimensional image is created by synthesizing all of the tomographic images Dj at the same ratio (weight value), as illustrated in
Therefore, as described above, the priority determination unit 33 according to the present embodiment determines, for each of the structures of interest having the same type, the priority of the structures of interest having the same type by using the predetermined closeness relationship and the predetermined size relationship.
The example illustrated in
On the other hand, the example illustrated in
Further, the example illustrated in
As described above, in the present embodiment, the priority determined by the priority determination unit 33 is reflected in the tomographic image Dj, and then the synthesized two-dimensional image is created. Therefore, the synthesized two-dimensional image in which the structure of interest having a high priority is emphasized is created. On the other hand, the present disclosure is not limited thereto. For example, first, a synthesized two-dimensional image may be temporarily created without considering the priority. Then, the synthesized two-dimensional image may be processed such that the structure of interest having a higher priority is emphasized, and thus, the final synthesized two-dimensional image may be created. As a method of emphasizing the structure of interest in this form, a form in which only a region of the structure of interest (hereinafter, referred to as an “emphasis target region”) is set to have higher brightness as compared with a region around the emphasis target region, a form in which only the emphasis target region is set to a different display state such as blink display or reverse display, a form in which only the emphasis target region is displayed by being surrounded by an external rectangular frame or an external circular frame, and the like can be exemplified.
Next, processing performed in the present embodiment will be described.
In a case where the input device 25 receives an instruction to start the processing from the operator, execution of the image processing program 22 is started, and thus, the image processing illustrated in
First, the image acquisition unit 30 acquires the projection images Gi by reading the projection images Gi from the storage 23, the projection images Gi being obtained by imaging the subject (in the present embodiment, the breast M) with the radiation emitted from the radiation source 16 at a plurality of angles (step S100). Next, the tomographic image reconstruction unit 31 reconstructs the plurality of tomographic images Dj from the acquired projection images Gi (step S102).
Next, the structure-of-interest detection unit 32 detects a plurality of structures of interest from the plurality of reconstructed tomographic images Dj (step S104). Next, the priority determination unit 33 determines a priority for each of the plurality of detected structures of interest having the same type by using the mutual relationship with the other structures of interest (step S106).
In addition, the image creation unit 34 creates a synthesized two-dimensional image from the plurality of tomographic images Dj by using the determined priority (step S108). Subsequently, the display controller 35 displays the synthesized two-dimensional image on the display 24 (step S110), and the processing is ended.
As described above, with the image processing apparatus according to the present embodiment, the projection images obtained by imaging the subject with the radiation at a plurality of angles are acquired, and the plurality of tomographic images are reconstructed from the acquired projection images. The plurality of structures of interest are detected from the plurality of reconstructed tomographic images, and the priority is determined for each of the plurality of detected structures of interest having the same type by using the mutual relationship with the other structures of interest. Therefore, it is possible to determine the superiority and inferiority of the structures of interest having the same type in consideration of the mutual relationship between the structures of interest.
Further, according to the image processing apparatus according to the present embodiment, the synthesized two-dimensional image of the subject is created by using the determined priority. Therefore, it is possible to further improve visibility of the structure of interest in the synthesized two-dimensional image as compared with a case where the determined priority is not used.
In addition, according to the image processing apparatus according to the present embodiment, the mutual relationship is set as at least one of the closeness relationship with other structures of interest in three-dimensional distance or the size relationship with other structures of interest. Therefore, in a case where the closeness relationship is applied as the mutual relationship, it is possible to obtain the priority of the structure of interest to be prioritized from the viewpoint of the closeness relationship, and in a case where the size relationship is applied as the mutual relationship, it is possible to obtain the priority of the structure of interest to be prioritized from the viewpoint of the size relationship.
Further, according to the image processing apparatus according to the present embodiment, at least one type of a calcification, a tumor, a spicula, or a mammary gland is applied as the structure of interest. Therefore, it is possible to determine the superiority and inferiority of the type of the applied structure of interest in consideration of the mutual relationship between the structures of interest.
Further, according to the image processing apparatus according to the present embodiment, the structure of interest is detected by using a machine learning model obtained by performing machine learning in advance, the machine learning model in which information indicating a tomographic image is set as input information and information indicating a plurality of structures of interest is set as output information. Therefore, it is possible to detect the structure of interest with higher accuracy as compared with a case of detecting the structure of interest by performing pattern matching.
Further, in the image processing apparatus according to the present embodiment, an aspect in which the structure of interest is detected by performing pattern matching using a template image of a target structure of interest is adopted. Therefore, according to the aspect, it is possible to more simply detect the structure of interest as compared with a case of detecting the structure of interest by using a machine learning model.
In the embodiment, a case where the CPU 21 provided in the image processing apparatus 4 is applied as the processor according to the technology of the present disclosure has been described. On the other hand, the present disclosure is not limited thereto. For example, the CPU provided in any of the mammography apparatus 1, the console 2, and the image storage system 3 may be applied as a processor according to the technology of the present disclosure.
In addition, in the embodiment, a case where the closeness relationship and the size relationship are each independently applied has been described. On the other hand, the present disclosure is not limited thereto. For example, a form in which the closeness relationship and the size relationship are applied in combination may be used. Examples of this form include a form in which a priority is first determined based on the closeness relationship and then the priority is determined based on the size relationship for a plurality of the structures of interest having the same priority, a form in which, on the contrary, a priority is first determined based on the size relationship and then the priority is determined based on the closeness relationship for a plurality of the structures of interest having the same priority, and the like.
In addition, in the embodiment, the case where the structures of interest are focused on only the calcifications in
In addition, in the embodiment, the case where the relationship, in which the structure of interest having a relatively small size is prioritized, is applied as the size relationship has been described. On the other hand, the present disclosure is not limited thereto. For example, a form in which a relationship in which the structure of interest having a relatively large size is prioritized is applied as the size relationship may be used.
In addition, in the embodiment, the case where the priority having only two stages of priority and non-priority is applied as the priority has been described. On the other hand, the present disclosure is not limited thereto. For example, a form in which a priority having three or more stages is applied as the priority may be used.
In addition, in the embodiment, for example, as hardware structures of processing units that execute various kinds of processing, such as the image acquisition unit 30, the tomographic image reconstruction unit 31, the structure-of-interest detection unit 32, the priority determination unit 33, the image creation unit 34, and the display controller 35, the following various processors can be used. As described above, the various processors include, in addition to the CPU that is a general-purpose processor that executes software (program) to function as various processing units, a programmable logic device (PLD) that is a processor of which a circuit configuration can be changed after manufacture, such as a field programmable gate array (FPGA), and a dedicated electric circuit that is a processor having a circuit configuration that is designed for exclusive use in order to execute a specific process, such as an application specific integrated circuit (ASIC).
One processing unit may be configured by one of the various processors, or may be configured by a combination of the same or different types of two or more processors (for example, a combination of a plurality of FPGAs or a combination of the CPU and the FPGA). In addition, a plurality of processing units may be configured by one processor.
As an example in which the plurality of processing units are configured by one processor, firstly, as represented by a computer such as a client and a server, a form in which one processor is configured by a combination of one or more CPUs and software and the processor functions as the plurality of processing units may be adopted. Secondly, as represented by a system on chip (SoC) or the like, a form in which a processor that implements the function of the entire system including the plurality of processing units by one integrated circuit (IC) chip is used may be adopted. In this way, various processing units are formed using one or more of the above-mentioned various processors as hardware structures.
Further, the hardware structure of these various processors is, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
In addition, in the embodiment described above, the aspect in which the image processing program 22 is stored (installed) in advance in the storage 23 of the image processing apparatus 4 has been described, but the present disclosure is not limited thereto. The image processing program 22 may be provided in a form recorded in a recording medium such as a compact disc read-only memory (CD-ROM), a digital versatile disc read-only memory (DVD-ROM), and a universal serial bus (USB) memory. In addition, the image processing program 22 may be configured to be downloaded from an external device via a network.
From the above description, the invention described in following Appendices can be understood.
An image processing apparatus comprising:
The image processing apparatus according to Appendix 1,
The image processing apparatus according to Appendix 1 or 2,
The image processing apparatus according to Appendix 3,
The image processing apparatus according to any one of Appendixes 1 to 4,
The image processing apparatus according to any one of Appendixes 1 to 5,
The image processing apparatus according to any one of Appendixes 1 to 6,
A radiography system comprising:
A program causing a computer to execute a process comprising:
Number | Date | Country | Kind |
---|---|---|---|
2023-147944 | Sep 2023 | JP | national |