The present application claims priority from Japanese Patent Application No. 2022-076306, filed on May 2, 2022, the entire disclosure of which is incorporated herein by reference.
The present disclosure relates to an image processing device, method, and program.
An endoscope is inserted into a lumen such as a bronchus or a digestive organ of a subject, and that an endoscopic image in the lumen is acquired to observe an inside of the lumen. In addition, a biopsy treatment is also performed in which a tissue at a site suspected to be a lesion found in the endoscopic image is collected with a treatment tool such as a forceps attached to a distal end of the endoscope. In a case of performing such a treatment using the endoscope, it is important that the endoscope accurately reaches a target position in the subject. Therefore, a positional relationship between the endoscope and a human body structure is grasped by continuously irradiating the subject with radiation from a radiation source during the treatment and performing fluoroscopic imaging to display the acquired fluoroscopic image in real time. However, it is difficult to grasp a depth inside the subject in the fluoroscopic image. In addition, in a case in which the lesion is small, it may be difficult to see in the endoscopic image, so that a success rate of collecting the tissue of the lesion is reduced.
Therefore, a small ultrasonic observing device is mounted on the distal end of the endoscope, a lesion on an outside of a wall is confirmed by ultrasound from an inside of the bronchus, and a tissue is collected while confirming whether a treatment tool for collecting the tissue contacts the lesion. However, even in a case in which such an endoscope is used, a positional relationship between the treatment tool and the endoscope is confirmed by using the fluoroscopic image, so that it is difficult to collect the tissue with a complete grasp of the positional relationship.
In order to solve such a problem, a marker made of a material that does not transmit radiation is attached to the distal end of the endoscope, and a position and a posture of the endoscope are grasped by using a marker image included in the fluoroscopic image (for example, refer to JP2010-522597A).
In the method disclosed in JP2010-522597A, although it is easy to grasp the position and the posture of the endoscope in the fluoroscopic image, a relationship between a position of the lesion and the position of the endoscope remains unclear.
The present disclosure has been made in view of the above circumstances, and an object of the present disclosure is to facilitate a grasp of a positional relationship between a distal end of an endoscope and a lesion.
An image processing device according to the present disclosure comprises: at least one processor, in which the processor is configured to: sequentially acquire a plurality of radiation images of a subject having a body cavity into which an ultrasonic endoscope to which an ultrasonic imaging device is attached and to which a radiation impermeable marker is attached is inserted; sequentially acquire a plurality of two-dimensional ultrasound images corresponding to the plurality of radiation images, which are acquired by the ultrasonic imaging device; recognize a position and a posture of the ultrasonic endoscope in the body cavity based on the marker included in each of the plurality of radiation images; and derive a three-dimensional ultrasound image from the plurality of two-dimensional ultrasound images based on the position and the posture of the ultrasonic endoscope recognized with respect to the plurality of radiation images.
In the image processing device according to the present disclosure, the processor may be configured to: perform registration between the radiation image and the three-dimensional ultrasound image; and superimpose and display the registered three-dimensional ultrasound image on the radiation image.
In addition, in the image processing device according to the present disclosure, the processor may be configured to: extract the body cavity into which the ultrasonic endoscope is inserted from a three-dimensional image of the subject acquired in advance; correct the position and the posture of the ultrasonic endoscope according to a shape of the extracted body cavity; and derive a three-dimensional ultrasound image from the plurality of two-dimensional ultrasound images based on the corrected position and posture.
An image processing method according to the present disclosure comprises: sequentially acquiring a plurality of radiation images of a subject having a body cavity into which an ultrasonic endoscope to which an ultrasonic imaging device is attached and to which a radiation impermeable marker is attached is inserted; sequentially acquiring a plurality of two-dimensional ultrasound images corresponding to the plurality of radiation images, which are acquired by the ultrasonic imaging device; recognizing a position and a posture of the ultrasonic endoscope in the body cavity based on the marker included in each of the plurality of radiation images; and deriving a three-dimensional ultrasound image from the plurality of two-dimensional ultrasound images based on the position and the posture of the ultrasonic endoscope recognized with respect to the plurality of radiation images.
An image processing program according to the present disclosure causes a computer to execute a process comprising: sequentially acquiring a plurality of radiation images of a subject having a body cavity into which an ultrasonic endoscope to which an ultrasonic imaging device is attached and to which a radiation impermeable marker is attached is inserted; sequentially acquiring a plurality of two-dimensional ultrasound images corresponding to the plurality of radiation images, which are acquired by the ultrasonic imaging device; recognizing a position and a posture of the ultrasonic endoscope in the body cavity based on the marker included in each of the plurality of radiation images; and deriving a three-dimensional ultrasound image from the plurality of two-dimensional ultrasound images based on the position and the posture of the ultrasonic endoscope recognized with respect to the plurality of radiation images.
According to the present disclosure, it is possible to easily confirm the position of the lesion included in the radiation image by using the three-dimensional ultrasound image.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. First, a configuration of a medical information system to which an image processing device according to a first embodiment is applied will be described.
The computer 1 includes the image processing device according to the first embodiment, and an image processing program of the first embodiment is installed in the computer 1. The computer 1 is installed in a treatment room in which a subject is treated as described below. The computer 1 may be a workstation or a personal computer directly operated by a medical worker who performs a treatment or may be a server computer connected thereto via a network. The image processing program is stored in a storage device of the server computer connected to the network or in a network storage in a state of being accessible from the outside, and is downloaded and installed in the computer 1 used by a doctor in response to a request. Alternatively, the image processing program is distributed by being recorded on a recording medium such as a digital versatile disc (DVD) or a compact disc read only memory (CD-ROM) and is installed on the computer 1 from the recording medium.
The three-dimensional image pick-up device 2 is a device that generates a three-dimensional image representing a treatment target site of a subject H by imaging the site, and is specifically, a CT device, an MRI device, a positron emission tomography (PET) device, and the like. The three-dimensional image including a plurality of tomographic images, which is generated by the three-dimensional image pick-up device 2, is transmitted to and stored in the image storage server 4. In addition, in the present embodiment, the treatment target site of the subject H is a lung, and the three-dimensional image pick-up device 2 is the CT device. A CT image including a chest portion of the subject H is acquired in advance as a three-dimensional image by imaging the chest portion of the subject H before a treatment on the subject H as described below and stored in the image storage server 4.
The fluoroscopic image pick-up device 3 includes a C-arm 3A, an X-ray source 3B, and an X-ray detector 3C. The X-ray source 3B and the X-ray detector 3C are attached to both end parts of the C-arm 3A, respectively. In the fluoroscopic image pick-up device 3, the C-arm 3A is configured to be rotatable and movable such that the subject H can be imaged from any direction. As will be described below, the fluoroscopic image pick-up device 3 sequentially acquires X-ray images of the subject H by performing fluoroscopic imaging in which the subject H is continuously irradiated with X-rays at a predetermined frame rate during the treatment on the subject H, and the X-rays transmitted through the subject H are sequentially detected by the X-ray detector 3C. In the following description, the X-ray images that are sequentially acquired will be referred to as fluoroscopic images. The fluoroscopic image is an example of a radiation image according to the present disclosure. In addition, the X-ray is an example of radiation according to the present disclosure.
The image storage server 4 is a computer that stores and manages various types of data, and comprises a large-capacity external storage device and database management software. The image storage server 4 communicates with another device via the wired or wireless network 5 and transmits and receives image data and the like. Specifically, various types of data including image data of the three-dimensional image acquired by the three-dimensional image pick-up device 2, the fluoroscopic image acquired by the fluoroscopic image pick-up device 3, and an ultrasound image acquired by an ultrasonic endoscope device 6 which will be described below are acquired via the network, and managed by being stored in a recording medium such as a large-capacity external storage device. A storage format of the image data and the communication between the respective devices via the network 5 are based on a protocol such as digital imaging and communication in medicine (DICOM).
In the present embodiment, it is assumed that a biopsy treatment is performed in which while performing fluoroscopic imaging of the subject H, a part of a lesion such as a pulmonary nodule existing in the lung of the subject H is collected to examine the presence or absence of a disease in detail. For this reason, the fluoroscopic image pick-up device 3 is disposed in a treatment room for performing a biopsy. In addition, the ultrasonic endoscope device 6 is installed in the treatment room. The ultrasonic endoscope device 6 comprises an endoscope 7 whose distal end is attached with a treatment tool such as an ultrasound probe and a puncture needle.
In the present embodiment, in order to perform a biopsy of the lesion, an operator inserts the endoscope 7 into the bronchus of the subject H, picks up a fluoroscopic image of the subject H with the fluoroscopic image pick-up device 3, confirms a distal end position of the endoscope 7 in the subject H in the fluoroscopic image while displaying the picked-up fluoroscopic image in real time, and moves the distal end of the endoscope 7 to a target position of the lesion.
Here, lung lesions such as pulmonary nodules occur outside the bronchus rather than inside the bronchus. Therefore, after moving the distal end of the endoscope 7 to the target position, the operator picks up an ultrasound image from an inner surface to the outside of the bronchus with the ultrasound probe, displays the ultrasound image, and performs treatment of collecting a part of the lesion using a treatment tool while confirming a position of the lesion in the ultrasound image.
In this case, a position and a posture of the distal end of the endoscope 7 can be recognized by an appearance of the marker 8 attached to the distal end of the endoscope 7 in the fluoroscopic image. Regarding the posture, in a case in which three axes are spatially set as shown in
Therefore, in a case of picking up an ultrasound image, the operator can determine the position and the posture of the distal end of the endoscope 7 in a state in which the lesion is included in the ultrasound image by a position and a shape of the marker 8 included in the fluoroscopic image, and can reliably collect the lesion by making the treatment tool reach the lesion while maintaining the position of the distal end.
On the other hand, in a case in which an ultrasonic endoscope on which the treatment tool is not mounted is used, after confirming the position of the lesion, an endoscope on which the treatment tool is mounted is inserted to the subject to collect the lesion tissue. In this case, in a case in which the same marker 8 is also attached to the endoscope on which the treatment tool is mounted, the operator can easily remember the position of the lesion by relying on the marker 8 included in the fluoroscopic image, so that the endoscope on which the treatment tool is mounted can be inserted into the position of the lesion and the tissue of the lesion can be reliably collected.
Next, the image processing device according to the first embodiment will be described.
The storage 13 is realized by, for example, a hard disk drive (HDD), a solid state drive (SSD), a flash memory, and the like. An image processing program 12 is stored in the storage 13 as a storage medium. The CPU 11 reads out the image processing program 12 from the storage 13, expands the image processing program 12 in the memory 16, and executes the expanded image processing program 12.
Next, a functional configuration of the image processing device according to the first embodiment will be described.
The image acquisition unit 21 sequentially acquires a plurality of fluoroscopic images TO acquired by the fluoroscopic image pick-up device 3 during the treatment of the subject H at a predetermined frame rate. In addition, the image acquisition unit 21 sequentially acquires a plurality of ultrasound images corresponding to the plurality of fluoroscopic images TO acquired by the ultrasonic endoscope device 6 at a predetermined frame rate. The ultrasound image acquired by the ultrasonic endoscope device 6 is an example of the two-dimensional ultrasound image of the present disclosure. In the following description, the ultrasound image means a two-dimensional ultrasound image unless otherwise noted.
The recognition unit 22 recognizes the position and the posture of the endoscope 7 in the bronchus based on an image of the marker 8 (hereinafter, referred to as a marker image) included in the fluoroscopic image TO. Since the marker 8 is radiation-impermeable, the marker image appears as a region of high brightness (low density) in the fluoroscopic image TO. Therefore, the marker image can be detected from the fluoroscopic image TO using threshold processing, a trained model, or the like. Here, based on the annular marker 8C as shown in
The recognition unit 22 sets one of the fluoroscopic images TO that are sequentially acquired, as a reference fluoroscopic image Tb, detects the marker image from the reference fluoroscopic image Tb, and recognizes a position and a posture of the marker image. The position and the posture of the marker image in the reference fluoroscopic image Tb are referred to as reference position and posture. The reference position need only be specified, for example, by the operator using the input device 15 to designate a first branch position of the bronchus, a position near the lesion, or the like.
After acquiring the reference fluoroscopic image Tb, the recognition unit 22 recognizes the position and the posture of the marker image in the fluoroscopic images TO that are sequentially acquired. Thus, in the fluoroscopic images TO that are sequentially acquired, the position and the posture of the endoscope 7 with reference to the reference position are sequentially recognized. The recognition unit 22 may recognize the position and the posture of the marker image by using the chess board marker 8B as an auxiliary in addition to the annular marker 8C.
The derivation unit 23 derives a three-dimensional ultrasound image UV0 from the plurality of ultrasound images U0 based on the position and the posture of the endoscope 7 recognized for the plurality of fluoroscopic images TO.
As shown in
Then, the derivation unit 23 derives a three-dimensional ultrasound image UV12, as shown in
The derivation unit 23 derives a three-dimensional ultrasound image UV0 by repeating the above-described processing for the ultrasound images whose acquisition times are adjacent to each other.
The registration unit 24 performs registration between the three-dimensional ultrasound image UV0 derived by the derivation unit 23 and the fluoroscopic image TO. Therefore, the registration unit 24 projects the three-dimensional ultrasound image UV0 derived from the ultrasound images U0 acquired so far in an imaging direction of the latest fluoroscopic image TO to obtain a two-dimensional projection ultrasound image UT0. As a projection method, any projection method such as maximum value projection or minimum value projection can be used.
Then, the registration unit 24 performs registration between the two-dimensional projection ultrasound image UT0 and the fluoroscopic image TO. For the registration, any method such as rigid body registration or non-rigid body registration can be used.
The display control unit 25 superimposes the registered two-dimensional projection ultrasound image UT0 on the fluoroscopic image TO and displays the superimposed image on the display 14.
Next, a process performed in the first embodiment will be described.
Then, the registration unit 24 performs registration between the three-dimensional ultrasound image UV0 and the latest fluoroscopic image TO (step ST4), and the display control unit 25 superimposes and displays, on the fluoroscopic image TO, the registered three-dimensional ultrasound image UV0, that is, the two-dimensional projection ultrasound image UT0 (step ST5), and returns to step ST1.
As described above, in the present embodiment, the three-dimensional ultrasound image UV0 is derived from the plurality of ultrasound images U0 based on the position and the posture of the endoscope 7 recognized for the plurality of fluoroscopic images TO. By using such a three-dimensional ultrasound image UV0, the position of the lesion included in the fluoroscopic image TO can be easily confirmed.
In particular, by superimposing and displaying the three-dimensional ultrasound image UV0 on the fluoroscopic image TO, a positional relationship between the distal end of the endoscope 7 included in the fluoroscopic image TO and the lesion included in the three-dimensional ultrasound image UV0 can be easily grasped. Therefore, in a case in which the tissue of the lesion is collected for a biopsy, an accuracy of collecting the tissue from the lesion can be improved based on the positional relationship between the distal end of the endoscope 7 included in the fluoroscopic image TO and the lesion included in the three-dimensional ultrasound image UV0.
Next, a second embodiment of the present disclosure will be described.
In the second embodiment, the image acquisition unit 21 acquires a three-dimensional image V0 of the subject H from the image storage server 4 in response to an instruction from the input device 15 by the operator before a treatment.
The extraction unit 26 extracts a body cavity into which the endoscope 7 is inserted from the three-dimensional image V0. In the second embodiment, since the endoscope 7 is inserted into the bronchus, the extraction unit 26 extracts the bronchus from the three-dimensional image V0. Therefore, the extraction unit 26 extracts a lung region from the three-dimensional image V0. As a method of extracting the lung region, any method, such as a method of extracting the lung region by creating a histogram of a signal value for each pixel in the three-dimensional image V0 and performing threshold processing for the lung or a region growing method based on a seed point indicating the lung, can be used. Note that a discriminator which has been subjected to machine learning to extract the lung region may be used.
Then, the extraction unit 26 extracts a graph structure of a bronchial region included in the lung region extracted from the three-dimensional image V0, as a three-dimensional bronchial region. As a method of extracting the bronchial region, for example, the method disclosed in JP2010-220742A can be used in which the graph structure of the bronchus is extracted using a Hessian matrix, the extracted graph structure is classified into a starting point, an end point, a branch point, and sides, and the starting point, the end point, and the branch point are connected with the sides to extract the bronchial region. Note that the method of extracting the bronchial region is not limited thereto.
The correction unit 27 corrects the position and the posture of the endoscope 7 according to a shape of the extracted bronchus. Therefore, the correction unit 27 performs a process of matching a coordinate system of the three-dimensional image V0 with a coordinate system of a distal end position of the endoscope 7. For example, the coordinate system of the three-dimensional image V0 is matched with the coordinate system of the distal end position of the endoscope 7 by performing coordinate transformation of the coordinate (three-dimensional) of the distal end position of the endoscope 7 such that the coordinate system of the endoscope 7 is matched with the coordinate system of the three-dimensional image V0.
Then, the correction unit 27 determines whether or not the distal end position of the endoscope 7 is in the bronchus, and corrects the recognized position and posture of the endoscope 7 such that the distal end position of the endoscope 7 is located in the bronchus in a case in which the distal end position of the endoscope 7 is not in the bronchus. On the other hand, the correction unit 27 does not correct the position and the posture of the endoscope 7 in a case in which the distal end position of the endoscope 7 is in the bronchus.
In a case in which the position and the posture of the endoscope 7 are corrected, the derivation unit 23 derives the three-dimensional ultrasound image UV0 based on the corrected position and posture of the endoscope. In a case in which the position and the posture of the endoscope 7 are not corrected, the derivation unit 23 derives the three-dimensional ultrasound image UV0 based on the position and the posture of the endoscope recognized by the recognition unit 22.
Next, a process performed in the second embodiment will be described.
In a case in which negative determination is made in step ST15, the correction unit 27 corrects the recognized position and posture of the endoscope 7 (step ST16). Subsequently, the derivation unit 23 derives the three-dimensional ultrasound image UV0 from the plurality of ultrasound images U0 based on the position and the posture of the endoscope 7 corrected for the plurality of fluoroscopic images TO (step ST17).
In a case in which positive determination is made in step ST15, the process proceeds to step ST17, and the derivation unit 23 derives the three-dimensional ultrasound image UV0 from the plurality of ultrasound images U0 based on the position and the posture of the endoscope 7 recognized for the plurality of fluoroscopic images TO.
Then, the registration unit 24 performs registration between the three-dimensional ultrasound image UV0 and the latest fluoroscopic image TO (step ST18), and the display control unit 25 superimposes and displays, on the fluoroscopic image TO, the registered three-dimensional ultrasound image UV0, that is, the two-dimensional projection ultrasound image UT0 (step ST19), and returns to step ST11.
As described above, in the second embodiment, the position and the posture of the endoscope are corrected in a case in which the position of the endoscope is not in the bronchus, so that an accuracy of recognition of the position of the endoscope can be improved. Therefore, the positional relationship between the distal end of the endoscope 7 included in the fluoroscopic image TO and the lesion included in the three-dimensional ultrasound image UV0 can be accurately grasped, and as a result, the accuracy of collecting the tissue from the lesion can be improved.
In each of the above-described embodiments, as shown in
In addition, in each of the above-described embodiments, the processing in a case in which the lesion of the lung is collected by using a bronchial endoscope is described, but the present disclosure is not limited thereto. For example, the image processing device according to the present embodiment can also be applied in a case where an ultrasonic endoscope is inserted into a digestive organ such as a stomach to perform a biopsy of a tissue such as a pancreas or a liver.
In addition, in each of the above-described embodiments, for example, as a hardware structure of a processing unit that executes various types of processing such as the image acquisition unit 21, the recognition unit 22, the derivation unit 23, the registration unit 24, the display control unit 25, the extraction unit 26, and the correction unit 27, various types of processors shown below can be used. The various types of processors include, as described above, a CPU which is a general-purpose processor that executes software (program) to function as various types of processing units, as well as a programmable logic device (PLD) which is a processor having a circuit configuration that can be changed after manufacturing such as a field programmable gate array (FPGA), a dedicated electrical circuit which is a processor having a circuit configuration exclusively designed to execute specific processing such as an application specific integrated circuit (ASIC), and the like.
One processing unit may be configured of one of the various types of processors, or a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs, or a combination of a CPU and an FPGA). Further, a plurality of processing units may be configured of one processor.
As an example of configuring a plurality of processing units with one processor, first, there is a form in which, as typified by computers such as a client and a server, one processor is configured by combining one or more CPUs and software, and the processor functions as a plurality of processing units. Second, there is a form in which, as typified by a system on chip (SoC) and the like, in which a processor that implements functions of an entire system including a plurality of processing units with one integrated circuit (IC) chip is used. As described above, the various types of processing units are configured using one or more of the various types of processors as a hardware structure.
Furthermore, as the hardware structure of the various types of processors, more specifically, an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined can be used.
Number | Date | Country | Kind |
---|---|---|---|
2022-076306 | May 2022 | JP | national |