The present disclosure relates to an ultrasound observation apparatus for observing a tissue to be observed, using ultrasound waves, a method of operating an ultrasound observation apparatus, and an ultrasound observation apparatus operation program.
Needle biopsy, which is performed using a puncture needle when diagnosing a tissue to be observed using ultrasound waves, has a very narrow range of tissues that can be collected by one puncture motion and has a small amount of tissues that can be collected. To collect a wide range of tissues, the needle biopsy may be performed a plurality of times at different positions on the same cross section. Further, to increase the collection amount of the tissues, the puncture needle may be reciprocated a plurality of times at the same position.
When performing the needle biopsy a plurality of times, it is not easy to grasp the position where the needle biopsy has been performed after the puncture needle has been taken out from the tissue to be observed. To solve this problem, conventionally, a technology of storing a position that the puncture needle has reached within a tissue and displaying the position on a monitor is disclosed (see, for example, JP 2009-189500 A).
An ultrasound observation apparatus according to one aspect of the present disclosure generates an ultrasound image based on an ultrasound echo acquired by an ultrasound probe provided with an ultrasound transducer that transmits an ultrasound wave to an observation target and receives an ultrasound wave reflected at the observation target, and includes: a puncture needle detection unit configured to detect an image of a puncture needle displayed in the ultrasound image; a motion extraction unit configured to extract a linear motion at a point of the puncture needle based on a history of the image of the puncture needle in the ultrasound image; and a composite image generation unit configured to generate a composite image by generating loci of the linear motions extracted by the motion extraction unit and superimposing the loci on the ultrasound image, the composite image generation unit generating the composite image by using the loci of the linear motions having passed through a common section a plurality of times among the loci of the linear motions extracted by the motion extraction unit.
Moreover, an ultrasound observation apparatus according to one aspect of the present disclosure generates an ultrasound image based on an ultrasound echo acquired by an ultrasound probe provided with an ultrasound transducer that transmits an ultrasound wave to an observation target and receives an ultrasound wave reflected at the observation target, and includes: a puncture needle detection unit configured to detect an image of a puncture needle displayed in the ultrasound image; a motion extraction unit configured to extract a linear motion at a point of the puncture needle based on a history of the image of the puncture needle in the ultrasound image; and a composite image generation unit configured to generate a composite image by generating loci of the linear motions extracted by the motion extraction unit and superimposing the loci on the ultrasound image, the composite image generation unit generating a composite image displaying loci of the linear motions during time from when the puncture needle detection unit detects the image of the puncture needle to when the puncture needle detection unit stops detecting the image of the puncture needle, and loci of the linear motions during time from when the puncture needle detection unit detects the image of the puncture needle again to when the puncture needle detection unit stops detecting the image of the puncture needle, in different display forms from each other.
The above and other features, advantages and technical and industrial significance of this disclosure will be better understood by reading the following detailed description of presently preferred embodiments of the disclosure, when considered in connection with the accompanying drawings.
Hereinafter, embodiments will be described with reference to the accompanying drawings.
The ultrasound endoscope 2 transmits ultrasound waves to a subject that is an observation target and receives the ultrasound waves reflected at the subject. The ultrasound endoscope 2 includes a tubular insertion portion 21 that is inserted into the subject, an operating unit 22 provided to a proximal end portion of the insertion portion 21 and which is held by a user and receives an operation input from the user, an universal cord 23 extending from the operating unit 22 and including a plurality of signal cables, an optical fiber that transmits illumination light generated by the light source device 6, and the like, and a connector 24 provided on an end portion of the universal cord 23 on an opposite side of the operating unit 22. For example, the ultrasound endoscope 2 observes digestive tract (esophagus, stomach, duodenum, or large intestine) and respiratory tract (trachea or bronchus) of the subject, as the observation target, and various types are known depending on the observation target.
The insertion portion 21 includes a distal end portion 211 in which an ultrasound transducer is provided, a rigid portion 212 externally covered with a rigid member connected to a proximal end side of the distal end portion 211, a bend portion 213 provided to a proximal end side of the rigid portion 212 and bendable according to the operation input received by the operating unit 22, and a flexible tube portion 214 provided to a proximal end side of the bend portion 213 and externally covered with a member having flexibility. A treatment tool channel 215 that is an insertion passage into which a treatment tool including a puncture needle is inserted is formed inside the insertion portion 21 (the treatment instrument channel is illustrated by the broken line in
The configuration of the ultrasound diagnosis system 1 is continuously described with reference to
The ultrasound observation apparatus 3 transmits and receives an electrical signal to and from the ultrasound endoscope 2 via the ultrasound cable. The ultrasound observation apparatus 3 applies predetermined processing to an electrical echo signal received from the ultrasound endoscope 2 to generate an ultrasound image or the like. Details of the function and configuration of the ultrasound observation apparatus 3 will be described below with reference to the block diagram of
The camera control unit 4 applies predetermined processing to an image signal received from the ultrasound endoscope 2 through the electrical cable to generate an endoscope image.
The display device 5 is configured using liquid crystal, organic electro luminescence (EL), or the like, and receives data of the ultrasound image generated by the ultrasound observation apparatus 3, the endoscope image generated by the camera control unit 4, and the like and displays the images.
The light source device 6 generates the illumination light for illuminating an inside of the subject and supplies the illumination light to the ultrasound endoscope 2 via the light guide. The light source device 6 also incorporates a pump for sending water and air.
The transmitting and receiving unit 31 transmits a pulse transmission drive wave signal to the ultrasound transducer 211a based on a predetermined waveform and transmission timing. Further, the transmitting and receiving unit 31 receives an electrical echo signal from the ultrasound transducer 211a. The transmitting and receiving unit 31 also has functions to transmit various control signals outputted by the control unit 37 to the ultrasound endoscope 2, and receive various types of information including an ID for identification from the ultrasound endoscope 2 and transmit the information to the control unit 37.
The signal processing unit 32 applies known processing such as band-pass filter, envelope detection, and logarithmic conversion to the echo signal to generate digital ultrasound image reception data, and outputs the generated data. The signal processing unit 32 is realized using a general-purpose processor such as a central processing unit (CPU) or a dedicated integrated circuit or the like that executes a specific function such as application specific integrated circuit (ASIC) or field programmable gate array (FPGA).
The puncture needle detection unit 34 detects the puncture needle displayed in the ultrasound image by image processing, and writes and stores coordinates of the position of the point of the detected puncture needle in the ultrasound image together with information of detection time to a puncture needle information storage unit 381 included in the storage unit 38. The puncture needle detection unit 34 detects, for example, a region having a large luminance value as the puncture needle by analyzing the luminance of pixels of the ultrasound image. Note that the puncture needle detection unit 34 may detect the puncture needle by performing pattern matching using the ultrasound image of the puncture needle stored in advance by the storage unit 38.
The motion extraction unit 35 extracts a motion that forms a linear locus in the ultrasound image based on the puncture needle information stored in the puncture needle information storage unit 381. At this time, the motion extraction unit 35 extracts movement of the puncture needle from the start to the end in the same direction as a single motion. By extracting the motion that forms a linear locus by the motion extraction unit 35 in this way, a locus of when changing a direction to puncture by the puncture needle 100 by changing the rising angle of the rising base 212e is deleted, for example. Note that “linear” referred to here includes not only the case where the point of the puncture needle is moved on one straight line in the ultrasound image but also a case where the point of the puncture needle is moved in a long and narrow rectangular region having a small width set in advance in a direction orthogonal to a moving direction along the straight line.
The image generation unit 36 includes an ultrasound image generation unit 361 that generates ultrasound image data based on the reception data, and a composite image generation unit 362 that generates a composite image by generating the locus of the linear motion of the puncture needle extracted by the motion extraction unit 35 and superimposing the locus on the ultrasound image.
The ultrasound image generated by the ultrasound image generation unit 361 is a B-mode image obtained by converting amplitude into luminance.
The composite image generation unit 362 generates a composite image by superimposing the locus of the linear motion extracted by the motion extraction unit 35 on the ultrasound image.
The composite image generation unit 362 generates the composite image by arranging lines representing the locus of the linear motion extracted by the motion extraction unit 35 while adjusting a display area of the image of the puncture needle in accordance with change of the position of the observation target, of each frame, displayed in the B-mode image.
In a case where the puncture needle detection unit 34 again detects the image of the puncture needle after the composite image generation unit 362 generates the composite image, the composite image generation unit 362 stops generation of the composite image. In this case, the B-mode image is displayed on the display device 5. In this case, the composite image generation unit 362 may generate a composite image displaying a locus in a form not disturbing the visibility of the puncture needle. Examples of the form not disturbing the visibility of the puncture needle include displaying the locus by a broken line and displaying the locus in a less visible color.
In a case where a freeze button of the operating unit 22 receives an operation input after the composite image generation unit 362 generates the composite image, the composite image generation unit 362 may stop generation of the composite image. In this case, when canceling the freeze upon receipt of an operation input again by the freeze button, the composite image generation unit 362 may resume the generation of the composite image.
The control unit 37 includes a display controller 371 that controls display of the display device 5. The display controller 371 causes the display device 5 to display the various images generated by the image generation unit 36.
The control unit 37 is realized using a general-purpose processor such as a CPU having arithmetic and control functions, or a dedicated integrated circuit such as ASIC or FPGA. In the case where the control unit 37 is realized by the general-purpose processor or the FPGA, the control unit 37 reads various programs and data stored in the storage unit 38 from the storage unit 38, and executes various types of arithmetic processing related to the operation of the ultrasound observation apparatus 3 to collectively control the ultrasound observation apparatus 3. In the case where the control unit 37 is configured from the ASIC, the control unit 37 may independently execute various types of processing, or may execute the various types of processing using various data stored in the storage unit 38. In the first embodiment, the control unit 37 and part of the signal processing unit 32, the puncture needle detection unit 34, the motion extraction unit 35, and the image generation unit 36 can be configured from a common general-purpose processor, a dedicated integrated circuit, or the like.
The storage unit 38 includes the puncture needle information storage unit 381 that stores information of the point position of the image of the puncture needle detected by the puncture needle detection unit 34 together with the information of the detection time of the image of the puncture needle and the like, as puncture needle information. The puncture needle information stored in the puncture needle information storage unit 381 is deleted under control of the control unit 37 when the puncture needle detection unit 34 starts detecting the image of the puncture needle again after having stopped detecting the image of the puncture needle.
The storage unit 38 stores various programs including an operation program for executing an operation method of the ultrasound observation apparatus 3. The various programs including an operation program can also be recorded on a computer readable recording medium such as a hard disk, a flash memory, a CD-ROM, a DVD-ROM, or a flexible disk and widely distributed. Note that the above-described various programs can also be acquired by being downloaded via a communication network. The communication network referred to here is realized by, for example, an existing public line network, a local area network (LAN), or a wide area network (WAN), and may be a wired or wireless network.
The storage unit 38 is realized using a read only memory (ROM) in which the various programs and the like are installed in advance, a random access memory (RAM) in which arithmetic parameters and data of processing, and the like are stored, and the like.
First, the transmitting and receiving unit 31 receives an echo signal that is a measurement result of an observation target by the ultrasound transducer 211a from the ultrasound endoscope 2 (Step S1).
Next, the transmitting and receiving unit 31 applies predetermined reception processing to the echo signal received from the ultrasound transducer 211a (Step S2). To be specific, the transmitting and receiving unit 31 amplifies (STC correction) the echo signal and then applies processing such as filtering or A/D conversion to the amplified signal.
After that, the ultrasound image generation unit 361 generates a B-mode image, using the echo signal processed by the transmitting and receiving unit 31, and outputs data of the B-mode image to the display device 5 (Step S3).
The puncture needle detection unit 34 performs processing of detecting the image of the puncture needle displayed in the B-mode image, using the generated B-mode image (Step S4). When the puncture needle detection unit 34 detects the image of the puncture needle in the B mode image (Step S4: Yes), and when the composite image generation unit 362 has generated a composite image in a previous frame (Step S5: Yes), the ultrasound observation apparatus 3 proceeds to Step S6.
In Step S6, the display controller 371 deletes (non-displays) the locus or changes the display method (Step S6). The puncture needle detection unit 34 writes and stores the information of the point position of the image of the detected puncture needle together with the information of detection time and the like to the puncture needle information storage unit 381 (Step S7). The information of the point position of the image of the puncture needle is represented by coordinates in the B-mode image, for example. Step S6 corresponds to a situation in which the puncture needle is newly inserted while the display device 5 is displaying the composite image. In Step S6, the locus of the previous puncture needle may be deleted, or the display method may be changed so that the locus can be identified as the previous locus.
Next, the display controller 371 performs control to cause the display device 5 to display the B-mode image (Step S8).
After that, when the input unit 33 receives an input of a signal instructing termination (Step S9: Yes), the ultrasound observation apparatus 3 terminates the series of processing. On the other hand, when the input unit 33 does not receive the input of a signal instructing termination (Step S9: No), the ultrasound observation apparatus 3 returns to Step S1.
When the puncture needle detection unit 34 detects the image of the puncture needle in the B-mode image (Step S4: Yes), and when the composite image generation unit 362 has not generated the composite image in the previous frame (Step S5: No), the ultrasound observation apparatus 3 proceeds to Step S7.
The case in which the puncture needle detection unit 34 does not detect the image of the puncture needle in the B-mode image (Step S4: No) in Step S4 will be described. In this case, when the composite image generation unit 362 has not generated the composite image in the previous frame (Step S10: No), the ultrasound observation apparatus 3 proceeds to Step S11. This situation corresponds to a situation in which the puncture needle detection unit 34 has never detected the puncture needle, or a situation in which the puncture needle detection unit 34 has continued to detect the image of the puncture needle up to one previous frame and stops detecting the image of the puncture needle at this frame.
In Step S11, when the puncture needle information storage unit 381 stores detection data of the puncture needle (Step S11: Yes), that is, when the puncture needle information storage unit 381 has stored the information of the point position of the image of the puncture needle in a plurality of frames up to the previous frame in succession, the motion extraction unit 35 extracts the linear motion at the point of the puncture needle based on the history of the point position of the image of the puncture needle (Step S12).
In Step S11, when the puncture needle information storage unit 381 does not store the detection data of the puncture needle (Step S11: No), the ultrasound observation apparatus 3 proceeds to Step S8 described above. This situation corresponds to a situation in which the puncture needle detection unit 34 has never detected the puncture needle.
After that, the composite image generation unit 362 generates the locus of the linear motion extracted by the motion extraction unit 35, and superimposes the locus on the B-mode image to generate a composite image (Step S13). At this time, the composite image generation unit 362 generates the composite image by performing correction to hold a relative positional relationship between the locus of the linear motion at the point of the puncture needle and the observation target of the B-mode image.
When the composite image generation unit 362 generates the locus in this frame (Step S14: Yes), the control unit 37 deletes the detection data of the puncture needle stored in the puncture needle information storage unit 381 (Step S15).
Next, the display controller 371 performs control to cause the display device 5 to display the composite image generated by the composite image generation unit 362 (Step S16). The composite image displayed by the display device 5 is, for example, the composite image 102 illustrated in
After Step S16, the ultrasound observation apparatus 3 proceeds to Step S9 described above.
In Step S10, when the composite image generation unit 362 has generated the composite image in the previous frame (Step S10: Yes), the ultrasound observation apparatus 3 proceeds to Step S13. This situation corresponds to a situation in which undetected state of the image of the puncture needle continues from at least one previous frame. In this situation, in Step S13, the composite image generation unit 362 generates the composite image, using the newly generated B-mode image and the locus generated in one previous frame.
In Step S14, when the composite image generation unit 362 has not generated the locus in this frame (Step S14: No), that is, when the composite image generation unit 362 has generated the locus in a frame prior to the present frame, the ultrasound observation apparatus 3 proceeds to Step S16.
According to the first embodiment described above, the composite image is generated by extracting the linear motion at the point of the puncture needle based on the history of the image of the puncture needle in the ultrasound image, and generating the locus of the extracted linear motion and superimposing the locus on the ultrasound image. Therefore, the position where the puncture needle has been moved a plurality of times in the subject can be accurately grasped.
In the first embodiment, when the puncture needle detection unit 34 stops detecting the image of the puncture needle, the composite image generation unit 362 starts generation of the composite image, and after that, when the puncture needle detection unit 34 detects the image of the puncture needle again, the composite image generation unit 362 stops generation of the composite image. Therefore, according to the first embodiment, the user can confirm the history of the first needle biopsy during time from when the puncture needle is taken out once to when the second needle biopsy is performed, and can more accurately grasp the position where a tissue is to be collected in the second needle biopsy.
First Modification
According to the first modification described above, only the loci including an overlapping portion in a plurality of motions are displayed. Therefore, a portion having a high possibility of collecting a tissue can be displayed. Therefore, a user can more reliably specify a place having a high possibility of collecting a tissue.
Note that, in the first modification, loci up to a predetermined place in descending order of the number of times of overlap may be displayed.
Second Modification
According to the second modification described above, only the loci including an overlapping portion in a plurality of motions is displayed together with the numbers of times of overlap. Therefore, the user can more easily grasp a portion having a high possibility of collecting a tissue.
Note that, in the second modification, a composite image that displays colors and types of lines of the loci in different forms according to the number of times of overlap may be generated instead of displaying the number of times of overlap. Further, in the second modification, loci up to a predetermined place in descending order of the number of times of overlap may be displayed, similarly to the first modification.
Third Modification
According to the third modification, in a case where a user wants to puncture another region in the next biopsy motion, the user can easily grasp the region to be newly punctured.
Note that, in a case where a puncture needle detection unit 34 detects an image of a puncture needle again after a composite image generation unit 362 generates a composite image displaying loci, as described in the first embodiment and first and second modifications, the composite image generation unit 362 may start generation of a composite image with contour display described in the third modification. In this case, when the composite image generation unit 362 resumes generation of the composite image displaying loci when the puncture needle detection unit 34 stops detecting the image of the puncture needle. The composite image generation unit 362 may start generation of the composite image with contour display when receiving an operation input of a freeze button of an operating unit 22, and may resume generation of the composite image displaying loci when receiving a re-operation input of the freeze button.
A second embodiment is characterized in displaying a locus of an image of a puncture needle in an ultrasound image substantially in real time. A configuration of an ultrasound observation apparatus according to the second embodiment is similar to that of the ultrasound observation apparatus 3 described in the first embodiment.
Following Step S23, a puncture needle detection unit 34 performs processing of detecting an image of a puncture needle displayed in a B-mode image using the generated B-mode image (Step S24). When the puncture needle detection unit 34 detects the image of the puncture needle in the B-mode image (Step S24: Yes), the puncture needle detection unit 34 writes and stores information of a point position of the image of the detected puncture needle together with information of detection time and the like to a puncture needle information storage unit 381 (Step S25).
After that, when the puncture needle information storage unit 381 stores detection data of the puncture needle (Step S26: Yes), that is, when the puncture needle information storage unit 381 has stored the image of the point position of the image of the puncture needle in a plurality of frames up to a previous frame in succession, a motion extraction unit 35 extracts a linear motion at the point of the puncture needle based on a history of the point position of the image of the puncture needle (Step S27). Meanwhile, when the puncture needle information storage unit 381 does not store the puncture needle information (Step S26: No), the ultrasound observation apparatus 3 proceeds to Step S30 described below.
After Step S27, a composite image generation unit 362 generates a composite image by superimposing the locus of the linear motion extracted by the motion extraction unit 35 on an ultrasound image (Step S28).
Next, a display controller 371 performs control to cause a display device 5 to display the composite image generated by the composite image generation unit 362 (Step S29).
After that, when an input unit 33 receives an input of a signal instructing termination (Step S30: Yes), the ultrasound observation apparatus 3 terminates the series of processing. On the other hand, when the input unit 33 does not receive the input of a signal instructing termination (Step S30: No), the ultrasound observation apparatus 3 returns to Step S21.
A case in which the puncture needle detection unit 34 does not detect the image of the puncture needle in the B-mode image (Step S24: No) will be described. In this case, when the ultrasound observation apparatus 3 generates the composite image in one previous frame (Step S31: Yes), the control unit 37 deletes detection data of the puncture needle stored in the puncture needle information storage unit 381 (Step S32).
After that, the display controller 371 performs control to cause the display device 5 to display the B-mode image generated in Step S23 (Step S33). As described above, in the second embodiment, the locus of the puncture needle is not displayed when the image of the puncture needle is not included in the B-mode image. After Step S33, the ultrasound observation apparatus 3 proceeds to Step S30.
In Step S31, when the ultrasound observation apparatus 3 has not generated a composite image in one previous frame (Step S31: No), the ultrasound observation apparatus 3 proceeds to Sep S33.
Note that the display controller 371 may cause the display device 5 to display the composite image in Step S33 without performing the processing of Step S32. In that case, when the puncture needle is newly detected, the display controller 371 deletes and non-displays the locus or changes the display method.
According to the second embodiment described above, the composite image is generated by extracting the linear motion at the point of the puncture needle based on the history of the image of the puncture needle in the ultrasound image, and generating the locus of the extracted linear motion and superimposing the locus on the ultrasound image. Therefore, the position where the puncture needle has been moved a plurality of times in the subject can be accurately grasped.
Further, according to the second embodiment, the linear locus of the image of the puncture needle can be grasped substantially in real time. Therefore, a position where a tissue is to be collected next can be specified in one-time needle biopsy.
The embodiments for carrying out the present disclosure have been described. However, the present disclosure should not be limited only by the above-described first and second embodiments. For example, in the first and second embodiments, when the puncture needle detection unit 34 starts detecting the image of the puncture needle again after stopping detecting the image of the puncture needle, the puncture needle information storage unit 381 may continuously provide and store information to be newly stored and identifiable additional information to the information of the point position of the image of the puncture needle stored in the puncture needle information storage unit 381 so far. In this case, the composite image generation unit 362 may generate a composite image displaying loci of the puncture needle respectively corresponding to the old and new information, that is, loci of the puncture needle in different needle biopsies in an identifiable manner.
Further, as an ultrasound probe, an extracorporeal ultrasound probe that irradiates a body surface of a subject with ultrasound waves may be applied. The extracorporeal ultrasound probe is usually used for observing abdominal organs (liver, gall bladder, and bladder), breast (especially mammary gland), and thyroid gland.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the disclosure in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2015-133871 | Jul 2015 | JP | national |
This application is a continuation of PCT international application Ser. No. PCT/JP2016/060573, filed on Mar. 30, 2016 which designates the United States, incorporated herein by reference, and which claims the benefit of priority from Japanese Patent Applications No. 2015-133871, filed on Jul. 2, 2015, incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/060573 | Mar 2016 | US |
Child | 15841582 | US |