The present application claims priority under 35 U.S.C. § 119 to Japanese Patent Application No. 2022-154712, filed on Sep. 28, 2022. The above application is hereby expressly incorporated by reference, in its entirety, into the present application.
The present invention relates to an ultrasound diagnostic apparatus and a control method for an ultrasound diagnostic apparatus that are used to observe a blood vessel.
Conventionally, dialysis, which is a treatment for artificially removing an unnecessary substance in a blood vessel of a subject, has been performed instead of kidneys. In dialysis, since an upper limb of the subject undergoes frequent punctures with a thick needle, various changes, such as stenosis, occlusion, or tortuous flow of the blood vessel, often occur in the blood vessel of the subject. In a case where various changes have occurred in the blood vessel of the subject, a puncture position in the blood vessel may not be easily selected. Therefore, normally, a so-called shunt map that records a position and a state of a blood vessel is created, and puncture is performed while confirming the created shunt map, in many cases. Since an operator often creates the shunt map by hand, the accuracy and the comprehensiveness of information in the shunt map depend on the operator's proficiency level, and creating the shunt map requires a significant amount of time.
Here, in order to easily select the puncture position in the blood vessel regardless of the operator's proficiency level, for example, a technology for automatically acquiring the position of the blood vessel of the subject imaged by a so-called ultrasound diagnostic apparatus, as disclosed in JP2014-221175A, JP2014-217745A, and JP2019-076748A, has been developed. JP2014-221175A, JP2014-217745A, and JP2019-076748A disclose a technology for acquiring a position of a blood vessel in a subject by capturing an ultrasound image while capturing an optical image of the subject and the ultrasound probe. JP2014-221175A and JP2019-076748A further disclose a technology for superimposing and displaying the ultrasound image including the blood vessel on the optical image in which the subject is captured.
Meanwhile, in a blood vessel of a subject undergoing a dialysis treatment, not only changes in a position and a shape in a plan view but also changes in a position and a shape in a depth direction often occur. In JP2014-221175A, JP2014-217745A, and JP2019-076748A, since the position and the shape of the blood vessel in the depth direction are not considered, the operator may not be able to easily select the puncture position in the blood vessel even in a case where these technologies are used.
The present invention has been made in order to solve such a conventional problem, and an object of the present invention is to provide an ultrasound diagnostic apparatus and a control method for an ultrasound diagnostic apparatus capable of easily selecting a puncture position in a blood vessel.
According to the following configuration, the above-described object can be achieved.
[1] An ultrasound diagnostic apparatus comprising:
[2] The ultrasound diagnostic apparatus according to [1], further comprising:
[3] The ultrasound diagnostic apparatus according to [2],
[4] The ultrasound diagnostic apparatus according to any one of [1] to [3],
[5] The ultrasound diagnostic apparatus according to [4], further comprising:
[6] The ultrasound diagnostic apparatus according to [5], further comprising:
[7] The ultrasound diagnostic apparatus according to any one of [4] to [6],
[8] The ultrasound diagnostic apparatus according to any one of [1] to [7],
[9] The ultrasound diagnostic apparatus according to any one of [1] to [8], further comprising:
[10] The ultrasound diagnostic apparatus according to any one of [1] to [9], further comprising:
[11] The ultrasound diagnostic apparatus according to any one of [1] to [10], further comprising:
[12] The ultrasound diagnostic apparatus according to any one of [1] to [11], further comprising:
[13] The ultrasound diagnostic apparatus according to [4], further comprising:
[14] The ultrasound diagnostic apparatus according to [13], further comprising:
[15] The ultrasound diagnostic apparatus according to [14], further comprising:
[16] The ultrasound diagnostic apparatus according to any one of to [15], further comprising:
[17] A control method for an ultrasound diagnostic apparatus, comprising:
In the present invention, an ultrasound diagnostic apparatus comprises: an ultrasound probe; a probe position detection unit configured to detect a position of the ultrasound probe; an image acquisition unit configured to acquire an ultrasound image of a subject using the ultrasound probe; a monitor; a three-dimensional data generation unit configured to generate three-dimensional ultrasound data of a blood vessel of the subject based on a plurality of frames of the ultrasound images acquired by the image acquisition unit while performing scanning with the ultrasound probe along a major axis direction of the blood vessel and the position of the ultrasound probe detected by the probe position detection unit; a blood vessel detection unit configured to detect the blood vessel from the three-dimensional ultrasound data generated by the three-dimensional data generation unit; a depth detection unit configured to detect a depth from a body surface to the blood vessel of the subject based on the blood vessel detected by the blood vessel detection unit; and a blood vessel image generation unit configured to generate a blood vessel image depicting the blood vessel within the subject based on the position of the ultrasound probe detected by the probe position detection unit, the blood vessel detected by the blood vessel detection unit, and the depth detected by the depth detection unit, and to display the blood vessel image on the monitor. Therefore, the user can easily select the puncture position in the blood vessel.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
The description of configuration requirements to be described below is made based on a representative embodiment of the present invention, but the present invention is not limited to such an embodiment.
In the present specification, a numerical range represented by “to” means a range including numerical values described before and after “to” as a lower limit value and an upper limit value, respectively.
In the present specification, “same” and “identical” include an error range generally allowed in the technical field.
The ultrasound probe 1 includes a transducer array 11. A transmission and reception circuit 12 is connected to the transducer array 11.
The apparatus main body 2 includes an image generation unit 21 connected to the transmission and reception circuit 12 of the ultrasound probe 1. A display controller 22 and a monitor 23 are sequentially connected to the image generation unit 21. The apparatus main body 2 comprises an optical image analysis unit 24 connected to the optical camera 3. A three-dimensional data generation unit 25 is connected to the image generation unit 21 and the optical image analysis unit 24. A blood vessel detection unit 26 is connected to the three-dimensional data generation unit 25. A depth detection unit 27 is connected to the three-dimensional data generation unit 25 and the blood vessel detection unit 26. A blood vessel image generation unit 28 is connected to the blood vessel detection unit 26 and the depth detection unit 27. The optical image analysis unit 24 and the blood vessel image generation unit 28 are connected to the display controller 22. In addition, a main body controller 29 is connected to the optical camera 3, the transmission and reception circuit 12, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, and the blood vessel image generation unit 28. An input device 30 is connected to the main body controller 29.
In addition, the transmission and reception circuit 12 and the image generation unit 21 constitute an image acquisition unit 31. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, and the main body controller 29 constitute a processor 32 for the apparatus main body 2. Further, the optical camera 3 and the optical image analysis unit 24 constitute a probe position detection unit 33.
The transducer array 11 of the ultrasound probe 1 includes a plurality of ultrasound transducers one-dimensionally or two-dimensionally arranged. Each of these ultrasound transducers transmits an ultrasound wave in accordance with a drive signal supplied from the transmission and reception circuit 12 and receives an ultrasound echo from a subject to output a signal based on the ultrasound echo. For example, each ultrasound transducer is composed of a piezoelectric body consisting of a piezoelectric ceramic represented by lead zirconate titanate (PZT), a polymer piezoelectric element represented by poly vinylidene di fluoride (PVDF), a piezoelectric single crystal represented by lead magnesium niobate-lead titanate (PMN-PT), or the like, and electrodes formed at both ends of the piezoelectric body.
The transmission and reception circuit 12 transmits the ultrasound wave from the transducer array 11 and generates a sound ray signal based on a reception signal acquired by the transducer array 11, under the control of the main body controller 29. As shown in
The pulsar 41 includes, for example, a plurality of pulse generators, and adjusts an amount of delay of each of drive signals and supplies the drive signals to the plurality of ultrasound transducers such that ultrasound waves transmitted from the plurality of ultrasound transducers of the transducer array 11 form an ultrasound beam based on a transmission delay pattern selected according to a control signal from the main body controller 29. In this way, in a case where a pulsed or continuous wave-like voltage is applied to the electrodes of the ultrasound transducer of the transducer array 11, the piezoelectric body expands and contracts to generate a pulsed or continuous wave-like ultrasound wave from each of the ultrasound transducers, whereby an ultrasound beam is formed from the combined wave of these ultrasound waves.
The transmitted ultrasound beam is reflected in, for example, a target such as a site of the subject and propagates toward the transducer array 11 of the ultrasound probe 1. The ultrasound echo propagating toward the transducer array 11 in this way is received by each of the ultrasound transducers constituting the transducer array 11. In this case, each of the ultrasound transducers constituting the transducer array 11 receives the propagating ultrasound echo to expand and contract to generate a reception signal, which is an electrical signal, and outputs these reception signals to the amplification section 42.
The amplification section 42 amplifies the signal input from each of the ultrasound transducers constituting the transducer array 11 and transmits the amplified signal to the AD conversion section 43. The AD conversion section 43 converts the signal transmitted from the amplification section 42 into digital reception data. The beam former 44 performs so-called reception focus processing by applying and adding a delay to each reception data received from the AD conversion section 43. By this reception focus processing, each reception data converted by the AD conversion section 43 is phase-added, and a sound ray signal in which the focus of the ultrasound echo is narrowed down is acquired.
As shown in
The signal processing section 45 generates a B-mode image signal, which is tomographic image information regarding tissues within the subject, by performing, on the sound ray signal received from the transmission and reception circuit 12, correction of the attenuation due to the distance according to the depth of the reflection position of the ultrasound wave using a sound velocity value set by the main body controller 29 and then performing envelope detection processing.
The DSC 46 converts (raster-converts) the B-mode image signal generated by the signal processing section 45 into an image signal in accordance with a normal television signal scanning method.
The image processing section 47 performs various types of necessary image processing such as gradation processing on the B-mode image signal input from the DSC 46 and then sends out the B-mode image signal to the display controller 22 and the three-dimensional data generation unit 25. Hereinafter, the B-mode image signal that has been subjected to image processing by the image processing section 47 is referred to as an ultrasound image.
The display controller 22 performs predetermined processing on the ultrasound image or the like generated by the image generation unit 21 and displays the ultrasound image or the like on the monitor 23, under the control of the main body controller 29.
The monitor 23 performs various kinds of display under the control of the display controller 22. The monitor 23 can include a display device such as a liquid crystal display (LCD), or an organic electroluminescence (EL) display, for example.
The optical camera 3 includes, for example, an image sensor, such as a so-called charge coupled device (CCD) image sensor or a so-called a complementary metal-oxide-semiconductor (CMOS) image sensor, and images a body surface of the subject and the ultrasound probe 1 disposed on the body surface of the subject to acquire an optical image. The optical camera 3 sends out the acquired optical image to the optical image analysis unit 24.
The optical image analysis unit 24 detects a position of the ultrasound probe 1 on the body surface of the subject by analyzing, for example, the optical image captured by the optical camera 3. For example, in a case where a marker for detecting the position of the ultrasound probe 1, for example, a so-called augmented reality marker (AR marker) or the like, is attached to the ultrasound probe 1, the optical image analysis unit 24 can detect the position of the ultrasound probe 1 by recognizing the marker.
In addition, the optical image analysis unit 24 stores, for example, a plurality of template images representing the ultrasound probe 1 and the body surface of the subject, and can detect the ultrasound probe 1 and the body surface of the subject by searching the optical image through a template matching method using the plurality of template images and can also detect the position of the ultrasound probe 1 on the body surface of the subject. Further, the optical image analysis unit 24 includes, for example, a machine learning model that has learned a large number of optical images showing general ultrasound probes and body surfaces of subjects, and can also detect the position of the ultrasound probe 1 on the body surface of the subject by using the machine learning model.
The three-dimensional data generation unit 25 generates three-dimensional ultrasound data of the blood vessel of the subject based on a plurality of frames of the ultrasound images including a minor axis image of the blood vessel, which are acquired by the image acquisition unit 31 while performing scanning with the ultrasound probe 1 along a major axis direction of the blood vessel, that is, along a flow direction of the blood vessel, and the position of the ultrasound probe 1 detected by the probe position detection unit 33. Here, the minor axis image of the blood vessel refers to a tomographic plane perpendicular to the major axis direction of the blood vessel.
In this case, the three-dimensional data generation unit 25 can generate the three-dimensional ultrasound data by arranging the plurality of frames of ultrasound images U showing a blood vessel B along a major axis direction D based on the position of the ultrasound probe 1 detected by the probe position detection unit 33, as shown in
The blood vessel detection unit 26 detects the blood vessel B from the three-dimensional ultrasound data by analyzing the three-dimensional ultrasound data generated by the three-dimensional data generation unit 25. The blood vessel detection unit 26 can detect, for example, a cross-section of the blood vessel B in the three-dimensional ultrasound data through the template matching method for a cross-section perpendicular to the major axis direction D and can detect the three-dimensional blood vessel B using a set of detection results thereof. In addition, the blood vessel detection unit 26 can also detect the three-dimensional blood vessel B from the three-dimensional ultrasound data by using a machine learning model that has learned a large amount of three-dimensional ultrasound data and the three-dimensional blood vessels B present within the large amount of three-dimensional ultrasound data. Further, the blood vessel detection unit 26 can detect the blood vessel B by detecting a blood vessel wall or a blood vessel lumen. Here, the blood vessel lumen refers to a spatial region present inside the blood vessel wall.
The depth detection unit 27 detects a depth from the body surface to the blood vessel B of the subject over the entire length of the detected blood vessel B based on the blood vessel B detected by the blood vessel detection unit 26. The depth detection unit 27 can detect the depth from the body surface to the blood vessel B of the subject by measuring, over the entire length of the detected blood vessel B, a distance in a depth direction between an upper end part of the ultrasound data, that is, the body surface, and the center of the blood vessel lumen, or a shortest distance in the depth direction between the upper end part of the ultrasound data, that is, the body surface, and the blood vessel wall, in the three-dimensional ultrasound data.
The blood vessel image generation unit 28 generates, for example, as shown in
In the blood vessel image C, a display form can be changed according to the depth of the blood vessel B, such as using, in addition to a position and a shape of the blood vessel B along a plane represented by the optical image Q, that is, in a plan view, for example, color intensity, chroma saturation, transparency, brightness, a display form of a color or a contour line, or the like to represent the depth of the blood vessel B. For example, in a case where the depth of the blood vessel B is represented by the color intensity, the blood vessel image C can represent that a location with a darker color corresponds to a deeper position of the blood vessel B, and the blood vessel image C can represent that a location with a lighter color corresponds to a shallower position of the blood vessel B. The user can easily grasp the position, the shape, and the depth of the blood vessel B by confirming the blood vessel image C shown on the monitor 23 and can easily select the puncture position in the dialysis treatment or the like.
The main body controller 29 controls each unit of the apparatus main body 2 and the ultrasound probe 1 in accordance with a program recorded in advance, or the like.
The input device 30 accepts the input operation from the examiner and sends out input information to the main body controller 29. The input device 30 is composed of, for example, a device for the examiner to perform an input operation, such as a keyboard, a mouse, a trackball, a touchpad, or a touch panel.
Although the processor 32 including the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, and the main body controller 29 may be composed of a central processing unit (CPU) and a control program for causing the CPU to perform various types of processing, the processor 32 may be composed of a field programmable gate array (FPGA), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a graphics processing unit (GPU), or other integrated circuits (ICs), or may be composed of a combination thereof.
In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, and the main body controller 29 of the processor 32 can also be configured by being integrated partially or entirely into one CPU or the like.
Next, an example of the operation of the ultrasound diagnostic apparatus according to Embodiment 1 will be described using the flowchart of
First, in step S1, the main body controller 29 controls each unit of the ultrasound diagnostic apparatus to start scanning of the ultrasound probe 1. In this case, the main body controller 29 can start the scanning of the ultrasound probe 1, for example, with an input of an instruction to start the scanning of the ultrasound probe 1 from the user via the input device 30, as a trigger. From then on, the user performs the scanning with the ultrasound probe 1 while moving the ultrasound probe 1 along the major axis direction D of the blood vessel B in a state in which the ultrasound probe 1 is in contact with the body surface A of the subject such that the minor axis image of the blood vessel B of the subject is captured.
Next, in step S2, the probe position detection unit 33 detects the position of the ultrasound probe 1. In this case, the optical camera 3 of the probe position detection unit 33 images the body surface A of the subject and the ultrasound probe 1 to acquire the optical image Q, and the optical image analysis unit 24 analyzes the optical image Q, whereby the position of the ultrasound probe 1 on the body surface A of the subject is detected. In a case where a marker for detecting the position of the ultrasound probe 1 is attached to the ultrasound probe 1, the optical image analysis unit 24 can detect the position of the ultrasound probe 1 by recognizing the marker. In addition, the optical image analysis unit 24 can also detect the position of the ultrasound probe 1 through a template matching method or a method using a machine learning model.
In step S3, the image acquisition unit 31 acquires the ultrasound image U. In this case, the transducer array 11 of the ultrasound probe 1 transmits the ultrasound beam into the subject and receives the ultrasound echo from the inside of the subject, thereby generating the reception signal. The transmission and reception circuit 12 of the image acquisition unit 31 performs so-called reception focus processing on the reception signal to generate the sound ray signal, under the control of the main body controller 29. The sound ray signal generated by the transmission and reception circuit 12 is sent out to the image generation unit 21. The image generation unit 21 generates the ultrasound image U using the sound ray signal sent out from the transmission and reception circuit 12.
Here, since the processing of steps S2 and S3 is performed substantially at the same time, the position of the ultrasound probe 1 detected in step S2 and the position of the ultrasound probe 1 in a case where the ultrasound image U is acquired in step S3 can be regarded as the same.
In step S4, the main body controller 29 determines whether or not to end the scanning of the ultrasound probe 1. The main body controller 29 can determine to end the scanning of the ultrasound probe 1, for example, in a case where an instruction to end the scanning of the ultrasound probe 1 is input by the user via the input device 30. The main body controller 29 can determine to continue the scanning of the ultrasound probe 1, for example, in a case where an instruction to end the scanning of the ultrasound probe 1 is not input by the user via the input device 30.
In a case where it is determined in step S4 to continue the scanning of the ultrasound probe 1, the process returns to step S2, and the position of the ultrasound probe 1 is newly detected. In subsequent step S3, the ultrasound image U is newly acquired, and the process proceeds to step S4. In this way, processing of steps S2 to S4 is repeated as long as it is determined in step S4 to continue the scanning of the ultrasound probe 1. In a case where it is determined in step S4 to end the scanning of the ultrasound probe 1, the process proceeds to step S5.
In step S5, the three-dimensional data generation unit 25 generates the three-dimensional ultrasound data of the blood vessel B based on the position of the ultrasound probe 1 detected in the repetition of steps S2 to S4 and the plurality of continuous frames of the ultrasound images U acquired in the repetition of steps S2 to S4. For example, as shown in
In step S6, the blood vessel detection unit 26 detects the blood vessel B from the three-dimensional ultrasound data by analyzing the three-dimensional ultrasound data generated in step S5. The blood vessel detection unit 26 can detect, for example, a cross-section of the blood vessel B in the three-dimensional ultrasound data through the template matching method for a cross-section perpendicular to the major axis direction D and can detect the three-dimensional blood vessel B using a set of detection results thereof. In addition, the blood vessel detection unit 26 can also detect the three-dimensional blood vessel B through a method using a machine learning model. In this case, the blood vessel detection unit 26 can detect the blood vessel B by detecting a blood vessel wall or a blood vessel lumen.
In step S7, the depth detection unit 27 detects the depth from the body surface A to the blood vessel B of the subject over the entire length of the detected blood vessel B based on the three-dimensional blood vessel B detected in step S6. The depth detection unit 27 can detect the depth from the body surface A to the blood vessel B of the subject by measuring, over the entire length of the detected blood vessel B, the distance in the depth direction between the upper end part of the ultrasound data, that is, the body surface A, and the center of the blood vessel lumen, or the shortest distance in the depth direction between the upper end part of the ultrasound data, that is, the body surface A, and the blood vessel wall, in the three-dimensional ultrasound data.
In step S8, the blood vessel image generation unit 28 generates, for example, as shown in
In step S9, the optical camera 3 acquires the optical image Q in which the body surface A of the subject is captured, under the control of the main body controller 29.
In step S10, the blood vessel image generation unit 28 superimposes the blood vessel image C generated in step S8 on, for example, as shown in
The user can easily grasp the position, the shape, and the depth of the blood vessel B within the subject by confirming the blood vessel image C displayed on the monitor 23 in this manner and can easily select the puncture position in the dialysis treatment or the like.
In a case where the processing of step S10 is completed, the operation of the ultrasound diagnostic apparatus following the flowchart of
From the above, with the ultrasound diagnostic apparatus of Embodiment 1 of the present invention, the three-dimensional data generation unit 25 generates the three-dimensional ultrasound data of the blood vessel B of the subject based on the plurality of frames of the ultrasound images U acquired by the image acquisition unit 31 while performing scanning with the ultrasound probe 1 along the major axis direction D of the blood vessel B and the position of the ultrasound probe 1 detected by the probe position detection unit 33, the blood vessel detection unit 26 detects the blood vessel B from the three-dimensional ultrasound data generated by the three-dimensional data generation unit 25, the depth detection unit 27 detects the depth from the body surface A to the blood vessel B of the subject based on the three-dimensional blood vessel B detected by the blood vessel detection unit 26, and the blood vessel image generation unit 28 generates the blood vessel image C depicting the blood vessel B within the subject based on the position of the ultrasound probe 1 detected by the probe position detection unit 33, the three-dimensional blood vessel B detected by the blood vessel detection unit 26, and the depth detected by the depth detection unit 27 and displays the blood vessel image C on the monitor 23. Therefore, the user can easily grasp the position, the shape, and the depth of the blood vessel B within the subject by confirming the blood vessel image C and can easily select the puncture position in the dialysis treatment or the like.
Although a case where the transmission and reception circuit 12 is provided in the ultrasound probe 1 has been described, the transmission and reception circuit 12 may be provided in the apparatus main body 2.
In addition, although a case where the image generation unit 21 is provided in the apparatus main body 2 has been described, the image generation unit 21 may be provided in the ultrasound probe 1.
Further, the apparatus main body 2 may be a so-called stationary type, a portable type that is easy to carry, or a so-called handheld type that is composed of, for example, a smartphone or a tablet type computer. As described above, the type of the device that constitutes the apparatus main body 2 is not particularly limited.
Further, although a case where the image generation unit 21 generates the B-mode image as the ultrasound image U has been described, it is also possible to further generate a so-called Doppler image. In this case, the apparatus main body 2 can comprise, for example, an image generation unit 21A having a configuration shown in
The image generation unit 21A comprises the signal processing section 45 and a quadrature detection section 51 connected to the transmission and reception circuit 12. Similar to the image generation unit 21 shown in
The quadrature detection section 51 mixes the sound ray signal generated by the transmission and reception circuit 12 with a carrier signal having a reference frequency to perform quadrature detection on the sound ray signal, thereby converting the sound ray signal into complex data.
The high-pass filter 52 functions as a so-called wall filter and removes a frequency component derived from the motion of the body tissue within the subject, from the complex data generated by the quadrature detection section 51.
The high-speed Fourier transformation section 53 performs frequency analysis by performing a Fourier transform on the complex data of a plurality of sample points, obtains the blood flow velocity, and generates a spectrum signal.
The Doppler image generation section 54 generates a Doppler image by aligning the spectrum signals generated by the high-speed Fourier transformation section 53 on a time axis and representing the magnitude of each frequency component as brightness. In the Doppler image, the lateral axis indicates a time axis, the vertical axis indicates a Doppler shift frequency, that is, a flow velocity, and the brightness of the waveform represents power in each frequency component.
The complex data memory 55 stores the complex data converted from the sound ray signal by the quadrature detection section 51.
Since the Doppler image is an image of a location where blood flow is present, the image of the blood vessel lumen can be clearly captured in the minor axis image of the blood vessel B, for example, even in a case where the blood vessel wall is thickened, a case where a plaque is generated in the blood vessel B to cause the stenosis of the blood vessel B, and the like.
The three-dimensional data generation unit 25 can acquire the B-mode image and the Doppler image, which are generated by the image generation unit 21A, and can generate the three-dimensional ultrasound data based on the plurality of continuous frames of the B-mode images and the plurality of continuous frames of the Doppler images.
The blood vessel detection unit 26 can detect the blood vessel lumen from three-dimensional ultrasound data based on the B-mode image and the Doppler image. Since the image of the blood vessel lumen is clearly captured in the Doppler image, the blood vessel detection unit 26 can improve the detection accuracy of the blood vessel lumen by detecting the blood vessel lumen based on both the B-mode image and the Doppler image, as compared with the blood vessel lumen detected based only on the B-mode image.
In a case where the blood vessel detection unit 26 detects the blood vessel lumen based on both the B-mode image and the Doppler image and the depth detection unit 27 detects the distance in the depth direction from the body surface A of the subject to the center of the blood vessel lumen as the depth of the blood vessel B, it is possible to improve the detection accuracy of the depth of the blood vessel B.
In Embodiment 1, a case where the position of the ultrasound probe 1 is detected based on the optical image Q has been described, but the detection method of the ultrasound probe 1 is not limited to this.
The apparatus main body 2B is obtained by removing the optical image analysis unit 24 and providing a main body controller 29B instead of the main body controller 29 with respect to the apparatus main body 2 in Embodiment 1 shown in
The position sensor 33B attached to the ultrasound probe 1 is a sensor that detects the position of the ultrasound probe 1. The position sensor 33B can use, for example, a predetermined position as a reference to detect relative coordinates from the reference position as the position of the ultrasound probe 1. As the position sensor 33B, for example, a so-called magnetic sensor, an acceleration sensor, a gyro sensor, a global positioning system (GPS) sensor, a geomagnetic sensor, or the like can be used. Information on the position of the ultrasound probe 1 detected by the position sensor 33B is transmitted to the three-dimensional data generation unit 25 and the blood vessel image generation unit 28. The three-dimensional data generation unit 25 generates the three-dimensional ultrasound data of the blood vessel B of the subject based on the plurality of frames of the ultrasound images U including the minor axis image of the blood vessel B, which are acquired by the image acquisition unit 31 while performing scanning with the ultrasound probe 1 along the major axis direction D of the blood vessel B and the position of the ultrasound probe 1 detected by the position sensor 33B.
The blood vessel image generation unit 28 generates, for example, as shown in
From the above, even in a case where the position of the ultrasound probe 1 is detected by the position sensor 33B, the blood vessel image generation unit 28 generates the blood vessel image C depicting the blood vessel B within the subject based on the position of the ultrasound probe 1 detected by the position sensor 33B, the three-dimensional blood vessel B detected by the blood vessel detection unit 26, and the depth detected by the depth detection unit 27 and displays the blood vessel image C on the monitor 23, similar to a case where the position of the ultrasound probe 1 is detected based on the optical image Q. Therefore, the user can easily grasp the position, the shape, and the depth of the blood vessel B within the subject by confirming the blood vessel image C and can easily select the puncture position in the dialysis treatment or the like.
For example, a subject undergoing a dialysis treatment undergoes frequent punctures with a thick needle in the blood vessel B, which may lead to abnormalities, such as thickening of the blood vessel wall and stenosis of the blood vessel B. Therefore, it is desirable to avoid such an abnormal location in a case of puncturing the blood vessel B of the subject with a needle. In that respect, the ultrasound diagnostic apparatus of the embodiment of the present invention can also automatically detect the abnormal location such that the user does not puncture the abnormal location.
The apparatus main body 2C comprises a main body controller 29C instead of the main body controller 29 and further comprises a blood vessel size calculation unit 61 and an abnormal location detection unit 62 (first abnormal location detection unit), with respect to the apparatus main body 2 in Embodiment 1. In the apparatus main body 2C, the blood vessel size calculation unit 61 is connected to the blood vessel detection unit 26 and the main body controller 29C. The abnormal location detection unit 62 is connected to the blood vessel size calculation unit 61. The abnormal location detection unit 62 is connected to the display controller 22. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29C, the blood vessel size calculation unit 61, and the abnormal location detection unit 62 constitute a processor 32C for the apparatus main body 2C.
The blood vessel size calculation unit 61 calculates the thickness of the blood vessel wall or the diameter of the blood vessel lumen over the entire length of the blood vessel B detected by the blood vessel detection unit 26, based on the blood vessel wall and the blood vessel lumen detected by the blood vessel detection unit 26. The blood vessel size calculation unit 61 can calculate the thickness of the blood vessel wall or the diameter of the blood vessel lumen over the entire length of the blood vessel B detected by the blood vessel detection unit 26 by, for example, measuring the blood vessel wall and the blood vessel lumen of the three-dimensional blood vessel B in the plane perpendicular to the major axis direction D.
The abnormal location detection unit 62 has predetermined change rate threshold values for a spatial change rate of the thickness of the blood vessel wall in the major axis direction D and a spatial change rate of the diameter of the blood vessel lumen in the major axis direction D and detects a location where the spatial change rate of the thickness of the blood vessel wall or the diameter of the blood vessel lumen calculated by the blood vessel size calculation unit 61 exceeds the change rate threshold value.
The abnormal location detection unit 62 can calculate the spatial change rate of the thickness of the blood vessel wall in the major axis direction D by differentiating the thickness of the blood vessel wall with respect to the position of the blood vessel B in the major axis direction D in the relationship between the position of the blood vessel B in the major axis direction D and the thickness of the blood vessel wall at that position. In addition, the abnormal location detection unit 62 can calculate the spatial change rate of the diameter of the blood vessel lumen in the major axis direction D by differentiating the diameter of the blood vessel lumen with respect to the position of the blood vessel B in the major axis direction D in the relationship between the position of the blood vessel B in the major axis direction D and the diameter of the blood vessel lumen at that position.
Here, in a case where abnormalities, such as thickening of the blood vessel wall and stenosis, have occurred in the blood vessel B of the subject, the spatial change rate of the thickness of the blood vessel wall or the diameter of the blood vessel lumen rapidly increases. Therefore, a location where these spatial change rates exceed the change rate threshold values can be regarded as an abnormal location where abnormalities, such as thickening of the blood vessel wall and stenosis, have occurred.
The abnormal location detection unit 62 displays the location where the spatial change rate of the thickness of the blood vessel wall or the diameter of the blood vessel lumen calculated by the blood vessel size calculation unit 61 exceeds the change rate threshold value, for example, as shown in
From the above, with the ultrasound diagnostic apparatus of Embodiment 3, the blood vessel size calculation unit 61 calculates the thickness of the blood vessel wall or the diameter of the blood vessel lumen over the entire length of the blood vessel B detected by the blood vessel detection unit 26, and the abnormal location detection unit 62 detects the location where the spatial change rate of the thickness of the blood vessel wall or the diameter of the blood vessel lumen calculated by the blood vessel size calculation unit 61 exceeds the change rate threshold value, and displays the location on the monitor 23. Therefore, the user can easily grasp the position of the blood vessel B, where the needle puncture should be avoided, and can easily select an appropriate position to be punctured with the needle in the blood vessel B.
The ultrasound diagnostic apparatus of Embodiment 3 has a configuration in which the blood vessel size calculation unit 61 and the abnormal location detection unit 62 are added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration may be employed in which the blood vessel size calculation unit 61 and the abnormal location detection unit 62 are added to the ultrasound diagnostic apparatus of Embodiment 2. Even in this case as well, the user can easily grasp the position of the blood vessel B, where the needle puncture should be avoided, by confirming the display of the abnormal location and can puncture an appropriate position with the needle.
The ultrasound diagnostic apparatus of the embodiment of the present invention can also display a three-dimensional image of the blood vessel B on the monitor 23 such that the user can easily determine an appropriate position to be punctured with the needle.
In the apparatus main body 2D, the three-dimensional image generation unit 63 is connected to the three-dimensional data generation unit 25, the blood vessel image generation unit 28, and the main body controller 29D. The three-dimensional image generation unit 63 is connected to the display controller 22. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29D, and the three-dimensional image generation unit 63 constitute a processor 32D for the apparatus main body 2D.
The three-dimensional image generation unit 63 generates, as shown in
The three-dimensional image generation unit 63 can also display the three-dimensional ultrasound image J of the blood vessel B on the monitor 23 at a rotational position in accordance with a viewpoint or a rotation angle designated by the user via the input device 30.
From the above, with the ultrasound diagnostic apparatus of Embodiment 4, the three-dimensional image generation unit 63 generates, as shown in
The three-dimensional image generation unit 63 can also reconstruct two-dimensional images of the blood vessel B in which the blood vessel B is viewed from a plurality of directions, from the three-dimensional ultrasound data, and can display the two-dimensional image on the monitor 23 together with the three-dimensional ultrasound image J of the blood vessel B. As a result, the user can grasp the shape and the depth of the blood vessel B in more detail and can easily select an appropriate position in the needle puncture.
In addition, the ultrasound diagnostic apparatus of Embodiment 4 has a configuration in which the three-dimensional image generation unit 63 is added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration may be employed in which the three-dimensional image generation unit 63 is added to the ultrasound diagnostic apparatuses of Embodiments 2 and 3. Even in this case as well, the user can grasp the shape and the depth of the blood vessel B in detail and can easily select an appropriate position in the needle puncture.
Depending on the user's proficiency level, there may be a case where the puncture direction and the puncture angle of the needle cannot be easily decided on even though the position suitable for the needle puncture in the blood vessel B can be grasped. In that respect, the ultrasound diagnostic apparatus of the embodiment of the present invention can also automatically calculate a recommended needle puncture direction and a recommended puncture angle.
In the apparatus main body 2E, the recommended puncture pathway calculation unit 64 is connected to the three-dimensional data generation unit 25 and the main body controller 29E. The recommended puncture pathway calculation unit 64 is connected to the display controller 22. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29E, and the recommended puncture pathway calculation unit 64 constitute a processor 32E for the apparatus main body 2E.
The recommended puncture pathway calculation unit 64 calculates, for example, as shown in
In this case, the recommended puncture pathway calculation unit 64 can detect, for example, from the three-dimensional ultrasound data, the flow direction of the blood vessel B and the position of the abnormal location, such as the thickening of the blood vessel wall and the stenosis of the blood vessel B, in a viewpoint of viewing the body surface A of the subject from above, and can calculate, as the recommended puncture direction, a direction, which is parallel to the flow direction of the blood vessel B and has a progression direction that does not include the abnormal location, such as the thickening of the blood vessel wall and the stenosis of the blood vessel B, at the puncture location designated by the user via the input device 30 for the blood vessel image C.
In addition, the recommended puncture pathway calculation unit 64 can reconstruct, for example, a major axis image of the blood vessel B at the puncture location designated by the user via the input device 30 for the blood vessel image C from the three-dimensional ultrasound data, and can calculate, as the recommended puncture angle, a certain angle range with respect to the flow direction of the blood vessel B in the major axis image.
From the above, with the ultrasound diagnostic apparatus of Embodiment 5, the recommended puncture pathway calculation unit 64 calculates, for example, as shown in
The ultrasound diagnostic apparatus of Embodiment 5 has a configuration in which the recommended puncture pathway calculation unit 64 is added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration may be employed in which the recommended puncture pathway calculation unit 64 is added to the ultrasound diagnostic apparatuses of Embodiments 2 to 4. Even in this case as well, the user can easily grasp the appropriate puncture direction and the appropriate puncture angle for the decided-on puncture position by confirming the recommended puncture pathway.
An artery may be located in the vicinity of a vein within the subject. In that respect, the ultrasound diagnostic apparatus of the embodiment of the present invention can also display, for example, the vein and the artery in display forms different from each other so as to facilitate the user in avoiding the artery in the needle puncture.
In the apparatus main body 2F, the blood vessel determination unit 65 is connected to the blood vessel detection unit 26 and the main body controller 29F. The blood vessel determination unit 65 is connected to the blood vessel image generation unit 28. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29F, and the blood vessel determination unit 65 constitute a processor 32F for the apparatus main body 2F.
The blood vessel determination unit 65 determines whether the blood vessel B detected by the blood vessel detection unit 26 is the artery or the vein. Normally, since the artery exhibits temporal variations in blood vessel diameter due to the pulsation of the heart, the blood vessel determination unit 65 can determine the blood vessel B that exhibits temporal variations in blood vessel diameter as the artery and can determine the blood vessel B that does not exhibit temporal variations in blood vessel diameter as the vein, for example, based on the plurality of continuous frames of the ultrasound images U. The blood vessel determination unit 65 can also determine the artery and the vein using, for example, a machine learning model that has learned a large number of ultrasound images U in which arteries are captured and a large number of ultrasound images U in which veins are captured.
The blood vessel image generation unit 28 can generate the blood vessel image C in which a depiction form of the blood vessel B is changed according to a determination result by the blood vessel determination unit 65. In this case, the blood vessel image generation unit 28 can generate the blood vessel image C in which an artery B1 and a vein B2 are depicted in colors different from each other, for example, as shown in
From the above, with the ultrasound diagnostic apparatus of Embodiment 6, the blood vessel determination unit 65 determines whether the blood vessel B detected by the blood vessel detection unit 26 is the artery B1 or the vein B2, and the blood vessel image generation unit 28 generates the blood vessel image C in which the depiction form of the blood vessel B is changed according to the determination result by the blood vessel determination unit 65. Therefore, the user can easily grasp the positions of the artery B1 and of the vein B2 by confirming the display of the monitor 23 and can easily select an appropriate puncture position in the blood vessel B.
The ultrasound diagnostic apparatus of Embodiment 6 has a configuration in which the blood vessel determination unit 65 is added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration may be employed in which the blood vessel determination unit 65 is added to the ultrasound diagnostic apparatuses of Embodiments 2 to 5. Even in this case as well, the user can easily grasp the positions of the artery B1 and the vein B2 by confirming the display of the monitor 23 and can easily select an appropriate puncture position in the blood vessel B.
A nerve bundle may be located in the vicinity of the blood vessel B within the subject. In that respect, the ultrasound diagnostic apparatus of the embodiment of the present invention can also display, for example, the nerve bundle located in the vicinity of the blood vessel B such that the risk of the needle reaching the nerve bundle can be reduced.
In the apparatus main body 2G, the nerve bundle detection unit 66 is connected to the three-dimensional data generation unit 25 and the main body controller 29G. The nerve bundle detection unit 66 is connected to the display controller 22. In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29G, and the nerve bundle detection unit 66 constitute a processor 32G for the apparatus main body 2G.
The nerve bundle detection unit 66 detects the nerve bundle from the three-dimensional ultrasound data generated by the three-dimensional data generation unit 25 and displays, for example, as shown in
From the above, with the ultrasound diagnostic apparatus of Embodiment 7, the nerve bundle detection unit 66 detects the nerve bundle N from the three-dimensional ultrasound data generated by the three-dimensional data generation unit 25 and displays the nerve bundle N on the monitor 23. Therefore, the user can easily grasp the position of the nerve bundle N located in the vicinity of the blood vessel B and can select an appropriate puncture position in the blood vessel B so as to avoid the nerve bundle N.
The ultrasound diagnostic apparatus of Embodiment 7 has a configuration in which the nerve bundle detection unit 66 is added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration may be employed in which the nerve bundle detection unit 66 is added to the ultrasound diagnostic apparatuses of Embodiments 2 to 6. Even in this case as well, the user can easily grasp the position of the nerve bundle N located in the vicinity of the blood vessel B and can select an appropriate puncture position in the blood vessel B so as to avoid the nerve bundle N.
The ultrasound diagnostic apparatus of the embodiment of the present invention can automatically compare, for example, an observation result of the blood vessel B in a past examination with an observation result of the blood vessel B in a current examination such that the user can easily grasp the abnormal location of the blood vessel B of the subject.
In the apparatus main body 2H, the data memory 67 is connected to the image generation unit 21, the blood vessel image generation unit 28, and the main body controller 29H. The blood vessel size calculation unit 68 is connected to the blood vessel detection unit 26 and the main body controller 29H. The blood vessel size calculation unit 68 is connected to the data memory 67. The abnormal location detection unit 69 is connected to the data memory 67, the blood vessel size calculation unit 68, and the main body controller 29H. The abnormal location detection unit 69 is connected to the display controller 22. The association unit 70 is connected to the blood vessel image generation unit 28, the data memory 67, and the main body controller 29H. The finding location display unit 71 is connected to the association unit 70 and the main body controller 29H. The finding location display unit 71 is connected to the display controller 22. The past image display unit 72 is connected to the data memory 67, the finding location display unit 71, and the main body controller 29H. The past image display unit 72 is connected to the display controller 22. The similar image display unit 73 is connected to the image generation unit 21, the data memory 67, and the main body controller 29H. The similar image display unit 73 is connected to the display controller 22.
In addition, the image generation unit 21, the display controller 22, the optical image analysis unit 24, the three-dimensional data generation unit 25, the blood vessel detection unit 26, the depth detection unit 27, the blood vessel image generation unit 28, the main body controller 29H, the blood vessel size calculation unit 68, the abnormal location detection unit 69, the association unit 70, the finding location display unit 71, the past image display unit 72, and the similar image display unit 73 constitute a processor 32H for the apparatus main body 2H.
The blood vessel size calculation unit 68 calculates the thickness of the blood vessel wall or the diameter of the blood vessel lumen over the entire length of the blood vessel B detected by the blood vessel detection unit 26, based on the blood vessel wall and the blood vessel lumen detected by the blood vessel detection unit 26, in the same manner as in the blood vessel size calculation unit 61 in Embodiment 3.
The data memory 67 stores the ultrasound image U generated by the image generation unit 21, the thickness of the blood vessel wall or the diameter of the blood vessel lumen calculated by the blood vessel size calculation unit 68, the blood vessel image C generated by the blood vessel image generation unit 28, a finding location in the blood vessel B input in relation to the blood vessel image C by the user via the input device 30, and a finding regarding the finding location in relation to each other. The data memory 67 also stores a past ultrasound image U, a past blood vessel image C, a past finding location in the blood vessel B of the subject, and a finding content regarding the finding location.
Here, as the data memory 67, for example, recording media such as a flash memory, a hard disk drive (HDD), a solid state drive (SSD), a flexible disk (FD), a magneto-optical disk (MO disk), a magnetic tape (MT), a random access memory (RAM), a compact disc (CD), a digital versatile disc (DVD), a secure digital card (SD card), or a universal serial bus memory (USB memory) can be used.
The abnormal location detection unit 69 has a predetermined lumen diameter threshold value for a difference between a diameter of the blood vessel lumen in the past blood vessel image C, that is, a diameter of the blood vessel lumen stored in relation to the past blood vessel image C in the data memory 67, and a diameter of the blood vessel lumen in the current blood vessel image C, that is, a diameter of the blood vessel lumen calculated by the blood vessel size calculation unit 68 and related to the current blood vessel image C, and detects a location where the difference exceeds the lumen diameter threshold value.
Here, in a case where abnormalities, such as thickening of the blood vessel wall or stenosis, occur in the blood vessel B of the subject, a change in the diameter of the blood vessel lumen occurs. Therefore, at the same position of the blood vessel B, a location where the difference between the diameter of the past blood vessel lumen and the diameter of the current blood vessel lumen exceeds the lumen diameter threshold value can be regarded as the abnormal location where abnormalities, such as thickening of the blood vessel wall and stenosis, have occurred.
The abnormal location detection unit 69 can, for example, as shown in
The association unit 70 associates the position of the past blood vessel image C stored in the data memory 67 with the position of the current blood vessel image C generated by the blood vessel image generation unit 28. In this case, the association unit 70 associates the past finding location and the past finding content, which are stored in relation to the past blood vessel image C in the data memory 67, with the current blood vessel image C together with the past blood vessel image C.
As shown in
The user can easily select an appropriate puncture position by confirming the past finding location displayed on the monitor 23 in this manner.
The past image display unit 72 displays the past finding content and the past ultrasound image U corresponding to the current location, which is displayed on the monitor 23 by the finding location display unit 71 and designated by the user via the input device 30, on the monitor 23. As shown in
The past image display unit 72 can display the past ultrasound image U corresponding to the current location designated by the user on the monitor 23 by referring to the data memory 67, but, for example, in a case where an ultrasound image U representing the major axis image or the like of the blood vessel B reconstructed based on the three-dimensional ultrasound data generated by the three-dimensional data generation unit 25 is stored in the data memory 67, the reconstructed ultrasound image U can also be displayed on the monitor 23.
The user can grasp the state of the blood vessel B of the subject in more detail by confirming the past finding content and the past ultrasound image U displayed on the monitor 23 by the past image display unit 72 and can select an appropriate puncture position.
The similar image display unit 73 refers to the data memory 67 to search the plurality of frames of the ultrasound images U acquired in the current examination for an ultrasound image U similar to the past ultrasound image U displayed by the past image display unit 72 and displays the ultrasound image U on the monitor 23. The user can use a temporal change in the past finding location in selecting an appropriate puncture position as a reference by confirming the past ultrasound image U displayed by the past image display unit 72 and the current ultrasound image U displayed by the similar image display unit 73.
From the above, with the ultrasound diagnostic apparatus of Embodiment 8, the observation result of the blood vessel B in the past examination and the observation result of the blood vessel B in the current examination are automatically compared by the abnormal location detection unit 69, the finding location display unit 71, the past image display unit 72, and the similar image display unit 73, and the result is displayed on the monitor 23. Therefore, the user can easily select an appropriate puncture position by confirming the comparison result displayed on the monitor 23.
In addition, the ultrasound diagnostic apparatus of Embodiment 8 has a configuration in which the data memory 67, the blood vessel size calculation unit 68, the abnormal location detection unit 69, the association unit 70, the finding location display unit 71, the past image display unit 72, and the similar image display unit 73 are added to the ultrasound diagnostic apparatus of Embodiment 1, but a configuration can also be employed in which the data memory 67, the blood vessel size calculation unit 68, the abnormal location detection unit 69, the association unit 70, the finding location display unit 71, the past image display unit 72, and the similar image display unit 73 are added to the ultrasound diagnostic apparatuses of Embodiments 2 and 4 to 7, and a configuration can also be employed in which the data memory 67, the association unit 70, the finding location display unit 71, the past image display unit 72, and the similar image display unit 73 are added to the ultrasound diagnostic apparatus of Embodiment 3. Even in these cases as well, the user can easily select an appropriate puncture position by confirming the comparison result displayed on the monitor 23.
Number | Date | Country | Kind |
---|---|---|---|
2022-154712 | Sep 2022 | JP | national |