The disclosure relates to an imaging system, and more particularly to an ultrasonic imaging system.
Ultrasound imaging is now widely used clinically for tissue diagnosis. A crystal of a conventional ultrasonic diagnostic probe can achieve a one-dimensional (1D) array arrangement with linear cutting, so directional electronic phase focusing can be performed to create a two-dimensional (2D) sectional image (ultrasonic image).
Since the conventional ultrasound imaging can only generate 2D ultrasonic images, one conventional approach to obtaining a three-dimensional (3D) ultrasonic image moves an ultrasonic probe to perform manual scanning so as to acquire multiple sectional images corresponding to different locations in sequence, and then performs numerical operations on the acquired sectional images to construct the 3D ultrasonic image. An array ultrasonic probe with 2D cutting may also be used to acquire the sectional images corresponding to different locations by having the ultrasonic probe elements be excited row by row. However, the probe used in the first approach may be expensive because of the high complexity in mechanical design, and the probe used in the second approach may be even more expensive.
Obtaining 3D anatomical information is critical for clinical interventional judgment. In this disclosure, it is intended to propose two possible approaches to providing 3D anatomical information for image guided intervention. The first one is to obtain 3D anatomical information via real-time reconstruction of 3D ultrasonic images. The second one is to superimpose a 2D real-time ultrasonic image onto a high-resolution 3D medical image.
Therefore, an object of the disclosure is to provide an ultrasonic imaging system that is used to construct a 3D ultrasonic image.
According to the disclosure, the ultrasonic imaging system includes an ultrasonic probe and a processing unit electrically coupled to the ultrasonic probe. The ultrasonic probe is operable at multiple different tilt angles that are defined by coplanar lines to send ultrasonic signals into a test target and to receive reflected ultrasonic signals corresponding to the ultrasonic signals from the test target. The processing unit controls the ultrasonic probe to send the ultrasonic signals and to receive the reflected ultrasonic signals, and is configured to generate a plurality of 2D ultrasonic images that respectively correspond to the different tilt angles based on the reflected ultrasonic signals, and to generate a 3D ultrasonic image based on the 2D ultrasonic images and the different tilt angles.
Another object of the disclosure is to provide an ultrasonic imaging system that can construct a 3D ultrasonic image and superimpose the constructed 3D ultrasonic image with a 3D medical image.
According to the disclosure, the ultrasonic imaging system includes an ultrasonic probe, a processing unit electrically coupled to the ultrasonic probe, a first pattern fixed on the ultrasonic probe, a second pattern to be disposed on the test target, a storage unit electrically coupled to the processing unit, an image capturing unit electrically coupled to the processing unit, and a display unit electrically coupled to the processing unit. The ultrasonic probe is operable at multiple different tilt angles that are defined by coplanar lines to send ultrasonic signals into a test target and to receive reflected ultrasonic signals corresponding to the ultrasonic signals from the test target. The processing unit controls the ultrasonic probe to send the ultrasonic signals and to receive the reflected ultrasonic signals, and is configured to generate a plurality of 2D ultrasonic images that respectively correspond to the different tilt angles based on the reflected ultrasonic signals, and to generate a 3D ultrasonic image based on the 2D ultrasonic images and the different tilt angles. The second pattern has a predefined fixed positional relationship with the test target. The storage unit stores a 3D image related to the test target, a first positional relationship between the first pattern and each of the 2D ultrasonic images, and a second positional relationship between the second pattern and the test target. The image capturing unit is disposed to capture images of the test target, the first pattern and the second pattern in a real time manner. The processing unit is further configured to obtain a first spatial position-orientation of the first pattern based on the first pattern in the images captured by the image capturing unit, and to acquire a spatial location of the 3D ultrasonic image based on the first positional relationship and the first spatial position-orientation. The processing unit is further configured to obtain a second spatial position-orientation of the second pattern based on the second pattern in the images captured by the image capturing unit, and to acquire a spatial location of the test target based on the second positional relationship and the second spatial position-orientation. The processing unit is further configured to superimpose the 3D ultrasonic image and the 3D image stored in the storage unit together based on the spatial location of the 3D ultrasonic image and the spatial location of the test target.
Yet another object of the disclosure is to provide an ultrasonic imaging system that can superimpose a 2D ultrasonic image with a 3D medical image.
According to the disclosure, the ultrasonic imaging system includes an ultrasonic probe, a processing unit electrically coupled to the ultrasonic probe, a first pattern fixed on the ultrasonic probe, a second pattern to be disposed on the test target, a storage unit electrically coupled to the processing unit, an image capturing unit electrically coupled to the processing unit, and a display unit electrically coupled to the processing unit. The ultrasonic probe is operable to send ultrasonic signals into a test target and to receive reflected ultrasonic signals corresponding to the ultrasonic signals from the test target. The processing unit controls the ultrasonic probe to send the ultrasonic signals and to receive the reflected ultrasonic signals, and is configured to generate a 2D ultrasonic image based on the reflected ultrasonic signals. The second pattern has a predefined fixed positional relationship with the test target. The storage unit stores a 3D image related to the test target, a first positional relationship between the first pattern and the 2D ultrasonic image, and a second positional relationship between the second pattern and the test target. The image capturing unit is disposed to capture images of the test target, the first pattern and the second pattern in a real time manner. The processing unit is further configured to obtain a first spatial position-orientation of the first pattern based on the first pattern in the images captured by the image capturing unit, and to acquire a spatial location of the 2D ultrasonic image based on the first positional relationship and the first spatial position-orientation. The processing unit is further configured to obtain a second spatial position-orientation of the second pattern based on the second pattern in the images captured by the image capturing unit, and to acquire a spatial location of the test target based on the second positional relationship and the second spatial position-orientation. The processing unit is further configured to superimpose the 2D ultrasonic image and the 3D image stored in the storage unit together based on the spatial location of the 2D ultrasonic image and the spatial location of the test target.
Other features and advantages of the disclosure will become apparent in the following detailed description of the embodiment(s) with reference to the accompanying drawings, of which:
Before the disclosure is described in greater detail, it should be noted that where considered appropriate, reference numerals or terminal portions of reference numerals have been repeated among the figures to indicate corresponding or analogous elements, which may optionally have similar characteristics.
Referring to
The ultrasonic probe 1 may be a conventional ultrasonic probe, and is operable at multiple different tilt angles that are defined by coplanar lines to send ultrasonic signals into the test target and to receive reflected ultrasonic signals corresponding to the ultrasonic signals from the test target. It should be noted that the ultrasonic probe 1 may be held in a user's hand to operate at different tilt angles in some embodiments, or may be operated using a special mechanical device to change among the different tilt angles more steadily in other embodiments.
The IMU 2 is mounted to the ultrasonic probe 1 in such a way that the IMU 2 tilts at a same angle as the ultrasonic probe 1, and is configured to detect acceleration components respectively corresponding to three axial directions that are defined with respect to the IMU 2. In this embodiment, the acceleration components include a first acceleration component, a second acceleration component, and a third acceleration component that respectively correspond to a first axial direction, a second axial direction, and a third axial direction that are perpendicular to each other. The tilt angle is defined to be an angle between the third axial direction and a direction of the gravitational acceleration, and can be anywhere between −90° and 90°. In this embodiment, when the tilt angle is 0°, the third axial direction is parallel to the normal vector 91, but this disclosure is not limited in this respect. The tilt angle, the gravitational acceleration, and the acceleration components have the following relationships:
where G represents a magnitude of the gravitational acceleration, A1 represents a magnitude of the first acceleration component, A2 represents a magnitude of the second acceleration component, A3 represents a magnitude of the third acceleration component, and φ represents the tilt angle. In other words, the tilt angle of the ultrasonic probe 1 can be calculated using the equations (1) and (2).
The processing unit 3 may be a processor of a computer, a digital signal processor (DSP), or any other kind of processing chip having computational capability, but this disclosure is not limited in this respect. The processing unit 3 is electrically coupled to the ultrasonic probe 1 and the IMU 2. When the ultrasonic probe 1 is in operation, the processing unit 3 receives the acceleration components detected by the IMU 2, controls the ultrasonic probe 1 to send the ultrasonic signals and to receive the reflected ultrasonic signals, and then generates a 2D ultrasonic image based on the reflected ultrasonic signals thus received. The 2D ultrasonic image may be a brightness mode (B-Mode) image that is obtainable using a conventional ultrasonic probe, and corresponds to a tilt angle the ultrasonic probe 1 was at when the 2D ultrasonic image was generated. Therefore, the processing unit 3 would generate a plurality of 2D ultrasonic images respectively corresponding to multiple different tilt angles based on the reflected ultrasonic signals received thereby when the ultrasonic probe 1 changes among these different tilt angles during operation. Subsequently, the processing unit 3 calculates, for each of the 2D ultrasonic images, the corresponding tilt angle based on the acceleration components received when the ultrasonic probe 1 was at the corresponding tilt angle (or when the 2D ultrasonic image was generated), and generates a 3D ultrasonic image based on the 2D ultrasonic images and the corresponding tilt angles thus calculated. It is noted that, in some embodiments, it may be the IMU 2 that calculates the tilt angle, and this disclosure is not limited in this respect.
Referring to
Referring to
For ease of calculation, in this embodiment, the greatest positive tilt angle φmax and the greatest negative tilt angle φmin may have the same magnitude but with different signs. For example, in a case that the greatest positive tilt angle is 60 degrees, the greatest negative tilt angle would be −60 degrees, but this disclosure is not limited thereto. In other cases, the greatest positive tilt angle can be about 90 degrees, and the greatest negative tilt angle would be about −90 degrees.
A maximum width (denoted as W in
H=h+R(1−sin(φcri)) (3)
L=2(h+R)|cos(φcri)| (4)
where h represents a maximum height of each of the 2D ultrasonic images, H represents a maximum height of the 3D ultrasonic image, L represents a maximum length of the 3D ultrasonic image, and φcri represents an absolute value of the greatest (greatest when looking at the magnitude only) one of the tilt angles that respectively correspond to the 2D ultrasonic images.
Each of the 2D ultrasonic images corresponds to a respective 2D coordinate system which is defined by an x-axis and a y-axis, and in which the maximum width of the 2D ultrasonic image refers to the maximum width of the 2D ultrasonic image in a direction of the x-axis, and the maximum height of the 2D ultrasonic image refers to the maximum height of the 2D ultrasonic image in a direction of the y-axis. The 3D ultrasonic image corresponds to a 3D coordinate system which is defined by an X-axis, a Y-axis and a Z-axis. As exemplified in
In other embodiments, the ultrasonic imaging system may acquire the tilt angle of the ultrasonic probe in ways other than using the IMU. For example, the ultrasonic imaging system may include a camera, and the ultrasonic probe may be provided with a barcode or a specific pattern thereon. Image recognition techniques may be used on an image captured by the camera of the barcode or the specific pattern in order to obtain Euler angles of the ultrasonic probe, and then acquire a corresponding tilt angle accordingly. In another example, the ultrasonic imaging system may include two cameras, and use an angular difference between the cameras to construct a location of the ultrasonic probe in the 3D space, thereby obtaining the Euler angles and the tilt angle of the ultrasonic probe. In yet another example, the ultrasonic imaging system may include an electromagnetic tracker that uses magnetic induction to identify three dimensional directions, so as to obtain the Euler angles and the tilt angle of the ultrasonic probe.
The display unit 4 is exemplified as a screen that is electrically coupled to the processing unit 3 for displaying the 3D ultrasonic image, or for displaying the 3D ultrasonic image and the 2D ultrasonic images simultaneously. In some embodiments, the processing unit 3 may be capable of generating a sectional image by taking a sectional view of the 3D ultrasonic image in any desired direction, of performing image processing on the sectional image, and of causing the display unit 4 to display the sectional image and the result of the image processing at the same time.
The processing unit 3 may perform image processing on the sectional image to generate functional images of, for example, entropy-based imaging, Doppler imaging, strain imaging, Nakagami imaging, and so on. The functional images of Doppler imaging may show blood flow. The functional images of strain imaging may be provided for Young's modulus measurement to identify elasticity of tissue. The functional images of entropy-based imaging or Nakagami imaging may provide analysis of regularity in structural arrangement of tissue. The processing unit 3 can cause the display unit 4 to simultaneously display the sectional image and at least one of the functional images, the 3D ultrasonic image and the 2D ultrasonic images, thereby providing various different ultrasound-based images for inspection by medical professionals.
Referring to
The first pattern 81 is fixed on the ultrasonic probe 1.
The second pattern 82 is disposed on the test target 92 in such a way that the second pattern 82 has a predefined fixed positional relationship with the test target 92.
The third pattern 83 is disposed on the intervention tool 10.
Each of the first pattern 81, the second pattern 82 and the third pattern 83 includes one or more one-dimensional barcodes, or one or more two-dimensional barcodes, or a specific pattern that is adapted for acquiring, via image recognition, a fixed normal vector (i.e., a normal vector with a fixed initial point, representing a spatial position and a spatial orientation) of the specific pattern. The fixed normal vector may include information of spatial position, orientation, and angle of the fixed normal vector in the 3D space.
In this embodiment, the first pattern 81 is exemplified to include four square two-dimensional barcodes, the second pattern 82 is exemplified to include eight coplanar two-dimensional barcodes that are disposed at two opposite sides of the test surface of the test target 92, and the third pattern 83 is exemplified to include one two-dimensional barcode that is attached to the intervention tool 10. However, in
The storage unit 6 is electrically coupled to the processing unit 5, and stores a 3D image related to the test target 92, a first positional relationship between the first pattern 81 and each of the 2D ultrasonic images, a second positional relationship between the second pattern 82 and the test target 92, and a third positional relationship between the third pattern 83 and the intervention tool 10. The 3D image has a high resolution, and may be a medical image of, for example, computerized tomography (CT), magnetic resonance imaging (MRI), etc. The first positional relationship between the first pattern 81 and each of the 2D ultrasonic images is fixed because the first pattern 81 is fixed on the ultrasonic probe 1 and moves along with the ultrasonic probe 1. The second positional relationship between the second pattern 82 and the test target 92 is fixed since the second pattern 82 is positioned on the test target 92 in a predefined manner. The third positional relationship between the third pattern 83 and the intervention tool 10 is fixed since the third pattern 83 is positioned on the intervention tool 10. Accordingly, the first positional relationship, the second positional relationship and the third positional relationship are predesigned or known parameters in this embodiment.
The image capturing unit 7 (e.g., a digital camera) is electrically coupled to the processing unit 5, and is disposed to capture images of the test target 92, the first pattern 81, the second pattern 82 and the third pattern 83 in a real time manner. That is, the test target 92, the first pattern 81, the second pattern 82 and the third pattern 83 are all covered by a field of view of the image capturing unit 7. In this embodiment, the image capturing unit 7 is mounted to the ultrasonic probe 1, but this is not essential for this embodiment as long as the image captured by the image capturing unit 7 can include the test target 92, the first pattern 81 and the second pattern 82 at the same time. For example, the image capturing unit 7 can be mounted to the test target 92 or the intervention tool 10 in other embodiments. A number of lenses of the image capturing unit 7 is determined using an image recognition and analysis technique to ensure that identification of a position and an orientation of the first pattern 81 (referred to as first spatial position-orientation hereinafter, and denoted as a fixed normal vector (V1) of a plane corresponding to the first pattern 81 in
The processing unit 5 obtains the first spatial position-orientation (V1) of the first pattern 81 based on the first pattern 81 in images captured by the image capturing unit 7, obtains the second spatial position-orientation (V2) of the second pattern 82 based on the second pattern 82 in the images captured by the image capturing unit 7, and obtains the third spatial position-orientation (V3) of the third pattern 83 based on the third pattern 83 in the images captured by the image capturing unit 7. In one embodiment, the processing unit 5 is a part of the image capturing unit 7.
In more detail, each of the images captured by the image capturing unit 7 contains all of the two-dimensional barcodes of the plurality of patterns 81, 82, 83, and each two-dimensional barcode may include at least three identification points that are disposed at specific positions (e.g., edges, corners, the center, etc.) of the two-dimensional barcode, respectively. When the processing unit 5 successfully identifies the identification points, the processing unit 5 uses predetermined or known spatial/positional relationships among the image capturing unit 7 and the identification points to acquire positional information of each of the identification points in the 3D space, and assigns spatial coordinates to each of the identification points accordingly.
Subsequently, for each of the two-dimensional barcodes, the processing unit 5 calculates a spatial vector for any two of the identification points of the two-dimensional barcode. The at least three identification points of the two-dimensional barcode may correspond to at least two distinct spatial vectors that are coplanar with the two-dimensional barcode. The processing unit 5 then calculates a cross product of two of the at least two spatial vectors for the two-dimensional barcode, thereby acquiring a fixed normal vector for the two-dimensional barcode. In another embodiment, the processing unit 5 calculates cross products for any two of the spatial vectors for the two-dimensional barcode, and acquires an average of the cross products to obtain a representative fixed normal vector for the two-dimensional barcode. In one implementation, one of the identification points of the two-dimensional barcode may be disposed at the center of the two-dimensional barcode, so the fixed normal vector calculated based on two spatial vectors corresponding to the central one of the identification points would be located at the center of the two-dimensional barcode. In other cases, if each two-dimensional barcode is below a certain size (sufficiently small) and has at least a certain number of identification points (sufficient number of identification points), the representative fixed normal vector of the two-dimensional barcode acquired based on the average of the cross products would be close to the center of the two-dimensional barcode. The processing unit 5 calculates an average of the fixed normal vectors (or the representative fixed normal vectors) obtained for the two-dimensional barcodes of the first pattern 81 to obtain the first spatial position-orientation (V1) of the first pattern 81, calculates an average of the fixed normal vectors (or the representative fixed normal vectors) obtained for the two-dimensional barcodes of the second pattern 82 to obtain the second spatial position-orientation (V2) of the second pattern 82, and calculates an average of the fixed normal vectors (or the representative fixed normal vectors) obtained for the two-dimensional barcodes of the third pattern 83 to obtain the third spatial position-orientation (V3) of the third pattern 83.
In this embodiment, since the second spatial position-orientation (V2) is obtained based on the eight two-dimensional barcodes, each of which has a set of known spatial coordinates, the processing unit 5 can acquire representative spatial coordinates of the first spatial position-orientation (V1) (e.g., coordinates of an initial point of the fixed normal vector (V1) in
After acquiring the first spatial position-orientation (V1) based on the first pattern 81 in the images captured by the image capturing unit 7, the processing unit 5 acquires a spatial location of a corresponding 2D ultrasonic image based on the first positional relationship and the first spatial position-orientation (V1). After acquiring the second spatial position-orientation (V2) based on the second pattern 82 in the images captured by the image capturing unit 7, the processing unit 5 acquires a spatial location of the test target 92 based on the second positional relationship and the second spatial position-orientation (V2). After acquiring the third spatial position-orientation (V3) based on the third pattern 83 in the images captured by the image capturing unit 7, the processing unit 5 acquires a spatial location of the intervention tool 10 based on the third positional relationship and the third spatial position-orientation (V3). Subsequently, the processing unit 5 superimposes the 2D ultrasonic image and the 3D image stored in the storage unit 6 together based on the spatial location of the 2D ultrasonic image and the spatial location of the test target 92, superimposes an image of the intervention tool 10 on the 3D image based on the spatial location of the intervention tool 10 and the spatial location of the test target 92, and causes the display unit 4 that is electrically coupled to the processing unit 5 to display the resultant image.
It is noted that, in some embodiments where the image of the intervention tool 10 is not required to be shown in the resultant image, the third pattern 83 may be omitted.
Furthermore, in some implementations of the second embodiment, the ultrasonic imaging system 200 may generate a 3D ultrasonic image using the method introduced in the first embodiment, and the processing unit 5 may superimpose the 3D ultrasonic image and the 3D image stored in the storage unit 6 together based on a spatial location of the 3D ultrasonic image and the spatial location of the test target 92. It is noted that since the 3D ultrasonic image is generated based on the 2D ultrasonic images obtained at multiple different tilt angles, the spatial location of the 3D ultrasonic image can be acquired based on the first positional relationship.
In summary, the first embodiment according to this disclosure uses the IMU 2 to acquire the tilt angle of the ultrasonic probe 1, so as to generate the 3D ultrasonic image based on the 2D ultrasonic images obtained at different tilt angles. The first embodiment can easily be applied to the conventional mid-end and low-end ultrasonic imaging systems with low cost and low complexity. The second embodiment according to this disclosure uses the image capturing unit 7 and preset patterns 81, 82, 83 to acquire positional relationships among the 3D medical image, the 2D/3D ultrasonic image and the intervention tool 10, so as to superimpose the 3D medical image, the 2D/3D ultrasonic image and the image of the intervention tool 10 together. The resultant image may have both the advantage of the high resolution from the 3D medical image and the advantage of immediacy from the 2D/3D ultrasonic image, thereby facilitating clinical diagnosis and treatment.
In the description above, for the purposes of explanation, numerous specific details have been set forth in order to provide a thorough understanding of the embodiment(s). It will be apparent, however, to one skilled in the art, that one or more other embodiments may be practiced without some of these specific details. It should also be appreciated that reference throughout this specification to “one embodiment,” “an embodiment,” an embodiment with an indication of an ordinal number and so forth means that a particular feature, structure, or characteristic may be included in the practice of the disclosure. It should be further appreciated that in the description, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of various inventive aspects, and that one or more features or specific details from one embodiment may be practiced together with one or more features or specific details from another embodiment, where appropriate, in the practice of the disclosure.
While the disclosure has been described in connection with what is (are) considered the exemplary embodiment(s), it is understood that this disclosure is not limited to the disclosed embodiment(s) but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.
Number | Date | Country | Kind |
---|---|---|---|
108132547 | Sep 2019 | TW | national |
This application is a divisional patent application of U.S. patent application Ser. No. 16/864,530, which claims priority to Taiwanese Invention Patent Application No. 108132547, filed on Sep. 10, 2019.
Number | Date | Country | |
---|---|---|---|
Parent | 16864530 | May 2020 | US |
Child | 18117437 | US |