This application claims priority to Korean Patent Application No. 10-2009-0124913 filed on Dec. 15, 2009 and Korean Patent Application No. 10-2010-0121158 filed on Dec. 1, 2010, the entire disclosures of which are incorporated herein by reference.
The present disclosure generally relates to ultrasound systems, and more particularly to an ultrasound system having an apparatus for selecting a slice image of a three-dimensional ultrasound image through a control volume unit and a method of selecting a two-dimensional slice image of the three-dimensional ultrasound image.
Three-dimensional ultrasound probes may acquire three-dimensional volume images (hereinafter referred to as 3D ultrasound images) by steering and triggering transducer elements to emit ultrasound signals and receiving echo ultrasound signals reflected from a target object. Such 3D ultrasound image may be changed in its representation according to a steering way of the transducer elements in the 3D ultrasound probe. Referring to
A cross-section of the 3D ultrasound image 10 may be obtained in the form of a two-dimensional (2D) ultrasound image (i.e., 2D slice image) at a region of interest (ROI) of a target object (not shown), which may be selected by an operator.
Referring to
For example, selecting a 2D slice image 11′ shown in
Various embodiments of an ultrasound system having an apparatus for selecting a slice image of a 3D ultrasound image and a method of selecting a 2D slice image from a 3D ultrasound image are provided. In one embodiment of the present disclosure, by way of non-limiting example, the ultrasound system comprises: a 3D ultrasound image acquisition unit configured to acquire a 3D ultrasound image of a target object; a 2D slice image selection unit including a control volume unit, the 2D slice image selection unit being configured to be rotated and/or moved by an operator; and a processor coupled to the 3D ultrasound image acquisition unit and 2D slice image selection unit. At least one selection plane is formed by the control volume unit as a reference plane for selecting at least one 2D slice image from the 3D ultrasound image. The processor is configured to extract at least one 2D slice image corresponding to the at least one selection plane from the 3D ultrasound image. The 3D ultrasound image or the at least one selection plane are rotated and/or moved together with the control volume unit.
The at least one selection plane may be fixed while the control volume unit and the 3D ultrasound image are rotated and/or moved together. Further, the processor matches a coordinate system of the control volume unit to a coordinate system of the 3D ultrasound image to rotate and/or move the 3D ultrasound image and the control volume unit together relative to the at least one selection plane. Alternatively, the 3D ultrasound image may be fixed while the control volume unit and the at least one selection plane are rotated and/or moved together, and the processor matches a coordinate system of the control volume unit to a coordinate system of the at least one selection plane to rotate and/or move the at least one selection plane and the control volume unit together relative to the 3D ultrasound image.
The shape of the control volume unit corresponds to the shape of the 3D ultrasound image.
The 2D slice image selection unit may include: an orientation and position recognition unit mounted on the control volume unit; a grip coupled to the control volume unit; and an operation button formed on the grip. The orientation and position recognition unit is configured to detect the rotation and/or movement of the control volume unit to form an orientation and position signal of the control volume unit. The operation button is configured to receive input data for operations of the 2D slice image selection unit from the operator.
The orientation and position recognition unit may include a sensor that is configured to detect the rotation and/or movement of the control volume unit to form detection signals. Further, the processor may be configured to generate the orientation and position signal of the control volume unit based on the detection signals.
The processor may include: a matching unit configured to match the coordinate system of the control volume unit to the coordinate system of the 3D ultrasound image; an image processing unit configured to change the orientation and position of the 3D ultrasound image or the at least one selection plane corresponding to the changed orientation and position of the control volume unit based on the orientation and position signal; and a 2D slice image extraction unit configured to extract at least one 2D slice image corresponding to the at least one selection plane from the 3D ultrasound image.
If the operation button receives input data of a first operation, then the matching unit matches the coordinate system of the control volume unit to the coordinate system of the 3D ultrasound image. If the operation button receives input data of a second operation, then the 2D slice image extraction unit extracts the at least one 2D slice image corresponding to the at least one selection plane from the 3D ultrasound image.
Further, in one embodiment of the present disclosure, the method of selecting a 2D slice image from a 3D ultrasound image comprises the following steps: a) acquiring a 3D ultrasound image of a target object; b) matching a coordinate system of the 3D ultrasound image to a coordinate system of a control volume unit configured to be moved and/or rotated by an operator, wherein at least one selection plane is formed by the control volume unit as a reference plane for selecting at least one 2D slice image from the 3D ultrasound image; c) detecting orientation and position of the control volume unit; d) rotating and/or moving the 3D ultrasound image or the at least one selection plane together with the control volume unit; and e) extracting at least one 2D slice image corresponding to the at least one selection plane from the 3D ultrasound image.
The at least one selection plane may be fixed while the control volume unit and the 3D ultrasound image are rotated and/or moved together. Otherwise, the 3D ultrasound image is fixed while the control volume unit and the at least one selection plane are rotated and/or moved together.
A detailed description may be provided with reference to the accompanying drawings. One of ordinary skill in the art may realize that the following description is illustrative only and is not in any way limiting. Other illustrative embodiments may readily suggest themselves to such skilled persons having the benefit of this disclosure.
Referring to
The 3D ultrasound image acquisition unit 110 may be configured to transmit ultrasound signals to a target object and receive reflected ultrasound signals, i.e., ultrasound echo signals, from the target object to acquire ultrasound data thereof. The organization of the 3D ultrasound image acquisition unit 110 will be described later with reference to
Referring to
The transmit signal formation unit 111 may be configured to form transmit signals in consideration of positions and focusing points of the transducer elements. The transmit signal formation unit 111 may be configured to form the transmit signals sequentially and repeatedly. Thus, the transmit signal formation unit 111 may be configured to form the transmit signals for obtaining image frames Fi (1≦i≦N, N being an integer) as shown in
In response to the transmit signals from the transmit signal formation unit 111, the ultrasound probe 112 may be configured to convert the transmit signals into corresponding ultrasound signals and transmit them to the target object. The ultrasound probe 112 may be further configured to receive ultrasound echo signals reflected from the target object to form receive signals. The receive signals may be analog signals. The ultrasound probe 112 may be configured to transmit the ultrasound signals and receive the ultrasound echo signals to form the receive signals in response to the transmit signals from the transmit signal formation unit 111. In an exemplary embodiment, the ultrasound probe 112 may include at least one of a 3D mechanical probe, a 2D array probe and the like.
In response to the receive signals from the ultrasound probe 112, the beam former 113 may be configured to convert the receive signals from analog to digital to form digital signals corresponding thereto. The beam former 113 may be further configured to receive-focus the digital signals in consideration of the positions and focusing points of the transducer elements in the ultrasound probe 112 to form a receive-focus beam.
The ultrasound data formation unit 114 may be configured to form ultrasound data based on the receive-focus beam from the beam former 113. For example, the ultrasound data formation unit 114 may be configured to form the ultrasound data corresponding to the respective frames Fi (1≦i≦N) shown in
The volume data formation unit 115 may be configured to form volume data 210 shown in
The image formation unit 116 may be configured to render the volume data from the volume data formation unit 115 to form a 3D ultrasound image. In an exemplary embodiment, rendering of the image formation unit 116 may include ray-casting rendering, surface rendering and the like.
Referring back to
Referring to
The operation button 124 may be provided on the grip 123 to receive input data from the operator. For example, the input data may include first and second input data, the first input data containing data for matching the coordinates of the control volume unit to that of the 3D ultrasound image and the second input data containing data for selecting a final 2D slice image.
As shown in
In case that the operator rotates the control volume unit 121 centering on the b-axis (represented in
In an exemplary embodiment, the control volume unit 121 may have the shape of, for example, a rectangular solid. In another embodiment, the control volume unit 121 may have a fan shape similar to a shape of the 3D ultrasound image, which may have curved top and bottom portions. With the fan shape, the operator may intuitively match the 3D ultrasound image to the control volume unit 121.
The orientation and position recognition unit 122 may be mounted within the control volume unit 121 to recognize the orientation and position thereof. In an exemplary embodiment, the processor 130 may be connected to the 3D ultrasound image acquisition unit 110 and the 2D slice image selection unit 120, and the orientation and position recognition unit 122 may include a sensor (not shown) configured to detect the rotation and/or movement of the control volume unit 121 to form detection signals corresponding to the rotation and/or movement thereof. The processor 130 may form orientation and position signals for determining the orientation and/or position of the control volume unit 121 based on the detection signals from the sensor. The sensor may comprise a device that detects the rotation and/or movement of the control volume unit 121. For example, the sensor may include an inertial sensor, a gyro sensor, an acceleration sensor and the like. Further, the processor 130 may include a micro controller unit (MCU) to form the orientation and position signals for determining the orientation and/or position of the control volume unit 121 based on the detection signals from the sensor.
The grip 123 may have a shape of a stick that is projected from one surface of the control volume unit 121. Such a shape may allow the operator to seize the grip 123 and change the orientation and/or position of the control volume unit 121 of 2D slice image selection unit 120.
Referring back to
Referring to
In response to an orientation and position signal from the 2D slice image selection unit 120, the image processing unit 132 may be configured to transit the orientation and position of the 3D ultrasound image (i.e., moving and rotating of the 3D ultrasound image) to correspond them to the orientation and position of the control volume unit 121, which are changed according to the orientation and position signal. For example, as shown in
The operation button 124 may be provided on the grip 123 to activate operations. In an exemplary embodiment, the operations may include a first operation defined as the first input data and a second operation defined as the second input data. For example, when the operation button 124 is activated in response to the first operation (i.e., the first input data is provided), the matching unit 131 may match the coordinate system of the control volume unit 121 to that of the 3D ultrasound image 10. When the operation button 124 is activated in response to the second operation (i.e., the second input data is provided), the 2D slice image extraction unit 133 may extract a 2D slice image on a slice corresponding to the fixed selection plane 121a from the 3D ultrasound image 10.
Referring back to
Referring to
When a first operation of the operation button 124 is activated (i.e., the first input data is provided from the operator) (S104), the matching unit 131 of the processor 130 may match the coordinate system of the control volume unit 121 to that of the 3D ultrasound image (S106). The 2D slice image extraction unit 133 may extract a first 2D slice image corresponding to the selection plane 121a from the 3D ultrasound image of which the coordinate system is matched to that of the control volume unit 121 (S108). The display unit 140 may display the matched 3D ultrasound image and the first 2D slice image (S110).
The processor 130 may decide whether or not the operator rotates and/or moves the control volume unit 121 due to the detection signals from the orientation and position recognition unit 122 (S112). If it is determined that the control volume unit 121 is moved and/or rotated, then the processor 130 may form an orientation and position signal, which represents the amount of rotation and/or movement of the control volume unit 121 (S114). Otherwise, if it is determined that the control volume unit 121 is not moved and/or rotated, then the processor 130 may not form the orientation and position signal.
In response to the orientation and position signal, the image processing unit 132 may transit the orientation and position of the 3D ultrasound image to the changed orientation and position of the control volume unit 121 (S116).
When a second operation of the operation button 124 is activated (i.e., the second input data is provided from the operator) (S118), the 2D slice image extraction unit 133 may extract a final 2D slice image corresponding to the selection plane 121a from the transited 3D ultrasound image (S120). Thereafter, the display unit 140 may display the final 2D slice image from the transited 3D ultrasound image (S122).
Although the matching the control volume unit 121 to the 3D ultrasound image 10 is described in the exemplary embodiment, it may be possible that the coordinate system of the 3D ultrasound image 10 shown in
In this embodiment, in response to the orientation and position signal, the processor 130 may transit the orientation and position of the selection plane 121a′ as shown in
According to the embodiments described above, there is a selection plane to select a 2D slice image. However, there may be a plurality of selection planes to select a plurality 2D slice images at one time.
According to the present disclosure, the operator can change the orientation and position of the 3D ultrasound image by changing the orientation or position of the control volume unit. Thus, the operator can easily select a slice to be displayed by using the control volume unit without performing any complex button operation.
Although embodiments have been described with reference to a number of illustrative embodiments thereof, it should be understood that various other modifications and embodiments can be devised by those skilled in the art that will fall within the spirit and scope of the principles of this disclosure. More particularly, numerous variations and modifications are possible in the component parts and/or arrangements of the subject combination arrangement within the scope of the disclosure, the drawings and the appended claims. In addition to variations and modifications in the component parts and/or arrangements, alternative uses will also be apparent to those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0124913 | Dec 2009 | KR | national |
10-2010-0121158 | Dec 2010 | KR | national |