Embodiments described herein relate generally to an ultrasonic diagnostic apparatus.
In recent years, in medical image diagnosis, alignment between three-dimensional image data, which are acquired by using a medical image diagnostic apparatus (an X-ray computer tomography apparatus, a magnetic resonance imaging apparatus, an ultrasonic diagnostic apparatus, an X-ray diagnostic apparatus, a nuclear medical diagnostic apparatus, etc.), has been performed by using various methods.
For example, alignment between three-dimensional (3D) ultrasonic image data and other three-dimensional (3D) medical image data is performed by acquiring, with use of an ultrasonic probe to which a position sensor is attached, three-dimensional image data to which position information is added, and by using this position information and position information which is added to the other 3D medical image data.
Besides, alignment between three-dimensional CT (Computed Tomography) image data and three-dimensional MR (magnetic resonance) image data is performed by analyzing the respective image data, specifying a region which functions as a landmark, and making the specified regions correspond to each other.
There are the following problems in the alignment between the 3D ultrasonic image data and 3D medical image data (three-dimensional image data of CT or MR, which is acquired by a medical image diagnostic apparatus) by the conventional method.
To begin with, an alignment operation with a CT or MR image has to be performed by a manual technique with an ultrasonic probe. Thus, a displacement occurs mainly in angular components, and the precision in alignment in the entirety of a region-of-interest tends to lower. In addition, it depends on the user's skill to perform alignment by finding in the 3D ultrasonic image data a structure common to the CT image or MR image. Thus, a variance occurs in precision of alignment. A tissue, a blood vessel, or blood appears differently between the CT image or MR image, and the ultrasonic image. In the case of ultrasonic, a structure relating to a gas or a deep portion of a bone cannot be viewed. In addition, an ultrasonic image of 3D display has a very small volume region, compared to the CT or MR. Thus, only a part of the structure is included in the ultrasonic image.
In the CT or MR, the direction of an image is kept constant by the bed. However, the direction of the image of 3D ultrasonic image data is freely variable, depending on how to apply the ultrasonic probe. Thus, in the alignment with the CT image or MR image, both the positional displacement and the angular displacement increase, and it is necessary to set a wide search range for alignment. However, if the search range is set to be large, it is highly possible that the ultrasonic image is trapped at a local optimal point and alignment fails to be achieved, and the success rate decreases. Accordingly, there is a difficulty in performing image alignment between the CT or MR image and the ultrasonic image. In research organizations or ultrasonic diagnostic apparatuses, attempts have been made to perform image alignment between the CT or MR image and the ultrasonic image, but these attempts are unsuccessful, and the quality in practical use is not secured. In ultrasonic diagnostic apparatuses, diagnosis is mostly conducted by two-dimensional tomographic images, and 3D ultrasonic image data is scarcely present, and this leads to a hindrance to alignment between the CT or MR image and the ultrasonic image. Furthermore, when alignment between 3D ultrasonic image data is considered, the alignment becomes alignment between small volumes, and the degree of freedom in position or direction is large, resulting in difficulty in securing overlap between data. A small overlap means that the number of included common structures is small. The image alignment between 3D ultrasonic image data has not been widely researched, and this alignment has not been put to practical use.
From the above points, the success rate of the image alignment between the 3D ultrasonic image data and the 3D medical image data by the conventional methods is low, and it can be said that the image alignment between the 3D ultrasonic image data and the 3D medical image data by the conventional methods is not practical.
In general, according to one embodiment, an ultrasonic diagnostic apparatus includes processing circuitry. The processing circuitry acquires position information relating to an ultrasonic probe and an ultrasonic image. The processing circuitry acquires ultrasonic image data which is obtained by transmission and reception of ultrasonic from the ultrasonic probe at a position where the position information is acquired, the ultrasonic image data being associated with the position information. The processing circuitry executes associating between a first coordinate system relating to the position information and a second coordinate system relating to medical image data. The processing circuitry executes image alignment between an ultrasonic image based on the associated ultrasonic image data and a medical image based on the medical image data.
Hereinafter, an ultrasonic diagnostic apparatus and an ultrasonic diagnosis support program according to embodiments will be described with reference to the accompanying drawings. In the embodiments to be described below, it is assumed that the parts denoted by like reference numerals perform the same operations, and overlapping descriptions will be omitted as needed.
The position sensor system 30 is a system for acquiring three-dimensional position information of the ultrasonic probe 70 and an ultrasonic image. The position sensor system 30 includes a position sensor 31 and a position detection device 32.
The position sensor system 30 acquires three-dimensional position information of the ultrasonic probe 70 by attaching, for example, a magnetic sensor, an infrared sensor or a target for an infrared camera, as the position sensor 31 to the ultrasonic probe 70. A gyro sensor (angular velocity sensor) may be built in the ultrasonic probe 70, and this gyro sensor may acquire the three-dimensional position information of the ultrasonic probe 70. In addition, the position sensor system 30 may photograph the ultrasonic probe 70 by a camera, and may subject the photographed image to an image recognition process, thereby acquiring the three-dimensional position information of the ultrasonic probe 70. The position sensor system 30 may hold the ultrasonic probe 70 by robotic arms, and may acquire the position of the robotic arms in the three-dimensional space as the position information of the ultrasonic probe 70.
In the description below, a case is described, by way of example, in which the position sensor system 30 acquires position information of the ultrasonic probe 70 by using the magnetic sensor. Specifically, the position sensor system 30 further includes a magnetism generator (not shown) including, for example, a magnetism generating coil. The magnetism generator forms a magnetic field toward the outside, with the magnetism generator itself being set as the center. A magnetic field space, in which position precision is ensured, is defined in the formed magnetic field. Thus, it should suffice if the magnetism generator is disposed such that a living body, which is a target of an ultrasonic examination, is included in the magnetic field space in which position precision is ensured. The position sensor 31, which is attached to the ultrasonic probe 70, detects a strength and a gradient of a three-dimensional magnetic field which is formed by the magnetism generator. Thereby, the position and direction of the ultrasonic probe 70 are acquired. The position sensor 31 outputs the detected strength and gradient of the magnetic field to the position detection device 32.
The position detection device 32 calculates, based on the strength and gradient of the magnetic field which were detected by the position sensor 31, for example, a position of the ultrasonic probe 70 (a position (x, y, z) and a rotational angle (θx, θy, θz) of a scan plane) in a three-dimensional space with the origin set at a predetermined position. At this time, the predetermined position is, for example, a position where the magnetism generator is disposed. The position detection device 32 transmits position information relating to the calculated position (x, y, z, θx, θy, θz) to the main body device 10.
In the meantime, the position information can be imparted to the ultrasonic image data by associating, by time synchronization or the like, the position information acquired as described above and the ultrasonic image data of the ultrasonic which is transmitted and received by the ultrasonic probe 70.
The ultrasonic probe 70 includes a plurality of piezoelectric transducers, a matching layer provided on the piezoelectric transducers, and a backing material for preventing the ultrasonic waves from propagating backward from the piezoelectric transducers. The ultrasonic probe 70 is detachably connected to the main body device 10. Each of the plurality of piezoelectric transducers generates an ultrasonic wave based on a driving signal supplied from ultrasonic transmission circuitry 11 included in the main body device 10. In addition, buttons, which are pressed at a time of an offset process (to be described later), at a time of a freeze of an ultrasonic image, and the like, may be disposed on the ultrasonic probe 70.
When the ultrasonic probe 70 transmits ultrasonic waves to a living body P, the transmitted ultrasonic waves are sequentially reflected by a discontinuity surface of acoustic impedance of the living tissue of the living body P, and received by the plurality of piezoelectric transducers of the ultrasonic probe 70 as a reflected wave signal. The amplitude of the received reflected wave signal depends on an acoustic impedance difference on the discontinuity surface by which the ultrasonic waves are reflected. Note that the frequency of the reflected wave signal generated when the transmitted ultrasonic pulses are reflected by moving blood or the surface of a cardiac wall or the like shifts depending on the velocity component of the moving body in the ultrasonic transmission direction due to the Doppler effect. The ultrasonic probe 70 receives the reflected wave signal from the living body P, and converts it into an electrical signal.
As described above, since the position sensor 31 is attached to the ultrasonic probe 70 according to the present embodiment, the position information at a time when the ultrasonic probe 70 three-dimensionally scans the living body P can be detected. Specifically, the ultrasonic probe 70 according to the present embodiment is a one-dimensional array probe including a plurality of ultrasonic transducers which two-dimensionally scans the living body P. In the meantime, the ultrasonic probe 70, to which the position sensor 31 is attached, may be a mechanical four-dimensional probe (a three-dimensional probe of a mechanical swing method) which is configured such that a one-dimensional array probe and a motor for swinging the probe are provided in a certain enclosure, and ultrasonic transducers are swung at a predetermined angle (swing angle). Thereby, a tilt scan or rotational scan is mechanically performed, and the living body P is three-dimensionally scanned. Besides, the ultrasonic probe 70 may be a two-dimensional array probe in which a plurality of ultrasonic transducers are arranged in a matrix, or a 1.5-dimensional array probe in which a plurality of transducers that are one-dimensionally arranged are divided into plural parts.
The main body device 10 illustrated in
The ultrasonic transmitting circuitry 11 is a processor which supplies a driving signal to the ultrasonic probe 70. The ultrasonic transmitting circuitry 11 is realized by, for example, trigger generating circuitry, delay circuitry, and pulser circuitry. The trigger generating circuitry repeatedly generates, at a predetermined rate frequency, rate pulses for forming transmission ultrasonic. The delay circuitry imparts, to each rate pulse generated by the trigger generating circuitry, a delay time for each piezoelectric transducer which is necessary for determining transmission directivity by converging ultrasonic, which is generated from the ultrasonic probe 70, into a beam form. The pulser circuitry applies a driving signal (driving pulse) to the ultrasonic probe 70 at a timing based on the rate pulse. By varying the delay time that is imparted to each rate pulse by the delay circuitry, the transmission direction from the piezoelectric transducer surface can arbitrarily be adjusted.
The ultrasonic receiving circuitry 12 is a processor which executes various processes on the reflected wave signal which the ultrasonic probe 70 receives, and generates a reception signal. The ultrasonic receiving circuitry 12 is realized by, for example, amplifier circuitry, an A/D converter, reception delay circuitry, and an adder. The amplifier circuitry executes a gain correction process by amplifying, on a channel-by-channel basis, the reflected wave signal which the ultrasonic probe 70 receives. The A/D converter converts the gain-corrected reflected wave signal to a digital signal. The reception delay circuitry imparts a delay time, which is necessary for determining reception directivity, to the digital signal. The adder adds a plurality of digital signals to which the delay time was imparted. By the addition process of the adder, a reception signal is generated in which a reflected component from a direction corresponding to the reception directivity is emphasized.
The B-mode processing circuitry 13 is a processor which generates B-mode data, based on the reception signal received from the ultrasonic receiving circuitry 12. The B-mode processing circuitry 13 executes an envelope detection process and a logarithmic amplification process on the reception signal received from the ultrasonic receiving circuitry 12, and generates data (B-mode data) in which the signal strength is expressed by the magnitude of brightness. The generated B-mode data is stored in a RAW data memory (not shown) as B-mode RAW data on a two-dimensional ultrasonic scanning line.
The Doppler-mode processing circuitry 14 is a processor which generates a Doppler waveform and Doppler data, based on the reception signal received from the ultrasonic receiving circuitry 12. The Doppler-mode processing circuitry 14 extracts a blood flow signal from the reception signal, generates a Doppler waveform from the extracted blood flow signal, and generates data (Doppler data) in which information, such as a mean velocity, dispersion and power, is extracted from the blood flow signal with respect to multiple points.
The three-dimensional processing circuitry 15 is a processor which can generate three-dimensional image data with position information, based on the data generated by the B-mode processing circuitry 13 and the Doppler-mode processing circuitry 14. When the ultrasonic probe 70, to which the position sensor 31 is attached, is the one-dimensional array probe or 1.5-dimensional array probe, the three-dimensional processing circuitry 15 adds the position information of the ultrasonic probe 70, which is calculated by the position detection device 32, to the B-mode RAW data stored in the RAW data memory. In addition, the three-dimensional processing circuitry 15 may generate two-dimensional image data which is composed of pixels, by executing RAW-pixel conversion, and may add the position information of the ultrasonic probe 70, which is calculated by the position detection device 32, to the generated two-dimensional image data.
Furthermore, the three-dimensional processing circuitry 15 generates three-dimensional image data (hereinafter referred to as “volume data”) which is composed of voxels in a desired range, by executing RAW-voxel conversion, which includes an interpolation process with spatial position information being taken into account, on the B-mode RAW data stored in the RAW data memory. The position information of the ultrasonic probe 70, which is calculated by the position detection device 32, is added to the volume data. Similarly, when the ultrasonic probe 70, to which the position sensor 31 is attached, is the mechanical four-dimensional probe (three-dimensional probe of the mechanical swing method) or the two-dimensional array probe, the position information is added to the two-dimensional RAW data, two-dimensional image data and three-dimensional image data.
The three-dimensional processing circuitry 15 generates rendering image data by applying a rendering process to the generated volume data.
The display processing circuitry 17 executes various processes, such as dynamic range, brightness, contrast and y curve corrections, and RGB conversion, on various image data generated in the three-dimensional processing circuitry 15, thereby converting the image data to a video signal. The display processing circuitry 17 causes the display 50 to display the video signal. In the meantime, the display processing circuitry 17 may generate a user interface (GUI: Graphical User Interface) for an operator to input various instructions by the input interface 21, and may cause the display 50 to display the GUI. For example, a CRT display, a liquid crystal display, an organic EL display, an LED display, a plasma display, or other arbitrary display known in the present technical field, may be used as needed as the display 50.
The internal storage 18 includes, for example, a storage medium which can be read by a processor, such as a magnetic or optical storage medium, or a semiconductor memory. The internal storage 18 stores a control program for realizing ultrasonic transmission/reception, a control program for executing an image process, and a control program for executing a display process. In addition, the internal storage 18 stores diagnosis information (e.g. patient ID, doctor's findings, etc.), a diagnosis protocol, a body mark generation program, and data such as a conversion table for presetting a range of color data for use in imaging, with respect to each of regions of diagnosis. Besides, the internal storage 18 may store anatomical illustrations, for example, an atlas, relating to the structures of internal organs in the body.
In addition, the internal storage 18 stores two-dimensional image data, volume data and rendering image data which were generated by the three-dimensional processing circuitry 15, in accordance with a storing operation which is input via the input interface 21. Furthermore, in accordance with a storing operation which is input via the input interface 21, the internal storage 18 may store two-dimensional image data with position information, volume data with position information and rendering image data with position information which were generated by the three-dimensional processing circuitry 15, along with the order of operations and the times of operations. The internal storage 18 can transfer the stored data to an external device via the communication interface 22.
The image memory 19 includes, for example, a storage medium which can be read by a processor, such as a magnetic or optical storage medium, or a semiconductor memory. The image memory 19 stores image data corresponding to a plurality of frames immediately before a freeze operation which is input via the input interface 21. The image data stored in the image memory 19 is, for example, successively displayed (cine-displayed).
The image database 20 stores image data which is transferred from the external device 40. For example, the image database 20 acquires, from the external device 40, past image data relating to the same patient, which was acquired in past diagnosis, and stores the past image data. The past image data includes ultrasonic image data, CT (Computed Tomography) image data, MR image data, PET (Positron Emission Tomography)-CT image data, PET-MR image data, and X-ray image data.
The image database 20 may store desired image data by reading in image data which is stored in storage media such as an MO, CD-R and DVD.
The input interface 21 accepts various instructions from the user via the input device 60. The input device 60 is, for example, a mouse, a keyboard, a panel switch, a slider switch, a trackball, a rotary encoder, an operation panel, and a touch command screen (TCS). The input interface 21 is connected to the control circuitry 23, for example, via a bus, converts an operation instruction, which is input from the operator, to an electric signal, and outputs the electric signal to the control circuitry 23. In the present specification, the input interface 21 is not limited to input interface which is connected to physical operation components such as a mouse and a keyboard. Examples of the input interface 21 include processing circuitry of an electric signal, which receives, as a wireless signal, an electric signal corresponding to an operation instruction that is input from an external input device provided separately from the ultrasonic diagnostic apparatus 1, and outputs this electric signal to the control circuitry 23.
The communication interface 22 is connected, for example, wirelessly, to the position sensor system 30, and receives position information which is transmitted from the position detection device 32. In addition, the communication interface 22 is connected to the external device 40 via the network 100 or the like, and executes data communication with the external device 40. The external device 40 is, for example, a database of a PACS (Picture Archiving and Communication System) which is a system for managing the data of various kinds of medical images, or a database of an electronic medical record system for managing electronic medical records to which medical images are added. In addition, the external device is, for example, various kinds of medical image diagnostic apparatuses other than the ultrasonic diagnostic apparatus 1 according to the present embodiment, such as an X-ray CT apparatus, an MRI (Magnetic Resonance Imaging) apparatus, a nuclear medical diagnostic apparatus, and an X-ray diagnostic apparatus. In the meantime, the standard of communication with the external device 40 may be any standard. An example of the standard is DICOM (digital imaging and communication in medicine).
The control circuitry 23 is, for example, a processor which functions as a central unit of the ultrasonic diagnostic apparatus 1. The control circuitry 23 executes a control program which is stored in the internal storage, thereby realizing functions corresponding to this program. Specifically, the control circuitry 23 executes a position information acquisition function 101, a data acquisition function 102, a sensor alignment function 103, a region determination function 104, an image alignment function 105, and a synchronization control function 106.
By executing the position information acquisition function 101, the control circuitry 23 acquires position information relating to the ultrasonic probe 70 from the position sensor system 30 via the communication interface 22.
By executing the data acquisition function 102, the control circuitry 23 acquires ultrasonic image data from the three-dimensional processing circuitry 15, and generates ultrasonic image data with position information, by associating the ultrasonic image data and the position information.
By executing the sensor alignment function 103, the control circuitry 23 associates the coordinate system of the position sensor and the coordinate system of 3D medical image data. As regards the ultrasonic image data, after the position information is defined by the position sensor coordinate system, the ultrasonic image data with position information and the 3D medical image data are aligned. The sensor alignment function 103 is an alignment function of alignment between 3D medical images in the sensor coordinate system. The ultrasonic image data is data of a free direction and position between a 3D medical image and a 3D ultrasonic image, or between 3D ultrasonic images. Thus, it is necessary to increase the search range for image alignment. However, by executing alignment in the coordinate system of the position sensor by the sensor alignment function 103, it is possible to perform rough adjustment of alignment between 3D medical image data. In the state in which the difference in position and rotation between the 3D medical image data is decreased, the image alignment that is the next step can be performed. In other words, the sensor alignment has a function of suppressing the difference in position and rotation between the 3D medical image data within a capture range of an image alignment algorithm.
By executing the region determination function 104, the control circuitry 23 receives, for example, an input to the input device 60 from the user via the input interface 21, and determines, based on the input, region information which serves as a reference for image alignment in at least one of the ultrasonic image and medical image.
By executing the image alignment function 105, the control circuitry 23 executes image alignment between an ultrasonic image based on the ultrasonic image data and a medical image based on the medical image data, the ultrasonic image data and medical image data being associated by the sensor alignment function 103.
By executing the synchronization control function 106, the control circuitry 23 synchronizes, based on the relationship between a first coordinate system and a second coordinate system, which was determined by the completion of the image alignment, a real-time ultrasonic image, which is an image based on ultrasonic image data newly acquired by the ultrasonic probe 70, and a medical image based on medical image data corresponding to the real-time ultrasonic image, and displays the real-time ultrasonic image and the medical image in an interlocking manner.
The position information acquisition function 101, data acquisition function 102, sensor alignment function 103, region determination function 104, image alignment function 105 and synchronization control function 106 may be assembled as the control program. Alternatively, dedicated hardware circuitry, which can execute these functions, may be assembled in the control circuitry 23 itself, or may be assembled in the main body device 10 as circuitry to which the control circuitry 23 can refer.
The control circuitry 23 may be realized by an application-specific integrated circuit (ASIC) in which this dedicated hardware circuitry is assembled, a field programmable logic device (FPGA), a complex programmable logic device (CPLD), or a simple programmable logic device (SPLD).
Next, referring to
An upper part of
In step S201, for example, the user three-dimensionally scans the ultrasonic probe 70. Thereby, three-dimensional image data is acquired as stack data. A three-dimensional repetitive scan is enabled by an electronic scan in which a mechanical 4D probe or a two-dimensional array probe is used as the ultrasonic probe 70. Thus, it is possible to acquire ultrasonic image data of four dimensions including a time axis, this ultrasonic image data being three-dimensional image data which are temporally successively acquired.
In step S202, since a plurality of two-dimensional ultrasonic image data (tomographic images), which are the acquired stack data, are acquired at mutually different coordinates, a coordinate system which can be commonly used between the respective tomographic images, is introduced. Thus, the three-dimensional ultrasonic image data are reconstructed (re-sampled) as isotropic voxels, and volume data is obtained.
In step S203, the volume data is project-displayed (rendered) by projection from the three dimensions onto a two-dimensional plane. Examples of the rendering method include an MPR (Multi-Planar Reconstruction/Reformation) method, an MIP (Maximum Intensity Projection) method, and a VR (Volume Rendering) method.
The MPR method is a method of creating a tomographic image in an arbitrary direction. A pixel value is calculated by interpolating a voxel value near a designated tomographic plane. The MPR method is useful in that a cross section, which cannot be viewed by normal ultrasonic imaging, can be observed. Normally, in order to grasp a stereoscopic structure, three cross sections, which are a combination of a designated cross section and two cross sections perpendicular to the designated cross section, are displayed at the same time.
The MIP method is a display method in which voxel values existing on a straight line between a point of view and a projection surface are checked, and the maximum value of the voxel values is projected on the projection plane. This method is useful, for example, in stereoscopic depiction of a blood vessel image by a color Doppler method or a contrast echo image in an ultrasonic contrast echo method. However, since depth information disappears in the MIP method, images created at varied angles need to be rotated and cine-displayed.
The VR method is a method in which a virtual physical phenomenon is simulated. In the virtual physical phenomenon, uniform light is emitted from a virtual screen, and the emitted light is reflected, attenuated and absorbed by a three-dimensional object which is expressed by voxel values. Transmissive light and reflective light are updated at intervals of a fixed step from a point on the virtual screen, which is the start point. At a time of the update process, an opacity corresponding to a voxel value is set. Thereby, various expressions can be realized in a range from a surface to an internal structure of the living body. In particular, this method is excellent in extracting a fine structure.
(Alignment Between Ultrasonic Image Data)
Referring to a flowchart of
In step S301, the ultrasonic probe 70 of the ultrasonic diagnostic apparatus according to the present embodiment is operated. Thereby, the control circuitry 23, which executes the data acquisition function 102, acquires ultrasonic image data of a living body region (also referred to as “target region”) in the vicinity of the liver cancer that is the treatment target. In addition, the control circuitry 23, which executes the position information acquisition function 101, acquires the position information of the ultrasonic probe 70 at the time of acquiring the ultrasonic image data from the position sensor system 30, and generates the ultrasonic image data with position information.
In step S302, the control circuitry 23 or three-dimensional processing circuitry 15 executes three-dimensional reconstruction of the ultrasonic image data by the above-described procedure illustrated in
Thereafter, a stage is assumed in which the treatment progressed and the operation was finished, and the effect of the treatment is determined.
In step S303, like step S301, the control circuitry 23, which executes the position information acquisition function 101 and the data acquisition function 102, acquires the position information of the ultrasonic probe and ultrasonic image data. Like the operation before the treatment, the ultrasonic probe 70 is operated on the target region after the treatment, and the control circuitry 23 acquires the ultrasonic image data of the target region, acquires the position information of the ultrasonic probe 70 from the position sensor system, and generates the ultrasonic image data with position information.
In step S304, like step S302, the control circuitry 23 or three-dimensional processing circuitry 15 generates volume data (also referred to as “second volume data”) of the ultrasonic image data with position information, by using the acquired ultrasonic image data and position information.
In step S305, based on the acquired position information of the ultrasonic probe 70 and ultrasonic image data, the control circuitry 23, which executes the sensor alignment function 103, executes sensor alignment between the coordinate system (also referred to as “first coordinate system”) of the first volume data and the coordinate system (also referred to as “second coordinate system”) of the second volume data, so that the positions of the target regions may generally match. Both the position of the first volume data and the position of the second volume data are commonly described in the position sensor coordinate system. Accordingly, the alignment can directly be executed based on the position information added to each volume data.
In step S306, if the living body does not move during the period from the acquisition of the first volume data to the acquisition of the second volume data, a good alignment state can be obtained by only the sensor alignment. In this case, parallel display of ultrasonic images in step S308 in
The details of the image alignment will be described later with reference to
In step S308, the control circuitry 23 instructs, for example, the display processing circuitry 17 to parallel-display the ultrasonic image before the treatment, which is based on the first volume data, and the ultrasonic image after the treatment, which is based on the second volume data. By the above, the alignment process between ultrasonic image data is completed.
Next, referring to a flowchart of
In step S401, the control circuitry 23 converts the coordinates with respect to one of the first volume data and the second volume data, to be more specific, the second volume data in this example. The coordinate conversion may be executed based on at least six parameters, namely the rotational movements and translational movements in an X direction, Y direction and Z direction, and, if necessary, based on nine parameters which additionally include three shearing directions.
In step S402, the control circuitry 23 checks a coordinate-converted region. Specifically, for example, the control circuitry 23 excludes data other than the volume data region. The control circuitry 23 may generate, at the same time, an arrangement in which an inside of the region is expressed by “1” and an outside of the region is expressed by “0”. In addition, the control circuitry 23 may set a specific pixel value (e.g. 255) for the outside of the region, and may represent the brightness by 0 to 254.
In step S403, the control circuitry 23 calculates a characteristic amount relating to the similarity between the first volume data and the second volume data. The characteristic amount is, for example, a brightness value of a voxel.
In step S404, the control circuitry 23 calculates an evaluation function of displacement between the first volume data and second volume data. As the evaluation function, for example, use may be made of a mutual information amount such as a brightness difference between brightness values calculated in step S403, a correlation, or a region with a highest similarity searched after matching structural information of brightness between volume data.
In step S405, the control circuitry 23 determines whether or not the evaluation function meets an optimal value reference. If the evaluation function meets the optimal value reference, the process advances to step S406. If the evaluation function fails to meet the optimal value reference, the process advances to step S406. Whether or not to meet the optimal value reference may be determined such that the evaluation function is determined to meet the optimal value reference at a time point when an improvement of the reference of similarity is no longer desired.
In step S406, the control circuitry 23 changes the conversion parameter in accordance with the result of the optimal value reference. When the improvement of the reference of similarity is no longer desired, it is possible that the similarity reference falls in a local solution. As a matter of course, the similarity reference at this time is less than the similarity reference of the optimal solution, and can be determined by comparing the ratio to the similarity reference of the image at a time of a large displacement, with the similarity reference at a time of an empirically recognized optimal solution. If it is determined that the similarity reference falls in the local solution, the parameter is slightly changed from the position at that time, and the optimization is executed once again. Thereby, it can be expected that the similarity reference reaches the optimal solution. For example, in the case of a downhill simplex method, the change of the parameter is implemented by making an initially set simplex position greater than the previous one.
In step S407, the control circuitry 23 determines a displacement amount, and makes a correction by the displacement amount. Thus, the image alignment process is completed. The image alignment illustrated in
A left image in
Next, referring to
A left image in
(Correction of Displacement Due to Body Motion or Respiratory Time Phase).
A second embodiment will be described with reference to
During a treatment, in some cases, due to a body motion, a large displacement t occurs between ultrasonic image data in the position sensor coordinate system, and this displacement exceeds a correctable range of image alignment. There is also a case in which a transmitter of a magnetic field is moved to a position near the patient, from the standpoint of maintaining the magnetic field strength. In such cases, even after the coordinate system of the sensor is associated by the sensor alignment function 103, a case is assumed in which a large displacement remains between the ultrasonic image data. In connection with such a case, a flowchart of
The user designates, in the respective ultrasonic images, corresponding points indicative of a living body region, these points corresponding between the ultrasonic image based on the first volume data and the ultrasonic image based on the second volume data. The method of designating the corresponding points may be, for example, a method in which the user designates the corresponding points by moving a cursor on the screen by using the operation panel through the user interface generated by the display processing circuitry 17, or the user may directly touch the corresponding points on the screen in the case of a touch screen. In an example of
In the meantime, a region of a predetermined range in the corresponding living body region may be determined as the corresponding region. Also in the case of designating the corresponding region, the control circuitry 23 may execute a similar process as in the case of the corresponding points.
Furthermore, although the example of correcting the displacement due to the body motion or respiratory time phase has been illustrated, the corresponding points or corresponding regions may be determined in order for the user to designate a region-of-interest (ROI) in the image alignment.
Like the first embodiment, after the displacement between the ultrasonic images was corrected by step S702 of
After the input of the instruction for image alignment, the display processing circuitry 17 parallel-displays the ultrasonic images which are aligned in step S308 of
(Alignment Between Ultrasonic Image Data and Medical Image Data Other than Ultrasonic Image)
A third embodiment will be described.
Hereinafter, a description will be given of a case of executing alignment between medical image data which is obtained by other modalities, such as CT image data, MR image data, X-ray image data and PET image data, and ultrasonic image data which is currently acquired by using the ultrasonic probe 70. In the description below, the case is assumed in which MRI image data is used as the medical image data.
Referring to a flowchart of
In step S901, the control circuitry 23 reads out 3D medical image data from the image database 20.
In step S902, the control circuitry 23 executes associating between the sensor coordinate system of the position sensor system 30 and the coordinate system of the 3D medical image data.
In step S903, the control circuitry 23, which executes the position information acquisition function 101 and the data acquisition function 102, associates the ultrasonic image data, which is acquired by the ultrasonic probe 70, and the position information at a time when the ultrasonic image data is acquired, thereby acquiring ultrasonic image data with position information.
In step S904, the control circuitry 23 or three-dimensional processing circuitry 15 generates volume data of the ultrasonic image data with position information.
In step S905, like step S307, the control circuitry 23, which executes the image alignment function 105, executes alignment between the volume data and the 3D medical image data.
In step S906, the display processing circuitry 17 parallel-displays the ultrasonic image based on the volume data and the medical image based on the 3D medical image data.
Next, referring to
Referring to
The user puts the ultrasonic probe 70 on the body surface of the living body in the direction of the axial cross section. The user confirms, by visual observation, whether or not the ultrasonic probe 70 is in the direction of the axial cross section. When the user puts the ultrasonic probe 70 on the living body in the direction of the axial cross section, the user performs a registration process such as clicking by the operation panel, or pressing of the button. Thereby, the control circuitry 23 acquires and associates the sensor coordinates of the position information of the sensor of the ultrasonic probe in this state, and the MR data coordinates of the position of the MPR plane of the MR data. The axial cross section in the MR image data of the living body can be converted to the position sensor coordinates, and can be recognized. Thereby, the alignment (matching of directions of coordinate axes of coordinate systems) illustrated in
Next, referring to
After the completion of the alignment, by moving the ultrasonic probe 70 in the state in which the displacement remains in the position of the body axis direction, the user can observe the MPR plane of the MR and the real-time ultrasonic image in an interlocking manner.
While viewing the real-time ultrasonic image 1103 which is displayed on the monitor, the user scans the ultrasonic probe 70, thereby causing the monitor to display a target region (or an ROI) such as the center of the region for alignment or a structure. Thereafter, the user designates the target region as a corresponding point 1201 by the operation panel or the like. In the example of
Next, the user moves the MPR cross section of the MR by moving the ultrasonic probe 70, and displays the cross-sectional image of the MR image, which corresponds to the cross section including the corresponding point 1201 of the ultrasonic image designated by the user. When the cross-sectional image of the MR image, which corresponds to the cross section including the corresponding point 1201, was displayed, the user designates a target region (or an ROI), such as the center of the region for alignment or a structure, which is designated on the cross-sectional image of the MR image, as a corresponding point 1202 by the operation panel or the like. At this time, the system acquires and stores the position information of the coordinate system of the MR data of the corresponding point 1202.
The control circuitry 23, which executes the region determination function, corrects a displacement between the coordinate system of the MR image data and the sensor coordinate system, based on the position of the designated corresponding point in the sensor coordinate system and the position of the designated corresponding point in the coordinate system of the MR data. Specifically, for example, based on a difference between the corresponding point 1201 and corresponding point 1202, the control circuitry 23 corrects a displacement between the coordinate system of the MR image data and the sensor coordinate system, and aligns the coordinate systems. Thereby, the process of mark alignment of
Next, referring to a schematic view of
After the completion of the position correction, the user manually operates the ultrasonic probe 70 with respect to the region including the target region, while referring to the three-dimensional MR image data, and acquires the ultrasonic image data with position information.
Next, the user presses the switch for image alignment, and executes image alignment. By the process thus far, the position of the MR data and the position of the ultrasonic data are made to generally match, and the MR data and the ultrasonic data include the common target. Thus, the operation of image alignment is well performed. An example of the ultrasonic image display after the image alignment will be described with reference to
As illustrated in
In the third embodiment, the MR 3D image data was described by way of example. However, the third embodiment is similarly applicable to other 3D medical image data of CT, X-ray, ultrasonic, PET, etc. The associating between the coordinate system of 3D medical data and the coordinate system of the position sensor was described in the steps of alignment and mark alignment illustrated in
(Synchronous Display Between Ultrasonic Image and Medical Image)
A fourth embodiment will be described.
If the above-described sensor alignment and image alignment are completed, the relationship between the coordinate system of the medical image (the MR coordinate system in this example) and the position sensor coordinate system is determined. The display processing circuitry 17 refers to the position information of the real-time (live) ultrasonic image acquired by the user freely moving the ultrasonic probe 70 after the completion of the alignment process, and can thereby display the MPR cross section of the corresponding MR. The corresponding cross sections of the highly precisely aligned MR image and real-time ultrasonic image can be interlock-displayed (also referred to as “synchronous display”). Synchronous display can also be executed between 3D ultrasonic images by the same method. Specifically, a 3D ultrasonic image, which was acquired in the past, and a real-time 3D ultrasonic image can be synchronously displayed. In the step S308 of
A fifth embodiment will be described. As illustrated in a flowchart of
In step S1703 in the flowchart of
The control circuitry 23, which executes the region determination function 104, executes sensor alignment by associating the coordinates of the corresponding point in the data coordinates of the MR, and the coordinates of the corresponding point in the position sensor coordinates.
The control circuitry 23, which executes the image alignment function 105, executes image alignment between the ultrasonic image and medical image, based on the region information. In the state in which the sensor alignment was executed, the user instructs image alignment, for example, by the operation panel. Based on the corresponding region, the control circuitry 23 reads in the CT 3D image data and 3D ultrasonic image data, and executes a process by an image alignment algorithm.
According to the above-described embodiment, the coordinate systems between the medical images including ultrasonic image data, which are different with respect to the time of acquisition and the position of acquisition, are associated based on the ultrasonic image data acquired by scanning the ultrasonic probe 70 to which the position information is added by the position sensor system, and the image alignment is executed based on the associating. Thereby, the success rate of image alignment is increased, and the ultrasonic image and medical image, which were easily and exactly aligned, can be presented to the user. In addition, since the sensor coordinate system and the coordinate system of the medical image, for which the image alignment is completed, are synchronized, the MPR cross section of the 3D medical image and real-time ultrasonic tomographic image can be synchronously displayed in interlock with the scan of the ultrasonic probe 70. Specifically, the exact comparison between the medical image and ultrasonic image can be realized, and the objectivity of ultrasonic diagnosis can be improved.
In the above-described embodiments, the position sensor systems, which utilize magnetic sensors, have been described.
(Modifications of Sensor Alignment Unit)
There are various modifications of the sensor alignment function illustrated in
A first embodiment of the sensor alignment unit is as follows. The alignment target region of the 3D medical image data is extracted from the ultrasonic image acquired by the operation of the ultrasonic probe 70. Thus, the sensor alignment unit associates the position sensor coordinates of this ultrasonic image and the coordinates of the corresponding 3D medical image data. This was described in the flowchart of
A second embodiment of the sensor alignment unit relates to a case in which the 3D medical image data is 3D ultrasonic image data with position information of the position sensor. The flowchart of
When 3D ultrasonic image data is acquired by moving the ultrasonic probe 70, the relationship in position or direction between the 3D ultrasonic image data can be grasped by the common transmitter coordinates, and the alignment can be executed.
A third embodiment of the sensor alignment unit is a case in which another magnetic sensor is disposed on the body surface.
The number of robotic arms, which are used as the position sensor system illustrated in
(Modifications of Ultrasonic Image Data)
In the above, the 3D ultrasonic image data with position information was illustrated as the ultrasonic image data by way of example. However, the ultrasonic image data may be a 2D tomographic image with position information. In the flow of the image alignment process of
The ultrasonic image data may be 3D ultrasonic image data or 4D ultrasonic image data, which are acquired by electronic scan by a mechanical swing-type 4D probe (mechanical 4D probe) with position information, or a 2D array probe.
As illustrated in
(Modifications of Region Determination Function)
There are various embodiments of the region determination function illustrated in
A first embodiment of the region determination function is illustrated in
In
A second embodiment of the region determination function is illustrated in
In the region determination function which determines the region information for alignment, image patterns of regions, which are suited for alignment, may be prepared in a database in advance, and 3D medical image data may be automatically searched from the database.
In the example of
Thereby, the doctor can obtain information relating to the quality of alignment, etc. By the doctor's judgment, based on the quality information, it is thinkable to cancel the alignment process, or to retry the alignment process by changing conditions.
Furthermore, the following function of the system is thinkable. The system prepares, in advance, algorithms of judgment for the position movement amount, angular movement amount, an evaluation value of a similarity function of alignment, a similarity of images, and the amount or ratio of the overlapping region between 3D medical image data. When the range of the set reference is exceeded, the system automatically cancels the alignment process.
As the similarity function, various evaluation functions, such as a mutual information amount and a cross-correlation amount, are thinkable. As the image similarity, various evaluation functions, such as a brightness difference value, are thinkable.
The control circuitry 23, which executes the image alignment function, may additionally include a function of detecting a noise region in the 3D medical image data or ultrasonic image data, and excluding the noise region from the alignment calculation.
In the ultrasonic images illustrated in
It is thinkable that the control circuitry 23, which executes the image alignment function 105, detects a region having a common structure in the 3D medical image data or ultrasonic image data, and executes the image alignment calculation. In the image alignment, a blood vessel structure is an important alignment structure.
As illustrated in
Although the flowchart illustrated in
Furthermore, the process of correcting a displacement due to a body motion or respiratory time phase, which is illustrated in
The term “processor” used in the above description means, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or circuitry such as an ASIC (Application Specific Integrated Circuit), or a programmable logic device (e.g. SPLD (Simple Programmable Logic Device), CLPD (Complex Programmable Logic Device), FPGA (Field Programmable Gate Array)). The processor realizes functions by reading out and executing programs stored in the storage circuitry. In the meantime, each processor of the embodiments is not limited to the configuration in which each processor is configured as single circuitry. Each processor of the embodiments may be configured as a single processor by combining a plurality of independent circuitries, thereby to realize the function of the processor. Furthermore, a plurality of structural elements in
In the above description, the case is assumed in which the alignment between the ultrasonic image data and medical image data is the alignment between two data. However, the alignment between three or more data may be executed. For example, currently scanned ultrasonic image data, previously captured ultrasonic image data, and CT 3D image data may be aligned and displayed in parallel.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2016-195129 | Sep 2016 | JP | national |
This application is a division of and claims the benefit of priority under 35 U.S.C. § 120 from U.S. application Ser. No. 15/718,578, filed Sep. 28, 2017, which claims the benefit of priority under 35 U.S.C. § 119 from the prior Japanese Patent Application No. 2016-195129, filed Sep. 30, 2016, the entire contents of all of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 15718578 | Sep 2017 | US |
Child | 18243153 | US |