This application claims the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2012-0001144, filed on Jan. 4, 2012, in the Korean Intellectual Property Office, the entire disclosure of which is incorporated herein by reference for all purposes.
1. Field
The following description relates to methods of generating an elasticity image and elasticity image generating apparatuses.
2. Description of Related Art
Medical devices configured to create cross-sectional images showing internal structures in a human body, such as ultrasonic imaging devices, X-ray imaging devices, computed tomography (CT) devices, and magnetic resonance imaging (MRI) devices, have been developed to improve patient convenience and expedite disease diagnosis with respect to a patient.
By way of a probe, ultrasonic imaging devices transmit an ultrasound signal to a predetermined part in a human body from a surface of the human body. The devices obtain an image of blood flow or a section of a soft tissue in the human body based on information determined from an ultrasound echo signal reflected from the soft tissue. Such ultrasonic imaging devices display a reflection coefficient of the ultrasound echo signal as a brightness at each point on a screen to generate a two-dimensional (2D) brightness (B)-mode image. Ultrasonic imaging devices are small, display images in real time, and have no risk of X-ray radiation exposure.
Since an abnormal tissue, such as a tumor or a cancer, has a reflection coefficient similar to that of a normal tissue but has a degree of elasticity greater than that of the normal tissue, the abnormal tissue may be more accurately displayed based on an elasticity imaging technique that generates an elasticity image indicating elasticity of a tissue. Such an elasticity imaging technique may accurately detect a tissue, such as a cyst, having a degree of elasticity less than that of a normal tissue and a pathological tissue, such as a cancer, having a degree of elasticity greater than that of the normal tissue.
In one general aspect, a method of generating an elasticity image includes detecting corresponding partial areas from examined areas of a first elasticity image and a second elasticity image from among a plurality of elasticity images indicating elasticity of examined areas in a subject, and generating data of a third elasticity image, comprising combining data of the first elasticity image with data of the second elasticity image based on the corresponding partial areas of the first elasticity image and the second elasticity image.
The method may further include that the detecting of the corresponding partial areas includes detecting a feature value of a first partial area that is a portion of the examined area of the first elasticity image from the data of the first elasticity image, and detecting a second partial area corresponding to the first partial area from the examined area of the second elasticity image based on the detected feature value of the first partial area.
The method may further include that the detecting of the feature value includes converting elasticity values of pixels of the first elasticity image into differential values between the elasticity values of the pixels of the first elasticity image and elasticity values of adjacent pixels, and detecting differential values of pixels of the first partial area of the first elasticity image from among differential values of the pixels of the first elasticity image as feature values.
The method may further include that the detecting of the second partial area is based on the detected feature value of the first partial areas and a similarity of the second elasticity image.
The method may further include that the first partial area is determined based on a positional relationship between the examined area of the first elasticity image and the examined area of the second elasticity image.
The method may further include that the detecting of the corresponding partial areas includes converting elasticity values of pixels of the plurality of elasticity images into differential values to generate differential data of the plurality of elasticity images, detecting differential values of pixels of a first partial area that is a portion of the examined area of the first elasticity image from differential data of the first elasticity image as feature values, and detecting an area having a greatest similarity to a feature value of the first partial region from differential data of the second elasticity image as a second partial area of the second elasticity image corresponding to the first partial area of the first elasticity image.
The method may further include that a partial area of the differential data of the second elasticity image that allows a sum of errors between differential values of pixels of the first partial area from among the differential data of the second elasticity image and differential values of pixels of the partial area of the differential data of the second elasticity image to be equal to or less than a preset allowance error is detected as the second partial area.
The method may further include that the first elasticity image is combined with the second elasticity image by matching pixels of the corresponding partial areas of the first elasticity image and the second elasticity image to generate the data of the third elasticity image.
The method may further include that the generating of the data of the third elasticity image includes correcting the data of any one of the first elasticity image and the second elasticity image based on a correlation between features of the corresponding partial areas of the first elasticity image and the second elasticity image to obtain corrected data, and combining the corrected data obtained from any one of the first elasticity image and the second elasticity image with the data of an elasticity image that remains from the any one of the first elasticity image and the second elasticity image.
The method may further include that the correcting of the data of any one of the first elasticity image and the second elasticity image is based on a correlation obtained between features of elasticity values of pixels of the corresponding partial areas of the first elasticity image and the second elasticity image.
The method may further include that the correlation between the features of the elasticity values of the pixels of the corresponding partial areas of the first elasticity image and the second elasticity image is obtained from a ratio of analyzed averages of the elasticity values of the pixels of the corresponding partial areas of the first elasticity image and the second elasticity image.
The method may further include that the generated data of the third elasticity image indicates elasticity of an examined area including the examined area of the first elasticity image and the examined area of the second elasticity image.
The method may further include determining a position of an examined area of the third elasticity image in all areas of an elasticity image display device, and displaying the position of the examined area of the third elasticity image on the elasticity image display device.
In another general aspect, an elasticity image generating apparatus includes an image processor configured to detect corresponding partial areas from examined areas of a first elasticity image and a second elasticity image from among a plurality of elasticity images indicating elasticity of examined areas in a subject, and generate data of a third elasticity image by combining data of the first elasticity image with data of the second elasticity image based on the corresponding partial areas of the first elasticity image and the second elasticity image.
The apparatus may further include a storage unit configured to store therein information configured to detect the corresponding partial areas of the first elasticity image and the second elasticity image.
The apparatus may further include that the image processor includes a partial area detecting unit, a correcting unit, and a combining unit, the partial area detecting unit being configured to detect a feature value of a first partial area that is a portion of the examined area of the first elasticity image from the data of the first elasticity image, the partial area detecting unit being configured to detect a second partial area corresponding to the first partial area from the examined area of the second elasticity image based on the detected feature value of the first partial area, the correcting unit being configured to correct the data of any one of the first elasticity image and the second elasticity image based on a correlation between features of the corresponding partial areas of the first elasticity image and the second elasticity image to obtain corrected data, the combining unit being configured to combine the corrected data obtained from any one of the first elasticity image and the second elasticity image with the data of an elasticity image that remains from the any one of the first elasticity image and the second elasticity image.
The apparatus may further include that the partial area detecting unit is further configured to convert elasticity values of pixels of the first elasticity image into differential values between the elasticity values of the pixels of the first elasticity image and elasticity values of adjacent pixels, and detect differential values of pixels of the first partial area of the first elasticity image from among differential values of the pixels of the first elasticity image as feature values.
The apparatus may further include that the partial area detecting unit is further configured to detect as the second partial area of the second elasticity image a partial area of differential data of the second elasticity image that allows a sum of errors between differential values of pixels of the first partial area from among the differential data of the second elasticity image and differential values of pixels of the partial area of the differential data of the second elasticity image to be equal to or less than an allowance error.
The apparatus may further include that the correcting unit is further configured to correct the data of any one of the first elasticity image and the second elasticity image based on a correlation obtained by the correcting unit, the correlation being between features of elasticity values of pixels of the corresponding partial areas of the first elasticity image and the second elasticity image.
The apparatus may further include an ultrasound image data generating unit configured to generate brightness (B)-mode ultrasound image data, and an image combining unit configured to combine the B-mode ultrasound image data with the data of the third elasticity image to generate a combined ultrasound image.
In yet another general aspect, there is provided a computer-readable recording medium having embodied thereon a program configured to execute a method of generating an elasticity image, the method including detecting corresponding partial areas from examined areas of a first elasticity image and a second elasticity image from among a plurality of elasticity images indicating elasticity of examined areas in a subject, and generating data of a third elasticity image, comprising combining data of the first elasticity image with data of the second elasticity image based on the corresponding partial areas of the first elasticity image and the second elasticity image.
In still another general aspect, a method of generating an elasticity image includes detecting corresponding first and second partial areas based on data of a first elasticity image indicating elasticity of a first examined area in a subject and data of a second elasticity image indicating elasticity of a second examined area in the subject, the first partial area being a portion of an examined area of the first elasticity image, the second partial area being a portion of an examined area of the second elasticity image, and generating data of a third elasticity image, comprising combining the data of the second elasticity image with data of a corrected image obtained from the data of the first elasticity image.
Other features and aspects may be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the systems, apparatuses and/or methods described herein will be suggested to those of ordinary skill in the art. In addition, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
The elasticity image data generating device 10 generates a source signal and transmits the source signal, via a probe 11 connected to the elasticity image data generating device 10, to an area of examination in the body of the patient 40. In this case, the source signal may be any of various signals known to one of ordinary skill in the art, such as an ultrasound signal and an X-ray signal.
The elasticity image data generating device 10 generates pieces of elasticity image data indicating elasticity of different examined areas in the body of the patient 40 based on a response signal from the probe 11. That is, as the medical expert moves the probe 11 in order to diagnose a disease of the patient 40, the elasticity image data generating device 10 sequentially generates pieces of elasticity image data indicating the elasticity of the different examined areas in the body of the patient 40.
In an example, the elasticity imaging system 100 generates an elasticity image based on a response signal obtained when the probe 11 contacting a surface of the body of the patient 40 repeatedly compresses and relaxes the body of the patient 40 to transmit and receive an ultrasound wave. In another example, the probe 11 includes transducers, each of which is configured to transmit and receive an ultrasound signal. Since the ultrasound signal causes a target tissue in the body of the patient 40 to be displaced, in yet another example, elasticity image data is generated by detecting information about the displacement of the target tissue due to the ultrasound signal based on a response signal to the ultrasound signal.
In an example, the examined area in the body of the patient 40 is a section or a portion of an organ, such as a liver or a kidney. Alternatively, in another example, the examined area in the body of the patient 40 is a section or a portion of a breast of a woman, a womb of a pregnant woman, amniotic fluid in a womb, or a section or a portion of an embryo. The elasticity image display device 30 displays an elasticity image received from the elasticity image generating device 20. Examples of the elasticity image display device 30 include a device known to one of ordinary skill in the art configured to display an elasticity image on a screen or paper, but are not limited thereto.
Elasticity image data generated by the elasticity image data generating device 10 is limited to a field of view that is visible to the probe 11, according to the characteristics thereof, at any one point in time. Here, the field of view refers to an area that may be viewed by the probe 11 when the probe 11 is resting at a predetermined position of the body of the patient 40. In an example, an ultrasonic version of the elasticity imaging system 100 generates a cross-sectional elasticity image of a section of a liver of a patient in real time.
The image processor 21 receives pieces of elasticity image data indicating elasticity of different examined areas from the elasticity image data generating device 10, combines the received pieces of elasticity image data, and generates an elasticity image from the combined pieces of elasticity image data indicating elasticity of an examined area having a wider field of view. In an example, contiguously input pieces of elasticity image data from the received pieces of elasticity image data have a common examined area.
The image processor 21 includes a partial area detecting unit 22, a correcting unit 23, and a combining unit 24. In an example, the image processor 21 includes exclusive chips configured to perform functions of the partial area detecting unit 22, the correcting unit 23, and the combining unit 24, or exclusive programs stored in the storage unit 25 and a general-purpose central processing unit (CPU) configured to perform functions of the partial area detecting unit 22, the correcting unit 23, and the combining unit 24.
In an example, various pieces of data generated during an operation of the image processor 21 are stored in the storage unit 25. In another example, pieces of elasticity image data and elasticity image data generated by combining the pieces of elasticity image data are stored in the storage unit 25. In addition, in yet another example, pieces of information needed to combine the pieces of elasticity image data are stored in the storage unit 25. Examples of the storage unit 25 known to one of ordinary skill in the art include a hard disc drive, a read-only memory (ROM), a random access memory (RAM), a flash memory, and a memory card.
In an example, the partial area detecting unit 22 detects the second partial area 321 corresponding to the first partial area 311 of the first elasticity image 31 from the examined area of the second elasticity image 32 based on a similarity of elasticity values of pixels of a partial area that is a portion of the examined area of the second elasticity image 32 and elasticity values of pixels of the first partial area 311 from the examined area of the first elasticity image 31. In an example of this case, a similarity of partial areas between the first and second elasticity images 31 and 32 is determined by comparing elasticity values of pixels of the first and second partial areas 311 and 321 of the first and second elasticity images 31 and 32 or feature values of pixels of the first and second partial areas 311 and 321 of the first and second elasticity images 31 and 32. In a further example of this case, the feature values refer to differential values between elasticity values of pixels of the first and second partial areas 311 and 321 and elasticity values of adjacent pixels.
The correcting unit 23 of the image processor 21 corrects data of the first elasticity image 31 based on a detection result of the partial area detecting unit 22. That is, the correcting unit 23 corrects data of the first elasticity image 31 based on a correlation between the first partial area 311 of the first elasticity image 31 and the second partial area 321 of the second elasticity image 32. In an example of this case, the correcting unit 23 corrects elasticity values of all pixels included in the first elasticity image 31 or pixels 333 corresponding to areas other than the first partial area from the data of the first elasticity image 31.
The combining unit 24 of the image processor 21 generates data of the third elasticity image 34 by combining data of the second elasticity image 32 with data of a corrected image 33 obtained from the data of the first elasticity image 31 based on a detection result of the partial area detecting unit 22. The combining unit 24 generates the third elasticity image 34 by combining the second elasticity image 32 with the corrected image 33 obtained from the first elasticity image 31 by matching pixels of partial areas 321 and 331 of the second elasticity image 32 and the corrected image 3, respectively. The third elasticity image 34 includes a combined area 343 obtained from a left area 333 of the first partial area 331 of the corrected image 33, and combined areas 341 and 342 obtained from the second partial area 321 and a right area 322 of the second elasticity image 32.
Referring further to the examples illustrated in
Referring further to the examples illustrated in
On the contrary, when the medical expert diagnoses a disease of the patient 40 while moving the probe 11 leftward, the examined area of the second elasticity image 32 is located at the left of the examined area of the first elasticity image 31, and, thus, the partial area detecting unit 22 determines leftmost pixels A from pixels of the first elasticity image 31 as the first partial area 811. In addition, in an example, when the medical expert diagnoses a disease of the patient 40 while moving the probe 11 upward or downward, the examined area of the second elasticity image 32 is located at the top or bottom of the examined area of the first elasticity image 31, and, thus, the partial area detecting unit 22 determines uppermost pixels D or lowermost pixels C from the pixels of the first elasticity image 31 as the first partial area 811.
When the medical expert diagnoses a disease of the patient 40 while moving the probe 11, as shown in
Referring further to the examples illustrated in
The partial area 22 detects (54) a second partial area 821 corresponding to the determined first partial area 811 of the differential data 81 of the first elasticity image from among an examined area of the differential data 82 of the second elasticity image based on a similarity between the differential values of the pixels of the determined first partial area 811 from among the pixels of the differential data 81 of the first elasticity image and differential values of pixels of the differential data 82 of the second elasticity image.
In an example of this case, the partial area detecting unit 22 detects an area having a greatest similarity to feature values of the differential values of pixels of the determined first partial area 811 of the differential data 81 of the first elasticity image as the second partial area 821 of the second elasticity image corresponding to the first partial area 811 of the first elasticity image by setting a moving window having the same matrix size as the pixels of the determined first partial area 811 of the differential data 81 of the first elasticity image to the pixels of the differential data 82 of the second elasticity image.
In an example, the partial area detecting unit 22 detects as the second partial area 821 of the second elasticity image a partial area of the differential data 82 of the second elasticity image when errors between differential values of pixels of the first partial area 811 of the differential data 81 of the first elasticity image and differential values of pixels of the partial area of the differential data 82 of the second elasticity image are obtained and a sum of the errors is minimal and equal to or less than a preset allowance error. In another example, if the sum of the errors between the differential values of the pixels of the first partial area 811 of the differential data 81 of the first elasticity image and the differential values of the pixels of the partial area of the differential data 82 of the second elasticity image exceeds an allowance range, the partial area detecting unit 22 searches for another area in the second elasticity image within the allowance range by moving to new coordinates.
In the detection of a stress distribution in a subject, free-hand elastography, which is a currently used elasticity imaging technique, enables a user to use the technique by directly applying non-uniform force with a probe, but, In an example, detecting a partial area of a second elasticity image based on feature values of differential values between elasticity values of elasticity images reduces the risk of false detection of the partial area due to a change in the size of a force applied when a first elasticity image and the second elasticity image are measured.
That is, although elasticity values of partial areas of different elasticity images vary in an example to an extent with respect to compression or relaxation of a probe when a medical expert diagnoses a disease, since elasticity values of adjacent tissues in a human body do not change abruptly and are kept constant, the elasticity values are strong against a change in an extent to which the probe compresses and relaxes. In this regard, in another example, when differential values of elasticity values are used as feature values, although elasticity values of a first elasticity image and a second elasticity image change, since differential values of elasticity values of adjacent pixels in the first elasticity image are similar to differential values of elasticity values of adjacent pixels in the second elasticity image, the risk of false detection of a partial area due to a difference between elasticity values of the first elasticity image and the second elasticity image is reduced.
In Equation 1, W denotes a correlation, Σbij denotes a sum of the elasticity values of the pixels of the second partial area 721 of the data 72 of the second elasticity image, and Σaij denotes a sum of the elasticity values of the pixels of the first partial area 711 of the data 71 of the first elasticity image.
The correcting unit 23 corrects (62) any one of the data 71 of the first elasticity image and the data 72 of the second elasticity image based on the obtained correlation between the first partial area 711 of the data 71 of the first elasticity image and the second partial area 721 of the data 72 of the second elasticity image. In an example, when the second elasticity image is set to a main image and the first elasticity image is set to a sub-image, the correcting unit 23 corrects the data 71 of the first elasticity image. On the contrary, in another example, when the first elasticity image is set to a main image and the second elasticity image is set to a sub-image, the correcting unit 23 corrects the data 71 of the second elasticity image.
In
c
ij
=W·a
ij (2)
In Equation 2, cij denotes corrected data obtained from the data 71 of the first elasticity image, W denotes a correlation, and aij denotes the data 71 of the first elasticity image. In an example of this case, the correcting unit 23 corrects elasticity values of all pixels included in the data 71 of the first elasticity image based on a correlation between the first partial area 711 of the data 71 of the first elasticity image and the second partial area 721 of the data 72 of the second elasticity image, or elasticity images of pixels of areas other than the first partial area from among the data 71 of the first elasticity image based on a correlation between the first partial area 711 of the data 71 of the first elasticity image and the second partial area 721 of the data 72 of the second elasticity image.
The combining unit 24 generates (63) the data 73 of the third elasticity image by combining the data 72 of the second elasticity image with the corrected data obtained from the data 71 of the first elasticity image. The data 73 of the third elasticity image is generated by combining a corrected area 731 obtained from areas other than the first partial area 711 of the data 71 of the first elasticity image with the second partial area 721 and a right area from among the data 72 of the second elasticity image.
In a further example, the imaging system may includes a transmission/reception unit configured to separately transmit and receive a B-mode image signal configured to generate a B-mode ultrasound image and an elasticity image signal configured to generated an elasticity image. In an additional example, the B-mode ultrasound image is generated by performing B-mode processing such as logarithmic compression or envelope detection on data of a response signal obtained from a probe. The B-mode processing is obvious to one of ordinary skill in the art, and thus a detailed explanation thereof will not be given.
The units described herein may be implemented using hardware components and software components, such as, for example, microphones, amplifiers, band-pass filters, audio to digital convertors, and processing devices. A processing device may be implemented using one or more general-purpose or special purpose computers, such as, for example, a processor, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a field programmable array, a programmable logic unit, a microprocessor, or any other device capable of responding to and executing instructions in a defined manner. The processing device may run an operating system (OS) and one or more software applications that run on the OS. The processing device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processing device is used as singular; however, one skilled in the art will appreciated that a processing device may include multiple processing elements and multiple types of processing elements. For example, a processing device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such as parallel processors. As used herein, a processing device configured to implement a function A includes a processor programmed to run specific software. In addition, a processing device configured to implement a function A, a function B, and a function C may include configurations, such as, for example, a processor configured to implement both functions A, B, and C, a first processor configured to implement function A, and a second processor configured to implement functions B and C, a first processor to implement function A, a second processor configured to implement function B, and a third processor configured to implement function C, a first processor configured to implement functions A and B, and a second processor configured to implement function C, a first processor configured to implement functions A, B, C, and a second processor configured to implement functions A, B, and C, and so on.
The software components may include a computer program, a piece of code, an instruction, or some combination thereof, for independently or collectively instructing or configuring the processing device to operate as desired. Software and data may be embodied permanently or temporarily in any type of machine, component, physical or virtual equipment, computer storage medium or device, or in a propagated signal wave capable of providing instructions or data to or being interpreted by the processing device. The software also may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion. For example, the software and data may be stored by computer readable recording media. Computer readable recording media may include any data storage device that can store data which can be thereafter read by a computer system or processing device. Examples of computer readable recording media include read-only memory (ROM), random-access memory (RAM), CD-ROMs, magnetic tapes, floppy disks, optical data storage devices. In addition, functional programs, codes, and code segments that accomplish the examples disclosed herein can be easily construed by programmers skilled in the art to which the examples pertain based on and using the flow diagrams and block diagrams of the figures and their corresponding descriptions as provided herein.
Program instructions to perform a method described herein, or one or more operations thereof, may be recorded, stored, or fixed in computer-readable storage media. The program instructions may be implemented by a computer. For example, the computer may cause a processor to execute the program instructions. The media may include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of computer-readable storage media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include machine code, such as that which is produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The program instructions, that is, software, may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion. For example, the software and data may be stored by one or more computer readable storage mediums. Further, the described unit to perform an operation or a method may be hardware, software, or some combination of hardware and software. For example, the unit may be a software package running on a computer or the computer on which that software is running.
A number of examples have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0001144 | Jan 2012 | KR | national |