Embodiments described herein relate generally to a medical image display apparatus which displays image data for the execution of image diagnosis.
Imaging apparatuses which generate image data by imaging objects include mammography apparatuses which image the breasts by irradiating them with X-rays, and X-ray diagnostic apparatuses, X-ray CT apparatuses, and MRI apparatuses which image various regions such as the chests. In addition, an imaging apparatus can perform image processing by digitalizing images. The imaging apparatus can provide an image having a plurality of features by performing a plurality of types of image processing for image data (original image data) before image processing. When performing diagnosis while displaying the image data obtained by the imaging apparatus on the monitor, the user sometimes performs radiographic interpretation by comparing the image data (current image data) with image data (differently processed image data) after processing, which is obtained by image processing different from that performed for the current image data, instead of making decision by using only the current image data currently displayed on the monitor. In addition, the user sometimes performs radiographic interpretation by using the magnifying glass function of selecting a lesion region (region of interest) on current image data, and enlarging only the selected region of interest on the current image data to window-display the enlarged image data.
When the user finds a region of interest on the current image data, it is necessary to switch to differently processed image data and compare them with each other. This makes the user lose sight of the region of interest because he/she must take his/her eyes off the current image data, and take much time for diagnosis. Furthermore, using the loupe function will display enlarged image data so as to overlap the region of interest. This makes it difficult to grasp the region of interest.
A medical image display apparatus according to an embodiment includes an input unit, a display unit, and an image data processing unit. The input unit inputs designation of a region of interest on a first medical image obtained by imaging an object. The display unit displays, adjacently to the region of interest together with the first medical image, an enlarged medical image obtained by enlarging an image in the region of interest. The image data processing unit decides a position at which the enlarged medical image is displayed, based on a position of the region of interest on the first medical image and a position of the first medical image.
An embodiment will be described below with reference to the accompanying drawings.
The image diagnostic system 600 includes an image display apparatus (medical image display apparatus) 100 which displays the image data generated by the imaging apparatus 200, the image data processed by the image processing apparatus 300, and the image data stored in the image storage apparatus 400. The imaging apparatus 200, the image processing apparatus 300, the image storage apparatus 400, and the image display apparatus 100 transmit and receive image data via a network 500.
The image display apparatus 100 includes a transmission/reception unit 10 which transmits and receives image data to and from the imaging apparatus 200, the image processing apparatus 300, and the image storage apparatus 400, and an image data storage unit 20 which stores the image data received by the transmission/reception unit 10. The image display apparatus 100 includes an image data processing unit 30 which performs processing for displaying the image data received by the transmission/reception unit 10 from each apparatus, and the image data stored in the image data storage unit 20, and a display unit 40 which displays the image data processed by the image data processing unit 30.
The image display apparatus 100 also includes an input unit 50 including input devices such as a mouse, keyboard, and joystick for input operation to designate a region of interest on image data processed by the image data processing unit 30, input operation to display the image data processed by the image data processing unit 30 on the display unit 40, and the like, and a control unit 60 which controls the transmission/reception unit 10, the image data storage unit 20, the image data processing unit 30, and the display unit 40.
An example of the operation of the image diagnostic system 600 will be described below with reference to
The imaging apparatus 200 generates image data by imaging an object and transmits the data to the image processing apparatus 300. The image processing apparatus 300 generates the second image data (second medical image) and the third image data (third medical image) by performing two different types of image processing for the first image data (first medical image) using the image data transmitted from the imaging apparatus 200 or the image data obtained by processing this image data as image data (first image data) for diagnosis. The image processing apparatus 300 transmits the first, second, and third image data to the image storage apparatus 400. The image storage apparatus 400 stores the first, second, and third image data transmitted from the image processing apparatus 300 in association with each other. The first image data is, for example, a radiographic image.
The transmission/reception unit 10 of the image display apparatus 100 transmits request information to the image storage apparatus 400 in accordance with an input to request image data from the input unit 50. The image storage apparatus 400 transmits the first image data and the second and third image data associated with the first image data to the image display apparatus 100 in accordance with the request information from the image display apparatus 100. The transmission/reception unit 10 receives the first, second, and third image data transmitted from the image storage apparatus 400 in accordance with the transmission of the request information. The image data storage unit 20 stores the first image data and the second and third image data in association with each other. Note that the second and third image data may be images obtained by imaging an object from a plurality of angles.
The user then operates the input unit 50 to perform input to set display conditions as follows: setting a display image to the first image data 21, setting a different image to the second image data 22, and setting the display mode for the different image to the superimposition display mode. In this case, the image data processing unit 30 reads out the first image data 21 from the image data storage unit 20. The image data processing unit 30 then places a movable cursor for designating a region of interest on the first image data 21. The image data processing unit 30 then arranges the first image data 21 and the cursor at predetermined positions and outputs them to the display unit 40. The display unit 40 displays the first image data 21 and the cursor output from the image data processing unit 30.
The user then operates, for example, the mouse via the input unit 50 to move the cursor 74 to a region of interest as, for example, a region of the data of a lesion on the first image data 21. In addition, when the user designates a region of interest and performs input operation for the display of a different image by pressing the left button of the mouse, the image data processing unit 30 places the first image data 21 at a position corresponding to the first display area 411. In addition, as shown in
In this case, data 221 of the same region (the same position) as the region of interest surrounded by the frame 75 on the first image data 21 is read out from the second image data 22 stored in the image data storage unit 20. The image data processing unit 30 places, adjacently to, for example, one of the four sides of the frame 75, the readout different image data obtained as a region of interest on the second image data 22 based on the superimposition display mode set as a display condition. The image data processing unit 30 places the entire region of the different image data at a position where it is superimposed on the first image data 21. The image data processing unit 30 outputs the first image data 21, the frame 75, and the different image data to the display unit 40. The display unit 40 displays the first image data 21 and the frame 75 and the different image data which are superimposed on the first image data 21.
Note that the image data processing unit 30 generates a tomosynthesis image based on the images obtained by imaging the object from a plurality of angles. More specifically, the image data processing unit 30 generates a tomosynthesis image by predetermined processing based on a plurality of images respectively corresponding to a plurality of angles relative to the object. The predetermined processing is, for example, a shift addition method or FBP (Filtered BackProjection) method. For example, the image data processing unit 30 generates a plurality of tomosynthesis images by changing a slice of the object at predetermined intervals. The image data processing unit 30 causes the image data storage unit 20 to store the generated tomosynthesis images. The image data processing unit 30 may generate different image data by using a tomosynthesis image as the second or third image data.
Note that an enlarged medical image obtained by enlarging a region of interest on the first image data (first medical image) may be used as the above different image data. In this case, the image data processing unit 30 generates an enlarged medical image in accordance with the setting (designation) of a region of interest. For example, the image data processing unit 30 decides a position at which the enlarged medical image is to be displayed, based on the position of the region of interest on the first image data (first medical image) and the position of the first image data. Note that different image data may be image data obtained by executing tone processing, frequency processing, and the like for the first medical image.
Superimposing and displaying the frame 75 surrounding the region of interest on the first image data 21 on the display unit 40 in this manner allows the user to easily grasp the region of interest on the first image data 21. In addition, superimposing and displaying the frame 75 and the different image data 24 arranged near the frame 75 on the first image data 21 on the display unit 40 allows the user to observe the different image data 24 without losing sight of the region of interest on the first image data 21. Furthermore, since the user can observe the region of interest on the first image data 21 and the different image data 24 without changing the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24.
In this case, when the user performs input operation to erase the different image by releasing the left button of the mouse of the input unit 50, the display unit 40 displays the screen 70 in
Assume that the user has input a display condition with the input unit 50 to change the different image set on the second image data 22 into an enlarged image (enlarged medical image) while the display unit 40 displays the screen 70 in
Superimposing and displaying the frame 75 and different image data 24a arranged near the frame 75 on the first image data 21 on the display unit 40 in this manner allows the user to observe the different image data 24a without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24a without changing the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24a.
Subsequently, when the user performs input operation to move the frame 75 displayed in the screen 70a in
Superimposing the moved frame 75a on the first image data 21 and displaying it on the display unit 40 in this manner allows the user to easily grasp the region of interest of the first image data 21. In addition, superimposing the moved frame 75a and different image data 24b arranged near the frame 75a on the first image data 21 and displaying them on the display unit 40 allows the user to observe the different image data 24b without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24b without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24b.
When the frame 75a reaches the right end portion of the first image data 21, the display unit 40 displays the first image data 21, and the frame and different image data which are located on the right end portion and superimposed on the first image data 21.
Superimposing and displaying the frame 75b moved to the right end portion of the first image data 21 on the display unit 40 in this manner allows the user to easily grasp the region of interest of the first image data 21. In addition, superimposing the frame 75b located on the right end portion and the different image data 24c arranged near a side other than the right side of the frame 75b on the first image data 21 and displaying them on the display unit 40 allows the user to observe the different image data 24c without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24c without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24c.
In addition, superimposing each of the frames 75, 75a, and 75b and each of the different image data 24, 24a, 24b, and 24c on the first image data 21 can also display the image data on a display unit having the area of the first display area 411 which is smaller than the display unit 40 having the display area 41 in terms of the maximum area on which image data can be displayed.
Assume that, while the screen 70 in
In this case, as shown in
It is possible to prevent oversight of a region of interest which is not displayed as different image data by displaying, on the display unit 40, the region of interest on the first image data 21 on which the different image data has once been displayed, upon identifying the region of interest. It is also possible to prevent the waste of carelessly designating the same region of interest.
Assume that the user has input a display condition with the input unit 50 to change the different image display mode set to the superimposition display mode into the second display area mode while the display unit 40 displays the screen 70 in
Note that if the different image data 24 has a size large enough to protrude from the second display area 412 when being displayed, the image data processing unit 30 reduces the different image data 24 to allow its entire region to be included in the second display area 412. Assume that in the following description, the different image data 24 has a size that allows its entire region to be included in the second display area 412.
In this case, when the user performs input operation to move the frame 75 to the upper side or the lower side on the first image data 21 while pressing the left button of the mouse of the input unit 50, the image data processing unit 30 reads out the data of the same region as that on the first image data 21 which is surrounded by the frame 75, which has moved to one side, from the second image data 22. The image data processing unit 30 generates different image data from the readout second image data. The image data processing unit 30 then places the data at a position synchronized to the frame 75 moved to one side of the second display area 412. The image data processing unit 30 then outputs the first image data 21, the frame 75 moved to one side, and the different image data to the display unit 40. The display unit 40 displays the first image data 21 and the frame 75 moved to one side on the first image data 21 in the first display area 411. In addition, the display unit 40 displays, in the second display area 412, the different image data associated with a region of interest which is moved in the same direction as that of the frame 75 synchronously with the frame 75 and changes synchronously with the frame 75.
In this manner, the display unit 40 displays the frame 75 surrounding the region of interest on the first image data 21 arranged in the first display area 411 and the different image data 24 arranged in the second display area 412 near the frame 75. This makes it possible to observe the different image data 24 without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24 without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24. Furthermore, since the user can observe the entire region of the first image data 21 without making the different image data 24 cover it, it is possible to easily move the frame 75 to another region of interest.
Assume next that the user has input a display condition with the input unit 50 to change and set the different image display mode set to the superimposition display mode into the third display area mode while the display unit 40 displays the screen 70 in
Note that if the different image data 24 has a size large enough to protrude from the third display area 413 when being displayed, the image data processing unit 30 reduces the different image data 24 to allow its entire region to be included in the third display area 413. Assume that in the following description, the different image data 24 has a size that allows its entire region to be included in the third display area 413.
In this case, when the user performs input operation to move the frame 75 to the left side or the right side on the first image data 21 while pressing the left button of the mouse of the input unit 50, the image data processing unit 30 reads out the data of the same region as that on the first image data 21 which is surrounded by the frame 75, which has moved to one side, from the second image data 22. The image data processing unit 30 generates different image data from the readout second image data 22. The image data processing unit 30 then places the different image data at a position linking to the frame 75 moved to one side of the third display area 413. The image data processing unit 30 then outputs the first image data 21, the frame 75 moved to one side, and the different image data to the display unit 40. The display unit 40 displays the first image data 21 and the frame 75 moved to one side on the first image data 21 in the first display area 411. In addition, the display unit 40 displays, in third display area 413, the different image data associated with a region of interest which is moved in the same direction as that of the frame 75 synchronously with the frame 75 and changes synchronously with the frame 75.
In this manner, the display unit 40 displays the frame 75 surrounding the region of interest on the first image data 21 arranged in the first display area 411 and the different image data 24 arranged in the third display area 413 near the frame 75. This makes it possible to observe the different image data 24 without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24 without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24. Furthermore, since the user can observe the entire region of the first image data 21 without making the different image data 24 cover it, it is possible to easily move the frame 75 to another region of interest.
Assume next that the user has input display conditions with the input unit 50, while the display unit 40 displays the screen 70 in
In this case, when the user performs input operation to move the frame 75 to the upper side or the lower side on the first image data 21 while pressing the left button of the mouse of the input unit 50, display unit 40 displays the first image data 21 and the frame 75 moved to one side on the first image data 21 in the first display area 411. In addition, the display unit 40 displays, in the second display area 412, the different image data obtained from the second image data 22 associated with the region of interest which is moved in the same direction as that of the frame 75 synchronously with the frame 75 and changes synchronously with the frame 75. In addition, the display unit 40 displays, in the third display area 413, the different image data obtained from the third image data 23 associated with the region of interest which is stopped below the frame 75 and changes synchronously with the frame 75.
In addition, when the user performs input operation to move the frame 75 to the left side or the right side on the first image data 21 while pressing the left button of the mouse of the input unit 50, the display unit 40 displays the first image data 21 and the frame 75 moved to one side on the first image data 21 in the first display area 411. In addition, the display unit 40 displays, in the second display area 412, the different image data obtained from the second image data 22 associated with the region of interest which is stopped at the right of the frame 75 and changes synchronously with the frame 75. Furthermore, the display unit 40 displays, in the third display area 413, the different image data obtained from the third image data 23 associated with the region of interest which is moved in the same direction as that of the frame 75 synchronously with the frame 75 and changes synchronously with the frame 75.
In this manner, the display unit 40 displays the frame 75 surrounding the region of interest on the first image data 21 arranged in the first display area 411 and the two different image data 24 and 25 arranged in the second and third display areas 412 and 413 near the frame 75. This makes it possible to observe the two different image data 24 and 25 without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and the different image data 24 and 25 without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with the different image data 24 and 25. Furthermore, since the user can observe the entire region of the first image data 21 without making the different image data 24 and 25 cover it, it is possible to easily move the frame 75 to another region of interest.
According to the above embodiment, it is possible to easily grasp a region of interest on the first image data 21 by superimposing each of the frames 75, 75a, and 75b surrounding the region of interest on the first image data 21 and displaying them on the display unit 40. In addition, superimposing each of the frames 75, 75a, and 75b and each of the different image data 24, 24a, 24b, and 24c arranged near the frame on the first image data 21 and displaying them on the display unit 40 allows the user to observe each of the different image data 24, 24a, 24b, and 24c without losing sight of the region of interest on the first image data 21. In addition, since the user can observe the region of interest on the first image data 21 and each of the different image data 24, 24a, 24b, and 24c without moving the direction of the eyes, it is possible to easily compare the region of interest on the first image data 21 with each of the different image data 24, 24a, 24b, and 24c.
In addition, it is possible to easily grasp a region of interest on the first image data 21 by displaying, on the display unit 40, the frame 75 surrounding the region of interest on the first image data 21 arranged in the first display area 411 and the different image data 24 arranged in the second or third display area 412 or 413 near the frame 75 or the different image data 24 and 25 arranged in the second and third display areas 412 and 413. Furthermore, it is possible to observe the different image data 24 or the different image data 24 and 25 without losing sight of the region of interest on the first image data 21. This makes it possible to observe the region of interest on the first image data 21 and the different image data 24 or the different image data 24 and 25 without moving the direction of the eyes. It is therefore possible to easily compare the region of interest on the first image data 21 with the different image data 24 or the different image data 24 and 25. Furthermore, since it is possible to observe the entire region of the first image data 21 without being covered by the different image data 24 or the different image data 24 and 25, it is possible to easily move the frame 75 to another region of interest.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2012-228462 | Oct 2012 | JP | national |
2013-207749 | Oct 2013 | JP | national |
This application is a Continuation application of PCT Application No. PCT/JP2013/077003, filed Oct. 3, 2013 and based upon and claiming the benefit of priority from the Japanese Patent Application No. 2012-228462, filed Oct. 15, 2012 and the Japanese Patent Application No. 2013-207749, filed Oct. 2, 2013, the entire contents of all of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2013/077003 | Oct 2013 | US |
Child | 14457144 | US |