1. Field of the Invention
The present invention relates to a technique of implementing faster change in a display area in an image viewer.
2. Description of the Related Art
In the field of pathology, a virtual slide system, which images a test sample placed on a preparation and digitizes the image to enable pathological diagnosis on a display, is used instead of an optical microscope, which is a conventional pathological diagnostic tool. By digitizing the pathological diagnosis using the virtual slide system, a conventional image of a test sample captured by an optical microscope can be handled as digital data. Thereby improved convenience can be expected in terms of explanations to a patient by using digital images, sharing rare cases, quickening remote diagnosis, and making education and training efficient.
An image viewer, to visualize the image data digitized by a virtual slide system, becomes an interface which directly connects the system and a pathologist (user), hence ease of use of the image viewer has a major influence on operability of the system itself.
Operability of a conventional image viewer however is sometimes inferior to that of a microscope. For example, a size of pathological image data is normally huge, and is several times to several hundred times the size of the area displayed on screen, depending on the case. If such huge image data is displayed on the image viewer, and the display data is updated by inputting an instruction by scrolling, for example, the response of the screen display to the input delays in some cases. This response delay often irritates the user of the image viewer, and could be a reason for dampening the popularity of a virtual slide system. Therefore improving the display responsiveness of the image viewer eliminates one barrier for a pathologist to use the virtual slide system, and this is very important.
Improving the display responsiveness of the image viewer is not a problem limited to the virtual slide system, but is a problem that is widely recognized. To solve this problem, the image data volume that exceeds the screen display area is normally stored in advance in a buffer memory of which access speed is relatively fast, and when a screen is updated by scrolling or the like, the next display data is read from the buffer memory. Herein below, the processing to read the image data or a part of the image data in advance from a buffer area, of which access speed is fast, in order to improve the responsiveness of the screen display, is called “buffering”. This processing is also called “caching”, and the memory used for cashing may be called a “cache memory” or a “cache”. In this description, the terms buffer and cache have the same meaning.
Buffering is a well known processing method, but in standard buffering, an area larger than the screen display area is read for equal amounts in the vertical and horizontal directions. In other words, in this case an area that is actually displayed on screen is located at the center of the image data area which was read from the buffer memory. This method is effective when using vertical and horizontal scrolling to change an image displayed on screen. However the responsiveness of the image viewer may drop if the scrolling amount at one event is large. This is because the area to be displayed on screen next moves outside the buffered area, therefore the image data must be read from a storage device which is slower than the buffer memory. To prevent this state, more buffer capacity should be secured, but this does not mean that the utilization efficiency of the buffer memory improves, and most buffered data ends up being wasted. In other words, generally speaking it is necessary to use as much buffer memory space as possible, in order to improve the display responsiveness of the image viewer.
The essential cause of these problems is because the next display image cannot be accurately predicted. In other words, if the next display image can be predicted at high probability, then the utilization efficiency of the buffered data could be improved.
In Japanese Patent Application Laid-open No. 2006-113801, for example, if text data is included in the image data, the content of the image is analyzed first using the characteristic that the text is read in a specific direction, then data to be buffered is selected. Thereby the technique disclosed in Japanese Patent Application Laid-open No. 2006-113801 implements both a faster scrolling display and a more efficient use of memory.
In the state disclosed in Japanese Patent Application Laid-open No. 2006-113801, the reading direction of the text is predetermined, therefore buffering that utilizes this characteristic is effective. However in the case of an image (e.g. pathological image) that does not include semantic data, such as text, the direction thereof cannot be uniquely determined by the characteristic of the image, hence there is no method for controlling buffering to implement both a faster speed and a more efficient use of memory. Furthermore, in the case of Japanese Patent Application Laid-open No. 2006-113801, the technique is effective to display an area adjacent to the current display image by scrolling, but responsiveness drops when an image located at a position distant from the current display image is displayed.
With the foregoing in view, it is an object of the present invention to provide a technique to improve the display responsiveness and the utilization efficiency of the buffer memory in pathological diagnosis.
A first aspect of the present invention resides in an image data-forming apparatus that generates display image data from pathological image data, the display image data being generated for a viewer having a function of displaying a part of a pathological image and changing the display target part according to an operation by a user,
the image data-forming apparatus comprises:
an image data buffer unit that can temporarily store a part of the pathological image data;
an image data output unit that generates display image data from the pathological image data, and that uses data stored in the image data buffer unit when data required for generating the display image data is stored in the image data buffer unit; and
a buffering control unit that implements control of storing data on a part of the pathological image in the image data buffer unit, wherein
the pathological image data includes data on a plurality of annotations attached to certain positions in the pathological image, and
the buffering control unit controls buffering based on information on the pathological diagnosis linked to each of the plurality of annotations.
A second aspect of the present invention resides in an image data-forming method comprising:
an image data buffer step of temporarily storing a part of the pathological image data in an image data buffer unit;
an image data output step of generating display image data from the pathological image data for a viewer having a function of displaying a part of the pathological image and changing the display target part according to an operation by the user, and using the data stored in the image data buffer unit when the data required for generating the display image data is stored in the image data buffer unit;
a step of acquiring data on a plurality of annotations, included in the pathological image data and attached to certain positions in the pathological image; and
a step of implementing control of storing a part of the pathological image in the image data buffer unit based on the information on the pathological diagnosis linked to each of the plurality of annotations.
A third aspect of the present invention resides in a non-transitory computer readable storage medium storing a program that causes an image data-forming apparatus to be executing:
an image data buffer step of temporarily storing a part of the pathological image data in an image data buffer unit;
an image data output step of generating display image data from the pathological image data for a viewer having a function of displaying a part of the pathological image and changing the display target part according to an operation by the user, and using the data stored in the image data buffer unit when the data required for generating the display image data is stored in the image data buffer unit;
a step of acquiring data on a plurality of annotations, included in the pathological image data and attached to certain positions in the pathological image; and
a step of implementing control of storing a part of the pathological image in the image data buffer unit based on the information on the pathological diagnosis linked to each of the plurality of annotations.
According to this invention, the display responsiveness and the utilization efficiency of the buffer memory can be improved in pathological diagnosis.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
The present invention relates to a technique for improving the display responsiveness upon moving the display area in an image viewer that has a function to display an image corresponding to a part of original image data, and to change (e.g. scrolling, jumping) the display portion according to the operation by the user. In concrete terms, according to the present invention, data to be buffered is appropriately controlled in an apparatus that generates image data for display (display image data) from the original image data, thereby the utilization efficiency of the buffer memory is improved and as a result, the display responsiveness is comprehensively improved.
More particularly according to the present invention, original image data is handled as a block image data group, buffering priority is determined for each block image data, and a block image data having the highest priority is buffered preferentially. This buffering priority should be determined based on the result of analyzing information of annotations attached to the image. Here “annotation” refers to a marker that is attached to an individual position (point or area) in the image. Some information on pathological diagnosis can be linked to a specific annotation. (Information linked to an annotation is called “annotation information”.) Any type of information may be included in annotation information. Typically information on date when the annotation is created, creator, type of annotation, importance (state of lesion and stage of disease), diagnostic remarks and other comments, for example, can be included. If an annotation is created by automatic diagnostic software or the like, diagnostic information created by such software and information that identifies the software may be included in the annotation information. In this description, various information other than comment information is called “attribute information”. Depending on the context, a marker and annotation in toto may be called an “annotation”.
In a virtual slide system which will be described in the following embodiment, important information, such as remarks based on pathological diagnosis, is written as a comment in an annotation (in this description, the remarks based on the pathological diagnosis and related attribute information are collectively called “pathological diagnostic information”). Therefore if an annotation is included in an image, it is more likely that the image around the annotation is viewed by an image viewer, compared with other portions of the image. Hence if an image around the annotation is buffered in advance, the display responsiveness can be improved when the user requests to display a peripheral area thereof. This effect is conspicuous when the display request is for an area distant from the currently displaying area, which cannot be displayed immediately by scrolling alone. In other words, position information of the annotation is utilized for predicting data to be buffered. Further, if only annotations that satisfy a certain condition are targeted for buffering, then buffering with good memory utilization efficiency can be implemented even when a number of annotations is large or when a capacity of the buffer memory is limited.
The image processing system according to Embodiment 1 will be described with reference to the drawings. The method shown in this embodiment is implemented under the system configuration depicted in
In this embodiment, the image-forming apparatus 102 can be implemented by, for example, a computer having the hardware configuration shown in
304 is an image data output unit which generates display image data, using the image data received from the image data storage unit 302 or the image data buffer unit 303, to be outputted to the image display apparatus 103. 305 is a screen update information input unit, which accepts input by the user via an input device such as a mouse, and transmits a screen update command to the buffering control unit 306 based on this input. The buffering control unit 306 has a function to control data to be stored in the image data buffer unit 303. The buffering control unit 306 transmits a data transfer command to the image data storage unit 302, or receives display area information from a display area information input unit 310, and transmits a buffering control command to the image data buffer unit 303 based on this information. Further, the buffering control unit 306 transmits a command to analyze the annotation information, attached to the image data, to an annotation information analysis unit 308, and receives buffering priority information from a buffering priority setting unit 309.
307 is an annotation analysis condition setting unit, which generates an analysis condition required for annotation information analysis based on the user input, for example, and transmits the analysis condition to the annotation information analysis unit 308. Instead of user input, the annotation analysis condition setting unit 307 may generate the analysis condition using other information held in the image-forming apparatus 102, such as organ information attached to the image and viewer information of the image.
The annotation information analysis unit 308 analyzes the annotation information by the analysis command received from the buffering control unit 306, and transmits this result to the buffering priority setting unit 309. The annotation information analysis unit 308 also transmits a priority calculation command to the buffering priority setting unit 309. The buffering priority setting unit 309 calculates the buffering priority based on this information, and transmits the result to the buffering control unit 306. A display area information input unit 310 acquires display area information held by an OS or the like, and transmits the result to the buffering control unit 306.
All image data constituting the pathological image data 401 is constituted by each block image data 410 as the minimum unit. In this embodiment, such processing as data transfer is executed in the units of this block image data 410. Each block image data 410 may be an independent image file, or image data 406 to 409 may be independent image files, and the block image data 410 may be defined as a unit to handle these [independent image files] internally. All block image data can be uniquely specifies using an index (i, j, k, l) constituted by a plurality of integers. Here i denotes an index in the x axis direction, j denotes an index in the y axis direction, k denotes an index in the z axis (depth) direction, and l denotes an index in the magnification direction. When there are two block image data of which the i and j match with each other, these are block image data at a same position on the xy plane if the respective l indices match. All l indices of the block image data belonging to a same depth image group have a same value.
The pathological image data shown in
503 to 508 are annotations linked to an image. Each annotation is a circular graphic, and the size thereof is larger than the block image data 410. In come cases however, the graphic to express an annotation may have a shape other than a circle, and the size thereof may be smaller than the block image data 410. Annotation data corresponding to the annotations 503 to 508 is attached to the pathological image data 401 that includes the image data 408. The annotation data of this embodiment is constituted by an ID number, creator name, creation date, individual who updated the data last, date of last update and comment, for example. Table 1 shows an example of the annotation data.
In this embodiment, the annotations 503 to 508 correspond to the IDs 1 to 6 of the annotation data respectively.
509 to 515 are the target areas to be buffered, and the size of each area (the amount of margin provided when the area is buffered) can be specified or changed in advance by the user or by a program. Hereafter 509 to 515 are referred to as the “buffering target area” respectively. There are two types of buffering target areas: a first buffering target area 509 linked to the screen display area 501, and second buffering target areas 510 to 515 related to positions where annotations are attached.
In this embodiment, the first buffering target area 509 is defined as an area that is wider than the display data area 502 by one block image data area in vertical and horizontal directions respectively, and excludes the display data area 502. In other words, the first buffering target area 509 is a group of one row of block image data around the display data area 502. If necessary however, the display data area 502 may be included in the buffering target area 509. In this embodiment, the second buffering target areas 510 to 515 are defined as areas that are larger than the minimum area, required to completely contain the corresponding annotations 503 to 508, by one block image data area respectively in the vertical and horizontal directions.
The definition range of the buffering target area is not limited to the above example, but may be changed if necessary. For example, in
In
(Image-Forming Processing)
First the buffering control unit 306 performs the initial setting (step S601). In step S601, the processing shown in
After step S601, the annotation analysis condition setting unit 307 determines whether an annotation analysis condition update request was inputted to the image-forming apparatus 102 by user input or the like (condition determination step S602). The annotation analysis condition update request is an operation to change the condition which determines the focus of information (priority) included in the annotation information. For example, information that is focused on (e.g. creator of the annotation, date or date and time of creation, date or date and time of update, keyword included in a comment) and the condition to determine priority should be provided as an annotation analysis condition. In concrete terms, various specification methods can be used, such as increasing the priority of an annotation created by a creator “Dr. oo”, increasing the priority of an annotation of which date and time of creation are newer, and increasing the priority of an annotation in which a specific keyword in the comment is included. In the case of user input, the analysis condition input screen is displayed on the image display apparatus, for example, and information to be focused on and a condition to determine the priority are selected or inputted by the user by using such an input device as a mouse. The annotation analysis condition setting unit 307 may automatically set the analysis condition. For example, information attached to pathological image data (e.g. sample dyeing method, type of organ) is extracted, and keywords corresponding to the information are set as an analysis condition.
If it is determined that the annotation analysis condition update request was inputted in the condition determination step S602, the annotation analysis condition setting unit 307 updates the annotation analysis condition based on this input (step S603). In concrete terms, data of the annotation analysis condition stored in the storage unit 202 is updated. If the result of the condition determination step S602 is NO, then processing advances to step S607.
Then the annotation information analysis unit 308 analyzes the annotation information according to the annotation analysis condition (step S604). In other words, the annotation information analysis unit 308 searches the annotation information of all the annotations included in the pathological image data, and extracts annotations that match with the analysis condition. A specific analysis procedure and analysis result thereof could be different depending on the content of the provided analysis condition. For example, if the analysis condition is a condition that specifies the “creator of the annotation”, then only those annotations created by the specified creator are selected from all the annotations. If the analysis condition is “last update data that is most recent”, then all the annotations are rearranged (sorted) in the sequence of the last update that is most recent. If the analysis condition is a keyword in a comment, only annotations that include the keyword are selected, or annotations may be sorted in the sequence of a higher matching degree with the keyword. The analysis result of the annotation information analysis unit 308 (e.g. list of ID numbers of the extracted an sorted annotations) is transferred to the buffering priority setting unit 309.
Then the buffering priority setting unit 309 determines the buffering target area based on the analysis result in step S604. In other words, the buffering target areas 510 to 515 are set so as to include each annotation portion with reference to the position information of the annotations extracted in step S604. Then the buffering priority setting unit 309 determines the buffering priority of each block image data that exists in the buffering target areas 510 to 515, and updates the content of the current buffering priority table (step S605). The buffering priority table is a table that is created in step S707 in the initial setting step S601. Step S707 will be described later.
The buffering priority table is a table constituted by an ID number of each block image data in the buffering target area, an ID number of the corresponding buffering target area, and a buffering priority thereof. The ID number of each block image data corresponds one-to-one with an index number that uniquely specifies each block image data. In this embodiment, the ID number of the buffering target area 509 is “0”, and the ID numbers of the buffering target areas 510 to 515 match with the ID numbers of the annotations to which these buffering target areas belong (in other words, integers 1 to 6 are assigned sequentially). In this embodiment, the value of the buffering priority is a positive integer, and the greater the value the higher the priority, but if necessary priority assignment that is different from this definition may be used for the values of the buffering priority, such as using positive or negative real numbers. Table 2 shows an example of a buffering priority table.
Then based on the content of the buffering priority table updated in step S605, the buffering control unit 306 updates the data stored in the image data buffer unit 303 (this is called the “buffer content”) (step S606). If it is unnecessary to update the buffer content, such as the case when the result, based on the updated analysis condition, matches with the result based on the analysis condition before update, the processing in step S606 may be omitted to ensure efficient processing. There are some methods that can be used for the processing in step S606, and
After step S606, the buffering control unit 306 determines whether a screen update request was inputted, in use of a certain means, to the image-forming apparatus 102 (condition determination step S607). The screen update request is, for example, a screen scroll instruction or a screen switching instruction inputted from such an input device as a mouse. If no screen update request is inputted, processing returns to step S602.
When it is determined that the screen update request was inputted in step S607, the buffering control unit 306 updates the screen display area information based on this input (step S608). Here “screen display area information” refers to the information on the position and display area of the screen display area 501 based on the image data 408.
Then the buffering control unit 306 updates the display data area information based on the updated screen display area information (step S609). Here “display data area information” refers to the position and area of the display data area 502 based on the image data 408.
Then the buffering control unit 306 determines whether the block image data required for screen display exists in the image data buffer unit 303 (condition determination step S610). If the determination result of the condition determination step S610 is “YES”, then processing advances to step S612, where the image data output unit 304 reads the block image display inside the image data buffer unit 303, generates the data image data, and transfers the data to the image display apparatus 103. If the determination result of the condition determination step S610 is “NO”, on the other hand, the processing advances to step S611, where the image data output unit 304 reads the block image data from the image data storage unit 302, generates the display image data, and transfers the data to the image display apparatus 103. The block image data itself may be transferred as the display image data, or the display image data may be generated by performing the necessary image processing (e.g. resolution conversion, gradation correction, color correction, enhancement processing, combining processing, format conversion) on the block image data.
Then the buffering control unit 306 updates the buffering target area information (step S613). Here “buffering target area information” refers to information on the position and shape of the buffering target area 509 and the buffering area range based on the image data 408. To simplify description in this embodiment, it is assumed that the positional relationship between the display data area 502 and the buffering target area 509, and the shape of the buffering target area 509 do not change. However if the position of the buffering target area 509 changes, it is possible that the buffering target area 509 and any of the buffering target areas 510 to 515 may overlap.
Then the buffering priority setting unit 309 updates the content of the current buffering priority table (step S614). If the buffering target area 509 does not overlap with any buffering target area 510 to 515, a predetermined priority is set for the block image data belonging to the buffering target area 509. For example, a higher priority is always assigned to block image data belonging to the buffering target area 509 rather than to other block image data. This method is based on the assumption that when the screen display area 501 moves, it is more likely to be moved by scrolling in the vertical or horizontal direction, rather than by jumping to a distant area. If the buffering target area 509 overlaps with any of the buffering target areas 510 to 515, priority is set regarding that the block image data existing in the overlapped area belongs to the buffering target area 509.
Then the buffering control unit 306 updates the buffer content stored in the image data buffer unit 303 based on the content of the buffering priority table updated in step S614 (step S615). According to the algorithm described in
After step S615, the buffering control unit 306 checks whether a command to terminate the program was inputted (condition determination step S616). If the result of the condition determination step S616 is “YES”, the program is terminated, and if “NO”, then processing continues returning to the condition determination step S602.
The above description is an overview of the flow of the image-forming processing according to this embodiment.
(Initial Setting Step S601)
First the buffering control unit 306 acquires the screen display area information from the display area information input unit 310 (step S701). This information is information on the display screen resolution.
Then the buffering control unit 306 sets a position of the screen display area based on the image data 408 on the depth and magnification to be displayed (step S702). By the information on the display screen resolution acquired in step S701 and information on the position which is set in step S702, the position and area of the screen display area in the image data 408 are determined. The depth, magnification and area of the initial display can be set, for example, based on an area at the center of the image data 408 of which depth and magnification are intermediate values in the pathological image data 401.
Then the buffering control unit 306 sets the display data area in the screen display area determined in step S702 (step S703). In step S703, the position and area of the display data area are determined.
Then the buffering control unit 306 sets the buffer memory capacity (step S704). The buffer memory capacity is set by the user or the program according to the capacity of the storage unit 202 included in the image-forming apparatus.
Then the positions of the annotations 503 to 508 linked to the image data 408 are acquired (step S705). This corresponds to acquiring the annotation data attached to the pathological image data 401 and acquiring the center positions and radiuses of the annotations 503 to 508 in the image data 408.
Then the position, area and shape of each buffering target area 509 to 515 are set (step S706). This setting is as described above.
Then the buffering control unit 306 creates the buffering priority table to store the buffering priority of each block image data existing in the buffering target areas 509 to 515 (step S707). As described instep S605, the buffering priority table is a table constituted by an ID number of each block image data in the buffering target areas 509 to 515, an ID number of the corresponding buffering target area, and a buffering priority thereof. At this point, the value of the buffering priority has not yet been calculated, therefore an appropriate value, such as “0”, is set.
Then under control of the buffering control unit 306, the image data output unit 304 reads the block image data of an area corresponding to the display data area 502 from the image data storage unit 302, and transfers this data to the image display apparatus 103 (step S708).
Then based on the information on the display data area 502 transferred to the image display apparatus 103, the buffering priority setting unit 309 updates the buffering priority table (step S709). In step S709, a predetermined priority is set to only those elements corresponding to the block image data belonging to the buffering target area 509 (that is, only block image data around the area displayed in step S708) in each element of the buffering priority table. The processing in step S709 is the same as in step S614.
Then the buffering control unit 306 updates the content of the buffer based on the buffering priority which was set in step S709 (step S710). The processing in step S710 is the same as in step S606 or S615.
The initial setting processing S601 then ends.
(Buffer Content Update Step S606, S615 and S710)
First the buffering control unit 306 sorts all the elements of the buffering priority table (hereafter called “priority table”) in the sequence of higher priority (step S801).
Then the buffering control unit 306 provides an integer type variable i, and sets variable i to “0” (step S802).
The buffering control unit 306 checks whether the block image data corresponding to the i-th element in the priority table already exists in the buffer (condition determination step S803). For example, a list of ID numbers corresponding to the block image data existing in the buffer is created in advance, and the ID number corresponding to the i-th block image data is collated with the list of the image in the buffer. If the result of the condition determination step S803 is “YES”, the buffering control unit 306 attaches a mark to the i-th element in the priority table (step S804), and also attaches a mark to the block image data detected in the buffer (step S805). It is preferable that a certain flag corresponds to this mark. If the result of the condition determination step S803 is “NO”, processing advances to step S806.
In step S806, the buffering control unit 306 increases the value of the variable i by 1.
In the condition determination step S807, the buffering control unit 306 checks whether the value of the variable i is a number of the elements of the priority table or higher. If the result of the condition determination step S807 is “NO”, processing returns to the condition determination step S803. If the result of the condition determination step S807 is “YES”, the buffering control unit 306 deletes the marked elements in the priority table (step S808). As a result, only elements that indicate the block image data having high priority and that do not exist in the buffer (in other words, block image data that should be added to the buffer) remain in the priority table.
After step S808, the buffering control unit 306 deletes the block image data other than marked images in the buffer (step S809). As a result, block image data which need not be buffered is deleted, and free space is created in the buffer.
In step S810, the buffering control unit 306 substitutes “0” for the variable i.
The buffering control unit 306 checks whether there is free space in the buffer (condition determination step S811). If the result of the condition determination step S811 is “YES”, the buffering control unit 306 transfers the block image data corresponding to the i-th element of the priority table from the image data storage unit 302 to the image data buffer unit 303 (step S812).
Then the buffering control unit 306 increases the value of the variable i by 1 (step S813), and checks whether the value of the variable i is less than a number of the elements of the priority table (condition determination step S814). If the result of the condition determination step S814 is “YES”, processing returns to step S811 to continue buffer content update processing. If the result of the condition determination step S814 is “NO”, this means that no block image data to be buffered remains, and the buffer content update processing is terminated. The case when the result of the condition determination step S811 is “NO” indicates the state that image data can no longer be stored in the buffer, therefore the buffer content update processing is terminated.
The above is a detailed description of
(Example of Buffering)
Now how block image data is stored in the image data buffer unit 303 when the processing operations in
It is assumed that there is a block image data group of which buffering priority shown in
The block image data to be stored in the buffer can either be compressed data or uncompressed data. If the focus is more on increasing the speed of the display processing, it is preferable to decompress the compressed data during program wait time, and store the decompressed data in the buffer in advance. If the focus is more on decreasing the buffer memory capacity, it is preferable to store data in the compressed state.
In this embodiment, the buffering priority is set using only one attribute information, that is, creator, but analysis may be performed combining a plurality of attributes information. For example, a plurality of creators may be specified, or creator information, date information, annotation type information and the like may be specified. To combine a plurality of attributes information, this information may be combined by an AND condition or by an OR condition.
By this embodiment, image data can be buffered at better memory utilization efficiency according to the characteristics of the pathological diagnosis, compared with prior arts, and display responsiveness on the image viewer improves as a result. In concrete terms, the annotation data attached to the pathological image is analyzed, and a higher buffering priority is assigned to the image data around an annotation of which significance for the user is estimated as high, whereby buffering processing, of which memory utilization efficiency is higher than prior arts, becomes possible.
Embodiment 2 of the present invention will now be described with reference to the drawings.
The system configuration, hardware configuration, functional blocks and processing algorithms are the same as Embodiment 1, but the annotation analysis condition and the image data buffering method are different.
In this embodiment, the annotation data in Table 1 is analyzed, and a higher buffering priority is set to the image data around an annotation of which the last update date is the most recent.
By the buffering control of this embodiment, buffering processing of which memory utilization efficiency is even better than the prior arts and the method of Embodiment 1 can be implemented. In concrete terms, the image data around an annotation having high priority can be buffered according to the size of the buffer area. In particular, the effect is even more clearly demonstrated when a number of annotations linked to the pathological image is very large.
Embodiment 3 of the present invention will now be described with reference to the drawings.
The system configuration, hardware configuration, functional blocks and processing algorithms are the same as Embodiment 1 and Embodiment 2, but the annotation analysis method and image data buffering method are different.
In Embodiment 1 and Embodiment 2, the annotation is analyzed using such attribute information as creator of annotation and last update date, but in this embodiment, the annotation is analyzed using comment information.
Table 3 shows the annotation data used for this embodiment. The annotation data in Table 3 is the same as the annotation date in Table 1, but comment content is more detailed in Table 3.
In this embodiment, comments in Table 3 is analyzed, and a higher buffering priority is set to the image data around an annotation that includes the phrase “liver cells” or “fat deposition” under comment. Further, a higher priority is assigned to an annotation which includes both of these phrases over an annotation which includes only one of these phrases.
In this embodiment, buffering priority is set using only the comment information in the annotation, but the comment information and attribute information may be combined to perform analysis. A plurality of attribute information may be combined.
By buffering control of this embodiment, buffering processing of which memory utilization efficiency is even better than the prior arts, and the methods of Embodiment 1 and Embodiment 2, can be implemented. In concrete terms, the buffering is performed using remark information written in a comment of an annotation, whereby the image data around the annotation which is most likely to be referred to can be buffered. In particular, the effect is even more clearly demonstrated when a number of annotations linked to the pathological image is very large.
<Other System Configurations>
In Embodiment 1 to Embodiment 3, the image processing system is constituted by the image-capturing apparatus, the image-forming apparatus and the image display apparatus as shown in
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., non-transitory computer-readable medium). Therefore, the computer (including the device such as a CPU or MPU), the method, the program (including a program code and a program product), and the non-transitory computer-readable medium recording the program are all included within the scope of the present invention.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2013-230366, filed on Nov. 6, 2013 and Japanese Patent Application No. 2014-100828, filed on May 14, 2014, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | Kind |
---|---|---|---|
2013-230366 | Nov 2013 | JP | national |
2014-100828 | May 2014 | JP | national |