The present invention relates to an information processing technology for supporting diagnosis based on tissue specimen images obtained by capturing body tissues.
In the field of the technology described above, as disclosed in Patent Document 1, a technology for color coding and displaying an angle in the longitudinal direction of a nucleus of a signet-ring cell so as to facilitate determination of a shape of the nucleus of the signet-ring cell has been known. Further, Patent Document 2 discloses a technology for dividing a tissue specimen image into grid-like areas so as to determine and display the importance of each divided area.
However, in the technologies described in the above Patent Documents, a pathologist cannot determine at a glance at which level and in which range the feature values for pathological diagnosis are distributed, while observing tissue specimen images.
An object of the present invention is to provide a technology for solving the problems described above.
In order to achieve the object, according to the present invention, there is provided an information processing apparatus which supports diagnosis based on a tissue specimen image obtained by capturing body tissues, and includes an area generation unit that divides at least one feature value of the tissue specimen image into a plurality of levels based on the magnitude of the feature value, and generates an area on the tissue specimen image belonging to each level; and an overlay image generation unit that associates the area of each level which is generated by the area generation unit with an image which is processed to have the same shape and the same positional relation as the area and in which a magnitude relation of the feature value is identifiable, and generates an overlay image.
In order to achieve the object, according to the present invention, there is provided a control method of an information processing apparatus which supports diagnosis based on a tissue specimen image obtained by capturing body tissues, and includes an area generation step of dividing at least one feature value of the tissue specimen image into a plurality of levels based on the magnitude of the feature value, and generates an area on the tissue specimen image belonging to each level; and an overlay image generation step of associating the area of each level which is generated in the area generation step with an image which is processed to have the same shape and the same positional relation as the area and in which a magnitude relation of the feature value is identifiable, and generates an overlay image.
In order to achieve the object, according to the present invention, there is provided a control program of an information processing apparatus which supports diagnosis based on a tissue specimen image obtained by capturing body tissues, and causes a computer to implement an area generation step of dividing at least one feature value of the tissue specimen image into a plurality of levels based on the magnitude of the feature value, and generates an area on the tissue specimen image belonging to each level; and an overlay image generation step of associating the area of each level which is generated in the area generation step with an image which is processed to have the same shape and the same positional relation as the area and in which a magnitude relation of the feature value is identifiable, and generates an overlay image.
In order to achieve the object, according to the present invention, there is provided an information processing system which supports diagnosis based on a tissue specimen image obtained by capturing body tissues, and includes an input unit that inputs the captured tissue specimen image; an area generation unit that divides at least one feature value of the tissue specimen image into a plurality of levels based on the magnitude of the feature value, and generates an area on the tissue specimen image belonging to each level; an overlay image generation unit that associates the area of each level which is generated by the area generation unit with an image which is processed to have the same shape and the same positional relation as the area and in which a magnitude relation of the feature value is identifiable, and generates an overlay image; and a superimposing and displaying unit that superimposes the overlay image generated by the overlay image generation unit on the tissue specimen image and displays the superimposed image.
In order to achieve the object, according to the present invention, there is provided an information processing method which supports diagnosis based on a tissue specimen image obtained by capturing body tissues, and includes an input step of inputting the captured tissue specimen image; an area generation step of dividing at least one feature value of the tissue specimen image into a plurality of levels based on the magnitude of the feature value, and generating an area on the tissue specimen image belonging to each level; an overlay image generation step of associating the area of each level which is generated in the area generation step with an image which is processed to have the same shape and the same positional relation as the area and in which a magnitude relation of the feature value is identifiable and generating an overlay image; and a superimposing and displaying step of superimposing the overlay image generated in the overlay image generation step on the tissue specimen image and displaying the superimposed image.
According to the present exemplary embodiment, a pathologist can determine at a glance at which level and in which range the feature values for pathological diagnosis are distributed, while observing tissue specimen images.
Above described object and other objects, features and advantages will become more apparent by preferred exemplary embodiments described below and the following accompanying drawings.
Hereinafter, exemplary embodiments of the present invention will be described in detail byway of examples with reference to drawings. However, constituent elements described in the following exemplary embodiments are merely illustrative, and the technical scope of the present invention is not intended to be limited only thereto.
An information processing apparatus 100 as a first exemplary embodiment of the present invention will be described using
As shown in
According to the present exemplary embodiment, a pathologist can determine at a glance at which level and in which range the feature values for pathological diagnosis are distributed, while observing tissue specimen images.
Next, an information processing system according to a second exemplary embodiment of the present invention will be described. In the present exemplary embodiment, the information processing apparatus 100 sets areas depending on feature values or the levels of the feature values within the tissue specimen images to be diagnosed by the pathologist, and associates each area with an image which is processed to have the same shape and the same positional relation as the area and in which the level of the feature value is identifiable by colors and patterns. Then, the information processing apparatus 100 generates an overlay image including an allocation image and transmits the overlay image to the communication terminal of the pathologist. The communication terminal of the pathologist superimposes the overlay image on the tissue specimen image and displays the superimposed image.
The present exemplary embodiment enables a support for facilitating the transition to the subsequent operation by the pathologist of the tissue sample image to be diagnosed, for example, the selection of an attention area and the expansion of an area to be diagnosed in detail.
The information processing system 200 includes an information processing apparatus 210 which is a pathological diagnosis support apparatus and is connected through a network 250, and communication terminals 230 which is operable by pathologists 240 and receive pathological diagnosis supports. In addition, the network 250 may be LANs in hospitals, or public lines or wireless communication connected with the outside of the hospitals.
The information processing apparatus 210 includes a communication control unit 211 that controls communication with the communication terminals 230 through the network 250. Through the communication control unit 211, the tissue specimen images received from communication terminals 230 by a tissue specimen image reception unit 212 are stored in a tissue specimen image storage unit 213. Then, the information processing apparatus 210 obtains the feature values of the stored tissue specimen images by referring to information of a feature value database 215 (hereinafter, referred to as DBs: see
The feature value may be one or plural as in
An area generation unit 216 divides, with reference to a level classification DB 217 (see
The communication terminal 230 superimposes the tissue specimen image that has been transmitted to the information processing apparatus 210 on the received overlay image and displays the superimposed image. Here, since the overlay image is generated while maintaining relative positional relations among a plurality of areas as described above, the overlay image is coincident with the tissue specimen image which is a base for generating the area in the positional relation. Therefore, when the images are superimposed, the communication terminal 230 can align the positional relation thereof. In addition, although only the overlay image is transmitted in the present exemplary embodiment, the information processing apparatus 210 may transmit a superimposed image in which the tissue specimen image is superimposed on the overlay image. However, it is desirable to transmit only the overlay image in view of communication traffic.
A superimposed display of the overlay image on the display screen of the communication terminal 230 in the present exemplary embodiment will be described with reference to
The following is an example of the feature value DB 215 that is prepared in advance to analyze the feature value with reference to
The configuration example 215-1 of the feature value DB stores in association with each body part, conditions such as a size 411 of the nucleus, a uniformity 412 of the nucleus, a distribution 413 of a chromatin, a distribution 414 of a nucleolus, and a shape 415 of the nucleus, and a score 410 for a nuclear grade (magnitude of feature value) with which the above conditions are associated.
The configuration example 215-2 of feature value DB stores in association with each body part, conditions such as an array 421 of cells, a shape 422 of a gland tube, and a size disparity 423 of a nucleus, and a score 420 for a degree of differentiation (magnitude of feature value) with which the above conditions are associated. Generally, the degree of differentiation is classified into a high differentiated state, a medium differentiated state, and a low differentiated state which are obtained by dividing the level. In this case, the level is already divided, and thus images may be allocated to the divided states as they are.
The configuration example 215-3 of the feature value DB stores in association with each body part, conditions such as a shape 431 of a gland tube including a tubular shape or a linear shape, the number 432 of cell nuclei in the gland tube, a distribution 433 of cell nuclei in a bottom portion area, and a score 430 for a structural (gland tube) grade (magnitude of feature value) with which the above conditions are associated. The details of such gland tube grade are described in Japanese Unexamined Patent Publication No. 2010-281636.
The configuration example 215-4 of the feature value DB stores in association with each body part, conditions such as a percentage 441 of mucus present in the lesion, a percentage or distribution 442 of tissues floating in the mucus other than the mucus and a signet ring cell-like grade 443, and a score 440 for a degree of mucus (magnitude of feature value) with which the above conditions are associated. In addition, see, for example, Patent Document 1 for an extraction method of a mucus area.
The configuration example 215-5 of the feature value DB stores in association with each body part, conditions such as a nuclear grade 451 and the number of occurrences of nuclear fission 452, and a score 450 for a nuclear grade (magnitude of feature value) with which the above conditions are associated. Further, the configuration example 215-5 stores a condition of a structural grade 461 and a score 460 for a histological grade (magnitude of feature value) with which the above conditions are associated, in addition to the nuclear grade 451 and the number of occurrences of nuclear fission 452. In addition, as the structural grade 461, for example, a degree of a gland tube formation is included.
The level classification DB 217 stores a feature value 501 including each feature value or a combination of a plurality of feature values, and a level value associated with the score range 502.
The allocation image DB 219 stores color information in association with the level 601. In
Further, the pattern information is stored in association with the level 601. In
An example of the area information 216a which is output to the overlay image generation unit 218 by the area generation unit 216 will be described with reference to
In an example of area information 216a-1, in association with a line 711, a start pixel coordinate 712 and an end pixel coordinate 713 which are contained in an area on the line are stored, and the feature value 714 and the level 715 of the area are stored. In addition, as the line 711, all lines crossing the area generated by the area generation unit 216 are stored.
In another example of area information 216a-2, in association with an area 721, a feature value 722 and a level 723 are stored, singularities for forming the area are stored as a start pixel coordinate 724 and an end pixel coordinate 725, and a curve function 726 connecting the singularities is stored. In addition, the curve function 726 may be stored as, for example, a spline curve or the like and the parameters thereof. In the present example, only the area generated by the area generation unit 216 is stored.
In an example 218a-1 of the overlay image information, in association with a line 811, a start pixel coordinate 812 and an end pixel coordinate 813 included in an area of the line are stored, and the allocation image 814 allocated to the overlay image generation unit 218 is stored in the area. In addition, as the line 811, all lines crossing the area generated by the area generation unit 216 are stored and transmitted to the communication terminal 230.
In an example of overlay image information 218a-2, in association with an area 821, the allocation image 822 allocated to the area by the overlay image generation unit 218 is stored, singularities forming the area are stored as a start pixel coordinate 823 and an end pixel coordinate 824, and a curve function 825 connecting the singularities is stored. In addition, the curve function 825 may be stored, for example, as a spline curve and the parameters thereof. In the present example, only the areas generated by the area generation unit 216 are stored and transmitted to the communication terminal 230.
In
A RAM 940 is a random access memory used by the CPU 910 as a work area for temporary storage. An area for storing data required for realizing the present exemplary embodiment is ensured in the RAM 940. 941 is an area for storing the tissue specimen image that has been received through the network 250 from the communication terminal 230 of the pathologist. 942 is an area for storing information for specifying the tissue specimen image 941 such as a communication terminal ID of the communication terminal 230 that has transmitted the tissue specimen image 941 and a pathologist ID. The information 942 for specifying the tissue specimen image 941 includes for example, a patient ID, the body part from which a tissue specimen is taken, gender, age, medical history, or the like. 943 is an area for storing the feature value which is calculated by the feature value analysis. 944 is an area for storing levels classified on the basis of the feature value 943 which is calculated, and the information of an area having the level (see
A storage 950 stores databases and various parameters, or the following data or programs required for realizing the present exemplary embodiment. 215 is a feature value DB (see
In addition, in
First, in step S1001, the information processing apparatus 210 determines whether or not the received data is a tissue specimen image received from any one of the communication terminals 230. If the received data is not the tissue specimen image, the process proceeds to another process.
If the received data is the tissue specimen image, the process proceeds to step S1003, and the information processing apparatus 210 acquires the communication terminal ID (for example, IP address, or the like) of the communication terminal 230 which has transmitted the tissue specimen image, and the information (pathologist IDs, patient IDs, body parts, and the like) for specifying the tissue specimen image. In step S1005, the information processing apparatus 210 stores the received tissue specimen image.
In step S1007, the information processing apparatus 210 performs the analysis process on the feature value while referring to the feature value DB 215. Next, in step S1009, the information processing apparatus 210 performs the area generation process while referring to the level classification DB 217. Next, in step S1011, the information processing apparatus 210 performs the overlay image generation process while referring to the allocation image DB 219. Then, in step S1013, the information processing apparatus 210 transmits the generated overlay image back to the communication terminal 230 which has transmitted the tissue specimen image.
In general, when the tissue specimen image is acquired for supporting the pathological diagnosis, a tissue specimen image of low resolution is first acquired for performing a rough diagnosis, and then a tissue specimen image of high resolution is acquired if a detailed diagnosis is required. The procedure also may be applied to the procedure of the present exemplary embodiment. Alternatively, if support for giving a tip for the detailed diagnosis of the pathologist is wanted, an overlay image may be generated only from the tissue specimen image of low resolution. In contrast, if support for showing the direction of diagnosis or to evaluate the result of diagnosis of the pathologist is wanted, it is preferable that an overlay image be generated by performing a preliminary diagnosis using the tissue specimen image of high resolution.
Next, an information processing system according to the third exemplary embodiment of the present invention will be described. The information processing system according to the present exemplary embodiment is different from that of the second exemplary embodiment in that there are three feature values to be analyzed, and Red (R), Green (G) and Blue (B), three primary colors of light, are allocated to the three feature values. As a result, combinations of the levels of three features are displayed in different colors. In addition, although Red is allocated to the feature value of a cell, Green is allocated to the feature value of a gland tube and Blue is allocated to the feature value of mucus in the present exemplary embodiment, the three feature values and the allocation of colors are not limited to those of the present example.
According to the present exemplary embodiment, the levels of three feature values can be determined at the same time from a tendency of color (reddish, bluish, whitish, and the like). Therefore, proper selection of three feature values and allocation of colors enhance the comprehensive determination from the hue by a plurality of features.
In addition, only the characteristic configuration of the present exemplary embodiment is described, other configurations and operations are the same as in the second exemplary embodiment, and thus the detailed description is not repeated.
The information processing apparatus 1110 of
In a feature value analysis unit 1114 and a feature value DB 1115 of the information processing apparatus 1110, the feature value is limited to three feature values: a nuclear feature value, a gland tube feature value and a mucus feature value, but there is no significant difference in the configuration.
The overlay image generation unit 1118 generates three overlay images with reference to the stored allocation image DB 1119, based on the allocation of three feature values and three primary colors which are selected in advance. The overlay image transmission unit 1120 transmits the three generated overlay images to the communication terminal 230 through the network 250.
In the allocation image DB 1119, the color 1203 and the brightness 1204 of three primary colors are stored in association with a feature value 1201 and a level 1202. In the present example, as the feature value 1201, a nucleus, a gland tube and mucus are stored and respectively associated with Red (R), Green (G) and Blue (B).
In the overlay image information 1118a, a generated area 1302 is stored for each overlay number 1301 for identifying three overlay images corresponding to three primary colors. A brightness 1303, a start pixel coordinate 1304 and an end pixel coordinate 1305, which represent an outline of an area, and a curve function 1306 are stored in association with the area 1302.
Next, an information processing system according to the fourth exemplary embodiment of the present invention will be described. The information processing system according to the present exemplary embodiment is different from that of the second exemplary embodiment in that the pathologist 240 can select the feature value to be analyzed and the allocation image to be allocated to the feature value from the communication terminal 230. In addition, although the present exemplary embodiment shows a configuration in which the pathologist can select both the feature value and the allocation image, a configuration in which only one thereof can be selected is possible.
According to the present exemplary embodiment, it is possible to analyze and display the feature value which is desired by the pathologist and to allow the pathologist to determine at a glance the feature value and the level to which the pathologist has to pay attention.
In addition, only the characteristic configuration of the present exemplary embodiment is described, other configurations and operations are the same as in the second exemplary embodiment, and thus the detailed description is not repeated.
A feature value selection information reception unit 1401 of the information processing apparatus 1410 receives information of the feature value selection instruction by the pathologist 240 that is transmitted from the communication terminal 230 through the network 250. The feature value selection unit 1402 analyzes the feature value according to the selection by the pathologist 240, which is received by the feature value selection information reception unit 1401.
Further, an allocation image selection information reception unit 1403 receives the information of the allocation image selection instruction by the pathologist 240 which is transmitted from the communication terminal 230 through the network 250. The result of reception is reported to the allocation image DB 219, the allocation image selected by the pathologist 240 is associated with each feature value, and the overlay image is generated.
1510 in
First, in step S1601, the communication terminal 230 acquires a tissue specimen image. The tissue specimen image may be read and acquired from a scanner (not shown) connected to the communication terminal 230 or acquired through the storage medium or the like. In step S1603, the communication terminal 230 transmits the acquired tissue specimen image to the information processing apparatus 1410. The information processing apparatus 1410 stores the tissue specimen image received in step S1605. Subsequently, in step S1607, the information processing apparatus 1410 transmits a screen for inquiring about the feature value selection and the allocation of the allocation image to the communication terminal.
The communication terminal 230 is on standby for the feature value selection and the allocation image selection by the pathologist 240 in step S1609, and if there is selection, the process proceeds to step S1611. The communication terminal 230 acquires the information on the feature value and the allocation image which are selected in step S1611, and transmits the acquired information back to the information processing apparatus 1410 in step S1613.
The information processing apparatus 1410 performs an analysis process of the feature value which is selected by the pathologist 240 in step S1615. Subsequently, the information processing apparatus 1410 performs an area generation process of a level corresponding to the feature value in step S1617. Next, the information processing apparatus 1410 performs a generation process of an overlay image, which is allocated with the allocation image selected by the pathologist 240, to each area in step S1619. Then, the information processing apparatus 1410 transmits the overlay image generated in step S1621 according to the feature value and the allocation image which are selected by the pathologist 240, to the communication terminal 230.
The communication terminal 230 superimposes the tissue specimen image that has been transmitted on the received overlay image and displays the superimposed image, in step S1623. The pathologist 240 determines an area to be further diagnosed in detail and an area to be expanded and displayed, with reference to the displayed superimposed image. In addition, in step S1625, the pathologist 240 determines whether or not the displayed superimposed image is a desired result, and in a case of selecting again different feature values or allocation images, the process is returned to step S1609 and is repeated, by operating the communication terminal 230.
In addition, the tissue specimen image may be transmitted simultaneously with the feature value selection information and the allocation image selection information. Further, an inquiry for feature value selection information and an inquiry for allocation image selection may be performed in different steps.
First, the RAM 1740 is different from the RAM in
Further, the storage 1750 is different from the storage in
The information processing apparatus 1410 transmits a screen for inquiring about a feature value and an allocation image to the communication terminal 230, in step S1801. Then, in step S1803, the information processing apparatus 1410 is on standby for receiving the selection information for selecting the feature value and the allocation image from the communication terminal 230, and if there is reception, the process proceeds to step S1805. In step S1805, the information processing apparatus 1410 stores the selection information for selecting the received feature value and allocation image. In subsequent steps S1007 to S1011, the information processing apparatus 1410 performs respective processes including a feature value analysis, an area generation, and an overlay image generation using the feature value and the allocation image which are selected by the pathologist 240. In step S1013, the information processing apparatus 1410 transmits the generated overlay image to the communication terminal 230.
In step S1807, the information processing apparatus 1410 is on standby for input from the pathologist 240 as to whether or not the input is “OK”, a result of determination of whether the desired result is achieved from the selection of the feature value and the selection of the allocation image. If the input is not OK, the process returns to step S1801, and the information processing apparatus 1410 is again on standby for the selection information of the feature value and the allocation image from the communication terminal 230, and repeats the process described above.
Next, an information processing system according to the fifth exemplary embodiment of the present invention will be described. An information processing system according to the present exemplary embodiment is different from the fourth exemplary embodiment in that the selection of the feature value and the allocation image are not performed by the pathologist 240 but performed automatically by the information processing apparatus based on the specific information of a tissue specimen image.
According to the present exemplary embodiment, without the selection by the pathologist, since the desired feature value and the desired allocation image are suitably selected from the tissue specimen image, the feature value and the level to which the pathologist has to pay attention can be objectively determined at a glance.
In addition, only the characteristic configuration of the present exemplary embodiment is described, and other configurations and operations are the same as in the fourth exemplary embodiment. Therefore, the detailed description is not repeated.
The tissue specimen image specific information reception unit 1901 of the information processing apparatus 1910 receives specific information for specifying the tissue specimen image which is transmitted from the communication terminal 230 through the network 250. The specific information includes a pathologist ID, a patient ID, body parts, gender, age, medical history, and the like. In addition, a configuration is possible in which the information processing apparatus 1910 can acquire other information from the pathological diagnosis support history DB 1903, based on the pathologist ID and the patient ID.
With reference to the pathological diagnosis support history DB 1903, a feature value and allocation image determination unit 1902 automatically determines the feature value and the allocation image from the received specific information, using a table for determination 1902a. The feature value and allocation image determination unit 1902 selects the feature value in the feature value selection unit 1402 according to the determined feature value and the allocation image, and selects the allocation image allocated from the allocation image DB 219.
2010 in
The table for determination 1902a stores a selected feature value 2106 and a selected allocation image 2107 in association with a pathologist ID 2101, a patient ID 2102, a patient attribute 2103, a body part 2104 to be taken, and a pathological diagnosis support history 2105. Based on the table for determination 1902a, the feature value and allocation image determination unit 1902 determines the selected feature value and the selected allocation image for the received tissue specimen image.
In step S2201, the information processing apparatus 1910 acquires specific information including the pathologist ID, the patient ID, and the like. Subsequently, in step S2203, the information processing apparatus 1910 determines a feature value and an allocation image from the acquired specific information. The following procedures are the same as in
Next, an information processing system according to the sixth exemplary embodiment of the present invention will be described. The information processing system according to the present exemplary embodiment is different from the second exemplary embodiment in that it superimposes an overlay image on the tissue specimen image to display on the communication terminal 230, and then expands and displays the specified area with the magnification percentage according to the feature value, in response to an area expansion instruction from the pathologist 240. In addition, although the present exemplary embodiment shows an example of displaying an expanded image on a separate area of the screen of the communication terminal operated by the pathologist, the expanded image may be displayed on a separate screen or displayed on the instructed position of the tissue specimen image, as a magnifying glass.
According to the present exemplary embodiment, when the pathologist determines the feature value and the level of the tissue specimen image to which the pathologist has to pay attention and then instructs expansion and display of the desired area, it is possible to expand and display the area with the magnification according to the feature value of the instructed area. Thus, it may be possible to eliminate the need for magnification adjustment by the pathologist and to reduce labor of work.
In addition, only the characteristic configuration of the present exemplary embodiment is described, other configurations and operations are the same as in the second exemplary embodiment, and thus the detailed description is not repeated.
An expanded area information reception unit 2301 of an information processing apparatus 2310 receives an area instruction on the screen of the communication terminal 230 on which the overlay image transmitted by the overlay image transmission unit 220 is superimposed and displayed. That is, if the pathologist 240 instructs an area of the overlay image which is displayed on the communication terminal 230, the communication terminal 230 transmits the area information together with the expansion instruction to the information processing apparatus.
A magnification selection unit 2302 selects a magnification ratio using a magnification selection table 2302a, according to the feature value which is coincident with the area information received from the communication terminal 230 and corresponds to the area information from the area generation unit 216. An expanded image generation unit 2303 expands the corresponding area of the tissue specimen image according to the magnification selected by the magnification selection unit 2302. Then, the expanded image generation unit 2303 transmits expanding transmission data 2300a containing the magnification information and the expanded image of the corresponding area back to the communication terminal 230. In addition, if an application capable of expanding an area with the received magnification is operable in the communication terminal 230, the expanded image generation unit 2303 is not an essential constituent element. In the first place, since the communication terminal 230 has a tissue specimen image of the highest resolution, it is desirable to have configurations for expanding an area with the magnification corresponding to the feature value received in the communication terminal 230, in view of communication traffic.
In
It is assumed that the pathologist 240 selects the area 2411 as an area to be expanded and diagnosed in detail from the overlay image of the superimposed image. In
In the magnification selection table 2302a, a feature value 2502 obtained from the area generation unit 216 corresponding to an area 2501 that is specified by the pathologist 240, and a magnification 2503 are stored. In addition, although not shown, information is prepared in association with the feature value and the magnification in advance in the magnification selection unit 2302. The information may be stored in another DB. The magnification selection unit 2302 can obtain a magnification associated with the feature value 2502 which is obtained from the area generation unit 216 corresponding to the area 2501 specified by the pathologist 240, using the information. In the example of
In
In
If it is determined that the instruction for area expansion is received, the information processing apparatus 2310 proceeds to step S2703 and acquires area information from the received expanded area information. Then, the information processing apparatus 2310 acquires feature value information from the area generation unit 216, using the acquired area information. Then, in step S2705, the information processing apparatus 2310 selects a magnification according to the feature value information corresponding to the acquired area information, using the magnification selection table 2302a. Then, in step S2707, the information processing apparatus 2310 transmits only the magnification or the expanded area image to the communication terminal 230.
Next, an information processing system according to the seventh exemplary embodiment of the present invention will be described. The information processing system according to the present exemplary embodiment is different from that in the second exemplary embodiment in that the same image is allocated and displayed for the feature value and level common to a plurality of tissue specimen images.
According to the present exemplary embodiment, it is possible to determine at a glance the feature value or the level to which the pathologist has to pay attention over a plurality of tissue specimen images.
In addition, only the characteristic configuration of the present exemplary embodiment is described, other configurations and operations are the same as in the second exemplary embodiment, and thus the detailed description is not repeated.
(Screen in which an Overlay Image is Superimposed on a Plurality of Tissue Specimen Images)
In
The area information 216a-3 has the same configuration as in
The overlay image information 218a-3 has the same configuration as
Hitherto, the exemplary embodiments of the present invention have been described in detail, but the system or the device in which the different features included in each exemplary embodiment are combined in any way also falls within the scope of the present invention.
Further, the present invention may be applied to a system configured from a plurality of devices, or may be applied to a single device. Further, the present invention is applicable when a control program for realizing functions of the exemplary embodiment is supplied to the device or the system directly or remotely. Accordingly, in order for a computer to perform the functions of the present invention, a control program installed in the computer, a medium storing the control program, or a World Wide Web (WWW) server from which the control program is downloaded is also included in the scope of the present invention.
This application claims priority based on Japanese Patent Application No. 2011-179094 filed on Aug. 18, 2011, which is incorporated herein in its entirety by disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2011-179094 | Aug 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/005182 | 8/17/2012 | WO | 00 | 2/14/2014 |