1. Field of the Invention
The present invention relates to an image processing apparatus, a control method for the same, an image processing system, and a program.
2. Description of the Related Art
Virtual slide systems that capture a virtual slide image by imaging a specimen on a slide using a digital microscope and display the virtual slide image on a monitor to allow observation have been receiving attention (see Japanese Patent Application Laid-Open No. 2011-118107).
Image presentation techniques enabling efficient display of reduced or magnified images having large data sizes have been known (see Japanese Patent Application Laid-Open No. 2011-170480).
In the virtual slide system disclosed in Japanese Patent Application Laid-Open No. 2011-118107, when there are a plurality of specimens on a slide, it is necessary to perform screening of individual specimen on a specimen-by-specimen basis with care not to overlook a specimen, making the specimen observation burdensome.
The display technique disclosed in Japanese Patent Application Laid-Open No. 2011-170480 enables a reduction of the possibility of overlook of individual specimens, but it does not reduce the burden in screening of individual specimens.
The present invention provides an image processing apparatus with which the burden in specimen observation (or screening) can be lightened in cases where there are a plurality of specimens on a slide.
According to a first aspect of the present invention, there is provided an image processing apparatus configured to generate a display image used to display on a display apparatus a captured image captured by imaging a slide on which a specimen is placed by an imaging apparatus, comprising:
an acquisition unit configured to acquire an overall image generated from the captured image for displaying the entirety of the slide and a magnified image generated from the captured image for displaying a portion of the specimen in a magnified manner; and
a generation unit configured to generate a display image containing the overall image and the magnified image,
wherein the magnified image is a rotated image rotated relative to the overall image on the basis of specimen information about a feature of the specimen displayed in the magnified manner.
According to a second aspect of the present invention, there is provided a control method for an image processing apparatus configured to generate a display image used to display on a display apparatus a captured image captured by imaging a slide on which a specimen is placed by an imaging apparatus, comprising:
an acquisition step of acquiring an overall image generated from the captured image for displaying the entirety of the slide and a magnified image generated from the captured image for displaying a portion of the specimen in a magnified manner; and
a generation step of generating a display image containing the overall image and the magnified image,
wherein the magnified image is a rotated image rotated relative to the overall image on the basis of specimen information about a feature of the specimen displayed in the magnified manner.
According to a third aspect of the present invention, there is provided a program that causes a computer to control an image processing apparatus configured to generate a display image used to display on a display apparatus a captured image captured by imaging a slide on which a specimen is placed by an imaging apparatus, the program causing the computer to execute:
an acquisition step of acquiring an overall image generated from the captured image for displaying the entirety of the slide and a magnified image generated from the captured image for displaying a portion of the specimen in a magnified manner; and
a generation step of generating a display image containing the overall image and the magnified image,
wherein the magnified image is a rotated image rotated relative to the overall image on the basis of specimen information about a feature of the specimen displayed in the magnified manner.
According to a fourth aspect of the present invention, there is provided an image processing apparatus configured to generate a display image used to display on a display apparatus a captured image captured by imaging a slide on which a plurality of specimens are placed by an imaging apparatus, comprising:
an acquisition unit configured to acquire an overall image generated from the captured image for displaying the entirety of the slide, a specimen image for displaying the entirety of a selected specimen among the plurality of specimens, and a magnified image for displaying a portion of the specimen displayed by the specimen image in a magnified manner; and
a generation unit configured to generate a display image containing the overall image, the specimen image, and the magnified image,
wherein the specimen image is an image rotated relative to the overall image on the basis of specimen information about a feature of the specimen displayed by the specimen image, and the magnified image is an image not rotated relative to the specimen image.
According to a fifth aspect of the present invention, there is provided a control method for an image processing apparatus configured to generate a display image used to display on a display apparatus a captured image captured by imaging a slide on which a plurality of specimens are placed by an imaging apparatus, comprising:
an acquisition step of acquiring an overall image generated from the captured image for displaying the entirety of the slide, a specimen image for displaying the entirety of a selected specimen among the plurality of specimens, and a magnified image for displaying a portion of the specimen displayed by the specimen image in a magnified manner; and
a generation step of generating a display image containing the overall image, the specimen image, and the magnified image,
wherein the specimen image is an image rotated relative to the overall image on the basis of specimen information about a feature of the specimen displayed by the specimen image, and the magnified image is an image not rotated relative to the specimen image.
The present invention can reduce the burden on a user in performing observation (screening) of specimens in cases where there are a plurality of specimens on a slide.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In the following, embodiments of the present invention will be described with reference to the drawings.
The image processing apparatus according to the present invention can be used in an image processing system including an imaging apparatus and a display apparatus. Such an image processing system will be described with reference to
The imaging apparatus 101 is a virtual slide scanner that performs imaging at a plurality of different positions in a two-dimensional plane to output digital image data of a plurality of two-dimensional images. The imaging apparatus 101 uses a solid-state imaging element such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) to acquire two-dimensional images. The virtual slide scanner serving as the imaging apparatus 101 may be replaced by a digital microscope apparatus constituted by an ordinary optical microscope and a digital camera attached to the eyepiece of the optical microscope.
The image processing apparatus 102 is an apparatus having the function of generating, responsive to a user's request, data (display data) of an image to be displayed on the display apparatus 103 from data of a plurality of original images (captured images) acquired through the imaging apparatus 101. The image processing apparatus 102 has hardware resources such as a CPU (Central Processing Unit), a RAM (Random Access Memory), a storage device, an operation unit, and various interfaces. The image processing apparatus 102 is constituted by a general-purpose computer or workstation. The storage device is, for example, a large capacity information storage device such as a hard disk drive, in which a program(s) and data used to implement later described various processing and an operating system (OS) are stored. The above-described functions are carried out by the CPU by loading a program(s) and data as needed to the RAM from the storage device and executing the program(s). The operation unit includes a keyboard and/or mouse or the like, which is used by the user to input various commands to the image processing apparatus 102.
The display apparatus 103 is a display such as a CRT (Cathode-Ray Tube) or liquid crystal display, which is used to display images (images for observation) based on display data generated by the image processing apparatus 102.
The data server 104 is a server in which diagnosis reference information (data relevant to standard of diagnosis) that serves as a guideline for the user in diagnosing specimens is stored. The diagnosis reference information is updated whenever needed to catch up with up-to-date knowledge of pathological diagnosis. The data server 104 is configured to update the storage content in line with the updating of the diagnosis reference information.
The imaging apparatus 101 is basically composed of an illumination unit 201, a stage 202, a stage control unit 205, an image forming optical system 207, an imaging unit 210, a developing unit 219, a preliminary measurement unit 220, a main control system 221, and an external apparatus I/F 222.
The illumination unit 201 is a unit illuminating a slide 206 placed on the stage 202 uniformly with light. The illumination unit includes a light source, an illumination optical system, and a control system for driving the light source. The stage 202 is driven under control of the stage control unit 205 so as to be capable of shifting in the three axial directions, or the X, Y, and Z directions. The slide 206 is a piece prepared by placing a slice of tissue or a smear of cells to be observed on a slide glass and fixing it under a cover glass with mounting agent.
The stage control unit 205 includes a drive control system 203 and a stage drive mechanism 204. The drive control system 203 receives commands from the main control system 221 to perform drive control for the stage 202. The direction of shift and the amount of shift of the stage 202 are determined based on position information and thickness information (or distance information) about the specimen obtained by measurement performed by the preliminary measurement unit 220 and on a command input by the user if needed. The stage drive mechanism 204 drives the stage 202 according to commands from the drive control system 203.
The image forming optical system 207 is a lens unit that forms an optical image of the specimen on the slide 206 on an imaging sensor 208.
The imaging unit 210 includes the imaging sensor 208 and an analogue front end (AFE) 209. The imaging sensor 208 is a one-dimensional or two-dimensional image sensor such as a CCD or CMOS device that converts a two-dimensional optical image into a physical or electrical quantity by photoelectric conversion. In the case where the image pickup sensor 208 is a one-dimensional sensor, a two-dimensional image is obtained by electrical scanning along a main scanning direction and moving the stage 202 along a sub-scanning direction. The imaging sensor 208 outputs an electrical signal having a voltage value correlating with the light intensity. In the case where a color image is to be captured, a single image sensor to which a color filter having a Bayer arrangement is attached may be used for example. The imaging unit 210 drives the stage 202 along the X axis direction and the Y axis direction to capture divisional images of the specimen.
The AFE 209 is a circuit that converts an analog signal output by the image pickup sensor 208 into a digital signal. The AFE 209 includes an H/V driver described later, a CDS (Correlated Double Sampling), an amplifier, an AD converter, and a timing generator. The H/V driver converts a vertical synchronizing signal and a horizontal synchronizing signal for driving the imaging sensor 208 into voltages required to drive the sensor.
The CDS is a correlated double sampling circuit for removing fixed pattern noises.
The amplifier is an analog amplifier that adjusts the gain of the analog signal from which noises have been removed by the CDS.
The AD converter converts an analog signal into a digital signal. In the case where the resolution of the data that the image pickup apparatus 101 finally outputs is 8 bits, the AD converter may convert the analog signal into digital data quantized generally in 10 to 16 bits to ensure precision in processing in a later stage (e.g. the developing unit 219) and output the digital data. The data obtained by converting signals output by the imaging sensor in this way is referred to as RAW data. The RAW data is developed in the developing unit 219 in a later stage.
The timing generator generates a signal for adjusting the timing of the imaging sensor 208 and the timing of the developing unit 219 in the later stage.
In the case where a CCD is used as the image pickup sensor 208, the above-described AFE 209 is indispensable. On the other hand, in the case where a CMOS image sensor capable of outputting digital signals is used, the CMOS image sensor itself has the above-described function of the AFE 209. There is also provided an imaging controller that controls the imaging sensor 208, though not shown in the drawings. The imaging controller controls timing and operations of the imaging sensor 208 such as the shutter speed, the frame rate, and the region of interest (ROI) etc.
The developing unit 219 includes a black correction unit 211, a demosaicing unit 212, a white balance adjusting unit 213, an image composing unit 214, a filter processing unit 216, a gamma correction unit 217, and a compression processing unit 218.
The black correction unit 211 performs processing of subtracting black correction data obtained in the shaded state from the RAW data for each pixel.
The demosaicing unit 212 performs processing of generating image data of respective colors of red (R), green (G), and blue (B) from the RAW data of the Bayer arrangement. The demosaicing unit 212 calculates the respective values of red, green, and blue in a target pixel by performing interpolation using the values in the pixels (including pixels of the same color and pixels of different colors) in the vicinity of the target pixel in the RAW data. The demosaicing unit 212 also performs correction processing (or interpolation) for defective pixels.
In the case where the imaging sensor 208 does not have a color filter and picks up a monochromatic image, the demosaicing processing is not needed, and the demosaicing unit 212 performs the correction processing for defective pixels.
The white balance adjusting unit 213 performs processing of adjusting the gains for the respective colors of red, green, and blue in accordance with the color temperature of the illumination unit 201 to reproduce desirable white. In the case where a monochromatic image is processed, the white balance adjusting processing is not needed.
The imaging apparatus 101 according to this embodiment divides an area to be imaged (i.e. the area over which the slide exists) into small regions each having a size over which the imaging sensor 208 can capture an image by a single imaging session and performs imaging for the small regions on a region-by-region basis. The image composing unit 214 performs processing of stitching a plurality of images captured by the above-described divisional imaging to generate a large size image data representing the entire area to be imaged (i.e. the entirety of the slide). In this embodiment, it is assumed that the size of the entire area to be imaged is larger than the size of the region over which the image sensor can capture an image by a single imaging session. Thus, the imaging apparatus 101 generates data of a single two-dimensional image in which the entire area to be imaged (or the entirety of the slide) is taken by performing processing of stitching a plurality of images captured by divisional imaging.
Here, it is assumed for example that a square area of 10 mm×10 mm on the slide 206 is to be imaged at a resolution of 0.25 μm. Then, the number of pixels along one side of the area is 10 mm/0.25 μm=40,000, and hence the total number of pixels is 40,0002=1,600,000,000 (16 hundred-million). If the number of pixels of the imaging sensor 208 is 10 mega (10 million) pixels, in order to obtain image data of hundred-million pixels, it is necessary to divide the entire area to be imaged (the entirety of the slide) into (16 hundred-million)/(10 million)=160 divisional regions and to perform imaging for the respective divisional regions.
Exemplary methods of stitching data of a plurality of images include stitching the plurality of divisional images while aligning them based on information about the position of the stage 202, stitching the plurality of divisional images with reference to corresponding points or lines in the divisional images, and stitching the plurality of divisional images based on positional information of the divisional images. Using interpolation processing such as 0-th order interpolation, linear interpolation, or high-order interpolation in stitching the images can lead to smoother stitching. In this embodiment, it is assumed that a single image having a large data amount is generated by the imaging apparatus 101. However, the image processing apparatus 102 may perform the processing of stitching divisional images captured by divisional stitching by the imaging apparatus 101 to generate a single image having a large data amount.
The filter processing unit 216 is a digital filter that performs processing of reducing high frequency components contained in the image, removing noises, and increasing the apparent sharpness.
The gamma correction unit 217 performs processing of giving inverse characteristics to the image taking into consideration tone reproduction characteristics of common display devices and performs tone conversion adapted to characteristics of human eyesight by tone compression in the high luminance part and/or dark part processing. In this embodiment, in order to produce an image to be used for the purpose of morphological observation, tone conversion suitable for composing processing and display processing in later stages is applied to the image data°
The compression processing unit 218 performs compression encoding in order to improve efficiency of transmission of large size two-dimensional image data and to reduce data amount for storage. As compression method for still images, standardized encoding scheme such as JPEG (Joint Photographic Experts Group), and JPEG2000 and JPEG XR developed by improving or advancing JPEG are widely known. The compression processing unit 218 also performs processing of reducing two-dimensional image data and generates multi-layer image data. The multi-layer image data will be described later with reference to
The preliminary measurement unit 220 is a unit that performs measurement for obtaining information about the position of a specimen on the slide 206 and information about the distance to a desired focus position and for calculating a parameter used for light quantity adjustment in accordance with the thickness of the specimen. This measurement is preliminary measurement performed before imaging (or main measurement) of acquiring virtual slide image. By obtaining the information about the object of imaging (i.e. the slide) by the preliminary measurement unit 220 before main measurement, imaging can be performed efficiently. A two-dimensional imaging sensor having a resolving power lower than the imaging sensor 208 is used to obtain position information in a two-dimensional plane. The preliminary measurement unit 220 obtains information about the position of the specimen on the X-Y plane from a captured image. A laser displacement meter or a Shack-Hartmann sensor is used to obtain distance information and thickness information.
The main control system 221 is configured to control the units described in the foregoing. The control functions of the main control system 221 and the developing unit 219 are implemented in a control circuit having a CPU, a ROM (Read-Only Memory), and a RAM. Specifically, programs and data are stored in the ROM, and the functions of the main control system 221 and the developing unit 219 are carried out by the CPU that executes the programs while using the RAM as a work memory.
As the ROM, a device such as an EEPROM (Electronically Erasable and Programmable Read-Only Memory) or a flash memory is used. As the RAM, a DRAM (Dynamic RAM) device such as a DDR3 DRAM is used for example. Alternatively, the function of the developing unit 219 may be implemented in an ASIC (Application Specific Integrated Circuit) as a dedicated hardware device.
The external apparatus I/F 222 is interface for transmission of the multi-layer image data generated by the developing unit 219 to the image processing apparatus 102. The imaging apparatus 101 and the image processing apparatus 102 are connected by an optical communication cable. Alternatively, general-purpose interface such as USB or Gigabit Ethernet (registered trademark) is used to connect the imaging apparatus 101 and the image processing apparatus 102.
The apparatus performing the image processing may be, for example, a personal computer (PC). The PC has a control unit 301, a main memory 302, a sub-memory 303, a graphics board 304, an internal bus 305 for interconnecting the above-mentioned units, LAN I/F 306, storage device I/F 307, external device I/F 309, operation I/F 310, and input/output I/F 313.
The control unit 301 accesses the main memory 302 and the sub-memory 303 when needed and performs overall control of all the blocks of the PC while executing various computations.
The main memory 302 and the sub-memory 303 are constituted by RAMs. The main memory 302 serves as a working area for the control unit 301, temporarily storing various data processed by the OS, various programs under execution, and display data generation. The main memory 302 and the sub-memory 303 also serve as storage areas for image data. The DMA (Direct Memory Access) function of the control unit 301 enables high-speed transmission of image data between the main memory 302 and the sub-memory 303 and between the sub-memory 303 and the graphics board 304.
The graphics board 304 outputs the result of image processing to the display apparatus 103. The display apparatus 103 is a display device utilizing, for example, liquid crystal or EL (Electro-Luminescence). In this embodiment, the display apparatus 103 is an external apparatus connected to the image processing apparatus 102. However, the image processing apparatus and the display apparatus may be constructed as a single integrated apparatus. A notebook PC can constitute such an integrated apparatus.
To the input/output I/F 313 are connected the data server 104 via the LAN I/F 306, a storage device 308 via the storage device I/F 307, the imaging apparatus 101 via the external device I/F 309, and a keyboard 311 and a mouse 312 via the operation I/F 310. The imaging apparatus 101 is, for example, a virtual slide scanner or a digital microscope.
The storage device 308 is an auxiliary storage device, which permanently store the OS and programs to be executed by the control unit 301 and various parameters as firmware. The data and information stored in the storage device 308 can be read out via the storage device I/F 307. The storage device 308 also serves as a storage area for multi-layer image data sent from the imaging apparatus 101. As the storage device 308, a magnetic disk drive such as a HDD (Hard Disk Drive) or SSD (Solid State Drive) or a semiconductor device using a flash memory may be used.
Pointing devices such as the keyboard 311 and the mouse 312 have been mentioned as input devices connected to the operation I/F 310 by way of example, the screen of the display apparatus 103 can be adapted to constitute an input device for example by the use of a touch panel or the like. When this is the case, the touch panel serving as an input device is integrated with the display apparatus 103.
The control unit 301 is composed of a user input information obtaining unit 401, image data acquisition control unit 402, a multi-layer image data acquisition unit 403, a display data generation control unit 404, a display candidate image data acquisition unit 405, a display candidate image data generation unit 406, and a display image data transfer unit 407.
The user input information obtaining unit 401 obtains command information input by the user through the keyboard 311 and/or the mouse 312 via the operation I/F 310. Examples of the command information include start and termination of image display, and scroll, reduction, and magnification of the display image.
The image data acquisition control unit 402 controls, based on information input by the user, read-out of image data from the storage device 308 and development of the image data into the main memory 302. The image data acquisition control unit 402 estimates changes of the displayed region (i.e. the image region to be actually displayed on the display apparatus) on the basis of various user input information such as start and termination of image display, and scroll, reduction, and magnification of the display image. Then, the image data acquisition control unit 402 specifies an image area (first display candidate region) the image data of which is needed to generate an image of the displayed region.
If the main memory 302 is not holding image data of the first display candidate region, the image data acquisition control unit 402 instructs the multi-layer image data acquisition unit 403 to read out image data of the first display candidate region from the storage device 308 and to develop the read-out image data into the main memory 302. Because the read-out of the image data from the storage device 308 takes processing time, it is desirable that the first display candidate region be set to be as large as possible to reduce the overhead necessitated by this processing.
The multi-layer image data acquisition unit 403 reads out image data from the storage device 308 and develops the read-out image data into the main memory 302 according to control instructions by the image data acquisition control unit 402.
The display data generation control unit 404 controls read-out of image data from the main memory 302, processing of the read-out image data, and transfer of the image data to the graphics board 304 on the basis of information input by the user. The display data generation control unit 404 estimates changes of the displayed region on the basis of user input information such as start and termination of image display, and scroll, reduction, and magnification of the display image. Moreover, the display data generation control unit 404 specifies an image region (a second display candidate region) whose image data is needed in generating the image of the displayed region and an image region (displayed region) to be actually displayed on the display apparatus 103.
If the sub-memory 303 is not holding the image data of the second display candidate region, the display data generation control unit 404 instructs the display candidate image data acquisition unit 405 to read out the image data of the second display candidate region from the main memory 302. Furthermore, the display data generation control unit 404 instructs the display candidate image data generation unit 406 as to the image data processing method responsive to a scroll request.
The display data generation control unit 404 instructs the display image data transfer unit 407 to read out the image data of the displayed region from the sub-memory 303. The read-out of image data from the main memory 302 can be done at a speed higher than the read-out of image data from the storage device 308. Therefore, the area of the second display candidate region may be set smaller than the area of the above-mentioned first display candidate region. Thus, the relationship of the areas of the above-mentioned first display candidate region, second display candidate region, and displayed region is as follows: the first display candidate region ≧ the second display candidate region ≧ the displayed region.
The display candidate image data acquisition unit 405 reads out the image data of the second display candidate region from the main memory 302 and transfers the read-out image data to the display candidate image data generation unit 406 according to control instructions by the display data generation control unit 404.
The display candidate image data generation unit 406 executes decompression of the compressed image data of the display candidate region to develop the image data into the sub-memory 303.
The display image data transfer unit 407 reads out the image data of the displayed region from the sub-memory 303 and transfers the read-out image data to the graphics board 304 according to control instructions by the display data generation control unit 404. The DMA function enables high-speed transmission of image data between the sub-memory 303 and the graphics board 304.
The first image layer 501 is an image having the lowest resolution among the four image layers, which is used as a thumbnail image or the like. The second image layer 502 and the third image layer 503 are images having medium resolutions, which are used for large-area observation of the virtual slide image. The fourth image layer 504 is an image having the highest resolution, which is used when the virtual slide image is observed in detail.
Each image layer is constituted by a collection of a certain number of blocks of compressed images. For example, in the case of JPEG compression, each compressed image block is a single JPEG image. In the illustrated case, the first image layer 501 is composed of one compressed image block, the second image layer 502 is composed of four compressed image blocks, the third image layer 503 is composed of 16 (sixteen) compressed image blocks, and the fourth image layer 504 is composed of 64 (sixty-four) compressed image blocks.
Differences in the resolution are analogous to differences in the optical magnification in the microscope observation. Specifically, observation of the first image layer 501 displayed on the display apparatus corresponds to microscope observation at a low magnification, and observation of the fourth image layer 504 displayed on the display apparatus corresponds to microscope observation at a high magnification. For example, if the user wishes to observe a specimen in detail, he/she may cause the display apparatus to display the fourth image layer 504 for observation.
In the illustrated case, nine specimens are attached to the slide 206, and an individual specimen 602 is one of the specimens. In the case of biopsy (removal of tissue for diagnostic examination from a living body) of stomach, liver or the like, a plurality of specimens are placed on one slide in some cases, as shown in
For example, the image processing apparatus 102 may be equipped with a dedicated hardware for implementing the function of the image presentation application. Alternatively, an extension board equipped with such hardware may be attached to the image processing apparatus 102 to enable the image processing apparatus 102 to execute the image presentation application. The source from which the image presentation application is provided is not limited to an external storage device, but the image presentation application may be downloaded through a network.
In
The second image 702 and the third image 703 can be considered to be a base image and a derivative image, which are in a first reduction-magnification relationship. The third image 703 and the first image 701 can also be considered to be a base image and a derivative image, which are in a second reduction-magnification relationship. This way of image presentation enables efficient observation of the specimen. Important features of the way of image presentation according to the present invention will be described below with reference to
The “Magnification” and “Depth” menus are used to set whether or not to display information about the magnification and the depth of the image presented as the first image 701. In the exemplary screen shown in
The “Tool Bar” menu is used to set whether or not to display a tool bar that contains tools for copying, cutting, and pasting images.
The “Status” menu is used to set whether or not to display a status panel for displaying information about the image format, coordinate information of the position designated by a mouse pointer on the image etc.
The “Image List” menu is used to set whether or not to display an image list for displaying a list of image files in the folder.
The “Navigator” menu will be described below.
The “Slide” menu is used to set whether or not to display an entire image of the slide including label captured by the preliminary measurement.
The “Full-Screen” menu is used to set whether or not to display the first image 701 as the full-screen in the screen of the display apparatus 103.
The “Navigator” menu is used to set whether or not to display the second image 702 and the third image 703 as navigation screens. The “Navigator” menu has lower menu layer including “Two Window Navigation”, “One Window Navigation”, and “No Navigation”.
When “Two Window Navigation” is selected, the image presentation is performed in the mode in which the second image 702 (image of the slide) and the third image 703 (image of an individual specimen) are displayed in addition to the first image 701 (magnified image), as is the case with the exemplary screen shown in
When “One Window Navigation” is selected, the image presentation is performed in the mode in which the second image 702 (image of the slide) is displayed in addition to the first image 701 (magnified image). In this mode, the specimen designation frame 704 is not displayed in the second image 702, and only the magnified region designation frame 705 is displayed. In this case, the designation of the region to be displayed as the first image 701 is performed using the second image 702. The image presentation application may be configured in such a way that both the specimen designation frame 704 and the magnified region designation frame 705 are displayed in the second image 702. Such configuration will be described later in the third embodiment.
When “No Navigation” is selected, only the first image 701 is displayed, and the screens of the second image 702 and the third image 703 are not displayed.
The “Two Window Navigation” has a further lower layer including three setting items, which are “Auto-Rotation ON”, “Manual Rotation ON” and “Rotation Mode OFF”.
When “Auto-Rotation ON” or “Manual Rotation ON” is selected, the first image (magnified image), the specimen designation frame in the second image, and the third image (image of an individual specimen) are displayed in a rotated state according to the shape or condition of the individual specimen or according to a rotating operation made by the user. The above-mentioned “Auto-Rotation ON” and “Manual Rotation ON” are collectively referred to as “Rotation Mode ON”. This will be described in detail later with reference to
When “Rotation Mode OFF” is selected, the image presentation is performed without applying image rotation to the multi-layer image data received from the imaging apparatus 101 as shown in
On the other hand, in the case of the image presentation with the rotation mode ON, a rotated image of the individual specimen 602 is displayed as the third image 703. The rotation of the image of the individual specimen 602 is performed based on the shape of the individual specimen 602. In
The third image 905 (with the rotation mode ON) is an image of the individual specimen 602 rotated in the above-described manner. In the second image 904 (with the rotation mode ON), the specimen designation frame 704 is rotated in accordance with the rotation of the individual specimen 602 in the third image 905. The magnified region designation frame 705 appearing in the third image 905 (with the rotation mode ON) has a frame shape the same as the magnified region designation frame 705 in
The rotation of the first image, the third image, and the frame in the second image specifying a region in the third image effected based on the shape of the specimen as shown in
In step S1001, the control unit 301 makes a determination as to whether or not there are a plurality of specimens on the slide 206. This step is executed in the preliminary measurement. For instance, information about the number of specimens is written or electronically recorded on the label 601 beforehand at the time of preparation of the slide 206, and the information on the label 601 is read in the preliminary measurement to acquire the information about the number of specimens.
The information about the number of specimen is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the number of specimens from the imaging apparatus 101, the storage device 308, or the apparatus in the network and makes a determination as to whether or not there are a plurality of specimens on the slide 206 on the basis of this information.
Alternatively, an image of the slide 206 captured by imaging in the preliminary measurement may be stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. In this case, the control unit 301 of the image processing apparatus 102 may retrieve the image of the slide 206 from the imaging apparatus 101, the storage device 308, or the apparatus in the network and determine the number of specimen by image processing.
In step S1002, the control unit 301 makes a determination as to whether or not auto-rotation is set to ON in the image presentation mode setting of the image presentation application. Auto-rotation can be set to ON by the user with the display menu described above with reference to
Responsive to the input of the command, the control unit 301 executing the image presentation application sets the rotation mode in the image presentation mode setting and executes processing for drawing the application screen according to the setting. The setting information of the present image presentation mode of the image presentation application is stored in the main memory 302 or the sub-memory 303, and the control unit 301 can make a determination as to the setting of the image presentation mode on the basis of the information stored in the memory.
In step S1003, the control unit 301 accepts a user's command for selecting an individual specimen to acquire information about the individual specimen selected by the user. Specifically, the user performs an operation of selecting an individual specimen that he/she wishes to observe using the keyboard 311 and/or the mouse 312 in the window in which the second image is displayed in the application screen. Responsive to the operation, a command for selecting one of the individual specimens is input to the image processing apparatus 102 through the operation I/F 310.
In step S1004, the control unit 301 acquires information about the position of the geometric centroid of the individual specimen selected in step S1003. The position of the geometric centroid of each of the individual specimens has been computed beforehand in the preliminary measurement, and information about the position of the geometric centroid is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the position of the geometric centroid from the imaging apparatus 101, the storage device 308, or the apparatus in the network.
In step S1005, the control unit 301 acquires information about the largest diameter axis of the individual specimen selected in step S1003. The largest diameter axis of each of the individual specimens has been computed beforehand in preliminary measurement, and this information is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the largest diameter axis of the individual specimen from the imaging apparatus 101, the storage device 308, or the apparatus in the network. As described with reference to
In step S1006, the control unit 301 computes a rotation angle of the individual specimen from the information about the position of the geometric centroid acquired in step S1004 and the information about the longest diameter axis acquired in step S1005. The rotation angle of the individual specimen is computed as such an angle that makes the longest diameter axis horizontal in the window as illustrated in
In step S1007, the control unit 301 performs drawing processing for drawing the specimen designation frame in the second image and rotation processing for rotating the individual specimen in the third image and then performs processing for presenting an image reflecting the result of above-mentioned drawing processing and rotation processing. The control unit 301 also performs processing for presenting as the first image a magnified image of the region designated by the magnified region designation frame in the third image. Specifically, the control unit 301 retrieves image data of the region designated by the magnified region designation frame in the third image and applies rotation processing on the retrieved image data in accordance with the rotation angle computed in step S1006 to generate the first image.
The first image is a part of the fourth image layer in the multi-layer image shown in
Therefore, the processing in steps 1004 through S1007 is not performed on the image of the individual specimen selected in step S1003 after it is selected, but the processing in steps S1004 through S1007 is performed in advance for each of the individual specimens. It is preferred that the rotated images be held in the storage device 308. The timing of performing the processing in steps S1004 through S1007 for each of the individual specimens may be, for example, immediately after imaging. When this is the case, the processing in steps S1004 through S1007 may be performed either in the imaging apparatus 101 or in the image processing apparatus 102.
On the other hand, in the image presentation with the rotation mode ON, an image in which the individual specimen 602 is rotated is displayed as the third image 703. The rotation of the individual specimen 602 is performed on the condition of the individual specimen 602. In
The rotation of the first image, the third image, and the frame in the second image specifying a region in the third image effected based on the condition of the specimen as shown in
In step S1201, the control unit 301 makes a determination as to whether or not there are a plurality of specimens on the slide 206. This step is executed in the preliminary measurement. For instance, information about the number of specimens is written or electronically recorded on the label 601 beforehand at the time of preparation of the slide 206, and the information on the label 601 is read in the preliminary measurement to acquire the information about the number of specimens.
The information about the number of specimen is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the number of specimens from the imaging apparatus 101, the storage device 308, or the apparatus in the network and makes a determination as to whether or not there are a plurality of specimens on the slide 206 on the basis of this information.
Alternatively, an image of the slide 206 captured by imaging in the preliminary measurement may be stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. In this case, the control unit 301 of the image processing apparatus 102 may retrieve the image of the slide 206 from the imaging apparatus 101, the storage device 308, or the apparatus in the network and determine the number of specimen by image processing.
In step S1202, the control unit 301 makes a determination as to whether or not auto-rotation is set to ON in the image presentation mode setting of the image presentation application. Auto-rotation can be set to ON by the user with the display menu described above with reference to
Responsive to the input of the command, the control unit 301 executing the image presentation application sets the rotation mode in the image presentation mode setting and executes processing for drawing the application screen according to the setting. The setting information of the present image presentation mode of the image presentation application is stored in the main memory 302 or the sub-memory 303, and the control unit 301 can make a determination as to the setting of the image presentation mode on the basis of the information stored in the memory.
In step S1203, the control unit 301 accepts a user's command for selecting an individual specimen to acquire information about the individual specimen selected by the user. Specifically, the user performs an operation of selecting an individual specimen that he/she wishes to observe using the keyboard 311 and/or the mouse 312 in the window in which the second image is displayed in the application screen. Responsive to the operation, a command for selecting one of the individual specimens is input to the image processing apparatus 102 through the operation I/F 310.
In step S1204, the control unit 301 acquires information about a suspected cancerous area of the individual specimen selected in step S1203. The suspected cancerous area of the individual specimen has a mark attached in advance in screening conducted by a cytotechnologist. One method of attaching a mark in an analog fashion is to draw a mark directly on the slide 206 using a pen to indicate a suspected cancerous area. Another method is to display an image captured by imaging the slide 206 on a viewer and to add an annotation digitally on the viewer. Screening is a preliminary observation, which may be performed on an image having a low magnification. The marking information of the suspected cancerous area of the individual specimen is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the marking information of the suspected cancerous region of the individual specimen from the imaging apparatus 101, the storage device 308, or the apparatus in the network.
In step S1205, the control unit 301 computes a rotation angle of the individual specimen on the basis of the information about the suspected cancerous area acquired in step S1204. The rotation angle of the individual specimen is computed as such an angle that causes the suspected cancerous area to be located at an upper left position in the window as shown in
In step S1206, the control unit 301 performs drawing processing for drawing the specimen designation frame in the second image and rotation processing for rotating the individual specimen in the third image and then performs processing for presenting an image reflecting the result of above-mentioned drawing processing and rotation processing. The control unit 301 also performs processing for presenting as the first image a magnified image of the region designated by the magnified region designation frame in the third image. Specifically, the control unit 301 retrieves image data of the region designated by the magnified region designation frame in the third image and applies rotation processing on the retrieved image data in accordance with the rotation angle computed in step S1205 to generate the first image.
The image rotation based on the specimen shape described with reference to
In this embodiment, the slide 206 is imaged by the imaging apparatus 101, and images captured by imaging are stored in the storage device 308 of the image processing apparatus 102. A cytotechnologist or pathologist conducts screening as to the condition of individual specimens using the stored images. It is preferred that the result of screening be stored in some form such as a digital annotation or analog marking so that it can be utilized in the image rotation processing.
In cytological diagnosis, screening is typically conducted by a cytotechnologist, and thereafter diagnosis by a pathologist is conducted. In the screening by the cytotechnologist, preliminary examination about the characteristics of an individual specimen is typically conducted. The result of this preliminary examination may include specimen characteristics information for use in the image rotation processing based on the specimen characteristics according to this embodiment. This enables the pathologist to conduct diagnosis with an image rotated according to the specimen characteristics by the image presentation application. Therefore, an advantageous effect of the present invention, or a reduction of burden on the pathologist in the operation in observing the specimen, can be expected to be achieved.
The image rotation based on the specimen characteristics may also be performed automatically. For example, the condition of the individual specimen may be determined by image processing or other processing (namely, suspected lesion may be extracted mechanically) on the basis of clinical findings of a portion from which the specimen of the slide 206 was taken. For example, because there is nuclear enlargement and disordered cell arrangement in an area suspected to be cancerous, such an area tends to appear darker in an HE (hematoxylin and eosin) stained image than the normal area. It is possible based on this tendency to find (or distinguish) a suspected lesion (i.e. suspected cancerous area) in an individual specimen by extracting an area in the HE stained image in which the brightness is lower than a reference value by image processing. Thus, rotation of the image of the individual specimen may be performed automatically based on information about the specimen characteristics (e.g. the position of a suspected cancerous area) obtained by image processing or the like.
In this embodiment, there has been described an illustrative case in which the position of a suspected cancerous area is used as information about the specimen characteristics. The information about the specimen characteristics is not limited to this, but it may be information about a suspected lesion such as inflammation or tumor.
The circumscribed rectangle shown in
In the case described here, image rotation of the individual specimen 602 is performed based on the rotational angle of the smallest circumscribed rectangle 1301. In other words, the image of the individual specimen 602 is rotated based on the rotational angle of the circumscribed rectangle of the individual specimen 602 at which the area of the circumscribed rectangle becomes smallest. An existing algorithm can be used as an algorithm for computing the smallest circumscribed rectangle based on the shape of the individual specimen 602. The shape of the individual specimen 602 can be acquired automatically by, for example, image processing based on contrast.
In step S1401, the control unit 301 makes a determination as to whether or not there are a plurality of specimens on the slide 206. This step is executed in the preliminary measurement. For instance, information about the number of specimens is written or electronically recorded on the label 601 beforehand at the time of preparation of the slide 206, and the information on the label 601 is read in the preliminary measurement to acquire the information about the number of specimens.
The information about the number of specimen is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 acquires the information about the number of specimens from the imaging apparatus 101, the storage device 308, or the apparatus in the network and makes a determination as to whether or not there are a plurality of specimens on the slide 206 on the basis of this information.
Alternatively, an image of the slide 206 captured by imaging in the preliminary measurement may be stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. In this case, the control unit 301 of the image processing apparatus 102 may retrieve the image of the slide 206 from the imaging apparatus 101, the storage device 308, or the apparatus in the network and determine the number of specimen by image processing.
In step S1402, the control unit 301 makes a determination as to whether or not auto-rotation is set to ON in the image presentation mode setting of the image presentation application. Auto-rotation can be set to ON by the user with the display menu described before with reference to
In step S1403, the control unit 301 accepts a user's command for selecting an individual specimen to acquire information about the individual specimen selected by the user. Specifically, the user performs an operation of selecting an individual specimen that he/she wishes to observe using the keyboard 311 and/or the mouse 312 in the window in which the second image is displayed in the application screen. Responsive to the operation, a command for selecting one of the individual specimens is input to the image processing apparatus 102 through the operation I/F 310. The process of selecting an individual specimen will be described later (see
The processing in steps S1401 to S1403 is the same as that in steps S1001 to S1003 in
In step S1404, the control unit 301 retrieves information about the smallest circumscribed rectangle of the individual specimen selected in step S1403. The smallest circumscribed rectangle of each of the individual specimens has been calculated in advance in preliminary measurement, and information about the smallest circumscribed rectangle is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the smallest circumscribed rectangle from the imaging apparatus 101, the storage device 308, or the apparatus in the network.
In step S1405, the control unit 301 computes the rotation angle of the individual specimen on the basis of the smallest circumscribed rectangle acquired in step S1404. The rotation angle of the individual specimen is computed as such an angle that makes the longer side or the shorter side parallel in the window.
In step S1406, the control unit 301 performs drawing processing for drawing the specimen designation frame in the second image and rotation processing for rotating the individual specimen in the third image and then performs processing for presenting an image reflecting the result of above-mentioned drawing processing and rotation processing. The control unit 301 also performs processing for presenting as the first image a magnified image of the region designated by the magnified region designation frame in the third image. Specifically, the control unit 301 retrieves image data of the region designated by the magnified region designation frame in the third image and applies rotation processing on the retrieved image data in accordance with the rotation angle computed in step S1405 to generate the first image.
The first image is a part of the fourth image layer in the multi-layer image shown in
While the embodiment of the present invention configured to rotate the image automatically based on the shape or condition of the specimen has been described in the foregoing with reference to
When “Manual Rotation ON” is set, the user is allowed to place the mouse pointer on the image of an individual specimen he/she wishes to rotate and dragging the mouse to thereby input a command for rotating the individual specimen designation frame to the image processing apparatus 102.
(Image Presentation with Image Orientation Indicator)
The broken lines indicating the individual specimen selection boundaries 1603 may either be actually displayed in the second image 1601 as shown in
In cases where the size of the second image displayed is relatively small in relation to the entire application screen displayed in the screen of the display apparatus 103 as shown in
Users (i.e. cytotechnologists and pathologists etc.) have their own preferences in the way of shifting observation area 1702.
Examples of operations for shifting the observation area 1702 in the image presentation application include dragging with the mouse and operations of arrow keys of the keyboard. Although dragging with the mouse can shift the observation area 1702 in desired directions (e.g. shifts indicated by oblique arrows in
The image presentation application may be configured to allow the user to set and store a preferred scrolling direction in advance and to select it upon observation.
In specimen observation (or screening), users commonly examine a normal area and an abnormal area (lesion) in comparison and contrast with each other. With which of a normal area and an abnormal area in the individual specimen users begin the observation depends on users. Beginning the observation with a normal area may facilitate recognition of an abnormal area (lesion) in some cases, and beginning the observation (or screening) with an abnormal area may facilitate recognition of a normal area in other cases. Therefore, rotating the image based on the condition of the specimen as described with reference to
For example, if the user likes to begin the observation with an abnormal area and from the upper left part of the specimen, it is preferred to rotate the image of the individual specimen 602 in such a way that a suspected cancerous area 1101 is located at an upper left position as shown in
In the following, a case where image rotation is applied to a depth image will be described as an application of the first embodiment (in which the present invention is applied to a two-dimensional image).
A specimen 1805 is a slice of tissue or a smear of cells to be observed. In
Each image in each layer is constituted by a collection of a certain number of blocks of compressed images. For example, in the case of JPEG compression, each compressed image block is a single JPEG image. In the illustrated case, each image in the first layer depth image group 1801 is composed of one compressed image block, each image in the second layer depth image group 1802 is composed of four compressed image blocks, each image in the third layer depth image group 1803 is composed of 16 (sixteen) compressed image blocks, and each image in the fourth layer depth image group 1804 is composed of 64 (sixty-four) compressed image blocks.
Differences in the resolution are analogous to differences in the optical magnification in the microscope observation. Specifically, observation of an image in the first layer depth image group 1801 displayed on the display apparatus corresponds to microscope observation at a low magnification, and observation of an image in the fourth layer depth image group 1804 displayed on the display apparatus corresponds to microscope observation at a high magnification. For example, if the user wishes to observe a specimen in detail, he/she may cause the display apparatus to display images in the fourth layer depth image group 1804 for observation.
An exemplary case in which image rotation is applied to a certain single depth image will be described with reference to
The imaging apparatus 101 images the slide 206 to capture depth images in the fourth layer. The imaging apparatus 101 or the image processing apparatus 102 performs processing to generate images in layers of lower resolutions (i.e. the depth images in the first to third layers) from the images in the layer of the highest resolution (i.e. the depth images in the fourth layer). The images in the layers of lower resolutions are stored in a storage device 308.
The second image 904 is an image in the first layer. Specifically, the image having a high overall focusing quality over the entire image among the images in the first layer depth image group is selected as the second image 904. The third image 905 is generated from a depth image in a layer having a higher resolution than the first layer, namely a depth image in one of the second, third, and fourth layers. As the third image 905, a depth image having a high focusing quality over the individual specimen 602 is selected. Therefore, it is not necessary that the depth of the second image 904 and the depth of the third image 905 be the same. For example, there may be a case where while the depth image having a high focusing quality as the second image 904 is the second depth image 1809, the depth image having a high focusing quality as the third image 905 is the third depth image 1810. In such a case, image rotation processing for the third image 905 is applied to a depth image having a depth different from the depth of the second image 904.
The focusing quality of an image can be determined based on the image contrast. The image contrast E can be calculated by the following equation:
E=Σ(L(m,n+1)−L(m,n))2+Σ(L(m+1,n)−L(m,n))2,
where L(m, n) is the brightness component of each pixel, m represents the position of the pixel with respect to the Y direction, and n represents the position of the pixel with respect to the X direction.
The first term in the right side of the equation represents the brightness difference between adjacent pixels along the X direction, and the second term represents the brightness difference between the adjacent pixels along the Y direction. The image contrast E can be calculated as the sum of squares of the brightness differences between adjacent pixels along the X direction and the Y direction.
As the first image 906, a depth image having a high focusing quality in the region designated by the magnified region designation frame 705 in the third image 905 is selected from among the depth images in the fourth layer depth image group having the highest resolution. Therefore, there may be cases where the depth of the first image 906 and the depth of the third image 905 are different from each other. For example, there may be a case where while the depth image selected as the first image 906 is the fourth depth image 1811, and the depth image selected as the third image 903 is the third depth image 1810. In such a case, a depth image having a depth different from the third image 905 is retrieved to display the first image 906 as a magnified image of the region designated by the magnified region designation frame 705 in the third image 905.
As described above, for displaying the first, second, and third images, images having high image qualities in the respective regions displayed in the first, second, and third images are selected from the images in the depth image groups. The region displayed in the first image is the region designated by the magnification region designation frame in the third image, the region displayed in the second image is the entire area of the slide 206, and the region displayed in the third image is a region containing the entirety of the individual specimen 602 selected in the second image.
In the above described illustrative embodiment, the information about the number of specimens on the slide, the information about the specimen shape, the information about the specimen characteristics, and the information about the smallest circumscribed rectangle etc. are obtained in preliminary measuring and stored and held in an apparatus such as the imaging apparatus, the image processing apparatus, or an apparatus on the network. The information about them may be added to the multi-layer image data as metadata and sent/received together with the multi-layer image data, between the imaging apparatus, the image processing apparatus, and apparatuses in the network.
The second embodiment is an application of the present invention to presentation of a three-dimensional specimen image.
The image processing apparatus according to the present invention can be used in an image processing system including an imaging apparatus and a display apparatus. The configuration of the image processing system, the functional blocks of the imaging apparatus in the image processing system, the hardware construction of the image processing apparatus, the functional blocks of the control unit of the image processing apparatus, the structure of multi-layer image data, and the construction of the slide are the same as those described in the description of the first embodiment and will not be described further.
The first embodiment is directed to a flat specimen and suitably applied to pathological tissue diagnosis. Specimens used in tissue diagnosis are as thin as approximately four micrometers and can be regarded as a planar specimen. On the other hand, the second embodiment is directed to a three-dimensional specimen and suitably applied to pathological cytodiagnosis. Specimens used in cytodiagnosis have a thickness from a few tens of micrometers to 100 micrometers and can be regarded as three-dimensional specimens. The second embodiment is characterized in its method of image presentation for a cross section (main cross section) which the user wishes to observe and provides advantages in reducing the burden on the user in specimen observation (screening).
For the sake of simplicity, the specimen model 1901 is assumed to be an axisymmetric solid constructed as a combination of cuboids and cones. The main axis 2102 passes through the apexes of the two cones at both ends. The specimen model 1901 has the main axis 2102 and the main plane 2103 as shown in
In the following, there will be described an illustrative case in which an image of the main cross section 2104 of the specimen model 1901 simulating an overlapped cell aggregate is displayed for observation by the user. In the case of a different specimen model, the method of determining a cross section to be displayed for observation by the user would be different. For example, in the case of a three-dimensional specimen in an Indian file arrangement (i.e. cells arranged in a row), the specimen may be projected onto a two-dimensional plane in such a way that its axis (i.e. the main axis in
The second image 2202 is not necessarily an image containing a plurality of individual specimens, but it may be an image containing one individual specimen. In the case where the second image 2202 contains a plurality of individual specimens, the image presentation application may be configured to allow the user to select one of the individual specimens at his/her discretion and to indicate the selected individual specimen by a specimen designation frame.
In
The second image 2202 and the third image 2203 can be considered to be a base image and a derivative image, which are in a first reduction-magnification relationship. The third image 2203 and the first image 2201 can also be considered to be a base image and a derivative image, which are in a second reduction-magnification relationship. Presenting the magnified image of the main cross section of the second image 2202 as the first image 2201 enables efficient observation of the specimen. This method is based on the idea that the main cross section contains a large amount of information about the three-dimensional specimen.
In step S2301, the control unit 301 makes a determination as to whether or not auto-rotation is set to ON in the image presentation mode setting of the image presentation application. Auto-rotation can be set to ON by the user with the display menu described before with reference to
Responsive to the input of the command, the control unit 301 executing the image presentation application sets the rotation mode in the image presentation mode setting and executes processing for drawing the application screen according to the setting. The setting information of the present image presentation mode of the image presentation application is stored in the main memory 302 or the sub-memory 303, and the control unit 301 can make a determination as to the setting of the image presentation mode on the basis of the information stored in the memory.
In step S2302, the control unit 301 accepts a user's command for selecting an individual three-dimensional specimen to acquire information about the individual three-dimensional specimen selected by the user. Specifically, the user performs an operation of selecting an individual three-dimensional specimen that he/she wishes to observe using the keyboard 311 and/or the mouse 312 in the window in which the second image is displayed in the application screen. Responsive to the operation, a command for selecting one of the individual three-dimensional specimens is input to the image processing apparatus 102 through the operation I/F 310. In this embodiment, there are not necessarily a plurality of specimens on the slide 206. If there is only one individual three-dimensional specimen on the slide 206, this step can be skipped.
In step S2303, the control unit 301 acquires information about the position of the geometric centroid of the individual three-dimensional specimen selected in step S2302. The position of the geometric centroid of each of the individual three-dimensional specimens has been computed beforehand, and information about the position of the geometric centroid is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the position of the geometric centroid of each of the individual three-dimensional specimens from the imaging apparatus 101, the storage device 308, or the apparatus in the network.
In step S2304, the control unit 301 acquires information about the main axis of the individual three-dimensional specimen selected in step S2302. The main axis of each of the individual three-dimensional specimens has been computed beforehand, and this information is stored and held in the imaging apparatus 101, the storage device 308, or an apparatus in the network. The control unit 301 of the image processing apparatus 102 retrieves the information about the position of the geometric centroid of each of the individual specimens from the imaging apparatus 101, the storage device 308, or the apparatus in the network.
In step S2305, the control unit 301 computes the main cross section of the individual three-dimensional specimen from the information about the position of the geometric centroid acquired in step S2303 and the information about the main axis acquired in step S2304.
In step S2306, the control unit 301 performs processing for drawing the main plane in the second image and processing of forming an image of the main cross section of the individual three-dimensional specimen. The control unit 301 applies rotation processing described in the first embodiment to the image of the main cross section thus formed to present the resultant image as the third image. Moreover, the control unit 301 performs processing for presenting as the first image a magnified image of the region designated by a magnified region designation frame in the third image. Specifically, the control unit 301 retrieves image data of the region designated by the magnified region designation frame in the third image, applies rotation processing to it to generate the first image.
The first image corresponds to the fourth image layer in the multi-layer image data shown in
According to this embodiment, a cross section of the three-dimensional specimen in a cross section different from the depth images captured by imaging is presented in a way that can lead to a reduction in the burden on the user during the observation. The depth images captured by imaging are, for example, images of the three-dimensional specimen in cross sections parallel to the surface of the slide, but the cross section of the three-dimensional specimen that the user wishes to observe is not necessarily the same as the cross section in which a depth image is captured.
According to this embodiment, a cross sectional image of the three-dimensional specimen in a cross section different from the cross sections in which the depth images are captured can be generated from the three-dimensional specimen image constructed from the plurality of depth images. Therefore, a cross sectional image of the three-dimensional specimen in a cross section that the user wishes to observe can be presented.
According to this embodiment, the cross sectional image of the three-dimensional specimen thus generated is rotated based on the shape and/or condition of the specimen, and the rotated image is presented. This leads to a reduction in the number of times of oblique shift of the observation area during screening, whereby user's trouble in operations such as scrolling can be reduced.
In this embodiment, an illustrative case in which a cross sectional image of the three-dimensional specimen in the main cross section is generated from the three-dimensional specimen image for presentation. This is an illustrative example having its basis in the assumption that the main cross section contains a large amount of information about the three-dimensional specimen. The cross section of the three-dimensional specimen in which an image to be presented is generated is not limited to limited to this.
As the third embodiment, there will be described a case in which the screen of the image presentation application is composed of two windows.
The image processing apparatus according to the present invention can be used in an image processing system including an imaging apparatus and a display apparatus. The configuration of the image processing system, the functional blocks of the imaging apparatus in the image processing system, the hardware construction of the image processing apparatus, the functional blocks of the control unit of the image processing apparatus, the structure of multi-layer image data, and the construction of the slide are the same as those described in the description of the first embodiment and will not be described further.
In the first and second embodiments, the screen of the image presentation application is composed of three windows in which a first image (a magnified image), a second image (an overall image of the slide), and a third image (an image of an individual specimen) are displayed respectively. As the third embodiment, there will be described an exemplary image presentation application whose screen is composed of two windows, in which the first image and the second image are displayed but the third image is not displayed. The method of rotating an image for image presentation and processing implementing the same are same as those described above in the first and second embodiments. Specifically, image rotation is performed based on the shape or condition of the specimen or on the smallest circumscribed rectangle. The third embodiment differs from the first and second embodiments in the method of presentation of the rotated image.
The second image 2402 further allows the user to designate a region of the individual specimen 602 to be displayed as the first image 2401 in a magnified manner. The region thus designated is indicated by a magnified region designation frame 2405. The specimen designation frame 2404 and the magnified region designation frame 2405 are rotated based on the shape, condition, or smallest circumscribed rectangle of the individual specimen 602 and displayed in the rotated orientation as with in the above-described embodiments. The first image 2401, which is a magnified image of the region designated by the magnified region designation frame 2405, is also a rotated image generated by rotating an original image.
In
Therefore, as described with reference to
While
As described above, it is possible to perform image presentation in such a way as to allow the user to see an overall image of the three-dimensional specimen and a cross sectional image displayed as the first image at the same time. This image presentation allows the user to easily know where in a cross section inside the three-dimensional specimen the region displayed as the magnified image as the first image 2401 is located. Furthermore, since the magnified image displayed as the first image 2401 is rotated based on the shape, condition, and/or inclination of a cross section inside the three-dimensional specimen displayed in the second image 2402, user's trouble in operations for shifting the observation area can be reduced.
In the illustrative case shown in
In this embodiment, the image presentation with only the first image (magnified image) and the second image (image of the slide) can make the window layout of the image presentation application simpler as compared to the first and second embodiments. Moreover, since the oblique shift of the observation area can be reduced by the image rotation based on the shape or condition of the specimen or a cross section of the specimen, the burden on the user can be reduced.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s). For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., non-transitory computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2012-287576, filed on Dec. 28, 2012, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2012-287576 | Dec 2012 | JP | national |