1. Field of the Invention
The present invention relates to an image processing apparatus, an image processing system, an image processing method, and a program for processing a virtual slide image.
2. Description of the Related Art
A virtual slide system has been popularized which can acquire a virtual slide image by imaging a sample on a slide with a digital microscope, and can display and observe the virtual slide image on a monitor (see Japanese Patent Application Laid-Open No. 2011-118107).
In addition, a document management apparatus capable of discriminating generators of annotations added to document data from each other has been discussed (see Japanese Patent Application Laid-Open No. 11-25077).
Diagnostic criteria and diagnostic classifications are updated as needed and are not unified among countries or areas. When a user views a sample diagnosed according to an old diagnostic criterion again or confirms a sample of another country based on the different diagnostic criterion for study, which diagnostic criterion is used for contents of an annotation described as a basis of the diagnosis may not be determined.
The present invention relates to an image processing apparatus capable of easily determining an annotation added according to a different diagnostic criterion.
According to an aspect of the present invention, an image processing apparatus includes a data acquisition unit configured to acquire data of a virtual slide image, and a display control unit configured to display the virtual slide image and a plurality of annotations added to the virtual slide image on a display apparatus, wherein data pieces of the plurality of annotations include diagnostic criterion information of the plurality of annotations, respectively, and wherein the display control unit groups at least two of the plurality of annotations based on the diagnostic criterion information, causes display forms of the plurality of annotations to differ from each other based on the diagnostic criterion information, and displays the plurality of annotations on the virtual slide image.
In the image processing apparatus according to the aspect of the present invention, when an annotation is added, not only the contents of the annotation itself but also diagnostic criterion information are stored together and a correspondence relation therebetween is prepared as a list. Therefore, the diagnostic criterion can easily be identified when the annotation is indicated.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
An image processing apparatus according to an exemplary embodiment of the present invention can be used in an image processing system that includes an imaging apparatus and a display apparatus. The image processing system will be described with reference to
The imaging apparatus 101 is a virtual slide apparatus that captures a plurality of two-dimensional images having different positions in a two-dimensional plane direction and outputs digital images. When the two-dimensional images are acquired, a solid-state image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) is used. Instead of the virtual slide apparatus, the imaging apparatus 101 maybe configured by a digital microscope device in which a digital camera is mounted on an eyepiece portion of a general optical microscope.
The image processing apparatus 102 is an apparatus that has, for example, a function for generating data to be displayed on the display apparatus 103 from a plurality of original image data pieces acquired from the imaging apparatus 101 based on the original image data in response to a request from a user. The image processing apparatus 102 is configured by a general-purpose computer or a workstation that includes hardware resources such as a central processing unit (CPU), a random access memory (RAM), a storage device, an operation unit, and various I/Fs. The storage device is a large-capacity information storage device such as a hard disk drive. The storage device stores programs or data pieces to be used to realize each process to be described below, an operating system (OS), and the like. Each function described above is realized, when the CPU loads a necessary program and data in the RAM from the storage device and executes the program. The operation unit includes a keyboard or a mouse and is used for an operator to input various instructions.
The display apparatus 103 displays an observation image which is an arithmetic operation result of the image processing apparatus 102 and includes a cathode-ray tube (CRT) or a liquid crystal display.
The data server 104 stores diagnostic criterion information (data relevant to a diagnostic criterion) serving as a guideline, when a user diagnoses a sample. The diagnostic criterion information is updated as needed to be suitable for the current situation of a pathological diagnosis. The data server 104 updates the storage contents according to the update of the diagnostic criterion information. The diagnostic criterion information will be described below with reference to
In the example in
The imaging apparatus 101 generally includes an illumination unit 201, a stage 202, a stage control unit 205, an imaging optical system 207, an imaging unit 210, a development processing unit 219, a pre-measurement unit 220, a main control system 221, and a data output unit 222.
The illumination unit 201 uniformly illuminates a slide 206 disposed on the stage 202 with light. The illumination unit 201 includes a light source, an illumination optical system, and a light source driving control system. The driving of the stage 202 is controlled by the stage control unit 205, and thus the stage 202 can be moved in three-axis directions of XYZ. The slide 206 is a member on which a segment of a tissue or a smeared cell to be observed is attached onto a slide glass and which is fixed below a cover glass along with a mounting medium.
The stage control unit 205 includes a driving control system 203 and a stage driving mechanism 204. The driving control system 203 receives an instruction of the main control system 221 and controls the driving of the stage 202. A movement direction, a movement amount, and the like of the stage 202 are determined based on position information and thickness information (distance information) of a sample measured by the pre-measurement unit 220, and an instruction from a user, as necessary. The stage driving mechanism 204 drives the stage 202 in response to an instruction of the driving control system 203.
The imaging optical system 207 is a lens unit that forms an optical image of a sample of the slide 206 to the image sensor 208.
The imaging unit 210 includes an image sensor 208, an analog front end (AFE) 209. The image sensor 208 is a one-dimensional or two-dimensional image sensor that changes a two-dimensional optical image into an electrical physical amount through photoelectric conversion. For example, a CCD or CMOS device is used. The one-dimensional image sensor can acquire a two-dimensional image by performing scanning in a scan direction. The image sensor 208 outputs an electrical signal having a voltage value corresponding to the intensity of light. When a color image is desired as a captured image, for example, a single-plate image sensor on which a color filter with a Bayer array is mounted may be used. The imaging unit 210 captures divided images of the sample by driving the stage 202 in the XY-axis direction.
The AFE 209 is a circuit that converts an analog signal output from the image sensor 208 into a digital signal. The AFE 209 includes a horizontal/vertical (H/V) driver, a correlated double sampling (CDS), an amplifier, an analog-to-digital (AD) converter, and a timing generator to be described below.
The H/V driver converts a vertical synchronization signal and a horizontal synchronization signal for driving the image sensor 208 into potentials necessary to drive the sensor. The CDS is a correlated double sampling circuit that removes noise of a fixed pattern. The amplifier is an analog amplifier that adjusts a gain of an analog signal from which the noise is removed by the CDS. The AD converter converts an analog signal into a digital signal. When an output of the final stage of the imaging apparatus is 8 bits, the AD converter converts an analog signal into digital data for which 10 bits are quantized to about 16 bits in consideration of processing in a subsequent stage, and output the digital data. The converted sensor output data is referred to as RAW data. The RAW data is developed by the development processing unit 219 of the subsequent stage. The timing generator generates a signal for adjusting the timing of the image sensor 208 and the timing of the development processing unit 219 of the subsequent stages.
When a CCD sensor is used as the image sensor 208, the AFE 209 is an essential unit. However, when a CMOS image sensor capable of outputting digital data is used, the function of the AFE 209 described above is included in the sensor. Although not illustrated, an imaging control unit that controls the image sensor 208 is provided to perform operation control of the image sensor 208 such as a shutter speed, a frame rate, a region of interest (ROI), and an operation timing.
The development processing unit 219 includes a black correction unit 211, a white balance adjustment unit 212, a de-mosaicing processing unit 213, an image synthesis processing unit 214, a filter processing unit 216, a γ correction unit 217, and a compression processing unit 218. The black correction unit 211 performs a process for subtracting black correction data obtained from each pixel of the RAW data at the time of shielding light. The white balance adjustment unit 212 performs a process of reproducing a desirable white color by adjusting the gain of each color of red, green and blue (RGB) according to the color temperature of the light from the illumination unit 201. More specifically, white balance correction data is added to the RAW data subjected to the black correction. When a monochromatic image is treated with, a white balance adjusting process is not necessary.
The de-mosaicing processing unit 213 performs a process for generating image data of each color of RGB from the RAW data with the Bayer array. The de-mosaicing processing unit 213 calculates the value of each color of RGB of a target pixel by interpolating values of periphery pixels (including a pixel of the same color and pixels of different colors) of the RAW data. The de-mosaicing processing unit 213 performs a process (interpolation process) for correcting a defective pixel. When the image sensor 208 does not include a color filter and can obtain a monochromatic image, a de-mosaicing process is not necessary.
The image synthesis processing unit 214 performs a process for generating large-capacity image data in a desired imaging range by joining image data pieces divided in an imaging range and acquired by the image sensor 208. In general, since a sample presence range is larger than the imaging range acquired through one imaging operation by an existing image sensor, one piece of two-dimensional image data is generated by joining the divided image data pieces. For example, when it is assumed that an image in a range of 10 mm square on the slide 206 is captured with a resolution of 0.25 μm, the number of pixels of one side is 40 thousand pixels of 10 mm/0.25 μm, and thus a total number of pixels is 1.6 billion pixels which are the square of the 40 thousand pixels. In order to acquire image data of 1.6 billion pixels using the image sensor 208 having the number of the pixels of 10 M (10 million), it is necessary to perform imaging by dividing the imaging range into 160 regions of 1.6 billion/10 million.
As a method for joining a plurality of pieces of image data, a method for joining image data pieces by adjusting positions based on the position information of the stage 202, a method for joining image data pieces by joining points or lines which correspond with each other in the plurality of divided images, and a method for joining based on the position information of the divided image data pieces can be used. When the joining is performed, the image data pieces can be smoothly joined together through interpolating processing such as 0-order interpolation, linear interpolation, or higher-order interpolation. According to the present exemplary embodiment, it is assumed that one piece of large-capacity image is generated. However, as the function of the image processing apparatus 102, the divided and acquired images may be joined to each other when display data is generated.
The filter processing unit 216 is a digital filter that realizes suppression of a high-frequency component contained in an image, removal of noise, and emphasis of sharpness.
The γ correction unit 217 performs processing for adding reverse-characteristics to an image according to gradation expression characteristics of a general display device or performs gradation conversion to be suitable for the human visual characteristics through gradation compression of a high luminance portion or dark-portion processing. According to the present exemplary embodiment, in order to acquire an image for the purpose of morphological observation, gradation conversion suitable for synthesizing processing or displaying processing in the subsequent stage is applied to the image data.
The compression processing unit 218 performs compression encoding processing which is performed to efficiently transmit large-capacity two-dimensional image data and reduce the capacity of the large-capacity two-dimensional image data to be stored. As a method for compressing a still image, a standard encoding scheme such as the Joint Photographic Experts Group (JPEG), the JPEG2000 improved and advanced from the JPEG, the JPEG XR is generally and widely known.
The pre-measurement unit 220 performs preliminary measurement to calculate parameters for light quantity adjustment originated from position information of a sample on the slide 206, distance information up to a desired focal position, and a thickness of a sample. Imaging can be performed without waste by causing the pre-measurement unit 220 to acquire information before the main measurement (acquisition of captured image data). When the position information of a two-dimensional plane is acquired, a two-dimensional imaging sensor with a resolution lower than that of the image sensor 208 is used. The pre-measurement unit 220 comprehends the position of a sample on the XY plane from the acquired image. When the distance information and the thickness information are acquired, a laser displacement meter or a Shack-Hartmann measuring device is used.
The main control system 221 has a function for controlling the various units described above. The control functions of the main control system 221 and the development processing unit 219 are realized by a control circuit that includes a CPU, a ROM, and a RAM. More specifically, a program and data are stored in the ROM, and the functions of the main control system 221 and the development processing unit 219 are realized by causing the CPU to execute the program using the RAM as a work memory. For example, a device such as an electrically erasable and programmable read only memory (EEPROM) or a flash memory is used as the ROM. For example, a dynamic random access memory (DRAM) device such as double data rate 3 (DDR3) is used as the RAM. The function of the development processing unit 219 may be substituted with a unit configured by an application specific integrated circuit (ASIC) as a dedicated hardware device.
The data output unit 222 is an interface configured to transmit RGB color images generated by the development processing unit 219 to the image processing apparatus 102. The imaging apparatus 101 and the image processing apparatus 102 are connected to each other by an optical communication cable. Alternatively, a general-purpose interface such as a Universal Serial Bus (USB) or a Gigabit Ethernet (registered trademark) is used.
The image processing apparatus 102 includes an image data acquisition unit 301, a memory storage unit (memory) 302, a user input information acquisition unit 303, an annotation data generation unit 305, a diagnostic criterion information acquisition unit 306, a display data generation control unit 309, a display image data acquisition unit 310, a display data generation unit 311, and a display data output unit 312.
The image data acquisition unit 301 acquires image data captured by the imaging apparatus 101. The image data described here refers to at least one of divided image data pieces of RGB colors obtained by dividing and imaging a sample and one piece of two-dimensional image data obtained by synthesizing the divided image data pieces. The divided image data pieces may be monochromatic image data.
The memory storage unit 302 imports, records, and stores image data acquired from an external device via the image data acquisition unit 301.
The user input information acquisition unit 303 acquires via the operation unit such as a mouse or a keyboard, an instruction to update display image data such as display position change or expansion or reduction display, or information to be input to a display application used at the time of performing diagnosis such as addition of an annotation. Further, the user input information acquisition unit 303 acquires an instruction to acquire diagnostic criterion information from a user.
The diagnostic criterion information acquisition unit 306 acquires the diagnostic criterion information serving as a guideline for a user who performs diagnosis from the data server 104. Since the latest diagnostic criterion information is stored in the data server 104, the corresponding diagnostic criterion information is acquired by a diagnostic criterion information acquisition instruction from a user. The diagnostic criterion information will be described below with reference to
The annotation data generation unit 305 generates, as a list, text input information added as an annotation obtained by the user input information acquisition unit 303 and the diagnostic criterion information acquired by the diagnostic criterion information acquisition unit 306. A structure of the list will be described below with reference to
The display data generation control unit 309 corresponds to a display control unit that controls generation of the display data in response to a user's instruction acquired by the user input information acquisition unit 303. The display data is mainly formed by the image data and annotation display data.
The display image data acquisition unit 310 acquires the image data necessary for display from the memory storage unit 302 under the control of the display data generation control unit 308.
The display data generation unit 311 generates the display data to be displayed on the display apparatus 103 based on an annotation data list generated by the annotation data generation unit 305 and the image data acquired by the display image data acquisition unit 310.
The display data output unit 312 outputs the display data generated by the display data generation unit 311 to the display apparatus 103 which is an external device.
The CPU 401 appropriately accesses the RAM 402 or the like, as necessary, and comprehensively control all of the blocks of the PC while performing various arithmetic operations. The RAM 402 is used as a work area or the like of the CPU 401 and temporarily stores an OS, various programs in progress, and various types of data which is an object of processing such as generation of display data. The storage device 403 is an auxiliary storage device that records and reads information fixedly stores the OS and firmware such as a program and various parameters to be executed by the CPU 401. A magnetic disk drive such as hard disk drive (HDD) or a solid-state disk (SSD) or a semiconductor device using a flash memory is used as the storage device 403.
The data server 104 is connected to the data input and output I/F 405 via the LAN I/F 406. The display apparatus 103 is connected to the data input and output I/F 405 via a graphics board. The imaging apparatus 101 is connected to the data input and output I/F 405 via an external device I/F 408. A keyboard 410 or a mouse 411 is connected to the data input and output I/F 405 via an operation I/F 409.
The display apparatus 103 may include, for example, a liquid crystal or an electro-luminescence (EL) display device. The display apparatus 103 is assumed to be connected as an external device, however it may be assumed that a PC is integrated with a display apparatus. For example, a notebook type PC corresponds to such a configuration.
A pointing device such as the keyboard 410 or the mouse 411 is assumed as a connection device with the operation I/F 409. However, a screen of the display apparatus 103 may be configured as a direct input device, such as a touch panel. In this case, the touch panel may be integrated with the display apparatus 103.
A processing flow of addition and indication of an annotation in the image processing apparatus according to the exemplary embodiment will be described with reference to a flowchart in
In step S501, image data to be displayed on the display apparatus is acquired from the memory storage unit 302.
In step S502, display data to be displayed on the display apparatus 103 is generated based on the acquired image data. The generated display data is displayed on the display apparatus 103.
In step S503, it is determined whether a screen displayed in response to an instruction from a user is updated. More specifically, update of the screen includes, for example, a change in a display position to display the image data located outside the displayed screen. If it is necessary to update the screen (YES in step S503), the processing returns to step S501, and then the processing for updating the screen is performed via the acquisition of the image data and the generation of the display data. It a screen update request is not received (NO in step S503), the processing proceeds to step S504.
In step S504, it is determined whether an instruction or a request to add an annotation is received from a user. If the instruction to add the annotation is received (YES in step S504), the processing proceeds to step S505. Whereas, if the instruction to add the annotation is not received (NO in step S504), the processing for adding the annotation is skipped, and the processing proceeds to step S506.
In step S505, various types of processing associated with the addition of the annotation are performed. As the processing contents, for example, storage of the contents of the annotation input using the keyboard 410 or the like, linking the contents of the annotation with the diagnostic criterion information which is the characteristic feature of the present exemplary embodiment, and displaying a warning when the annotation is added according to different diagnostic criterion. The details of the processing contents will be described below with reference to
In step S506, it is determined whether a request for indicating the added annotation is made. If the request for indicating the annotation is made from the user (YES in step S506), the processing proceeds to step S507. Whereas, if the request is not made (NO in step S506), the processing returns to step S503, and the subsequent processing is repeated. The processing flow is chronologically described. However, reception of the screen update request which is the change in the display position, the addition of the annotation, and the indication of the annotation may be performed at any timing, for example, simultaneously or sequentially.
In step S507, processing for effectively indicating the annotation to the user is performed in response to the indication request. The details of the processing in step S507 will be described below with reference to
The addition of the annotation and the indication of the annotation are thus performed according to the above-described processing flow.
In step S601, it is determined whether a diagnostic criterion is changed. If the diagnostic criterion is changed (YES in step S601), the processing proceeds to step S602. Whereas, if the diagnostic criterion is not changed (NO instep S601), the processing proceeds to step S603 while skipping step S602. A situation in which the diagnostic criterion is not changed includes a case in which the diagnostic criterion is not necessary to be changed from an initial setting when an annotation is first added. A situation in which the diagnostic criterion is changed includes a case in which a user views a sample diagnosed according to an old diagnostic criterion again and a case in which a user confirms a sample of another country according to a different diagnostic criterion for study.
The diagnostic criterion is established by collecting diagnostic classifications of respective organs and is in line with the current conditions of each country and area. The diagnostic classification indicates a stage of disease of each organ. For example, in a case of a stomach cancer, a diagnostic classification defined by a cancer classification code α which is a diagnostic criterion of a given area may be different from a diagnostic classification defined by a cancer classification code β which is a diagnostic criterion of another area. The diagnostic criterion and the diagnostic classification will be described below with reference to
In step S602, the diagnostic criterion is set. The diagnostic criterion is set according to the current conditions of a user such as an area where the user makes a diagnosis, or suggestion of a community to which the user belong.
In step S603, it is determined whether the addition of the annotation based on the different diagnostic criterion is performed on the same sample. If the addition of the annotation is performed based on the different diagnostic criterion (YES in step S603), the processing proceeds to step S604. Whereas, if the addition of the annotation is performed based on the same diagnostic criterion (NO in step S603), the processing proceeds to step S605 while skipping step S604.
In step S604, a warning is displayed for the addition of the annotation based on the different diagnostic criterion to the same sample. (I.e., when an annotation based on a second diagnostic criterion, which is different from a first diagnostic criterion, is input to a virtual slide image to which an annotation based on the first diagnostic criterion is added, the warning is displayed on the display apparatus.) The display of warning will be described below with reference to
In step S605, diagnostic criterion information is acquired. Since the diagnostic criterion information updated according to the current situation of pathological diagnosis is stored in the data server 104, the image processing apparatus 102 acquires the diagnostic criterion set by the user in step S602 from the data server 104.
In step S606, the diagnostic classification according to the diagnostic criterion is displayed. When it is determined in step S603 that the addition of the annotation is performed on the same sample based on the different diagnostic criterion, correspondence of the diagnostic classification is displayed in response to an instruction from the user. (I.e., when an annotation based on the second diagnostic criterion, which is different from the first diagnostic criterion, is input to a virtual slide image to which an annotation based on the first diagnostic criterion is added, the correspondence of the diagnostic classifications between the first diagnostic criterion and the second diagnostic criterion is displayed on the display apparatus.) The display of the correspondence of the diagnostic classifications will be described below with reference to
In step S607, the contents of the annotation input with the keyboard 410 are acquired. The acquired text information is used when the annotation is indicated.
In step S608, annotation data is generated based on the diagnostic criterion set in step S602 and the text information input as the annotation acquired in step S607.
In step S609, if the addition of the annotation is performed for the first time based on the annotation data generated in step S608, an annotation data list is newly generated. If the annotation data list is already present, values and contents of the list are updated. Information to be stored in the list is the diagnostic criterion information and the text information input as the annotation. The structure of the annotation data list will be described below with reference to
The addition of the annotation and the generation of the annotation data are performed according to the above-described processing flow.
In
The sample thumbnail image 703 displays a position or a size of the display area 705 of the sample image data in the entire image of the sample. The position or the size of the display area 705 can be comprehended by a frame of the detailed display area 704. The detailed display area 704 can be set directly by a user's instruction from an input device such as a touch panel or the mouse 411 connected to the outside and can be set and updated in response to an operation for moving, enlarging, or reducing the display area of the displayed image. The sample image data for detailed observation is displayed in the display area 705 of the sample image data. In the display area 705, an enlarged or reduced image of the image is displayed by moving the display area (i.e., selecting and moving a partial area to be observed in the entire sample image) or changing the display magnification in response to an operation and instruction from a user.
In
In
In
In
The diagnostic criterion and the diagnostic classification illustrated in
In
Since the structure of the annotation data list includes the group ID, an expression method of indicating an annotation can be changed and displayed for each group ID.
A diagnostic classification correspondence 1001 indicates a correspondence between the stage of progression (1002) of stomach cancer of the cancer classification Japanese code I and the stage of progression (1003) of stomach cancer of the cancer classification international code I. The stage of progression 1002 is classified into five stages, BI to BV. On the other hand, the stage of progression 1003 is classified into eight stages, CI to CV. The stage BII is further classified into stages CII-a and CII-b. Likewise, the stage BIII is further classified into stages CIII-a and CIII-b, and the stage BIV is further classified into stages CIV-a and CIV-b. The diagnostic classification correspondence 1001 may be displayed in the information area 702 or a separate window, and a user can operate to display or hide the diagnostic classification correspondence 1001, as necessary.
The diagnostic classification correspondence screen is not limited to this example. With regard to two diagnostic criteria that do not clearly have a correspondence relation, a text sentence may be displayed to inform the user of that fact. Alternatively, only a difference in the diagnostic criteria which requires special attention may be extracted and displayed in a text sentence.
Accordingly, when a user performs diagnosis based on the different diagnostic criterion or the user confirms a sample diagnosed based on the different diagnostic criterion, the user can easily comprehend the correspondence relation.
In
In
In
As described above, a user can easily confirm the correspondence between the text information about the diagnosis findings based on the different diagnostic criterion and the sample image by linking with a system of the electronic medical record.
According to the present exemplary embodiment, not only the contents of an annotation but also the diagnostic criterion information are stored together and the correspondence relation therebetween is prepared as a list at the time of adding the annotation, so that a user can easily discriminate the diagnostic criterion when the annotation is indicated.
In addition, the present exemplary embodiment can easily deal with the update of the diagnostic criterion by updating the contents of a database thereof.
Further, according to the present exemplary embodiment, a warning is issued to the user that an annotation is added based on the different diagnostic criterion, so that the user can avoid adding an annotation based on the different diagnostic criterion which the user does not intend to apply. Furthermore, when the user intentionally adds an annotation based on the different diagnostic criterion, the user can be clearly reminded of the intention.
Furthermore, according to the present exemplary embodiment, by indicating the diagnostic classification correspondence, the user can easily comprehend the correspondence relation, for example, when the user performs diagnosis based on the different diagnostic criterion or when the user confirms the sample diagnosed based on the different diagnostic criterion.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or an MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment, and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment. For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
This application claims priority from Japanese Patent Application No. 2011-286782 filed Dec. 27, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2011-286782 | Dec 2011 | JP | national |