Digital microscope.
In various settings, examination of biological specimens is required for diagnostic purposes. Generally speaking, pathologists and other diagnosticians collect and study samples from patients, utilize microscopic examination, and other devices to assess the samples at cellular levels. Numerous steps typically are involved in pathology and other diagnostic processes, including the collection of biological samples such as blood and tissue, processing the samples, preparation of microscope slides, staining, examination, re-testing or re-staining, collecting additional samples, re-examination of samples, and ultimately the offering of diagnostic findings.
The examination of a biological sample generally involves magnification of the sample or region of interest of the sample and an assessment by a pathologist or diagnostician. Traditionally, this is done by placing a slide containing a sample on a microscope and examining a magnified view of the tissue sample or region of interest of the tissue sample through a microscope. Recently, digital microscopes have been developed wherein a sample, particularly a sample on a microscope slide is placed in an instrument and a magnified digital image of the sample or region of interest of the sample is captured and displayed on a monitor such as thin film liquid crystal display monitor. While being able to view a sample or origin of interest of a sample on a display rather than through a lens of a microscope may be beneficial to the pathologist or other diagnosticians, often the time it takes to scan a magnified image and display that image presents an inconvenient delay or a significant delay when multiple samples need to be processed (magnified).
Connected to computer 110 is display 120 configured to display information transmitted from computer 110. Display 120 is, for example, a thin film liquid crystal display monitor that is S-IPS or PVA technology based. A 24-inch or greater color monitor is a representative example. Alternatively, two, three or more displays can be connected to computer 110 to provide a user with more information in a more structured way. For example, one display may provide the image of a sample having a hematoxylin and eosin (H&E) stain, while another one shows images of the same case using a different type of staining method, and a third may show clinical data from other disciplines such as Clinical Chemistry, Hematology, Radiology. Also connected to computer 110 is keyboard 130, mouse 1400A and mouse 1400B. In one embodiment, mouse 1400A is a two-dimensional conventional mouse and mouse 1400B is a three-dimensional mouse such as the 3DConnexion Space Navigator™. Three-dimensional mouse 1400B may, for example, be used to position or navigate the environment and mouse 1400A may, for example, be used to select, create or edit.
Computer 110 may have internet and/or internet connection 145 to allow remote transmission operation of system 100 and/or to connect to a network operating system.
Connected to computer 110 in this embodiment of system 100 is digital microscope 150. Digital microscope 150 may include one or several imaging systems including sensor 160 and sensor 165, optical imaging sub-system 168 and optical imaging sub-system 170, autofocus optics and illuminations. Each imaging system may have a different optical resolution or range of resolution. At least one optical system may reach magnification, m≤1. The system may also provide high resolution with magnifications, m>1. The digital microscope also includes stage 180 translatable in x-, y- and z-directions and control electronics.
Digital microscope 150 may be operated as a bright field and/or fluorescence microscope. In the case of a bright field operation, a sensor or sensors sense(s) the absorption of the sample and captures an image of the sample on the stage 180 with a light source on the opposite side of sensor(s) with respect to the sample. As shown in
In one embodiment of an imaging system, sensor 160 is configured to sense and capture an image of a sample, such as an image of a slide or a portion of a slide on stage 180. Optical imaging sub-system 168 in digital microscope 150 includes lens or objective 1680 that focuses light from light source 195 of an illumination sub-system on sensor 60. Light from light source 195 is emitted through an opening in stage 180, through a slide on stage 180. Mirror 1682 of optical imaging sub-system 168 directs the light to lens or objective 1680. Sensor 160 may capture such image by sensing the image, without magnification (m=1) or a magnification less than one (m<1) through optical imaging sub-system 168. In one embodiment, a location of optical imaging sub-system 168 and sensor are fixed. Mirror 1682 may be moved in an x- and a y-direction by xy-stepper motor 172 and in a z-direction by z-stepper motor 174.
Computer 110 receives signals representative of a sensed image from sensor 160 and generates an image for display and displays such generated image on display 120.
In one embodiment, sensor 165 is similar to sensor 160. Sensor 165 is configured to capture an image of a sample, such as an image of a slide or a portion of a slide on stage 180. Sensor 165 captures such image through optical imaging system 170 (m>1). Optical imaging sub-system 170 in digital microscope 150 may include multiple objectives. Objective 1700A, objective 1700B, and objective 1700C are shown. Objective 1700A is, for example, an infinity corrected type from Carl Zeiss having a magnification of 2.5×. Objective 1700B is, for example, an infinity corrected type from Carl Zeiss, having a 20 times (20×) magnification. Objective 1700C is, for example, a Carl Zeiss A-plan objective, having a 40 times (40×) magnification. Interchanging these objectives yield different optical systems, where each system results in a different pixel resolution (e.g., two microns for magnification 2.5 and 250 nanometers for magnification 20). As needed, other objectives can be substituted or more objectives can be added. The individual objectives are movable in an x- and a y-direction by xy-stepper motor 172 allowing a particular objective to be associated with sensor 165 and a slide on stage 180 when desired. Representatively, the objectives and mirror 1682 may individually be connected to a track and are motorized to move along the track and actuated into position when desired.
Disposed between sensor 165 and optical imaging sub-system 170, in one embodiment, may be automatic focusing system 167 including a beam splitter, an autofocus detector and autofocus illumination. An infrared filter and tube lens may also be disposed between sensor 165 and optical imaging sub-system 170, such as between automatic focusing system 167 and sensor 165.
In one embodiment, when capturing images through optical imaging sub-system 170, microscope 150 uses light source 196 of illumination sub-system positioned beneath stage 180. Light source 196 may be similar to light source 195. Associated with light source 196 (i.e., disposed between light source 196 and stage 180) and included in the illumination sub-system are motorized apertures or diaphragms 197 providing Köhler illumination that improves the specimen illumination.
Computer 110 receives signals representative of a sensed image from sensor 165 and generates an image for display. The generated image is displayed on display 120.
In the above-described embodiment, multiple sensors (sensor 160, sensor 165) are described to capture an image of a sample on a slide. In another embodiment of an imaging system, system 100 includes a single sensor configured to capture an image of a slide or a portion of a slide without magnification (m=1) or with a magnification less than one (m<1) and to capture an image or portion of an image through magnifying optical imaging sub-system 170 (m>1). In this embodiment, a single sensor may be utilized in connection with interchangeable optics (e.g., optical imaging sub-systems 168 and 170). Similarly, in the above embodiment, instead of light source 195 for optical imaging system 168 and light source 196 for optical imaging system 170, a single light source may be used for each imaging system.
In one embodiment, digital microscope 150 includes control unit 175. Control unit 175 is connected to computer 110. Control unit 175 is connected to computer 110. Control unit 175 is also connected to the various components of digital microscope 150 to control an operation of digital microscope based on signals received from computer 110. Control unit representatively controls xy-stepper motor, z-stepper motor, light source 185, light source 196, motorized apertures or diaphragms 187, optical imaging sub-system 168, optical imaging sub-system 170, sensor 160 and sensor 165.
Referring to digital microscope 150 operated as a bright field microscope, in one embodiment, where a tissue sample is a slide having a label with patient identification information and/or other information, including, for example, type of stain or process to which the sample was subjected, printed on the label, digital microscope 150 can sense and capture such information. Using a light source beneath a label to illuminate the label, however, may not make the label information visible. Accordingly, in one embodiment, second light source 198 is utilized to illuminate the slide or a label portion of the slide so that the data on a slide label may be sensed via reflectance. An image of the label may be captured with, for example, sensor 160 and optical imaging sub-system 168.
Referring again to
Referring again to digital microscope 150 of system 100, the microscope includes stage 180. In one embodiment, stage 180 is sized to handle one or more slides. In one embodiment, a slide tray including four slides may be contained on stage 180.
Referring to
Referring to
In operation, digital microscope 150 uses one of sensors 160 and 165 to sense and capture images of a sample or a region of interest of a sample on the slide. The sensor captures slide images of the sample and transmits those images in digital signals to computer 110 and such signals are displayed on display 120. In one embodiment, when capturing an image and displaying that image on display 120, it may not be desirable to store the image in the sense that it could be retrieved in the future. Instead, the image is transmitted from sensor 160 or sensor 165 to computer 110 and absent some instruction from a user or from the system to take another action, the images are refreshed representatively at a refresh rate on the order of several images per second. The refreshing rate may vary. If there is no action of the microscope, for example, there is no need to refresh the image.
In one embodiment, sensor 160 captures an image of a sample on a slide with a magnification of one or less (m≤1). In other words, where a magnification is less than one (m<1), optical imaging sub-system 168 project an unmagnified or a demagnified image of the sample on the sensor. Representatively, sensor 160 is smaller than a slide (e.g., a sensor is approximately 3 to 4 millimeters in diameter while a slide is approximately 25 millimeters by 76 millimeters). Optical imaging sub-system 168 includes an objective that projects a larger field of view on sensor 160.
In one embodiment, system 100 creates an overview image of an entire sample on a slide or a portion of an entire sample. The overview image is an image captured without magnification as described above (i.e., a magnification of one or less than one). An advantage to capturing a overview image is the speed at which it can be captured. For example, an image of a full slide can be captured on the order of one to two seconds while capturing a magnified image may take on the order of 20 seconds or more.
As noted above, a sensor is smaller than a slide and typically smaller than a sample or a portion of a sample on the slide. In order to obtain an acceptable resolution of an image, such as an overview image, an area that individual pixels of the sensor represent is reduced. In one embodiment, to obtain an acceptable overview image of a sample on the slide, sensor 160 will take multiple images and stitch those images together. For example, in one embodiment, a slide or an image sample such as an entire sample on a slide is divided into thirds with a sensor capturing light through a third of the desired area for the overview image (e.g., a third of the usable area of a slide). To coordinate the capture of light representative of a third of a sample, stage 180, in one embodiment, is moved to a desired portion within the field of view of sensor 160.
Referring to
In one embodiment, microscope 150 and system 100 is calibrated using a reference slide carrier such that a nominal position of slides are known within a defined tolerance. The defined tolerance is a result of the xz coordinate system of stage 180 (±p); the mechanical tolerances of slide carrier 210 and its position when inserted into microscope 150 (±q); and the mechanical tolerances of slide cavities in slide carrier 210 that accept slides (±r). The defined tolerance considering these factors is p+q+r. An overview image of a slide, in one embodiment, consists of three overlapping images with the field of view of each image and overlap selected to accommodate the defined tolerance and to collectively capture an image of the entire slide. In other words, since the images obtained by sensor 160 will be stitched, in one embodiment, stage 180 is translated from one field of view of sensor 160 to a different field of view such that, at the different field of view, there is an overlap of another field of view (e.g., a previously imaged field of view). In one embodiment, the overlap is at least 20 pixels more than a maximum tolerance of the stage, slide carrier and cavities within the slide carrier (e.g., 20 to 50 pixels).
In the description of image capture of adjacent images, in one embodiment, the system requires overlap of the captured images. Overlap is shown in
As noted above, in one embodiment, the stitching of portions of an image to gather the overview image is performed as the images are captured. Although
System 100 can establish approximately where a sample is on a slide as well as a location of significant features based on the field of view images used to assemble an overview image. For example, each pixel of an overview image represents a specific area, e.g., 5.4 μm×5.4 μm. Further, the assembled image is represented by the pixel number of sensor 160 which can be described in terms of x- and y-coordinates, e.g., 2504 x-direction pixels by 3324 y-direction pixels. With this information, a selection by mouse 1400A or mouse 1400B of a position in the overview image is a selection of a pixel or pixels in that image. Since a size of each pixel is known, system 100 can determine the number of pixels in an x-direction and a y-direction that a selected position (e.g., a position of a feature of the image) is located relative to a starting position, e.g., an edge of the image. An approximate coordinate system for the image may thus be established such that system 100 can identify a location of an area including a feature represented by a particular pixel or pixels.
As noted above, generally a slide has a label fixed to a surface. In one embodiment, it is desirous to have an overview image including not only of the sample on the slide but also the slide label. Because a slide label will obstruct light introduced in digital microscope 150 from below the slide, digital microscope 150 includes sensor 198 that captures an image of the slide label in reflectance.
Where sensor 160 and optical imaging sub-system 168 are positioned over a sample on a slide to capture an image of that slide, a user of system 100 can electronically “zoom” in to increase the resolution. In one embodiment, system 100 initially captures an overview image of a sample on a slide (e.g., an overview image of the entire sample) with sensor 160 and an image of the slide label. The initial overview image may be stitched together as described above. In one embodiment, the initial image is displayed at a relatively larger sensor to pixel ratio (multiple pixels of sensor 160 mapped to a pixel of display 120). It is appreciated that pixels on display 120 are generally larger than on sensor 160. For example, pixels on sensor 160 have a size on the order of five microns while a pixel size of display 120 are on the order of 0.5 millimeters.
In one example, the initial overview image is displayed at a sensor to display pixel ratio of four to one or greater. A user then uses mouse 1400A to select a region of interest in a sample. The user can then zoom in to increase the image resolution at that particular point or that particular region of interest and/or increase the magnification. Representatively, to zoom in electronically at a selected region of interest (selected by mouse 1400A and located by system 100 as described above), a user directs mouse 1400B to zoom in and, in response, system 100 will modify a sensor to display pixel ratio from, for example, four to one toward one to one or more (i.e., map fewer sensor pixels to individual display pixels). It is appreciated that as the individual sensor pixels are mapped to more display pixels, the image will appear to the user as being magnified as the display area of the region of interest on display 120 increases. A user can accept and save the image at any desired ratio of sensor pixel to display pixel.
At some point, a threshold resolution will be reached (e.g., a one-to-one sensor to display pixel ratio). If the user wishes to continue to zoom in on the region of interest, in response, system 100 will automatically switch from an optical magnification of one or less to the next higher magnification in microscope 150. In an embodiment where a separate sensor is associated with magnifying optics, system 100 will automatically switch to sensor 165 and position magnifying optical imaging sub-system 170 over the region of interest. Representatively, when the threshold resolution of the image is reached with sensor 160 and optical imaging sub-system 168, system 100 will switch to magnifying the image through objective lens 1700A. Objective lens 1700A is, for example, a 2.5× magnification.
Upon switching to capturing an image through magnifying optical imaging sub-system 170, the sensor to display pixel ratio will once again begin at a pixel ratio greater than one to one (e.g., four to one). A user can accept that captured image and save that captured image or continue to zoom in and accept and save the image at any desired ratio. Continuing to zoom in again initially involves modifying a sensor to display pixel ratio from a sensor to display pixel ratio greater than one to one towards a ratio of one to one or more. Once the threshold resolution is reached, system 100 will change objectives from objective 1700A to objective 1700B with the next higher optical magnification. In one embodiment, objective 1700B is a 20× magnification. Continued zooming follows the same action.
The above discussion involved user interaction to direct system 100 to electronically zoom in and/or increase magnifications. In another embodiment, system 100 may do this automatically. For example, system 100 may be configured to perform the above operations to capture saved images at different resolutions and/or magnifications.
In one embodiment, when a slide is placed in digital microscope 150, system 100 immediately creates an overview image with a magnification of one or less than one. A user can zoom in where sensor 160 is capturing an image from a sample in the slide as described above, or a user can alternatively capture a greater magnification of a region of interest on the sample by telling the system to increase magnification. One way a user does this is by using mouse 1400A and selecting a region of interest on the overview image and indicating the desired magnification. In the latter case, it is appreciated that a user may make the selection on an overview image whether or not a particular sample/slide is currently the slide to which sensor 160 may be capturing an image. For example, where sensor 160 is currently capturing an image of slide 320A (see
Using the example of a user wanting a magnified view of a portion of a sample on slide 320B, initially a saved overview image of slide 320B will be displayed on display 120. If, for example, a user wants a magnified image of a portion of the image (e.g., a region of interest), the user can drag mouse 1400A to a desired position of display 120 showing a sample on slide 320B and then indicate to system 100 the desired area of magnification by clicking on mouse 1400A. As noted before, a specific coordinate system of the overview image may not be saved. However, system 100 knows the approximate location selected by a user, because it knows where on display 120 a user indicated (e.g., clicked), it knows the size of individual pixels in the overview image (e.g., 50 μm×50 μm) and it knows the pixel size of the image (e.g., 3324×2504 pixels). Since the system previously identified slide 320B in slide carrier 210 and the approximate location of the sample on the slide, system 100 will approximately know the region of interest to capture a magnified view of the region of interest. Similarly, if a magnified image including a region of interest had previously been saved, system 100 can retrieve that image based on a user indicating the region of interest on the overview image. In other words, an ability to identify a region of interest by a pixel position in an overview image applies not only to the overview image of a sample but to any other image of that sample.
In one embodiment, system 100 allows a user to put an annotation on an image and save the location of the annotation. For example, a user may wish to identify a region of interest in a 20× image by an arrow (an annotation) pointing at the region of interest. In one embodiment, a user locates the region of interest in 20× magnification, then moves mouse 1400A to the point where an annotation is desired. The user indicates to system 100 that it desires to put an annotation or object at a location (e.g., by previously selecting (clicking on) an icon in the display browser) and clicks mouse 1400A to place the annotation. To place the annotation, system 100 must locate the point in the 20× image. Similar to locating points in an overview image, system 100 can find the point because it knows a pixel resolution at 20×, a pixel size of the sensor (e.g., 5.4 μm×5.4 μm) from which a pixel size may be determined (5.4/20×1000=270 nm), and it knows the pixel number (e.g., 3324 pixels×2504 pixels). Based on this information, system 100 can locate and store information about a location of a point (e.g., a pixel or pixels) in the 20× magnified view as well as in the overview image of the slide.
Where slide information including an annotation or object of a point or region of interest of a slide is saved in system 100 and the slide is removed from digital microscope 150 and slide carrier 210, system 100 can also find the point or region of interest when the slide is reinserted in microscope 150. As described earlier, there are error tolerances associated with stage 180, slide carrier 210 and cavities within slide carrier 210. These error tolerances could effect alignment of an annotation to a particular point or region of interest of a sample on a slide when the slide is removed and then reinserted in slide carrier 210 and microscope 150. To account for this potential alignment error, in one embodiment, system 100 captures a new image of the slide or a portion of the slide and compares the new image to the saved image with the annotation. For example, system 100 may take a new image of the slide label or a corner of the sample and overlay that image on the saved image of the label or the corner of the sample, respectively. If the images are not aligned, system 100 rotates and/or linearly positions the new image until it is aligned. In doing this adjustment, system 100 stores information about the adjustment and uses this stored information to find where an annotation is located in the new view of the image. In a simple example, a new image of a sample on a slide reinserted into slide carrier 210 and microscope 150 is determined to be three pixel lengths in an x-direction different than the original image of the slide including the annotation. When indicating the annotation in the new image, system 100 knows the x-direction position of the annotation in the old image and then moves the annotation three pixels to the right to locate the annotation in the new image.
With regard to saving images (e.g., an overview image, a magnified image), in one embodiment, a single assembled image of a sample is saved. In another embodiment, a hierarchy of images of the sample is saved. In one embodiment, a hierarchy of images of a sample is created based on a sensor to display pixel ratio. In this embodiment, a highest ranked image in the hierarchy is an image having a one-to-one pixel ratio (full resolution sample mapping each sensor pixel with each display pixel). One or more lower ranked images of increasing sensor to display pixel ratio (a sensed image is displayed on display 120 such that one sensor pixel is mapped to more than one display pixel, e.g., 2:1, 4:1, etc.) make up the remainder of the hierarchy of images. Each of the full resolution sample and the one or more lower ranked images may be stored together in a data set.
For objectives with high magnification, a depth of field (i.e., objects within the depth of field (z-direction range)) is relatively small. The z-direction range is so small (e.g., 1 μm) that the image captured may not capture all objects in a sample having a thickness, for example, on the order of 10 μm, in one image capture. To capture as many objects as possible, in one embodiment, system 100 may capture several images at different focal planes by a depth of field. In the example, system 100 may capture 10 images moving stage 180 one micron in a z-direction between each capture. Such an operation will result in 10 image planes representing a z-direction stack or z-stack of the sample.
In another embodiment, a coordinate system for each slide may be established using a label image. In one embodiment, a slide label may be printed with more than one perceivable dot or point.
In one embodiment, a data set including a saved image or hierarchy of images of a sample, a z-stack, a coordinate system for that image, if any, and separately saved label image is assembled in a memory of computer 110. Such a data set may also contain comments or annotations (including markings on an image made by a user) and content of label (e.g., interpretation of label).
Having described certain components of system 100, a brief description of operation is now presented. In one embodiment, use of system 100 is software driven. In other words, a machine or computer readable medium is provided in computer 110 containing program instructions that, when executed, will carry out the various methods of operation described.
In one embodiment, a method of operation is illustrated in
As a starting point, slide carrier 210 may be loaded into digital microscope 150 and placed on stage 180. Sensors may be located on stage 180 to sense a slide carrier. Computer 110 is responsive to such sensors. When computer 110 senses slide carrier 210 on stage 180, in one embodiment, system 100 has three modes: a live mode; a scanning mode; and a viewing mode. The live mode and viewing mode are interactive modes in that they include user interaction. The scanning mode may be operated interactively or be fully automated with specific predefined parameters or configurations for scanning (saving) images. For example, in the scanning mode, slide(s) may be loaded onto a slide carrier and inserted into a digital microscope and the system will sense and save one or more images of samples on a slide.
In the example where a user chooses a live mode by selecting “Live” 610, computer 110 will direct digital microscope 150 to move slide carrier 210 to a loading position, such as by extending slide carrier 210 out of the instrument so it is accessible by a user. At this time, when slide carrier 210 is accessible, computer 110 may indicate that the slide carrier may be removed from digital microscope 150 and loaded with one or more slides. One way to indicate this is by an announcement on monitor 120 (block 505,
In the embodiment where slide carrier 210 has four slide cavities (slide cavities 220A, 220B, 220C and 220D), a user may place up to four slides onto slide carrier 210. After placing one or more slides on slide carrier 210, the carrier is loaded into the instrument and the instrument pulls the carrier inside and senses its presence and location (block 510,
Once slide carrier 210 is placed within digital microscope 150, system 100 determines the number and position of slides inserted into slide cavities 220A, 220B, 220C and/or 220D (block 515,
The screen shot in
Main part 730 of screen 700 is divided into slide selection section 740, profile selection section 750 and slide information section 760. Slide selection section 740 shows a sketch of slide carrier 180 (see
Referring to profile selection section 750 of the GUI, the section allows the user to select specific, predefined scanned profiles that are optimized and customized image acquisitions for a specific stain or, in an embodiment utilizing fluorescent image, a fluorescence number. For example, a specific profile for an H&E stain might be a 20× image. Section 750 allows a user to select the tool bar and be offered a choice of a 20× view and scan. Slide and label information is presented in slide information section 760 and provides, in one embodiment, identifying information about the patient as well as the process step(s) that a sample was subjected to in preparing it for microscopic analysis.
Referring to the screen shot in
In an instance where a user selects a “live mode,” a user may select a particular slide or slides based on the displayed overview images (block 540,
Views can be added and removed from the screen via selection in a journal. The journal is a collection of all objects in a case and can be found on the right hand side of the screen. At the same time, only one view is active. All controls reflect the setting of the active view. Any operation will only affect this specific image. The images from the other scans are “frozen.” To switch focus to another slide, the user needs to click on one of the displayed images or a label in the journal.
In title bars 810 and 910 of the screen shots shown in
In each view, the magnification of the image, a scale indicator and a focus control may be displayed.
Using keyboard 130, mouse 1400A or 3D mouse 1400B, the user is able to navigate within the scanned image data (block 550,
Each view can be viewed in a full screen mode. In this mode, the total screen is covered by the view. Only the magnification of the image, scale indicator and the navigation control are displayed in the image. A user can leave this total screen view mode by selecting (pushing) the “esc” key on keyboard 130.
At any time during the navigation, a user may save the displayed image (block 560,
On the right hand side of the GUI in
Navigation control 850/950 shows an overview image of the active slide. The control indicates the position of the active view (e.g., the live image that can be refreshed at a desired refresh rate). By clicking on the control, the position can be moved. The control also indicates the positions of the annotations and scans which have been added to the slide.
Below navigation control 850/950, journal 860/960 is located. Journal 860/960 represents the structure of a case. In one embodiment, journal 860/960 contains slide information, annotations and comments of all slides in a case. Slides can be added and removed from the case. The slides are grouped into two parts. In the upper part of the list, slides in the instrument can be found. A user can examine these slides live. In this case, the user can add scans to these slides by scanning selected areas of the slides. In the lower part of the list (as viewed), pre-scanned slides from the memory in computer are presented.
In the structure of journal 860/960, each slide has by default three objects: a label image, the slide information and an overview image. The user can add additional objects (annotations, comments, bookmarks and high-resolution scans) to the journal. A user can choose one of the entries in journal 860/960 to jump to a specific image position.
In one embodiment, journal 860/960 will be a starting point for evaluation of specific regions or for the creation of a report of the considered case. Journal 860/960 also contains the “save session” button. By selecting this button, system 100 will store the session (including all slides, annotations and setting) to memory in computer 110. The file contains the labels, the overview image and the high-resolution scans, which are defined in the journal. A user can restore the session at a later time.
Below journal 860/960 in
Following optional modification of image parameters, a user may save the session. A user may also abort the session. If a user decides to abort the session, the user exits the live mode. If the user has modified entries in journal 860/960, which are not saved yet, system 100 will ask the user, if the user wants to save the session. Subsequently, system 100 moves to the “slide (group) selection” screen. The user can continue with the next slide(s).
If a user has defined a high-resolution scan region in the live mode, the high-resolution scan function is accessed. Before the scan starts, the user selects the scan parameters (resolution, acquisition parameters and z-stack parameters).
After the high-resolution scan is started, system 100 moves to the high-resolution scan display.
Following a high-resolution scan, system 100 automatically reverts to the “live mode” screen. The completed scan is inserted into the journal and is located below the corresponding slide.
In another embodiment, system 100 can be directed to perform scans of multiple slides, such as high-resolution scans of each slide in slide carrier 180. An embodiment where slide carrier 180 holds four slides, a user places up to four slides onto slide carrier 180 and inserts the carrier digital microscope 150. Digital microscope 150 pulls slide carrier 180 inside and determines the slide carrier type, the number of slide and their positions. Automatic scanning of more than four slides is possible in combination with, for example, an external slide loader. In that case, system 100 would communicate with the slide loader so that slides to be scanned are automatically loaded and retrieved. For example, a commercially available slide loader can be associated with digital microscope 150. Such slide loaders are configured to receive and transfer slides. Accordingly, a transfer mechanism of a slide loader can load a slide or slides onto slide carrier 210 and a transfer mechanism (e.g., pick and place robotic grabber) associated with or connected to digital microscope 150 can return slides to the slide loader after scanning. The control of transferring, scanning and imaging may be controlled by controller 110.
In an automatic or reflex imaging mode, system 100 may be instructed to acquire scans of either the complete slide or of selected areas of the slide can be accomplished immediately or at a later time as selected by the user. Scanning of the slides in the reflex imaging mode is facilitated by an automatic slide loader. In addition, to select complete or partial slide scans, a user may select the scan parameters such as, but not limited to, magnification and z-stack. In addition, a user can select more than one scan at different magnifications or using other user-defined parameters. Default scan parameters can be programmed into system 100 to reflect differences between clinicians and/or tissue types. Completion of the reflex imaging of a slide or group of slides, as in a case, may be automatically signaled to one or more devices, for example, via internet connection 145 and to the computer 110 to enable rapid sign-out of the cases (see
System 100 presents a user with selections for the scanning resolution and the predefined image parameter set such as described above in the live mode. Moreover, the user can define z-stack parameters (number of planes, distance between the planes and the focus offset from the focal plane).
The user can modify the save parameters (file name and scan directory). The default parameters are defined in the system settings. In the case of the file name, the user can either use the label content or define a file name. If a name is defined, in one embodiment, system 100 will add a progressive number to distinguish between the slides.
After defining the scan parameters, system 100 starts to scan slide carrier 180 slide by slide. For each slide, a low-resolution overview image and the label image are taken. Following the capture of overview and label images for each slide, system 100 will capture high-resolution images. In one embodiment, the GUI on display 120 will change to the first high-resolution scan screen. System 100 may depict the overview and the label image of current slide. It automatically identifies the tissue regions on the slide and indicates the detected tissue regions in the overview image. Subsequently, the focal plane for the tissue regions on the slide is determined. The progress of these steps is indicated on the screen.
In one embodiment, during the actual scan of the detected tissue regions, system 100, depending on the system settings, in one embodiment, display 120 displays either the image of the current scan position or shows the entire current scan region, in which the image builds up.
The scanned high-resolution images for a slide are stored in memory of computer 110 and can be accessed later by a user using the viewing mode. After all tissue regions on the slide are scanned, system 100 proceeds with the next slide. After all slides have been scanned, system 100 ejects slide carrier 180 and moves to the entry screen.
In a viewing mode, a user can retrieve and view stored images of previously scanned samples (slides) including multiple slides that may make up a case. In this manner, a user can view the images from display 120 directly connected to computer 110 and optical microscope 150. Alternatively, through intranet/internet connection 145 (
In the preceding detailed description, the invention is described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The application is Non-Provisional Application and claims the benefit of the earlier filing date of U.S. Provisional Patent Application No. 61/375,703, filed Aug. 20, 2010 and incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3309262 | Copeland et al. | Mar 1967 | A |
3525803 | Smart | Aug 1970 | A |
3762798 | Grubb et al. | Oct 1973 | A |
3765851 | White | Oct 1973 | A |
3862909 | Copeland | Jan 1975 | A |
4000417 | Adkisson et al. | Dec 1976 | A |
4079248 | Lehureau et al. | Mar 1978 | A |
4089989 | White et al. | May 1978 | A |
4148752 | Burger et al. | Apr 1979 | A |
4404683 | Kobayashi et al. | Sep 1983 | A |
4673988 | Jansson et al. | Jun 1987 | A |
4684799 | Emoto et al. | Aug 1987 | A |
4760385 | Jansson | Jul 1988 | A |
4761075 | Matsushita et al. | Aug 1988 | A |
4836667 | Ozeki | Jun 1989 | A |
4849177 | Jordan | Jul 1989 | A |
4962264 | Forester | Oct 1990 | A |
5180606 | Stokes et al. | Jan 1993 | A |
5287272 | Rutenberg et al. | Feb 1994 | A |
5297034 | Weinstein | Mar 1994 | A |
5297215 | Yamagishi | Mar 1994 | A |
5311426 | Donohue et al. | May 1994 | A |
5428690 | Bacus et al. | Jun 1995 | A |
5473706 | Bacus et al. | Dec 1995 | A |
5532874 | Stein | Jul 1996 | A |
5546323 | Bacus et al. | Aug 1996 | A |
5561556 | Weissman et al. | Oct 1996 | A |
5581637 | Cass et al. | Dec 1996 | A |
5655028 | Soll et al. | Aug 1997 | A |
5659174 | Kaneoka et al. | Aug 1997 | A |
5675141 | Kukihara | Oct 1997 | A |
5686960 | Sussman et al. | Nov 1997 | A |
5768033 | Brock | Jun 1998 | A |
5793969 | Kamentsky et al. | Aug 1998 | A |
5836877 | Zavislan | Nov 1998 | A |
5864138 | Miyata et al. | Jan 1999 | A |
5891619 | Zakim et al. | Apr 1999 | A |
5924074 | Evans | Jun 1999 | A |
5947167 | Bogen et al. | Sep 1999 | A |
6031930 | Bacus et al. | Feb 2000 | A |
6061176 | Shih | May 2000 | A |
6078681 | Silver | Jun 2000 | A |
6091842 | Domanik et al. | Jul 2000 | A |
6101265 | Bacus et al. | Aug 2000 | A |
6147797 | Lee | Nov 2000 | A |
6205235 | Roberts | Mar 2001 | B1 |
6208374 | Clinch | Mar 2001 | B1 |
6215892 | Douglass et al. | Apr 2001 | B1 |
6226352 | Salb | May 2001 | B1 |
6226392 | Bacus et al. | May 2001 | B1 |
6248995 | Tanaami et al. | Jun 2001 | B1 |
6272235 | Bacus et al. | Aug 2001 | B1 |
6309607 | Johnston et al. | Oct 2001 | B1 |
6396941 | Bacus et al. | May 2002 | B1 |
6404906 | Bacus et al. | Jun 2002 | B2 |
6466690 | Bacus et al. | Oct 2002 | B2 |
6522774 | Bacus et al. | Feb 2003 | B1 |
6529271 | Engelhardt | Mar 2003 | B1 |
6606413 | Zeineh | Aug 2003 | B1 |
6671393 | Hays et al. | Dec 2003 | B2 |
6674881 | Bacus et al. | Jan 2004 | B2 |
6674884 | Bacus et al. | Jan 2004 | B2 |
6678398 | Wolters et al. | Jan 2004 | B2 |
6684092 | Zavislan | Jan 2004 | B2 |
6711283 | Soenksen | Mar 2004 | B1 |
6735531 | Rhett et al. | May 2004 | B2 |
6775402 | Bacus et al. | Aug 2004 | B2 |
6800249 | de la Torre-Bueno | Oct 2004 | B2 |
6800853 | Ohkura | Oct 2004 | B2 |
6834237 | Noergaard et al. | Dec 2004 | B2 |
6847481 | Ludl et al. | Jan 2005 | B1 |
6847729 | Clinch et al. | Jan 2005 | B1 |
6947583 | Ellis et al. | Sep 2005 | B2 |
6959720 | Kurihara et al. | Nov 2005 | B2 |
6982741 | Fiedler | Jan 2006 | B2 |
6993169 | Wetzel et al. | Jan 2006 | B2 |
7009638 | Gruber et al. | Mar 2006 | B2 |
7016109 | Nakagawa | Mar 2006 | B2 |
7027627 | Levin et al. | Apr 2006 | B2 |
7031507 | Bacus et al. | Apr 2006 | B2 |
7071969 | Stimson | Jul 2006 | B1 |
7098634 | Yu | Aug 2006 | B1 |
7110586 | Bacus et al. | Sep 2006 | B2 |
7110645 | Birk et al. | Sep 2006 | B2 |
7133545 | Douglass et al. | Nov 2006 | B2 |
7136518 | Griffin | Nov 2006 | B2 |
7141802 | Takeyama et al. | Nov 2006 | B2 |
7146372 | Bacus et al. | Dec 2006 | B2 |
7149332 | Bacus et al. | Dec 2006 | B2 |
7171030 | Foran et al. | Jan 2007 | B2 |
7194118 | Harris et al. | Mar 2007 | B1 |
7212660 | Wetzel et al. | May 2007 | B2 |
7224839 | Zeineh | May 2007 | B2 |
7233340 | Hughes et al. | Jun 2007 | B2 |
7248403 | Nakagawa | Jul 2007 | B2 |
7250963 | Yuri | Jul 2007 | B2 |
7292251 | Gu | Nov 2007 | B1 |
7349482 | Kim | Mar 2008 | B2 |
7359548 | Douglass et al. | Apr 2008 | B2 |
7391894 | Zeineh | Jun 2008 | B2 |
7394482 | Olschewski | Jul 2008 | B2 |
7394979 | Luther et al. | Jul 2008 | B2 |
7396508 | Richards et al. | Jul 2008 | B1 |
7400342 | Gaida et al. | Jul 2008 | B2 |
7400983 | Feingold et al. | Jul 2008 | B2 |
7406215 | Clune et al. | Jul 2008 | B2 |
7421102 | Wetzel et al. | Sep 2008 | B2 |
7426345 | Takamatsu et al. | Sep 2008 | B2 |
7428325 | Douglass et al. | Sep 2008 | B2 |
7433026 | Wolpert et al. | Oct 2008 | B2 |
7456377 | Zeineh et al. | Nov 2008 | B2 |
7463761 | Eichhorn et al. | Dec 2008 | B2 |
7470541 | Copeland et al. | Dec 2008 | B2 |
7482600 | Seyfried | Jan 2009 | B2 |
7483554 | Kotsianti et al. | Jan 2009 | B2 |
7486329 | Endo | Feb 2009 | B2 |
7502519 | Eichhorn et al. | Mar 2009 | B2 |
7542596 | Bacus et al. | Jun 2009 | B2 |
7584019 | Feingold et al. | Sep 2009 | B2 |
7596249 | Bacus et al. | Sep 2009 | B2 |
7602524 | Eichhorn et al. | Oct 2009 | B2 |
7623697 | Hughes et al. | Nov 2009 | B1 |
7630113 | Sase et al. | Dec 2009 | B2 |
7633616 | Hing | Dec 2009 | B2 |
7642093 | Tseung et al. | Jan 2010 | B2 |
7653300 | Fujiyoshi et al. | Jan 2010 | B2 |
7657070 | Lefebvre | Feb 2010 | B2 |
7663078 | Virag et al. | Feb 2010 | B2 |
7677289 | Hayworth et al. | Mar 2010 | B2 |
7689024 | Eichhorn et al. | Mar 2010 | B2 |
7738688 | Eichhorn et al. | Jun 2010 | B2 |
7756309 | Gholap et al. | Jul 2010 | B2 |
7756357 | Yoneyama | Jul 2010 | B2 |
7778485 | Zeineh et al. | Aug 2010 | B2 |
7822257 | Endo et al. | Oct 2010 | B2 |
7840300 | Harker | Nov 2010 | B2 |
7856131 | Bacus et al. | Dec 2010 | B2 |
7860292 | Eichhorn et al. | Dec 2010 | B2 |
7864414 | Sase et al. | Jan 2011 | B2 |
7869641 | Wetzel et al. | Jan 2011 | B2 |
7873193 | De La Torre-Bueno et al. | Jan 2011 | B2 |
7876948 | Wetzel et al. | Jan 2011 | B2 |
RE42220 | Clinch et al. | Mar 2011 | E |
7901941 | Tseung et al. | Mar 2011 | B2 |
7912267 | Kawano et al. | Mar 2011 | B2 |
7916916 | Zeineh | Mar 2011 | B2 |
7920163 | Kossin | Apr 2011 | B1 |
7925067 | Bacus et al. | Apr 2011 | B2 |
7944608 | Hayashi et al. | May 2011 | B2 |
7949161 | Kawanabe et al. | May 2011 | B2 |
7957057 | Sase et al. | Jun 2011 | B2 |
7967057 | Kunii et al. | Jun 2011 | B2 |
7978894 | Soenksen et al. | Jul 2011 | B2 |
8000560 | Shirota | Aug 2011 | B2 |
8000562 | Morales et al. | Aug 2011 | B2 |
8036868 | Zeineh et al. | Oct 2011 | B2 |
8074547 | Ito et al. | Dec 2011 | B2 |
8077959 | Dekel et al. | Dec 2011 | B2 |
8085296 | Yuguchi et al. | Dec 2011 | B2 |
8094902 | Crandall | Jan 2012 | B2 |
8094914 | Iki et al. | Jan 2012 | B2 |
8098279 | Sase et al. | Jan 2012 | B2 |
8098956 | Tatke et al. | Jan 2012 | B2 |
8103082 | Olson et al. | Jan 2012 | B2 |
8125534 | Shimonaka | Feb 2012 | B2 |
8159547 | Kawashima | Apr 2012 | B2 |
8174763 | Guiney et al. | May 2012 | B2 |
8187536 | Graupner et al. | May 2012 | B2 |
8199358 | Eichhorn et al. | Jun 2012 | B2 |
8203575 | Molnar et al. | Jun 2012 | B2 |
8283176 | Bland et al. | Oct 2012 | B2 |
8305434 | Nakatsuka et al. | Nov 2012 | B2 |
8306298 | Bacus et al. | Nov 2012 | B2 |
8306300 | Bacus et al. | Nov 2012 | B2 |
8339703 | Knebel | Dec 2012 | B2 |
8350904 | Fujimoto et al. | Jan 2013 | B2 |
8366857 | Hayworth et al. | Feb 2013 | B2 |
8385619 | Soenksen | Feb 2013 | B2 |
8385686 | Sano | Feb 2013 | B2 |
8388891 | Lefebvre | Mar 2013 | B2 |
8394635 | Key et al. | Mar 2013 | B2 |
8396669 | Cocks | Mar 2013 | B2 |
8463741 | Ehike et al. | Jun 2013 | B2 |
8473035 | Frangioni | Jun 2013 | B2 |
8476585 | Galloway | Jul 2013 | B2 |
8501435 | Gustafsson et al. | Aug 2013 | B2 |
8565480 | Eichhorn et al. | Oct 2013 | B2 |
8565503 | Eichhorn et al. | Oct 2013 | B2 |
8582489 | Eichhorn et al. | Nov 2013 | B2 |
8582849 | Eichhorn et al. | Nov 2013 | B2 |
8673642 | Key et al. | Mar 2014 | B2 |
8687858 | Walter et al. | Apr 2014 | B2 |
8725237 | Bryant-Greenwood et al. | May 2014 | B2 |
8730315 | Yoneyama | May 2014 | B2 |
8744213 | Tatke et al. | Jun 2014 | B2 |
8747746 | Lefebvre | Jun 2014 | B2 |
8771978 | Ragan | Jul 2014 | B2 |
8788217 | Feingold et al. | Jul 2014 | B2 |
8796038 | Williamson, IV et al. | Aug 2014 | B2 |
8827760 | Ushibo et al. | Sep 2014 | B2 |
20010035752 | Kormos et al. | Nov 2001 | A1 |
20020169512 | Stewart | Nov 2002 | A1 |
20020176160 | Suzuki et al. | Nov 2002 | A1 |
20020176161 | Yoneyama et al. | Nov 2002 | A1 |
20030048931 | Johnson et al. | Mar 2003 | A1 |
20030112330 | Yuri et al. | Jun 2003 | A1 |
20030124729 | Christensen et al. | Jul 2003 | A1 |
20030133009 | Brown | Jul 2003 | A1 |
20030142882 | Beged-Dov et al. | Jul 2003 | A1 |
20030156276 | Bowes | Aug 2003 | A1 |
20040027462 | Hing | Feb 2004 | A1 |
20040080758 | Ban et al. | Apr 2004 | A1 |
20040141660 | Barth et al. | Jul 2004 | A1 |
20050073649 | Spector | Apr 2005 | A1 |
20050090017 | Morales | Apr 2005 | A1 |
20050094262 | Spediacci et al. | May 2005 | A1 |
20050112537 | Wu | May 2005 | A1 |
20050211874 | Takeyama et al. | Sep 2005 | A1 |
20050219688 | Kawano et al. | Oct 2005 | A1 |
20050221351 | Jekwam | Oct 2005 | A1 |
20050239113 | Ryu et al. | Oct 2005 | A1 |
20050248837 | Sase | Nov 2005 | A1 |
20060039583 | Bickert et al. | Feb 2006 | A1 |
20060045388 | Zeineh | Mar 2006 | A1 |
20060077536 | Bromage et al. | Apr 2006 | A1 |
20060088940 | Feingold et al. | Apr 2006 | A1 |
20060098861 | See et al. | May 2006 | A1 |
20060146283 | Baumann et al. | Jul 2006 | A1 |
20060164623 | Wagner et al. | Jul 2006 | A1 |
20060171560 | Manus | Aug 2006 | A1 |
20060179992 | Kermani | Aug 2006 | A1 |
20070025606 | Gholap et al. | Feb 2007 | A1 |
20070091324 | Paul et al. | Apr 2007 | A1 |
20070098237 | Yoo | May 2007 | A1 |
20070198001 | Bauch et al. | Aug 2007 | A1 |
20070207061 | Yang et al. | Sep 2007 | A1 |
20070224699 | Gates | Sep 2007 | A1 |
20070285768 | Kawanabe et al. | Dec 2007 | A1 |
20080002252 | Weiss et al. | Jan 2008 | A1 |
20080020128 | van Ryper et al. | Jan 2008 | A1 |
20080095424 | Iki et al. | Apr 2008 | A1 |
20080095467 | Olszak et al. | Apr 2008 | A1 |
20080142708 | Workman et al. | Jun 2008 | A1 |
20080180794 | Tafas et al. | Jul 2008 | A1 |
20080240613 | Dietz et al. | Oct 2008 | A1 |
20090040322 | Leberl et al. | Feb 2009 | A1 |
20090116101 | Tafas et al. | May 2009 | A1 |
20090140169 | Niehren | Jun 2009 | A1 |
20090195688 | Henderson | Aug 2009 | A1 |
20100000383 | Koos et al. | Jan 2010 | A1 |
20100020157 | Jelinek et al. | Jan 2010 | A1 |
20100039507 | Imade | Feb 2010 | A1 |
20100074489 | Bacus et al. | Mar 2010 | A1 |
20100093022 | Hayworth et al. | Apr 2010 | A1 |
20100102571 | Yang | Apr 2010 | A1 |
20100109725 | Yun et al. | May 2010 | A1 |
20100118133 | Walter et al. | May 2010 | A1 |
20100118393 | Lin | May 2010 | A1 |
20100134655 | Kuroiwa | Jun 2010 | A1 |
20100141751 | Uchida | Jun 2010 | A1 |
20100141752 | Yamada | Jun 2010 | A1 |
20100141753 | Olson et al. | Jun 2010 | A1 |
20100171809 | Fujiyoshi | Jul 2010 | A1 |
20100177166 | Eichhorn et al. | Jul 2010 | A1 |
20100188738 | Epple et al. | Jul 2010 | A1 |
20100194873 | Viereck et al. | Aug 2010 | A1 |
20100201800 | Yamamoto et al. | Aug 2010 | A1 |
20100225668 | Tatke et al. | Sep 2010 | A1 |
20100260407 | Eichhorn et al. | Oct 2010 | A1 |
20100279342 | Kijima et al. | Nov 2010 | A1 |
20100295932 | Yokomachi et al. | Nov 2010 | A1 |
20100310139 | Kimura | Dec 2010 | A1 |
20110037847 | Soenksen | Feb 2011 | A1 |
20110038523 | Boardman | Feb 2011 | A1 |
20110043663 | Tsuchiya | Feb 2011 | A1 |
20110064296 | Dixon | Mar 2011 | A1 |
20110074817 | Shinichi et al. | Mar 2011 | A1 |
20110102571 | Yoneyama | May 2011 | A1 |
20110109735 | Otsuka | May 2011 | A1 |
20110145755 | Bacus et al. | Jun 2011 | A1 |
20110181622 | Bacus et al. | Jul 2011 | A1 |
20110221881 | Shirota et al. | Sep 2011 | A1 |
20110316993 | Chen et al. | Dec 2011 | A1 |
20110316999 | Yoneyama et al. | Dec 2011 | A1 |
20120002043 | Nitta | Jan 2012 | A1 |
20120002892 | Eichhorn et al. | Jan 2012 | A1 |
20120069171 | Kodaira et al. | Mar 2012 | A1 |
20120069344 | Liu | Mar 2012 | A1 |
20120076391 | Dietz et al. | Mar 2012 | A1 |
20120076411 | Dietz et al. | Mar 2012 | A1 |
20120076436 | Dietz et al. | Mar 2012 | A1 |
20120081536 | Kuppig et al. | Apr 2012 | A1 |
20120114204 | Olson et al. | May 2012 | A1 |
20120120225 | Maddison | May 2012 | A1 |
20120127297 | Baxi et al. | May 2012 | A1 |
20120163680 | Lefebvre | Jun 2012 | A1 |
20120208184 | Ragan | Aug 2012 | A1 |
20120281931 | Eichhorn et al. | Nov 2012 | A1 |
20130003172 | Widzgowski et al. | Jan 2013 | A1 |
20130076886 | Ikeno et al. | Mar 2013 | A1 |
20130140459 | Galloway | Jun 2013 | A1 |
20130162802 | Soenksen | Jun 2013 | A1 |
20130164781 | Lefebvre | Jun 2013 | A1 |
20130182922 | Kil | Jul 2013 | A1 |
20130216451 | Hayworth et al. | Aug 2013 | A1 |
20130250090 | Morimoto | Sep 2013 | A1 |
20140030757 | Schiffenbauer | Jan 2014 | A1 |
20140049632 | Hemmer | Feb 2014 | A1 |
20140051158 | Nakajima et al. | Feb 2014 | A1 |
20140085453 | Yamane | Mar 2014 | A1 |
20140086463 | Meetz et al. | Mar 2014 | A1 |
20140087411 | Chow et al. | Mar 2014 | A1 |
20140098376 | Hashimshony et al. | Apr 2014 | A1 |
20140112560 | Soenksen | Apr 2014 | A1 |
20140118528 | Wolff et al. | May 2014 | A1 |
20140130613 | Adiga et al. | May 2014 | A1 |
20140137715 | Sneyders et al. | May 2014 | A1 |
20140273086 | Lefebvre | Sep 2014 | A1 |
20150177504 | Bickert et al. | Jun 2015 | A1 |
Number | Date | Country |
---|---|---|
2504245 | Nov 2006 | CA |
102782557 | Nov 2012 | CN |
102841079 | Dec 2012 | CN |
102009012293 | Mar 2009 | DE |
1447699 | Aug 2004 | EP |
2051051 | Apr 2009 | EP |
2110696 | Oct 2009 | EP |
2169379 | Mar 2010 | EP |
2620537 | Mar 1989 | FR |
59071018 | Apr 1984 | JP |
61248168 | Nov 1986 | JP |
S63206793 | Aug 1988 | JP |
09080138 | Mar 1997 | JP |
09133856 | May 1997 | JP |
9161068 | Jun 1997 | JP |
09218354 | Aug 1997 | JP |
2001281553 | Oct 2001 | JP |
2002031513 | Jan 2002 | JP |
200284554 | Mar 2002 | JP |
2006003543 | Jan 2006 | JP |
2006343595 | Dec 2006 | JP |
2009192824 | Feb 2008 | JP |
2008262100 | Oct 2008 | JP |
2009036969 | Feb 2009 | JP |
WO-0154052 | Jul 2001 | WO |
WO-2005015120 | Feb 2005 | WO |
WO-2008118886 | Oct 2008 | WO |
WO-2008141009 | Nov 2008 | WO |
WO-2010105015 | Sep 2010 | WO |
WO-2012024627 | Feb 2012 | WO |
Entry |
---|
Sakura Finetek U.S.A., Inc., Chinese second office action dated Dec. 27, 2013 for CN201080017649.4. |
Sakura Finetek U.S.A., Inc., et al., Canadian Examiner's Report dated Dec. 7, 2012 for CA 2,755,164. |
Sakura Finetek U.S.A., Inc., et al., International Preliminary Report on Patentability dated Mar. 7, 2013 for PCT/US2011/048488. |
Haruhisa, S., et al., “Application of telepathology for improvement of therapeutic effects and economic efficiency, and development of new equipment for it”, Science Links Japan; http://sciencelinks.jp/j-east/article/200516/000020051605A0431066.php, Journal Code: N20051113, (2005), 166-125. |
Sakura Finetek, PCT Search Report and Written Opinion dated Oct. 13, 2011 for Int'l Application No. PCT/US2011/048488., 13 pages. |
Sensovation AG, PCT International Preliminary Report on Patentability dated Sep. 20, 2011 for Int'l Application No. PCT/IB2010/000518., 7 pages. |
Sakura Finetek, EPO Office Action dated Jul. 30, 2013 for EPO App No. 10719379.9, 8 pages. |
Sakura Finetek, Australian Office Action dated Nov. 26, 2013 for Australian App No. 2010222633, 3 pages. |
Sakura Finetek, Japanese Office Action dated Dec. 10, 2013 for JP App No. P2011-553548, 9 pages. |
Sakura Finetek, Australian Examination Report dated Dec. 24, 2013 for AU 2011291517, 3 pages. |
Sakura Finetek U.S.A., Inc., Final office action dated Sep. 2, 2014 for Japanese App No. 2011-553548. |
Sakura Finetek U.S.A., Inc., Examination Report dated Jun. 19, 2014 for Australian App No. 2011291517. |
Sakura Finetek U.S.A., Inc., PCT Search Report and Written Opinion dated Sep. 22, 2014 for International Application No. PCT/US2014/034477, 12 pages. |
Sakura Finetek U.S.A., Inc., Chinese Final Office Action dated Jul. 3, 2014 for CN Application No. 201080017649.4. |
Sakura Finetek U.S.A., Inc., Second office action dated Jul. 6, 2015 for Chinese Appln. No. 201180047558. |
Sakura Finetek USA, Extended Search Report for EP15154503 dated Jun. 19, 2015. |
Sakura Finetek U.S.A., Inc., Extended European Search Report, EP App No. 15194968.2 (dated Mar. 18, 2016). |
Sakura Finetek U.S.A., Inc., Notice of rejection for Japanese Application No. 2013-525005, (dated Feb. 9, 2016). |
Sakura Finetek U.S.A., Inc., Final office action, U.S. Appl. No. 13/255,827, (dated Oct. 20, 2015). |
Sakura Finetek U.S.A., Inc., “Third Office Action”, TW Application No. 201180047558X, (dated Apr. 1, 2016). |
Sakura Finetek U.S.A., Inc., “Non final office action”, U.S. Appl. No. 14/138,740, (dated Jul. 1, 2016). |
Sakura Finetek U.S.A., Inc., “Final Office Action”, U.S. Appl. No. 14/779,550 (dated May 24, 2017). |
Sakura Finetek U.S.A., Inc., “Final Office Action”, JP Application No. 2016-507909 (dated Apr. 28, 2017). |
Sakura Finetek U.S.A., Inc., “Final Office Action”, U.S. Appl. No. 14/138,740, (dated Jan. 26, 2017). |
Sakura Finetek U.S.A., Inc., “First Office Action”, EP Application No. 15194968.2, (dated Mar. 10, 2017). |
Sakura Finetek USA Inc., “Office Action”, EP Application No. 15154503.5, (dated Feb. 28, 2017). |
Sakura Finetek U.S.A., Inc., “Final Rejection”, JP Application No. P2013-525005, (dated Dec. 27, 2016). |
Sakura Finetek U.S.A., Inc., “Fourth Office Action”, CN Application No. 201180047558X, (dated Oct. 24, 2016). |
Sakura Finetek USA, Inc., “Office Action”, JP Application No. 2016-507909, (dated Sep. 15, 2016). |
Sakura Finetek U.S.A., Inc., “Examiner's Report”, CA Application No. 2808105, dated Jun. 12, 2017. |
Sakura Finetek U.S.A., Inc., “Examiner's Report”, CA Application No. 2908058, dated Jul. 24, 2017. |
Sakura Finetek U.S.A., Inc., “Final Office Action”, U.S. Appl. No. 14/138,740, dated Jun. 20, 2017. |
Sakura Finetek U.S.A., Inc., “Second Office Action”, CN Application No. 2014800218372, dated Aug. 1, 2017. |
Sakura Finetek U.S.A., “Extended Search Report”, European Application No. 14198636, (dated Sep. 30, 2015). |
Sakura Finetek U.S.A., Inc., “EP Supplementary Search Report”, EP Application No. 14784707.3, (dated Oct. 4, 2016). |
Sakura Finetek U.S.A., Inc., “Examination Report”, CA Application No. 2908058, (dated Nov. 16, 2016). |
Sakura Finetek U.S.A., Inc., “First Office Action with search report”, CN Application No. 2014800218372, (dated Nov. 30, 2016). |
Sakura Finetek U.S.A., Inc., “International Preliminary Report on Patentability”, International Application No. PCT/US2014/034477, (dated Oct. 29, 2015). |
Sakura Finetek U.S.A., Inc., “International Search Report and Written Opinion”, International Application No. PCT/US2014/034477, (dated Sep. 22, 2014). |
Sakura Finetek U.S.A., Inc., “Non-Final Office Action”, U.S. Appl. No. 14/779,550, (dated Jan. 19, 2017). |
Sakura Finetek U.S.A., Inc., “Patent Examination Report No. 1”, AU Application No. 201453889, (dated May 18, 2016). |
Chinese Third Office Action dated Apr. 8, 2018. CN Application No. 201480021837.2. |
Sakura Finetek U.S.A., Inc., Chinese First Office Action dated Feb. 9, 2018, CN Application No. 201410415253.5. |
Sakura Finetek U.S.A., Inc., European First Office Action dated Mar. 5, 2018, EP Application No. 14784707.3. |
Sakura Finetek U.S.A., Inc., European Office Action dated Feb. 22, 2018. |
EP Application No. 15154503.5. |
Sakura Finetek U.S.A., Inc., Extended European Search Report dated Apr. 24, 2018, EP Application No. 17202516. |
Sakura Finetek U.S.A., Inc., Non-Final Office Action dated Dec. 22, 2017, U.S. Appl. No. 14/779,550. |
Sakura Finetek U.S.A., Inc., Notice of Allowance dated Feb. 13, 2018, U.S. Appl. No. 14/138,740. |
Number | Date | Country | |
---|---|---|---|
20120044342 A1 | Feb 2012 | US |
Number | Date | Country | |
---|---|---|---|
61375703 | Aug 2010 | US |