Method and apparatus for automatic staining of tissue samples

Information

  • Patent Grant
  • 8257968
  • Patent Number
    8,257,968
  • Date Filed
    Friday, December 19, 2003
    20 years ago
  • Date Issued
    Tuesday, September 4, 2012
    11 years ago
Abstract
The present invention concerns an apparatus (1) for staining tissue samples, said apparatus (1) including a reagent section (2) or reagent containers (3); at least one staining section or tissue samples, a robotic head (22) or robotic element (20) that may move reagent to a predetermined tissue sample, said robotic element (20) being moveable above the reagent and the staining sections, a control element (85) that may manage a staining process, a 2-D optical sensor (86) to detect two-dimensional image data of a relevant property and that can feed the captured image data to the control element (86). By providing the robotic element (20) with a 2-D optical sensor (86), a common image processor may be provided having multiple functions. By using a 2-D optical image processing system, the control system of the apparatus may easily be adapted to read various types of data presentations, just as actual images for sections of the apparatus may be identified in order to assess the condition of the apparatus. The optical sensor may be used to automatically identify the slides and the reagent containers present in the apparatus, just as the optical sensor may be used for checking if a slide is misplaced at or absent from a slide position, etc.
Description
TECHNICAL FIELD

The present invention relates to an apparatus and a method for automatic staining of tissue samples. It may further relate to systems for sample processing and data acquisition, data maintenance, and data retrieval for sample processing. Applications to which the present invention may especially relate include immunohistochemistry, in-situ hybridization, fluorescent in-situ hybridization, special staining, and cytology, as well as potentially other chemical and biological applications.


BACKGROUND

Tissue sample processing in immunohistochemical (IHC) applications and in other chemical and biological analyses, such as in-situ hybridization, special staining and cytology, may require one or more processing sequences or protocols as part of an analysis of one or more samples. The sample processing sequences or protocols may be defined by the individual or organization requesting an analysis, such as a pathologist or histologist of a hospital, or may be defined by dictates of a particular analysis to be performed, e.g. standardized protocols defined by an organization.


In preparation for sample analysis, a biological sample may be acquired by known sample acquisition techniques and may comprise tissues which in some applications may even be one or more isolated cells. The tissue sample may be accommodated on a sample carrier such as a slide or perhaps a microscope slide.


For example, immunologic applications may require processing sequences or protocols that comprise steps such as deparaffinisation, target retrieval, and staining. Previously, in some applications, these steps may have been performed manually, potentially resulting in a time consuming protocol and necessitating personnel to be actively involved in sample processing. In particular relating to the staining process, various devices for automated staining of tissue slides are known, as attempts have been made to automate sample processing to address the need for expedient sample processing and less manually burdensome operation.


Aspects of the present invention may be especially applicable to sample processing having one or a plurality of processing steps to be performed on one, a portion, or an entirety of samples, such protocols identified in some instances by the individual carriers presenting the samples. Aspects of the present invention may be especially applicable to immunohistochemistry (IHC) techniques, as well as in-situ hybridization (ISH) and fluorescent in-situ hybridization (FISH), especially techniques incorporating the staining of samples.


Embodiments of the invention may further relate to automated control systems for sample processing. Embodiments may also be directed to data acquisition, data maintenance, data retrieval for sample processing, especially information sharing of processing protocol and processing status, such as for individual samples or multiple batch processing, sample diagnostic features, and real-time or adaptive capabilities for multiple batch processing.


U.S. Pat. No. 5,839,091 discloses an apparatus for automatic tissue staining where microscope slides are arranged in a number of rows and reagent vials are stored in a section next to this slide section. A robotic head picks up a predetermined amount of reagent from a bottle and deposits this amount of reagent on a predetermined slide and blows the liquid off the slides according to a control program. This program is run on a computer that is coupled to the staining apparatus. The apparatus is loaded with a number of slides, and each slide and its position is registered in the computer and a staining sequence is selected. The program also receives data relating to the reagents and their position in the reagent section. On the basis of these slide and reagent position data, the program calculates a staining run and controls the robotic motion in the apparatus.


U.S. Pat. No. 6,352,861 discloses a carousel-type automatic staining apparatus in which the slides are arranged on a rotatable carousel slide support and the reagents are similarly arranged on a rotatably carousel reagent support above the slide support. A particular slide is then rotated to a delivery zone and a particular reagent vial is also rotated to this position and reagent is dispensed onto the slide. The slides and the reagent bottles are provided with bar codes and associated bar code readers are provided to identify the slides and the reagents respectively. A blowing zone and an identifying zone are also provided at the periphery of the slide carousel. The slide bar codes identify the slide samples and their particular immunohistochemical processes required for the particular samples. A reagent bar code reader is positioned to scan the reagent bar codes on the reagent bottles. The scanned information from the slide bar code reader and the reagent bar code reader is fed into a computer and correlated with the indexed position of the slide and the reagent carousel, respectively. This information is used to rotate the slide carousel and the reagent carousel to place the correct reagent bottle in the dispense zone for each slide treatment step for each slide.


A drawback of the automated staining apparatus described in '091 is that the position of each of the tissue slides and each of the reagent vials in the slide section and in the reagent section, respectively, must be entered manually into the computer, since the control program cannot check the location of the particular slides and reagent vials. This involves the risk that a misplaced slide is treated with the wrong staining protocol and makes the apparatus very inflexible in use.


Although the '861 patent uses a bar code identification of the slides and reagents, this carousel-type apparatus is time consuming in running the staining protocols, since this involves rotating the carousels with the entire reagent inventory and the slide carousel with all the slides for each step in a protocol. These rotations are time consuming and make this type of apparatus unsuitable for running of larger numbers of slides. Moreover, the bar codes can only carry a small amount of data, typically simply an identification code, which means that the control computer must be provided with corresponding data associated with the identification codes.


DISCLOSURE OF INVENTION

It is an object for the present invention to provide an automatic tissue sample processing apparatus of the initially mentioned kind, with automatic identification of the inventory of reagents and slides present in the machine. Another object is to provide identification of relevant properties of the apparatus to allow for automatic preparatory checks before a staining process of newly loaded slides is initiated.


In one embodiment these objects are achieved by an apparatus of the initially mentioned kind wherein a robotic element, perhaps with a robotic head, is provided with an optical sensor, or perhaps a 2-D optical sensor means for detecting two-dimensional image data of a relevant property and with the capability of feeding the captured image data to the control means.


The invention also provides a method of identifying at least one property in an automatic staining apparatus perhaps including at least one slide array and a reagent array and a robotic element or perhaps robotic means for performing staining of the slides also using reagents;

    • said method including in one embodiment the steps of
      • providing optical sensor means on the robotic head of the robotic means,
      • moving the optical sensor means on said robotic head to a predetermined position,
      • recording relevant image data at said position, and


        feeding said image data to a control system for manipulating the staining process according to said image data.


Furthermore, the invention concerns a method of staining tissue samples in an automatic staining apparatus perhaps including at least one slide array and a reagent array, a robotic element or perhaps and robotic means for performing staining of the slides also using reagents according to tissue sample specific staining protocols; said method including in one embodiment the steps of:

    • providing optical sensor means on the robotic head of the robotic means,
    • moving the optical sensor means on said robotic head to a predetermined position,
    • recording relevant image data at said position by said optical sensor means;
    • feeding said image data to a control system for manipulating the staining process according to said image data; and


      staining a tissue sample also using reagent from a reagent container.


In embodiments, the automatic staining apparatus 1, that is any apparatus that stains with at least some automated operation, may include a reagent container 3. One or more reagent containers 3 may be positioned in a reagent section 2 of the automatic staining apparatus 1. The automatic staining apparatus may also include a tissue sample 74 which may therein be placed on a slide. A plurality of slides with tissue samples thereon may also be positioned in a slide section 5 of the automatic staining apparatus 1. The invention, in embodiments, may also include a robotic element 20, some type of control element, and even an optical sensor (86), perhaps an image-capture 2-D optical sensor. As can be easily understood, the control element 85, may be a computer, software routine, or merely a particular programmable processor functionality.


As mentioned, the present invention may provide for the capability of optically sensing a two dimensional image. This can occur through an image-capture 2-D sensor which may provide a two-dimensional image of an element in the auto staining apparatus 1. Through providing the robotic element or perhaps a robotic head with a 2-D optical sensor or means, as but one embodiment, a common image processing means is able to have multiple functions. By using a 2-D optical image processing system, the control system of the apparatus may easily be adapted to read various types of data presentations, just as actual images of elements or for sections of the apparatus may be identified in order to assess the condition of the apparatus. The optical sensor or optical sensor means may be used to automatically identify the slides and the reagent containers present in the apparatus, just as the optical sensor or optical sensor means may be used for checking if a slide is misplaced at or absent from a certain slide position, etc.


An optical sensor provides a staining apparatus according to the invention with a hitherto unseen flexibility and possibility of automating the identification functions in a staining apparatus. By utilizing a CCD-camera or the like, perhaps on the robotic head or even the robotic element, individual identification means for each of the identification tasks may no longer be required. This means that controlling as well as maintenance of the apparatus is facilitated. The software controlling the apparatus may be adapted to include automated identifications of various properties and conditions of the apparatus, including slide and reagent information. By a method of identifying relevant properties in the staining apparatus and a method of performing the staining process according to the invention, the automated staining process may be less time-consuming and more qualitative checks may be included without loosing any significant speed in the slide staining operations.


In one embodiment of the invention, the reagent section accommodates a plurality of reagent containers stationary arranged in a plurality of rows. Similarly, the tissue samples are accommodated on slides that are stationary arranged in a plurality of rows in the at least one staining section or slide section during the staining process. The layout of these sections is such that it presents a substantially planar platform work area for the robotic head, which is moveable in the X and Y-axis. In a particularly preferred embodiment, a row of slides and/or reagents can be removed and be replaced without interfering with the staining process.


In another preferred embodiment, the apparatus comprises at least two staining sections separated by a reagent section, that is they may be arranged so that at least some of the tissue samples are closer to at least some of the reagent containers. Hereby, the movements required by the robotic head in order to reach all the slides may be significantly limited and the capacity of the staining apparatus can hereby be increased, just as a reduction in the time for running the staining protocols or other advantages may be achieved. It is further realized that these shorter processing times or other advantages may also be achieved by this layout of the slide and reagent sections without a vision system, e.g. an optical sensor.


In other preferred embodiments of the invention, the optical sensor may be a camera or perhaps include a CCD element. By the term “camera” it should be understood that any image capture apparatus is intended whether or not is uses film, plates, memory, or any type of electronic media and whether or not it images light, visible electromagnetic radiation, or even non-visible electromagnetic radiation such as now well known. By recording the relevant image, relevant image data, or even recording digital image data, a computer processing of this data in the control system may be carried out in a quick manner by known image processing capabilities already available. Moreover, by using this digital technology relative complex images can be recorded with high resolution, just as a fast recording of several identifications, e.g. labels on an entire row of slides, may be achieved as the robotic head may be moved across the slide labels in a continuous movement, so stop and start time for each slide identification may be avoided. However, by the invention it is realized that other image sensors, e.g. solid state sensors, or perhaps CMOS sensors could also be used depending on the requirements for image resolution.


As indicated above, the optical sensor may be adapted to record the individual reagent containers or bottles and slides present in the apparatus. While of course it may image larger areas, or perhaps even the entire device, it may be configured for individual imaging either electronically, optically, or positionally. Regardless, as a result of the imaging capability, predetermined positions of the slides or reagent containers or bottles that are loaded into the automatic staining apparatus may not be required, since the apparatus may be adapted to automatically identify new slides and reagent bottles once they are loaded into the apparatus.


In an embodiment, the reagent containers and the slides may be provided with an optical identification element. For example, a reagent container may be provided with a reagent optical identification element and a slide may be provided with a slide optical identification element. These optical identification elements may contain machine readable data concerning the reagent type as well as other relevant data relating to the reagent in the bottle, and the slide identifiers may contain data concerning the tissue sample, such as identification of the patient, the staining protocol, etc. An optical identification element may include reiterated information or perhaps even redundant information. This may include information that is repeated or even partially repeated and may even include information that may or may not be in different versions which may relate to similar information.


The optical identification element or alternatively optical identification means may be on or even mounted on the reagent container or on the slides in such a manner that the optical identification element is readable by the optical sensor. By being positioned “on” it is intended that any manner of association be encompassed; thus it should be understood that separate attachment or surface mounting is not required. Similarly, by being “above” it should be understood that this may exist not only in a sense such as with respect to gravity, but also in a figurative sense such as roughly perpendicularly above a surface or the like. In an embodiment, the optical identification element may be readable from above by the optical sensor or alternatively means. Furthermore, the optical identification element may be provided on a label, which is perhaps adhesively attachable to a specific slide or reagent bottle. Hereby, the labels or perhaps adhesive labels may be presented to the optical sensor means on the robotic head above the slides and the reagent bottles facilitating the reading of the optical identification means. By providing the optical identification means on a printed label which is attached to the slide, respectively the reagent bottle, individual labels may be prepared on site and the relevant data may be entered into a computer and a corresponding label carrying said relevant data may be printed on an associated label printer.


In an embodiment of the invention, one type of optical identification element may be a two-dimensional high-resolution symbology code, e.g. of the so-called “INFOGLYPH™” type. The optical identification may also be more generically a two-dimensional symbology. Two-dimensional symbology may be representative of data including, but not limited to: tissue sample related data, patient identification data, staining protocol data, reagent related data, regent type data, regent volume related data, regent durability related data, and the like data. By encoding the relevant information into numerous tiny, individual graphic elements, typically small lines in 45° diagonal lines as short as 0.02 mm ( 1/100 inch), a high resolution with high contrast encoded information label may be achieved which is printable in a printer and readable by a high resolution camera. The type of encoded 2-D symbology label may be provided in different colors and in a variety of materials.


Alternatively, the optical identification means or alternatively the optical identification element may be a data matrix code or even a one-dimensional bar code, namely the identification code with a pattern of vertical bars whose width and spacing identifies an item marked. An advantage of using an optical sensor capable of reading 2-D symbology is that the apparatus may be capable of reading any kind of optical identifier, as this is only requires an adaptation in the software processing the captured perhaps digital image.


In an embodiment, an optical identification element label may include a two-dimensional (2-D) symbology zone and even at least one human readable text zone, each as conceptually depicted in FIG. 5. Hereby, an extra visual inspection of the label by the operator may be provided for verification of the printed label.


In a more advanced usage of the 2-D image capturing capability, the image processing capability or image processor element may be adapted to identify the texture or outline of the tissue sample itself captured by the optical sensor and may use said image-captured tissue property as an individual identification of the tissue sample. The optical sensor may be configured to identify desired features of the tissue samples such as but not limited to the texture, outline, a visual property, or even an individual feature of a tissue sample. Of course, various different features or properties may be identified as desirable to detect or perhaps identify, a property which may include any attribute, characteristic, or the like. This embodiment could make the use of slide labels obsolete, as the tissue texture itself or at least a predefined section thereof (with or without magnification) could be used as an identifier for a list of data in the control software.


In one preferred embodiment of the invention, the optical sensor may be a moveable optical sensor which may be moveable along the areas above the staining and the reagent sections, and said optical sensor may be adapted to determine the presence of and the positions of slides in the at least one staining section. This may be facilitated by having the optical sensor movable in response to or perhaps on a robotic element. Once a new set of slides are being loaded into the apparatus, this feature would allow the staining apparatus according to this embodiment of the invention to automatically determine where the slides are positioned so that the optimal scheduling of treatment steps can be calculated. The optical sensor may even determine the approximate location and the approximate area of a tissue sample. Furthermore, this capability may also provide the apparatus control software with warning if a slide is not correctly positioned or other irregularities have occurred during the loading of the slides.


In another embodiment of the invention, the optical sensor may be adapted to locate pre-selected reference locations for self-calibration of the robotic control system or robotic element controlling the movements of the robotic head. The camera can be used to teach the robotic arm critical locations necessary to calibrate the system, allowing the apparatus to properly position the robotic head to all required positions and locations within the platform work area. If the apparatus has been moved or otherwise been tampered with, e.g. due to maintenance, etc., this feature may provide the staining apparatus according to the invention with the capability of self-calibrating the robotic motion control system, e.g. if the slides are arranged in racks (intended to broadly encompass any locationally tied collection) by checking if the slide rack fit correctly into a receiving element in the apparatus, and/or by determining the position of predefined reference components of the apparatus.


In another embodiment of the invention, the optical sensor may be a camera adapted to record an image of the finalized tissue sample after said tissue sample has been subjected to a staining protocol for recording an image of the manipulated tissue sample. Hereby, a picture or digital image of the stained tissue sample may be recorded, preferably in a high resolution, for later examination or for sending this digitalize picture to a remote location for examination. Accordingly, in embodiments the present invention may provide for storing an image relevant to the process of staining tissue samples. This may include images both before and after staining or some other operation, of course. Also, this feature of the invention may provide for archiving images of the about to be stained or the stained tissue samples for later verification of the tissue sample analysis or the identification if this should it be required. Thus the invention may automatically facilitate a user activity such as those mentioned. To understand the various possibilities, the automatic facilitation may be of activities including, but not limited to, later accessing a historical image of a stained tissue sample, remotely accessing an image of a stained tissue sample, archiving an image of a stained tissue sample, later accessing a historical image of an unstained tissue sample, remotely accessing an image of an unstained tissue sample, archiving an image of an unstained tissue sample, and the like activities.


A sensor may be provided in some embodiments that may automatically identify information from one or more slides or reagent containers. In some embodiments, protocol information may be provided by the adaptive sample processing control system. The sample processing system may process one or more slides, or one or more batches of slides, concurrently, sequentially, or in any other temporal fashion, potentially in accordance with protocol information provided by a slide having a sample or provided by the adaptive sample processing control system. Sample batches or individual slides may be inserted or removed during processing protocol steps by the control and monitoring accomplished by the adaptive sample processing control system.


Another embodiment of the present invention that may achieve the foregoing and other objects of invention may comprise a method of sample processing, comprising the steps of: accessing at least one of a plurality of drawers, providing at least one sample carrier retainment assembly configured with at least one sample carrier, configuring at least one of the drawers with the at least one sample carrier retainment assemblies, and adaptively processing the sample carriers. The step of adaptive processing may automate the processing of samples and may allow for either or both continuous or batch processing of slides, and may afford multiple independent slide processing and in some embodiments redundant slide processing to process each slide independently.


Embodiments of the invention may further comprise a method of automated sample processing, comprising the steps of: acquiring protocol information, transmitting the protocol information to at least one sample processing system, adaptively processing samples, and acquiring sample processing information from the step of adaptively processing. Furthermore, embodiments may provide: maintaining the protocol information, maintaining the sample processing information, information sharing of protocol information, and sample processing information. These and other method steps may be provided for individual samples or multiple batch processing, sample diagnostic features, and real-time or adaptive capabilities for multiple batch processing.


Many aspects of invention are applicable to immunohistochemistry (IHC) techniques, as well as in-situ hybridization (ISH) and fluorescent in-situ hybridization (FISH) special staining of samples, and microarrays, especially techniques incorporating target retrieval or the staining of samples.


Support should be understood to exist for the following aspects and embodiments of the invention:

    • An automatic staining apparatus comprising:
      • at least one reagent container;
      • at least one sample;
      • a robotic element adapted to affect said reagent container and said sample;
      • a control element to which said robotic element is responsive; and
      • an image-capture 2-D optical sensor configured to two dimensionally image at least one element in said automatic staining apparatus.
    • A method of identifying at least one property in an automatic staining apparatus comprising the steps of:
      • providing at least one sample;
      • providing at least one reagent container;
      • providing a robotic element adapted to affect said reagent container and said sample;
      • optically sensing a two dimensional image of at least one element in said automatic staining apparatus;
      • recording relevant image data; and
      • feeding said image data to a control element to which said robotic element is responsive.
    • A method of staining samples in an automatic staining apparatus comprising the steps of:
      • providing at least one sample;
      • providing at least one reagent container;
      • providing a robotic element adapted to affect said reagent container and said sample;
      • providing an optical sensor responsive to said robotic element and adapted to sense a two dimensional image of at least one element in said automatic staining apparatus;
      • recording relevant image data; and
      • feeding said image data to a control element to which said robotic element is responsive.
    • An automatic staining apparatus comprising:
      • at least one reagent container;
      • at least one sample;
      • a robotic element adapted to affect said reagent container and said sample;
      • a control element to which said robotic element is responsive; and
      • a multifunction optical sensor configured to sense at least one element in said automatic staining apparatus.





BRIEF DESCRIPTION OF DRAWINGS

In the following the invention is described with reference to the accompanying drawings, in which:



FIG. 1 is a schematic perspective view of a staining apparatus according to the preferred embodiment of the invention;



FIG. 2 is a top view of the work area in the staining apparatus shown in FIG. 1;



FIG. 3 is a detailed view of the robotic element in the staining apparatus according to some embodiments of the invention;



FIG. 4 is a top view of a reagent bottle with optical identification means;



FIG. 5 is a microscope slide with an optical identifier label thereon;



FIG. 6 is an example of a lay-out of this label; and



FIGS. 7 to 10 are examples of various kinds of optical identifying means on the slides.





MODE(S) FOR CARRYING OUT THE INVENTION

An automatic staining apparatus 1 according to the invention is shown in FIGS. 1 and 2. The automatic staining apparatus 1 comprises a rectangular frame 4 surrounding a reagent station or section 2 comprising an array of reagent bottle or container compartments, wherein each compartment a reagent vial or reagent container 3 is placed, and a first and second slide sections 5 wherein a number of separate racks 6 is placed, and where each rack 6 comprises a number of microscope slides 7 mounted side by side in the rack 6. A plurality of reagent containers or even slides may be placed in any desired order, an array. In the embodiment shown, each rack may hold up to 8 slides, but the rack may be designed to hold any suitable number of slides. With eight racks arranged side by side, the shown embodiments may hold up to 64 slides 7 each having a sample, e.g. a tissue mounted on the upper side of the slide, so that reagent may be applied from above to the sample on each slide. The sample processed may be any material, but is most likely a biologic material such as a biological sample or a biological specimen, perhaps such as a histological sample, e.g. tissue and cell specimens, cells, collections of cells, or tissue samples, the definition to include cell lines, proteins and synthetic peptides, tissues, cell preps, cell preparations, blood, bodily fluids, bone marrow, cytology specimens, blood smears, thin-layer preparations, and micro arrays. It should also be understood to include slide-based biological samples.


As mentioned, the present invention may include a robotic element, which may somehow affect the reagent container and tissue sample. Thus any sort of action to, action resulting from, or merely information from the reagent container or tissue sample may be facilitated through the robotic element. The robotic element, in embodiments, may be adapted to perform staining of the slides with (including as a result of or in conjunction with) the reagent application or the like. The robot arm or robotic element 20 may also remove reagent from a reagent container to a predetermined tissue sample. For example, a robotic element 20 for moving a probe 10 in X and Y (as well as Z) direction as indicated by the arrows X and Y is arranged above the frame 4 of the staining apparatus. A robot arm may position the probe 10 above all reagent vials 3 as well as above all the slides 7, and may further operate the probe 10 to aspirate portions of reagent contained in any of the vials 3, to transfer the portion of reagent and apply it to any of the slides 7 in order to provide a selected staining or treatment of the sample on each slide 7. A control element may manage a staining process by controlling the entire process or even any portion of it. By use of a suitable control element or alternatively a control means e.g. capability within a computer (not shown) having the appropriate software and input data for the purpose, this staining apparatus 1 may be able to automatically stain or treat samples requiring different staining or treatment reagents and processes.


As shown in FIGS. 1 and 3, the probe 10 is accommodated in a robotic head 22 and is manipulated by the robotic element 20. The probe 10 is raised to an upper position (in a Z direction) where it is clear of the vials 3 underneath the probe 10, but the robot may include means or element in the robotic head 22 for lowering the probe 10 in order to dip the probe tip into the content of a selected reagent vial 3 and to aspirate a selected amount of reagent for the selected staining or treatment process. In an embodiment, the present invention may include providing an optical sensor 86 on a robotic element and perhaps moving the optical sensor to a predetermined position through action of the robotic element. As but one example, the robotic head 22 may be provided with an optical sensor 86, perhaps even a CCD camera 25 pointing downwards. An optical sensor may be positioned on or perhaps more broadly in response to the robotic element. After the optical sensor is positioned, image data may be recorded at the location at which the optical sensor is established.


In some embodiments a robotic element 20 or even a robotic head 22 may include a variety of components, including but not limited to a push tool 38 that may be connected to an air cylinder 39, a probe 10 that may be responsive to a probe movement element 36 which may even be connected to a syringe pump 37, and an optical sensor 86 as shown in FIG. 3.


In embodiments, the optical sensor may detect two-dimensional image data of a relevant property. It may also be adapted to sense a two-dimensional image of an element in general. The camera may be utilized to determine status information of the slides and the reagent bottles and other features of the apparatus in the work area, for example reading a code provided on a reagent container to determine the reagent type and the reagent location within the system. The camera may also determine status of the tissue sample carriers, for example the location of a particular slide, informational indicia, such as a code, that indicate information about the tissue sample presented on the slide or the processing protocol to be performed. A camera may be used for diagnostic purposes. In some embodiments, the sample may be scanned for further analysis, potentially by a computer. The present invention may include, in embodiments, a computer image biological analysis element or perhaps even biologically analysing image data of a sample with a computer.


As previously discussed, the invention may include recording a variety of relevant image data. Of course, a variety of relevant image data may be recorded.


Importantly, this may include recording element calibration reference points, or perhaps even robotic element calibration reference positions on or in the apparatus. As mentioned, the invention may also provide for recording slide identification image data and reagent identification image data. A significant aspect of an embodiment is the possibility of recording an optical identification element of a particular slide or perhaps merely recording information relevant to an element. Such information may include information concerning the tissue sample, of course. Similarly, optical identification may be recorded on a reagent container that may include information concerning the reagent contained therein. It may provide for recording a two-dimensional symbology on a slide or even on a reagent container. Two-dimensional symbology recorded on a slide may represent data including, but not limited to: tissue sample related data, patient identification data, staining protocol data, or the like. Two-dimensional symbology recorded on a reagent container may represent data including, but not limited to: reagent related data, reagent type data, reagent volume related data, reagent durability related data, or the like. It may also provide a connection element through which captured image data may be transferred to the control element. It may include feeding the image data to a control element so that the robotic element may respond. After the relevant image data has been recorded, and perhaps as a result of feeding that data to the control element, the invention may manipulate a staining or other process according to that relevant image data. Thus the invention may perform staining of slides according to tissue specific protocols.


The staining apparatus 1 of the present embodiment further comprises a probe washing station 8 and a reagent mixer 9, and the robotic element 20 is furthermore arranged to transfer the probe to the washing station 8 as well as to the reagent mixer 9.


As shown in FIG. 4, the reagent bottle 3 may be provided with an area 30 on a surface on which to mount an optical identification element. This optical identifier may be an adhesive label 31 carrying encoded information about the content of the bottle 3, such as reagent type, date of manufacture, expiry date, etc. The encoded information could be in the form of a data matrix code, an INFOGLYPH™ code or any other kind of 2-D code, and could in principle also be a simple 1-D code, i.e. a bar code. Additionally, the label 31 may also be provided with human readable text to aid the operator handling the reagent bottles e.g. during loading of bottles into the staining apparatus.



FIG. 5 shows a slide 7 with a label 71 mounted thereon. One layout of the label 71 is shown in FIG. 6. The label 71 may be an adhesive optical identifier, which may be prepared for the particular slide and printed on a label printer (not shown) or any other suitable printing device. It is even possible that in a particular situation, if a batch of slides is to be subjected to the same treatment, a series of identical labels could be provided for the slides. The label 71 may comprise an area 72 for encoded information about the tissue sample on the slide 7, such as patient data, date and file number, the staining protocol and/or the series of process steps. Furthermore, the label 71 may be provided with one or more rows 73 of human readable text and/or blank space for the laboratory personnel preparing the slides to write on the slide label.


In FIGS. 7 to 9 various kinds of data encoded symbology for the label 71 (the entire label 71 as shown or only for the label area 72 (see FIG. 6)).


In FIG. 7, an example of a 2-D symbology of the INFOGLYPH™ type is shown. This may include perhaps even an information carpet type of symbology. This type of 2-D symbology is advantageous since it can carry a large amount of optically machine-readable information. Making use of a high-resolution camera, this type of symbology may be readable in a high resolution and a large amount of information can be encoded therein. The symbology may be printed with tiny diagonal lines in different directions or perhaps even colors and can easily be read by a CCD camera or the like.



FIG. 8 shows an example of a data matrix code that can be used as an alternative to the INFOGLYPH™ symbology. The data matrix is similarly readable with a CCD camera but may not carry as many data in the encoding as the INFOGLYPH™. However, it is easier to print as it may have a less high resolution making it a simple and cost effective solution if less identification data on the slides and the reagent bottles is required. A yet simpler solution is shown in FIG. 9, where the symbology is the old bar code. In principle this means that only a bar code scanner is required for reading the slides and the reagent bottle information, but by using a 2-D sensor, the possibility of self-calibration and monitoring the installation of slides and reagents in the staining apparatus may be enhanced.


In an embodiment, the optical identifiers on the slides and on the reagent bottles are the same type. This may facilitate the image processing of the identification process in the staining apparatus.


A different approach to identifying the individual slides or as a way of facilitating the new capabilities of confirming identification or storing confirmatory information may be to record the contour and/or the texture of the tissue sample 74 itself, such as shown in FIG. 10. Utilizing the high-resolution of the image that can be recorded by the camera, the unique features of the tissue sample itself can be used as a graphical identifier of the slide. Furthermore, an image of the stained tissue sample can be recorded so that a digital representation of the tissue sample is produced. This digital image can be sent electronically to remote locations for instant examination and/or archived for later examination. This may provide the staining apparatus with a unique flexibility in use and may introduce new and advantageous methods of analyzing the tissue samples.


Besides identifying the microscope slides and the reagent bottles in the staining apparatus, the 2-D optical sensor can also be used for self-calibration of the apparatus, e.g. after maintenance, if the apparatus has been disassembled or moved to another location. By identifying critical locations within the apparatus by capturing an image by the camera, the image processing software can compare the captured image with a reference image to determine if certain critical components in the apparatus are off-set from their predetermined positions, e.g. if a slide rack or a slide is slightly off-set, and if so, a set of correction data for the robotic motion control system may be calculated and this set of data may be used for calibrating the apparatus. If the correction needed exceeds a certain size, a warning could be automatically issued to an operator, so that it is ensured that the apparatus does not malfunction during the processing of the slides. Furthermore, this image analysis system could also be used for determining if a slide is present or dislocated in the rack in order to produce a warning signal.


By the invention, it is realised that a variety of changes of the above description of some preferred embodiments of the invention may be made without departing from the scope of the invention as set forth in the claims. As can be easily understood, the basic concepts of the present invention may be embodied in a variety of ways. It involves both staining techniques as well as various systems, assemblies, and devices to accomplish staining and other functions. In this application, the staining techniques are also disclosed as part of the results shown to be achieved by the various systems, assemblies, and devices described and as steps that are inherent to utilization. They should be understood to be the natural result of utilizing the devices as intended and described. In addition, while some devices are disclosed, it should be understood that these not only accomplish certain methods but also can be varied in a number of ways. Importantly, as to all of the foregoing, all of these facets should be understood to be encompassed by this disclosure.


The reader should be aware that the specific discussion may not explicitly describe all embodiments possible; many alternatives are implicit. It also may not fully explain the generic nature of the invention and may not explicitly show how each feature or element can actually be representative of a broader filmation or of a great variety of alternative or equivalent elements. Again, these are implicitly included in this disclosure. Where the invention is described in device-oriented terminology, each element of the device implicitly performs a function. Apparatus claims may not only be included for the device described, but also method or process claims may be included to address the functions the invention and each element performs. Neither the description nor the terminology is intended to limit the scope of the disclosure.


It should also be understood that a variety of changes may be made without departing from the essence of the invention. Such changes are also implicitly included in the description. They still fall within the scope of this invention. A broad disclosure encompassing both the explicit embodiment(s) shown, the great variety of implicit alternative embodiments, and the broad methods or processes and the like are encompassed by this disclosure and may be relied upon to support additional claims for presentation in this or subsequent patent application.


Further, each of the various elements of the invention and claims may also be achieved in a variety of manners. This disclosure should be understood to encompass each such variation, be it a variation of an embodiment of any apparatus embodiment, a method or process embodiment, or even merely a variation of any element of these. Particularly, it should be understood that as the disclosure relates to elements of the invention, the words for each element may be expressed by equivalent apparatus terms or method terms—even if only the function or result is the same. Such equivalent, broader, or even more generic terms should be considered to be encompassed in the description of each element or action. Such terms can be substituted where desired to make explicit the implicitly broad coverage to which this invention is entitled. As but one example, it should be understood that all actions may be expressed as a means for taking that action or as an element which causes that action. Similarly, each physical element disclosed should be understood to encompass a disclosure of the action which that physical element facilitates. Regarding this last aspect, as but one example, the disclosure of a “sensor” should be understood to encompass disclosure of the act of “sensing”—whether explicitly discussed or not—and, conversely, were there effectively disclosure of the act of “sensing”, such a disclosure should be understood to encompass disclosure of a “sensor” and even a “means for sensing”. It should also be understood that in jurisdictions where specific language may be construed as limiting, as but one example in the United States where some interpretations of “means for” elements can be construed narrowly, broader equivalent language (such as “element” or the like) may be used to avoid the narrow interpretation and should be understood as encompassed by this specification. Such changes and alternative terms are to be understood to be explicitly included in the description.


Any patents, patent applications, publications, or other references mentioned in this application for patent are hereby incorporated by reference. In addition, as to each term used it should be understood that unless its utilization in this application is inconsistent with such interpretation, common dictionary definitions should be understood as incorporated for each term and all definitions, alternative terms, and synonyms such as contained in the Random House Webster's Unabridged Dictionary, second edition are hereby incorporated by reference. Finally, any priority case for this application is hereby appended and hereby incorporated by reference.


Thus, the applicant(s) should be understood to have support to claim at least: i) each of the sample processing systems and subsystems as herein disclosed and described, ii) the related methods disclosed and described, iii) similar, equivalent, and even implicit variations of each of these systems, assemblies, devices and methods, iv) those alternative designs which accomplish each of the functions shown as are disclosed and described, v) those alternative designs and methods which accomplish each of the functions shown as are implicit to accomplish that which is disclosed and described, vi) each feature, component, and step shown as separate and independent inventions, vii) the applications enhanced by the various systems or components disclosed, viii) the resulting products produced by such systems or components, and ix) methods and systems, assemblies, devices, and apparatuses substantially as described hereinbefore and with reference to any of the accompanying examples, x) the various combinations and permutations of each of the elements disclosed, xi) each potentially dependent claim or concept as a dependency on each and every one of the independent claims or concepts presented, xii) processes performed with the aid of or on a computer as described throughout the above discussion, xiii) a programmable system as described throughout the above discussion, xiv) a computer readable memory encoded with data to direct a computer comprising means or elements which function as described throughout the above discussion, xv) a computer configured as herein disclosed and described, xvi) individual or combined subroutines and programs as herein disclosed and described, xvii) the related methods disclosed and described, xviii) similar, equivalent, and even implicit variations of each of these systems and methods, xix) those alternative designs which accomplish each of the functions shown as are disclosed and described, xx) those alternative designs and methods which accomplish each of the functions shown as are implicit to accomplish that which is disclosed and described, xxi) each feature, component, and step shown as separate and independent inventions, and xxii) the various combinations and permutations of each of the above.


Further, if or when used, the use of the transitional phrase “comprising” or the like is used to maintain the “open-end” claims herein, according to traditional claim interpretation. Thus, unless the context requires otherwise, it should be understood that the term “comprise” or variations such as “comprises” or “comprising” or the like, are intended to imply the inclusion of a stated element or step or group of elements or steps but not the exclusion of any other element or step or group of elements or steps. Such terms should be interpreted in their most expansive form so as to afford the applicant the broadest coverage legally permissible.


Any claims set forth at any time are hereby incorporated by reference as part of this description of the invention, and the applicant expressly reserves the right to use all of or a portion of such incorporated content of such claims as additional description to support any of or all of the claims or any element or component thereof, and the applicant further expressly reserves the right to move any portion of or all of the incorporated content of such claims or any element or component thereof from the description into the claims or vice-versa as necessary to define the matter for which protection is sought by this application or by any subsequent continuation, division, or continuation-in-part application thereof, or to obtain any benefit of, reduction in fees pursuant to, or to comply with the patent laws, rules, or regulations of any country or treaty, and such content incorporated by reference shall survive during the entire pendency of this application including any subsequent continuation, division, or continuation-in-part application thereof or any reissue or extension thereon.

Claims
  • 1. An automatic staining apparatus comprising: at least one removable reagent container positioned on a reagent rack within a reagent section;at least one slide positioned within a slide section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one removable reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process; anda control element to which the robotic element is responsive, the control element configured tomonitor insertion or removal of the at least one removable reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 2. An apparatus according to claim 1, wherein the optical sensor is adapted to locate pre-selected reference features for self-calibration of the robotic element.
  • 3. An apparatus according to claim 1, wherein a sample is placed on the at least one slide, and wherein the optical sensor is adapted to record an image of the finalized sample after said sample has been subjected to the staining process.
  • 4. An apparatus according to claim 3, further comprising: at least one element provided on the at least one removable reagent container and the at least one slide;wherein at least one element comprises an element selected from a group consisting of: a two-dimensional high-resolution symbology code, a datamatrix code, a bar code, an adhesive label, a two dimensional symbology zone, and a human readable text zone.
  • 5. An apparatus according to claim 3, wherein the optical sensor is configured to identify a feature selected from a group consisting of: the texture of the sample, the outline of the sample, a visual property of the sample, and an individual identification feature of the sample.
  • 6. A method of identifying at least one property in an automatic staining apparatus comprising the steps of: providing at least one sample on a slide positioned within a slide section;providing at least one reagent container positioned on a reagent rack within a reagent section;wherein a robotic element is configured to move above the slide section and above the reagent section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the at least one reagent container is added to or removed from the apparatus without interrupting dispensing of at least one reagent during the staining process;providing the robotic element with an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;recording relevant image data;recording calibration reference points of the apparatus; andfeeding said image data to a control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 7. A method of staining samples in an automatic staining apparatus comprising the steps of: providing at least one sample on a slide, the slide being positioned in a slide section within slide racks;providing at least one reagent container positioned on a reagent rack within a reagent section;wherein a robotic element is configured to move above the slide section and above the reagent section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the at least one reagent container is added to or removed from the apparatus without interrupting dispensing of at least one reagent during the staining process;providing the robotic element with an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;recording relevant image data;recording calibration reference positions for said slide racks; andfeeding said image data to a control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 8. An automatic staining apparatus comprising: at least one reagent container positioned on a reagent rack within a reagent section;at least one sample on a slide, the slide being positioned within a slide section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process and locate pre-selected reference features for self-calibration of the robotic element; anda control element to which the robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 9. An automatic staining apparatus comprising: at least one reagent container on a reagent rack in a reagent section;at least one first sample contained on a slide in a first slide section;at least one second sample contained on a slide in a second slide section, wherein said first slide section and said second slide section are separated by said reagent section;wherein a robotic element is configured to move above the reagent section and above the first and second slide sections during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process; anda control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 10. An automatic staining apparatus comprising: at least one reagent container positioned on a reagent rack within a reagent section;at least one sample placed on a slide, the slide being positioned within a slide section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process and image at least one optical identification element; anda control element to which the robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the-staining process using the optical identification element and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 11. An apparatus according to claim 10 wherein the optical identification element has reiterated information, said reiterated information comprises multiple reiterated information.
  • 12. An apparatus according to claim 11 wherein said reiterated information comprises redundant information.
  • 13. An apparatus according to claim 11 wherein said optical identification element comprises a two-dimensional high-resolution symbology code.
  • 14. An apparatus according to claim 11 wherein said optical identification element comprises a datamatrix code.
  • 15. An apparatus according to claim 11 wherein said optical identification element comprises a bar code.
  • 16. An automatic staining apparatus comprising: at least one reagent container positioned on a reagent rack within a reagent section;at least one sample on a slide, the slide being positioned within a slide section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;a computer image biological analysis element; anda control element to which the robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process,wherein the optical sensor records a first image of the at least one sample before staining and records a second image of the sample after staining and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 17. An apparatus according to claim 16 wherein said optical sensor comprises a camera.
  • 18. An apparatus according to claim 17, wherein said camera comprises a CCD element.
  • 19. A method of identifying at least one property in an automatic staining apparatus comprising the steps of: providing at least one sample, the sample being placed on a slide in a removable slide rack, the slide rack being positioned within a slide section;providing at least one reagent container positioned on a reagent rack within a reagent section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the at least one reagent container is added to or removed from the apparatus without interrupting dispensing of at least one reagent during the staining process;providing the robotic element with an optical sensor configured to automatically identify new slides and regent bottles loaded into the apparatus during the staining process;recording relevant image data;feeding said image data to a control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container; andbiologically analysing image data of said at least one sample with a computer.
  • 20. A method according to claim 19, wherein said optical sensor comprises a camera.
  • 21. A method according to claim 20, wherein said camera comprises a CCD element.
  • 22. A method of staining tissue samples in an automatic staining apparatus comprising the steps of: providing at least one removable sample on at least one slide positioned within a slide section;providing at least one reagent container positioned on a reagent rack within a reagent section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the at least one reagent container is be added to or removed from the apparatus without interrupting dispensing of at least one reagent during the staining process;providing the robotic element with an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;recording relevant image data;feeding said image data to a control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container; andbiologically analysing image data of said at least one sample with a computer.
  • 23. A method according to claim 22, wherein said optical sensor comprises a camera.
  • 24. A method according to claim 22, wherein said optical sensor comprises a CCD element.
  • 25. A method according to claim 22, further comprising a step of storing an image relevant to the staining process.
  • 26. An automatic staining apparatus comprising: at least one reagent container positioned on a reagent rack within a reagent section;at least one sample, the sample being placed on a slide positioned within a slide section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises a multifunction optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;a computer image biological analysis element; anda control element to which the robotic element is responsive, the control element configured to monitor insertion or removal of the at least one reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 27. An apparatus according to claim 26, wherein said optical sensor comprises a camera.
  • 28. An apparatus according to claim 26, wherein said optical sensor comprises a CCD element.
  • 29. An apparatus according to claim 26, further comprising a stored image relevant to the staining process.
  • 30. An automatic staining apparatus comprising: at least one removable reagent container positioned on a reagent rack within a reagent section;at least two staining sections separated by the reagent section;at least one sample placed on a slide in a slide rack, the slide rack being positioned within the staining sections;wherein a robotic element is configured to move above the reagent section and above the staining sections during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the reagent section is situated to enable the at least one reagent container to be added to or removed from the apparatus without interrupting the movement of the robotic element during dispensing of at least one reagent during the staining process;wherein the robotic element comprises an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process; anda control element to which the robotic element is responsive, the control element configured to monitor insertion or removal of the at least one removable reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
  • 31. A method of staining tissue samples in an automatic staining apparatus comprising the steps of: providing at least one slide positioned within a slide section;providing at least one removable reagent container positioned on a reagent rack within a reagent section;wherein a robotic element is configured to move above the reagent section and above the slide section during a staining process and wherein the reagent rack is removable below the plane of the robotic element;wherein the at least one reagent container is be added to or removed from the apparatus without interrupting dispensing of at least one reagent during the staining process;providing the robotic element with an optical sensor configured to automatically identify new slides and reagent bottles loaded into the apparatus during the staining process;recording relevant image data; andfeeding said image data to a control element to which said robotic element is responsive, the control element configured to monitor insertion or removal of the at least one removable reagent container during the staining process and to continue movement of the robotic element and dispensing of at least one reagent during insertion or removal of the at least one removable reagent container.
Parent Case Info

This application is the United States National Stage of International Application No. PCT/US2003/040518, filed Dec. 19, 2003 which claims the benefit under 35 U.S.C. 119(e) of U.S. Provisional Application No. 60/435,601, filed Dec. 20, 2002, each hereby incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/US03/40518 12/19/2003 WO 00 6/14/2005
Publishing Document Publishing Date Country Kind
WO2004/058950 7/15/2004 WO A
US Referenced Citations (379)
Number Name Date Kind
3219416 Natelson Nov 1965 A
3398935 Livesey et al. Aug 1968 A
3482082 Isreeli Dec 1969 A
3513320 Weldon May 1970 A
3547064 Binnings et al. Dec 1970 A
3553438 Blitz et al. Jan 1971 A
3600900 Buddecke Aug 1971 A
3644715 Holderith Feb 1972 A
3660638 Oberli May 1972 A
3680967 Engelhardt Aug 1972 A
3772154 Isenberg et al. Nov 1973 A
3801775 Acker Apr 1974 A
3807851 Knox et al. Apr 1974 A
3831006 Chaffin, III et al. Aug 1974 A
3851972 Smith et al. Dec 1974 A
3853092 Amos et al. Dec 1974 A
3854703 Gibbs et al. Dec 1974 A
3873079 Kuus Mar 1975 A
3876297 Appeldorn et al. Apr 1975 A
3909203 Young et al. Sep 1975 A
RE28585 Amos et al. Oct 1975 E
3912456 Young Oct 1975 A
3916157 Roulette et al. Oct 1975 A
3916160 Russo et al. Oct 1975 A
3971917 Maddox et al. Jul 1976 A
3976028 Howells et al. Aug 1976 A
3994594 Sandrock et al. Nov 1976 A
4013038 Rogers et al. Mar 1977 A
4018565 Fletcher, III et al. Apr 1977 A
4039288 Moran Aug 1977 A
4066412 Johnson et al. Jan 1978 A
4083638 Sandrock et al. Apr 1978 A
4084541 Ito Apr 1978 A
4092952 Wilkie et al. Jun 1978 A
4100309 Micklus et al. Jul 1978 A
4113437 Duff et al. Sep 1978 A
4115861 Allington Sep 1978 A
4133642 Nosaka et al. Jan 1979 A
4135883 McNeil et al. Jan 1979 A
4159875 Hauser Jul 1979 A
4163643 Hunter et al. Aug 1979 A
4200056 Johnson Apr 1980 A
4200607 Suzuki Apr 1980 A
4227810 Sandrock et al. Oct 1980 A
4245967 Busselet Jan 1981 A
4263504 Thomas Apr 1981 A
4281387 Kraft et al. Jul 1981 A
4286146 Uno et al. Aug 1981 A
RE30730 Duff Sep 1981 E
4298571 DiFulvio et al. Nov 1981 A
4311667 Gocho Jan 1982 A
4323537 Mody Apr 1982 A
4338279 Orimo et al. Jul 1982 A
4346056 Sakurada Aug 1982 A
4371498 Scordato et al. Feb 1983 A
4404641 Bazarnik Sep 1983 A
4406547 Aihara Sep 1983 A
4447395 Englar et al. May 1984 A
4451433 Yamashita et al. May 1984 A
4455280 Shinohara et al. Jun 1984 A
4467073 Creasy Aug 1984 A
4467603 Wilson Aug 1984 A
4488679 Bockholt et al. Dec 1984 A
4510169 Linner Apr 1985 A
4517160 Galle et al. May 1985 A
4528159 Liston Jul 1985 A
4531455 Palmer Jul 1985 A
4539632 Hansen et al. Sep 1985 A
4558946 Galle et al. Dec 1985 A
4567748 Klass et al. Feb 1986 A
4571699 Herzog et al. Feb 1986 A
4585622 Bowe et al. Apr 1986 A
4609017 Coulter et al. Sep 1986 A
4624588 Bivin Nov 1986 A
4634576 Galle et al. Jan 1987 A
4634850 Pierce et al. Jan 1987 A
4643879 Hanaway Feb 1987 A
4647432 Wakatake Mar 1987 A
4647543 Stöcker Mar 1987 A
4656006 Assmann et al. Apr 1987 A
4664526 Scheffler et al. May 1987 A
4675299 Witty et al. Jun 1987 A
4678752 Thorne et al. Jul 1987 A
4678894 Shafer Jul 1987 A
4681741 Hanaway Jul 1987 A
4683120 Meserol et al. Jul 1987 A
4692308 Riley et al. Sep 1987 A
4692603 Brass et al. Sep 1987 A
4695430 Coville et al. Sep 1987 A
4708886 Nelson Nov 1987 A
4719087 Hanaway Jan 1988 A
4727033 Hijikata et al. Feb 1988 A
4728783 Brass et al. Mar 1988 A
4728959 Maloney et al. Mar 1988 A
4729661 Bell Mar 1988 A
4738824 Takeuchi Apr 1988 A
4751186 Baisch et al. Jun 1988 A
4754127 Brass et al. Jun 1988 A
4761075 Matsushita et al. Aug 1988 A
4764342 Kelln et al. Aug 1988 A
4774055 Wakatake et al. Sep 1988 A
4781891 Galle et al. Nov 1988 A
4782221 Brass et al. Nov 1988 A
4794239 Allais Dec 1988 A
4795613 Azuma et al. Jan 1989 A
4795710 Muszak et al. Jan 1989 A
4797938 Will Jan 1989 A
4800762 Sugaya Jan 1989 A
4808380 Minekane Feb 1989 A
4815978 Mazza et al. Mar 1989 A
4817916 Rawstron Apr 1989 A
4824641 Williams Apr 1989 A
4844887 Galle et al. Jul 1989 A
4847208 Bogen Jul 1989 A
4849177 Jordan Jul 1989 A
4855109 Muraishi et al. Aug 1989 A
4855110 Marker et al. Aug 1989 A
4865811 Newton et al. Sep 1989 A
4868129 Gibbons et al. Sep 1989 A
4869114 Kido et al. Sep 1989 A
4871682 Mazza Oct 1989 A
4873877 Harris Oct 1989 A
4874936 Chandler et al. Oct 1989 A
4886590 Tittle Dec 1989 A
4896029 Chandler et al. Jan 1990 A
4900513 Barker et al. Feb 1990 A
4919887 Wakatake Apr 1990 A
4924078 Sant'Anselmo et al. May 1990 A
4933147 Hollar et al. Jun 1990 A
4935875 Shah et al. Jun 1990 A
4939354 Priddy et al. Jul 1990 A
4939674 Price et al. Jul 1990 A
4943415 Przybylowicz et al. Jul 1990 A
4961906 Andersen et al. Oct 1990 A
4965049 Lillig et al. Oct 1990 A
4967606 Wells et al. Nov 1990 A
4985206 Bowman et al. Jan 1991 A
4986891 Sarrine et al. Jan 1991 A
4988482 Weston Jan 1991 A
4998010 Chandler et al. Mar 1991 A
5031797 Boris et al. Jul 1991 A
5051238 Umetsu et al. Sep 1991 A
5053609 Priddy et al. Oct 1991 A
5059393 Quenin et al. Oct 1991 A
5068091 Toya Nov 1991 A
5073504 Bogen Dec 1991 A
5075079 Kerr et al. Dec 1991 A
5081038 Sugaya et al. Jan 1992 A
5102624 Muraishi Apr 1992 A
5104527 Clinkenbeard Apr 1992 A
5106583 Raysberg et al. Apr 1992 A
5118369 Shamir Jun 1992 A
5122342 McCulloch et al. Jun 1992 A
5124536 Priddy et al. Jun 1992 A
5180606 Stokes et al. Jan 1993 A
5202552 Little et al. Apr 1993 A
5225325 Miller et al. Jul 1993 A
5229074 Heath et al. Jul 1993 A
5232664 Krawzak et al. Aug 1993 A
5250262 Heidt et al. Oct 1993 A
5281395 Markart et al. Jan 1994 A
5289385 Grandone Feb 1994 A
5311426 Donohue et al. May 1994 A
5316319 Suggs May 1994 A
5316452 Bogen et al. May 1994 A
5316726 Babson et al. May 1994 A
5316728 Hayashi et al. May 1994 A
5322771 Rybski et al. Jun 1994 A
5331176 Sant' Anselmo et al. Jul 1994 A
5338358 Mizusawa et al. Aug 1994 A
5346672 Stapleton et al. Sep 1994 A
5350697 Swope et al. Sep 1994 A
5355304 DeMoranville et al. Oct 1994 A
5355439 Bernstein et al. Oct 1994 A
5355695 Kawaguchi et al. Oct 1994 A
5356595 Kanamori et al. Oct 1994 A
5365614 Perkins Nov 1994 A
5366896 Margrey et al. Nov 1994 A
5369261 Shamir Nov 1994 A
5380486 Anami Jan 1995 A
5382511 Stapleton Jan 1995 A
5395588 North, Jr. et al. Mar 1995 A
5399316 Yamada Mar 1995 A
5417213 Prince May 1995 A
5418138 Miller et al. May 1995 A
5424036 Ushikubo Jun 1995 A
5425918 Healey et al. Jun 1995 A
5428740 Wood et al. Jun 1995 A
5431455 Seely Jul 1995 A
5432056 Hartman et al. Jul 1995 A
5439645 Saralegui et al. Aug 1995 A
5439649 Tseung et al. Aug 1995 A
5439826 Kontorovich Aug 1995 A
5446652 Peterson et al. Aug 1995 A
5449622 Yabe et al. Sep 1995 A
5473551 Sato et al. Dec 1995 A
5487975 Miller et al. Jan 1996 A
5544650 Boon et al. Aug 1996 A
5549848 Zeheb et al. Aug 1996 A
5552087 Zeheb et al. Sep 1996 A
5573727 Keefe Nov 1996 A
5578452 Shi et al. Nov 1996 A
5580524 Forrest et al. Dec 1996 A
5595707 Copeland et al. Jan 1997 A
5602674 Weissman et al. Feb 1997 A
5612524 Sant'Anselmo et al. Mar 1997 A
5645114 Bogen et al. Jul 1997 A
5646046 Fischer et al. Jul 1997 A
5646049 Tayi Jul 1997 A
5649537 Anelli et al. Jul 1997 A
5650136 Platzek et al. Jul 1997 A
5650327 Copeland et al. Jul 1997 A
5654199 Copeland et al. Aug 1997 A
5654200 Copeland et al. Aug 1997 A
5656493 Mullis et al. Aug 1997 A
5675715 Bernstein et al. Oct 1997 A
5677966 Doerrer et al. Oct 1997 A
5681543 Schmitt-Willich et al. Oct 1997 A
5695739 Schmitt-Willich et al. Dec 1997 A
5696887 Bernstein et al. Dec 1997 A
5723092 Babson Mar 1998 A
5733528 Felder et al. Mar 1998 A
5737449 Lee Apr 1998 A
5737499 Bernstein et al. Apr 1998 A
5758033 Bernstein et al. May 1998 A
5776414 Itani et al. Jul 1998 A
5798092 Schmitt-Willich et al. Aug 1998 A
5814277 Bell et al. Sep 1998 A
5820849 Schmitt-Willich et al. Oct 1998 A
5839091 Rhett et al. Nov 1998 A
5854075 Levine et al. Dec 1998 A
5875286 Bernstein et al. Feb 1999 A
5876698 Schmitt-Willich et al. Mar 1999 A
5885529 Babson et al. Mar 1999 A
5888576 Nagano Mar 1999 A
5888733 Hyldig-Nielsen et al. Mar 1999 A
5888876 Shiozawa et al. Mar 1999 A
5900045 Wang et al. May 1999 A
5930461 Bernstein et al. Jul 1999 A
5945341 Howard, III Aug 1999 A
5947167 Bogen et al. Sep 1999 A
5948359 Kalra et al. Sep 1999 A
5958341 Chu Sep 1999 A
5963368 Domanik et al. Oct 1999 A
5965454 Farmilo et al. Oct 1999 A
5985563 Hyldig-Nielsen et al. Nov 1999 A
5994071 Ross et al. Nov 1999 A
6017495 Ljungmann Jan 2000 A
6019945 Ohishi et al. Feb 2000 A
6045759 Ford et al. Apr 2000 A
6080363 Takahashi et al. Jun 2000 A
6083490 Ellis et al. Jul 2000 A
6092695 Loeffler Jul 2000 A
6093574 Druyor-Sanchez et al. Jul 2000 A
6096271 Bogen et al. Aug 2000 A
6097995 Tipton et al. Aug 2000 A
6104483 Sebok et al. Aug 2000 A
6110425 Gao et al. Aug 2000 A
6110676 Coull et al. Aug 2000 A
6142292 Patterson Nov 2000 A
6169169 Hyldig-Nielsen et al. Jan 2001 B1
6180061 Bogen et al. Jan 2001 B1
6183693 Bogen et al. Feb 2001 B1
6192945 Ford et al. Feb 2001 B1
6193933 Sasaki et al. Feb 2001 B1
6208771 Jared et al. Mar 2001 B1
6235476 Bergmann et al. May 2001 B1
6244474 Loeffler Jun 2001 B1
6245207 Yasuda et al. Jun 2001 B1
6248590 Malachowski Jun 2001 B1
6281004 Bogen et al. Aug 2001 B1
6287772 Stefano et al. Sep 2001 B1
6296809 Richards et al. Oct 2001 B1
6310179 Batz et al. Oct 2001 B1
6327395 Hecht et al. Dec 2001 B1
6335208 Lowry Jan 2002 B1
6349264 Rhett et al. Feb 2002 B1
6352861 Copeland et al. Mar 2002 B1
6358682 Jaffee et al. Mar 2002 B1
6387326 Edwards et al. May 2002 B1
6388061 Bergmann et al. May 2002 B1
6395562 Hammock et al. May 2002 B1
6403036 Rodgers et al. Jun 2002 B1
6403931 Showalter et al. Jun 2002 B1
6405609 Richards et al. Jun 2002 B1
6408931 Tilak Jun 2002 B1
6414133 Dietz-Band et al. Jul 2002 B1
6416713 Ford et al. Jul 2002 B1
6420916 Freeman Jul 2002 B1
6426794 Trainoff Jul 2002 B1
6444170 Heid et al. Sep 2002 B1
6451551 Zhan et al. Sep 2002 B1
6472217 Richards et al. Oct 2002 B1
6495106 Kalra et al. Dec 2002 B1
6498037 Lewis et al. Dec 2002 B1
6509193 Tajima Jan 2003 B1
6534008 Angros Mar 2003 B1
6537818 Reinhardt et al. Mar 2003 B2
6541261 Bogen et al. Apr 2003 B1
6544798 Christensen et al. Apr 2003 B1
6548822 Morris et al. Apr 2003 B1
6582962 Richards et al. Jun 2003 B1
6594537 Bernstein et al. Jul 2003 B1
6632598 Zhang et al. Oct 2003 B1
6635225 Thiem et al. Oct 2003 B1
6699710 Kononen et al. Mar 2004 B1
6735531 Rhett et al. May 2004 B2
6746851 Tseung et al. Jun 2004 B1
6800249 de la Torre-Bueno Oct 2004 B2
6821072 Thiem et al. Nov 2004 B2
6827901 Copeland et al. Dec 2004 B2
6855559 Christensen et al. Feb 2005 B1
6943029 Copeland et al. Sep 2005 B2
7135992 Karlsson et al. Nov 2006 B2
7142852 Tell et al. Nov 2006 B2
7226788 De La Torre-Bueno Jun 2007 B2
7303725 Reinhardt et al. Dec 2007 B2
7378055 Lemme et al. May 2008 B2
7396508 Richards et al. Jul 2008 B1
7400983 Feingold et al. Jul 2008 B2
7404927 Lemme et al. Jul 2008 B2
7850912 Favuzzi et al. Dec 2010 B2
20010006417 Modlin et al. Jul 2001 A1
20010010936 Rizzo et al. Aug 2001 A1
20010037072 Virtanen Nov 2001 A1
20010044124 Bacus Nov 2001 A1
20010049114 Bacus Dec 2001 A1
20010055799 Baunoch et al. Dec 2001 A1
20020001849 Copeland et al. Jan 2002 A1
20020009391 Marquiss et al. Jan 2002 A1
20020019001 Light Feb 2002 A1
20020034456 Ford et al. Mar 2002 A1
20020072122 Copeland et al. Jun 2002 A1
20020091593 Fowler Jul 2002 A1
20020098117 Ammann et al. Jul 2002 A1
20020098595 Lubman et al. Jul 2002 A1
20020110494 Lemme et al. Aug 2002 A1
20020114733 Showalter et al. Aug 2002 A1
20020116132 Rhett et al. Aug 2002 A1
20020127727 Bach et al. Sep 2002 A1
20020176801 Giebeler et al. Nov 2002 A1
20020177137 Hodge Nov 2002 A1
20020178547 Shofner et al. Dec 2002 A1
20020182628 Dietz-Band et al. Dec 2002 A1
20030003537 Fischer et al. Jan 2003 A1
20030022391 Richards et al. Jan 2003 A1
20030032048 Kim et al. Feb 2003 A1
20030043963 Yamagami et al. Mar 2003 A1
20030059790 Jaffee et al. Mar 2003 A1
20030087443 Pressman et al. May 2003 A1
20030099573 Tseung et al. May 2003 A1
20030100043 Kalra et al. May 2003 A1
20030119200 Taft et al. Jun 2003 A1
20030120633 Torre-Bueno Jun 2003 A1
20030124729 Christensen et al. Jul 2003 A1
20030162221 Bader et al. Aug 2003 A1
20030200111 Damji Oct 2003 A1
20030215357 Malterer et al. Nov 2003 A1
20040002163 Reinhardt et al. Jan 2004 A1
20040033163 Tseung et al. Feb 2004 A1
20040219069 Kalra et al. Nov 2004 A1
20040265185 Kitagawa Dec 2004 A1
20040266015 Favuzzi et al. Dec 2004 A1
20050038676 Showalter et al. Feb 2005 A1
20050064535 Favuzzi et al. Mar 2005 A1
20050124028 Windeyer et al. Jun 2005 A1
20050159982 Showalter et al. Jul 2005 A1
20060045806 Winther et al. Mar 2006 A1
20060046298 Key et al. Mar 2006 A1
20060063265 Welcher et al. Mar 2006 A1
20060085140 Feingold et al. Apr 2006 A1
20060088928 Sweet et al. Apr 2006 A1
20060088940 Feingold et al. Apr 2006 A1
20060105359 Favuzzi et al. May 2006 A1
20060148063 Fauzzi et al. Jul 2006 A1
20060172426 Buchanan et al. Aug 2006 A1
20060265133 Cocks et al. Nov 2006 A1
20070010912 Feingold et al. Jan 2007 A1
20070196909 Showalter et al. Aug 2007 A1
Foreign Referenced Citations (114)
Number Date Country
7754191 Feb 1992 AU
0644876 Dec 1993 AU
4312093 Oct 1993 DE
4313807 Nov 1993 DE
69417908 Sep 1994 DE
0285851 Oct 1988 EP
0290018 Nov 1988 EP
0310303 Apr 1989 EP
0325101 Jul 1989 EP
0600939 Jun 1994 EP
0722363 Jul 1996 EP
0881481 Dec 1998 EP
0881481 Dec 1998 EP
0502636 Sep 1992 ER
2160486 Apr 1999 ES
2239167 Jul 1973 FR
2216 259 Mar 1988 GB
2218514 Mar 2003 GB
54014287 Feb 1979 JP
55107957 Aug 1980 JP
63240688 Oct 1988 JP
03209163 Dec 1991 JP
6504115 May 1994 JP
8500922 Jan 1996 JP
8506888 Jul 1996 JP
9503304 Mar 1997 JP
WO 9958972 Nov 1909 WO
WO 8503571 Aug 1985 WO
WO 8602163 Apr 1986 WO
WO 8700086 Jan 1987 WO
WO 8700280 Jan 1987 WO
WO 8706695 Nov 1987 WO
WO 8802866 Apr 1988 WO
WO 8802865 Apr 1988 WO
WO 8901616 Feb 1989 WO
WO 9113335 Sep 1991 WO
WO 9201919 Feb 1992 WO
WO 9201919 Feb 1992 WO
WO 9303451 Feb 1993 WO
WO 9303451 Feb 1993 WO
WO 9406080 Mar 1994 WO
WO 9510035 Apr 1995 WO
WO 9510035 Apr 1995 WO
WO 9510035 Apr 1995 WO
WO 9528179 Oct 1995 WO
WO 9532741 Dec 1995 WO
WO 9533240 Dec 1995 WO
WO 9723732 Jul 1997 WO
WO 9723732 Jul 1997 WO
WO 9726541 Jul 1997 WO
WO 9726541 Jul 1997 WO
WO 9826295 Jun 1998 WO
WO 9934190 Jul 1999 WO
WO 9934190 Jul 1999 WO
WO 9943434 Sep 1999 WO
WO 9943434 Sep 1999 WO
WO 9944031 Sep 1999 WO
WO 9949295 Sep 1999 WO
WO 9949295 Sep 1999 WO
WO 9955916 Nov 1999 WO
WO 9955916 Nov 1999 WO
WO 9957309 Nov 1999 WO
WO 9957309 Nov 1999 WO
WO 0002030 Jan 2000 WO
WO 0002030 Jan 2000 WO
WO 0002660 Jan 2000 WO
WO 0002660 Jan 2000 WO
WO 0036393 Jun 2000 WO
WO 0036393 Jun 2000 WO
WO 0077592 Dec 2000 WO
WO 0102859 Jan 2001 WO
WO 0102859 Jan 2001 WO
WO 0102861 Jan 2001 WO
WO 0102861 Jan 2001 WO
WO 0104628 Jan 2001 WO
WO 0106255 Jan 2001 WO
WO 0106255 Jan 2001 WO
WO 0107890 Feb 2001 WO
WO 0107890 Feb 2001 WO
WO 0151909 Jul 2001 WO
WO 0151909 Jul 2001 WO
WO 0155346 Aug 2001 WO
WO 0155346 Aug 2001 WO
WO 0168259 Sep 2001 WO
WO 0168269 Sep 2001 WO
WO 0168269 Sep 2001 WO
WO 0175540 Oct 2001 WO
WO 0187487 Nov 2001 WO
WO 0187487 Nov 2001 WO
WO 0187487 Nov 2001 WO
WO 0188500 Nov 2001 WO
WO 02056121 Jul 2002 WO
WO 02064812 Aug 2002 WO
WO 03045560 Jun 2003 WO
WO 03045560 Jun 2003 WO
WO 03045560 Jun 2003 WO
WO 03052386 Jun 2003 WO
WO 2004057307 Jul 2004 WO
WO 2004057308 Jul 2004 WO
WO 2004058404 Jul 2004 WO
WO 2004058404 Jul 2004 WO
WO 2004058950 Jul 2004 WO
WO 2004059284 Jul 2004 WO
WO 2004059284 Jul 2004 WO
WO 2004059287 Jul 2004 WO
WO 2004059287 Jul 2004 WO
WO 2004059288 Jul 2004 WO
WO 2004059288 Jul 2004 WO
WO 2004059297 Jul 2004 WO
WO 2004059441 Jul 2004 WO
WO 2004059441 Jul 2004 WO
WO 2004074845 Sep 2004 WO
WO 2004074847 Sep 2004 WO
WO 2005031312 Apr 2005 WO
Related Publications (1)
Number Date Country
20060088928 A1 Apr 2006 US
Provisional Applications (1)
Number Date Country
60435601 Dec 2002 US