This application incorporates by reference three-dimensional (3D) images in the form of movies in .wmv format contained on compact disks filed concurrently herewith. Each compact disk is being filed in duplicate. The 3D images represent specimens upon which extended-depth confocal microscopy has been employed in accordance with an example embodiment of the present invention. The specimens are transgenic mouse tissue specimens imaged with high spatial resolutions and at significant depths and volumes.
The following files are contained on the compact discs:
To view the images, one must use Microsoft Windows Media Player, version 10 or equivalent.
Ever since van Leeuwenhoek developed the first microscope nearly 400 years ago, scientists have wanted to use a microscope to view the fine details of complex structures. However, microscopes can generally reveal structures only at or near the surface of specimens. The ability to observe below the surface of a specimen has remained limited but possible, to some extent, by an ability of an histologist to slice the specimen into thin slices, thereby bringing deep structures to the surface. In so doing, precise spatial relationships between structures within the slices are altered making it difficult or impossible to describe these relationships within the intact specimen. The 1980's brought the confocal microscope and the ability to image specimens emitting fluorescent light up to 100 microns deep. This was followed in the 1990's by two-photon microscopy, which extended the range to 300 microns. An advanced and expensive application of two-photon microscopy allows imaging up to 1 mm, but light scattering still limits the resolution at which structures may be viewed. Light scattering may thereby eliminate the ability to resolve fine structures, such as cellular details. Ultimately, it is cellular details that are of most interest to microscopists.
The summary that follows describes some of the example embodiments included in this disclosure. The information is proffered to provide a fundamental level of comprehension of aspects of this disclosure.
An example embodiment of the present invention includes a system and corresponding method for generating a three-dimensional image of a specimen comprising: an objective, optical elements, a sectioning device, a programmable stage, a programmable focus controller, and a sensor. The objective may be spaced a distance from the specimen at which at least part of the specimen is within the in-focus plane of the objective. The optical elements may direct incident light from a light source along an incident light path to multiple regions of the in-focus plane of the objective. Directing light to the multiple regions and the emitted light may include separate beams of emitted light corresponding to the specimen portions. Directing light may also include serially directing incident light to each region to illuminate separately each specimen portion within a corresponding one of the regions, which may include scanning the specimen with the incident light to sequentially illuminate separate portions of the specimen. The multiple regions of the in-focus plane of the objective may have a thickness substantially equal to a depth of field of the objective. The incident light may cause the specimen, at the in-focus plane, to produce emitted light responsive to the incident light. The optical elements may also direct the emitted light along a return light path.
The sectioning device may be configured to section the specimen. The programmable stage may be in an operative arrangement with the objective and sectioning device and configured to support and move the specimen. The specimen may be moved to the objective to image at least one area of the specimen and relative to the sectioning device to section the specimen in a cooperative manner with the sectioning device. The programmable focus controller may change the distance between the objective and programmable stage to move the in-focus plane of the objective within the specimen. The sensor may be in optical communication with the return light path to detect the emitted light from the multiple regions of the in-focus plane of the objective and to generate signals representative of detected emitted light.
The programmable stage (stage) may reposition the specimen relative to the objective to bring an area of the specimen previously outside the field of view of the objective to within the field of view of the objective. The stage may position the specimen relative to the objective to produce partial overlap between three-dimensional images of contiguous areas of the specimen in at least one of two perpendicular dimensions. The partial overlap may be in the X-axis or Y-axis.
Another example embodiment of the system and method may further include a programmable focus controller to change the distance between the stage and the sectioning device to define how much depth of the specimen is to be sectioned, which may be less than the imaging depth to produce partial overlap in contiguous three-dimensional images of the same field of view before and after sectioning. The programmable focus controller may also move the objective relative to the stage, or vice versa, to change the distance between the objective and specimen to bring more portions of the specimen within the in-focus plane of the objective.
The system and method may include an image and sectioning tracker to determine a distance and tilt between the in-focus plane of the objective and a sectioning plane of the sectioning device to support accurate imaging and sectioning. The image and sectioning tracker may also determine the position of the surface of the specimen after sectioning to use as a reference in a next imaging and sectioning.
The system and method may further include an imaging controller configured to cause the objective to image contiguous areas of the specimen with partial overlap and to cause the programmable stage to move in a cooperative manner with the sectioning device to section the specimen between imaging of the contiguous areas. The imaging controller may also cause the programmable stage to repeat the imaging and sectioning a multiple number of times. The contiguous areas are contiguous in the X-, Y- or Z-axis relative to the objective.
The system and method may further include a reconstruction unit, an identification unit, a feature matching unit, an offset calculation unit, and a processing unit. The reconstruction unit may be used to reconstruct multiple three-dimensional images based upon multiple sets of two-dimensional images based on signals representative of the detected light. The identification unit may identify features in the multiple three-dimensional images. The feature matching unit may determine matching features in contiguous three-dimensional images. The offset calculation unit may calculate offsets of the matching features to generate an alignment vector or matrix. The processing unit may process the contiguous three-dimensional images as a function of the alignment vectors or matrix to generate adjusted data representing an adjusted three-dimensional image.
The system and method may further include a display unit to display the adjusted three-dimensional image.
The system and method may further include a transmit unit to transmit data, representing two-dimensional images, representing layers of the specimen within the imaging depth of the objective, via a network to a reconstruction server to reconstruct a three-dimensional image of the specimen at a location in the network apart from the sensor. A data storage unit may also be used to store data representing the two-dimensional or three-dimensional images.
The system and method may further include an imaging controller configured to cause the programmable stage to move the specimen to the sectioning device or to cause a different programmable stage, in operative relationship with the sectioning device, to move the sectioning device to the specimen.
The system and method may further include a storage container and a reporting unit. The storage container may store sections removed from the specimen to enable a person or machine to identify aspects of the sections and generate a correlation of the aspects of the sections with images representing layers in the specimen. The reporting unit may report results of the correlation. The system may also include a staining unit to enable the person or machine to stain the sections removed from the specimen to correlate the sections stored with the respective images of the sections.
In an example embodiment of the present invention, the sectioning device oscillates a blade relative to a blade holder in a substantially uni-dimensional manner.
In accordance with the present invention, the objective and programmable stage are components of a microscope selected from a group consisting of: an epifluorescence microscope, confocal microscope, or multi-photon microscope. Additionally, the sensor may detect fluorescent light emitted by the specimen at select wavelengths of a spectrum of the emitted light. The sensor may also include a detector selected from a group consisting of: a photo-multiplier tube (PMT) or a solid-state detector, such as a photo-diode or a charge-coupled device (CCD) array. The “specimen” may be tissue selected from a group consisting of: a human, animal, or plant.
Another example embodiment of the present invention includes a method for providing data for healthcare comprising: generating a three-dimensional image of a specimen from a patient by reconstructing multiple two-dimensional images of layers of the specimen and transmitting data representing the three-dimensional image via a network to the patient or a person associated with the healthcare for the patient. The “patient” may be a human, animal, or plant.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.
A description of example embodiments of the invention follows.
Investigations into the mechanisms underlying neural development, such as growth and differentiation, are enhanced by an ability to develop images of neural structure at a microscopic level. To label cells selectively, neuronal tracers can be injected at specific sites in the nervous system. In addition, transgenic mice are available that express fluorescent proteins in subsets of neurons. Techniques for imaging fluorescent structures in thick specimens include confocal and multi-photon microscopy; however, light scattering limits the depth at which signals can be acquired with high resolution. Electron microscopy and standard histology techniques overcome the limitations due to light scattering. Nevertheless, these techniques are not commonly used to reconstruct images of structures in thick specimens because of the difficulty of collecting, aligning, and segmenting serial sections. A need remains for improved techniques to image three-dimensional cellular structures in thick specimens.
Further, current techniques for imaging a large tissue volume rely largely on use of a tissue slicing device to render the volume into thin slices, each of which can be imaged separately. Imaging thin slices is necessary using current techniques because, as mentioned above, the light used to generate the image penetrates only a short distance into a specimen; therefore, structures located below the specimen surface cannot be visualized until they are brought to the surface by removal of structures above the structures of interest. Images from each thin slice may then be reconstructed into a three-dimensional volume using computer applications. A problem with such techniques is that, on sectioning the specimen, the resulting slice can be significantly distorted, leaving the images of such slices with no consistent spatial relationship to one-another and rendering three-dimensional reconstruction of the volume difficult or impossible, if the structure is complex, and three-dimensional reconstructions may be inaccurate or incomplete if the structure is able to be rendered.
Described herein are example embodiments of a system and corresponding method that are designed to facilitate imaging and sectioning (e.g., slicing) of large volumes of biological tissue specimens in a way that allows for seamless three-dimensional reconstruction of the tissue volume. Reconstruction of large tissue volumes is of value and interest to scientists, for example, to increase understanding of spatial relationships and prospects for functional interaction between cells and their processes. Example embodiments of the present invention are of major significance because they allow scientists to understand the organization of large numbers of cells in their natural configuration, and an ability to perform high resolution spatial mapping of large three-dimensional tissue volumes provided by the example embodiments.
Embodiments of the present invention described herein address shortcomings of current techniques used to generate three-dimensional images of structures in a thick specimen by providing a novel approach to developing images of thick specimens using a combination of a laser scanning microscope system and a sectioning device. The approach is based on block face imaging of a specimen. An example embodiment of the present invention is based on a development of a miniature microtome and the use of precision programmable stages to move the specimen relative to the microtome or vice versa and realign the specimen with respect to an imaging system. Imaging through use of the example embodiment or other example embodiments as presented herein is flexible and promises to be useful in basic research investigations of synaptic connectivity and projection pathways and also useful in other contexts, such as hospitals, physician offices, pathology laboratories, central diagnostic facilities, and so forth. Images of specimen fluorescence may be developed at the resolution limit of a light microscope using very high Numerical Aperture (NA) objectives. A system and method according to example embodiments of the present invention may also be used to reconstruct images of cellular structures in different organs, for example, muscle and liver.
An example embodiment of the present invention overcomes problems due to sectioning (e.g., slicing) specimens by imaging tissue of interest (also referred to herein as “sections”) before it is sectioned. By doing so, all structures within the tissue retain their original spatial relationship with one another. After imaging into the volume of the tissue of interest, a slice may be removed from the top (i.e., imaging side of the tissue of interest) that is physically thinner than the depth of tissue that was imaged. The slice may be discarded or put through a staining process whose results may then be compared to an image of the slice. The newly exposed tissue surface may then be re-imaged and, subsequently, another tissue section may be taken off the top. Three-dimensional reconstruction of the large tissue volume is possible in this circumstance because: a) the tissue block face is much less prone to distortion due to sectioning, so adjacent structures retain their original spatial relationship to one another and alignment of adjacent series of images can be performed, and b) sets of images are effectively “thicker” than the tissue slice removed, so adjacent sets of images overlap one-another and edge structures appear in adjacent image series. Because edge structures appear in adjacent image series, alignment and reconstruction of the tissue volume can be performed. Additionally, an example embodiment of the present invention may be employed using existing microscope systems. An example embodiment of the present invention does not require that the specimen be cleared, meaning the specimen is not subjected to a process to dehydrate the specimen by replacing water with a polar solvent in the specimen. Hence, the specimen may be imaged in its natural configuration.
Example embodiments of a system or method in accordance with of the present invention enable high resolution three-dimensional imaging and reconstruction of specimens having small or large volumes, where the actual volume that can be imaged is limited only by the size of the structure that can be mounted for sectioning and by computer power and memory for imaging and reconstruction.
Examples of specimen include biological specimens of interest, such as animal or human brain (or part thereof) or skeletal muscle (or part thereof). The system and methods may be used on any soft tissue or structure that can be sectioned and imaged, including most animal or human tissues and organs and also plant “tissues.”
Information gleaned from rendered three-dimensional images may be used to gain new insight into spatial relationships between component cells of the tissue of interest and can thereby promote a new and deeper understanding of the way in which cells interact to produce a functional system. A system and method of the present invention may be used in a research laboratory to provide information on the organization of normal cell systems in a controlled environment and also allow for an investigation of cell organization in pathological or abnormal situations in research animals or in tissues surgically removed from animals or humans for subsequent processing and visualization in laboratory and non-laboratory environments. Examples of such use include, but are not limited to: examination and reconstruction of cancer cells invading host tissue, benign and malignant growths in relationship to host structures, tissue damaged by trauma or usage, and congenitally abnormal tissues and structures.
While the embodiments discussed herein are detailed using examples involving animal tissue, an example embodiment of the present invention may also be entirely suitable for similar purposes in reconstructing spatial details and relationships in tissues from plants, bryophytes, fungi, lichens, etc. Further, the present invention may be useful as a means for providing the data to enable detailed three-dimensional reconstruction of any specimen that is soft enough and of a consistency that it may be sectioned and imaged. An example of such a usage may be in the sectioning, imaging and subsequent three-dimensional reconstruction of a piece of fabric, perhaps showing details of damaged fibers and reliable data on the trajectory of a penetrating object, such as a bullet or blade. In short, an example embodiment of the present invention may be used with any soft tissue specimen removed from an animal, human, or plant.
In brief, an example embodiment of the present invention provides a programmable stage that, in addition to its normal use in microscopy, the programmable stage may be used as an integral component of (i.e., operates in a cooperative manner with) a specimen sectioning device that removes surface portions (e.g., sections) of the specimen. The thickness of the surface portions that are removed may be selected by changing the distance between the specimen and the sectioning device using the programmable focus controller. Changing the position of the sectioning plane of the sectioning device in relation to the specimen may include moving the sectioning device in the Z-axis relative to the specimen or moving the specimen in the Z-axis using the programmable stage relative to the sectioning device.
Use of a programmable microscope stage may allow for removal of surface portions in a controlled and automated manner and may also allow the user (e.g., person or machine) to reposition the specimen precisely under the microscope objective to image the regions or areas of the specimen previously imaged or to be newly imaged. For an example embodiment of the present invention to be automated, a specimen bath may be included to allow for the specimen to be submerged in a fluid. The specimen bath may also be used for collecting sections for further processing (e.g., staining) and analysis. The thickness of the portions of the specimen that are imaged may be greater than the thickness of the portions that are removed, allowing overlap between successive image stacks of the same regions (see
The way in which the selected surface portions of the specimen are removed by the sectioning device may vary. In an exemplary embodiment, the sectioning device may be mounted in a fixed position, and the specimen may be moved on a programmable stage to the sectioning device. Alternatively, the specimen may be in a fixed position on the microscope stage, and the sectioning device may be directed on a programmable stage to the specimen. Some embodiments of the present invention do not require physical modifications of an existing microscope system; software control for automation of imaging and sectioning need only be implemented in a modified or new form. Some example embodiments may be employed with any confocal or multi-photon microscope system that has an upright stand and a programmable stage because the sectioning device is sufficiently small to work with most if not all of today's motorized stage microscope systems without modification.
In
The nosepiece 104 may hold one or more microscope objectives 105, which allows for easy selection of each microscope objective 105. In
When the incident light illuminates the specimen at the in-focus plane of the objective 105, fluorescence emission occurs, and at least a portion of the emitted light is received and directed to the scanhead 103. The scanhead 103 may include a detector (not shown) that detects the emitted light and, in turn, produces a corresponding electrical signal, which may be captured and processed to render two-dimensional (2D) images (not shown) of multiple (for example, 100) layers of a section of the specimen corresponding to the number of movements of the in-focus plane of the objective 105 within the section of the specimen. The set of 2D images may themselves be rendered into a three-dimensional (3D) image. It should be understood that the rendering of the 2D or 3D images may be performed internally in the microscope system 100, if it is configured with an image processor for such a purpose, locally at a computer in communication via a wired, wireless, or fiber optic communications bus or link, or remotely via a network (not shown). Once the image (or just raw data) is captured, a thin section of the specimen may be removed from the block surface of the specimen by moving the microscope-based programmable stage 113 from an imaging location beneath the microscope objective 105 toward the manipulator and blade assembly 115 at a section removal location. The manipulator and blade assembly 115 may be connected to another motorized stage 117 for local movement, optionally in X, Y, or Z coordinate axes 101, or global movement to move the manipulator and blade assembly 115 to the specimen at the imaging area for sectioning. The sectioning device 115 may also be attached to the nosepiece 104, and sectioning may thus occur immediately adjacent to the imaging location.
Once a section of desired thickness has been removed from the specimen, the microscope-based programmable stage 113 may return the specimen to its original position under the objective 105, and the process of imaging and sectioning may be repeated until all areas, optionally in X, Y, or Z coordinate axes 101, of interest for the investigation have been imaged.
The objective 105 may be coupled to a programmable focus controller (not shown), which is configured to change the distance between the objective 105 and programmable stage 113 to move the in-focus plane of the objective 105 within the specimen.
Both programmable stages 113, 117 may include X-, Y- and Z-axis substages configured to move the specimen in at least one respective axis. In some embodiments, the Z-axis substage may position the in-focus plane within the specimen during imaging or a blade 116 of the sectioning device 115 within the specimen during sectioning, within a tolerance of 1 micron or other suitable tolerance. It should be understood that the tolerance may be based on mechanical, electrical, sampling, or other forms of error contributing to system tolerance.
Continuing to refer to
Conventional confocal and two-photon microscope systems are unable to acquire high-resolution images more than approximately 100 microns to 300 microns deep into tissue, respectively. Image quality deteriorates quickly at greater depths due to light scattering by overlying tissue. With traditional histology methods, tissue can be cut into a series of thin sections (typically 3 microns to 5 microns) which are then stained and imaged. The alignment of images is difficult, however, because of section warping. Methods that directly image a block surface eliminate need for image alignment. In surface imaging microscopy (SIM), tissue is labeled by fluorescent dyes that are embedded in resin. The block surface is repeatedly imaged and sectioned using a fluorescence microscope equipped with a wide-field camera and an integrated microtome, for example, with a glass or diamond knife. An advantage of the SIM technique is that the axial resolution can be made the same as the resolution of the light microscope in X and Y coordinates axes. A disadvantage is that while some dyes remain fluorescent after tissue is dehydrated and embedded, GFP does not. Another existing method uses a two-photon laser to serially image and ablate a tissue specimen. A major disadvantage of the two-photon laser method is its speed because, in its current configuration, the maximum scan rate is limited to 5 mm per second. Tissue is ablated typically in 10 micron sections. Thus, the time that is required to remove 70 microns of tissue in a 1 mm by 1 mm square is at least 23 minutes. However, high-resolution imaging and sectioning of a large tissue by employing an example embodiment of the present invention is done in significantly less time, such as less than 5 minutes for a 1 cm by 1 cm block.
The specimen, e.g., brain tissue that is fixed and embedded in agarose, may be positioned on a suitably configured microscope stage. The specimen may be directed on the programmable microscope stage to the position of the sectioning device, which may include a manipulator and blade assembly that may be driven by a programmable stage, as illustrated in
In another embodiment of the present invention, the specimen is fixed with a stronger fixative, such as glutaraldehyde. This fixative may stiffen the cellular structure of the specimen, which may be bound together weakly by connective tissue. Furthermore, multiple fixatives applied together or in sequence may achieve the desired stiffness while having certain optical advantages, for example, reduced autofluorescence. For example, a muscle specimen may be fixed with a mixture of paraformaldehyde and glutaraldehyde. The muscle specimen then has adequate stiffness and optical characteristics to allow both sectioning and imaging.
The ability to remove portions of the specimen in sections with constant thickness depends on the type of tissue and the thickness to be cut. Fixation adequate for intended cutting therefore varies. For example, stronger fixation may be required for muscle versus brain. The variability in section thickness may also depend on cutting speed; however, variability in section thickness may be difficult to predict. In any case, the quality of sectioning may be improved by drawing the specimen over the sectioning device slowly, for example, at roughly 3 min per cm to 4 min per cm.
The fixation may be applied to the specimen as generally well known in the art, such as by immersing the specimen in an aqueous solution of the fixative, removing the specimen from the solution, post-fixing the specimen, then rinsing and embedding the specimen in agarose. Solutions of fixatives suitable for use according to an example embodiment of the present disclosure are known, and an example is described in the Exemplifications Section herein below.
Now that an example system, description of imaging and sectioning of a specimen, and results of imaging have been presented, details of the system, including a network embodiment, and methods for use thereof are presented in reference to
Continuing to refer to
The scanning/de-scanning mechanism 629 may divide the in-focus plane 623 of the objective 627 into a plurality of regions 639 (e.g., 512×512 grid regions) and serially direct the incident light beam 625 to each region 639. For illustrative purposes, a collection of regions 639 are shown in a top view of the in-focus plane 623 at an enlarged scale. An object tile 621 of the specimen, which may be positioned in a region 639 of the in-focus plane 623, may absorb incident light beams 625 and emit fluorescence light 632. Although the in-focus plane 623 is identified as a plane, it should be understood that the in-focus plane 623 actually has a thickness proportional to the depth of field of the objective 627. Likewise, each region 623 has a thickness t (i.e., a distance from top to bottom), which may be proportional to the depth of field of the objective 627 and extends into the specimen up to an imaging depth, as described in reference to
Continuing to refer to
There are several embodiments of a general method of creating three-dimensional images of thick specimens in accordance with the present invention. The specimen may be positioned in the optical field of view and may be visualized using fluorescence optics. As described supra, the scanning/de-scanning mechanism 629 may divide the in-focus plane 623 of the objective 627 into a plurality of grid regions (regions) 639. The regions 639 may be any sort of regular pattern, as desired, that is suitable for imaging the specimen. Moreover, any equivalent means of dividing an in-focus plane 623 of an objective 627 of a laser scanning microscope system 620 into a plurality of grid, discrete or continuous regions 639 conducive to imaging the specimen may also be employed. In one embodiment, the grid regions 639 of the in-focus plane 623 of the objective 627 are of a thickness proportional to the depth of field of the objective 627.
Continuing to refer to
In summary,
Manipulations of an example embodiment of the present invention may be used to change the in-focus plane as desired, and new grid regions may be established in the new plane of focus so that other select regions of the specimen may be excited by the incident light. Serial manipulations may be used to change the in-focus plane, thereby allowing sequential imaging as desired using sequential imaging of portions of the specimen in each succeeding in-focus plane.
The blade 968 in the blade holder 960 may be used to remove a portion of the thickness of the volume of a specimen, which includes cutting a section in an oscillatory manner (e.g., substantially linear dimension with regard to blade holder 960). The blade 968 may be configured to cut sequential sections of the specimen with thicknesses between about 1 micron and 50 microns, 1 micron and 10 microns, and 2 microns and 4 microns. The blade 968 may be moved relative to the specimen or the blade holder 960 within a tolerance of less than 1 micron in the Z-axis. The fasteners 973 and pins 975, 979 are used for example purposes only; any appropriate means of fastening, securing, or interconnecting the components of the blade holder 960 or manipulator 981 known by one skilled in the art may be employed. As an alternative embodiment, the blade 968 may include a non-vibrating diamond or glass blade to cut sequential sections of the specimen embedded in wax or resin with thicknesses between 50 nm and 200 nm, or 0.5 microns and 5 microns. The blade 968 may be moved relative to the specimen or the blade holder 960 within a tolerance of less than 50 nm.
Continuing to refer to
Client computer(s)/devices 1053 and server computer(s) 1054 provide processing, storage, and input/output devices executing application programs and the like. Client computer(s)/devices 1053 can also be linked through communications network 1055 to other computing devices, including other client devices/processes 1053 and server computer(s) 1054. For example, a client computer 1053 may be in communication with an imaging station 1051, which transmits raw data or 2D or 3D image data 1052 to the client computer 1053. The client computer 1053 then directs the raw data or 2D or 3D image data 1052 to the network 1055. Additionally, a 3D reconstruction server 1054 may receive 2D images 1056 from the network 1055, which will be used to reconstruct a 2D or 3D image(s) 1057 that will be sent via the network 1055 to a 3D image display unit on a client computer 1053. Communications network 1055 can be part of a remote access network, a global network (e.g., the Internet), a worldwide collection of computers, Local area or Wide area networks, and gateways that currently use respective protocols (TCP/IP, Bluetooth, etc.) to communicate with one another. Other electronic device/computer network architectures are suitable.
With respect to the imaging system 100 of
In one embodiment, the processor routines 1071 and 2D data images 1073 or 3D data images 1074 are a computer program product (generally referenced 1071), including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the invention system. Computer program product 1071 can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection. In other embodiments, the invention programs are a computer program propagated signal product 1057 embodied on a propagated signal on a propagation medium (e.g., a radio wave, an infrared wave, a laser wave, a sound wave, or an electrical wave propagated over a global network such as the Internet, or other network(s)). Such carrier medium or signals provide at least a portion of the software instructions for the present invention routines/program 1071.
In alternative embodiments, the propagated signal is an analog carrier wave or digital signal carried on the propagated medium. For example, the propagated signal may be a digitized signal propagated over a global network (e.g., the Internet), a telecommunications network, or other network. In one embodiment, the propagated signal is a signal that is transmitted over the propagation medium over a period of time, such as the instructions for a software application sent in packets over a network over a period of milliseconds, seconds, minutes, or longer. In another embodiment, the computer readable medium of computer program product 1071 is a propagation medium that the computer system 1053 may receive and read, such as by receiving the propagation medium and identifying a propagated signal embodied in the propagation medium, as described above for computer program propagated signal product.
Generally speaking, the term “carrier medium” or transient carrier encompasses the foregoing transient signals, propagated signals, propagated medium, storage medium and the like.
For example, the present invention may be implemented in a variety of computer architectures. The computer network of
Continuing to refer to
In
Another embodiment of the present invention employs a nosepiece (not shown, see nosepiece 104 of
Continuing to refer to
In
While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
Transgenic Mice. Mice that expressed cytoplasmic YFP under the neuron-specific Thyl promoter (YFP-H line) or both cyan fluorescent protein (CFP) and YFP (cross of CFP-S and YFP-H lines) were used for all experiments (protocol approved by the Faculty of Arts and Sciences' Institutional Animal Care and Use Committee, IACUC, at Harvard University. Adult and neonatal mice were anesthetized by subcutaneous injection of a mixture of ketamine and xylazine (17.39 mg/ml K, 2.61 mg/ml X; dose=0.1 ml/20 gms). For fixation of brain, mice were transcardially perfused with 3% paraformaldehyde. For fixation of muscle, mice were perfused with a mixture of 2% paraformaldehyde and 0.75% glutaraldehyde. The stronger fixation allowed muscle to be cut with minimal tearing. Brain was post-fixed for at least 3 hours before being removed from the skull. Muscle was surgically removed and post-fixed for 1 hour. The tissue was thoroughly rinsed in PBS (3 times, 15 minutes per rinse). Muscle was then incubated with alexa-647 conjugated α-bungarotoxin (2.5 micrograms per ml for 12 hrs at 4 C; Invitrogen) to label acetylcholine receptors and rinsed thoroughly with PBS. Finally the tissue was embedded in 8% low melting-temperature agarose, and the agarose block was attached to a polylysine-coated slide using super glue. Care was taken to keep the agarose hydrated with PBS to prevent shape changes due to drying.
Imaging. Tissue specimens were imaged using a multi-photon microscope system (FV 1000-MPE on a BX61 upright stand, Olympus America, Inc.) equipped with a precision XY stage (Prior) and a high-NA dipping cone objective (20x 0.95NA XLUMPFL20XW, Olympus America, Inc.). Image stacks were acquired from just below the cut surface of the block to a depth determined by light scattering properties of the fixed tissue, typically 50 microns to 100 microns for confocal imaging. The field of view was enlarged by acquiring tiled image stacks. The position of each image stack was controlled precisely by translating the block on the programmable microscope stage. The overlap between tiled stacks was typically 2%. The center coordinates of each image stack was recorded to allow repeat imaging of the same regions. CFP and YFP were excited with the 440 nm and 514 nm laser lines respectively. The receptor labeling was excited with 633 nm laser light. The channels were imaged sequentially.
Sectioning. Sections were cut by drawing the block under an oscillating-blade cutting tool, using the programmable stage to move the block relative to the cutting tool in a controlled and precise manner. The block was raised and lowered relative to the blade (High Profile 818 Blade, Leica Microsystems) by adjusting the microscope focus. The focus position was recorded after each slice. Section thickness was controlled by changing the focus (i.e., stage height) a known amount relative to the recorded position. The precision of the sectioning was determined by moving the block back under the objective and imaging the cut surface. The programmable stage made it straightforward to move back to the same region repeatedly. If the cutting speed was slow (approximately 3 min per 1 cm to 4 min per 1 cm), the sectioning was very consistent. Sections were cut reliably as thin as 25 microns. The cut surface was within 2 microns of the expected height. Blade chatter was roughly 2 microns to 4 microns for brain and 10 microns for muscle. The sections were typically discarded but could be collected for further analysis or processing if required.
Image Alignment. Large volumes were reconstructed seamlessly from image stacks that overlapped in X, Y and Z directions. After acquiring one set of tiled image stacks, a section was removed from the top surface of the block that was physically thinner than the depth that was just imaged. Structures that were imaged deep in the first set of image stacks were then re-imaged near the surface in the second set. This process of imaging and sectioning was repeated until all structures of interest were completely visualized. There was very little distortion as a result of sectioning; therefore, precision alignment was straightforward. Montages were created by stitching together the sets of tiled image stacks (overlapping in X and Y). A final 3D image was produced by merging the successive montages (overlapping in Z). The tiled stacks were aligned by identifying a structure that was present at an edge of two adjacent stacks in any image plane. The image stacks were merged by shifting one relative to the other in X and Y and discarding data from one or other stack where there was overlap. Successive montages were merged by discarding image planes from the bottom of the first montage that overlapped with the planes at the top of the next montage. The montages were then aligned by shifting the first plane of the second montage relative to the final plane of the first montage. The remaining planes of the second montage were aligned automatically by applying the same shift as for the first plane.