This invention generally relates to the imaging of objects and, more particularly, to devices and methods for obtaining both two and three-dimensional images of objects for use in inspection and analysis of the objects.
There are currently numerous non-invasive imaging techniques that can be used to produce images of a given object for use in inspection, analysis, and the like. Such techniques include X-rays, magnetic resonance imaging (“Mill”), computed tomography (“CT” or “microtomography”) scans, ultrasound and optical imaging using structured light, among others.
As an example, definitive diagnosis of cancers such as breast cancer is typically accomplished through the surgical removal (e.g., biopsy) of the suspicious tissue (e.g., lesion) by a surgeon for further examination by a radiologist and/or pathologist. After a surgeon has appropriately identified a location of a possible lesion, the surgeon proceeds to excise tissue (e.g., object) that includes the lesion and then verify that the entirety of the suspicious area is within the margins of the excised tissue. In this regard, a radiologist or the like will often x-ray or otherwise image the excised tissue specimen from multiple views (e.g., orthogonal views) to confirm appropriate tissue margins. Once the tissue margins have been confirmed, the surgeon may then appropriately mark or otherwise indicate where on the excised tissue specimen a pathologist should focus during subsequent analysis and diagnosis.
In the event that the area of interest is too close or even contacts the tissue margins, the surgeon may need to excise additional tissue. Accordingly, it is important for the radiologist and surgeon to have confidence from the various images of the tissue specimen that the tissue margins are sufficient and that all potentially cancerous or worrisome tissue is fully contained within the specimen to limit the number of further tissue excisions.
As another example, objects such as printed circuit boards and other electrical devices are known to include a substrate with numerous tiny external electrical bonds whereby an electrical lead is soldered to the device. Because these kind of bonds are often on the order of a few microns in diameters, they typically cannot be visually inspected with the naked eye and thus must be magnified in some manner to inspect the bonds and other components for flaws and the like. Manufacturers thus often x-ray such electrical devices to identify any internal defects in the devices as part of nondestructive testing (NDT) of the devices.
Disclosed herein is a cabinet imaging system that can automatically obtain and provide two and three-dimensional digital images of various types of objects (e.g., tissue specimens, animals, electrical devices, etc.) for use in analysis thereof in a manner free of manual repositioning of the objects as well as free of movement of an electromagnetic radiation source and detector within or relative to the housing of the cabinet. While much of this disclosure will be in the context of objects such as a tissue specimens, it is to be understood that the disclosed cabinet imaging system and related methods can be used to obtain orthogonal and reconstructed three-dimensional images of objects in various other contexts such as medical (e.g., small animals), manufacturing (e.g., electrical devices), research, security, defense, and the like.
In one aspect, a cabinet for use in obtaining images of an object includes a housing having a plurality of walls that surround an interior chamber, an imaging detector positioned relative to the housing, a source of electromagnetic radiation (e.g., x-ray tube or the like) positioned relative to the housing and that is configured to emit a beam of electromagnetic radiation along a first axis towards the imaging detector, an object receiving surface disposed within the interior chamber for receiving an object thereon, and a motion control mechanism for moving the object receiving surface along a second axis relative to the source and the first axis, where the first and second axes are non-parallel and non-perpendicular.
For instance, the motion control mechanism may include a first linear drive that is configured to move the object receiving surface along the second axis. The first linear drive may include a sliding member that is configured to slide along the second axis, and where the object receiving surface is interconnected to the sliding member. The motion control mechanism may also include a rotary drive that is interconnected to the object receiving surface and the sliding member of the first linear drive, where the rotary drive is configured to rotate the object receiving surface about a rotational axis that is perpendicular to the first axis.
In one arrangement, the housing may include a false floor within the interior chamber that divides the interior chamber into a first chamber and a second chamber, where the object receiving surface is disposed within the first chamber. For instance, at least a portion of the motion control mechanism may be disposed in the second chamber and/or in the first chamber.
In another aspect, a method for use in imaging an object in a cabinet includes operating a motion control apparatus in a cabinet to move an object within the cabinet along a first axis relative to a source of electromagnetic radiation and an imaging detector from a first position on the first axis to a second position on the first axis, and triggering the source to emit a beam of electromagnetic radiation along a second axis through the object in its second position towards the detector, where the first and second axes are non-parallel and non-perpendicular.
Various refinements may exist of the features noted in relation to the various aspects. Further features may also be incorporated in the various aspects. These refinements and additional features may exist individually or in any combination, and various features of the aspects may be combined. In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following descriptions.
For a more complete understanding of the present invention and further advantages thereof, reference is now made to the following Detailed Description, taken in conjunction with the drawings, in which:
Reference will now be made to the accompanying drawings, which assist in illustrating the various pertinent features of the various novel aspects of the present disclosure. In this regard, the following description is presented for purposes of illustration and description. Furthermore, the description is not intended to limit the inventive aspects to the forms disclosed herein. Consequently, variations and modifications commensurate with the following teachings, and skill and knowledge of the relevant art, are within the scope of the present inventive aspects.
With initial respect to
The system 100 generally includes a shielded imaging cabinet 200, a computing system 300 (e.g., service, desktop computer, etc., including processor(s), memory, etc.), and one or more peripherals 400 electrically interconnected to the computing system 300 such as input devices 404 (e.g., keyboard, mouse), output devices 408 (e.g., monitor), and the like. The computing system 300 may generally be configured to receive input from a technician, physician, or the like regarding an object to be imaged (e.g., patient information, object information, etc.) and store the same, initiate an imaging procedure based at least in part on the received input (e.g., trigger an x-ray source to emit x-rays through the object for receipt at an x-ray detector), move an object imaging platform on which the object is disposed into one or more various positions within the cabinet 200 as discussed more fully below), receive and process signals from the x-ray detector, and generate various 2D and 3D images of the object for presentation to the physician or the like (e.g., on output device/monitor 408) for use in tissue margin verification. The computing system 300 may allow the physician or the like to view the 2D and 3D images on a screen and slice through the 3D image at almost any position to see internal details of the same.
While the computing system 300 is illustrated as being separate from the cabinet 200, the computing system 300 may in other arrangements be appropriately combined with the cabinet 200 into a single unit. In other arrangements, the computing system 300 may be disposed remote from the cabinet 200 such as in a separate room or even geographically remote and in communication therewith by one or more networks (e.g., LAN, WAN, Internet) or may be distributed among a plurality of computing systems (e.g., servers, networks, etc.). In any case, all references to “computing system” or similar herein are intended to encompass one or processors or processor cores that are configured to execute one or more sets of computer-readable instruction sets to carry out the various determinations and functionalities disclosed herein (e.g., determining a position of a object within the interior chamber 208 of the cabinet 200, triggering motion control apparatus 500 to move the object within the cabinet based on the determined position, triggering electromagnetic radiation source 220 to emit one or more electromagnetic radiation beams 222 through object, generating image data sets based on electromagnetic radiation beams received at detector 224, and the like, discussed herein).
Broadly, the cabinet 200 includes a housing 204 that generally defines an interior chamber 208 for receiving an object (e.g., tissue specimen) on an object receiving surface 216 of an object holder 212 (e.g., platform, table, stage, etc.) that is movable within the interior chamber 208 relative to a source 220 of electromagnetic radiation (e.g., beam 222) and an imaging detector 224. The imaging detector 224 is configured to receive electromagnetic radiation emitted from the source 220 after passing through an object (not shown) received on the object receiving surface 216. In one arrangement, the object holder 212 may include one or more walls that extend upwardly away from the object receiving surface 216 to form a container for the object. The object holder 212 may be constructed from any appropriate radiolucent or low radio-density material (e.g., as one example, polymeric foam) to substantially eliminate or at least reduce attenuation of beam of electromagnetic radiation passing through the object holder 212; this arrangement thus substantially eliminates or at least reduces the likelihood of the object holder 212 appearing in an image of the object and correspondingly increases the quality (e.g., contrast, resolution, etc.) of the image (e.g., for use in verifying tissue margins, identifying suspicious locations or areas in the excised tissue specimen to be subsequently analyzed by a pathologist, and/or the like).
The housing 204 may generally include any appropriate arrangement of walls 228, electromagnetic shielding (e.g., lead sheets, etc.), brackets, and other componentry (not all shown in the interest of clarity) to define the interior chamber 208, limit electromagnetic radiation from escaping or leaking from the housing 204, and non-movably secure the source 220 and detector 224 relative to the housing 204 (i.e., the source 220 and detector 224 are non-movable relative to the walls 228, brackets, etc. of the housing 204 during imaging procedures). Furthermore, the housing 204 includes a shielded access member 232 (e.g., door) that is movable between an open position (as shown in
With reference now to
In the context of cancer diagnosis and the like, it is important for an excised tissue specimen to remain in a substantially constant shape and/or a substantially undisturbed position with respect to some particular reference point or device (e.g., relative to a tray or carrier used to transport the specimen) between excision up to and including pathologist diagnosis. For instance, reshaping of the tissue specimen (e.g., compressing, folding, etc.) between the taking of first and second orthogonal images (e.g., for use in tissue margin detection) through manual repositioning of the specimen by a technician or the like can make accurate tissue margin analysis difficult or even impossible. Furthermore, obtaining three-dimensional images of specimens has become an important technique for use in tissue margin verification which involves obtaining a plurality of images about an outer periphery of the specimen and then reconstructing (e.g., through digital processing) the plurality of images into a three-dimensional data set and a corresponding three-dimensional image which can be manipulated by a technician or the like to analyze the tissue margins. While some existing imaging cabinets include electromagnetic radiation sources and/or detectors that move relative to a specimen that remains stationary within the cabinet for use in obtaining two dimensional orthogonal images and three-dimensional specimen images, such existing cabinets thus require componentry, space, and the like to allow for such moving sources and/or detectors which can result in a larger footprint of the cabinet among other inefficiencies.
In this regard, the cabinet 200 includes a motion control mechanism or apparatus 500 for moving an object received or placed on the object receiving surface 216 relative to the source 220, the detector 224, and the axis 244 along which the beams 222 are emitted from the source 220 to the detector 224. In one embodiment, the motion control mechanism 500 may include a rotary drive 504 that is configured to rotate the object holder 212 (and thus the object receiving surface 216 and an object received thereon) about a rotational axis 508 that is substantially perpendicular to the axis 222 along which the beams 222 travel.
For instance, the rotary drive 504 may include a motor 510 that is configured to rotate a shaft assembly 512 in first and/or second directions about the rotational axis 508 under control of the computing system 300. The shaft assembly 512 may be rigidly or non-movably attached to the object holder 212 in any appropriate manner such that rotation of the shaft assembly 512 induces simultaneous corresponding rotation of the object holder 212 (and thus the object receiving surface 216 and the object placed thereon) about the rotational axis 508.
In operation, and after an object has been placed on the object receiving surface 212, the computing system 300 may trigger the source 220 to emit a beam 222 of electromagnetic radiation along the axis 244 through the object for receipt at the detector 224, whereupon the received electromagnetic radiation signals may be appropriately processed by the computing system 300 or the like for generate of an image of the object with the object in a first rotational position. As discussed above, orthogonal and/or three-dimensional imaging of the object may be used to verify tissue margins in the case of tissue specimens, detect defects in the case of electrical devices, and the like. In this regard, the computing system 300 may trigger the motion control apparatus 500 to rotate the object receiving surface 212 and object by 90° about the rotational axis 508 from the first rotational position to a second rotational position and then trigger the source 220 to emit a beam 222 of electromagnetic radiation along the axis 244 through the object for receipt at the detector 224 for generation of another (orthogonal) image of the object with the object in the second rotational position.
Additionally or alternatively, the computing system 300 may trigger the motion control apparatus 500 to rotate the object receiving surface 212 and object and simultaneously trigger the source 220 to emit a beam 222 of electromagnetic radiation along the axis 244 through the object as it is rotating about the rotational axis 508. The computing system 300 may be configured to receive and process detected electromagnetic radiation signals from the detector 224 as the object is rotating about the rotational axis 508 to generate a plurality of two dimensional images (e.g., several times per second or more) which may then be reconstructed by the computer device 300 or the like into a three-dimensional data set and a corresponding three-dimensional image of the object. The three-dimensional images can be used in combination with or separate from the two dimensional images as part of tissue margin verification, defect detection, and the like.
In some situations, a maximum outline of the object may not substantially fill the area of the beam (e.g., where the area of the beam extends within a reference plane that is substantially parallel to the first and second side walls 236, 248 of the housing 204 and perpendicular to the beam axis 244) due to the size or dimensions of the object, due to the positioning of the object receiving surface 216 relative to the beam axis 244, and/or the like which may otherwise result in inaccurate or distorted images of the object. In another characterization, a centroid of the object may not substantially intersect the beam axis 244 or the centroid may substantially intersect the beam axis 244 but the object may be positioned too far away from the source 220 to obtain images of an appropriate magnification.
In this regard, the motion control apparatus 500 (under control of the computing system 300) may be configured to linearly move the object receiving surface 212 and object thereon along an axis relative to the source 220, the detector 224, and the beam axis 244 so as to move the centroid of the object into substantial intersection with the beam axis 244 and/or to move the object closer to the source 220 for use in obtaining higher quality images of the object. In one arrangement and as shown in
The motor or driving mechanism of the linear drive 516 may be rigidly fixed to the housing 204 (e.g., to the walls 228 or other fixed structures) in any appropriate manner. As shown in
To allow the motion control apparatus 500 to be connected to the object holder 212 across the false floor 260, the false floor 260 may include an elongated opening or slot 262 extending along a longitudinal axis 264 that is disposed within a reference plane (not shown) along with the axis 524 of the linear drive 516. In this regard, a portion of the object holder 212 and/or the shaft assembly 512 (as shown, the shaft assembly 512) may be configured to slide within the slot 262 as the motion control apparatus 500 moves the sliding member 520 and thus the object receiving surface 216 and their axes 524, 217 (via the shaft assembly 512).
In the embodiment of
To more fully understand the various functionalities of the disclosed system, additional reference will now be made to
The system 100 may initially be powered on and calibrated 604 and appropriate object information (e.g., patient name and ID number, body portion from where specimen excised, part number, etc.) may be inputted 608 into the cabinet 200 and/or computing system 300 in any appropriate manner. For instance, a power switch 280 may be manipulated by a user into an “on” position and a screen 282 on the cabinet (or in other location) may provide a status of the cabinet 200 (e.g., calibrating, ready, in use, etc.). Also for example, one or more of the input devices 404 may be manipulated to input patient information regarding the imaging procedure into the computing system 300 which may be displayed on the output device 408 in any appropriate manner.
The object O may eventually be placed 612 into the interior chamber 228 (e.g., the first interior chamber 2281). For instance, the access member 232 may be opened into the position shown in
In one arrangement, the method 600 may include starting 616 the imaging procedure and proceeding to take 620 orthogonal images of the object O and present the same on the output device/monitor 408 for review. For instance, after the system has been calibrated and the object O placed into the interior chamber 228, the technician may initiate the orthogonal imaging by way of using the input device(s) 404 in any appropriate manner to trigger the computing system 300 to conduct the orthogonal imaging. Specifically, the computing system 300 may trigger the source 220 to emit a beam 222 of electromagnetic radiation along axis 244 through the object O with the object in a first rotational position so as to be received at the detector 224, whereupon the computing system 300 may appropriately process the signals received from the detector 224 to generate a first orthogonal image of the object O. The corresponding data may be saved in any appropriate manner and the image may be displayed on output device 408 and/or another output device. In the case where the beam 222 is a cone beam or the like as illustrated in
After the first orthogonal image has been obtained, the computing system 300 or the like may trigger the motion control mechanism 500 (e.g., the rotary drive 504) to rotate the object holder 212 (and thus the object receiving surface 216 and object O) by 90° about rotational axis 508 from a first rotational position to a second rotational position. The computing system 300 may then trigger the source 220 to emit a beam 222 of electromagnetic radiation along axis 244 through the object O with the object O in the second rotational position so as to be received at the detector 224, whereupon the computing system 300 may appropriately process the signals received from the detector 224 to generate a second orthogonal image of the object O. Again, the corresponding data may be saved in any appropriate manner (e.g., in storage 312 of
As discussed previously, an increase in the distance between the centroid (e.g., geometrical center) of an object and the beam axis 244 can sometimes result in an increase in the level of distortion in generated images of the object (e.g., in the case of three-dimensional imaging of the object). Furthermore, when the maximum outline of an object fails to substantially fill or encompass a substantial entirety of the area of the beam 222 (e.g., where the area of the beam extends within a reference plane that is substantially parallel to the first and second side walls 236, 248 of the housing 204 and perpendicular to the beam axis 244), the object may be positioned at a less than optimal magnification point within the interior chamber 208 relative to the source 220.
For instance,
Accordingly, the method 600 may include determining a position of the object O relative to the source 220 and the beam axis 244 (e.g., how far the centroid C is from the beam axis 244 and the source 220) and then automatically triggering the motion control apparatus 500, based on the determined position, to move 624 the object O from a first position to a second position whereby its centroid C is closer to or intersects the beam axis 244 and/or so that the centroid C is closer to the source 220. In another characterization, the method 600 may include moving 624 the object O so that its maximum outline more fully fills the area of the beam 222 (e.g., consumes more of the area of the beam 222). For instance, and after determining the relative position of the object O relative to the beam axis 244 and/or the source 220, the computing system 300 may trigger the linear drive 516 (see
In one arrangement, and with the object O in a first or initial position such as in
While
With reference back to
In this regard, and upon determination of a second position to which the object is to be moved, the computing system 300 may trigger the motion control apparatus 500 to move the object receiving surface 216 along axis 218 and/or along axis 219 to move the object receiving surface 216 and object O into their respective second positions. For instance, the embodiment of
In one arrangement, the computing system 300 may trigger the motion control apparatus 500 to rotate the object receiving surface 216 and object O by at least one full revolution or 360°. In other arrangements, however, the computing system 300 may trigger the motion control apparatus 500 to rotate the object receiving surface 216 and object O by more than a full revolution or even less than a full revolution (e.g., in the latter case, by 180°, by 270°, etc.), obtain a plurality of images during such rotation, and generate three-dimensional data sets for display of corresponding three-dimensional images.
While the moving step 624 was discussed as occurring after the step 620 of obtaining orthogonal images of the object, the moving step 624 may in some embodiments occur before the orthogonal imaging step 620. In some arrangements, the computing system 300 may be configured to obtain 2D (e.g., orthogonal) and 3D images of the object at two or more different magnification levels or two or more different linear positions of the object within the interior chamber 208. After obtaining 2D or 3D image data sets of the object at one magnification level or position, the computing system 300 may be configured to subsequently trigger the motion control apparatus 500 to move the specific to a different position along axis 290 whereupon the computing system 300 may then trigger the source 220 and detector 224 to obtain further image data of the object. In one arrangement, the system 100 may be configured to obtain, store and transmit high resolution digital images that are compliant with the Digital Imaging and Communications in Medicine (DICOM) standard.
For instance, the memory device 308 may include one or more imaging sequences 316 such as orthogonal imaging sequences 320 and 3D imaging sequences 324 that are configured to be executed by the processing unit 308 to trigger the electromagnetic source 220 to emit beams of electromagnetic radiation and to collect signals from the detector 224 for use in generating and storing corresponding imaging data sets 332 in storage 312 and displaying the same on an output device 404 (e.g., monitor). The memory device 308 may also include one or more magnification/object movement sequences 328 that are configured to be executed by the processing unit 308 to trigger the motion control apparatus 500 to rotate the object receiving surface 212 and object O about rotation axis 508 and/or move object receiving surface 212 and object O along one or more of the above-discussed axes as part of imaging of the object O. Any appropriate patient data 336 (e.g., name, ID, object location, etc.) may also be stored in any appropriate format or structure.
The processing unit 308 may execute the various sequences 316, 328 independently or concurrently as appropriate, consistent with the teachings presented herein. It is to be understood that the various sequences 316, 328, etc. (logic, computer-readable instructions) may be loaded from any appropriate non-volatile storage (e.g., storage 312 or elsewhere) before being appropriately loaded into memory 304 for execution by processing unit 308. In one arrangement, the memory device 304 and processing unit 308 may function as a controller that is configured to trigger one or more components of the system 100 (e.g., motion control apparatus 500, source 220, etc.) based on inputs from a user (e.g., to initiate an imaging sequence), based on measurements or readings obtained by the computing system 300, etc.
The description herein has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit the invention to the form disclosed herein. Consequently, variations and modifications commensurate with the above teachings, and skill and knowledge of the relevant art, are within the scope of the present invention. For instance, the system 100 may include any appropriate arrangement (e.g., position encoder, other indicator(s), etc.) that allows the computing system 300 to determine the angular or rotational position of the object holder 212 about the rotational axis 508. In one arrangement, the object holder 212 may be configured to be mounted to the motion control mechanism (e.g., to the shaft assembly 512) only in a particular rotational or angular position (e.g., through the use of keys and corresponding slots). In this case, and assuming an object is placed onto the object receiving surface 216 in a particular orientation relative thereto (e.g., relative to a grid or other indicator(s) on the object receiving surface 216) the computing system 300 may be able to present such object orientation information to a user on a display along with the generated images. As an example, the grid or other indicator may indicate to a user how the object is to be positioned on the object receiving surface 216 so that the computing system 300 can more accurately present such orientation information to the user on the display with the generated images. For instance, an image of a human body may be superimposed on the object receiving surface 216 to indicate to a user that the portion of the specimen closest to the patient's head should be positioned closest to the head on the superimposed human body.
In one arrangement, one or more orientation indicators or marks may be provided on or in the object holder 212 that are configured to at least partially inhibit transmission of electromagnetic radiation therethrough so that a corresponding indication or mark appears in the generated image to provide information regarding the orientation of the object relative to the indication/mark to a user (e.g., relative to a human body). In another arrangement, the computing system 300 may be configured to digitally superimpose one or more orientation indicators, marks, graphics, and/or the like into or about the generated image(s) of the object.
As mentioned, embodiments disclosed herein can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus (processors, cores, etc.). The computer-readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them. In addition to hardware, code that creates an execution environment for the computer program in question may be provided, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
A computer program (also known as a program, software, software application, script, or code) used to provide the functionality described herein can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
While this disclosure contains many specifics, these should not be construed as limitations on the scope of the disclosure or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the disclosure. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
This application is a National Stage Application of PCT/US2018/050490, filed on Sep. 11, 2018, which claims the benefit of U.S. Provisional Patent App. No. 62/556,566, entitled “IMAGING SYSTEM WITH ADAPTIVE OBJECT MAGNIFICATION,” and filed on Sep. 11, 2017, the entireties of which are incorporated herein by reference as if set forth in full. To the extent appropriate, a claim of priority is made to each of the above disclosed applications.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/050490 | 9/11/2018 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/051496 | 3/14/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4038988 | Perisse | Aug 1977 | A |
4134012 | Smallbone et al. | Jan 1979 | A |
4306570 | Matthews | Dec 1981 | A |
4549554 | Markham | Oct 1985 | A |
4658834 | Blankenship et al. | Apr 1987 | A |
4837795 | Garrigus | Jun 1989 | A |
4852560 | Hermann, Jr. | Aug 1989 | A |
5023894 | McCroskey | Jun 1991 | A |
5023895 | McCroskey | Jun 1991 | A |
5256160 | Clement | Oct 1993 | A |
5427742 | Holland | Jun 1995 | A |
5456689 | Kresch et al. | Oct 1995 | A |
5491344 | Kenny et al. | Feb 1996 | A |
5505210 | Clement | Apr 1996 | A |
5526822 | Burbank et al. | Jun 1996 | A |
5575293 | Miller et al. | Nov 1996 | A |
5609827 | Russell | Mar 1997 | A |
5983125 | Alfano et al. | Nov 1999 | A |
6017316 | Ritchart et al. | Jan 2000 | A |
6032673 | Savage et al. | Mar 2000 | A |
6058159 | Conway | May 2000 | A |
6163590 | Wilkins | Dec 2000 | A |
6225107 | Nagle | May 2001 | B1 |
6234672 | Tomasetti et al. | May 2001 | B1 |
6322522 | Zimmon | Nov 2001 | B1 |
6403035 | Caratsch et al. | Jun 2002 | B1 |
6485436 | Truckai et al. | Nov 2002 | B1 |
6535284 | Hajduk et al. | Mar 2003 | B1 |
7175612 | Felix et al. | Feb 2007 | B2 |
7397894 | Nakai | Jul 2008 | B2 |
7662109 | Hibner | Feb 2010 | B2 |
7692144 | Watanabe | Apr 2010 | B2 |
7715523 | Lafferty | May 2010 | B2 |
7753857 | Hibner | Jul 2010 | B2 |
7758601 | Heywang-Koebrunner et al. | Jul 2010 | B2 |
7856081 | Peschmann | Dec 2010 | B2 |
7858038 | Andreyko et al. | Dec 2010 | B2 |
7867173 | Hibner et al. | Jan 2011 | B2 |
7972062 | Nicolosi | Jul 2011 | B2 |
8038347 | Manak | Oct 2011 | B2 |
8038627 | Hibner | Oct 2011 | B2 |
8050735 | Feke | Nov 2011 | B2 |
8052616 | Andrisek et al. | Nov 2011 | B2 |
8162140 | Hansen | Apr 2012 | B2 |
8177728 | Hibner et al. | May 2012 | B2 |
8213570 | Panesar | Jul 2012 | B2 |
8235913 | Hibner et al. | Aug 2012 | B2 |
8284896 | Singh | Oct 2012 | B2 |
8764679 | Miller et al. | Jul 2014 | B2 |
8911381 | Hibner et al. | Dec 2014 | B2 |
8923603 | Weston | Dec 2014 | B2 |
8956306 | Hibner | Feb 2015 | B2 |
8971484 | Beckmann | Mar 2015 | B2 |
8983030 | Ookawa | Mar 2015 | B2 |
9068920 | Churilla | Jun 2015 | B2 |
9129715 | Adler | Sep 2015 | B2 |
9188696 | Schafer | Nov 2015 | B2 |
9234855 | Watanabe | Jan 2016 | B2 |
9277895 | Hara | Mar 2016 | B2 |
9322790 | Ookawa | Apr 2016 | B2 |
9329139 | Itou | May 2016 | B2 |
9341546 | Stuke | May 2016 | B2 |
9347894 | Sims | May 2016 | B2 |
9492130 | Flagle et al. | Nov 2016 | B2 |
9642581 | Lowe | May 2017 | B2 |
9865424 | Ikeda | Jan 2018 | B2 |
9943850 | Purdy | Apr 2018 | B2 |
9953799 | Hakoda | Apr 2018 | B2 |
10008298 | King | Jun 2018 | B2 |
10010296 | Basu | Jul 2018 | B2 |
10078093 | Flagle | Jul 2018 | B2 |
10098216 | Kabumoto | Oct 2018 | B2 |
10105709 | Purdy | Oct 2018 | B2 |
10145806 | Tanaka | Dec 2018 | B2 |
10190997 | Aoki | Jan 2019 | B2 |
10322412 | Purdy | Jun 2019 | B2 |
10393678 | Watanabe | Aug 2019 | B2 |
10488351 | Butani | Nov 2019 | B2 |
10705030 | Watanabe | Jul 2020 | B2 |
10753836 | O'Driscoll | Aug 2020 | B2 |
10809208 | Yashima | Oct 2020 | B2 |
11083426 | DeFreitas | Aug 2021 | B2 |
20020193656 | Ravins et al. | Dec 2002 | A1 |
20030216730 | Barry et al. | Nov 2003 | A1 |
20040022350 | Gregerson et al. | Feb 2004 | A1 |
20040174031 | Rasmussen | Sep 2004 | A1 |
20040218716 | Freifeld | Nov 2004 | A1 |
20050051723 | Neagle et al. | Mar 2005 | A1 |
20050065453 | Shabaz et al. | Mar 2005 | A1 |
20050112034 | McCormick | May 2005 | A1 |
20050124913 | Damarati | Jun 2005 | A1 |
20060074343 | Hibner | Apr 2006 | A1 |
20060116603 | Shibazaki et al. | Jun 2006 | A1 |
20060173266 | Pawluczyk et al. | Aug 2006 | A1 |
20070106176 | Mark et al. | May 2007 | A1 |
20070166834 | Williamson, IV et al. | Jul 2007 | A1 |
20070237684 | Hansen | Oct 2007 | A1 |
20070239067 | Hibner et al. | Oct 2007 | A1 |
20070270714 | Cushner et al. | Nov 2007 | A1 |
20080004545 | Garrison | Jan 2008 | A1 |
20080082021 | Ichikawa | Apr 2008 | A1 |
20080132805 | Heywang-Koebrunner et al. | Jun 2008 | A1 |
20080214955 | Speeg et al. | Sep 2008 | A1 |
20080221480 | Hibner et al. | Sep 2008 | A1 |
20080228103 | Ritchie et al. | Sep 2008 | A1 |
20090088663 | Miller et al. | Apr 2009 | A1 |
20090088666 | Miller et al. | Apr 2009 | A1 |
20090131818 | Speeg et al. | May 2009 | A1 |
20090131820 | Speeg | May 2009 | A1 |
20090131823 | Andreyko et al. | May 2009 | A1 |
20090171243 | Hibner et al. | Jul 2009 | A1 |
20090171244 | Ning | Jul 2009 | A1 |
20090213987 | Stein | Aug 2009 | A1 |
20100081964 | Mark | Apr 2010 | A1 |
20100152611 | Parihar | Jun 2010 | A1 |
20100160826 | Parihar | Jun 2010 | A1 |
20100191145 | Lafferty | Jul 2010 | A1 |
20100317997 | Hibner | Dec 2010 | A1 |
20110285837 | Bello | Nov 2011 | A1 |
20120051514 | Sims et al. | Mar 2012 | A1 |
20120053484 | Parks | Mar 2012 | A1 |
20120116246 | Hibner | May 2012 | A1 |
20120123295 | Sanbuichi | May 2012 | A1 |
20120245485 | Hibner | Sep 2012 | A1 |
20130231585 | Flagle | Sep 2013 | A1 |
20140039343 | Mescher | Feb 2014 | A1 |
20140065656 | Baysal | Mar 2014 | A1 |
20140072104 | Jacobsen et al. | Mar 2014 | A1 |
20140257135 | DeFreitas | Sep 2014 | A1 |
20140276209 | Hibner | Sep 2014 | A1 |
20150083893 | Wismueller | Mar 2015 | A1 |
20150131773 | Lowe et al. | May 2015 | A1 |
20170131311 | Flagle | May 2017 | A1 |
20170336706 | Wang | Nov 2017 | A1 |
20190054217 | Axon | Feb 2019 | A1 |
20190072463 | O'Driscoll | Mar 2019 | A1 |
20190167869 | Willard | Jun 2019 | A1 |
20190285558 | DeFreitas | Sep 2019 | A1 |
20190346471 | Flagle | Nov 2019 | A1 |
20200061622 | Purdy | Feb 2020 | A1 |
20200187923 | Safir | Jun 2020 | A1 |
20200386657 | O'Driscoll | Dec 2020 | A1 |
Number | Date | Country |
---|---|---|
2007287 | Jun 2016 | EP |
2018601 | Oct 1979 | GB |
2014-526937 | Oct 2014 | JP |
2015-520402 | Jul 2015 | JP |
2016-154878 | Sep 2016 | JP |
8101363 | May 1981 | WO |
2007021905 | Feb 2007 | WO |
2008025146 | Mar 2008 | WO |
2009120206 | Oct 2009 | WO |
2012074885 | Jun 2012 | WO |
2013166497 | Nov 2013 | WO |
2018204710 | Nov 2018 | WO |
2019216766 | Nov 2019 | WO |
Entry |
---|
International Search Report and Written Opinion of the International Searching Authority for International Patent Application No. PCT/US2018/050490 dated Dec. 27, 2018, 34 pages. |
European extended Search Report in Application 18853903.5, dated May 17, 2021, 8 pages. |
PCT International Preliminary Report on Patentability in International Application PCT/US2018/050490, dated Mar. 26, 2020, 31 pages. |
Watanabe, M. et al., “The quantitative analysis of thin specimens: a review of progress from the Cliff-Lorimer to the new zeta-factor methods”, Journal of Microscopy, vol. 221, No. 2, Feb. 1, 2006, p. 91. |
Number | Date | Country | |
---|---|---|---|
20200268331 A1 | Aug 2020 | US |
Number | Date | Country | |
---|---|---|---|
62556566 | Sep 2017 | US |