Not Applicable
A primary purpose of cancer surgery and surgical pathology is to remove tumor tissues. Complete removal of tumors and all tumor tissue is important. If a tumor is not completely removed, then a second surgery may be required. This produces unneeded patient anxiety and risk, delays the start of second stage therapies, and incurs additional financial costs.
In order to completely remove all tumor tissue, surgeons and pathologists work together during surgery to assess tumor margins such that no malignant cells remain in the wound bed. Tumor margins are the healthy tissue surrounding the tumor, more specifically, the distance between the tumor tissue and the edge of the surrounding tissue removed along with the tumor. Ideally, the margins are selected so that the risk of leaving tumor tissue within the patient is low.
Intraoperative pathology consultation guides the cancer surgeon during surgery to assess if the tumor and margin have not been completely excised. Current surgical pathology consultation relies on palpation, visualization, and anatomical features to select sampling areas for frozen section analysis, which is often limited to 20-30 minutes. Intraoperative pathology consultation starts with gross examination of the excised tissues to choose areas for sampling. This is followed by dissection, embedding, frozen sectioning, and staining of 5-μm slice sections of the excised tissue.
The limited surgical consultation timeframe allows only a small portion of the resected tissue to be interpreted by the pathologist. This time crunch can result in sampling errors which negatively impact the patient outcome.
There exists a need in the art for better intraoperative tumor tissue analysis.
Generally disclosed is a device that provides a nearly full, 360° rotation plus limited elevation-angle, three-dimensional (3D) images of a tumor sample that has been systemically labeled using a fluorescence dye. A first camera takes full-color pictures of the tumor sample at different azimuth and elevation angles; a second camera captures fluorescence images of the sample at the same angles while the sample is illuminated at an excitation frequency. Because of the low light levels of fluorescence, sensitivity-improving approaches, such as active cooling, imaging with a monochrome sensor, long exposure durations, and electronic noise suppression methods, can be used with the second camera.
The resulting full-color and fluorescence images are either overlaid or used to make 3D computer models that are overlaid in space. For making 3D models, calibration marks on a window platform help the software generate precise 3D grids and sub-volumes in order to determine the position of each surface in the images.
The imager can generate a full rotation 3D image of a sample and present it in interactive animation as well as in an augmented realty 3D model to help localize diseased regions around the whole surgical sample.
A graphical user interface (GUI) aids a surgeon by presenting the images in a filmstrip manner on a touch screen device. A surgeon can swipe the filmstrip left or right in order to rotate views in azimuth around the sample. By swiping up or down, the surgeon can access views of the sample taken from higher or lower in elevation.
Also provided is a touch pen comprising a pen body and a pen tip. The pen tip is attached to an end of the pen body. The touch pen is sterile and does not dispense ink. In some embodiments, the pen body comprises stainless steel. In some embodiments, the pen tip is detachable from the pen body and replaceable with a second pen tip. In some embodiments, the touch pen further comprises a pen cover. The pen cover encloses the pen body and the pen tip, and is sterile.
Due to divergence, excitation light excites fluorescence less at farther points on the sample. This can be corrected for by determining the distance of each surface from the fluorescence excitation light source and adjusting the resulting reading's intensity.
The device will help localize disease tissues to reduce sampling errors and re-examination, and ultimately reduce local cancer recurrence and second surgery costs. The device can be used for image-guided pathology to screen resected tissue or biopsy triage in conjunction with image-guide surgery.
Intraoperative consultations are among the most stressful parts of a pathologist's duties. With a strict time limit, little or no ancillary studies available, and using only touch and visual cues to select a few tissue sections to analyze, the pathologist must make a diagnosis where clinical consequences are quite significant.
Imaging modalities currently used in patient care include magnetic resonance imaging (MRI), ultrasound, radiography (X-ray), single-photon emission computed tomography (SPECT), and positron emission tomography (PET). These imaging techniques are not generally used to localize disease in tissues for margin determinations. The reason for this is in part that the resolution offered by these five imaging technologies is not sufficient. Further, the non-specific contrast agents typically used by MRI, ultrasound, and X-ray computed tomography do not target tumors. Additionally, the use of radioactive isotopes in SPECT and PET imaging introduces safety and wellness concerns and is financially costly for both patients and care providers.
An alternative pseudo tomography imaging system embodiment can be used to satisfy the critical criteria for clinical utility. The proposed technology can provide the speed and throughput, the resolution, and the sensitivity and specificity needed to offer a solution for enhancing intraoperative pathology consultation. The described embodiments can also be used within the current work flow of surgeries without significant alterations.
An imaging device as described herein can relieve the diagnosis stress while helping to visualize and identify suspicious areas in lesion samples. It is foreseeable that the device will be used to assist intraoperative consultation of primary tumors and biopsies of head and neck cancers, lymph nodes, hepatobiliary, breast, female genital tract, skin tumors, etc. The proposed technology would also provide cost savings to the healthcare system by reducing reoperations due to recurrence. It is notable that this invention can provide a new paradigm in cancer surgery without causing a deviation from current work flow.
The optical engine 101 can also comprise a preview camera 106. The preview camera 106 can be a true-color zoom camera module used to take preview images and videos in real-time at user-selected view angles.
A laser system 107 can be positioned and configured to illuminate sample 103 with an excitation wavelength. In some embodiments, the excitation wavelength is within the second range of wavelengths that the monochromatic camera is insensitive to.
All three of the cameras 104, 105, and 106 can be configured to have a depth of focus that coincides with the location of the sample 103 on the pseudo-tomography platform 102. The illumination light fields and fields-of view of the cameras 104, 105, and 106, can further be configured along with the depths of focus to provides an imaging resolution of 100 line pairs per inch.
The optical engine 101 can also comprise a cooling line 109. The cooling line 109 can be used provide liquid or other coolant to camera 105 in order to maintain a preferred camera temperature and increase camera sensitivity.
The tomography platform 102 allows nearly full-rotation sample scanning and positioning for tomography imaging and sample loading. This is accomplished through tilting and rotation of the sample stage 107 holding the sample 103. Tilting of the sample stage 107 allows projection views of the top and bottom of the sample 103.
In some embodiments, and as shown in
In some embodiments, a first imaging camera is used for collecting an image over a broad spectrum of visible wavelengths. In some embodiments, the first camera is a grayscale camera. In some embodiments, the first camera is a full color camera.
In some embodiments, the second imaging camera is a cooled camera configured to capture fluorescence images at very low light levels. In some embodiments, a cooling fan, liquid cooling elements, and other cooling systems can be used to cool the second imaging camera. The camera can be optimized for collecting near infrared wavelength photons.
The imaging cameras are configured to take pictures of the sample from the same orientation and at the same focal depth as one another. There are several different ways to achieve this.
The dichroic mirror 403 can be configured to reflect a majority of light outside of a particular range of wavelengths. The dichroic mirror 403 can transmit light within the particular range of wavelengths to camera 402, which can be a cooled monochromatic camera.
In some embodiments, a pivot location offset from the first mirror 508 is used to move the first mirror between its first 503 and second 504 positions. In some embodiments, a slide with detents is used to move the first mirror 508 between its first 503 and second 504 positions, The slide can be made from metal, plastic, or other suitable rigid materials.
In some embodiments, a pivot location offset from the first mirror 608 is used to move the first mirror between its first 603 and second 604 positions. In some embodiments, a slide with detents is used to move the first mirror 608 between its first 603 and second 604 positions, The slide can be made from metal, plastic, or other suitable rigid materials.
In some embodiments, the sample is imaged with a near infrared imaging system that provides a nearly full-rotation 3D tomographic image of surgical samples. The tomographic image can allow for the surgeon and pathologist to efficiently review putative disease areas in an augmented reality display with both fluorescence and reflective imaging modalities. The imaging device can incorporate one or both of true color and fluorescence imaging information with tomographic imaging information. Information associated with each imaging modality can be captured with a nearly 360° sample rotation stage and a 4-degree-of-freedom motion control system.
The nearly full-rotation 3D pseudo-tomography imager can provide an efficient visualization of fluorescence signal at nearly every corner and nearly every facet on a true color 3D sample model. An animation of rotation images representing a 3D reconstructed model displayed on an interactive interface can present to the surgeon and pathologist the results in augmented reality to quickly identify disease areas on a surgical sample such as a tumor margin or biopsy.
The proposed near-infrared (NIR) fluorescence pseudo-tomography imaging system comprises an NIR optical engine for excitation and image collection, a pseudo tomography platform for sample positioning to generate 3D images, a sample delivery system to optimize throughput, and an interactive user interface for efficient image review.
Short wavelength NIR imaging around 700 to 800 nm has been well accepted in research applications due to advantages such as the availability of clinical and translational optical agents, high signal-to-noise ratios, low native tissue/bio-material fluorescence (autofluorescence), and low extinction at the excitation and emission wavelengths. Commercial products have been developed for small animal fluorescence imaging in the NIR range.
The imaging system optical engine can also include laser sources in dual directions, white light LED illumination, specialized filter systems, a cooled imaging camera, and noise suppression technology. The laser sources can have wavelengths of, for example, one or more of 488 nm, 512 nm, 533 nm, 550 nm, 580 nm, 633 nm, 650 nm, 675 nm, 690 nm, 743 nm, 785 nm, or 806 nm. In some embodiments, the laser sources are 690 nm and 785 nm for 700-nm and 800-nm channels, respectively. LED illumination can have wavelengths in one or more ranges from 450 nm to 850 nm, from 450 nm to 650 nm, from 550 nm to 750 nm, from 650 nm to 850 nm, from 450 nm to 550 nm, from 500 nm to 600 nm, from 550 nm, to 650 nm, from 600 nm to 700 nm, from 650 nm to 750 nm, from 700 nm to 800 nm, or from 750 nm to 850 nm. In some embodiments, the LED illumination is at 670 nm and 760 nm for NIR dye excitation. In some embodiments, the LED illumination is broad band illumination at visible wavelengths for white light illumination. Filter sets are designed to isolate excitation wavelength(s) with optimized emission collection for corresponding imaging agents (i.e. IRDye 800CW and ICG).
The imaging system can offer high dynamic range and optimized signal-to-noise ratios with uniform illumination of the imaging field. The imaging dynamic range can be >6 logs (22-bit) in one imaging capture cycle. This dynamic range is higher than those (8-16 bit) commonly offered by other fluorescence imaging technologies. To relieve the processing loads from high dynamic range imaging, one can apply floating point values (instead of long integer values) for each pixel to maximize dynamic range without increasing image size. Thus imaging and processing speed is competitive.
In some embodiments, dark noise calibration is used to give imaging results high signal linearity for quantifiable evaluation. For example, testing on subcutaneous tumors (IRDye 800CW-labeled agent, 15 nmol) of a mouse model has demonstrated linear imaging signal versus tumor tissue weight (Pearson r=0.97 with P-value<0.0001).
In some embodiments, uniform illumination and field flattening correction technologies for laser(s) and LED (light emitting diode) white light channels are used to ensure the reliability and reproducibility of imaging throughout the whole imaging field. The uniformity and reliability can be such that coefficients of variation of approximately 0.04% are observed for repeated measurements at the same positions and <3% for measurements at different positions in the illumination field.
In some embodiments, a pseudo-tomography imager can be capable of detecting indocyanine green (ICG), IRDye 800CW, and other dyes with emission in the 800-nm range. ICG is a clinically approved NIR fluorescence imaging contrast agent. In a dilute aqueous solution, ICG shows an absorption peak around the wavelength 780 nm and an emission peak around 822 nm.
As a systemically administrated imaging contrast agent, ICG offers a favorable safety profile and has been approved by U.S. Food and Drug Administration (FDA) for imaging of lymphatic flow, evaluation of hepatic function and liver blood flow, cardiac output, and other indications. Clinical trials for intraoperative imaging of tumors with ICG have been conducted on a variety of cancers. The use of ICG in tumor detection may be limited because ICG does not contain a reactive functional group for conjugation to target molecules specific to tumor tissues. Nevertheless, clinical studies show ICG may be beneficial for tumor tissue localization in sentinel lymph nodes and in liver, due to pooling effects of dye in the lymphatic system and uptake of ICG by hepatic parenchymal cells in liver. Focal accumulation or fluorescent rim in hepatocellular carcinoma can be visualized by an embodiment imaging device to indicate regions of interest in resected tissues prior to frozen analysis for intraoperative consultation. As the only NIR imaging agent currently approved by FDA, ICG permits rapid clinical translation of the proposed imaging device to intraoperative consultation.
LI-COR's IRDye 800CW has been used in clinical studies and trials for the detection of tumors. It offers reactive groups for easy conjugation with antibodies and superior solubility in aqueous solutions. IRDye 800CW advantages include that the dye is manufactured under cGMP and suitable for human use currently in investigational studies. A rodent toxicity study with IRDye 800CW carboxylate showed no toxic effects under the conditions of the study. A non-human primate toxicity study was completed in 2013 and showed no clinically significant toxicities from cetuximab-IRDye 800CW. Multiple human studies and clinical trials of IRDye 800CW-conjugates as imaging agents are being conducted. To date, the study results and clinical trial outcomes with IRDye 800CW-conjugates are encouraging.
FDA approved anti-epidermal growth factor receptor (EGFR) monoclonal antibodies, cetuximab and panitumumab, and anti-vascular endothelial growth factor (VEGF) antibody, bevacizumab, have been conjugated to IRDye 800CW for tumor targeted imaging. A systemic dose escalation study has shown safety and efficacy in a clinical trial. Other similar moieties can also be attached to the dye. Some of these additional moiety attachments can also be undergoing current clinical trials.
Sensitivity and specificity (reducing false negatives) are key metrics to validate the imaging technologies applied in a 3D imager. One study demonstrated that the presently disclosed imaging module can produce superior sensitivity, specificity, detection limits and signal contrast. In this study, nude mice (Charles River Lab) received HNSCC (head and neck squamous carcinoma) cells to generate xenograft tumors. Imaging experiments were performed 48-96 hours following a systemic injection of panitumumab-IRDye 800CW. IgG-IRDye 800CW was used to assess non-specific binding. It was shown that the new imaging technology gives high tumor-to-background ratio with low nonspecific or native fluorescence background. The high contrast and tumor-to-background ratio is due at least in part to linear background suppression, high dynamic range technologies, and other imaging advances.
In some embodiments, a substantially uniform illumination area (3% variation) and imaging depth of focus will accommodate a 3-inch cube with resolution of ˜125 μm. Other embodiments establish an imaging volume of a >4-inch cube.
In contrast with conventional small animal fluorescence imaging which provide images of partial 3D faces, the disclosed systems, devices, and methods can offer nearly full 3D imaging, presenting nearly full or Euler animated rotations of the resulting reconstructed model. In some embodiments, the pseudo-tomography imager provides fluorescence imaging and reflective white light imaging co-localized in a nearly full-rotation 3D image.
In some embodiments, a tomography platform is coordinated with the optical engine to provide the nearly full-rotation 3D imaging result. The imaging volume of the tomography system is a 3D space that accommodates the resected tissue sample. The imaging volume of this system is defined by the illumination fields of the excitation light, the depth of focus of the objective, and the field of view of the imaging head. The imaging volume can have a depth (in the x-axis) of, for example, 1 inch, 1.5 inches, 2 inches, 2.5 inches, 3 inches, 3.5 inches, 4 inches, 4.5 inches, or 5 inches. The imaging volume can have a cross-section (in the x-y plane) of, for example, 7×8 cm2, 7×10 cm2, 7×12 cm2, 8.5×8 cm2, 8.5×10 cm2, 8.5×12 cm2, 10×8 cm2, 10×10 cm2, or 10×12 cm2. The imaging resolution can be, for example, about 50 line pairs per inch, about 60 line pairs per inch, about 70 line pairs per inch, about 80 line pairs per inch, about 90 line pairs per inch, about 100 line pairs per inch, about 110 line pairs per inch, about 120 line pairs per inch, about 130 line pairs per inch, about 140 line pairs per inch, about 150 line pairs per inch, about 200 line pairs per inch, about 250 line pairs per inch, about 300 line pairs per inch, about 350 line pairs per inch, about 400 line pairs per inch, about 450 line pairs per inch, about 500 line pairs per inch, about 550 line pairs per inch, or about 600 line pairs per inch.
The tomography platform is equipped with rotational motors and stages to control the view angle and position of a sample within the imaging volume. By rotating a sample in two degrees of freedom, an imager can efficiently provide a nearly full rotation 3D image. The first rotation is a nearly 360-degree movement along the z-axis (roll) relative to the sample to collect images at serial view angles. The second rotation is tilting along y-axis (pitch) for imaging at different perspectives. Tilting of the sample stage allows projection views from the top and bottom of the sample via a transparent glass window. Rotation combinations allow nearly the entire sample to be imaged. Translational movements of the sample stage in X-Y plane allow the registration of the sample to the center of the imaging volume.
To collect pertinent imaging projections along a sample for 3D reconstruction, the tomography platform rotates the object in two degrees-of-freedom. To provide comprehensive coverage of sample features, the tilting angle is typically in the range from 7.5 degrees to 45 degrees, depending on the complexity of the sample. With sample holding structures, such as pins, clamps, or stops, larger tilting angle can be achieved. A rolling step of 22.5 degree and a tilting angle at ±35 degrees in an embodiment can offer a nearly full rotation animation for 3D inspection.
For many practical applications, an imaging device should be capable of imaging a tissue having a size of ˜3-inch diameter or larger with resolution better than 150 μm. The sample stage can be designed with lightweight material to accommodate large samples. In some embodiments, the sample stage has a custom-made glass window with marking dimensions at its edges.
In some embodiments, the sample to be imaged is supported by a platform or stage having a transparent portion or window. In some embodiments, the entire platform or stage is transparent.
The window can be transparent at the working wavelengths for both reflective light and fluorescence imaging. To accommodate a large size sample, the window can be custom made to a shape that is wider than the projection size of the imaging volume or the footprint of a target sample.
The material of the window can be borosilicate based glass, or other transparent material. The surface could be treated or coated for optical (anti-reflection, transparency, absorption purposes) or surface functional (hydrophobic or hydrophilic properties, marks, barcodes, etc.) requirements.
A circle on the window can be used to mark the border of suggested imaging area. Tick marks along the window can provide reference scales to users. The tick marks also can allow the construction software to identify and calculate the dimensions with references extending into the spatial volume.
Due to the divergent propagation of excitation light along the depth of imaging volume (x-axis), the imaging analysis may need to apply a correction factor in volume to compensate for decay along the sample.
For large-area illumination, the excitation light goes along the propagation axis towards an object with controlled divergence or defined illumination pattern. In the figures, the excitation radiation comes from a point source and is diverging toward the imaging volume for fluorescence excitation. If a signal spot is located at one end of an object, the energy density of the excitation wavefront at the area of the signal spot varies as the object is rotated to a different view angle. For a point source (isotropic radiator), the energy density at a distance y follows the inverse-square law (I=1/y2). For a controlled divergence to achieve uniform illumination at an x-y plane, the function of light intensity along the propagation axis can be specified or can be measured. With a known function of energy density at a distance y, a correction factor can be applied into the signal reconstruction model. The 3D information collected via the reflective light imaging gives the geometric information at the spots of interest. The correction factor of fluorescence signal can then be implemented to improve the reliability and quantifiability of the imaging result. For a complicated model, scattering and extinction of the signal light going through a part of the tissue object can be considered to further improve the fidelity of reconstructed results under superficial layers.
In some embodiments, a tomographic imager can generate a series of images corresponding to different view angles automatically in short period of time. Collocated serial images of two modalities are obtained by overlaying images of reflective light and fluorescence together. Software code is used to compile the serial imaging results into an animation to demonstrate the nearly full-rotation 3D views of a sample. The animation offers real-time 3D perspectives of nearly the entire sample. For the 3D reconstruction of the sample model and the interactive displaying of the 3D results, commercial 3D reconstruction algorithms can be used.
In some embodiments, the imaging process of an entire sample at 85 μm resolution can be done in 20 min or less. In some embodiments, a 3D animation can be generated within a minute or less.
In
In
In
Furthermore, an augmented reality display interface will allow the user to review 3D views by rolling and tilting the tablet display in which a gyroscope package senses the motion and provides the information to interact with the image output.
A 3D reconstruction algorithm can offer full-rotation 3D model for an augmented reality application. In the reconstruction algorithm one can apply volume carving to form spatial volume, together with feature recognition techniques to locate common features. A projection method known as “silhouette projection” takes an image into a virtual volume by carving away the voxels outside the silhouette projection at each view angle. Overall geometry of an object can then be derived. By matching common features, an object can be verified and incorporated into the model. After the model is created, an intensity scalar related to the adjusted fluorescence color map of each view is retrieved for 3D rendering. Coordinates, features, and scalars defined in the volume reconstruction process are used to match 2D and 3D results in rendering the signal distribution. Tissue mimicking phantoms and tissue constructs are accounted for in algorithm validation.
As current tablet displays commonly have a refresh rate of ≥60 Hz and a response time of <8 ms, the input lagging of an augmented reality 3D model is determined by the application algorithm. An input lag can be measured and expected to achieve ≤100 ms lagging for a typical 3D model so that user will not experience serious human detectable latency while viewing a model interactively.
The animation can be automatically played or can be interactively controlled. For example, a surgeon can stop the animation at a selected frame, zoom in, or tilt up and down to see more of the sample.
In a surgical workflow, a surgeon who operates a surgery only touches tools that are sterilized. In some surgical procedures, a technologist or other staff member assists a surgeon by helping to manipulate information presented on a display of any instrument. However, actions taken by the staff may not accurately or effectively accord with the verbal commands and requests from a surgeon. As a result, there can be a benefit to enabling surgeons to work with a display or instrument directly. Touching of instruments such as a computer, keyboards, display panels, or a cabinet imager may break the sterilization, though, and create contamination problems. The use of a sterile touch pen to operate a display or interface on a screen can therefore assist in maintaining a sterile environment in an operating room.
The pen body 4201 can be made of disposable and pre-sterilized material intended for one-time or limited-time use. The pen body 4201 can be or made of sterilizable material intended for repeated use with sterilization occurring prior to each use. In some embodiments, one or both of the pen body 4201 and the pen tip 4202 comprise a metal. In some embodiments, the metal is stainless steel. In some embodiments, the pen tip 4202 is detachable from the pen body 4201. The pen tip 4202 can be made of disposable and pre-sterilized material intended for one-time or limited-time use. The touch pen can be enclosed in a pen cover 4203 that is made of disposable and pre-sterilized material intended for one-time or limited-time use. In some embodiments, the pen body 4201 and pen tip 4202 are not sterile, but the pen cover 4203 is sterile. In some embodiments, the touch pen can dispense ink. In some embodiments, the touch pen does not dispense ink.
An embodiment can use a high-performance monochrome imaging camera with a CCD sensor. With advances in CMOS imaging sensors, CMOS sensors may become the norm in fluorescence applications.
Although the foregoing invention has been described in some detail by way of illustration and example for purposes of clarity of understanding, one of skill in the art will appreciate that certain changes and modifications may be practiced within the scope of the appended claims. In addition, each reference provided herein is incorporated by reference in its entirety to the same extent as if each reference was individually incorporated by reference.
This application is a continuation of U.S. patent application Ser. No. 15/192,771, filed Jun. 24, 2016 (U.S. Pat. No. 10,379,048 to be issued Aug. 13, 2019), which claims the benefit of U.S. Provisional Patent Application Nos. 62/185,407, filed Jun. 26, 2015, 62/325,588, filed Apr. 21, 2016, and 62/339,657, filed May 20, 2016, all of which are incorporated by reference in their entireties for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2792502 | O'Connor | May 1957 | A |
5103338 | Crowley et al. | Apr 1992 | A |
5224141 | Yassa et al. | Jun 1993 | A |
5408294 | Lam | Apr 1995 | A |
5782762 | Vining | Jul 1998 | A |
5812265 | Hoshiyama | Sep 1998 | A |
5959295 | Braun | Sep 1999 | A |
6165170 | Wynne et al. | Dec 2000 | A |
6172373 | Hara et al. | Jan 2001 | B1 |
6356272 | Matsumoto et al. | Mar 2002 | B1 |
RE37913 | Nishi | Nov 2002 | E |
6711433 | Geiger et al. | Mar 2004 | B1 |
7218393 | Sharpe et al. | May 2007 | B2 |
7286232 | Bouzid | Oct 2007 | B2 |
7453456 | Petrov et al. | Nov 2008 | B2 |
7505124 | Kreckel et al. | Mar 2009 | B2 |
7551711 | Sarment et al. | Jun 2009 | B2 |
7715523 | Lafferty | May 2010 | B2 |
7929743 | Khorasani | Apr 2011 | B2 |
8115918 | Zavislan et al. | Feb 2012 | B2 |
8220415 | Ragatz et al. | Jul 2012 | B2 |
8503602 | Lafferty | Aug 2013 | B2 |
8741232 | Baysal et al. | Jun 2014 | B2 |
8754384 | Persoon et al. | Jun 2014 | B1 |
8848071 | Asakura | Sep 2014 | B2 |
8851017 | Ragatz et al. | Oct 2014 | B2 |
8892192 | Cuccia et al. | Nov 2014 | B2 |
9053563 | Embrey | Jun 2015 | B2 |
9345389 | Nie | May 2016 | B2 |
9528938 | Wang | Dec 2016 | B2 |
9557281 | Badawi et al. | Jan 2017 | B2 |
9632187 | Badawi et al. | Apr 2017 | B2 |
10254227 | Wang | Apr 2019 | B2 |
10278586 | Wang | May 2019 | B2 |
10379048 | Wang | Aug 2019 | B2 |
10489964 | Wang | Nov 2019 | B2 |
20020012420 | Bani-Hashemi et al. | Jan 2002 | A1 |
20030078477 | Kang et al. | Apr 2003 | A1 |
20040101088 | Sabol et al. | May 2004 | A1 |
20050046840 | Kusuzawa | Mar 2005 | A1 |
20050227374 | Cunningham | Oct 2005 | A1 |
20060072123 | Wilson et al. | Apr 2006 | A1 |
20060250518 | Nilson et al. | Nov 2006 | A1 |
20060253035 | Stern | Nov 2006 | A1 |
20070106111 | Horn et al. | May 2007 | A1 |
20070121099 | Matsumoto et al. | May 2007 | A1 |
20070276184 | Okawa | Nov 2007 | A1 |
20080025475 | White | Jan 2008 | A1 |
20080077019 | Xiao et al. | Mar 2008 | A1 |
20080297890 | Natori et al. | Dec 2008 | A1 |
20080312540 | Ntziachristos | Dec 2008 | A1 |
20090011386 | Eiff et al. | Jan 2009 | A1 |
20090018451 | Bai et al. | Jan 2009 | A1 |
20090032731 | Kimura et al. | Feb 2009 | A1 |
20090129543 | Le Gros et al. | May 2009 | A1 |
20090192358 | Jaffer et al. | Jul 2009 | A1 |
20090208072 | Seibel et al. | Aug 2009 | A1 |
20090234225 | Martin et al. | Sep 2009 | A1 |
20090250631 | Feke et al. | Oct 2009 | A1 |
20090268010 | Zhao et al. | Oct 2009 | A1 |
20100309548 | Power et al. | Dec 2010 | A1 |
20110025880 | Nandy | Feb 2011 | A1 |
20110104071 | Lee | May 2011 | A1 |
20110116694 | Gareau | May 2011 | A1 |
20110135190 | Maad | Jun 2011 | A1 |
20110229023 | Jones et al. | Sep 2011 | A1 |
20120049087 | Choi et al. | Mar 2012 | A1 |
20120049088 | Klose | Mar 2012 | A1 |
20120065518 | Mangoubi et al. | Mar 2012 | A1 |
20120105600 | Meyer | May 2012 | A1 |
20120182411 | Nakatsuka et al. | Jul 2012 | A1 |
20120194663 | Haisch et al. | Aug 2012 | A1 |
20120206448 | Embrey | Aug 2012 | A1 |
20120206577 | Guckenberger et al. | Aug 2012 | A1 |
20120257079 | Ninan et al. | Oct 2012 | A1 |
20120302880 | Tian et al. | Nov 2012 | A1 |
20120312957 | Loney et al. | Dec 2012 | A1 |
20130027516 | Hart et al. | Jan 2013 | A1 |
20130057673 | Ferrer Moreu et al. | Mar 2013 | A1 |
20130135081 | McCloskey et al. | May 2013 | A1 |
20140125790 | Mackie et al. | May 2014 | A1 |
20140140594 | Mahadevan-Jansen et al. | May 2014 | A1 |
20140163388 | Sasayama et al. | Jun 2014 | A1 |
20140186049 | Oshima et al. | Jul 2014 | A1 |
20140276008 | Steinbach | Sep 2014 | A1 |
20140294247 | Sirault et al. | Oct 2014 | A1 |
20140346359 | Holliday | Nov 2014 | A1 |
20140349337 | Dasari et al. | Nov 2014 | A1 |
20140378843 | Valdes | Dec 2014 | A1 |
20150000410 | Grimard et al. | Jan 2015 | A1 |
20150008337 | Shimizu | Jan 2015 | A1 |
20150018690 | Kang et al. | Jan 2015 | A1 |
20150022824 | Babayoff | Jan 2015 | A1 |
20150062153 | Mihalca et al. | Mar 2015 | A1 |
20150073213 | Khait et al. | Mar 2015 | A1 |
20150098126 | Keller et al. | Apr 2015 | A1 |
20150105283 | Weiss et al. | Apr 2015 | A1 |
20150257653 | Hyde et al. | Sep 2015 | A1 |
20150359413 | Rainis | Dec 2015 | A1 |
20150373293 | Vance et al. | Dec 2015 | A1 |
20160155472 | Elg et al. | Jun 2016 | A1 |
20160187199 | Brunk et al. | Jun 2016 | A1 |
20160217609 | Kornilov et al. | Jul 2016 | A1 |
20160335984 | Wu | Nov 2016 | A1 |
20170020627 | Tesar et al. | Jan 2017 | A1 |
20170059487 | Wang | Mar 2017 | A1 |
20170176338 | Wu et al. | Jun 2017 | A1 |
20170309063 | Wang | Oct 2017 | A1 |
20170336706 | Wang | Nov 2017 | A1 |
20170367582 | Wang | Dec 2017 | A1 |
20180020920 | Ermilov et al. | Jan 2018 | A1 |
20180140197 | Wang et al. | May 2018 | A1 |
20180180550 | Franjic et al. | Jun 2018 | A1 |
20180209924 | Sasazawa et al. | Jul 2018 | A1 |
20180228375 | Kim et al. | Aug 2018 | A1 |
20180242939 | Kang et al. | Aug 2018 | A1 |
20190079010 | Bawendi et al. | Mar 2019 | A1 |
20190298303 | Bingley et al. | Oct 2019 | A1 |
Number | Date | Country |
---|---|---|
101301192 | Nov 2008 | CN |
101401722 | Apr 2009 | CN |
101460953 | Jun 2009 | CN |
101984928 | Mar 2011 | CN |
102048525 | May 2012 | CN |
203677064 | Jul 2014 | CN |
103082997 | Oct 2015 | CN |
102011104216 | Dec 2012 | DE |
2455891 | May 2012 | EP |
2514125 | Nov 2014 | GB |
02304366 | Dec 1990 | JP |
0924052 | Jan 1997 | JP |
09236755 | Sep 1997 | JP |
10123054 | May 1998 | JP |
2000304688 | Nov 2000 | JP |
2004531729 | Oct 2004 | JP |
2008298861 | Dec 2008 | JP |
2009008739 | Jan 2009 | JP |
2013253983 | Dec 2013 | JP |
6585728 | Sep 2019 | JP |
20130096910 | Sep 2013 | KR |
2006113908 | Oct 2006 | WO |
2007030424 | Mar 2007 | WO |
2013166497 | Nov 2013 | WO |
2014094142 | Jun 2014 | WO |
2015023990 | Feb 2015 | WO |
2016014252 | Jan 2016 | WO |
2016073569 | May 2016 | WO |
2016100214 | Jun 2016 | WO |
2016210340 | Dec 2016 | WO |
2017184940 | Oct 2017 | WO |
2017200801 | Nov 2017 | WO |
2017223378 | Dec 2017 | WO |
2018098162 | May 2018 | WO |
Entry |
---|
Mondal et al., “Real-time Fluorescence Image-Guided Oncologic Surgery,” Adv. Cancer Res. 2014; 123: 171-211. (Year: 2014). |
JP2017-562263, “Office Action,” dated Mar. 3, 2020, 11 pages. |
Arctec Eva Fast Handheld 3D Scanner for Professionals, Artec 3D, Available online at: http://www.artec3d.com/hardware/artec-evat/, Accessed from Internet on Apr. 19, 2016, 6 pages. |
BioVision Digital Specimen Radiography (DSR) System, Bioptics Inc., Premarket Notification 510(k) Summary, May 2009, 5 pages. |
BioVision Surgical Specimen Radiography System, Faxitron Bioptics LLC, Available online at: http://www.faxitron.com/medical/products/biovision.html, Accessed from Internet on Apr. 26, 2016, 2 pages. |
Clinical Video Management and Visible Light Documentation, Orpheus Medical, The Examiner's Attention is Directed to Slide 11, Feb. 3, 2016, 17 pages. |
Every Cancer Tells a Story If You Have the Tools to Read It, PerkinElmer, Solutions for Cancer Research, AACR Annual Meeting, Available online at: http://go.perkinelmer.com/webmail/32222/179460051/9c4865b118d5295e96e973a5b6c28bad, Apr. 18-22, 2015, 1 page. |
Every Cancer Tells a Story if You Have the Tools to Read it, PerkinElmer, Available online at: http://go.perkinelmer.com/1/32222/2015-03-26/3rww9?utm_content=LST-AACR-GLOQ, Accessed from Internet on Apr. 15, 2015, 2 pages. |
Imaging Modules, TomoWave Laboratories, Available online at: http://www.tomowave.com/imagingmodules.html, Accessed from Internet on Mar. 23, 2016, 1 page. |
Optical Scatter Imaging System for Surgical Specimen Margin Assessment During Breast Conserving Surgery, Project Information NIH Research Portfolio Online Reporting Tools, Project No. 1R01CA192803-01, Accessed from Internet on Apr. 18, 2016, 2 pages. |
Faxitron—“Path Vision,” Faxitron Bioptics LLC, Available online at: http://www.faxitron.com/medical/products/pathvision.html, Accessed from Internet on Apr. 18, 2016, 2 pages. |
U.S. Appl. No. 15/049,970, Notice of Allowability dated Oct. 14, 2016, 4 pages. |
Badawi et al., Real-Time Tissue Assessment During Surgical Procedures, UC David Office of Research, Tech ID: 24307, 1 page. |
Fang et al., Combined Optical and X-ray Tomosynthesis Breast Imaging, Radiology, vol. 258, No. 1, Jan. 2011, pp. 89-97. |
Japanese Application No. 2017-544296, Office Action dated May 7, 2019, 5 pages (2 pages of Original Document and 3 pages of English Translation). |
Kleiner et al., Classification of Ambiguous Nerve Fiber Orientations in 3D Polarized Light Imaging, Medical Image Computing and Computer-Assisted Intervention, vol. 15, Part 1, 2012, pp. 206-213. |
Lamberts et al., Tumor-Specific Uptake of Fluorescent Bevacizumab-IRDye800CW Microdosing in Patients with Primary Breast Cancer: A Phase I Feasibility Study, Clinical Cancer Research, Personalized Medicine and Imaging, American Association for Cancer Research, Nov. 9, 2016, 41 pages. |
Lee et al., Fusion of Coregistered Cross-Modality Images Using a Temporally Alternating Display Method, Medical & Biological Engineering & Computing, Springer, vol. 38, No. 2, Mar. 1, 2000, pp. 127-132. |
International Application No. PCT/US2016/018972, International Search Report and Written Opinion dated Jun. 23, 2016, 10 pages. |
International Application No. PCT/US2016/039382, International Search Report and Written Opinion dated Sep. 13, 2016, 14 pages. |
International Application No. PCT/US2017/028769, International Search Report and Written Opinion dated Sep. 22, 2017, 19 pages. |
International Application No. PCT/US2017/031740, International Search Report and Written Opinion dated Sep. 19, 2017, 25 pages. |
International Application No. PCT/US2017/038860, International Search Report and Written Opinion dated Sep. 22, 2017, 12 pages. |
International Application No. PCT/US2017/062812, International Preliminary Report on Patentability dated Jun. 6, 2019, 7 pages. |
International Application No. PCT/US2018/027978, International Search Report and Written Opinion dated Jul. 12, 2018, 13 pages. |
Sturm et al., CopyMe3D: Scanning and Printing Persons in 3D, Medical Image Computing and Computer-Assisted Intervention—Miccai 2015: 18th International Conference, Sep. 3-6, 2013, pp. 405-414. |
Wu et al., Rotational Imaging Optical Coherence Tomography for Full-Body Mouse Embryonic Imaging, Journal of Biomedical Optics, vol. 21, No. 2, Feb. 2016, pp. 026002-1-026002-9. |
PCT/US2016/018972, “Invitation to Pay Additional Fees and, Where Applicable, Protest Fee,” dated Apr. 1, 2016, 2 pages. |
PCT/US2016/039382, “International Preliminary Report on Patentability,” dated May 23, 2017, 12 pages. |
PCT/US2017/028769, “International Preliminary Report on Patentability,” dated Nov. 1, 2018, 13 pages. |
PCT/US2017/028769, “Invitation to Pay Additional Fees and, Where Applicable, Protest Fee,” dated Jul. 26, 2017, 12 pages. |
PCT/US2017/031740, “International Preliminary Report on Patentability,” dated Nov. 29, 2018, 18 pages. |
PCT/US2017/031740, “Invitation to Pay Additional Fees and, Where Applicable, Protest Fee,” dated Jul. 28, 2017, 23 pages. |
U.S. Appl. No. 16/456,511, “Corrected Notice of Allowability,” dated Aug. 11, 2020, 5 pages. |
EP17721277.6, “Office Action,” dated Jun. 18, 2020, 6 pages. |
Application No. CN201680037184.6, Office Action, dated Sep. 21, 2020, 10 pages. |
EP16815434.2, “Office Action,” dated Apr. 23, 2020, 6 pages. |
CN201680037184.6, “Office Action,” dated Mar. 27, 2020, 21 pages. |
CN201680037184.6 , “Office Action”, dated Oct. 24, 2019, 12 pages. |
PCT/US2018/027978 , “International Preliminary Report on Patentability”, dated Nov. 7, 2019, 9 pages. |
Number | Date | Country | |
---|---|---|---|
20190353595 A1 | Nov 2019 | US |
Number | Date | Country | |
---|---|---|---|
62185407 | Jun 2015 | US | |
62325588 | Apr 2016 | US | |
62339657 | May 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15192771 | Jun 2016 | US |
Child | 16529463 | US |