The present disclosure relates to devices, systems, and methods for tumor visualization and removal. The disclosed devices, systems, and methods may also be used to stage tumors and to assess surgical margins and specimens such as tissue margins, excised tissue specimens, and tissue slices of excised tumors and margins on tissue beds/surgical beds from which a tumor and/or tissue has been removed. The disclosed devices, systems, and methods may also be used to identify one or more of residual cancer cells, precancerous cells, and satellite lesions and to provide guidance for removal and/or treatment of the same. The disclosed devices may be used to obtain materials to be used for diagnostic and planning purposes.
Surgery is one of the oldest types of cancer therapy and is an effective treatment for many types of cancer. Oncology surgery may take different forms, dependent upon the goals of the surgery. For example, oncology surgery may include biopsies to diagnose or determine a type or stage of cancer, tumor removal to remove some or all of a tumor or cancerous tissue, exploratory surgery to locate or identify a tumor or cancerous tissue, debulking surgery to reduce the size of or remove as much of a tumor as possible without adversely affecting other body structures, and palliative surgery to address conditions caused by a tumor such as pain or pressure on body organs.
In surgeries in which the goal is to remove the tumor(s) or cancerous tissue, surgeons often face uncertainty in determining if all cancer has been removed. The surgical bed, or tissue bed, from which a tumor is removed, may contain residual cancer cells, i.e., cancer cells that remain in the surgical margin of the area from which the tumor is removed. If these residual cancer cells remain in the body, the likelihood of recurrence and metastasis increases. Often, the suspected presence of the residual cancer cells, based on examination of surgical margins of the excised tissue during pathological analysis of the tumor, leads to a secondary surgery to remove additional tissue from the surgical margin.
For example, breast cancer, the most prevalent cancer in women, is commonly treated by breast conservation surgery (BCS), e.g., a lumpectomy, which removes the tumor while leaving as much healthy breast tissue as possible. Treatment efficacy of BCS depends on the complete removal of malignant tissue while leaving enough healthy breast tissue to ensure adequate breast reconstruction, which may be poor if too much breast tissue is removed. Visualizing tumor margins under standard white light (WL) operating room conditions is challenging due to low tumor-to-normal tissue contrast, resulting in reoperation (i.e., secondary surgery) in approximately 23% of patients with early stage invasive breast cancer and 36% of patients with ductal carcinoma in situ. Re-excision is associated with a greater risk of recurrence, poorer patient outcomes including reduced breast cosmesis and increased healthcare costs. Positive surgical margins (i.e., margins containing cancerous cells) following BCS are also associated with decreased disease specific survival.
Current best practice in BCS involves palpation and/or specimen radiography and rarely, intraoperative histopathology to guide resection. Specimen radiography evaluates excised tissue margins using x-ray images and intraoperative histopathology (touch-prep or frozen) evaluates small samples of specimen tissue for cancer cells, both of which are limited by the time delay they cause (˜20 min) and inaccurate co-localization of a positive margin on the excised tissue to the surgical bed. Thus, there is an urgent clinical need for a real-time, intraoperative imaging technology to assess excised specimen and surgical bed margins and to provide guidance for visualization and removal of one or more of residual cancer cells, precancerous cells, and satellite lesions.
The present disclosure may solve one or more of the above-mentioned problems and/or may demonstrate one or more of the above-mentioned desirable features. Other features and/or advantages may become apparent from the description that follows.
In accordance with one aspect of the present disclosure, a method of assessing surgical margins and/or specimens is disclosed. The method comprises, subsequent to administration of a compound configured to induce porphyrins in cancerous tissue cells, positioning a distal end of a handheld, white light and fluorescence-based imaging device adjacent to a surgical margin. The method also includes, with the handheld device, substantially simultaneously exciting and detecting autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells of the surgical margin. And, based on a presence or an amount of fluorescence emissions of the induced porphyrins detected in the tissue cells of the surgical margin, determining whether the surgical margin is substantially free of at least one of precancerous cells, cancerous cells, and satellite lesions.
In accordance with another aspect of the present disclosure, a method of visualizing a tissue of interest in a patient is disclosed. The method comprises administering to the patient, in a diagnostic dosage, a non-activated, non-targeted compound configured to induce porphyrins in cancerous tissue. The method further comprises, between about 15 minutes and about 6 hours after administering the compound, removing tissue containing the induced porphyrins from the patient, wherein removing the tissue creates a surgical cavity. The method also includes, with a handheld white light and fluorescence-based imaging device, viewing a surgical margin of at least one of the removed tissue cells, one or more sections of the removed tissue cells, and the surgical cavity to visualize any induced porphyrins contained in tissues of the surgical margin.
In accordance with yet another aspect of the present disclosure, a handheld, white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins is disclosed. The device comprises a body having a first end portion configured to be held in a user's hand and a second end portion configured to direct light onto a surgical margin. The body contains at least one excitation light source configured to excite autofluorescence emissions of tissue cells and fluorescence emissions of induced porphyrins in tissue cells of the surgical margin. The body also contains a filter configured to prevent passage of reflected excitation light and permit passage of emissions having a wavelength corresponding to autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells. The body further contains an imaging lens, an image sensor configured to detect the filtered autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells of the surgical margin, and a processor configured to receive the detected emissions and to output data regarding the detected filtered autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells of the surgical margin. In accordance with one example embodiment, the filter in the body may be mechanically moved into and out of place in front of the image sensor.
In accordance with a further aspect of the present disclosure, a kit for white light and fluorescence-based visualization of cancerous cells in a surgical margin is disclosed. The kit comprises a handheld, white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins and a non-targeted, non-activated compound configured to induce porphyrins in cancerous tissue cells.
In accordance with another aspect of the present disclosure, a multispectral system for visualizing cancerous cells in surgical margins is disclosed. The system comprises a handheld, white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins, a display device configured to display data output by the processor of the handheld device; and a wireless real-time data storage and pre-processing device.
In accordance with yet another aspect of the present disclosure, a kit for white light and fluorescence-based visualization of cancerous cells in a surgical margin includes a handheld, white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins and a plurality of tips configured to be exchangeable with a tip portion on the handheld device, wherein each tip includes at least one light source.
In accordance with another aspect of the present disclosure, a handheld, white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins is disclosed. The device comprises a body having a first end portion configured to be held in a user's hand and a second end portion configured to direct light onto a surgical margin. The body contains at least one excitation light source configured to excite autofluorescence emissions of tissue cells and fluorescence emissions having a wavelength of between about 600 nm and about 660 nm in precancerous cells, cancerous cells, and satellite lesions of the surgical margin after exposure to an imaging or contrast agent. The body also contains a filter configured to prevent passage of reflected excitation light and permit passage of emissions having a wavelength corresponding to autofluorescence emissions of tissue cells and fluorescence emissions between about 600 nm and about 660 nm in tissue cells of the surgical margin. The body further contains an imaging lens, an image sensor configured to detect the filtered autofluorescence emissions of tissue cells and fluorescence emissions between about 600 nm and about 660 nm in tissue cells of the surgical margin, and a processor configured to receive the detected emissions and to output data regarding the detected filtered autofluorescence emissions of tissue cells and fluorescence emissions between about 600 nm and about 660 nm in tissue cells of the surgical margin.
In accordance with a further aspect of the present disclosure, a method of assessing surgical margins is disclosed. The method comprises, subsequent to administration of a compound configured to induce emissions of between about 600 nm and about 660 nm in cancerous tissue cells, positioning a distal end of a handheld, white light and fluorescence-based imaging device adjacent to a surgical margin. The method also includes, with the handheld device, substantially simultaneously exciting and detecting autofluorescence emissions of tissue cells and fluorescence emissions of the induced wavelength in tissue cells of the surgical margin. And, based on a presence or an amount of fluorescence emissions of the induced wavelength detected in the tissue cells of the surgical margin, determining whether the surgical margin is substantially free of at least one of precancerous cells, cancerous cells, and satellite lesions.
In accordance with yet another aspect of the present disclosure, a method of assessing surgical margins is disclosed. The method comprises, subsequent to the administration to a patient of a non-activated, non-targeted compound configured to induce porphyrins in cancerous tissue cells, and with a white light and fluorescence-based imaging device for visualizing at least one of precancerous cells, cancerous cells, and satellite lesions in surgical margins, illuminating tissue cells of a surgical margin in the patient with an excitation light. The method further includes detecting fluorescence emissions from tissue cells in the surgical margin that contain induced porphyrins and displaying in real-time the tissue cells from which fluorescence emissions were detected to guide surgical assessment and/or treatment of the surgical margin.
In accordance with yet another aspect of the present disclosure, a method of assessing lymph nodes is disclosed. The method comprises, subsequent to administration of a compound configured to induce porphyrins in cancerous tissue cells, substantially simultaneously exciting and detecting fluorescence of the induced porphyrins in tissue cells of a target lymph node. The method further includes based on an amount of fluorescence of the induced porphyrins detected in the tissue cells of the target lymph node, determining whether the lymph node is substantially free of cancerous cells.
In accordance with yet another aspect of the present disclosure, a method of predicting an amount of fibrosis in a tissue sample is disclosed. The method comprises receiving RGB data of fluorescence of the tissue sample responsive to illumination with excitation light; and based on a presence or an amount of fluorescence emitted by the tissue sample, calculating a percentage of green fluorescence, a density of the green fluorescence, and a mean green channel intensity of the green fluorescence in the tissue sample.
In accordance with yet another aspect of the present disclosure, a method of method of correlating tissue types identified in a sample is disclosed. The method comprises receiving a digitalized section of a tissue sample from a surgical bed, a surgical margin or an excised tissue specimen that was exposed to a histological stain and to a compound configured to induce porphyrins in tissue cells. The method further comprises selecting a tissue category for analyzing the tissue sample, determining a first area value for one or more stained portions in the tissue sample, determining a second area value based on fluorescence emitted by the tissue sample when illuminated by excitation light, wherein the first area value and the second area value correspond to the selected tissue category, and comparing the first area value with the second area value.
In accordance with yet another aspect of the present disclosure, a method of quantifying color contrast in a fluorescence emission of a tissue sample is disclosed. The method comprises inputting an RGB image of the tissue sample, the tissue sample being previously exposed to a compound configured to induce porphyrins in tissue cells. The method further comprises converting the RGB image into a data set, calculating a first average color intensity in the tissue sample and corresponding values in the data set, calculating a second average color intensity in the tissue sample and corresponding values in the data set, plotting x and y coordinates on a chromaticity diagram for the first average color intensity and the second average color intensity, and connecting the coordinates with a vector.
The present disclosure can be understood from the following detailed description either alone or together with the accompanying drawings. The drawings are included to provide a further understanding, and are incorporated in and constitute a part of this specification. The drawings illustrate one or more exemplary embodiments of the present disclosure and together with the description serve to explain various principles and operations.
Existing margin assessment technologies focus on the excised sample to determine whether surgical margins include residual cancer cells. These technologies are limited by their inability to accurately spatially co-localize a positive margin detected on the excised sample to the surgical bed, a limitation the present disclosure overcomes by directly imaging the surgical cavity.
Other non-targeted techniques for reducing re-excisions include studies which combine untargeted margin shaving with standard of care BCS. While this technique may reduce the overall number of re-excisions, the approach includes several potential drawbacks. For example, larger resections are associated with poorer cosmetic outcomes and the untargeted removal of additional tissues is contradictory to the intention of BCS. In addition, the end result of using such a technique appears to be in conflict with the recently updated ASTRO/SSO guidelines, which defined positive margins as ‘tumor at ink’ and found no additional benefit of wider margins. Moran M S, Schnitt S J, Giuliano A E, Harris J R, Khan S A, Horton J et al., “Society of Surgical Oncology-American Society for Radiation Oncology consensus guideline on margins for breast-conserving surgery with whole-breast irradiation in stages I and II invasive breast cancer,” Ann Surg Oncol. 2014. 21(3):704-716. A recent retrospective study found no significant difference in re-excisions following cavity shaving relative to standard BCS. Pata G, Bartoli M, Bianchi A, Pasini M, Roncali S, Ragni F., “Additional Cavity Shaving at the Time of Breast-Conserving Surgery Enhances Accuracy of Margin Status Examination,” Ann Surg Oncol. 2016. 23(9):2802-2808. Should margin shaving ultimately be found effective, FL-guided surgery may be used to refine the process by adding the ability to target specific areas in a surgical margin for shaving, thus turning an untargeted approach, which indiscriminately removes additional tissue, into a targeted approach that is more in line with the intent of BCS.
The present application discloses devices, systems, and methods for fluorescent-based visualization of tumors, including in vivo andex vivovisualization and/or assessment of tumors, multifocal disease, and surgical margins, and intraoperative guidance for removal of residual tumor, satellite lesions, precancerous cells, and/or cancer cells in surgical margins. In certain embodiments, the devices disclosed herein are handheld and are configured to be at least partially positioned within a surgical cavity. In other embodiments, the devices are portable, without wired connections. However, it is within the scope of the present disclosure that the devices may be larger than a handheld device, and instead may include a handheld component. In such embodiments, it is contemplated that the handheld component may be connected to a larger device housing or system by a wired connection.
Also disclosed are methods for intraoperative, in-vivo imaging using the device and/or system. The imaging device may be multispectral. It is also contemplated that the device may be hyperspectral. In addition to providing information regarding the type of cells contained within a surgical margin, the disclosed devices and systems also provide information regarding location (i.e., anatomical context) of cells contained within a surgical margin. In addition, methods of providing guidance for intraoperative treatment of surgical margins using the device are disclosed, for example, fluorescence-based image guidance of resection of a surgical margin. The devices, systems, and methods disclosed herein may be used on subjects that include humans and animals.
In accordance with one aspect of the present disclosure, some disclosed methods combine use of the disclosed devices and/or systems with administration of a non-activated, non-targeted compound configured to induce porphyrin in tumor/cancer cells, precancer cells, and/or satellite lesions. For example, the subject may be given a diagnostic dose (i.e., not a therapeutic dose) of a compound (imaging/contrast agent) such as the pro-drug aminolevulinic acid (ALA). As understood by those of ordinary skill in the art, dosages of ALA less than 60 mg/kg are generally considered diagnostic while dosages greater than 60 mg/kg are generally considered therapeutic. As disclosed herein, the diagnostic dosage of ALA may be greater than 0 mg/kg and less than 60 kg/mg, between about 10 mg/kg and about 50 mg/kg, between about 20 mg/kg and 40 mg/kg, and may be administered to the subject in a dosage of about 5 mg/kg, about 10 mg/kg, about 15 kg/mg, about 20 mg/kg, about 25 mg/kg, about 30 mg/kg, about 35 mg/kg, about 40 mg/kg, about 45 mg/kg, about 50 mg/kg, or about 55 mg/kg. The ALA may be administered orally, intravenously, via aerosol, via immersion, via lavage, and/or topically. Although a diagnostic dosage is contemplated for visualization of the residual cancer cells, precancer cells, and satellite lesions, it is within the scope of the present disclosure to use the disclosed devices, systems, and methods to provide guidance during treatment and/or removal of these cells and/or lesions. In such a case, the surgeon's preferred method of treatment may vary based on the preferences of the individual surgeon. Such treatments may include, for example, photodynamic therapy (PDT). In cases where PDT or other light-based therapies are contemplated as a possibility, administration of a higher dosage of ALA, i.e., a therapeutic dosage rather than a diagnostic dosage, may be desirable. In these cases, the subject may be prescribed a dosage of ALA higher than about 60 mg/kg.
The ALA induces porphyrin formation (protoporphyrin IX (PpIX)) in tumor/cancer cells (
In exemplary embodiments, the non-activated, non-targeted compound configured to induce porphyrin in tumor/cancer cells, precancer cells, and/or satellite lesions is administered to a subject between about 15 minutes and about 6 hours before surgery, about 1 hour and about 5 hours before surgery, between about 2 hours and about 4 hours before surgery, or between about 2.5 hours and about 3.5 hours before surgery. These exemplary time frames allow sufficient time for the ALA to be converted to porphyrins in tumor/cancer cells, precancer cells, and/or satellite lesions. The ALA or other suitable compound may be administered orally, intravenously, via aerosol, via immersion, via lavage, and/or topically.
In cases where the administration of the compound is outside of the desired or preferred time frame, it is possible that PpIX may be further induced (or induced for the first time if the compound was not administered prior to surgery) by, for example, applying the compound via an aerosol composition, i.e., spraying it into the surgical cavity or onto the excised tissue (before or after sectioning for examination). Additionally or alternatively, the compound may be administered in a liquid form, for example as a lavage of the surgical cavity. Additionally or alternatively, with respect to the removed specimen, PpIX may be induced in the excised specimen if it is immersed in the liquid compound, such as liquid ALA, almost immediately after excision. The sooner the excised tissue is immersed, the better the chance that PpIX or additional PpIX will be induced in the excised tissue.
During surgery, the tumor is removed by the surgeon, if possible. The handheld, white light and fluorescence-based imaging device is then used to identify, locate, and guide treatment of any residual cancer cells, precancer cells, and/or satellite lesions in the surgical bed from which the tumor has been removed. The device may also be used to examine the excised tumor/tissue specimen to determine if any tumor/cancer cells and/or precancer cells are present on the outer margin of the excised specimen. The presence of such cells may indicate a positive margin, to be considered by the surgeon in determining whether further resection of the surgical bed is to be performed. The location of any tumor/cancer cells identified on the outer margin of the excised specimen can be used to identify a corresponding location on the surgical bed, which may be targeted for further resection and/or treatment. This may be particularly useful in situations in which visualization of the surgical bed itself does not identify any residual tumor/cancer cells, precancer cells, or satellite lesions.
In accordance with one aspect of the present disclosure, a handheld, white light and fluorescence-based imaging device for visualization of tumor/cancer cells is provided. The white light and fluorescence-based imaging device may include a body sized and shaped to be held in and manipulated by a single hand of a user. An exemplary embodiment of the handheld white light and fluorescence-based imaging device is shown in
The device may be configured to be used with a surgical drape or shield. For example, the inventors have found that image quality improves when ambient and artificial light are reduced in the area of imaging. This may be achieved by reducing or eliminating the ambient and/or artificial light sources in use. Alternatively, a drape or shield may be used to block at least a portion of ambient and/or artificial light from the surgical site where imaging is occurring. In one exemplary embodiment, the shield may be configured to fit over the second end of the device and be moved on the device toward and away from the surgical cavity to vary the amount of ambient and/or artificial light that can enter the surgical cavity. The shield may be cone or umbrella shaped. Alternatively, the device itself may be enclosed in a drape, with a clear sheath portion covering the end of the device configured to illuminate the surgical site with white light and excitation light.
In some embodiments, the device may include provisions to facilitate attachment of a drape to support sterility of the device. For example, the drape may provide a sterile barrier between the non-sterile device contained in the drape and the sterile field of surgery, thereby allowing the non-sterile device, fully contained in the sterile drape, to be used in a sterile environment. The drape may cover the device and may also provide a darkening shield that extends from a distal end of the device and covers the area adjacent the surgical cavity to protect the surgical cavity area from light infiltration from sources of light other than the device.
The drape or shield may comprise a polymer material, such as polyethylene, polyurethane, or other polymer materials. In some embodiments, the drape or shield may be coupled to the device with a retaining device. For example, the device may include one or more grooves that are configured to interact with one or more features on the drape or shield, in order to retain the drape or shield on the device. Additionally or alternatively, the drape or shield may include a retaining ring or band to hold the drape or shield on the device. The retaining ring or band may include a resilient band, a snap ring, or a similar component. In some embodiments, the drape or shield may be suitable for one-time use.
The drape or shield may also include or be coupled with a hard optical window that covers a distal end of the device to ensure accurate transmission of light emitted from the device. The window may include a material such as polymethyl methacrylate (PMMA) or other rigid, optically transparent polymers, glass, silicone, quartz, or other materials.
The drape or shield may not influence or alter the excitation light of the device. The window of the drape or shield may not autofluoresce under 405 nm or IR/NIR excitations. Additionally, the material of the drape or shield may not interfere with wireless signal transfers to or from the device.
Other variations of a drape or shield configured to reduce or remove ambient and/or artificial light may be used as will be understood by those of ordinary skill in the art.
Additionally or alternatively, the handheld white light and fluorescence-based imaging device may include a sensor configured to identify if lighting conditions are satisfactory for imaging. For example, the device may include an ambient light sensor that is configured to indicate when ambient lighting conditions are sufficient to permit fluorescent imaging, as the fluorescence imaging may only be effective in an adequately dark environment. The ambient light sensor may provide feedback to the clinician on the ambient light level. Additionally, an ambient light level prior to the system going into fluorescent imaging mode can be stored in picture metadata. The light level could be useful during post analysis. The ambient light sensor could also be useful during white light imaging mode to enable the white light LED or control its intensity.
The device may further include, contained within the body of the device, at least one excitation light source configured to excite autofluorescence emissions of tissue cells and fluorescence emissions of induced porphyrins in tissue cells of the surgical margin, surgical bed, or excised tissue specimen. Although use of the device is discussed herein for purposes of examination of surgical margins and/or beds after tissue has been excised and to examine excised tissue specimens, it is contemplated by the inventors and is within the scope of the present application that the devices may be used during excision of the primary tumor, for example as a guide to distinguish between tumor and non-cancerous tissue. Additionally or alternatively, the devices of the present application could also be used to guide removal of satellite lesions and/or tumors. Thus, the device may also be used to make real-time adjustments during a surgical procedure.
As shown in
The excitation light source may provide a single wavelength of excitation light, chosen to excite tissue autofluorescence emissions, autofluorescence of other biological components such as fluids, and fluorescence emissions of induced porphyrins in tumor/cancer cells contained in a surgical margin of the excised tumor/tissue and/or in a surgical margin of a surgical bed from which tumor/tissue cells have been excised. In one example, the excitation light may have wavelengths in the range of about 350 nm-about 600 nm, or about 350 nm-about 450 nm and about 550 nm-about 600 nm, or, for example 405 nm, or for example 572 nm. See
The excitation light source may be configured to provide two or more wavelengths of excitation light. The wavelengths of the excitation light may be chosen for different purposes, as will be understood by those of skill in the art. For example, by varying the wavelength of the excitation light, it is possible to vary the depth to which the excitation light penetrates the surgical bed. As depth of penetration increases with a corresponding increase in wavelength, it is possible to use different wavelengths of light to excite tissue below the surface of the surgical bed/surgical margin. In one example, excitation light having wavelengths in the range of 350 nm-450 nm, for example about 405 nm±10 nm, and excitation light having wavelengths in the range of 550 nm to 600 nm, for example about 572 nm±10 nm, may penetrate the tissue forming the surgical bed/surgical margin to different depths, for example, about 500 μm-about 1 mm and about 2.5 mm, respectively. This will allow the user of the device, for example a surgeon or a pathologist, to visualize tumor/cancer cells at the surface of the surgical bed/surgical margin and the subsurface of the surgical bed/surgical margin. See
Additionally or alternatively, an excitation light having a wavelength in the near infrared/infrared range may be used, for example, excitation light having a wavelength of between about 760 nm and about 800 nm, for example about 760 nm±10 nm or about 780 nm±10 nm, may be used. In addition, to penetrate the tissue to a deeper level, use of this type of light source may be used in conjunction with a second type of imaging/contrast agent, such as infrared (IR) dye (e.g., IRDye 800, indocyanine green (ICG). See
Thus, excitation light may comprise one or more light sources configured to emit excitation light causing the target tissue containing induced porphyrins to fluoresce, allowing a user of the device, such as a surgeon, to identify the target tissue (e.g., tumor, cancerous cells, satellite lesions, etc.) by the color of its fluorescence. Additional tissue components may fluoresce in response to illumination with the excitation light. In at least some examples, additional tissue components will fluoresce different colors than the target tissue containing the induced porphyrins, allowing the user of the device (e.g., surgeon) to distinguish between the target tissue and other tissues. For example, when excitation light emits light having wavelengths of about 405 nm, the target tissue containing induced porphyrins will fluoresce a bright red color. Connective tissue (e.g., collagen, elastin, etc.) within the same surgical site, margin, bed, or excised specimen, which may surround and/or be adjacent to the target tissue, when illuminated by the same excitation light, will fluoresce a green color. Further, adipose tissue within the same surgical site, margin, bed, or excised specimen, which may surround and/or be adjacent to the target tissue and/or the connective tissue, when illuminated by the same excitation light, will fluoresce a pinkish-brown color. Addition of other wavelengths of excitation light may provide the user (e.g., surgeon) with even more information regarding the surgical site, margin, surgical bed, or excised specimen. For example, addition of an excitation light source configured to emit excitation light at about 572 nm will reveal the above tissues in the same colors, but a depth below the surface of the surgical site, surgical margin, surgical bed, or excised specimen. Alternatively or in addition, the addition of another excitation light, the excitation light being configured to emit excitation light at about 760 nm, will allow the user (e.g., surgeon) to identify areas of vascularization within the surgical site, surgical margin, surgical bed, or surgical specimen. With the use of an NIR dye (e.g., IRDye800 or ICG), the vascularization will appear fluorescent in the near infrared (NIR) wavelength band, in contrast to surrounding tissues that do not contain the NIR dye. For example, the vascularization may appear bright white, grey, or purple in contrast to a dark black background. The device may include additional light sources, such as a white light source for white light (WL) imaging of the surgical margin/surgical bed/tissue specimen/lumpectomy sample. In at least some instances, such as for example, during a BCS such as a lumpectomy, removal of the tumor will create a cavity which contains the surgical bed/surgical margin. WL imaging can be used to obtain an image or video of the interior of the cavity and/or the surgical margin and provide visualization of the cavity. The WL imaging can also be used to obtain images or video of the surgical bed or excised tissue sample. The WL images and/or video provide anatomical and topographical reference points for the user (e.g., surgeon). Under WL imaging, the surgical bed or excised tissues provide useful information to the user (e.g. surgeon and/or pathologist). For example, the WL image can indicate areas of the tissue that contain adipose (fat) tissue, which appear yellow in color, connective tissue, which typically appears white in color, as well as areas of blood, which appear bright red or dark red. Additionally, moisture, charring from cauterization, staining with chromogenic dyes, intraoperative or other exogenous objects (e.g., marking margins, placement of wire guides) can be visualized in the WL images. Furthermore, the WL image may provide context in order to interpret a corresponding FL images. For example, a FL image may provide ‘anatomical context’ (i.e., background tissue autofluorescence), and the corresponding WL image may allow the user to better understand what is shown in the FL image (e.g., image of a surgical cavity as opposed to an excised specimen). The WL image also. It lets the user colocalize a fluorescent feature in an FL image to the anatomical location under white light illumination.
The white light source may include one or more white light LEDs. Other sources of white light may be used, as appropriate. As will be understood by those of ordinary skill in the art, white light sources should be stable and reliable, and not produce excessive heat during prolonged use.
The body of the device may include controls to permit switching/toggling between white light imaging and fluorescence imaging. The controls may also enable use of various excitation light sources together or separately, in various combinations, and/or sequentially. The controls may cycle through a variety of different light source combinations, may sequentially control the light sources, may strobe the light sources or otherwise control timing and duration of light source use. The controls may be automatic, manual, or a combination thereof, as will be understood by those of ordinary skill in the art.
The body of the device may also contain a spectral filter configured to prevent passage of reflected excitation light and permit passage of emissions having wavelengths corresponding to autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells. In one example embodiment, an mCherry filter may be used, which may permit passage of emissions having wavelengths corresponding to red fluorescence emissions (both autofluorescence and induced porphyrin emissions) and green autofluorescence emissions, wherein the red band captures adipose tissue autofluorescence emissions and PpIX emissions and the green band captures connective tissue autofluorescence emissions. As shown in
The handheld white light and fluorescence-based imaging device also includes an imaging lens and an image sensor. The imaging lens or lens assembly may be configured to focus the filtered autofluorescence emissions and fluorescence emissions on the image sensor. A wide-angle imaging lens or a fish-eye imaging lens are examples of suitable lenses. A wide-angle lens may provide a view of 180 degrees. The lens may also provide optical magnification. A very high resolution (e.g., micrometer level) is desirable for the imaging device, such that it is possible to make distinctions between very small groups of cells. This is desirable to achieve the goal of maximizing the amount of healthy tissue retained during surgery while maximizing the potential for removing substantially all residual cancer cells, precancer cells, satellite lesions. The image sensor is configured to detect the filtered autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells of the surgical margin, and the image sensor may be tuned to accurately represent the spectral color of the porphyrin fluorescence and tissue autofluorescence. The image sensor may have 4K video capability as well as autofocus and optical zoom capabilities. CCD or CMOS imaging sensors may be used. In one example, a CMOS sensor combined with a filter may be used, i.e., a hyperspectral image sensor, such as those sold by Ximea Company. Example filters include a visible light filter (https://www.ximea.com/en/products/hyperspectral-cameras-based-on-usb3-xispec/mg022hg-im-sm4x4-vis) and an IR filter (https://www.ximea.com/en/products/hyperspectral-cameras-based-on-usb3-xispec/mg022hg-im-sm5x5-nir). The handheld device also may contain a processor configured to receive the detected emissions and to output data regarding the detected filtered autofluorescence emissions of tissue cells and fluorescence emissions of the induced porphyrins in tissue cells of the surgical margin. The processor may have the ability to run simultaneous programs seamlessly (including but not limited to, wireless signal monitoring, battery monitoring and control, temperature monitoring, image acceptance/compression, and button press monitoring). The processor interfaces with internal storage, buttons, optics, and the wireless module. The processor also has the ability to read analog signals.
The device may also include a wireless module and be configured for completely wireless operation. It may utilize a high throughput wireless signal and have the ability to transmit high definition video with minimal latency. The device may be both Wi-Fi and Bluetooth enabled—Wi-Fi for data transmission, Bluetooth for quick connection. The device may utilize a 5 GHz wireless transmission band operation for isolation from other devices. Further, the device may be capable of running as a soft access point, which eliminates the need for a connection to the internet and keeps the device and module connected in isolation from other devices which is relevant to patient data security.
The device may be configured for wireless charging and include inductive charging coils. Additionally or alternatively, the device may include a port configured to receive a charging connection.
In accordance with one aspect of the present disclosure, an example embodiment of a handheld, multispectral imaging device 100, in accordance with the present teachings, is shown in
As illustrated in
In embodiments of the device 100 in which the tip 116 is removable and exchangeable, it is envisioned that kits containing replacement tips could be sold. Such kits may be provided in combination with the device itself, or may include one or more compounds or dyes to be used with the types of light sources included on the tips contained in the kit. For example, a kit with a 405 nm light source tip might include ALA, while a kit with a 405 nm light source and a 760 nm light source tip might include both ALA and IRdye 800 and/or ICG. Other combinations of light sources and compounds will be apparent to those of ordinary skill in the art.
In some embodiments, the device may include a polarized filter. The polarizing feature may be part of the spectral filter or a separater filter incorporated inot the spectral filter. The spectral filter/imaging filter may be a polarized filter, for example, a linear or circular polarized filter combined with optical wave plates. This may prevent imaging of tissue with minimized specular reflections (e.g., glare from white light imaging) as well as enable imaging of fluorescence polarization and/or anisotropy-dependent changes in connective tissue (e.g., collagen and elastin). Additionally, the polarized filter may allow a user to better visualize the contrast between different fluorescent colors, and thus better visualize the boundary between different tissue components (e.g. connective vs adipose vs tumor). Stated another way, the polarizing filter may be used for better boundary definition under FL imaging. The polarized filter may also improve image contrast between the tissue components for WL and FL images.
This embodiment allows for easy switching between fluorescence (with filter) and white light (no filter) imaging. In addition, both sensors may be capturing images of the exact same field of view at the same time and may be displayed side-by-side on the display. 3D stereoscopic imaging is possible, using both image sensors at the same time, with the filter from the second sensor removed, making it possible to provide a 3D representation of the surgical cavity. In addition, other functions such as Monochrome and full color imaging are possible, with the filter from the second sensor removed. The monochrome and full color images can be combined, with the benefit of a monochrome sensor providing enhanced detail when combined with the full color image.
In each of the embodiments described above, the camera module/image sensor may be associated with camera firmware contained on a processor of the device. The processor is incorporated into the electronics board of the device, as is a wireless module as described above. The camera firmware collects data from the imaging sensor, performs lossless data compression and re-sampling as required, packages image and video data appropriate to the transmission protocol defined by the soft access point, timestamps data packages for synchronization with audio annotation data where applicable, and transmits the data to be received by a wireless hub in real time.
The handheld, multispectral imaging device is configured to be operatively coupled with a wireless hub 1200. As shown in
The display 1280 may be any display that can be utilized in a surgical suite or in a lab. The display 1280 includes firmware configured to transmit image, video and audio data via a wired connection to an external display monitor, display video data in real time with image capture indication, display images from different light sources side by side up command, and integrate with external augmented reality and virtual reality systems to prepare/adjust display settings as per user preference.
Together, the handheld multispectral imaging device 100, the wireless hub 1200, and the display 1280 form a system 1300 configured to permit intraoperative visualization of tumor and surgical margins. The system may include other components as well. For example, as shown in
As shown in
In accordance with the present teachings, an exemplary method of using the device 100 will now be described. Prior to surgery, the patient is prescribed a diagnostic dosage of a non-activated, non-targeted compound configured to induce porphyrins in tumor/cancer tissue cells, such as ALA. The dosage may comprise, for example, about 5 mg/kg, about 10 mg/kg, about 15 mg/kg, about 20 mg/kg, about 25 mg/kg, about 30 mg/kg, about 35 mg/kg, about 40 mg/kg, about 45 mg/kg, about 50 mg/kg, or about 55 mg/kg. As also discussed above, it is possible to administer a dosage greater than about 60 mg/kg. The patient is provided with instructions to consume the compound between about 15 min and about 6 hours prior to surgery, between about 1 and about 5 hours prior to surgery, or between about 2 and about 4 hours before surgery. If the patient is unable to take the compound orally, it may be administered intravenously. Additionally or alternatively, as previously discussed, it is possible to administer the compound as an aerosol or a lavage during surgery.
The pro-drug aminolevulinic acid (ALA) induces porphyrin formation in tumor/cancer tissue cells via the process illustrated in
In one example, oral 5-ALA was dissolved in water and administered by a study nurse between 2-4 h before surgery in patients at dosages of 15 or 30 mg/kg 5-ALA. The PRODIGI device, used in clinical trials described herein is also described in U.S. Pat. No. 9,042,967, entitled “Device and method for wound imaging and monitoring,” which is hereby incorporated by reference in its entirety.
Approximately 2-4 hours after 5-ALA or a similar compound is administered, the surgery begins. In this application, surgical processes are described relative to BCS. However, the scope of the present application is not so limited and is applicable to surgeries and pathological analyses for all types of cancer, including for example, breast cancer, brain cancer, colorectal cancer, squamous cell carcinoma, skin cancer, prostate cancer, melanoma, thyroid cancer, ovarian cancer, cancerous lymph nodes, cervical cancer, lung cancer, pancreatic cancer, head and neck cancer, or esophageal cancer. Additionally, the methods and systems disclosed herein may be used with regard to cancers in animals excluding humans, for example, in canines or felines. The methods and systems may be applicable with, for example, mast cell tumors, melanoma, squamous cell carcinoma, basal cell tumors, tumors of skin glands, hair follicle tumors, epitheliotropic lymohoma, mesenchymal tumors, benign fibroblastic tumors, blood vessel tumors, lipomas, liposarcomas, lymphoid tumors of the skin, sebaceous gland tumors, and soft tissue sarcomas in canines and felines,
The surgeon begins by locating the tumor and subsequently removing the tumor. As discussed above, the surgeon may use the imaging device for location of the tumor, especially in cases where the tumor comprises many tumor nodules. Additionally, the surgeon may also use the imaging device during resection of the tumor to look at margins as excision is taking place (in a manner substantially the same as that described below). After the surgeon removes the tumor/cancerous tissue, the distal end 114 of the device 100, including at least the tip 116 and end face 118 are inserted through the surgical incision into the surgical cavity from which the tumor/cancerous tissue has been removed. The surgeon operates the controls on the proximal portion of the device, held in the surgeon's hand, to actuate the white light source and initiate white light imaging (WL imaging) of the surgical cavity and surgical bed. During WL imaging, the spectral filter is not engaged and light reflected from the surfaces of the surgical cavity passes through the wide-angle imaging lens and is focused on the camera module/image sensor in the body 110 of the device 100. The processor and/or other circuitry on the electronics board transmits the image data (or video data) to the wireless hub 1200, wherein the data is stored and/or pre-processed and transmitted to the display 1280. The surgeon/device operator may move the tip of the device around in the surgical cavity as necessary to image the entire cavity (or as much of the cavity as the surgeon desires to image). In some embodiments, the distal end portion of the device may be articulatable and is controlled to articulate the distal end portion thereby changing the angle and direction of the white light incidence in the cavity as needed to image the entire cavity. Articulation of the distal end portion may be achieved by various means, as will be understood by those of ordinary skill in the art. For example, the distal end may be manually articulatable or it may be articulatable by mechanical, electromechanical, or other means.
Subsequent to WL imaging, the surgeon/device operator, toggles a switch or otherwise uses controls to turn off the white light source and actuate one or more of the excitation light sources on the device 100. The excitation light source(s) may be engaged individually, in groups, or all at once. The excitation light source(s) may be engaged sequentially, in a timed manner, or in accordance with a predetermined pattern. As the excitation light source(s) is actuated, excitation light is directed onto the surgical bed of the surgical cavity, exciting autofluorescence emissions from tissue and fluorescence emissions from induced porphyrins in tumor/cancer tissue cells located in the surgical margin. The imaging lens on the end face 118 of the device 100 focuses the emissions and those emissions that fall within wavelength ranges permitted passage by the spectral filter pass through the filter to be received by the camera module/image sensor within the device body 110. The processor and/or other circuitry on the electronics board transmits the image data (or video data) to the wireless hub 1200, wherein the data is stored and/or pre-processed and transmitted to the display 1280. Thus, the surgeon may observe the captured fluorescence images on the display in real time as the surgical cavity is illuminated with the excitation light. This is possible due to the substantially simultaneous excitation and detection of the fluorescence emissions. As the surgeon observes the fluorescence images, it is possible to command the display of the white light image of the same locality in a side-by-side presentation on the display. In this way, it is possible for the surgeon to gain context as to the location/portion of the surgical cavity/surgical bed or margin being viewed. This allows the surgeon to identify the location of any red fluorescence in the cavity/margin, which may be attributable to residual cancer cells in the cavity/margin. In addition to red fluorescence, the FL imaging may also capture green fluorescence representative of connective tissue such as collagen. In some cases, the autofluorescence emissions forming very dense connective tissue in the breast will fluoresce a bright green color. This allows the surgeon to identify areas of dense connective tissue, differentiate from dark areas which may represent the vasculature/vascularization (dark due to the absorption of light), as a more highly vascularized tissue may potentially represent vascularization associated with cancerous cells. Additionally, by viewing the autofluorescence of the connective tissue in conjunction with any red fluorescence, the surgeon is given context regarding the location of the red fluorescence that may represent residual cancer cells. This context may be used to inform the surgeon's decision regarding further treatment and/or resection of the surgical bed/surgical margin as well as for decisions regarding reconstruction procedures.
As with WL imaging, during FL imaging, the surgeon/device operator may move the tip of the device around in the surgical cavity as necessary to image the entire cavity (or as much of the cavity as the surgeon desires to image). In some embodiments, the distal end portion of the device may be articulatable and is controlled to articulate the distal end portion thereby changing the angle and direction of the white light incidence in the cavity as needed to image the entire cavity. Articulation of the distal end portion may be achieved by various means, as will be understood by those of ordinary skill in the art. For example, the distal end may be manually articulatable or it may be articulatable by mechanical, electromechanical, or other means.
Although this process is described with WL imaging occurring prior to FL imaging, it is possible to reverse the process and/or to perform FL imaging without WL imaging.
In addition to viewing the surgical margins of a surgical cavity, the disclosed handheld multispectral imaging device may also be used to observe lymph nodes that may be exposed during the surgical procedure. By viewing lymph nodes prior to removal from the subject's body, it is possible to observe, using the device 100, red fluorescence emissions from cells containing induced porphyrins that are within the lymph node. Such an observation is an indication that the tumor/cancer cells have metastasized, indicating that the lymph nodes should be removed and that additional treatment may be necessary. Use of the imaging device in this manner allows the device to act as a staging tool, to verify the stage of the cancer and/or to stage the cancer dependent upon the presence or absence of red fluorescence emissions due to induced porphyrins in the lymph node. Such a process may also be used on lymph nodes that have already been removed from the subject, to determine whether tumor/cancer cells are contained within the removed lymph nodes. Independent of the process used, in vivo, ex vivo or in vitro, the information obtained can be used to inform the surgeon's decisions regarding further treatment and/or interventions.
In addition to looking at the surgical cavity and the lymph nodes, there is also value in imaging the removed tumor. The outer surface (surgical margin) of the tumor can be imaged, looking to identify cancer cells, precancer cells, and satellite lesions. The removed tissue can also be viewed with the imaging device after sectioning.
In accordance with another aspect of the present disclosure, it is contemplated that the intensity of the induced porphyrins detected may be used as a guide to determine an optimal time frame for PDT. For example, it is possible to monitor the intensity of the fluoresce emitted by the porphyrins and determine when they are at peak, and perform PDT at that time for optimal results.
Under standard WL, differentiating between regions of breast adipose and connective tissues is challenging. FL imaging reveals consistent autofluorescent (AF) characteristics of histologically validated adipose and connective tissues, which appear pale pink and bright green, respectively, under 405 nm excitation. When combined with 5-ALA red FL, the differing emission spectra of normal tissue AF and PpIX are easily distinguishable visually (see
AF mammary ductoscopy using blue light illumination can spectrally differentiate between healthy duct luminal tissue AF (bright green) and invasive breast tumor tissue. The clinicians' imaging data demonstrates bright green AF in areas of healthy breast tissue. Moreover, the clinical findings with 5-ALA demonstrate that both en face FL imaging and endoscopic FL imaging are clinically feasible.
During one clinical trial, tumor AF intensity and distribution were heterogeneous. Qualitatively, intensity ranged from visually brighter, darker, or low contrast compared to surrounding normal breast tissue. In addition, mottled green FL was common among the specimens both in the demarcated tumor as well as in areas of normal tissue, likely due to interspersed connective tissue. Endogenous tumor AF was inconsistent across different patient resection specimens and hence is not a reliable intrinsic FL biomarker for visual identification of tumors within surgical breast tumor specimens (i.e., not all tumors are brighter compared to surrounding normal tissues).
Overall, differences in tumor AF signals may represent differences in the composition of each tumor and the surrounding normal regions. It is possible that brighter tumors contain more fibrous connective tissue and as a result had a characteristic bright green AF signature. However, in cases where the healthy surrounding tissue was also highly fibrous with dense connective tissue, the tumor and normal AF signal were similar and could not be distinguished from each other, resulting in low contrast of the tumor relative to normal tissue.
Blood is known to increase absorption of 405 nm light resulting in decreased emission. Intact specimens were rinsed with saline prior to imaging to remove surface blood, however, once bread-loafed, blood in tumor vessels may have affected the AF intensity of tumor sections. Therefore, it is possible that darker tumors had lower connective tissue content and higher vascularity.
In patients receiving 5-ALA, PpIX FL was lower in areas of normal connective and adipose tissue relative to tumor tissue. While the diagnostic measures for detecting tumor were not significantly improved in the higher 5-ALA group, the inventors did see an increase in the median concentration of tumor PpIX relative to the lower 5-ALA group.
The inventors found connective tissue (collagen) was characterized by green AF (525 nm peak) when excited by 405 nm light. Accordingly, necrotic areas that were also highly fibrotic were characterized by green AF. Additionally, collagen and elastin found in the intimal and adventitial layers of tumor-associated vasculature exhibited bright green AF. Broad AF emission between 500 nm and 600 nm was observed in adipocytes located in both healthy and tumor tissues. This is likely due to the broad emission spectrum of lipo-pigments. Under macroscopic imaging with an alternative embodiment of the imaging device, the broad 500-600 nm FL emission characteristic of adipocytes is spectrally and visually distinct from the narrow red (635 nm peak) FL emission characteristic of tumor-localized of PpIX. Thus, tumor cells containing PpIX are distinguishable from a background of fatty breast tissues.
Multispectral or multiband fluorescence images using 405 nm (e.g., +/−5 nm) excitation, and detecting ALA-induced porphyrin FL between 600-750 nm, can be used to differentiate between connective tissues, adipose tissues, muscle, bone, blood, nerves, diseased, precancerous and cancerous tissues.
Device and method can be used to visualize microscopic and macroscopic tumor foci (from a collection of cells to mm-sized or larger lesions) at the surface or immediately below the surface of a resected specimen (lumpectomy, mastectomy, lymph node) and/or surgical cavity, and this can lead to:
Better visualization of tumor foci/lesions against a background of healthy or inflamed or bloody tissues;
Faster detection of microscopic tumor foci/lesions using FL imaging compared with conventional methods;
Real-time visual guidance from FL images/video for the clinician to remove the FL tumor foci/lesions during surgery;
Confirmation of more complete tumor removal following FL imaging (reduction of porphyrin FL or its absence after FL guided surgery can indicate more (or all of) the tumor has been removed;
FL images can be used to target biopsy of suspicious premalignant or malignant tissues in real time;
FL imaging can also identify macroscopic and microscopic tumor foci/lesions in lymphatic tissues during surgery, including lymph nodes;
Area of PpIX red fluorescence indicates extent of tumour burden in lymph node;
Detect subsurface tumor lesions during or after a surgical procedure;
Differentiation between low, moderate and high mitotic index tumor lesions based on porphyrin FL intensity and color;
FL images and video with audio annotation to document completeness of tumour removal;
Can be correlated with pathology report and used to plan re-excision, reconstructive surgery;
FL images/video can be used to plan treatment of focal x-ray radiation or implantation of brachytherapy seed treatment in breast or other types of cancer;
Improve margin assessment by detecting microscopic residual tumor foci/lesions; and
Connective tissues FL green, Premalignant and malignant tissues (red).
FL imaging can be used in combination with FL point spectroscopy, Raman spectroscopy and imaging, mass spectrometry measurements, hyperspectral imaging, histopathology, MRI, CT, ultrasound, photoacoustic imaging, terahertz imaging, infrared FL imaging, OCT imaging, polarized light imaging, time-of-flight imaging, bioluminescence imaging, FL microscopy for examining ex vivo tissues and/or the surgical cavity for the purpose of detecting diseased tissue, diagnosing said diseased tissue, confirming the presence of healthy tissues, guiding surgery (or radiation or chemotherapy or cell therapies in the case of patients with cancer).
In addition to the ability to identify cancer or precancer cells, the image data gathered through use of the devices and methods disclosed herein can be used for several purposes.
In accordance with one aspect of the present disclosure, the image data gathered using the devices and methods disclosed herein may be useful in the identification of fibrosis. Fibrosis refers to a thickening or increase in the density of breast connective tissue. Fibrous breast tissues include ligaments, supportive tissues (stroma), and scar tissues. Breast fibrosis is caused by hormonal fluctuations, particularly in levels of estrogen, and can be more acute just before the menstruation cycle begins. Sometimes these fibrous tissues become more prominent than the fatty tissues in an area of the breast, possibly resulting in a firm or rubbery bump. Fibrosis may also develop after breast surgery or radiation therapy. The breast reacts to these events by becoming inflamed, leaking proteins, cleaning up dead breast cells, and laying down extra fibrous tissue. Fibrous tissue becomes thinner with age and fibrocystic changes recede after menopause.
In the fluorescence RGB images collected with the PRODIGI camera in the breast ALA study, connective tissue in the breast appears as green colour fluorescence. This is expected as this reflects the wavelengths emitted by collagen when excited with 405 nm light, and, collagen is the primary component of connective tissue. Therefore, by characterising and quantifying the green autofluorescence in the images, a correlation to the connective tissue fibrosis can be performed.
Based on the above observed correlation between the clinician examination of the lumpectomy specimens and the green fluorescence in the imaged tissue, it is possible to utilize such images of breast tissue to predict an amount of fibrosis in the tissues. The flowchart in
In accordance with the present teachings, a RGB image of interest is input. Next, as shown in blue, the software converts the RGB image to HSV format (Hue, Saturation, and Value). It also contemplated that other color spaces could be used, for example, CMYK and HSL. Those of skill in the art will understand that other color spaces are possible as well. As discussed further, the HSV format may be used to determine the percentage of green autofluorescence and the density of green autofluorescence in the image. The Hue, Saturation, and Value channels are then separated from the HSV image. All values in the Hue channel are multiplied by 360 to obtain radial values of hues from 0 degrees to 360 degrees. On the RGB image, a region of interest (ROI) can be identified using a freehand drawing tool in MATLAB. A user may draw the region of interest, which covers the entire specimen slice in the image minus the background and adjacent slices. The software may then create a binary mask of the region of interest. Next, the software may calculate the area of the region of interest in mm2 by calibrating the absolute area of each pixel in that image using the ruler tag in the image in order to determine an Area of the whole slice. The software may then locate all pixels with autofluorescent green color by thresholding the hue values (70<Hue<170), which is the range of hues observed with the autofluorescent connective tissue.
Next, as shown in yellow in
As shown in pink in
Alternatively, instead of using HSV, as shown in green in
As shown in gray in
In addition to fibrosis, predictive determinations regarding the composition of tissues or percentages of other types of tissues within the images can be made based on color in the images.
Images collected by the device are displayed as a composite color image. When imaging is performed in fluorescence mode (405 nm illumination with capture of emitted light in the range of 500-550 nm and 600-660 nm) composite images contain a spectrum of colors resulting from the emission of green light (500-550 nm) and red light (600-660 nm) or a combination thereof. The wavelength(s) (corresponding to the color) of light emitted from the target are a result of the presence of specific fluorescent molecules. For example, PpIX (a product of 5-ALA metabolism) present in tumors appears red fluorescent while collagen, a component of normal connective tissue, appears green fluorescent. When a mixture of different fluorescent molecules is present in the tissue the resultant color in the composite image is due to a combination of the different emitted wavelengths. The concentration/density and intrinsic fluorescent properties (some fluorescent molecules have stronger intrinsic fluorescent intensity) of each type of fluorescent molecule present in the target tissue will affect the resultant fluorescent color.
Color can be used to assist in classifying the different types of tissues contained within collected images.
Analysis of the fluorescent color (including features such as hue, luminosity, saturation) can provide information about the type and relative amount of different tissue(s) in the target tissue (i.e. what proportion of the target tissue is tumor vs. connective tissue). Luminosity in particular is useful in interpreting fluorescence images, given that tissues with a similar hue can be differentiated visually (and through image analysis) by differences in luminosity. For example, in breast tissue specimens, fat appears pale pink while PpIX fluorescent tumors can appear as a range of intensities of red. In some cases, PpIX tumor fluorescence will have the same hue as background normal fat tissue, however differences in luminosity will make the PpIX in tumors appear ‘more bright’. In addition, subtle differences in color characteristics which are not visually perceptible in the composite images may also be calculated using image analysis software to interpret differences in tissue composition or identify the presence of specific tissue components.
The relationship between fluorescence color and tissue composition allows the user to interpret the composite color image/video (i.e., the user will know what type of tissue he/she is looking at) as well as provides the user with additional information, not otherwise obvious under white light examination, to guide clinical decisions. For example, if the target tissue appears bright red fluorescent to the user (e.g., surgeon), the user will understand that this means there is a high density of tumor cells in that area and may choose to act on the information by removing additional tissue from the surgical cavity. Conversely, if the tissue appears weakly red fluorescent, the user may decide not to remove additional tissue but rather take a small piece of tissue to confirm the presence of tumor microscopically.
Thus, in this sense, the redness of the fluorescence may be considered predictive of tissue type and the presence of disease. In addition to looking at the color contained in the image, the clinician, surgeon, or other medical staff looking at the images may also look at the pattern or “texture” of the image. Further, not only is the intensity of a single color relevant, but combinations of colors together also provide information to the clinician. For example, the identification of green in the image can be an identifier of normal, healthy connective tissue in the image, such as collagen or elastin. The pattern that color makes may also provide an indication regarding the density of the tissue. For example, patchy or mottled green may indicate diffuse connective tissue while solid green may be indicative of dense connective tissue. Similarly, a large solid mass of red may indicate focal tumor or disease, while red dots spread throughout the image may be indicative of multifocal disease.
As noted above, seeing the interaction of or the position of the colors relative to one another can also provide information to the clinician. When red fluorescence and green fluorescence are in an image together, it is possible to see the extent of disease (red fluorescence) within healthy tissue (green fluorescence). Further, the positioning of the red (disease) relative to the green (healthy tissue) can guide a clinician during intervention to remove or resect the disease. Red and green together can also delineate the boundary between diseased and healthy tissue and provide context of the healthy tissue anatomy to guide resection. The combination of these colors together also provides feedback to the surgeon/clinician during interventions such as resection. That is, as the diseased tissue is removed or otherwise destroyed, the visual representation in red and green will change. As the red disappears and green becomes more prevalent, the surgeon will be receiving affirmative feedback that the disease is being removed, allowing the surgeon to evaluate the effectiveness of the intervention in real-time. This is applicable to many types of image-guided interventions including, for example, laparoscopy, resection, biopsy, curettage, brachytherapy, high-frequency ultrasound ablation, radiofrequency ablation, proton therapy, oncolytic virus, electric field therapy, thermal ablation, photodynamic therapy, radiotherapy, ablation, and/or cryotherapy.
When looking at color/texture/pattern of the image, it is possible for the clinician to differentiate tissue components such as connective tissue, adipose tissue, tumor, and benign tumor (hyperplastic lesions). In one aspect, the clinician can get an overall picture of healthy tissue versus diseased tissue (green versus red), and then, within diseased tissue, potentially differentiate between benign disease and malignant disease based on the intensity of the red fluorescence (benign=weak intensity, malignant=strong intensity). Non-limiting examples of benign disease that may be identified include fibroid adenoma, hyperplasia, lobular carcinoma in situ, adenosis, fat necrosis, papilloma, fibrocystic disease, and mastitis.
Looking at the red and green fluorescence together can also assist clinicians in targeting biopsies and curettage.
When looking at lymph nodes, it may be possible to identify subclinical disease and/or overt disease. The fluorescence image can be used to identify metastatic disease in the lymphatic system, the vascular system, and the interstitial space including infiltrate disease.
Based on the above examples, the features present in a multispectral image can be used to classify tissue and to determine the effectiveness of interventions. In particular, it is possible to do the following using the features found in a multispectral image:
Analysis of the images obtained herein may be performed by software running on the devices described herein or on separate processors. Examples of image analysis and appropriate software may be found, for example, in U.S. Provisional Patent Application No. 62/625,611, filed Feb. 2, 2018 and entitled “Wound Imaging and Analysis” and in international patent application no. PCT/CA2019/000002 filed on Jan. 15, 2019 and entitled “Wound Imaging and Analysis,” the entire content of each of which is incorporated herein by reference.
The multispectral images collected by the devices disclosed in this application, in accordance with the methods described in this application, may lead to the ability to do the following:
In accordance with another aspect of the present disclosure, a method of quantifying the fluorescence images obtained with the disclosed handheld multispectral device (first method of
The method of quantifying the fluorescence images (referred to herein as the first method) will be discussed first and will reference various steps identified in
For example, in the first method of
As shown in
As shown in steps 5-8 of
The imaging software may also allow a user to modify the imaging software's classification of the tissue sample via real-time tuning. For example, a user may view the imaging software's classification of the tissue sample (step 6). In one example, the imaging software may classify areas in the tissue sample as including connective tissue and the remaining areas as being background non-tissue. The user may then create a region of interest (ROI) around any histologically normal structures that are misclassified (step 7). For example, the user may identify one or more portions of the areas classified as connective tissue that are actually background non-tissue. Thus, the user may identify one or more areas in which the imaging device misclassified the portions as connective tissue. Such an identification may be used to refine/improve the imaging device in order to improve its accuracy in correctly identifying tissue. The user may also highlight additional areas of interest in the tissue sample in order to further refine/improve the accuracy of each tissue category (step 8).
In step 9 of the first method of
It is also contemplated that the imaging device may perform the analysis in step 9 (for the first method) on only a specific portion of the tissue sample, for example, on a specific region of interest within the tissue sample. In some embodiments, the region of interest may be a particular area of the tissue sample that is, for example, about one-third in size of the total tissue sample. In other embodiments, the region of interest may be an area of the tissue sample that is within a specific distance from the imaged surface.
The imaging software may extract area values (e.g. mm2) for each of the selected tissue categories (step 10) of
In the second method of
A method of determining the accuracy of the fluorescence images obtained with the disclosed handheld multispectral device (second method of
As shown in
As shown in steps 5-8 of
The imaging software may also allow a user to modify the imaging software's classification of the tissue sample via real-time tuning. For example, a user may view the imaging software's classification of the tissue sample (step 6). In one example, the imaging software may classify areas in the tissue sample as including connective tissue and the remaining areas as being background non-tissue. The user may then create a region of interest (ROI) around any histologically normal structures that are misclassified (step 7). For example, the user may identify one or more portions of the areas classified as connective tissue that are actually background non-tissue. Thus, the user may identify one or more areas in which the imaging device misclassified the portions as connective tissue. Such an identification may be used to refine/improve the imaging device in order to improve its accuracy in correctly identifying tissue. The user may also highlight additional areas of interest in the tissue sample in order to further refine/improve the accuracy of each tissue category (step 8).
In step 9 of the second method of
It is also contemplated that the imaging device may perform the analysis in step 9 (for the second method) on only a specific portion of the tissue sample, for example, on a specific region of interest within the tissue sample. In some embodiments, the region of interest may be a particular area of the tissue sample that is, for example, about one-third in size of the total tissue sample. In other embodiments, the region of interest may be an area of the tissue sample that is within a specific distance from the imaged surface.
The imaging software may extract area values (e.g. mm2) for each of the selected tissue categories (step 10) in the second method of
In another example, the imaging software may determine, for example, that the H&E stain shows that the tissue sample includes 35% connective tissue while the disclosed multispectral device shows that the tissue sample include 25% connective tissue. In this example, the imaging software may then determine that the multispectral device is not accurate in its determination of identifying connective tissue and needs refinement, or that the imaging software itself needs refinement in is determination of identifying connective tissue (because the first area value is not equal to the second area value).
In order to determine the percent of each tissue category in the tissue sample, the imaging device may use the area values, as discussed above. For example, in order to calculate the relative percentage of a given tissue category, the imaging device may divide the area value of that tissue category by the area classified as normal tissue. The area classified as normal tissue may also include any region of interest specifically identified by the user as being normal tissue, as discussed above.
The imaging device may also use the area values, as discussed above, to determine a ratio of two components. For example, to determine a ratio of tumor tissue to connective tissue. Thus, the imaging device may divide the area value of the tissue classified as tumor tissue with the area value of the tissue classified as connective tissue.
As discussed above, the data from the H&E stain is compared/correlated with the fluorescence images (step 12). This may be used to determine the accuracy of the disclosed multispectral device). Thus, a user may determine that the multispectral device accurately detects the presence and amount of tumor tissue but fails to accurately detect the presence and/or amount of connective tissue. Such may be helpful to refine the multispectral device.
The disclosed multispectral device may be refined by altering the optical filter of the device. For example, the transmission band of the optical filter may be varied in order to alter the detected fluorescence. Such may allow, for example, less green fluorescence to be viewed, which may more accurately correlate to the actual presence of connective tissue in the biopsy.
In some embodiments, the disclosed imaging device may be used with adipose tissue that produces, for example, a pinkish brown fluorescence emission. In this example, a user would select the tissue category of adipose tissue. In other embodiments, tissue categories such as blood and abnormal tissue (e.g., tumor, cancerous cells, lesions, benign tumor, and hyperplastic lesions) may be selected.
After a first tissue category is selected, a user may then select a second tissue category. The imaging software would then create a new first area value and a new second area value for the second tissue category. The software may then compare the new first area value and the new second are value, as discussed above with regard to the first and second area values.
Is it also contemplated that the disclosed imaging software allows a user to determine if the multispectral device needs refinement without a high level of expertise by the user. Thus, the imaging device provides an easy and automated system to determine if the multispectral device needs refinement.
The imaging software can be used with other devices other than the disclosed multispectral device. Thus, the imaging device may be used with a variety of devices in order to determine the accuracy of the device, and whether it needs refinement.
It is contemplated that the steps of
In accordance with another aspect of the present disclosure, a method of quantifying color contrast is disclosed. For example, the method may be used to quantify the fluorescence color contrast between tumor tissue and normal tissue. Thus, the average color intensity of the tumor tissue is compared with the average color intensity of the normal tissue. In some embodiments, the method may be used to quantify the fluorescence color contrast between different intensities of connective tissue. Thus, the average color intensity of a first area of the connective tissue is compared with the average color intensity of a second area of the connective tissue. Such color contrasts may not be reliable when perceived with a user's eye. For example, both the first and second areas may have a green autofluorescence that is so similar, a user's eye may not be able to discern the difference in color between these two areas. Thus, the method of
The method is illustrated in the flow chart of
As shown in step 1 of
The imaging software may also display the region of interest (ROI) in the tissue sample (step 3). For example, the region of interest may be demarcated by the user on a corresponding white light image of the tissue. The imaging software may then display this same region of interest in the RGB image. In one example, the region of interest may be a specific area that includes a high level of connective tissue or tumor tissue. In another example, the region of interest may include both tumor tissue and normal tissue. It is also contemplated that more than one region of interest may be used.
As shown in Step 4 of
In step 5, the imaging software may create a binary mask of the RGB image. As discussed further below, the binary mask may be used to determine the XYV values from the RGB image. The binary mask may be created for only the area(s) specified by the region of interest. Next, the imaging software may calculate a mean RGB value and a mean XYZ value (step 6). For example, the imaging software may create a mean RGB value on a green fluorescence portion of the connective tissue and a corresponding XYZ value. The mean value may be, for example, an average green intensity in the region of interest, and the mean XYV value may be, for example, a corresponding tristimulus value.
Next, in step 7, the imaging software may derive the mean ‘x’ and ‘y’ parameters from the tristimulus values calculated in step 6. The ‘x’ value may be calculated according to the following formula: x=X/(X+Y+Z), and the ‘y’ value may be calculated according to the following formula: y=Y/(X+Y+Z). In step 8, a user may plot the ‘x’ and ‘y’ co-ordinates on a chromaticity diagram to represent the mean color of the specified tissue sample. For example, the specified tissue sample may have a green fluorescence color with a wavelength of 520 nm on the chromaticity diagram.
In some embodiments, the imaging software may create two ‘x’ and ‘y’ coordinates on the chromaticity diagram. The two coordinates may originate from the same tissue sample such that one coordinate correlates to tumor tissue and the other coordinate correlates to normal tissue. In other embodiments, one coordinate may correlate to tumor tissue in a first area of the tumor and the other coordinate correlates to tumor tissue in a second area of the same tumor.
As shown in step 9, the imaging software may then connect the two coordinates with a vector. In one example, a first coordinate has a wavelength of 520 nm (green) and a second coordinate has a wavelength of 640 nm (red) on the chromaticity diagram (so that the coordinates represent healthy and tumor tissue, respectively). A vector may connect these two coordinates. Then, in step 10, the imaging software may measure the Euclidean distance vector between the first and second coordinates. The Euclidean distance vector may provide an indication as to the color contrast between the green and red fluorescence colors in the RGB image. Thus, the Euclidean distance vector provides a method/system to quantify the color contrast between the green (normal tissue) and red (tumor tissue). Such may allow a user to easily determine the normal tissue in the specimen compared to the healthy tissue. Additionally, such may allow a user to quantify the difference. A larger difference may be indicative of tumor tissue with a higher density, whereas a smaller difference may be indicative of a tumor tissue with a lower density. Additionally or alternatively, a larger difference may be indicative of a higher dose of ALA in the patient.
In some embodiments, both the first and second coordinates may represent tumor tissue. Thus, the first coordinate may have a wavelength of 640 nm on the chromaticity diagram and the second coordinate may have a wavelength of 700 nm on the chromaticity diagram. Therefore, the second coordinate may correlate to tissue that has a darker red appearance than the first coordinate. The Euclidean distance vector between these two coordinates may allow a user to confirm that a color contrast does indeed exist between the two samples (which may be hard to ascertain based upon a user's vision alone). More specifically, the Euclidean distance vector may confirm that the two tissue samples are indeed different shades of red. Additionally, based upon the Euclidean distance vector, the imaging software may determine that the tissue sample with the darker shade of red (the second coordinate) has a higher density of tumor cells than the tissue sample with the lighter shade of red (the first coordinate). Such may allow a user to quantitively determine the relative densities of tumor cells in one or more specified areas. In some examples, the tissue sample with the lighter shade of red may correspond to benign tissue, while the tissue sample with the darker shade of red may correspond to malignant tissue. Thus, the imaging system may allow a user to quantitively determine whether a tissue sample is benign or malignant.
As shown in step 11 of
In step 13, the imaging system may output a chromaticity diagram for each of the three groups (control group, low dose ALA, and high dose ALA), as shown in FIG. 25. Each chromaticity diagram may include two points connected by a vector that depicts the distance between mean tumor color and mean normal tissue color within each group. A user may then compare the chromaticity diagrams for the three groups to quantitively assess the differences.
It is contemplated that the steps of
It will be appreciated by those ordinarily skilled in the art having the benefit of this disclosure that the present disclosure provides various exemplary devices, systems, and methods for intraoperative or ex vivo visualization of tumors and/or residual cancer cells on surgical margins. Further modifications and alternative embodiments of various aspects of the present disclosure will be apparent to those skilled in the art in view of this description.
Furthermore, the devices and methods may include additional components or steps that were omitted from the drawings for clarity of illustration and/or operation. Accordingly, this description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the general manner of carrying out the present disclosure. It is to be understood that the various embodiments shown and described herein are to be taken as exemplary. Elements and materials, and arrangements of those elements and materials, may be substituted for those illustrated and described herein, parts and processes may be reversed, and certain features of the present disclosure may be utilized independently, all as would be apparent to one skilled in the art after having the benefit of the description herein. Changes may be made in the elements described herein without departing from the spirit and scope of the present disclosure and following claims, including their equivalents.
Furthermore, this description's terminology is not intended to limit the present disclosure. For example, spatially relative terms—such as “beneath,” “below,” “lower,” “above,” “upper,” “bottom,” “right,” “left,” “proximal,” “distal,” “front,” and the like—may be used to describe one element's or feature's relationship to another element or feature as illustrated in the figures. These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of a device in use or operation in addition to the position and orientation shown in the drawings. For the purposes of this specification and appended claims, unless otherwise indicated, all numbers expressing quantities, percentages or proportions, and other numerical values used in the specification and claims, are to be understood as being modified in all instances by the term “about” if they are not already. Accordingly, unless indicated to the contrary, the numerical parameters set forth in the following specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by the present disclosure. At the very least, and not as an attempt to limit the application of the doctrine of equivalents to the scope of the claims, each numerical parameter should at least be construed in light of the number of reported significant digits and by applying ordinary rounding techniques.
Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the present disclosure are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain errors necessarily resulting from the standard deviation found in their respective testing measurements. Moreover, all ranges disclosed herein are to be understood to encompass any and all sub-ranges subsumed therein.
It is noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the,” and any singular use of any word, include plural referents unless expressly and unequivocally limited to one referent. As used herein, the term “include” and its grammatical variants are intended to be non-limiting, such that recitation of items in a list is not to the exclusion of other like items that can be substituted or added to the listed items.
It should be understood that while the present disclosure has been described in detail with respect to various exemplary embodiments thereof, it should not be considered limited to such, as numerous modifications are possible without departing from the broad scope of the appended claims, including the equivalents they encompass.
This application claims priority to Provisional Application No. 62/625,967, filed on Feb. 2, 2018, to Provisional Application No. 62/625,983, filed on Feb. 3, 2018, and to Provisional Application No. 62/793,843, filed on Jan. 17, 2019, the entire content of each of which is incorporated by reference herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2019/000015 | 2/1/2019 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62793843 | Jan 2019 | US | |
62625983 | Feb 2018 | US | |
62625967 | Feb 2018 | US |