Systems, devices, and methods for fluorescence imaging are disclosed. In particular, the systems, devices, and methods may capture, interpret, and/or modify images using a fluorescence imaging system to identify or assist in the identification of characteristics related to a target. In various applications, for example, the target may be a wound and the systems, devices, and methods herein may be used to identify, quantify, and/or differentiate bacteria present in the wound.
Wound care is a major clinical challenge. Healing and chronic non-healing wounds are associated with a number of biological tissue changes including inflammation, necrosis, production of exudate, bleeding, proliferation, remodeling of connective tissues, and, a common major concern, bacterial presence, growth and infection. A portion of wound infections are not clinically apparent and contribute to the growing personal, emotional, and economic burdens associated with wound care, especially in aging populations. For example, Pseudomonas aeruginosa and Staphylococcus aureus are genera of bacteria that are prevalent in hospital settings and are common causes of bacterial infection. Currently, the clinical gold standard of wound assessment includes direct visual inspection of the wound site under white light illumination for classical signs and symptoms of infection. This is often combined with a swab culture or tissue biopsy sample for laboratory testing.
Certain medical specialties (e.g., cardiology, oncology, neurology, orthopedics, etc.) rely on particular imaging modalities (e.g., x-ray, ultrasound, magnetic resonance imaging (MRI), computed tomography (CT) scans, etc.) to assist with the diagnosis and assessment. Clinicians in such specialties may use advanced and established methods of interpreting the images. In wound care specialties, by contrast, the standard of care has not historically relied on such imaging modalities and no such advanced or established methods of interpreting the images exist. While some clinicians may use cameras to capture images of a wound in a standard photographic format, these formats do not identify or expose any bacterial information within the wound.
Qualitative and subjective visual assessment only provides a gross view of the wound site, but does not provide information about underlying biological, biochemical, and molecular changes that are occurring at the tissue and cellular level. Moreover, bacteria are invisible to the unaided eye, resulting in suboptimal wound sampling and an inability to appropriately track changes in bacterial growth in the wound site. This can impede healing and timely selection of the optimum anti-microbial treatment. Moreover, it may be difficult to differentiate certain markers of bacterial presence from similar markers caused by non-bacterial sources. For example, a fluorescence image may contain reflections from non-bacterial sources (e.g., tattoos, fingernails, toenails, jewelry, background environment, etc.) which appear to be similar in color to the fluorescence that would be expected from certain strains of bacteria. Moreover, very high or very low bacterial loads may be difficult or impossible to detect with a high degree of confidence using comparative devices due to insufficient signal-to-noise ratios (SNRs) and/or image saturation. These situations may result in the misidentification of a wound as containing bacteria when it does not (perhaps resulting in medically unnecessary treatments) and/or the misidentification of a wound as being free from bacteria when it is in fact infected (potentially delaying treatment).
Therefore, there exists a need for systems, devices, and methods for the capturing of medical images (such as fluorescence images of wound) which may reliably indicate the presence of regions which potentially have bacteria levels above a certain threshold, thus reducing both morbidity and mortality due especially to chronic wounds.
In view of these and other circumstances, the present disclosure provides for methods, systems, and devices to provide point-of-care interpretation of fluorescence images to assist with the identification of regions of bacteria.
In one aspect of the present disclosure, there is provided a portable, hand-held device comprising: an illumination device including at least one excitation light source and a driver configured to drive the at least one excitation light source to sequentially produce a plurality of output intensities; an imaging device configured to capture a plurality of fluorescence images of a target surface respectively corresponding to the plurality of output intensities; a memory; and a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
In another aspect of the present disclosure, there is provided a system comprising: a display device; an illumination device including an excitation light source and a driver configured to drive the excitation light source to sequentially produce a plurality of output intensities; an imaging device configured to capture a plurality of fluorescence images of a target surface respectively corresponding to the plurality of output intensities; a housing; and circuitry disposed within the housing, the circuitry including a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
In yet another aspect of the present disclosure, there is provided a fluorescence image interpretation method, comprising: capturing a plurality of fluorescence images of a target surface, including sequentially for a plurality of different values of a drive parameter: driving an excitation light source to produce excitation light at an output intensity corresponding to the drive parameter, and capturing a respective fluorescence image of the target surface, wherein the fluorescence image includes an emission response of the target surface to the excitation light; co-registering the plurality of fluorescence images; dividing an image area into a plurality of sections; for each of the plurality of sections, selecting an image portion from one of the plurality of fluorescence images; and combining the selected image portions to generate a composite image.
In this manner, aspects of the present disclosure provide for improvements in at least the technical field of fluorescence imaging, as well as the related technical fields of image processing, medical devices, biophotonics, wound treatment, and the like. Additional aspects of the present disclosure will be set forth in part in the description which follows, and in part will be obvious from the description or may be learned by practice of the present disclosure. The aspects of the present disclosure, and advantages which arise therefrom, may be realized and attained by means of the elements and combinations particularly pointed out in the appended claims and their equivalents.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present disclosure and claims.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
These and other aspects of the present disclosure are described with respect to the attached drawings, in which:
Reference will now be made in detail to various exemplary embodiments, examples of which are illustrated in the accompanying drawings. The various exemplary embodiments are not intended to limit the disclosure. To the contrary, the disclosure is intended to cover alternatives, modifications, and equivalents of the exemplary embodiments. In the drawings and the description, similar elements are provided with similar reference numerals. It is to be noted that the features explained individually in the description can be mutually combined in any technically expedient manner and disclose additional embodiments of the present disclosure.
The present disclosure provides devices, systems, and computer-implemented methods which provide for fluorescence imaging in which a light source and/or an image sensor is modulated to capture a series of fluorescence images respectfully corresponding to different levels of excitation light and/or different exposure periods. The different images may then be mosaicked and/or stitched together to generate a composite image capable of rendering a wide range of component (e.g., bacteria) concentrations in a target surface. While the following description is presented with regard to an exemplary implementation in which the target surface is a wound, devices, systems, and computer-implemented methods in accordance with the present disclosure may be used to detect contamination on non-organic target surfaces in addition to organic tissue surfaces. As such, the target surface may be, without limitation, a wound, a tissue sample, a surgical instrument, a surgical suite, a medical device, forensic equipment, food processing equipment, and the like.
In comparative devices and systems, a fluorescence imaging operation may capture only a single fluorescence image corresponding to a single excitation light source power level and a single exposure period (herein also referred to as a “nominal fluorescence image” or a fluorescence image obtained at a “nominal power level” or a “nominal exposure period”). Such comparative devices and systems may thus be capable of detecting bacterial loads above a certain threshold level. However, if the bacterial load is below this threshold level, the nominal power level and/or exposure period may be insufficient to provide an adequate SNR for a particular area of the target surface thus resulting in a bacterial infection which is not detected. If the bacterial load is sufficiently above this threshold level, the signal may become saturated in the resulting image, thus rendering it difficult or impossible to adequately classify or categorize the source of the signal.
Thus, devices, systems, and computer-implemented methods in accordance with the present disclosure may provide for improved confidence in fluorescence signature interpretation, may result in a reduced occurrence of false negatives and/or false positives, and may be used to provide a qualitative or semi-qualitative estimate of bacterial load.
Although the present disclosure is described primarily with regard to imaging of wound components (e.g., collagen, blood and/or vasculature, oxygenation and/or perfusion, bone, adipose tissue, exudate, and bacteria) and identification of characteristics captured in fluorescent images, the devices, systems, and methods disclosed herein can also be used to capture images and identify characteristics of excised tissue, such as cancerous tissue (e.g., lumpectomy for breast cancer surgery) captured in fluorescent images. In use with excised tissue, the devices and methods could be used to identify characteristics such as, for example, tissue components, tumor size, tumor edge, tumor boundaries, and tissue vascularization shown in fluorescent images. Moreover, while the present disclosure is described primarily with regard to certain wavelengths of fluorescence and corresponding or associated bacterial species, and so on, the present disclosure is not so limited. In practical implementations, the devices, systems, and methods herein may be recalibrated for any bacterial species based on a unique fluorescence emission signature of a given wavelength(s) or wavelength range(s) for the bacterial species, and similarly, can be used with other tissue components (normal or abnormal) for which a unique fluorescence signature is known or for which a signature of a unique biomarker or of a combination of biomarkers contained therein is known. The present disclosure may additionally be implemented with regard to non-bacterial signatures, including but not limited to viruses, fungi, yeast, and/or other microorganisms. The devices, systems, and methods disclosed herein are not limited to use with fluorescence images, and instead may be used to analyze other types of images such as white-light images.
Exemplary wound monitoring devices described herein including include hand-held/portable optical digital imaging devices having specific excitation light sources and optical filters (e.g., low-pass filters, high-pass filters, band-pass filters, multi-band filters, polarization filters, etc.) attached thereto, although in some implementations the hand-held/portable optical digital imaging devices described herein may be filterless. Such exemplary devices may be configured with optical heads to permit multiple different use cases, including but not limited to endoscopic imaging, and in some implementations such devices may be modular. These devices include but are not limited to those described in International Patent Application Publication WO 2009/140757 A1, International Patent Application Publication WO 2019/148268 A1, International Patent Application Publication WO 2020/0148725 A1, and/or International Patent Application Publication WO 2020/0148726 A1, the entire contents of which are herein each incorporated by reference in their entirety. Using imaging devices and systems further described herein, fluorescence of components in a wound due to exposure to excitation light may be imaged and analyzed. For example, in a wound having a bacterial presence caused by or containing, for example, Pseudomonas aeruginosa, the Pseudomonas aeruginosa fluoresce with a specific spectral signature, i.e., one or more bands of wavelengths with known peaks, when subjected to excitation light. The excitation light may comprise any light with known wavelength or range of wavelengths with known peaks, such as a peak at 405 nm. Capturing and analyzing this data permits identification of bacterial presence in general, and identification of the presence of specific types of bacteria as well. In order to identify, type, and quantify the bacterial presence as well as additional characteristics of the wound, the devices and systems are trained.
One example of a wound monitoring device is a portable, handheld imaging system that includes an imaging device having two or more cameras (i.e., camera sensors) and a processor coupled to the imaging device for analyzing the images captured from the camera sensors to perform algorithms or other operations as will be described in more detail below. The imaging device, for example, includes a first, primary camera sensor and a second, secondary camera sensor. The first, primary camera sensor and the second, secondary camera sensor may be configured to capture standard, white light (WL) images, fluorescent (FL) images, near infrared (NIR) images, or infrared (IR) images. The sensors may be so configured by use with dedicated filters or filters selectable from a plurality of filters associated with the imaging device (e.g., filter wheel, tunable filters, etc.), in which the filters may be wavelength filters (e.g., low-pass filters, high-pass filters, band-pass filters, multi-band filters, etc.) and/or polarization filters. Thus, the method disclosed herein may be used to measure features captured in WL, FL, NIR, or IR images. In some implementations, to permit determination of the parallax value of a primary and secondary image (taken, respectively, by the primary and secondary camera sensors), the first camera sensor is separated from the second camera sensor by a predetermined, fixed separation distance. In other implementations, to permit depth measurement, a time-of-flight or other depth imaging sensor may be provided. In some implementations, the imaging device may include a thermal image sensor to capture thermal images.
The imaging device 120 includes at least one image sensor, such as, for example, a first image sensor 121, a second image sensor 122, and a third image sensor 123. Each of the first image sensor 121, the second image sensor 122, and the third image sensor 123 may individually be implemented as image sensors that may be used for one or more of WL, FL, IR, and thermal imaging. In one example, the first image sensor 121 and the third image sensor 123 are together configured for stereoscopic white-light imaging, and the second image sensor 122 is configured for fluorescence imaging. In another example, the first image sensor 121 and the third image sensor 123 are together configured for stereoscopic fluorescence imaging, and the second image sensor 122 is configured for white-light imaging. In yet another example, the first image sensor 121 is configured for white-light imaging, the second image sensor 122 is configured for fluorescence imaging of a first wavelength or wavelength range, and the third image sensor 123 is configured for fluorescence imaging of a second wavelength or wavelength range. The physical arrangement (i.e., ordering) of the first image sensor 121, the second image sensor 122, and the third image sensor 123 may also be different from that shown in
The housing 130 may include a physical user interface, such as a power button, one or more input buttons, and so on. The housing 130 may also include various input and output ports, such as wired charging ports, inductive charging ports, universal serial bus (USB) ports, and/or other peripheral ports. As shown in
Device 200 further includes a rocker switch 211 enabling switching between a standard imaging mode and a fluorescence imaging mode. For instance, device 200 captures real-time images (e.g., in JPG format), and videos (e.g., in MOV format) using both standard and fluorescent imaging modes. The standard imaging mode is generally used for standard photography, i.e., to capture RGB images and videos of targets illuminated with standard white light. The fluorescence imaging mode is used to capture RGB images and videos of targets illuminated with light having known peak wavelengths and intended to generate fluorescence from specific targets being excited by the light. Consequently, device 200 further includes LEDs 212 that have specific wavelengths or ranges of wavelengths for illuminating targets when in fluorescence imaging mode, as well as a camera lens 213 enabling image and video capture, a range finder sensor 214 for detecting an optimal distance from a wound or surrounding skin, and an ambient light sensor 215 for detecting optimal lighting conditions for the fluorescence imaging mode. Further, device 200 includes a holding contour 217 for allowing a user to grip the device securely, and a charging port 218 enabling device charging using a standard or proprietary power adapter.
The fluorescence image sensor 310 is configured to capture one or a plurality of fluorescence images of a target surface (e.g., a wound), and may be implemented as or including a photoelectric conversion device which receives incident electromagnetic radiation (e.g., light) and converts the radiation into signal charges which may be used to generate an image of the field-of-view of the image sensor. Such an image sensor may have, for example, a complementary metal-oxide semiconductor (CMOS) architecture, a charge-coupled device (CCD) architecture, and so on. The photoelectric conversion device may include an array of a plurality of individual pixel circuits, each of which includes a photosensitive element such as a photodiode. The image sensor may include additional circuitry such as driving circuits, timing circuits, memory circuits, control circuits, output circuits, power circuits, buses, and the like. For example, the fluorescence image sensor 310 may include a timing control circuit configured to the length of time to which the photosensitive elements of the pixel circuits accumulate charge modulate an exposure period for images captured by the fluorescence image sensor 310. One example of a fluorescence image sensor 310 which includes a timing control circuit is illustrated in
As illustrated in
Returning to
The excitation light source 320 may be or include one or more light emitting elements that produce excitation light or illumination. The excitation light or illumination may be broadband (e.g., 300-1200 nm) or may be narrowband or multi-narrowband (e.g., one or a combination of 300-350 nm, 325-375 nm, 350-400 nm, 375-425 nm, 400-450 nm, 425-475 nm, 450-500 nm, 475-525 nm, 500-550 nm, 525-575 nm, 550-600 nm, 575-625 nm, 600-650 nm, 625-675 nm, 650-700 nm, 725-775 nm, 700-750 nm, 725-775 nm, 750-800 nm, 775-825 nm, 800-850 nm, 825-875 nm, 850-900 nm, 875-925 nm, 900-950 nm, 925-975 nm, 950-1000 nm, 975-1025 nm, 1000-1050 nm, 1025-1075 nm, 1050-1100 nm, 1075-1125 nm, 1100-1150 nm, 1125-1175 nm, or 1150-1200 nm), for example, monochromatic or white light having a wavelength peak of 400-450 nm, or any other combination of single or multiple wavelengths (e.g., wavelengths in the ultraviolet/visible/near infrared/infrared ranges), to illuminate a target object (e.g., a wound or other area of interest) in order to elicit an optical signal (e.g., fluorescence). For example, the excitation light source 320 may be blue or violet LED arrays emitting light at about 405 nm (e.g., ±5 nm, ±10 nm, ±15 nm, or ±20 nm), and may be coupled with additional band-pass filters centered at about 405 nm to remove/minimize the side spectral bands of light from the LED array output so as not to cause light leakage into the imaging detector with its own optical filters. It should be understood that the foregoing wavelengths are merely exemplary and not exhaustive. In some implementations, the narrowband or multi-narrowband wavelengths may have any peak wavelength between 300 nm and 1200 nm and are not limited to a bandwidth of 50 nm. For example, the above list of wavelengths may include wavelengths within 15 nm of the above values, such that “about 400-450 nm” includes 385-400 nm, 415-400 nm, 400-465 nm, 403-444 nm, and so on. The excitation light source 320 may further or alternatively comprise a laser diode and/or filtered lights arranged in a variety of geometries. The device 300 may include a method or apparatus (e.g., a heatsink or a cooling fan) to dissipate heat and cool the excitation light source 320. The device 300 may include a system or device (e.g., an optical band-pass filter) to remove any undesirable wavelengths of light from the excitation light source 320 used to illuminate the object being imaged. The excitation light source 320 may include a light source driver operatively connected to the light emitting elements included in the excitation light source 320. The light source driver may for example, receive a control signal (such as from the controller 330) and generate a corresponding drive signal which causes the light emitting elements to produce excitation light or illumination of a corresponding intensity, pattern, duration, etc. One example of an excitation light source 320 which includes a light source driver and light emitting elements is illustrated in
As illustrated in
Returning to
The display device 340 may be any type of display, including but not limited to a liquid crystal display (LCD), a quantum dot display, an organic light emitting display (OLED), a thin-film transistor (TFT) display, and the like. The display device 340 may be configured to provide real-time display of the field-of-view of the fluorescence image sensor 310, to display images stored in memory 350 or remotely stored, to provide graphical overlays, and so on. The display device 340 may include a touch panel to permit input from a clinician as a user interface. The display device 340 may be configured to present one or more graphical user interfaces (GUIs).
The UI 360 may include physical and virtual mechanisms by which a user interacts with the device 300, including physical buttons, dials, switches, and the like; interactive icons displayed on the display device 340; and/or one or more GUIs. The power supply 370 may be an AC/DC power supply, a compact battery bank, or a rechargeable battery pack. Additionally or alternatively, the device 300 may be adapted for connecting to an external power supply. The communication circuitry 380 may include wired communication circuitry and/or wireless communication circuitry to permit communications between the device 300 and external devices. The communication circuitry 380 may be configured to communicate using wired communication media such as coaxial cables, USB cables, Ethernet cables, and so on; and/or using wireless communication protocols such as Wi-Fi, Bluetooth, Near Field Communication (NFC), 4G cellular communications, 5G wireless communications, and so on.
The I/O circuitry 390 may be or include on or more of: an interface for a head-mounted display; an interface for an external printer; an interface for a tablet computer, laptop computer, desk top computer or other computer device; an interface for a device allowing the use of extra memory; and an interface for a microphone. The device 300 may have a housing that houses all the components in one entity. The housing may be equipped with a means of securing any digital imaging device within it. The housing may be designed to be hand-held, compact, and/or portable. The housing may be one or more enclosures or separate housings.
The device 300 may be configured with application programs (e.g., stored in the memory 350 and loaded by the controller 330 for execution) to perform various algorithms, including those to provide excitation light source modulation and/or fluorescence image sensor modulation for fluorescence images, image processing, image interpretation, and so on.
In
In response to the predetermined trigger, the fluorescence image sensor 310 is set to the first exposure period t1 and a first fluorescence image is captured. Then, the fluorescence image sensor 310 is set to the second exposure period t2 and a second fluorescence image is captured. This repeats until the Nth fluorescence image is taken at the maximum exposure period tN. In this manner, the “maximum” exposure period tN does not necessarily mean the maximum exposure period of which the fluorescence image sensor is capable, but instead refers to the longest exposure period for the fluorescence imaging operation. After the Nth fluorescence image is captured, the exposure period may be reset and a white light image may be captured. While
In one particular example, N=4, ti=25 milliseconds (ms), t2=75 ms, t3=100 ms, and t4=200 ms. In this example, when a user initiates a fluorescence imaging operation (e.g., by pressing a button on the UI 360 or by entering a command via a GUI presented on the display device 340), the wound monitoring device 300 captures four imaging devices over ** seconds. The fluorescence imaging may be automatically followed by a white light imaging operation as noted above, as well as by various imaging processing operations as will be described in more detail below.
In
In response to the predetermined trigger, the excitation light source 320 is driven to the first output power level P1 and a first fluorescence image is captured. Then, the excitation light source 320 is driven to the second output power level P2 and a second fluorescence image is captured. This repeats until the excitation light source 320 is driven to the maximum output power level PN and the Nth fluorescence image is taken. In this manner, the “maximum” output power level PN does not necessarily mean the maximum output power level of which the excitation light source 320 is capable, but instead refers to the highest output power for the fluorescence imaging operation. After the Nth fluorescence image is captured, driving of the excitation light source 320 stops such that the output power level returns to zero. At this point (or, in other implementations, prior to capture of the first fluorescence image), a white light image may be captured.
In one particular example N=4, P1=0.44 milliwatts (mW), P2=2.3 mW, P3=4.2 mW, and P4=6.2 mW. In another particular example N=6, P1=0.44 milliwatts (mW), P2=1.44 mW, P3=2.45 mW, P4=3.46 mW, P5=4.5 mW, and P6=5.5 mW. In these examples, when a user initiates a fluorescence imaging operation (e.g., by pressing a button on the UI 360 or by entering a command via a GUI presented on the display device 340), the wound monitoring device 300 captures four or six fluorescence images over two or three seconds, respectively. The fluorescence imaging operation may be automatically followed by a white light imaging operation as noted above, as well as by various image processing operations as will be described in more detail below.
In some implementations, LED modulation and exposure period modulation may be used together. However, it is also contemplated that only LED modulation or only exposure period modulation may be utilized. Moreover, a combination of LED and exposure period modulation may be used, for example to capture a first set of images at a first output power level for n1 different exposure periods, followed by a second set of images at a second power level for n2 different exposure periods, and so on. In still other implementations, the output power level of an LED may be iteratively increased to the maximum level PN (holding the exposure period constant), after which the exposure period may be iteratively increased to capture additional images at the maximum power level PN. In yet other implementations, LED modulation, exposure, period modulation, or both may be implemented in a successive manner with different power levels and/or exposure periods in order to provide coarse modulation followed by fine modulation. For example, after a first iteration of LED modulation from power level P1 to P5, a second iteration of LED modulation may be performed from power level P1′ to P5′, where each of P1′ to P5′ are between P3 and P4.
Returning to
The process of co-registering images may include various operations to determine the spatial relationship between the pixels of one fluorescence image and the pixels of another fluorescence image. For example, the controller 330 may be configured to determine that the upper-left pixel (0,0) of the first fluorescence image corresponds to a different pixel (x,y) of the second fluorescence image. The pixel correspondence may be determined for multiple pixels in the first and second fluorescence images, such that the controller 330 may be able to determine whether and to what degree the wound monitoring device 300 was moved (i.e., translated, rotated about the imaging axis, rotated about other perpendicular axes, and so on) between the capture of the first fluorescence image and the capture of the second fluorescence image. The co-registration operations may be performed for each fluorescence image. Afterward, the controller 330 may be configured to identify a rectangular area of the target surface that is present in all fluorescence images. This rectangular area may be identified as an “image area” for later processing. In some implementations, the process of co-registering images includes a white light image and/or a thermal image in addition to the fluorescence images.
The process of dividing the image may include various operations to section each fluorescence image. For example, the controller 330 may be configured to divide the image area for each fluorescence image into a plurality of tiles. The tiles may be regularly sized and shaped (e.g., a plurality of rectangles of equal size) or irregularly sized and shaped (e.g., polygonal tessellation across the image area). If the tiles are irregularly sized and shaped, the tiles may be larger in areas that are comparatively featureless in the fluorescence images (e.g., portions of the image area where there is little signal and/or where the signal changes little) and smaller in areas that include features (e.g., small areas of brightness or areas where the signal changes quickly from pixel to pixel). In some implementations, the white light image may be used to determine which portions of the image area should have larger tiles and which portions of the image area should have smaller tiles. In other implementations, one or more fluorescence images may additionally or alternatively be used. Regardless of whether the tiles are regular or irregular, the tiling pattern is the same for the image area of each fluorescence image.
The process of selecting a best area may include various operations to determine, on a tile by tile basis, which fluorescence image conveys the best information on the status of the target area. For example, the controller 330 may be configured, for a given tile, to determine the fluorescence signal present in each fluorescence image. The fluorescence signal may be analyzed in terms of either the absolute signal or the SNR of the fluorescence image. Using the SNR instead of the absolute signal may permit the controller 330 to account for instances where the signal is saturated. The fluorescence image having the best value of the fluorescence signal (e.g., the highest SNR) may be flagged or otherwise identified as the best image for the given tile.
The process of mosaicking may include various operations to generate an output image consisting of components selected from multiple different fluorescence images. For example, the controller 330 may be configured to generate a blank image having a dimension corresponding to the image area and having a tiling pattern corresponding to the tiling pattern used for the fluorescence images. For each tile, the controller 330 may be configured to use the image data corresponding to the fluorescence image identified as the best image. The output of this process may be referred to as a “mosaicked image.”
The process of stitching may include various operations to smooth or otherwise remove discontinuities present in the mosaicked image. For example, the image data selected for one tile may and the image data selected for an adjacent tile may originate from different source fluorescence images. In such instances, discontinuities may exist at the border between the neighboring tiles where one or more parameters (e.g., one or more of the Hue, Saturation, and Value channels in the HSV color space) abruptly jump from one pixel to the next. To remove such discontinuities, the controller 330 may be configured to modify the image data such that the discontinuous parameters for pixels in the vicinity of the border (e.g., within ten pixels, within fifty pixels, and so on) smoothly transition from one end of the vicinity to the other. The output of this process may be referred to as a “stitched image.”
The process of interpreting may include various operations to automatically analyze a processed image (e.g., the mosaicked image and/or the stitched image) to determine the presence and/or concentration of one or more components of the target surface. For example, the processor 330 may be configured to perform a color analysis and/or a textural analysis of the processed image. In one particular example, the color analysis includes converting the processed image to an HSV color space (if it is not already in said color space) to generate a first converted image, comparing a saturation parameter of the first converted image to a first saturation threshold condition, wherein the first saturation threshold condition is based on a correlation between the first color and a first bacterial fluorescence signature, comparing a value parameter of the first converted image to a first value threshold condition, wherein the first value threshold condition is based on the correlation between the first color and the first bacterial fluorescence signature, and flagging areas of the processed image where the saturation parameter satisfies the first saturation threshold condition and the first value parameter satisfies the value threshold condition; unflag the flagged areas which are smaller than a size threshold, and outline the remaining flagged areas on the processed image, thereby to generate an overlay image. The color analysis may be repeated for multiple colors, each having their own saturation threshold conditions and/or value threshold conditions based on the correlations between the individual color and different bacterial fluorescence signatures. In other examples of the present disclosure, a color space other than HSV may be used in the color analysis, including but not limited to HSL, CIEXYZ, CIELUV, L*a*b*, RGB, YCbCr, YUV, LCh, CMY, CMYK, and custom color spaces. Where color spaces other than HSV are used, the threshold conditions (e.g., the hue thresholds, the saturation thresholds, and/or the value thresholds) may be converted to the other color spaces directly. Where the color space used in the color analysis is the same as the color space in which the input image has been captures (e.g., both RGB), then the conversion operation may be omitted.
In a particular example, the textural analysis includes converting the processed image to grayscale to generate a second converted image; comparing, on a pixel-by-pixel basis, an intensity parameter of the second converted image to an intensity threshold; temporarily flagging regions within a predetermined pixel distance of pixels in which the intensity parameter exceeds the intensity threshold; converting the processed image to a L*a*b* color space (if it is not already in said color space), thereby to generate a third converted image; within each channel of the third converted image, determine a respective gradient of the temporarily flagged regions; and permanently flagging those ones of the temporarily flagged regions in which the respective gradient for each channel of the third converted image exceeds a gradient threshold. In other examples of the present disclosure, a color space other than L*a*b* may be used in the textural analysis, including but not limited to HSV, CIEXYZ, CIELUV, RGB, YCbCr, YUV, LCh, CMY, CMYK, and custom color spaces. Where color spaces other than L*a*b* are used, the thresholds (e.g., the intensity thresholds and/or the gradient thresholds) may be selected and/or generated anew based on the characteristics of the other color spaces. Where color spaces other than HSV are used, the threshold conditions (e.g., the hue thresholds, the saturation thresholds, and/or the value thresholds) may be converted to the other color spaces directly. Where the color space used in the textural analysis is the same as the color space in which the input image has been captures (e.g., both RGB), then the conversion operation may be omitted. Where the color space used in the textural analysis is the same as the color space used in the color analysis, then one of the conversion operations may be omitted and/or the two conversion operations may be combined (depending on the order in which the color analysis and the textural analysis occur and/or whether the analyses are conducted in series or parallel). An “overlay image” may refer to a combination of an overlay with the processed image itself, in which the overlay includes contours drawn as a result of the color and/or textural analysis. Alternatively, an “overlay image” may refer to the overlay itself as an element that is shown, stored, etc. separate from the processed image itself, such that no modifications are made to the processed image.
The textural analysis may be used to differentiate between bacterial sources of apparent fluorescence and non-bacterial sources of apparent fluorescence (e.g., bone, tendon, tattoo ink, and so on). The individual bacterial signature may correspond to a bacterial concentration depending on the output power level and/or exposure period at which the corresponding fluorescence image was captured (e.g., 102 colony forming units per gram (cfu/g) or higher if the bacteria was detected only at high output power levels and/or long exposure periods, 107 cfu/g or higher if the bacteria was detected even at low output power levels and/or short exposure periods, 104 cfu/g or higher if the bacteria was first detected at intermediate output power levels and/or medium exposure periods) and may indicate the presence of any one of the following: gram negative aerobic species, including Pseudomonas aeruginosa, Escherichia coli, Proteus mirabilis, Proteus vulgaris, Enterobacter cloacae, Serratia marcescens, Acinetobacter baumannii, Klebsiella pneumoniae, Klebsiella oxytoca, Morganella morganii, Stenotrophomonas maltophilia, Citrobacter koseri, Citrobacter freundii, Aeromonas hydrophilia, Alcaligenes faecalis, and Pseudomonas putida; gram positive aerobic species, including Staphylococcus aureus, Staphylococcus epidermis, Staphylococcus lugdunensis, Staphylococcus capitis, Corynebacterium striatum, Bacillus cereus, and Listeria monocytogenes; and/or anaerobic species, including Bacteroides fragilis, Clostridium perfringens, Peptostreptococcus anaerobius, Propionibacterium acnes, and Veillonella parvula. This list is intended to be exemplary only and does not encompass all species to which the identification and differentiation operations may be applied.
The operations of determining a concentration of the component may result in a qualitative or semi-qualitative estimate of the concentration. For example, the concentration may categorize an estimated bacterial load for one or more identified bacterial species as either none, occasional, light, moderate, or heavy. The estimate may be based on which fluorescence image was earlier identified as the best image for areas determined to correspond to the bacterial species. For example, if a particular area of bacterial contamination corresponds to an image area where the best fluorescent image was taken at a low output power level of the excitation light source 320 or a short exposure period of the fluorescence image sensor 310, it may be determined that the bacterial load is heavy. In contrast, if a particular area of bacterial contamination corresponds to an image area where the best fluorescent image was taken at a high output power level of the excitation light source 320 or a long exposure period of the fluorescence image sensor 310, it may be determined that the bacterial load is occasional or light.
Depending on the particular implementation (i.e., depending on whether the stitching and/or overlay processes are formed), any one or more selected from the mosaicked image, the stitched image, the overlay image, or combinations thereof may be output and/or stored as a “composite image,” for example on the display device 340 and/or in the memory 350. The composite image may also include indicators, such as labels, contours, arrows, or other interface elements which operate to identify an area of the target surface which meets a predetermined condition, such as an area corresponding to the concentration of various components in the target area. These interface elements may be directly overlaid on the composite image and/or may be included in a border area. The composite image may also include metadata identify the source fluorescence image for each of the plurality of tiles.
At operation 810, the user positions the wound monitoring device relative to a target surface, such as a wound or other portion of the skin surface of a patient. The positioning may be accomplished with the assistance of a range finder included in the wound monitoring device; for example, a GUI on a display of the wound monitoring device may be configured to provide a visual indication to the user regarding whether the wound monitoring device is positioned at an appropriate distance from the target surface for imaging.
Once the wound monitoring device is in position (e.g., held appropriately by the user), the user may initiate operation 820 to sequentially capture a plurality of images. Operation 820 includes a series of sub-operations, examples of which are illustrated in
In response to the imaging command, at operation 920 or 1020 the wound monitoring device initializes an index variable i to 1. In the case of exposure period modulation, operation 930 is performed. At operation 930 a timing controller of the fluorescence image sensor controls an image sensor array of the fluorescence image sensor, such that the fluorescence image sensor is configured to capture images at with an exposure period ti. In the case of LED modulation, operations 1030 and 1040 are performed. At operation 1030 a light source driver (e.g., an LED driver) of the wound monitoring device sets its output power level to the corresponding output power level Pi. At operation 1040, the light source driver drives an excitation light source (e.g., an LED) of the wound monitoring device to produce excitation light at an intensity corresponding to the output power level Pi. In either case, synchronously with the emission by the excitation light source, at operation 940 or 1050 the fluorescence image sensor of the wound monitoring device captures a fluorescence image.
At operation 950 or 1060, it is determined whether the fluorescence imaging procedure has completed, for example by determining if the index i=N. If the fluorescence imaging procedure is not completed, at operation 960 or 1070 the index i is incremented and the method returns to operation 930 or 1030. Operations 930-940 or 1030-1050 are repeated for the exposure period ti or output power level Pi corresponding to the new value of the index i. If at operation 950 or 1060 it is determined that the fluorescence imaging procedure is complete, the fluorescence imaging subroutine exits at operation 970 or 1080. In the case of LED modulation, at operation 1080 and the driving of the excitation light source stops such that the output power level returns to zero.
To perform the repeated iterations of operations 1030, 1040, and 1050 (i.e., in the case of LED modulation), the light source driver may provide a drive signal in the form of a monotonically-increasing stepped waveform, such that each successive fluorescence image corresponds to a higher output power level. In other implementations, the drive signal may be a monotonically-decreasing stepped waveform beginning with the maximum output power level PN for the first fluorescence image and ending with the lowest output power level P1 for the Nth fluorescence image. The stepped waveform need not, however, be monotonic and the output power levels may be presented in any order desired. In still other implementations, the drive signal may have a pulsed waveform in which the excitation light source is driven to the first output power level P1 for capture of the first fluorescence image, driving is subsequently stopped such that the drive signal returns to zero, then the excitation light source is driven to the second output power level P2 for capture of the second fluorescence image, driving is subsequently stopped such that the drive signal returns to zero, and so on. Of course, if the drive signal has a pulsed waveform the pulses may be arranged in order increasing amplitude from P1 to PN, in order of decreasing amplitude from PN to P1, or in any other order. In yet other implementations, the drive signal may have a ramp waveform which steadily increases or decreases. In such implementations, the output power level would not be constant throughout the entire exposure period for image capture and thus the output power level may refer to the average power level for the given image capture period.
After operation 820 has been completed, (e.g., after all operations of
Once all fluorescence images and the white light image have been captured, at operation 840 a composite image is generated. Operation 840 includes a series of sub-operations, an example of which is illustrated in
At operation 1110, the fluorescent images are coregistered. Operation 1110 may include various operations to determine the spatial relationship between the pixels of one fluorescence image and the pixels of another fluorescence image. For example, the wound monitoring device may be configured to determine that the upper-left pixel (0,0) of the first fluorescence image corresponds to a different pixel (x,y) of the second fluorescence image. The pixel correspondence may be determined for multiple pixels in the first and second fluorescence images, such that the wound monitoring device may be able to determine whether and to what degree the wound monitoring device was moved (i.e., translated, rotated about the imaging axis, rotated about other perpendicular axes, and so on) between the capture of the first fluorescence image and the capture of the second fluorescence image. The co-registration operations may be performed for each fluorescence image. Afterward, the wound monitoring device may be configured to identify a rectangular area of the target surface that is present in all fluorescence images. This rectangular area may be identified as an “image area” for later processing. In some implementations, the process of co-registering images includes the white light image and/or a thermal image in addition to the fluorescence images.
At operation 1120, the image area is divided into M tiles, where M is a positive integer greater than 1. The tiles may be regularly sized and shaped (e.g., a plurality of rectangles of equal size) or irregularly sized and shaped (e.g., polygonal tessellation across the image area). If the tiles are irregularly sized and shaped, the tiles may be larger in areas that are comparatively featureless in the fluorescence images (e.g., portions of the image area where there is little signal and/or where the signal changes little) and smaller in areas that include features (e.g., small areas of brightness or areas where the signal changes quickly from pixel to pixel). In some implementations, the white light image may be used to determine which portions of the image area should have larger tiles and which portions of the image area should have smaller tiles. In other implementations, one or more fluorescence images may additionally or alternatively be used. Regardless of whether the tiles are regular or irregular, the tiling pattern is the same for the image area of each fluorescence image.
For each tile, it is determined which fluorescence image provides the best representation of the target area and/or conveys the best information on the status of the target area. This is represented by a series of operations 1130 to 1160 which are performed for each tile j, where j is an index variable running from 1 to M. At operation 1130 the wound monitoring device initializes the index variable j to 1. Then, at operation 1140, the wound monitoring device may determine which fluorescence image should be selected for tile j. This may include an operation of determining the fluorescence signal present in each fluorescence image. The fluorescence signal may be analyzed in terms of either the absolute signal or the SNR of the fluorescence image. Using the SNR instead of the absolute signal may permit the method to account for instances where the signal is saturated. The fluorescence image having the best value of the fluorescence signal (e.g., the highest SNR) may be flagged or otherwise identified as the best image for the given tile.
At operation 1150, it is determined whether a fluorescence image has been selected for each tile, for example by determining if the index j=M. If there remain any tiles which have not yet been processed, at operation 1160 the index j is incremented and the method returns to operation 1140. Operations 1140 and 1150 are then repeated for the tile represented by the new value of the index j. If at operation 1150 it is determined that all tiles have been processed, at operation 1170 a mosaic image is generated.
Operation 1170 may include various operations to generate an output image consisting of components selected from multiple different fluorescence images. For example, operation 1170 may include generating a blank image having a dimension corresponding to the image area and having a tiling pattern corresponding to the tiling pattern used for the fluorescence images. For each tile, the method may be configured to use the image data corresponding to the fluorescence image identified as the best image. The output of this process may be referred to as a “mosaicked image.”
In some instances, the image data selected for one tile may and the image data selected for an adjacent tile may originate from different source fluorescence images. In such instances, discontinuities may exist at the border between the neighboring tiles where one or more parameters (e.g., one or more of the Hue, Saturation, and Value channels in the HSV color space) abruptly jump from one pixel to the next. Therefore, the mosaicked image may be processed according to operation 1180 to stitch the adjacent tiles together. Operation 1180 may 1180 may include various operations to smooth or otherwise remove discontinuities present in the mosaicked image. To remove such discontinuities, the image data may be modified such that the discontinuous parameters for pixels in the vicinity of the border (e.g., within ten pixels, within fifty pixels, and so on) smoothly transition from one end of the vicinity to the other. The output of this process may be referred to as a “stitched image.”
Where desired, the mosaicked image and/or the stitched image may be further processed to provide for automated image interpretation and/or the generation and overlay of various indicators at operation 1190. Operation 1190 may include various operations to automatically analyze a processed image (e.g., the mosaicked image and/or the stitched image) to determine the presence and/or concentration of one or more components of the target surface. For example, the method may include performing a color analysis and/or a textural analysis of the processed image. In one particular example, the color analysis includes converting the processed image to an HSV color space (if it is not already in said color space) to generate a first converted image, comparing a saturation parameter of the first converted image to a first saturation threshold condition, wherein the first saturation threshold condition is based on a correlation between the first color and a first bacterial fluorescence signature, comparing a value parameter of the first converted image to a first value threshold condition, wherein the first value threshold condition is based on the correlation between the first color and the first bacterial fluorescence signature, and flagging areas of the processed image where the saturation parameter satisfies the first saturation threshold condition and the first value parameter satisfies the value threshold condition; unflag the flagged areas which are smaller than a size threshold, and outline the remaining flagged areas on the processed image, thereby to generate an overlay image. The color analysis may be repeated for multiple colors, each having their own saturation threshold conditions and/or value threshold conditions based on the correlations between the individual color and different bacterial fluorescence signatures. In other examples of the present disclosure, a color space other than HSV may be used in the color analysis, including but not limited to HSL, CIEXYZ, CIELUV, L*a*b*, RGB, YCbCr, YUV, LCh, CMY, CMYK, and custom color spaces. Where color spaces other than HSV are used, the threshold conditions (e.g., the hue thresholds, the saturation thresholds, and/or the value thresholds) may be converted to the other color spaces directly. Where the color space used in the color analysis is the same as the color space in which the input image has been captures (e.g., both RGB), then the conversion operation may be omitted.
In a particular example, the textural analysis includes converting the processed image to grayscale to generate a second converted image; comparing, on a pixel-by-pixel basis, an intensity parameter of the second converted image to an intensity threshold; temporarily flagging regions within a predetermined pixel distance of pixels in which the intensity parameter exceeds the intensity threshold; converting the processed image to a L*a*b* color space (if it is not already in said color space), thereby to generate a third converted image; within each channel of the third converted image, determine a respective gradient of the temporarily flagged regions; and permanently flagging those ones of the temporarily flagged regions in which the respective gradient for each channel of the third converted image exceeds a gradient threshold. In other examples of the present disclosure, a color space other than L*a*b* may be used in the textural analysis, including but not limited to HSV, CIEXYZ, CIELUV, RGB, YcbCr, YUV, LCh, CMY, CMYK, and custom color spaces. Where color spaces other than L*a*b* are used, the thresholds (e.g., the intensity thresholds and/or the gradient thresholds) may be selected and/or generated anew based on the characteristics of the other color spaces. Where color spaces other than HSV are used, the threshold conditions (e.g., the hue thresholds, the saturation thresholds, and/or the value thresholds) may be converted to the other color spaces directly. Where the color space used in the textural analysis is the same as the color space in which the input image has been captures (e.g., both RGB), then the conversion operation may be omitted. Where the color space used in the textural analysis is the same as the color space used in the color analysis, then one of the conversion operations may be omitted and/or the two conversion operations may be combined (depending on the order in which the color analysis and the textural analysis occur and/or whether the analyses are conducted in series or parallel). An “overlay image” may refer to a combination of an overlay with the processed image itself, in which the overlay includes contours drawn as a result of the color and/or textural analysis. Alternatively, an “overlay image” may refer to the overlay itself as an element that is shown, stored, etc. separate from the processed image itself, such that no modifications are made to the processed image.
The textural analysis may be used to differentiate between bacterial sources of apparent fluorescence and non-bacterial sources of apparent fluorescence (e.g., bone, tendon, tattoo ink, and so on). The individual bacterial signature may correspond to a bacterial concentration depending on the output power level at which the corresponding fluorescence image was captured (e.g., 102 cfu/g or higher if the bacteria was detected only at high output power levels, 107 cfu/g or higher if the bacteria was detected even at low output power levels, 104 cfu/g or higher if the bacteria was first detected at intermediate output power levels) and may indicate the presence of any one of the following: gram negative aerobic species, including Pseudomonas aeruginosa, Escherichia coli, Proteus mirabilis, Proteus vulgaris, Enterobacter cloacae, Serratia marcescens, Acinetobacter baumannii, Klebsiella pneumoniae, Klebsiella oxytoca, Morganella morganii, Stenotrophomonas maltophilia, Citrobacter koseri, Citrobacter freundii, Aeromonas hydrophilia, Alcaligenes faecalis, and Pseudomonas putida; gram positive aerobic species, including Staphylococcus aureus, Staphylococcus epidermis, Staphylococcus lugdunensis, Staphylococcus capitis, Corynebacterium striatum, Bacillus cereus, and Listeria monocytogenes; and/or anaerobic species, including Bacteroides fragilis, Clostridium perfringens, Peptostreptococcus anaerobius, Propionibacterium acnes, and Veillonella parvula. This list is intended to be exemplary only and does not encompass all species to which the identification and differentiation operations may be applied.
Operation 890 may include the generation of a qualitative or semi-qualitative estimate of the concentration. For example, the concentration may categorize an estimated bacterial load for one or more identified bacterial species as either none, occasional, light, moderate, or heavy. The estimate may be based on which fluorescence image was earlier identified as the best image for areas determined to correspond to the bacterial species. For example, if a particular area of bacterial contamination corresponds to an image area where the best fluorescent image was taken at a low output power level of the excitation light source, it may be determined that the bacterial load is heavy. In contrast, if a particular area of bacterial contamination corresponds to an image area where the best fluorescent image was taken at a high output power level of the excitation light source, it may be determined that the bacterial load is occasional or light.
Depending on the particular implementation (i.e., depending on whether the stitching and/or overlay processes are formed), any one or more selected from the mosaicked image, the stitched image, the overlay image, or combinations thereof may be output and/or stored as a “composite image,” for example on the display of the wound monitoring device and/or in a memory of the wound monitoring device. The composite image may also include indicators, such as labels, contours, arrows, or other interface elements which operate to identify an area of the target surface which meets a predetermined condition, such as an area corresponding to the concentration of various components in the target area. These interface elements may be directly overlaid on the composite image and/or may be included in a border area. The composite image may also include metadata identify the source fluorescence image for each of the plurality of tiles
The operations of
The systems, devices, and methods described above thus may provide for fluorescence imaging capable of detecting a wider range of parameters for potential areas of interest (e.g., a wider range of bacterial concentration) and/or may information to accurately, reliably, and/or easily identify potential areas of interest (e.g., bacterial infections) in fluorescence images, thereby improving treatment and reducing errors.
While the above description is presented with regard to fluorescence imaging at a single wavelength of excitation light, in practice the above operations and processes may be performed successively for a plurality of different wavelengths or wavelength bands of excitation light. For example, the above operations and processes may be performed once at a particular wavelength in order to identify and/or classify a first bacterial species, again at a different wavelength in order to identify and/or classify a second bacterial species, and so on.
The exemplary systems, devices, and methods described herein may be performed under the control of a processing system executing computer-readable codes embodied on a non-transitory computer-readable recording medium or communication signals transmitted through a transitory medium. The computer-readable recording medium may be any data storage device that can store data readable by a processing system, and may include both volatile and nonvolatile media, removable and non-removable media, and media readable by a database, a computer, and various other network devices.
Examples of the computer-readable recording medium include, but are not limited to, read-only memory (ROM), random-access memory (RAM), erasable electrically programmable ROM (EEPROM), flash memory or other memory technology, holographic media or other optical disc storage, magnetic storage including magnetic tape and magnetic disk, and solid state storage devices. The computer-readable recording medium may also be distributed over network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The communication signals transmitted through a transitory medium may include, for example, modulated signals transmitted through wired or wireless transmission paths.
Illustrative examples of systems, methods, and devices described herein are provided below. An embodiment of a system, method, and/or device described herein may include any one or more, and any combination of, the clauses described below:
Clause 1. A portable, hand-held device, comprising: an illumination device including at least one excitation light source and a driver configured to drive the at least one excitation light source to sequentially produce a plurality of output intensities; an imaging device configured to capture a plurality of fluorescence images of a target surface, each one of the plurality of fluorescent images respectively corresponding to one of the plurality of output intensities; a memory; and a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
Clause 2. The device according to clause 1, wherein the driver is configured to modulate the at least one excitation light source by controlling one or more of a current through the at least one excitation light source, a voltage across the excitation light source, or a duty cycle of a pulse-width modulation signal provided to the excitation light source.
Clause 3. The device according to clause 1 or clause 2, wherein the driver is configured to drive the at least one excitation light source according to a monotonically increasing or monotonically decreasing output waveform having a plurality of steps, wherein an amplitude of each one of the plurality of steps corresponds to a respective one of the plurality of output intensities.
Clause 4. The device according to clause 1 or clause 2, wherein the driver is configured to drive the at least one excitation light source according to an output waveform having a plurality of pulses, wherein an amplitude of each of the pulses corresponds to a respective one of the plurality of output intensities.
Clause 5. The device according to any one of clauses 1 to 4, wherein the at least one excitation light source is configured to output excitation light having a wavelength of 405 nm±20 nm.
Clause 6. The device according to any one of clauses 1 to 5, wherein the plurality of output intensities is between four and six output intensities, inclusive.
Clause 7. The device according to any one of clauses 1 to 6, wherein the processor is further configured to quantitatively estimate a concentration of a component of the target surface.
Clause 8. The device according to any one of clauses 1 to 7, wherein the processor is configured to select the image portion for a given one of the plurality of sections by: for each of the plurality of fluorescence images, determining a respective fluorescence signal corresponding to a signature of a component of the target surface; and selecting, as the selected image portion, the corresponding one of the plurality of fluorescence images for which the respective fluorescence signal is highest.
Clause 9. The device according to clause 7 or clause 8, wherein the component of the target surface is a bacterial species.
Clause 10. The device according to clause 9, wherein the bacterial species is at least one of Pseudomonas aeruginosa, Escherichia coli, Proteus mirabilis, Proteus vulgaris, Enterobacter cloacae, Serratia marcescens, Acinetobacter baumannii, Klebsiella pneumoniae, Klebsiella oxytoca, Morganella morganii, Stenotrophomonas maltophilia, Citrobacter koseri, Citrobacter freundii, Aeromonas hydrophilia, Alcaligenes faecalis, Pseudomonas putida, Staphylococcus aureus, Staphylococcus epidermis, Staphylococcus lugdunensis, Staphylococcus capitis, Corynebacterium striatum, Bacillus cereus, Listeria monocytogenes, Bacteroides fragilis, Clostridium perfringens, Peptostreptococcus anaerobius, Propionibacterium acnes, and/or Veillonella parvula.
Clause 11. The device according to any one of clauses 1 to 10, wherein the processor is further configured to generate at least one interface element on the composite image.
Clause 12. The device according to clause 11, wherein the at least one interface element is configured to identify an area of the target surface which meets a predetermined condition.
Clause 13. The device according to clause 12, wherein the predetermined condition is that a bacterial concentration in the area exceeds a predetermined threshold.
Clause 14. The device according to any one of clauses 1 to 13, wherein the at least one interface element includes an overlay to highlight a portion of the composite image.
Clause 15. The device according to any one of clauses 1 to 14, further comprising a display device configured to display the composite image.
Clause 16. The device according to any one of clauses 1 to 15, wherein the memory is configured to store the composite image.
Clause 17. The device according to any one of clauses 1 to 16, wherein the imaging device includes a first image sensor configured to detect wavelengths between 470 nm and 520 nm, inclusive.
Clause 18. The device according to any one of clauses 1 to 17, wherein the imaging device includes a second image sensor configured to detect wavelengths between 600 nm and 660 nm, inclusive.
Clause 19. The device according to any one of clauses 1 to 18, wherein the imaging device is configured to capture the plurality of fluorescence images at a rate of approximately two per second.
Clause 20. The device according to any one of clauses 1 to 19, wherein the composite image includes metadata identifying the selected one of the plurality of fluorescence images for each of the plurality of sections.
Clause 21. The device according to any one of clauses 1 to 20, wherein combining the selected image portions includes smoothing discontinuities between adjacent ones of the plurality of sections.
Clause 22. The device according to any one of clauses 1 to 21, wherein the processor includes a trained machine learning algorithm configured to perform the operation of selecting the image portion and/or the operation of combining the selected image portions.
Clause 23. The device according to any one of clauses 1 to 22, wherein the imaging device is further configured to capture a white light image of the target surface in the absence of an output from the at least one excitation light source.
Clause 24. A system, comprising: a display device; an illumination device including an excitation light source and a driver configured to drive the excitation light source to sequentially produce a plurality of output intensities; an imaging device configured to capture a plurality of fluorescence images of a target surface respectively corresponding to the plurality of output intensities; a housing; and circuitry disposed within the housing, the circuitry including a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image output to the display device.
Clause 25. A fluorescence imaging method, comprising: capturing a plurality of fluorescence images of a target surface, including sequentially for a plurality of different values of a drive parameter: driving an excitation light source to produce excitation light at an output intensity corresponding to the drive parameter, and capturing a respective fluorescence image of the target surface, wherein the fluorescence image includes an emission response of the target surface to the excitation light; co-registering the plurality of fluorescence images; dividing an image area into a plurality of sections; for each of the plurality of sections, selecting an image portion from one of the plurality of fluorescence images; and combining the selected image portions to generate a composite image.
Clause 26. A portable, hand-held device, comprising: an illumination device including at least one excitation light source configured to produce light at an output intensity; an imaging device including an image sensor array and a timing controller configured to drive the image sensor array to sequentially capture a plurality of fluorescence images of a target surface respectively corresponding to a plurality of exposure periods; a memory; and a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
Clause 27. The device according to clause 26, wherein the at least one excitation light source is configured to output excitation light having a wavelength of 405 nm±20 nm.
Clause 28. The device according to clause 26 or clause 27, wherein the plurality of exposure periods is between four and six exposure periods, inclusive.
Clause 29. The device according to any one of clauses 26 to 28, wherein the processor is further configured to quantitatively estimate a concentration of a component of the target surface.
Clause 30. The device according to any one of clauses 26 to 29, wherein the processor is configured to select the image portion for a given one of the plurality of sections by: for each of the plurality of fluorescence images, determining a respective fluorescence signal corresponding to a signature of a component of the target surface; and selecting, as the selected image portion, the corresponding one of the plurality of fluorescence images for which the respective fluorescence signal is highest.
Clause 31. The device according to clause 29 or clause 30, wherein the component is a bacterial species.
Clause 32. The device according to clause 31, wherein the bacterial species is at least one of Pseudomonas aeruginosa, Escherichia coli, Proteus mirabilis, Proteus vulgaris, Enterobacter cloacae, Serratia marcescens, Acinetobacter baumannii, Klebsiella pneumoniae, Klebsiella oxytoca, Morganella morganii, Stenotrophomonas maltophilia, Citrobacter koseri, Citrobacter freundii, Aeromonas hydrophilia, Alcaligenes faecalis, Pseudomonas putida, Staphylococcus aureus, Staphylococcus epidermis, Staphylococcus lugdunensis, Staphylococcus capitis, Corynebacterium striatum, Bacillus cereus, Listeria monocytogenes, Bacteroides fragilis, Clostridium perfringens, Peptostreptococcus anaerobius, Propionibacterium acnes, and/or Veillonella parvula.
Clause 33. The device according to any one of clauses 26 to 32, wherein the processor is further configured to generate at least one interface element on the composite image.
Clause 34. The device according to clause 33, wherein the at least one interface element is configured to identify an area of the target surface which meets a predetermined condition.
Clause 35. The device according to clause 34, wherein the predetermined condition is that a bacterial concentration in the area exceeds a predetermined threshold.
Clause 36. The device according to any one of clauses 26 to 35, further comprising a display device configured to display the composite image.
Clause 37. The device according to any one of clauses 26 to 36, wherein the memory is configured to store the composite image.
Clause 38. The device according to any one of clauses 26 to 37, wherein the imaging device includes a first image sensor configured to detect wavelengths between 470 nm and 520 nm, inclusive.
Clause 39. The device according to any one of clauses 26 to 38, wherein the imaging device includes a second image sensor configured to detect wavelengths between 600 nm and 660 nm, inclusive.
Clause 40. The device according to any one of clauses 26 to 39, wherein the imaging device is configured to capture the plurality of fluorescence images at a rate of approximately two per second.
Clause 41. The device according to any one of clauses 26 to 40, wherein the composite image includes metadata identifying the selected one of the plurality of fluorescence images for each of the plurality of sections.
Clause 42. The device according to any one of clauses 26 to 41, wherein combining the selected image portions includes smoothing discontinuities between adjacent ones of the plurality of sections.
Clause 43. The device according to any one of clauses 26 to 42, wherein the processor includes a trained machine learning algorithm configured to perform the operation of selecting the image portion and/or the operation of combining the selected image portions.
Clause 44. The device according to any one of clauses 26 to 43, wherein the imaging device is further configured to capture a white light image of the target surface in the absence of an output from the at least one excitation light source.
Clause 45. A system, comprising: a display device; an illumination device including an excitation light source configured to produce light at an output intensity; an imaging device including an image sensor array and a timing controller configured to drive the image sensor array to sequentially capture a plurality of fluorescence images of a target surface respectively corresponding to a plurality of exposure periods; a housing; and circuitry disposed within the housing, the circuitry including a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, combine the selected image portions to generate a composite image, and output the composite image to the display device.
Clause 46. A fluorescence imaging method, comprising: capturing a plurality of fluorescence images of a target surface, including sequentially for a plurality of different values of an image capture parameter: driving an excitation light source to produce excitation light at an output intensity, setting an exposure period of an image sensor to a value corresponding to the drive parameter, and capturing a respective fluorescence image of the target surface, wherein the fluorescence image includes an emission response of the target surface to the excitation light; co-registering the plurality of fluorescence images; dividing an image area into a plurality of sections; for each of the plurality of sections, selecting an image portion from one of the plurality of fluorescence images; and combining the selected image portions to generate a composite image.
Clause 47. A portable, hand-held device, comprising: an illumination device including at least one excitation light source and a driver configured to drive the at least one excitation light source to sequentially produce at least one output intensity; an imaging device an imaging device including an image sensor array and a timing controller configured to drive the image sensor array to sequentially capture a plurality of fluorescence images of a target surface respectively corresponding to at least one exposure period; a memory; and a processor configured to: co-register the plurality of fluorescence images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
Clause 48. An imaging method, comprising: capturing a plurality of images of a target surface, including sequentially for a plurality of different values of a drive parameter: driving an light source to produce light at an output intensity corresponding to the drive parameter, and capturing a respective image of the target surface, wherein the image includes an optical response of the target surface to the light; co-registering the plurality of images; dividing an image area into a plurality of sections; for each of the plurality of sections, selecting an image portion from one of the plurality of images; and combining the selected image portions to generate a composite image.
Clause 49. The method according to clause 48, wherein respective ones of the plurality of images are fluorescence images, white-light images, or thermal images.
Clause 50. A portable, hand-held device, comprising: an illumination device including at least one light source configured to produce light at an output intensity; an imaging device including an image sensor array and a timing controller configured to drive the image sensor array to sequentially capture a plurality of images of a target surface respectively corresponding to a plurality of exposure periods; a memory; and a processor configured to: co-register the plurality of images, divide an image area into a plurality of sections, for each of the plurality of sections, select an image portion from one of the plurality of fluorescence images, and combine the selected image portions to generate a composite image.
The above description and associated figures teach the best mode of the disclosed devices, systems, and methods, and are intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent to those skilled in the art upon reading the above description. The scope should be determined, not with reference to the above description, but instead with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into future embodiments. In sum, it should be understood that the application is capable of modification and variation.
All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary is made herein. In particular, the use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
The Abstract is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
The present Application claims priority to U.S. Provisional Application No. 63/482,892, filed in the United States patent and Trademark Office on Feb. 2, 2023, the entire contents of which are incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63482892 | Feb 2023 | US |