Wounds which are not healing may indicate severe underlying disease patterns. Impediments for wound healing include wound size, ischemia, and infection. Infected wounds are typically examined for staging and treatment selection via summarizing indices such as the wound, ischemia, foot infection (wifi) classification system index. Bacterial colonization deprives the chronic wound of oxygen and nutrients and can prolong an inflammatory phase, i.e., prevent the wound from healing even when perfusion deficits have been treated. Therefore, frequent perfusion/oxygenation monitoring needs to be accompanied by measures for monitoring of the infection status over time to pinpoint root causes of the chronic wound, to select care options and to respond to changes with relevant treatment. A positive prognosis for a patient requires a targeted treatment along with monitoring of disease development and treatment impact over time.
Current methods for quantifying infection status require biopsies, organism culturing, bulky hardware or suffer from confounding factors outside an in-vitro context. Current methods are generally inapplicable to a monitoring use case involving frequent measurements when a result is typically not available for several days after biopsies and organism culturing. Alternative, optical approaches are either invasive and require the injection of infrared (IR) dyes or fluorescent agents, or leverage vibrational spectroscopic imaging of substrates colonized by sampled bacteria. Vibrational imaging attempts to determine the exact type and number of bacteria by comparing the spectral signature of the bacteria to a large reference database of pathogens for molecular identification. While in-vitro spectroscopy still takes several hours, corresponding in-vivo attempts usually suffer from a weak signal and overlapping confounding factors. Furthermore, vibrational spectroscopic imaging systems are typically coupled to fiber optical probes or confocal microscopes which makes the imaging approach very local, bulky and therefore impractical for bedside monitoring of larger wound surfaces.
As a result of the background described above, there is an open need for rapid and contactless/remote monitoring of infection status.
According to an aspect of the present disclosure, a device includes a light source, a camera, a memory and a processor. The light source emits emission light to a wound at a plurality of wavelengths. The camera detects backscattered light generated based on the emission light. The memory stores instructions. The processor executes the instructions. When executed by the processor, the instructions cause the device to record an image based on the backscattered light; quantify a degree of infection of the wound based on the image, and output an indication of the degree of infection.
According to another aspect of the present disclosure, a method for quantifying a degree of an infection includes emitting, by a light source of a device, emission light to a wound at a plurality of wavelengths; detecting, by a camera of the device, backscattered light generated based on the emission light; recording an image based on the backscattered light; quantifying a degree of infection of the wound based on the image, and outputting, from the device, an indication of whether the wound is infected.
According to another aspect of the present disclosure, a system includes a light source, a camera, a memory and a processor. The light source emits emission light to a wound at a plurality of wavelengths. The camera detects backscattered light generated based on the emission light. The memory stores instructions. The processor executes the instructions. When executed by the processor, the instructions cause the system to record an image based on the backscattered light; quantify a degree of infection of the wound based on the image, and output an indication of the degree of infection.
The example embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
In the following detailed description, for the purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments. Nonetheless, systems, devices, materials and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. The defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept.
The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. As used in the specification and appended claims, the singular forms of terms ‘a’, ‘an’ and ‘the’ are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms “comprises”, and/or “comprising,” and/or similar terms when used in this specification, specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Unless otherwise noted, when an element or component is said to be “connected to”, “coupled to”, or “adjacent to” another element or component, it will be understood that the element or component can be directly connected or coupled to the other element or component, or intervening elements or components may be present. That is, these and similar terms encompass cases where one or more intermediate elements or components may be employed to connect two elements or components. However, when an element or component is said to be “directly connected” to another element or component, this encompasses only cases where the two elements or components are connected to each other without any intermediate or intervening elements or components.
The present disclosure, through one or more of its various aspects, embodiments and/or specific features or sub-components, is thus intended to bring out one or more of the advantages as specifically noted below. For purposes of explanation and not limitation, example embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. However, other embodiments consistent with the present disclosure that depart from specific details disclosed herein remain within the scope of the appended claims. Moreover, descriptions of well-known apparatuses and methods may be omitted so as to not obscure the description of the example embodiments. Such methods and apparatuses are within the scope of the present disclosure.
As described herein, contactless and rapid monitoring of wound infection status may be provided using multispectral optical imaging or hyperspectral optical imaging. Optical backscatter from the multispectral optical imaging or hyperspectral optical imaging is influenced by the presence of bacteria in the wound, such that the optical backscatter across a spectrum of different wavelengths may be exploited to determine status of the wound. The optical backscatter may be recorded via a camera and then analyzed by a controller to determine whether a wound is infected.
As described herein, wounds can be monitored for infections using a single contactless device or system. The device or system is configured for quick use, such that the device or system may be used on an ad-hoc basis or on a persistent basis.
The system in
The camera 170 may be capable of recording a multi-spectral backscattered image. The camera 170 is configured to spatially resolve and spectrally resolve backscatter. The camera 170 is configured to determine, from the recorded backscattered image, which light intensity is recorded at which wavelength. The multi-variate information per pixel in the camera image provides backscattered light intensity information in different spectral ranges/wavelengths. The camera 170 is configured to determine spectral per-pixel resolution, either by sensor design or by time multiplex as sequentially spectral bins are filtered out of the incoming light by fast changing subsequent bandpass filters. The camera 170 may also output one image or a sequence of images.
The system in
The light source 180 may be a multi-wavelength light source or a hyper-wavelength light source. A multi-wavelength light source may be configured to provide light that can be detected by a multi-spectral camera. A hyper-wavelength light source may be configured to provide light that can be detected by a hyper-spectral camera. A multi-wavelength light source may emit light at fewer wavelengths (e.g., a smaller band or a smaller set) than a hyper-spectral wavelength light source. In embodiments based on
The backscattered reflective light in the filtered spectrum is input to a multivariate regression model which outputs an infection status. The multivariate regression model may be implemented via instructions stored in a memory and executed by a processor, such as by a controller described herein. The multivariate regression model may be obtained from empirical evidence. The ground truth for the regression module may be derived from alternative measurement means. An output of the infection status may be, for examples, a binary output such as green/red or infected/not infected or a granular output such as organisms per gram of tissue. All types of infected or non-infected wounds will result in backscattered light. The backscatter light detected by the cameras describe herein is detected based on changes in the spectrum that indicate the presence of scatterers or absorbers which have optical properties of listed bacteria. The backscattered spectrum may also be measured and evaluated relative to a non-wound reference location. An example of the wound exudate which results in the backscattered reflective light detected by cameras described herein is Pseudomonas aeruginosa Staphylococcus aureus, CA with a width of 0.01-1 μm and a length of 1-5 μm.
In
The device 100B includes a controller 150, a first interface 156, a second interface 157, a third interface 158, a fourth interface 159, a camera 170, a light source 180, and a handle 190. The device 100B may be a single, integrated device that can be used to detect wound infection status quickly by emitting light from the light source 180, detecting the backscattered reflective light at the camera 170, and processing the backscattered reflective light at the controller 150.
The controller 150 is further depicted in
The first interface 156, the second interface 157, the third interface 158 and the fourth interface 159 may include ports, disk drives, wireless antennas, or other types of receiver circuitry. The first interface 156, the second interface 157, the third interface 158 and the fourth interface 159 may also include user interfaces such as thumbwheels, buttons, knobs and interactive touch-displays.
The camera 170 may be or include multiple cameras, such as an optical camera, a RGB camera, and/or a thermal camera. The camera 170 is a sensor sensitive to the desired band of wavelengths which are either predetermined or selected via a user interface. The output of the camera 170 is a spectrally and spatially resolved image of the wound area.
The light source 180 may alone be considered an imaging unit, or may be considered an imaging unit in combination with the camera 170. The light source 180 irradiates a broad band of wavelengths onto the wound area. The incident spectrum of the light source 180 is either predetermined, or is determined from frequent or even constant measurements. Knowledge of the incident spectrum is used to calibrate camera images recorded by the camera 170 against the input signal. For example, the camera images recorded by the camera 170 may comprise the backscatter spectrum, and may be normalized by knowledge of the amount of light emitted at each wavelength of the incident spectrum.
The light from the light source 180 is diffusely backscattered from the wound and recorded by the camera 170.
Although not shown in
The handle 190 is representative of handles or similar mechanisms by which a user may hold and move a portable electronic device.
Although not illustrated, the device 100B in
The controller 150 includes a memory 151 and a processor 152. The memory 151 stores instructions which are executed by the processor 152. The processor 152 executes the instructions. When the controller 150 is provided as a stand-alone unit, the controller 150 may also include one or more of the first interface 156, the second interface 157, the third interface 158, and the fourth interface 159, such as to connect the controller 150 to other elements of a device or system.
The controller 150 may perform some of the operations described herein directly and may implement other operations described herein indirectly. For example, the controller 150 may directly process the backscatter light detected from the wound, and determine based on the backscatter light whether the wound is infected. The controller 150 may also directly control emissions of light by the light source 180. The controller 150 may indirectly control other operations such as by generating and transmitting content to be displayed on an external display. Accordingly, the processes implemented by the controller 150 when the processor 152 executes instructions from the memory 151 may include steps not directly performed by the controller 150.
In embodiments herein, the controller 150 may control the camera 170, the light source 180, other types of cameras described below, and the processing of the multivariate regression model. The controller 150 may be used to provide functions additional to the core functions described herein. For example, when a reference spectrum is available, the reference spectrum may be incorporated into the inference process via normalization of the measurement spectrum in the pre-processing stage or the reference spectrum may be passed to the regression model as an extra input in order to identify the calibration function in a manner that is data-driven from empirical experience.
The regression model is trained to operate as a spatially resolved classifier insofar as the chemical and structural composition of wounds (whether infected or not) differs from normal skin tissue. An example of the regression model is a convolutional neural network trained to segment the wound area given the spatially resolved backscatter spectrum as input. The regression model may also be trained to incorporate a reference backscatter spectrum of healthy skin tissue of the same patient. Thereby, the wound area can be calculated by the controller 150 implementing the regression model without requiring an RGB/grey-value camera. The wound area estimate may then be used for predicting a mean pathogen density as a single crisp measure for the infection extent. The mean pathogen density may be expressed as pathogen number per area.
In
A variety of additional or alternative features may be provided for the embodiments of
The RGB camera 172 is representative of an RGB/grey-value camera with an approximately same field-of-view as the backscatter detector provided by the optical camera 176. The images acquired by the RGB camera 172 may be used to automatically delineate the wound via, for example, a convolutional neural network trained to segment wounds. The images may also be used to calculate the area of the wound, and subsequently pass the calculated wound area to the inference unit for predicting the bacteria density. The bacteria of interest are bacteria with sizes in or above an incident wavelength range, as these are the bacteria which influence the backscatter behavior. The backscattered light proportion per wavelength may be a function of the wound tissue properties and the chemical composition, number and size of the organisms.
In
A variety of additional or alternative features may be provided for the embodiments of
The thermal camera 174 may operate as a temperature measurement device. For example, the thermal camera may be or include a thermographic sensor configured to capture a thermographic profile of the wound. The thermal camera 174 may also capture the body surface area temperature outside of the wound and/or the room temperature, as a reference. The thermal camera 174 is incorporated into the device 100D as an element to ensure realization of the device 100D as a fully contactless device. Temperature measurements from the thermal camera 174 may add additional information on the inflammatory status and metabolic activity of the wound. A measurement process based on thermography using the thermal camera 174 and optical sensing using the optical camera 176 may include the acquisition of a reference image from a healthy skin patch, such as a skin patch that is relatively devoice of hair and relatively planar. The reference image may serve as a reference to rule out major confounding factors such as current skin surface temperature based on seasonal individual biases, skin type, skin age etc.
In the embodiments based on
The method of
At S210, a regression model is trained. The regression model may be trained for application to images. An example of such a trained regression model is a trained neural network. The regression model may be trained using empirical optical measurements which are linked to target values obtained from clinical standard methods. The regression model may also be trained using Monte-Carlo backscatter simulations from tissue models including different amounts of pathogen scatterers. Alternatively, training of the regression model may be supported by use of such Monte-Carlo backscatter simulations from tissue models including different amounts of pathogen scatterers. The trained regression model may output targets classifications that are binary, such as infection/no infection, or that are ranges, such as a discrete number reflective of ranges of organisms-per-gram-tissue.
At S220, an emission light spectrum and a backscatter light spectrum are each set. An interface on the device 100B, the device 100C or the device 100D may be used to select a range of wavelengths of the backscattered light to use in detecting the backscattered light. The method of
At S230, a reference image is captured. The reference image may be an image of skin where no wound is present, and is preferably an image of skin that is relatively planar and hairless.
At S240, emission light is emitted. The emission light is emitted at the emission light spectrum set at S220.
At S250, backscatter light is detected. The backscatter light is detected at the backscatter light spectrum set at S220. The backscatter signal will depend on the chemical composition, number and size of organism in the wound. While the impact of the chemical composition and size and/or shape on the backscatter intensity spectrum will mainly reflect the presence of organisms, the number of scatterers in the wound exudate will scale the backscatter response with the grade of infection. These aspects are reflected by the Rayleigh equation, reproduced below as Equation (1). In the Rayleigh equation, the backscattered intensity over a spectrum of different wavelengths λ (at a given distance R to the scatterer and scattering angle θ) depends on the refractive index n as well as the sixth power of the particle diameter d.
Since the size of bacterial organism is within or larger than the wavelength band used for monitoring (e.g. including Pseudomonas aeruginosa, Staphylococcus aureus, ca. 0.1-1 μm wide, 1-5 μm long), principles of Mie scattering would however be predominant. However, leaving all other parameters but the particle diameter fixed, the backscatter response (independent of the model that describes it) is higher in the presence (or at a higher number) of scatterers that are larger than the wavelength compared to regimes with scatterers smaller than the incident wavelength.
At S260, backscatter light is normalized. For a specific wavelength or spectral bin of emitted light, the backscattered light may be normalized by the amount of emitted light at the wavelength/bin. Intensities may vary from light source to light source. Part of the normalization may include a recording of an image with the light source switched off, to obtain a reference of the background illumination to be subtracted before relating the leftover backscatter to the source light.
At S270, an image is recorded. Although not specified in
At S280, the recorded image from S270 is compared with the reference image from S230.
After S280, the regression model trained at S210 may be applied to features extracted from the image recorded at S270. For example, a backscatter image may be pre-processed before information is input to the regression model. Pre-processing may involve spatial filtering or temporal filtering. Moreover, an image may comprise a sequence of images, so that temporal features such as pulsation as seen in PPG can be deducted from the images. Local indications for infection may be detected as well as global indications for infection, and application of the regression model to a full image may result in one indication of infection being output. Feature extraction may also involve, for example, a spatial masking of relevant wound area so that the extracted global optical backscatter information can be tailored to the relevant area. Additionally, although not required, a thermographic profile of the wound may be input to the regression model.
Pre-processing may also be provided in the form of a known method to estimate physical or optical properties from the backscattered light. Examples of the physical or optical properties which can be estimated by known pre-processing methods which may be performed consistent with the teachings herein include estimates of hemoglobin or water content, such as from e.g. inverse Monte-Carlo model optimization.
The regression model may be applied by a controller 150, and specifically by a processor executing instructions from a memory. The controller 150 may be or include an inference unit that takes a multivariate input and translates the multivariate input to a target value describing the infection status. The multivariate input includes the spectrally resolved images and may include a temperature measure as well as a reference spectrum from the reference image taken at S230. The controller 150 may exploit the 2-dimensional image topology by convolutional architectures to fit a nonlinear function to empirical data.
The target value defines the type of the inference problem and may be one of the following options, described as follows from basic to elaborate. The most basic target value may be a binary value which classified an entire image into infected or not infected. The next most basic target value may be a classification for the entire image into a measure of the extent of infection. The extent of infection may be proportional or equal to organisms-per-gram-tissue, and may be discretized into coarse ranges such as not-infected, slightly infected, moderately infected, highly infected or highly infected, depending on clinical implications. In more elaborate schemes, the regression model may output a classification for each pixel or group of pixels, either as a binary classification or as a relative measure of the extent of infection. An indication of whether the wound is infected may be output for each of a plurality of subsets of pixel values used to generate an image.
Empirical data and ground truth for the regression model may be obtained from alternative mechanisms which are considered highly reliable, e.g., so-called gold standards. The alternative mechanisms may not be capable of the rapid and remote monitoring made possible by the teachings herein, but yield a clinically established/accepted measure of higher detail than required according to the teachings herein. As examples, the traditional mechanisms used to establish ground truth may not be capable of rapid and remote monitoring due to the time required for analysis, the nature of the bulky/contact-based device(s) used for such mechanisms, and the limited ability to take only local measurements.
As examples of the empirical data used to obtain the regression model, tissue biopsy and culturing may yield densities per type of bacterium per wound. Densities per type may be expressed as organisms-per-gram-tissue. The data of densities per type may be joined into a single organism density to obtain a binary target. As described above, binary targets may be applicable to entire images, but not for individual pixel values or subsets of pixel values in the images. As another example, vibrational spectroscopy may yield one or more number(s) per measurement location, either in-vivo or in-vitro. As with densities per type, the results of vibrational spectroscopy may be joined into a single density. The measurement locations may be (e.g. manually) registered with the camera image recorded as reference image at S230 and then used subsequently for inference. The information measured on a subset of locations across the image/wound area may be “smeared” across the image, such as by using convolutional filters, so as to obtain a ground truth of the full resolution. The smearing technique assumes that the infection status does not vary abruptly across the wound surface, and the technique may be more valid when the ground truth measure is quantized into coarse ranges compared to discrete numerical outputs.
The methodologies used to collect empirical data for the ground truth are each aimed at resolving the pathogen type, whereas the output from the multivariate regression model used herein may be less detailed. To build a robust inference module for the infection status/extent, the multivariate regression model is trained on cases with a variety of wounds such that sufficient variability in the concentration, type, shape, size of pathogens and other confounding factors is covered. The factors in the variability of wounds used to establish the ground truth may be factors not otherwise taken into account in the reference image taken at S230 or any dedicated model, and the factors in the variability of wounds used to establish the ground truth may exclude factors such as skin type, wound age, depth.
At S299, an indication is output. The indication may be output as a display, as an audio tone, by a color indication such as by individual light-emitting diodes (LEDs), or by other mechanisms perceptible to a human. The output is an indication of infection status. The inference result may be finally displayed to the user, such as on the device 100B that outputs the light, captures the backscatter spectrum, and applies the multivariate regression model. Depending on the chosen target value, the display may be realized as a colormap overlay that indicates the regression output/infection status per pixel on top of the wound image. An indication of whether the wound is infected may be output for each of a plurality of subsets of pixel values used to generate an image. The display may alternatively be realized as a color range/bar indicator which visually classifies the infection status into an evaluated context. For example, the display may output green for no infection and red for highly infected and action needed.
In a bedside or home monitoring scenario, the infection status indicator can be combined with a threshold and warning system that triggers the patient to see an expert for more detailed analysis
At S292, a multivariate input of 2-dimensional (2D) image topology is input to an inference unit. A camera records backscattered light from the wound. At each pixel, the camera may resolve the sensed backscattered light intensity into contributions stemming from different wavelength. In other words, multiple variables are derived from across pixels (space) and possibly time in the instance of multiple camera images. In space the pixels are subject to a specific 2-dimensional topology. An example of a specific 2-dimensional topology is a 2-dimensional regular grid of pixels, where a pixel neighborhood is defined. The multivariate input may also be or include other types of information not particularly related to the camera. Other types of information may include, for example, temperature, reference spectrum from areas outside of the wound, patient-specific parameters such as age and skin type, and so on.
At S294, a regression model is fit to the 2-dimensional image topology. A regression model is fit to data that empirically describes the functional relationship between multivariate backscatter signal and infection status. The regression model thereby fits a non-linear function to the data. By fitting the nonlinear function, the regression model (predicting an indicator either per pixel or for the whole image) may exploit the neighborhood information provided by the 2-dimensional topology of the image. The regression model may predict an indicator either per pixel or for an entire image. An example of S294 is a 2-dimensional convolutional neural network fusing information from a local neighborhood as defined by its filter kernels.
At S296, best fit classifications are output from the regression model as an infection status.
The method of
Referring to
In a networked deployment, the computer system 400 operates in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 400 can also be implemented as or incorporated into various devices, such as the devices described herein including a stationary device or a mobile device, a mobile computer, a laptop computer, a tablet computer, or any other machine capable of executing a set of software instructions (sequential or otherwise) that specify actions to be taken by that machine. The computer system 400 can be incorporated as or in a device that in turn is in an integrated system that includes additional devices. In an embodiment, the computer system 400 can be implemented using electronic devices that provide voice, video or data communication. Further, while the computer system 400 is illustrated in the singular, the term “system” shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of software instructions to perform one or more computer functions.
As illustrated in
The term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Programs have software instructions performed by one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
The computer system 400 further includes a main memory 420 and a static memory 430, where memories in the computer system 400 communicate with each other and the processor 410 via a bus 408. Either or both of the main memory 420 and the static memory 430 may be considered representative examples of the memory 151 of the controller 150 in
“Memory” is an example of a computer-readable storage medium. Computer memory is any memory which is directly accessible to a processor. Examples of computer memory include, but are not limited to RAM memory, registers, and register files. References to “computer memory” or “memory” should be interpreted as possibly being multiple memories. The memory may for instance be multiple memories within the same computer system. The memory may also be multiple memories distributed amongst multiple computer systems or computing devices.
As shown, the computer system 400 further includes a video display unit 450, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid-state display, or a cathode ray tube (CRT), for example. Additionally, the computer system 400 includes an input device 460, such as a keyboard/virtual keyboard or touch-sensitive input screen or speech input with speech recognition, and a cursor control device 470, such as a mouse or touch-sensitive input screen or pad. The computer system 400 also optionally includes a disk drive unit 480, a signal generation device 490, such as a speaker or remote control, and/or a network interface device 440.
In an embodiment, as depicted in
In an embodiment, dedicated hardware implementations, such as application-specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays and other hardware components, are constructed to implement one or more of the methods described herein. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules. Accordingly, the present disclosure encompasses software, firmware, and hardware implementations. Nothing in the present application should be interpreted as being implemented or implementable solely with software and not hardware such as a tangible non-transitory processor and/or memory.
In accordance with various embodiments of the present disclosure, the methods described herein may be implemented using a hardware computer system that executes software programs. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Virtual computer system processing may implement one or more of the methods or functionalities as described herein, and a processor described herein may be used to support a virtual processing environment.
Accordingly, contactless wound monitoring enables a monitoring device with rapid surface monitoring of infection status by focusing on describing the wound infection status either in a binary form or in a rough quantification. The contactless wound monitoring described herein allows for also monitoring trends in severity of infection over time, which in turn provides valuable information for assessing the success chances of a current treatment strategy and for adjusting the current treatment strategy (e.g. by increasing a drug dose), when appropriate. Contactless wound monitoring may be used to rapidly obtain status of a wound, such as when elaborate information of the nature of any wound infection is not particularly sought. Contactless wound monitoring should minimally impact workflow while serving as a gating process to indicate trends and any need for further investigations. Contactless wound monitoring may be used for home, bedside monitoring, during interventions in the catherization lab, and in a variety of other contexts. In clinical use, contactless wound monitoring may be employed in a variety of fields including peripheral vessel disease, surgical wounds and/or dermatology applications. A single device used for the contactless wound monitoring may be both mobile and portable or may be integrated into a catheterization lab such as by integrating the device with a workstation.
Although contactless wound monitoring has been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of contactless wound monitoring in its aspects. Although contactless wound monitoring has been described with reference to particular means, materials and embodiments, contactless wound monitoring is not intended to be limited to the particulars disclosed; rather contactless wound monitoring extends to all functionally equivalent structures, methods, and uses such as are within the scope of the appended claims.
The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of the disclosure described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
One or more embodiments of the disclosure may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any particular invention or inventive concept. Moreover, although specific embodiments have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72 (b) and is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter may be directed to less than all of the features of any of the disclosed embodiments. Thus, the following claims are incorporated into the Detailed Description, with each claim standing on its own as defining separately claimed subject matter.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/085185 | 12/9/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63291559 | Dec 2021 | US |