Automated or partially automated anatomical surface assessment methods, devices and systems

Information

  • Patent Grant
  • 12039726
  • Patent Number
    12,039,726
  • Date Filed
    Wednesday, May 20, 2020
    4 years ago
  • Date Issued
    Tuesday, July 16, 2024
    4 months ago
Abstract
Devices, systems and methods for assessing anatomical surface features are described herein. In some embodiments, a method of assessing a surface feature on a patient's skin surface includes (i) capturing one or more data sets from a patient's skin surface including the skin surface feature, and (ii) determining outline data of the skin surface feature based on at least one of the one or more data sets. The method can further include determining one or more confidence attributes associated with the determined outline data.
Description
FIELD

The present technology is generally related to devices, systems and methods for assessing anatomical surface features.


BACKGROUND

Various techniques have been used to monitor wounds, ulcers, sores, lesions, tumours etc. (herein referred to collectively as “wounds”) both within hospitals and for outpatients. Typically these wounds are flat or concave and up to 150 millimetres (around 6 inches) across, though larger wounds are also possible. Manual techniques are typically labour-intensive and require examination and contact by skilled personnel. Such measurements may be inaccurate and there may be significant variation between measurements made by different personnel. Further, these approaches may not preserve any visual record for review by an expert or for subsequent comparison.


A number of techniques for the automated monitoring of wounds have been proposed. A common approach is to place a reference object next to the wound, capture an image, and determine the size of the wound utilising the scale of the reference object. It is often undesirable to place a reference object near to a wound and this requires an additional cumbersome step for a user and risks contamination of the wound. Further, when the target is not in the plane of the wound, or if it is oblique, or if the wound is not planar, there will be errors in any area calculation. Further, a reference object may be misplaced.


In general, the outline of the wound may be determined manually by a user, or automatically by the system. In some cases, automated determination of wound outline has been problematic (e.g. the automated wound outline has not been of sufficient quality to allow accurate or at least repeatable assessment of the wound). In such cases manual input of a wound outline by a user may be preferred. It would be desirable to provide improvements in automated determination of surface feature outline.


The Applicant devised methods for assessment of surface features such as wounds, as disclosed in its earlier applications and patents, including: U.S. Pat. No. 8,755,053, filed May 11, 2009; U.S. Pat. No. 9,179,844, filed Nov. 27, 2012; U.S. patent application Ser. No. 15/144,722; and U.S. Provisional Patent Application No. 62/423,709 filed 17 Nov. 2016; PCT publication WO2018/185560; U.S. Provisional Patent Application No. 62/481,612; U.S. patent application Ser. No. 15/816,862; and US Patent Publication No. 2018/0132726 all of which are incorporated herein by reference in their entireties.


It is an object of the present technology to provide improved devices, systems, and/or methods for assessing and/or treating wounds and/or other anatomical features.


SUMMARY

A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Outline data of the skin surface feature may be determined based on at least one of the one or more data sets. One or more confidence attributes may be determined associated with the determined outline data.


The determined outline data may include a determined outline of the surface feature.


The one or more confidence attributes may include a confidence attribute for the determined outline as a whole. The one or more confidence attributes may include a plurality of segment confidence attributes, each associated with a segment of the determined outline. The one or more confidence attributes may include a plurality of point confidence attributes, each associated with a point on the determined outline.


The determined outline or part of the determined outline may be rejected if one or more of the confidence attributes are unacceptable. Part of the determined outline may be rejected if one or more of the segment confidence attributes are unacceptable. Part of the determined outline may be rejected if one or more of the point confidence attributes are unacceptable.


User input for replacement or modification of the rejected determined outline or rejected part of the determined outline may be solicited.


The determined outline data may include sub-region outline data for one or more sub-regions of the surface feature.


The one or more confidence attributes may include one or more of: one or more confidence attributes for the sub-region outline data for each of the one or more sub-regions; a plurality of sub-region segment confidence attributes, each associated with a segment of the determined sub-region outline data; and a plurality of sub-region point confidence attributes, each associated with a point in the determined sub-region outline data.


One or more tissue types associated with each of the sub-regions may be determined. A tissue confidence attribute associated with the determined tissue type for each of the sub-regions may be determined.


Determination of the confidence attribute may be based at least in part on distances between points in the determined outline data.


The determined outline data may be processed, wherein determination of the confidence attribute is based at least in part on distances between points in the determined or processed outline data and points in the one or more data sets. Processing the determined outline data may include resampling the determined outline data. The points in the one or more data sets may include points with a predetermined tissue type classification.


The one or more confidence attributes and/or data based on the one or more confidence attributes may be stored.


The one or more confidence attributes and/or data based on the one or more confidence attributes may be displayed.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Outline data of the skin surface feature based on at least one of the one or more data sets may be determined. One or more confidence attributes associated with the determined outline data may be determined. At least part of an outline that is based at least in part on the determined outline data may be displayed. One or more visual representations of the one or more confidence attributes may be displayed.


A representation of the skin surface and/or skin surface feature may be displayed.


Displaying one or more visual representations of the one or more confidence attributes may include displaying one or more of: a numerical confidence indicator, a confidence symbol indicator, a colored confidence indicator, a patterned confidence indicator, and a confidence scale indicator.


A plurality of visual representations of the one or more confidence attributes may be displayed, at least some of the displayed visual representation being associated with segments of the displayed outline.


Displaying at least part of an outline may include displaying only outline segments with confidence attributes that are acceptable.


User input of outline data for outline segments with confidence attributes that are unacceptable may be solicited.


A user may be enabled to accept or reject the displayed outline or segments of the displayed outline. A user may be enabled to modify the displayed outline or segments of the displayed outline.


One or more revised confidence attributes associated with the modified outline may be determined, with the modified outline being displayed and one or more visual representations of the one or more revised confidence attributes being displayed.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Outline data for the skin surface feature may be determined based on at least one of the one or more data sets. One or more confidence attributes associated with the determined outline data may be determined. Based on the one or more confidence attributes, user input of replacement or modified outline data may be solicited.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. A region of interest (‘ROI’) within at least one of the data sets may be defined, the skin surface feature being positioned within the ROI. Data within the ROI may be analyzed to determine outline data of the skin surface feature.


A user may define the ROI.


Alternatively, the ROI may be automatically defined based on one or more known characteristics of a capture device used in the capturing of one or more data sets. The ROI may be automatically defined by extracting data corresponding to the patient's skin surface from background data. The ROI may be automatically defined based on a first approximation of a boundary of the surface feature. The ROI may be automatically defined based on historic data associated with the same wound or same patient.


The ROI may be based on a historic ROI. The ROI may be based on historic outline data. The ROI may be based on known fiducials on the skin surface.


User approval of the ROI may be solicited. User modification of the ROI may be enabled.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Outline data for the skin surface feature may be determined. The outline data may be smoothed to produce a set of smoothed outline points.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Outline data may be determined for the skin surface feature. The outline data may include a first set of outline points. The outline data may be processed to produce a reduced set of outline points, the reduced set of outline points having a smaller number of outline points than the first set of outline points.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. A subset of skin surface feature outline points may be determined in at least one of the one or more data sets. From the identified subset of skin surface feature outline points and data from the one or more data sets, extrapolation may be performed to determine a set of outline data for the skin surface feature.


A user may manually identify the subset of skin surface feature outline points. A user may manually identify the subset of skin feature outline points by identifying one or more discrete outline points. A user may manually identify the subset of skin feature outline points by identifying one or more outline segments each including a plurality of outline points.


A user may be enabled to classify outline points in the identified subset of outline points as definite points or non-definite points.


A method of assessing a surface feature on a patient's skin surface may include capturing, in one or more first color spaces, one or more images of a patient's skin surface including the skin surface feature. At least some of the one or more images may be converted to produce one or more converted images in one or more further color spaces, different to the one or more first color spaces. Outline data for the skin surface feature may be determined based on the one or more converted images.


The one or more further color spaces may include one or more monochrome color spaces. The one or more further color spaces may include a normalized green color space. The one or more color spaces may include one or more of: normalized red, normalized blue, red, green, blue, UV, Infrared, CIE Lab, RGB, HSL, HSV, CIE Luv, YCbCr, CMYK, enhanced red, and grayscale color spaces.


A method of assessing a surface feature on a patient's skin surface may include capturing one or more data sets from a patient's skin surface including the skin surface feature. Historical information relating to the patient and/or skin surface feature may be retrieved from storage. Based on the retrieved historical information, an outlining method may be selected from a plurality of available methods. Outline data of the skin surface feature may be determined based on at least one of the one or more data sets, using the selected outlining method.


The historical information may include one or more of: skin surface characteristics, surface feature characteristics, surface feature type, surface feature etiology, anatomical site, imaging metrics, previous dressing type, tissue type characteristics, 3D surface feature information, expected or assessed pressure or friction around the surface feature, and historical outline information.


A method of assessing one or more surface features on the skin surfaces of one or more patients, may include determining outline data of a first skin surface feature based on one or more captured data sets. A processor may modify an outline determination method. A processor may determine outline data of a further skin surface feature based on one or more further captured data sets and according to the modified outline determination method.


The processor may modify the outline determination method based on information obtained from the determining of outline data of the first skin surface feature.


The determining of outline data of the first skin feature may include a processor determining the outline data of the first skin surface feature based on one or more captured data sets and according to the outline determination method.


A user may accept or modify the determined outline data of the first skin surface feature and the processor modifies the outline determination method based at least in part on the user acceptance or modification of the determined outline data of the first skin surface feature.


Code embodying the modified outline determination method may be stored in memory.


It is acknowledged that the terms “comprise”, “comprises” and “comprising” may, under varying jurisdictions, be attributed with either an exclusive or an inclusive meaning. For the purpose of this specification, and unless otherwise noted, these terms are intended to have an inclusive meaning—i.e., they will be taken to mean an inclusion of the listed components which the use directly references, and possibly also of other non-specified components or elements.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings which are incorporated in and constitute part of the specification, illustrate embodiments of the present technology and, together with the general description of the present technology given above, and the detailed description of embodiments given below, serve to explain the principles of the present technology, in which:



FIG. 1 is a diagram showing some of the components typically incorporated in at least some of the computer systems and other devices on which the facility executes.



FIG. 2 is a flow diagram providing an overview of one example of outline determination method.



FIG. 3 shows an image including a surface feature and Region of Interest according to one example.



FIG. 4 shows an image including a surface feature, ROI and structured light pattern according to one example.



FIG. 5 shows an image including a skin surface and surface feature, according to one example.



FIG. 6A shows a surface feature outline before and after smoothing according to one example.



FIG. 6B is an expanded view of a portion of FIG. 6A.



FIG. 7 shows a smoothed or filtered outline according to one example.



FIG. 8 shows outline data according to one example.



FIG. 9A shows outline data according to a further example.



FIG. 9B is an expanded view of a portion of FIG. 9A.



FIG. 10 shows outline data according to another example.



FIG. 11A shows outline data according to yet another example.



FIG. 11B is an expanded view of a portion of FIG. 11A.



FIG. 12 is a flow diagram illustrating one example of confidence determination.



FIG. 13 is a simplified screenshot showing one example of display of data.



FIG. 14 is a simplified screenshot showing one example of display of data including outline and global confidence data.



FIG. 15 is a simplified screenshot showing one example of display of data including outline data, local confidence data and a user prompt.



FIG. 16 is a simplified screenshot showing a further example of display of data including outline data and global confidence data.





DETAILED DESCRIPTION

Overview


Described herein is a software facility for automatically assessing an anatomical surface feature (“the facility”), such as a wound, and for managing information related to assessed anatomical surface features across a range of patients and institutions. While the following discussion liberally employs the term “wound” to refer to the anatomical surface feature(s) being assessed, the present devices, systems, and methods may be straightforwardly applied to anatomical surface features of other types, such as ulcers, sores, lesions, tumors, bruises, burns, moles, psoriasis, keloids, skin cancers, erythema, cellulitis, and the like. Similarly, a wide variety of users may use the facility, including doctors, nurses, technologists, or any other caregiver of the patient, or the patient.


As discussed in greater detail below, the facility may be implemented using readily available portable computing devices, thereby taking advantage of existing device capabilities without the need for separate hardware attachments (although in some embodiments auxiliary hardware devices may be used). As used herein, the terms “computer” and “computing device” generally refer to devices that have a processor and non-transitory memory, as well as any data processor or any device capable of communicating with a network. Data processors include programmable general-purpose or special-purpose microprocessors, programmable controllers, application-specific integrated circuits (ASICs), programming logic devices (PLDs), system on chip (SOC) or system on module (SOM) (“SOC/SOM”), an ARM class CPU with embedded Linux or Android operating system or the like, or a combination of such devices. Computer-executable instructions may be stored in memory, such as random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such components. Computer-executable instructions may also be stored in one or more storage devices, such as magnetic or optical-based disks, flash memory devices, or any other type of non-volatile storage medium or non-transitory medium for data. Computer-executable instructions may include one or more program modules, which include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types.


Anatomical Surface Feature Assessment



FIG. 1 is a block diagram showing a sample environment having multiple components in which the facility executes. The environment 100 may include one or more capture devices 102, 102a, 102b, 102c, one or more personal computing devices 104, one or more server computers 106, and one or more persistent storage devices 108. The capture device 102, 102a, 102b, 102c and the personal computing device 104 communicate (wirelessly or through a wired connection) with the server computer 106 through a network 140 such as, for example, a Local Area Network (LAN), a Wide Area Network (WAN), and/or the Internet. In the embodiment shown in FIG. 1, the capture device 102, 102a, 102b, 102c may or may not communicate directly with the personal computing device 104. For example, the capture device 102, 102a, 102b, 102c may communicate wirelessly with a first base station or access point 142 using a wireless mobile telephone standard, such as the Global System for Mobile Communication (GSM), or another wireless standard, such as IEEE 802.11, and the first base station or access point 142 communicates with the server computer 106 via the network 140. Likewise, the computing device 104 may communicate wirelessly with a second base station or access point 144 using a wireless mobile telephone standard, such as the Global System for Mobile Communication (GSM), or another wireless standard, such as IEEE 802.11, and the second base station or access point 144 communicates with the server computer 106 via the network 140. In some embodiments, confidential patient data generated by the capture device 102, 102a, 102b, 102c is only temporarily stored locally, or not at all, and instead is permanently stored at the storage device 108 associated with the server computer 106. The facility can be practiced on any of the computing devices disclosed herein (e.g., one or more personal computing devices 104, one or more server computers 106, etc.), and may include an interface module that generates graphical user interfaces (GUIs) to allow users to access the facility (as described in greater detail below with reference to FIGS. 2-29).


The personal computing device 104 can include one or more portable computing devices 120 (e.g., a smart phone, a laptop, a tablet, etc.) and/or one or more desktop computing devices 122. During data capture with the capture device 102, 102a, 102b, 102c at the point-of-care, the personal computing device 104 may also be present (i.e., in the same treatment room), or the personal computing device 104 may be remote (i.e., outside of the treatment room but in the same treatment facility, outside of the treatment room and remote from the treatment facility, etc.). The desktop computing devices 122, if utilized, are typically associated with a particular property, e.g., a medical treatment center 124 (e.g., a hospital, a doctor's office, a clinic, etc.). The portable computing devices 120 and desktop computing devices 124 communicate with each other and the server computer 106 through networks including, for example, the Internet. In some instances the portable computing devices 120 and desktop computing devices 122 may communicate with each other through other wireless protocols, such as near field or Bluetooth.


The capture device 102, 102a, 102b, 102c is a handheld, portable imaging device that includes one or more sensing devices for generating data characterizing the wound (“wound data”) at the point-of-care. In the embodiment shown in FIG. 1, the capture device 102 may include an image sensor 110 (e.g., a digital camera), a depth sensor 112 (also known as a “range imager”), and a computing device 116 (shown schematically) in communication with the image sensor 110 and the depth sensor 112.


Alternatively, in some embodiments the capture device may be a smartphone 102a, tablet 102b or other capture device including an image sensor 110 and a computing device 116. The capture device may for example be an iPhone, iPad, Android phone, other Smartphone, tablet etc.).


In some embodiments a capture device 102c may include a 3D camera 126. For the purposes of this specification, a 3D camera 126 is a device that creates a topographic map of the surface structure as seen from the viewpoint of the camera. Suitable 3D cameras may include stereo cameras, fully sampled structured light cameras, fully sampled time of flight cameras, depth from focus 3D cameras, light field (or plenoptic) 3D cameras and holographic cameras.


The capture device may include one, two or some other number of cameras, and/or structured light arrangements, 3D cameras etc. In such embodiments a separate personal computing device 104 may not be required. As discussed below, an auxiliary device may be mounted to or connected to a capture device 102, 102a, 102b, 102c.


Further a variety of different capture devices 102, 102a, 102b, 102c may be used in the environment 100. Different types of capture device 102, 102a, 102b may be used to capture data in relation to the same wound, at the same or different times, by the same or different users. The data from a variety of different capture devices 102, 102a, 102b may be processed in the facility.


The capture device 102, 102a, 102b, 102c is also in wireless communication with the server computer 106 (e.g., via the network 140). The image sensor 110 is configured to generate image data of the wound (e.g., pixels containing RGB color data—or data in a similar or equivalent color space such as HSV or HSL). In some embodiments, a hyperspectral image sensor may be used, i.e. a sensor that has sensitivity outside the conventional RGB visible spectrum range. In some embodiments a spectrometer or array of spectrometers may be used to provide greater color resolution than a conventional RGB sensor. A spectrometer or array of spectrometers may be particularly useful in analysis of fluorescence. Where provided, the depth sensor 112 is configured to generate depth data characterizing the depth or topography of the wound. For example, in some embodiments the depth sensor 112 is a structured light device configured to emit structured light (e.g., one or more lasers, DLP projectors, film projectors, etc. where the emitted light may be infra-red, visible, ultraviolet, etc.) in a predetermined arrangement toward the wound. In such embodiments, for example, the depth sensor 112 may comprise three laser elements (labeled 112a-112c) spaced apart around a circumference of the capture device 102. The laser elements 112a-112c have a fixed positional relationship with respect to one another, and also with respect to the image sensor 110. Together the laser elements 112a-112c can be configured to create a structured light pattern (e.g., a laser point(s), a laser fan(s), etc.) In some embodiments the laser elements do not need to be symmetrically arranged. In other embodiments, the depth sensor 112 can include other suitable devices for range imaging, such as an ultrasonic sensor, a stereo camera, a plenoptic camera, a time-of-flight camera, an ISM band miniature radar, etc.


The capture device 102, 102a, 102b, 102c also includes a rechargeable power source and an actuator 118 (e.g., a button, a switch, touch screen etc.) for initiating data capture. When a user presses the actuator 118, the computing device 116 activates the image sensor 110 (and the depth sensor 112 if included) to generate data. The computing device 116 then communicates the captured data to the remote server computer 106 for further processing by the facility. In some embodiments, the computing device 116 may partially or completely process the captured data before communicating the captured data to the remote server computer 106. In some embodiments, the capture device 102, 102a, 102b, 102c may partially or completely process the captured data before communicating with the computing device 116. In some embodiments, the computing device 116 wirelessly communicates with the server computer 106 (e.g., over a network). Such a cordless arrangement can be advantageous as it allows the user greater freedom of movement with the capture device 102, 102a, 102b, 102c, which can be especially beneficial when trying to access certain anatomical locations. Also, the absence of a cord reduces the surface area available at the point-of-care on which bacteria and/or other unwanted microorganisms may bind and travel. In some embodiments, the capture device 102, 102a, 102b, 102c may be permanently cordless (i.e., no input port), and in other embodiments, the capture device 102, 102a, 102b, 102c may be configured to detachably receive an electronic connector, such as a power cord or a USB cord, or a permanently attached retracting cord may be provided. The computing device 116 may automatically transfer the captured data to the remote server computer 106 (e.g., over the network 140) at the moment the data is captured. In certain embodiments, however, the computing device 116 may not be in communication with the network 140; in such scenarios, the captured data may be temporarily stored in the volatile and/or non-volatile memory of the capture device 102, 102a, 102b, 102c for later transfer to the server computer 106.


The capture device 102, 102a, 102b, 102c may include additional features for enhancing data collection from the wound, such as one or more light sources 114 (e.g., a light emitting diode (LED), an incandescent light source, an ultraviolet light source, a flash etc.) for illuminating the wound before or during data capture, an indicator (not shown) configured to provide a visual and/or audio signal (e.g., images, text, lights, etc.) to the user, a thermal camera, a video camera, and/or one or more input/output devices (e.g., a microphone, a speaker, a port for communicating electrically with external components, such as a power source, the personal computing device 104, etc.). In some embodiments, the capture device 102, 102a, 102b, 102c is configured for wireless charging, e.g., via a dock or cradle (not shown). In such embodiments, the charging cradle may also serve as an access point for the network 140. As discussed in greater detail below with reference to FIGS. 6A-6B, the capture device 102, 102a, 102b, 102c and/or image sensor 110 may also be configured to capture images of barcodes and/or QR codes displayed on the computing device 104, such as a barcode and/or a QR code that enable the capture device 102, 102a, 102b, 102c to be configured to connect to the network 140.


In some embodiments, the capture device 102, 102a, 102b, 102c may have other configurations than that shown in FIG. 1. For example, although the image sensor 110, depth sensor 112, and computing device 116 are shown as part of a single component and/or within the same housing, in other embodiments, any or all of the of the image sensor 110, the depth sensor 112, and the computing device 116 can be separate components. Likewise, in some embodiments, the capture device 102, 102c does not include separate image and depth sensors, and instead includes a stereo camera that is configured to generate both image data and depth data. In other embodiments the capture device may include a display-less imaging device connected (by wired or wireless link) to a display device. Additional details regarding suitable capture devices 102 and methods of use can be found in U.S. Pat. No. 8,755,053, filed May 11, 2009; U.S. Pat. No. 9,179,844, filed Nov. 27, 2012; U.S. patent application Ser. No. 15/144,722; and U.S. Provisional Patent Application No. 62/423,709 filed 17 Nov. 2016; PCT publication WO2018/185560; U.S. Provisional Patent Application No. 62/481,612; U.S. patent application Ser. No. 15/816,862; and US Patent Publication No. 2018/0132726 all of which are incorporated herein by reference in their entireties.


As discussed above, the facility may include an interface module that generates graphical user interfaces (GUIs) to allow users to access the facility. The interface module also provides application programming interfaces (APIs) to enable communication and interfacing with the facility. APIs may be used by other applications, web portals, or distributed system components to use the system. For example, an application operating on a personal computing device may use an API to interface with system servers and receive capture data from the system. The API may utilize, for example, Representational State Transfer (REST) architecture and Simple Object Access Protocol (SOAP) protocols.


In some embodiments the capture device 102, 102a, 102b, 102c may include one or more further data gathering components, such as a positioning module (e.g. GPS), Inertial Measurement Unit, temperature sensor etc. Alternatively, such functions may in some embodiments be provided by a separate auxiliary module configured for attachment to the capture device 102, 102a, 102b, 102c.


Any of the capture devices 102, 102a, 102b, 102c and/or the personal computing devices 104 may provide access to video and/or audio communications, including video conferencing. Any of the capture devices 102, 102a, 102b, 102c and/or the personal computing devices 104 may provide access to remote medical expert systems. A remote medical expert may review or assess data captured by the capture device 102, 102a, 102b, 102c in real time, or at a later time.


The facility may provide for automated billing based on usage and/or data gathered by the capture devices 102, 102a, 102b, 102c. The facility may also maintain inventory information for capture devices 102, 102a, 102b, 102c.


The following methods may be implemented using appropriate facility software running on the capture device 102, 102a, 102b, 102c, personal computing device 104, and/or server computer 106 and/or further computing devices within the environment. In some embodiments, methods may be implemented through an application running on a capture device 102, 102a, 102b, 102c. Methods may be implemented across two or more devices and/or computers. For example, a method may include data capture steps implemented on a capture device 102, 102a, 102b, 102c, data analysis steps implemented on the capture device 102, 102a, 102b, 102c and/or personal computing device 104 and/or server computer 106, and data storage steps implemented on server computer 106 and persistent storage devices 108. In some embodiments, the personal computing device 104 may be omitted.


In one embodiment a surface feature outline may be automatically determined by the facility, or more specifically by a processor, based on data captured from the patient's anatomical surface. FIG. 2 is a flow chart illustrating this process. In FIG. 2, data is captured at block 200. Optional data processing may be performed at block 201. An outline of an anatomical surface feature may be determined at block 202. Resulting outline and associated data may be stored and/or displayed at block 203.


In general, a physical surface feature may be considered to have a ‘boundary’. The determined outline should match the boundary to an accuracy that is satisfactory for the application.


Data capture may involve capture of data from the patient's anatomical surface using any suitable device or combination of devices.


Such capture devices may be contained within a single instrument, or may be contained in different instruments. Data may be captured by any one or more of the devices and methods discussed herein and in the Applicant's applications and patents, including U.S. Pat. No. 8,755,053, filed May 11, 2009; U.S. Pat. No. 9,179,844, filed Nov. 27, 2012; U.S. patent application Ser. No. 15/144,722; and U.S. Provisional Patent Application No. 62/423,709 filed 17 Nov. 2016; PCT publication WO2018/185560; U.S. Provisional Patent Application No. 62/481,612; U.S. patent application Ser. No. 15/816,862; and US Patent Publication No. 2018/0132726, the entire contents of all of which are hereby incorporated by reference herein, or from any other suitable device or devices.


Preliminary data processing may include any desired preliminary processing in advance of outline determination. For example, preliminary data processing may include reduction of the data set by a processor using a manually or automatically determined Region of Interest (ROI). Preliminary data processing may include conversion of captured data by a processor into one or more color spaces. For example, captured image data may be converted into a normalized green (normG) color space or any other desired color space. Color spaces may include red, green, blue, infrared, ultraviolet color spaces, color spaces based on other wavelength regions or data, or any color space based on a combination of any two or more of red, green, blue, infrared, ultraviolet or other wavelength region captured data. Preliminary data processing may include data cleaning. For example, the captured data may be processed to reduce or reject noise.


In some embodiments, outline determination may be carried out manually by a user entering an outline on any suitable device. For example, a user may draw an outline using a touch screen. However, in other embodiments, outline determination may be performed automatically by a processor. In still further embodiments outline determination may be performed by a combination of automated processing and user input and/or user modification of data.


Outline determination may include an initial determination of outline data and optionally any processing of the determined outline data to produce processed outline data. Outline determination may include thresholding within the image data to determine regions (and possibly sub-regions) within the image data. Such segmentation can be performed many different ways using pixel-based, edge-based, or region-based methods that are fully automatic or require seeding and/or user interaction. The wound outline may be determined based on segmentation or the segmentation may result from determination of the wound outline.


Outline determination may include data cleaning. For example, small adjacent regions of like properties may be consolidated. Other noise reducing or rejecting methods may be applied. Outline determination may include resampling of outline data to form a resampled outline data set. Resampling may include manual or automatic addition of further outline data points. Outline determination may include snapping of outline data points. Outline determination may include determination of one or more confidence attributes. Outline determination may include smoothing of outline data. Outline determination may include simplification of outline data.


Storage of outline data may be storage in any suitable memory—local, central or remote. Display of outline data may include display on any suitable device—local, central or remote. Outline data may be displayed together with any data resulting from data capture, preliminary data processing and/or outline determination, including for example captured image data and/or confidence data. Outline data may be displayed together with historic data (e.g. previously captured data, or data based on previously captured data such as historic outline data). This may be helpful in displaying data indicative of trends over time.


Some possible preliminary data processing methods will now be described in further detail.


A Region of Interest (“ROI”) may be defined. This may help to limit the processing required to data within the ROI and/or to exclude background regions. The ROI may include the anatomical surface feature and surrounding surface external to the surface feature. FIG. 3 shows a surface feature 300 on a patient's leg 301. A user may be prompted to enter a Region of Interest using any suitable user input (e.g. mouse, pointer, touchscreen etc). In the example shown, a user has dragged and resized an oval ROI indicator 302 over the image. At least some subsequent processing of data may be limited to this user-defined ROI.


In further embodiments, the ROI may be automatically defined by a processor. In structured light devices, such as the Applicant's devices described in U.S. Pat. No. 9,179,844 the structured light pattern may be used in ROI definition. For example, in the case of a ‘star’ shaped structured light pattern having a plurality of structured light elements (usually stripes) meeting at a central point, the user may be trained or instructed to locate the central point within the surface feature, as shown in FIG. 4. Here, a ‘star’ shaped laser pattern 403 is projected onto a surface feature 400 on a skin surface 401. The ROI may then be automatically defined by a processor by using the structured light data to make a first determination of surface feature boundary data. For example, by starting at the central point and looking outwards along the structured light feature it is possible to detect a transition in 3D shape (or in color, texture or some other characteristic) corresponding to the boundary of a wound. In the example shown this will yield six boundary points. An ROI may then be automatically defined based on the obtained data, on approximate surface feature location and/or dimensions. A margin of error may be allowed. For example, in the example shown the ROI may be defined as an oval 402 centered on the laser line crossing point 405 and having an area twice that of the approximate wound area calculated from the six boundary points 404. Any other suitable margins of error, ROI shapes and structured light patterns may be used. Alternatively, the structured light pattern may be used in the same way, but instead of or in addition to creating an ROI, the boundary data points may be used as a starting point or ‘seed’ for outlining.


An ROI may also be automatically defined by a processor by foreground extraction. For example, in FIG. 5 image data including an anatomical feature 500 may be processed to differentiate between a foreground feature, i.e. the patient's leg surface 501, and the background region 502. The foreground may be considered the ROI, or may be further processed to define the ROI.


The ROI may also be automatically defined by a processor based on historical data. For example, a manually or automatically defined ROI may be stored. At a later data capture for the same patient, the ROI may be the same as, or based on, the stored ROI. Alternatively, an ROI may be determined based on a previously determined surface feature outline. For example, a wound outline may be determined. At a later data capture, the ROI may be defined as the previously determined wound outline plus a margin (e.g. 50% greater area). An ROI may be determined based on other historic information such as wound location and/or wound type. For example, a stage 1 DFU may almost always be less than a certain size, therefore the ROI can be initially set to the maximum expected size and located based on any suitable captured or historic information.


The ROI may also be located and/or dimensioned based on one or more known “anatomical fiducials”, such as markings or other permanent or semi-permanent skin features, moles, freckles, birthmarks, skin wrinkles, creases, lines, tattoos (permanent or temporary tattoos), scars or other permanent or long-term, fixed points or markings on the patient's body. In some embodiments, a permanent or semi-permanent marking may be added (using e.g. a pen, tattoo etc) in order to provide an anatomical fiducial. This may be done in addition to existing anatomical fiducials, or in a surface region where no or insufficient suitable anatomical fiducials are found.


Further, information from the capture process may be used. For example, in some devices (such as the Applicant's Silhouette Lite+device) the device may display or project a region (by e.g. displaying a region on the device screen during data capture), which the user aligns around the surface feature. This displayed or projected region may be used as the ROI, or as the basis for determination of the ROI. For example, the ROI may be determined as the displayed or projected region plus a margin, e.g. 10% by area.


Further, in other embodiments an ROI may be determined (manually or automatically) that straddles a boundary of the surface feature, or is contained within the boundary of the surface feature. In some embodiments a small ROI may be determined, down to a single point, such as a small region or single point within the wound boundary. From here a wound boundary may be determined by looking outwards from the ROI.


Any suitable combination of the above ROI definition methods may be used.


Preliminary data processing may also involve conversion of captured data by a processor into one or more different color spaces. For example, where RGB image data is captured, it may be converted into a monochrome color space.


Some wound beds (e.g. in diabetic foot ulcers—DFUs) have been found to exhibit a lack of green color, and in one embodiment captured RGB data may be converted into a normalized green (normG) color space. For example, the computation of the normalized green value Ĝ at pixel location i,j may be calculated as:








G
ˆ


i

j


=


G

i

j





R

i
,
j

2

+

G

i
,
j

2

+

B

i
,
j

2









where R, G and B represent red, green and blue channels respectively.


DFUs tend to have a red wound bed and yellow calloused wound surround. The use of a normG color space provides good contrast between wound bed and wound surround, since the wound bed will show a low value in normG space (i.e. an absence of green) while the wound surround will show higher values in normG space.


Other color spaces may be used, and different color spaces may be suitable for different types of wound, tissue, anatomical surface features, surface feature shapes or other characteristics. Further, any suitable combination or set of color spaces may be used. For example, normalized red, normalized blue, red, green, blue, UV, Infrared, CIE Lab, RGB, HSL, HSV, CIE Luv, YCbCr, CMYK, enhanced red, grayscale color spaces may be used. Color spaces may be monochrome color spaces (e.g. greyscale, red, normalized red etc) or color spaces with two or more channels (e.g. RGB, HSV etc). Where more complex capture devices are used (e.g. devices such as spectrometers) other or more complex color spaces may be used.


Where a plurality of color spaces are used, these may be weighted differently.


In further embodiments color space data may be used as a diagnostic tool. For example, captured data may be converted into one or more color spaces, with the converted data being analyzed for characteristics indicative of a surface feature property (e.g. surface feature type and/or stage, tissue types etc.).


Preliminary data processing may include any suitable initial cleaning or noise-reduction processes. In one embodiment, small connected regions may be consolidated. This is largely a noise-rejection measure to remove small erroneously detected wound regions. The consolidation of regions may continue until the cumulative area of the largest region or regions is above an appropriately defined threshold.


Some possible outline determination methods will now be described in further detail. These methods may be applied whether or not any of the preliminary processing methods are applied.


The image data may be subjected to any suitable thresholding method. For example, in the case of monochrome normG data, thresholding may be used to distinguish wound regions (low normG values) from surrounding tissue (high normG values). This thresholding may provide a classification of pixels classified into ‘wound’ and ‘non-wound’ classes or categories. Any suitable thresholding method, for example Otsu thresholding, may be used, and may be based on any suitable data.


Further, thresholding may be used to create more than two classes or categories. Multilevel thresholding may be used. Further, thresholding in more than one color space may be used. Continuous or multi-level thresholding across several types of data or a plurality of color spaces may be used. For example, thresholding in normG space may be used to define a ‘wound bed’ region, expected to have low normG values. Thresholding in RGB color space may be used to define a ‘bone’ region, expected to appear white in RGB data; and/or to define a ‘necrotic’ region, with necrotic tissue expected to appear black in RGB data. The combination of thresholding in these color spaces may therefore allow several tissue type regions to be determined. Identification of some tissue type regions may rely on a combination of thresholding in two or more data spaces.


Thresholding in any captured data space or in any converted data space may be used. Captured or converted data spaces may include color spaces (including UV and IR color spaces), fluorescence imaging color spaces, thermal data spaces, physical spaces including 3D spaces and any other suitable captured or converted data space. Thresholding may be applied to image histograms. Local thresholding methods may be used, in which the threshold is variable over the image or ROI. The threshold may vary dynamically over the image or ROI, or in any other suitable manner.


An initial surface feature outline may be automatically determined by a processor based, e.g. on the outside of a wound region, or on a border between wound and non-wound regions. A determined outline may include a single outline (e.g. around a simple wound) or may be a complex outline including a plurality of distinct outline portions (e.g. an outline portion around the outside of a wound and one or more outline portions around ‘islands’ of healthy tissue within the wound; or outline portions around several distinct regions of wound tissue; or outline portions defining sub-regions of different tissue types).


An initial outline, or a plurality of initial outlines, may be determined by any suitable method. In one embodiment, the convex hull of the wound region may be determined. The convex hull is the smallest convex set containing the wound region. The initially determined outline may be iterated or resampled to provide a refined wound outline. Any version of the outline may be displayed to a user, who may be given the opportunity to modify the outline, e.g. by movement of outline points or segments, deletion of outline points or segments, or addition of outline points or segments. Such changes may be incorporated and the outline iterated. The user may be provided further opportunity for modification of the outline with each iteration, or at least some of the iterations.


User modifications of the outline may be classified as definite (or high confidence) modifications and non-definite (or hint, possible or low confidence) modifications. ‘Non-definite’ modifications may not necessarily be honored by subsequent processing. ‘Definite’ modifications may be required to be honored by subsequent processing.


The outline may be further modified by automated snapping of outline points. In one embodiment, each outline point may be moved, or snapped, to the nearest pixel location of a pixel that has been identified as a ‘wound’ pixel. The nearest pixel location may be determined in any suitable manner, e.g. based on a simple Euclidian distance. If more than one outline point snaps to a particular pixel location, the duplicates may be removed. In further embodiments, the snapping of points may be in a specific direction rather than to the closest ‘wound’ pixel. For example, outline points could be snapped directly inwards towards the centre of a structured light pattern such as that shown in FIG. 4, i.e. to the closest ‘wound’ pixel on a line extending between the structured light centre and the outline point.


In some embodiments, the generation of the initial raw outline may produce a very detailed outline, often with many hundreds of points. This is because the resampling and snapping approach described above follows the outline of the thresholded image to individual pixel level. The highly sampled, detailed outline may be problematic. The perimeter may be inflated since the path or integral around the outline follows the single pixel jumps. That is, the perimeter may be measured as longer than it is. Further, outlines returned to the user may be difficult to manually adjust or alter (if this functionality is provided) since there are so many points close together. Outline processing and handling may also be intensive. More detailed outlines may also require more storage space. Operations like determining regions of different tissue types (e.g. islands of healthy tissue within a surrounding wound) and dealing with overlapping outlines may become very intensive.


To reduce these effects, the outline may be simplified to reduce the point count of the outline. Rationalization where every nth point is retained may be used, but this is simplistic and can result in lost detail. Instead, filtering followed by simplification (feature preserving decimation) may be performed.


Filtering, e.g. in the form of Savitzky-Golay filtering (though any suitable filtering method may be used), may be used to smooth the outline.


Savitzky-Golay filtering approximates the underlying function within the moving window (w=2n+1) not by a constant (such as a moving average) but by a polynomial of higher order (m). It does a good job of preserving detail, which is often lost or flattened when some other smoothing techniques are employed.


This filtering step produces a much more ‘organic’ outline. However, the filtered outline may be defined with sub-pixel coordinates and the number of points may not have been reduced. Sub-pixel locations may be problematic in some applications where the data is to be stored using integer pixel locations.


Each filtered (smoothed) outline point Ois is given by:







O

i
s


=







k
=

-
n


n



A
k



O

i
+
k









k
=

-
n


n


A
k








where the weighting coefficients (A−n, A−(n−1), . . . , A(n−1), An) are the Savitzky-Golay coefficients. This moving polynomial fit may be handled in the same way as a weighted moving average. In some applications, smooth coefficients for a polynomial of order 3 and window size of 31 (m=3, n=15, w=31) may be suitable.



FIGS. 6A and 6B show an outline in dashed line, with a filtered or smoothed outline in solid line. FIG. 6B is an enlarged view of the section marked by box 600 in FIG. 6A.


A simplification step may be performed to reduce the number of outline vertices. This may include removing points in long segments where the points roughly line up. The allowable deviation from a truly straight line may be specified by a tolerance (ε) parameter. After removal of redundant locations the final subset of locations may be rounded to the nearest pixel. Any suitable simplification method may be used. In some applications the Douglas-Peucker algorithm, with ε=1 or another suitable value, may be used for simplification.


The Douglas-Peucker algorithm, also known as Iterative End-Points Fitting, recursively divides the piecewise linear curve. It starts by creating a straight-line segment that connects the start and end points. It then locates the point furthest from this initial line segment. If this point is closer than ε to the line segment, then all unmarked points between the start and end points can be discarded without the error exceeding ε. However, if the furthest point is greater than ε from the initial line segment, that point must be kept. Two new lines segments are formed from the start point to the new point and the new point to the end point. The recursion process is then repeated for each new line segment and so on until all points have been checked and kept or discarded.



FIG. 7 shows a smoothed or filtered outline in solid line, with the black squares being the retained simplified points. In this example, the outline point count has reduced from 1078 to 76.


Generally, using smoothing and simplification of the outline may avoid abrupt outline portions and reduce the point count by removal of near-aligned points. This results in a much reduced complexity of the outline. The outline may take on a more ‘organic’ shape similar to outlines drawn by users. Further, the point count may be adjusted by altering parameters, especially parameters of the simplification process. The perimeter errors mentioned above may be reduced since the fractal outline issue is largely removed.


One or more quality or confidence attributes (hereafter “confidence attributes”) associated with the outline, or parts of the outline, may be determined by a processor. Confidence attributes may include one or more overall confidence attributes for the outline and/or one or more confidence attributes for one or more segments of the outline and/or one or more confidence attributes for each outline point. The confidence attributes may vary in different outline portions. The confidence attributes may provide an indication of how well the determined outline follows the edge of the ‘wound’ pixels. Poor confidence outlines or segments of outlines with poor confidence may not be displayed to the user.


Any suitable method of determining confidence attributes may be used. However, in one embodiment, the quality of the outline may be determined by looking at the distance between the points making up the outline (distance metric) and how closely the outline follows the underlying thresholded image (snap metric). In one embodiment, a good quality wound outline may be considered one that accurately follows the wound boundary and is evenly sampled (i.e. does not contain big jumps and jagged edges).


The distance metric may be calculated as the RMS (root mean square) distance between adjacent outline points in the outline before any smoothing or simplification. The snap metric may be calculated as the RMS distance between the resampled outline points snapped onto the thresholded image, also before any smoothing or simplification. Alternatively, instead of RMS methods, any other suitable method of determining the distance and/or snap metrics may be used.


For each metric the per point distances may be calculated and then the RMS (root mean square) of each set of points is computed. The RMS may be used because it is sensitive to large errors. However, as the per point distances are calculated for each metric it may also be possible to report on these individually.


A high confidence wound outline should contain no excessive jumps. The distance metric is the RMS of the Euclidean distance between successive points in the wound outline, i.e. the length of the line segment connecting them. For a high confidence outline this should tend to somewhere between 1 and √2 times the pixel spacing.


If the outline contains N points denoted O1, O2, . . . ON, the Euclidean distance between neighbouring outline points On=(Onx,Ony) and On+1=(On+1x,On+1y) may be given by:

dEn=√{square root over ((On+1(mod N)x−Onx)2+(On+1(mod N)y−Ony)2)}


The distance metric may be given by:







Distance


Metric

=



1
N






i
=
0

N


d


E
i










FIG. 8 shows an example in which a wound region 800 (shown as a white region) and a non-wound region 801 (shown as a grey region) have been identified. Outline points 802 (shown as diamond symbols) are generally regular and close together. Line segments connecting the points are generally short. The RMS distance between points in this example is 1.3017 pixels.



FIGS. 9A and 9B show a further example in which a wound region 900 and a non-wound region 901 have been identified. FIG. 9B is an expanded view of the part of FIG. 9A indicated by rectangle 902. In this example, there are large jumps between the outline points 903 resulting in some large line segments. For example, in FIG. 9B relatively long line segments 904 can be seen between data points. The RMS distance between points is 2.4543 pixels. This example will therefore have a higher distance metric than the example of FIG. 8.


As an alternative to the above calculation, confidence could be reported per individual line segment, or even per individual point, instead of the overall RMS.


The distance an outline point must move to snap onto the closet point on the thresholded image provides a measure of how closely it follows the region(s) of likely ‘wound’ locations. If it follows the thresholded ‘wound’ locations exactly then the snap metric will be zero. The resampled outline points may be used for this calculation. The impact of large jumps in the outline would be hidden if the resampled outline were not used, as the ends of the large segments are close to the thresholded image.


If the resampled outline contains N points denoted Oi to ON, the snap distance between outline point On=(Onx,Ony) and the closest point on the thresholded image CPn=(CPnx,CPny) may be given by:

dSn=√{square root over ((Onx−CPnx)2+(Ony−CPny)2)}


The snap metric may be given by:







Snap


Metric

=



1
N






i
=
0

N


d


S
i










FIG. 10 shows an example in which a wound region 1000 (shown as a white region) and a non-wound region 1001 (shown as a grey region) have been identified. The resampled outline contains no large jumps to the wound region. The distance between each point 1002 and the closest point on the thresholded image is short. The RMS snap distance is 0.3071 pixels.



FIGS. 11A and 11B show a further example in which a wound region 1100 and a non-wound region 1101 have been identified. FIG. 11B is an expanded view of the part of FIG. 11A indicated by rectangle 1102. There are large jumps required to move some of the outline points to the boundary of the wound region. In this example the resampled points 1103 require large pixel distances to snap to the thresholded image. The RMS snap distance is 1.4370 pixels.


Again, confidence could be reported per outline pixel by looking at individual snap distances instead of the overall RMS.


A sample of wound outlines may be visually examined by a user and classified as high, medium and low confidence wound outlines. A set of thresholds may be determined for low, medium and high confidence based on these classifications.


Confidence attributes may therefore include an indication of confidence level, such as ‘low’, ‘medium’ or ‘high; or ‘acceptable’/‘unacceptable’ etc. Confidence attributes may include numerical values, such as a percentage confidence or a confidence score on any suitable scale.


Confidence attributes may be displayed to a user using words or numbers, or any other suitable indicator. For example, confidence may be displayed using colors (e.g. red for low, orange for medium, and green for high confidence), star ratings, bar displays or any other suitable mechanism for display of the confidence. Audio cues may be associated with the confidence attribute. For example, different sounds may be played based on the determined confidence attributes. Other cues and/or indicators may be used, for example haptic cues or indicators.


Further, in some embodiments such cues and/or indicators may be provided during any user entry of outline data, tissue classification data etc that may be allowed. Such cues and/or indicators may assist a user in entering data (e.g. on a touchscreen, pointer or other user input device) by giving feedback on determined or expected confidence for the data being entered. Such feedback may be given in real time during user entry of data or after user entry of data. For example, feedback may be given by any suitable cues and/or indicators to assist a user in staying in a high confidence ‘channel’ when drawing wound outline data.


A high confidence outline may be automatically accepted or the user may be prompted to review and/or accept the outline. Lower confidence outlines may be either rejected or returned to the user for user input of further outline points. The manual user addition of outline points may include addition of a number of discrete outline points, addition of an outline segment or segments, or manual input of an entire outline.


Potentially low confidence regions of the outline could prompt further or additional processing in these areas.



FIG. 12 is a flow chart illustrating one possible manner of iterative outline refinement. Other methods may also be suitable.


At block 1200, the outline and confidence are determined, using any of the methods described above, or any other suitable method.


The outline and confidence may be displayed to a user at block 1201. The outline may be displayed superimposed or overlaid on image data for the wound. The image data may be captured image data or processed image data. The image data may be processed from a plurality of captured images (e.g. RGB, UV and/or Infrared images). The outline may be displayed as a line with or without the display of outline points. Alternatively, outline points may be displayed without connecting lines. Any other suitable display of the outline may be used. The confidence may be displayed using words (e.g. high, medium, low, good, bad, OK etc) or numbers, or any other suitable indicator. Confidence may be displayed using visual indicators such as colors (e.g. red for low, orange for medium, and green for high confidence), star ratings, bar displays or any other suitable mechanism for display of the confidence. Audio cues may be associated with the confidence attribute. For example, different sounds may be played based on the determined confidence attributes.


As the confidence may vary around the outline, a plurality of confidence values may be displayed for different segments of the outline. The confidence data may be retained down to the pixel level if appropriate.


At block 1202 the user may examine the displayed outline and confidence. The user may accept the displayed outline, in which case the outline is stored at block 1203.


Alternatively, the user may reject the outline in its entirety. In this case, the user may manually input an outline at block 1205. The manually entered outline may be stored at block 1203. Confidence values for the manually entered outline may also be calculated and stored, if desired, by any of the methods discussed herein.


As a further alternative, the user may reject part of the outline. At block 1206, the user may manually enter additional outline data, such as points or segments, or may otherwise modify the outline (e.g. by dragging parts of the outline to a new location). The resulting modified outline may return to block 1200 for further iteration of the outline and confidence.


The user's input of outline points or modifications may be solicited by the system. In particular, the system may solicit user input where the confidence is unacceptable (e.g. below a threshold). For example, a segment of the outline may have a confidence value less than a threshold. This may be indicated by marking that outline segment in red (or some other color or line pattern, visual effect etc, or simply be omitting outline segments with confidence values lower than the threshold), prompting the user to modify or replace that outline segment.


Further, only a partial outline having a confidence value that is acceptable (e.g. above a threshold) may be displayed to a user. The system may solicit the user's input in relation to the ‘missing’ portions of the outline, for which the confidence value was below a threshold.


In some embodiments, regardless of the confidence values, the user may override part or all of the computed outline. In other embodiments the user may override only those parts of the outline with a confidence below a threshold.


Confidence thresholds are treated in this specification as thresholds above which confidence is good, high, acceptable etc, and below which confidence is poor, low, unacceptable etc. However, other methods of determining whether a confidence is acceptable or unacceptable may be suitable. Further, the skilled reader will recognize that equivalent arrangements may include ‘rejection’ methods, e.g. where data with a rejection value above a rejection threshold may be rejected (similar to low confidence data) while data with a rejection value below the rejection threshold may be accepted (similar to high confidence data). Such variations and equivalents are to be considered confidence methods and are intended to fall within the scope of the claims.


Once the outline is stored, further processing of the data may be performed at block 1204.



FIGS. 13 to 16 illustrate one possible manner of displaying data to a user, whether on a capture device or other device. FIG. 13 shows a display window 1300 in which an image 1301 is displayed. The image may include a surface feature 1302 on a patient's skin surface 1303. An ROI 1304 may also be shown. A thumbnail display 1305 may also be included, allowing a user to select one of a plurality of images for display as the main image.



FIG. 14 shows a display 1400 in which a generated outline 1401 is displayed within an ROI 1402. The outline may or may not be overlaid with an image of the skin surface. In the example shown, a portion of the generated outline (indicated generally by arrow 1403) has been determined to have a confidence value below a threshold. That portion 1403 has therefore not been displayed (although, as noted above, in other embodiments low confidence outline portions may still be displayed, and may be displayed in a manner distinguishing them from acceptable confidence outline portions). FIG. 14 also shows a global confidence indicator 1404. In this embodiment the global confidence indicator is displayed as a combination of the text ‘No good’ and a hatched pattern. In other embodiments any suitable display or notification of confidence may be used. FIG. 14 also shows a ‘Measurements’ box 1405, in which calculated values of perimeter and area for the surface feature are displayed. Any other desired data may be displayed.



FIG. 15 shows a further display 1500 in which a generated outline 1501 is displayed within an ROI 1502. The outline may or may not be overlaid with an image of the skin surface. Here, two lower confidence outline portions 1503, 1504 are indicated by dashed line (though any suitable visual indicator could be used). A user prompt 1506 is also displayed, prompting a user to edit the generated outline. Edits may be performed in any suitable manner using any suitable user input device. A measurements box 1507 is also displayed.



FIG. 16 shows a further display 1600 in which an outline 1601 is displayed within an ROI 1602. The outline may or may not be overlaid with an image of the skin surface. A global confidence indicator 1603 (‘OK’) indicates that the global confidence is acceptable, or above a threshold. A measurements box 1604 is also displayed. A display such as that of FIG. 16, with an acceptable global confidence indicator, may result directly from generation of an outline in captured data, or via user editing of a generated outline with unacceptable global confidence (e.g., from user edits of an outline such as that of FIG. 14 or FIG. 15).


The above method provides iterative processing to allow the incorporation of optional user defined edits, together with overall user acceptance of the wound outline.


In any of these methods, once the wound is segmented from the image, further segmentation can be performed on the pixels within the wound boundary. This allows further classification of tissue. This further segmentation may be determined automatically by a processor based on any of the available image data, including RGB data, thermal data, fluorescence data and/or three-dimensional image data and any data based on captured data (such as e.g. converted color space data). The boundaries may be based on changes in color, temperature, fluorescence or surface roughness, for example.


Again, the user's input on this further segmentation may be solicited by the system, and the user may be given the opportunity to add segmentation data or modify the automatically determined segmentation data, for example by tracing on a touch screen or other user input device.


For each sub-region, a determination may be made as to the tissue types present in that sub-region. Tissue types may include, for example: Granulation tissue, Slough, Eschar, Bone, Ligament, Healthy tissue (e.g. islands or tissue outside of wound). Tissue types may be selected manually by a user, for example from a menu of available tissue types. Alternatively, tissue types present in a sub-region may be determined automatically by analysis of the image and other captured data (including any of 3D, thermal, fluorescence data etc.). Previously gathered data for specific sub-regions may be overlaid or otherwise compared with current data, to allow changes or trends for that sub-region to be seen.


A determination may also be made as to the proportions of tissue types present in that sub-region. For example, by visual inspection a user may determine that granulation tissue occupies around 40% of a sub-region, slough around 30% and healthy tissue around 30%. The user may enter these values via any suitable user interface, for example, via sliders, text boxes etc. Alternatively, proportions of tissue types present in a sub-region may be determined automatically by analysis of the image data.


Validation may be performed by comparison with ground truth (supervised evaluation) using a number of metrics such as Dice, Jaccard, Hasuhorff metrics etc, or any other suitable methods. Unsupervised evaluation does not require comparison with ground truth, but is solely based on a given segmentation.


This comparison with ground truth may involve comparison of automatically determined outlines with expert/clinician generated outlines. The ground truth or clinician outline may not generally available in many applications. However, such comparisons may be valuable in, for example, machine learning. For example, the system may learn the outlining from health professionals and adjust its methods over time. Such learning processes may take place at any suitable scale—globally across all customers, specific to a customer, or even specific to a clinician. This may lead to adjustments in automated outlining globally, or for specific customers, institutions or users.


In addition, where an automatically determined wound outline is deemed unacceptable, e.g. where the user overrides all or part of the automatically determined outline, the user entered outline and corresponding automatically determined outline may provide inputs to machine learning methods. These methods may include but are not limited to Support Vector Machines, Artificial Neural Networks, Random Forest Decision Trees etc. This may lead to fine tuning of input parameters, color spaces, thresholding methods, confidence metrics, smoothing methods, simplification methods or highlight the need for an alternative approach. For example, if the user often or always overrides some or all of the outline for a specific wound type (e.g. Venous Leg Ulcers) the system may adjust its methods to automatically include alternative or additional processing or any other available information for these wound types. Similarly, where both user and automatic tissue type classifications are available, machine learning could be applied to better determine the different tissue types for future classifications.


In one embodiment the process of machine learning may be based on minimizing a cost function based on the difference between a user entered outline and an automatically determined outline. The machine learning process may adjust the weightings of various color spaces and/or other metrics and re-compute the automatically determined outline. If the difference (cost) of the new outline is smaller the color space weightings may be adjusted for subsequent wound outlining calculations. The process of adjusting the weighting coefficients from previous outlining calculations may be performed gradually over time to ensure the weightings form an average of many outlining datasets.


In one embodiment the machine learning process could include information from a wider set of patient metrics, typically retrieved from a health providers Electronic Medical Records (EMR) system. In this case the machine learning algorithm could be used to establish wound outline and tissue classification regions from a wider set of input information. For example, a machine learning algorithm may establish a pattern or correlation between a patients medication and the overall shape of a wound.


Fully automated outlining and/or identification of sub-regions may eliminate user involvement and produce more accurate results than manual segmentation. However, in some applications implementation of such fully automated methods may prove challenging and may not always provide satisfactory results, e.g. due to image complexity. Assisted, interactive or semi-automatic methods may be used in some embodiments. These may involve a user providing a suitable input.


In some embodiments a user may define a closed initial contour or initial segmentation, used as a basis for automated determination of outline data and/or sub-region data.


In some embodiments a user may input initial points, regions etc that are identified as ‘object’ and/or ‘foreground’ and/or ‘background’ points, regions etc. The user may identify distinct points, or may identify larger number of points or a region by e.g. dragging a pointer or ‘brush’, or a finger on a touchscreen, or selecting a region by dragging a shape, or by any other suitable user input method.


In some embodiments a user may input a small number of points that are ‘outline points’. From these points the system may ‘extrapolate’ the outline by following regions of the skin surface with similar characteristics.


In any of these methods, identification of an ROI may not be required.


In one embodiment, a user may identify a small number (perhaps 2 to 10) of points on the surface feature boundary in any suitable data set. The system may identify the image structure at each of the selected points (e.g. from the normalized-green map, 3D shape gradient, or any other suitable imaging modality such as UV etc.). The system may then identify further regions of the image with equivalent structure and ‘extrapolate’ the outline.


The resulting outline data may be further processed by any of the methods described herein.


Outlining methods have been described mainly with reference to image data. However, any suitable data sets, captured using any suitable capture device, may be used. For example, suitable data sets may include three-dimensional data, three-dimensional image data, thermal image data, fluorescence image data, texture image data, color image data, monochrome image data, UV data, Infrared data, data captured from a single device pose, data captured from a plurality of device poses, data sets under different lighting conditions (e.g. lighting from different angles, lighting of different spectral range or wavelength, lighting at different color temperatures—these may be adjusted using the lighting built into capture devices or using external devices) or any other suitable data.


Suitable data capture devices may include any one or more of: cameras, including color and monochrome cameras, 3D scanners, 3D cameras, structured light devices, thermal cameras, UV cameras, Infrared cameras, or any other suitable capture device.


Where more than one data set is captured, the method may include registration between data sets.


Further, any suitable further data may be incorporated into any suitable stage of the method, e.g. in order to improve accuracy, efficiency or aid method selection. Such additional information may include any one or more of: imaging metrics, 3D information of the surface and/or surface feature, the 3D shape of many surface features across many patients, anatomical site of the surface feature (e.g. calf, foot), wound type or etiology (e.g. pressure ulcer: stage 3, venous ulcer, diabetic), wound margin characteristics (e.g. sloping, inflamed), surrounding skin characteristics (e.g. inflamed, healthy, dry), contributing factors (e.g. pressure, friction), wound appearance or tissue types (e.g. % granulating, necrotic tissue etc in last assessment), previous dressing type, one or more previous outlines for the surface feature, any other suitable real time data or information, any other suitable historical data or information.


Object segmentation (identification of surface feature/non-surface feature and sub-regions such as tissue type regions) may be improved by utilizing any available or prior information. This may include any of the information described in the preceding paragraph and/or any one or more of: user interaction, intensity, color, texture, boundary and edge information, shape, topology, location, length, area, number of regions, anatomical knowledge etc. Further, different methods of data capture, preliminary data processing, ROI determination, outline determination and/or display may be used based on the available information.


For example, some data sets may be used only where the confidence achieved through analysis of a first set of data sets is insufficient. For example, if the confidence achieved through analysis of an RGB image data set is insufficient, a 3D data set may also be analysed.


The selection of an ROI determination method may be based on whether the capture relates to a new wound (in which case there is no previous information to use), or an existing wound (in which case a previous wound outline or a previous ROI could be used in determination of a new ROI). The selection of an ROI determination method may also be based on the capture device used and/or the data sets available. For example, some capture devices may capture only RGB data, others may capture RGB and 3D data; others may project a region onto the skin surface, etc.


The selection of color spaces could be a function of wound location. For example, if the wound is located on a patient's foot, the system may determine that this is most likely a diabetic foot ulcer and select a normalized green color space. A wound on the patient's shin may use an alternative processing (e.g. multiple weighted color spaces followed by multi-level thresholding etc).


Selection of color space may also be based on wound history. For example, if a previous measurement indicated the presence of specific tissue types, one or more color spaces may be selected appropriate to those tissue types.


The selection of a thresholding method may be a function of metrics of image pixels within the ROI. For example, if the histogram of the pixel levels has a bimodal distribution, the system may use Otsu thresholding to classify the pixels into ‘wound’ and ‘non-wound’. If the histogram of the pixel levels is unimodal, the system may use an alternative thresholding method.


The following examples are illustrative of several embodiments of the present technology:


1. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • determining outline data of the skin surface feature based on at least one of the one or more data sets; and
    • determining one or more confidence attributes associated with the determined outline data.


2. A method as in example 1, wherein the determined outline data includes a determined outline of the surface feature.


3. A method as in example 2, wherein at least one of the one or more confidence attributes include a confidence attribute for the determined outline as a whole.


4. A method as in example 2 or example 3, wherein the one or more confidence attributes include a plurality of segment confidence attributes, each associated with a segment of the determined outline.


5. A method as in any one of examples 2-4, wherein the one or more confidence attributes include a plurality of point confidence attributes, each associated with a point on the determined outline.


6. A method as in any one of examples 2-5, including rejecting the determined outline or part of the determined outline if one or more of the confidence attributes are unacceptable.


7. A method as in any one of examples 2-6, including rejecting part of the determined outline if one or more of the segment confidence attributes are unacceptable.


8. A method as in any one of examples 2-7, including rejecting part of the determined outline if one or more of the point confidence attributes are unacceptable.


9. A method as in example 6, including soliciting user input for replacement or modification of the rejected determined outline or rejected part of the determined outline.


10. A method as in any one of examples 1-9, wherein the determined outline data includes sub-region outline data for one or more sub-regions of the surface feature.


11. A method as in example 10, wherein the one or more confidence attributes include one or more of: one or more confidence attributes for the sub-region outline data for each of the one or more sub-regions; a plurality of sub-region segment confidence attributes, each associated with a segment of the determined sub-region outline data; and a plurality of sub-region point confidence attributes, each associated with a point in the determined sub-region outline data.


12. A method as in example 10 or example 11, including determining one or more tissue types associated with each of the sub-regions.


13. A method as in example 12, including determining a tissue confidence attribute associated with the determined tissue type for each of the sub-regions.


14. A method as in any one of examples 1-13 wherein determination of the confidence attribute is based at least in part on distances between points in the determined outline data.


15. A method as in any one of examples 1-14, including processing the determined outline data, wherein determination of the confidence attribute is based at least in part on distances between points in the determined or processed outline data and points in the one or more data sets.


16. A method as in example 15 wherein processing the determined outline data includes resampling the determined outline data.


17. A method as in example 15 or example 16 wherein the points in the one or more data sets include points with a predetermined tissue type classification.


18. A method as in any one of examples 1-17 including storing the one or more confidence attributes and/or data based on the one or more confidence attributes.


19. A method as in any one of examples 1-18 including displaying the one or more confidence attributes and/or data based on the one or more confidence attributes.


20. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • determining outline data of the skin surface feature based on at least one of the one or more data sets;
    • determining one or more confidence attributes associated with the determined outline data;
    • displaying at least part of an outline that is based at least in part on the determined outline data; and
    • displaying one or more visual representations of the one or more confidence attributes.


21. A method as in example 20, including displaying a representation of the skin surface and/or skin surface feature.


22. A method as in example 20 or example 21, wherein displaying one or more visual representations of the one or more confidence attributes includes displaying one or more of: a numerical confidence indicator, a confidence symbol indicator, a colored confidence indicator, a patterned confidence indicator, and a confidence scale indicator.


23. A method as in any one of examples 20-22, including displaying a plurality of visual representations of the one or more confidence attributes, at least some of the displayed visual representation being associated with segments of the displayed outline.


24. A method as in any one of examples 20-23, wherein displaying at least part of an outline includes displaying only outline segments with confidence attributes that are acceptable.


25. A method as in any one of examples 20-24, including soliciting user input of outline data for outline segments with confidence attributes that are unacceptable.


26. A method as in any one of examples 20-25, including enabling a user to accept or reject the displayed outline or segments of the displayed outline.


27. A method as in any one of examples 20-26, including enabling a user to modify the displayed outline or segments of the displayed outline.


28. A method as claimed in example 27, including:

    • determining one or more revised confidence attributes associated with the modified outline;
    • displaying the modified outline; and
    • displaying one or more visual representations of the one or more revised confidence attributes.


29. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • determining outline data for the skin surface feature based on at least one of the one or more data sets;
    • determining one or more confidence attributes associated with the determined outline data; and
    • based on the one or more confidence attributes, soliciting user input of replacement or modified outline data.


30. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • defining a region of interest (‘ROI’) within at least one of the data sets, the skin surface feature being positioned within the ROI; and
    • analyzing data within the ROI to determine outline data of the skin surface feature.


31. A method as in example 30, including a user defining the ROI.


32. A method as claimed in example 30 or example 31, including automatically defining the ROI based on one or more known characteristics of a capture device used in the capturing of one or more data sets.


33. A method as in any one of examples 30-32, including automatically defining the ROI by extracting data corresponding to the patient's skin surface from background data.


34. A method as in any one of examples 30-33, including automatically defining the ROI based on a first approximation of a boundary of the surface feature.


35. A method as in any one of examples 30-34, including automatically defining the ROI based on historic data associated with the same wound or same patient.


36. A method as in example 35, wherein the ROI is based on a historic ROI.


37. A method as in example 35, wherein the ROI is based on historic outline data.


38. A method as in example 35, wherein the ROI is based on known fiducials on the skin surface.


39. A method as in any one of examples 30-39, including soliciting user approval of the ROI.


40. A method as in any one of examples 30-40, including enabling user modification of the ROI.


41. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • determining outline data for the skin surface feature; and
    • smoothing the outline data to produce a set of smoothed outline points.


42. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • determining outline data for the skin surface feature, wherein the outline data includes a first set of outline points;
    • processing the outline data to produce a reduced set of outline points, the reduced set of outline points having a smaller number of outline points than the first set of outline points.


43. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • identifying a subset of skin surface feature outline points in at least one of the one or more data sets; and
    • extrapolating from the identified subset of skin surface feature outline points and data from the one or more data sets to determine a set of outline data for the skin surface feature.


44. A method as in example 43 including a user manually identifying the subset of skin surface feature outline points.


45. A method as in example 44 wherein a user manually identifies the subset of skin feature outline points by identifying one or more discrete outline points.


46. A method as in example 44 wherein a user manually identifies the subset of skin feature outline points by identifying one or more outline segments each including a plurality of outline points.


47. A method as in example 44 including enabling a user to classify outline points in the identified subset of outline points as definite points or non-definite points.


48. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing, in one or more first color spaces, one or more images of a patient's skin surface including the skin surface feature;
    • converting at least some of the one or more images to produce one or more converted images in one or more further color spaces, different to the one or more first color spaces; and
    • determining outline data for the skin surface feature based on the one or more converted images.


49. A method as in example 48 wherein the one or more further color spaces include one or more monochrome color spaces.


50. A method as in example 48 or example 49 wherein the one or more further color spaces include a normalized green color space.


51. A method as in example 48 wherein the one or more color spaces include one or more of: normalized red, normalized blue, red, green, blue, UV, Infrared, CIE Lab, RGB, HSL, HSV, CIE Luv, YCbCr, CMYK, enhanced red, and grayscale color spaces.


52. A method of assessing a surface feature on a patient's skin surface, including:

    • capturing one or more data sets from a patient's skin surface including the skin surface feature;
    • retrieving from storage historical information relating to the patient and/or skin surface feature;
    • based on the retrieved historical information, selecting an outlining method from a plurality of available methods; and
    • determining outline data of the skin surface feature based on at least one of the one or more data sets, using the selected outlining method.


53. A method as in example 52 wherein the historical information includes one or more of: skin surface characteristics, surface feature characteristics, surface feature type, surface feature etiology, anatomical site, imaging metrics, previous dressing type, tissue type characteristics, 3D surface feature information, expected or assessed pressure or friction around the surface feature, and historical outline information.


54. A method of assessing one or more surface features on the skin surfaces of one or more patients, including:

    • determining outline data of a first skin surface feature based on one or more captured data sets;
    • a processor modifying an outline determination method; and
    • a processor determining outline data of a further skin surface feature based on one or more further captured data sets and according to the modified outline determination method.


55. A method as in example 54, wherein the processor modifies the outline determination method based on information obtained from the determining of outline data of the first skin surface feature.


56. A method as claimed in example 54 or example 55 wherein the determining of outline data of the first skin feature includes a processor determining the outline data of the first skin surface feature based on one or more captured data sets and according to the outline determination method.


57. A method as in example 55 or example 56 wherein a user accepts or modifies the determined outline data of the first skin surface feature and the processor modifies the outline determination method based at least in part on the user acceptance or modification of the determined outline data of the first skin surface feature.


58. A method as in any one of examples 54-57 including storing in memory code embodying the modified outline determination method.


The skilled reader will understand that many variations on the manner of selection of data capture methods, preliminary data processing, ROI determination, outline determination and/or display are possible.


Color spaces may also be selected based on the contrast in a number of different color spaces—for example, one or more color spaces with the highest contrast, or acceptable contrast, may be used.


Algorithm learning techniques may also be used. These may be based on any suitable inputs, including e.g. the user's acceptance, modification or rejection of outline data. For example, the final accepted outline may be compared to the initial outline data and any areas of agreement/disagreement between the two may be used to tune the outline determination method. Tuning may involve adjustment of weighting coefficients from each of the various inputs (color spaces, 3d shape etc), adjustments to the way the confidence is calculated, or any other suitable modification.


Various methods have been described above. The skilled reader will understand that the order of method steps may be varied where appropriate. The skilled reader will also understand that further processing or method steps may be incorporated. Such variations are intended to fall within the scope of the invention.


Any data captured or generated by these methods may be stored for later analysis, in determining trends (such as trends in wound area or volume over time), or for use in subsequent data capture. Further, any data captured or generated by these methods may be displayed to a user in any suitable form on any suitable device.


Any automated step may be performed by a processor or processors at any suitable part of the facility. Automated steps may be carried out locally, centrally or remotely. Processors on the capture device, local user device, central computers or elsewhere in the facility.


Any software, code embodying method steps, modifications to such code, as well as any captured or processed or calculated data may be stored in any suitable memory anywhere in the facility, including local, central or remote memory.


While the present technology has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in detail, it is not the intention of the Applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the present technology in its broader aspects is not limited to the specific details, representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of the Applicant's general inventive concept.

Claims
  • 1. A method of assessing a surface feature on a patient's skin surface, including: capturing one or more data sets from a patient's skin surface including the skin surface feature;determining outline data of the skin surface feature based on at least one of the one or more data sets; anddetermining one or more confidence attributes associated with the determined outline data; wherein determination of the confidence attribute is based at least in part on one or more of: distances between points in the determined outline data; anddistances between points in the determined or processed outline data and points in the one or more data sets.
  • 2. A method as claimed in claim 1, wherein at least one of the one or more confidence attributes include one or more of: a confidence attribute for the determined outline as a whole; a plurality of segment confidence attributes, each associated with a segment of the determined outline; and a plurality of point confidence attributes, each associated with a point on the determined outline.
  • 3. A method as claimed in claim 2, including rejecting the determined outline or part of the determined outline if one or more of the confidence attributes are unacceptable.
  • 4. A method as claimed in claim 3, including soliciting user input for replacement or modification of the rejected determined outline or rejected part of the determined outline.
  • 5. A method as claimed in claim 1, wherein the determined outline data includes sub-region outline data for one or more sub-regions of the surface feature.
  • 6. A method as claimed in claim 5, wherein the one or more confidence attributes include one or more of: one or more confidence attributes for the sub-region outline data for each of the one or more sub-regions; a plurality of sub-region segment confidence attributes, each associated with a segment of the determined sub-region outline data; and a plurality of sub-region point confidence attributes, each associated with a point in the determined sub-region outline data.
  • 7. A method as claimed in claim 5, including determining one or more tissue types associated with each of the sub-regions.
  • 8. A method as claimed in claim 7, including determining a tissue confidence attribute associated with the determined tissue type for each of the sub-regions.
  • 9. A method as claimed in claim 1 including storing the one or more confidence attributes and/or data based on the one or more confidence attributes.
  • 10. A method as claimed in claim 1 including displaying the one or more confidence attributes and/or data based on the one or more confidence attributes.
  • 11. A method of assessing a surface feature on a patient's skin surface, including capturing one or more data sets from a patient's skin surface including the skin surface feature; determining outline data of the skin surface feature based on at least one of the one or more data sets;determining one or more confidence attributes associated with the determined outline data;displaying at least part of an outline that is based at least in part on the determined outline data; anddisplaying one or more visual representations of the one or more confidence attributes.
  • 12. A method as claimed in claim 11, including displaying a representation of the skin surface and/or skin surface feature.
  • 13. A method as claimed in claim 11, wherein displaying one or more visual representations of the one or more confidence attributes includes displaying one or more of: a numerical confidence indicator, a confidence symbol indicator, a colored confidence indicator, a patterned confidence indicator, and a confidence scale indicator.
  • 14. A method as claimed in claim 11, including displaying a plurality of visual representations of the one or more confidence attributes, at least some of the displayed visual representation being associated with segments of the displayed outline.
  • 15. A method as claimed in claim 11, wherein displaying at least part of an outline includes displaying only outline segments with confidence attributes that are acceptable.
  • 16. A method as claimed in claim 11, including enabling a user to modify the displayed outline or segments of the displayed outline, determining one or more revised confidence attributes associated with the modified outline;displaying the modified outline; anddisplaying one or more visual representations of the one or more revised confidence attributes.
  • 17. A method of assessing a surface feature on a patient's skin surface, including: capturing one or more data sets from a patient's skin surface including the skin surface feature;determining outline data of the skin surface feature based on at least one of the one or more data. sets, wherein the outline data includes a. first set of outline points; anddetermining one or more confidence attributes associated with the determined outline data, wherein determination of the confidence attribute is based at least in part on one or more of: distances between points in the determined outline data; anddistances between points in the determined or processed outline data and points in the one or more data sets;processing the outline data to produce a reduced set of outline points, the reduced set of outline points having a smaller number of outline points than the first set of outline points;smoothing one or more of the first set of outline points and the reduced set of outline points.
  • 18. A method as claimed in claim 17, wherein at least one of the one or more confidence attributes include one or more of: a confidence attribute for the determined outline as a whole; a plurality of segment confidence attributes, each associated with a segment of the determined outline; and a plurality of point confidence attributes, each associated with a point on the determined outline.
  • 19. A method as claimed in claim 18, including rejecting the determined outline or part of the determined outline if one or more of the confidence attributes are unacceptable.
  • 20. A method as claimed in claim 19, including soliciting user input for replacement or modification of the rejected determined outline or rejected part of the determined outline.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims priority to and the benefit of U.S. Provisional Patent Application No. 62/850,377, filed May 20, 2019, and titled “AUTOMATED OR PARTIALLY AUTOMATED ANATOMICAL SURFACE ASSESSMENT METHODS, DEVICES AND SYSTEMS,” which is incorporated herein by reference in its entirety.

PCT Information
Filing Document Filing Date Country Kind
PCT/IB2020/000420 5/20/2020 WO
Publishing Document Publishing Date Country Kind
WO2020/234653 11/26/2020 WO A
US Referenced Citations (569)
Number Name Date Kind
3259612 Peter Jul 1966 A
3335716 Alt et al. Aug 1967 A
4090501 Chaitin May 1978 A
4170987 Anselmo et al. Oct 1979 A
4236082 Butler Nov 1980 A
4505583 Konomi Mar 1985 A
4515165 Carroll May 1985 A
4535782 Zoltan Aug 1985 A
4556057 Hiruma et al. Dec 1985 A
4724480 Hecker et al. Feb 1988 A
4736739 Flaton Apr 1988 A
4768513 Suzuki Sep 1988 A
4773097 Suzaki et al. Sep 1988 A
4821117 Sekiguchi Apr 1989 A
4839807 Doi et al. Jun 1989 A
4851984 Doi et al. Jul 1989 A
4894547 Leffell et al. Jan 1990 A
4930516 Alfano et al. Jun 1990 A
4957114 Zeng et al. Sep 1990 A
4979815 Tsikos Dec 1990 A
4996994 Steinhauer et al. Mar 1991 A
D315901 Knowles Apr 1991 S
5003977 Suzuki et al. Apr 1991 A
5016173 Kenet et al. May 1991 A
5036853 Jeffcoat et al. Aug 1991 A
5080100 Trotel Jan 1992 A
5157461 Page Oct 1992 A
5174297 Daikuzono Dec 1992 A
5241468 Kenet Aug 1993 A
5270168 Grinnell Dec 1993 A
5319550 Griffith Jun 1994 A
5363854 Martens et al. Nov 1994 A
5369496 Alfano et al. Nov 1994 A
5396331 Kitoh et al. Mar 1995 A
5408996 Salb Apr 1995 A
5421337 Richards-Kortum et al. Jun 1995 A
5515449 Tsuruoka et al. May 1996 A
5519208 Esparza et al. May 1996 A
5528703 Lee Jun 1996 A
5531520 Grimson et al. Jul 1996 A
5532824 Harvey et al. Jul 1996 A
5561526 Huber et al. Oct 1996 A
5588428 Smith et al. Dec 1996 A
5590660 MacAulay et al. Jan 1997 A
5603318 Heilbrun et al. Feb 1997 A
5627907 Gur et al. May 1997 A
5644141 Hooker et al. Jul 1997 A
5648915 Mckinney et al. Jul 1997 A
5673300 Reckwerdt et al. Sep 1997 A
5689575 Sako et al. Nov 1997 A
5699798 Hochman et al. Dec 1997 A
5701902 Vari et al. Dec 1997 A
5717791 Labaere et al. Feb 1998 A
D393068 Kodama Mar 1998 S
5740268 Nishikawa et al. Apr 1998 A
5749830 Kaneko et al. May 1998 A
5784162 Cabib et al. Jul 1998 A
5791346 Craine et al. Aug 1998 A
5799100 Clarke et al. Aug 1998 A
5810014 Davis et al. Sep 1998 A
5836872 Kenet et al. Nov 1998 A
5910972 Ohkubo et al. Jun 1999 A
5921937 Davis et al. Jul 1999 A
5946645 Rioux et al. Aug 1999 A
5957837 Raab Sep 1999 A
5967797 Maldonado Oct 1999 A
5967979 Taylor et al. Oct 1999 A
5969822 Fright et al. Oct 1999 A
5974165 Giger et al. Oct 1999 A
6032070 Flock et al. Feb 2000 A
6081612 Gutkowicz-Krusin et al. Jun 2000 A
6081739 Lemchen Jun 2000 A
6091995 Ingle et al. Jul 2000 A
6101408 Craine et al. Aug 2000 A
6208749 Gutkowicz-Krusin et al. Mar 2001 B1
6215893 Leshem et al. Apr 2001 B1
6265151 Canter et al. Jul 2001 B1
6266453 Hibbard et al. Jul 2001 B1
6272278 Takahata et al. Aug 2001 B1
6278793 Gur et al. Aug 2001 B1
6307957 Gutkowicz-Krusin et al. Oct 2001 B1
6324417 Cotton Nov 2001 B1
D453350 Fenton Feb 2002 S
6359513 Kuo et al. Mar 2002 B1
6359612 Peter Mar 2002 B1
D455166 Raad Apr 2002 S
6381026 Schiff et al. Apr 2002 B1
6381488 Dickey et al. Apr 2002 B1
6392744 Holec May 2002 B1
6396270 Smith May 2002 B1
6413212 Raab Jul 2002 B1
6421463 Poggio et al. Jul 2002 B1
6427022 Craine et al. Jul 2002 B1
6491632 Taylor Dec 2002 B1
6567682 Osterweil et al. May 2003 B1
6594388 Gindele et al. Jul 2003 B1
6594516 Steckner et al. Jul 2003 B1
6603552 Cline et al. Aug 2003 B1
6611617 Crampton Aug 2003 B1
6611833 Johnson Aug 2003 B1
6631286 Pfeiffer et al. Oct 2003 B2
6648820 Sarel Nov 2003 B1
6671349 Griffith Dec 2003 B1
6678001 Elberbaum Jan 2004 B1
6690964 Bieger et al. Feb 2004 B2
6715675 Rosenfeld Apr 2004 B1
6754370 Hall-Holt et al. Jun 2004 B1
6770186 Rosenfeld et al. Aug 2004 B2
6798571 Wetzel et al. Sep 2004 B2
6809803 O'Brien et al. Oct 2004 B1
6810279 Mansfield et al. Oct 2004 B2
6816606 Wetzel et al. Nov 2004 B2
6816847 Toyama Nov 2004 B1
6862410 Miyoshi Mar 2005 B2
6862542 Lockhart et al. Mar 2005 B2
6873340 Luby Mar 2005 B2
6873716 Bowker Mar 2005 B1
6879394 Amblard et al. Apr 2005 B2
6907193 Kollias et al. Jun 2005 B2
6915073 Seo Jul 2005 B2
6922523 Merola et al. Jul 2005 B2
6941323 Galperin Sep 2005 B1
6961517 Merola et al. Nov 2005 B2
6968094 Gallagher Nov 2005 B1
6993169 Wetzel et al. Jan 2006 B2
7006223 Mullani Feb 2006 B2
7013172 Mansfield et al. Mar 2006 B2
7015906 Olschewski et al. Mar 2006 B2
7027153 Mullani Apr 2006 B2
7040536 Rosenfeld May 2006 B2
7054674 Cane et al. May 2006 B2
7064311 Jung et al. Jun 2006 B2
7068828 Kim et al. Jun 2006 B2
7068836 Rubbert et al. Jun 2006 B1
7074509 Rosenfeld et al. Jul 2006 B2
7103205 Wang et al. Sep 2006 B2
7106885 Osterweil et al. Sep 2006 B2
7127094 Elbaum et al. Oct 2006 B1
7127280 Dauga Oct 2006 B2
7128894 Tannous et al. Oct 2006 B1
7130465 Muenzenmayer et al. Oct 2006 B2
7136191 Kaltenbach et al. Nov 2006 B2
D533555 Odhe et al. Dec 2006 S
7155049 Wetzel et al. Dec 2006 B2
7162063 Craine et al. Jan 2007 B1
7167243 Mullani Jan 2007 B2
7167244 Mullani Jan 2007 B2
7181363 Ratti et al. Feb 2007 B2
7194114 Schneiderman Mar 2007 B2
7212660 Wetzel et al. May 2007 B2
7227621 Lee et al. Jun 2007 B2
7233693 Momma Jun 2007 B2
D547347 Kim Jul 2007 S
7248724 Gutenev Jul 2007 B2
D554682 Martinez Nov 2007 S
7295226 Meron et al. Nov 2007 B1
7298881 Giger et al. Nov 2007 B2
D561804 Asai Feb 2008 S
7347365 Rowe Mar 2008 B2
7376346 Merola et al. May 2008 B2
7400754 Jung et al. Jul 2008 B2
7421102 Wetzel et al. Sep 2008 B2
7426319 Takahashi Sep 2008 B2
7440597 Rowe Oct 2008 B2
7450783 Talapov et al. Nov 2008 B2
7460250 Keightley et al. Dec 2008 B2
7474415 Lin et al. Jan 2009 B2
7487063 Tubic et al. Feb 2009 B2
7489799 Nilsen et al. Feb 2009 B2
7495208 Czarnek et al. Feb 2009 B2
7496399 Maschke Feb 2009 B2
7509861 Masotti et al. Mar 2009 B2
7538869 Treado et al. May 2009 B2
7545963 Rowe Jun 2009 B2
D597205 Koch Jul 2009 S
7580590 Lin et al. Aug 2009 B2
7581191 Rice et al. Aug 2009 B2
7587618 Inui et al. Sep 2009 B2
7595878 Nelson et al. Sep 2009 B2
7603031 Viaud et al. Oct 2009 B1
D603441 Wada Nov 2009 S
7613335 McLennan et al. Nov 2009 B2
7620211 Browne et al. Nov 2009 B2
7647085 Cane et al. Jan 2010 B2
7668350 Rowe Feb 2010 B2
7684589 Nilsen et al. Mar 2010 B2
7724379 Kawasaki et al. May 2010 B2
7729747 Stranc et al. Jun 2010 B2
7735729 Rowe Jun 2010 B2
7738032 Kollias et al. Jun 2010 B2
7751594 Rowe et al. Jul 2010 B2
7765487 Cable Jul 2010 B2
7819311 Rowe et al. Oct 2010 B2
7869641 Wetzel et al. Jan 2011 B2
7876948 Wetzel et al. Jan 2011 B2
7881777 Docherty et al. Feb 2011 B2
7894645 Barsky Feb 2011 B2
7912320 Minor Mar 2011 B1
7912534 Grinvald et al. Mar 2011 B2
7916834 Piorek et al. Mar 2011 B2
7931149 Gilad et al. Apr 2011 B2
7951395 Lee et al. May 2011 B2
8000776 Gono Aug 2011 B2
8019801 Robb et al. Sep 2011 B1
8026942 Payonk et al. Sep 2011 B2
8071242 Rosenfeld et al. Dec 2011 B2
8078262 Murphy et al. Dec 2011 B2
8094294 Treado et al. Jan 2012 B2
8105233 Abou El Kheir Jan 2012 B2
D653687 Yu Feb 2012 S
8123704 Richards Feb 2012 B2
8150500 Goldman et al. Apr 2012 B2
8161826 Taylor Apr 2012 B1
8165357 Rowe Apr 2012 B2
8184873 Rowe et al. May 2012 B2
D662122 Goodwin Jun 2012 S
D664655 Daniel et al. Jul 2012 S
8213695 Zouridakis Jul 2012 B2
8218862 Demirli et al. Jul 2012 B2
8218873 Boncyk et al. Jul 2012 B2
8218874 Boncyk et al. Jul 2012 B2
8224077 Boncyk et al. Jul 2012 B2
8224078 Boncyk et al. Jul 2012 B2
8224079 Boncyk et al. Jul 2012 B2
8229185 Ennis et al. Jul 2012 B2
8238623 Stephan et al. Aug 2012 B2
8306334 Paschalakis et al. Nov 2012 B2
8326031 Boncyk et al. Dec 2012 B2
8335351 Boncyk et al. Dec 2012 B2
8437544 Boncyk et al. May 2013 B2
8457395 Boncyk et al. Jun 2013 B2
8463030 Boncyk et al. Jun 2013 B2
8463031 Boncyk et al. Jun 2013 B2
8465762 Lee et al. Jun 2013 B2
8467600 Boncyk et al. Jun 2013 B2
8467602 Boncyk et al. Jun 2013 B2
8478036 Boncyk et al. Jul 2013 B2
8478037 Boncyk et al. Jul 2013 B2
8480641 Jacobs Jul 2013 B2
8488880 Boncyk et al. Jul 2013 B2
8494264 Boncyk et al. Jul 2013 B2
8498460 Patwardhan Jul 2013 B2
8520942 Boncyk et al. Aug 2013 B2
8533879 Taylor Sep 2013 B1
8548245 Boncyk et al. Oct 2013 B2
8548278 Boncyk et al. Oct 2013 B2
8582817 Boncyk et al. Nov 2013 B2
8588476 Spicola, Jr. Nov 2013 B1
8588527 Boncyk et al. Nov 2013 B2
D697210 Delaney et al. Jan 2014 S
8638986 Jiang et al. Jan 2014 B2
8661915 Taylor Mar 2014 B2
8712193 Boncyk et al. Apr 2014 B2
8718410 Boncyk et al. May 2014 B2
8734342 Cable May 2014 B2
8755053 Fright et al. Jun 2014 B2
8768052 Kawano Jul 2014 B2
8773508 Daniel et al. Jul 2014 B2
8774463 Boncyk et al. Jul 2014 B2
8787621 Spicola, Sr. et al. Jul 2014 B2
8787630 Rowe Jul 2014 B2
8795169 Cosentino et al. Aug 2014 B2
8798368 Boncyk et al. Aug 2014 B2
8800386 Taylor Aug 2014 B2
8814841 Hartwell Aug 2014 B2
8824738 Boncyk et al. Sep 2014 B2
8837868 Boncyk et al. Sep 2014 B2
8842941 Boncyk et al. Sep 2014 B2
8849380 Patwardhan Sep 2014 B2
D714940 Kim Oct 2014 S
8855423 Boncyk et al. Oct 2014 B2
8861859 Boncyk et al. Oct 2014 B2
8867839 Boncyk et al. Oct 2014 B2
8873891 Boncyk et al. Oct 2014 B2
8875331 Taylor Nov 2014 B2
8885983 Boncyk et al. Nov 2014 B2
8892190 Docherty et al. Nov 2014 B2
8904876 Taylor et al. Dec 2014 B2
8913800 Rowe Dec 2014 B2
8923563 Boncyk et al. Dec 2014 B2
D720864 Behar et al. Jan 2015 S
8938096 Boncyk et al. Jan 2015 B2
8939918 Richards Jan 2015 B2
8948459 Boncyk et al. Feb 2015 B2
8948460 Boncyk et al. Feb 2015 B2
D724216 Gant et al. Mar 2015 S
8997588 Taylor Apr 2015 B2
9014513 Boncyk et al. Apr 2015 B2
9014514 Boncyk et al. Apr 2015 B2
9014515 Boncyk, V et al. Apr 2015 B2
9020305 Boncyk et al. Apr 2015 B2
9025813 Boncyk et al. May 2015 B2
9025814 Boncyk et al. May 2015 B2
9031278 Boncyk et al. May 2015 B2
9036947 Boncyk et al. May 2015 B2
9036948 Boncyk et al. May 2015 B2
9041810 Ecker et al. May 2015 B2
D735879 Behar et al. Aug 2015 S
9110925 Boncyk et al. Aug 2015 B2
9116920 Boncyk et al. Aug 2015 B2
9135355 Boncyk et al. Sep 2015 B2
9141714 Boncyk et al. Sep 2015 B2
9148562 Boncyk et al. Sep 2015 B2
D740945 Booth Oct 2015 S
9154694 Boncyk et al. Oct 2015 B2
9154695 Boncyk et al. Oct 2015 B2
9167800 Spicola, Jr. Oct 2015 B2
9179844 Fright et al. Nov 2015 B2
9186053 Viola Nov 2015 B2
9196067 Freed Nov 2015 B1
9224205 Tsin et al. Dec 2015 B2
9235600 Boncyk et al. Jan 2016 B2
9244943 Boncyk et al. Jan 2016 B2
9262440 Boncyk et al. Feb 2016 B2
9268197 Digregorio et al. Feb 2016 B1
9285323 Burg et al. Mar 2016 B2
9288271 Boncyk et al. Mar 2016 B2
9311520 Burg et al. Apr 2016 B2
9311540 Ecker et al. Apr 2016 B2
9311552 Boncyk et al. Apr 2016 B2
9311553 Boncyk et al. Apr 2016 B2
9311554 Boncyk et al. Apr 2016 B2
9317769 Boncyk et al. Apr 2016 B2
9324004 Boncyk et al. Apr 2016 B2
9330326 Boncyk et al. May 2016 B2
9330327 Boncyk et al. May 2016 B2
9330328 Boncyk et al. May 2016 B2
9330453 Soldatitsch et al. May 2016 B2
9342748 Boncyk et al. May 2016 B2
D758608 Behar et al. Jun 2016 S
9377295 Fright et al. Jun 2016 B2
9395234 Cosentino et al. Jul 2016 B2
9399676 Schurpf et al. Jul 2016 B2
9438775 Powers Sep 2016 B2
9451928 Falco et al. Sep 2016 B2
9525867 Thomas et al. Dec 2016 B2
9528941 Burg et al. Dec 2016 B2
9607380 Burg et al. Mar 2017 B2
D783838 Zhao et al. Apr 2017 S
9684815 Walch Jun 2017 B2
9690904 Zizi Jun 2017 B1
9697415 Jo Jul 2017 B2
9730629 White Aug 2017 B2
9808206 Zhao et al. Nov 2017 B1
9818193 Smart Nov 2017 B2
9824271 Whritenor Nov 2017 B2
9861285 Fright et al. Jan 2018 B2
9863811 Burg Jan 2018 B2
9955910 Fright et al. May 2018 B2
9958383 Hall et al. May 2018 B2
9972077 Adiri et al. May 2018 B2
9996923 Thomas Jun 2018 B2
10013527 Fairbairn et al. Jul 2018 B2
D827827 Canfield et al. Sep 2018 S
10068329 Adiri et al. Sep 2018 B2
D831197 Scruggs et al. Oct 2018 S
10117617 Cantu et al. Nov 2018 B2
10130260 Patwardhan Nov 2018 B2
10143425 Zhao et al. Dec 2018 B1
D837388 Dacosta et al. Jan 2019 S
10267743 Burg et al. Apr 2019 B2
10307382 Jung et al. Jun 2019 B2
10362984 Adiri et al. Jul 2019 B2
10368795 Patwardhan Aug 2019 B2
10426402 Shi Oct 2019 B2
10559081 Omer et al. Feb 2020 B2
D877931 Dacosta et al. Mar 2020 S
10607340 Kim et al. Mar 2020 B2
10614623 D'alessandro Apr 2020 B2
10617305 Patwardhan et al. Apr 2020 B2
10652520 Otto et al. May 2020 B2
10674953 Baker et al. Jun 2020 B2
10692214 Bisker Jun 2020 B2
10702160 Patwardhan Jul 2020 B2
10775647 Joy et al. Sep 2020 B2
10777317 Fairbair et al. Sep 2020 B2
D898921 Dacosta et al. Oct 2020 S
D899604 Dacosta et al. Oct 2020 S
10835204 Caluser Nov 2020 B2
D903863 Dacosta et al. Dec 2020 S
10874302 Fright et al. Dec 2020 B2
10880488 Tashayyod et al. Dec 2020 B2
10940328 Lv et al. Mar 2021 B2
11116407 Dickie et al. Sep 2021 B2
11134848 Bala et al. Oct 2021 B2
11138707 Yeo et al. Oct 2021 B2
11250945 Fairbairn et al. Feb 2022 B2
11315245 Moore Apr 2022 B2
20020054297 Lee et al. May 2002 A1
20020149585 Kacyra et al. Oct 2002 A1
20020197600 Maione et al. Dec 2002 A1
20030004405 Townsend et al. Jan 2003 A1
20030006770 Smith Jan 2003 A1
20030031383 Gooch Feb 2003 A1
20030036751 Anderson et al. Feb 2003 A1
20030085908 Luby May 2003 A1
20030164841 Myers Sep 2003 A1
20030164875 Myers Sep 2003 A1
20030229514 Brown Dec 2003 A2
20030231793 Crampton Dec 2003 A1
20040013292 Raunig Jan 2004 A1
20040014165 Keidar et al. Jan 2004 A1
20040059199 Thomas et al. Mar 2004 A1
20040080497 Enmei Apr 2004 A1
20040117343 Johnson Jun 2004 A1
20040136579 Gutenev Jul 2004 A1
20040146290 Kollias et al. Jul 2004 A1
20040201694 Gartstein et al. Oct 2004 A1
20040225222 Zeng et al. Nov 2004 A1
20040264749 Skladnev et al. Dec 2004 A1
20050012817 Hampapur et al. Jan 2005 A1
20050027567 Taha Feb 2005 A1
20050033142 Madden et al. Feb 2005 A1
20050084176 Talapov et al. Apr 2005 A1
20050094262 Spediacci et al. May 2005 A1
20050111757 Brackett et al. May 2005 A1
20050154276 Barducci et al. Jul 2005 A1
20050190988 Feron Sep 2005 A1
20050237384 Jess et al. Oct 2005 A1
20050259281 Boust Nov 2005 A1
20050273011 Hattery et al. Dec 2005 A1
20050273267 Maione Dec 2005 A1
20060008178 Seeger et al. Jan 2006 A1
20060012802 Shirley Jan 2006 A1
20060036135 Kern Feb 2006 A1
20060036156 Lachaine et al. Feb 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060055943 Kawasaki et al. Mar 2006 A1
20060058665 Chapman Mar 2006 A1
20060072122 Hu et al. Apr 2006 A1
20060073132 Congote Apr 2006 A1
20060089553 Cotton Apr 2006 A1
20060098876 Buscema May 2006 A1
20060135953 Kania et al. Jun 2006 A1
20060151601 Rosenfeld Jul 2006 A1
20060159341 Pekar et al. Jul 2006 A1
20060204072 Wetzel et al. Sep 2006 A1
20060210132 Christiansen et al. Sep 2006 A1
20060222263 Carlson Oct 2006 A1
20060268148 Kollias et al. Nov 2006 A1
20060269125 Kalevo et al. Nov 2006 A1
20060293613 Fatehl et al. Dec 2006 A1
20070065009 Ni et al. Mar 2007 A1
20070097381 Tobiason et al. May 2007 A1
20070125390 Afriat et al. Jun 2007 A1
20070129602 Bettesh et al. Jun 2007 A1
20070229850 Herber Oct 2007 A1
20070273894 Johnson Nov 2007 A1
20070276195 Xu et al. Nov 2007 A1
20070276309 Xu et al. Nov 2007 A1
20080006282 Sukovic Jan 2008 A1
20080021329 Wood et al. Jan 2008 A1
20080045807 Psota et al. Feb 2008 A1
20080088704 Wendelken et al. Apr 2008 A1
20080098322 Champion et al. Apr 2008 A1
20080126478 Ferguson et al. May 2008 A1
20080165357 Stem Jul 2008 A1
20080232679 Hahn Sep 2008 A1
20080246759 Summers Oct 2008 A1
20080275315 Oka et al. Nov 2008 A1
20080285056 Blayvas Nov 2008 A1
20080312642 Kania et al. Dec 2008 A1
20080312643 Kania et al. Dec 2008 A1
20090116712 Al-Moosawi et al. May 2009 A1
20090118720 Black et al. May 2009 A1
20090221874 Vinther Sep 2009 A1
20090225333 Bendall Sep 2009 A1
20090234313 Mullejeans et al. Sep 2009 A1
20100004564 Jendle Jan 2010 A1
20100020164 Perrault Jan 2010 A1
20100091104 Sprigle et al. Apr 2010 A1
20100111387 Christiansen, II et al. May 2010 A1
20100113940 Sen et al. May 2010 A1
20100121201 Papaioannou May 2010 A1
20100149551 Malinkevich Jun 2010 A1
20100156921 McLennan et al. Jun 2010 A1
20100191126 Al-Moosawi et al. Jul 2010 A1
20100278312 Ortiz Nov 2010 A1
20110032349 San Matias et al. Feb 2011 A1
20110102550 Daniel et al. May 2011 A1
20110125028 Wood et al. May 2011 A1
20110190637 Knobel et al. Aug 2011 A1
20120035469 Whelan et al. Feb 2012 A1
20120059266 Davis et al. Mar 2012 A1
20120078088 Whitestone et al. Mar 2012 A1
20120078113 Whitestone et al. Mar 2012 A1
20120226152 Porikli Sep 2012 A1
20120253200 Stolka et al. Oct 2012 A1
20120265236 Wesselmann Oct 2012 A1
20120275668 Chou et al. Nov 2012 A1
20130051651 Leary et al. Feb 2013 A1
20130162796 Bharara et al. Jun 2013 A1
20130335545 Darling Dec 2013 A1
20140048687 Drzymala et al. Feb 2014 A1
20140088402 Xu Mar 2014 A1
20140354830 Schafer et al. Dec 2014 A1
20150077517 Powers Mar 2015 A1
20150089994 Richards Apr 2015 A1
20150142462 Vaidya et al. May 2015 A1
20150150457 Wu et al. Jun 2015 A1
20150214993 Huang Jul 2015 A1
20150250416 LaPlante et al. Sep 2015 A1
20150265236 Garner et al. Sep 2015 A1
20150270734 Davison et al. Sep 2015 A1
20160100790 Cantu et al. Apr 2016 A1
20160157725 Munoz Jun 2016 A1
20160178512 Hall et al. Jun 2016 A1
20160206205 Wu et al. Jul 2016 A1
20160259992 Knodt et al. Sep 2016 A1
20160261133 Wang Sep 2016 A1
20160262659 Fright et al. Sep 2016 A1
20160275681 D'alessandro Sep 2016 A1
20160284084 Gurcan et al. Sep 2016 A1
20160338594 Spahn et al. Nov 2016 A1
20170076446 Pedersen et al. Mar 2017 A1
20170079577 Fright et al. Mar 2017 A1
20170084024 Gurevich Mar 2017 A1
20170085764 Kim et al. Mar 2017 A1
20170086940 Nakamura Mar 2017 A1
20170127196 Blum et al. May 2017 A1
20170236273 Kim et al. Aug 2017 A1
20170258340 Przybyszewski et al. Sep 2017 A1
20170262985 Finn et al. Sep 2017 A1
20170303790 Bala et al. Oct 2017 A1
20170303844 Baker et al. Oct 2017 A1
20180132726 Dickie et al. May 2018 A1
20180214071 Fright et al. Aug 2018 A1
20180252585 Burg Sep 2018 A1
20180279943 Budman et al. Oct 2018 A1
20180296092 Hassan et al. Oct 2018 A1
20180303413 Hassan et al. Oct 2018 A1
20180322647 Harrington et al. Nov 2018 A1
20180336720 Larkins et al. Nov 2018 A1
20190133513 Patwardhan May 2019 A1
20190240166 Jung et al. Aug 2019 A1
20190273890 Christiansen, II et al. Sep 2019 A1
20190290187 Ariri et al. Sep 2019 A1
20190298183 Burg et al. Oct 2019 A1
20190298252 Patwardhan Oct 2019 A1
20190307337 Little et al. Oct 2019 A1
20190307400 Zhao et al. Oct 2019 A1
20190310203 Burg et al. Oct 2019 A1
20190336003 Patwardhan Nov 2019 A1
20190350535 Zhao et al. Nov 2019 A1
20190369418 Joy et al. Dec 2019 A1
20200014910 Larkins Jan 2020 A1
20200121245 Barclay et al. Apr 2020 A1
20200126226 Adiri et al. Apr 2020 A1
20200126227 Adiri et al. Apr 2020 A1
20200196962 Zhao et al. Jun 2020 A1
20200209214 Zohar et al. Jul 2020 A1
20200211193 Adiri et al. Jul 2020 A1
20200211228 Adiri et al. Jul 2020 A1
20200211682 Zohar et al. Jul 2020 A1
20200211693 Adiri et al. Jul 2020 A1
20200211697 Adiri et al. Jul 2020 A1
20200225166 Burg et al. Jul 2020 A1
20200234444 Budman et al. Jul 2020 A1
20200286600 De Brouwer et al. Sep 2020 A1
20200297213 Patwardhan Sep 2020 A1
20200359971 Zhao et al. Nov 2020 A1
20200364862 Dacosta et al. Nov 2020 A1
20200383631 Canfield et al. Dec 2020 A1
20210000387 Zizi Jan 2021 A1
20210004995 Burg et al. Jan 2021 A1
20210068664 Fright et al. Mar 2021 A1
20210219907 Fright et al. Jul 2021 A1
20210386295 Dickie et al. Dec 2021 A1
20220270746 Fairbairn et al. Aug 2022 A1
Foreign Referenced Citations (92)
Number Date Country
549703 Mar 2012 AT
110326029 Oct 2019 CN
2642841 Mar 1978 DE
3420588 Dec 1984 DE
4120074 Jan 1992 DE
355221 Feb 1990 EP
552526 Jul 1993 EP
650694 May 1995 EP
1210906 Jun 2002 EP
1248237 Oct 2002 EP
1351036 Oct 2003 EP
1303267 Apr 2004 EP
1584405 Oct 2005 EP
1611543 Jan 2006 EP
1467706 Mar 2007 EP
1946567 Jul 2008 EP
119660 May 2009 EP
2272047 Mar 2012 EP
2883037 Jun 2015 EP
3114462 Jan 2017 EP
3143378 Mar 2017 EP
3160327 May 2017 EP
2750673 Aug 2017 EP
3251332 Dec 2017 EP
3270770 Jan 2018 EP
3286695 Feb 2018 EP
3364859 Aug 2018 EP
3365057 Aug 2018 EP
3371779 Sep 2018 EP
3371780 Sep 2018 EP
3381015 Nov 2019 EP
3586195 Jan 2020 EP
3589187 Jan 2020 EP
3602501 Feb 2020 EP
3555856 Apr 2020 EP
3655924 May 2020 EP
3371781 Sep 2020 EP
3707670 Sep 2020 EP
4183328 May 2023 EP
2384086 Jun 2012 ES
2570206 Mar 1986 FR
2458927 Nov 2012 GB
2544263 May 2017 GB
2544460 May 2017 GB
2544725 May 2017 GB
2545394 Jun 2017 GB
2557633 Jun 2018 GB
2557928 Jul 2018 GB
2559977 Aug 2018 GB
2559978 Aug 2018 GB
2011516849 May 2011 JP
5467404 Apr 2014 JP
293713 Sep 1997 NZ
588740 Jul 2012 NZ
2000003210 Jan 2000 WO
2000030337 May 2000 WO
2002001143 Jan 2002 WO
2002065069 Aug 2002 WO
2002093450 Nov 2002 WO
2004092874 Oct 2004 WO
2004095372 Nov 2004 WO
2005033620 Apr 2005 WO
2006078902 Jul 2006 WO
2007029038 Mar 2007 WO
2007043899 Apr 2007 WO
2007059780 May 2007 WO
2008033010 Mar 2008 WO
2008039539 Apr 2008 WO
2008048424 Apr 2008 WO
2008057056 May 2008 WO
2008071414 Jun 2008 WO
2008080385 Jul 2008 WO
2009046218 Apr 2009 WO
2009122200 Oct 2009 WO
2010048960 May 2010 WO
2012146720 Nov 2012 WO
2016069463 May 2016 WO
2016199134 Dec 2016 WO
2017077276 May 2017 WO
2017077277 May 2017 WO
2017077279 May 2017 WO
2017089826 Jun 2017 WO
2018109453 Jun 2018 WO
2018109479 Jun 2018 WO
2018154271 Aug 2018 WO
2018154272 Aug 2018 WO
2018185560 Oct 2018 WO
2019239106 Dec 2019 WO
2019239147 Dec 2019 WO
2020014779 Jan 2020 WO
2020141346 Jul 2020 WO
2020251938 Dec 2020 WO
Non-Patent Literature Citations (167)
Entry
Non-Final Office Action mailed Oct. 24, 2022, U.S. Appl. No. 17/398,883, 18 pages.
Afromowitz, et al., “Multispectral Imaging of Burn Wounds: A New Clinical Instrument for Evaluating Burn Depth”, IEEE Transactions on Biomedical Engineering, vol. 35, No. 10, pp. 842-850; Oct. 1988.
Ahn et al., “Advances in Wound Photography and Assessment Methods,” Advances in Skin & Wound Care, Feb. 2008, pp. 85-93.
Ahroni, JH et al., “Reliability of computerized wound surface area determinations” Wounds: A Compendium of Clinical Research and Practice, No. 4, (1992) 133-137.
Anderson, R., et al. “The Optics of Human Skin”, The Journal of Investigative Dermatology, vol. 77, No. 1, pp. 13-19; Jul. 1981.
Armstrong, DG et al “Diabetic foot ulcers: prevention, diagnosis and classification” Am Fam Physician Mar. 15, 1998: 57 (6) :1325-32, 1337-8.
Bale, S, Harding K, Leaper D. An Introduction to Wounds. Emap Healthcare Ltd 2000.
Beaumont, E et al “RN Technology Scorecard: Wound Care Science at the Crossroads” American Journal of Nursing Dec. 1998 98(12):16-18, 20-21.
Bergstrom, N, Bennett MA, Carlson CE. Treatment of Pressure Ulcers: Clinical Practice Guideline No. 15. Rockville, MD: U.S. Department of Health and Human Services. Public Health Service, Agency for Health Care Policy and Research 1994: 95-0652: [O].
Berriss 1997: Automatic Quantitative Analysis of Healing Skin Wounds using Colour Digital Image Processing: William Paul Berriss, Stephen John Sangwine [E].
Binder, et al., “Application of an artificial neural network in epiluminescence microscopy pattern analysis of pigmented skin lesions: a pilot study”, British Journal of Dermatology 130; pp. 460-465; 1994.
Bland, JM et al “Measurement error and correlation coefficients” BMJ Jul. 6, 1996: 313 (7048) :41-2.
Bland, JM et al “Measurement error” BMJ Jun. 29, 1996; 312 (7047) :1654.
Bohannon Richard; Barbara A Pfaller Documentation of Wound Surface Area from Tracings of Wound Perimeters [E].
Bolton, L., “Re Measuring Wound Length, Width, and Area: Which Technique?” Letters, Advances in Skin & Wound Care, pp. 450-452, vol. 21, No. 10.
Bostock, et al, Toward a neural network based system for skin cancer diagnosis; IEEE Conference on Artificial neural Networks, ISBN: 0-85296-573-7, pp. 215-219, May 1993.
BPG2005: Assessment and Management of Foot Ulcers for People with Diabetes:Nursing Best Practice Guidelines, Toronto, Ontario [E], Mar. 2013.
Briers, J.D., “Laser speckle contrast imaging for measuring blood flow,” Optica Applicata, 2007, pp. 139-152, vol. XXXVII, No. 1-2.
Briggs Corporation: Managed care making photo documentation a wound care standard. Wound care solutions product catalog 1997.
Brown, G “Reporting outcomes for Stage IV pressure ulcer healing; a proposal” Adv Skin Wound Care (2000)13:277-83.
Callieri 2003: Callieri M, Cignoni P, Pingi P, Scopigno R. Derma: Monitoring the evolution of skin lesions with a 3D system, VMV 2003. 8th International Fall Workshop, Vision, Modeling, and Visualization 2003, Nov. 19-21, 2003, Munich, Germany [E].
Campana: XML-based synchronization of mobile medical devices [E], 2002, 2 Pages.
Cardinal et al., “Early healing rates and wound area measurements are reliable predictors of later complete wound closure,” Wound Rep. Reg., 2008, pp. 19-22, vol. 16.
Cardinal et al., “Wound shape geometry measurements correlate to eventual wound healing,” Wound Rep. Reg., 2009, pp. 173-178, vol. 17.
Cascinelli, N., et al. “Results obtained by using a computerized image analysis system designed as an aid to diagnosis of cutaneous melanoma”, Melanoma Research, vol. 2, pp. 163-170, 1992.
Cleator et al., “Mobile wound care: Transforming care through technology,” Rehab & Community Care Medicine, Winter 2008, pp. 14-15.
Collins, C et al “The Role of Ultrasound in Lower Extremity Wound Management” International Journal of Lower Extremity Wounds (2002) 1: 229-235.
Daubechies, I., “The Wavelet Transform, Time-Frequency Localization and Signal Analysis”, IEEE Trans Inform Theory, vol. 36, No. 5, pp. 961-1005; Sep. 1990.
De Vet, HC et al “Current challenges in clinimetrics” J Clin Epidemiol Dec 2003; 56 (12) :1137-41.
De Vet, H C., et al., “When to use agreement versus reliability measures”, J Clin Eoidemiol 59 (10), (Oct. 2006), 1033-9.
Debray, M., Couturier P, Greuillet F, Hohn C, Banerjee S, Gavazzi G, Franco A. “A preliminary study of the feasibility of wound telecare for the elderly.” Journal of Telemedicine & Telecare 2001: 7(6): 353-8. [A].
Dowsett, C. et al., “Triangle of Wound Assessment—made easy”, Wounds Asia (www.woundasia.com), May 2015.
Duckworth et al., “A Clinically Affordable Non-Contact Wound Measurement Device,” 2007, pp. 1-3.
Duff, et al. (2003), Loftus Hills A, Morrell C 2000 Clinical. Guidelines for the management of venous leg ulcers: Implementation Guide. Royal College of Nursing; 2000: 001 (213): 1-48. [E].
Ercal, F., “Detection of Skin Tumor Boundaries in Color Images”, IEEE Transactions of Medical Imaging, vol. 12, No. 3, pp. 624-627, Sep. 1993.
Ercal, F., et al. “Neural Network Diagnosis of Malignant Melanoma From Color Images”, IEEE Transactions of Biomedical Engineering, vol. 41, No. 9, pp. 837-845, Sep. 1994.
Ferrell, B “Pressure ulcers. Assessment of healing” Clin Geriatr Med (1997)13:575-87.
Fette, A.M., “A clinimetric analysis of wound measurement tools,” World Wide Wounds, 2006, [retrieved on Jul. 26, 2006]. Retrieved from the Internet: <URL: http://www.worldwidewounds.com/2006/January/Fette/Clinimetric-Ana . . . >, 6 pages.
Fitzpatrick et al., “Evaluating patient-based outcome measures for use in clinical trials,” Health Technology Assessment, 1998, vol. 2, No. 14, 86 pages.
Flahr et al., “Clinimetrics and Wound Science,” Wound Care Canada, 2005, pp. 18-19, 48, vol. 3, No. 2.
Flanagan, M. “Improving accuracy of wound measurement in clinical practice” Ostomy Wound Manage Oct. 2003, 49(10):28-40.
Flanagan, M., “Wound measurement: can it help us to monitor progression to healing?” JWound Care May 2003, 12(5): 189-94.
Gethin et al., “Wound Measurement: the contribution to practice,” EWMA Journal, 2007, pp. 26-28, vol. 7, No. 1.
Gilman, T “Wound outcomes: the utility of surface measures” Int J Low Extrem Wounds Sep. 2004; 3 (3) :125-32.
Goldman, RJ “The patientcom, 1 year later” Adv Skin Wound Care Nov.-Dec. 2002; 15 (6) :254, 256.
Goldman, RJ et al “More than one way to measure a wound: An overview of tools and techniques” Adv Skin Wound Care (2002) 15:236-45.
Golston, et al. “Automatic Detection of Irregular Borders in Melanoma and Other Skin Tumors”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 199-203, 1992.
Graaf, R., et al. “Optical properties of human dermis in vitro and in vivo”, Applied Optics, vol. 32, No. 4, pp. 435-447, Feb. 1, 1993.
Greene, A., “Computer image analysis in the diagnosis of melanoma”, Journal of the American Academy of Dermatology; vol. 31, No. 6, pp. 958-964, 1994.
Griffin, JW et al “A comparison of photographic and transparency-based methods for measuring wound surface area” Phys Ther Feb. 1993; 73 (2) :117-22.
Haghpanah et al., “Reliability of Electronic Versus Manual Wound Measurement Techniques,” Arch Phys Med Rehabil, Oct. 2006, pp. 1396-1402, vol. 87.
Hansen 1997: Wound Status Evaluation Using Color Image Processing Gary: L. Hansen, Ephraim M. Sparrow, Jaydeep Y. Kokate, Keith J. Leland, and Paul A. Iaizzo [E].
Hayes 2003:Hayes S, Dodds, S. Digital photography in wound care. Nursing Times 2003:9(42):48-9. [A].
Herbin, et al, Color Quantitation Through Image Processing in Dermatology; IEEE Transaction on Medical Imaging, vol. 9, Issue 3, pp. 262-269, Sep. 1990.
Hibbs, P “The economics of pressure ulcer prevention” Decubitus Aug. 1988; 1 (3) :32-8.
Houghton 2000: Houghton PE, Kincaid CB, Campbell KE, Woodbury MG, Keast DH. Photographic assessment of the appearance of chronic pressure and leg ulcers. Ostomy Wound management 2000: 46(4): 20-6, 28-30. [A].
HSA Global, “Mobile Wound Care”, Marketing material (2009).
Huang, C., et al.“Border irregularity: atypical moles versus melanoma”, Eur J Dermatol, vol. 6, pp. 270-273, Jun. 1996.
Iakovou, D. et al., “Integrated sensors for robotic laser welding,” Proceedings of the Third International WLT-Conference on Lasers in Manufacturing, Jun. 2005, pp. 1-6.
International Search Report and Written Opinion for International Application No. PCT/US2004/028445 filed Sep. 1, 2004.
International Search Report and Written Opinion mailed Jan. 23, 2019, International Application No. PCT/IB2018/000447, 20 pages.
International Search Report and Written Opinion mailed Jul. 2, 2019, International Application No. PCT/IB2018/001572, 17 pages.
International Search Report and Written Opinion mailed Mar. 1, 2007, International Application No. PCT/NZ2006/000262, 12 pages.
Johnson, JD (1995) Using ulcer surface area and volume to document wound size, J Am Podiatr Med Assoc 85(2), (Feb. 1995), 91-5.
Jones, et al, An Instrument to Measure the Dimension of Skin Wounds; IEEE Transaction on Biomedical Engineering, ISSN: 0018-9294; vol. 42, Issue 5, pp. 464-470, May 1995.
Jones, TD “Improving the Precision of Leg Ulcer Area Measurement with Active Contour Models”, PHD Thesis (1999) http://www.comp.glam.ac.uklpages/staff/tjones/ThesisOL/Title. Htm.
Jones, TD et al “An active contour model for measuring the area of leg ulcers” IEEE Trans Med Imaging Dec. 2000, 19(12):1202-10.
Kecelj-Leskovec et al., “Measurement of venous leg ulcers with a laser-based three-dimensional method: Comparison to computer planimetry with photography,” Wound Rep Reg, 2007, pp. 767-771, vol. 15.
Kenet, R., et al. “Clinical Diagnosis of Pigmented Lesions Using Digital Epiluminescence Microscopy”, Arch Dermatol, vol. 129, pp. 157-174; Feb. 1993.
Khashram et al., “Effect ofTNP on the microbiology of venous leg ulcers: a pilot study,” J Wound Care, Apr. 2009, pp. 164-167, vol. 18, No. 4.
Kloth, LC et al “A Randomized Controlled Clinical Trial to Evaluate the Effects of Noncontact Normothermic Wound Therapy on Chronic Full-thickness Pressure Ulcers” Advances in Skin & Wound Care Nov./Dec. 2002, 15(6):270-276.
Korber et al., “Three-dimensional documentation of wound healing: First results of a new objective method for measurement,” JDDG, Oct. 2006, (Band 4), pp. 848-854.
Koren, et al, Interactive Wavelet Processing and Techniques Applied to Digital Mammography; IEEE Conference Proceedings, ISBN: 0-7803-3192-3; vol. 3, pp. 1415-1418, May 1996.
Kovesi, P., “Image Features From Phase Congruency”, University of Western Australia, pp. 1-30; Technical Report 9/4, Revised Jun. 1995.
Krouskop, TA et al “A noncontact wound measurement system” J Rehabil Res Dev May-Jun. 2002, 39(3):337-45.
Kumar et al., “Wound Image Analysis Classifier For Efficient Tracking Of Wound Healing Status,” Signal & Image Processing: An International Journal (SIPIJ), vol. 5, No. 2, Apr. 2014, pp. 15-27.
Kundin 1989: Kudin JI. A new way to size up a wound. American Journal of Nursing 1989: (2):206-7.
Lakovou, D. et al., “Integrated sensors for robotic laser welding,” Proceedings of the Third International WLT-Conference on Lasres in Manufacturing, Jun. 2005, pp. 1-6.
Langemo et al., “Measuring Wound Length, Width, and Area: Which Technique?”, Advances in Skin & Wound Care, Jan. 2008, pp. 42-45, vol. 21, No. I.
Langemo, DK et al “Comparison of 2 Wound Volume Measurement Methods” Advances in Skin & Wound Care Jul./Aug. 2001, vol. 14(4), 190-196.
Langemo, DK et al “Two-dimensional wound measurement: comparison of 4 techniques” Advances in Wound Care Nov.-Dec. 1998, 11(7):337-43.
Laughton, C et al “A comparison of four methods of obtaining a negative impression of the foot” J Am Podiatr Med Assoc May 2002; 92 (5) :261-8.
Lee, et al, A Multi-stage Segmentation Method for Images of Skin Lesions; IEEE Conference Proceedings on Communication, Computers, and Signal Processing, ISBN 0-7803-2553-2, pp. 602-605, May 1995.
Levoy, et al. “The Digital Michelangelo Project: 3D Scanning Of Large Statues,” ACM, 2000.
Lewis 1997: Lewis P, McCann R, Hidalgo P, Gorman M. Use of store and forward technology for vascular nursing teleconsultation service. Journal of Vascular Nursing 1997. 15(4): 116-23. [A].
Lewis, JS, Achilefu S, Garbow JR, Laforest R, Welch MJ., Small animal imaging. current technology and perspectives for oncological imaging, Radiation Sciences, Washington University School of Medicine, Saint Louis, MO, USA, Eur J Cancer. Nov. 2002;38(16):2173-88.
Li, D. 2004, Database design and implementation for wound measurement system. Biophotonics, 2004: 42-43. [E].
Liu et al., “Wound measurement by curvature maps: a feasibility study,” Physiol. Meas., 2006, pp. I 107-1123, vol. 27.
Lorimer, K “Continuity through best practice: design and implementation of a nurse-led community leg-ulcer service” Can J Nurs Res Jun. 2004, 36(2): 105-12.
Lowery et al., “Technical Overview of a Web-based Telemedicine System for Wound Assessment,” Advances in Skin & Wound Care, Jul./Aug. 2002, pp. 165-169, vol. 15, No. 4.
Lowson, S., “The safe practitioner: Getting the record straight: the need for accurate documentation,” J Wound Care, Dec. 2004, vol. 13, No. 10, [retrieved on Dec. 17, 2004). Retrieved from the Internet: <URL: http://www.journalofwoundcare.com/nav?page=jowc.article&resource=1455125>, 2 pages.
Lucas, C., “Pressure ulcer surface area measurement using instant full-scale photography and transparency tracings,” Advances in Skin & Wound Care, Jan./Feb. 2002, [retrieved on Jul. 28, 2006]. Retrieved from the Internet: <URL: http://www.findarticles.com/p/articles/mi _qa3977/is_200201 /al_n904 . . . >, 7 pages.
Lunt, M.J., “Review of duplex and colour Doppler imaging of lower-limb arteries and veins,” World Wide Wounds, 2000, [retrieved on Apr. 17, 2005]. Retrieved from the Internet: <URL: http://www.worldwidewounds.com/2000/sept/Michael-Lunt/Dopple . . . >, 6 pages.
Maglogiannis et al., “A system for the acquisition of reproducible digital skin lesions images,” Technol and Health Care, 2003, pp. 425-441, vol. 11.
Malian et al., “MEDPHOS: A New Photogrammetric System for Medical Measurement,” 2004, Commission V, WG V/3, 6 pages.
Mallat. S., et al. “Characterization of signals from multiscale edges”, IEEE Trans Patt and Mech Int'l; 14:710-732; 1992.
Marchesini, R., et al. “In vivo Spectrophotometric Evaluation of Neoplastic and Non-Neoplastic Skin Pigmented Lesions. III. CCD Camera-Based Reflectance Imaging”, Photochemistry and Photobiology, vol. 62, No. 1, pp. 151-154; 1995.
Marjanovic et al., “Measurement of the volume of a leg ulcer using a laser scanner,” Physiol. Meas., 1998, pp. 535-543, vol. 19.
Mashburn et al., “Enabling user-guided segmentation and tracking of surface-labeled cells in time-lapse image sets of living tissues,” Cytometry A., NIH Public Access, May 1, 2013, pp. 1-17.
Mastronjcola et al., “Burn Depth Assessment Using a Tri-stimulus Colorimeter,” Wounds—ISSN: I044-7946, Sep. 2005, pp. 255-258, vol. 17, No. 9.
McCardle, J., “Visitrak: wound measurement as an ald to making treatment decisions,” The Diabetic Foot, Winter 2005, [retrieved on Mar. 30, 2008). Retrieved from the Internet: <URL: http://findarticles.com/p/articles/mi_mOMDQ/is_4_8/ai_n16043804/print>, 4 pages.
Menzies, S., “The Morphologic Criteria of the Pseudopod in Surface Microscopy”, Arch Dermatol, vol. 131, pp. 436-440, Apr. 1995.
Molnar et al., “Use of Standardized, Quantitative Digital Photography in a Multicenter Web-based Study,” 2009, ePlasty, pp. 19-26, vol. 9.
Nachbar, et al., “The ABCD rule of dermatology”, Journal of the American Academy of Dermatology, vol. 3, No. 4, pp. 551-559, Apr. 1994.
National Pressure Ulcer Advisory Panel, “FAQ: Photography for pressure ulcer documentation,” 1 1P56, 4 pages.
National Pressure Ulcer Advisory Panel, Position Statement, 1998, [retrieved on Jan. 6, 2005]. Retrieved from the Internet: <URL: http://www.npuap.org/>, 2 pages (Pressure Ulcer Healing Chart attached, 2 pages).
Oduncu et al., “Analysis of Skin Wound Images Using Digital Color Image Processing: A Preliminary Communication,” Lower Extremity Wounds, 2004, pp. 151-156, vol. 3, No. 3.
Pages, Jordi, et al., “Plane-to-plane positioning from image-based visual serving and structured light,” Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 28-Oct. 2, 2004, pp. 1004-1009.
Patete et al., “A non-invasive, three-dimensional, diagnostic laser imaging system for accurate wound analysis,” Physiol. Meas., 1996, pp. 71-79, vol. 17.
Payne, C., “Cost benefit comparison of plaster casts and optical scans of the foot for the manufacture of foot orthoses,” AJPM, 2007, pp. 29-31, vol. 41, No. 2.
Pehamberger, H., et al. “In vivo epiluminescence microscopy of pigmented skin lesions. I. Pattern analysis of pigmented skin lesions”, Journal of American Academy of Dermatology, vol. 17, No. 4, pp. 571-583, Oct. 1987.
Plassman, et al. “Problems of Assessing Wound Size,” Would healing Research Unit, University of Wales College of Medicine, Cardiff CF4 4XN, Wales, UK (1993) (Unpublished).
Plassmann et al., “MAVIS: a non-invasive instrument to measure area and volume of wounds,” Medical Engineering & Physics, 1998, pp. 332-338, vol. 20.
Plassmann, P., “Recording Wounds—Documenting Woundcare,” Medical Computing Group, 1998, pp. 1-31.
Plaza et al., “Minimizing Manual Image Segmentation Turn-Around Time for Neuronal Reconstruction by Embracing Uncertainty,” Plos One, vol. 7, Issue 9, Sep. 2012, pp. 1-14.
Rogers et al., “Measuring Wounds: Which Stick to Use?”, Podiatry Management, Aug. 2008, pp. 85-90.
Romanelli et al., “Technological Advances in Wound Bed Measurements,” Wounds, 2002, pp. 58-66, vol. 14, No. 2, [retrieved on Apr. 8, 2005]. Retrieved from the Internet: <URL: http:/lwww.medscape.com/viewarticle/430900 _print>, 8 pages.
Russell, L., “The importance of wound documentation & classification,” British J Nursing, 1999, pp. 1342-1354, vol. 8, No. 20.
Salcido, R., “The Future of Wound Measurement,” Advances in Skin & Wound Care, Mar./Apr. 2003, pp. 54, 56, vol. 13, No. 2.
Salcido, R., “Pressure Ulcers and Wound Care,” Physical Medicine and Rehabilitation, eMedicine, 2006, [retrieved on]. Retrieved from the Internet: <URL: http://www.emedicine.com/pmr/topic 179.htm>, 25 pages.
Salmhofer, et al., “Wound teleconsultation in patients with chronic leg ulcers,” 2005.
Sani-Kick et al., “Recording and Transmission of Digital Wound Images with the Help of a Mobile Device,” 2002, 2 pages.
Santamaria et al., “The effectiveness of digital imaging and remote expert wound consultation on healing rates in chronic lower leg ulcers in the Kimberley region of Western Australia,” Primary Intention, May 2004, pp. 62-70, vol. 12, No. 2.
Schindewolf, et al. “Comparison of classification rates for conventional and dermatoscopic images of malignant and benign melanocytic lesions using computerized colour image analysis”, Eur J Dermatol, vol. 3, No. 4, pp. 299-303, May 1993.
Schindewolf, T., et al. “Classification of Melanocytic Lesions with Color and Texture Analysis Using Digital Image Processing”, The International Academy of Cytology, Analytical and Quantitative Cytology and Histology, vol. 15, No. 1, pp. 1-11, Feb. 1993.
Schindewolf, T., et al. “Evaluation of different image acquisition techniques for a computer vision system in the diagnosis of malignant melanoma”, Journal of the American Academy of Dermatology, vol. 31, No. 1, pp. 33-41, Jul. 1994.
Schultz et al., “Wound bed preparation: a systematic approach to wound management,” Wound Repair and Regeneration, Mar./Apr. 2003, p. SI-S28, vol. 1 1, No. 2, Supplement.
Shaw et al., “An Evaluation of Three Wound Measurement Techniques in Diabetic Foot Wounds,” Diabetes Care, 2007, [retrieved on Mar. 30, 2008]. Retrieved from the Internet: <URL: http://care.diabetesjournals.org/cgi/content/full/30/ l 0/2641?ck=nck>, 5 pages.
Sheehan et al., “Percent Change in Wound Area of Diabetic Foot Ulcers Over a 4-Week Period Is a Robust Predictor of Complete Healing in a 12-Week Prospective Trial,” Diabetes Care, Jun. 2003, pp. 1879-1882, vol. 26, No. 6.
Sheng, Chao, Brian W. Pogue, Hamid Dehghani, Julia A. O'Hara, P. J. Hoopes, Numerical light dosimetry in murine tissue: analysis of tumor curvature and angle of incidence effects upon fluence in the tissue, Proc. SPIE, vol. 4952, 39 (2003), DOI:10.1117/12.474081, Online Publication Date: Jul. 28, 2003.
Smith & Nephew, “Leg ulcer guidelines: a pocket guide for practice,” National Guideline Clearinghouse, U.S. Dept of Health & Human Services, 2002, [retrieved on Jan. 10, 2012]. Retrieved from the Internet: <URL: http://guidelines.gov/content.aspx?id=9830&search=Pressure+Ulcer>, 17 pages.
Smith & Nephew, “Visitrak Wound Measurement Device,” Wound Management, [retrieved on Apr. 7, 2005]. Retrieved from the Internet: <URL: http://wound.smith-nephew.com/us/node.asp?NodeId=3 l 20>, 7 pages.
Smith & Nephew, “Guidelines for the Management of Leg Ulcers in Ireland” www.smith-nephew.com.
Smith et al., “Three-Dimensional Laser Imaging System for Measuring Wound Geometry,” Lasers in Surgery and Medicine, 1998, pp. 87-93, vol. 23.
Sober, et al., “Computerized Digital Image Analysis: An Aid for Melanoma Diagnosis”, The Journal of Dermatology, vol. 21, pp. 885-890, 1994.
Solomon et al., “The use of video image analysis for the measurement of venous ulcers,” British J Dermatology, 1995, pp. 565-570, vol. I 33.
Steiner, A., “In vivo epiluminescence microscopy of pigmented skin lesions. II. Diagnosis of small pigmented skin lesions and early detection of malignant melanoma”, Journal of the American Academy of Dermatology, vol. 17, No. 4, pp. 584-591; Oct. 1987.
Stoecker, et al. “Automatic Detection of Asymmetry in Skin Tumors”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 191-197, 1992.
Takiwaki, et al., “A rudimentary system for automatic discrimination among basic skin lesions on the basis of color analysis of video images”, Journal of the American Academy of Dermatology, vol. 32, No. 4, pp. 600-604, Apr. 1995.
Tellez, R., “Managed Care Making Photo Documentation a Wound Care Standard,” Wound Care, 1997, [retrieved on Aug. 29, 2005]. Retrieved from the Internet: <URL: http://woundcare.org/newsvol2n4/art.htm>, 2 pages.
Thali, M.J., et al. “Optical 3D surface digitizing in forensic medicine: 3D documentation of skin and bone injuries.” Forensic Science International. 2003.
Thawer et al., “A Comparison of Computer-Assisted and Manual Wound Size Measurement,” Ostomy Wound Management, Oct. 2002, pp. 46-53, vol. 48, No. IO.
Treuillet et al., “Three-Dimensional Assessment of Skin Wounds Using a Standard Digital Camera,” IEEE Transactions on Medical Imaging, May 2009, pp. 752-762, vol. 28, No. 5.
Umbaugh et al., “Automatic Color Segmentation Algorithms with Application to Skin Tumor Feature Identification”, IEEE Engineering in Medicine and Biology, pp. 75-82, Sep. 1993.
Umbaugh, et al., “An Automatic Color Segmentation Algorithm with Application to Identification of Skin Tumor Borders”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 227-235, May-Jun. 1992.
Umbaugh, et al., “Automatic Color Segmentation of Images with Application to Detection of Variegated Coloring in Skin Tumors”, IEEE Engineering in Medicine and Biology Magazine, Dec. 1989, pp. 43-52.
Van Zuijlen et al., “Reliability and Accuracy of Practical Techniques for Surface Area Measurements of Wounds and Scars,” Lower Extremity Wounds, 2004, pp. 7-11, vol. 3, No. I.
Vermolen et al., “A simplified model for growth factor induced healing of circular wounds,” 2005, pp. 1-15.
Voigt, H., et al. “Topodermatographic Image Analysis for Melanoma Screening and the Quantitative Assessment of Tumor Dimension Parameters of the Skin”, Cancer, vol. 75, No. 4, Feb. 15, 1995.
Walker, N, Rogers A, Birchall N, Norton R, MacMahon S. Leg ulcers in New Zealand: age at onset, recurrence and provision of care in an urban population. NZ Med J; 2002; 115(1156):286-9.
Walker, N, Vandal A, Holden K, Rogers A, Birchall N, Norton R, Triggs C, MacMahon S. Does capture-recapture analysis provide more reliable estimates of the incidence and prevalence of leg ulcers in the community? Aust NZJ Public Health 2002; 26(5):451-5.
Walker, N., Rodgers A, Birchall N, Norton R, MacMahon S. The occurrence of leg ulcers in Auckland: results of a population-based study. NZ Med J; 2002: 115(1151): 159-162.
Wallenstein et al., “Statistical analysis of wound-healing rates for pressure ulcers,” Amer J Surgery, Jul. 2004 (Supplement), pp. 73S-78S, vol. 188.
Wang et al., “A comparison of digital planimetry and transparency tracing based methods for measuring diabetic cutaneous ulcer surface area,” Zhongguo Xiu Fu Chong Jian Wal Ke Za Zhi, May 2008, pp. 563-566, vol. 22, No. 5, [retrieved on Sep. 15, 2009]. Retrieved from the Internet: <URL: http://www.ncbi.nlm.nih.gov/pu bmed/ I 8630436?ordinalpos= I &itool=E . . . >, I page.
Wendelken et al., “Key Insights On Mapping Wounds With Ultrasound,” Podiatry Today, Jul. 2008, [retrieved on Jul. 14, 2008]. Retrieved from the Internet: <URL: http://www.podiatrytoday.com/article/5831>, 5 pages.
Wilbright, W.A., The Use of Telemedicine in the Management of Diabetes-Related Foot Ulceration: A Pilot Study, Advances in Skin & Wound Care, Jun. 2004, [retrieved on Jul. 28, 2006]. Retrieved from the Internet: <URL: http://www.findarticles.com/p/articles/mi_ qa3977/is_200406/ai_n942 . . . >, 6 pages.
Wild et al., “Wound healing analysis and measurement by means of colour segmentation,” ETRS Poster Presentation V28, Sep. 15, 2005, V28-I 7, 1 page.
Williams, C., “The Verge Videometer wound measurement package,” British J Nursing, Feb./Mar. 2000, pp. 237-239, vol. 9, No. 4.
Woodbury et al., Pressure ulcer assessment instruments: a critical appraisal, Ostomy Wound Management, May 1999, pp. 48-50, 53-55, vol. 45, No. 5, [retrieved on Dec. 8, 2005]. Retrieved from the Internet: <URL: http://gateway.ut.ovid.com.ezproxy.otago.ac.nzigw2/ovidweb.cgi>, 2 pages.
Woodbury, M.G., “Development, Validity, Reliability, and Responsiveness of a New Leg Ulcer Measurement Tool,” Advances in Skin & Wound Care, May 2004, [retrieved on Jul. 28, 2006]. Retrieved from the Internet.
Zhao, et al., The Classification of the Depth of Burn Injury Using Hybrid Neural Network; IEEE Conference on Engineering in Medicine and Biology Society, ISBN 0-7803-2475-7; vol. 1, pp. 815-816, Sep. 1995.
Zimmet, “Venous Leg Ulcers: Evaluation and Management,” American College of Phlebology. 1998.
Notice of Allowance mailed Feb. 24, 2021, U.S. Appl. No. 15/816,862, 18 pages.
Non-Final Office Action mailed Nov. 24, 2021, U.S. Appl. No. 16/500,785, 34 pages.
Extended European Search Report for European Application No. EP22204772.2 filed Apr. 3, 2018, mailed Apr. 13, 2023, 6 pages.
Final Office Action mailed Jul. 13, 2023, U.S. Appl. No. 17/398,883, 22 pages.
Non-Final Office Action mailed Mar. 29, 2023, U.S. Appl. No. 17/100,615, 10 pages.
Related Publications (1)
Number Date Country
20220215538 A1 Jul 2022 US
Provisional Applications (1)
Number Date Country
62850377 May 2019 US