Automatically assessing an anatomical surface feature and securely managing information related to the same

Information

  • Patent Grant
  • 11923073
  • Patent Number
    11,923,073
  • Date Filed
    Monday, January 3, 2022
    2 years ago
  • Date Issued
    Tuesday, March 5, 2024
    a month ago
Abstract
A facility for procuring and analyzing information about an anatomical surface feature from a caregiver that is usable to generate an assessment of the surface feature is described. The facility displays information about the surface feature used in the assessment of the surface feature. The facility obtains user input and/or data generated by an image capture device to assess the surface feature or update an existing assessment of the surface feature.
Description
TECHNICAL FIELD

The present technology is generally related to devices, systems, and methods for assessing anatomical surface features and securely managing information related to the same.


BACKGROUND

Various techniques have been used to monitor anatomical surface features, such as wounds, ulcers, sores, lesions, tumors etc. (herein referred to collectively as “surface features”) both within hospitals and outside hospitals (e.g. domiciliary-based care, primary care facilities, hospice and palliative care facilities, etc.). Wounds, for example, are typically concave and up to about 250 millimeters across. Manual techniques are often labor-intensive and require examination and contact by skilled personnel. Such measurements may be inaccurate, and there may be significant variation between measurements made by different personnel. Further, these approaches may not preserve any visual record for review by an expert or for subsequent comparison. Accordingly, there is a need for improved systems for assessing surface features.





BRIEF DESCRIPTION OF THE DRAWINGS

Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure.



FIG. 1 is a diagram showing some of the components typically incorporated in at least some of the computer systems and other devices on which the facility executes.



FIG. 2 is a flow diagram showing steps typically performed by the facility to automatically assess an anatomical surface feature of a human patient.



FIG. 3 is a display diagram showing a sample display typically presented by the facility to permit the user to enter a username and password to access the facility.



FIG. 4 is a display diagram showing a sample display typically presented by the facility to permit the user to select an existing patient profile and/or create a new patient profile.



FIG. 5A is a display diagram showing a sample display typically presented by the facility to display surface feature information for a selected patient and enable the user to capture additional images of the surface feature.



FIG. 5B is a display diagram showing a sample display typically presented by the facility to enable the user to automatically couple a particular capture device to the facility.



FIG. 6 is a display diagram showing a sample display typically presented by the facility to display one or more captured images and enable the user to assign a new image to a pre-existing wound.



FIG. 7 is a display diagram showing a sample display typically presented by the facility to display a captured image of a surface feature and enable the user to outline the perimeter of the surface feature within the captured image.



FIG. 8 is a display diagram showing a portion of the display shown in FIG. 7 after the facility has determined one or more measurements characterizing the surface feature and has displayed those measurements to the user.



FIG. 9 is a display diagram showing a sample display typically presented by the facility to present an updated surface feature assessment for a selected patient.



FIGS. 10A-10D are display diagrams showing sample displays typically presented by the facility to enable the user to provide notes characterizing the surface feature.



FIG. 11 is a display diagram showing a sample display typically presented by the facility when the user selects the create report button.



FIGS. 12A-12D are display diagrams showing a sample display typically presented by the facility to display a report characterizing the surface feature.





DETAILED DESCRIPTION

Overview


Described herein is a software facility for automatically assessing an anatomical surface feature (“the facility”), such as a wound, and for managing information related to assessed anatomical surface features across a range of patients and institutions. While the following discussion liberally employs the term “wound” to refer to the anatomical surface feature(s) being assessed, those skilled in the art will appreciate that the facility may be straightforwardly applied to anatomical surface features of other types, such as ulcers, sores, lesions, tumors, bruises, burns, moles, psoriasis, keloids, skin cancers, erythema, cellulitis, and the like. Similarly, a wide variety of users may use the facility, including doctors, nurses, technologists, or any other caregiver of the patient.


As used herein, the terms “computer” and “computing device” generally refer to devices that have a processor and non-transitory memory, as well as any data processor or any device capable of communicating with a network. Data processors include programmable general-purpose or special-purpose microprocessors, programmable controllers, application-specific integrated circuits (ASICs), programming logic devices (PLDs), system on chip (SOC) or system on module (SOM) (“SOC/SOM”), an ARM class CPU with embedded Linux or Android operating system or the like, or a combination of such devices. Computer-executable instructions may be stored in memory, such as random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such components. Computer-executable instructions may also be stored in one or more storage devices, such as magnetic or optical-based disks, flash memory devices, or any other type of non-volatile storage medium or non-transitory medium for data. Computer-executable instructions may include one or more program modules, which include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types.


Anatomical Surface Feature Assessment



FIG. 1 is a block diagram showing a sample environment having multiple components n which the facility executes. The environment 100 may include one or more capture devices 102, one or more personal computing devices 104, one or more server computers 106, and one or more persistent storage devices 108. The capture device 102 and the personal computing device 104 communicate (wirelessly or through a wired connection) with the server computer 106 through a network 140 such as, for example, a Local Area Network (LAN), a Wide Area Network (WAN), and/or the Internet. In the embodiment shown in FIG. 1, the capture device 102 does not communicate directly with the personal computing device 104. For example, the capture device 102 may communicate wirelessly with a first base station or access point 142 using a wireless mobile telephone standard, such as the Global System for Mobile Communication (GSM), or another wireless standard, such as IEEE 802.11, and the first base station or access point 142 communicates with the server computer 106 via the network 140. Likewise, the computing device 104 may communicate wirelessly with a second base station or access point 144 using a wireless mobile telephone standard, such as the Global System for Mobile Communication (GSM), or another wireless standard, such as IEEE 802.11, and the second base station or access point 144 communicates with the server computer 106 via the network 140. As such, confidential patient data generated by the capture device 102 is only temporarily stored locally, or not at all, and instead is permanently stored at the storage device 108 associated with the server computer 106. The facility can be practiced on any of the computing devices disclosed herein (e.g., one or more personal computing devices 104, one or more server computers 106, etc.), and may include an interface module that generates graphical user interfaces (GUIs) to allow users to access the facility (as described in greater detail below with reference to FIGS. 3-12D).


The personal computing device 104 can include one or more portable computing devices 120 (e.g., a smart phone, a laptop, a tablet, etc.) and/or one or more desktop computing devices 122. During data capture with the capture device 102 at the point-of-care, the personal computing device 104 may also be present (i.e., in the same treatment room), or the personal computing device 104 may be remote (i.e., outside of the treatment room but in the same treatment facility, outside of the treatment room and remote from the treatment facility, etc.). The desktop computing devices 122, if utilized, are typically associated with a particular property, e.g., a medical treatment center 124 (e.g., a hospital, a doctor's office, a clinic, etc.). The portable computing devices 120 and desktop computing devices 124 communicate with each other and the server computer 106 through networks including, for example, the Internet. In some instances the portable computing devices 120 and desktop computing devices 122 may communicate with each other through other wireless protocols, such as near field or Bluetooth.


The capture device 102 is a handheld, portable imaging device that includes one or more sensing devices for generating data characterizing the wound (“wound data”) at the point-of-care. In the embodiment shown in FIG. 1, the capture device 102 includes an image sensor 110 (e.g., a digital camera), a depth sensor 112 (also known as a “range imager”), and a computing device 116 (shown schematically) in communication with the image sensor 110 and the depth sensor 112. The computing device 116 is also in wireless communication with the server computer 106 (e.g., via the network 140). The image sensor 110 is configured to generate image data of the wound (e.g., pixels containing RGB color data), and the depth sensor 112 is configured to generate depth data characterizing the depth or topography of the wound. For example, in some embodiments the depth sensor 112 is a structured light device configured to emit structured light (e.g., one or more lasers, DLP projectors, film projectors, etc. where the emitted light may be infra-red, visible, ultraviolet, etc.) in a predetermined arrangement toward the wound. In such embodiments, for example, the depth sensor 112 may comprise three laser elements (labeled 112a-112c) spaced apart around a circumference of the capture device 102. The laser elements 112a-112c have a fixed positional relationship with respect to one another, and also with respect to the image sensor 110. Together the laser elements 112a-112c can be configured to create a structured light pattern (e.g., a laser point(s), a laser fan(s), etc.) In other embodiments, the depth sensor 112 can include other suitable devices for range imaging, such as an ultrasonic sensor, a stereo camera, a plenoptic camera, a time-of-flight camera, etc.


The capture device 102 also includes a rechargeable power source and an actuator 118 (e.g., a button, a switch, etc.) for initiating data capture. When a user presses the button 118, the computing device 116 simultaneously activates both the image sensor 110 and the depth sensor 112 to generate both the image data and the depth data. The computing device 116 then communicates the image data and the depth data to the remote server computer 106 for further processing by the facility. In some embodiments, the computing device 116 wirelessly communicates with the server computer 106 (e.g., over a network). Such a cordless arrangement can be advantageous as it allows the user greater freedom of movement with the capture device 102, which can be especially beneficial when trying to access certain anatomical locations. Also, the absence of a cord reduces the surface area available at the point-of-care on which bacteria and/or other unwanted microorganisms may bind and travel. In some embodiments, the capture device 102 may be permanently cordless (i.e., no input port), and in other embodiments, the capture device 102 may be configured to detachably receive an electronic connector, such as a power cord or a USB cord. The computing device 116 may automatically transfer the captured data to the remote server computer 106 (e.g., over the network 140) at the moment the data is captured. In certain embodiments, however, the computing device 116 may not be in communication with the network 140; in such scenarios, the captured data may be temporarily stored in the volatile and/or non-volatile memory of the capture device 102 for later transfer to the server computer 106.


The capture device 102 may include additional features for enhancing data collection of the wound, such as one or more light sources 114 (e.g., a light emitting diode (LED), an incandescent light source, an ultraviolet light source, etc.) for illuminating the wound before or during data capture, an indicator (not shown) configured to provide a visual and/or audio signal (e.g., images, text, lights, etc.) to the user, a thermal camera, a video camera, and/or one or more input/output devices (e.g., a microphone, a speaker, a port for communicating electrically with external components, such as a power source, the personal computing device 104, etc.). In some embodiments, the capture device 102 is configured for wireless charging, e.g., via a dock or cradle (not shown). In such embodiments, the charging cradle may also serve as an access point for the network 140. As discussed in greater detail below with reference to FIGS. 5A-5B, the capture device 102 and/or image sensor 110 may also be configured to capture images of barcodes and/or QR codes displayed on the computing device 104, such as a barcode and/or a QR code that enable the capture device 102 to connect to the network 140.


Those skilled in the art will appreciate that the capture device 102 may have other configurations than that shown in FIG. 1. For example, although the image sensor 110, depth sensor 112, and computing device 116 are shown as part of a single component and/or within the same housing, in other embodiments, any or all of the of the image sensor 110, the depth sensor 112, and the computing device 116 can be separate components. Likewise, in some embodiments, the capture device 102 does not include separate image and depth sensors, and instead includes a stereo camera that is configured to generate both image data and depth data. Additional details regarding suitable capture devices 102 and methods of use can be found in U.S. Pat. No. 8,755,053, filed May 11, 2009 and U.S. Pat. No. 9,179,844, filed Nov. 27, 2012, both of which are incorporated herein by reference in their entireties.


As discussed above, the facility may include an interface module that generates graphical user interfaces (GUIs) to allow users to access the facility. The interface module also provides application programming interfaces (APIs) to enable communication and interfacing with the facility. APIs may be used by other applications, web portals, or distributed system components to use the system. For example, an application operating on a personal computing device may use an API to interface with system servers and receive capture data from the system. The API may utilize, for example, Representational State Transfer (REST) architecture and Simple Object Access Protocol (SOAP) protocols.



FIG. 2 is a flow diagram showing steps typically performed by the facility to assess a wound of a patient and/or manage data (including meta data) related to the wound. At step 202, the facility provides a display that solicits the user to enter login information (e.g., a username and password) to permit the user to access the facility and/or the storage device 108 (see, for example, FIG. 3). At step 204, the facility solicits the user to identify the patient having a wound needing assessment (see, for example, FIG. 4). If the patient is new to the database, the facility enables the user to create a new data structure (or profile) for the patient, and a new data structure (or profile) for the wound that is associated with the patient's profile. If the patient already exists in the database, then the facility displays a unique identifier for the patient or otherwise enables the user to access the already-existing patient profile. Once the facility has identified the patient, the facility solicits the user to identify the wound to be reviewed and/or assessed (see, for example, FIG. 5A). If the wound has not yet been entered into the patient profile, the facility enables the user to create a new data structure (or profile) for the wound and associates that data structure with the already-existing patient profile. If the wound already exists in the patient profile, then the facility displays a unique identifier for the particular wound or otherwise enables the user to access the already-existing wound profile. At some point after the facility has identified the corresponding patient and wound profile, the facility receives new image data, new depth data, and new outline data characterizing the wound and assigns the new data to the wound profile (steps 208-210). As shown at steps 212-214, the facility then analyzes this data to determine one or more wound measurements (such as wound area and wound volume), and displays the wound measurements to the user.


Those skilled in the art will appreciate that the steps shown in FIG. 2 may be altered in a variety of ways. For example, the order of the steps may be rearranged, sub steps may be performed in parallel, shown steps may be omitted, other steps may be included, etc.



FIGS. 3-12D contain sample displays presented by the facility in some embodiments in performing portions of the method shown in FIG. 2. In the following discussion, the user interacts with the facility through a web-based interface, and thus all tasks or modules of the facility are performed at a remote server computer (such as server computer 106). However, the facility can also be practiced in distributed computing environments, where tasks or modules of the facility are performed by multiple remote processing devices (such as the personal computing device 104 and the server computer 106), which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), or the Internet. For example, those skilled in the relevant art will recognize that portions of the facility may reside on a server computer, while corresponding portions reside on a remote or personal computing device. In such a distributed computing environment, program modules or subroutines may be located in and executed on both local and remote memory storage devices. Aspects of the facility described herein may be stored or distributed on tangible, non-transitory computer-readable media, including magnetic and optically readable and removable computer discs, stored in firmware in chips (e.g., EEPROM chips). Alternatively, aspects of the facility may be distributed electronically over the Internet or over other networks (including wireless networks).


To begin a wound assessment, a caregiver may first provide a username and password to gain access to the interface module. For example, FIG. 3 is a display diagram showing a sample display 300 typically presented by the facility that solicits the user to enter a username and password to access the programs and data stored at the storage device 108. As shown in FIG. 3, the display 300 includes a username field 302 and a password field 304.



FIG. 4 is a display diagram showing a sample display 400 typically presented by the facility to permit the user to select an existing patient profile and/or create a new patient profile. The display 400 includes a generic search field 412 for searching for a particular patient by name, birthday, unique identifier, and/or assessment date. The display 400 further includes a control 404 to create a new patient profile. The display 400 may also include an existing patient table 402 listing existing patient profiles in rows 421-424. Each row is divided into the following sortable and/or filterable columns: a first name column 431 containing the first name of an already-assessed patient, a last name column 432 containing the last name of an already-assessed patient, a date-of-birth column 433 containing the date of birth of an already-assessed patient, a patient ID column 434 containing the patient ID of an already-assessed patient, a “#” column 435 containing the number of wound assessments performed on the already-assessed patient, and a number of active orders column 436 containing indicated the number of orders for new assessments that are currently pending. Orders for new assessments might come, for example, from an Electronic Medical Records (EMR) system attached to the server computer 106. In the sample display 400 shown in FIG. 4, row 421 indicates that patient Harriet Jones has at least one wound that has been previously assessed by the facility, and that Harriet Jones' date of birth is Jan. 15, 1976 and patient ID is 9990532, and that Harriet Jones has two assessments completed but no orders for another assessment. (It will be appreciated that the patient information used in the displays and examples herein are fictitious.) While the contents of patient table 400 are included to pose a comprehensible example, those skilled in the art will appreciate that the facility can use a patient table having columns corresponding to different and/or a larger number of attributes, as well as a larger number of rows to accommodate additional patients. Attributes that may be used include, for example, number of wounds actively being monitored, date and/or time of the most recent assessment, date and/or time of the first assessment, name or other identifier of the caregiver that gave the last assessment, etc. For a variety of reasons, certain values may be omitted from the patient table.


When the user clicks on one of the rows 421-424 of already-assessed patients listed (i.e., to selected a particular patient profile), the facility displays additional information on the selected patient. For example, FIG. 5A is a display diagram showing a sample wound information display 500 typically presented by the facility for review of wounds already being monitored or assessed by the facility. Display 500 includes a patient identifier 502, a wound identifier area 504, a visit area 508, an image area 510, an analytics area 514, and a wound measurement area 518. The wound identifier area 504 includes a button 505 for adding a new wound profile, as well as buttons 506a-506c, each of which correspond to a wound of the patient that is being monitored. As shown in FIG. 5A, the buttons 506a-506c identify the corresponding monitored wound by anatomical reference. For example, button A refers to a wound found at or near the sacrum of the patient. The buttons 506a-506c may also be color-coded. When the user selects one of the buttons 506a-506c, the facility displays information related to the selected wound, such as the most recent wound image 510, analytics 514 showing progress of the selected wound, and wound measurements 518. The sample display 500 of FIG. 5A shows a display after a user has selected wound A (by clicking on button 506a).


The visit area 508 displays the dates and times of previous assessments for the selected wound. The user may select a particular visit to review the wound assessment from the selected visit. The visit area includes a button 509 for creating a new visit for the selected wound. The wound image area 510 displays an image of the wound W. The default image is that from the most recent assessment (or visit), although the user can view wound images from earlier visits by selecting the particular visit in the visit area 508. The wound image area 510 can include buttons 512 that allow the user to manipulate the displayed image. In some embodiments, the wound image area 510 may include display a three-dimensional model of the wound. The wound measurement area 518 displays one or more wound measurements, such as wound area, wound volume, wound outline, the maximum depth of the wound, the minimum depth of the wound, etc. In display 500, the wound measurement area 518 is blank, as the user has not yet generated new capture data. As shown in the display 500′ of FIG. 9 (discussed below), the facility will display the measurements once the user has captured new wound data. The analytics area 514 displays additional information to the user based on the facility's analysis of the wound measurements over time. For example, the analytics area 514 of the display 500 shows a graph plotting the measured area of wound A over time. The graph indicates that wound A has been previously assessed two times, and that the area of the wound increased between those assessments. The analytics area 514 may include a button 516 that, when selected, displays a drop-down menu of wound measurement options that may be plotted over time and displayed for the user.


In some embodiments, the facility may enable the user to couple (i.e., pair, associate, etc.) the capture device 102 to the facility so that the capture device 102 may automatically send captured data to a specific patient profile in the storage device 108 that is accessible by the facility for processing. In FIG. 5A, for example, the display 500 includes a button 522 that, when selected by the user, causes the facility to dynamically generate coupling information. FIG. 5B is a display diagram showing a sample display 550 having coupling information 556 that may be presented by the facility to the user. In FIG. 5B, the coupling information 556 is displayed as a QR code. In other embodiments, however, the coupling information 556 can have other configurations, such as a bar code, text that the user may enter into an input element of the capture device 102, etc. The user may scan or otherwise detect the coupling information 556 with the capture device 102 (e.g., via the image sensor 110 of the capture device 102) to automatically place the capture device 102 in communication with a particular patient profile in the storage device 108. The coupling information 556, for example, can include a network identifier and password (e.g., an SSID and SSID password), an Internet address (e.g., the URL for the interface module), and a patient identifier for directing the capture device 102 to the particular patient profile at the storage device 108. Once the capture device 102 is wirelessly coupled to the server computer 106 and/or storage device 108, the user can position the capture device 102 near the wound being assessed and begin capturing wound data. Each time the user actuates the actuator 118 (FIG. 1) on the capture device 102, the capture device 102 automatically populates the patient profile at the storage device 108 with a new data set. The user may select button 554 to view the captured images and/or data. Each new data set may be displayed by the facility as an unassigned image button, as discussed in greater detail below with reference to FIG. 6.


It will be appreciated that the coupling information 556 may be encrypted to protect the patient's confidential information. For example, in the embodiment shown in FIG. 5B, the QR code 556 is encrypted such that a screen shot of display 552 will not provide the coupling information and/or provide access to a recipient of the screen shot.


The present technology includes additional methods for coupling the capture device 102 to the facility. For example, in some embodiments, some or all of the coupling information may be received from an identifier on the patient, such as a patient wristband. As another example, in some embodiments the facility may display a list of capture devices 102 available on the network 140 (FIG. 1) (e.g., a plurality of capture devices 102 may be dispersed amongst different examination rooms in a medical treatment center, and at least a portion of those devices may be coupled to the same wireless network). The facility may display (e.g., via the interface module) each of the capture devices 102 available, and enable the user to select the appropriate capture device 102 for pairing. The facility may then solicit the user to input a patient identifier and/or additional information for connecting the capture device 102 to the facility and/or server computer 106. In yet another embodiment, the capture device 102 may be placed in direct communication with the personal computing device 104 (e.g., via a USB cord), and the user may input pairing information into a local program executing on the personal computing device 104 and/or a web-based interface executing on the remote server computer 106.


In some embodiments (not shown), the wound information display 500 includes a button that, when selected by the user, causes a live image from the capture device 102 (FIG. 1) to be displayed on the interface module. For example, should the personal computing device 104 also be present at the point-of-care, the user may utilize the display of the personal computing device 104 to properly position the capture device 102 from the wound. In other embodiments, the capture device 102 has a display and/or the facility does not display a live image of the wound on the personal computing device 104 during image capture. The facility may further include a button (not shown) on the image capture display (not shown) that, once the user is finished capturing wound data, the user may select the button to begin assigning the captured images (containing the image data and the depth data) to the corresponding wound profile. For example, FIG. 6 is a display diagram showing a sample display 600 typically presented by the facility to display one or more captured images and enable the user to assign newly-captured wound data with a pre-existing wound. As shown in FIG. 6, the display 600 includes a new image area 604, a wound identifier area 612 having buttons 614 (similar to buttons 506a-506c in FIG. 5A), an image area 616, and a trace button 618. The new image area 604 includes an “unassigned” area 603 having buttons 606a-606e, each of which correspond to a new or unassigned image. Each of the buttons 606a-606e may include a small, preview image of the captured image so that the user can easily distinguish between the new images. To assign a new image to a wound profile, the user may drag and drop the button 606a-606e over the corresponding button 614a-c in the wound area. The facility also displays a discard button 615 for the user to remove unwanted images.


Once the captured data has been assigned to a corresponding wound profile, the user may select the trace button 618, which causes the facility to display an interface that solicits the user to manually trace the outline of the wound. In certain embodiments, the facility utilizes image processing techniques to automatically determine an outline or outline of the wound. In such embodiments, the facility may nevertheless solicit the user to view and edit the automatically generated outline. FIG. 7, for example, is a display diagram showing a sample display 700 typically presented by the facility to display a captured image of a wound W and solicit the user to outline the outline of the wound within the captured image. The display 700 includes an image identifier area 704 and an editing area 705. The identifier area includes buttons 706a-706c (only buttons 706b and 706c shown in FIG. 7) that identify the image (with a preview image) and wound (e.g., by color and/or by reference letters “B” and “C” that correspond to a particular wound profile). When a user selects a button 706a-706c, an enlarged view of the image appears in the editing area 705. The editing area 705 includes editing buttons 710 that enable the user to edit or manipulate the image. Button 710a, for example, enables the user to click various points around the outline of the wound W, and the facility automatically displays a line (line “a”) connecting the points. Button 710b enables the user to draw a continuous line (line “b”) around the outline of the wound W. As shown in FIG. 7, the resulting outline line can be a combination of one or both techniques. In other embodiments, the facility may include other editing techniques. For example, in a particular embodiment, the facility may display a suggested outline line for the user and one or more buttons for adjusting the automatically generated line.


Once the outline data is generated (either automatically by the facility, manually by the user's input, or a combination of both), the facility then utilizes the captured data (image data and depth data) and the outline data to determine one or more wound measurements, which in some embodiments may include generating a three-dimensional model of the wound. The facility may then update the corresponding wound identifier button 706a-706c to display the determined wound measurements. FIG. 8, for example, is a display diagram showing an enlarged view of the button 706c shown in the display 700 of FIG. 7 after being updated to include the newly determined wound measurements 802.



FIG. 9 is an updated version 500′ of the sample wound information display 500 (FIG. 5A) after the facility has updated the measurement area 518′, image area 510′, and analytics area 514′ to include the new capture data, outline data, and wound measurements. Updated graph 514′, for example, includes an additional data point (as compared to the graph of FIG. 5A).


The facility may optionally include a button 520 that enables the user to enter additional notes characterizing a wound and/or the patient. For example, FIGS. 10A-10D are display diagrams showing a sample display 1000 presented by the facility to the user when the user selects button 520. The display 1000 includes an identifier area 1004 having a patient button 1003 and wound identifier buttons 1006a-c (similar to buttons 506a-c), and a notation area 1005. After selecting any of the patient buttons 1003 and wound identifier buttons 1006a-c, the facility displays a notation area 1005 solicits the user for input regarding the selected patient and/or wound. Display 1000 shows a sample notation area 1005 after the user has selected the patient button 1003. As shown in FIGS. 10A-10D, the notation area 1005 can include various topic areas 1010, 1020, 1030, 1040, 1050, 1060, 1070, and 1080 (referred to collectively as “topic areas 1010-1080”), such as admission and discharge, medical history, investigation history, wound appearance, wound margins, surrounding skin, exudate, and wound pain. It will be appreciated that the facility may display more or fewer notation topics. The facility may divide each of the topic areas 1010-1080 into one or more sub-topics (the sub-topics for topic area 1020 are labeled 1022a, 1022b, etc.), and each of the sub-topics may include one or more input options (the input options for sub-topic 1022a are labeled 1024a, etc.). In some embodiments, the facility may include a text field (not shown) that enables the user to enter personalized notes. When the user is finished, the user may select the “done” button 1090 to return to the updated wound information display 500′ (FIG. 9).


The facility also enables the user to generate a report one or more of the wounds, or the patient, generally. For example, FIG. 11 is a display diagram showing a sample display 1100 typically presented by the facility when the user selects the “create report” 526 button on the wound information display 500, 500′, and FIGS. 12A-12D, for example, are display diagrams showing a sample display 1200 typically presented by the facility to display the generated report.


Although the facility is described above as providing the user a means of interacting by displaying one or more selectable buttons, it will be appreciated that other means of interacting with or controlling the facility, personal computing device, and/or capture device are within the scope of the present technology. For example, in some embodiments, in addition to or in place of any of the buttons described above, the user may interact with the facility with audio (via a microphone) and/or tactile commands (e.g., a foot control, etc.).


In those embodiments of capture devices 102 having a plurality of structured light elements (such as lasers), the capture device 102 may be configured to capture data based on a predetermined capture sequence (for example, instructions executed by the computing device 116 (FIG. 1)). An example capture sequence utilized by the capture device 102 starts with an RGB (or texture) frame captured while the light source 114 (FIG. 1) emits light. The RGB frame is followed by capture of individual frames for each structured light element (a frame is captured while each structured light element is turned on by itself), as well as a frame for the combination of structured light elements. During capture of the RGB frame, the structured light elements 112 do not emit light. Likewise, during capture of the structured light frames, the light source 114 does not emit light. However, post-capture processing methods require that only one type of light source (regardless of the number of light sources per type) is enabled during the capture sequence. Existing imaging devices meet this requirement by configuring the image sensor to operate in a “single frame capture mode” (i.e., pauses are inserted between consecutive frames of the capture sequence). However, because of the added pauses, “single frame capture mode” results in an increase in total capture time. To address this shortcoming, the image sensor 110 may be configured to operate in video mode (continuous streaming) which reduce capture times because it eliminates the pauses and “pipelines” the imaging processing steps. Suitable image sensors that operate in video mode include, for example, a 5-megapixel Omnivision OV5640 image sensor. Such image sensors, however, often utilize a rolling shutter (i.e., only a fraction of the sensor rows are exposed to light at any instant). For some of the image sensors that utilize a rolling shutter, the end the current frame and the beginning of the next frame overlap, and thus two disjoint fractions of the sensor rows are exposed to light at the same instant. This overlap results in cross-talk between successive frames. To avoid the potential for such cross-talk to reduce the utility of the captured frames, a specific capture sequence is employed and one or more multiplicative binary masks are applied to the structured light frames that reject the cross-talk from the previous frame (and also the next frame in cases where illumination for the next frame is pre-enabled) and also range gate the laser returns to certain depths (e.g., between 200 mm and 350 mm). In these and other embodiments, the image sensor 110 may additionally or alternatively be configured to operate in a “single frame capture mode.” Additionally, because the capture device 102 is a handheld device and relative motion inevitably occurs between the capture device 102 and the patient over the course of the capture sequence, the facility and/or capture device 102 of the present technology may include one or more motion compensation algorithms to reduce or eliminate the negative effect of such motion on the resulting image and/or data quality.


Conclusion


It will be appreciated by those skilled in the art that the above-described facility may be straightforwardly adapted or extended in various ways. For example, the facility may use a variety of user interfaces to collect various information usable in determining valuations from users and other people knowledgeable about homes, and a variety of user interfaces to display refined valuations. While the foregoing description makes reference to particular embodiments, the scope of the invention is defined solely by the claims that follow and the elements recited therein.


While computing devices configured as described above are typically used to support the operation of the facility, one of ordinary skill in the art will appreciate that the facility may be implemented using devices of various types and configurations, and having various components. For example, in some instances the capture devices 102 and the personal computing devices 104 may communicate directly with each other (in addition to communicating with the server computer 106) through a wired or wireless connection. Such a configuration could provide the user a live image of the wound faster and/or provide a higher quality live preview image. In such embodiments, suitable restrictions can be administered when sending and storing patient data to ensure confidentiality. In another variation, the capture device 102 is only in communication (wired or wirelessly) with the computing device 104, and the computing device 104 communicates with the server computer 106 (e.g., via cellular data protocols), thereby serving as a pass through for patient data without permanently storing the patient data. In yet another variation, the facility may route communications from the capture devices 102 and the computing devices 104 through a common access point, rather than the two separate access points (first and second access points 140, 142) shown in FIG. 1. Additionally, the facility may provide the user with audit information for the assessments (e.g., who performed the assessments, who accessed the assessments, etc.).


Although not required, aspects of the present technology have been described in the general context of computer-executable instructions, such as routines executed by a general-purpose computer, a personal computer, a server, or other computing system. The present technology can also be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein.

Claims
  • 1. A computer-implemented method for evaluating an anatomical surface feature (“surface feature”) of a patient, the method comprising: capturing, using a capture device, video data including the surface feature;transmitting the captured video data from the capture device to a remote server computer;at a personal computing device, displaying on a web-based interface live image data based on the captured video data;the capture device capturing one or more data sets characterizing the surface feature;according to instructions executed at a processor of the remote server computer and/or the personal computing device, determining one or more measurements of the surface feature based on the one or more data sets;storing, at a non-volatile storage device remote from the capture device, the determined one or more measurements; anddisplaying the determined one or more measurements on the web-based interface.
  • 2. The method of claim 1, including displaying the live image data in response to a user request.
  • 3. The method of claim 1, including displaying a live image button on a display of the personal computing device and displaying the live image data in response to a user actuation of the live image button.
  • 4. The method of claim 1, including a user positioning the capture device using the displayed live image data.
  • 5. The method of claim 1, wherein the one or more data sets are captured in response to one or more user capture instructions.
  • 6. The method of claim 1 wherein none of the video data, data sets or determined one or more measurements are stored in a non-volatile memory of the capture device.
  • 7. The method of claim 1 wherein none of the video data, data sets or determined one or more measurements are stored at a non-volatile memory of the personal computing device.
  • 8. The method of claim 1 wherein the capture device only communicates with the personal computing device through the server computer.
  • 9. The method of claim 1 wherein the capture device does not include a display.
  • 10. The method of claim 1, including displaying at the personal computing device a surface feature information display associated with the surface feature, the surface feature information display including a live image button, and displaying the live image data in response to a user actuation of the live image button.
  • 11. The method of claim 10, including storing the one or more data sets in a surface feature profile associated with the surface feature.
  • 12. The method of claim 10, wherein displaying the determined one or more measurements on a web-based interface includes displaying the determined one or more measurements within the surface feature information display.
  • 13. A computer-implemented method for evaluating an anatomical surface feature (“surface feature”) of a patient, the method comprising: capturing, using a capture device, video data including the surface feature;transmitting the captured video data from the capture device to a remote server computer;at a personal computing device, displaying on a web-based interface live image data based on the captured video data, wherein the capture device only communicates with the personal computing device through the server computer;the capture device capturing one or more data sets characterizing the surface feature;according to instructions executed at a processor of the remote server computer and/or the personal computing device, determining one or more measurements of the surface feature based on the one or more data sets;storing, at a non-volatile storage device remote from the capture device, the determined one or more measurements; anddisplaying the determined one or more measurements on the web-based interface.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 17/005,260, filed Aug. 27, 2020, now pending, which is a divisional of U.S. patent application Ser. No. 15/974,433, filed May 8, 2018, now U.S. Pat. No. 10,777,317, which is a continuation of U.S. patent application Ser. No. 15/144,722, filed May 2, 2016, now U.S. Pat. No. 10,013,527, the disclosure of each of which are incorporated herein by reference in their entireties.

US Referenced Citations (553)
Number Name Date Kind
3259612 Peter Jul 1966 A
3335716 Alt et al. Aug 1967 A
4090501 Chaitin May 1978 A
4170987 Anselmo et al. Oct 1979 A
4236082 Butler Nov 1980 A
4505583 Konomi Mar 1985 A
4515165 Carroll May 1985 A
4535782 Zoltan Aug 1985 A
4556057 Hiruma et al. Dec 1985 A
4724480 Hecker et al. Feb 1988 A
4736739 Flaton Apr 1988 A
4768513 Suzuki Sep 1988 A
4773097 Suzaki et al. Sep 1988 A
4821117 Sekiguchi Apr 1989 A
4839807 Doi et al. Jun 1989 A
4851984 Doi et al. Jul 1989 A
4894547 Leffell et al. Jan 1990 A
4930516 Alfano et al. Jun 1990 A
4957114 Zeng et al. Sep 1990 A
4979815 Tsikos Dec 1990 A
4996994 Steinhauer et al. Mar 1991 A
D315901 Knowles Apr 1991 S
5003977 Suzuki et al. Apr 1991 A
5016173 Kenet et al. May 1991 A
5036853 Jeffcoat et al. Aug 1991 A
5080100 Trotel Jan 1992 A
5157461 Page Oct 1992 A
5174297 Daikuzono Dec 1992 A
5241468 Kenet Aug 1993 A
5270168 Grinnell Dec 1993 A
5319550 Griffith Jun 1994 A
5363854 Martens et al. Nov 1994 A
5369496 Alfano et al. Nov 1994 A
5396331 Kitoh et al. Mar 1995 A
5408996 Salb Apr 1995 A
5421337 Richards-Kortum et al. Jun 1995 A
5515449 Tsuruoka et al. May 1996 A
5519208 Esparza et al. May 1996 A
5528703 Lee Jun 1996 A
5531520 Grimson et al. Jul 1996 A
5532824 Harvey et al. Jul 1996 A
5561526 Huber et al. Oct 1996 A
5588428 Smith et al. Dec 1996 A
5590660 MacAulay et al. Jan 1997 A
5603318 Heilbrun et al. Feb 1997 A
5627907 Gur et al. May 1997 A
5644141 Hooker et al. Jul 1997 A
5648915 Mckinney et al. Jul 1997 A
5673300 Reckwerdt et al. Sep 1997 A
5689575 Sako et al. Nov 1997 A
5699798 Hochman et al. Dec 1997 A
5701902 Vari et al. Dec 1997 A
5717791 Labaere et al. Feb 1998 A
D393068 Kodama Mar 1998 S
5740268 Nishikawa et al. Apr 1998 A
5749830 Kaneko et al. May 1998 A
5784162 Cabib et al. Jul 1998 A
5791346 Craine et al. Aug 1998 A
5799100 Clarke et al. Aug 1998 A
5810014 Davis et al. Sep 1998 A
5836872 Kenet et al. Nov 1998 A
5910972 Ohkubo et al. Jun 1999 A
5921937 Davis et al. Jul 1999 A
5946645 Rioux et al. Aug 1999 A
5957837 Raab Sep 1999 A
5967797 Maldonado Oct 1999 A
5967979 Taylor et al. Oct 1999 A
5969822 Fright et al. Oct 1999 A
5974165 Giger et al. Oct 1999 A
6032070 Flock et al. Feb 2000 A
6081612 Gutkowicz-Krusin et al. Jun 2000 A
6081739 Lemchen Jun 2000 A
6091995 Ingle et al. Jul 2000 A
6101408 Craine et al. Aug 2000 A
6208749 Gutkowicz-Krusin et al. Mar 2001 B1
6215893 Leshem et al. Apr 2001 B1
6265151 Canter et al. Jul 2001 B1
6266453 Hibbard et al. Jul 2001 B1
6272278 Takahata et al. Aug 2001 B1
6278793 Gur et al. Aug 2001 B1
6307957 Gutkowicz-Krusin et al. Oct 2001 B1
6324417 Cotton Nov 2001 B1
D453350 Fenton Feb 2002 S
6359513 Kuo et al. Mar 2002 B1
6359612 Peter Mar 2002 B1
D455166 Raad Apr 2002 S
6381026 Schiff et al. Apr 2002 B1
6381488 Dickey et al. Apr 2002 B1
6392744 Holec May 2002 B1
6396270 Smith May 2002 B1
6413212 Raab Jul 2002 B1
6421463 Poggio et al. Jul 2002 B1
6427022 Craine et al. Jul 2002 B1
6491632 Taylor Dec 2002 B1
6567682 Osterweil et al. May 2003 B1
6594388 Gindele et al. Jul 2003 B1
6594516 Steckner et al. Jul 2003 B1
6603552 Cline et al. Aug 2003 B1
6611617 Crampton Aug 2003 B1
6611833 Johnson Aug 2003 B1
6631286 Pfeiffer et al. Oct 2003 B2
6648820 Sarel Nov 2003 B1
6671349 Griffith Dec 2003 B1
6678001 Elberbaum Jan 2004 B1
6690964 Bieger et al. Feb 2004 B2
6715675 Rosenfeld Apr 2004 B1
6754370 Hall-Holt et al. Jun 2004 B1
6770186 Rosenfeld et al. Aug 2004 B2
6798571 Wetzel et al. Sep 2004 B2
6809803 O'Brien et al. Oct 2004 B1
6810279 Mansfield et al. Oct 2004 B2
6816606 Wetzel et al. Nov 2004 B2
6816847 Toyama Nov 2004 B1
6862410 Miyoshi Mar 2005 B2
6862542 Lockhart et al. Mar 2005 B2
6873340 Luby Mar 2005 B2
6873716 Bowker Mar 2005 B1
6879394 Amblard et al. Apr 2005 B2
6907193 Kollias et al. Jun 2005 B2
6915073 Seo Jul 2005 B2
6922523 Merola et al. Jul 2005 B2
6941323 Galperin Sep 2005 B1
6961517 Merola et al. Nov 2005 B2
6968094 Gallagher Nov 2005 B1
6993169 Wetzel et al. Jan 2006 B2
7006223 Mullani Feb 2006 B2
7013172 Mansfield et al. Mar 2006 B2
7015906 Olschewski et al. Mar 2006 B2
7027153 Mullani Apr 2006 B2
7040536 Rosenfeld May 2006 B2
7054674 Cane et al. May 2006 B2
7064311 Jung et al. Jun 2006 B2
7068828 Kim et al. Jun 2006 B2
7068836 Rubbert et al. Jun 2006 B1
7074509 Rosenfeld et al. Jul 2006 B2
7103205 Wang et al. Sep 2006 B2
7106885 Osterweil et al. Sep 2006 B2
7127094 Elbaum et al. Oct 2006 B1
7127280 Dauga Oct 2006 B2
7128894 Tannous et al. Oct 2006 B1
7130465 Muenzenmayer et al. Oct 2006 B2
7136191 Kaltenbach et al. Nov 2006 B2
D533555 Odhe et al. Dec 2006 S
7155049 Wetzel et al. Dec 2006 B2
7162063 Craine et al. Jan 2007 B1
7167243 Mullani Jan 2007 B2
7167244 Mullani Jan 2007 B2
7181363 Ratti et al. Feb 2007 B2
7194114 Schneiderman Mar 2007 B2
7212660 Wetzel et al. May 2007 B2
7227621 Lee et al. Jun 2007 B2
7233693 Momma Jun 2007 B2
D547347 Kim Jul 2007 S
7248724 Gutenev Jul 2007 B2
D554682 Martinez Nov 2007 S
7295226 Meron et al. Nov 2007 B1
7298881 Giger et al. Nov 2007 B2
D561804 Asai Feb 2008 S
7347365 Rowe Mar 2008 B2
7376346 Merola et al. May 2008 B2
7400754 Jung et al. Jul 2008 B2
7421102 Wetzel et al. Sep 2008 B2
7426319 Takahashi Sep 2008 B2
7440597 Rowe Oct 2008 B2
7450783 Talapov et al. Nov 2008 B2
7460250 Keightley et al. Dec 2008 B2
7474415 Lin et al. Jan 2009 B2
7487063 Tubic et al. Feb 2009 B2
7489799 Nilsen et al. Feb 2009 B2
7495208 Czarnek et al. Feb 2009 B2
7496399 Maschke Feb 2009 B2
7509861 Masotti et al. Mar 2009 B2
7538869 Treado et al. May 2009 B2
7545963 Rowe Jun 2009 B2
D597205 Koch Jul 2009 S
7580590 Lin et al. Aug 2009 B2
7581191 Rice et al. Aug 2009 B2
7587618 Inui et al. Sep 2009 B2
7595878 Nelson et al. Sep 2009 B2
7603031 Viaud et al. Oct 2009 B1
D603441 Wada Nov 2009 S
7613335 McLennan et al. Nov 2009 B2
7620211 Browne et al. Nov 2009 B2
7647085 Cane et al. Jan 2010 B2
7668350 Rowe Feb 2010 B2
7684589 Nilsen et al. Mar 2010 B2
7724379 Kawasaki et al. May 2010 B2
7729747 Stranc et al. Jun 2010 B2
7735729 Rowe Jun 2010 B2
7738032 Kollias et al. Jun 2010 B2
7751594 Rowe et al. Jul 2010 B2
7765487 Cable Jul 2010 B2
7819311 Rowe et al. Oct 2010 B2
7869641 Wetzel et al. Jan 2011 B2
7876948 Wetzel et al. Jan 2011 B2
7881777 Docherty et al. Feb 2011 B2
7894645 Barsky Feb 2011 B2
7912320 Minor Mar 2011 B1
7912534 Grinvald et al. Mar 2011 B2
7916834 Piorek et al. Mar 2011 B2
7931149 Gilad et al. Apr 2011 B2
7951395 Lee et al. May 2011 B2
8000776 Gono Aug 2011 B2
8019801 Robb et al. Sep 2011 B1
8026942 Payonk et al. Sep 2011 B2
8071242 Rosenfeld et al. Dec 2011 B2
8078262 Murphy et al. Dec 2011 B2
8094294 Treado et al. Jan 2012 B2
8105233 Abou El Kheir Jan 2012 B2
D653687 Yu Feb 2012 S
8123704 Richards Feb 2012 B2
8150500 Goldman et al. Apr 2012 B2
8161826 Taylor Apr 2012 B1
8165357 Rowe Apr 2012 B2
8184873 Rowe et al. May 2012 B2
D662122 Goodwin Jun 2012 S
D664655 Daniel et al. Jul 2012 S
8213695 Zouridakis Jul 2012 B2
8218862 Demirli et al. Jul 2012 B2
8218873 Boncyk et al. Jul 2012 B2
8218874 Boncyk et al. Jul 2012 B2
8224077 Boncyk et al. Jul 2012 B2
8224078 Boncyk et al. Jul 2012 B2
8224079 Boncyk et al. Jul 2012 B2
8229185 Ennis et al. Jul 2012 B2
8238623 Stephan et al. Aug 2012 B2
8306334 Paschalakis et al. Nov 2012 B2
8326031 Boncyk et al. Dec 2012 B2
8335351 Boncyk et al. Dec 2012 B2
8437544 Boncyk et al. May 2013 B2
8457395 Boncyk et al. Jun 2013 B2
8463030 Boncyk et al. Jun 2013 B2
8463031 Boncyk et al. Jun 2013 B2
8465762 Lee et al. Jun 2013 B2
8467600 Boncyk et al. Jun 2013 B2
8467602 Boncyk et al. Jun 2013 B2
8478036 Boncyk et al. Jul 2013 B2
8478037 Boncyk et al. Jul 2013 B2
8480641 Jacobs Jul 2013 B2
8488880 Boncyk et al. Jul 2013 B2
8494264 Boncyk et al. Jul 2013 B2
8498460 Patwardhan Jul 2013 B2
8520942 Boncyk et al. Aug 2013 B2
8533879 Taylor Sep 2013 B1
8548245 Boncyk et al. Oct 2013 B2
8548278 Boncyk et al. Oct 2013 B2
8582817 Boncyk et al. Nov 2013 B2
8588476 Spicola, Jr. Nov 2013 B1
8588527 Boncyk et al. Nov 2013 B2
D697210 Delaney et al. Jan 2014 S
8638986 Jiang et al. Jan 2014 B2
8661915 Taylor Mar 2014 B2
8712193 Boncyk et al. Apr 2014 B2
8718410 Boncyk et al. May 2014 B2
8734342 Cable May 2014 B2
8755053 Fright et al. Jun 2014 B2
8768052 Kawano Jul 2014 B2
8773508 Daniel et al. Jul 2014 B2
8774463 Boncyk et al. Jul 2014 B2
8787621 Spicola, Sr. et al. Jul 2014 B2
8787630 Rowe Jul 2014 B2
8795169 Cosentino et al. Aug 2014 B2
8798368 Boncyk et al. Aug 2014 B2
8800386 Taylor Aug 2014 B2
8814841 Hartwell Aug 2014 B2
8824738 Boncyk et al. Sep 2014 B2
8837868 Boncyk et al. Sep 2014 B2
8842941 Boncyk et al. Sep 2014 B2
8849380 Patwardhan Sep 2014 B2
D714940 Kim Oct 2014 S
8855423 Boncyk et al. Oct 2014 B2
8861859 Boncyk et al. Oct 2014 B2
8867839 Boncyk et al. Oct 2014 B2
8873891 Boncyk et al. Oct 2014 B2
8875331 Taylor Nov 2014 B2
8885983 Boncyk et al. Nov 2014 B2
8892190 Docherty et al. Nov 2014 B2
8904876 Taylor et al. Dec 2014 B2
8913800 Rowe Dec 2014 B2
8923563 Boncyk et al. Dec 2014 B2
D720864 Behar et al. Jan 2015 S
8938096 Boncyk et al. Jan 2015 B2
8939918 Richards Jan 2015 B2
8948459 Boncyk et al. Feb 2015 B2
8948460 Boncyk et al. Feb 2015 B2
D724216 Gant et al. Mar 2015 S
8997588 Taylor Apr 2015 B2
9014513 Boncyk et al. Apr 2015 B2
9014514 Boncyk et al. Apr 2015 B2
9014515 Boncyk, V et al. Apr 2015 B2
9020305 Boncyk et al. Apr 2015 B2
9025813 Boncyk et al. May 2015 B2
9025814 Boncyk et al. May 2015 B2
9031278 Boncyk et al. May 2015 B2
9036947 Boncyk et al. May 2015 B2
9036948 Boncyk et al. May 2015 B2
9041810 Ecker et al. May 2015 B2
D735879 Behar et al. Aug 2015 S
9110925 Boncyk et al. Aug 2015 B2
9116920 Boncyk et al. Aug 2015 B2
9135355 Boncyk et al. Sep 2015 B2
9141714 Boncyk et al. Sep 2015 B2
9148562 Boncyk et al. Sep 2015 B2
D740945 Booth Oct 2015 S
9154694 Boncyk et al. Oct 2015 B2
9154695 Boncyk et al. Oct 2015 B2
9167800 Spicola, Jr. Oct 2015 B2
9179844 Fright et al. Nov 2015 B2
9186053 Viola Nov 2015 B2
9196067 Freed et al. Nov 2015 B1
9224205 Tsin et al. Dec 2015 B2
9235600 Boncyk et al. Jan 2016 B2
9244943 Boncyk et al. Jan 2016 B2
9262440 Boncyk et al. Feb 2016 B2
9268197 Digregorio et al. Feb 2016 B1
9285323 Burg et al. Mar 2016 B2
9288271 Boncyk et al. Mar 2016 B2
9311520 Burg et al. Apr 2016 B2
9311540 Ecker et al. Apr 2016 B2
9311552 Boncyk et al. Apr 2016 B2
9311553 Boncyk et al. Apr 2016 B2
9311554 Boncyk et al. Apr 2016 B2
9317769 Boncyk et al. Apr 2016 B2
9324004 Boncyk et al. Apr 2016 B2
9330326 Boncyk et al. May 2016 B2
9330327 Boncyk et al. May 2016 B2
9330328 Boncyk et al. May 2016 B2
9330453 Soldatitsch et al. May 2016 B2
9342748 Boncyk et al. May 2016 B2
D758608 Behar et al. Jun 2016 S
9377295 Fright et al. Jun 2016 B2
9395234 Cosentino et al. Jul 2016 B2
9399676 Schurpf et al. Jul 2016 B2
9438775 Powers Sep 2016 B2
9451928 Falco et al. Sep 2016 B2
9525867 Thomas et al. Dec 2016 B2
9528941 Burg et al. Dec 2016 B2
9607380 Burg et al. Mar 2017 B2
D783838 Zhao et al. Apr 2017 S
9690904 Zizi Jun 2017 B1
9808206 Zhao et al. Nov 2017 B1
9818193 Smart Nov 2017 B2
9861285 Fright et al. Jan 2018 B2
9863811 Burg Jan 2018 B2
9955910 Fright et al. May 2018 B2
9972077 Adiri et al. May 2018 B2
9996923 Thomas Jun 2018 B2
10013527 Fairbairn et al. Jul 2018 B2
D827827 Canfield et al. Sep 2018 S
10068329 Adiri et al. Sep 2018 B2
D831197 Scruggs et al. Oct 2018 S
10117617 Cantu et al. Nov 2018 B2
10130260 Patwardhan Nov 2018 B2
10143425 Zhao et al. Dec 2018 B1
D837388 Dacosta et al. Jan 2019 S
10267743 Burg et al. Apr 2019 B2
10307382 Jung et al. Jun 2019 B2
10362984 Adiri et al. Jul 2019 B2
10368795 Patwardhan Aug 2019 B2
10559081 Omer et al. Feb 2020 B2
RE47921 Patwardhan Mar 2020 E
D877931 Dacosta et al. Mar 2020 S
10614623 D'alessandro Apr 2020 B2
10617305 Patwardhan et al. Apr 2020 B2
10652520 Otto et al. May 2020 B2
10674953 Baker et al. Jun 2020 B2
10692214 Bisker Jun 2020 B2
10702160 Patwardhan Jul 2020 B2
10775647 Joy et al. Sep 2020 B2
10777317 Fairbairn et al. Sep 2020 B2
D898921 Dacosta et al. Oct 2020 S
D899604 Dacosta et al. Oct 2020 S
D903863 Dacosta et al. Dec 2020 S
10874302 Fright et al. Dec 2020 B2
11116407 Dickie et al. Sep 2021 B2
11134848 Bala et al. Oct 2021 B2
11250945 Fairbairn et al. Feb 2022 B2
20020054297 Lee et al. May 2002 A1
20020149585 Kacyra et al. Oct 2002 A1
20020197600 Maione et al. Dec 2002 A1
20030004405 Townsend et al. Jan 2003 A1
20030006770 Smith Jan 2003 A1
20030031383 Gooch Feb 2003 A1
20030036751 Anderson et al. Feb 2003 A1
20030085908 Luby May 2003 A1
20030164841 Myers Sep 2003 A1
20030164875 Myers Sep 2003 A1
20030229514 Brown Dec 2003 A2
20030231793 Crampton Dec 2003 A1
20040013292 Raunig Jan 2004 A1
20040014165 Keidar et al. Jan 2004 A1
20040059199 Thomas et al. Mar 2004 A1
20040080497 Enmei Apr 2004 A1
20040117343 Johnson Jun 2004 A1
20040136579 Gutenev Jul 2004 A1
20040146290 Kollias et al. Jul 2004 A1
20040201694 Gartstein et al. Oct 2004 A1
20040225222 Zeng et al. Nov 2004 A1
20040264749 Skladnev et al. Dec 2004 A1
20050012817 Hampapur et al. Jan 2005 A1
20050027567 Taha Feb 2005 A1
20050033142 Madden et al. Feb 2005 A1
20050084176 Talapov et al. Apr 2005 A1
20050094262 Spediacci et al. May 2005 A1
20050111757 Brackett et al. May 2005 A1
20050154276 Barducci et al. Jul 2005 A1
20050190988 Feron Sep 2005 A1
20050237384 Jess et al. Oct 2005 A1
20050259281 Boust Nov 2005 A1
20050273011 Hattery et al. Dec 2005 A1
20050273267 Maione Dec 2005 A1
20060008178 Seeger et al. Jan 2006 A1
20060012802 Shirley Jan 2006 A1
20060036135 Kern Feb 2006 A1
20060036156 Lachaine et al. Feb 2006 A1
20060044546 Lewin et al. Mar 2006 A1
20060055943 Kawasaki et al. Mar 2006 A1
20060058665 Chapman Mar 2006 A1
20060072122 Hu et al. Apr 2006 A1
20060073132 Congote Apr 2006 A1
20060089553 Cotton Apr 2006 A1
20060098876 Buscema May 2006 A1
20060135953 Kania et al. Jun 2006 A1
20060151601 Rosenfeld Jul 2006 A1
20060159341 Pekar et al. Jul 2006 A1
20060204072 Wetzel et al. Sep 2006 A1
20060210132 Christiansen et al. Sep 2006 A1
20060222263 Carlson Oct 2006 A1
20060268148 Kollias et al. Nov 2006 A1
20060269125 Kalevo et al. Nov 2006 A1
20060293613 Fatehi et al. Dec 2006 A1
20070065009 Ni et al. Mar 2007 A1
20070097381 Tobiason et al. May 2007 A1
20070125390 Afriat et al. Jun 2007 A1
20070129602 Bettesh et al. Jun 2007 A1
20070229850 Herber Oct 2007 A1
20070273894 Johnson Nov 2007 A1
20070276195 Xu et al. Nov 2007 A1
20070276309 Xu et al. Nov 2007 A1
20080006282 Sukovic Jan 2008 A1
20080021329 Wood et al. Jan 2008 A1
20080045807 Psota et al. Feb 2008 A1
20080088704 Wendelken et al. Apr 2008 A1
20080098322 Champion et al. Apr 2008 A1
20080126478 Ferguson et al. May 2008 A1
20080165357 Stem Jul 2008 A1
20080232679 Hahr Sep 2008 A1
20080246759 Summers Oct 2008 A1
20080275315 Oka et al. Nov 2008 A1
20080285056 Blayvas Nov 2008 A1
20080312642 Kania et al. Dec 2008 A1
20080312643 Kania et al. Dec 2008 A1
20090116712 Al-Moosawi et al. May 2009 A1
20090118720 Black et al. May 2009 A1
20090221874 Vinther Sep 2009 A1
20090225333 Bendal Sep 2009 A1
20090234313 Mullejeans et al. Sep 2009 A1
20100004564 Jendle Jan 2010 A1
20100020164 Perrault Jan 2010 A1
20100091104 Sprigle et al. Apr 2010 A1
20100111387 Christiansen, II et al. May 2010 A1
20100113940 Sen et al. May 2010 A1
20100121201 Papaionnou May 2010 A1
20100149551 Malinkevich Jun 2010 A1
20100156921 McLennan et al. Jun 2010 A1
20100191126 Al-Moosawi et al. Jul 2010 A1
20100278312 Ortiz Nov 2010 A1
20110102550 Daniel et al. May 2011 A1
20110125028 Wood et al. May 2011 A1
20110190637 Knobel et al. Aug 2011 A1
20120035469 Whelan et al. Feb 2012 A1
20120059266 Davis et al. Mar 2012 A1
20120078088 Whitestone et al. Mar 2012 A1
20120078113 Whitestone et al. Mar 2012 A1
20120253200 Stolka et al. Oct 2012 A1
20120265236 Wesselmann et al. Oct 2012 A1
20120275668 Chou et al. Nov 2012 A1
20130051651 Leary et al. Feb 2013 A1
20130162796 Bharara et al. Jun 2013 A1
20130335545 Darling Dec 2013 A1
20140048667 Drzymala et al. Feb 2014 A1
20140088402 Xu Mar 2014 A1
20140354830 Schafer et al. Dec 2014 A1
20150077517 Powers Mar 2015 A1
20150089994 Richards Apr 2015 A1
20150142462 Vaidya et al. May 2015 A1
20150150457 Wu et al. Jun 2015 A1
20150214993 Huang Jul 2015 A1
20150250416 LaPlante et al. Sep 2015 A1
20150265236 Garner Sep 2015 A1
20150270734 Davison et al. Sep 2015 A1
20160259992 Knodt et al. Feb 2016 A1
20160100790 Cantu et al. Apr 2016 A1
20160157725 Munoz Jun 2016 A1
20160206205 Wu et al. Jul 2016 A1
20160261133 Wang Sep 2016 A1
20160262659 Fright et al. Sep 2016 A1
20160275681 D'alessandro Sep 2016 A1
20160284084 Gurcan et al. Sep 2016 A1
20160338594 Spahn et al. Nov 2016 A1
20170076446 Pedersen et al. Mar 2017 A1
20170079577 Fright et al. Mar 2017 A1
20170084024 Gurevich Mar 2017 A1
20170085764 Kim et al. Mar 2017 A1
20170086940 Nakamura Mar 2017 A1
20170127196 Blum et al. May 2017 A1
20170258340 Przybyszewski et al. Sep 2017 A1
20170262985 Finn et al. Sep 2017 A1
20170303790 Bala et al. Oct 2017 A1
20170303844 Baker et al. Oct 2017 A1
20180132726 Dickie et al. May 2018 A1
20180214071 Fright et al. Aug 2018 A1
20180252585 Burg Sep 2018 A1
20180279943 Budman et al. Oct 2018 A1
20180296092 Hassan et al. Oct 2018 A1
20180303413 Hassan et al. Oct 2018 A1
20180322647 Harrington et al. Nov 2018 A1
20180336720 Larkins et al. Nov 2018 A1
20190133513 Patwardhan May 2019 A1
20190240166 Jung et al. Aug 2019 A1
20190290187 Ariri et al. Sep 2019 A1
20190298183 Burg et al. Oct 2019 A1
20190298252 Patwardhan Oct 2019 A1
20190307337 Little et al. Oct 2019 A1
20190307400 Zhao et al. Oct 2019 A1
20190310203 Burg et al. Oct 2019 A1
20190336003 Patwardhan Nov 2019 A1
20190350535 Zhao et al. Nov 2019 A1
20190369418 Joy et al. Dec 2019 A1
20200014910 Larkins Jan 2020 A1
20200121245 Barclay et al. Apr 2020 A1
20200126226 Adiri et al. Apr 2020 A1
20200126227 Adiri et al. Apr 2020 A1
20200196962 Zhao et al. Jun 2020 A1
20200209214 Zohar et al. Jul 2020 A1
20200211193 Adiri et al. Jul 2020 A1
20200211228 Adiri et al. Jul 2020 A1
20200211682 Zohar et al. Jul 2020 A1
20200211693 Adiri et al. Jul 2020 A1
20200211697 Adiri et al. Jul 2020 A1
20200225166 Burg et al. Jul 2020 A1
20200234444 Budman et al. Jul 2020 A1
20200286600 De Brouwer et al. Sep 2020 A1
20200297213 Patwardhan Sep 2020 A1
20200359971 Zhao et al. Nov 2020 A1
20200364862 Dacosta et al. Nov 2020 A1
20200383631 Canfield et al. Dec 2020 A1
20210000387 Zizi Jan 2021 A1
20210004995 Burg et al. Jan 2021 A1
20210068664 Fright et al. Mar 2021 A1
20210219907 Fright et al. Jul 2021 A1
20210386295 Dickie et al. Dec 2021 A1
20220215538 Robinson et al. Jul 2022 A1
Foreign Referenced Citations (92)
Number Date Country
549703 Mar 2012 AT
110326029 Oct 2019 CN
2642841 Mar 1978 DE
3420588 Dec 1984 DE
4120074 Jan 1992 DE
355221 Feb 1990 EP
552526 Jul 1993 EP
650694 May 1995 EP
1210906 Jun 2002 EP
1248237 Oct 2002 EP
1351036 Oct 2003 EP
1303267 Apr 2004 EP
1584405 Oct 2005 EP
1611543 Jan 2006 EP
1467706 Mar 2007 EP
1946567 Jul 2008 EP
119660 May 2009 EP
2272047 Mar 2012 EP
2883037 Jun 2015 EP
3114462 Jan 2017 EP
3143378 Mar 2017 EP
3160327 May 2017 EP
2750673 Aug 2017 EP
3251332 Dec 2017 EP
3270770 Jan 2018 EP
3286695 Feb 2018 EP
3364859 Aug 2018 EP
3365057 Aug 2018 EP
3371779 Sep 2018 EP
3371780 Sep 2018 EP
3381015 Nov 2019 EP
3586195 Jan 2020 EP
3589187 Jan 2020 EP
3602501 Feb 2020 EP
3555856 Apr 2020 EP
3655924 May 2020 EP
3371781 Sep 2020 EP
3707670 Sep 2020 EP
4183328 May 2023 EP
2384086 Jun 2012 ES
2570206 Mar 1986 FR
2458927 Nov 2012 GB
2544263 May 2017 GB
2544460 May 2017 GB
2544725 May 2017 GB
2545394 Jun 2017 GB
2557633 Jun 2018 GB
2557928 Jul 2018 GB
2559977 Aug 2018 GB
2559978 Aug 2018 GB
2011516849 May 2011 JP
5467404 Apr 2014 JP
293713 Sep 1997 NZ
588740 Jul 2012 NZ
WO2000003210 Jan 2000 WO
WO2000030337 May 2000 WO
WO2002001143 Jan 2002 WO
WO2002065069 Aug 2002 WO
WO2002093450 Nov 2002 WO
WO2004092874 Oct 2004 WO
WO2004095372 Nov 2004 WO
WO2005033620 Apr 2005 WO
WO2006078902 Jul 2006 WO
WO2007029038 Mar 2007 WO
WO2007043899 Apr 2007 WO
WO2007059780 May 2007 WO
WO2008033010 Mar 2008 WO
WO2008039539 Apr 2008 WO
WO2008048424 Apr 2008 WO
WO2008057056 May 2008 WO
WO2008071414 Jun 2008 WO
WO2008080385 Jul 2008 WO
WO2009046218 Apr 2009 WO
WO2009122200 Oct 2009 WO
WO2010048960 May 2010 WO
WO2012146720 Nov 2012 WO
WO2016069463 May 2016 WO
WO2016199134 Dec 2016 WO
WO2017077276 May 2017 WO
WO2017077277 May 2017 WO
WO2017077279 May 2017 WO
WO2017089826 Jun 2017 WO
WO2018109453 Jun 2018 WO
WO2018109479 Jun 2018 WO
WO2018154271 Aug 2018 WO
WO2018154272 Aug 2018 WO
WO2018185560 Oct 2018 WO
WO2019239106 Dec 2019 WO
WO2019239147 Dec 2019 WO
WO2020014779 Jan 2020 WO
WO2020141346 Jul 2020 WO
WO2020251938 Dec 2020 WO
Non-Patent Literature Citations (166)
Entry
Afromowitz, et al., “Multispectral Imaging of Burn Wounds: A New Clinical Instrument for Evaluating Burn Depth”, IEEE Transactions on Biomedical Engineering, vol. 35, No. 10, pp. 842-850; Oct. 1988.
Ahn et al., “Advances in Wound Photography and Assessment Methods,” Advances in Skin & Wound Care, Feb. 2008, pp. 85-93.
Ahroni, JH et al., “Reliability of computerized wound surface area determinations” Wounds: A Compendium of Clinical Research and Practice, No. 4, (1992) 133-137.
Anderson, R., et al. “The Optics of Human Skin”, The Journal of Investigative Dermatology, vol. 77, No. 1, pp. 13-19; Jul. 1981.
Armstrong, DG et al “Diabetic foot ulcers: prevention, diagnosis and classification” Am Fam Physician (Mar. 15, 1998); 57 (6) :1325-32, 1337-8.
Bale, S, Harding K, Leaper D. An Introduction to Wounds. Emap Healthcare Ltd 2000.
Beaumont, E et al “RN Technology Scorecard: Wound Care Science at the Crossroads” American Journal of Nursing (1998) Dec. 1998(12):16-18, 20-21.
Bergstrom, N, Bennett MA, Carlson CE. Treatment of Pressure Ulcers: Clinical Practice Guideline No. 15. Rockville, MD: U.S. Department of Health and Human Services. Public Health Service, Agency for Health Care Policy and Research 1994: 95-0652: [O].
Berriss 1997: Automatic Quantitative Analysis of Healing Skin Wounds using Colour Digital Image Processing: William Paul Berriss, Stephen John Sangwine [E].
Binder, et al., “Application of an artificial neural network in epiluminescence microscopy pattern analysis of pigmented skin lesions: a pilot study”, British Journal of Dermatology 130; pp. 460-465; 1994.
Bland, JM et al “Measurement error and correlation coefficients” BMJ (Jul. 6, 1996); 313 (7048) :41-2.
Bland, JM et al “Measurement error” BMJ (Jun. 29, 1996); 312 (7047) :1654.
Bohannon Richard; Barbara A Pfaller Documentation of Wound Surface Area from Tracings of Wound Perimeters [E].
Bolton, L., “Re Measuring Wound Length, Width, and Area: Which Technique?” Letters, Advances in Skin & Wound Care, pp. 450-452, vol. 21, No. 10.
Bostock, et al, Toward a neural network based system for skin cancer diagnosis; IEEE Conference on Artificial neural Networks, ISBN: 0-85296-573-7, pp. 215-219, May 1993.
BPG2005: Assessment and Management of Foot Ulcers for People with Diabetes: Nursing Best Practice Guidelines, Toronto, Ontario [E], Mar. 2013.
Briers, J.D., “Laser speckle contrast imaging for measuring blood flow,” Optica Applicata, 2007, pp. 139-152, vol. XXXVII, No. 1-2.
Briggs Corporation: Managed care making photo documentation a wound care standard. Wound care solutions product catalog 1997.
Brown, G “Reporting outcomes for Stage IV pressure ulcer healing: a proposal” Adv Skin Wound Care (2000)13:277-83.
Callieri 2003: Callieri M, Cignoni P, Pingi P, Scopigno R. Derma: Monitoring the evolution of skin lesions with a 3D system, VMV 2003. 8th International Fall Workshop, Vision, Modeling, and Visualization 2003, Nov. 19-21, 2003, Munich, Germany [E].
Campana: XML-based synchronization of mobile medical devices [E], 2002, 2 Pages.
Cardinal et al., “Early healing rates and wound area measurements are reliable predictors of later complete wound closure,” Wound Rep. Reg., 2008, pp. 19-22, vol. 16.
Cardinal et al., “Wound shape geometry measurements correlate to eventual wound healing,” Wound Rep. Reg., 2009, pp. 173-178, vol. 17.
Cascinelli, N., et al. “Results obtained by using a computerized image analysis system designed as an aid to diagnosis of cutaneous melanoma”, Melanoma Research, vol. 2, pp. 163-170, 1992.
Cleator et al., “Mobile wound care: Transforming care through technology,” Rehab & Community Care Medicine, Winter 2008, pp. 14-15.
Collins, C et al “The Role of Ultrasound in Lower Extremity Wound Management” International Journal of Lower Extremity Wounds (2002) 1: 229-235.
Daubechies, I., “The Wavelet Transform, Time-Frequency Localization and Signal Analysis”, IEEE Trans Inform Theory, vol. 36, No. 5, pp. 961-1005; Sep. 1990.
De Vet, HC et al “Current challenges in clinimetrics” J Clin Epidemiol (Dec. 2003); 56 (12) :1137-41.
De Vet, H C., et al., “When to use agreement versus reliability measures”, J Clin Eoidemiol 59 (10), (Oct. 2006), 1033-9.
Debray, M., Couturier P, Greuillet F, Hohn C, Banerjee S, Gavazzi G, Franco A. “A preliminary study of the feasibility of wound telecare for the elderly.” Journal of Telemedicine & Telecare 2001: 7(6): 353-8. [A].
Dowsett, C. et al., “Triangle of Wound Assessment—made easy”, Wounds Asia (www.woundasia.com), May 2015.
Duckworth et al., “A Clinically Affordable Non-Contact Wound Measurement Device,” 2007, pp. 1-3.
Duff, et al. (2003), Loftus Hills A, Morrell C 2000 Clinical. Guidelines for the management of venous leg ulcers: Implementation Guide. Royal College of Nursing; 2000: 001 (213): 1-48. [E].
Ercal, F., “Detection of Skin Tumor Boundaries in Color Images”, IEEE Transactions of Medical Imaging, vol. 12, No. 3, pp. 624-627, Sep. 1993.
Ercal, F., et al. “Neural Network Diagnosis of Malignant Melanoma From Color Images”, IEEE Transactions of Biomedical Engineering, vol. 41, No. 9, pp. 837-845, Sep. 1994.
Ferrell, B “Pressure ulcers. Assessment of healing” Clin Geriatr Med (1997) 13:575-87.
Fette, A.M., “A clinimetric analysis of wound measurement tools,” World Wide Wounds, 2006, [retrieved on Jul. 26, 2006]. Retrieved from the Internet: <URL: http://www.worldwidewounds.com/2006/January/Fette/Clinimetric-Ana . . . >, 6 pages.
Fitzpatrick et al., “Evaluating patient-based outcome measures for use in clinical trials,” Health Technology Assessment, 1998, vol. 2, No. 14, 86 pages.
Flahr et al., “Clinimetrics and Wound Science,” Wound Care Canada, 2005, pp. 18-19, 48, vol. 3, No. 2.
Flanagan, M. “Improving accuracy of wound measurement in clinical practice” Ostomy Wound Manage (Oct. 2003), 49(10):28-40.
Flanagan, M., “Wound measurement: can it help us to monitor progression to healing?” JWound Care (May 12, 2003) (5):189-94.
Gethin et al., “Wound Measurement: the contribution to practice,” EWMA Journal, 2007, pp. 26-28, vol. 7, No. 1.
Gilman, T “Wound outcomes: the utility of surface measures” Int J Low Extrem Wounds (Sep. 2004); 3 (3) :125-32.
Goldman, RJ “The patientcom, 1 year later” Adv Skin Wound Care (Nov.-Dec. 2002); 15 (6) :254, 256.
Goldman, RJ et al “More than one way to measure a wound: An overview of tools and techniques” Adv Skin Wound Care (2002) 15:236-45.
Golston, et al. “Automatic Detection of Irregular Borders in Melanoma and Other Skin Tumors”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 199-203, 1992.
Graaf, R., et al. “Optical properties of human dermis in vitro and in vivo”, Applied Optics, vol. 32, No. 4, pp. 435-447, Feb. 1, 1993.
Greene, A., “Computer image analysis in the diagnosis of melanoma”, Journal of the American Academy of Dermatology; vol. 31, No. 6, pp. 958-964, 1994.
Griffin, JW et al “A comparison of photographic and transparency-based methods for measuring wound surface area” Phys Ther (Feb. 1993); 73 (2) :117-22.
Haghpanah et al., “Reliability of Electronic Versus Manual Wound Measurement Techniques,” Arch Phys Med Rehabil, Oct. 2006, pp. 1396-1402, vol. 87.
Hansen 1997: Wound Status Evaluation Using Color Image Processing Gary: L. Hansen, Ephraim M. Sparrow, Jaydeep Y. Kokate, Keith J. Leland, and Paul A. Iaizzo [E].
Hayes 2003:Hayes S, Dodds, S. Digital photography in wound care. Nursing Times 2003:9(42):48-9. [A].
Herbin, et al, Color Quantitation Through Image Processing in Dermatology; IEEE Transaction on Medical Imaging, vol. 9, Issue 3, pp. 262-269, Sep. 1990.
Hibbs, P “The economics of pressure ulcer prevention” Decubitus (Aug. 1988); 1 (3) :32-8.
Houghton 2000: Houghton PE, Kincaid CB, Campbell KE, Woodbury MG, Keast DH. Photographic assessment of the appearance of chronic pressure and leg ulcers. Ostomy Wound management 2000: 46(4): 20-6, 28-30. [A].
HSA Global, “Mobile Wound Care”, Marketing material (2009).
Huang, C., et al. “Border irregularity: atypical moles versus melanoma”, Eur J Dermatol, vol. 6, pp. 270-273, Jun. 1996.
Iakovou, D. et al., “Integrated sensors for robotic laser welding,” Proceedings of the Third International WLT-Conference on Lasers in Manufacturing, Jun. 2005, pp. 1-6.
International Search Report and Written Opinion for International Application No. PCT/US2004/028445 filed Sep. 1, 2004.
International Search Report and Written Opinion dated Jan. 23, 2019, International Application No. PCT/IB2018/000447, 20 pages.
International Search Report and Written Opinion dated Jul. 2, 2019, International Application No. PCT/IB2018/001572, 17 pages.
International Search Report and Written Opinion dated Mar. 1, 2007, International Application No. PCT/NZ2006/000262, 12 pages.
Johnson, JD (1995) Using ulcer surface area and volume to document wound size, J Am Podiatr Med Assoc 85(2), (Feb. 1995), 91-5.
Jones, et al, An Instrument to Measure the Dimension of Skin Wounds; IEEE Transaction on Biomedical Engineering, ISSN: 0018-9294; vol. 42, Issue 5, pp. 464-470, May 1995.
Jones, TD “Improving the Precision of Leg Ulcer Area Measurement with Active Contour Models”, PHD Thesis (1999) http://www.comp.glam.ac.uklpages/staff/tjones/ThesisOL/Title. Htm.
Jones, TD et al “An active contour model for measuring the area of leg ulcers” IEEE Trans Med Imaging (Dec. 19, 2000), (12):1202-10.
Kecelj-Leskovec et al., “Measurement of venous leg ulcers with a laser-based three-dimensional method: Comparison to computer planimetry with photography,” Wound Rep Reg, 2007, pp. 767-771, vol. 15.
Kenet, R., et al. “Clinical Diagnosis of Pigmented Lesions Using Digital Epiluminescence Microscopy”, Arch Dermatol, vol. 129, pp. 157-174; Feb. 1993.
Khashram et al., “Effect ofTNP on the microbiology of venous leg ulcers: a pilot study,” J Wound Care, Apr. 2009, pp. 164-167, vol. 18, No. 4.
Kloth, LC et al “A Randomized Controlled Clinical Trial to Evaluate the Effects of Noncontact Normothermic Wound Therapy on Chronic Full- thickness Pressure Ulcers” Advances in Skin & Wound Care (Nov.-Dec. 2002), 15(6):270-276.
Korber et al., “Three-dimensional documentation of wound healing: First results of a new objective method for measurement,” JDDG, Oct. 2006, (Band 4), pp. 848-854.
Koren, et al, Interactive Wavelet Processing and Techniques Applied to Digital Mammography; IEEE Conference Proceedings, ISBN: 0-7803-3192-3; vol. 3, pp. 1415-1418, May 1996.
Kovesi, P., “Image Features From Phase Congruency”, University of Western Australia, pp. 1-30; Technical Report 9/4, Revised Jun. 1995.
Krouskop, TA et al “A noncontact wound measurement system” J Rehabil Res Dev (May-Jun. 2002) 39(3):337-45.
Kumar et al., “Wound Image Analysis Classifier For Efficient Tracking Of Wound Healing Status,” Signal & Image Processing: An International Journal (SIPIJ), vol. 5, No. 2, Apr. 2014, pp. 15-27.
Kundin 1989: Kudin JI. A new way to size up a wound. American Journal of Nursing 1989: (2):206-7.
Lakovou, D. et al., “Integrated sensors for robotic laser welding,” Proceedings of the Third International WLT-Conference on Lasres in Manufacturing, Jun. 2005, pp. 1-6.
Langemo et al., “Measuring Wound Length, Width, and Area: Which Technique?”, Advances in Skin & Wound Care, Jan. 2008, pp. 42-45, vol. 21, No. I.
Langemo, DK et al “Comparison of 2 Wound Volume Measurement Methods” Advances in Skin & Wound Care (Jul.-Aug. 2001), vol. 14(4), 190-196.
Langemo, DK et al “Two-dimensional wound measurement: comparison of 4 techniques” Advances in Wound Care (Nov.-Dec. 1998), 11(7):337-43.
Laughton, C et al “A comparison of four methods of obtaining a negative impression of the foot” J Am Podiatr Med Assoc (May 2002); 92 (5) :261-8.
Lee, et al, A Multi-stage Segmentation Method for Images of Skin Lesions; IEEE Conference Proceedings on Communication, Computers, and Signal Processing, ISBN 0-7803-2553-2, pp. 602-605, May 1995.
Levoy, et al. “The Digital Michelangelo Project: 3D Scanning Of Large Statues,” ACM, 2000.
Lewis 1997: Lewis P, McCann R, Hidalgo P, Gorman M. Use of store and forward technology for vascular nursing teleconsultation service. Journal of Vascular Nursing 1997. 15(4): 116-23. [A].
Lewis, JS, Achilefu S, Garbow JR, Laforest R, Welch MJ., Small animal imaging. current technology and perspectives for oncological Imaging, Radiation Sciences, Washington University School of Medicine, Saint Louis, MO, USA, Eur J Cancer. Nov. 2002;38(16):2173-88.
Li, D. 2004, Database design and implementation for wound measurement system. Biophotonics, 2004: 42-43. [E].
Liu et al., “Wound measurement by curvature maps: a feasibility study,” Physiol. Meas., 2006, pp. I 107-I 1123, vol. 27.
Lorimer, K “Continuity through best practice: design and implementation of a nurse-led community leg-ulcer service” Can J Nurs Res (Jun. 2004) 36(2):105-12.
Lowery et al., “Technical Overview of a Web-based Telemedicine System for Wound Assessment,” Advances in Skin & Wound Care, Jul./Aug. 2002, pp. 165-169, vol. 15, No. 4.
Lowson, S., “The safe practitioner: Getting the record straight: the need for accurate documentation,” J Wound Care, Dec. 2004, vol. 13, No. 10, [retrieved on Dec. 17, 2004). Retrieved from the Internet: <URL: http://www.journalofwoundcare.com/nav?page=jowc.article&resource=I45512 5>, 2 pages.
Lucas, C., “Pressure ulcer surface area measurement using instant full-scale photography and transparency tracings,” Advances in Skin & Wound Care, Jan./Feb. 2002, [retrieved on Jul. 28, 2006]. Retrieved from the Internet: <URL: http://www.findarticles.com/p/articles/mi _qa3977/is_200201 /ai_n904 . . . >, 7 pages.
Lunt, M.J., “Review of duplex and colour Doppler imaging of lower-limb arteries and veins,” World Wide Wounds, 2000, [retrieved on Apr. 17, 2005] Retrieved from the Internet: <URL: http://www.worldwidewounds.com/2000/sept/Michael-Lunt/Dopple . . . >, 6 pages.
Maglogiannis et al., “A system for the acquisition of reproducible digital skin lesions images,” Technol and Health Care, 2003, pp. 425-441, vol. 11.
Malian et al., “MEDPHOS: A New Photogrammetric System for Medical Measurement,” 2004, Commission V, WG V/3, 6 pages.
Mallat, S., et al. “Characterization of signals from multiscale edges”, IEEE Trans Patt and Mech Int'l; 14:710-732; 1992.
Marchesini, R., et al. “In vivo Spectrophotometric Evaluation of Neoplastic and Non-Neoplastic Skin Pigmented Lesions. III. CCD Camera-Based Reflectance Imaging”, Photochemistry and Photobiology, vol. 62, No. 1, pp. 151-154; 1995.
Marjanovic et al., “Measurement of the volume of a leg ulcer using a laser scanner,” Physiol. Meas., 1998, pp. 535-543, vol. 19.
Mashburn et al., “Enabling user-guided segmentation and tracking of surface-labeled cells in time-lapse image sets of living tissues,” Cytometry A., NIH Public Access, May 1, 2013, pp. 1-17.
Mastronjcola et al., “Burn Depth Assessment Using a Tri-stimulus Colorimeter,” Wounds—ISSN: 1044-7946, Sep. 2005, pp. 255-258, vol. 17, No. 9.
McCardle, J., “Visitrak: wound measurement as an aid to making treatment decisions,” The Diabetic Foot, Winter 2005, [retrieved onMar. 30, 2008). Retrieved from the Internet: < URL: http://findarticles.com/p/articles/mi_mOMDQ/is_4_8/ai_n16043804/print>, 4 pages.
Menzies, S., “The Morphologic Criteria of the Pseudopod in Surface Microscopy”, Arch Dermatol, vol. 131, pp. 436-440, Apr. 1995.
Molnar et al., “Use of Standardized, Quantitative Digital Photography in a Multicenter Web-based Study,” 2009, ePlasty, pp. 19-26, vol. 9.
Nachbar, et al., “The ABCD rule of dermatology”, Journal of the American Academy of Dermatology, vol. 3, No. 4, pp. 551-559, Apr. 1994.
National Pressure Ulcer Advisory Panel, “FAQ: Photography for pressure ulcer documentation, ” 1 1P56, 4 pages.
National Pressure Ulcer Advisory Panel, Position Statement, 1998, [retrieved on Jan. 6, 2005]. Retrieved from the Internet: <URL: http://www.npuap.org/>, 2 pages (Pressure Ulcer Healing Chart attached, 2 pages).
Oduncu et al., “Analysis of Skin Wound Images Using Digital Color Image Processing: A Preliminary Communication,” Lower Extremity Wounds, 2004, pp. 151-156, vol. 3, No. 3.
Pages, Jordi, et al., “Plane-to-plane positioning from image-based visual serving and structured light,” Proceedings of 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep. 28-Oct. 2, 2004, pp. 1004-1009.
Sani-Kick et al., “Recording and Transmission of Digital Wound Images with the Help of a Mobile Device,” 2002, 2 pages.
Santamaria et al., “The effectiveness of digital imaging and remote expert wound consultation on healing rates in chronic lower leg ulcers in the Kimberley region of Western Australia,” Primary Intention, May 2004, pp. 62-70, vol. 12, No. 2.
Schindewolf, et al. “Comparison of classification rates for conventional and dermatoscopic images of malignant and benign melanocytic lesions using computerized colour image analysis”, Eur J Dermatol, vol. 3, No. 4, pp. 299-303, May 1993.
Schindewolf, T., et al. “Classification of Melanocylic Lesions with Color and Texture Analysis Using Digital Image Processing”, The International Academy of Cytology, Analytical and Quantitative Cytology and Histology, vol. 15, No. 1, pp. 1-11, Feb. 1993.
Schindewolf, T., et al. “Evaluation of different image acquisition techniques for a computer vision system in the diagnosis of malignant melanoma”, Journal of the American Academy of Dermatology, vol. 31, No. 1, pp. 33-41, Jul. 1994.
Schultz et al., “Wound bed preparation: a systematic approach to wound management,” Wound Repair and Regeneration, Mar./Apr. 2003, p. SI-S28, vol. 1 1, No. 2, Supplement.
Shaw et al., “An Evaluation of Three Wound Measurement Techniques in Diabetic Foot Wounds,” Diabetes Care, 2007, [retrieved on Mar. 30, 2008]. Retrieved from the Internet: <URL: http://care.diabetesjournals.org/cgi/content/full/30/ I 0/2641 ?ck=nck>, 5 pages.
Sheehan et al., “Percent Change in Wound Area of Diabetic Foot Ulcers Over a 4-Week Period Is a Robust Predictor of Complete Healing in a 12-Week Prospective Trial,” Diabetes Care, Jun. 2003, pp. 1879-1882, vol. 26, No. 6.
Sheng, Chao, Brian W. Pogue, Hamid Dehghani, Julia A. O'Hara, P. J. Hoopes, Numerical light dosimetry in murine tissue: analysis of tumor curvature and angle of incidence effects upon fluence in the tissue, Proc. SPIE, vol. 4952, 39 (2003), DOI: 10.1117/12.474081, Online Publication Date: Jul. 28, 2003.
Smith & Nephew, “Leg ulcer guidelines: a pocket guide for practice,” National Guideline Clearinghouse, U.S. Dept of Health & Human Services, 2002, [retrieved on Jan. 10, 2012]. Retrieved from the Internet: <URL: http://guidelines.gov/content.aspx?id=9830&search=Pressure+Ulcer>, 17 pages.
Smith & Nephew, “Visitrak Wound Measurement Device,” Wound Management, [retrieved on Apr. 7, 2005]. Retrieved from the Internet: <URL: http://wound.smith-nephew.com/us/node.asp?NodeId=3 I 20>, 7 pages.
Smith & Nephew, “Guidelines for the Management of Leg Ulcers in Ireland” www.smith-nephew.com.
Smith et al., “Three-Dimensional Laser Imaging System for Measuring Wound Geometry,” Lasers in Surgery and Medicine, 1998, pp. 87-93, vol. 23.
Sober, et al., “Computerized Digital Image Analysis: An Aid for Melanoma Diagnosis”, The Journal of Dermatology, vol. 21, pp. 885-890, 1994.
Solomon et al., “The use of video image analysis for the measurement of venous ulcers,” British J Dermatology, 1995, pp. 565-570, vol. I 33.
Steiner, A., “In vivo epiluminescence microscopy of pigmented skin lesions. II. Diagnosis of small pigmented skin lesions and early detection of malignant melanoma”, Journal of the American Academy of Dermatology, vol. 17, No. 4, pp. 584-591; Oct. 1987.
Stoecker, et al. “Automatic Detection of Asymmetry in Skin Tumors”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 191-197, 1992.
Takiwaki, et al., “A rudimentary system for automatic discrimination among basic skin lesions on the basis of color analysis of video images”, Journal of the American Academy of Dermatology, vol. 32, No. 4, pp. 600-604, Apr. 1995.
Tellez, R., “Managed Care Making Photo Documentation a Wound Care Standard,” Wound Care, 1997, [retrieved on Aug. 29, 2005]. Retrieved from the Internet: <URL: http://woundcare.org/newsvol2n4/art.htm>, 2 pages.
Thali, M.J., et al. “Optical 3D surface digitizing in forensic medicine: 3D documentation of skin and bone injuries.” Forensic Science International. 2003.
Thawer et al., “A Comparison of Computer-Assisted and Manual Wound Size Measurement,” Ostomy Wound Management, Oct. 2002, pp. 46-53, vol. 48, No. IO.
Treuillet et al., “Three-Dimensional Assessment of Skin Wounds Using a Standard Digital Camera,” IEEE Transactions on Medical Imaging, May 2009, pp. 752-762, vol. 28, No. 5.
Umbaugh et al., “Automatic Color Segmentation Algorithms with Application to Skin Tumor Feature Identification”, IEEE Engineering in Medicine and Biology, pp. 75-82, Sep. 1993.
Umbaugh, et al., “An Automatic Color Segmentation Algorithm with Application to Identification of Skin Tumor Borders”, Computerized Medical Imaging and Graphics, vol. 16, No. 3, pp. 227-235, May-Jun. 1992.
Umbaugh, et al., “Automatic Color Segmentation of Images with Application to Detection of Variegated Coloring in Skin Tumors”, IEEE Engineering In Medicine and Biology Magazine, Dec. 1989, pp. 43-52.
Van Zuijlen et al., “Reliability and Accuracy of Practical Techniques for Surface Area Measurements of Wounds and Scars,” Lower Extremity Wounds, 2004, pp. 7-11, vol. 3, No. I.
Vermolen et al., “A simplified model for growth factor induced healing of circular wounds,” 2005, pp. 1-15.
Voigt, H., et al. “Topodermatographic Image Analysis for Melanoma Screening and the Quantitative Assessment of Tumor Dimension Parameters of the Skin”, Cancer, vol. 75, No. 4, Feb. 15, 1995.
Walker, N, Rogers A, Birchall N, Norton R, MacMahon S. Leg ulcers in New Zealand: age at onset, recurrence and provision of care in an urban population. NZ Med J; 2002; 115(1156):286-9.
Walker, N, Vandal A, Holden K, Rogers A, Birchall N, Norton R, Triggs C, MacMahon S. Does capture-recapture analysis provide more reliable estimates of the incidence and prevalence of leg ulcers in the community? Aust NZJ Public Health 2002; 26(5):451-5.
Walker, N., Rodgers A, Birchall N, Norton R, MacMahon S. The occurrence of leg ulcers in Auckland: results of a population-based study. NZ Med J; 2002: 115 (1151): 159-162.
Wallenstein et al., “Statistical analysis of wound-healing rates for pressure ulcers,” Amer J Surgery, Jul. 2004 (Supplement), pp. 73S-78S, vol. 188.
Wang et al., “A comparison of digital planimetry and transparency tracing based methods for measuring diabetic cutaneous ulcer surface area,” Zhongguo Xiu Fu Chong Jian Wai Ke Za Zhi, May 2008, pp. 563-566, vol. 22, No. 5, [retrieved on Sep. 15, 2009]. Retrieved from the Internet: <URL: http://www.ncbi.nlm.nih.gov/pu bmed/ I 8630436?ordinalpos= I &itool=E . . . > I page.
Wendelken et al., “Key Insights on Mapping Wounds With Ultrasound,” Podiatry Today, Jul. 2008, [retrieved on Jul. 14, 2008]. Retrieved from the Internet: <URL: http://www.podiatrytoday.com/article/5831>, 5 pages.
Wilbright, W.A., The Use of Telemedicine in the Management of Diabetes-Related Foot Ulceration: A Pilot Study, Advances in Skin & Wound Care, Jun. 2004, [retrieved on Jul. 28, 2006]. Retrieved from the Internet: <URL: http://www.findarticles.com/p/articles/mi_ qa3977/is_200406/ai_n942 . . . >, 6 pages.
Wild et al., “Wound healing analysis and measurement by means of colour segmentation,” ETRS Poster Presentation V28, Sep. 15, 2005, V28-I 7, 1 page.
Williams, C., “The Verge Videometer wound measurement package,” British J Nursing, Feb./Mar. 2000, pp. 237-239, vol. 9, No. 4.
Woodbury et al., Pressure ulcer assessment instruments: a critical appraisal, Ostomy Wound Management, May 1999, pp. 48-50, 53-55, vol. 45, No. 5, [retrieved on Dec. 8, 2005]. Retrieved from the Internet: <URL: http://gateway.ut.ovid.com.ezproxy.otago.ac.nzigw2/ovidweb.cgi>, 2 pages.
Zhao, et al, The Classification of the Depth of Burn Injury Using Hybrid Neural Network; IEEE Conference on Engineering in Medicine and Biology Society, ISBN 0-7803-2475-7; vol. 1, pp. 815-816, Sep. 1995.
Zimmet, “Venous Leg Ulcers: Evaluation and Management,” American College of Phlebology. 1998.
Notice of Allowance dated Feb. 24, 2021, U.S. Appl. No. 15/816,862, 18 pages.
Non-Final Office Action dated Nov. 24, 2021, U.S. Appl. No. 16/500,785, 34 pages.
Non-Final Office Action dated Oct. 24, 2022, U.S. Appl. No. 17/398,883, 18 pages.
Extended European Search Report for European Application No. EP22204772.2 filed Apr. 3, 2018, dated Apr. 13, 2023, 6 pages.
Final Office Action dated Jul. 13, 2023, U.S. Appl. No. 17/398,883, 22 pages.
Non-Final Office Action dated Mar. 29, 2023, U.S. Appl. No. 17/100,615, 10 pages.
Patete et al., “A non-invasive, three-dimensional, diagnostic laser imaging system for accurate wound analysis,” Physiol, Meas., 1996, pp. 71-79, vol. 17.
Payne, C., “Cost benefit comparison of plaster casts and optical scans of the foot for the manufacture of foot orthoses,” AJPM, 2007, pp. 29-31, vol. 41, No. 2.
Pehamberger, H., et al. “In vivo epiluminescence microscopy of pigmented skin lesions. I. Pattern analysis of pigmented skin lesions”, Journal of American Academy of Dermatology, vol. 17, No. 4, pp. 571-583, Oct. 1987.
Plassman, et al. “Problems of Assessing Wound Size,” Would healing Research Unit, University of Wales College of Medicine, Cardiff CF4 4XN, Wales, UK (1993) (Unpublished).
Plassmann et al., “MAVIS: a non-invasive Instrument to measure area and volume of wounds,” Medical Engineering & Physics, 1998, pp. 332-338, vol. 20.
Plassmann, P., “Recording Wounds—Documenting Woundcare,” Medical Computing Group, 1998, pp. 1-31.
Plaza et al., “Minimizing Manual Image Segmentation Turn-Around Time for Neuronal Reconstruction by Embracing Uncertainty,” PLOS One, vol. 7, Issue 9, Sep. 2012, pp. 1-14.
Rogers et al., “Measuring Wounds: Which Stick to Use?”, Podiatry Management, Aug. 2008, pp. 85-90.
Romanelli et al., “Technological Advances in Wound Bed Measurements,” Wounds, 2002, pp. 58-66, vol. 14, No. 2, [retrieved on Apr. 8, 2005]. Retrieved from the Internet: <URL: http://www.medscape.com/viewarticle/430900_print>, 8 pages.
Russell, L., “The importance of wound documentation & classification,” British J Nursing, 1999, pp. 1342-1354, vol. 8, No. 20.
Salcido, R., “The Future of Wound Measurement,” Advances in Skin & Wound Care, Mar./Apr. 2003, pp. 54, 56, vol. 13, No. 2.
Salcido, R., “Pressure Ulcers and Wound Care,” Physical Medicine and Rehabilitation, eMedicine, 2006, [retrieved on]. Retrieved from the Internet: <URL: http://www.emedicine.com/pmr/topic 179.htm>, 25 pages.
Salmhofer, et al., “Wound teleconsultation in patients with chronic leg ulcers,” 2005.
Related Publications (1)
Number Date Country
20220270746 A1 Aug 2022 US
Divisions (1)
Number Date Country
Parent 15974433 May 2018 US
Child 17005260 US
Continuations (2)
Number Date Country
Parent 17005260 Aug 2020 US
Child 17567270 US
Parent 15144722 May 2016 US
Child 15974433 US