The present disclosure relates to a hand hygiene compliance (HHC) system that, in addition to monitoring hand hygiene, provides data entry and messaging capabilities that allow healthcare workers to optimize their workflow and, in the process, improve the level of care they provide to each of their patients. More specifically, the HHC system includes a control unit that is associated with a hand hygiene dispenser and configured to enable use of a touch or touch-free user interface each time the control unit detects a parameter indicating use of the dispenser by an individual, wherein icons on the user interface allow the individual to, without limitation, communicate, enter, or update patient care information.
In 2002, the Centers for Medicare and Medicaid Services (CMS) asked the Agency for Healthcare Research and Quality (AHRQ) to develop a survey that measures a patient's perception of the level of care they received during their stay. The survey, which is now commonly referred to as the Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey, includes twenty-seven (27) questions and is used to publicly report hospital performance (quality of care as perceived by patients). As follows, consumers (that is, potential patients) may rely on public reports of hospital performance to select a hospital. Further, in order to avoid losing as much as two-percent (2%) of its Medicare/Medicaid reimbursement, hospitals must provide a patient with the HCAHPS survey at the time the patient is discharged.
Of the 27 questions included in the HCAHPS survey, a select number are related to pain management (that is, the frequency with which healthcare workers inquire about a patient's level of pain). Currently, when pain is assessed inside a patient's room, it s communicated in written form or called into a nurse's station. As such, since data related to pain management is not recorded electronically at the time of assessment, there exists a likelihood for said data to be forgotten or recorded incorrectly. This is problematic, because this data serves as the only check against HCAHPS scores, specifically pain management scores.
The issue of healthcare-associated infections (HAIs) is well known within and outside the healthcare community. To date, many studies have been conducted in an effort to ascertain effective ways to reduce the occurrence of HAIs, and the clear majority finds a thorough cleansing of one's hands upon entering and exiting a patient's room as the single most effective way to prevent the spread of HAIs. As a result, in an attempt to improve patient care, many hospitals have installed HHC systems to monitor healthcare workers' compliance with hand hygiene protocols. However, since HHC systems are limited to monitoring hand hygiene, which accounts for only one of a plurality of factors affecting patient care, the return-on-investment (ROI) for these systems has yet to be fully optimized.
Therefore, in order to improve documentation of pain management and other similar quality metrics encompassed in the HCAHPS survey, hospitals must implement different systems are needed to improve some or all of the many factors affecting patient care. Thus, there is a need for a system that combines the asset tracking capabilities of an RTLS system, the messaging capabilities of a nurse/call system, and the hand hygiene monitoring capabilities of a HHC system.
The present disclosure may address one or more of the problems and deficiencies discussed above. However, it is contemplated that the disclosure may prove useful in addressing other problems and deficiencies in a number of technical areas. Therefore, the present disclosure should not necessarily be construed as limited to addressing any of the particular problems or deficiencies discussed herein.
Embodiments of the present disclosure provide a HHC system that, in addition to monitoring hand hygiene, provides data entry, messaging, and asset tracking capabilities which allow healthcare workers to optimize their workflow, and, in the process, improve the quality of care they provide to each of their patients. In a preferred embodiment, the HHC system includes a communications network capable of detecting the presence of a person having a wearable tag, preferably in the form of a Radio Frequency Identification (RFID) tag, and monitors whether the person washed his hands upon entering and exiting a patient's room. The HHC system also includes a control unit (that is, a device equipped with a sensor and communications devices) which further includes a feedback device in the form of a display and necessary hardware to detect the wearable tag and communicate with a communications network, such as a wireless computer network. Through the communications network, the control unit may communicate with devices throughout the hospital, including, without limitation, servers, tablets, PDAs, cellular phones, desktop computers at an administrator's desk or nurses' station, or any other like device now existing or hereinafter developed.
The control unit is associated with a hand hygiene dispenser and is programmed to enable use of a touch or touch-free user interface each time the control unit detects a parameter indicating use of the hand hygiene dispenser by a person. In particular, icons displayed on the touch or touch-free user interface allow the person to, without limitation, communicate, enter, obtain, or update workflow information through the selection of one or more icon(s) displayed on the feedback device. In other words, the user interface cannot receive input unless and until the person complies with hand hygiene protocols by using the dispenser. Additionally, the term “icons” is used broadly to refer to a graphic or textual element displayed on the feedback device, the selection of which may execute a command, a macro, or cause new icons to be displayed on the feedback device.
In one embodiment, healthcare workers enter or update existing patient information by selecting one or more icons of a touch-screen user interface (TUI) displayed on the feedback device of a control unit. More specifically, upon detecting use of the hand hygiene dispenser by a healthcare worker wearing a tag, the control unit enables use of the TUI to allow the healthcare worker to enter or update existing patient information, such as without limitation, a pain status indicator for a patient. The control unit preferably is programmed to prohibit use of the TUI and its associated icons unless and until a person complies with hand hygiene protocols by using the dispenser. Thus, if a healthcare worker needs to access the TUI to perform a required task, then the healthcare worker must comply with hand hygiene protocols. Otherwise, the healthcare worker cannot perform her job.
In another embodiment, the user interface is touch-free and, while enabled, allows healthcare workers to select icons displayed on the feedback device without physically touching the display. More specifically, the control unit includes a gesture-sense system which includes a plurality of transmitters, a receiver, and a controller. The transmitters can be configured to transmit a light-based signal, a heat-based signal, or a sound-based signal. The receiver measures reflected signals from an object, such as a user's hand, over a predetermined amount of time to detect motion of the object. The controller is associated with the receiver and uses an algorithm to match motion of the object to one of a plurality of predefined gestures which may include, without limitation, a right swipe, left swipe, hover, or enter gesture. In the event the object's motion matches one of the predefined touch-free gestures, the controller executes an action in response to the gesture. As an example of an action, the controller may change the selection status of an icon by moving a selection indicator (which may be represented by highlighting the icon) left or right or directly to a particular icon, or may execute the command associated with the icon, which may cause the controller to perform a function, macro, or modify those icons currently displayed on the feedback device.
Further, in response to detecting one or more gestures, the control unit communicates data over the communications network to the server, wherein data may include, without limitation, the icon or sequence of icons selected. Upon receiving data, a processor associated with the server is programmed to execute instructions specific to data. Alternatively, in other embodiments, the control unit may include a processor that is programmed to execute instructions specific to data.
These and other embodiments of the present disclosure will become readily apparent to those skilled in the art from the following detailed description of the embodiments having reference to the attached figures, the disclosure not being limited to any particular embodiment(s) disclosed.
The various embodiments of the present disclosure and their advantages may be understood by referring to
As used herein, “processing workflow information” means executing instructions in response to one or more icons selected from a user interface displayed on a feedback device associated with a control unit, wherein the control unit or a server in communication with the control unit may be configured to process workflow information. Likewise, the following terms shall be construed in the following manner: “entering workflow information” means receiving input from a person, wherein input is related to workflow information and includes, without limitation, entering new workflow information or updating existing workflow information; and “communicating workflow information” means to distribute workflow information to devices on the communications network or directly to a person through a communications interface, such as a feedback device on a control unit. The term “transmitters” broadly refers to any device operable to transmit a light-based, sound-based, or heat-based signal. The term “receiver” broadly refers to devices operable to measure signals reflected off an object in addition to ambient light levels in a room or area. The term “device” broadly refers to tablets, smart phones, PDAs, personal computers, servers and any other like device now existing or hereafter developed. Finally, the term “pain status inquiry event” refers to verbal or non-verbal communications between a person (e.g. healthcare worker) and a patient regarding the level of pain, if any, that the patient may be experiencing.
In
Referring to
At step (220), control branches again based upon actions of the person. If an icon on the TUI is not selected within a predetermined interval of time, then control branches to step (225) and the control unit (110) disables use of the TUI. Conversely, if an icon on the TUI is selected within the predetermined interval of time, then control branches to step (230) and, as a response to the icon most recently selected, the graphics processor (130) performs a function, macro, or generates new icons to display on the feedback device (120). At step (235), the graphics processor (130) updates the TUI in response to the icon most recently selected. At step (240), control branches again based upon actions of the person. If additional icons are selected, then iterations of steps (230) and (235) are executed until the predetermined interval of time passes without an icon of the TUI being selected. Once this condition is satisfied, control branches to step (245) and the communications device (140) communicates data over the communications network to the server (150), wherein the server (150) processes workflow information. Alternatively, in other embodiments, the control unit (100) may be programmed or configured to process workflow information.
Referring now to
As shown in
The control unit (300) also transmits results (that is, the pain status indicator) of the pain status inquiry event to a server (not shown) via a wired or wireless network, wherein the server assigns a timestamp for the event and stores the pain status indicator in memory associated with the server. It is understood that the aforementioned results may be the numerical value or range of numerical values assigned to the pain status indicator, or a unique code associated with the pain status indicator. Alternatively, the control unit (300) may be programmed to assign the timestamp and store results locally on a memory associated with the control unit (300). Still further, the control unit (300) may be programmed to transmit results from memory to the server over a wired or wireless network.
A report based on results may be generated by authorized personnel and viewed on a device, such as without limitation, a laptop or desktop computer, smartphone, PDA, or any other like device now existing or developed hereafter. More specifically, the report may include, without limitation, the number of pain status inquiry events performed over a predetermined interval of time (e.g., interval of time spanning a patient's admission to said patient's discharge) along with the pain status indicator recorded for each event. Further, the report may also include the number of instances where a pain status inquiry event proved unsuccessful due to the patient being asleep or otherwise unavailable. Still further, the report may include the name of a healthcare worker associated with each pain status inquiry event. The report may be compared against Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) scores so as to identify any lapses in implementing a hospital's pain management protocol. Still further, nurse managers may use the report to educate those individuals (that is, healthcare workers) that do not adhere to an established protocol regarding pain status inquiry events.
The control unit (300) may be programmed to monitor a time lapse since a pain status indicator (323) was entered or most recently updated. As follows, if the time lapse exceeds a predetermined value, the control unit (300) may be programmed to generate a notification on the feedback device (320), wherein the notification (e.g., audio or visual notification) prompts healthcare workers within a predetermined proximity of the control unit (300) to perform a pain status inquiry event.
Referring now to
By recording the ratio of R1 to R2 as well as the amplitude of R1 and R2, the controller can detect motion of the object (505) towards or away from the gesture sense system (510). For example, if the ratio of R1 to R2 remains substantially the same over a series of measurements, but the amplitude measured for R1 and R2 increase or decrease, then the controller (335) interprets this as motion towards the gesture sense system (510) or away from the gesture sense system (510), respectively. As follows, motion of the object (505) towards the gesture sense system (510) is interpreted by the controller (535) as an enter gesture used to select an icon on a menu of icons displayed on the feedback device (540). Further, as discussed in more detail below, in addition to detecting motion in the Z-axis, the gesture sense system (510) is operable to detect motion of the object (505) in both the X and Y-axis.
As an example, a positive motion in the X-axis can be interpreted as a right swipe, while a negative motion in the X-axis can be interpreted as a left swipe. Likewise, positive motion in the Z-axis can be interpreted as an enter gesture, and, although not shown, it is understood that one or more of the transmitters (520), (525) may be positioned along the Y-axis, rather than along the X axis, to detect vertical motion of an object. The rate of movement may also be measured. For example, a higher rate of movement may correspond to a fast scroll while a slower rate of movement may correspond to a slow scroll. Further, once the controller (535) correlates the object's motion to one of a plurality of predefined touch-free gestures, the controller (535) sends a command to the graphics processor (544) to execute a function, macro, or modify the list of icons on a touch-free menu, a process discussed in more detail below.
Alternatively, in another embodiment, the control unit may be equipped with a capture device in the form of a camera, which may be used to visually monitor motion of a user. Further, the control unit may be programmed (i.e. image or motion recognition software) to interpret motion of the user as controls that can be used to affect a touch-free menu displayed on a feedback device associated with the control unit. As such, a user may use her movements to navigate to or select one or more icons on the touch-free menu. In this particular embodiment, the control unit is programmed to enable the camera only after detecting use of a hand hygiene dispenser associated with the control unit. In other words, the user must comply with hand hygiene protocols before gaining access to the touch-free menu.
Referring now to
Similarly, as shown in
Referring again to
As shown in
Referring now to
At step (1015), if the gesture matches one of a plurality of predefined gestures, then control proceeds to step (1020) and the controller (435) sends a message to the graphics processor (444) to display the touch-free menu (1050) on the feedback device (440). Next, at step (1025), control branches based upon actions of the person. If a second touch-free gesture is not detected by the gesture sense system (410), control branches to step (1030) and the control unit (400) disables use of the touch-free menu (450) after a predetermined interval of time. Conversely, if a second touch-free gesture is detected, then control branches to step (1035).
At step (1035), control branches again according to which predefined touch-free gesture the controller (435) matches with the second touch-free gesture. If the second touch-free gesture is a left swipe, then the controller (435) sends a message to the graphics processor (444) at step (1040) to shift a selection indicator left or up on the touch-free menu. If the second touch-free gesture is a right swipe, then the controller (435) sends a message to the graphics processor (444) at step (1045) to shift the selection indicator right or down. If the second touch-free gesture is an enter gesture, then the controller (435) sends a message to the graphics processor (444) at step (1050) to select whatever icon is currently highlighted by the selection indicator. It is understood that any combination of steps (1040), (1045), and (1050) may occur until a predetermined interval of time passes during which the gesture-sense system (410) is unable to detect a touch-free gesture that matches one of the predefined gestures in step (1035). When this end condition is met, control reverts to step (1030).
The use of the terms “a” and “an” and “the” and similar referents in the context of describing the present disclosure (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by the context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the present disclosure and does not pose a limitation on the scope of the disclosure unless otherwise claimed. Also, no language in the specification should be construed as indicating any non-claimed element as essential to practicing the present disclosure.
Further, one of ordinary skill in the art will recognize that a variety of approaches for communicating workflow information with a HHC system may be employed without departing from the teachings of the present disclosure. Therefore, the foregoing description is considered in all respects to be illustrative and not restrictive.
This application is a continuation-in-part (CIP) application that claims the benefit of and priority to, U.S. application Ser. No. 13/736,945 filed on Jan. 9, 2013.
Number | Date | Country | |
---|---|---|---|
Parent | 13736945 | Jan 2013 | US |
Child | 14463621 | US |