This invention concerns a distributed patient monitoring system for visually monitoring patients and patient parameters using a plurality of portable processing devices in different remote locations.
Monitoring of patients, particularly patients in critical care is a burdensome and labor intensive task. This problem has been addressed by use of a centralized monitoring facility enabling a physician at the workstation of the centralized monitoring facility to monitor patient vital signs and video and audio. One known remote centralized patient monitoring system, described in U.S. Pat. No. 6,804,656, provides fixed location, static centralized monitoring of ICUs by a physician. The centralized monitoring employs a single command center and a workstation provides a single display area operated by clinical personnel. However it is fixed in location, inflexible in performance and architecture and fails to accommodate high bandwidth communication of patient related data. A system according to invention principles addresses these deficiencies and related problems.
A distributed patient monitoring system enables visual monitoring of patients and patient parameters using live motion video and audio data presented on multiple portable processing devices in different remote locations in response to user selection of a specific patient related item in an image showing specific patient electronic medical record data or a patient census list, for example. A distributed patient monitoring system for visually monitoring patients and patient parameters using portable processing devices in different remote locations includes a monitoring processor. The monitoring processor is responsive to user initiated commands from multiple different portable processing devices in different remote locations and includes an input processor and a data processor. The input processor acquires vital sign parameters and associated video data representative of multiple sequences of video images of corresponding multiple different patients. The data processor processes the vital sign parameters and associated video data to provide processed first video and audio data representing an image sequence and providing two-way audio communication including a composite image of a first area showing live video of a selected first patient and a second area presenting vital sign parameters of the selected first patient together with ancillary clinical data (e.g.: laboratory, physician notes, etc.). The data processor also processes the vital sign parameters and associated video data to provide processed second video data representing an image sequence including a composite image including a first area showing live video of a selected second patient and a second area presenting vital sign parameters of the selected second patient. A communication network has bandwidth sufficient to communicate the processed first video data and second video data to first and second portable processing devices respectively of the multiple different portable processing devices in different remote locations in response to commands received from the first and second portable processing devices respectively.
A distributed patient monitoring system enables a user to visually monitor patients and patient parameters using live motion video and audio data presented on multiple portable processing devices in different remote locations comprising distributed personal command centers. A mobile or stationary clinician in a healthcare enterprise monitors live motion video and audio data of a patient within a hospital room presented using a Web browser on a wireless tablet personal computer, palm pilot or other portable device. Execution of an individual command center application is initiated from within a patient specific display image view presenting specific patient electronic medical record data, in response to user selection of a specific patient related item or an item in a patient census list, for example. Patient identifier information is employed in acquiring video and audio data of a particular patient using association of the patient identifier with patient medical information and specific room and bed identifiers. The system in one embodiment advantageously employs a mobile hardware unit that enables viewing of any patient in the healthcare enterprise. Mobile units are located within an enterprise and use radio frequency identification. The radio frequency identification tags placed on a mobile unit transmit location representative data to a centralized processor which associates the particular enterprise location with a patient location determined from the health information system. This enables a mapping of a location of the mobile unit of the viewing hardware to a particular patient clinical record, thereby enabling a user to view video and hear audio directly from a patient bedside when selected via the health information system.
Multiple clinicians at multiple locations are able to concurrently view patient information as well as communicate verbally via audio linkage with occupants of the patient room. Similarly, multiple viewing clinicians can, in turn and based on an on-line collaborative mechanism, alternately perform remote pan-tilt-zoom operation of the camera in the patient room. Patient parameters including vital sign data (heart rate, blood pressure, blood oxygen saturation etc. including data normally taken and displayed from within patient flow sheets) is also visible in an electronic medical record display image view. A health information system application analyzes individual patient vital sign data by comparing discrete values (values that are validated by a nurse for inclusion in a patient record) with predetermined thresholds.
The system enables one or more clinicians to view live motion video and to transmit and receive audio via a Web-enabled plug-in software component that allows the user to view a specific patient as part of the normal patient care management process. Video and audio data are acquired from cameras located within patient rooms and is associated with specific patients via virtual electronic linkages that permit associating patient demographic information (including standard patient identifiers) with a patient location. This information is used to launch a patient specific data image display view via a Web-based electronic health record as a child process that acquires patient specific information and searches for this information to display within a Web-browser on either a wired or wirelessly communicating computing device. Video and audio representative data derived at the point of care is also captured using a wireless mobile embodiment, thereby enabling viewing of any patient at any location within a healthcare enterprise. The patient information is viewed using a Web-based computer application that allows one or more distributed users to view a patient at any time from substantially any location within a healthcare organization. Multiple clinicians can view multiple patients concurrently or individually. In addition, patient parameter information is displayed and is visible to multiple clinicians concurrently. Discrete patient parameters (e.g., vital sign) information (validated by a nurse for inclusion within a patient health record) is processed using a rule information engine to assess whether the parameter values fall within normal ranges or meet certain thresholds.
The system provides a visual and audio link with patients through a Web-enabled browser and displays this information through a context-based link that enables clinicians to view specific patients without requiring them to select the patients from a census list, thereby facilitating the rapid review of patients and their parameters within their care. Web-based accessibility from within the patient record allows for remote viewing and collaboration among healthcare professionals virtually anywhere within a healthcare enterprise, advantageously obviating the need for a clinician to return to, or contact, a centrally located command center.
A processor, as used herein, operates under the control of an executable application to (a) receive information from an input information device, (b) process the information by manipulating, analyzing, modifying, converting and/or transmitting the information, and/or (c) route the information to an output information device. In specific embodiments a processor determines location of a mobile video and audio unit at a patient bedside and provides the capability for multiple viewing healthcare professionals to concurrently view and communicate verbally with a patient or healthcare providers present at the patient bedside. A processor may use, or comprise the capabilities of, a controller or microprocessor, for example. The processor may operate with a display processor or generator. A display processor or generator is a known element for generating signals representing display images or portions thereof. A processor and a display processor may comprise a combination of, hardware, firmware, and/or software.
An executable application, as used herein, comprises code or machine readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input. An executable procedure is a segment of code or machine readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters. A user interface (UI), as used herein, comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
The UI also includes an executable procedure or executable application. The executable procedure or executable application conditions the display processor to generate signals representing the UI display images. These signals are supplied to a display device which displays the image for viewing by the user. The executable procedure or executable application further receives signals from user input devices, such as a keyboard, mouse, light pen, touch screen or any other means allowing a user to provide data to a processor. The processor, under control of an executable procedure or executable application, manipulates the UI display images in response to signals received from the input devices. In this way, the user interacts with the display image using the input devices, enabling user interaction with the processor or other device. The functions and process steps (e.g., of
A workflow processor, as used herein, processes data to determine tasks to add to a task list, remove from a task list or modifies tasks incorporated on, or for incorporation on, a task list. A task list is a list of tasks for performance by a worker or device or a combination of both. A workflow processor may or may not employ a workflow engine. A workflow engine, as used herein, is a processor executing in response to predetermined process definitions that implement processes responsive to events and event associated data. The workflow engine implements processes in sequence and/or concurrently, responsive to event associated data to determine tasks for performance by a device and or worker and for updating task lists of a device and a worker to include determined tasks. A process definition is definable by a user and comprises a sequence of process steps including one or more, of start, wait, decision and task allocation steps for performance by a device and or worker, for example. An event is an occurrence affecting operation of a process implemented using a process definition. The workflow engine includes a process definition function that allows users to define a process that is to be followed and includes an Event Monitor, which captures events occurring in a Healthcare Information System. A processor in the workflow engine tracks which processes are running, for which patients, and what step needs to be executed next, according to a process definition and includes a procedure for notifying clinicians of a task to be performed, through their worklists (task lists) and a procedure for allocating and assigning tasks to specific users or specific teams.
Server 20 includes monitoring processor 15, data processor 29, input processor 27, authentication processor 39, workflow processor 34 and rules processor 19. Monitoring processor 15 is responsive to user initiated commands from multiple different portable processing devices 12 and 14 in different remote locations and includes input processor 27 and data processor 29. Workflow processor 34 initiates tracks and monitors task sequences performed by personnel and systems in response to events. Input processor 27 acquires vital sign parameters and associated video data representative of multiple sequences of video images of corresponding multiple different patients. Data processor 29 processes vital sign parameters and associated video data to provide processed first video data representing an image sequence including a composite image including a first area showing live video of a selected first patient and a second area presenting vital sign parameters of the selected first patient. Data processor 29 similarly provides processed second video data representing an image sequence including a composite image including a first area showing live video of a selected second patient and a second area presenting vital sign parameters of the selected second patient. Data processor 29 processes the vital sign parameters and associated video data to provide processed first and second video data by encoding the video data with a compression function compatible with, MPEG-4, MPEG-2 or DIVX, for example.
Input processor 27 acquires audio data of multiple different patients, data processor 29 processes the audio data to provide processed audio data by encoding the audio data with a compression function and communication network 21 is of bandwidth sufficient to communicate the processed first video data and second video data and audio data to first and second portable processing devices 12 and 14 respectively. Network 21 has sufficient bandwidth to convey video and audio data between rooms and other devices of the network. Input processor 27 acquires the audio data from multiple different microphones in patient rooms associated with multiple different patients and similarly acquires the associated video data from multiple different cameras in patient rooms associated with the multiple different patients. Further, input processor 27 acquires the vital sign parameters from patient monitoring devices attached to the multiple different patients.
Communication network 21 has sufficient bandwidth to communicate the processed first video data and second video data to first and second portable processing devices 12 and 14 respectively of the multiple different portable processing devices in different remote locations in response to commands received from the first and second portable processing devices 12 and 14 respectively. Rules processor 19 applies rules to the vital sign parameters to identify an alert condition indicating a significant patient clinical condition or change of clinical condition. Data processor 29 processes data representing the alert condition for inclusion in the processed first video data and the composite image includes an image element indicating the alert condition. Further, authentication processor 39 enables a user to obtain access authorization to access patient data in response to entry of identification data using any portable processing device of the multiple portable processing devices in different remote locations.
System 10 supports distributed patient monitoring, without centralization, so a clinician may view the patients themselves via video and their vitals signs and listen to an individual or multiple selected patients from anywhere within an enterprise. Multiple clinicians may view multiple patients or single patients concurrently through a wireless or hand held portable device 12 and 14. Rules processor 19 analyzes validated discrete patient parameters by comparing patient vital sign parameters with predetermined thresholds. User interface processor 26 employs a Web browser application supporting viewing video transmitted through existing hospital network 21 as MPEG-4 (for example) compatible compressed images. Patient rooms 41 incorporate equipment including cameras connected via network 21 to portable devices 12 and 14 and one or more servers (e.g., server 20). Portable devices 12 and 14 incorporate a virtual camera controller Web compatible application allowing a clinician to control pointing, zoom, focus, and an iris of patient room cameras in pan, tilt and zoom operations via a Web browser in wireless portable devices 12 and 14.
Data is transmitted via network switch 436 from encoder units 430, 432 and 434 to an application server (e.g., application server 76
System 10 displays live motion video and audio information of patients within a health care enterprise in a web-browser based window such that viewing of patient information may occur concurrently on multiple (distributed) wirelessly communicating computers, computing tablets or other stationary or mobile processing devices. The system enables a user to virtually control the viewing field of cameras located within patient rooms via a Web-browser based application that is downloaded from a remote application server and to adjust video views of multiple patients using the Web-browser based application. The system decodes and displays compressed video information of patients acquired from raw camera video feeds via a mobile computing platform. Rules processor 19 processes and analyzes patient parameters and laboratory test results and compares parameters with pre-determined thresholds and notifies users as to whether values collected and validated by clinical staff (e.g.: nursing) fall within or outside of acceptable ranges.
In step 917, authentication processor 39 enables a user to obtain access authorization to access patient data in response to entry of identification data using any portable processing device of the multiple portable processing devices in different remote locations. In step 919, user interface 26, in response to the access authorization, enables a user to, initiate execution of a clinical information application providing a user with a clinical application display image-identifying multiple different patients in corresponding multiple different locations and select a particular patient of the multiple different patients in the clinical application display image. A display processor in user interface 26 in step 923 initiates generation of data representing an image sequence (processed video) for presentation in a composite image including a first area showing live video of a selected particular patient and a second area presenting vital sign parameters of the particular patient in response to user selection of an image element associated with the particular patient of the multiple different patients in the clinical application display image. The image element associated with the selected patient comprises a hyperlink presented in a list of different patients in the display image provided by the clinical information application.
A first portable processing device 12 of multiple portable processing devices, has a user interface 26 that enables a user, in response to access authorization, to, initiate execution of a clinical information application providing a user with a clinical application display image identifying multiple different patients in corresponding different locations. Device 12 user interface 26 also enables a user to select a first patient of the multiple different patients in a clinical application display image and display an image sequence including a composite image comprising a first area showing live video of the first patient and a second area presenting vital sign parameters of the first patient. Similarly, second portable processing device 14 of the multiple portable processing devices, has a user interface 26 that enables a user, concurrently with operation of the first portable processing device and in response to access authorization, to, initiate execution of a clinical information application providing a user with a clinical application display image identifying multiple different patients in corresponding multiple different locations. Device 14 user interface 26 also enables a user to select a second patient of the multiple different patients in the clinical application display image and display an image sequence including a composite image comprising a first area showing live video of the second patient and a second area presenting vital sign parameters of the second patient.
In step 926, rules processor 19, applies rules to the vital sign parameters to identify an alert condition indicating a significant patient clinical condition or change of clinical condition and the composite image includes an image element indicating the alert condition. The process of
The systems and processes of
This is a non-provisional application of provisional application Ser. No. 60/910,674 filed Apr. 9, 2007 and of provisional application Ser. No. 60/911,302 filed Apr. 12, 2007, by J. R. Zaleski.
Number | Date | Country | |
---|---|---|---|
60910674 | Apr 2007 | US | |
60911302 | Apr 2007 | US |