This disclosure relates to health care and more particularly to an interactive visualization of healthcare information.
Electronic medical records (EMRs) are used in the healthcare industry to facilitate storage, retrieval and modification of management of health care information records. The change from paper records to EMR based systems is being accelerated at least in part in the U.S. due to the American Recovery and Reinvestment Act of 2009. The EMR is used to document aspects of patient care and billing for healthcare services, typically resulting in voluminous data is stored and accessed for patients. The user interfaces for EMR systems tend to be quite rigid. For example, the user interfaces are often modeled similar to the paper charts that they were intended to replace. Additionally, use of such EMR systems can be oftentimes frustrating to healthcare providers due to the voluminous amounts of data stored in an EMR database.
This disclosure relates to health care and more particularly to an interactive visualization of healthcare information.
As one example, a system for visualizing health information can include a repository interface to access health data objects for a given patient from an electronic health record (EHR) repository. Association data can be stored in memory separate from the EHR repository, the association data representing a link between selected health data objects for the given patient. A visualization engine can dynamically generate an interactive graphical map representing selected health data objects as graphical elements and representing links between the selected health data objects as graphical connections between related graphical elements based on the association data.
As another example, a non transitory computer readable medium can store instructions for performing a method. The method can include accessing health data objects for a given patient from an electronic health record (EHR) system. The method can also include storing association data to represent a link between health data objects for the given patient, the association data being stored separately from the EHR system. The method can also include dynamically generating an interactive graphical map representing selected health data objects as graphical elements and representing links between the selected health data objects as graphical connections between related graphical elements based on the association data.
This disclosure relates to health care and more particularly to an interactive visualization of healthcare information.
The diagnostic mapping system 12 includes a repository interface 14 that is programmed to access health care data objects for any number of one or more patients from an electronic health record (EHR) repository 16. For example, the repository interface 14 can be programmed to pull (e.g., retrieve) data from the EHR repository 16 in response to instructions from a visualization engine 18. Additionally, the repository interface 14 can include methods and functions programmed to push data to the EHR repository 16 also in response to instructions from the visualization engine 18.
The diagnostic mapping system 12 can be implemented in a variety of healthcare environments including hospitals, private practices, networks of hospitals or the like. Accordingly, the number of patients and the amount of data stored in the EHR repository 16 can vary depending upon the implementation of the system 10. It is to be understood and appreciated that in a given network or enterprise the EHR repository 16 can correspond to one or more different types of EHR systems that may be implemented in different locations or for different portions of the given network or enterprise. Accordingly, the interface 14 can be extensible and appropriately programmed to selectively push and pull data for each such EHR system that may be utilized.
The visualization engine 18 is programmed to generate an interactive graphical map representing selected health data objects from the EHR repository 16 as graphical elements. Additionally, the visualization engine 18 can represent associations between corresponding health data objects as corresponding graphical connections in the graphical map. The associations between the health data objects are stored as part of local storage 22, which can be stored in a data structure or database that is separate from the EHR repository 16. For example, the local storage 22 can include association data 24 that is stored in memory separate from the EHR repository 16. The association data can be generated in response to a user input and/or based on information stored in the EHR repository. The association data 24 can thus include link data that represents the links between underlying health data objects (from the EHR repository 16) for each given patient.
The association data 24 can also include relevance data. The relevance data can be a quantitative value that corresponds to a relevance between two or more health data objects for each link. The relevance data can define confidence value that the health data objects being linked are related. The value can be stored as an integer or floating point value that maps graphically to a distance parameter between the associated pair of graphical elements, for example. As other supporting evidence is collected and analyzed, the relevance data can for each of the graphical elements can be dynamically updated accordingly. The distance parameter can be adjusted according to the display resolution capabilities of the output device where the map is being displayed. The distance parameter can correspond to an on-screen distance between graphical elements and/or affect other display parameters (e.g., brightness, thickness, color, size and the like) that can graphically demonstrate a confidence of the causal relationship between elements. While the examples shown herein are demonstrated as two-dimensional, it is appreciated that the concepts are equally applicable to three-dimensional interactive graphical maps and four-dimensional maps (e.g., the fourth dimension being time). For instance, the graphical elements and links can be arranged hierarchically in three-dimensions according to their relative importance in driving a diagnosis for the given patient.
The relevance value can be computed and assigned automatically. Alternatively, the relevance data can be specified in response to a user input (e.g., a provider may set the relevance by adjusting a distance between graphical objects including overriding a computed value) based on professional judgment. The visualization engine 18 can generate the graphical map 20, based on the relevance data, such as to graphically differentiate a relevance (e.g., importance) of selected health data objects and the links between each pair of associated data objects. The relevance data can be programmed in response to a user input (e.g., specifying the relevance explicitly and/or graphically), based on data stored in the EHR repository 16 or can be determined from other sources.
The visualization engine 18 can graphically represent the relevance of such objects and corresponding links in a variety of different ways. For example, the visualization engine 18 can define the graphical connections between the graphical health data elements based on association data 24, such as by varying the type or form of graphical connection between the graphically health objects. As an example, the visualization engine 18 can generate the graphical connection with a relative graphically-represented parameter, which is stored as part of the association data. Relative in this context refers to how a given graphical connection is visualized in a graphical display when compared to how one or more other graphical connections in the same graphical display are concurrently visualized. For example, the relative graphically-represented parameter can include a relative length parameter, a relative thickness parameter, or a combination of different relative parameters that can graphically represent the determined relevance defined by the relevance data. As a further example, for a given diagnosis, graphical objects, corresponding to different contributing factors to the diagnosis, can also be graphically displayed with different relative parameters, such as relative sizes and/or and distances apart from the diagnosis object, depending on each factor's contribution to the diagnosis.
Additionally, the visualization engine 18 can represent the graphical elements in the graphical map 20 based on object data and metadata that is stored in memory as part of the association data 24 in the memory storage 22. Graphical elements in the map 20 can include iconic-type or other predefined graphical representations for different types of patient health data objects. For example, health data objects such as diagnoses and supported evidence such as lab data, orders, radiology information, and risk factors can be represented graphically as different icons that include graphical and/or textual information. For example, the object data can be stored as part of the local storage 22 for each of the health data objects, which are retrieved from the EHR repository or created within the diagnostic mapping system 12.
Metadata (e.g., data that describes the data objects and data associations) can also be stored as part of the association data 24. Such metadata can thus be utilized to provide additional information about a given element or graphical connection (e.g., corresponding to a link between data objects). For instance, corresponding metadata can be employed to present information in a textual and/or graphical manner in response to hovering a pointing element over or otherwise selecting a given graphical element or connection. The additional information presented based on the metadata associated with such selected element can be graphically presented in a superimposed relationship or adjacent to the selected element, such as a pop-up window or other form of representation.
The visualization engine 18 also includes user controls 26 to provide for user interaction with graphical elements and links that are presented as part of the graphical map 20. The user controls 26 allow an authorized user to create new health data objects, such as corresponding to a diagnosis or problem (e.g., from a problem list stored in the EHR repository 16, or supporting evidence, as well as links between such evidence and the diagnosis. The user controls 26 can be programmed to modify the interactive graphical map 20 in response to user inputs such as can be made via a pointing element that is controlled by a user input device (e.g., a mouse, touch-screen, or other human machine interface). Thus, the user controls 26 are programmed to provide for user interaction and manipulation of the interactive graphical map 20 and its various components that are presented as part of the graphical map.
As disclosed herein, each of the graphical elements and graphical connections between corresponding elements correspond to health data objects and associations (e.g., relationships or links) between such objects. Thus, as a user manipulates the graphical objects and links or creates or deletes graphical elements and links from the map 20, each action can be stored as encounter data 28 indicating a corresponding effect on the underlying health data objects and one or more relationships to other health data objects. In this way, each instance of manipulation or adjustment, creation or deletion of a graphical element or graphical connection between elements can be stored as part of the encounter data (e.g., log data) corresponding to the underlying health data objects and relationships represented thereby. The log data thus can be used to provide a detailed record of the decision making process. In this way, not only does the diagnostic mapping system 12 provide a visualization of a current (or historical) diagnosis and contributing factors (e.g., represented by the state of the graphical elements and connections), but it also can store the encounter data to record each intermediate step (corresponding to additions subtractions or changes) that occurred to arrive at such diagnosis.
The visualization engine 18 can also include a document generator 30 that is programmed to generate the encounter data 28 by capturing a process of clinical decision-making in response to each addition, subtraction or modification to the graphical elements and graphical connections in the graphical map 20. It is to be understood that some graphical elements displayed in the graphical map may not be associated or linked with other elements and that the document generator 30 can also store encounter data reflecting modifications or other graphical actions that are performed on such unassociated graphical elements.
By way of further example, the document generator 30 can include a coder 31 that is programmed to generate clinical codes and/or billing codes in response to each user graphical interaction with the system 12. For instance, when a diagnosis or supporting evidence is dragged onto another graphical element, the corresponding diagnosis engine 34 can execute a set of rules to acquire necessary information and details that may be required to comply with clinical and billing coding regulations or standards. The coder 31 can be implemented to be self-learning or infer codes for each diagnosis and user-interaction via the user controls 26. The coder 31 thus can generate corresponding codes and store such codes in the local storage 22 in response interactions entered by a user.
For example, the document generator 30 can store the encounter data using a variety of standard codes. Thus the coder can be programmed to generate corresponding codes according to the coding systems utilized by the healthcare enterprise using the system 10, such as diagnostic codes (e.g., ICD-10, ICD-9, ICPC-2 and the like), procedure codes (e.g., HCPCS CPT, ICD-10 PCS, ICD-9-CM and the like) pharmaceutical codes (ATC, NDC, DIN and the like), topographical codes, outcome codes (NOC) or other relevant codes, including known or yet to be developed coding systems. In this way, the rules can be programmed and executed by the document generator 30 to ensure that the most detailed code(s) for diagnosis and billing purposes can be generated.
In addition to generating codes, the document generator 30 can also construct other supporting evidence (e.g., severity information or the like) over a broad clinical spectrum that can be stored as part of the encounter data 28. The document generator 30, for example, can add such information to a patient encounter in response to user interactions with the graphical map (e.g., making and/or breaking links). Alternatively or additionally, the document generator 30 can be programmed to elicit such information from the user via corresponding user input GUI elements (e.g., presenting a text user-entry form or the like). Such a user input GUI element can be partially (or wholly) populated with information based on the graphical map (according to health data objects and the association data being represented), which pre-populated information can require validation by the user. Once such encounter data has been generated, including codes and related supporting evidence, the system 12 can employ the repository interface 14 to push the data to be stored in the EHR repository 16 such as for billing and/or clinical purposes.
The document generator 30 can also be utilized to create notes or other freeform entry of information (e.g., text, audio, or audio-video) that a user may enter into the system 10 via the corresponding user controls 26. Such notes or other information can be stored as part of the encounter data 28. The visualization engine 18 can send the encounter data 28 to the EHR repository via the repository interface 14 (e.g., via HL7 or other application layer protocol) to push back log data and notes data that may be stored as corresponding health data objects for a given patient encounter.
The document generator 30 can also be programmed to assemble or generate a user perceptible type of document (e.g., a report) based on the encounter data 28 that can be stored in the local storage 22. For example, the encounter data can be stored in a known format (e.g., XML), which the document generator 30 can utilize to create a corresponding user perceptible document (e.g., a PDF, a Microsoft Word document or the like). Such user perceptible document can be created based on metadata for links between the health data objects, corresponding to the graphical connections in the graphical map 20.
By way of further example, the document generator 30 can generate the document to include user perceptible representation of the health data objects for diagnostic concepts that are represented by the graphical elements in the graphical map 20. The document can also include health data objects for lab data as well as health data objects for interventions, which can be represented as graphical elements in the interactive graphical map 20. In this way, the document generator 30 can provide the encounter data in one or more forms, which may depend upon the EHR system and user requirements. The form may also be selected by the user via corresponding user controls 26. The corresponding user perceptible document can thus provide additional supporting proof of patient management and/or review of patient medical data that is recorded and logged as part of the encounter data in response to and corresponding to the user interactions with the graphical map 20.
The visualization engine 18 also includes a display control 32 that controls the graphical appearance of the graphical elements and graphical connections in the graphical map 20 based on the association data 24 and user data 40. The display control 32 can also control animation of elements and connections in the graphical map 20.
As an example, the display control 32 can operate in an animation mode to animate the graphical map for a given patient over a period of time based upon the health object data obtained from the EHR repository (corresponding to the graphical elements) and based on the association data 24 as a function of temporal data that is stored with the association data 24 and the health data objects. In this way, temporal changes in the interactive graphical map over one or more patient encounter can be visualized graphically to represent the medical decision making process over one or more selected periods of time for a given patient. For example, by entering such animation mode, the graphical map 20 can graphically re-create the decision making process for a given patient, such as based on the encounter data mentioned above. The animation and play back of the decision making process can help a reviewer (e.g., the user or a supervisor or team) better understand the underlying thought process and decisions made by the caregiver. The amount of time or patient encounters for which the animation is displayed for the graphical map 20 can be selected by a given user in response to a user input.
The visualization engine 18 can also include a diagnosis engine 34 to determine a diagnosis based on health objects retrieved from the repository 16 and local storage 22. For example, the diagnosis engine 34 can be programmed to generate the map or a portion thereof such that graphical connections between selected graphical elements in the interactive graphical map represent association data in relation to a one or more diagnosis relating to the health data objects.
The diagnosis engine can employ a rules engine that is programmed to evaluate the health data objects for a given patient by applying a set of predetermined rules. The rules can be based programmed a best practices approach or other criteria that may vary for a given application of the system 12. The diagnosis engine 34 can also graphically suggest an association as a suggested graphical connection between graphical elements corresponding to a given diagnostic relationship between a potentially related set of health data objects based on application of the rules to the health data objects represented by the graphical elements for a given patient encounter. A potentially related set of health data objects can comprise two or more health data objects for diagnostic concepts, health data objects for lab data, health data objects for interventions or other supporting evidence that may be entered into the system via the user controls 26 or obtained from the EHR repository 16 or another source (e.g., medical devices, monitoring equipment or the like) for a given patient. The diagnosis engine 34 can represent the relationship between two or more such potentially related health data objects as a graphical connection between such respective graphical elements for objects according to metadata that is stored as part of the association data 24.
A suggested graphical connection or suggested diagnosis can be implemented in a variety of forms, such as, for example, blinking, animation, dotted lines, different color graphics or other methods to differentiate the suggested link from an actual association that has been validated by a user. The suggested graphical link can remain differentiated from other graphical connections until validated or invalidated by a user. For example, a user can validate a suggested link or diagnosis by clicking on it or otherwise marking it via the user controls 26. Each interaction via the user controls 26, including for validating and invalidating new graphical elements or links between elements, can be recorded and stored as medical decision making information as part of the encounter data 28 as disclosed herein. In this way, such interactions by the user with the graphical map 20 can create a log of patient management and review of clinical data for a given patient that can be stored as the encounter data 28. As disclosed herein, the encounter data or a selected portion thereof can be pushed to the EHR repository 16 via the repository interface 14.
By way of further example, the visualization engine 18 can also include a health element generator 36 and a link generator 38. The health element generator 36 can be programmed to generate a new graphical element for a corresponding health data object for a given patient. The health element generator 36 can be programmed to generate the health data element as a potential element in response to an evaluation of supporting evidence and health data objects by the diagnosis engine 34. User controls 26 can be employed to validate a corresponding new suggested health data object represented by the graphical element on the map 20. The corresponding health data object for such graphical element can be provided to the EHR repository 16 via the repository interface 14.
The health element generator 36 can be programmed to automatically generate such health data elements based on the analysis of health data objects from the EHR repository 16 and association data 24 for the given patient. The generation of health data elements and links can be constrained to a current patient encounter or it may also encompass historical data for the patient. Such automatically generated health data elements can be graphically differentiated until validated or invalidated in response to a user input.
Alternatively or additionally, the health element generator 36 can be programmed to generate new graphical elements and corresponding health data objects in response to a user input via the user controls 26. Once such new elements are generated in response to user controls they can be automatically presumed to be validated (having been manually—not automatically—generated). The manually generated health elements can thus result in a corresponding health data object being created and stored in the encounter data 28 as well as being pushed to the EHR repository 16 for the patient encounter.
Similarly, the link generator 38 can be utilized to automatically create and/or suggest links between graphical health elements in the map to indicate an association or causal connection between corresponding health data objects and supporting evidence. For example, the diagnosis engine 34 can evaluate a set health data objects and supporting evidence for a given patient and based upon such analysis determine if potentially relevant associations exist. The link generator 38 thus can present the suggested link to the user as a graphical connection between graphical health elements in a graphically differentiated form until validated or invalidated by the user via the user controls 26. A user can also manually generate a link between health elements or destroy link between health elements via the user controls 26. Each interaction of generating or invalidating links between health elements can be recorded as part of the encounter data 28, which may also be pushed into the EHR repository 16, as disclosed herein.
The visualization system 10 can also employ user data 40 stored in the local storage 22. The user data 40 can store information relating to each authorized user of the system. For example, the user data 40 can include role data and preference data for each user. The role data can be stored in memory for each of the users and be utilized to vary or control the content and organization of the interactive graphical map 20 for a user based upon the role data. For example, each user can be assigned a given role, such as a physician, nurse, patient, or other technical professional and, depending upon the role, different types of information may be presented in a graphical map. In addition to different types of information, information may be presented in different ways depending upon the sophistication or technical expertise of the user defined by the role data. For example, more technical information may be provided for a physician than for a patient, which can also be a user. Additionally, different users at a given category may result in information being presented differently depending on each user's role data, such as identifying a particular interest or area of specialization. For example, a pulmonologist can have the graphical map 20 appear differently (with the same or different information) from the graphical map generated for the same patient where the user's role is defined a cardiologist. The visualization engine 18 can employ the display control 32 to flex or morph the graphical map 20 based on the role data for each respective user. Additionally, a greater level of authorization and access to different types of information can be provided based on the role data.
Preference data, which can also be stored in memory, can be utilized to set individual user preferences for the arrangement and structure of information that the visualization engine 18 presents in the interactive graphical map 20. For example, preference data can be set automatically by the diagnostic mapping system 12 based upon a given user's prior usage, which is stored as part the preference data. The display control 32 thus can select and control the graphical representation of health data objects for use in generating the interactive graphical map and arrange such graphical elements in the map for a given instance according to the user preference data of a given user that is currently logged into the system. The system 12 can learn preferences and how to arrange objects based upon repeated changes made by a given user. For example, the system 10 can infer or employ machine learning from log data that can be stored in memory in response to user interactions.
The user data 40 can also be utilized to establish access to the diagnostic mapping system 12 via a plurality of different types of devices, each of which may be presented the data differently, such as depending upon the display capabilities of such device. Each device can still employ the user controls 26 to generate new graphical elements, modify existing elements or to generate links or modify existing links in the graphical map 20. The manner in which such controls are implemented and accessed by a user can vary depending upon the device.
Additionally, the rules engine 42 can generate new rules which can be globally implemented within the system 12 or be user defined (e.g., part of the user data) to provide more flexibility to each user. For example, the rules engine 42 can learn and apply a unique set of rules for each user based on previous system usage data that can be stored in the user data 40.
The diagnosis engine 34 can also include an object relevance calculator 46 that can compute a confidence value indicative of how related the health data objects are. The object relevance calculator 46 can compute the relevance and provide the confidence value based upon the association data or metadata that is provided with the respect of health data objects. The relevance between health data objects thus can be stored as relevance data as part of the association data 24 (
In addition to the rules engine 42 being applied to new health data objects, the rules engine 42 can be programmed to analyze health data objects and links between health data objects in the graphical map 20 in response to user manipulation or modification thereof. That is, the rules engine 42 can reapply relevant rules 44 to evaluate an existing set of elements in the graphical map following changes in links and other metadata that may be effected in response to the user manipulation via the user controls. This can be done to suggest additional links or perhaps suggest additional health data objects that may be determined to be pertinent based upon the aggregate set of health data objects represented by elements in the graphical map 20.
The diagnosis engine 34 can also employ rules to obtain additional information related to a given diagnosis. Examples of cascading logic that can be utilized as rules for generating diagnoses are shown in Appendix A.
Additionally, the rules engine 42 can learn new associations between graphical elements and store such as new diagnosis rules in the rule set 44, such as in response to user validation or creation of a diagnosis data element and its association with supporting evidence data elements on the interactive graphical map.
The diagnosis engine 34 can also include a prediction function 48 that can be programmed (e.g., with a predictive model) to predict a likelihood of a patient's outcome, such as a diagnosis, length of stay, readmission risk, patient satisfaction or other outcomes for a patient or group of patients. In addition to predicting patient outcomes, the prediction function 48 can be utilized to generate a prediction for administrative conditions. Administrative conditions can include quantifiable information about various parts of a facility or institution, such as admissions, capacity, scheduled surgery, number of open beds or other conditions that may be monitored by administrative personnel or executive staff. The type of prediction algorithms and models that can be utilized can vary according to the type of condition or outcome being predicted and the type of information to be presented by the diagnostic mapping system 12. One example of a prediction model that can be utilized is disclosed in U.S. patent application Ser. No. 13/451,984, filed Apr. 20, 2012, and entitled PREDICTIVE MODELING, which is incorporated herein by reference.
In view of the foregoing examples of
Diagnosis 1 is demonstrated as being supported by supporting evidence (e.g., health data objects) represented in the map 100 by graphical elements 110, 112 and 114. For instance, element 110 corresponds to lab results (Lab 3) and is associated with Diagnosis 1 via graphical connection 116. Supporting evidence graphical element 112 is demonstrated as Real Time Data 1 (RT Data 1) and is associated with Diagnosis 1 via graphical connection 118. The OTHER EVIDENCE graphical element 114 is also associated with Diagnosis 1102 via the graphical connection 120. As disclosed herein, each of the graphical elements 110, 112 and 114 can correspond to health data objects, such as can be stored in an EHR repository and/or in local storage.
Similarly, Diagnosis 2 is demonstrated in the example of
The graphical map 100 also includes a diagnosis engine user interface element 140. The diagnosis engine 140 can be utilized by a user to create new links or diagnoses, such as in response to activating a NEW user interface element 142. For instance, a user can employ a pointing element 144 to activate the user interface element 142. Additionally, additional modifications can be made to the interactive graphical map via the pointing element or via other means which may vary depending upon the type of user device. For instance, elements can be accessed and manipulated via touch screen, keyboard or other unit input devices. A set of predicted results can also be generated and displayed in the interactive graphical map, as demonstrated at 148, based on applying corresponding predictive models to the health data objects represented in the graphical map 100. In the example of
As disclosed herein, in addition to adding elements to the map 100 a user can also remove graphical elements and connections via corresponding user controls. For example, a link or element can be deleted by dragging it into a trash user interface element 150. Those skilled in the art will understand and appreciate that other mechanisms can be utilized for deleting such as highlighting and clicking on the object and providing a corresponding drop-down menu or highlighting an object by clicking on it and then deleting it via a delete key on the keyboard.
Also as shown in
By way of example, suggested graphical elements and suggested links are demonstrated in the example of
As disclosed herein, a user can validate or invalidate each suggested piece of supporting evidence (graphical elements) and each association (connections) presented in the interactive graphical map 100, such as via user controls that can be accessed via the pointing element 144 or other user input mechanisms. Thus, in the example of
The example of
Additionally, as shown in the example of
In the example of
Similarly, the acute chronic systolic heart failure diagnosis (represented by graphical element 204) is supported by evidentiary health data, which are represented in this example as including a creatine graphical element 218, a B-type Natriuretic Peptide (BNP) graphical element 220, an ejection fraction graphical element 222 and a congestive heart failure graphical element 224. Also associated with the diagnosis element 204 is an allergy interface element 226 demonstrating an allergy to a given medication, in this example Lisinopril.
A diagnosis can also provide supporting evidence or otherwise be associated with another diagnosis. In this example, acute chronic renal failure is supported or supports the acute chronic systolic heart failure diagnosis, which association is demonstrated by a corresponding graphical connection. The acute blood loss anemia diagnosis (represented by graphical element 208) is also associated with the element 204. Supporting evidence for the acute blood loss anemia diagnosis is provided via a recent surgery graphical element 228 and lab results, corresponding to a hematocrit and a graphical indication of trending downward, via graphical element 230.
Also demonstrated in the example of
Similar to the example shown and described with respect to
As shown in the example of
Some common issues can be presented in a summary manner. For example, information describing a number of cases of pneumonia and alerts can be represented in the map, such as strep infection. Each of these indications can be drilled down to obtain more detailed information, such as by clicking on or otherwise activating the corresponding user interface element. Similarly detailed information about each of the patients identified by the respective icons in the facility map 302 can be accessed by activating the respective patient icons 306.
The interactive graphical map 300 can also include a forecasting user interface element 310. The forecasting user interface element can employ one or more prediction functions, such as to forecast or predict conditions associated with the facility. A graphical slide or other like interface element 312 can be provided to selectively adjust the time period for which each prediction is computed, demonstrated in this example as ranging between twelve and thirty-six hours. Other types of ranges and timeframes can also be utilized for forecasting.
Additional facility indicators can also be provided at 314, 316 and 318. For example, indicator 314 can provide a graphical user interface element providing information about patient census information (CEN) and an indication of which way such parameters is trending. Element 316 can provide information about admissions (ADM) over the forecast period and as well as indicate a current trend in such parameter. The user interface element 318 can provide information about open beds (OPN) in the facility as well as indicate current trends associated with the number of open beds. A PATIENT LIST user interface element 320 can also be provided to represent specific information about the patients in the facility which further may be drilled down upon as shown and described herein.
Information in the administration screen 350 can be based upon historical data as well as scheduling information that can be stored in associated scheduling system accessible by the systems and methods shown and described herein. In addition to displaying plots of selected information, a timeline or caliber can be provided that can be moved across a given dashboard element to provide information for such selected time period. Additionally, predicted information can also be displayed for each of the dashboard elements. A user can also modify what information is presented in the screen 350 via corresponding selection user interface elements.
In the example of
The system can also provide a patient status GUI screen 420. The patient status GUI 420 can provide current information and/or historic information for the user. The patient status GUI 420 can be displayed (e.g., graphically and/or via text) in relation to appropriate icons or other graphical indictors representing a selected set of parameters being monitored for the respective patient. In the example of
The architecture 500 also employs one or more user devices 522, each of which may include a user interface 524. The user interface 524 can be programmed for accessing the system 502 and implementing the functions and methods shown and described herein. For example, in response to a user input provided via the user interface 524, the visualization engine 506 can employ the repository interface 508 to access data from an EHR system 526 in which EHR data 528 is stored. The visualization engine 506 thus can employ the repository interface 508 to retrieve health data objects and other information from the EHR system 526 as well as from one or more other data sources 530 for generating an interactive graphical map 510, such as shown and described herein.
There can be different groups of health data objects stored in the EHR 526 that can be utilized by the visualization engine. For instance, the health data objects can include problem data objects representing problems that form a problem list for each given patient. There can also be intervention data objects representing interventions initiated by a user for the given patient. As another example, the clinical data objects can be stored in the EHR system 526, representing clinical data acquired for the given patient.
For example the visualization engine 506 can receive one or more lab values, one or more orders, radiology information and risk factors as inputs for a given patient. Based upon corresponding rules (e.g., see
The system 502 can also employ one or more device interfaces 514 for monitoring one more monitoring devices 532. Monitoring devices 532 can monitor any health related condition in real time to provide real time patient data indicative of a biological parameter of a patient, such as disclosed herein. The parameter can correspond to supporting evidence that can be programmatically associated with one or more diagnoses.
The system 502 can also communicate (e.g., retrieve and send) information relative to one or more other services 533. Such other services, for example, can include billing systems, insurance systems (internal to the organization or third party insurers), Personal Health Records, scheduling systems, admission discharge transfer (ADT) systems, prediction services, patient health portals or the like. In this way, the system can leverage information from a variety or resources and present users with current information that can be relevant to each patient or to groups of patients. The current information (as well as historical data) can be utilized to populate the interactive map with supporting evidence for one or more diagnoses that can be computed by the visualization engine 506 or manually created in response to a user input, as disclosed herein.
The system 500 further may employ a messaging system 534 for sending messages and alerts to one or more predetermined individuals that can be programmed into the system 502. The type messaging may include for example, email, alphanumeric paging, telephone, PA announcement or any combination of these or other message types. For example, if an actual or predicted condition is outside of the an expected parameter, the system 502 can trigger an alert to instruct the messaging system 534 to issue one or more messages to appropriate personnel (e.g., caregivers) so that appropriate action can be taken.
In still other examples, the system 500 may operate in an investigational or study mode in which health objects may be retrieved from the EHR and utilized for purposes of study or evaluation. However, in such mode, data is not sent back to the EHR system 526 for a given patient. Instead, the user can manipulate data elements and connections, add new interventions, clinical data and problems and allow the system to graphical demonstrate how the health data objects are related and how changes or new data might affect diagnoses.
As will be appreciated by those skilled in the art, portions of the invention may be embodied as a method, data processing system, or computer program product. Accordingly, these portions of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Furthermore, portions of the invention may be a computer program product on a computer-usable storage medium having computer readable program code on the medium. Any suitable computer-readable medium may be utilized including, but not limited to, static and dynamic storage devices, hard disks, optical storage devices, and magnetic storage devices.
Certain embodiments of the invention are described herein with reference to flowchart illustrations of methods, systems, and computer program products. It will be understood that blocks of the illustrations, and combinations of blocks in the illustrations, can be implemented by computer-executable instructions. These computer-executable instructions may be provided to one or more processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus (or a combination of devices and circuits) to produce a machine, such that the instructions, which execute via the processor, implement the functions specified in the block or blocks.
These computer-executable instructions may also be stored in computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture including instructions which implement the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
In this regard and in view of the foregoing structural and functional features described above, an example method will be better appreciated with reference to
At 608, a determination is made whether the interactive map is modified. The modifications to the map can be made automatically, in response to additional health data for the given patient, such as can be obtained from the EHR system, other services or devices. Additionally or alternatively, the modifications can be made in response to a user input. The modifications can correspond to changes in properties of currently displayed elements, validating or invalidating suggested links or elements as disclosed herein. If no changes are made, the method can return to 602. If changes are made, the method proceeds to 610 in which encounter data corresponding to such changes can be stored. The encounter data thus can provide a record of medical decision making, as disclosed herein. The encounter data can also be sent to the EHR system. From 610, the method can return to 602 and continue accordingly.
What have been described above are examples. It is, of course, not possible to describe every conceivable combination of components or methodologies, but one of ordinary skill in the art will recognize that many further combinations and permutations are possible. Accordingly, the invention is intended to embrace all such alterations, modifications, and variations that fall within the scope of this application, including the appended claims. As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on. Additionally, where the disclosure or claims recite “a,” “an,” “a first,” or “another” element, or the equivalent thereof, it should be interpreted to include one or more than one such element, neither requiring nor excluding two or more such elements.
This application claims the benefit of U.S. Provisional Patent Application No. 61/484,902, filed May 11, 2011, and entitled DIAGNOSTIC MAPPING, the entire contents of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61484902 | May 2011 | US |