Many medical devices in use today are unable to interface with an electronic medical record, or are very difficult or costly to interface. Hospitals are reluctant to spend the large amounts of money needed to buy replacement medical devices that are able to interface with an electronic medical record because the existing medical devices are adequate for all other purposes. Because the devices cannot interface with the electronic medical record, hospitals are forced to physically store a paper printout from the medical device, or take the labor-intensive step of scanning the printout into the electronic medical record.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The present invention is defined by the claims.
Embodiments of the present invention are directed to methods, computer systems, and computer storage media for use in interfacing a medical device with an electronic medical record. As mentioned above, some medical devices are unable to connect and interface with an electronic medical record. The present invention enables these devices to interface with the electronic medical record by using a camera to capture an image of an output of the medical device. The image is analyzed to generate a result which is then stored in the electronic medical record.
Embodiments are described in detail below with reference to the attached drawing figures, wherein:
The subject matter of the present invention is described with specificity herein to meet statutory requirements. However, the description itself is not intended to limit the scope of this patent. Rather, the inventors have contemplated that the claimed subject matter might also be embodied in other ways, to include different steps or combinations of steps similar to the ones described in this document, in conjunction with other present or future technologies. Moreover, although the terms “step” and/or “block” might be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly stated.
Embodiments of the present invention are directed to methods, computer systems, and computer storage media for use in interfacing a medical device with a digital record, such as an electronic medical record (EMR). Some medical devices are unable to connect and interface with an EMR. In brief and at a high level, the present invention enables these devices to interface with the EMR by using a camera to capture an image of an output of the medical device. The image is analyzed to generate a result which is then stored in the EMR.
As will be evident from the discussion of the figures below, aspects hereof provide an improvement in computer functionality and/or other technology, including improvements to the output capabilities of non-connected medical devices. As mentioned, certain medical devices are unable to connect and interface with an EMR. Such devices are referred to herein as “non-connected” or “non-networked” devices. For example, a non-connected device may be a legacy device that lacks the capability to connect to a communications network and/or other devices (e.g., the device may not have the ability to connect via Wi-Fi®, Bluetooth®, or another wired or wireless communication means). Additionally or alternatively, a non-connected device may have such capability, but may be unconnected for any number of reasons, including security concerns. For example, a medical device may have the capability to connect to a communications network and/or other devices, but for security reasons, the device may not actually be connected to a communications network or other devices. Additionally or alternatively, the device may be connected to a communications network and/or other devices, generally, but the device may not be permitted to send data to, receive data from, or otherwise communicate with a particular remote database that stores confidential information (e.g., the device may communicate with another device via a Bluetooth® or Wi-Fi® connection, but the device may not have authorization to communicate with a healthcare provider's system). Accordingly, for any number of reasons, a non-connected device is unable to connect and interface with a remote system and/or digital records included therein (such as EMRs), and the output generated by such devices cannot be provided directly to the remote system and/or records.
Conventional approaches to the technical problem described above are time, labor, and/or resource intensive. For example, some conventional approaches require that the output generated by a non-connected device be manually printed, scanned, and saved in a digital record for a patient. This is only possible if the device generates a printed output, and even then, the visual quality of the output may be compromised as a result of printing and scanning. Additionally or alternatively, conventional approaches may require that a result associated with the output be manually entered into a digital record (e.g., a vital signs result indicated by the device output must be typed into the patient's record). These conventional approaches are not only time and labor intensive, but they also introduce the possibility of errors created by human subjectivity (e.g., a human must exercise judgment to associate a scanned result with the appropriate patient or manually read and transcribe a result). Other conventional approaches require that the non-connected device be replaced by a networked device or reconfigured to be network compatible. This is expensive and wasteful, because the non-connected device may be fit for other purposes, such as monitoring, diagnostic, or treatment purposes.
In contrast to these conventional approaches, aspects hereof solve the aforementioned technical problem by improving the output capabilities of a non-connected device. This improvement is achieved through a non-conventional arrangement of various components, including a camera, a medical device interfacing system, the non-connected device, and/or a remote system in which patient data is stored. For example, the solutions described herein facilitate capturing the output of a non-connected device and providing that output to a digital record for a patient. The non-conventional systems and methods described herein address the shortcomings of conventional approaches in numerous ways. For example, the solutions described herein remove the need for human subjectivity and thus eliminate the potential for associated errors. The solutions described herein also expand the types of devices that may interface with a digital record. For example, conventional approaches require that the non-connected device be associated with a printer so that an output can be printed or scanned. By contrast, embodiments described herein provide for using a camera to capture an output displayed on a screen associated with a non-connected device, which means the output need not be printed. Additionally, the output displayed on the screen and captured by the camera may be dynamic and/or static in nature; such output may not be conducive to printing and may thus be incompatible with conventional approaches. The solutions described herein also eliminate the waste associated with replacing and/or reconfiguring existing non-connected devices.
Accordingly, in one embodiment, the present invention is directed toward one or more computer storage media having computer-executable instructions embodied thereon that, when executed, facilitate a method of interfacing a medical device with an EMR. An image of an output of the medical device is received from a camera associated with the medical device. The image is analyzed to generate a result which is stored in the EMR.
In another embodiment, the present invention is directed toward a system to interface a medical device with an EMR. The system comprises a camera to record an image of an output of the medical device and one or more computing devices having at least one processor. The one or more computing devices comprise an image collector component that receives the image from the camera, and a determining component that determines a region of the image to be analyzed. In addition, there is an analyzer component that analyzes the region and generates a result, and a storing component that stores the result in the EMR.
In yet another embodiment, the present invention is directed toward one or more computer storage media having computer-executable instructions embodied thereon that, when executed, facilitate a method of interfacing a medical device with an EMR. An image of an output of the medical device is received from a camera associated with the medical device. A region of the image is determined to be analyzed based upon an identity of the medical device. The region is analyzed, and a result for the region is generated. Validation of the result for the region is received to provide a validated result, and the validated result is stored in the EMR.
Having briefly described embodiments of the present invention, an exemplary computing environment suitable for use in implementing embodiments of the present invention is described below.
The present invention might be operational with numerous other purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that might be suitable for use with the present invention include personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above-mentioned systems or devices, and the like.
The present invention might be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Exemplary program modules comprise routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types. The present invention might be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules might be located in association with local and/or remote computer storage media (e.g., memory storage devices).
With continued reference to
The control server 102 typically includes therein, or has access to, a variety of computer-readable media. Computer-readable media can be any available media that might be accessed by control server 102, and includes volatile and nonvolatile media, as well as removable and nonremovable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by control server 102. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of any of the above should also be included within the scope of computer-readable media.
The control server 102 might operate in a computer network 106 using logical connections to one or more remote computers 108. Remote computers 108 might be located at a variety of locations in a medical or research environment, including clinical laboratories (e.g., molecular diagnostic laboratories), hospitals and other inpatient settings, veterinary environments, ambulatory settings, medical billing and financial offices, hospital administration settings, home healthcare environments, and clinicians' offices. Clinicians may comprise a treating physician or physicians; specialists such as surgeons, radiologists, cardiologists, and oncologists; emergency medical technicians; physicians' assistants; nurse practitioners; nurses; nurses' aides; pharmacists; dieticians; microbiologists; laboratory experts; laboratory technologists; genetic counselors; researchers; veterinarians; students; and the like. The remote computers 108 might also be physically located in nontraditional medical care environments so that the entire healthcare community might be capable of integration on the network. The remote computers 108 might be personal computers, servers, routers, network PCs, peer devices, other common network nodes, or the like and might comprise some or all of the elements described above in relation to the control server 102. The devices can be personal digital assistants or other like devices.
Computer networks 106 comprise local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. When utilized in a WAN networking environment, the control server 102 might comprise a modem or other means for establishing communications over the WAN, such as the Internet. In a networking environment, program modules or portions thereof might be stored in association with the control server 102, the data store 104, or any of the remote computers 108. For example, various application programs may reside on the memory associated with any one or more of the remote computers 108. It will be appreciated by those of ordinary skill in the art that the network connections shown are exemplary and other means of establishing a communications link between the computers (e.g., control server 102 and remote computers 108) might be utilized.
In operation, an organization might enter commands and information into the control server 102 or convey the commands and information to the control server 102 via one or more of the remote computers 108 through input devices, such as a keyboard, a pointing device (commonly referred to as a mouse), a trackball, or a touch pad. Other input devices comprise microphones, satellite dishes, scanners, or the like. Commands and information might also be sent directly from a remote healthcare device to the control server 102. In addition to a monitor, the control server 102 and/or remote computers 108 might comprise other peripheral output devices, such as speakers and a printer.
Although many other internal components of the control server 102 and the remote computers 108 are not shown, such components and their interconnection are well known. Accordingly, additional details concerning the internal construction of the control server 102 and the remote computers 108 are not further disclosed herein.
Turning now to
The system environment 200 includes a medical device interfacing system 210, a camera 212, an electronic medical record (EMR) 214, an end-user computing device 216, and a network 218. Each of the components 210, 212, 214, and 216 may be in communication with each other via the network 218. The network 218 may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs). Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet. Accordingly, the network 218 is not further described herein.
The camera 212 may comprise any camera capable of recording or capturing an image and interfacing with a computing device such as, for example, one of the remote computers 108 of
The camera 212 may be associated with a patient and/or a medical device. The association may occur in several different ways. For example, the camera 212 may have software that enables the camera 212 to be associated with a medical device which, in turn, is associated with a patient. In some embodiments, the camera 212 may be physically attached to a medical device. Therefore, if it is known that the medical device is associated with a patient by, for example, associating a product identification of the medical device with a patient identification, then it can be assumed that the camera 212 is associated with the patient because it is physically attached to the medical device. This association may be strengthened by positioning the camera 212 that is attached to the medical device so that it captures or records the product identification of the medical device. For example, the camera 212 may be positioned so that it captures a serial number of the medical device, or a bar code of the medical device. The recorded product identification of the medical device can be used as a check to verify the camera-to-patient association.
In another embodiment, a clinician may manually associate the camera 212 with a medical device and a patient by manually entering association information into, for example, the medical device interfacing system 210. For instance, the clinician may be interested in obtaining a one-time measurement of blood oxygen levels of a patient using a pulse oximeter. The clinician manually associates the camera 212 and the pulse oximeter device with the patient and obtains an image; the clinician then manually disassociates the camera 212 and the pulse oximeter device from the patient. The image obtained in this manner is known to be associated with the pulse oximeter device and the patient.
In one embodiment of the invention, the camera 212 may be positioned so that its field of view captures the entire output of a medical device. The entire output may subsequently be analyzed to generate a series of results. Alternatively, selected regions of the output may be analyzed to generate results. In another embodiment, the camera 212 may be positioned so that its field of view captures a discrete region of the image. The region, when subsequently analyzed, may generate one result.
The EMR 214 may comprise electronic clinical documents such as images, clinical notes, summaries, reports, analyses, or other types of electronic medical documentation relevant to a particular patient's condition and/or treatment. Electronic clinical documents contain various types of information relevant to the condition and/or treatment of a particular patient and can include information relating to, for example, patient identification information, images, physical examinations, vital signs, past medical histories, surgical histories, family histories, histories of present illnesses, current and past medications, allergies, symptoms, past orders, completed orders, pending orders, tasks, lab results, other test results, patient encounters and/or visits, immunizations, physician comments, nurse comments, other caretaker comments, and a host of other relevant clinical information. The electronic clinical documents must be authenticated or signed to be considered a valid, legal medical record. Authenticated medical records stored in the EMR 214 are searchable and accessible for use in patient care. However, the EMR 214 may also store unvalidated data and/or images. The unvalidated data and/or images may still be used by clinicians in the decision-making process.
The end-user computing device 216 may be any type of computing device such as, for example, any of the remote computers 108 of
The medical device interfacing system 210 shown in
Components of the medical device interfacing system 210 may include, without limitation, a processing unit, internal system memory, and a suitable system bus for coupling various system components, including one or more data stores for storing information (e.g., files and metadata associated therewith). The medical device interfacing system 210 typically includes, or has access to, a variety of computer-readable media.
While the medical device interfacing system 210 is illustrated as a single unit, it will be appreciated that the medical device interfacing system 210 is scalable. For example, the medical device interfacing system 210 may in actuality include a plurality of computing devices in communication with one another. Moreover, the EMR 214, or portions thereof, may be included within, for instance, the medical device interfacing system 210 as a computer-storage medium. The single unit depictions are meant for clarity, not to limit the scope of embodiments in any form.
As shown in
In one embodiment of the invention, the image collector component 220 is configured to receive an image from the camera 212, where the camera 212 has recorded the image from an output of a medical device. The output of the medical device may be in the form of a visual display of a video or a static image, or a printout from the medical device. As well, the output may comprise alphanumeric characters or a wave form(s) and may include the entire output of the medical device or only a portion of the output of the medical device. In another embodiment of the invention, the image collector component 220 may be configured to receive an image from a foreign or third-party system (via some type of electronic health information transmission protocol such as, for example, HL7). For example, wave form data may be received by the image collector component 220 from a foreign system via HL7 protocol. Any and all such variations are within the scope of embodiments of the present invention.
The image collector component 220 may, in one embodiment, be configured to determine an identity of a medical device the camera 212 is associated with. This can be done for example, by determining an Internet Protocol (IP) address of the camera 212. The image collector component 220 can then access a data store (for example, the data store 104 of
In some embodiments, the image collector component 220 may be configured to temporarily store the image upon receiving the image and prior to analyzing the image. Images stored in this manner may still be accessible by clinicians for decision-making purposes. For example, the image collector component 220 may temporarily store the image in the EMR 214 or in an intermediate data store. Still further, the image may be temporarily stored based on an identity of the medical device. The identity of the medical device may include the type of medical device, (i.e., a vital signs monitor), the company that produced the medical device, the model or serial number of the medical device, and the like. In yet another aspect, the image may be stored in a work queue that is prioritized based on the identity of the medical device. For example, images received from vital signs monitors may have a higher priority in the work queue than images from an optical instrument used for everyday eye exams. Images may also be prioritized based on a clinical status of a patient (stable, critical, etc.), a clinician identity, time until discharge, and the like.
In some embodiments of the invention, the image collector component 220 may be configured to cause the camera 212 to record images. The image collector component 220 may cause the camera 212 to record images at fixed intervals or upon determining that an image has changed in some material way beyond just background noise. For example, it may be determined that a threshold number of pixels that comprise the image have changed.
The determining component 222 is configured for determining a region of the image to be analyzed. This determination is dependent upon the identity of the medical device as determined by, for example, the image collector component 220. In one embodiment, the determining component 222 may determine that the entire image, and not just a region of the image, should be analyzed. For instance, the entire output of an electrocardiogram (EKG) monitor contains useful information. The determining component 222 determines that an entire image of the output should be analyzed based on the identity of the medical device—an EKG monitor.
In another embodiment, the determining component 222 may determine that a region of the image needs to be analyzed. For example, some regions of the image contain data that is not particularly useful for helping clinicians make decisions regarding patient care. But other regions of the image contain useful data. Again, the identity of the medical device determines which regions of the image have useful data that should be analyzed. By way of illustrative example, a spirometer measures the volume of air inspired and expired by the lungs. The output of the spirometer may be a printout with a wave form where the peak of the wave form indicates maximum exhalation and the trough of the wave form indicates maximum inspiration. A clinician would be interested in data regarding the peaks and troughs but not necessarily data associated with other parts of the wave form.
The analyzer component 224 is configured to analyze an image and/or determined region and generate results as outputs. One result may be generated or multiple results may be generated. Images and/or regions containing alphanumeric text may be analyzed using optical character recognition (OCR) technology, while images and/or regions that consist of wave forms may be analyzed by using waveform analysis including Fourier analysis. The results may include numerical values or text.
In one embodiment, when a medical device is always associated with the camera 212 or when a clinician manually associates a medical device with the camera 212, the analyzer component 224 analyzes the image and generates results without, for example, the image collector component 220 determining an identity of the medical device. In another embodiment, the analyzer component 224 analyzes the image and generates results based on knowledge of the identity of the device as determined by, for example, the image collector component 220. In yet another embodiment, the analyzer component 224 analyzes a region of the image and generates results, where the region analyzed is determined by, for example, the determining component 222 based on an identity of the medical device.
In one aspect of the invention, a tagging component (not shown) is configured to tag the image or a region of the image with a result(s). Thus, a clinician viewing the image would see tags corresponding to the result(s). In the example given above regarding the spirometer, a clinician could access the spirometer readout image from, for example, the EMR 214 and view the image. The peaks of the image would be tagged with a FEV1 value, and the troughs of the image would be tagged with a FIV1 value.
In yet another aspect of the invention, the validation component 226 is configured to present the image, and/or region, and the result(s) to a user and receive validation from the user indicating that the result(s) is valid for the image and/or region. The validation component 226 may also be configured to present an image with a tagged region to a user and receive validation of the tagged region from the user indicating that the tagged region is valid for the image. The user may be a clinician involved in the patient's care. Or the user may be an “image transcriptionist.” If the image does not contain sensitive patient information, it is possible for the transcription process to be performed outside of patient/provider confidentiality guidelines. Once the clinician and/or the image transcriptionist has verified that the result is valid for the image and/or region, or that the tagged region is valid for the image, the image can be signed. Once the image is signed or authenticated it is considered a valid, signed medical record that can be accessed from the EMR 214 and used to care for the patient.
The storing component 228 is configured to store the result(s) in, for example, the EMR 214. In one aspect, the storing component 228 is configured to store the result(s) and/or the image in the EMR 214. In yet another aspect, the storing component 228 is configured to store the image and/or any tagged regions in the EMR 214. In one embodiment, the storing component 228 may be configured to identify a proper location in the EMR 214 for storing the result(s). The location may be dependent upon what type of medical device generated the result(s).
The storing component 228 may store the result(s), and/or the image, and/or the tagged region in the EMR 214 before receiving validation or after receiving validation. If it is stored before validation, a clinician or image transcriptionist would still be able to access the image or results from the EMR 214 by using, for example, a patient identification and use the image or results for decision-making purposes. The result may then be validated and re-stored in the EMR 214. If it is stored after receiving validation, it is considered a valid, signed medical record. At this point, the image can be searched and accessed by a clinician and used to guide decisions regarding patient care.
Turning now to
At step 312, the image is analyzed to generate a result. In one embodiment, the entire image is analyzed to generate one or more results. In another embodiment, an identity of the medical device associated with the camera is determined before the image is analyzed. This may be done by, for example, the image collector component 220 of
Turning now to
Continuing with respect to step 410, an image may be received from a foreign or third-party system via some type of electronic health information transmission protocol. Images received in this manner may include information detailing an identity of a medical device that produced the image.
At step 412, a region of the image is determined to be analyzed based upon an identity of the medical device. This determination may be made by, for example, the determining component 222 of
At step 414, the region of the image is analyzed. Optical character recognition may be used if the region contains alphanumeric characters, and waveform analysis may be used if the region includes wave forms. At step 416, a result is generated for the analyzed region. The result may be a numerical value or text. In one embodiment, the image and/or region is tagged with the result.
At step 418, validation of the result is received to provide a validated result. Validation may be received after presenting the image and/or region and the result to a user and receiving validation from the user indicating that the result is valid for the image and/or region. The user may be a clinician involved in patient care, or, if privacy concerns are maintained, the user may be an image transcriptionist that validates the image outside of the normal healthcare setting. If the image and/or region has been tagged with the result, the tagged image and/or region may be validated.
At step 420, the validated result is stored in an EMR such as the EMR 214 of
The present invention has been described in relation to particular embodiments, which are intended in all respects to be illustrative rather than restrictive. Further, the present invention is not limited to these embodiments, but variations and modifications may be made without departing from the scope of the present invention.
This patent application is a continuation-in-part of U.S. application Ser. No. 14/844,932, filed Sep. 3, 2015, entitled “Medical Device Interfacing Using a Camera,” and having Attorney Docket No. CRNI.240748, which is a continuation of U.S. application Ser. No. 13/167,269, filed Jun. 23, 2011, entitled “Medical Device Interfacing Using a Camera,” and having Attorney Docket No. CRNI.161843. The entirety of the disclosures of each of U.S. application Ser. No. 13/167,269 and U.S. application Ser. No. 14/844,932 is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 13167269 | Jun 2011 | US |
Child | 14844932 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14844932 | Sep 2015 | US |
Child | 15905184 | US |