The present disclosure relates generally to systems and methods for computer-based medical charting and, more specifically, to systems and methods for computer-based medical charting employing an ink-over interface with structured data capture.
A medical chart can provide a comprehensive, standardized, graphical view of clinical data (e.g., diagnostic information and/or therapeutic information) related to at least one area of a patient's body. Traditional medical charts have employed a pen and paper interface that allows medical professionals to make the annotations in locations approximating clinical reality (e.g., the annotation can be made in the proper spatial orientation). However, as the medical community has embraced the digital world (e.g., electronic medical records), traditional pen and paper interfaces have become obsolete. Electronic medical charts have been developed with keyboard and mouse interfaces to meet the demands of the digital world. However, the keyboard and mouse interfaces allow for imprecise annotations that do not approximate clinical reality. Additionally, medical professionals can find the keyboard and mouse interface cumbersome to document clinical findings and treatment planning.
The present disclosure relates generally to systems and methods for computer-based medical charting and, more specifically, to systems and methods for computer-based medical charting employing an ink-over interface with structured data capture. The ink-over interface allows an input that resembles a traditional pen and paper interface, while providing the computational abilities of the digital world. The structured data capture allows the input to be stored and categorized in a structured format (e.g., for categorization in an electronic health record (EHR) of a patient).
In one aspect, the present disclosure can include a system that can enter clinical data on an electronic medical chart. The system can include a non-transitory memory storing computer-executable instructions and a processor that executes the computer-executable instructions to at least: receive a graphical data input from an ink-over interface, wherein the graphical data input comprises an annotation associated with at least a portion of a patient's body displayed by the electronic medical chart; perform optical symbol recognition to identify the annotation; determine information associated with the annotation based on information stored in a recognition engine; and store the information associated with the annotation as structured data associated with the at least the portion of the patient's body in an electronic health record.
In another aspect, the present disclosure can include a method for entering clinical data into an electronic medical chart. The method can include steps that can be performed by a system that includes a processor. The steps can include: receiving a graphical data input from an ink-over interface, wherein the graphical data input comprises an annotation associated with at least a portion of a patient's body displayed by the electronic medical chart; performing optical symbol recognition to identify the annotation; determining information associated with the annotation based on information stored in a recognition engine; and storing the information associated with the annotation as structured data associated with the at least the portion of the patient's body in an electronic health record.
In a further aspect, the present disclosure can include an electronic medical charting system. The electronic medical charting system can include an ink-over interface and a computing device associated with the ink-over interface. The ink-over interface can be configured to receive a graphical data input comprising an annotation associated with at least a portion of a patient's body. The computing device can include a non-transitory memory storing computer-executable instructions; and a processor that executes the computer-executable instructions to at least: detect a gesture associated with the ink-over interface; receive the graphical data input from the ink-over interface based on detection of the gesture; perform optical symbol recognition to identify an annotation within the graphical data input; determine information associated with the annotation based on information stored in a recognition engine; and store the information associated with the annotation as structured data associated with the at least the portion of the patient's body in an electronic health record.
The foregoing and other features of the present disclosure will become apparent to those skilled in the art to which the present disclosure relates upon reading the following description with reference to the accompanying drawings, in which:
In the context of the present disclosure, the singular forms “a,” “an” and “the” can also include the plural forms, unless the context clearly indicates otherwise.
The terms “comprises” and/or “comprising,” as used herein, can specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups.
As used herein, the term “and/or” can include any and all combinations of one or more of the associated listed items.
Additionally, although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Thus, a “first” element discussed below could also be termed a “second” element without departing from the teachings of the present disclosure. The sequence of operations (or acts/steps) is not limited to the order presented in the claims or figures unless specifically indicated otherwise.
As used herein, the term “medical charting” can refer to a process in which a medical professional lists and describes information related to the health of at least a portion of a patient's body. The information can be graphically summarized and organized on a medical chart. In some instances, the information can include diagnostic information and/or therapeutic information.
As used herein, the term “medical chart” can refer to a graphical tool that presents a comprehensive view of at least a portion of a patient's body and information related to the health of the at least the portion of the patient's body. The information can be displayed, for example, via annotations.
As used herein, the terms “electronic” and “computer-based” when used in connection with “medical charting” and “medical chart” can refer to data entry corresponding to the “medical charting” and “medical chart” being performed using a computing device that includes at least a non-transitory memory.
As used herein, the terms “electronic health record (EHR)” and “electronic medical record” can refer to a digital version of a patient's paper medical chart. The EHR can be stored in a repository that is accessible to different medical professionals associated with the patient.
As used herein, the term “annotations” can refer to diagrammatic indications on at least a portion of a patient's body chart that reflect the information related to the health of at least a portion of a patient's body in a standardized manner. The standardized annotations can include text, numbers, and/or symbols in different colors, where specific combinations of text, numbers, symbols, and colors can represent different conditions. The standardization allows the annotations to be understood by different medical professionals (either specific to a certain specialty or uniform across specialties).
As used herein, the term “interface” can refer to software and/or hardware that allow a user (e.g., a medical professional) to communicate with a computing device.
As used herein, the term “ink-over interface” can refer to a software or hardware interface that allows a user (e.g., a medical professional) to enter graphical data input to a computing device. The graphical data input can be written and/or drawn in one or more colors on the ink-over interface using an input device. In some instances, an ink-over interface can be implemented on a touch screen device (e.g., a tablet computing device, a smart phone device, a laptop computing device, etc.) and the input device can be a stylus, a finger, or the like.
As used herein, the term “graphical data input” can refer to an input on an ink-over interface including one or more annotations. In some instances, the graphical data input can be initiated and/or ended based on one or more gestures on the ink-over interface.
As used herein, the term “structured data” can refer to data that resides in a fixed field within a stored record (e.g., a relational database). In some instances, structured data can include data related to an annotation, such as a condition, a procedure, a medical history, a medical professional's name, etc.
As used herein, the term “medical professional” can refer to a person involved in a medical exam or procedure that can employ a medical chart, including, but not limited to, doctors, physicians assistants, nurse practitioners, nurses, medical students, and other medical staff.
As used herein, the term “patient” can refer to any warm-blooded organism including, but not limited to, a human being, a pig, a rat, a mouse, a dog, a cat, a goat, a sheep, a horse, a monkey, an ape, a rabbit, a cow, etc.
The present disclosure relates generally to systems and methods for electronic medical charting and, more specifically, to systems and methods for electronic medical charting employing an ink-over interface with structured data capture. The ink-over interface allows an input that resembles a traditional pen and paper interface, while providing a link to the digital world. The systems and methods of the present disclosure can solve problems inherent to electronic medical charting with keyboard and mouse interfaces that can are not intuitive and provide imprecise annotations. In contrast to traditional electronic medical charting, the systems and methods of the present disclosure provide an intuitive ink-over interface (e.g., analogous to traditional pen and paper interfaces) for entry of standardized annotations, while also meeting the existing information needs of medical professionals (e.g., by employing optical symbol recognition and a recognition engine to create structured data related to the annotations). The electronic medical chart of the present disclosure can be used for medical record documentation (e.g., in the patient's EHR). Examples of fields where the systems and methods of the present disclosure can be used include podiatry (e.g., for charting diabetic foot assessments), dermatology (e.g., for charting skin lesions), and ophthalmology (e.g., for annotation of retinal health).
One aspect of the present disclosure can include a system for computer-based medical charting. One example of such a system is shown in
The computer program instructions can also be stored in a non-transitory computer-readable memory that can direct the computing device 8 to function in a particular manner, such that the instructions stored in the non-transitory computer-readable memory produce an article of manufacture including instructions, which implement the functions specified in the block diagrams and associated description.
The computer program instructions can also be loaded onto the computing device 8 to cause a series of operational steps to be performed to produce a computer-implemented process such that the instructions that execute on the computing device 8 provide steps for implementing the functions of the components specified in the block diagrams and the associated description.
Accordingly, functionalities of the computing device 8 and/or the system 10 can be embodied at least in part in hardware and/or in software (including firmware, resident software, micro-code, etc.). Furthermore, aspects of the computing device 8 and/or the system 10 can take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. A computer-usable or computer-readable medium can be any non-transitory medium that is not a transitory signal and can contain or store the program for use by or in connection with the instruction or execution of a system, apparatus, or device. The computer-usable or computer-readable medium can be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples (a non-exhaustive list) of the computer-readable medium can include the following: a portable computer diskette; a random access memory; a read-only memory; an erasable programmable read-only memory (or Flash memory); and a portable compact disc read-only memory.
As shown in
In some instances, the electronic medical chart can be displayed by the display 6 with one or more views of a portion of the electronic medical chart. For example, one view can include a zoomed in view of one or more portions of the patient's body from the electronic medical chart. In this case, the graphical data input can be based on a gesture that includes a symbol drawn on the zoomed in view of the portion of the patient's body at a certain location where an annotation is made or is going to be made.
The electronic medical chart (EC) can be used in any field of medicine, dentistry, veterinary medicine, or the like. The respective annotations can be standard annotations that include symbols, colors, numbers, and text to document clinical conditions that are widely accepted across the field. The electronic medical chart (EC) can include patient information (e.g., name, date of birth, contact information, chronic medical conditions, a photograph, etc.).
As an example, the electronic medical chart (EC) can be used in the field of ophthalmology. An example of an electronic medical chart (EC) that can be utilized in ophthalmology can include one or more fundus drawings that can assist the medical professional's annotations. For example, various annotations (with different symbols, numbers, and text) can be made in different colors to track a record of anterior segment and/or retinal disease progress. Various annotations can be made in various colors to represent various disease states and/or treatments of the retina. The anatomic region of the eye where the medical professional has made the annotation can be automatically recognized. The annotation can be stored as structured data on the electronic medical chart (EC) or in the clinical note of the electronic health record (EHR).
In another example, the electronic medical chart (EC) can be used in the field of podiatry (e.g., to document a foot exam for an injury or diabetic issues). In this case, the palpated points and other clinical information can be charted on the foot directly through the ink-over interface. The anatomic region of the foot or skin where the medical has made the annotation can be automatically recognized. The annotation can be stored as structured data on the electronic medical chart (EC) or in the clinical note of the electronic health record (EHR).
In yet another example, the electronic medical chart (EC) can be used in the field of dermatology. For example, accepted annotations can be used when documenting a specific region, shape, and color of moles or other birth identification marks or other clinical conditions. The medical professional can chart directly onto a map of the human body. The anatomic region of these annotations can be automatically recognized, and the annotations can be captured as structured data, which can be stored in an electronic health record (EHR) of the patient.
In each of these specialties, the electronic medical chart (EC) can also include a plurality of selectable tools that can be utilized in connection with the ink-over interface 4. In some instances, the tools can include a pen tool with different selectable colors and an eraser tool. The display can also include a history (e.g., related to previous annotations on the portion of the patient's body). In some instances, the display can also include one or more x-ray images (e.g., from the electronic medical record (EHR)). The display can also include one or more actions that can be performed on the electronic medical chart (EC) (e.g., select the portion of the patient's body, annotate the portion of the patient's body, edit history of portion of the patient's body, select a different portion of the patient's body, view previous versions of the chart, etc.).
The electronic medical chart (EC) can be displayed in one or more of a plurality of different views. For example, an area of interest mode can be activated by a tap gesture on the portion of the body. An additional window can be displayed with a zoomed in version of the portion of the body (and may include additional surrounding portions of the body). For example, the zoomed in version can include a 2x zoom area of the selected portion of the patient's body. A double-tap gesture can allow the zoomed in version to be further zoomed in (e.g., 4x). An annotation can be made on the zoomed in version of the portion of the patient's body and gestures related to the annotation can be used to identify the annotation. The history related to the portion of the patient's body can be updated with structured data related to the annotation (e.g., with progress notes showing the progression of the portion of the patient's body through history).
Referring again to
One or more of the components can include instructions that are stored in a non-transitory memory 22 and executed by a processor 24. Each of the components can be in a communicative relationship with one or more of the other components, the processor 24, and/or the non-transitory memory 22 (e.g., via a direct or indirect electrical, electromagnetic, optical, or other type of wired or wireless communication) such that an action from the respective component causes an effect on one or more of the other components and/or on the electronic medical chart (EC).
The ink-over interface receiver 12 can be configured to receive a graphical data input (GDI) from an ink-over interface 4. In some instances, the graphical data input (GDI) can include an annotation associated with one or more portions of the patient's body on a medical chart. For example, the annotation can include one or more of: a location on the portion of the patient's body associated with the annotation, information related to an existing condition, diagnostic information, therapeutic information, and information related to a planned treatment or procedure. In some instances, the annotation can relate to a previous diagnosis, treatment, or procedure.
A state diagram 32 showing the operation of the ink-over interface receiver 12 in the receipt of graphical data information (GDI) is illustrated in
Referring again to
If a match for the annotation is not found in the recognition engine 18 (e.g., at element 66), the annotation can be moved to a pattern failed state. The annotation analyzer 14 can prompt the medical professional to complete the annotation correctly by starting an annotation assistant (e.g., at element 67). The annotation assistant can prompt the medical professional to clear the current annotation, persist with the image as unstructured data, and/or make recommendations to guide the medical professional to an alternate means of creating the structured data (SD).
Referring again to
When the structured data is accepted as associated with the portion of the patient's body, it can persist with the portion of the patient's body (e.g., through different views, different zoom levels, different charts, and the like). In some instances, the structured data can be stored in an electronic health record (EHR). The electronic health record (EHR) can include additional information that can provide a complete medical history for the patient (e.g., x-rays). In other instances, the structured data can be included in the electronic medical chart (EC). The electronic medical chart (EC) can be transmitted to the display 6 and the structured data (SD) can be visually displayed with the rest of the electronic medical chart (EC).
Another aspect of the present disclosure can include a method for electronic medical charting. An example of a method 70 that can detect annotations entered via an ink-over interface is shown in
The methods 70 and 80 of
One or more blocks of the respective flowchart illustrations, and combinations of blocks in the block flowchart illustrations, can be implemented by computer program instructions. The computer program instructions can be stored in memory and provided to a processor of a general purpose computer, special purpose computer, and/or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer and/or other programmable data processing apparatus, create mechanisms for implementing the steps/acts specified in the flowchart blocks and/or the associated description. In other words, the steps/acts can be implemented by a system comprising a processor that can access the computer-executable instructions that are stored in a non-transitory memory.
The methods 70 and 80 of the present disclosure may be embodied in hardware and/or in software (including firmware, resident software, micro-code, etc.). Furthermore, aspects of the present disclosure may take the form of a computer program product on a computer-usable or computer-readable storage medium having computer-usable or computer-readable program code embodied in the medium for use by or in connection with an instruction execution system. A computer-usable or computer-readable medium may be any non-transitory medium that can contain or store the program for use by or in connection with the instruction or execution of a system, apparatus, or device.
Referring to
At step 74, a graphical data input (GDI) can be received (from ink-over interface receiver 12 at annotation analyzer 14) from the ink-over interface upon detection of the gesture. The graphical data input can include the completed annotation. At 76, information associated with the graphical data input (GDI) can be determined (e.g., based on a pattern recognition process by annotation analyzer 14) based on information stored in a recognition engine (e.g., recognition engine 18).
The information associated with the graphical data input (GDI) can be used to create structured data associated with the annotation (e.g., by structured data unit 16). For example, the received annotation can undergo a pattern recognition process to match the annotation to stored information (e.g., within the recognition engine 18). If a match is found, structured data (SD) related to the annotation can be created (e.g., by structured data unit 16) from information associated with the identified annotation (e.g., stored in the recognition engine 18). In some instances (e.g., when the annotation matches a lesser-known pattern), the medical professional can select one or more procedures and/or diagnoses associated with the annotation to create the associated structured data. If a match for the annotation cannot be found (e.g., within recognition engine 18), the medical professional can be prompted to complete the annotation correctly (e.g., prompt the medical professional to clear the current annotation, persist the image as unstructured data, and/or make recommendations to guide the medical professional to an alternate means of creating the structured data (SD)).
Referring now to
From the above description, those skilled in the art will perceive improvements, changes and modifications. Such improvements, changes and modifications are within the skill of one in the art and are intended to be covered by the appended claims.
This application is a Continuation-in-Part of U.S. patent application Ser. No. 14/779,408, filed Sep. 23, 2015, entitled “Systems and Methods for Tooth Charting”, which is a U.S. National Stage filing under 35 USC 371, claiming priority to Serial No. PCT/US2014/032601, filed Apr. 2, 2014, and also claims the benefit of U.S. Provisional Application No. 61/808,871, filed Apr. 5, 2013 and U.S. Provisional Application No. 61/876,242, filed Sep. 11, 2013, both entitled “Intelligent Tooth Charting Interface”. These applications are hereby incorporated by reference in their entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
61808871 | Apr 2013 | US | |
61876242 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14779408 | Sep 2015 | US |
Child | 14882693 | US |