Automated code feedback system

Information

  • Patent Grant
  • 11282596
  • Patent Number
    11,282,596
  • Date Filed
    Friday, November 16, 2018
    7 years ago
  • Date Issued
    Tuesday, March 22, 2022
    3 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Zhang; Leshui
    Agents
    • Blueshift IP, LLC
    • Plotkin; Robert
Abstract
A computer system automatically generates and provides real-time feedback to a healthcare provider about a selected Evaluation and Management (E/M) level to assign to a patient encounter. The system provides the feedback while the healthcare provider is creating the clinical note that documents the patient encounter. The system may also automatically generate and suggest E/M levels to the healthcare provider for approval by the healthcare provider while the healthcare provider is creating the clinical note.
Description
BACKGROUND

When a healthcare provider provides healthcare services to a patient in a patient encounter, the provider must create accurate, precise, and complete clinical documentation in order to provide the patient with high quality care, and in order for the provider to receive proper and timely payment for services rendered. Facts, findings, and observations about the patient's history, current physiological state, treatment options, and need for medical follow-up must be recorded. Physicians and other healthcare providers typically document patient encounters in a clinical note, which may be stored in a document and/or Electronic Medical Record (EMR) system.


As a side-effect of providing patient care, healthcare providers must be paid for the services they provide. The services are reported with codes developed by the American Medical Association (AMA) and published in the Current Procedural Terminology (CPT) with yearly updates to reflect new and emerging medical technology. A specific subsection of CPT, Evaluation and Management (E/M), was established to report provider services that do not involve diagnostic or therapeutic procedures. These services are typically represented in physician office visits, consultative services, or daily inpatient hospital visits, but can include services provided in locations such as emergency rooms, hospice facilities, and nursing homes. According to the Evaluation and Management Service Guidelines as published in the American Medical Association's Current Procedural Technology, “The levels of E/M services encompass the wide variations in skill, effort, time, responsibility, and medical knowledge required for the prevention or diagnosis and treatment of illness or injury and the promotion of optimal health.” Low E/M levels represent problem-focused patient encounters with straightforward medical decision making whereas high levels represent comprehensive encounters with medical decision making of high complexity. The documentation of the patient encounter that is created by the healthcare provider is used as the evidentiary record of the service provided in support of an E/M level assigned to the patient encounter.


One set of E/M standards is entitled, “1995 Documentation Guidelines for Evaluation and Management Services,” available online at https://www.cms.ov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNEdWebGuide/Downloads/95Docguidelines.pdf, and hereby incorporated by reference herein. Another set of E/M standards is entitled, “1997 Documentation Guidelines for Evaluation and Management Services,” available online at https://www.cms.gov/Outreach-and-Education/Medicare-Learning-Network-MLN/MLNEdWebGuide/Downloads/97Docguddelines.pdf, and hereby incorporated by reference herein. As described in those two documents, and merely for purposes of example and without limitation, descriptors for the levels of E/M services may recognize seven components which may be used, in any combination, to define the levels of E/M services, namely: history, examination, medical decision making, counseling, coordination of care, nature of presenting problem, and time.


In existing systems, the E/M level for a particular patient encounter typically is selected by providing the clinical note from the patient encounter to a medical coding expert, who applies a complex set of standardized rules to the clinical note to select an appropriate E/M level, which in turn is used to determine the payment for the services rendered by the provider. The physician's and/or medical coder's judgment about the patient's condition may also be taken into account when selecting an E/M level to apply to the patient encounter. The resulting E/M level is then used to determine the payment for the services provided by the physician. Thus, existing professional fee coding techniques require a coding professional with a thorough understanding of E/M guidelines to analyze clinical documentation and to determine the level of service given to a patient by a provider of medical care.


As the description above illustrates, the process of selecting an appropriate E/M level to apply to a patient encounter is a complex, tedious, and time-consuming process, involving a combination of rules and human judgment, requiring a human medical coding expert with a thorough understanding of E/M guidelines to analyze clinical document in order to determine the level of service given to the patient by the provider.


SUMMARY

A computer system automatically generates and provides real-time feedback to a healthcare provider about a selected Evaluation and Management (E/M) level to assign to a patient encounter. The system provides the feedback while the healthcare provider is creating the clinical note that documents the patient encounter. The system may also automatically generate and suggest E/M levels to the healthcare provider for approval by the healthcare provider while the healthcare provider is creating the clinical note.


Other features and advantages of various aspects and embodiments of the present invention will become apparent from the following description and from the claims.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a dataflow diagram of a computer system for automatically generating providing real-time feedback to a healthcare provider about Evaluation and Management (E/M) codes selected by the healthcare provider 102 according to one embodiment of the present invention.



FIG. 2 is a flowchart of a method performed by the system of FIG. 1 according to one embodiment of the present invention.





DETAILED DESCRIPTION

Referring to FIG. 1, a dataflow diagram is shown of a computer system 100 for automatically generating providing real-time feedback to a healthcare provider 102 about Evaluation and Management (E/M) codes selected by the healthcare provider 102 according to one embodiment of the present invention. Referring to FIG. 2, a flowchart is shown of a method 200 performed by the system 100 of FIG. 1 according to one embodiment of the present invention.


An audio capture component 106 captures the speech 104 of the healthcare provider 102 (e.g., a physician) during or after a patient encounter (FIG. 2, operation 202). The healthcare provider 102 may, for example, dictate a report of the patient encounter, while the patient encounter is occurring and/or after the patient encounter is completed, in which case the speech 104 may be the speech of the healthcare provider 102 during such dictation. Embodiments of the present invention, however, are not limited to capturing speech that is directed at the audio capture component 106 or otherwise intended for use in creating documentation of the patient encounter. For example, the speech 104 may be natural speech of the healthcare provider 102 during the patient encounter, such as speech of the healthcare provider 102 that is part of a dialogue between the healthcare provider 102 and the patient. Regardless of the nature of the speech 104, the audio capture component 106 captures some or all of the speech 104 and produces, based on the speech 104, an audio output signal 108 representing some or all of the speech 104. The audio capture component 106 may use any of a variety of known techniques to produce the audio output signal 108 based on the speech 104.


Although not shown in FIG. 1, the speech 104 may include not only speech of the healthcare provider 102 but also speech of one or more additional people, such as one or more additional healthcare providers (e.g., nurses) and the patient. For example, the speech 104 may include the speech of both the healthcare provider 102 and the patient as the healthcare provider 102 engages in a dialogue with the patient as part of the patient encounter.


The audio capture component 106 may be or include any of a variety of well-known audio capture components, such as microphones, which may be standalone or integrated within or otherwise connected to another device (such as a smartphone, tablet computer, laptop computer, or desktop computer).


The system 100 also includes an automatic speech recognition (ASR) and natural language understanding (NLU) component 110, which performs automatic speech recognition and natural language processing to the audio signal 108 to produce a structured note 112, which contains both text 114 representing some or all of the words in the audio signal 108 and concepts extracted from the audio signal 108 and/or the text 114 (FIG. 2, operation 204). The ASR/NLU 110 may, for example, perform the functions disclosed herein using any of the techniques disclosed in U.S. Pat. No. 7,584,103 B2, entitled, “Automated Extraction of Semantic Content and Generation of a Structured Document from Speech” and U.S. Pat. No. 7,716,040, entitled, “Verification of Extracted Data,” which are hereby incorporated by reference herein. The ASR/NLU 100 component may encode concepts in the structured note 112 using, for example, the Systematized Nomenclature of Medicine (SNOMED).


The ASR/NLU component 110 may use the structured note 112 to further identify ICD diagnoses as well as CPT procedures. The ASR/NLU component 110 may use a set of rules to identify the chief complaint (CC) and one or more E/M concepts in order to classify the subcomponents and components of the structured note 112 that make up the history, exam, and medical decision-making (MDM) sections of the E/M documentation guidelines.


The ASR/NLU component 110 may be implemented in any of a variety of ways, such as in one or more software programs installed and executing on one or more computers. Although the ASR/NLU component 110 is shown as a single component in FIG. 1 for ease of illustration, in practice the ASR/NLU component may be implemented in one or more components, such as components installed and executing on separate computers.


The structured note 112 may take any of a variety of forms, such as any one or more of the following, in any combination: a text document (e.g., word processing document), a structured document (e.g., an XML document), and a database record (e.g., a record in an Electronic Medical Record (EMR) system). Although the structured note 112 is shown as a single element in FIG. 1 for ease of illustration, in practice the structured note 112 may include one or more data structures. For example, the text 114 and the concepts 116 may be stored in distinct data structures. The structured note 112 may include data representing correspondences (e.g., links) between the text 114 and the concepts 116. For example, if the concepts 116 include a concept representing an allergy to penicillin, the structured note 112 may include data pointing to or otherwise representing text within the text 114 which represents an allergy to penicillin (e.g., “Patient has an allergy to penicillin”).


The system 100 also includes an automatic Evaluation and Management (E/M) code generation module 118, which may automatically generate one or more suggested E/M codes 120 based on the structured note 112 and an E/M calculation module 122 (FIG. 2, operation 206). Although the E/M code generation module 118 may generate one or more of the suggested E/M codes 120 after the structured note 112 has been generated in its entirety, this is not a limitation of the present invention. The E/M code generation module 118 may, for example, generate one or more of the suggested E/M codes 120 while the structured note 112 is being generated and before the entire structured note 112 has been generated, e.g., while any of one or more of the following is occurring:

    • while the healthcare provider 102 is speaking (to produce the speech 104) and before the healthcare provider 102 has produced all of the speech 104;
    • while the audio capture component 106 is capturing the speech and before the audio capture component 106 has captured all of the speech 104;
    • while the audio capture component 106 is generating the audio signal 108 and before the audio capture component 106 has generated all of the audio signal 108; and
    • while the ASR/NLU component 110 is processing the audio signal 108 to produce the structured note 112 and before the ASR/NLU component 110 has produced all of the structured note 112.


The E/M code generator 118 may, for example, generate a first one of the suggested E/M codes 120 based on a first portion of the structured note 112 after the ASR/NLU component 110 has generated only that first portion of the structured note 112, after which the ASR/NLU component 110 may generate a second portion of the structured note 112, after which the E/M code generator 118 may generate, based on the first and second portions of the structured note 112, a second one of the suggested E/M codes 120, and so on. As merely a few examples, the E/M code generator 118 may generate a first one of the suggested E/M codes 120 within 100 milliseconds, 200 milliseconds, 500 milliseconds, 1 second, 2 seconds, or 5 seconds of the first portion of the structured note 112 being generated. In this sense, the E/M code generator 118 may generate the suggested E/M codes 120 in real-time or near real-time.


The E/M code generation module 118 generates the suggested E/M codes 120 “automatically” in the sense that the E/M code generation module 118 generates the suggested E/M codes 120 without receiving input from the healthcare provider 102 or other user, other than by receiving data derived from the speech 104 in the form of the structured note 112. The E/M code generation module 118, in other words, receives the structured note 112 and, in response to receiving the structured note 112, generates the suggested E/M codes 120 automatically based on the structured note 112 and the E/M calculation module 122 without receiving or relying on any user input in the process of generating the suggested E/M codes 120.


The E/M calculation module 122 codifies professional fee guidelines for generating E/M codes based on clinical notes, such as the structured note 112. The E/M calculation module standardizes coding practices and improves efficiency as well as quality for coders and auditors. This includes using real-time NLU (such as by using the M*Modal Natural Language Processing (NLP) engine, available from M*Modal of Franklin, Tenn.) to identify clinical findings in the structured note 112 while the structured note 112 is being created, in real-time or near real-time. The E/M calculation module 122 may also use as input, for example, information relating to the patient encounter, such as the length of the encounter, the duration of the audio signal 108, and the admitting diagnosis for the patient. The E/M calculation module 122 may also use as input the subcomponents and components of the structured note 112 that were previously identified, as described above.


The system 100 provides the suggested E/M codes 120 to the healthcare provider 102 for review (FIG. 2, operation 208). The system 100 may generate output representing the suggested E/M codes 120 in any of a variety of ways, such as by displaying text representing one or more of the suggested E/M codes 120. In response to being prompted by the system 100 to review the suggested E/M codes 120, the healthcare provider 102 provides input 124 indicating whether the healthcare provider 102 approves or disapproves of the suggested E/M codes 120. The system 100 receives the input 124, and may automatically incorporate any E/M codes approved by the healthcare provider 102 into the structured note 112 and not incorporate any E/M codes not approved by the healthcare provider 102 into the structured note 112 (FIG. 2, operation 210).


The approval process implemented by operations 208 and 210 may be performed in real-time or near real-time. For example, the E/M code generator 118 may provide one or more of the suggested E/M codes to the healthcare provider 102, and the healthcare provider 102 may provide the approval input 124:

    • after the healthcare provider 102 has produced some of the speech 104 and before the healthcare provider 102 has produced all of the speech 104;
    • while the audio capture component 106 is capturing the speech 104 and before the audio capture component 106 has captured all of the speech 104;
    • while the audio capture component 106 is generating the audio signal 108 and before the audio capture component 106 has generated all of the audio signal 108;
    • while the ASR/NLU component 110 is processing the audio signal 108 to produce the structured note 112 and before the ASR/NLU component 110 has produced all of the structured note 112; and
    • after the E/M code generation module 118 has produced some of the E/M codes 120 and before the E/M code generation module 118 has produced all of the E/M codes 120.


As described above, the E/M code generator 118 may automatically generate suggested E/M codes 120 for review and approval by the healthcare provider 102. Additionally, or alternatively, the healthcare provider 102 may input suggested E/M codes 120 to the system 100, in response to which the system 100 may evaluate the suggested E/M codes 120 input by the healthcare provider 102 and provide feedback to the healthcare provider 102 on those suggested E/M codes 120.


In some embodiments, the system 100 automatically generates the suggested E/M codes 120 and the healthcare provider 102 reviews those codes 120 but does not manually enter any suggested E/M codes 120 to the system 100. In other embodiments, the healthcare provider 102 manually enters the suggested E/M codes 120 into the system 100 and the system 100 does not automatically generate any of the suggested E/M codes 120. In yet other embodiments, the system 100 automatically generates some of the suggested E/M codes 120 and the healthcare provider 102 manually enters some of the suggested E/M codes 120.


The healthcare provider 102 may enter some or all of the suggested E/M codes 120 at any of the times described above, such as while the structured note 112 is being generated and before the entire structured note 112 has been generated. For example, the healthcare provider 102 may dictate the structured note 112 and, after describing part of the patient encounter, provide input specifying a suggested E/M code in connection with the part of the patient encounter that was just dictated, and then continue to dictate additional parts of the structured note.


The system 100 may include an E/M code evaluator 126, which may receive the E/M codes 120 input by the healthcare provider 102 and perform any of a variety of functions on those E/M codes 120. For example, the E/M code evaluator 126 may automatically generate, and provide to the healthcare provider 102, feedback 128 on the healthcare provider 102's suggested E/M codes 120, such as recommendations to keep or change the suggested E/M codes 120 to different E/M codes 120, or to document the patient encounter more fully to match the healthcare provider 102's suggested E/M codes 120. The E/M code evaluator 126 may, for example, determine that the structured note 112 justifies assigning a higher level E/M code than one of the healthcare provider 102's suggested E/M codes 120 and, in response to that determination, recommend (via the feedback 128) that the higher level E/M code replace the healthcare provider 102's suggested E/M code. Conversely, the E/M code evaluator 126 may, for example, determine that an E/M level represented by one of the healthcare provider 102's suggested E/M codes 120 is not justified by the structured note 112 and, in response to that determination, recommend (via the feedback 128) that an E/M code having a lower level than the healthcare provider 102's suggested E/M code be used to replace the healthcare provider 102's suggested E/M code. In cases in which the system 100 recommends that the E/M level be increased, the system 100 may provide output to the healthcare provider 102 indicating the type of documentation that is needed to justify the recommended higher E/M level.


Even if the E/M code evaluator 126 determines that the suggested E/M codes 120 are not justified by the structured note 112, the E/M code evaluator 126 may not recommend that further action be taken (e.g., that the healthcare provider 102 provide additional documentation to justify the E/M codes 120 or that the E/M codes 120 be sent to a billing coding specialist for review). For example, there are many situations in which more verbose documentation would not impact the current E/M level. If the system 100 determines that one of the suggested E/M codes is not justified by the structured note 112, and also determines that providing additional documentation within the structured note 112 would not increase the level of the suggested E/M code, then the system 100 may automatically decide not to request or suggest documentation changes, because such changes would not impact the E/M level, thereby avoiding wasting the healthcare provider 102's time to add more documentation.


As another example, the E/M code evaluator 126 may automatically decide, based on the healthcare provider 102's E/M codes 120, to provide those E/M codes 120 to a billing coding specialist for review, or to recommend that the E/M codes 120 be provided to a billing coding specialist for review. For example, the E/M code evaluator 126 may automatically decide to recommend that the E/M codes 120 be provided to a billing coding specialist for review in response to determining that the structured note 112 does not justify one or more of the suggested E/M codes 120. As yet another example, the healthcare provider 102 may provide input to the system 100 indicating that the E/M code generator 118's automatically generated codes 120 should be provided to a billing coding specialist for review.


While the healthcare provider 102 is dictating the structured note 112, the system 100 may determine whether a particular aspect of the patient encounter has been documented sufficiently in the structured note 112 and, in response to determining that the particular aspect of the patient encounter has been documented sufficiently for billing purposes in the structured note 112, the system 100 may notify the healthcare provider 102 of this so that the healthcare provider 102 can stop dictating additional information about that aspect of the patient encounter and move on to other aspects of the patient encounter.


As yet another example, the system 100 may provide the healthcare provider 102 with the option to provide input indicating that the suggested E/M codes 120 should be sent to a billing coding specialist for review. In response to receiving such input (e.g., the pressing of a “Send to Coding” button by the healthcare provider 102), the system 100 may send the suggested E/M codes 120 to a billing coding specialist for review (whether the suggested E/M codes 120 were generated automatically by the system 100 or manually by the healthcare provider 102).


Embodiments of the present invention have a variety of advantages. For example, existing systems require E/M codes to be generated on the back-end, i.e., as a post-process after the healthcare provider 102 has finished speaking (i.e., producing the speech 104), and typically after the structured note 112 has been generated. Such an approach can be highly inefficient and prone to error for a variety of reasons. As just one example, if the billing coding specialist who is producing the E/M codes based on the structured note 112 has any questions about the patient encounter, that specialist typically must contact the healthcare provider 102 with questions, often long after the patient encounter has concluded. In this situation, the memory of the healthcare provider 102 may have faded and the billing coding specialist incurs a delay in generating the required E/M codes even if the healthcare provider is able to answer the billing coding specialist's questions.


In contrast, embodiments of the present invention allow the healthcare provider to review and approve of automatically-generated E/M codes 120 on the front end, i.e., at the time the structured note 112 is being created, while the healthcare provider 102's memory of the patient encounter is fresh and while the healthcare provider 102 is working on creating the structured note 112. Embodiments of the present invention do this, in part, by generating the suggested E/M codes 120 and/or the feedback 128 during the patient encounter and while the structured note 112 is being generated. As a result of such real-time code and feedback generation by the system 100, the healthcare provider 102 does not need to become involved in the E/M code generation process again at a later time, and the involvement of a separate billing coding specialist may be minimized or eliminated. Even if the healthcare provider 102 does route the structured note 112 to a billing coding specialist for review and final approval, embodiments of the present invention make it more likely that the billing coding specialist will receive all necessary information from the healthcare provider 102, because of the real-time feedback and guidance that the system 100 provides to the healthcare provider 102 at the time of creation of the structured note 112.


Real-time generation of the codes 120 and feedback 128 is not a feature that is present in prior art systems and is not performed manually by humans in existing systems. Such real-time features are only made possible by the computer-implemented techniques disclosed herein, which perform processing automatically and quickly enough to enable the codes 120 and feedback 128 to be generated in real-time, e.g., while the patient encounter is ongoing and before the creation of the structured note is complete. Embodiments of the present invention, therefore, are inherently rooted in computer technology and do not merely automate manual processes.


Another advantage of embodiments of the present invention is that they use computer technology to generate E/M codes that have higher quality than the E/M codes that are manually generated by the healthcare provider 102, and do so while the patient is still receiving treatment from the healthcare provider 102. Embodiments of the present invention may, for example, alert the healthcare provider 102 when the manually-generated E/M codes 120 are not sufficiently justified by the content of the structured note 112 in progress, thereby giving the healthcare provider 102 the opportunity to obtain additional information necessary to justify the manually-generated E/M codes 120 while the patient is still present and the patient encounter is still ongoing, whereas such an opportunity would be lost if the system 100 did not alert the healthcare provider 102 until after the patient encounter had concluded. The automatic determination by a computer that a human-generated E/M code is not sufficiently justified by the content of the structured note 112 is a step that is not present in the prior art and is not merely an automation of an otherwise manual process.


Although the description herein refers to E/M codes, this is merely an example and not a limitation of the present invention. More generally, embodiments of the present invention may be applied to other kinds of codes, such as hierarchical condition codes, critical care codes, and preventive medicine codes.


The invention can indicate to the healthcare provider that they have sufficiently documented some area of the encounter (for the purposes of billing), so that they can move on if there is no clinical need to document further.


The invention can use other data sources to determine the expected E/M level and thus provide better critiques if documentation is lower than needed for the expected E/M level (e.g., length of time the provider spent with the patient is predictive of the E/M level).


Similarly, the invention may provide the option to automatically correct the documents with missing information, not by applying a rigid “Exam Template,” but by using NLU to determine an appropriate place to add the missing information and adding it automatically, or suggesting it and then adding it in response to user approval.


In real-time, while the note is being generated, the invention may tell the provider that he has sufficiently documented some area of the encounter (for the purpose of justifying an E/M code), so that the provider can move on if there is no clinical need to provide further documentation. This saves time for the provider.


The invention can use other data sources to determine the expected E/M level, and thereby provide better critiques if the documentation is lower than needed for the expected E/M level. For example, the invention may use the length of time the provider spent with the patient to identify an expected E/M level, and then ask the provider for more documentation if more documentation is needed to support the expected E/M level based on the amount of time the provider spent with the patient.


It is to be understood that although the invention has been described above in terms of particular embodiments, the foregoing embodiments are provided as illustrative only, and do not limit or define the scope of the invention. Various other embodiments, including but not limited to the following, are also within the scope of the claims. For example, elements and components described herein may be further divided into additional components or joined together to form fewer components for performing the same functions.


Any of the functions disclosed herein may be implemented using means for performing those functions. Such means include, but are not limited to, any of the components disclosed herein, such as the computer-related components described below.


The techniques described above may be implemented, for example, in hardware, one or more computer programs tangibly stored on one or more computer-readable media, firmware, or any combination thereof. The techniques described above may be implemented in one or more computer programs executing on (or executable by) a programmable computer including any combination of any number of the following: a processor, a storage medium readable and/or writable by the processor (including, for example, volatile and non-volatile memory and/or storage elements), an input device, and an output device. Program code may be applied to input entered using the input device to perform the functions described and to generate output using the output device.


Embodiments of the present invention include features which are only possible and/or feasible to implement with the use of one or more computers, computer processors, and/or other elements of a computer system. Such features are either impossible or impractical to implement mentally and/or manually. For example, embodiments of the present invention use computerized automatic speech recognition and natural language understanding components to automatically recognize and understand speech, in order to generate a structured note which contains both text and data representing concepts embodied in that text. Such components are inherently computer-implemented and provide a technical solution to the technical problem of automatically generating documents based on speech.


Any claims herein which affirmatively require a computer, a processor, a memory, or similar computer-related elements, are intended to require such elements, and should not be interpreted as if such elements are not present in or required by such claims. Such claims are not intended, and should not be interpreted, to cover methods and/or systems which lack the recited computer-related elements. For example, any method claim herein which recites that the claimed method is performed by a computer, a processor, a memory, and/or similar computer-related element, is intended to, and should only be interpreted to, encompass methods which are performed by the recited computer-related element(s). Such a method claim should not be interpreted, for example, to encompass a method that is performed mentally or by hand (e.g., using pencil and paper). Similarly, any product claim herein which recites that the claimed product includes a computer, a processor, a memory, and/or similar computer-related element, is intended to, and should only be interpreted to, encompass products which include the recited computer-related element(s). Such a product claim should not be interpreted, for example, to encompass a product that does not include the recited computer-related element(s).


Each computer program within the scope of the claims below may be implemented in any programming language, such as assembly language, machine language, a high-level procedural programming language, or an object-oriented programming language. The programming language may, for example, be a compiled or interpreted programming language.


Each such computer program may be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a computer processor. Method steps of the invention may be performed by one or more computer processors executing a program tangibly embodied on a computer-readable medium to perform functions of the invention by operating on input and generating output. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, the processor receives (reads) instructions and data from a memory (such as a read-only memory and/or a random access memory) and writes (stores) instructions and data to the memory. Storage devices suitable for tangibly embodying computer program instructions and data include, for example, all forms of non-volatile memory, such as semiconductor memory devices, including EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROMs. Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits) or FPGAs (Field-Programmable Gate Arrays). A computer can generally also receive (read) programs and data from, and write (store) programs and data to, a non-transitory computer-readable storage medium such as an internal disk (not shown) or a removable disk. These elements will also be found in a conventional desktop or workstation computer as well as other computers suitable for executing computer programs implementing the methods described herein, which may be used in conjunction with any digital print engine or marking engine, display monitor, or other raster output device capable of producing color or gray scale pixels on paper, film, display screen, or other output medium.


Any data disclosed herein may be implemented, for example, in one or more data structures tangibly stored on a non-transitory computer-readable medium. Embodiments of the invention may store such data in such data structure(s) and read such data from such data structure(s).

Claims
  • 1. A method performed by at least one computer processor executing computer program instructions stored on at least one non-transitory computer-readable medium, the method comprising: (1) at an audio capture component: (1) (a) capturing speech of a healthcare provider and speech of a patient;(1) (b) producing an audio signal representing the speech of the healthcare provider and the speech of the patient;(2) at an automatic speech recognition/natural language understanding (ASR/NLU) component, generating, based on a first portion of the audio signal, a first data structure containing: (a) a first text within a document, the first text representing a first speech of the healthcare provider and a first speech of the patient; (b) a first concept; and (c) a first association between the first text and the first concept;(3) at a code generator module, after (2) and before (4): (3) (a) generating a first code automatically based on the first text and the first concept; and(3) (b) incorporating the first code into the document;(4) after (2) and (3) and before (5), at the automatic speech recognition/natural language understanding (ASR/NLU) component, generating, based on a second portion of the audio signal, a second data structure containing: (a) a second text within the document, the second text representing a second speech of the healthcare provider and a second speech of the patient; (b) a second concept; and (c) a second association between the second text and the second concept; and(5) at the code generator module, after (2), (3), and (4): (5) (a) generating a second code automatically based on the second text and the second concept; and(5) (b) incorporating the second code into the document.
  • 2. The method of claim 1, wherein the first code comprises an evaluation and management (E/M) code.
  • 3. A non-transitory computer-readable medium comprising computer program instructions executable by at least one computer processor to perform a method, the method comprising: (1) at an audio capture component: (1) (a) capturing speech of a healthcare provider and speech of a patient;(1) (b) producing an audio signal representing the speech of the healthcare provider and the speech of the patient;(2) at an automatic speech recognition/natural language understanding (ASR/NLU) component, generating, based on a first portion of the audio signal, a first data structure containing: (a) a first text within a document, the first text representing a first speech of the healthcare provider and a first speech of the patient; (b) a first concept; and (c) a first association between the first text and the first concept;(3) at a code generator module, after (2) and before (4): (3) (a) generating a first code automatically based on the first text and the first concept; and(3) (b) incorporating the first code into the document;(4) after (2) and (3) and before (5), at the automatic speech recognition/natural language understanding (ASR/NLU) component, generating, based on a second portion of the audio signal, a second data structure containing: (a) a second text within the document, the second text representing a second speech of the healthcare provider and a second speech of the patient; (b) a second concept; and (c) a second association between the second text and the second concept; and(5) at the code generator module, after (2), (3), and (4): (5) (a) generating a second code automatically based on the second text and the second concept; and(5) (b) incorporating the second code into the document.
  • 4. The non-transitory computer-readable medium of claim 3, wherein the first code comprises an evaluation and management (E/M) code.
US Referenced Citations (165)
Number Name Date Kind
5065315 Garcia Nov 1991 A
5148366 Buchanan Sep 1992 A
5483443 Milstein et al. Jan 1996 A
5664109 Johnson et al. Sep 1997 A
5823948 Ross et al. Oct 1998 A
6006183 Lai et al. Dec 1999 A
6182039 Rigazio Jan 2001 B1
6345249 Ortega Feb 2002 B1
6377922 Brown Apr 2002 B2
6738784 Howes May 2004 B1
6834264 Lucas Dec 2004 B2
7233938 Carus Jun 2007 B2
7236968 Seki Jun 2007 B2
7313515 Crouch Dec 2007 B2
7447988 Ross Nov 2008 B2
7519529 Horvitz Apr 2009 B1
7584103 Fritsch Sep 2009 B2
7613610 Zimmerman Nov 2009 B1
7716040 Koll May 2010 B2
7725330 Rao May 2010 B2
7869998 Fabbrizio et al. Jan 2011 B1
7885811 Zimmerman Feb 2011 B2
8024196 Wodtke Sep 2011 B1
8050938 Green, Jr. Nov 2011 B1
8311854 Stanley Nov 2012 B1
8463673 Koll Jun 2013 B2
8468167 Sathyanarayana Jun 2013 B2
8583439 Kondziela Nov 2013 B1
8706521 Ramarajan Apr 2014 B2
8781829 Koll Jul 2014 B2
8781853 Green, III Jul 2014 B2
8805703 Martin Aug 2014 B2
9082310 Koll Jul 2015 B2
9275643 Koll Mar 2016 B2
9424523 Koll Aug 2016 B2
9679077 Jaganathan et al. Jun 2017 B2
9704099 Koll Jul 2017 B2
10156956 Koll Dec 2018 B2
10325296 Koll Jun 2019 B2
10877620 Koll et al. Dec 2020 B2
10950329 Koll et al. Mar 2021 B2
11043306 El Saadawi Jun 2021 B2
20010042080 Ross Nov 2001 A1
20020029161 Brodersen et al. Mar 2002 A1
20020065854 Pressly May 2002 A1
20020077819 Girardo Jun 2002 A1
20020099717 Bennett Jul 2002 A1
20020120466 Finn Aug 2002 A1
20020198741 Randazzo Dec 2002 A1
20030074248 Braud Apr 2003 A1
20030133156 Cragun Jul 2003 A1
20030154085 Kelley Aug 2003 A1
20030181790 David et al. Sep 2003 A1
20040044546 Moore Mar 2004 A1
20040078215 Dahlin et al. Apr 2004 A1
20040128163 Goodman et al. Jul 2004 A1
20040172297 Rao et al. Sep 2004 A1
20040240720 Brantley Dec 2004 A1
20040243545 Boone et al. Dec 2004 A1
20040254816 Myers Dec 2004 A1
20050071194 Bormann et al. Mar 2005 A1
20050102140 Davne et al. May 2005 A1
20050137910 Rao Jun 2005 A1
20050158767 Haskell Jul 2005 A1
20050171819 Keaton et al. Aug 2005 A1
20050203775 Chesbrough Sep 2005 A1
20050228815 Carus et al. Oct 2005 A1
20050240439 Covit et al. Oct 2005 A1
20050251422 Wolfman Nov 2005 A1
20050273360 Drucker Dec 2005 A1
20060020492 Cousineau Jan 2006 A1
20060020493 Cousineau Jan 2006 A1
20060020886 Agrawal et al. Jan 2006 A1
20060036472 Crockett Feb 2006 A1
20060041428 Fritsch et al. Feb 2006 A1
20060122865 Preiss Jun 2006 A1
20060129435 Smitherman et al. Jun 2006 A1
20060277073 Heilbrunn Dec 2006 A1
20070013968 Ebaugh Jan 2007 A1
20070016450 Bhora Jan 2007 A1
20070016451 Tilson Jan 2007 A1
20070033073 Tajaliawal Feb 2007 A1
20070050187 Cox Mar 2007 A1
20070067185 Halsted Mar 2007 A1
20070088564 March Apr 2007 A1
20070106751 Moore May 2007 A1
20070112599 Liu May 2007 A1
20070118410 Nadai May 2007 A1
20070143141 Villasenor et al. Jun 2007 A1
20070192143 Krishnan et al. Aug 2007 A1
20070198907 Degala Aug 2007 A1
20070203708 Polcyn Aug 2007 A1
20070226211 Heinze et al. Sep 2007 A1
20070276649 Schubert Nov 2007 A1
20070299651 Koll Dec 2007 A1
20070299665 Koll Dec 2007 A1
20080004505 Kapit Jan 2008 A1
20080016164 Chandra Jan 2008 A1
20080028300 Krieger et al. Jan 2008 A1
20080077451 Anthony Mar 2008 A1
20080141117 King et al. Jun 2008 A1
20080249374 Morita Oct 2008 A1
20090048833 Fritsch et al. Feb 2009 A1
20090070290 Nye Mar 2009 A1
20090077658 King et al. Mar 2009 A1
20090187407 Soble Jul 2009 A1
20090192800 Brandt Jul 2009 A1
20090192822 Regulapati et al. Jul 2009 A1
20090228299 Kangarloo et al. Sep 2009 A1
20090271218 Mok Oct 2009 A1
20090287678 Brown Nov 2009 A1
20100063907 Savani Mar 2010 A1
20100094657 Stern et al. Apr 2010 A1
20100099974 Desai Apr 2010 A1
20100100570 Constantin et al. Apr 2010 A1
20100125450 Michaelangelo May 2010 A1
20100138241 Ruark Jun 2010 A1
20100250236 Jagannathan et al. Sep 2010 A1
20100278453 King Nov 2010 A1
20100299135 Fritsch et al. Nov 2010 A1
20100299320 Claud et al. Nov 2010 A1
20110043652 King et al. Feb 2011 A1
20110055688 Isidore Mar 2011 A1
20110239146 Dutta et al. Sep 2011 A1
20110295864 Betz et al. Dec 2011 A1
20110301978 Shiu Dec 2011 A1
20120010900 Kaniadakis Jan 2012 A1
20120016690 Ramarajan et al. Jan 2012 A1
20120041950 Koll Feb 2012 A1
20120065987 Farooq et al. Mar 2012 A1
20120166220 Baldwin Jun 2012 A1
20120185275 Loghmani Jul 2012 A1
20120215551 Flanagan Aug 2012 A1
20120215782 Jagannathan Aug 2012 A1
20120239429 Corfield Sep 2012 A1
20120323598 Koll Dec 2012 A1
20130144651 Rao Jun 2013 A1
20130159408 Winn Jun 2013 A1
20130226617 Mok Aug 2013 A1
20130238330 Casella dos Santos Sep 2013 A1
20140006431 Jagannathan Jan 2014 A1
20140019128 Riskin et al. Jan 2014 A1
20140047375 Koll Feb 2014 A1
20140108047 Kinney Apr 2014 A1
20140164197 Koll Jun 2014 A1
20140278553 Fritsch Sep 2014 A1
20140288970 Lee et al. Sep 2014 A1
20140324423 Koll Oct 2014 A1
20140343963 Fritsch Nov 2014 A1
20150088504 Jagannathan Mar 2015 A1
20150134349 Vdovjak et al. May 2015 A1
20150278449 Laborde Oct 2015 A1
20150371145 Koll Dec 2015 A1
20160147955 Shah May 2016 A1
20160179770 Koll Jun 2016 A1
20160267232 Koll Sep 2016 A1
20160294964 Brune Oct 2016 A1
20160335554 Koll Nov 2016 A1
20170270626 Koll Sep 2017 A1
20180040087 Koll Feb 2018 A1
20180081859 Snider Mar 2018 A1
20180101879 Koll Apr 2018 A1
20180276188 Koll Sep 2018 A1
20190026436 Bender Jan 2019 A1
20190065008 Koll Feb 2019 A1
Foreign Referenced Citations (29)
Number Date Country
1361522 Nov 2003 EP
2030198 Mar 2009 EP
2030196 Sep 2018 EP
3571608 Nov 2019 EP
H09106428 Apr 1997 JP
2005267358 Sep 2005 JP
2006509295 Mar 2006 JP
4037250 Jan 2008 JP
2008108021 May 2008 JP
2009211157 Sep 2009 JP
2011118538 Jun 2011 JP
5986930 Aug 2016 JP
6023593 Oct 2016 JP
6078057 Jan 2017 JP
2005122002 Dec 2005 WO
2009143395 Nov 2009 WO
2011100474 Aug 2011 WO
2011106776 Sep 2011 WO
2011100474 Jan 2012 WO
2012048306 Apr 2012 WO
2012177611 Dec 2012 WO
2012177611 Dec 2012 WO
2014120501 Aug 2014 WO
2015079354 Jun 2015 WO
2015136404 Sep 2015 WO
2015187604 Dec 2015 WO
2016149003 Sep 2016 WO
2018136417 Jul 2018 WO
2019103930 May 2019 WO
Non-Patent Literature Citations (59)
Entry
Adam E.J et al., “ESR guidelines for the communication of urgent and unexpected findings” European Society of Radiology (ESR), 2011, vol. 3, Issue (1), pp. 1-3.
Anonymous: “Medical transcription—Wikipedia”, Feb. 13, 2010, XP055465109, Retrieved from the Internet: URL: https://en.wikipedia.org/w/index.php?title=Medical_transcription&oldid-343657066 [Retrieved on Apr. 6, 2018].
Arup 106428 et al., “Context-based Speech Recognition Error Detection and Correction,” Proceedings of HLT-NAACL 2004: Short Papers, May 2004, 4 pages.
Communication Pursuant to Article 94(3) EPC, mailed Jun. 27, 2019, in EPO application No. 14762803.6, 11 pages.
Daugherty B et al., “Tracking Incidental Findings”, Radiology Today, Jul. 2014, vol. 15, No. 7, p. 6.
Dimick, Chris, “Quality Check: An Overview of Quality Measures and Their Uses,” Journal of AHIMA 81, No. 9 (Sep. 2010); 34-38, Retrieved on May 14, 2013 from http://library.ahima.org/xpedio/groups/public/documents/ahima/bok1_047952.hcsp?DocName=bok1_047952.
Examination Report received in Canadian patent application No. 2,791,292 dated Aug. 2, 2019, 7 pages.
Examiner's Report dated Jun. 27, 2019 in Canadian Patent Application No. 2,881,564, 6 pages.
First Examination Report in Indian patent application No. 2186/MUMNP/2012 dated Jul. 19, 2019, 7 pages.
First Examination Report dated Aug. 29, 2019 in Indian patent application No. 448/MUMNP/2013, 6 pages.
International Preliminary Report on Patentability, dated Aug. 1, 2019 in International Patent Application No. PCT/US2018/013868, 9 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2018/013868, dated Jun. 18, 2018, 14 pages.
Non-Final Rejection dated Jul. 22, 2019 in U.S. Appl. No. 14/941,445 of Detlef Koll, filed Nov. 13, 2015, 29 pages.
Non-Final Rejection dated Sep. 16, 2019 for U.S. Appl. No. 15/993,958 of Detlef Koll, filed May 31, 2018, 65 pages.
Notice from the European Patent Office dated Oct. 1, 2007 concerning business methods, Journal of the European Patent Office, XP007905525, vol. 30, No. 11, pp. 592-593 (dated Oct. 1, 2007).
OpenVPMS, Follow-up tasks, Submitted by Matt C on Fri, Sep. 17, 2010, Available at: https://openvpms.org/project/followup-task-lists-enhancements.
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC mailed Jun. 12, 2019 in European Patent Application No. 11748231.5, 11 pages.
Yildiz M.Y. et al., “A text processing pipeline to extract recommendations from radiology reports”, Journal of Biomedica Informatics, 2013, vol. 46, pp. 354-362.
Examiner's Report dated Nov. 29, 2019 in Canadian patent application No. 2,839,266, 4 pages.
First Examination Report dated Dec. 24, 2019, in Indian patent application No. 336/DELNP/2014, 7 pages.
Non-Final Rejection dated Jan. 9, 2020 for U.S. Appl. No. 15/788,522 of Detlef Koll, filed Oct. 19, 2017, 55 pages.
International Search Report and Written Opinion received for PCT Patent Application No. PCT/US2018/061517, dated Mar. 7, 2019, 10 pages.
Decision to Refuse European Application dated Oct. 24, 2019 in European Patent Application No. 11748231.5, 19 pages.
Final Rejection dated Nov. 5, 2019 for U.S. Appl. No. 14/941,445 of Detlef Koll, filed Nov. 13, 2015, 19 pages.
Final Rejection dated Nov. 27, 2019 for U.S. Appl. No. 15/872,532 of Gilan El Saadawi, filed Jan. 16, 2018, 33 pages.
First Examination Report dated Oct. 7, 2019 in Indian patent application No. 7449/DELNP/2012, 9 pages.
“xPatterns Clinical Auto-Coding,” Atigeo, Whitepaper, pp. 8 (2014).
Centers for Medicare & Medicaid Services, “Medicare Physician Guide: 1995 Documentation Guidelines for Evaluation and Management Services,” 1995, 16 pages [online: https://www.cms.gov/Outreach-and-Education/Medicare-Leaming-Network-MLN/MLNEdWebGuide/Downloads/95Docguidelines.pdf].
Centers for Medicare & Medicaid Services, “Medicare Physician Guide: 1997 Documentation Guidelines for Evaluation and Management Services ,” 1997, 49 pages [online: https://www.cms.gov/Outreach-and-Education/Medicare-Leaming-Network-MLN/MLNEdWebGuide/Downloads/97Docguidelines.pdf].
Examiner's Report dated Dec. 20, 2018 in Canadian patent application No. 2,839,266,4 pages.
Examination Report dated Feb. 4, 2020, in Canadian patent application No. 2,904,656, 5 pages.
Final Rejection dated Mar. 16, 2020 for U.S. Appl. No. 15/993,958 of Detlef Koll, filed May 31, 2018, 33 pages.
Non-Final Rejection dated Mar. 19, 2020 for U.S. Appl. No. 16/174,503 of Detlef Koll, filed Oct. 30, 2018,43 pages.
Non-Final Rejection dated Mar. 6, 2020 in U.S. Appl. No. 14/941,445 of Detlef Koll, filed Nov. 13, 2015, 16 pages.
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC mailed Mar. 24, 2020, by the European Patent Office in patent application No. 13809956.9, 10 pages.
Examination Report dated Jun. 18, 2019, in Canadian patent application No. 2,811,942, 6 pages.
Examiner's Report dated Apr. 15, 2019 in Canadian patent application No. 2,875,584, 4 pages.
Non-Final Rejection dated Jun. 19, 2019 in U.S. Appl. No. 15/872,532 of Gilan El Saadawi, filed Jan. 16, 2018, 54 pages.
Notice of Allowance dated Sep. 11, 2020 for U.S. Appl. No. 16/174,503 of Detlef Koll, filed Oct. 30, 2018, 33 pages.
Examination Report received in Canadian patent application No. 2,791,292 dated Mar. 9, 2020, 3 pages.
Non Final Rejection dated Apr. 3, 2020 for U.S. Appl. No. 15/872,532 of Gilan El Saadawi, filed Jan. 16, 2018, 22 pages.
Non Final Rejection dated Apr. 6, 2020 for U.S. Appl. No. 15/616,884 of Detlef Koll, filed Jun. 7, 2017, 7 pages.
Non Final Rejection dated Jun. 10, 2020 for U.S. Appl. No. 15/993,958 of Detlef Koll, filed May 31, 2018, 14 pages.
Final Rejection dated Jul. 14, 2020 for U.S. Appl. No. 15/788,522 of Detlef Koll, filed Oct. 19, 2017, 33 pages.
Final Rejection dated Jul. 21, 2020 in U.S. Appl. No. 15/872,532 of Gilan El Saadawi, filed Jan. 16, 2018, 38 pages.
Second Examiner's Report dated May 20, 2020, in Canadian patent application No. 2,875,584, 5 pages.
Extended European Search Report dated Sep. 25, 2020, in European patent application No. 18741689.6, 14 pages.
Final Rejection dated Jan. 21, 2021 for U.S. Appl. No. 15/993,958 of Detlef Koll, filed May 31, 2018, 21 pages.
Final Rejection dated Oct. 29, 2020 for U.S. Appl. No. 15/616,884 of Detlef Koll, filed Jun. 7, 2017, 9 pages.
Hassanpour Saeed et al., “Information extraction from multi-institutional radiology reports,” Artificial Intelligence in Medicine, Elsevier NL, vol. 66, Oct. 3, 2015, pp. 29-39.
Meliha Yetisgen-Yildiz et al., “A text processing pipeline to extract recommendations from radiology reports,” Journal of Biomedical Informatics, vol. 46, No. 2, Jan. 24, 2013, pp. 354-362.
Notice of Allowance dated Nov. 3, 2020 for U.S. Appl. No. 14/941,445 of Detlef Koll, filed Nov. 13, 2015,42 pages.
Sayon Dutta et al., “Automated Detection Using Natural Language Processing of Radiologists Recommendations for Additional Imaging of Incidental Findings,” Annals of Emergency Medicine, vol. 62, No. 2, Aug. 1, 2013, pp. 162-169.
Third Examiner's Report dated Apr. 22, 2021, in Canadian patent application No. 2,875,584, 4 pages.
Decision of Refusal dated Apr. 19, 2021, by the European Patent Office in patent application No. 13809956.9, 12 pages.
Notice of Allowance dated Sep. 30, 2021 for U.S. Appl. No. 15/616,884 of Detlef Koll, filed Jun. 37,2017, 25 pages.
Notice of Allowance dated Mar. 9, 2021 for U.S. Appl. No. 15/872,532 of Gilan El Saadawi, filed Jan. 16, 2018, 26 pages.
Non-Final Rejection dated Apr. 1, 2021 for U.S. Appl. No. 15/616,884 of Detlef Koll, filed Jun. 7, 2017, 99 pages.
Extended European Search Report dated Jul. 15, 2021, in European patent application No. 18880181.5, 9 pages.
Related Publications (1)
Number Date Country
20190156926 A1 May 2019 US
Provisional Applications (1)
Number Date Country
62590114 Nov 2017 US