Embodiments of the present disclosure relate to improving the healthcare experience of patients, and more specifically, to improving the experience between a patient and a healthcare provider using augmented or virtual reality to supplement a medical communication interaction. Patients facing surgery or other medical procedures may lack the ability to express their thoughts or ask questions during important interactions with their healthcare provider (e.g., during a pre-operative check-up). In many cases, fear, anxiety, embarrassment, language barriers, and/or cultural differences may contribute to the patient's difficulties in expressing their thoughts or questions during these interactions. The healthcare provider may explain to the patient information regarding their condition and treatment plan/options, but may be unaware that the patient is understanding very little due to any of the above factors.
According to embodiments of the present disclosure, systems, methods of and computer program products for improving the experience between patients and healthcare providers are provided. In one or more embodiments of the invention, a method includes measuring a first physiological property of a patient during a communication interaction with a healthcare provider, where the first physiological property selected from the group consisting of: natural language communicated by the patient, eye movement, breathing rate, and body position. The method further includes determining whether the first physiological property is indicative of a negative interaction between the healthcare provider and the patient. When the first physiological property meets a threshold value indicative of a negative interaction, the method includes providing an indication to the healthcare provider that the communication interaction is negative. The method further includes providing an artificial reality display to supplement the communication interaction. The artificial reality display includes an augmented reality (AR) system or a virtual reality (VR) system and demonstrates a medical procedure to the patient.
In one or more embodiments of the invention, a system includes a computing node having a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor of the computing node to cause the processor to perform a method including measuring a first physiological property of a patient during a communication interaction with a healthcare provider, where the first physiological property includes natural language communicated by the patient, eye movement, breathing rate, and body position. The method further includes determining whether the first physiological property is indicative of a negative interaction between the healthcare provider and the patient. When the first physiological property meets a threshold value indicative of a negative interaction, the method includes providing an indication to the healthcare provider that the communication interaction is negative. The method further includes generating, for display on an artificial reality display, a medical procedure to the patient, the artificial reality display supplementing the communication interaction and comprising an augmented reality (AR) system or a virtual reality (VR) system.
In one or more embodiments of the invention, a computer program product is provided for improving the experience between a patient and a healthcare provider. The computer program product includes a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause the processor to perform a method including measuring a first physiological property of a patient during a communication interaction with a healthcare provider, where the first physiological property includes natural language communicated by the patient, eye movement, breathing rate, and body position. The method further includes determining whether the first physiological property is indicative of a negative interaction between the healthcare provider and the patient. When the first physiological property meets a threshold value indicative of a negative interaction, the method includes providing an indication to the healthcare provider that the communication interaction is negative. The method further includes generating, for display on an artificial reality display, a medical procedure to the patient, the artificial reality display supplementing the communication interaction and comprising an augmented reality (AR) system or a virtual reality (VR) system.
In the clinical practice of medicine, healthcare providers (e.g., doctors, nurses, physical therapists, or other medical professionals) traditionally conducts a series of consultations and/or examinations for patients who are planning to have a surgical procedure performed. During these consultations and/or examinations, the healthcare provider(s) may explain to the patient information regarding the patient's condition, pre-operative preparation steps, the surgical procedure, and post-operative expectations. In many cases, because of emotional factors such as fear, anxiety, anger, and/or embarrassment and other external factors such as language barriers and/or cultural differences, patients may find difficulties in expressing their thoughts or questions during these interactions with their healthcare provider(s). The healthcare provider(s) may explain to the patient information regarding their condition and treatment plan/options, but may be unaware that the patient is understanding very little due to any of the above factors. This may not be the result of poor explanation on the healthcare provider's part, but rather on the mental state of the patient affecting their ability to process information regarding their treatment.
The systems, methods, and computer program products of the present invention allow healthcare providers to gain insight into whether or not their patients are fully understanding in-person communications/interactions regarding their healthcare. In particular, the systems, methods, and computer program products of the present invention may constantly monitor in-person communication interactions between the healthcare provider and the patient, through an analysis of a multitude of behavioral and physiological factors, to provide a discrete notification to the healthcare provider if it is determined that the patient is exhibiting signs of not understanding or confusion. Non-limiting examples of a healthcare communication interaction between a healthcare provider and a patient may include a pre-operative check-up or evaluation or an annual physical wellness exam. During this communication interaction, the healthcare provider may explain to the patient information such as, for example, a medical diagnostic test, results of a medical diagnostic test, a surgical procedure, pre-operative preparation instructions, post-operative recovery process, or other healthcare-related information. These types of discussions can be stressful for patients, and this stress may make it difficult for the patient to adequately express their thoughts or questions to the healthcare provider.
The system of the present invention includes at least one recording device to record the communication interaction between the healthcare provider and the patient. The recording device may include a camera, such as, for example, a digital camera, a stereoscopic camera, a plenoptic camera, a thermal camera, a fish-eye camera or any other suitable recording device as is known in the art. The recording device may also include one or more microphones, either separately or integrated into the device. The camera(s) may be used to record images and/or video of the interaction while the microphone may be used to record audio data associated with the interaction between the health care provider and the patient. Audio data may include, for example, natural language from either or both of the healthcare provider and the patient.
The system of the present invention may process the recorded images, video, and or audio data using computer vision techniques, audio processing techniques, and/or natural language processing techniques as is known in the art. From the recorded images, video, and/or audio data, the system may measure one or more physiological or behavioral factors/properties of one or both of the patient and the healthcare provider during the interaction. For example, the system may measure the natural language of the patient or the healthcare provider (e.g., intonation, sentiment, complexity of vocabulary, and/or length of time each individual is talking). As another example, the system may measure physiological factors/properties such as eye movement or blinking rate, breathing rate, and/or body position. For example, body position may include the posture or limb motion of the patient and/or the healthcare provider. Behavioral factors/properties may also be measured. For example, the system may measure pathing of the patient, such as if the patient is pacing in a room, where excessive pacing may indicate that the patient is stressed or anxious.
The system of the present invention may determine whether one or more of the above-listed physiological or behavioral factors/properties is indicative of a negative interaction. Generally, a negative interaction is where the patient is not able to fully express themselves to their healthcare provider or understand their communications with their healthcare provider due to certain circumstances (e.g., emotional or communication-related). A negative interaction may occur, for example, when a patient is stressed or anxious about an upcoming surgical procedure and is not fully understanding a discussion with their healthcare provider about the surgical procedure. In various embodiments, certain behaviors or physiological properties may be associated with negative interactions, such as, for example, lack of eye contact, too much eye contact, too much tactile interaction, too little tactile interaction, hyperventilation, body position anxiety, hiding a body part, not responding when engaged by the healthcare provider, and/or responding to a different topic.
In various embodiments, the system of the present invention may determine using audio processing of recorded audio data (and/or computer vision processing of the motion of the patient's mouth) that the patient is not asking enough questions and thus determine that the interaction is negative. In various embodiments, the system of the present invention may determine that the patient is asking a lot of questions because they are anxious and thus determine that the interaction is negative. In various embodiments, the system of the present invention may determine through computer vision that the patient's posture is indicative of fear or anxiousness and thus determine that the interaction is negative.
Once the system determines that the interaction is negative, the system may provide a discrete indication to the healthcare provider that the communication interaction is negative. For example, the healthcare provider may have an electronic accessory, such as smart glasses, a smart watch, or a mobile phone/device to which the system can “push” a notification of a negative communication interaction to the healthcare provider. In various embodiments, the notification may be a binary indication such as a light or a symbol (e.g., an emoji, an up/down arrow, a thumbs up/down). In various embodiments, the indicator may be text (e.g. a text message), a numerical scale (e.g., 1 to 10), a sliding scale, or a color bar scale (e.g., green/yellow/red).
In various embodiments, the system of the present invention may determine the quality of interactions between several individuals at the same time (e.g., family members of the patient). In this case, the system may provide one or more notifications to the healthcare provider that members other than the patient are displaying behavioral or physiological patterns indicative of a negative interaction.
Once a negative communication interaction is determined between the healthcare provider and the patient, the system of the present invention may direct the healthcare provider to supplement the interaction with an artificial reality device, such as an augmented reality (AR) system or a virtual reality (VR) system. In various embodiments, the system may determine that the AR system is to be used. In various embodiments, the system may determine that the VR system is to be used. The AR system and/or the VR system may be used to demonstrate a surgical procedure or other medical procedure to the patient to facilitate a better understanding of the procedure. The system may retrieve the patient's medical record from, for example, an electronic health/medical record (EHR/EMR database) as is known in the art, including one or more medical images (e.g., x-ray, CT, MM, and/or ultrasound) corresponding to an anatomical structure of the patient (e.g., a bone, an organ, muscle, tendon, ligament and/or other anatomical structure).
The system may generate from the one or more medical images, a three-dimensional simulation of the patient's anatomy in the artificial reality device (e.g., the AR system and/or the VR system). Once the three-dimensional simulation of the patient's anatomy is generated, the system may generate a three-dimensional (3D) simulation of a surgical/medical procedure (e.g., hip/knee replacement, cardiac bypass surgery, stent placement, tumor removal, spinal fusion, biopsy procedure, or diagnostic procedure) in the artificial reality device (e.g., the AR system and/or the VR system) to be viewed by the patient with the healthcare provider. During or after the 3D simulation, the healthcare provider may guide the patient through a more detailed explanation of the surgical/medical procedure or treatment. In various embodiments, the three-dimensional simulation may include a simulation of the effects of a medical treatment plan, such as a simulation of the effect of a particular drug treatment on the patient (e.g., a simulation of statin drugs on cholesterol buildup or anti-inflammatory drug on an inflamed tissue). The system may generate one or more labels for relevant anatomical features in the three-dimensional simulation of the patient's anatomy and display the label(s) to the patient in the artificial reality device. The system may provide an interactive dialogue explaining the simulation of the surgical/medical procedure to the patient. For example, when the patient touches a particular feature of the 3D simulation, the artificial reality device provides a tailored description of the anatomical feature, condition, or surgical step to the patient. The artificial reality device may take into account demographics of the patient and provide an explanation in language that is comprehensible to the patient. During the interactive dialogue, the system may monitor for any questions asked by the patient. Upon determining that the patient has asked a question, the system may automatically pause the display of the three-dimensional simulation so that the healthcare provider may address any questions or issues that the patient has.
If the simulation of the surgical procedure is viewed in the VR system, after the simulation is complete, the system may transition to the AR system. The system may continue to use the AR system during the remainder of the healthcare communication interaction between the healthcare provider and the patient.
The system may take into account demographics of the patient and/or healthcare provider. For example, the system may use age, income, geographic location, race, ethnicity, or other suitable demographic information to aid in determining whether the patient is involved in the healthcare communication interaction.
Similar to
In various embodiments, the system of the present invention may generate a simulation of a surgical/medical procedure using the 3D simulation of the patient's anatomy. In various embodiments, the system may transition from an AR system to a VR system for the surgical/medical procedure simulation. For example, the system may generate a simulation of a carpal tunnel surgery of the wrist shown in
During the 3D simulation of the surgical/medical procedure, if the patient asks a question, the system may detect that a question has been asked and automatically pause the simulation until the patient's question has been answered. The system may use data gathered from the recorded images, video, and/or audio data to determine when a question has been asked. For example, the system may determine that the patient's voice has a rising inflection, which may be indicative of a question, and pause the surgical/medical simulation. The system may resume playback of the surgical/medical procedure simulation when, for example, a voice command is provided by the healthcare provider indicating that the question has been answered. The healthcare provider may also supplement the surgical/medical procedure simulation with additional details, as needed.
In various embodiments, a method of performing a medical procedure of the present invention may include measuring physiological properties of a patient during a communication period with a physician, the physiological properties including at least one of: natural language communicated by patient, eye movement, breathing rate, and body position. The method may further include determining whether the patient's physiological properties meet a threshold value. The method may further include providing an artificial reality display, the artificial reality display demonstrating a medical procedure to the patient when the patient's physiological properties are below the threshold value. The artificial reality may include a virtual reality or augmented reality perspective of a patient's anatomical feature, including annotations of the anatomical features depicted. Demonstration of the medical procedure may pause upon the patient asking a question. The physician may be provided at least one indicator representing the patient's understanding of the medical procedure.
With reference to
In computing node 510 there is a computer system/server 512, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 512 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.
Computer system/server 512 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. Computer system/server 512 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
As shown in
Bus 518 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer system/server 512 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 512, and it includes both volatile and non-volatile media, removable and non-removable media.
System memory 528 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 530 and/or cache memory 532. Computer system/server 512 may further include other removable/non-removable, volatile/non-volatile computer system storage media. By way of example only, storage system 534 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected to bus 518 by one or more data media interfaces. As will be further depicted and described below, memory 528 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
Program/utility 540, having a set (at least one) of program modules 542, may be stored in memory 528 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment. Program modules 542 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.
Computer system/server 512 may also communicate with one or more external devices 514 such as a keyboard, a pointing device, a display 524, etc.; one or more devices that enable a user to interact with computer system/server 512; and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 512 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 522. Still yet, computer system/server 512 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 520. As depicted, network adapter 520 communicates with the other components of computer system/server 512 via bus 518. It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 512. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
The present disclosure may be embodied as a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
In various embodiments, off the shelf VR systems are optionally used with additional external compatible sensors to track various elements in multiple fields including, e.g., motion tracking, cognitive challenges, speech recognition, stability, facial expression recognition, and biofeedback.
Speech Recognition can include, but is not limited to fluent speech, ability to imitate, and pronunciation.
The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.