A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the United States Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
The present invention relates generally to a training system employing computer simulation and immersive virtual reality for instructing and evaluating the progress of a person performing a skilled-oriented task and, more particularly, to a simulator for instructing and evaluating performance of a skilled-oriented task such as, for example, providing direct health care and/or assisting in providing care for a patient's needs, including perineal care, in a healthcare facility such as, for example, a residential facility, healthcare office, hospital or trauma center or facility, as well as on a scene of an event such as, for example, a motor vehicle accident, natural or manmade disaster, concert or other entertainment performance, and the like, or transportation therefrom to the healthcare facility, where care is provided under non-critical and/or critical conditions.
Generally speaking, training is needed for a person to acquire and/or maintain skills necessary for performing a skill-oriented task such as, for example, providing and/or assisting patients in a healthcare facility address their direct health care needs. Health care needs include providing and/or assisting with perineal care to patients unable or unwilling to properly clean private areas such as, for example, genitals of both male and female patients, that can be particularly prone to infection.
In healthcare facilities patient care and safety are mission critical tasks. Many medical practitioners (e.g., doctors and nurses) undergo years of educational and practical (e.g., “on-the-job”) training to acquire, refine and/or maintain skills needed in the healthcare industry. The training necessary for these advance medical practitioners to acquire and/or maintain their skills applies also to other medical professionals such as, for example, emergency medical technicians (EMTs), licensed practical nurse (LPN), a certified nursing assistant also referred to as a nurse's aid or a patient care assistant (collectively referred to herein as a CNA), as well as individuals providing home health aid. These medical professionals typically work directly with a patient and/or with a nurse to assist in rendering patient care which can include many physical tasks of patient care. Tasks include, for example, bathing, grooming, and feeding patients, responding to patient's requests for assistance with positioning in bed, transport to restroom facilities, and the like, cleaning a patient as well as the patient's bedding and a patient's room or portion thereof, checking and restocking medical supplies located in proximity to the patient being cared for, taking or assisting other medical practitioners taking a patient's vital signs (e.g., temperature, blood pressure, and the like) or being administered medicine, and similar medical and patient care tasks. Often, an important part of each task performed includes documenting medical records so that other medical practitioners rendering care to a patient are fully informed of the patient's condition and what has been provided to the patient in a given time period.
Traditionally, these medical professionals (e.g., the EMTs, LPNs, CNAs, home health aid provider) acquire their skills initially in a classroom or other instructional setting, followed by working in a supervised, practical setting where some patient interaction occurs in a type of apprenticeship or “on-the-job” type training environment with another more experienced medical practitioner (e.g., a nurse or more experienced EMT, LPN, or CNA). As can be appreciated, there is a constant need for qualified and experience medical practitioners at all levels of patient care. Accordingly, there is a great demand for systems to assist in training medical practitioners.
Accordingly, there is a need for training systems and methods using computer simulation and immersive virtual reality and which permit evaluation of the progress in obtaining and/or maintaining skills needed by a medical practitioner such as, for example, EMTs, LPNs, CNAs, or home health aid providers, that assists patients in healthcare or residential facilities with their direct health care needs.
The present invention is directed to a simulator for skill-oriented training of a task. The simulator includes a head-mounted display unit (HMDU) wearable by an operator operating the simulator. The HMDU has at least one of a camera, a speaker, a display device, and a HMDU sensor. The camera, the speaker, and the display device provide visual and audio output to the operator. The simulator also includes one or more controllers operable by the operator. The controllers each have at least one controller sensor. The controller sensor and the HMDU sensor cooperate to measure and to output one or more signals representing spatial positioning, angular orientation, speed and direction of movement data of the HMDU and/or the one or more controllers relative to a simulated patient as the operator performs a healthcare task. The simulator also includes a data processing system operatively coupled to the HMDU and the one or more controllers. The data processing system includes a processor and memory operatively coupled to the processor with a plurality of executable algorithms stored therein. The processor is configured by the executable algorithms to determine coordinates of a position, an orientation, and a speed and a direction of movement of the one or more controllers in relation to the patient as the operator takes actions to perform the healthcare task based on the one or more signals output from the HMDU sensor and the controller sensor of each of the one or more controllers. The processor is also configured to model the actions taken by the operator to perform the healthcare tasks to determine use of healthcare equipment and supplies and changes in condition of the patient and the used healthcare equipment and supplies in relation to the actions taken. The processor renders the patient, the used healthcare equipment and supplies, the condition of the patient, changes to the condition of the patient, changes to the used healthcare equipment and supplies, and sensory guidance as to the performance of the healthcare tasks from the actions taken by the operator in a three-dimensional virtual training environment. The processor is further configured to simulate in real-time the three-dimensional virtual training environment depicting the rendered patient, the rendered used healthcare equipment and supplies, the rendered changes to the condition of the patient, the rendered changes to the used healthcare equipment and supplies, and the rendered sensory guidance as the operator performs the healthcare task in the training environment.
In one embodiment, the rendered patient, the rendered used healthcare equipment and supplies, the rendered changes to the condition of the patient, the rendered changes to the used healthcare equipment and supplies, and the rendered sensory guidance are exhibited in near real-time to the operator within the training environment on the display device of the HMDU to provide in-process correction and reinforcement of preferred performance characteristics as the operator performs the healthcare task. In one embodiment, the rendered sensory guidance includes a plurality of visual, audio and tactile indications of performance by the operator as compared to optimal values for performance.
In one embodiment, the simulator further includes an avatar or portion thereof, manipulated and directed by the operator with the one or more controllers to take the actions to perform the healthcare task in the training environment. In one embodiment, the portion of the avatar includes virtual hands.
In one embodiment, the operator of the simulator further includes a plurality of operators undertaking the skill-oriented training as a group cooperating to perform the healthcare task within the three-dimensional virtual training environment. In still another embodiment, the operator is one of a medical professional and an individual providing home health aid. In one embodiment, the medical professional includes at least one of an emergency medical technician (EMT), a licensed practical nurse (LPN), and a certified nursing assistant, nurse's aid, or a patient care assistant referred to herein as a CNA.
In one embodiment, the sensory guidance exhibited to the operator and/or others includes one or more of visual, audio, and tactile indications of performance by the operator operating the one or more controllers relative to the patient as compared to optimal values for performance of the healthcare task or tasks currently being performed by the operator. In one embodiment, the visual indications of performance include an indication, instruction, and/or guidance of the optimal values for preferred performance of the healthcare task currently being performed by the operator. In one embodiment, the audio indications of performance include an audio tone output by the at least one speaker of the HMDU. In still one embodiment, the audio tone is a reaction by the patient to the healthcare task or tasks currently being performed by the operator.
In yet another embodiment, the simulator further includes a display device operatively coupled to the data processing system such that an instructor may monitor the performance by the operator of the healthcare task. In one embodiment, the visual indications include a score or grade established by the instructor for the operator in the performance by the operator of the healthcare task as compared to a set of performance criteria defining standards of acceptability. In one embodiment, the established score or grade is a numeric value based on how close to optimum the operator's performance is to the set of performance criteria. In one embodiment, the score or grade further includes rewards including certification levels and achievements highlighting the operator's results and/or progress as compared to the set of performance criteria and to other operators. In still another embodiment, the score or grade and rewards for one or more of the operators are at least one of shared electronically, posted on a website or bulletin board, and over social media sites.
In one embodiment of the simulator, the data processing system is configured to provide a review mode for evaluating the operator's performance of the healthcare task. In one embodiment, when in the review mode the data processing system is further configured to provide reports of the operator's performance. In one embodiment, the data processing system is further configured to provide the review mode to at least one of the operator of the controller, an instructor overseeing the skill-oriented training, and other operators undergoing the skill-oriented training. In one embodiment, the simulator is portable as a self-contained modular assembly. In one embodiment, the data processing system of the simulator is further configured to provide one or more modes for assigning characteristics of at least one of the operator, the patient, and the environmental setting where the healthcare task is performed.
Referring now to the Figures, which are exemplary embodiments, and wherein the like elements are numbered alike.
In one embodiment, the simulator 20 permits training and evaluating the operator's performance of a task, namely, using one or more controllers 60, for example, one or more handheld controllers 60 (e.g., a righthand and lefthand controller), to take actions by manipulating and directing a position and movement of an avatar 120 (
In one embodiment, the one or more handheld controllers 60 include a Pico Neo 3 controller of Qingdao Pico Technology Co., Ltd. dba Pico Immersive Pte. Ltd (Qingdao, China) (Pico Neo is a registered trademark of Qingdao Pico Technology Co., Ltd.). In one embodiment, the one or more handheld controllers 60 include an Oculus Quest 2 and/or an Oculus Rift controller of Facebook Technologies, LLC (Menlo Park, California) (Oculus Quest and Oculus Rift are registered trademarks of Facebook Technologies, LLC). In another embodiment, the one or more handheld controllers 60 include a Vive Pro Series controller of HTC Corporation (Taoyuan City Taiwan) (Vive is a registered trademark of HTC Corporation). In still another embodiment, it is within the scope of the present invention for the simulator 20 to be implemented in a controller-free embodiment, for example, where a user's hands and gestures made therewith (e.g., grasping, picking up and moving objects, pinching, swiping, and the like) are identified and tracked (e.g., with cameras and sensors within the virtual healthcare environment 100) rather than actions and movement initiated by the user with a handheld controller in the environment 100.
As described herein, the operator 10 using the one or more controllers 60 alone or with one or more other input devices 53 (described below) manipulates and directs the avatar 120 to navigate through the virtual healthcare environment 100 and to take actions, for example, with the virtual hands 122, objects 104 (e.g., the health care tools, equipment, PPE, and/or supplies) rendered therein, to perform tasks within the virtual healthcare environment 100. A tracking system within each of the one or more controllers 60 spatial senses and tracks movement of the respective controller 60 (e.g., speed, direction, orientation, spatial location, and the like) as directed by the operator 10 in performing one or more tasks in providing and/or assisting the resident or patient 102 with his/her healthcare needs, for example, perineal care needs. The healthcare training simulator 20 collects, determines and/or stores data and information (described below) defining the movement of the one or more controllers 60 including its speed, direction, orientation, and the like, as well as the impact of such movement and actions within the virtual healthcare environment 100 such as, for example, as health care equipment and/or supplies 104 are used and the condition of the patient 102 changes (e.g., improves) as the operator 10 renders care to the patient 102.
Referring to
As should be appreciated, the objects 104 within the 3-D virtual VRNA healthcare training environment 100 include, for example, health care tools and/or equipment, PPEs, and/or supplies. It should also be appreciated that the 3-D virtual healthcare training environment 100 not only depicts the simulated patient 102 but also a condition of and/or symptoms and/or reaction exhibited by the simulated patient 102 undergoing treatment, including for example, changes in conditions, symptoms, and/or reactions of the patient 102 before, during, and after care. In one embodiment, the depicted condition and/or symptoms of simulated patient 102 are related to perineal care and may include, for example, affects from episodes of incontinence, bedsores, skin ulcers, or the like. The operator 10 interacts within the virtual reality provided in the 3-D virtual healthcare training environment 100, for example, to view and otherwise sense (e.g., see, feel, hear, and optionally smell) the patient 102 and/or their condition, the avatar 120 and/or virtual hands 122, and the resulting actions he/she is directing to the simulated patient 102, their condition and changes thereto, and the objects 104 used (e.g., health care tools, equipment, PPEs, and/or supplies) and changes thereto, as he/she performs the healthcare tasks. In one embodiment, multiple operators 10 are present simultaneously within the 3-D virtual healthcare training environment 100 and cooperate to provide and assist in providing healthcare to the patient 102. The interaction (individual operator and/or group of operators) is monitored, and data and information therefrom is recorded and stored (e.g., in a memory device) to permit performance evaluation by the operator 10, an instructor or certification agent 12, and/or other operators/healthcare trainees present during training or otherwise monitoring or cooperating to provide healthcare within the 3-D virtual healthcare training environment 100 at or from another location remote from where the training is being conducted, as is described in further detail below.
In one embodiment, the healthcare training simulator 20 generates audio, visual, and other forms of sensory output, for example, vibration, workplace disturbance (e.g., noise, smells, interruption from other medical practitioners and/or patient visitors, etc.), environmental conditions (e.g., lighting) and the like, to simulate senses experienced by the operator 10, individually and as a group of operators, as if the healthcare procedure is being performed in a real-world healthcare setting. For example, the training simulator 20 simulates experiences that the operator 10 (individual) and/or operators 10 (group) may encounter when performing the healthcare task “in the field,” e.g., outside of the training environment and in a healthcare work environment. As shown in
In one embodiment, input and output devices of the HMDU 40 and each of the one or more controllers 60 such as, for example, the cameras 42, the sensors 44 (e.g., tracking sensors), the display 46, and the speakers 48 of the HMDU 40, and sensors 62 (e.g., tracking sensors), control buttons or triggers 64, and haptic devices 66 of the controller 60 (e.g., rumble packs to simulate weight and/or vibration) that impart forces, vibrations and/or motion to the operator 10 of the controllers 60, and external input and output devices such as speakers 55, are incorporated into the conventional form factors. Signals from these input and output devices (as described below) are input signals and provide data to the processing system 50. The data is processed and provided to permit a thorough evaluation of the healthcare training procedure including the actions taken by the operator 10 in performing healthcare and equipment and/or supplies used therein.
As should be appreciated, the HMDU 40 and the one or more controllers 60 provide a plurality of inputs to the healthcare training simulator 20. The plurality of inputs includes, for example, spatial positioning (e.g., proximity or distance), orientation (e.g., angular relationship), and movement (e.g., direction and/or speed) data and information for tracking the position of one or more of the HMDU 40 and the one or more controllers 60 relative to the simulated patient 102, objects 104 (e.g., healthcare tools, equipment, PPEs, and supplies) within the 3-D virtual healthcare training environment 100. The HMDU 40 and the one or more controllers 60 may include sensors (e.g., the tracking sensors 44 and 62) that track the movement of the operator 10 operating the controllers 60. In one embodiment, the sensors 44 and 62 may include, for example, magnetic sensors, mounted to and/or within the HMDU 40 and the controllers 60 for measuring spatial position, angular orientation, and movement within the 3-D virtual healthcare training environment 100. In one embodiment, the sensors 44 and 62 of the HMDU 40 and the controllers 60 are components of a six degree of freedom (e.g., x, y, z for linear direction, and pitch, yaw, and roll for angular direction) tracking system 110. In one embodiment, the tracking system is an “inside-out” positional tracking system, where one or more cameras and/or sensors are located on the device being tracked (e.g. the HMDU 40 and controllers 60) and the device “looks out” to determine how its spatial positioning, orientation, and movement has changed in relation to the external environment to reflect changes (e.g., in spatial positioning, orientation, and movement) within the 3-D virtual healthcare training environment 100. Examples of systems employing such inside-out positional tracking include, for example, the aforementioned Oculus Quest, Oculus Rift, and Vive controllers and HMDUs. In another embodiment, the tracking system is an “outside-in” positional tracking system, where one or more cameras and/or sensors are fixedly located in the environment (e.g., including one or more stationary locations) and on the device being tracked (e.g. the HMDU 40 and controllers 60) and the spatial positioning, orientation, and movement of the device being tracked is determined in relation to the stationary locations within the 3-D virtual healthcare training environment 100. An example of a system employing such outside-in positional tracking includes, for example, a Polhemus PATRIOT™ Tracking System, model number 4A0520-01, from the Polhemus company (Colchester, Vermont USA).
It should be appreciated that it is within the scope of the present invention to employ other tracking systems for locating the HMDU 40 and/or the controllers 60 in relation to the patient 102 within the 3-D virtual VRNA healthcare training environment 100. For example, in some embodiments the training simulator 20 includes a capability to automatically sense dynamic spatial properties (e.g., positions, orientations, and movements) of the HMDU 40 and/or the controllers 60 during performance of one or more tasks in providing and/or assisting in the performance of the task, namely, his/her positioning and movement in rendering care consistently and in a preferred manner. The training simulator 20 further includes the capability to automatically track the sensed dynamic spatial properties of the HMDU 40 and/or one or more of the controllers 60 over time and automatically capture (e.g., electronically capture) the tracked dynamic spatial properties thereof during the performance of the healthcare tasks.
As shown in
In one embodiment, as illustrated in
In one embodiment, illustrated in
In one embodiment, the computing device 52 of the processing system 50 invokes one or more algorithms or subsystems 132 that are stored in the internal memory 130 or hosted at a remote location such as, for example, a processing device (e.g., one of the processing systems 93) or in one of the data storage devices 96 or 150 operatively coupled to the computing device 52. From data and information provided by the HMDU 40 and one or more controllers 60, the one or more algorithms or subsystems 132 are executed by the CPU of computing device 52 to direct the computing device 52 to determine coordinates of a position, an orientation, and a speed and direction of movement of the operator 10 (e.g., via data and information received from the sensors 44 and 62 of the HMDU 40 and/or controllers 60) to model, render, and simulate the 3-D virtual training environment 100 depicting the rendered the avatar 120 and/or the virtual hands 122, patient 102 and/or the other objects 104 (e.g., the health care tools, equipment and/or supplies) with virtual imagery as the operator 10 performs the healthcare tasks.
In one embodiment, the algorithms or subsystems 132 include, for example, a tracking engine 134, a physics engine 136, and a rendering engine 138. The tracking engine 134 receives input, e.g., data and information, from the healthcare training environment 100 such as a spatial position (e.g., proximity and distance), and/or an angular orientation, as well as a direction, path and/or speed of movement of the sensors 44 and 62 of the HMDU 40 and/or the one or more controllers 60, respectively, in relation to the patient 102 and the objects 104 in the training environment 100 as provided by the sensors 44 and 62 of the HMDU 40 and/or each of the one or more controllers 60. The tracking engine 134 processes the input and provides coordinates to the physics engine 136. The physics engine 136 models the actions directed by the operator and/or operators 10 in performing healthcare tasks to the patient 102, the use of the objects 104 (e.g., the health care tools, equipment and/or supplies) in the performance of such tasks, and changes to the condition of the patient and to the used healthcare equipment and supplies within the virtual healthcare environment 100 based on the received input and/or coordinates from the tracking engine 134. The physics engine 136 provides the modeled actions performed by the operator or and/or operators 10 to the rendering engine 138. The processing system 50 then executes the algorithms of the rendering engine 138 to render the avatar 120 and/or the virtual hands 122 for the operator and/or operators 10, the patient 102, the patient's condition, the use of the objects 104 (e.g., the health care tools, equipment and/or supplies) in the performance of such tasks, and changes to the condition of the patient and to the used healthcare equipment and supplies in a three-dimensional (3-D) virtual healthcare training environment 100 in response to the modeled performance of the healthcare tasks. The processing system 50 then simulates, in real-time, the 3-D virtual healthcare training environment 100 depicting the rendered the avatar 120 and/or the virtual hands 122 of the operator and/or operators 10, the simulated patient 102, the used objects 104, the changes to the condition and/or reaction of the patient and/or the used healthcare equipment and supplies with virtual imagery as the operator and/or operators 10 perform the healthcare tasks.
In one embodiment, the operating environment of the VRNA healthcare training simulator 20 is developed using a Unity™ game engine (Unity Technologies, San Francisco, California USA; and Unity IPR ApS, Copenhagen, DENMARK) and operates on the Windows™ (Microsoft Corporation, Redmond, Washington USA) platform. It should be appreciated, that the VRNA healthcare training simulator 20 may also operate on a portable computing processing system, for example, the aforementioned processing systems 93 including PDAs, IPADs, tablet computers, mobile radio telephones, smartphones (e.g., Apple™ iPhone™ device, Google™ Android™ device, etc.), or the like. It should be appreciated that one or more of the algorithms or subsystems 132 described herein (e.g., the tracking engine 134, the physics engine 136, and the rendering engine 138) may access the data storage device 150 to retrieve and/or store data and information 152 including data and information describing training and/or lesson plans 154 including skilled-oriented tasks, steps, or activities in providing care and/or in assisting patients with direct healthcare needs, performance criteria 156 (e.g., proper techniques for performing and/or assisting in performing a healthcare task), data and information from one or more instances of performance of healthcare tasks 158 by one or more healthcare trainees (e.g., operators 10), scores and/or performance evaluation data for individual 160 and/or groups 162 of healthcare trainees (e.g., one or more healthcare trainees/operators 10), and healthcare simulation data as well as variables and/or parameters 164 used by the healthcare training simulator 20. It should be appreciated that the input data and information is processed by the computing device 52 in near real-time such that the position, distance, orientation, path, direction, and speed of movement of the HMDU 40 and/or one or more controllers 60 is depicted as the operator and/or operators 10 are performing one or more healthcare tasks. Further aspects of the training simulator 20, are described in detail below.
It also should be appreciated that the input data and information include one or more variables or parameters set by the operator 10 on healthcare tools or equipment such as, for example, one or more setting for medical devices that measure, as is known in the art, temperature, blood pressure, or the like, of the patient 102 undergoing care. Moreover, the operator 10 may enter parameters, measurements, tasks performed, condition of a patient as observed by the operator 10 and the like, in electronic medical records to simulate the documenting of care administered to the patient 102 as the operator 10 performs healthcare tasks within the 3-D virtual training environment 100. In effect, the tracking engine 134, the physics engine 136, and the rendering engine 138 simulate actions taken by the operator and/or operators 10 in performing healthcare tasks in a non-virtual environment. In one embodiment, the actions taken by the operator and/or operators 10 in performing healthcare tasks are evaluated and compared to preferred and/or proper techniques for performing and/or assisting in performing healthcare tasks (e.g., performance criteria 156). The actions of the operator and/or operators 10 can then be viewed in, for example, in one or more review or evaluation modes, a specific instructional mode, and/or a playback mode, where the actions of the operator 10 are shown to the operator 10 (e.g., the healthcare trainee or trainees), the instructor or certification agent 12, and/or other healthcare trainees.
For example, the actions of the operator and/or operators 10, and for example, the acceptability thereof in performing healthcare tasks with preferred and/or proper technique, reflect the level of skill of the operator and/or operators 10 individually and as a group. As can be appreciated, good technique typically results in acceptable actions in performing healthcare tasks, and less than good technique may result in an unacceptable action in performing healthcare tasks. The evaluation, and various review modes thereof (described herein), allows the operator 10, an instructor or certification agent 12 and/or others (e.g., other healthcare trainees) to evaluate the technique and actions used in performing healthcare tasks in a virtual setting, as captured and stored by the training simulator 20, for example, as performance data 158, and to make in-process adjustments to or to maintain the preferred or proper technique being performed and/or performed in a next healthcare performance. The evaluation compares the demonstrated techniques to acceptable performance criteria for the task (e.g., the performance criteria 156) and ultimately the acceptability of the tasks performed by the operator and/or operators 10 to the patient 102. In one embodiment, the operator's performance as he/she completes one or more skilled-oriented tasks, steps, or activities in providing care and/or in assisting patients with direct healthcare needs (e.g., within the training and/or lesson plans 154) is monitored and graded, scored or otherwise evaluated in comparison to preferred or proper techniques for performing and/or assisting in performing the healthcare task (e.g., in accordance with the performance criteria 156). The grade, score and/or other evaluation information (e.g., comments from the instructor 12), operator's progress in obtaining requisite level of knowledge or skill in a task or tasks, may be stored in the data storage device 150 as, for example, scores and/or performance evaluation data for an individual 160 and/or for one or more groups 162 of healthcare trainees. In one embodiment, the review modes may be utilized to evaluate an operator's knowledge of acceptable and/or unacceptable aspects of a previous performance by the operator and/or operators 10 or by an actual or theoretical third-party operator. For example, a review mode may present a number of deficiencies in a performance of one or more healthcare tasks and query the operator 10 to identify the type or nature of the deficiency in the performance, possible reasons for the deficiency, and/or how to correct the deficiency going forward or in remedial operations.
It should be appreciated that it is also within the scope of the present invention for the review modes to provide tutorials, e.g., audio-video examples, illustrating setup and use of healthcare equipment and supplies typically used in the healthcare industry, acceptable performance techniques using the same, common deficiencies and ways to reduce or eliminate the same, and the like. It should also be appreciated that, as described herein, the VRNA healthcare training simulator 20 can be used for training, developing, maintaining, and improving other skills (e.g., more than just performance of healthcare treatment procedures) but also skills such as, for example, workplace safety, patient privacy, team building, and group performance skills, and the like.
It should further be appreciated that the VRNA healthcare training simulator 20 may be implemented as a project-based system wherein an individual instructor, certification agent, or the like, may define their own performance characteristics (e.g., elapsed time, preferred and/or proper performance techniques, requisite level of knowledge or skill to attain a rating or certification, and the like) and/or criteria including those unique to the instructor, agent and/or a given healthcare facility. In such embodiments, the operator and/or operators 10 are evaluated (e.g., individually and as a group) in accordance with the unique performance characteristics and/or criteria. In one embodiment, as described herein, the healthcare training simulator 20 is operatively coupled to the Learning Management System (LMS) 170. The LMS 170 may access the data storage device 150 that stores data and information 152 used by the healthcare training simulator 20.
In one embodiment, the healthcare training simulator 20 is operatively coupled to an Artificial Intelligence (AI) engine 190. The AI engine 190 is operatively coupled, directly or through the network 90, to the computing device 50 and/or the LMS 170. In one embodiment, the AI engine 190 accesses and analyzes data and information 152 within the LMS 170 and/or data storage device 150 including the performance criteria 156, the performance data 158, scores and/or evaluation data for individual 160 and/or groups 162, and the like, for one or more of the operators 10 and identifies, for example, successes or deficiencies in performance by individual and/or groups of operators 10, successes or deficiencies or instructors in terms of how his/her trainees performed, and the like. In one embodiment, the AI engine 190 determines common and/or trends in deficiencies and recommends modifications to existing and/or new lesson plans, tasks, and activities (e.g., the stored lesson plans 154), and/or to the performance criteria 156, with an aim of minimizing and/or substantially eliminating the identified and/or determined deficiencies through performance of the improved and/or new lesson plans and evaluation thereof by improved and/or new performance criteria 156. It should be appreciated that the AI engine 190 may access and analyze performance data on-demand or iteratively to provide continuous learning improvements over predetermined and/or prolonged periods. In one embodiment, the AI engine 190 interacts with the operator and/or operators 10 (e.g., respective avatars), for example, as an in-scene instructor (e.g., senior medical practitioner), or to provide and/or to enhance interaction to be more realistic of actual conditions in a healthcare facility or in an interior or exterior scene of an event (e.g., motor vehicle accident, critical natural or manmade disaster, concert or other entertainment performance, and the like) under ordinary daily and/or emergency conditions. It should be appreciated that the scene of the event and simulated patient interaction may include in-transport care as the patient is being moved, e.g., driven or flown, from an accident sight to a hospital or other trauma center.
For example, when the operator 10 selects the Male Patient element 206A of the start-up page GUI 204 of
In one embodiment, the series of GUIs 200 further include, for example, GUIs 216 and 218 of
As shown in
As shown in GUIs 224, 226, 228, 230, 232, 234, 236, and 238 of
As also shown in GUI 232, 234, 236, 238, 240, and 242 of
As shown in GUIs 240, 242, 244, and 246 of
It should be appreciated that one aspect of providing the aforementioned cleaning tasks is for a healthcare practitioner to assess a preexisting or newly developed condition of the resident/patient undergoing care. For example, it is not uncommon for bedridden residents/patients to develop skin ulcers from lack of movement or mobility leading to poor blood flow in areas of his/her body and as a result loss of outer layers of their skin, redness, and in extreme cases, open sores, wounds, and ulcers. In one embodiment, as shown on GUI 360 of
In one embodiment, the VRNA training simulator 20 provides a series of the GUIs 200 to, for example, monitor and evaluate a healthcare trainee (e.g., the operator 10) performing such medical or patient care task as taking, or assisting other medical practitioners taking, a patient's vital signs (e.g., temperature, blood pressure, blood glucose level, blood flow, and the like) and/or being administered medicine. For example, as shown in GUIs 300, 302, 304, 306, and 308 of
In one embodiment, the VRNA training simulator 20 also provides a series of the GUIs 200 to, for example, monitor and evaluate the healthcare trainee (e.g., the operator 10) measuring, reading, and recording other vital signs of the resident/patient 102 such as, e.g., a patient's blood glucose level at a patient's finger 102B with a simulated glucose meter 104G on a GUI 320 of
In one embodiment, the VRNA training simulator 20 may introduce one or more tests or quizzes to, for example, periodically evaluate the healthcare trainee (e.g., the operator 10) knowledge and skill in completing a healthcare task. For example, as illustrated in
In one embodiment, the healthcare provided to residents/patients may include, for example, physical therapy. For example, as shown on GUIs 350 and 354 of
As should be appreciated from the description presented herein, the VRNA healthcare training simulator 20 implements the 3-D virtual healthcare training environment 100 for training and re-training healthcare trainees operating the system to gain and/or further refine a plurality of healthcare skills. For example, the healthcare skills within the VRNA healthcare training simulator 20 include, but are not limited to, the following:
As can be appreciated by those skilled in healthcare practice, hand hygiene is important as it prevents the spread of germs thus protecting both the caregiver as well as those persons receiving care from the caregiver. Accordingly, as shown in GUI 370 of
In one embodiment, the VRNA training simulator 20 may capture and record (e.g., via the tracking sensors 44 and 62) one or more paths of travel of the one or more controllers 60 as the operator 10 manipulates one of the objects 104 (e.g., the healthcare tools, equipment, and supplies) used by operators 10 in providing care, and/or of the HMDU 40 as an indication of the operator's physical movement within and about the 3-D virtual healthcare training environment 100. In one embodiment, the training simulator 20 may generate, for example, in a review and/or evaluation mode, a line as a visual indication of the one or more captured and recorded paths of travel of the objects 104 and/or the operator 10 to demonstrate the position and/or orientation thereof as a performance measurement tool. In one embodiment, such a performance measurement tool may be used, for example, to demonstrate proper and efficient, and/or improper and inefficient performance of healthcare procedures conducted by the operator 10. In one embodiment, the visual indication of two or more paths of travel may be color coded or otherwise made visually distinct, may include a legend or the like, depicting and individually identifying each of the paths of travel, and/or may include one or more visual cues (e.g., a starting point, cone, arrow, or icon, numeric or alphanumeric character, and the like) illustrating aspects of the paths of travel such as, for example, speed, direction, orientation, and the like.
As should be appreciated, it is within the scope of the present invention to provide more and/or different sensory indications (e.g., visual graphs and icons, audio and/or tactile indications) to illustrate, for example, both favorable and/or unfavorable aspects of the performance of healthcare procedures by the operator 10 (e.g., healthcare trainee) within the 3-D virtual healthcare training environment 100. The inventors have discovered that this in-process, real-time sensory guidance (e.g., the visual, audio and/or tactile indications) can improve training of the operator 10 by influencing and/or encouraging in-process changes by the operator 10 such as positioning (e.g., proximity and/or angle) of the one or more controllers 60 in relation to the patient 102. As can be appreciated, repeated performance at, or within a predetermined range of, optimal performance characteristics develops and/or reinforces skills necessary for performing a skill-oriented task. Accordingly, the training simulator 20 and its real-time evaluation and sensory guidance toward optimal performance characteristics are seen as advantages over conventional training techniques. Furthermore, in some embodiments, the performance characteristics associated with the operator 10 and/or the quality characteristics associated with the healthcare virtually rendered thereby may be used to provide a measure or score of a capability of the operator 10, where a numeric score is provided based on how close to optimum (e.g., preferred, guideline, or ideal) the operator 10 is for a particular tracked procedures and the like.
As described above, the healthcare training simulator 20 tracks, captures or records, and utilizes various cues and sensory indications to exhibit both favorable and/or unfavorable aspects of the healthcare procedures being performed by the operator 10. In one aspect of the invention, the simulator 20 evaluates an operator's performance and the tools, equipment, and supplies 104 used, as well as the controller 60 movement (e.g., speed, direction or path, orientation, distance), to a set of performance criteria established by, for example, the instructor or certification agent 12 and/or healthcare industry standards of acceptability. In one embodiment, the training simulator 20 based evaluation yields scores and/or rewards (e.g., certification levels, achievement badges, and the like) highlighting the operator's progress and/or results as compared to the set of performance criteria and, in one embodiment, as compared to other healthcare trainees. The scoring may be determined and/or presented both on an in-process and/or on a completed task basis. As noted above, the scoring may include evaluations of operator's actions in manipulating the patient 102 and/or objects 104 by movement of the one or more controllers 60 (e.g., speed, orientation, distance) as the operator 10 performs a healthcare procedure and tasks therein as well as the operator's performance with respect to other parameters such as, for example, elapsed time, efficiency, resulting patient condition and/or improved condition (e.g., perceived good and bad results).
In one embodiment, scoring and/or rewards are stored by the VRNA healthcare simulator 20, for example, within the aforementioned performance data 158, individual and group scores 160 and 162, as compared to performance criteria 156 of the data storage device 150 for one or more trainee/operators 10. In one embodiment, the scoring and/or rewards may be downloaded and transferred out of the training simulator 20 such as, for example, via a USB port (e.g., port 58) on the computing device 52. In one embodiment, scoring and/or rewards for one or more trainees (e.g., the operators 10) may be shared electronically, for example, included in electronic mail messages, posted on a portal accessible by one or more healthcare facilities or the like, websites and bulletin boards, and over social media sites. In one embodiment, shown in GUIs 400, 404, 408, and 412 of
In one embodiment, a healthcare trainee may “earn” an award, commendation, and/or badge when the trainee's score in performing an activity meets or exceeds one or more predetermined thresholds. As such, the awards, commendations, and badges are in recognition for superlative performance, e.g., performance at or above one or more predetermined performance thresholds. In one embodiment, the performance thresholds may be set in accordance with, for example, institutional, state, or federal competency requirements as well as other regulatory and/or certifying agencies or the like. In one embodiment, trainees can upload and publish their scores 405 via the network 90 to, for example, social networking websites such as, for example, Facebook®, Twitter®, or the like. The publication is seen to enhance trainee interest, engagement and, further, foster a level of competition that may drive trainees to build advanced skills in order to obtain a “leader” position among his/her classmates and/or peers.
As noted above, it is within the scope of the present invention of the administer of the VRNA simulator 20, the instructor or certification agent 12, and/or the operator or user 10 to selectively vary characteristics or physical features of the simulated resident or patient 102 such as gender, hair, skin tone, skin color, height, weight, and the like, clothing or medical gown worn by the patient 102, medical condition including mental and/or physical conditions, symptoms and/or disabilities of the resident or patient 102 such as, for example, height, weight, patients having an amputated limb or limbs, physical deformities, injuries, wounds, or other medical illnesses, diseases, handicaps, and/or special health care needs, and the like. For example, in one embodiment as illustrated in
In one aspect of the present invention, the VRNA healthcare training simulator 20 is portable (e.g., transferable) as a self-contained modular assembly 400 (
In one aspect of the present invention, the VRNA healthcare training simulator 20 is customizable (e.g., modifiable and/or adjustable) to assign particular characteristics of an operator, e.g., height, spoken language, and the like, and/or environmental settings where healthcare is to be performed, e.g., urban versus rural setting and a particular healthcare facility's room configurations (e.g., single versus multiple resident/patient occupancy, equipment present, display of instruction, informational, and/or hazard/warning postings or displays (e.g., specific PPE required for access)). For example, in one embodiment, the VRNA healthcare training simulator 20 includes a configuration mode, depicted in GUI 450 of
While the invention has been described with reference to various exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims.
This application claims benefit of and priority under 35 U.S.C. § 119(e) to copending, U.S. Patent Application Ser. No. 63/336,490, filed Apr. 29, 2022, the disclosure of which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63336490 | Apr 2022 | US |