One or more embodiments of the present invention relate to a haptic-mixed reality intravenous needle insertion simulation assembly. One or more embodiments of the present invention relate to a corresponding system. One or more embodiments of the present invention relate to corresponding methods.
A common hospital procedure is the insertion of an intravenous catheter (IVC). Infusion therapy utilizing intravenous catheters provides a route to administer life sustaining fluids, electrolyte replacement, and pharmacological agents. Intravenous catheters also allow for extracting blood for testing and diagnostic purposes. Unfortunately, not all IVC insertions are successful, especially on the first attempt.
Venipuncture skills are among the most challenging for a novice nurse to master when in training and transitioning into practice. Poor success rates have been attributed to confidence issues, improper angle of insertion, and lack of opportunities.
Educational opportunities to train IVC skills are typically performed on plastic manikin arms, which generally provide insufficient replacement for the realism and variability required to achieve mastery. In addition, teaching this skill requires many consumable products and costly medical devices, e.g., single use intravenous (IV) catheters and the manikin arms.
Therefore, there is a need in the art for improvements for enhancing learning and training for intravenous catheter insertion.
A first embodiment provides an assembly for training a user for intravenous (IV) catheter insertion, the assembly including a mixed reality display adapted to display a virtual patient and a virtual needle, the mixed reality display including a hand tracking module; a first haptic device including a stylus assembly with a needle assembly; a second haptic device adapted to allow the user to stabilize a hand; and a hand motion controller adapted to track the second haptic device in conjunction with the hand tracking module of the mixed reality display.
A further embodiment provides a system which includes the assembly coupled with a computer.
An additional embodiment provides a method utilizing the system, the method including providing the system having a first set of insertion conditions; utilizing the system to simulate inserting the virtual needle into an arm of the virtual patient with the first set of insertion conditions; adjusting the system to a second set of insertion conditions; and utilizing the system to simulate inserting the virtual needle into the arm of the virtual patient with the second set of insertion conditions.
One or more embodiments of the present invention relate to a haptic-mixed reality intravenous (HMR-IV) needle insertion simulation assembly. One or more embodiments of the present invention relate to a corresponding system. One or more embodiments of the present invention relate to corresponding methods. One or more embodiments of the present invention system include a bimanual haptic interface completely integrated into a mixed reality and/or virtual reality system with programmable variabilities considering real clinical environments. Advantageously, embodiments of the present invention disclosed herein offer the ability for enhancing learning and training for intravenous catheter insertion.
Embodiments of the present invention allow users, such as nursing students and healthcare professionals, to practice intravenous (IV) needle insertion into a simulated/virtual arm. The practice will serve to improve the psychomotor skill development necessary for enhancing learning to achieve mastery. Embodiments of the present invention allow for multiple attempts at IV needle insertion under a variety of insertion conditions. The insertion conditions can be varied for skin, such as differences in color, texture, stiffness, and friction. The insertion conditions can be varied for a vein, such as differences in size, shape, location depth, stiffness, and friction. The insertion conditions may be fluctuated in order to provide for sufficient realism and variability, which aids in accounting for the differences in these conditions among human patients.
As further described herein, the IV insertion simulation allows for a variety of different insertion conditions. Moreover, a force-profile-based haptic rendering algorithm provides realistic haptic feedback to the user while inserting the needle into the virtual vein. Also, improved haptic glove tracking is disclosed, which utilizes a hand tracking sensor. Embodiments of the present invention can further include calibrating the haptic and mixed reality system with a calibration box using a camera sensor. And the system and methods were tested to verify the realism.
With particular reference to the Figures,
Haptic-mixed reality intravenous needle insertion simulation assembly 12, which may also be referred to as simulation assembly 12 or assembly 12, includes a hand motion controller 16, a mixed reality display 18, a first haptic device 20, and a second haptic device 22.
The hand motion controller 16 works in conjunction with a hand tracking module of the mixed reality display 18 to achieve accurate global hand tracking. An exemplary hand motion controller 16 is the controller available under the name Leap Motion Controller. Another exemplary hand motion controller 16 is the controller available under the name Vive tracker attached to a haptic glove and tracked by a corresponding base station. As discussed further herein, one or more embodiments of the system include a hybrid global hand tracking system, which can combine the hand motion controller 16 and the hand tracking of the mixed reality display 18, which may also be referred to as combining two depth sensors. The hybrid global hand tracking system can accurately track one or more of the first haptic device and the second haptic device. That is, the hybrid global hand tracking system can accurately track a haptic device (e.g., glove) and can simulate grasping a virtual hand with force feedback.
The mixed reality display 18 allows a user to see their real hands in conjunction with a virtual image 24 (
The first haptic device 20 should allow for the user to simulate inserting a needle, normally with their dominant hand. An exemplary first haptic device 20 is a stylus haptic device available under the name Geomagic Touch. Other 3DOF or 6DOF desktop haptic devices may be suitable. As discussed further herein, the first haptic device 20, in conjunction with a force profile based haptic rendering, is able to mimic the real, tactile feeling of IV needle insertion. The first haptic device 20 can include, or can be used in conjunction with, a force sensor to achieve the force profile based haptic rendering.
The first haptic device 20, which may be referred to as a modified haptic needle interface 20, can be or can include a stylus 21 adapted to be used by a dominant hand of the user, the stylus acting as a simulated intravenous catheter. The stylus 21, which can be referred to as stylus assembly 21, can include a stylus end 23 which includes a needle assembly 25. The needle assembly 25 may include the sharp needle end being removed for safety purposes. The first haptic device 20 as a stylus assembly 21 may be a desktop haptic stylus assembly. In one or more embodiments, the first haptic device 20 and the force profile based haptic rendering are sufficient for mimicking the real, tactile feeling of IV needle insertion, such that the system 10 can be devoid of a manikin arm. In other embodiments, the system 10 might include a manikin arm or other assembly for physically imitating a human arm, for insertion of a simulated needle therein.
The second haptic device 22 should allow for the user to stabilize their other hand, normally their non-dominant hand. The second haptic device 22 can be a haptic glove. That is, the second haptic device 22 as a haptic glove can be adapted to be worn on a non-dominant hand of the user. Exemplary second haptic devices 22 include haptic gloves available under the names Dexmo and SenseGlove.
The combination of the first haptic device 20 and the second haptic device 22 may be referred to as bimanual haptic simulation. The first haptic device 20 and the second haptic device 22 may be referred to as being nonhomogeneous but complementary haptic devices. To simulate IV needle insertion with realistic variable conditions such as human skin characteristics (e.g., color, textures, roughness), vein characteristics (e.g., shape, size, thickness, location), and different types of IV catheters, the bimanual haptic simulation is integrated with a mixed reality system, as discussed elsewhere herein, for conducting a bimanual IV needle insertion procedure.
The components of assembly 12 are coupled with computer 14. As shown in
The particular respective connections between the respective components of assembly 12 and the computer 14 can be any suitable connection known to the skilled person. As shown in
The haptic mixed reality IV simulation system (HMR-IV Sim) 10 shown in
The built-in shader, which can be a physically based shader, can be applied to increase the realism of interactions in the graphic rendering by adjusting one or more of the following parameters: metallic: 0; smoothness: 0.5; normal map scale: 1; specular highlights and reflection enabled. In addition, graphic rendering parameters simulating variable conditions (e.g., colors and textures of the skin and veins; vein size, shape, and location; IV needle size and blood drawing) can be developed and set to allow for flexibility in programing for practice with various conditions.
These parameters should be synchronized with the haptic rendering to achieve a realistic simulated environment that considers both the visual and haptic perceptions expected in real-world scenarios. For example, to control the variability of vein size and location, the graphic rendering interface can be programmed to be selected at from 10 mm to 7 mm for the vein diameter and from 5.5 mm to 7 mm for vein location (under the skin surface).
For haptic rendering, a suitable algorithm can be utilized for the haptic devices 20, 22 in terms of basic colliders and force feedback. To create a feeling of realistic needle insertion, a force-profile-based needle insertion algorithm can be developed, as further described herein. In the algorithm, variable conditions were implemented. However, rather than graphic variables, a focus can be on stiffness for the vein and skin due to the importance to creating a haptic feeling that mimics real-world experience. These stiffness parameters can be implemented to be adjustable, allowing for two distinguishable values for skin and vein based on a discrimination threshold estimated to measure a human differential threshold for haptic stiffness discrimination in the presence of MR graphic objects. A hand motion controller 16 can be employed to achieve accurate global hand tracking in combination with the hand tracker of the mixed reality display 18. In one or more embodiments, the hand motion controller 16 can be mounted to a center (e.g., right above a camera) of the mixed reality display 18.
Calibration of the mixed reality display 18, which can display both virtual and real objects, can be performed to spatially synchronize the virtual needle 28 with the real needle assembly 25 attached to the haptic stylus device 20, 21 in motion. Without calibration, there can be discrepancies in position and direction between the virtual needle 28 and real needle assembly 25, leading to misperceptions of the needle insertion task in the mixed reality system. Accuracy of the calibration should be considered, since simulated IV needle insertion requires hand-eye coordination as well as fine hand/finger motor skill in terms of locating and inserting the thin virtual needle 28.
The coordinate system of the mixed reality display 18, which may be referred to as a head-mounted display (HMD) 18, can be a right-handed coordinate system that enables the ability to track the HMD position in the real world. Once the user can define a stage that represents the room in the real world, the stage defines a stage origin, which is a spatial coordinate system centered at the user's position and orientation. The HMD position can be tracked based on the stage origin. For this reason, a consistent world coordinate system should be used for calibration within the virtual world, and haptic coordinates should be used to simulate the needle insertion with an overlayed syringe on the haptic stylus.
The calibration method relies on the initial positions of the camera of the mixed reality display 18 and the haptic stylus device 21, 25, which are both measured in the real-world coordinate system. To fix the initial positions of the mixed reality display 18 and the haptic stylus device 21, 25, a device-positioning board (not shown) can be utilized. The device-positioning board, which can be an acrylic panel (e.g., width: 41 cm; height: 26.5 cm; depth: 26.5 cm) with positioning guides for the mixed reality display 18 and the haptic stylus device 21, 25 to be positioned at predefined positions when the system 10 starts. In this way, the synchronization of different coordinate systems can be well maintained, even when the mixed reality display 18 is dynamically in use by the user.
Calibration data computed through this process can be pre-stored in a step 36 and automatically loaded when the HMR-IV Sim system 10 starts, as shown in
To estimate an accurate transform operation from the coordinate system of the haptic device 21, 25 to the eye view of the mixed reality display 18, an MR camera calibration method using Direct Linear Transform (DLT) can be combined with coordinate system synchronization. Due to the offset between the mixed reality display 18 camera position and both eye view positions, the holographic object is overlapped onto the real object in the eye view, and the mixed reality display 18 camera view shows the gap between the holograph and the real object. The root mean square error of the gap from all real control points to virtual control points on the projected image can be found. An example is (−24.5, −31.5) pixels, (−20.9, −21.1) pixels, and (−24.1, −28.25) pixels for a captured image from 0, −45, and 45 degrees, respectively.
The DLT-based calibration method requires at least 6 control points as pairs of corresponding points between 3D (x, y, z) and 2D image coordinates (u, v). The control points, which are 3D points in the real-world coordinate system and projected pixel points on the camera image plane, can be obtained by measurements and then used for estimating the unknown parameters (called DLT parameters) of a 3 by 4 homogenous camera matrix P by solving linear equations ([u; v; w]=P*[X; Y; Z; W]). The projection matrix P, including the estimated parameters, maps any 3D points to the camera image plane, which becomes the mixed reality display 18 view.
A checkerboard box 40 (
The gyro sensor can be used to maintain the position and orientation of the mixed reality display 18 camera in this step. The origin of the real-world coordinate system is the left front corner of the synchronization board, and for the axis, the right-handed coordinate system can be used. The control points can then be used for computing an initial projection matrix using the DLT method. In this step, 2 points from each side, or a total of 6 points, can be selected to create the matrices and select the matrix with minimum error as the initial matrix. Then, the initial matrix can be iteratively optimized using the Broyden-Flether-Goldfarb-Shanno optimization (BFGS) algorithm, which finds the minimum reprojection errors in the camera image coordinates. As shown in
The next step of the calibration process can be computing the homogeneous transformation matrices between 3D coordinate systems, namely the real world, the virtual world, and the haptic stylus system, as each has a different coordinate system. For the coordinate synchronization, four (4×4) transformation matrices (TWH, THW, TUW, TWU) can be computed using 12 corresponding points between paired coordinate systems (e.g., the real world to the virtual world, the virtual world to the haptic stylus, the real world to the haptic stylus), and then matrices can be optimized with the BFGS. The cost function of the BFGS is the root mean square error of all 12 points generated by 12 4×4 homogeneous matrix parameters.
The mixed reality display 18 may include a hand-tracking algorithm 42, which may be referred to as a hand-tracking module 42, for hand-gesture-based interactions through a mixed reality toolkit (MRTK) (shown as “MRTK2” in
A hand-tracking algorithm can be utilized that automatically selects either sensor to accurately obtain tracking in a dynamic mixed reality scene. The algorithm can place a higher priority on the hand motion controller 16 because it may demonstrate better performance tracking of the haptic glove (e.g., facing down) while in motion. For one suitable tracking algorithm, synchronization of the two coordinate systems can be done first so that hand-tracking data from the two different sensors can be aligned in a single coordinate system. To achieve this, the same method of coordinate system synchronization described above can be applied, using 12 control points sampled from both coordinate systems for estimating a 4 by 4 transformation matrix that maps the hand motion controller 16 sensor coordinates to the mixed reality display 18 coordinates with a minimum error (e.g., RMSE: 8.6 mm).
The hand-tracking performance may also be affected by illumination and camera viewing angles to be determined by the location of a depth sensor. The location of the hand motion controller 16 sensor may therefore be considered. The location of the hand motion controller 16 can be on the center of the mixed reality display 18 (i.e., while on the head).
The success rate of the grasping gesture tracking should be sufficient to achieve realistic haptic grasping feedback. This may include configuring the tracking such that any tracking failure frames in the grasping gesture are found in the middle of the motion (i.e., not near the end, where haptic force feedback is computed). This may also include configuring the system to activate the local finger tracking of the haptic glove to compute accurate force feedback once the user's hand has been successfully located at the desired grasping position. In this way, the global hand tracking of the haptic glove can be improved. The global hand-tracking module 44 can be designed to select either hand tracker 16, 18 based on the tracking status (i.e., failure or success) of each. The virtual hand position (i.e., palm center) of the haptic glove can be updated with the tracking information of the selected tracker. If both of the trackers would happen to fail to track the haptic glove 22, the algorithm can hold the virtual hand at the last updated position.
In one or more embodiments, two haptic rendering schemes can be developed to simulate realistic IVC needle insertion using the modified 3 DOF (degrees of freedom) haptic stylus 20 and grasping a virtual hand or arm using the exoskeleton glove 22, which can be an 11 DOF glove.
For the IVC needle insertion haptic rendering, a force-profile-based haptic rendering algorithm 46 can be created. The force-profile-based haptic rendering algorithm 46, which may also be referred to as module 46, can include two parts. First, a multilayer mesh-based force rendering framework can be created and optimized for simulating IVC needle insertion with adjustable parameters (e.g., stiffness and friction). Second, the optimum values of the adjustable parameters can be determined using force-profile-based data analysis and user feedback.
The multilayer mesh-based framework can use multiple 3D mesh objects (e.g., skin, vein, needle) at different layers to create different haptic feelings at the skin layer 26 and vein layer 32, respectively, as graphically illustrated in
The resisting force of the haptic cylinder object 48 is computed only when an end point 50 of the virtual needle 28 is moved into the skin surface 26, which is detected by the position sensor of the first haptic device 20. The resisting force can be optimized by stiffness and friction. Once the haptic cylinder 48 is activated, the needle 28 is not able to break the collider of the cylinder 48 until the end of insertion. Insertion feedback forces can be computed using a virtual proxy model implemented in an open library (i.e., layer 30) (e.g., OpenHaptics library) which can include adjustments to provide haptic rendering parameters (e.g., stiffness, friction, pop-through).
A second step can be determining the optimum values of haptic parameters when inserting into the skin 26 or vein 32. To objectively complete this process, force-profiling-based data analysis can be conducted. A user force profile can be used as the reference point to be similarly formed by the multilayer mesh-based force rendering. This can include estimating the optimum value of stiffness that causes a realistic haptic feeling for the skin 26 and vein 32. Force profiling can be conducted using a high-precision force sensor attached to a real IVC needle during insertion into a manikin arm.
Two force profiles were recorded for the skin and vein (
A haptic glove rendering module 52 can also be implemented using open libraries 30 (e.g., OpenHaptics and Dexmo SDK in Unity) to simulate grasping a virtual hand while a virtual needle is inserted. As mentioned above, global hand tracking can be combined with the haptic finger tracking function. Resisting forces (e.g., max 0.5 N) can be computed when the virtual fingers of the haptic glove touches the virtual hand surface for grasping. This force computing can be implemented using the virtual proxy model.
Rather than visual variables (color, textures, size, shape, etc.), stiffness (hardness) generally determines the haptic variability of the skin and vein. However, a change in stiffness is invisible and completely unknown until the user feels it using a haptic device. Since embodiments include replacing the end part 23 of the haptic stylus 21 with a real IV needle 25, a discrimination threshold of haptic stiffness can be found when using a real needle in the context of fully immersive mixed reality. This may also be referred to as haptic perception data.
Haptic perception data input step 54 (
As discussed herein, the simulation can include variability of insertion conditions. The skin conditions can include one or more of color, texture, stiffness, friction, and the presence or absence of tattoos. The vein conditions can include one or more of size, shape, location depth, stiffness, and friction. Other conditions for the skin and vein can include one or more of dark skin, large veins, rolling veins, excess hair, geriatric, can palpate, tattoos, light skin, small vein, thick skin, cannot visualize vein, smooth skin, superficial veins, can visualize veins, cannot palpate, thin skin, and deeper veins.
To simulate vein size and/or depth, a skin deformation script can be created to raise the mesh vertices on the hand where the virtual vein 32 would be. This can cause a bulge over the vein 32, giving it an appearance of protruding from the hand, simulating a more prominent vein. A blue line can be added to the color map and the normal map over the vein area to further simulate vein protrusion. The albedo value of the model material can be adjusted to change the skin color. Exemplary skin colors which can be variable include white, tan, and black. Another factor can be the presence or absence of tattoos. This can include variables of three levels of difficulty for the tattoos: no tattoo (easy), pattern tattoo (medium), and anchor tattoo (hard). This can include changing the color map to a hand texture with the tattoo overlayed to apply the tattoos. The many variables can be adjusted to combine various overall difficulty levels, which variability of the difficulty level can be useful for mirroring the variability of real world scenarios. In one or more embodiments, this can include manipulating three variables to provide enough variability to create sufficient scenarios.
Though aspects of one or more methods are disclosed elsewhere herein, additional details of one or more methods are disclosed here. As mentioned above, embodiments of the system 10 disclosed herein offer the ability for a step 56/method 56 (
A method utilizing the system disclosed herein can include a first step of providing the system with a first set of insertion conditions. The user can then utilize the system to simulate inserting a needle into a human arm with the first set of insertion conditions. The system can then be adjusted to a second set of insertion conditions. The user can then further simulate inserting a needle into a human arm with the second set of insertion conditions. These steps can be repeated for still further insertion conditions.
In one or more embodiments, in order to assist with maintaining stable rendering, the system can be devoid of deformable motion.
The hand-tracking performance was analyzed based on the location of a depth sensor. An experiment was performed to find the best location of a Leap Motion sensor for two mounting scenarios: on a desk compared to on the head of a user with a HoloLens 2 device. Success rates of tracking a haptic glove were compared with the glove being worn and tested with different hand postures and gestures. The different hand postures and gestures included facing up, facing down, and grasping a virtual hand. The single sensors were analyzed, as well as the combination according to the details disclosed herein. The success rate of hand tracking was computed using the metric, SR=t/n, where SR denotes the success rate of hand tracking, t denotes the frames during which the glove is tracked, and n denotes total frames. The results are shown in the below Table 1. The results suggest the center of the mixed-reality (MR) headset (on the head) was the best location.
An evaluation experiment was conducted to measure the usability of the HMR IV needle simulation system disclosed herein with human subjects from novices to experts for an IVC needle insertion task using two hands. Twenty participants took part in the experiment. Nine of them were experts who had formal training and had performed more than five successful IV insertions with real patients, while eleven were novices who had no formal IV training or insertions.
The HMR-IV insertion simulation system was developed with a 64-bit Windows desktop PC (Intel® core™ i7-9900K CPU from Intel, Santa Clara, CA, USA, 32 G RAM, and a NVIDIA RTX 2070), a Geomagic Touch haptic device (right hand), a Dexmo haptic glove (left hand), and Microsoft HoloLens 2 were used. The participants were given time to sufficiently familiarize themselves with the IVC needle insertion system after learning the usage protocol. For novice participants, a short tutorial about practicing IVC needle insertion (grip, insertion angles, a pop feeling in the vein, etc.) was also provided.
In the main experiment, participants were asked to repeat an IV needle insertion 64 times (trials) with different variabilities. For each trial, variability conditions (vein location, vein size (diameter), haptic skin, and vein stiffness, with 2 distinguishable levels each) were randomized. Participants were also asked to use earplugs to block external noise. For each trial, instruction texts (start, end, actions to take, etc.) were sequentially displayed through the HoloLens display for participants to complete the experiment without assistance. To guide the target touch positions with haptic devices, visual graphic feedback was provided. For the haptic glove, a semitransparent hand image overlaid on the haptic glove turned green from red, and for the IV needle, a target insertion area was highlighted by an oval. For each trial, participants were asked to wait for 5 seconds without motion when they were confident about a successful insertion into the virtual vein. It took on average 35 min for each participant to complete the experiment.
To measure the usability of the IV needle insertion simulation system, both quantitative and qualitative data analyses were conducted. For quantitative data analysis of success rates of needle insertion, the insertion angles (5 to 30 degrees), task completion time (start and end), and distance (the needle tip end to the vein center) were measured by functions developed in the system. A success rate was calculated by a formula: the number of successful insertions divided by total attempts. The requirements of a successful insertion, defined by an expert using evidence-based practices set forth by the Infusion Nurses Society, include (1) an insertion angle between 5 to 30 degrees (ideally 20 degrees) and (2) the needle tip staying inside the vein once inserted. For qualitative data analysis, subjective responses were analyzed to measure the usability of the system.
To verify the real-time computing performance of the developed system, the update rate of the entire system was measured while the haptic system was running at an update rate over 1 Khz, which can be a minimum requirement for real-time haptic rendering. For the update rate measurement, total frames were divided by the elapsed time from needle insertion start on the skin to the end. This was repeated 10 times and then averaged. The result was 56 frames per second (0.018 s per frame), which is sufficiently fast to conduct the bimanual IV needle insertion procedure in real-time. The system calibration described above was conducted to compute the transformation matrices (
The quantitative results were based on measurements (success rate, completion time, distance from the needle tip to the vein center, insertion angle) automatically recorded in the system during the experiment. The data between two groups (novice and expert) were compared and further analyzed under a statistical analysis using a t-test to see if there was any significant difference between the two groups.
In light of the foregoing, the present invention advances the art by providing improvements for enhancing learning and training for intravenous catheter insertion. While particular embodiments of the invention are disclosed herein, the invention is not limited thereto or thereby inasmuch as variations will be readily appreciated by those of ordinary skill in the art. The scope of the invention shall be appreciated from the claims that follow.
This application claims the benefit of U.S. Provisional Application No. 63/442,218, filed on Jan. 31, 2023, which is incorporated herein by reference.
This invention was made with government support under grant/contract 2118380 awarded by National Science Foundation. The government has certain rights in the invention.
Number | Date | Country | |
---|---|---|---|
63442218 | Jan 2023 | US |