Embodiments of the present disclosure relate to intravenous catheters. More specifically, embodiments of the present disclosure relate to ultrasound aided positioning of an intravenous catheter.
Intravenous catheters may be used to administer various medications into the blood stream of a patient. In particular, peripheral intravenous (IV) therapy is a common procedure performed in modern hospitals by medical personnel. However, various catheter parameters effect the blood flow velocity rate of the patient thereby complicating the selection of an appropriately sized catheter. Additionally, in many cases it is difficult for medical personnel to locate and select a suitable blood vessel for placement of the catheter. Poor insertion techniques and improper placement of an intravenous catheter may result in any of pre-mature catheter failure, medical complications, and treatment delays.
The positioning of the catheter within the vein of the patient also effects the blood flow velocity rate. As such, it may be difficult to achieve a proper hemodilution ratio of the patient for effective administration of the medication, which may lead to various complications. For example, a hemodilution ratio that is high leads to greater dilution of medication in the vascular system of the patient. Conversely, a hemodilution ratio that is too low may prevent proper integration or dilution of the medication into the patient's blood stream.
About one third of patients are recognized as having ‘difficult access’ in terms of intravenous insertion. Further, lack of necessary skill in medical personnel often results in multiple insertion attempts and training for proper insertion techniques is time consuming, expensive, and challenged by staff turnover.
Embodiments of the present disclosure solve the above-mentioned problems by providing a method and system for monitoring and positioning a catheter within a vein of a patient using ultrasound technology. Further still, embodiments of the present disclosure provide artificial intelligence for identifying anatomical structures within real-time ultrasound data, as well as training, guiding, and instructing medical personnel through an intravenous insertion procedure.
In some aspects, the techniques described herein relate to a method of inserting an intravenous catheter into a blood vessel of a patient, the method including receiving a first set of ultrasound data associated with the blood vessel diameter from an ultrasound device prior to an insertion of the intravenous catheter, estimating a first blood flow velocity rate associated with the blood vessel based on the first set of ultrasound data, receiving a second set of ultrasound data associated with the blood vessel depth from the ultrasound device before insertion of the intravenous catheter, identifying a plurality of positioning parameters of the intravenous catheter based at least in part on the second set of ultrasound data, the plurality of positioning parameters including an angle of insertion of the intravenous catheter and a depth of the vein.
In some aspects, the techniques described herein relate to a method, wherein the second set of ultrasound data includes a cross-section image or diameter of the blood vessel and the intravenous catheter, and a side image of the blood vessel and the intravenous catheter.
In some aspects, the techniques described herein relate to a method, further including automatically detecting one or more edges of the blood vessel within the first set of ultrasound data.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Other aspects and advantages of the present disclosure will be apparent from the following detailed description of the embodiments and the accompanying drawing figures.
Embodiments of the present disclosure are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the present disclosure to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure.
The following detailed description references the accompanying drawings that illustrate specific embodiments in which the present disclosure can be practiced. The embodiments are intended to describe aspects of the present disclosure in sufficient detail to enable those skilled in the art to practice the present disclosure. Other embodiments can be utilized and changes can be made without departing from the scope of the present disclosure. The following detailed description is, therefore, not to be taken in a limiting sense. The scope of the present disclosure is defined only by the appended claims, along with the full scope of equivalents to which such claims are entitled.
In this description, references to “one embodiment,” “an embodiment,” or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment,” “an embodiment,” or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the technology can include a variety of combinations and/or integrations of the embodiments described herein.
Embodiments of the present disclosure provide improvements to intravenous catheter insertion by employing ultrasound technology. Real-time ultrasound data may be utilized to provide a visual display to medical personnel to aid in vein selection, catheter selection, catheter positioning, and other logistics and operational selections associated with intravenous catheter insertion. Further, embodiments are contemplated in which machine learning and other forms of artificial intelligence such as computer vision may be implemented to provide any of identification of anatomical structures, instruct medical personnel through an insertion procedure, select a suitable blood vessel and catheter, as well as other functions as will be described in further detail below.
In some embodiments, the system 100 further comprises a device 120 coupled to the ultrasound device 110. In some embodiments, the device 120 comprises at least one processor such as processor 122, as shown, for processing ultrasound data collected by the ultrasound device 110. In some embodiments, the device 120 comprises a controller, computer, or other form of control unit or similar device. In some embodiments, the device 120 further comprises a data store 124 configured to store data associated with the device 120. For example, in some embodiments, the data store 124 stores ultrasound data captured by the ultrasound device 110.
In some embodiments, the device 120 may be communicatively coupled to a network 126 such as a wired or wireless network. For example, in some embodiments, the device 120 may be coupled to a cloud data store. Accordingly, the device 120 may be configured to transmit at least a portion of data collected by the ultrasound device 110 or other devices to the cloud data store via the network 126 such that the data may be stored remotely and/or transmitted elsewhere. Additionally, it should be understood that the communication connection with the network 126 may be bidirectionally, for example, such that data may be received by the device 120 from the network 126.
In some embodiments, a machine learning model may be included on the device 120, for example, the device 120 may include a machine learning algorithm stored on the data store 124. Said machine learning model may be trained based on historical ultrasound data and may be configured for any combination of identifying anatomical structures within real-time ultrasound data, instructing and/or guiding medical personnel through a catheter insertion operation, or grading medical personnel on insertion operation technique.
In some embodiments, a user device 128 may be included. The user device 128 may comprise any of a mobile device such as a smart phone, tablet, laptop computer, or another user device such as a desktop computer. In some embodiments, the user device 128 includes a processor, data store, and display configured to display a user interface. In some embodiments, a software application may be executed on the user device 128. For example, the user device 128 may store and execute an application for aiding in a catheter insertion procedure. In some embodiments, the user device 128 may be communicatively coupled to the device 120 via any of a wired connection or a wireless connection such as network 126.
In some embodiments, the user device 128 may be configured to automatically and/or manually record and store information associated with the catheter insertion procedure. For example, the user device 128 may be configured for hands free operation such that a medical personnel may operate the user device 128 and provide information to the user device 128 by uttering verbal phrases. As one example, the medical personnel may verbally read out outputs from the ultrasound device such as any of a vein diameter, hemodilution ratio, catheter to vein ratio, centimeter per second blood flow velocity, physician order for an infusion flow rate of medication to be delivered, distance from the skin surface to top of vein, angle of insertion, and IV catheter gauge, as well as other parameters associated with the catheter insertion procedure such that the user device 128 can record and store the parameters. Here, the user device 128 may be configured with voice recognition to identify verbally uttered phrases from the medical personnel and record ultrasound and other insertion data within a data store. Said handsfree operation to record parameters allows the medical personnel to continue the insertion procedure while maintaining aseptic no touch technique (ANTT) to thereby reduce touch contamination during the insertion procedure. Alternatively, or additionally, in some embodiments, one or more insertion parameters such as, number of insertion attempts, angle of insertion, diameter, depth, blood flow velocity, vein purchase, and final tip position may be automatically detected and recorded based on real-time ultrasound data.
In some embodiments, an ultrasound application may be included on the user device 128 to automatically receive and provide parameters and other information associated with the insertion procedure. For example, the user device 128 may receive a distance from the skin surface to the surface of the selected vein and use the distance value as an input to calculate a recommended catheter length based on a selected angle of insertion. In some embodiments, one or more trigonometric functions may be used to determine a recommended catheter length. Further, in some embodiments, the user device 128 may be configured to automatically select a particular catheter device from a catheter data store based on one or more input parameters.
Alternatively, or additionally, in some embodiments, the user device 128 may be configured to automatically update and store parameters based on a communication connection with the device 120 and ultrasound device 110. For example, a real-time image stream of ultrasound data may be received from the ultrasound device 110 and the user device 128 or device 120 may be configured to analyze the real-time ultrasound data to automatically determine one or more parameters. Said one or more parameters are then stored within a data store of the respective device and/or transmitted to the network 126 for remote cloud storage. Further still, it should be understood that action may be taken based at least in part on the one or more parameters. For example, embodiments are contemplated in which a particular catheter device is selected based on the one or more parameters, such as distance from the surface of the skin to the vein and suggested catheter length, and the selected catheter device is automatically ordered.
In some embodiments, a plurality of instructions may be generated and provided to a medical personnel through the user device 128 or the device 120 to guide the medical personnel through the ultrasound and insertion procedure. In a particular example, a medical personnel may first be instructed to position a probe of the ultrasound device 110 on a forearm of a patient such that a potential insertion site may be identified within the patient's forearm. Subsequent instructions may be provided to reposition the ultrasound device 110, for example, to locate another potential insertion site or to provide a better image of the first potential insertion site. Further, in some embodiments, the medical personnel may be instructed to move the ultrasound device 110 to another portion of the patient's body responsive to determining that one or more potential insertion sites in the patient's forearm are below a predetermined insertion score threshold. For example, the medical personnel may be instructed to place the ultrasound device 110 on the patient's other arm.
In some embodiments, the ultrasound 200 may be generated for display within a display of the user device 128 or the device 120. For example, the ultrasound 200 may be viewed by a medical personnel in real-time prior to and during a catheter insertion procedure. Further, in some embodiments, the ultrasound images may be augmented with additional information. For example, one or more anatomical structures may be identified within the ultrasound image and indicated via augmentation of the display.
In some embodiments, a catheter to vein ratio may be deduced or estimated from the cross-sectional ultrasound image 202. For example, the catheter to vein ratio may include a comparison of the catheter diameter to the inner vein diameter of the vein. In some embodiments, a specific type of catheter 208 may be selected to achieve a suitable catheter to vein ratio.
In some embodiments, the side-view ultrasound image 204 shows side view visualizations of the vein 206 and the catheter 208. In some embodiments, the side-view ultrasound image 204 may visually indicate an angle of insertion of the catheter 208. In some embodiments, the cross-sectional ultrasound image 202 and the side-view ultrasound image 204 may be received and displayed in real-time to aid a user in inserting and positioning the catheter 208. Additionally, in some embodiments, markings may be included showing a measurement scale adjacent to the cross-sectional ultrasound image 202 and the side-view ultrasound image 2044. For example, incremental length measurements may be included on a side of the ultrasound 200, as shown.
In some embodiments, any of the insertion parameters described above may be suggested based on parameters of the vein 206. For example, the length of the catheter, the angle of the catheter, and the depth of the catheter into the vein may be suggested based on the size of the vein and a blood flow velocity rate of the vein. In some embodiments, the parameters may be determined to achieve a suggested hemodilution ratio for proper administration of a medication into the patient's blood stream.
In some embodiments, a length for the catheter may be selected based on a depth of vein and angle of insertion for the catheter using trigonometric functions. For example, the Pythagorean theorem may be employed to calculate a suitable length of the catheter based on the depth of center for the vein and the travel across the length of the vein based on the angle of insertion.
In some embodiments, the user interface 400 includes indicators of one or more preset parameters 402. For example, an angle of insertion 404 and a catheter length 406 may be listed on the user interface 400, as shown. The angle of insertion 404 may refer to the actual angle or the recommended target angle of the catheter 208 within the vein 206. In some embodiments, the angle of insertion 404 may be the same as the angle α shown in
In some embodiments, the user interface 400 further comprises indications of one or more calculated parameters 408. For example, the one or more calculated parameters 408 may include any of a vein depth 410, catheter length in vein 412, entry point from center of vein 414, and a vein diameter 415 may be listed on the user interface 400, as shown. In some embodiments, the one or more calculated parameters 408 may be calculated automatically, for example, based at least in part on data from the ultrasound 200. In some embodiments, a suggested catheter length in vein is approximately two thirds of the length of the catheter.
In some embodiments, additional mark-up may be included in the user interface 400 overlayed onto the cross-sectional ultrasound image 202 and the side-view ultrasound image 204, as shown. In some embodiments, the additional mark-up may highlight the angle of the catheter 208 and the center of the vein 206, as shown.
In some embodiments, the user interface 500 may include a variety of selectable pages and actions. For example, the user interface 500 may include an add user selection option 502, a user management selection option 504, a patient management selection option 506, an analytics selection option 508, and a logout selection option 510. In some such embodiments, each of the selection options 502-510 may be associated with a different user interface page or action. For example, the add user selection option 502 may be used to access an add user interface page operable to add a new user to a user database of the application. Further, the user management selection option 504 may be used to edit information within the user database such as user parameters and/or to remove user information from the user database.
In some embodiments, the patient management selection option 506 may be used to access a patient management interface page operable to access and edit patient data stored in a patient data store. Embodiments are contemplated in which patient data may be automatically stored within the patient data store. For example, ultrasound data, as described above may be automatically recorded and stored within the patient data store and associated with a patient profile of the patient. Further, in some embodiments, the analytics selection option 508 may be used to access an analytics interface page operable to view and monitor analytics data associated with a plurality of patients.
In some embodiments, the user interface 500 includes a user list 512, as shown. For example, the user list 512 may be accessed as part of the user management page accessed via the user management selection option 504. The user list 512 may include various user information. For example, the user list 512 may include a set of user information relating to a respective plurality of users, such as, a user ID 514, a tenant name, a facility name 516, a username 518, a first name 520, a last name 522, an email address 524, a phone number 526, and one or more user roles 528. In some embodiments, the user information may be collected when a new user is added. Further, in some embodiments, the user interface 500 allows the user information to be added and updated after a user account is established.
At step 602, a first set of ultrasound data in received prior to insertion of a catheter 208. In some embodiments, the first set of ultrasound data may include ultrasound imagery relating to one or more veins of the patient. Accordingly, the ultrasound data may be used to select a vein for insertion of the catheter. In some embodiments, one or more edges of the vein may be detected based on the received ultrasound data. At step 604, a blood flow velocity rate of the vein may be estimated based at least in part on the first set of ultrasound data. The blood flow velocity rate may be an initial volumetric blood flow velocity rate calculated prior to insertion of the catheter. In some embodiments, a volumetric blood flow velocity rate may be calculated by multiplying the velocity of the blood flow by the value π and the vein diameter.
In some embodiments, any combination of the first set of ultrasound data and the estimated blood flow velocity rate may be used to select an appropriately sized catheter for the vein. For example, in some embodiments, a size of the vein may be identified from the first set of ultrasound data. The diameter of the vein may be used to select an appropriately sized catheter. For example, in some embodiments, the catheter may be selected to achieve a desired catheter to vein ratio, as described above. For example, in some embodiments, a catheter diameter may be selected to compliment a diameter of the vein. In some embodiments, a suggested catheter length may be determined and calculated based on the ultrasound data of the vein. In some embodiments, the suggested catheter parameters, such as catheter length and catheter gauge size, may be retrieved from a catheter/vein chart based on one or more vein parameters.
In some embodiments, one or more suggested positioning parameters or other catheter parameters associated with the intravenous catheter may be suggested based at least in part on the first set of ultrasound data. In some embodiments, the one or more suggested positioning parameters may be displayed to a user within a user interface such as the user interface 400 or user interface 500 described above.
At step 606, a second set of ultrasound data is received after insertion of the catheter 208 into the vein 206. At step 608, one or more positioning parameters of the catheter 208 are identified. In some embodiments, the one or more positioning parameters may be identified based at least in part on the second set of ultrasound data including any of an angle of insertion, depth in vein, accuracy to center of vein, and a number of other positioning parameters relating to the insertion of the catheter 208 within the vein 206.
At step 610, an adjusted blood flow velocity rate is estimated based at least in part on the second set of ultrasound data after insertion of the catheter 208 within the vein 206. In some embodiments, the adjusted blood flow velocity rate comprises an infusion flow rate associated with the blood flow velocity rate after insertion of the catheter. Accordingly, the infusion flow rate may be calculated similarly to the volumetric blood flow velocity rate described above. At step 612, the adjusted blood flow velocity rate is compared to the initial blood flow velocity rate. In some embodiments, the blood flow velocity rates are compared to monitor the effects of the inserted catheter on the blood flow velocity rate. For example, in some cases, the insertion of the catheter impedes the blood flow velocity rate. However, it may be desirable that the effects of the catheter insertion on the blood flow velocity rate are minimized. Accordingly, comparing the blood flow velocity rates allows for adjustment of the catheter positioning to thereby optimize the positioning of the catheter such that the blood flow velocity rate is not significantly impeded.
At step 614, a hemodilution ratio is estimated. In some embodiments, the hemodilution ratio may be estimated based at least in part on the ultrasound data and the estimated blood flow velocity rate. In some embodiments, the hemodilution ratio may be calculated by dividing the initial volumetric blood flow velocity rate by the infusion blood flow velocity rate. Accordingly, in some embodiments, the hemodilution ratio may comprise a metric of comparison of the blood flow velocity rate after insertion of the catheter. In some embodiments, the hemodilution ratio represents the dilution of medication into the blood stream. In some embodiments, a 3:1 ratio is desirable.
In some embodiments, various parameters may be monitored associated with insertion of the catheter. For example, a catheter to vein ratio may be determined based on comparing the catheter gauge and selected vein diameter. For example, in some embodiments, a catheter to vein ratio of 40% or less is desirable. Further, a suggested catheter length may be determined based on vein depth and angle of insertion of the catheter. In some embodiments, it may be desirable that 65% of the catheter is inserted into the vein.
In some embodiments, the ultrasound data may be stored and used in a simulation training program. For example, a catheter insertion simulation may be generated using historical ultrasound data. Accordingly, the catheter insertion simulation may be used to train medical personnel in catheter insertion.
The data system 700 may comprise health system data store 702, as shown. The health system data store 702 may include health system data from any of a plurality of health system sources such as historical ultrasound data from ultrasound providers, medical patient data, policies and procedures from health systems, user data, and other suitable health system data.
The health system data store 702 may be communicatively coupled to a data model 704, as shown. For example, the data model 704 may be generated based at least in part on the health system data from the health system data store 702. In some embodiments, the data model 704 includes training data for training one or more machine learning models. In some embodiments, the data model 704 comprises any of dwell time analysis data, medical complication data, care and maintenance data, as well as medical personnel productivity data.
At least one machine learning model such as machine learning model 706 may be included. For example, the machine learning model 706 may include a computer vision machine learning model configured to identify one or more anatomical structures within a set of real-time ultrasound data. For example, the machine learning model 706 may be trained with training data from the data model 704 or from data received directly from the health system data store 702. Alternatively, or additionally, in some embodiments, the machine learning model 706 may comprise any combination of an analysis and recommendation model, a training model, a guidance model, and a tracking model.
In some embodiments, the machine learning model 706 may be communicatively coupled to one or more research and standards data stores 708. The research and standards data stores may store a plurality of information relating to medical research and medical standards such as clinical research data, medical evidence, and medical information from other data sources.
The machine learning model 706 may output any combination of analysis 710 configured to analyze and provide recommendations in real-time to medical personnel, training 712 configured to train medical personnel, guidance 714 configured to provide real-time instructions and guidance to medical personnel, and tracking 716 configured to monitor medical procedures and provide real-time care and maintenance tracking.
In some embodiments, the machine learning model is configured to determine one or more parameters for the insertion procedure. For example, one or more suggested parameters for the catheter may be selected based on the real-time ultrasound data. Further, the machine learning model may be configured to determine a suitable angle of insertion and position for insertion of the catheter based on the ultrasound data. Further still, in some embodiments, the machine learning model is configured to select a suitable vein for insertion based on the real-time ultrasound data.
In some embodiments, the machine learning model comprises a generative artificial intelligence model configured to generate training content based on collected ultrasound data. For example, the machine learning model may analyze historical training of medical personnel and medical personnel performance in order to provide medical training content optimized for successful performance. As a specific example, if a particular image or instruction that was presented to a plurality of medical personnel during training and those medical personnel received a high performance score, that image or instruction may be selected for inclusion in subsequent training content. Additionally, the machine learning model utilizing generative artificial intelligence may be operable to access the one or more research and standards data stores 708 to check and improve training content based on information on the one or more research and standards data stores 708. In some embodiments, a persistent set of medical personnel training content may be stored and continuously updated based on acquired medical evidence and ultrasound data over time. Accordingly, training content may improve over time and methodology may be updated and improved over time to enhance and standardize catheter insertion procedures.
In some embodiments, a suggested target slope may be generated for display and overlaid onto the real-time ultrasound image. For example, the suggested target slope may be selected based at least in part on one or more parameters associated with the insertion site and may achieve a low angle of insertion to improve blood flow after insertion. Accordingly, the medical personnel is able to visually compare the target slope angle and the actual angle of insertion of the catheter needle during insertion.
The user interface 800A includes a real-time ultrasound image 802, as shown. For example, the real-time ultrasound image 802 may include an ultrasound image captured by the ultrasound device 110 of a potential catheter insertion site of a patient. As such, the real-time ultrasound image 802 may be captured and displayed prior to selection and/or insertion of the catheter 102.
In some embodiments, one or more anatomical structures are identified within the real-time ultrasound image 802. For example, a computer vision algorithm may be used to identify anatomical structures within the real-time ultrasound image 802 and provide an indication to medical personnel to locate and indicate said anatomical structures. The anatomical structures may include any combination of one or more veins 804, one or more arteries 806 (or other blood vessels), and one or more nerve bundles 808. Further, in some embodiments, other anatomical structures may be identified such as any of valves and muscle tissue. The identification of the anatomical structures may aid in guiding the medical personnel through successful vein selection for insertion.
In some embodiments, the anatomical structures may be highlighted or otherwise visually indicated through augmentation of the ultrasound image. For example, in some embodiments, each type of anatomical structure may be highlighted with a different color within the user interface 800A such that the medical personnel is able to distinguish between structures and other objects present in the ultrasound image. Further, in some embodiments, the information indicative of the identified anatomical structures from the real-time ultrasound image 802 may be transmitted to one or more other machine learning models for automatic selection and/or suggestion of a particular vein for insertion. The computer vision algorithm may be further configured to determine one or more parameters associated with the identified anatomical structures and the intravenous catheter such as, for example, diameter, depth, blood flow velocity, angle of insertion, vein purchase, and catheter tip location.
In some embodiments, additional information may be included within the user interface 800A. For example, in some embodiments, the user interface 800A provides real-time instructions 810 configured to guide a medical personnel through an intravenous catheter insertion procedure. As an example, the real-time instructions 810 may include a first instruction for guiding the medical personnel through properly positioning a probe of the ultrasound device 110 and a second instruction for subsequent repositioning of the probe. The real-time instructions 810 may be provided and updated by the user interface 800A in real-time while the medical personnel prepares for and performs the insertion procedure.
Additionally, in some embodiments, the user interface 800A further includes one or more real-time parameters 812. For example, the real-time parameters may include parameters measured, calculated, or estimated from the ultrasound data such as a vein diameter, vein pressure, and vein depth of a selected vein. Further, in some embodiments, the user interface 800A comprises one or more interface actuators such as interface actuator 814 for allowing the medical personnel or other users to interact with the user interface 800A. For example, the interface actuator 814 may comprise a next interface button configured to proceed to a subsequent user interface or set of instructions. Here, the user may select the next button once the listed instructions have been completed to proceed with further guidance from the user interface 800A.
In some embodiments, the user interface 800B includes a lengthwise view 820 of a selected vein. Accordingly, the medical personnel may be able to visualize an angle and positioning of the vein within the body of the patient. As such, the lengthwise view 820 may be utilized to assess the suitability of the vein for intravenous catheter insertion.
Similar to user interface 800A, the user interface 800B may include real-time instructions 810. For example, instructions for positioning the ultrasound probe, ensuring a suitable view of the vein, and initializing a pulsed doppler signal transmission. Accordingly, in some embodiments, a pulsed doppler signal data 822 may be included within the user interface 800B such that the medical personnel is able to simultaneously view the lengthwise view 820 of the selected vein and the pulsed doppler signal data 822.
Additionally, the user interface 800B may include one or more real-time parameters such as the vein diameter and vein depth. In some embodiments, the user interface 800B includes a blood flow rate 824 and a hemodilution rate 826. The blood flow rate 824 and hemodilution rate 826 may be determined based at least in part on the real-time ultrasound data using any suitable method as described above.
In some embodiments, a suggested angle of insertion 828 for insertion of the catheter is included, as shown. In some embodiments, the angle of insertion may be determined based at least in part on an angle of the vein. For example, a default angle of insertion may be about 15 degrees from the lengthwise axis of the vein. However, in some embodiments, the angle of insertion may be adjusted based on a variety of factors.
The table 900A includes a first column listing a plurality of medical personnel 906 for which information has been recorded. The table 900A further includes a current status 908, a number of insertions 910, a number of maintenance operations 912, a percentage of first insertion success 914, and a training status 916 for each respective medical personnel of the plurality of medical personnel 906. In some embodiments, the current status 908 represents an overall proficiency of the medical personnel in catheter insertion operations.
The user interface 900B may be presented to a user such as a medical administrator to monitor and review information relating to the plurality of medical personnel. Accordingly, the medical administrator may use the user interface 900B to consider a proficiency of the overall medical staff or particular individuals relating to intravenous catheter insertion procedures. Further, the user interface 900B may be used to determine which medical personnel are equipped to perform said insertion procedures and which staff units should be associated with insertion procedures. Additionally, in some embodiments, the information depicted in table 900A and user interface 900B may be provided to the machine learning model to automatically determine updates to training and to properly assign insertion procedures among available medical personnel.
In some embodiments, the user interface 900B further comprises a staff unit breakdown diagram 926 showing a breakdown of each staff unit of medical personnel and indicating a number of medical personnel that have achieved a proficient score for catheter insertion out of a total number of medical personnel.
In some embodiments, further machine learning and artificial intelligence techniques may be employed to improve training for medical personnel. For example, stored ultrasound data from a previous insertion procedure may be reviewed by a machine learning model to identify errors and provide improvement recommendations to medical personnel.
In some embodiments, a machine learning model or other computing or control device is operable to generate a suggestion to retrain one or more medical personnel based on the data presented in either of table 900A or user interface 900B. For example, if a first insertion success 914 for a particular medical personnel is below a predetermined threshold, that medical personnel may be selected for retraining. Further, in some embodiments, the machine learning model may be trained to identify a particular error or action that lead to the low success rate of the medical personnel and cater training content around that particular error or action. Further, in some embodiments, one or more medical personnel may be selected to perform a particular insertion procedure based on the medical personnel training and performance data. For example, if a patient record indicates that a patient has had difficulties in insertion in the past one or more medical personnel with insertion performance above a predetermined performance threshold may be automatically selected to perform the insertion procedure. In some embodiments, medical personnel selection may be selected based further on medical personnel availability.
In some embodiments, the insertion record 930 includes a visual insertion representation 932 showing a catheter needle and insertion region of the patient, as well as indications of the vein depth, vein diameter, and angle of insertion of the catheter. Further, the insertion record 930 may include insertion parameters 934 indicating a hemodilution ratio, catheter vein ratio, and suggested catheter length. In some embodiments, patient information 936 is included on the insertion record 930, as shown.
In some embodiments, the insertion record 930 includes an insertion evaluation portion 938 including insertion parameters such as IV location, vein identification method, vein diameter, velocity of blood flow, vein depth, etc., as well as a catheter properties portion 940 including catheter information associated with the selected catheter such as a catheter gauge, catheter length, catheter brand, and an angle of insertion of the catheter.
Additionally, in some embodiments, the insertion record 930 further comprises an accessories portion 942 including additional information associated with the insertion procedure such as a skin prep technique, dressing type, securement, needless connector, and Hub disinfection technique, and indication of whether an antimicrobial sponge was used. Further, an insertion site photo 944 may be included, as shown, to provide an actual visual representation of the insertion region after the insertion procedure.
The insertion record 930 described above and information included therein may be stored within a medical data store. For example, in some embodiments, the insertion record 930 may be added to a patient profile within a medical system database such that the insertion record 930 may be reviewed at a later time in regard to the patient, for example, prior to a subsequent insertion procedure. Similarly, in some embodiments, the insertion record 930 may be added to a medical personnel profile to monitor performance of the medical personnel.
At step 1002, a set of training data is received. The training data may comprise any combination of historical ultrasound data, augmented ultrasound data, and other metadata. For example, in some embodiments, the training data may comprise real ultrasound data received from a plurality of historical ultrasounds and augmented ultrasound data produced by augmenting the real ultrasound data to extend the data set and increase the volume of training data. Further, in some embodiments, the training data may comprise any combination of ultrasound data such as ultrasound images, ultrasound parameters, catheter selections, medical personnel data, and other data relevant to ultrasound procedures and catheter insertion procedures. In some embodiments, the set of training data may be received from one or more training data stores. For example, the set of training data may comprise a plurality of ultrasound images received from one or more medical entities and other health system data stores.
At step 1004, a machine learning model is trained based on the set of training data. For example, the machine learning model may be trained with historical ultrasound data to identify anatomical structures and other structures within subsequent real-time ultrasound data. For example, a catheter device may be identified within ultrasound images of the training data such that a similar catheter device may be identified within real-time ultrasound data by the machine learning model after training.
In some embodiments, the machine learning model may comprise a computer-vision algorithm configured to classify one or more objects within a real-time ultrasound image. Here, the computer-vision algorithm may be trained with a plurality of historical ultrasound images within the training data. For example, the computer-vision algorithm may comprise a classification model configured to classify portions of real-time ultrasound images. In some embodiments, other forms of computer vision techniques are also included. For example, any combination of image filters and other image identification techniques may be used. In some embodiments, the computer vision technique used may include any one of or combination of image segmentation, object detection, edge detection, pattern detection, image classification, and feature matching.
At step 1102, real-time ultrasound data is received. The real-time ultrasound data may be received from an ultrasound device 110 including an ultrasound probe configured to capture ultrasound images. In some embodiments, additional machine learning techniques may be applied to aid in positioning of the ultrasound device 110 and ultrasound probe to improve the image quality of the real-time ultrasound image data. In some embodiments, the real-time ultrasound data may be received from any suitable communication connection with the ultrasound device 110. For example, the ultrasound data may be transmitted via a wired or wireless connection, such as over a network or Bluetooth or other form of radio frequency connection.
In some embodiments, the real-time ultrasound data may be received prior to selection and insertion of an intravenous catheter. Alternatively, in some embodiments, the real-time ultrasound data is received after or during insertion of the intravenous catheter. Further, embodiments are contemplated in which the real-time ultrasound data may be received prior to selection and insertion of the catheter, as well as during and after insertion. For example, a first set of real-time ultrasound data may be received prior to insertion and a second set of real-time ultrasound data may be received after or during insertion.
At step 1104, one or more anatomical structures are identified within the real-time ultrasound data. In some embodiments, the one or more anatomical structures are identified using the machine learning model and/or computer vision algorithm, as described above. In some embodiments, the one or more anatomical structures identified include any one of, or combination of: a blood vessel, a vein, a nerve bundle, as well as other forms of anatomical structures present in the ultrasound image data. Additionally, or alternatively, in some embodiments, one or more other structures may be identified within the real-time ultrasound data such as, for example, a catheter device may be identified within a real-time ultrasound image.
In some embodiments, the one or more anatomical structures are identified by providing an array of pixel values from the real-time ultrasound image data to the computer vision algorithm. The computer vision algorithm analyzes the array of pixel values and identifies the one or more anatomical structures based on classification of respective portions of the array of pixel values. Similarly, a catheter or catheter needle may be identified within the real-time ultrasound data based on a portion of the array of pixel values. For example, the catheter needle may be identified based on a set of pixel values corresponding to a straight line.
Additionally, in some embodiments, the computer vision algorithm may be configured to score a plurality of potential veins or other blood vessels. Here, an insertion score may be provided for each vein identified within the real-time ultrasound data. The insertion score may be determined based on any combination of vein depth, vein diameter, blood flow rate, and other information received directly or indirectly from the real-time ultrasound data. Subsequently, the insertion scores may be compared to suggest or automatically select a particular vein for insertion. In some embodiments, a predetermined insertion score threshold is contemplated such that a potential insertion site or vein may be avoided based on receiving an insertion score below the predetermined insertion score threshold. Conversely, a particular vein or insertion site may be selected responsive to determining that the insertion score is above the predetermined insertion score threshold or above the predetermined insertion score threshold by a particular amount. Further, in some embodiments, vein suitability may be determined based at least in part on a hemodilution rate associated with the vein.
Additionally, in some embodiments, automatic error checking and other operations may be applied based on the computer vision analysis of the real-time ultrasound data. For example, a notification that the ultrasound device is faulty may be generated based on the analysis of the real-time ultrasound data responsive to detecting an error within the image data using the computer vision algorithm. In some such embodiments, the computer vision algorithm may be further trained based on historical faulty ultrasound data such that said errors may be identified. Further, one or more false artifacts may be removed or flagged within the real-time ultrasound image data. For example, common artifacts may be identified via training of the computer vision algorithm such that the artifacts may be automatically removed from the real-time image to prevent confusion for the medical personnel.
At step 1106, a blood flow is automatically estimated based on the real-time ultrasound data. For example, a blood flow of a selected vein or a potential vein considered for selection may be estimated based on one or more images from the real-time ultrasound data. In some embodiments, the blood flow may be estimated based on a vein diameter identified within the ultrasound data. For example, the vein diameter may be identified by the computer vision algorithm after the vein is identified.
At step 1108, an intravenous catheter is selected based at least in part on the real-time ultrasound data. In some embodiments, the intravenous catheter is selected from a plurality of catheters. For example, a catheter supplier or inventory data store may be accessed to review a plurality of catheters such that a suitable catheter may be selected. For example, an inventory system of catheter supplier data store may store catheter information relating to a respective plurality of catheters including catheter lengths, catheter codes/part numbers, catheter types, catheter brands, and other catheter information. In some embodiments, the catheter may be selected based on a variety of parameters such as a vein diameter of a selected vein, a selected angle of insertion, a catheter length, and other insertion parameters. A default angle of insertion of about 15 degrees may be included. However, in some embodiments, the angle of insertion may be adjusted based on one or more other insertion parameters such as a recommended hemodilution rate. Further, in some embodiments, the catheter inventory system may be automatically updated responsive to selection of a particular catheter. In some such embodiments, for example, a catheter may be automatically ordered responsive to the selection.
In some embodiments, a catheter check is contemplated. For example, a request may be generated to instruct the medical personnel to scan a universal product code associated with the catheter device. As such, a confirmation may be provided that the medical personnel has the correct catheter device prior to insertion. Alternatively, if the catheter device is incorrect, a notification may be generated for the medical personnel to locate the correct catheter device.
At step 1110, one or more suggested positioning parameters are generated for insertion of the catheter. In some embodiments, the one or more suggested positioning parameters are determined based at least in part on the real-time ultrasound data and the one or more anatomical structures identified within the real-time ultrasound data. In some embodiments, the one or more suggested positioning parameters are generated for display in real-time to a human operator such as a medical personnel performing the insertion procedure. Alternatively, or additionally, in some embodiments, the one or more suggested parameters may be provided audibly, for example, one or more verbal instructions or suggestions may be provided via one or more speakers associated with the ultrasound device 110 or the user device 128.
Alternatively, in some embodiments, the one or more suggested positioning parameters may be presented to medical personnel via an augmented reality display. For example, a medical personnel may wear a head mounted display configured to display an overlay including ultrasound data, suggestions, and other information relevant to the insertion procedure. In some embodiments, the real-time ultrasound data may be generated for display on the head mounted display overlaid with ultrasound information such as indications of identified anatomical structures, an indication of the catheter position, as well as any other measured and/or calculated parameters associated with the insertion procedure, such as, the angle of insertion, a suggested positioning change, a vein diameter, a blood flow rate, or another parameter. In some embodiments, the user interface 800A or 800B may be displayed on a display of a head mounted display device. Similarly, the indications, information, and images described above may be displayed on another form of display that is not configured to be head mounted. For example, an augmented ultrasound image may be displayed on a computer screen or a screen of a mobile device in real-time prior to and during an insertion procedure.
It should be understood that, in some embodiments, additional steps may be included in either of method 1000 or method 1100. For example, in some embodiments, an additional step for retraining the machine learning model may be included. Here, a subsequent set of training data may be received and the machine learning model may be further trained based on the subsequent set of training data. In some embodiments, the subsequent set of training data comprises ultrasound data from a plurality of catheter insertion procedures. Further still, in some embodiments, additional steps are included for storing and transmitting the real-time ultrasound data. For example, the real-time ultrasound data may be stored within an ultrasound data store for any of retraining the machine learning model, monitoring medical personnel performance, reviewing procedure parameters, and training medical personnel.
In some embodiments, a variety of filters and other augmentations of the real-time ultrasound data are contemplated. For example, in some embodiments, a three-dimensional model may be generated based on the real-time ultrasound data. In some such embodiments, the three-dimensional model provides a three-dimensional representation of the insertion region and may be provided and accessible in real-time to the medical personnel. For example, instructions and suggestions may be provided to guide a nurse or other medical personnel through the insertion procedure. Additionally, in some embodiments, the three-dimensional model may be stored within a data store such that the model may be used for subsequent monitoring or analysis of historical insertion procedures.
Further, in some embodiments, a warning or other notification may be generated and transmitted in real-time to the medical personnel, for example, to warn the medical personnel of a bad insertion site. Here, an insertion score may be generated for a plurality of respective insertion sites identified within the ultrasound data. The insertion scores may then be compared to determine an optimal or suggested insertion region along with a plurality of suggested insertion parameters for completing the insertion procedure. For example, a particular insertion site may receive a low insertion score responsive to identification of one or more nerve bundles adjacent to the insertion site. Accordingly, a warning or notification informing the medical personnel of the nerve bundles or indicating the low insertion score may be presented to the medical personnel. Said notification may be provided via any suitable form of notification described herein such as via visual display on a display device or an audible message from an audio device.
Additionally, in some embodiments, any of a plurality of instructions may be generated and provided to the medical personnel during the catheter insertion procedure. For example, in some embodiments, an instruction may be provided to the medical personnel including a suggested dwell time for the catheter device. The suggested dwell time may be selected based on any combination of the blood flow rate, the hemodilution rate, and information associated with a particular drug administered to the patient. For example, the machine learning model may automatically access a data store including pharmaceutical information relating to a particular drug being administered and determine a recommended dwell time based at least in part on the information. Alternatively, or additionally, the machine learning model may also consider patient specific factors such as a patient height, patient weight, or patient age. Further, instructions may be generated for a subsequent catheter removal procedure to guide medical personnel through removal of an inserted catheter.
Additionally, embodiments are contemplated in which ultrasound data and other patient specific insertion data may be stored, for example, to suggest a particular vein or insertion site in a subsequent procedure for the patient or a similar patient. For example, based a vein may be selected and suggested based on stored data of a successful insertion procedure. Conversely, a vein may be removed from consideration responsive to an unsuccessful insertion attempt recorded for the patient.
Although the present disclosure has been described with reference to the embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the present disclosure as recited in the claims.
This non-provisional patent application claims priority benefit, with regard to all common subject matter, of earlier-filed U.S. Provisional Patent Application No. 63/410,712, filed on Sep. 28, 2022, and entitled “ULTRASOUND AIDED POSITIONING OF AN INTRAVENOUS CATHETER.” The identified earlier-filed provisional patent application is hereby incorporated by reference in its entirety into the present application.
Number | Date | Country | |
---|---|---|---|
63410712 | Sep 2022 | US |