The present disclosure relates generally to medication delivery to patients and, in some non-limiting embodiments or aspects, to a sensory assembly for monitoring push medication delivery to patients from syringes.
Barcoded Medication Administration (BCMA) is an inventory control system that uses barcodes to reduce or prevent human errors in the distribution of prescription medications at hospitals. A goal of BCMA is to make sure that patients are receiving the correct medications at the correct time by electronically validating and documenting medications. The information encoded in barcodes allows for the comparison of the medication being administered with what was ordered for the patient.
Currently, BCMA is the only safety check performed near the patient to reduce or prevent misadministration of medications and aid in documentation in the electronic medical record/electronic health record (EMR/EHR). Accordingly, there is a monitoring gap for drugs delivered via syringe in that an act of connection to a patient's IV line/catheter and dispensing or delivery of a drug may not be monitored.
Accordingly, provided are improved systems, devices, products, apparatus, and/or methods for monitoring push medication delivery to patients from syringes.
According to some non-limiting embodiments or aspects, provided is a sensor assembly, including: an image capture device configured to capture one or more images; a user feedback device configured to provide an audible and/or visual output to a user; a mechanical connector configured to connect the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; and a processor coupled to a memory and configured to: obtain the one or more the images captured by the image capture device; determine, based on the one or more images, information associated with a medication delivery for a patient; and control, based on the information, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the image capture device is configured to automatically capture the one or more images in response to a connection of the mechanical connector to the lumen and/or the syringe.
In some non-limiting embodiments or aspects, the sensor assembly further includes: a housing including the image capture device, the user feedback device, and the processor, wherein the mechanical connector extends from a sidewall of the housing, wherein a field-of-view of the image capture device is fixed relative to the housing and/or the mechanical connector, and wherein, with the mechanical connector connecting the sensor assembly to the lumen and/or the syringe connected to the lumen, a field-of-view of the image capture device includes the syringe.
In some non-limiting embodiments or aspects, the mechanical connector includes an opening in a sidewall extending from the housing, and wherein the opening is configured to receive the lumen and/or a barrel of the syringe.
In some non-limiting embodiments or aspects, the image capture device is configured to automatically capture the one or more images in response to the lumen and/or the barrel of the syringe being received in the opening.
In some non-limiting embodiments or aspects, the mechanical connector includes a rigid deformable clip.
In some non-limiting embodiments or aspects, the mechanical connector includes butterfly-style clip.
In some non-limiting embodiments or aspects, the image capture device is configured to automatically capture the one or more images in response to an actuation of the butterfly-style clip.
In some non-limiting embodiments or aspects, the mechanical connector includes arcuate jaws configured to be actuated by one or more buttons, each arcuate jaw connected at a first end to a hinge and including an inwardly facing surface configured to enclose around an exterior surface of the lumen and/or the syringe, the hinge configured to urge second ends of the arcuate jaws together, and the one or more buttons configured to cause the arcuate jaws to separate from each other in response to an actuation of the one or more buttons.
In some non-limiting embodiments or aspects, the image capture device is configured to automatically capture the one or more images in response to the actuation of the one or more buttons.
In some non-limiting embodiments or aspects, the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting in the one or more images, a patient identifier element associated with the patient; and determining, based on the detected patient identifier element, patient information associated with the patient, wherein the patient information includes an indication of a type and/or a dosage of a medication to be delivered to the patient and/or a unique identifier of a medication delivery lumen to be used to deliver the medication to the patient.
In some non-limiting embodiments or aspects, the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a syringe identifier element associated with the syringe; and determining, based on the detected syringe identifier element, syringe information associated with the syringe, wherein the syringe information includes a type and/or a dosage of a medication contained in the syringe.
In some non-limiting embodiments or aspects, the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: determining, based on the patient information and the syringe information, whether the medication contained in the syringe is a correct type and/or a correct dosage of the medication to be delivered to the patient; and controlling, based on whether the medication contained in the syringe includes the correct type and/or the correct dosage of the medication to be delivered to the patient, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a lumen identifier element associated with the lumen; and determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen.
In some non-limiting embodiments or aspects, the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: determining, based on the lumen information and the syringe information, whether the lumen is the medication delivery lumen to be used to deliver the medication to the patient; and controlling, based on whether the lumen includes the medication delivery lumen to be used to deliver the medication to the patient, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the one or more images include a series of images including the syringe during the medication delivery for the patient, and wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a movement of a stopper and/or a plunger rod of the syringe; determining, based on the detected movement, at least one delivery parameter of the following delivery parameters: a patency of the lumen; an average rate of delivery of the medication; a current rate of the delivery of the medication; a steadiness index of the current rate of delivery of the medication; a volume of the medication delivered; a completion of the medication delivery for the patient; or any combination thereof; and controlling, based on the at least one delivery parameter, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the one or more images include a series of images including one or more flush syringes connected to the lumen during a flush procedure, and wherein the least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen; detecting, in the one or more images, one or more flush syringe identifier elements associated with the one or more flush syringes; determining, based on the one or more detected flush syringe identifier elements, flush syringe information associated with the one or more flush syringes, wherein the flush syringe information includes a number of flush syringes used for the flush procedure; and determining, based on the lumen information and the flush syringe information, that the flush procedure has been performed for the lumen, a number of flush syringes used for the flush procedure, and/or a type of the flush procedure performed (e.g., continuous flushing, partial flushing, pulsatile flushing, etc.).
In some non-limiting embodiments or aspects, the sensor assembly further includes: wireless communications circuitry configured to wirelessly communicate with an external computing device, wherein the at least one processor is configured to control the wireless communication circuitry to wirelessly communicate the information associated with the medication delivery for the patient to the external computing device.
According to some non-limiting embodiments or aspects, provided is a system, including: an external computing device; and one or more sensor assemblies, each sensor assembly of the one or more sensor assemblies including: an image capture device configured to capture one or more images, a user feedback device configured to provide an audible and/or visual output to a user, and a mechanical connector configured to connect the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; and a wireless communication link configured to facilitate communication between the external computing device, wherein the external computing device includes a processor coupled to a memory and configured to, for each sensor assembly: receive the one or more the images captured by the image capture device; determine, based on the one or more images, information associated with a medication delivery for a patient; and control, based on the information, via the wireless communication link, the user feedback device of the sensor assembly to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the one or more sensor assemblies include a plurality of sensor assemblies.
According to some non-limiting embodiments or aspects, provided is a method, including: capturing, with an image capture device of a sensor assembly, one or more images, wherein the sensor assembly includes a mechanical connector connecting the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; obtaining, with at least one processor of the sensor assembly, the one or more the images captured by the image capture device; determining, with the at least one processor of the sensor assembly, based on the one or more images, information associated with a medication delivery for a patient; and controlling, with the at least one processor of the sensor assembly, based on the information, a user feedback device of the sensor assembly to provide an audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the method further includes: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to a connection of the mechanical connector to the lumen and/or the syringe.
In some non-limiting embodiments or aspects, the sensor assembly further includes a housing including the image capture device, the user feedback device, and the at least one processor, wherein the mechanical connector extends from a sidewall of the housing, wherein a field-of-view of the image capture device is fixed relative to the housing and/or the mechanical connector, and wherein, with the mechanical connector connecting the sensor assembly to the lumen and/or the syringe connected to the lumen, a field-of-view of the image capture device includes the syringe.
In some non-limiting embodiments or aspects, the mechanical connector includes an opening in a sidewall extending from the housing, and wherein the opening is configured to receive the lumen and/or a barrel of the syringe.
In some non-limiting embodiments or aspects, the method further includes: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to the lumen and/or the barrel of the syringe being received in the opening.
In some non-limiting embodiments or aspects, the mechanical connector includes a rigid deformable clip.
In some non-limiting embodiments or aspects, the mechanical connector includes butterfly-style clip.
In some non-limiting embodiments or aspects, the method further includes: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to an actuation of the butterfly-style clip.
In some non-limiting embodiments or aspects, the mechanical connector includes arcuate jaws configured to be actuated by one or more buttons, each arcuate jaw connected at a first end to a hinge and including an inwardly facing surface configured to enclose around an exterior surface of the lumen and/or the syringe, the hinge configured to urge second ends of the arcuate jaws together, and the one or more buttons configured to cause the arcuate jaws to separate from each other in response to an actuation of the one or more buttons.
In some non-limiting embodiments or aspects, the method further includes: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to the actuation of the one or more buttons.
In some non-limiting embodiments or aspects, the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a patient identifier element associated with the patient; and determining, based on the detected patient identifier element, patient information associated with the patient, wherein the patient information includes an indication of a type and/or a dosage of a medication to be delivered to the patient and/or a unique identifier of a medication delivery lumen to be used to deliver the medication to the patient.
In some non-limiting embodiments or aspects, the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a syringe identifier element associated with the syringe; and determining, based on the detected syringe identifier element, syringe information associated with the syringe, wherein the syringe information includes a type and/or a dosage of a medication contained in the syringe.
In some non-limiting embodiments or aspects, the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: determining, based on the patient information and the syringe information, whether the medication contained in the syringe is a correct type and/or a correct dosage of the medication to be delivered to the patient; and controlling, based on whether the medication contained in the syringe includes the correct type and/or the correct dosage of the medication to be delivered to the patient, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen.
In some non-limiting embodiments or aspects, the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: determining, based on the lumen information and the syringe information, whether the lumen is the medication delivery lumen to be used to deliver the medication to the patient; and controlling, based on whether the lumen includes the medication delivery lumen to be used to deliver the medication to the patient, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the one or more images include a series of images including the syringe during the medication delivery for the patient, and wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a movement of a stopper and/or a plunger rod of the syringe; determining, based on the detected movement, at least one delivery parameter of the following delivery parameters: a patency of the lumen; an average rate of delivery of the medication; a current rate of the delivery of the medication; a steadiness index of the current rate of delivery of the medication; a volume of the medication delivered; a completion of the medication delivery for the patient; or any combination thereof; and controlling, based on the at least one delivery parameter, the user feedback device to provide the audible and/or visual output to the user.
In some non-limiting embodiments or aspects, the one or more images include a series of images including one or more flush syringes connected to the lumen during a flush procedure, and wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen; detecting, in the one or more images, one or more flush syringe identifier elements associated with the one or more flush syringes; determining, based on the one or more detected flush syringe identifier elements, flush syringe information associated with the one or more flush syringes, wherein the flush syringe information includes a number of flush syringes used for the flush procedure; and determining, based on the lumen information and the flush syringe information, that the flush procedure has been performed for the lumen, a number of flush syringes used for the flush procedure, and/or a type of the flush procedure performed (e.g., continuous flushing, partial flushing, pulsatile flushing, etc.).
In some non-limiting embodiments or aspects, the method further includes: controlling, with the at least one processor of the sensor assembly, wireless communications circuitry of the sensor assembly to wirelessly communicate the information associated with the medication delivery for the patient to an external computing device.
Further embodiments or aspects are set forth in the following numbered clauses:
Clause 1. A sensor assembly, comprising: an image capture device configured to capture one or more images; a user feedback device configured to provide an audible and/or visual output to a user; a mechanical connector configured to connect the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; and a processor coupled to a memory and configured to: obtain the one or more the images captured by the image capture device; determine, based on the one or more images, information associated with a medication delivery for a patient; and control, based on the information, the user feedback device to provide the audible and/or visual output to the user.
Clause 2. The sensor assembly of clause 1, wherein the image capture device is configured to automatically capture the one or more images in response to a connection of the mechanical connector to the lumen and/or the syringe.
Clause 3. The sensor assembly of any of clause 1 and clause 2, further comprising: a housing including the image capture device, the user feedback device, and the processor, wherein the mechanical connector extends from a sidewall of the housing, wherein a field-of-view of the image capture device is fixed relative to the housing and/or the mechanical connector, and wherein, with the mechanical connector connecting the sensor assembly to the lumen and/or the syringe connected to the lumen, a field-of-view of the image capture device includes the syringe.
Clause 4. The sensor assembly of any of clauses 1-3, wherein the mechanical connector includes an opening in a sidewall extending from the housing, and wherein the opening is configured to receive the lumen and/or a barrel of the syringe.
Clause 5. The sensor assembly of any of clauses 1-4, wherein the image capture device is configured to automatically capture the one or more images in response to the lumen and/or the barrel of the syringe being received in the opening.
Clause 6. The sensor assembly of any of clauses 1-5, wherein the mechanical connector includes a rigid deformable clip.
Clause 7. The sensor assembly of any of clauses 1-6, wherein the mechanical connector includes butterfly-style clip.
Clause 8. The sensor assembly of any of clauses 1-7, wherein the image capture device is configured to automatically capture the one or more images in response to an actuation of the butterfly-style clip.
Clause 9. The sensor assembly of any of clauses 1-8, wherein the mechanical connector includes arcuate jaws configured to be actuated by one or more buttons, each arcuate jaw connected at a first end to a hinge and including an inwardly facing surface configured to enclose around an exterior surface of the lumen and/or the syringe, the hinge configured to urge second ends of the arcuate jaws together, and the one or more buttons configured to cause the arcuate jaws to separate from each other in response to an actuation of the one or more buttons.
Clause 10. The sensor assembly of any of clauses 1-9, wherein the image capture device is configured to automatically capture the one or more images in response to the actuation of the one or more buttons.
Clause 11. The sensor assembly of any of clauses 1-10, wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting in the one or more images, a patient identifier element associated with the patient; and determining, based on the detected patient identifier element, patient information associated with the patient, wherein the patient information includes an indication of a type and/or a dosage of a medication to be delivered to the patient and/or a unique identifier of a medication delivery lumen to be used to deliver the medication to the patient.
Clause 12. The sensor assembly of any of clauses 1-11, wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a syringe identifier element associated with the syringe; and determining, based on the detected syringe identifier element, syringe information associated with the syringe, wherein the syringe information includes a type and/or a dosage of a medication contained in the syringe.
Clause 13. The sensor assembly of any of clauses 1-12, wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: determining, based on the patient information and the syringe information, whether the medication contained in the syringe is a correct type and/or a correct dosage of the medication to be delivered to the patient; and controlling, based on whether the medication contained in the syringe includes the correct type and/or the correct dosage of the medication to be delivered to the patient, the user feedback device to provide the audible and/or visual output to the user.
Clause 14. The sensor assembly of any of clauses 1-13, wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen.
Clause 15. The sensor assembly of any of clauses 1-14, wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: determining, based on the lumen information and the syringe information, whether the lumen is the medication delivery lumen to be used to deliver the medication to the patient; and controlling, based on whether the lumen includes the medication delivery lumen to be used to deliver the medication to the patient, the user feedback device to provide the audible and/or visual output to the user.
Clause 16. The sensor assembly of any of clauses 1-15, wherein the one or more images include a series of images including the syringe during the medication delivery for the patient, and wherein the at least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a movement of a stopper and/or a plunger rod of the syringe; determining, based on the detected movement, at least one delivery parameter of the following delivery parameters: a patency of the lumen; an average rate of delivery of the medication; a current rate of the delivery of the medication; a steadiness index of the current rate of delivery of the medication; a volume of the medication delivered; a completion of the medication delivery for the patient; or any combination thereof; and controlling, based on the at least one delivery parameter, the user feedback device to provide the audible and/or visual output to the user.
Clause 17. The sensor assembly of any of clauses 1-16, wherein the one or more images include a series of images including one or more flush syringes connected to the lumen during a flush procedure, and wherein the least one processor is configured to determine, based on the one or more images, the information associated with the medication delivery for the patient by: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen; detecting, in the one or more images, one or more flush syringe identifier elements associated with the one or more flush syringes; determining, based on the one or more detected flush syringe identifier elements, flush syringe information associated with the one or more flush syringes, wherein the flush syringe information includes a number of flush syringes used for the flush procedure; and determining, based on the lumen information and the flush syringe information, that the flush procedure has been performed for the lumen, a number of flush syringes used for the flush procedure, and/or a type of the flush procedure performed (e.g., continuous flushing, partial flushing, pulsatile flushing, etc.).
Clause 18. The sensor assembly of any of clauses 1-17, further comprising: wireless communications circuitry configured to wirelessly communicate with an external computing device, wherein the at least one processor is configured to control the wireless communication circuitry to wirelessly communicate the information associated with the medication delivery for the patient to the external computing device.
Clause 19. A system, comprising: an external computing device; and one or more sensor assemblies, each sensor assembly of the one or more sensor assemblies including: an image capture device configured to capture one or more images, a user feedback device configured to provide an audible and/or visual output to a user, and a mechanical connector configured to connect the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; and a wireless communication link configured to facilitate communication between the external computing device, wherein the external computing device includes a processor coupled to a memory and configured to, for each sensor assembly: receive the one or more the images captured by the image capture device; determine, based on the one or more images, information associated with a medication delivery for a patient; and control, based on the information, via the wireless communication link, the user feedback device of the sensor assembly to provide the audible and/or visual output to the user.
Clause 20. The system of clause 19, wherein the one or more sensor assemblies include a plurality of sensor assemblies.
Clause 21. A method, comprising: capturing, with an image capture device of a sensor assembly, one or more images, wherein the sensor assembly includes a mechanical connector connecting the sensor assembly to a lumen and/or a syringe configured to be connected to the lumen; obtaining, with at least one processor of the sensor assembly, the one or more the images captured by the image capture device; determining, with the at least one processor of the sensor assembly, based on the one or more images, information associated with a medication delivery for a patient; and controlling, with the at least one processor of the sensor assembly, based on the information, a user feedback device of the sensor assembly to provide an audible and/or visual output to the user.
Clause 22. The method of clause 21, further comprising: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to a connection of the mechanical connector to the lumen and/or the syringe.
Clause 23. The method of any of clause 21 and clause 22, wherein the sensor assembly further includes a housing including the image capture device, the user feedback device, and the at least one processor, wherein the mechanical connector extends from a sidewall of the housing, wherein a field-of-view of the image capture device is fixed relative to the housing and/or the mechanical connector, and wherein, with the mechanical connector connecting the sensor assembly to the lumen and/or the syringe connected to the lumen, a field-of-view of the image capture device includes the syringe.
Clause 24. The method of any of clauses 21-23, wherein the mechanical connector includes an opening in a sidewall extending from the housing, and wherein the opening is configured to receive the lumen and/or a barrel of the syringe.
Clause 25. The method of any of clauses 21-24, further comprising: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to the lumen and/or the barrel of the syringe being received in the opening.
Clause 26. The method of any of clauses 21-25, wherein the mechanical connector includes a rigid deformable clip.
Clause 27. The method of any of clauses 21-26, wherein the mechanical connector includes butterfly-style clip.
Clause 28. The method of any of clauses 21-27, further comprising: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to an actuation of the butterfly-style clip.
Clause 29. The method of any of clauses 21-28, wherein the mechanical connector includes arcuate jaws configured to be actuated by one or more buttons, each arcuate jaw connected at a first end to a hinge and including an inwardly facing surface configured to enclose around an exterior surface of the lumen and/or the syringe, the hinge configured to urge second ends of the arcuate jaws together, and the one or more buttons configured to cause the arcuate jaws to separate from each other in response to an actuation of the one or more buttons.
Clause 30. The method of any of clauses 21-29, further comprising: automatically capturing, with the image capture device of the sensor assembly, the one or more images in response to the actuation of the one or more buttons.
Clause 31. The method of any of clauses 21-30, wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a patient identifier element associated with the patient; and determining, based on the detected patient identifier element, patient information associated with the patient, wherein the patient information includes an indication of a type and/or a dosage of a medication to be delivered to the patient and/or a unique identifier of a medication delivery lumen to be used to deliver the medication to the patient.
Clause 32. The method of any of clauses 21-31, wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a syringe identifier element associated with the syringe; and determining, based on the detected syringe identifier element, syringe information associated with the syringe, wherein the syringe information includes a type and/or a dosage of a medication contained in the syringe.
Clause 33. The method of any of clauses 21-32, wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: determining, based on the patient information and the syringe information, whether the medication contained in the syringe is a correct type and/or a correct dosage of the medication to be delivered to the patient; and controlling, based on whether the medication contained in the syringe includes the correct type and/or the correct dosage of the medication to be delivered to the patient, the user feedback device to provide the audible and/or visual output to the user.
Clause 34. The method of any of clauses 21-33, wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen.
Clause 35. The method of any of clauses 21-34, wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: determining, based on the lumen information and the syringe information, whether the lumen is the medication delivery lumen to be used to deliver the medication to the patient; and controlling, based on whether the lumen includes the medication delivery lumen to be used to deliver the medication to the patient, the user feedback device to provide the audible and/or visual output to the user.
Clause 36. The method of any of clauses 21-35, wherein the one or more images include a series of images including the syringe during the medication delivery for the patient, and wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a movement of a stopper and/or a plunger rod of the syringe; determining, based on the detected movement, at least one delivery parameter of the following delivery parameters: a patency of the lumen; an average rate of delivery of the medication; a current rate of the delivery of the medication; a steadiness index of the current rate of delivery of the medication; a volume of the medication delivered; a completion of the medication delivery for the patient; or any combination thereof; and controlling, based on the at least one delivery parameter, the user feedback device to provide the audible and/or visual output to the user.
Clause 37. The method of any of clauses 21-36, wherein the one or more images include a series of images including one or more flush syringes connected to the lumen during a flush procedure, and wherein the determining, with the at least one processor of the senor assembly, based on the one or more images, the information associated with the medication delivery for the patient includes: detecting, in the one or more images, a lumen identifier element associated with the lumen; determining, based on the detected lumen identifier element, lumen information associated with the lumen, wherein the lumen information includes a unique identifier of the lumen; detecting, in the one or more images, one or more flush syringe identifier elements associated with the one or more flush syringes; determining, based on the one or more detected flush syringe identifier elements, flush syringe information associated with the one or more flush syringes, wherein the flush syringe information includes a number of flush syringes used for the flush procedure; and determining, based on the lumen information and the flush syringe information, that the flush procedure has been performed for the lumen, a number of flush syringes used for the flush procedure, and/or a type of the flush procedure performed (e.g., continuous flushing, partial flushing, pulsatile flushing, etc.).
Clause 38. The method of any of clauses 21-37, further comprising: controlling, with the at least one processor of the sensor assembly, wireless communications circuitry of the sensor assembly to wirelessly communicate the information associated with the medication delivery for the patient to an external computing device.
These and other features and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structures and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of limits. As used in the specification and the claims, the singular form of “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
Additional advantages and details of embodiments or aspects of the present disclosure are explained in greater detail below with reference to the exemplary embodiments that are illustrated in the accompanying schematic figures, in which:
It is to be understood that the present disclosure may assume various alternative variations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification, are simply exemplary and non-limiting embodiments or aspects. Hence, specific dimensions and other physical characteristics related to the embodiments or aspects disclosed herein are not to be considered as limiting.
For purposes of the description hereinafter, the terms “end,” “upper,” “lower,” “right,” “left,” “vertical,” “horizontal,” “top,” “bottom,” “lateral,” “longitudinal,” and derivatives thereof shall relate to the present disclosure as it is oriented in the drawing figures. However, it is to be understood that the present disclosure may assume various alternative variations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification, are simply exemplary embodiments or aspects of the present disclosure. Hence, specific dimensions and other physical characteristics related to the embodiments or aspects of the embodiments disclosed herein are not to be considered as limiting unless otherwise indicated.
No aspect, component, element, structure, act, step, function, instruction, and/or the like used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more” and “at least one.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, a combination of related and unrelated items, etc.) and may be used interchangeably with “one or more” or “at least one.” Where only one item is intended, the term “one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least in partially on” unless explicitly stated otherwise.
As used herein, the terms “communication” and “communicate” refer to the receipt or transfer of one or more signals, messages, commands, or other type of data. For one unit (e.g., any device, system, or component thereof) to be in communication with another unit means that the one unit is able to directly or indirectly receive data from and/or transmit data to the other unit. This may refer to a direct or indirect connection that is wired and/or wireless in nature. Additionally, two units may be in communication with each other even though the data transmitted may be modified, processed, relayed, and/or routed between the first and second unit. For example, a first unit may be in communication with a second unit even though the first unit passively receives data and does not actively transmit data to the second unit. As another example, a first unit may be in communication with a second unit if an intermediary unit processes data from one unit and transmits processed data to the second unit. It will be appreciated that numerous other arrangements are possible.
It will be apparent that systems and/or methods, described herein, can be implemented in different forms of hardware, software, or a combination of hardware and software. The actual specialized control hardware or software code used to implement these systems and/or methods is not limiting of the implementations. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code, it being understood that software and hardware can be designed to implement the systems and/or methods based on the description herein.
Some non-limiting embodiments or aspects are described herein in connection with thresholds. As used herein, satisfying a threshold may refer to a value being greater than the threshold, more than the threshold, higher than the threshold, greater than or equal to the threshold, less than the threshold, fewer than the threshold, lower than the threshold, less than or equal to the threshold, equal to the threshold, etc.
As used herein, the term “computing device” or “computer device” may refer to one or more electronic devices that are configured to directly or indirectly communicate with or over one or more networks. The computing device may be a mobile device, a desktop computer, or the like. Furthermore, the term “computer” may refer to any computing device that includes the necessary components to receive, process, and output data, and normally includes a display, a processor, a memory, an input device, and a network interface. An “application” or “application program interface” (API) refers to computer code or other data sorted on a computer-readable medium that may be executed by a processor to facilitate the interaction between software components, such as a client-side front-end and/or server-side back-end for receiving data from the client. An “interface” refers to a generated display, such as one or more graphical user interfaces (GUIs) with which a user may interact, either directly or indirectly (e.g., through a keyboard, mouse, touchscreen, etc.).
As used herein, the term “server” may refer to or include one or more processors or computers, storage devices, or similar computer arrangements that are operated by or facilitate communication and processing for multiple parties in a network environment, such as the Internet, although it will be appreciated that communication may be facilitated over one or more public or private network environments and that various other arrangements are possible. Further, multiple computers, e.g., servers, or other computerized devices, such as POS devices, directly or indirectly communicating in the network environment may constitute a “system,” such as a merchant's POS system. As used herein, the term “data center” may include one or more servers, or other computing devices, and/or databases.
As used herein, the term “mobile device” may refer to one or more portable electronic devices configured to communicate with one or more networks. As an example, a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer (e.g., a tablet computer, a laptop computer, etc.), a wearable device (e.g., a watch, pair of glasses, lens, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices. The terms “client device” and “user device,” as used herein, refer to any electronic device that is configured to communicate with one or more servers or remote devices and/or systems. A client device or user device may include a mobile device, a network-enabled appliance (e.g., a network-enabled television, refrigerator, thermostat, and/or the like), a computer, and/or any other device or system capable of communicating with a network.
As used herein, the term “application” or “application program interface” (API) refers to computer code, a set of rules, or other data sorted on a computer-readable medium that may be executed by a processor to facilitate interaction between software components, such as a client-side front-end and/or server-side back-end for receiving data from the client. An “interface” refers to a generated display, such as one or more graphical user interfaces (GUIs) with which a user may interact, either directly or indirectly (e.g., through a keyboard, mouse, etc.).
Referring to
Medical device 102 may include a lumen 102a (e.g., an IV line, a catheter, etc.) and/or a syringe 102b configured to be connected to the lumen 102a (e.g., via a needless connector, etc.). Syringe 102b may include syringe barrel 112, plunger rod 114, and/or stopper 116 (see, e.g.,
Identifier element 118 (e.g., a tag, a label, a code, etc.) may be associated with (e.g., removably attached to, permanently attached to, integrated with, implemented on, etc.) medical device 102 (e.g., lumen 102a, syringe 102b, etc.) and/or a patient (e.g., a patient wristband, etc.). For example, identifier element 118 may encapsulate a device identifier associated with a type and/or contents of medical device 102 associated with identifier element 118 and/or uniquely identify medical device 102 associated with identifier element 118 from other medical devices and/or indicate an orientation of the medical device 102 within environment 100 and/or with respect to another medical device (e.g., a fluid flow path direction through a medical device, an input or inlet position and an output or outlet position of a medical device, etc.). As an example, identifier element 118 may encapsulate a unique patient identifier associated with a specific patient and/or patient information, such as, the unique patient identification number, drug allergies, blood type, patient vital signs, lab results, current disease states and/or clinical diagnoses, drugs previously administered, medical orders (e.g., medication, dose, route of administration, treatment schedule, etc.), historic patient information (e.g., disease state, clinical diagnosis, dosing history, etc.), and/or the like. Identifier element 118 may include at least one of the following: a colored pattern, a reflective pattern, a fluorescent pattern, a predetermined shape and color, a LED pattern, a barcode (e.g., a 1D barcode, a 2D barcode, etc.), a fiducial marker (e.g., an AprilTag, etc.), a hologram marker, or any combination thereof, which may encapsulate the device identifier and/or patient identifier.
Sensor assembly 104 may include housing 140 and mechanical connector 142. Housing 140 may include (e.g., house, encompass, etc.) image capture device 144, processor 146, wireless communications circuitry 148, user feedback device 150, power source 152, memory 154, and/or real-time clock (RTC) 156.
Mechanical connector 142 may be configured to connect or attach (e.g., removably attach, etc.) housing 140 to an exterior surface of medical device 102 (e.g., to lumen 102a, to barrel 112 of syringe 102b, etc.). For example, housing 140 may be removably attached to mechanical connector 142, permanently attached to mechanical connector 142, and/or integrated with mechanical connector 142. As an example mechanical connector 142 may extend from a sidewall of housing 140. Further details regarding non-limiting embodiments or aspects of mechanical connector 142 are provided below with regard to
Image capture device 144 may include a camera, such as a digital monocular camera, an infrared camera, a stereo camera, a single camera, and/or the like, configured to capture one or more images. For example, image capture device 144 may be configured to capture a plurality of images over a period of time (e.g., as a series of images, etc.). In some non-limiting embodiment or aspects, image capture device is configured to automatically capture the one or more images in response to a connection of mechanical connector 142 to lumen 102a and/or syringe 102b. In some non-limiting embodiments or aspects, a field-of-view of image capture device 144 is fixed relative to housing 140 and/or mechanical connector 142 such that, with mechanical connector 142 connecting housing 140 of sensor assembly 104 to lumen 102a and/or syringe 102b connected to lumen 102a, the field-of-view of image capture device includes lumen 102a and/or syringe 102b. For example, the fixed field-of-view of image capture device 144 may include identifier elements 118 on lumen 102a and/or syringe 102b.
Processor 146 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), etc.) may be programmed and/or configured to obtain the one or more the images captured by image capture device 142; determine, based on the one or more images, information associated with a medication delivery for a patient; and control, based on the information, user feedback device 150 to provide an audible and/or visual output to a user. In some non-limiting embodiments or aspects, processor 146 may include a low power microcontroller unit (MCU). For example, information associated with a medication delivery for a patient may include patient information, lumen information, syringe information, flush syringe information, and/or the like.
Processor 146 may be programmed and/or configured to process the one or more images using one or more object detection techniques (e.g., a deep learning technique, an image processing technique, an image segmentation technique, etc.) to identify identifier element 118, to determine whether syringe 102b is connected to lumen 102a, and/or to detect movement of an actuation mechanism or fluid expulsion mechanism, such as a plunger rod 114 and/or stopper 116, during a fluid delivery procedure using syringe 102b. For example, a deep learning technique may include a bounding box technique that generates a box label for objects (e.g., identifier element 118, stopper 116, plunger rod 114, etc.) of interest in images, an image masking technique (e.g., masked FRCNN (RCNN or CNN) that captures specific shapes of objects (e.g., identifier element 118, stopper 116, plunger rod 114, etc.) in images, a trained neural network that identifies objects (e.g., identifier element 118, stopper 116, plunger rod 114, etc.) in images, and/or the like. As an example, an image processing technique may include a cross correlation image processing technique, an image contrasting technique, a binary or colored filtering technique, and/or the like.
In some non-limiting embodiments or aspects, identifier element 118 may include an AprilTag. For example, identifier element 118 may include an AprilTag V3 of type customTag 48h12, which enables using AprilTag V3 detection to determine a unique ID, which may indicate a type of medical device 102 associated with identifier element 118 (e.g., in leading digits, etc.), a unique serial number for that specific medical device 102 (e.g., in the trailing digits, etc.), and/or a location (e.g., x, y, and z coordinates, directional vectors for Z, Y, and X axes, etc.) of identifier element 118 in a field-of-view (FOV) of an image capture device 144. However, non-limiting embodiments or aspects are not limited thereto, and identifier element 118 may include a QR code, a barcode (e.g., a 1D barcode, a 2D barcode, etc.), an Aztec code, a Data Matrix code, an ArUco marker, a colored pattern, a reflective pattern, a fluorescent pattern, a predetermined shape and/or color (e.g., a red pentagon, a blue hexagon, etc.), an LED pattern, a hologram, and/or the like that encapsulates an identifier associated with a type of medical device 102 associated with identifier element 118, uniquely identifies medical device 102 associated with identifier element 118 from other medical devices, and/or encapsulates pose information associated with a 3D position of identifier element 118.
Wireless communications circuitry 148 may be configured to wirelessly communicate with computing device 106. For example, wireless communications circuitry 148 may be configured to communicate information associated with a medication delivery for a patient to computing device 106. In some non-limiting embodiments or aspects, wireless communication device 148 includes one or more computing devices, chips, contactless transmitters, contactless transceivers, NFC transmitters, RFID transmitters, contact based transmitters, Bluetooth transceivers® and/or the like that enables wireless communications circuitry 148 to receive information directly from and/or communicate information directly to computing device 106 via a short range wireless communication connection (e.g., a communication connection that uses NFC protocol, a communication connection that uses Radio-frequency identification (RFID), a communication connection that uses a Bluetooth® wireless technology standard, and/or the like).
User feedback device 150 may be configured to provide audible and/or visual output to a user. For example, user feedback device 150 may include at least one of the following: a display (e.g., an LCD display, etc.), a light-emitting diode (LED), a plurality of LEDs, an audio output device (e.g., a buzzer, a speaker, etc.), a tactile or haptic feedback device (e.g., a vibrator, etc.), or any combination thereof.
Power source 152 may be configured to power image capture device 144, processor 146, wireless communications circuitry 148, user feedback device 150, memory 154, and/or RTC 156. For example, power source 152 may include a battery (e.g., a rechargeable battery, a disposable battery, etc.), an energy harvester (e.g., an energy harvester configured to derive energy from one or more external sources, such as electromagnetic energy, solar energy, thermal energy, wind energy, salinity gradients, kinetic energy, and/or the like, etc.), or any combination thereof.
Memory 154 may include random access memory (RAM), read-only memory (ROM), and/or another type of dynamic or static storage device (e.g., flash memory, magnetic memory, optical memory, etc.) that stores information and/or instructions for use by processor 146. For example, sensor assembly 104 may perform one or more processes described herein based on processor 146 executing software instructions stored by a computer-readable medium, such as memory 154. A computer-readable medium (e.g., a non-transitory computer-readable medium) is defined herein as a non-transitory memory device. A non-transitory memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices. When executed, software instructions stored in memory 154 may cause processor 146 to perform one or more processes described herein. Additionally or alternatively, hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments or aspects described herein are not limited to any specific combination of hardware circuitry and software.
RTC 156 may include a computer clock (e.g., in the form of an integrated circuit, etc.) that keeps track of the current time. For example, processor 146 may time stamp images and/or information associated with a medication delivery for a patient.
Base 105 may be configured to hold sensor assembly 104 and/or recharge a battery of power source 152 in sensor assembly 104 for reuse of sensor assembly 104. Further details regarding non-limiting embodiments or aspects of mechanical connector 142 are provided below with regard to
In some non-limiting embodiments or aspects, sensor assembly 104 may include a client device or a user device. For example, sensor assembly 104 may include one or more computing systems including one or more processors (e.g., one or more computing devices, one or more server computers, one or more mobile computing devices, one or more tablet computers, etc.). As an example, sensor assembly 104 may include a tablet computer or mobile computing device, such as an Apple® iPad, an Apple® iPhone, an Android® tablet, an Android® phone, and/or the like. For example, sensor assembly 104 may include a smartphone or a tablet carried by a nurse and/or another computing device (e.g., a nurse's workstation, a bedside station, etc.) located in a patient care area for managing patient care and/or EMR documentation. As an example, a display of such a computing device may function as user feedback device 150, such as to provide medication confirmation, administration information and/or guidance (e.g., delivery rate, contraindications, etc.), and/or dose administration rate feedback. In such an example, one or more of image capture device 144, processor 146, wireless communications circuitry 148, user feedback device 150, power source 152, memory 154, and/or real-time clock (RTC) 156 may be implemented by such a computing device, which need not include mechanical connector 142, and/or such a computing device may be in communication with one or more of image capture device 144, processor 146, wireless communications circuitry 148, user feedback device 150, power source 152, memory 154, and/or real-time clock (RTC) 156 in housing 140 connected to medical device 102 via mechanical connector 142. For example, such a computing device may be self-supported (e.g., via a phone or tablet stand, via a mobile nurse's workstation, etc.) at a location near the patient in a manner that enables the nurse to have a hand on medical device 102. In this way, if the user is already carrying a smartphone or a tablet to interact with the EMR, such a device may be used to implement sensor assembly 104 and/or one or more components of sensor assembly 104 (e.g., as an additional display or user feedback device, etc.).
Computing device 106 may include one or more devices capable of receiving information and/or data from sensor assembly 104 and/or one or more other computing devices and/or communicating information and/or data to sensor assembly 104 and/or the one or more other computing devices. For example, computing device 106 may include a server, a group of servers, a mobile device, a group of mobile devices, a receiving unit, a router device, a bridge device, a hub device, and/or the like. In some non-limiting embodiments or aspects, computing device 106 includes one or more computing devices, chips, contactless transmitters, contactless transceivers, NFC transmitters, RFID transmitters, Bluetooth® transceivers, contact based transmitters, and/or the like that enables computing device 106 to receive information directly from and/or communicate information directly to wireless communications circuitry 148 of sensor assembly 104 via a short range wireless communication connection (e.g., a communication connection that uses NFC protocol, a communication connection that uses Radio-frequency identification (RFID), a communication connection that uses a Bluetooth® wireless technology standard, and/or the like). In some non-limiting embodiments or aspects, computing device 106 may include and/or upload information and/or data to an electronic data management system, such as a hospital record system, a system used during a clinical trial to collect the trial related information, and/or the like.
The number and arrangement of devices and systems shown in
Referring now to
Image capture device 144 may be located in a portion of the sidewall of housing 140 extending away from mechanical connector 142 in a direction perpendicular and/or at an angle to the length of the cylinder shape of mechanical connector 142 including opening 202. For example, a field-of-view of image capture device 144 may include a fixed field-of-view relative to housing 140 and/or mechanical connector 142 such that image capture device 144 may capture images of an area immediately proximate to and extending away from opening 202 at a first end of the cylinder shape closer to image capture device 144 than a second end of the cylinder shape opposite the first end (see, e.g.,
As shown in
In some non-limiting embodiments or aspects, mechanical interface 142 including opening 202 may include a rigid deformable clip. For example, the cylinder shaped sidewall 204 of housing 140 surrounding opening 202 may be configured to clip onto an exterior surface of lumen 102a and/or barrel 112 of syringe 102b. As an example, the cylinder-shaped sidewall 204 may include a material forming the open cylinder shape that is configured to deform around and grasp an exterior surface of lumen 102a and/or barrel 112 of syringe 102b when mechanical interface 142 including opening 202 is pressed onto lumen 102a and/or barrel 112 of syringe 102b.
In some non-limiting embodiments or aspects, image capture device 144 may configured to automatically capture the one or more images in response to lumen 102a and/or barrel 112 of syringe 102b being received in opening 202. For example, the cylinder-shaped sidewall 204 of housing 140 within opening 202 may include a sensor (e.g., a mechanical sensor or button, a proximity sensor, an optical sensor, etc.) configured to detect a presence of lumen 102a and/or barrel 112 of syringe 102b within opening 202 and, in response thereto, provide a signal to processor 146 to control image capture device 144 to initiate capture of the one or more images.
Referring now to
Referring now to
In a same or similar manner to implementation 300, image capture device 144 may be located in a portion of the sidewall of housing 140 extending away from mechanical connector 142 in a direction perpendicular and/or at an angle to a space between jaws 402. For example, a field-of-view of image capture device 144 may include a fixed field-of-view relative to housing 140 and/or mechanical connector 142 such that image capture device 144 may capture images of an area immediately proximate to and extending away from a first side of jaws 402 closer to image capture device 144 than second side of jaws 402 opposite the first side (see, e.g.,
Again in a same or similar manner to implementation 300, and as shown in
In some non-limiting embodiments or aspects, image capture device 144 may configured to automatically capture the one or more images in response actuation of handles 406 (e.g., in response to actuation of the butterfly-style clip, etc.). For example, housing 140 (e.g., a the hinge between jaws 402, etc.) may include a sensor (e.g., a mechanical sensor or button, a proximity sensor, an optical sensor, etc.) configured to detect an actuation of the jaws 402 and/or a presence of lumen 102a and/or barrel 112 of syringe 102b between the jaws 402 and, in response thereto, provide a signal to processor 146 to control image capture device 144 to initiate capture of the one or more images.
Referring now to
In a same or similar manner to implementations 300 and/or 400, image capture device 144 may be located in a portion of the sidewall of housing 140 extending away from mechanical connector 142 in a direction perpendicular and/or at an angle to a space between arcuate jaws 502. For example, a field-of-view of image capture device 144 may include a fixed field-of-view relative to housing 140 and/or mechanical connector 142 such that image capture device 144 may capture images of an area immediately proximate to and extending away from a first side of arcuate jaws 502 closer to image capture device 144 than a second side of arcuate jaws 502 opposite the first side (see, e.g.,
Again in a same or similar manner to implementations 300 and/or 400, and as shown in
In some non-limiting embodiments or aspects, image capture device 144 may configured to automatically capture the one or more images in response actuation of the one or more buttons 504. For example, housing 140 may include a sensor (e.g., a mechanical sensor or button, a proximity sensor, an optical sensor, etc.) configured to detect an actuation of the one or more buttons and/or a presence of lumen 102a and/or barrel 112 of syringe 102b between the jaws 402 and, in response thereto, provide a signal to processor 146 to control image capture device 144 to initiate capture of the one or more images.
Referring now to
Referring now to
As shown in
As shown in
Patient information may include at least one of the following parameters associated with a patient: a unique patient identifier, an approved medication, an approved medication dosage, an approved medication delivery lumen, an approved medication delivery time, a patient medication allergy, or any combination thereof.
Lumen information may include at least one of the following parameters associated with a lumen: a unique lumen identifier, a unique patient identifier of a patient associated with the lumen; identifiers of one or more other medical devices connected to and/or forming the lumen, one or more previous medication previously delivered via the lumen, or any combination thereof.
Syringe information may include at least one of the following parameters associated with a syringe: a unique syringe identifier, a type of medication included in a syringe, a dosage of the medication included in the syringe, a concentration of the medication included in the syringe, a prescribed rate of delivery of the medication included in the syringe, or any combination thereof.
Flush syringe information may include at least one of the following parameters associated with a flush syringe: a unique flush syringe identifier, a type of flush fluid contained in the flush syringe, a volume of flush fluid contained in the flush syringe, or any combination thereof.
In some non-limiting embodiments or aspects, processor 146 of sensor assembly 104 (and/or computing device 106) may determine, based on the one or more images, a patient identifier, a lumen identifier, a syringe identifier, and/or a flush syringe identifier and, access, using the patient identifier, the lumen identifier, the syringe identifier, and/or the flush syringe identifier a database (e.g., at computing device 106, etc.) and/or a look-up table to determine patient information, lumen information, syringe information, flush syringe information associated therewith.
In some non-limiting embodiments or aspects, information associated with a medication delivery for a patient may include a caregiver (e.g., nurse, etc.) identifier associated with sensor assembly 104 and/or a room identifier associated with sensor assembly 104. For example, sensor assembly may be assigned to a particular caregiver and/or to a particular room of a hospital. As an example, the caregiver identifier and/or the room identifier may be used (e.g., by computing device 106, etc.) to calculate one or more analytics associated with the particular caregiver and/or the particular room.
Further details regarding non-limiting embodiments or aspects of step 704 of process 700 are provided below with regard to
As shown in
Further details regarding non-limiting embodiments or aspects of step 706 of process 700 are provided below with regard to
As shown in
As shown in
Referring now to
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
Referring now to
As shown in
As shown in
In some non-limiting embodiments or aspects, identifier element 118 of syringe 102b may be used by processor 146 of sensor assembly 104 (and/or computing device 106) to estimate a position of stopper 116 and/or plunger rod 114 of syringe 102b. For example, image processing software may record an initial position of stopper 116 and/or plunger rod 114 of syringe 102b relative to the position of identifier element 118 and, when stopper 116 and/or plunger rod 114 of syringe 102b moves relative to the position of identifier element 118, the image processing software may determine that a medication delivery procedure has begun. When stopper 116 and/or plunger rod 114 of syringe 102b advances a predetermined distance from identifier element 118, it may be assumed that the fluid delivery procedure is complete.
In some non-limiting embodiments or aspects, processor 146 of sensor assembly 104 (and/or computing device 106) may automatically identify the position of stopper 116 and/or plunger rod 114 of syringe 102b relative to other markers on syringe 102b. For example, the markings may be graduated lines and/or indicia on syringe barrel 112. As an example, the movement of stopper 116 and/or plunger rod 114 of syringe 102b relative to the markings may be used determine not only initiation and dose, but also fluid volume delivered.
In some non-limiting embodiments or aspects, processor 146 of sensor assembly 104 (and/or remote computing device 106) may determine, in response to detecting, in the one or more images, a backward or proximal movement of the stopper and/or the plunger rod of the syringe that satisfies a threshold distance, a positive lumen patency. For example, a flush syringe may be used to check catheter patency by aspirating (e.g., pulling back, etc.) on the plunger of the syringe, and if processor 146 of sensor assembly 104 (and/or remote computing device 106) determines that the plunger of the syringe has moved in a backward direction a distance that satisfies the threshold distance, processor 146 of sensor assembly 104 (and/or remote computing device 106) may confirm the positive catheter patency and/or provide an indication of the positive catheter patency to the user (E.g., via user feedback device 150, etc.). In some non-limiting embodiments or aspects, processor 146 of sensor assembly 104 (and/or remote computing device 106) may additionally, or alternatively, confirm the positive lumen patency in response to detecting, in the one or more images (e.g., using the image processing software, etc.), blood in the lumen (e.g., in the catheter line, etc.).
As shown in
In some non-limiting embodiments or aspects, processor 146 of sensor assembly 104 (and/or computing device 106) may control user feedback device 150 to provide guidance on rate of administration or delivery of the medication (e.g., fast, slow, “over X minutes”, etc.). As an example, processor 146 of sensor assembly 104 (and/or computing device 106) may compare, during the fluid delivery procedure a current delivery rate to a prescribed or approved delivery rate for the medication and, based on the comparison, provide guidance to maintain the delivery rate, increase the delivery rate, and/or reduce the delivery rate. For example, during medication delivery, the current rate of delivery of the medication may be variable over a delivery time period, and processor 146 of sensor assembly 104 (and/or computing device 106) may compare the current rate of delivery to the prescribed rate or an ideal steady rate and determine whether the current rate of delivery should be increased, decreased, or maintained to match the prescribed or ideal rate of delivery (and/or control user feedback device 150 to provide a prompt to the user to increase, decrease, or maintain the current rate of delivery to match the prescribed or ideal rate of delivery). As an example, some users may attempt to maintain a steady rate of delivery over a delivery time period (e.g., 1 minute, etc.); however, other users may deliver ¼th a dose rapidly 4 separate times over the delivery time period, which may be a less ideal delivery from a pharmacokinetics perspective than the more steady rate of delivery over the delivery time period. In this way, non-limiting embodiment or aspects of the present disclosure may improve a steadiness of medication delivery.
Referring now to
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in
Although embodiments or aspects have been described in detail for the purpose of illustration and description, it is to be understood that such detail is solely for that purpose and that embodiments or aspects are not limited to the disclosed embodiments or aspects, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any embodiment or aspect can be combined with one or more features of any other embodiment or aspect. In fact, any of these features can be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of possible implementations includes each dependent claim in combination with every other claim in the claim set.
The present application claims priority to U.S. Provisional Application No. 63/439,689 entitled “Sensor Assembly and System for Monitoring Push Medication Delivery to Patients from Syringes” filed Jan. 18, 2023, the entire disclosure of which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63439689 | Jan 2023 | US |