The present invention relates to the field of methods and systems for assessing an injection of a pharmaceutical product, and more particularly to methods and systems for assessing an injection of a pharmaceutical product performed using a syringe.
Medical syringe training is usually performed under the direct supervision of a professor or an experienced healthcare practitioner, a training approach that is expensive and human resource consuming.
Furthermore, since the evaluation is performed by a human being, errors may happen in the evaluation. For example, it may be difficult for a human being to evaluate whether the insertion angle of a needle within a subject is adequate.
Therefore, there is a need for an improved method and system for assessing the insertion of a needle and/or the injection of a pharmaceutical product using the needle and a syringe.
According to a first broad aspect, there is provided a computer-implemented method for assessing an injection performed on a subject using a syringe provided with a needle, the method comprising: processing a sequence of images taken of the injection for: determining an insertion angle of the needle relative to the subject; and determining a depth of insertion of the needle within the subject; determining one of a speed of injection and a duration of injection; and outputting an indication of the insertion angle, the depth of insertion and the one of the speed of injection and the duration of injection.
In one embodiment, the injection angle comprises an angle between a longitudinal axis of the syringe and a tangent line to a surface of the subject at a contact point between the needle and the surface of the subject.
In one embodiment, the step of determining an insertion angle comprises: identifying a given one of the images in which a distal end of the needle comes into contact with the subject; and measuring the insertion angle within the given one of the images.
In one embodiment, the step of said identifying the given one of the images comprises processing the sequence of images for: tracking the needle and the subject within the sequence of images; and identifying the given one of the images as being a first image in the sequence of images in which first coordinates of the distal end of the needle corresponds to second coordinates of a point of the surface of the subject. In another embodiment, the step of identifying the given one of the images comprises processing the sequence of images for: calculating a distance between a reference point located on one of the needle and the syringe and the surface of the subject; and identifying the given one of the images as being a first image in the sequence of images in which the calculated distance is one of equal to and less than a target distance.
In one embodiment, the step of determining an insertion angle comprises: identifying a plurality of the images in which a distal end of the needle comes into contact with the subject; measuring the respective insertion angle within each one of the plurality of the images; and calculating one of a median insertion angle and an average insertion angle based on the respective insertion angles, thereby obtaining the insertion angle.
In one embodiment, the step of determining a depth of insertion comprises processing the sequence of images for determining whether the needle has been entirely inserted into the subject.
In one embodiment, the step of determining a depth of insertion comprises processing the sequence of images for determining a given image in the sequence in which the needle has stopped moving along a longitudinal axis of the syringe.
In one embodiment, the step of determining a depth of insertion further comprises determining within the given image whether the needle is visible. In another embodiment, the step of determining a depth of insertion further comprises measuring, within the given image, a length of a visible portion of the needle. In a further embodiment, the step of determining a depth of insertion further comprises determining, within the given image, a position of a reference point on the syringe relative to a surface of the subject and calculating the depth of insertion based on the position of the reference point.
In one embodiment, the step of determining one of the speed of injection and the duration of injection comprises determining the duration of the injection.
In one embodiment, the duration of the injection corresponds to a time elapsed between a first image in which the needle comes into contact with the subject and a subsequent image in which the needle is no longer in contact with the subject. In another embodiment, the duration of the injection corresponds to a time elapsed between a first image in which the insertion depth has reached a desired depth and a subsequent image in which the needle is no longer in contact with the subject.
In one embodiment, the step of determining one of the speed of injection and the duration of injection comprises tracking a position of a plunger relative to a barrel of the syringe.
In one embodiment, the step of tracking is performed within the sequence of images.
In one embodiment, the method further comprises: determining whether the insertion angle is adequate, whether the depth of insertion is adequate and whether the one of the speed of injection and the duration of injection is adequate, thereby obtaining assessment results; and outputting the assessment results.
In one embodiment, the insertion angle is determined as being adequate if the insertion angle is comprised between a predefined minimal angle and a predefined maximal angle.
In one embodiment, the depth of insertion is determined as being adequate by determining that the syringe comes into contact with the subject. In another embodiment, the depth of insertion is determined as being adequate if the depth of insertion is comprised between a predefined minimal insertion and a predefined maximal insertion.
In one embodiment, the one of the speed of injection and the duration of injection is determined as being adequate if the one of the speed of injection and the duration of injection is comprised between a first threshold and a second threshold.
According to another broad aspect, there is provided a non-volatile memory having stored thereon statements and instructions that upon execution by a processor perform the steps of the above computer-implemented method.
According to a further broad aspect, there is provided a system for assessing an injection of a pharmaceutical product, the system comprising at least one processor and a memory, the memory having stored thereon statements and instructions that upon execution by the at least one processor perform the steps of: processing a sequence of images taken of the injection for: determining an insertion angle of the needle relative to the subject; and determining a depth of insertion of the needle within the subject; determining one of a speed of injection and a duration of injection; and outputting an indication of the insertion angle, the depth of insertion and the one of the speed of injection and the duration of injection.
In one embodiment, the injection angle comprises an angle between a longitudinal axis of the syringe and a tangent line to a surface of the subject at a contact point between the needle and the surface of the subject.
In one embodiment, the step of determining an insertion angle comprises: identifying a given one of the images in which a distal end of the needle comes into contact with the subject; and measuring the insertion angle within the given one of the images.
In one embodiment, the step of said identifying the given one of the images comprises processing the sequence of images for: tracking the needle and the subject within the sequence of images; and identifying the given one of the images as being a first image in the sequence of images in which first coordinates of the distal end of the needle corresponds to second coordinates of a point of the surface of the subject. In another embodiment, the step of identifying the given one of the images comprises processing the sequence of images for: calculating a distance between a reference point located on one of the needle and the syringe and the surface of the subject; and identifying the given one of the images as being a first image in the sequence of images in which the calculated distance is one of equal to and less than a target distance.
In one embodiment, the step of determining an insertion angle comprises: identifying a plurality of the images in which a distal end of the needle comes into contact with the subject; measuring the respective insertion angle within each one of the plurality of the images; and calculating one of a median insertion angle and an average insertion angle based on the respective insertion angles, thereby obtaining the insertion angle.
In one embodiment, the step of determining a depth of insertion comprises processing the sequence of images for determining whether the needle has been entirely inserted into the subject.
In one embodiment, the step of determining a depth of insertion comprises processing the sequence of images for determining a given image in the sequence in which the needle has stopped moving along a longitudinal axis of the syringe.
In one embodiment, the step of determining a depth of insertion further comprises determining within the given image whether the needle is visible. In another embodiment, the step of determining a depth of insertion further comprises measuring, within the given image, a length of a visible portion of the needle. In a further embodiment, the step of determining a depth of insertion further comprises determining, within the given image, a position of a reference point on the syringe relative to a surface of the subject and calculating the depth of insertion based on the position of the reference point.
In one embodiment, the step of determining one of the speed of injection and the duration of injection comprises determining the duration of the injection.
In one embodiment, the duration of the injection corresponds to a time elapsed between a first image in which the needle comes into contact with the subject and a subsequent image in which the needle is no longer in contact with the subject. In another embodiment, the duration of the injection corresponds to a time elapsed between a first image in which the insertion depth has reached a desired depth and a subsequent image in which the needle is no longer in contact with the subject.
In one embodiment, the step of determining one of the speed of injection and the duration of injection comprises tracking a position of a plunger relative to a barrel of the syringe.
In one embodiment, the step of tracking is performed within the sequence of images.
In one embodiment, the at least one processor is further configured for determining whether the insertion angle is adequate, whether the depth of insertion is adequate and whether the one of the speed of injection and the duration of injection is adequate, thereby obtaining assessment results; and outputting the assessment results.
In one embodiment, the insertion angle is determined as being adequate if the insertion angle is comprised between a predefined minimal angle and a predefined maximal angle.
In one embodiment, the depth of insertion is determined as being adequate by determining that the syringe comes into contact with the subject. In another embodiment, the depth of insertion is determined as being adequate if the depth of insertion is comprised between a predefined minimal insertion and a predefined maximal insertion.
In one embodiment, the one of the speed of injection and the duration of injection is determined as being adequate if the one of the speed of injection and the duration of injection is comprised between a first threshold and a second threshold.
According to still another broad aspect, there is provided a kit for assessing an injection performed using a syringe provided with a needle, the system comprising: a subject comprising an anatomical model; a support comprising an opening for receiving therein a camera configured for capturing a sequence of images of the injection performed on the anatomical model; and a syringe provided with a needle.
In one embodiment, the anatomical model is shaped so as to mimic a shape of a portion of a body of a human being.
In one embodiment, the support is adapted to provide the camera, when received in the support, with a predefined orientation relative to a receiving surface on which the support is to be deposited.
In one embodiment, the kit further comprises a mat for receiving the anatomical model and the support thereon, the mat comprising marks thereon for indicating at least one of a position and an orientation for the anatomical model and the support.
According to still a further embodiment, there is provided a method for assessing an injection performed on a subject using a syringe provided with a needle, the method comprising: performing the injection on the subject; concurrently taking a sequence of images of the performed injection; providing the sequence of images for processing to determine an insertion angle of the needle relative to the subject and a depth of insertion of the needle within the subject; and outputting an indication of the insertion angle and the depth of insertion.
Further features and advantages of the present invention will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
In the following there is provided a computer-implemented method for assessing an injection of a product in order to train a user to perform injections using a syringe provided with a needle. The method allows for automatically evaluating an injection performed by a user without the surveillance of a professor or an experienced healthcare practitioner. There is also provided a kit to be used in connection with the assessment method. The kit comprises an anatomical model on which the injection is to be performed and a support for receiving a camera to be used for capturing images of the injection in the anatomical model. The kit may further comprise a mat for receiving the anatomical model and the support thereon while ensuring a predefined relative position between the anatomical model and the support.
It should be understood that the method 10 may be used to assess the injection of any pharmaceutical product that can be injected into a subject using a needle mounted on a syringe. For example, the pharmaceutical product may be a biological product, a chemical product, a medicinal product, or the like. For example, the pharmaceutical product can be a vaccine, insulin, etc. In an embodiment in which the method is used in a context of training and the subject is inanimate, the pharmaceutical product may be any adequate fluidic product such as air, water, or the like.
At step 12, a sequence of images illustrating the insertion of a needle into a subject by a user and the injection of a pharmaceutical product into the subject is received. The images sequentially illustrate a needle secured to a syringe moving towards the surface of the subject, the distal end of the needle coming in contact with the surface of the subject, the needle being inserted into the subject, the actuation and displacement of the plunger of the syringe to deliver the pharmaceutical product. In one embodiment, the images further illustrate the extraction of the needle from the subject until the needle is no longer in contact with the subject. It should be understood that the received images are timely ordered so that the position of a given image within the sequence of images corresponds to a respective point in time during the administration of the pharmaceutical product since the number of images per second is fixed and known. In one embodiment each image of the sequence is time-stamped so that a temporal order is provided to the sequence of images. In the following, since the images are timely ordered, it should be understood that identifying or referring to a particular point in time is equivalent to identifying or referring to the corresponding image. For example, identifying the first point in time at which a needle comes in contact with the subject is equivalent to identifying the first image in the sequence in which the needle comes in contact with the subject, and vice-versa.
In one embodiment, the sequence of images are all received concurrently. For example, step 12 may consist in receiving a video file containing the sequence of images. In another embodiment, the images are iteratively received as they are being captured by a camera.
In one embodiment, the sequence of images are part of a video captured by at least one camera. As described below in greater detail, a single camera may be used to capture the insertion of the needle, the injection of the pharmaceutical product, and optionally the extraction of the needle. In another embodiment, at least two cameras may be used. In this case, step 12 comprises receiving a sequence of images from each camera and the received sequences of images all represent the same needle insertion and the same product injection but from different points of view or fields of view. For example, the cameras may be at different locations within a same plane or at different locations within different planes.
In one embodiment, the subject is an inanimate subject such as an object. For example, an inanimate object may be an anatomical model such as an object mimicking a part of a body such as a shoulder of a human being. In another embodiment, an inanimate object may be a fruit such as an orange. It should be understood that any adequate object in which a needle may be inserted may be used. For example, an inanimate object may be made of foam.
In another embodiment, the subject may be a living subject. For example, the subject may a human being, an animal such as a mammal, etc.
Referring back to
It should be understood that any adequate method for analyzing images to recognize objects in images and therefore follow the position of objects from one image to another may be used. For example, any adequate machine learning models or deep learning models configured for recognizing objects/subjects within images may be used. For example, image segmentation and blob analysis may be used for identifying the needle and the subject within the sequence of images. In another embodiment, a convolution neural network (CNN) may be trained to recognize the needle and the subject within the sequence of images.
It should also be understood that any adequate method for determining that the needle comes into contact with the surface of the subject into an image may be used. For example, once the subject and the needle have been recognized and tracked in the images, the point of contact between the needle and the subject may be established when the distal end of the needle is positioned on the surface of the subject. For example, the position of the distal end of the needle may be tracked from one image to another and the point of contact between the needle and the subject is established when the coordinates of the distal end of the needle corresponds to the coordinates of one point of the surface of the subject.
In another example, a machine learning model such as a deep learning model may be trained to detect whether the distal end of a needle is in contact with the surface of a subject. In this case, the point of contact between the distal end of the needle and the surface of the subject is determined using the machine learning model.
In a further example, the point of contact between the distal end of the needle and the surface of the subject may be determined by calculating the distance between a reference point located on the needle or the syringe and the surface of the subject and comparing the calculated distance to a target or reference distance. When the calculated distance is equal to or less that the target distance, it is assumed that the distal end of the needle is in physical contact with the subject. In this case, the method 10 further comprises a step of receiving the target distance and optionally the identification of the reference point and.
Referring to
In one embodiment, the distance between the reference point and the subject corresponds to the distance between the reference point and the surface of the subject along the longitudinal axis of the needle (which also corresponds to the longitudinal axis of the barrel 54). In another embodiment, the distance between the reference point and the subject corresponds to the shortest distance between the reference point and the surface of the subject.
Referring back to
In one embodiment, the insertion angle of the needle is calculated once only. For example, the first image of the sequence in which a point of contact between the needle and the subject is detected may be identified and the insertion angle may be calculated only in this first image.
In another embodiment, the insertion angle is iteratively calculated at several points in time (or in several images of the sequence) during the insertion of the needle within the subject. For example, the insertion angle may be calculated for each image following the detection of the point of contact between the needle and the subject. In one embodiment, the calculation of the insertion angle is stopped once a desired insertion depth is reached such as when the needle has been entirely inserted into the subject. In another embodiment, the calculation of the insertion angle is stopped once the syringe or the needle stops moving relative to the subject along the longitudinal axis of the syringe/needle.
At step 18, the depth of insertion of the needle into the subject is determined from the received images. It should be understood that any adequate method for determining the depth of insertion of the needle within the subject may be used. The insertion of the needle into the subject occurs from a first point in time (or a first image) at which the needle comes into contact with the surface of the subject until a second point in time (or a second image) at which the syringe stops moving relative to the subject along the longitudinal axis of the syringe. In one embodiment, the second point in time corresponds to the point in time at which the syringe has stopped moving relative to the subject along the longitudinal axis of the syringe for a predetermined period of time. The depth of insertion corresponds to the length of the portion of the needle that is inserted into the subject at the second point in time.
In one embodiment, the user is instructed to insert the whole needle into the subject. In this case, step 18 may consist in determining whether at the second point in time (or in the second image) the needle has been inserted entirely into the subject, i.e., whether the needle is visible or not in the second image corresponding to the second point in time. In this case, the insertion depth may have two values: “entirely inserted” and “partially inserted”. In another example, the insertion depth may have the two following values: “needle visible” and “needle not visible”. It should be understood that any adequate method for determining if a whole needle has been inserted into a subject from images or determining whether a needle is visible in images may be used.
In one embodiment, the needle is considered to be entirely inserted into the subject when a reference point comes into contact with the subject at the second point in time (i.e., the point in time at which the syringe stops moving relative to the subject along the longitudinal axis of the syringe or the point in time at which the syringe has stopped moving relative to the subject along the longitudinal axis of the syringe for a predetermined period of time). For example, turning to
In another embodiment, machine learning models such as deep learning models may be trained to determine whether a needle is entirely inserted into a subject. For example, the machine learning model may be trained to determine whether a reference point on the syringe or the adapter is in contact with the surface of the subject. The image taken at the second point in time is then analyzed by the machine learning model to determine whether the needle is entirely inserted into the subject. The machine learning model may analyze the received sequence of images and identify the first image in which the needle is entirely inserted into the subject. If no such image is identified by the machine learning model, then it is concluded that the needle was not entirely inserted into the subject. In another example, the machine learning model may analyze the received sequence of image, identify the first image at which the needle stops moving relative to the subject along the longitudinal axis of the syringe and determine whether the needle is visible in the first image. The machine learning model outputs the value “visible” if the needle is visible in the first image and the value “not visible” if the needle is not visible in the first image.
In a further embodiment, the depth of insertion of the needle is determined from the images by measuring the length of the visible portion of the needle within the images. A given image of the sequence in which the syringe has stopped moving relative to the subject along the longitudinal axis of the syringe is identified. The length of the visible portion of the needle within the identified image is determined and the insertion depth is calculated as being the length of the needle minus the determined length of the visible portion of the needle. If the needle is no longer visible, then it is assumed that the whole needle has been inserted into the subject and the insertion depth is equal to the length of the needle. In this embodiment, the method 10 further comprises the step of receiving the length of the needle.
In a further embodiment, the motion of a reference point located on the syringe or the adapter is tracked within the images starting from the first time at which the contact between the needle and the surface of the subject has been detected until the second point time at which the reference point stops moving relative the subject along the longitudinal axis of the syringe. As described above, the reference point may be located on the adapter securing the needle to the barrel of a syringe or on the syringe such at the distal end of the syringe. In one embodiment, the depth of insertion of the needle corresponds to the distance travelled by the reference point along the longitudinal axis of the syringe between the first and second points in time. In another embodiment in which the length of the needle is known, the distance between the reference point and the surface of the subject along the longitudinal axis of the syringe is determined at the second point in time and the depth of insertion of the needle within the subject can be determined from the determined distance and the length of the needle. For example, if the reference point is located on the adapter, then the needle is considered to be entirely inserted into the subject if the measured distance between the adapter and the surface of the subject is substantially equal to zero. In this embodiment, the method 10 further comprises a step of receiving an identification of the reference point.
Referring back to
In one embodiment, the speed of injection refers to the amount of pharmaceutical product injected per unit of time, such as per second. For example, knowing the diameter of the barrel of the syringe, the amount of pharmaceutical product injected per unit of time can be determined based on the displacement speed of the plunger or the injection duration. In this case, the method further comprises a step of receiving the diameter of the barrel of the syringe.
It should be understood that any adequate method for determining the injection duration or the injection speed may be used.
In one embodiment, the injection speed or the injection duration is determined from the received images. In another embodiment, a tracking system such as a triangulation tracking system may be used to determine and track the position of the plunger of the syringe. For example, the plunger may be provided with a signal emitting device configured for emitting a signal such as a radio frequency signal and sensors are used to detect the emitted signal. As known in the art, the position of the plunger, such as the position of the distal end of the plunger inserted into the barrel of the syringe, may then be determined from the signals received by the sensors. Knowing the position in time of the plunger, the injection duration, i.e., the time taken by the plunger to move between two extreme positions, and/or the speed of injection, i.e., the speed at which the plunger moves between the two extreme positions, may be determined.
In an embodiment in which the images are used to determine the injection duration, the injection duration may be assumed as being the period of time elapsed between the point in time at which the distal end of the needle came into contact with the subject and the subsequent point in time at which the needle is no longer in contact with the subject (i.e., the point in time at which the needle is extracted from the subject). In this case, the duration of the injection corresponds to the time elapsed between the first image at which the needle comes into contact with the subject and the first subsequent image at which the needle is no longer in contact with the subject. The injection speed may then correspond to the speed of displacement of the plunger during the motion of the plunger between the first image at which the needle comes into contact with the subject and the first subsequent image at which the needle is no longer in contact with the subject.
In another embodiment in which the images are used to determine the injection duration, the injection duration may be assumed as being the period of time elapsed between the first point in time at which the insertion depth has reached the desired depth and the first subsequent point in time at which the needle is no longer in contact with the subject. In this case, the duration of the injection corresponds to the time elapsed between the first image at which the insertion depth has reached the desired depth and the first subsequent image at which the needle is no longer in contact with the subject. The injection speed may then correspond to the speed of displacement of the plunger during the motion of the plunger between the first image at which the insertion depth has reached the desired depth and the first subsequent image at which the needle is no longer in contact with the subject.
In a further embodiment in which the images are used to determine the injection duration or the injection speed, the injection duration or the injection speed is determined based on the position in time of the plunger relative to the barrel. For example, a reference point on the plunger, such as the distal end of the plunger, may be localized within the images and the motion of the distal end of the plunger relative to the barrel may be tracked between its two extreme positions while the pharmaceutical product is injected. By tracking the position of the distal end of the plunger, the injection duration and/or the injection speed may be determined.
In one embodiment, at least one portion of the plunger such as the distal end of the plunger or plunger head of the plunger may be provided with a predefined color so as to allow an easier localization of the plunger within the images. The extremities of the barrel may also be provided with a respective color while still being translucent and the portion of the barrel extending between the two extremities may be substantially transparent. In this case, when the plunger head moves away from the proximal extremity, the color of the proximal extremity is revealed. Conversely, as the plunger head reaches the distal extremity, the color of the distal extremity as perceived by the camera changes. The position of the plunger head relative to the barrel may then be approximated by merely detecting the changes in color of the barrel extremities.
At step 22, the insertion angle, the insertion depth and the injection duration or speed are outputted. For example, they may be stored in memory. In another example, they may be provided for display on a display unit. In a further example, they may be transmitted to a computer machine.
In one embodiment, the method 10 further comprises a step of evaluating the injection parameters, i.e., the determined insertion angle, insertion depth and injection duration or speed.
In one embodiment, the determined insertion angle is compared to two angle thresholds, i.e., a minimal angle and a maximal angle. If the insertion angle is comprised between the minimal and maximal angles, then the insertion angle is identified as being adequate. Otherwise, the insertion angle is identified as being inadequate.
In an embodiment in which the insertion angle is determined at different points in time during the insertion of the needle, each determined insertion angle may be compared to the minimal and maximal insertion angles. If at least one of the determined insertion angles is not comprised between the minimal and maximal insertion angles, then the insertion of the needle may be considered as being inadequate. If all of the determined insertion angles are comprised between the minimal and maximal insertion angles, then the insertion of the needle is considered to be adequate. In another example, the median or the mean of the different determined insertion angles may be compared to the minimal and maximal insertion angles. If the median or the mean of the different determined insertion angles is not comprised between the minimal and maximal insertion angles, then the insertion of the needle may be considered as being inadequate. Otherwise, the insertion of the needle is considered to be adequate.
In one embodiment, the determined insertion depth is compared to at least one depth threshold and the determined insertion depth is identified as being adequate or inadequate based on the comparison. For example, the determined insertion depth may be compared to a minimal depth. If the determined insertion depth is less than the minimal depth, then the determined depth is considered as being inadequate. Otherwise, the determined depth is considered as being adequate.
In an embodiment in which the needle has to be entirely inserted into the subject for the insertion to be adequate and the step 18 of determining the insertion depth consists in determining whether the needle is entirely inserted into a subject, the output value of step 18 is compared to a target value, e.g., “entirely inserted” or “not visible”. If the output value of step 18 corresponds to the target value, then the insertion depth is considered as being adequate. Otherwise, if the output value of step 18 does not correspond to the target value, then the insertion depth is considered as being inadequate. For example, if the two possible output values for step 18 are “visible” and “not visible”, the target value is “visible” and the actual output value determined at step 18 is “not visible”, then it is determined that the insertion depth is inadequate. However, if the actual output value determined at step 18 is “visible”, then it is determined that the insertion depth is adequate.
In one embodiment, the determined injection duration or speed is compared to at least one injection threshold. For example, in an embodiment in which the injection duration is determined at step 20, the determined injection duration may be compared to a minimal duration. If the determined injection duration is less than the minimal duration, the injection duration is identified as being inadequate. Otherwise, the injection duration is identified as being adequate. For example, in an embodiment in which the injection speed is determined at step 20, the determined injection speed may be compared to a maximal duration. If the determined injection speed is greater than the maximal speed, the injection speed is identified as being inadequate. Otherwise, the injection speed is identified as being adequate.
Once the parameters of the injection have been evaluated, the evaluation results are outputted, i.e., once the determined insertion angle, the determined insertion depth and the determined injection duration or speed have been evaluated, an indication as to whether the determined insertion angle, the determined insertion depth and the determined injection duration or speed are adequate or not is outputted. For example, the evaluation results may be stored in memory. In another example, they may be provided for display on a display unit.
In one embodiment, the method 10 further comprises the step of capturing the sequence of images using a camera.
In one embodiment, the steps 14 to 20 are performed in substantially real-time while the images are being acquired.
In one embodiment, the evaluation of the determined insertion angle, insertion depth and injection duration or speed is performed in substantially real-time while the camera acquires the images. In this case, the injection parameters are evaluated as the images are received. In this case, a substantially real-time feedback can be provided to the user. For example, when it is detected that the needle came into contact with the subject, the insertion angle may be determined and evaluated and an indication as to whether the insertion angle is adequate can be provided to the user, thereby allowing the user to correct the insertion angle in the event the determined insertion angle is considered to be inadequate. In another embodiment, the
In one embodiment, the method 10 may be used for training a user such as a medical student without the presence of a trainer such as a professor, a supervisor or the like. In this case, the user records a video while performing an injection and the video is transmitted to a computer machine such as a server that executes the method 10 to calculate the injection parameters and optionally evaluate the injection parameters. Hence, a user may be evaluated without requiring the presence of a trainer.
As described above, the method 10 may be used when an inanimate subject is used. In this case, the pharmaceutical product may be air for example. Such a scenario is particularly adequate for training users, especially training users remotely.
In an embodiment in which the subject is a living subject, the method 10 may be used for evaluating medical staff members such as nurses and allowing them to improve their skills.
In one embodiment, the method 10 may be embodied as a non-transitory memory having stored thereon statements and instructions that when executed by a processing unit perform the steps of the method 10.
In another embodiment, the method 10 may be embodied as a system comprising at least one processing unit configured for performing the steps of the method 10.
The camera 102 captures a video of the injection which is transmitted to the computer machine 104. In one embodiment, the camera 102 may be integral with the computer machine 104 such as when the computer machine 104 is a laptop, a smartphone, a tablet, or the like.
In one embodiment, the computer machine 104 receives the video and transmits the received video to the server 106 over a communication network such as the Internet. In this case, the server 106 is configured for performing the steps of the method 10. For example, the server 106 may be configured to perform the evaluation of the injection. The evaluation results may be stored in memory by the server 106 and/or displayed on a display connected to the server 106. The server 106 may also transmits the evaluation results to the computer machine 104 which may provide the received evaluation results for display on a display connected to the computer machine 104.
In one embodiment, the server 106 may be omitted and the computer machine 104 is configured for performing the steps of the method 10.
In the following there is described a system 200 for assessing an injection of a pharmaceutical product on a subject while using a syringe provided with a needle and using a smartphone or a tablet.
As illustrated in
In the illustrated embodiment, the anatomical model 202 mimics the shape of a portion of a shoulder. As illustrated in
It should be understood that the anatomical model 202 is made of any adequate material allowing the insertion of a needle therein. For example, the anatomical model 202 may be made of foam.
Referring back to
The support 204 is positioned relative to the anatomical model 202 so that the anatomical model 202 be in the field on view of the camera of the smartphone 206. In one embodiment, the relative positioning of the support 204 and the anatomical model 202 is chosen so that one of the lateral faces 214 is parallel to the plane of the camera of the smartphone when the smartphone is received in the support 204, i.e., parallel to the plane in which the front face of the smartphone extends.
In one embodiment, the support 204 is shaped and sized to provide a predefined orientation of the plane of the camera of the smartphone 206 relative to the surface on which the support 204 is positioned. For example, the support 204 may be shaped and sized so that the plane of the camera, e.g., the plane in which the smartphone extends, be orthogonal to the surface on which the support 204 is deposited.
In one embodiment, the support 204 is positioned at a predefined distance from the anatomical model 202. In one embodiment, the system 200 further comprises a mat on which the anatomical model 202 and the support 204 are to be installed. The mat is provided with reference marks thereon to help the user adequately position and orient the anatomical model 202 and the support 204 relatively to one another. For example, the mat may comprise a first mark thereon for adequately positioning the anatomical model 202 and a second mark thereon for adequately positioning the support 204 Proper use of the mat ensures that the anatomical model 202 and the support 204 are at a predefined distance from one another and the smartphone 206 when received in the support 204 is adequately oriented relative to a lateral face 214 of the anatomical model 202.
The smartphone 206 is provided with an application stored thereon as described below. The application is configured for guiding the user to install the smartphone 206 into the support 204 and position and orient the support 204 relative to the anatomical model 202. The application is further configured for receiving the video of the injection captured by the camera of the smartphone 206 and transmit the received video of the injection to a server such as the server 106 which executes the steps of the method 10. The evaluation results of the injection parameters generated by the server are transmitted to the smartphone 206 which displays the received results on its display.
In one embodiment, the application is configured for transmitting the images captured by the camera as they are captured by the camera. In another embodiment, the application is configured for transmitting the images only when the recording of the video ended.
Prior to the execution of the flow chart of
The interface of
The interface of
Once the user indicates that the installation is completed, the application displays the exemplary interface of
As illustrated in
The first step of the analysis performed by the server is the detection of the subject or anatomical model 202 within the video frame. Once the anatomical model 202 has been detected, a verification step is performed, i.e., the server ensure that the detected object identified as the anatomical model 202 does not move for a given period of time such as at least 0.5 second. If the object identified as the anatomical model 202 moves during the period of time, the server understands that the identified object is not the anatomical model 202 and transmits an error message to the smartphone to be displayed thereon. For example, the error message may indicate that the anatomical model 202 has to be placed at the location indicated on the mat.
Once the anatomical model 202 has been identified, the orientation of the working surface 212 of the anatomical model 202 is detected and validated with the information inputted by the user regarding whether he is right or left-handed.
Then the syringe is detected within the frames of the video. If the syringe is not detected in the video frames, an error message is generated by the server and transmitted to the smartphone 206 to be displayed thereon. For example, the error message may inform the user that the syringe was not detected and request the user to ensure that proper lightning of the room is used.
Once the syringe has been detected, the server identifies the first frame in which the distal end of the needle comes into contact with the surface of the anatomical model 202. If it cannot detect a contact between the needle and the anatomical model 202, the server generates an error message and transmits the error message to the smartphone 206 to be displayed thereon. The error message may indicate that no contact between the needle and the anatomical model 202 was detected and request the user to ensure that proper lighting of the room is used.
Once the contact between the needle and the anatomical model 202 has been detected, the server analyses the subsequent frames to calculate the injection parameters, i.e. the insertion angle of the needle, the depth of insertion and the injection duration/speed. The server then compares the calculated parameters to respective thresholds, as described above.
If the calculated insertion angle is not adequate, the server generates and transmits to the smartphone 206 an error message to be displayed thereon. For example, when the insertion angle is not comprised between a given range such as between 80 degrees and 100 degrees, the error message may indicate that the insertion angle falls outside the recommended range.
If the calculated insertion depth is not adequate, the server generates and transmits to the smartphone 206 an error message to be displayed thereon. For example, when the needle has to be entirely inserted into the anatomical model 202 and the server determines that the needle was not entirely inserted, the error message may indicate that the needle needs to be fully inserted into the anatomical model 202.
If the calculated injection duration is not adequate, the server generates and transmits to the smartphone 206 an error message to be displayed thereon. For example, when the injection duration is not comprised between a given range such as between 3.5 seconds and 6.5 seconds, the error message may indicate that the insertion speed was too fast or too slow.
If the calculated insertion angle, insertion depth and injection duration/speed are each found to be adequate, the server generates and transmits to the smartphone 206 an evaluation message indicating that all injection parameters were found adequate.
The application running on the smartphone may generate a graphical interface for displaying the results of the evaluation.
It should be understood that the server may execute any adequate methods or machine learning models configured for object recognition and tracking to locate and identify the anatomical model 202 and the syringe. For example, GOTURN or Kernelized Correlation Filters may be used. In another example, a machine learning model such as Yolo RetinaNet, SSD, Fast RCNN, or the like may be used. In one embodiment, a bounding box is generated around the anatomical model 202 as illustrated in
It should also be understood that any adequate method may be used for determining the orientation of the working face 212 of the anatomical model 202. In one embodiment, the user may be requested to place at least one hand within the field of view of the camera and to the side of the working face 212. The orientation of the working face 212 may then be determined using the location of the hand(s) within the images. The detection of the hand(s) by the server may be performed using methods such as Histogram Oriented Gradient, Canny edge detector and/or Support Vector Machine, or a trained machine learning model such as DeepPose.
In another example, the determination of the orientation of the working face 212 may be automatically performed by the server. For example, the server may execute methods such as Histogram Oriented Gradient and Support Vector Machine to determine the orientation of the working face 212. In another example, a machine learning model such as a CNN with a binary classification (i.e. right or left) may be trained to determine the orientation of the working face 212.
In one embodiment, once the working face 212 of the anatomical model 202 has been detected, a second bounding box is generated around the anatomical model 202 as illustrated in
Once the syringe has been detected, a bounding box is assigned to the syringe, as illustrated in
Once the contact has been determined, the server determines the insertion angle by determining the tangent to the surface of the anatomical model 202 at the contact point and calculating the angle between a first vector oriented along the longitudinal axis of the syringe and a second vector oriented along the determined tangent. However, it should be understood that any adequate method for calculating the insertion angle may be used. For example, the insertion angle may be assumed as corresponding to the angle of the diagonal of the bounding box associated with the syringe. In another example, Histogram Oriented Gradient may be used for calculating the insertion angle. In a further example, Principal Component Analysis (PCA) may be used.
Then the server determines when the needle has been entirely inserted into the anatomical model 202, as illustrated in
While the above methods and systems are directed to the assessment of the injection of a pharmaceutical product while using a needle and a syringe, it should be understood that the methods and system may be adapted for the assessment of a blood withdrawal and a joint aspiration for example. In this case, the step of determining the depth of insertion and/or the step of determining the speed or duration of injection may be omitted.
The embodiments of the invention described above are intended to be exemplary only. The scope of the invention is therefore intended to be limited solely by the scope of the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2022/051557 | 2/22/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63152132 | Feb 2021 | US |