Injection system

Information

  • Patent Grant
  • 12070581
  • Patent Number
    12,070,581
  • Date Filed
    Monday, November 4, 2019
    5 years ago
  • Date Issued
    Tuesday, August 27, 2024
    2 months ago
  • Inventors
  • Original Assignees
    • Truinject Corp. (Irvine, CA, US)
  • Examiners
    • Stigell; Theodore J
    Agents
    • Knobbe, Martens, Olson & Bear, LLP
Abstract
A smart injection system that promotes patient safety includes, among other things, a smart stem that allows for measuring the location of the injection relative to the patient's face and/or the amount of medication injected into the patient. In addition, the smart stem has medication cartridge verification and injector verification features. The smart stem wirelessly transmits the measure data to a processing system.
Description
FIELD

The present disclosure is related to a smart injection system. In particular, the present disclosure is related to a smart injection system that promotes patient safety.


BACKGROUND

A variety of medical injection procedures are often performed in prophylactic, curative, therapeutic, or cosmetic treatments. Injections may be administered in various locations on the body, such as under the conjunctiva, into arteries, bone marrow, the spine, the sternum, the pleural space of the chest region, the peritoneal cavity, joint spaces, and internal organs. Injections can also be helpful in administering medication directly into anatomic locations that are generating pain. These injections may be administered intravenously (through the vein), intramuscularly (into the muscle), intradermally (beneath the skin), subcutaneously (into the fatty layer of skin) or intraperitoneal injections (into the body cavity). Injections can be performed on humans as well as animals. The methods of administering injections typically range for different procedures and may depend on the substance being injected, needle size, or area of injection.


Injections are not limited to treating medical conditions, but may be expanded to treating aesthetic imperfections, restorative cosmetic procedures, procedures for treating migraine, depression, epidurals or other therapeutic procedures. Many of these procedures are performed through injections of various products into different parts of the body. The aesthetics and therapeutic industry consists of two main categories of injectable products: neuromodulators and dermal fillers. The neuromodulator industry commonly utilizes nerve-inhibiting products such as Botox®, Dysport®, and Xeomin®. The dermal filler industry utilizes products administered by providers to patients for both cosmetic and therapeutic reasons, such as, for example, Juvederm®, Restylane®, Belotero®, Sculptra®, Artefill®, and others. These providers or injectors may include plastic surgeons, facial plastic surgeons, oculoplastic surgeons, dermatologists, primary care givers, psychologist/psychiatrist, nurse practitioners, dentists, and nurses.


SUMMARY OF THE DISCLOSURE

Patient safety is of utmost importance in any injection procedure. One measure for ensuring patient safety is to train caregivers to have steady hands and provide the procedure in compliance with standard safety protocol. Some aspects of the disclosure are directed toward a smart injection system that can aid the caregiver in complying with the safety protocol. For example, the present disclosure provides features allowing for monitoring of an injection speed, aiding in locating the intended injection site, and facilitating injection of hyaluronidase at the last injection site when an artery occlusion occurs.


Another measure is to have an injection system with verification features to alert the caregiver and/or patient about counterfeit, and potentially unsafe, medication or other aspects of the injection treatment that are not in compliance with the standard of care. For example, the smart injection system is configured to read information about the manufacturer, serial/lot number, expiration date, and the like. Some aspects of the disclosure are directed toward a smart injection system allowing for components of the injection system to be authenticated to ensure that the patient is not receiving counterfeit or expired, therefore potentially unsafe, medication. The manufacturer can also receive the information about the medications actually used in patients from the injection system. The injection system can further allow identification of the injector to ensure that the person providing the injection is qualified to carry out the procedure.


An embodiment of an injection system can include a reusable electronic assembly configured to be coupled to at least a portion of a disposable syringe. The syringe can include at least a stem, a barrel and a needle. The electronic assembly can be configured to measure at least one of injection force and/or pressure, injection speed, displacement of the stem relative to the barrel, a volume of medication injected, medication flow rate, position and movement of the stem and/or the syringe relative to a patient's face, biometric data of a person performing the injection, and/or indication of authenticity of medication. The electronic assembly can be configured to communicate with a hardware processor that is configured to process the measured data. In some embodiments, the injection system can further comprise a syringe that includes a stem having a proximal cap and a shaft. The stem shaft is configured to engage a lumen of a barrel from a proximal end of the barrel and a distal end of the barrel is configured to couple to a needle. The needle is configured to penetrate skin of a patient to inject medication contained in the syringe to a patient. In some embodiments, the syringe is disposable. In some embodiments, the hardware processor is configured to provide analysis of the measured data. In some embodiments, the electronic assembly further comprises a wireless data transmitter. In some embodiments, at least a portion of the electronic assembly is located on a snap-on data cap configured to be coupled with the proximal cap of the stem. In some embodiments, at least a portion of the electronic assembly is built into the proximal cap of the stem. In some embodiments, at least a portion of the electronic assembly is built into a removable flange configured to be coupled with the barrel at or near a proximal end of the barrel. In some embodiments, the hardware processor is on a remote server.


In some embodiments of the injection system, the injection speed, displacement of the stem relative to the syringe, volume of medication injected, and/or medication flow rate is measured by a displacement sensor located on or near a distal end of the proximal cap of the stem, the displacement sensor configured to send one or more acoustic or optical signals to a reflecting surface on the barrel and receive one or more reflected acoustic or optical signals. In some embodiments, the reflecting surface is on a proximal flange of the barrel. In some embodiments, the reflecting surface is on a friction collar circumferentially surrounding a portion of the stem shaft. In some embodiments, the injection speed, displacement of the stem relative to the syringe, volume of medication injected, and/or medication flow rate is measured by a resistance reader on the stem shaft. In some embodiments, the output of the hardware processor further comprises a warning that is at least one of audio, visual, or tactile feedback when the desired amount of displacement has been achieved and/or when the rate of displacement falls outside a desired range.


In some embodiments, the injection force and/or pressure is measured by a force or pressure sensor located on the proximal cap of the stem, the force or pressure sensor configured to measure data about a force or pressure applied on the proximal cap. In some embodiments, the output of the hardware processor further comprises a warning that is at least one of audio, visual, or tactile feedback when the measured force or pressure fall outside a desired range. In some embodiments, the force and/or pressure, injection speed, displacement of the stem relative to the syringe, volume of medication injected, or medication flow rate is measured by a velocity sensor. In some embodiments, the hardware processor is configured to process the measured injection pressure and medication flow rate to output an indication that an injection site is an artery.


In some embodiments of the injection system, the position and movement of the syringe relative to a patient's face is measured by an angular, rotational and positioning sensor, the angular, rotational and positioning sensor configured to interact with one or more landmarks on a patient's face to measure data about a location of the syringe relative to the patient's face. In some embodiments, the landmarks comprise at least one of a pair of glasses, a mouth guard, stickers or ink markings configured to be placed on the patient's face. In some embodiments, the output of the hardware processor further comprise a warning that is at least one of audio, visual, or tactile feedback when the syringe is targeting a no-injection zone or an indication that the syringe is targeting an approved zone.


In some embodiments of the injection system, the biometric data of the person performing the injection is measured by a fingerprint reader. In some embodiments, the output of the hardware processor further comprise a warning that is at least one of audio, visual, or tactile feedback when the biometric data does not match one qualified to perform the injection, the hardware processor having access to a database of qualified personnel.


In some embodiments of the injection system, the indication of authenticity of medication is measured by an identification device on the barrel and a corresponding identification device reader on the stem shaft, the corresponding identification device reader interacting with the identification device when the stem is pushed distally into the barrel lumen. In some embodiments, the identification device is one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the identification device is on the barrel shaft. In some embodiments, the identification device is near a proximal opening to the barrel lumen. In some embodiments, the indication of authenticity of medication is measured by an identification device on a lip of the barrel and a corresponding identification device reader on a removable flange, the corresponding identification device reader interacting with the identification device when the removable flange couples with the barrel lip. In some embodiments, the processor is configured to compare the data measured by the identification device reader with a database. In some embodiments, the indication of authenticity of medication comprises one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the syringe, instructions for use, or indications. In some embodiments, the output of the hardware processor further comprise a warning that is at least one of audio, visual, or tactile feedback when the medication is counterfeit, has expired, and/or the barrel has been used before.


An embodiment of a method of promoting patient safety during an injection procedure can be used with an injection system including a disposable syringe assembly and a reusable electronic assembly. The method can include sending from the electronic assembly including a displacement sensor on a stem or a barrel of the disposable syringe assembly one or more acoustic or optical signals in a direction substantially parallel to a longitudinal axis of the stem when the stem shaft is moving distally inside a lumen of the barrel, receiving at the displacement sensor one or more signals reflected from a reflecting surface on the barrel or the stem, and calculating displacement and/or rate of displacement of the stem relative to the barrel based on the time lapse between sending and receiving each signal. In some embodiments, the reflecting surface is on a proximal flange of the barrel. In some embodiments, the reflecting surface is on a friction collar circumferentially surrounding a portion of the stem shaft.


In some embodiments, the method further comprises communicating with a hardware processor to send measured data about the displacement and/or rate of displacement to the hardware processor. In some embodiments, the communicating with the hardware processor is performed via a wireless data transmitter on the injection system. In some embodiments, the method further comprises measuring a force or pressure applied when the stem shaft is pushed distally inside the barrel lumen using the electronic assembly including a force or pressure sensor, wherein the force or pressure sensor is located on the stem.


Another embodiment of a method of promoting patient safety during an injection procedure can be used with an injection system including a disposable syringe assembly and a reusable electronic assembly. The method can include measuring, using the reusable electronic assembly, a speed of a movement of a stem shaft of the disposable syringe assembly relative to a barrel of the disposable syringe assembly when the stem shaft moves distally inside a lumen of the barrel; and calculating for display purposes, based on the measured speed, one or more of injection force and/or pressure, injection speed, displacement of the stem relative to the barrel, a volume of medication injected, or medication flow rate. In some embodiments, the method further comprises determining a relationship of the measured force or pressure and the rate of displacement and outputting a warning that is audio, visual, or tactile feedback when the relationship indicates that the needle is in an artery.


Another embodiment of a method of medication verification can be used with an injection system including a disposable syringe assembly and a reusable electronic assembly. The method can include coupling a shaft of a stem of the disposable syringe assembly with a lumen of a barrel of the disposable syringe assembly, the barrel including an identification device containing information specific to a prefilled medication and/or the syringe containing the prefilled medication, the stem shaft including an identification device reader, and moving the stem shaft toward a distal end of the barrel, wherein the identification device reader is configured to read data from the identification device and communicate the data to a hardware processor configured to process the read data and output an indication related to information about the medication and/or the barrel. In some embodiments, the verification device includes encoded information specific to the prefilled medication and/or the syringe. In some embodiments, the verification device is one or more of one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the information specific to the prefilled medication and/or syringe comprise one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the barrel, instructions for use, and/or indications.


Another embodiment of a method of medication verification can be used with an injection system including a disposable syringe assembly and a reusable electronic assembly. The method can include coupling a removable flange to a lip of a barrel of the disposable syringe assembly, the barrel including an identification device containing information specific to a prefilled medication and/or the syringe containing the prefilled medication, the flange including an identification device reader. The identification device reader is configured to read data from the identification device when the removable flange is coupled to the lip of the barrel and communicate the data to a hardware processor configured to process the read data and output an indication related to information about the medication and/or the barrel. In some embodiments, the verification device includes encoded information specific to the prefilled medication and/or the syringe. In some embodiments, the verification device is one or more of one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the information specific to the prefilled medication and/or syringe comprise one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the barrel, instructions for use, and/or indications.


Another embodiment of a method of promoting patient safety during an injection procedure can be used with an injection system including a disposable syringe assembly and a reusable electronic assembly. The method can include receiving at a hardware processor data from a 9-axis IMS of the reusable electronic assembly located on the disposable syringe assembly, receiving at the hardware processor information about one or more landmarks on a patient's face, the one or more landmarks defining the patient's face. calculating a location and position of the injection system relative to a patient's face, and outputting an indication when the injection system is targeting an approved injection zone or a no-injection zone. In some embodiments, the 9-axis IMS is located on data cap coupled to a proximal cap of a stem of the disposable syringe assembly. In some embodiments, the 9-axis IMS is located on a proximal cap of a stem of the disposable syringe assembly. In some embodiments, the 9-axis IMS is located on a removable flange coupleable to a lip of a barrel of the disposable syringe assembly.


Another embodiment of an injection system includes a reusable memory device configured to be coupled to a portion of a syringe assembly, the syringe assembly including at least a barrel, a stem and a needle. A memory device reader is configured to read data from the memory device and communicate the data to a hardware processor configured to process the read data and output an indication related to information about a medication held by the barrel and/or the syringe assembly. In some embodiments, the barrel contains prefilled medication. In some embodiments, the syringe assembly is disposable. In some embodiments, the memory device is on a shaft of the barrel. In some embodiments, the memory device is on a lip on a proximal end of the barrel. In some embodiments, the memory device is one or more of one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the data on the memory device comprise one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the barrel, instructions for use, and/or indications.


Another embodiment of an injection system includes a reusable memory device reader configured to be coupled to at least a portion of a syringe assembly, the syringe assembly including at least a stem, a barrel and a needle. The memory device reader is configured to read data from a memory device located on the barrel and communicate the data to a hardware processor configured to process the read data and output an indication related to information about the medication and/or the barrel. In some embodiments, the barrel contains prefilled medication. In some embodiments, the syringe assembly is disposable. In some embodiments, the memory device reader is on a shaft of the stem, wherein coupling the stem with the barrel allows the memory device reader to interact with the memory device on the barrel. In some embodiments, the memory device reader is on a proximal cap of the stem, wherein coupling the stem with the barrel allows the memory device reader to interact with the memory device on the barrel. In some embodiments, the memory device reader is on a reusable flange configured to be removably coupled to a lip of the barrel, wherein coupling the flange to the lip of the barrel allows the memory device reader to interact with the memory device located on the lip of the barrel. In some embodiments, the memory device reader is a reader for one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the data on the memory device comprise one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the barrel, instructions for use, and/or indications.


Another embodiment of an injection system includes a reusable memory device configured to be coupled to a barrel of a syringe assembly and a reusable memory device reader configured to be coupled to at least a portion of the syringe assembly. The syringe assembly can further include at least a stem and a needle. The memory device reader is configured to read data from the memory device located on the barrel and communicate the data to a hardware processor configured to process the read data and output an indication related to information about the medication and/or the barrel. In some embodiments, the barrel contains prefilled medication. In some embodiments, the syringe assembly is disposable. In some embodiments, the memory device is on a shaft of the barrel. In some embodiments, the memory device is on a lip on a proximal end of the barrel. In some embodiments, the memory device is one or more of one or more of an EEPROM, a barcode, an RFID tag and a resistor. In some embodiments, the data on the memory device comprise one or more of manufacturer's information, product type, serial number, lot number, date of expiration, prior use of the barrel, instructions for use, and/or indications. In some embodiments, the memory device reader is on a shaft of the stem, wherein coupling the stem with the barrel allows the memory device reader to interact with the memory device on the barrel. In some embodiments, the memory device reader is on a proximal cap of the stem, wherein coupling the stem with the barrel allows the memory device reader to interact with the memory device on the barrel. In some embodiments, the memory device reader is on a reusable flange configured to be removably coupled to a lip of the barrel, wherein coupling the flange to the lip of the barrel allows the memory device reader to interact with the memory device located on the lip of the barrel.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other features, aspects, and advantages of the present disclosure are described with reference to the drawings of certain embodiments, which are intended to schematically illustrate certain embodiments and not to limit the disclosure.



FIG. 1 is a simplified functional block diagram of an embodiment of the electronics assembly 100 of the smart syringe.



FIG. 2A illustrates an embodiment of a smart injection system with a data cap.



FIG. 2B illustrates a cross section of a proximal portion of the smart injection system of FIG. 2A.



FIG. 2C illustrates an example flowchart of displacement/rate of displacement measurement by an embodiment of a smart injection system.



FIG. 2D illustrates another embodiment of the smart injection system with a friction collar on a shaft of a smart stem.



FIG. 2E illustrates another embodiment of a smart injection system with a resistance wiper feature.



FIG. 3 illustrates stem pressure versus medication flow rate in tissue and in artery of any embodiment of a smart injection system.



FIG. 4A illustrates another embodiment of a smart injection system with features for monitoring location of an injection site relative to the patient's face.



FIG. 4B illustrates an example flowchart of location determination by an embodiment of a smart injection system.



FIG. 4C illustrates another embodiment of a smart injection system with features for monitoring location of an injection site relative to the patient's face.



FIG. 5 illustrates another embodiment of the smart injection system with the electronics integrated into a stem cap.



FIG. 6A illustrates an embodiment of a smart injection system with an identification device on a syringe and an identification device reader on a stem.



FIG. 6B illustrates an example flowchart of medication/syringe verification by the smart injection system of FIG. 6A.



FIG. 6C illustrates another embodiment of a smart injection system with an identification device on a syringe and an identification device reader on a stem.



FIG. 7 illustrates an example flowchart of injector verification by an embodiment of a smart injection system.



FIG. 8A illustrates an exploded view of another embodiment of a smart injection system with the electronics integrated in a syringe flange.



FIG. 8B illustrates a side view of the syringe flange of FIG. 8A.



FIG. 9 illustrates examples of integrating features of a smart injection system.





DETAILED DESCRIPTION

Although certain embodiments and examples are described below, those of skill in the art will appreciate that the disclosure extends beyond the specifically disclosed embodiments and/or uses and obvious modifications and equivalents thereof. Thus, it is intended that the scope of the disclosure herein disclosed should not be limited by any particular embodiments described below.


Some aspects of this present disclosure are directed to a smart injection system having smart features, that can, among other things, allow for measuring the location of the injection relative to the patient's face, guide the caregiver to the last injection site, measure the amount of medication injected into the patient and/or the speed and/or force of injection, authenticate medication to be injected in the patient, and verify identification of the injection provider.


Overview of Electronic Assembly


In some embodiments, a smart injection system includes a stem, a syringe, a needle, and an electronics assembly. The smart features of the injection system can be provided by interaction of the electronic assembly with at least one of the stem, syringe, needle assembly, or the patient. The smart injection system can wirelessly transmit measured data to a processing system that processes the data and to further transmit the data to one or more remote servers. FIG. 1 illustrates schematically an example electronics assembly 100 of a smart syringe. Physical locations and configurations of components of the electronics assembly 100 can vary and will be described in detail below.


As shown in FIG. 1, the electronics assembly 100 can include one or more sensors/readers 102, a processor 106, a wireless transceiver 108, and a power source (e.g., a battery) 110. The sensors/readers 102 that can be integrated into the electronic assembly are not limiting and can be any sensor or reader known in the art. Some examples include position sensor, force/pressure sensor, proximity/displacement sensor, biometric sensor, velocity sensor, resistance reader, barcode reader, EEPROM reader, or RFID tag reader. The position sensor can be accelerometer, gyroscope and magnetometer with three-degree angular and three-degree rotational resolution, and with a sensor drift adjustment capability (pitch, yaw and roll). The force sensor can be capable of sensing up to twenty pounds (20 lbs.) of force with a 2 percent accuracy factor. The proximity/displacement sensor can be optic range sensors or acoustic sensors. One example of an optic range sensor is a time-of-flight (ToF) camera system such as the FlightSense™ Ranging products (STMicroelectronics, Geneva Switzerland). One example of an acoustic range sensor is an ultrasonic sensor. A skilled artisan will appreciate that numerous other sensors and readers can be used in the disclosed electronics assembly 100 without departing from the scope of the disclosure herein. In some embodiments the electronics assembly 100 communicates data by way of a wireless transceiver 108 employing, for example, a Bluetooth wireless communications protocol. Other forms of wireless communication can also be used.


In some embodiments, the stem, syringe, and needle assembly of the smart injection system can be off-the-shelf or any standard injection systems, with the electronic assembly attached to one or more of the stem, syringe, and needle assembly before use. These embodiments can advantageously promote compatibility with most standard injection systems. The standard injection systems can be disposable, which can prevent cross-contamination due to re-use of any of the needle, syringe, or stem. In other embodiments, the stem 210, syringe 220, and needle assembly 230 can be custom-made to fit the electronic assembly 100. More details of these embodiments will be described below.


Data Cap with Force/Pressure Sensor



FIG. 2A illustrates a smart injection system 200 including the electronic assembly described above, a stem 210, a syringe 220, and a needle 230. In FIG. 2A, the stem 210, syringe 220, and needle 230 can be any standard injection systems with no built-in electronic components or smart features. The electronics assembly is attached to the standard injection system. For example, the electronic assembly can be built into a data cap 240, which can be coupled with a proximal cap 212 of the stem 210, making it a smart stem 210. The data cap 240 can be battery operated or rechargeable. One non-limiting example of coupling the data cap 240 and the proximal cap 212 of the stem 210 is a snap-fit feature. More detailed structures of the data cap 240 is shown in a cross sectional view of the injection system 200 in FIG. 2B. The data cap 240 can have an integrated cap 242 sliding over a cap body 244. The cap body 244 can have a slot 246 configured to accommodate the proximal cap 212 of the stem 210. A skilled artisan will appreciate that other means of coupling can be used, such as adhesives or clips. The electronic components, such as sensor(s)/reader(s), power source, and/or wireless transmitters can be built into an enclosed space formed by the integrated cap 242 and the cap body 244.


The data cap 240 can incorporate a printed circuit board (PCB) 250 having a force/pressure sensor. As illustrated in FIG. 2B, the PCB 250 can be located on a flat surface under a proximal end of the integrated cap 242 of the data cap 240. More specifically, the PCB 250, and therefore the force/pressure sensor is sandwiched between the flat surface of the integrated cap 242 and a column on the cap body 244. When the caregiver pushes the stem 210 distally along the syringe shaft 224 toward the needle 230, forces applied to the integrated cap 242 are transmitted to the proximal cap 212 via the force/pressure sensor and the column. The data cap 240 can optionally have a return spring 248 biased against an expanded configuration of the data cap 240 when no force is applied on the data cap 240. Pushing onto the data cap 240 can compress the return spring 248 and cause the data cap 240 to transit to a compressed configuration. The return spring 248 can advantageously maintain a minimum height of the enclosed space, preventing the electrical components inside the enclosed space from touching an inner surface of the cap body 244.


When a shaft 214 of the stem is positioned within a syringe shaft 224 and the caregiver pushes 252 onto the data cap 240, the stem shaft 214 is constrained to move distally and proximally along a longitudinal axis of the syringe shaft 224. The force/pressure sensor can measures a force or pressure applied to the data cap 240, which is the force or pressure applied to inject medication in the syringe 220. The force/pressure sensor can communicate the measured force/pressure information to, for example, an external processing system such as an interface/display device, by way of a communications protocol. In some embodiments the force/pressure sensor communicates data by way of the wireless transceiver 208 employing, for example, a Bluetooth wireless communications protocol. In some embodiments, the force/pressure sensor communicates data by way of a USB port using a cable that has a minimal diameter and is highly compliant.


In some embodiments, warnings can be given by the smart injection system 100, or the wireless transceiver 208 when the force/pressure measured by the force/pressure sensor exceeds or falls below a predetermined range. The form of warning is non-limiting, and can be audio, visual or tactile. By way of example, a beep or buzz can alert the caregiver and the patient of an inappropriate injection force/pressure. In another example, only a flashing LED light can go off or the smart injection system 200 can send a signal to, for example, a wristwatch worn by the caregiver, to provide tactile feedback. The visual and tactile alerts will only alert the caregiver so as to not agitate the patient during the injection procedure.


An ability of the smart injection system to measure the injection force/pressure can advantageously help in training the caregiver or medical personnel in providing a steady and desired injection force/pressure. A steady and appropriate injection force/pressure can reduce discomfort to the patient and ensure patient safety. Different medications may have different viscosity and require different ranges of injection force/pressure. The information about suitable ranges of injection force/pressure for various medications can be prepopulated in a processor of the wireless transceiver 208 before using the data cap 240. The information can also be encoded in a memory device included in the body of the syringe 220 during manufacturing and read by the data cap 240. For example, it can be in the form of an RFID tag, a bar code, a resistance value or encoded in an EPROM. In one embodiment, when the stem 210 and the data cap 240 coupled to the stem 210 are connected to the syringe 220, an electrical circuit can be completed such that the data cap 240 can communicate with any wired memory device on the syringe 220.


Data Cap with Displacement Sensor


With continued reference to FIGS. 2A-B, the data cap 240 can include a displacement sensor 260 as described herein. The displacement sensor 260 can be on a distal end of the data cap 240 such as when the data cap 240 is coupled to the proximal cap 212 of the stem 210, the displacement sensor 260 is at a distal side of the proximal cap 212. As shown in FIG. 2B, the displacement sensor 260 can be located on an outer surface of the cap body 244 of the data cap and directly faces a reflecting surface of a proximal syringe flange 222.


When the caregiver pushes onto the data cap 240, and therefore the proximal cap 212 of the stem 210, the displacement sensor 260 can detect axial travel/displacement of the stem shaft 212 within the syringe shaft 224. In some embodiments, the displacement sensor 260 detects a displacement of the stem shaft 212 with respect to time. As shown in FIG. 2C, the displacement sensor 260, such as the FlightSense™ sensor or an ultrasound sensor, can be activated 290 in any manner known in the art. For example, the sensor can be activated by charged or when the stem 210 connects to the syringe 220. The displacement sensor can be activated when the electronic assembly of the stem is charged. The activated displacement sensor 260 can send a signal 292 toward the proximal flange 222 of the syringe 220 and receive the signal 294 when the signal returns upon hitting a reflecting surface on the proximal flange 222 facing the displacement sensor 260. The displacement sensor 260 can measure and record 296 the time taken between sending and receiving the signal. The displacement sensor 260 can then communicate the measured data to a processor or the wireless transceiver 208 in the manner described herein. The processor can calculate an instantaneous displacement or rate of displacement of the stem 210 relative to the syringe 220 by taking into account the speed of light or sound as the displacement sensor 260 repeats the steps 292, 294, 296 to provide the processor with data measured with respect to subsequent signals. A skilled artisan will appreciate any types of optic range sensor or acoustic sensors can measure the displacement or rate of displacement of the stem 210 relative to the syringe 220.


Information about the displacement and/or the rate of displacement of the stem 210 can be valuable. For example, the information can be used to inform the medical personnel and/or the patient that the injection speed is too high or too low. In addition, as an inner cross sectional area of the syringe shaft 220 can be known, the displacement of the stem 210 inside the syringe shaft 224 can be used to calculate a volume of the medication being injected into the patient. In some embodiments, the information about the volume of injected medication can be available to the caregiver, patient, and or manufacturers of the medication real time. In some embodiments, the information can be automatically sent to the patient, manufacturer, medical professional, or a repository. This information can provide assurance to the patient that an appropriate amount of medication has been provided. The information can incentivize the caregiver to comply with injection procedure because detailed information about how the injection procedure is performed can be recorded and be part of the patient's medical record. The information also provides the medication manufacturers with a tool for inventory keeping and for predicting future sales. Current technology allows the manufacturers to only keep track of the amount of medication sold, but not of the amount of medication actually used.


Alternative embodiments of the injection system capable of measuring displacement and/or displacement rate of the stem 210 relative to the syringe 220 will now be described. FIG. 2D illustrates an optional friction slide collar 270, which can have a surface with reflective property facing the proximal cap 212 of the stem. This surface can function as a reflecting surface instead of the syringe flange 222. The collar can move axially along the stem shaft 214 and can advantageously provide more fine-tuned resolution for axial displacement or rate of displacement of the stem 210 relative to the syringe 220 as the collar does not move or wobble when the stem 210 is moved axially into the syringe.



FIG. 2E illustrates another method of measuring displacement and/or displacement rate of the stem 210 relative to the syringe 220. In this embodiment, the syringe 220 has a spring wiper 226 at or near a proximal end of the syringe 220. For example, the wiper 226 can be located right above the syringe flange 222. The stem shaft 214 has a strip of variable resistor 216 along a longitudinal axis of the stem shaft 214. The resistor 216 can be electrically wired to the wireless communication components on the stem 210 and can provide measured resistance values to the wireless transceiver 208. The measured resistance values can inform on the axial displacement or displacement rate of the stem 210 relative to the syringe 220. Further, any combination of the above-described displacement sensors can be used in a single device.


Data Cap with Force/Pressure and Displacement Sensors


Turning to FIG. 3, a method of an embodiment of a smart injection system having both a force/pressure sensor and a displacement sensor disclosed herein will now be described. A risk during an injection procedure is when the needle hits an artery, because the medication injected can lead to artery occlusion, and can potentially cause blindness in the patient. One or more processors, such as one on a remote server, in communication with the smart injection system or included as part of the smart injection system, can process measured data from included sensors. For example, the one or more processors can analyze pressure applied to the stem in relation to a mediation flow rate. The flow rate can be calculated from the displacement rate data as described above. The relationships of the pressure applied on the stem and the medication flow rate when the needle is placed in the tissue and inside an artery are illustrated in FIG. 3. The processor can analyze the medication flow rate when the pressure measured by the force/pressure sensor is within a nominal range. As shown in FIG. 3, the flow rate is significantly higher in the artery than in the tissue for the same range of nominal pressure because blood has less resistance than tissues.


Using this information, the processor can output a warning in any manner as described herein when the flow rate during the nominal range of measured pressure indicated that the current injection site is an artery instead of tissue. This will warn a physician to stop the injection immediately and can provide immediate instructions for applying a dissolving agent. The immediate alert can also allow the physician to leave the needle in place and swap the currently injected substance for a dissolving agent without changing the injection site. In some embodiments, when the needle is still inside the artery, the syringe containing the medication can be replaced with a syringe containing a clot-dissolving agent. A non-limiting example of a clot-dissolving agent is hyaluronidase.


Data Cap with Angular & Relative Positioning


Returning to FIGS. 2A-B, the data cap 240 can include 3-axis gyroscope, 3-axis accelerometer, and 3-axis magnetometer (a 9-axis Inertia Motion Sensor (IMS)) as described herein. The 9-axis IMS can measure angular position and rotation of the data cap 240, and therefore of the proximal cap 212 of the stem 210. As shown in FIG. 2B, the 9-axis IMS can be located on the PCB 250, although a skilled artisan will recognize that the 9-axis IMS can be located in any appropriate part of the data cap 240.


Methods of using a smart injection system 400 having a 9-axis IMS for locating approved and/or prohibited injection zone(s) will be described with reference to FIGS. 4A-C. As shown in FIG. 4A, two landmarks 401, 402 on a patient's face can be provided on a pair of glasses and a third landmark 403 can be identified provided a bite block on a patient's mouth to define a patient's face. A skilled artisan will appreciate that any suitable landmarks of any combination can be used to define the patient's face. In some embodiments, additional or secondary landmarks can be provided on the patient's face, such as by a marker pen, stickers (shown in FIG. 4B) and/or face masks. These additional landmarks can be used instead of or in combination with the three landmarks 401, 402, 403 or other suitable landmarks. Information about the landmarks 401, 402, 403 and other landmarks can be preloaded on a processor in communication with the smart injection system 200. The 9-axis IMS can communicate with the wireless transceiver 408 in any manner described above to provide angular and rotational position of a stem 410 coupled to a syringe 420 and a needle 430.


As shown in FIG. 4C, the processor can receive 490 data measured by the 9-axis IMS from the wireless transceiver 408 and compare 492 the measured data with the preloaded landmarks information in order to calculate the location and position of the injection system 400 relative to the patient's face. The processor can check 494 if the location is in an approved injection zone. If the injection system is in an approved zone, the processor can output an indication 495 that the caregiver can proceed with the injection at the current location and position of the injection system. If the injection system is not in an approved zone, the processor can check 496 if the injection system is in a no-injection zone. If the injection system is in a no-injection zone, the processor can output a warning 497 in any manner described herein. If the injection is in neither an approved nor a no-inject zone, the processor can output an instruction 498 to the caregiver to move the injection system.


In some embodiments, the processor can output an indication of the relative position of the injection system 400 to the patient's face by green and red LED lights. For example, a green light indicates that the injection system 400 is targeting an approved injection zone 405 and a red light indicates that the injection system 400 is targeting a no-inject zone 407. In another example, the processor can display the position of the injection system 400 relative to the patient's face on a display screen. Details of the display systems and methods are described in U.S. Provisional Application No. 62/303,251, filed Mar. 3, 2016 and entitled “GUIDED NEEDLE,” the entirety of which is incorporated by reference herein and should be considered a part of this disclosure.


Integrated Smart Stem


Turning to FIG. 5, another embodiment of a smart injection system 500 will be described. The injection system 500 has the same features as the injection system 200, 400 except as described below. Accordingly, features of the injection system 200, 400 can be incorporated into features of the injection system 500 and features of the injection system 500 can be incorporated into features of the injection system 200, 400. The smart injection system 500 also has a stem 510, a syringe 520, and a needle 530. The syringe 520 and needle 530 may be standard off-the-shelf syringe and needle and be disposable. However, the stem 510 includes one or more electronic components as described above that are integrated into a proximal cap 512 of the stem 510. The stem 510 can be battery operated or rechargeable. Examples of the electronic components include but are not limited to force/pressure sensor, displacement sensor, 9-axis IMS, biometric sensor, power source and wireless transmitters. In this embodiment, the stem 510 is designed to be reusable. For example, some electronic components can be sealed within the proximal cap 512 so that the stem 510 can be sterilized without affecting those electronic components or any exposed sensors/readers can be directly sterilized. A skilled artisan will recognize that any method of using the injection system having the data cap 240 can be performed by the injection system 500 with the integrated proximal cap 512.


Smart Injection System with Medication Verification


Turning to FIG. 6A, a smart injection system 600 capable of verifying authenticity of a syringe containing certain medications will be described. The injection system 600 has the same features as the injection system 200, 400, 500 except as described below. Accordingly, features of the injection system 200, 400, 500 can be incorporated into features of the injection system 600 and features of the injection system 600 can be incorporated into features of the injection system 200, 400, 500.


In this embodiment, the syringe 620 can have a manufacturer's identification device 625 on the syringe shaft 624. The manufacturer's identification device 625 can be one or more of an EEPROM, a barcode, a RFID tag, or a resistor. The identification device 625 can be embedded in a wall of the syringe shaft 620, located on an inner or outer surface of the syringe shaft 624, or located at a top lip of the shaft, for example, as shown in FIG. 6C. Exact location of the identification device 625 on the syringe 220 is not limiting. The stem 610 has a corresponding identification device reader 615 on the stem shaft 614. For example, the corresponding identification device reader 615 can be an EEPROM reader, a bar code reader, an RFID tag reader or a resistance reader.


When an authentic syringe 620 prefilled with medication by the manufacturer is used with the stem 610, the identification device reader 615 on the stem 610 will interact with or read the identification device 625 on the syringe 620. For example, an RFID tag as the identification device 625 can be encoded by the manufacturer so that only an authentic prefilled syringe 620 will result in a positive reading. The identification device reader 615 on the stem 610 can be electrically wired to the wireless transmitter on the stem 210 and can provide a reading result to a wireless transceiver, which can forward the data to one or more remote servers. The injection system 600 or the wireless transceiver can provide a warning or indication of authenticity in any manner described above. The manufacturer can access information sent to the remote server receive to be informed when its medication is used and be alerted when a counterfeit prefilled syringe is detected by the stem 610.


In some embodiments, the identification device 625 can store information related to authenticity or product type, as well as optionally store information specific about the particular batch of medication or the syringe. Non-limiting examples of such information include serial and/or lot number, expiration date of the medication, and prior use of the syringe 620. Serial/Lot numbers can provide easy identification of the manufacturer and aid the manufacturer in keeping track of medication that has been injected into patients. Expiration date of the medication can ensure that patients are not injected with expired or unsafe products. Information about prior use of the syringe 620 can inform the caregiver and/or the patient that the medication in the syringe may have been tampered with and prevent cross-contamination caused by multiple uses of the syringe 620. The information can also be used for tracking inventory and aiding in product recalls.



FIG. 6B illustrates a flowchart of processing activities of one or more processors. The processor first determines from the data sent by the identification device reader 615 if an identification device is present on the syringe 620 in step 690. In some embodiments, the identification device reader is powered by the electronic assembly of the injection system. In some embodiments, if no identification device is present, the processor will deactivate any sensors or readers on the electronic assembly so that no data about the counterfeit product is collected in step 691. The processor can also alert the patient, caregiver, and/or the manufacturer about counterfeit products in the step 691. In an embodiment, sensors can continue to operate, but the data is not provided to the physician. It can be kept for later analysis by the manufacturer. In other embodiments, only an alert is provided, but the processor and sensors continue to operate as normal. If an identification device is present and can be read, the processor can optionally decode 692 any encoded information on the identification device. Once the information has been decoded, the processor can access a manufacturer's database 693 containing information about its prefilled syringes to compare the decoded information with information on the database. If there is a match between the data on the syringe and information on the database, the processor can output for display 694 the information about the prefilled syringe, such as the manufacturer information, serial and lot numbers, product types, instructions for use, indications, etc., and output an indication that the prefilled syringe is an authentic product. If there is no match, the processor can repeat the warning step 691. The processor can optionally further determine if the medication has expired based on its date of production 695, or if the syringe has been previously used 696, which suggests that the syringe and the medication in the syringe has been tampered with. The processor can then output either an indication that the prefilled syringe is safe for use 694 or a warning 691 accordingly.


Smart Stem with Injector Verification


In some embodiments, the smart injection system described herein can optionally include a biometric sensor. The biometric sensor can be included on the data cap 240, for example, on the PCB 250, or on the integrated proximal cap 512 of the stem 510. A non-limiting example of biometric sensors is a fingerprint reader, although a skilled artisan will recognize that other biometric sensors known in the art can be used. The biometric sensor can detect information about the person performing the injection procedure. The biometric sensor can transmit measured data about the person performing the injection procedure to the wireless transceiver 208, 508, which can in turn send the measured data to a processor in communication with the wireless transceiver. In some embodiments, the processor can be on a remote server, which can also store or have access to a database of qualified personnel.


As shown in FIG. 7, the processor can first determine if any data is read 790 by the biometric sensor after the smart injection system has been activated. The processor can output a warning 792 if the biometric sensor cannot read any data and/or stop all measuring activities as described herein. In some embodiments, the processor can also deactivate all sensors of the electronic assembly so that no data about the injection procedure is measured in the warning step 792. In some embodiments, the warning step 792 can further include a warning to the medical certification board that the clinic or hospital where the injection took place is likely allowing unqualified personnel to perform the injection. If biometric data about the injector is received, the processor compares it with the database of qualified personnel 794 to look for a match. If the data collected about the person performing the injection does not match one in the database, the processor can repeat the alert step 792 described herein. If there is a match, the processor can output an indication that the injector is qualified to perform the injection 796. Collecting data about the caregiver can deter clinics/hospitals from hiring untrained staff to perform the injections, thereby improving safety of the patients and accountability of the medical professionals.


Integrated Syringe Flange


Turning to FIGS. 8A-B, another embodiment of a smart injection system 800 will be described. The injection system 800 has the same features as the injection system 200, 400, 500, 600, except as described below. Accordingly, features of the injection system 200, 400, 500, 600, can be incorporated into features of the injection system 800 and features of the injection system 800 can be incorporated into features of the injection system 200, 400, 500, 600. The smart injection system 800 has a stem 810, a flange 822, and a syringe 820 with a syringe lip 823 and a syringe shaft 824. A needle can be coupled to the syringe shaft 824 at a distal end of the syringe shaft 824.



FIG. 8A-8B illustrate an example mechanism for removably coupling the flange 822 to the syringe 820. As shown, the flange 822 has a slot 827 sized to fit the syringe lip 823. Sliding the syringe lip 823 into the slot 827 allows the flange 822 to be attached to the syringe 820. A skilled artisan will recognize that other means of coupling the flange 822 to the syringe 820 known in the art, for example, clips, adhesives, and the like, can be used without departing from the scope of the disclosure herein.


The stem 810 and the syringe 820 may be standard off-the-shelf parts, and be disposable. The flange 822 includes one or more electronic components as described above that are integrated into the flange 822 and is designed to be reusable. In this embodiment, for example, some electronic components can be sealed within the flange 822 so that the flange 822 can be sterilized without affecting those electronic components or any exposed sensors/readers can be directly sterilized.


Examples of the electronic components include but are not limited to a velocity sensor, a 9-axis IMS, biometric sensor, an identification device reader, a power source (for example, disposable or rechargeable batteries), and wireless transmitters. For example, the velocity sensor can be any velocity sensor known in the art and can measure the displacement and/or rate of displacement of the stem 810 relative to the syringe 820. Information about the speed of injection and volume of medication injected into the patient can be determined with the data measured by the velocity sensor as described above. In addition, the injection pressure can be calculated based on the measured data and known constants, including but not limited to viscosity and drag. A skilled artisan will recognize that the velocity sensor can be implemented in the data cap 240 and/or the integrated stem 510. A skilled artisan will also recognize that any method of using the injection system having the data cap 240 or the integrated stem proximal cap 512 can be performed by the injection system 800 with the integrated flange 822. It is to be understood that only one, more than one, or all of the above listed sensors and other sensors known in the art can be integrated into and used with flange 822.


In some embodiments, an identification device 825 as described above can be placed on the syringe lip 823, as shown in FIG. 8A. When the flange 822 connects to the syringe lip 823, an identification device reader 826 on the flange 823 can read the information on the identification device 825 in any manner described herein and perform verification of the medication prefilled in the syringe 820 and/or the syringe 820.


Combination and/or Subcombination Embodiments


Although features of the smart injection system are described individually in various embodiments herein, a skilled artisan will appreciate that any one or more of those features described herein can be implemented on a smart injection system.


An example combination of features and the advantages thereof are illustrated in FIG. 9. The injection system can have an electronic assembly configured to measure a variety of information, including but not limited to force of injection and/or travel of the stem information, angular and relative position of the stem relative to a patient's face, verify authenticity and other product information of a prefilled syringe, and verify identity of an injector, in the manners described above. The information measured by the electronic assembly can be transmitted to one or more processors located locally or remotely. In some embodiments, the measured data can cause the one or more processors to generate local alerts/alarms on the injection system or in the room where the patient receives the injection. In some embodiments, data transmission can be performed wirelessly and the one or more processors can be located on one or more remote servers (the “Cloud”). In response to the measured data, the processor can output instructions to the injection system. Examples of the instructions include information about whether the injector is an authorized/licensed medical professional, whether the procedure is in compliance with protocols, whether the medication being injected is safe and/or authentic, and the like. In response to the measured data, the processor can also output alerts and/or data to the manufacturer, including but not limited to information about medication usage for inventory control, monitoring of injectors' qualification, and injection quality data. In some embodiments, the processor can make the alerts and/or data available for retrieval by the manufacturer. In other embodiments, the processor can automatically send the alerts and/or data to the manufacturer.


In addition, an augment to the alarms/alerts on the injection system can provide audio, visual or tactile feedback confirming the correct injection technique. This continuous feedback contributes to a more perfect injection than threshold alarms that are triggered only at the limits of acceptable operation.


It is to be understood that the various sensor and electronics, as well as the techniques and processes described with respect to each embodiment disclosed herein can be used with and integrated to other embodiments disclosed herein as would be readily understood by a person of skill in the art reading the present disclosure.


Terminology


Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without other input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.


Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be either X, Y, or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require the presence of at least one of X, at least one of Y, and at least one of Z.


Language of degree used herein, such as the terms “approximately,” “about,” “generally,” and “substantially” as used herein represent a value, amount, or characteristic close to the stated value, amount, or characteristic that still performs a desired function or achieves a desired result. For example, the terms “approximately”, “about”, “generally,” and “substantially” may refer to an amount that is within less than 10% of, within less than 5% of, within less than 1% of, within less than 0.1% of, and within less than 0.01% of the stated amount. As another example, in certain embodiments, the terms “generally parallel” and “substantially parallel” refer to a value, amount, or characteristic that departs from exactly parallel by less than or equal to 15 degrees, 10 degrees, 5 degrees, 3 degrees, 1 degree, 0.1 degree, or otherwise.


Any methods disclosed herein need not be performed in the order recited. The methods disclosed herein include certain actions taken by a practitioner; however, they can also include any third-party instruction of those actions, either expressly or by implication. For example, actions such as “inserting the testing tool” include “instructing insertion of a testing tool.”


All of the methods and tasks described herein may be performed and fully automated by a computer system. The computer system may, in some cases, include multiple distinct computers or computing devices (e.g., physical servers, workstations, storage arrays, cloud computing resources, etc.) that communicate and interoperate over a network to perform the described functions. Each such computing device typically includes a processor (or multiple processors) that executes program instructions or modules stored in a memory or other non-transitory computer-readable storage medium or device (e.g., solid state storage devices, disk drives, etc.). The various functions disclosed herein may be embodied in such program instructions, and/or may be implemented in application-specific circuitry (e.g., ASICs or FPGAs) of the computer system. Where the computer system includes multiple computing devices, these devices may, but need not, be co-located. The results of the disclosed methods and tasks may be persistently stored by transforming physical storage devices, such as solid state memory chips and/or magnetic disks, into a different state. In some embodiments, the computer system may be a cloud-based computing system whose processing resources are shared by multiple distinct business entities or other users.


Depending on the embodiment, certain acts, events, or functions of any of the processes or algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described operations or events are necessary for the practice of the algorithm). Moreover, in certain embodiments, operations or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially.


The various illustrative logical blocks, modules, routines, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware (e.g., ASICs or FPGA devices), computer software that runs on general purpose computer hardware, or combinations of both. Various illustrative components, blocks, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as specialized hardware versus software running on general-purpose hardware depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.


Moreover, the various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor device, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor device can be a microprocessor, but in the alternative, the processor device can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor device can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor device includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor device can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor device may also include primarily analog components. For example, some or all of the rendering techniques described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.


The elements of a method, process, routine, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor device, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of a non-transitory computer-readable storage medium. An exemplary storage medium can be coupled to the processor device such that the processor device can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor device. The processor device and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor device and the storage medium can reside as discrete components in a user terminal.


While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it can be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As can be recognized, certain embodiments described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain embodiments disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. A method of automatically assessing accuracy in an injection procedure performed on a live patient, the method comprising: providing an injection system comprising:a syringe, wherein the syringe comprises a plunger, a syringe body, and a needle, the syringe configured to inject an injectable product directly into an anatomical location of the live patient at a predetermined target location, an efficacy or safety of the injectable product depending at least in part on an accuracy of an injection relative to the predetermined target location; andone or more sensors coupled to the syringe, the one or more sensors configured to measure data indicative of a position and an orientation of the syringe relative to anatomy of the live patient;receiving and analyzing, by a processor in communication with the one or more sensors, data of the injection from the one or more sensors; andoutputting a warning in response to detecting an error in the accuracy of the injection based at least in part on the data of the injection.
  • 2. The method of claim 1, further comprising determining the syringe is in an approved injection zone in the live patient.
  • 3. The method of claim 2, wherein the error comprises the syringe not being in the approved injection zone.
  • 4. The method of claim 1, wherein the error comprises comprise the syringe being in a no-injection zone in the live patient.
  • 5. The method of claim 1, wherein the injection system further comprises: a plunger displacement sensor;a force sensor; ora pressure sensor.
  • 6. The method of claim 5, further comprising detecting that a volume of the injectable product injected into the live patient is inaccurate based at least in part on data from the plunger displacement sensor.
  • 7. The method of claim 5, further comprising detecting that a rate of injection of the injectable product or an injection force exceeds a threshold based at least in part on data from the plunger displacement sensor, the force sensor, or the pressure sensor.
  • 8. The method of claim 5, further comprising detecting that the needle is in an artery of the live patient based at least in part on data from: the plunger displacement sensor; andthe force sensor or the pressure sensor.
  • 9. The method of claim 8, wherein the needle being in the artery is detected by the processor determining a relationship of a measured injection force or pressure and a measured rate of plunger displacement, wherein the processor is configured to analyze an injectable product flow rate calculated from the measured rate of plunger displacement when the measured injection force or pressure is within a range.
  • 10. The method of claim 9, further comprising outputting a second warning instructing an injector to stop the injection immediately and apply an occlusion dissolving agent.
  • 11. The method of claim 10, wherein the dissolving agent comprises hyaluronidase.
  • 12. The method of claim 10, wherein the second warning allows the injector to replace the syringe body containing the injectable product with another syringe body containing the occlusion dissolving agent when the needle is inside the artery.
  • 13. The method of claim 1, wherein the warning comprises audio, visual, or tactile feedback, or any combinations thereof.
  • 14. The method of claim 1, wherein the one or more sensors comprise an inertia motion sensor.
  • 15. The method of claim 14, further comprising comparing data from the inertia motion sensor with anatomical landmarks information preloaded on the processor to calculate the position and the orientation of the syringe relative to the anatomy of the live patient.
INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57. Specifically, this application is a continuation of U.S. patent application Ser. No. 15/299,209, filed Oct. 20, 2016, entitled “INJECTION SYSTEM,” which claims benefit of U.S. Provisional Application No. 62/243,801, filed Oct. 20, 2015 and entitled “INJECTION SYSTEM,” the entirety of which is hereby incorporated by reference and should be considered a part of this specification.

US Referenced Citations (363)
Number Name Date Kind
3237340 Knott Mar 1966 A
3722108 Chase Mar 1973 A
3941121 Olinger et al. Mar 1976 A
4142517 Contreras Guerrero de Stavropoulos et al. Mar 1979 A
4311138 Sugarman Jan 1982 A
4356828 Jamshidi Nov 1982 A
4410020 Lorenz Oct 1983 A
4439162 Blaine Mar 1984 A
4515168 Chester et al. May 1985 A
4566438 Liese et al. Jan 1986 A
4836632 Bardoorian Jun 1989 A
4867686 Goldstein Sep 1989 A
4880971 Danisch Nov 1989 A
4945478 Merickel et al. Jul 1990 A
5065236 Diner Nov 1991 A
5197476 Nowacki et al. Mar 1993 A
5198877 Schulz Mar 1993 A
5241184 Menzel Aug 1993 A
5249581 Horbal et al. Oct 1993 A
5295483 Nowacki et al. Mar 1994 A
5321257 Danisch Jun 1994 A
5391081 Lampotang et al. Feb 1995 A
5517997 Fontenot May 1996 A
5518407 Greenfield et al. May 1996 A
5534704 Robinson et al. Jul 1996 A
5622170 Shulz Apr 1997 A
5651783 Reynard Jul 1997 A
5690618 Smith et al. Nov 1997 A
5704791 Gillio Jan 1998 A
5727948 Jordan Mar 1998 A
5766016 Sinclair et al. Aug 1998 A
5817105 Van Der Brug Oct 1998 A
5828770 Leis et al. Oct 1998 A
5890908 Lampotang et al. Apr 1999 A
5899692 Davis et al. May 1999 A
5923417 Leis Jul 1999 A
5954648 Van Der Brug Sep 1999 A
5954701 Matalon Sep 1999 A
6010531 Donlon et al. Jan 2000 A
6024576 Bevirt et al. Feb 2000 A
6061644 Leis May 2000 A
6064749 Hirota et al. May 2000 A
6127672 Danisch Oct 2000 A
6172499 Ashe Jan 2001 B1
6217558 Zadini et al. Apr 2001 B1
6288785 Frantz et al. Sep 2001 B1
6353226 Khalil et al. Mar 2002 B1
6385482 Boksberger et al. May 2002 B1
6428323 Pugh Aug 2002 B1
6470302 Cunningham et al. Oct 2002 B1
6485308 Goldstein Nov 2002 B1
6538634 Chui et al. Mar 2003 B1
6553326 Kirsch et al. Apr 2003 B1
6564087 Pitris et al. May 2003 B1
6568941 Goldstein May 2003 B1
6575757 Leight et al. Jun 2003 B1
6625563 Kirsch et al. Sep 2003 B2
6687529 Van Vaals Feb 2004 B2
6702790 Ross et al. Mar 2004 B1
6769286 Biermann et al. Aug 2004 B2
6774624 Anderson et al. Aug 2004 B2
6836745 Seiler et al. Dec 2004 B2
6857878 Chosack et al. Feb 2005 B1
6863536 Fisher et al. Mar 2005 B1
7015859 Anderson Mar 2006 B2
7115113 Evans et al. Oct 2006 B2
7137712 Brunner et al. Nov 2006 B2
7158754 Anderson Jan 2007 B2
7194296 Frantz et al. Mar 2007 B2
7204796 Seiler Apr 2007 B1
7247149 Beyerlein Jul 2007 B2
7383728 Noble et al. Jun 2008 B2
7474776 Kaufman et al. Jan 2009 B2
7500853 Bevirt et al. Mar 2009 B2
7544062 Hauschild et al. Jun 2009 B1
7553159 Arnal et al. Jun 2009 B1
7594815 Toly Sep 2009 B2
7665995 Toly Feb 2010 B2
7725279 Luinge et al. May 2010 B2
7761139 Tearney et al. Jul 2010 B2
7783441 Nieminen et al. Aug 2010 B2
7857626 Toly Dec 2010 B2
7912662 Zuhars et al. Mar 2011 B2
7945311 McCloy et al. May 2011 B2
8007281 Toly Aug 2011 B2
8040127 Jensen Oct 2011 B2
8072606 Chau et al. Dec 2011 B2
8103883 Smith Jan 2012 B2
8131342 Anderson Mar 2012 B2
8165844 Luinge et al. Apr 2012 B2
8203487 Hol et al. Jun 2012 B2
8208716 Choi et al. Jun 2012 B2
8226610 Edwards et al. Jul 2012 B2
8250921 Nasiri et al. Aug 2012 B2
8257250 Tenger et al. Sep 2012 B2
8277411 Gellman Oct 2012 B2
8319182 Brady et al. Nov 2012 B1
8342853 Cohen Jan 2013 B2
8351773 Nasiri et al. Jan 2013 B2
8382485 Bardsley Feb 2013 B2
8403888 Gaudet Mar 2013 B2
8408918 Hu et al. Apr 2013 B2
8409140 Ejlersen et al. Apr 2013 B2
8437833 Silverstein May 2013 B2
8442619 Li et al. May 2013 B2
8450997 Silverman May 2013 B2
8467855 Yasui Jun 2013 B2
8469716 Fedotov et al. Jun 2013 B2
8525990 Wilcken Sep 2013 B2
8535062 Nguyen Sep 2013 B2
8556635 Toly Oct 2013 B2
8632498 Rimsa et al. Jan 2014 B2
8647124 Bardsley et al. Feb 2014 B2
8655622 Yen et al. Feb 2014 B2
8684744 Selz et al. Apr 2014 B2
8689801 Ritchey et al. Apr 2014 B2
8715233 Brewer et al. May 2014 B2
8764449 Rios et al. Jul 2014 B2
8818751 Van Acht et al. Aug 2014 B2
8917916 Martin et al. Dec 2014 B2
8924334 Lacey et al. Dec 2014 B2
8945147 Ritchey et al. Feb 2015 B2
8961189 Rios et al. Feb 2015 B2
8994366 Ashe Mar 2015 B2
9017080 Placik Apr 2015 B1
9024624 Brunner May 2015 B2
9031314 Clausen et al. May 2015 B2
9053641 Samosky Jun 2015 B2
9123261 Lowe Sep 2015 B2
9251721 Lampotang et al. Feb 2016 B2
9275557 Trotta Mar 2016 B2
9318032 Samosky et al. Apr 2016 B2
9361809 Caron Jun 2016 B1
9439653 Avneri et al. Sep 2016 B2
9443446 Rios et al. Sep 2016 B2
9456766 Cox et al. Oct 2016 B2
9460638 Baker et al. Oct 2016 B2
9486162 Zhuang et al. Nov 2016 B2
9554716 Burnside et al. Jan 2017 B2
9595208 Ottensmeyer et al. Mar 2017 B2
9626805 Lampotang et al. Apr 2017 B2
9666102 East et al. May 2017 B2
9792836 Rios et al. Oct 2017 B2
9922578 Foster et al. Mar 2018 B2
10083630 Samosky et al. Sep 2018 B2
10173015 Fiedler et al. Jan 2019 B2
10269266 Rios et al. Apr 2019 B2
10290231 Rios et al. May 2019 B2
10290232 Rios et al. May 2019 B2
10325522 Samosky et al. Jun 2019 B2
10398855 McClellan Sep 2019 B2
10500340 Rios et al. Dec 2019 B2
10643497 Rios et al. May 2020 B2
10743942 Foster et al. Aug 2020 B2
10849688 Rios et al. Dec 2020 B2
10857306 Holmqvist et al. Dec 2020 B2
10896627 Foster et al. Jan 2021 B2
10902746 Rios et al. Jan 2021 B2
11403964 Rios et al. Aug 2022 B2
11710424 Rios et al. Jul 2023 B2
11730543 Rios et al. Aug 2023 B2
11854426 Rios et al. Dec 2023 B2
20010037191 Furuta et al. Nov 2001 A1
20020076681 Leight et al. Jun 2002 A1
20020168618 Anderson et al. Nov 2002 A1
20020191000 Henn Dec 2002 A1
20030031993 Pugh Feb 2003 A1
20030055380 Flaherty Mar 2003 A1
20030065278 Rubinstenn et al. Apr 2003 A1
20030108853 Chosack et al. Jun 2003 A1
20030114842 DiStefano Jun 2003 A1
20030164401 Andreasson et al. Sep 2003 A1
20030220557 Cleary et al. Nov 2003 A1
20040009459 Anderson et al. Jan 2004 A1
20040092878 Flaherty May 2004 A1
20040118225 Wright et al. Jun 2004 A1
20040126746 Toly Jul 2004 A1
20040161731 Arington et al. Aug 2004 A1
20040175684 Kaasa et al. Sep 2004 A1
20040234933 Dawson et al. Nov 2004 A1
20050055241 Horstmann Mar 2005 A1
20050057243 Johnson et al. Mar 2005 A1
20050070788 Wilson et al. Mar 2005 A1
20050084833 Lacey et al. Apr 2005 A1
20050181342 Toly Aug 2005 A1
20050203380 Sauer et al. Sep 2005 A1
20060084050 Haluck Apr 2006 A1
20060085068 Barry Apr 2006 A1
20060194180 Bevirt et al. Aug 2006 A1
20060264745 Da Silva Nov 2006 A1
20060264967 Ferreyro et al. Nov 2006 A1
20070003917 Kitching et al. Jan 2007 A1
20070150247 Bodduluri Jun 2007 A1
20070179448 Lim et al. Aug 2007 A1
20070197954 Keenan Aug 2007 A1
20070219503 Loop et al. Sep 2007 A1
20070238981 Zhu et al. Oct 2007 A1
20080038703 Segal et al. Feb 2008 A1
20080097378 Zuckerman Apr 2008 A1
20080107305 Vanderkooy et al. May 2008 A1
20080123910 Zhu May 2008 A1
20080138781 Pellegrin et al. Jun 2008 A1
20080176198 Ansari et al. Jul 2008 A1
20080177174 Crane Jul 2008 A1
20080194973 Imam Aug 2008 A1
20080270175 Rodriguez et al. Oct 2008 A1
20090036902 Dimaio et al. Feb 2009 A1
20090043253 Podaima Feb 2009 A1
20090046140 Lashmet et al. Feb 2009 A1
20090061404 Toly Mar 2009 A1
20090074262 Kudavelly Mar 2009 A1
20090081619 Miasnik Mar 2009 A1
20090081627 Ambrozio Mar 2009 A1
20090123896 Hu et al. May 2009 A1
20090142741 Ault et al. Jun 2009 A1
20090161827 Gertner et al. Jun 2009 A1
20090208915 Pugh Aug 2009 A1
20090221908 Glossop Sep 2009 A1
20090234302 Hoendervoogt Sep 2009 A1
20090262988 Karkanias et al. Oct 2009 A1
20090263775 Ullrich Oct 2009 A1
20090265671 Sachs et al. Oct 2009 A1
20090275810 Ayers et al. Nov 2009 A1
20090278791 Slycke et al. Nov 2009 A1
20090305213 Burgkart et al. Dec 2009 A1
20090326556 Diolaiti Dec 2009 A1
20100030111 Perriere Feb 2010 A1
20100071467 Nasiri et al. Mar 2010 A1
20100099066 Mire et al. Apr 2010 A1
20100120006 Bell May 2010 A1
20100167249 Ryan Jul 2010 A1
20100167250 Ryan et al. Jul 2010 A1
20100167254 Nguyen Jul 2010 A1
20100179428 Pederson et al. Jul 2010 A1
20100198141 Laitenberger et al. Aug 2010 A1
20100273135 Cohen Oct 2010 A1
20110027767 Divinagracia Feb 2011 A1
20110046915 Hol et al. Feb 2011 A1
20110060229 Hulvershorn Mar 2011 A1
20110071419 Liu et al. Mar 2011 A1
20110098569 Warmath et al. Apr 2011 A1
20110144658 Wenderow et al. Jun 2011 A1
20110170752 Martin et al. Jul 2011 A1
20110202012 Bartlett Aug 2011 A1
20110207102 Trotta et al. Aug 2011 A1
20110236866 Psaltis et al. Sep 2011 A1
20110257596 Gaudet Oct 2011 A1
20110269109 Miyazaki Nov 2011 A2
20110282188 Burnside et al. Nov 2011 A1
20110294103 Segal et al. Dec 2011 A1
20110301500 Maguire et al. Dec 2011 A1
20110306025 Sheehan et al. Dec 2011 A1
20120002014 Walsh Jan 2012 A1
20120015336 Mach Jan 2012 A1
20120026307 Price Feb 2012 A1
20120027269 Fidaleo et al. Feb 2012 A1
20120034587 Toly Feb 2012 A1
20120045743 Okano et al. Feb 2012 A1
20120053514 Robinson et al. Mar 2012 A1
20120082969 Schwartz et al. Apr 2012 A1
20120130269 Rea May 2012 A1
20120148994 Hori et al. Jun 2012 A1
20120157800 Tschen Jun 2012 A1
20120171652 Sparks et al. Jul 2012 A1
20120183238 Savvides et al. Jul 2012 A1
20120209243 Yan Aug 2012 A1
20120214144 Trotta et al. Aug 2012 A1
20120219937 Hughes Aug 2012 A1
20120238875 Savitsky et al. Sep 2012 A1
20120251987 Huang et al. Oct 2012 A1
20120280988 Lampotang et al. Nov 2012 A1
20120282583 Thaler et al. Nov 2012 A1
20120293632 Yukich Nov 2012 A1
20120301858 Park et al. Nov 2012 A1
20120323520 Keal Dec 2012 A1
20130006178 Pinho et al. Jan 2013 A1
20130018494 Amini Jan 2013 A1
20130046489 Keal Feb 2013 A1
20130100256 Kirk et al. Apr 2013 A1
20130131503 Schneider et al. May 2013 A1
20130179110 Lee Jul 2013 A1
20130189658 Peters et al. Jul 2013 A1
20130189663 Tuchschmid et al. Jul 2013 A1
20130197845 Keal Aug 2013 A1
20130198625 Anderson Aug 2013 A1
20130203032 Bardsley Aug 2013 A1
20130223673 Davis et al. Aug 2013 A1
20130226137 Brown Aug 2013 A1
20130236872 Laurusonis et al. Sep 2013 A1
20130267838 Fronk et al. Oct 2013 A1
20130296691 Ashe Nov 2013 A1
20130308827 Dillavou et al. Nov 2013 A1
20130323700 Samosky Dec 2013 A1
20130342657 Robertson Dec 2013 A1
20140017650 Romero Jan 2014 A1
20140039452 Bangera et al. Feb 2014 A1
20140071165 Tuchschmid et al. Mar 2014 A1
20140102167 MacNeil et al. Apr 2014 A1
20140120505 Rios et al. May 2014 A1
20140121636 Boyden et al. May 2014 A1
20140121637 Boyden May 2014 A1
20140129200 Bronstein et al. May 2014 A1
20140142422 Manzke et al. May 2014 A1
20140162232 Yang et al. Jun 2014 A1
20140240314 Fukazawa et al. Aug 2014 A1
20140244209 Lee et al. Aug 2014 A1
20140260704 Lloyd et al. Sep 2014 A1
20140278183 Zheng et al. Sep 2014 A1
20140278205 Bhat et al. Sep 2014 A1
20140278215 Keal et al. Sep 2014 A1
20140322683 Baym et al. Oct 2014 A1
20140349263 Shabat et al. Nov 2014 A1
20140349266 Choi Nov 2014 A1
20140363801 Samosky et al. Dec 2014 A1
20150031987 Pameijer et al. Jan 2015 A1
20150049081 Coffey et al. Feb 2015 A1
20150079545 Kurtz Mar 2015 A1
20150079565 Miller et al. Mar 2015 A1
20150080710 Henkel et al. Mar 2015 A1
20150086955 Poniatowski et al. Mar 2015 A1
20150104773 Toly et al. Apr 2015 A1
20150182706 Wurmbauer et al. Jul 2015 A1
20150206456 Foster et al. Jul 2015 A1
20150262512 Rios et al. Sep 2015 A1
20150314105 Gasparyan et al. Nov 2015 A1
20150352294 O'Mahoney et al. Dec 2015 A1
20150359721 Hagel et al. Dec 2015 A1
20150379899 Baker et al. Dec 2015 A1
20150379900 Samosky et al. Dec 2015 A1
20160000411 Raju et al. Jan 2016 A1
20160001016 Poulsen et al. Jan 2016 A1
20160155363 Rios et al. Jun 2016 A1
20160193428 Perthu Jul 2016 A1
20160213856 Despa et al. Jul 2016 A1
20160293058 Gaillot et al. Oct 2016 A1
20160324580 Esterberg Nov 2016 A1
20160367766 Baker Dec 2016 A1
20160374902 Govindasamy et al. Dec 2016 A1
20170053563 Holloway Feb 2017 A1
20170178540 Rios et al. Jun 2017 A1
20170186339 Rios et al. Jun 2017 A1
20170245943 Foster et al. Aug 2017 A1
20170252108 Rios et al. Sep 2017 A1
20170254636 Foster et al. Sep 2017 A1
20170316720 Singh et al. Nov 2017 A1
20180012516 Rios et al. Jan 2018 A1
20180068075 Shiwaku Mar 2018 A1
20180197441 Rios et al. Jul 2018 A1
20180225991 Pedroso et al. Aug 2018 A1
20180240365 Foster et al. Aug 2018 A1
20180261125 Rios et al. Sep 2018 A1
20180261126 Rios et al. Sep 2018 A1
20180271581 OuYang et al. Sep 2018 A1
20180333543 Diaz et al. Nov 2018 A1
20180338806 Grubbs Nov 2018 A1
20190130792 Rios et al. May 2019 A1
20200202747 Rios et al. Jun 2020 A1
20200226951 Rios et al. Jul 2020 A1
20210174706 Rios et al. Jun 2021 A1
20210177518 Rios et al. Jun 2021 A1
20210213205 Karlsson et al. Jul 2021 A1
20220309954 Rios et al. Sep 2022 A1
20230009855 Rios et al. Jan 2023 A1
Foreign Referenced Citations (88)
Number Date Country
2011218649 Sep 2011 AU
2015255197 Dec 2015 AU
2865236 Sep 2013 CA
2751386 Jan 2006 CN
201213049 Mar 2009 CN
201359805 Dec 2009 CN
201465399 May 2010 CN
101908294 Dec 2010 CN
202159452 Mar 2012 CN
102708745 Oct 2012 CN
102737533 Oct 2012 CN
104703641 Jun 2015 CN
105118350 Dec 2015 CN
205541594 Aug 2016 CN
106710413 May 2017 CN
107067856 Aug 2017 CN
102004046003 Mar 2006 DE
202005021286 Sep 2007 DE
0316763 May 1989 EP
1504713 Feb 2005 EP
1723977 Nov 2006 EP
1884211 Feb 2008 EP
2425416 Mar 2015 EP
2538398 Aug 2015 EP
2756857 May 2016 EP
2288686 Jul 1997 GB
2309644 Aug 1997 GB
2 309 644 May 2000 GB
2508510 Jun 2014 GB
201202900 Nov 2013 IN
H10161522 Jun 1998 JP
H10260627 Sep 1998 JP
2004-348095 Dec 2004 JP
2006-189525 Jul 2006 JP
2008-83624 Apr 2008 JP
2011-113056 Jun 2011 JP
2013-037088 Feb 2013 JP
52-21420 Jun 2013 JP
2013-250453 Dec 2013 JP
2014-153482 Aug 2014 JP
2012009379 Feb 2012 KR
20140047943 Apr 2014 KR
10-1397522 May 2014 KR
201207785 Feb 2012 TW
WO 9616389 May 1996 WO
WO 0053115 Sep 2000 WO
WO 02083003 Oct 2002 WO
WO 2005083653 Sep 2005 WO
WO 2005089835 Sep 2005 WO
WO 2007109540 Sep 2007 WO
WO 2008005315 Jan 2008 WO
WO 2008122006 Oct 2008 WO
WO 2009023247 Feb 2009 WO
WO 2009049282 Apr 2009 WO
WO 2009094646 Jul 2009 WO
WO 2009141769 Nov 2009 WO
WO 2011043645 Apr 2011 WO
WO 2011127379 Oct 2011 WO
WO 2011136778 Nov 2011 WO
WO 2012075166 Jun 2012 WO
WO 2012088471 Jun 2012 WO
WO 2012101286 Aug 2012 WO
WO 2012106706 Aug 2012 WO
WO 2012155056 Nov 2012 WO
WO 2013025639 Feb 2013 WO
WO 2013064804 May 2013 WO
WO 2014035659 Mar 2014 WO
WO 2014070799 May 2014 WO
WO 2014100658 Jun 2014 WO
WO 2015109251 Jul 2015 WO
WO 2015110327 Jul 2015 WO
WO 2015136564 Sep 2015 WO
WO 2015138608 Sep 2015 WO
WO 2015171778 Nov 2015 WO
WO 2016089706 Jun 2016 WO
WO 2016123144 Aug 2016 WO
WO 2016162298 Oct 2016 WO
WO 2016191127 Dec 2016 WO
WO 2017048929 Mar 2017 WO
WO 2017048931 Mar 2017 WO
WO 2017050781 Mar 2017 WO
WO 2017060017 Apr 2017 WO
WO 2017070391 Apr 2017 WO
WO 2017151441 Sep 2017 WO
WO 2017151716 Sep 2017 WO
WO 2017151963 Sep 2017 WO
WO 2017153077 Sep 2017 WO
WO 2018136901 Jul 2018 WO
Non-Patent Literature Citations (218)
Entry
“A beginner's guide to accelerometers,” Dimension Engineering LLC, accessed Jul. 11, 2018, in 2 pages, https://www.dimensionengineering.com/info/accelerometers.
“Accelerometer: Introduction to Acceleration Measurement,” Omega Engineering, Sep. 17, 2015, 3 pages, https://www.omega.com/prodinfo/accelerometers.html.
Afzal, et al., “Use of Earth's Magnetic Field for Mitigating Gyroscope Errors Regardless of Magnetic Perturbation,” Sensors 2011, 11, 11390-11414; doi:10.3390/s111211390, 25 pp. published Nov. 30, 2011.
Andraos et al., “Sensing your Orientation” Address 2007, 7 pp.
Arms, S.W., “A Vision for Future Wireless Sensing Systems,” 44 pp., 2003.
“B-Smart disposable manometer for measuring peripheral nerve block injection pressures”, Bbraun USA, 2016, in 4 pages.
Bao, et al., “A Novel Map-Based Dead-Reckoning Algorithm for Indoor Localization”, J. Sens. Actuator Networks, 2014, 3, 44-63; doi: 10.3390/jsan3010044, 20 pp., Jan. 3, 2014.
Benbasat et al., “An Inertial Measurement Framework for Gesture Recognition and Applications,” I. Wachsmuth and T. Sowa (Eds.): GW 2001, Springer-Verlag Berlin Heidelberg, 12 pp., 2002.
Bergamini et al., “Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks”, Oct. 2014, 18625-18649.
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 6 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/.
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 13 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/.
Correa et al., “Virtual Reality Simulator for Dental Anesthesia Training in the Inferior Alveolar Nerve Block,” Journal of Applied Oral Science, vol. 25, No. 4, Jul./Aug. 2017, pp. 357-366.
Desjardins, et al. “Epidural needle with embedded optical fibers for spectroscopic differentiation of tissue: ex vivo feasibility study”, Biomedical Optics Express, vol. 2(6): pp. 1-10. Jun. 2011.
“EPGL Medical Invents Smart Epidural Needle, Nerve Ablation and Trigger Point Treatment Devices: New Smart Medical Devices Will Give Physicians Advanced Situational Awareness During Critical Procedures,” EPGL Medical, dated Aug. 12, 2013, in 3 pages. Retrieved from http://www.prnewswire.com/news-releases/epgl-medical-invents-smart-epidural-needle-nerve-ablation-and-trigger-point-treatment-devices-219344621.html#.
“The EpiAccess System: Access with Confidence”, EpiEP Epicardial Solutions, dated 2015, in 2 pages.
Esteve, Eric, “Why do you need 9D Sensor Fusion to support 3D orientation?”, 5 pp., Aug. 23, 2014, https://www.semiwiki.com/forum/content/3794-why-do-you-need-9d-sensor-fusion-support-3d-orientation.html.
Garg et al., “Radial Artery cannulation-Prevention of pain and Techniques of cannulation: review of literature,” The Internet Journal of Anesthesiology, vol. 19, No. 1, 2008, in 6 pages.
Grenet et al., “spaceCoder: a Nanometric 3D Position Sensing Device,” CSEM Scientific & Technical Report, 1 page, 2011.
Helen, L., et al. “Investigation of tissue bioimpedance using a macro-needle with a potential application in determination of needle-to-nerve proximity”, Proceedings of the 8th International Conference on Sensing Technology, Sep. 2-4, 2014, pp. 376-380.
International Search Report and Written Opinion for Appl. No. PCT/US2013/067352 dated Mar. 31, 2014 in 10 pages.
International Search Report and Written Opinion for Appl. No. PCT/US2015/011845, mailed Apr. 29, 2015 in 10 pages.
International Search Report and Written Opinion for Appl. No. PCT/US2015/019974, mailed May 21, 2015, 10 pages.
International Search Report and Written Opinion for Appl. No. PCT/US2015/062798, mailed Mar. 14, 2016, 12 pages.
International Search Report and Written Opinion for Appl. No. PCT/US2016/057974, mailed Apr. 19, 2017, 21 pages.
International Search Report and Written Opinion for Appl. No. PCT/US2017/019518, mailed Sep. 18, 2017, 19 pages.
International Preliminary Report on Patentability for Appl. No. PCT/US2016/057974, mailed May 3, 2018, 13 pages.
Jafarzadeh et al., “Design and construction of an automatic syringe injection pump,” Pacific Science Review A: Natural Science and Engineering 18, 2016, in 6 pages.
Kalvøy, H., et al., “Detection of intraneural needle-placement with multiple frequency bioimpedance monitoring: a novel method”, Journal of Clinical Monitoring and Computing, Apr. 2016, 30(2):185-192.
Kettenbach et al., “A robotic needle-positioning and guidance system for CT-guided puncture: Ex vivo results,” Minimally Invasive Therapy and Allied Technologies, vol. 23, 2014, in 8 pages.
Ladjal, et al., “Interactive Cell Injection Simulation Based on 3D Biomechanical Tensegrity Model,” 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, in 9 pages.
Lee et al., “An Intravenous Injection Simulator Using Augmented Reality for Veterinary Education and its Evaluation,” Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, Dec. 2-4, 2012, in 4 pages.
Madgwick, Sebastian O.H., “An efficient orientation filter for inertial and inertial/magnetic sensor arrays,” 32 pp., Apr. 30, 2010.
Microsoft, “Integrating Motion and Orientation Sensors,” 85 pp., Jun. 10, 2013.
Miller, Nathan L., Low-Power, Miniature Inertial Navigation System with Embedded GPS and Extended Kalman Filter, MicroStrain, Inc., 12 pp., 2012.
MPU-9150 9-Axis Evaluation Board User Guide, Revision 1.0, 15 pp., May 11, 2011, http//www.invensense.com.
MPU-9150, Register Map and Descriptions, Revision 4.2, 52 pp., Sep. 18, 2013, http//www.invensense.com.
MPU-9150, Product Specification, Revision 4.3, 50 pp., Sep. 18, 2013, http//www.invensense.com.
Poyade et al., “Development of a Haptic Training Simulation for the Administration of Dental Anesthesia Based Upon Accurate Anatomical Data,” Conference and Exhibition of the European Association of Virtual and Augmented Reality, 2014, in 5 pages.
PST Iris Tracker, Plug and Play, 3D optical motion tracking specifications, 1 p., Dec. 4, 2014, www.pstech.com.
PST Iris Tracker, Instruction Manual, 3D optical motion tracking specifications, 42 pp., Jul. 27, 2012, www.pstech.com.
QUIO, “Smartinjector,” available at https://web.archive.org/web/20161017192142/http://www.quio.com/smartinjector, Applicant believes to be available as early as Oct. 17, 2016, in 3 pages.
Search and Examination Report for Appl. No. GB1319193.7 in 6 pages dated Mar. 28, 2014.
Search and Examination Report, dated Feb. 23, 2015, by the UK Intellectual Property Office, in the matter of Application No. GB1414892.8 of TruInject Medical Corporation, 6 pp.
State Electronics, “Sensofoil Membrane Potentiometer,” Product Information and Technical Specifications, in 6 pages.
Struik, Pieter, “Ultra Low-Power 9D Fusion Implementation: A Case Study,” Synopsis, Inc., 7 pp., Jun. 2014.
Sutherland, et al. “An Augmented Reality Haptic Training Simulator for Spinal Needle Procedures,” IEEE, 2011.
Truinject Corp., “Smart Injection Platform,” http://truinject.com/technology/, printed Jan. 13, 2018, in 3 pages.
Varesano, Fabio, “Prototyping Orientation and Motion Sensing Objects with Open Hardware,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Feb. 10, 2013, 4 pp.
Varesano, Fabio, “FreeIMU: An Open Hardware Framework for Orientation and Motion Sensing,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Mar. 20, 2013, 10 pp.
“About the Journal”, J. Dental Educ., AM. Dental Educ. Ass'n, 2019, http://www.jdentaled.org/content/about-us (last visited Oct. 9, 2019).
Begg et al., “Computational Intelligence for Movement Sciences: Neural Networks and Other Emerging Techniques”, Idea Group Inc (IGI), 2006.
Comsa et al., “Bioluminescene imaging of point sources implants in small animals post mortem: evaluation of a method for estimating source strength and depth”, Phys. Med. Biol., Aug. 2007, vol. 52, No. 17, pp. 5415-5428.
Hotraphinyo et al., “Precision measurement for microsurgical instrument evaluation”, Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Societyl, 2001, vol. 4, pp. 3454-3457.
Krupa et al., “Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing”, IEEE Trans. Robotics and Automation, 2003, vol. 19, pp. 842-853.
Lee et al., “Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine,” The Veterinary Journal, 2013, vol. 196, No. 2, pp. 197-202.
Liu et al. “Robust Real-Time Localization of Surgical Instruments in the Eye Surgery Stimulator (EyeSi)”, Signal and Image Processing, 2002.
Merril et al., “The Ophthalmic Retrobulbar Injection Simulator (ORIS): An Application of Virtual Reality to Medical Education”, Proc. Ann. Symp. Comput. Med. Care, 1992, pp. 702-706.
Mukherjee et al., “A Hall Effect Sensor Based Syringe Injection Rate Detector”, IEEE 2012 Sixth Int'l Conf. on Sensing Technol.(ICST), Dec. 18-21, 2012.
Petition for Inter Partes Review of U.S. Pat. No. 9,792,836, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 ET SEQ., IPR2020-00042, dated Oct. 11, 2019.
Patterson et al., “Absorption spectroscopy in tissue-simulating materials: a theoretical and experimental study of photon paths”, Appl. Optics, Jan. 1995, vol. 34, No. 1, pp. 22-30.
Van Sickle et al., “Construct validation of the ProMIS simulator using novel laparoscopic suturing task”, Surg Endosc, Sep. 2005, vol. 19, No. 9, pp. 1227-1231.
Wierinck et al., “Expert Performance on a Virtual Reality Simulation System”, 71J. Dental Educ., Jun. 2007, pp. 759-766.
Wik et al., “Intubation with laryngoscope versus transillumination performed by paramedic students on mainkins and cadavers”, Resuscitation, Jan. 1997, vol. 33, No. 3, pp. 215-218.
Laerdal, “Virtual Phlebotomy—Directions for Use,” Self-directed Phlebotomy learning, Aug. 4, 2020, pp. 1-100.
Laerdal Medical, http://www.laerdal.com/us/nav/203/Venous-Arterial-Access, printed Mar. 8, 2019 in 3 pgs.
Lampotang et al., “A Subset of Mixed Simulations: Augmented Physical Simulations with Virtual Underlays,” Interservice/Idnustry Training, Simualtion, and Education Conference (I/ITSEC), 2012, pp. 1-11.
Lance Baily, Polhemus Delivers World Class Motion Tracking Technology to Medical Simulation Industry, healthysimulation.com, (May 2, 2016), https://www.healthysimulation.com/8621/polhemus-deliversworld-class-motion-tracking-technology-to-medical-simulationindustry/.
Lampotang et al., “Mixed Reality Simulation for Training Reservists and Military Medical Personnel in Subclavian Central Venous Access,” Informational Poster, UFHEALTH, Center for Safety, Simulation and Advanced Learning Technologies, 2015, 1 pp. available at https://simulation.health.ufl.edu/files/2018/12/Dept_CoR_2015-Mixed_Reality_Simulation_for_Training.pdf.
S. Laufer et al., Sensor Technology in Assessments of Clinical Skill, 372 N Engl JMED 784-86 (2015).
“Learning by Feel: ToLTech and Allergan Simulator”, 3D Systems, dated May 8, 2012, in 93 pages.
Lee et al., “A Phantom Study on the Propagation of NIR Rays under the Skin for Designing a Novel Vein-Visualizing Device,” ICCAS, Oct. 20-23, 2013, pp. 821-823.
Lee et al., “Evaluation of the Mediseus® Epidural Simulator”, Anaesthesia and Intensive Care (2012), vol. 40, No. 2, pp. 311-318.
Lee et al., “The utility of endovascular simulation to improve technical performance and stimulate continued interest of preclinical medical students in vascular surgery,” Journal of Surgical Education, 2009 APDS Spring Meeting, vol. 66, No. 6, 367-373.
Lee et al., “Virtual Reality Ophthalmic Surgical Simulation as a Feasible Training and Assessment Tool: Results of a Multicentre Study,” Canada Journal of Ophthalmology, Feb. 2011 vol. 46, No. 1, 56-60.
Lemole et al., “Virtual Reality in Neurosurgical Education: Part-Task Ventriculostomy Simulation with Dynamic Visual and Haptic Feedback,” Neurosurgery, Jul. 2007, vol. 61, No. 1, pp. 142-149.
Lendvay et al., “The Biomechanics of Percutaneous Needle Insertion”, Studies in Health Technology and Informatics, Jan. 2008 in 2 pages.
Leopaldi et al., “The dynamic cardiac biosimulator: A method for training physicians in beating-heart mitral valve repair procedures,” The Journal of Thoracic and Cardiovascular Surgery, 2018, vol. 155, No. 1, pp. 147-155.
Lim et al., “Simulation-Based Military Regional Anesthesia Training System”, US Army Medical Research and Materiel Command Fort Detrick MD, Telemedicine and Advanced Technology Research Center, 2008, in 8 pages.
Lim, M.W. et al., “Use of three-dimensional animation for regional anaesthesia teaching: application to interscalene brachial plexus blockade,” British Journal of Anaesthesia, Advance Access, 2004, vol. 94, pp. 372-377.
Liu et al. “Study on an Experimental AC Electromagnetic Tracking System” Proceedings of the 5th World Congress on Intelligent Control and Automation, Jun. 15-19, 2001. pp 3692-3695.
Luboz et al., “ImaGiNe Seldinger: First simulator for Seldinger technique and angiography training”, Computer Methods and Programs in Biomedicine, vol. 111, No. 2, Aug. 2013 pp. 419-434.
Mastmeyer et al., “Direct Haptic Volume Rendering in Lumbar Puncture Simulation”, Studies in Health Technology and Informatics, vol. 173, No. 280, 2012 in 8 pages.
Mastmeyer et al., “Real-Time Ultrasound Simulation for Training of US-Guided Needle Insertin in Breathing Virtual Patients”, Studies in Health Technology and Informatics, Jan. 2016 in 9 pages.
MEDGADGET Editors, “EYESI Surgical Simulator,” MEDGADGET, Aug. 28, 2006,4 pp., printed on Feb. 7, 2020, https://www.medgadget.com/2006/08/eyes_i_surgical.html.
MEDGADGET Editors, “ToLTech Cystoscopy Simulator Helps Practice BOTOX Injections”, MEDGADGET, May 14, 2012, in 2 pages. Printed on Feb. 6, 2020, http://www.medgadget.com/2012/05/toltech-cystoscopy-simulator-helps-practice-botox-injections.html.
Merlone1, Eyesi_Cataract_2011 (Sept. 9, 2011), https://www.youtube.com/watch?v=XTulabWmEvk.
Mnemonic, Ipsen Injection Simulators, available at http://mnemonic.studio/project/ispen-injection-simulators. Copyright 2019, Website viewed on Aug. 24, 2020.
Mnemonic, Injection Simulator (Oct. 20, 2017), https://vimeo.com/239061418.
Mukherjee et al., “An Ophthalmic Anesthesia Training System Using Integrated Capacitive and Hall Effect Sensors,” IEEE, Transactions on Instrumentation and Measurement, Jan. 2014, vol. 63, No. 5, 11 pp.
Nelson, Douglas A. Jr., “A Modular and Extensible Architecture Integrating Sensors, Dynamic Displays of Anatomy and Physiology, and Automated Instruction for Innovations in Clinical Education” Doctoral Dissertation, Univ. of Pitt., 2017, 260 pp.
Nelson et al., “The Tool Positioning Tutor: A Target-Pose Tracking and Display System for Learning Correct Placement of a Medical Device,” Medicine Meets Virtual Reality 18, IOS Press, 2011, 5 pp.
Ottensmeyer et al., “Ocular and Craniofacial Trauma Treatment Training System: Overview & Eyelid Laceration Module,” workshop Proceedings of the 8th International Conference on Intelligent Environments, IOS Press, 2012, 13 pp.
Ozturk wt al., “Complications Following Injection of Soft-Tissue Fillers,” Aesthetic Surgery Journal, from the American Society for Aesthetic Plastic Surgery, Inc. Reprints and permissions, http://www.sagepub.com/journalsPermissions.nav, Aug. 2013, pp. 862-877.
K. Perrone et al., Translating motion tracking data into resident feedback: An opportunity for streamlined video coaching, 209 Am J Surg. 552-56 (2015).
Petition for Inter Partes Review of U.S. Pat. No. 9,792,836, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 ET SEQ., IPR2020-00042, dated Oct. 17, 2017.
C. Pugh et al., A Retrospective Review of TATRC Funding for Medical Modeling and Simulation Technologies, 6 Simulation in Healthcare, 218-25 (2011).
Petition for Inter Partes Review of U.S. Pat. No. 10,290,232, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 ET SEQ., IPR2020-00937 dated May 14, 2019.
Petition for Inter Partes Review of U.S. Pat. No. 10,290,231, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 ET SEQ., IPR2020-00935 dated May 14, 2019.
Pitt Innovates, BodyExplorer™ (Sep. 24, 2014), https://www.youtube.com/watch?v=T6G2OWJm5hs.
Pitt Innovates, Pitt Student Innovator Award, Pitt Intellectual Property 2017, Douglas A Nelson Jr. (Nov. 28, 2017), https://www.youtube.com/watch?v=0_CVBgWtCLo.
Rahman et al., “Tracking Manikin Tracheal Intubation Using Motion Analysis,” Pediatric Emergency Care, Aug. 2011, vol. 27, No. 8, pp. 701-705.
Robinson et al., “A Mixed-Reality Part-Task Trainer for Subclavian Venous Access,” Journal of the Society for Simulation in Healthcare, Feb. 2014, vol. 9, No. 1, pp. 56-64.
Salem et al., “Clinical Skills Laboratories “CSLs” Manual 1432-2011,” Jan. 2011, pp. 0-88.
Samosky et al., “BodyWindows: Enhancing a Mannequin with Projective Augmented Reality for Exploring Anatomy, Physiology and Medical Procedures,” Medicine Meets Virtual Reality 19, 2012, 433, J.D. Westwood et al. eds., IOS Press, pp. 433-439.
Samosky et al., “Enhancing Medical Device Training with Hybrid Physical-Virtual Simulators: Smart Peripherals for Virtual Devices,” Medicine Meets Virtual Reality 20, Jan. 2013, J.D. Westwood et al. eds., IOS Press 377, pp. 377-379.
Samosky, Joseph, “View from the Top: Simulation Director Envisions Greater Use for Training Tool,” Biomedical Instrumentation & Technology, 2012, pp. 283-288.
Samosky et al., “Toward a Comprehensive Hybrid Physical—Virtual Reality Simulator of Peripheral Anesthesia with Ultrasound and Neurostimulator Guidance,” Medicine Virtual Reality 18, IOS Press, 2011, pp. 552-554.
Satava, “Accomplishments and Challenges of Surgical Simulation”, Dawning of the next-generation surgical education, Surgical Endoscopy Ultrasound and Interventional Techniques, Online publication, Feb. 6, 2001, in 10 pages.
Schneider, Chad Michael, “Systems for Robotic Needle Insertion and Tool-Tissue Interaction Modeling,” Research Gate, 2004, pp. 1-74, Baltimore, Maryland.
Sclaverano et al. “BioSym : a simulator for enhanced learning of ultrasound-guided prostate biopsy”, Studies in Health Technology and Informatics, 2009 in 6 pages.
S. Shaharan et al., Motion Tracking System in Surgical Training, 2017 INTECHOPEN 3-23 (2017), available at http://dx.doi.org/10.5772/intechopen.68850.
Shen et al., “Virtual trainer for intra-destrusor injection of botulinum toxin to treat urinary incontinence”, Studies in Health Technology and Informatics, vol. 173, 2012 in 4 pages.
J. {hacek over (S)}ilar et al., Development of In-Browser Simulators for Medical Education: Introduction of a Novel Software Toolchain, 21 J Med Internet Res. e14160 (published online Jul. 3, 2019).
Simbionix, Valencia College's CVT program uses Simbionix ANGIO Mentor simulators, Feb. 26, 2013, https://www.youtube.com/watch?v=oAE0fWzXMjw.
SimEx, “Dental Augmented Reality Simulator,” EPED, 3 pp. https://www.epedmed.com/simex. Available as early as 2019.
Spiteri et al., “Phacoemulsification Skills Training and Assessment,” The British Journal of Ophthalmology 2010, Aug. 2009, 20 pp.
Stunt et al., “Validation of ArthroS virtual reality simulator for arthroscopic skills,” Knee Surgery Sports Traum. Arthroscopy 23, Jun. 11, 2014, 8 pp.
Sultan et al., “A Novel Phantom for Teaching and Learning Ultrasound-guided Needle Manipulation,” Journal of Medical Ultrasound, Elsevier Taiwan LLC, Jul. 2013, vol. 21, pp. 152-155.
Suzuki et al., “Simulation of Endovascular Neurointervention Using Silicone Models: Imaging and Manipulation,” Neurol Med Chir (Tokyo), 2005, vol. 45, pp. 567-573.
The Simulation Group, Internet Archive Wayback webpage capture of http://www.medicalsim.org/virgil.htm, apparently available Apr. 10, 2013, site visited Aug. 25, 2020.
The Simulation Group, VIRGIL™ Videos (2002), http://www.medicalsim.org/ virgil_vid.htm; http://www.medicalsim.org/virgil/virgil%20expert.mpg.
Ting et al., “A New Technique to Assist Epidural Needle Placement: Fiberoptic-guided Insertion Using Two Wavelengths,” Anesthesiology, 2010, vol. 112, pp. 1128-1135.
Touch of Life Technologies, “ToLTech Cystoscopy Simulator Helps Practice BOTOX Injections,” https://www.medgadget.com/2012/05/toltech-cystoscopy-simulator-helps-practice-botox-injections.html, May 2012, printed on Feb. 6, 2020 in 2 pgs.
Touch of Life Technologies, “Touch of Life Technologies' new cystoscopy and bladder injection simulator offers urologists training on use of BOTOX®,” https://www.urotoday.com/recent-abstracts/pelvic-health-reconstruction/urinary-incontinence/50289-touch-of-life-technologies-new-cystoscopy-and-bladder-injection-simulator-offers-urologists-training-on-use-of-botox-onabotulinumtoxina-as-treatment-for-urinary-incontinence-in-adults-with-neurological-conditions.html, May 2012, printed on Feb. 6, 2020 in 2 pgs.
UFCSSALT, “Video of mixed simulation for placement of CVL needle”—(Patent Pending), Dec. 5, 2011, https://www.youtube.com/watch?v=0ITIFbiiwRs.
UFHealth, “UF developing mixed-reality simulators for training in treatment of injured soldiers,” Aug. 20, 2014, https://www.youtube.com/watch?v=sMxH1lprc10& feature=emb_title.
Ungi et al., “Perk Tutor: An Open-Source Training Platform for Ultrasound-Guided Needle Insertions,” IEEE Transactions on Biomedical Engineering, Dec. 2012, vol. 59, No. 12, pp. 3475-3481.
Univervisty of Pittsburgh Innovation Institute, “BodyExplorer: An Automated Augmented Reality Simulator for Medical Training and Competency Assessment,” Mar. 2016, 2 pp.
Univervisty of Pittsburgh Innovation Institute, “BodyExplorer: Enhancing a Mannequin Medical Simulator with Sensing. Tangible Interaction and Projective Augmented Reality for Exploring Dynamic Anatomy, Physiology and Clinical Procedures,” 2012, pp. 1-3.
Vaughan et al., “A review of virtual reality based training simulators for orthopedic surgery,” Journal Engineering and Physics, 2016, vol. 38, Elsevier Ltd., pp. 59-71.
Vidal et al., “Developing an Immersive Ultrasound Guided Needle Puncture Simulator”, Studies in Health Technology and Informatics, 2009, pp. 398-400.
Virgil™, The Simulation Group/CIMIT, “Medical Simulation Chest Trauma Training System,” 2002, 6 pp. http://www.medicalsim.org/virgil.htm.
VirtaMed ArthroS™, “Virtual reality arthroscopy for knee, shoulder, hip, ankle & FAST basic skills,” Fact Sheet/Brochure Jul. 13, 2011.
VirtaMed ArthroS™ Module Descriptions. 2019.
Virtamed, ArthroS—The 2012 Arthroscopic Simulator for Knee Arthroscopy, Feb. 1, 2012, https://www.youtube.com/watch?v=Y6w3AGfAqKA.
Virtamed, Arthroscopy Training Simulator ArthroS Now With Shoulder Module!, Mar. 13, 2013, https://www.youtube.com/watch?v=kPuAm0MIYg0.
Virtamed, Arthroscopy Training 2013: VirtaMed ArthroS Shoulder Simulator, Sep. 24, 2013, https://www.youtube.com/watch?v=WdCtPYr0wK0.
VIRTAMED News, “VirtaMed ArthroS—Virtual reality training for knee arthroscopy,” VirtaMed, Jul. 13, 2011, 2 pp. accessed on Feb. 6, 2020, https://www.virtamed.com/en/news/virtamed-arthros-virtual-reality-training-knee-arthroscopy/.
VirtaMed, VirtaMed ArthroS™—diagnostic and therapeutic arthroscopy in both the knee and shoulder (Apr. 15, 2014), https://www.youtube.com/watch?v=gtkISWnOzRc.
Virtual I.V.® Simulator—1. Introduction. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=H9Qd6N9vG_A, viewed on Jul. 27, 2021.
Virtual I.V.® Simulator—2. System Overview. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=101UFNFU3cU, viewed on Jul. 28, 2021.
Virtual I.V.® Simulator—3. Training overview. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=5Ut6YkDaNWI, viewed on Jul. 27, 2021.
VRMAGIC, “eyesi by VRmagic Surgical Simulator,” Product Brochure, 2015, available at https://pdf.medicalexpo.com/pdf/vrmagic/eyesi-surgical-product-brochure/112458-159450.html.
Walsh et al., “Use of Simulated Learning Environments in Dentistry and Oral Health Curricula,” SLE in Dentistry and Oral Health: Final Report, 2010, Health Workforce Australia, pp. 1-112.
Wandell et al., “Using a Virtual Reality Simulator in Phlebotomy Training”, LabMedicine, ( Aug. 2010) vol. 41, No. 8, in 4 pages.
Welk et al., “DentSim—A Future Teaching Option for Dentists,” 7 International Journal of Computerized Dentistry, 2004, 9 pp.
Wiles, Andrew et al., “Accuracy assessment and interpretation for optical tracking systems,” SPIE, Medical Imaging: Visualization, Image-Guided Procedures and Display, 2004, vol. 5367, pp. 1-12.
Wolpert et al., “ENISS: An Epidural Needle Insertion Simulation System”, Institute of Electrical and Electronics Engineers Inc., 2007 pp. 271-272.
Yeo et al., “The Effect of Augmented Reality Training on Percutaneous Needle Placement in Spinal Facet Joint Injections,” IEEE, Transactions on Biomedical Engineering, Jul. 2011, vol. 58, No. 7, 8 pp.
Yu et al., “Development of an in Vitro Tracking System with Poly (vinyl alcohol) Hydrogel for Catheter Motion,” Journal of Biomedical Science and Engineering, 2010, vol. 5, No. 1, 11-17.
Examination Report in corresponding European Patent Application No. 16787996.4, dated Dec. 16, 2020, in 9 pages.
3D Systems,“ANGIO Mentor Clinical Validations, The Role of Simulation in Boosting the learning Curve in EVAR Procedures,” Journal of Surgical Education, Mar.-Apr. 2018, 75(2), pp. 1-2, accessed on Feb. 6, 2020, https://simbionix.com/ simulators/clinical-validations/angio-mentor-clinical-validations/ (listing clinical validations completed on ANGIO Mentor from 2007 through 2018).
3D Systems, “ANGIO Mentor™,” Product Brochure/Overview. 2015, 6 pp.
Ainsworth et al., “Simulation Model for Transcervical Laryngeal Injection Providing Real-time Feedback,” Annals of Otology, Rhinology & Laryngology, 2014, col. 123 (12), pp. 881-886.
Association of American Medical Colleges, Medical Simulation in Medical Education: Results of an AAMC Survey (Sep. 2011) (“AAMC Survey”), in 48 pages.
“A Virtual Reality Based Joint Injection Simulator Phase III”, https://www.sbir.gov/. Retreived Mar. 5, 2021, in 2 pages.
Banivaheb, Niloofar, “Comparing Measured and Theoretical Target Registration Error of an Optical Tracking System,” Feb. 2015, Toronto, Ontario, 128 pp.
Berkelman et al., “Co-Located 3D Graphic and Haptic Display using Electromagnetic Levitation”, The Institute of Electrical and Electronics Engineers, 2012 in 6 pages.
Blue Telescope, DAISEY Injector Simulator, Available athttps://www.bluetelescope.com/work/ipsen-injection-simulator. Blue Telescope Laboratories 2020, site visited Aug. 24, 2020.
Blum et al., “A Review of Computer-Based Simulators for Ultrasound Training,” Society for Simulation in Healthcare, Apr. 2013, vol. 8, pp. 98-108.
Botden et al., “Suturing training in Augmented Reality: gaining proficiency in suturing skills faster,” Surg Endosc, 2009, vol. 23, pp. 2131-2137.
Botden et al., “Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?,” World J. Surgery, 31, 2007, 10 pp.
Botden et al., “Face validity study of the ProMIS Augmented Reality laparoscopic suturing simulator,” Surgical Technology International, Feb. 2008, 17, 16 pp.
Botden et al., “What is going on in augmented reality simulation in laparoscopic surgery,” Surgical Endoscopy 23, 2009, 1693-1700.
Bova et al., “Mixed-Reality Simulation for Neurosurgical Procedures,” Neurosurgery, Oct. 2013, vol. 73, No. 4, pp. S138-S145.
Brennan et al., “Classification of diffuse light emission profiles for distinguishing skin layer penetration of a needle-free jet injection,” Biomedial Optics Express, Oct. 1, 2019, vol. 10, No. 10, pp. 5081-5092.
Brennan et al., “Light source depth estimation in porcine skin using spatially resolved diffuse imaging,” 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, 2016, pp. 5917-5920.
Brett, et al., “Simulation of resistance forces acting on surgical needles,” Proceedings of the Instiutional of Mechanical Engineers Part H Journal of Engineering in Medicine, Feb. 1997, vol. 211 Part H, pp. 335-347.
Buchanan, Judith Ann, “Use of Simulation Technology in Dental Education,” Journal of Dental Education, 2001, vol. 65, No. 11, 1225-1231.
CAE Healthcare, “CAE ProMIS Laparoscopic Simulator,” Product Brochure/Overview, 2012, 2 pp.
Capsulorhexis forceps only technique rehearsed on EYESi before OR (Feb. 10, 2010), https://www.youtube.com/watch?v=ySMI1Vq6Ajw.
Chui et al., “Haptics in computer-mediated simulation: Training in vertebroplasty,” Simulation & Gaming, Dec. 2006, vol. 37, No. 4, pp. 438-451.
J. Clark et al., A quantitative scale to define endoscopic torque control during natural orifice surgery, 22 Minimally Invasive Therapy & Allied Technologies 17-25 (2013).
Coles et al., “Modification of Commercial Force Feedback Hardware for Needle Insertion Simulation”, Studies in Health Technology and Informatics, 2011 in 1 page.
Coquoz et al., “Determination of depth of in vivo bioluminescent signals using spectral imaging techniques,” Conference Proceedings of SPIE, 2003, vol. 4967, pp. 37-45, San Jose, CA.
Craig, Alan B., “Augmented Reality Hardware,” Understanding Augmented Reality Chapter 3, 2013, Elsevier Inc., pp. 69-124.
Cumin et al., “Simulators for use in anaesthesia,” Anaesthesia, 2007, vol. 62, pp. 151-162.
Dang et al., “Development and Evaluation of an Epidural Injection Simulator with Force Feedback for Medical Training”, Studies in Health Technology and Informatics, 2001, vol. 81., pp. 97-102.
A. D'Angelo et al., Use of decision-based simulations to assess resident readiness for operative independence, 209 Am J Surg. 132-39 (2015).
V. Datta et al., The relationship between motion analysis and surgical technical assessments, 184(1) Am J Surg.70-73 (2002).
Datta et al., “The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model”. vol. 193, No. 5, Nov. 2001, pp. 479-485.
Davenar123, DentSim (Mar. 18, 2008), https://www.youtube.com/watch?v=qkzXUHay1W0.
Decision Denying Institution of Inter Parties Review for IPRP2020-00042, U.S. Pat. No. 9,792,836, dated Apr. 14, 2020, in 20 pages.
Defendant SHDS, Inc.'s(F/K/A Nestle Skin Health, Inc.) Second Supplemental Disclosure of Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Mar. 5, 2021, in 9 pages.
Defendant SHDS, Inc.'s(F/K/A Nestle Skin Health, Inc.) Final Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Jun. 18, 2021, in 54 pages.
DentSim Educators, DentSim Classroom Introduction (Aug. 8, 2013), https://vimeo.com/79938695.
DentSimLab, Aha Moments—Dentsim Students explain how their dental skills are improving (Nov. 13, 2013), https://www.youtube.com/watch?v=02NgPmhg55Q.
Dine et al., “Improving cardiopulmonary resuscitation quality and resuscitation training by combining audiovisual feedback and debriefing,” Crit Care Med, 2008 vol. 36, No. 10, pp. 2817-2822.
A. Dosis et al., Synchronized Video and Motion Analysis for the Assessment of Procedures in the Operating Theater, 140 Arch Surg. 293-99 (2005).
EPED Taiwan, EPED—Computerized Dental Simulator (CDS-100) (Jun. 9, 2014), https://www.youtube.com/watch?v=m8UXaV2ZSXQ.
Färber et al., “Needle Bending in a VR-Puncture Training System Using a 6DOF Haptic Device”, Studies in Health Technology and Informatics, 2009, vol. 142, in 3 pages.
Ford et al.,“Impact of simulation-based learning on mediation error rates in critically ill patients,” Intensive Care Med, 2010, vol. 36, pp. 1526-1531.
Franz et al., “Electromagnetic Tracking in Medicine—A Review of Technology, Validation, and Applications,” IEEE, Transactions on Medical Imaging, Aug. 2014, vol. 33, No. 8, pp. 1702-1725.
Garrett et al., “High-Fidelity Patient Simulation: Considerations for Effective Learning,” Teaching with Technoloyg: High-Fidelity Simulation, 2010, vol. 31, No. 5, pp. 309-313.
Gobbetti et al., “Catheter Insertion Simulation with co-registered Direct Volume Rendering and Haptic Feedback”, Studies in Health Technology and Informatics, vol. 70, 2000 in 3 pages.
Gottlieb et al., “Faculty Impressions of Dental Students' Performance With and Without Virtual Reality Simulation,” Journal of Dental Education, 2011, vol. 75, No. 11, pp. 1443-1451.
Gottlieb et al., “Simulation in Dentistry and Oral Health,” The Comprehensive Textbook of Healthcare Simulation Chapter 21, Apr. 2013, pp. 329-340.
Hoffman et al., “Arytenoid Repositioning Device,” Annals of Otology, Rhinology & Laryngology, 2014, vol. 123 (3); pp. 195-205.
Hoffman et al., “Transillumination for Needle Localization in the Larynx,” The Laryngoscope, 2015, vol. 125, pp. 2341-2348.
Huang et al., “CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System with Dexterous Instrument Tracking Technology,” CHI' 18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Apr. 21-26, 2018, pp. 1-12, ACM, Montreal, Canada.
IDA Design Awards—Winners, DAISEY Injection Simulator, available at https://idesignawards.com/winners/zoom.php?eid=9-11737-16&count=0&mode=, Available as early as Sep. 7, 2016.
Image Navigation, DentSim by Image Navigation—Augmented Reality Dental Simulation, Nov. 2014, 5 pp., available at https://image-navigation.com/wp-content/uploads/2014/11/DentSim-V5-2-Pager.pdf.
Image Navigation, DentSim Computerized Dental Training Simulator, Product Brochure, Jul. 2014, available at https://image-navigation.com/wp-content/uploads/2014/07/DentsimBrochure.pdf.
“Immersion Medical Joins with PICC Excellence to Promote Training Products for Peripherally Inserted Central Catheter Procedure”, Immersion Corporation, Business Wire 2006. Dated Jan. 9, 2006, in 3 pages.
“Immersion Medical Upgrades CathSim AccuTouch”, Med Device Online, dated Jan. 12, 2005 in 1 page.
Invensense, Inc., “MPU-9150 EV Board User Guide,” May 11, 2011, pp. 1-15.
Invensense, Inc., “MPU-9150 Product Specification Revision 4.3,” Sep. 18, 2013, pp. 1-50.
Invensense, Inc., “MPU-9150 Register Map and Descriptions Revision 4.2,” Sep. 18, 2013, pp. 1-52.
Jasinevicius et al., “An Evaluation of Two Dental Simulation Systems: Virtual Reality versus Contemporary Non-Computer-Assisted,” Journal of Dental Education, 2004, vol. 68, No. 11, 1151-1162.
Judgment and Final Written Decision Determing All Challenged Claims Unpatentable, U.S. Pat. No. 10,290,231B2, IPR2020-00935.
Judgment and Final Written Decision Determing All Challenged Claims Unpatentable, U.S. Pat. No. 10,290,232B2, IPR2020-00937.
Kandani et al., “Development in blood vessel searching system for HMS,” SPIE, Infrared Systems and Photoelectronic Tehcnology III, 2008, vol. 7065, pp. 1-10.
Khosravi, Sara, “Camera-Based Estimation of Needle Pose for Ultrasound Percutaneous Procedures,” University of British Columbia, 2008, pp. ii-83.
Kumar et al., “Virtual Instrumentation System With Real-Time Visual Feedback and Needle Position Warning Suitable for Ophthalmic Anesthesia Training,” IEEE: Transactions on Instrumentation and Measurement, May 2018, vol. 67, No. 5, pp. 1111-1123.
Lacey et al., “Mixed-Reality Simulation of Minimally Invasive Surgeries,” IEEE Computer Society, 2007, pp. 76-87.
Laerdal, “Virtual I.V.—Directions for Use”, www.laerdal.com, dated Sep. 3, 2010, in 100 pages.
Laerdal, “Virtual I.V. Sell Sheet”, www.laerdal.com, dated Mar. 26, 2013, in 2 pages.
Laerdal, “Virtual I.V. Simulator (Discontinued)”, www.laerdal.com, in 5 pages. Retrieved Jul. 23, 2021.
Related Publications (1)
Number Date Country
20200206424 A1 Jul 2020 US
Provisional Applications (1)
Number Date Country
62243801 Oct 2015 US
Continuations (1)
Number Date Country
Parent 15299209 Oct 2016 US
Child 16673889 US