A variety of medical injection procedures are often performed in prophylactic, curative, therapeutic, or cosmetic treatments. Injections may be administered in various locations on the body, such as under the conjunctiva, into arteries, bone marrow, the spine, the sternum, the pleural space of the chest region, the peritoneal cavity, joint spaces, and internal organs. Injections can also be helpful in administering medication directly into anatomic locations that are generating pain. These injections may be administered intravenously (through the vein), intramuscularly (into the muscle), intradermally (beneath the skin), subcutaneously (into the fatty layer of skin) or intraperitoneal injections (into the body cavity). Injections can be performed on humans as well as animals. The methods of administering injections typically range for different procedures and may depend on the substance being injected, needle size, or area of injection.
Injections are not limited to treating medical conditions, but may be expanded to treating aesthetic imperfections or restorative cosmetic procedures. Many of these procedures are performed through injections of various products into different parts of the body. The aesthetics and therapeutic industry consists of two main categories of injectable products: neuromodulators and dermal fillers. The neuromodulator industry commonly utilizes nerve-inhibiting products such as Botox®, Dysport®, and Xeomin®. The dermal filler industry utilizes products administered by providers to patients for both cosmetic and therapeutic reasons, such as, for example, Juvederm®, Restylane®, Belotero®, Sculptra®, Artefill®, and others. These providers or injectors may include plastic surgeons, facial plastic surgeons, oculoplastic surgeons, dermatologists, nurse practitioners, dentists and nurses.
One of the major problems in the administration of injections is that there is no official certification or training process. Anyone with a minimal medical related license may inject a patient. These “injectors” may include primary care physicians, dentists, veterinarians, nurse practitioners, nurses, physician's assistants, or aesthetic spa physicians. However, the qualifications and training requirements for injectors vary by country, state, and county. For example, in most states in the United States, the only requirement to inject patients with neuromodulators and/or fillers is a nursing degree or medical degree. This causes major problems with uniformity and expertise in administering injections. The drawbacks with lack of uniformity in training and expertise are widespread throughout the medical industry. Doctors and practitioners often are not well trained in administering injections for diagnostic, therapeutic, and cosmetic chemical substances. This lack of training often leads to instances of chronic pain, headaches, bruising, swelling, or bleeding in patients.
Current injection training options are classroom based, with hands-on training performed on live models. The availability of models is limited. Moreover, even when available, live models are limited in the number and type of injections that may be performed on them. The need for live models is restrictive because injectors are unable to be exposed to a wide and diverse range of situations in which to practice. For example, it may be difficult to find live models with different skin tones or densities. This makes the training process less effective because patients often have diverse anatomical features as well as varying prophylactic, curative, therapeutic, or cosmetic needs. Live models are also restrictive because injectors are unable to practice injection methods on internal organs due to health considerations. As a result of these limited training scenarios, individuals seeking treatments involving injections have a much higher risk of being treated by an inexperienced injector. This may result in low patient satisfaction with the results or failed procedures. In many instances, patients have experienced lumpiness from incorrect dermal filler injections. Some failed procedures may result in irreversible problems and permanent damage to a patient's body. For example, patients have experienced vision loss, direct injury to the globe of the eye, and brain infarctions where injectors have incorrectly performed dermal filler procedures. Other examples of side effects include inflammatory granuloma, skin necrosis, endophthalmitis, injectable-related vascular compromise, cellulitis, biofilm, subcutaneous nodules, fibrotic nodules, and other infections.
As a result of the varying qualifications and training requirements for injectors, there is currently no standard to train, educate, and certify providers on the proper and accurate process of various injection techniques. Patients seeking injections also have few resources for determining the qualifications or experience of a care practitioner.
The present disclosure generally relates to an injection apparatus and training system for prophylactic, curative, therapeutic, acupuncture, or cosmetic injection training and certification. The training system eliminates the need to find live models for hands-on training sessions. The training system provides feedback on trainees and the accuracy of injection procedures performed. In an embodiment, feedback is provided in real time. The training system can be used as a measurement on how the “trainee” is doing prior to receiving actual product by the manufacturing company as a measure of qualification. The training system reduces the risks associated with inexperienced and uncertified medical personnel performing injection procedures.
The training system can be used to educate, train, and certify medical personnel for injection procedures. It can also be utilized as a testing program for certifying medical personnel. The system will enable users to practice a variety of injections, ranging from on label to off label product injections. In some embodiments, the system may allow users to train for therapeutic treatments. In other embodiments, the system may allow users to train for injections into arteries, bone marrow, the spine, the sternum, the pleural space of the chest region, the peritoneal cavity, joint spaces, internal organs, or any other injection sites. The system may be used for any type of injection, including, but not limited to those involving prophylactic, curative, therapeutic, or cosmetic treatments in both humans and animals. In other applications, the systems disclosed herein can be used for dental application and training for dental procedures.
In one embodiment, there are three main components of the training system: (1) a training apparatus (also referred to interchangeable as an injection apparatus throughout the present disclosure) which features an anatomically accurate model of a human or human body part necessary for injection training, (2) a camera associated with the training apparatus, and (3) a testing tool with light emitting capabilities. In an embodiment, a fourth component of the training system can include a output device that can run an application which receives communications from the training apparatus or camera and generates information regarding injection parameters based on the communications from the injection apparatus or camera. In an embodiment, the images captured by the camera are processed by a processor included either in the injection apparatus or in the camera before being communicated to the output device. This processing can include, for example, determining an indication of one or more injection parameters. In an embodiment, the anatomical model can include various injection conditions, such as, for example, layered skin, available in multiple tones and textures to mimic a diverse span of age, race, and skin texture. In an embodiment, the layered skin can be removable and/or replaceable. The apparatus can simulate any human or animal part, such as, for example, the face, head, brain, neck, back, chest, spine, torso, arms, legs, hands, feet, mouth, or any other body part or portion of the body of interest. In an embodiment, the testing tool can be, for example a syringe or hypodermic needle. In an embodiment, the injection apparatus is reusable. In an embodiment, the injection apparatus is disposable.
Although the present disclosure specifically describes the use of a camera, it is to be understood that the principles disclosed throughout the present disclosure can apply to any light detector or light detection device. Moreover, by referring to a camera, the present disclosure is not limited to a visible light detection device, rather, any visible or non-visible light detector or detection device can be used as would be understood by a person of skill in the art with any embodiment disclosed herein.
In one embodiment, the injection apparatus can feature an anatomically correct model of an animal or animal body part. The animal or animal body part can have a base layer that can be covered in removable skin, animal hair, or scales to replicate the look and feel of a real animal. The skin, animal hair, or scales can be in different colors, coarseness, thickness, density, or stiffness.
In some embodiments, the base layer of the apparatus may be a clear plastic shell simulating a human or animal body part, such as, for example, a human or animal head. The plastic shell can be covered with layers of elastomer membranes simulating human or animal muscle or skin. In an embodiment, one or more of these layers can be removable and/or replaceable. In some embodiments, the top layer of injectable skin consists of separate layers simulating mammalian skin: the epidermis, dermis, and hypodermis. The layers of injectable muscle and skin may be of uniform density. In other embodiments, the layers of skin may be thicker or thinner to simulate the skin of humans or animals with uneven skin layers or damaged skin. The separate layers of injectable skin may consist of elastomers simulating the look and feel of human or animal skin and muscle. The injectable muscle and skin layers may have a different transparency. For example, the different layers may be opaque, tinted, or clear.
In one embodiment, the injection apparatus may be used for injections in different areas of the human or animal body. For example, the injection apparatus may simulate the rear torso and buttocks of a human for epidural injections. The injection apparatus can also simulate the back of the neck of an animal for subcutaneous injections. The injection apparatus may also simulate different organs of the human or body, such as the heart, brain, or liver. In some embodiments, the injection apparatus can simulate different bones in human or animal bodies that require injections or extractions. The simulated bones can contain extractable material that simulates bone marrow. The bones can be separately used as a training apparatus or be placed within an anatomically correct model of a simulated human or animal body part. For example, bone marrow extractions can be performed by inserting the testing tool through skin, muscle, and bone layers of a training apparatus. In one embodiment, the injection apparatus may be covered in different removable layers of material for detecting an injection. The different layers may be opaque, tinted, marked or clear. In some embodiments, the different removable layers of the injection apparatus may be embedded with sensors that can be pierced by a testing tool. In an embodiment, the apparatus is a human or animal mouth that can be used to perform dental or periodontic procedures.
In one embodiment, a testing tool is provided. In an embodiment, the testing tool is in the form of a hypodermic needle. The hypodermic needle can be part of a syringe. The hypodermic needle can be of any gauge. The testing tool can be of any size or shape and designed to simulate the size and shape of an injection tool, such as a syringe, used for any particular type of injection being practiced. In an embodiment, the testing tool has a light source that emits light at the head of the needle. In an embodiment, a fiber optic is in the needle. For example, the fiber optic can be inserted into or threaded through the needle and configured to emit light from a light source through the tip or head of the needle. The light source may be one or more an LEDs, laser diodes, or any other light emitting device or combination of devices. In an embodiment, the light source can emit light along a spectrum of visible. In other embodiments, the light source can emit light of non-visible light, such as infrared light. In some embodiments, the light emitted from the light source is attenuated by each layer of simulated skin or muscle. The testing tool can have a barrel. The testing tool can also have a plunger associated with the barrel.
The testing tool can be used to practice injections on the injection apparatus. In an embodiment, the light emitted through the tip or head of the needle of the testing tool is attenuated by the injection apparatus. The attenuated light is detected by the camera. As the needle portion of the testing tool penetrates through each layer of the injection apparatus material, different colors, intensities, fluorescence, textures, graph lines, polarization, or other visual effects of light will be detected by the camera (or any other visible or non-visible light detector). The resulting visual effects of the attenuated light are detected or viewed by the camera. The visual effects can represent the differences in location of the injection, depth of the injection, pressure of an injection exerted by the user and/or angle of injection. This information, detected by the camera, can be communicated to an output device for data collection, testing or certification purposes. Although the disclosure discloses the use of a camera, the disclosure and claims are not limited to the use of a visible light camera, or typical consumer photography cameras. Rather, the term camera, as used herein, can, in some embodiments, extend to the use of any light detectors or light detection devices, including, for example, photodiodes, infrared, polarization, fluorescent or ultraviolet light or thermal imaging cameras or other devices used to detect the presence or absence of visible or non-visible light.
In some embodiments, a camera is placed within or proximate to the injection apparatus. The camera can send the information detected to a processing unit. The processing unit communicates with an output device which can display the results received from an injection. The output device, also interchangeably referred to herein as an interface device, user device or display device, can include any type of display useful to a user, such as, for example, a tablet, phone, laptop or desktop computer, television, projector or any other electronic or paper based display technology. The processing unit can also collect the information for use in data gathering or informatics. Information about the injection can also be gathered from the testing tool. The output device may include lights, graphical displays, audio devices, or user controls. The output device can be an electronic, computer, or mobile device. This can include, for example, a smart phone or tablet. The output device can run a dedicated application configured to receive wireless communication directly from the camera and/or testing tool and analyze this information for feedback and display to a user. Alternatively, a separate processor in the injection apparatus and/or testing tool can process the information before sending the processed information to the output device for display.
In some embodiments, the injection apparatus can be configured to mimic certain muscle contraction conditions common with a particular type of injection. For example, this can include contractions of facial features, such as furrowing of an eyebrow, squinting of the eyes, or pursing of the lips. The removable skin can also include blemishes, such as scars or wrinkles.
In an embodiment, the layers of material surrounding the injection apparatus have different pigmentations. For example, in an embodiment where the injection apparatus has three layers of pigmentation, the first layer may be opaque, the second layer may be tinted, and the third layer may be clear. The pigmentation of the layers selectively alters the testing tool light to display a different color or intensity of light as it passes through each layer. This resulting attenuated light from the testing tool is then detected or viewed by the camera enclosed in the injection apparatus. The output device is configured with software to recognize the color, direction, and intensity of light detected by the camera. Based on the color, direction, and intensity detected by the camera, a software program may determine the depth, pressure, or angle of injection. Similarly, markings or other identification options can be used to identify the depth, pressure, or angle of injection as described below.
In an embodiment, a system for cosmetic or therapeutic training configured to aid in training a care provider to provide cosmetic or therapeutic injections is disclosed. The system includes a testing tool that has an injection needle head, the testing tool configured to emit light. The system also includes an apparatus configured to receive an injection and attenuate the light emitted by the testing tool, the attenuation representative of an injection parameter; and a light detector, the light detector positioned to detect the light attenuated by the apparatus. In an embodiment, the system includes a processor configured to receive and process an indication of the detected light from the light detector. In an embodiment, the system includes a display device. In an embodiment, the display device is configured to display the indication of the detected light from the light detector. In an embodiment, the indication of the detected light is an image. In an embodiment, the display device is configured to display injection measurement data. In an embodiment, the injection parameter is a depth of the injection. In an embodiment, the injection parameter is an angle of injection. In an embodiment, the injection parameter is pressure. In an embodiment, the processor is configured to determine an accuracy of the injection. In an embodiment, the apparatus includes a plurality of nesting layers, wherein each layer provides a different attenuation of light. In an embodiment, each of the plurality of nesting layers are colored. In an embodiment, at least some of the plurality of nesting layers are translucent. In an embodiment, the plurality of nesting layers include a removable skin layer, a muscle layer, and a nerve layer. In an embodiment, the removable skin layer is configured to represent one or more of different ages, ethnicities, races, textures, or thicknesses of human skin. In an embodiment, the removable skin layer is transparent and the muscle layer is visible through the skin layer. In an embodiment, the removable skin layer simulates cosmetic conditions. In an embodiment, one or more injection sites are positioned at locations on the injection apparatus which correspond to injection locations for cosmetic conditions and therapeutic treatment. In an embodiment, the testing tool comprises a light source configured to emit visible light. In an embodiment, the light detector comprises a camera.
In an embodiment, method of injection training is disclosed. The method can include providing a testing tool simulating a syringe and configured to emit light; providing an injection apparatus, the injection apparatus configured to provide a simulation of a testing site and configured to attenuate the light emitted by the testing tool according to a desired parameter of an injection; providing a light detector configured to detect the attenuated light; using the testing tool to inject the injection apparatus; and detecting, using the light detector, the light attenuated by the injection apparatus during the injection. In an embodiment, the method also includes analyzing the detected attenuated light from the camera to determine an accuracy of the injection. In an embodiment, the parameter under test is the depth and location of the injection. In an embodiment, the parameter under test is one or more of speed, pressure, angle, depth or location. In an embodiment, the injection apparatus is configured to attenuate the light by providing a plurality of nesting layers each including a different level of light attenuation. In an embodiment, each of the plurality of nesting layers is a different color.
In an embodiment, a testing tool is disclosed. The testing tool can include a needle; a barrel; and a light source configured to emit light from the needle. In an embodiment, the testing tool also includes a plunger. As will be understood by those of skill in the art, the testing tool described throughout this disclosure and in every embodiment of the disclosure can be configured to simulate all or various combinations of parts of a typical syringe or hypodermic needle. For example, this can include a needle, a barrel, and a plunger or any combination thereof. Also, any light source or combination of elements described herein can also be included in the testing tool. In any of the embodiments disclosed herein, unneeded or unnecessary parts of the testing tool can be left off. For example, in some embodiments, a plunger is unnecessary and left off the device.
In an embodiment, the testing tool can include a sensor configured to determine a relative position of the plunger with respect to the barrel. In an embodiment, the sensor is potentiometer. In an embodiment, the sensor is housed proximate the barrel. In an embodiment, the sensor is housed away from the barrel. In an embodiment, the testing tool includes a friction system configured to simulate an injection. In an embodiment, the needle is hollow. In an embodiment, the testing tool includes an optical fiber configured to transmit the emitted light to the tip of the needle. In an embodiment, the emitted light is visible light. In an embodiment, the emitted light is one or more of visible light, non-visible light, ultraviolet light, polarized light, infrared light or fluorescent light.
In an embodiment, an injection apparatus configured to simulate at least a portion of a patient under test is disclosed. The injection apparatus includes a first structure configured to simulate a portion of a patient under test; and at least one injection layer configured to simulate an injection condition, the injection layer configured to attenuate emitted light from a testing tool such that at least one testing parameter of a desired injection can be determined. In an embodiment, the injection apparatus includes at least two injection layers, wherein each injection layer is configured to attenuate the emitted light. In an embodiment, each of the two or more injection layers attenuates the emitted light differently. In an embodiment, each of the two or more injection layers is tinted with a different color and the emitted light is visible light. In an embodiment, the patient is a human. In an embodiment, the patient is an animal. In an embodiment, the first structure is a head. In an embodiment, the first structure is a back. In an embodiment, the first structure is a chest.
In an embodiment, a testing tool is disclosed. The testing tool includes a needle; and an optical fiber configured to receive emitted light from a light source through a proximate end of the optical fiber, the optical fiber further configured to emit light out of a distal end of the optical fiber, the optical fiber positioned in the needle so that light is emitted at a head of the needle. In an embodiment, the needle is a hypodermic needle. In an embodiment, the distal end of the optical fiber is located at a tip of the needle. In an embodiment, the emitted light is visible light. In an embodiment, the emitted light is one or more of visible light, non-visible light, ultraviolet light, infrared light or fluorescent light. In an embodiment, the testing tool includes a syringe. In an embodiment, the testing tool includes a barrel and plunger. In an embodiment, the testing tool includes a sensor configured to determine a relative position of the plunger with respect to the barrel. In an embodiment, the sensor is potentiometer. In an embodiment, the sensor is housed proximate the barrel. In an embodiment, the sensor is housed away from the barrel. In an embodiment, the testing tool includes a friction system configured to simulate an injection. In an embodiment, the testing tool includes a friction system configured to simulate an injection. In an embodiment, the needle is hollow.
In an embodiment, a method of using a testing tool for injection training is disclosed. The method includes providing a testing tool, the testing tool including a needle; an optical fiber configured to emit light out of a distal end of the optical fiber, the optical fiber positioned in the needle so that light is emitted at a head of the needle; and a light source configured to emit light through a proximate end of the optical fiber. The method also includes using the testing tool to inject an injection apparatus; and detecting the emitted light after attenuation by the injection apparatus to determine an injection parameter. In an embodiment, the needle is a hypodermic needle. In an embodiment, the distal end of the optical fiber is located at a tip of the needle. In an embodiment, the emitted light is visible light. In an embodiment, the emitted light is one or more of visible light, non-visible light, ultraviolet light, infrared light or fluorescent light. In an embodiment, the method further includes providing a syringe. In an embodiment, the method further includes providing a barrel and plunger. In an embodiment, the method further includes providing a sensor configured to determine a relative position of the plunger with respect to the barrel. In an embodiment, the sensor is potentiometer. In an embodiment, the sensor is housed proximate the barrel. In an embodiment, the sensor is housed away from the barrel. In an embodiment, the method further includes providing a friction system configured to simulate an injection. In an embodiment, the method further includes providing a friction system configured to simulate an injection. In an embodiment, the needle is hollow. In an embodiment, the method further includes storing the injection parameter in an electronic storage device. In an embodiment, the method further includes compiling a plurality of injection parameters from an injector and determining an accuracy rating of injection. In an embodiment, the method further includes publically publishing the accuracy rating.
In an embodiment, a method of rating an injector is disclosed. The method includes using an injection apparatus to detect injection parameters about an injection by an injector using a testing tool; and determining a rating of the injector from the injection parameters. In an embodiment, the injector is a primary care physician, dentist, veterinarian, nurse practitioner, nurse, physician's assistant, aesthetic spa physician, plastic surgeon, facial plastic surgeon, oculoplastic surgeon, or dermatologist. In an embodiment, the rating is an accuracy of injections. In an embodiment, the rating is an experience of the injector. In an embodiment, the rating indicates a quality of the injector. In an embodiment, the rating is publically published. In an embodiment, the rating is one or more of education, years of experience, performance results with the injection apparatus, or patient reviews.
Embodiments will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure. Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the present disclosure.
A testing tool 110 is also illustrated which can be used with the injection apparatus 100 and in conjunction with a camera 120 located within the injection apparatus 100. The testing tool 110 may simulate any type of equipment used in connection with an injection or minimally invasive procedure, such as a needle, catheter or cannula. As described in further detail below, the camera 120 can capture visual indications of the user's injection using the testing tool 110. The visual indications provide an operator or user with information regarding the location, depth, pressure, or angle of the injection. In an embodiment, the testing tool 110 contains a light source that emits light through the needle portion of the testing tool which is used to aid in obtaining the visual indications detectable by a camera 120. The light source can emit visible light. In an embodiment, a gradient of white light is emitted through the needle portion of the testing tool. Other colors of visible light can also be used, such as green, blue, red, yellow or any combination of those colors. In an alternative embodiment, the light source may emit light along a spectrum of visible or non-visible light, such as fluorescent or ultraviolet light. In some embodiments, the light emitted from the light source is attenuated differently depending on which layer of simulated skin or muscle of the injection apparatus 100 is penetrated. Different colors, directions, graph lines, visual patterns, polarization, fluorescence, or intensities of light can be captured by the camera 120 as the testing tool 110 is injected through the different layers of material surrounding the injection apparatus 100. The resulting light detected by the camera 120 can be used to determine the location of the injection, the pressure exerted by the user, the angle of injection, or the depth of the injection. This information can be detected, for example by a camera 120, and communicated to a user interface device 140 for testing or certification purposes.
The camera 120 within the simulated skull of the injection apparatus captures the attenuated light of an injection through video recording and/or photographic images. The camera 120 can include a processor and can communicate the camera output to a user interface device 140. The information gathered from the camera 120 and testing tool 110 may be communicated to a user interface 140 for data collection, testing or certification purposes. The camera output can be raw or processed video or images obtained from the camera 120. The camera processor can include software configured to interpret the visual indications for testing purposes or can merely pass images to the user interface device 140 for further processing. In an embodiment, the user interface device 140 can also communicate instructions to the camera 120 and/or testing tool 110.
In some embodiments, the light viewed by the camera 120 from the needle 212 can change from a round to oval shape. This can occur when the needle moves out of alignment with the camera 120. The length and width of the oval viewed by the camera 120 can indicate the angle of the injection while the direction of the oval along its longer axis can indicate the direction of the injection.
In some embodiments, fluid can be stored within the testing tool 110 and injected into the injection apparatus 100. The fluid injected can simulate the consistency of the injection substance that would be injected for a real treatment. Once the injection apparatus 100 receives the injection, a camera 120 captures the location, speed, pressure, and angle of the injection. The camera 120 sends this information in video or photographic form to the processor 604 which analyzes the detailed information and determines desired injection parameters. In some embodiments, the data associated with the injection can be determined by the testing tool 110 sending information wirelessly to the processor 604. For example, the testing tool may detect the friction experienced by the friction o-ring 211 to send to the processor 604 information about the speed and pressure of an injection. In one embodiment, an accelerometer can be attached to the testing tool 110 to provide information on an angle of injection. Another process for conveying information to the processor 604 includes alternating frequencies or patterns of light emitted by the LED. Alternatively, the information from the camera 120 and testing tool 110 can be sent directly to the output device 140 for processing.
In one embodiment, the plunger 209 of the testing tool 110 may be able to detect the angle, speed, friction, and depth of an injection. This can be accomplished with a wired or wireless electrical signal being transmitted from a sensor placed in the testing tool 100. In some embodiments, a cable can be placed parallel to the light fiber that can read the injection parameters, such as the pressure, speed, or acceleration of the injection. For example, the electrical signal transmitted from the sensor can detect 0-5 volts of electricity, which can represent the amount of pressure being exerted by the user when utilizing the testing tool 110. In other embodiments, the electrical signal may emit a certain frequency that represents the pressure exerted. For example, a frequency of 100 Hz can represent low pressure while a frequency of 1,000 Hz can represent high pressure exerted by the user. In an embodiment, the LED can be modulated at a modulation rate corresponding to an angle, speed, friction or depth of an injection. This modulated light can be detected by the camera and used to determine the desired injection parameters without the need for a separate data communication path between the testing tool and the rest of the system. In some embodiments, a wireless transmitter can be placed in the testing tool that communicates directly to the user interface device 140 and displays the parameters of the injection.
In some embodiments, the testing tool 110 can inject a fluorescent fluid into the injection apparatus 100. The layers of simulated muscle and skin may be configured to have a reservoir that accepts these fluid injections. The fluorescent fluid may be visible through transparent, opaque, or lightly pigmented material in the simulated skin and muscle layers. In one embodiment, a UV lamp may be placed within the injection apparatus 100 in order for a user to clearly see the injection and injected fluid going into the injection apparatus 100.
In some embodiments, the testing tool 110 may also be powered with a plug in cable. The testing tool 110 can send information over a wireless network or portable computer to process information about an injection. The signals may send information related to the 3D location, depth, intensity, or pressure exerted by the user when practicing the injection.
In some embodiments, the transducer or potentiometer can be connected to a slider 216. The linear potentiometer 214 measures the position of the plunger 209 of the testing tool relative to the barrel 210. In some embodiments, the linear potentiometer 214 may be fixed to the plunger 209 of the testing tool. A slider 216 may be attached through a slot 225 in the barrel 210 and mated with a pocket within the plunger 209. The slider 216 moves with the plunger 209 to allow the transducer to output the position of the plunger 209 through an output pin 215. The transducer then electronically communicates through a connection with a processor 604, which calculates the simulated volume and distribution of the injection. This calculation may be completed by using the parameters of the plunger 209 displacement and the diameter of the barrel 210.
In some embodiments, the testing tool determines the pressure applied by the injector. This can be accomplished by measuring the force applied to the plunger 209 through a thin film force sensor 217 on the plunger 209 flange. Electrical connections to the force sensor and linear potentiometer may be placed along with the optical fiber 207 in a multi-lumen sheath. The force sensor 217 electrically communicates with a processor 604 (
In an embodiment, the remote transducer is used in conjunction with a force application system to simulate the viscosity encountered during an injection. In an embodiment, a needle drag can be designed to simulate a real injection. The needle drag can be determined based on the elasticity of the injection apparatus layers (for example, as measured in durometers), the coefficient of friction between the plunger and the barrel and the needle diameter and length.
In some embodiments, each separate layer of skin or muscle 410, 420, 430 may be of a different transparency, density or color. In some embodiments, the different intensity or colors can be viewed by the camera after the testing tool 110 is inserted into the simulated skin or muscle. This can allow a camera 120 to send information to a processor 604 related to the location, pressure, angle, or depth of an injection. In other embodiments, the injectable muscle and skin layers may be of uniform density, consistency, or color. In some embodiments, the injectable muscle and skin layers 410, 420, 430 may be made of an elastomer. In an embodiment, the elastomer may simulate the elasticity of human skin and range from 5-35 on the durometer “A” scale. The simulated skin and muscle layers 410, 420, 430 may also consist of different angled fibers that deflect light emitted from a testing tool in different directions to allow for location, depth, angle and pressure analysis based on the optical properties observed. In an embodiment, the fibers can be a pattern printed on each skin or muscle layer 410, 420, 430 that selectively block light viewed by the camera. Depending on the angle of the fibers within each layer of the skin and muscle layers 410, 420, 430, the light emitted from a testing tool may be deflected at that angle. For example, the first layer 410 may have threaded angled fibers directed at a 45 degree angle. The second layer 420 may have threaded angled fibers directed at a 55 degree angle. The third layer 430 may have threaded angled fibers directed at a 65 degree angle. Depending on which layer an injector has penetrated, the light emitted from a testing tool 110 may be deflected in a different direction. If the injector has penetrated the second layer 420, the light should be deflected at a 55 degree angle. The deflection of the light emitted from the testing tool 110 is captured by a camera 120 and sent to a processor 604. The processor 604 analyzes the intensity, deflection, and clarity of the light emitted from the testing tool 110 to generate results about the injection.
In some embodiments, the layers of skin or muscle 410, 420, 430 may be dyed with carbon black particles or similar light-obscuring agents. The density of the carbon black particles can be adjusted to substantially block emitted light from reaching the camera through all layers. As the needle portion of the testing tool 110 travels through each layer, more light is viewed by the camera. The carbon black particles obscure light so that an injection into each layer may represent a different intensity of light. In some embodiments, this will allow a camera 120 placed within the injection apparatus 100 to detect the layer of skin or muscle 410, 420, 430 which is being penetrated by the light source. In one embodiment, the different layers of skin or muscle may be dyed with translucent color. These translucent layers will attenuate the light emitted from a testing tool in different ways. The degree and color of attenuation of the light after it has traveled through the simulated muscle and skin layers can then be detected by the camera and used to analyze the injection.
In an embodiment, the system includes an injection apparatus 100 for injection procedures on different parts of the human body. In an embodiment, there are at least three nesting layers of the apparatus: the skeletal structure layer, muscle layer, and top layer of simulated skin. A nerve layer can also be present within the muscle layer. This allows trainees to visualize and study the layers of muscle and nerves underneath the skin layer to become familiar with human facial anatomy. Veins or arteries can also be included and embedded within the muscle layer. The veins or arteries may be of a different color or density than the muscle and skin layers. The injectable muscle and skin layers 410, 420, 430 anatomically match that of the human body. In some embodiments, the injection apparatus 100 may simulate the internal organs or other body parts of a human or animal. In some embodiments, injectable muscle and skin layers 410, 420, 430, may be color coded so that a trainee may be able to identify the different sections of the human body or muscles associated with each simulated condition.
The depicted layer on the injection apparatus 100 in
In some embodiments, the injection apparatus 100 is configured to represent human facial features, such as those features associated with smiling or frowning, as would be encountered during certain cosmetic or therapeutic injections. In some embodiments, the apparatus can model various cosmetic conditions or damaged areas of the human body. For example, these cosmetic conditions may include glabellar frown lines, horizontal forehead lines, temporal brow lifts, crow's feet (lateral canthal lines), lower eyelids, nasalis bunny lines, vertical lip lines, gummy smiles, nasolabial folds (NLFs), marionette lines, pre-jowl sulcus, labiomental crease, and midface, facial lipoatrophy, lip augmentation, mouth frowns (depressor anguli oris), apple dumpling chin, horizontal neck lines, vertical platysmal bands, acne blemishes, accident scars, or asymmetry. In some embodiments, the skin can be manipulated to mimic actual facial movement, such as furrowing of the brow, squinting of the eyes, and pursing of the lips. Users of the injection apparatus may be able to pinch the skin, stretch the skin, or grab a portion of the muscle in order to simulate a real injection. The injection apparatus 100 may be programmed to display various cosmetic conditions through a user interface device 140. There may also be buttons available on the injection apparatus 100 for programming cosmetic conditions. In some embodiments, the skin layer may be manufactured with pre-determined cosmetic conditions.
In one embodiment, programs for individual injection sites may be sold separately or in a package. The user interface device 140 may be updated with various injection tests for different parts of the human or animal body. For example, an injection test can be purchased for Botox® injections. The injection sites for parts of the human face could be downloaded onto the user interface device 140 and unlocked by a user. For example, the targeted injection sites for toxin cosmetic injections for a human face may include the frontalis (forehead lines), glabellar complex (procerus and corrugators) frown lines, orbicularis oculi-lateral canthal area, crow's feet lines, nasalis-bunny lines, orbicularis oris-vertical lip lines, depressor anguli oris, mentalis, masseter, platysma, depressor septi nasi, levator labii superioris alaeque nasi, gland hypertrophy, or labial artery. The program can communicate with the processor 604 to control the movement of the camera 120 to record or measure the specific injection sites for injection testing. The program can also communicate with the processor 604 to change the pigmentation or color of the skin layers 410, 420, 430 of the injection apparatus 100. In some embodiments, the program can be set to simulate a specific type of injection scenario. For example, a user can set the user interface device 140 to simulate crow's feet on the injection apparatus 100. The skin layers 410, 420, 430 would be mechanically moved to simulate the wrinkles at the edge of the injection apparatus 100 to form crow's feet. Once the user correctly injects the injection apparatus 100 at the injection site for crow's feet, the injection apparatus 100 would mechanically smooth out the wrinkles from the crow's feet.
In one embodiment, the program can inform the user of the type of treatment performed on the injection apparatus 100 through the user interface device 140. For example, the user interface device 140 may educate the user on the type of treatment, such as whether it is therapeutic, sub-therapeutic, or super-therapeutic.
The injection apparatus 100 may also be used for therapeutic treatment training. These treatments may include those related to blepharospasm, strabismus, or chronic migraines, and others. For example, Botox® injections can be practiced to induce localized, partial paralysis on the apparatus for treatment of blepharospasm. In some embodiments, the injection apparatus may be manipulated to display the different physical features associated with conditions requiring therapeutic treatment. For example, the injection apparatus 100 may display a squinted eye or be cross-eyed when it is programmed as a patient with strabismus. Upon therapeutic treatment by a trainee, the injection apparatus 100 mechanically readjusts to fix the condition.
In some embodiments, the base layer 400 allows the injection apparatus 100 to keep its structure and holds the components of the injectable muscle and skin layer in place. The base layer 400 may be mechanical and moveable in response to an injection from the testing tool 110. The base layer 400 may be mapped with a grid of target zones. For example, the inside or outside of the base layer 400 may have imprinted lines that represent zones of injection. The grid of target zones may correspond to an image on a user interface device 140 that is able to show the accuracy of an injection. The grid can show the face from the inside of the camera and what the muscles look like. This can occur, for example, in a training mode. In some embodiments the top skin layer 410 may have visual targets which display the location for injection corresponding to a cosmetic condition or therapeutic treatment. These visual targets may be color coded so that a user may be different injection zones that should be targeted for administering different injections.
In some embodiments, the base layer 400 of the apparatus may be a clear plastic shell. The plastic shell layer may be covered with removable layers of elastomer membranes simulating human muscle or skin. The plastic shell may simulate the look of any human body part, including any internal organs. In some embodiments, the injection apparatus 100 simulates an entire human or animal body part with or without removable layers of elastomer membranes simulating human skin or muscles.
In an embodiment, the injection apparatus may have a camera 120 attached to a pivotable stand 460 and placed within the injection apparatus 100. The pivotable stand 460 may be attached to a removable base 470 that allows a user to physically change the direction of the pivotable stand 460. In some embodiments, the pivotable stand 460 may be mechanically movable upon detection of a signal received through a processor 604. The processor 604 may receive a signal to change the location of the pivotable stand 460 through a output device 140.
In some embodiments, the camera 120 may be positioned so it may swing into different positions in response to a shift gate. This allows a user to move the camera 120 to focus on different target zones without having to manually move the camera within the injection apparatus 100. The camera 120 may include an angular grid sensing filter that can detect its position and rotate itself according to a displayed grid within the injection apparatus 100. In an embodiment, the camera 120 is set to focus on either color or line orientations within the injection apparatus 100. The camera 120 may read a user's injection based on the information received from the light emitted from the testing tool 110 in conjunction with the location determined by a grid embedded in the base layer 400 of the injection apparatus 100.
In some embodiments the camera 120 may have a broad or focused range 450. For example, a broad range camera may be used when there is no specific target area that is being focused on for testing or certification purposes. A focused range camera can be positioned to aim at a zone for injection. In some embodiments, the camera 120 is configured to communicate with a user interface device 140 to display the results of an injection. In an embodiment, the results of the injection may be determined by the intensity and color viewed by the camera 120 after the testing tool 110 has been injected into the different layers of skin or muscle. The range 450 of the camera 120 may be manually adjusted by setting the camera to encompass a smaller or bigger range. The range 450 of the camera 120 may also be adjusted by inputting a grid location into the output device 140 and communicated to the camera 120. The camera 120 then adjusts its targeted location.
The camera 120 can output video to a user interface device 140 through a wired or wireless connection. In an embodiment, the output device 140 is equipped with software to read and analyze the results obtained from the video. In an embodiment, the software is able to analyze the results and generate a score or evaluation of the user's injection. The software can also report and capture data on the trainee's experience. The software may also play back the user's injection and display errors or provide feedback acceptable injections. In an embodiment, the software includes a biometric sensor to identify each trainee.
In an embodiment, the injector or administrator of injection training may choose to focus on a specific area of the injection apparatus and only have the removable layer surrounding that area. The injector may then observe the injection apparatus 100 to see how an injection penetrates through the different layers of skin, muscle, and nerves. This embodiment may be used, for example, for novice injectors who require visual guidance for the depth of their injections.
In some embodiments, the injection apparatus 100 may have sensors embedded within the different human skin and muscle layers 410, 420, 430. The sensors may be located on an injection site 520, multiple injection sites 520, or continuously throughout the entire human skin and muscle layers 410, 420, 430. Once an area has been treated by an injection, the sensor may communicate with the testing tool 110 or the injection apparatus 100 to provide the information associated with the injection. For example, the sensor would be able to test reads the treatment from the sensors. The pressure applied to the area of injection may be detected by the training tool and the parameters of the injection may capture the depth, pressure, and angle of a user's injection. The parameters may then be compared to a pre-determined treatment and provided to a user interface device 140, which displays the testing results.
In some embodiments, the injection apparatus 100 may have inflatable pads embedded within the different human skin and muscle layers 410, 420, 430. The inflating and deflating of the pads may be initiated by an attached sensor detecting the penetration of an injection. The inflatable pads may be located on the injection site 520. The inflatable pad may independently inflate or deflate proportionally to the location, depth, pressure, and angle of a user's injection. The inflation and resulting size of the pads may differ at various injection sites 520, depending on the natural human reaction to an injection in that area. Once a user has completed an administered test, the inflatable pad may deflate and return the human skin and muscle layers to their original condition. The inflation pads allow the asymmetries of an injection to be observed and addressed by the injector. In an embodiment, the testing tool injects air into the inflatable pads, skin and/or muscle layers so that a user can observe how the injection has affected the apparatus. This allows the trainee to see in real time the effect of the injection. For example, the effect can be watching the apparatus “age”. In an embodiment, the trainee can also deflate the fat pads. This allows a trainee to practice determining how much injection is required for a given patient.
In some embodiments, the injection apparatus 100 can be configured to turn on the measurement and/or analysis of different injection sites 520. A software program communicating through the user interface device 140 can selectively enable certain procedures, for example, through separate software purchases or upgrades related to particular injection sites 520. For example, the injection sites 520 for Botox® procedures can be enabled. The injection sites 520 for treating cosmetic conditions such as furrowed brows, crow's feet, or adding volume to lips can also be separately enabled. Once the testing tool 110 injects that particular injection site 520 corresponding to the cosmetic condition, the camera 120 views the injection and communicates the results to the processor 604. The results are generated and displayed through the user interface device 140.
The camera 120 receives its power from batteries in the camera or through a power supply 602. A power manager 603 monitors the on/off switch of the camera 120 and the output device 140 and turns each on or off accordingly. The batteries in the camera 120 may either be alkaline rechargeable batteries or another renewable power source. The camera 120 may also be powered with a plug in cable. In some embodiments, the camera can send information over a wireless network or directly to portable computer to process information about an injection using the wireless communication transceiver 609. The wired or wireless transceiver 609 can communicate over any known protocol including Bluetooth, Zigbee, WiFi, Ethernet, USB, or any other wired or wireless communication protocols.
A non-volatile memory 600 is connected to the processor 604 via a high-speed bus. In the present embodiment, the memory 600 is erasable and allows a user to store information including operating software, user configurable command options and information related to different types of injections, including recorded images or video. The memory 600 may also store user-specific data. For example, a user who has completed several injections on a certain day may store results of those several injections and access the results at a later time. In addition, information obtained by the injection apparatus can be stored and sent to a central repository for analysis with testing information from other devices. The central repository, can be, for example, a server or cloud computing device.
In some embodiments, the separate simulated skin or muscle layers may consist of different angled fibers. As a result of these angled fibers, the light emitted from the testing tool 110 may be deflected in different directions. For example, the fibers present in the lowest layer of simulated muscle or skin may be at a 45 degree angle, the second layer of simulated muscle or skin may be at a 60 degree angle, and the top layer of simulated muscle or skin may be at a 75 degree angle. As the camera 120 views the emitted light from the testing tool, it is able to capture information about the injection into the layer of muscle or skin. The output device 140 may receive this information and generate a report determining the depth, pressure, or angle of the user's injection.
In order to maintain the overall performance of the injection apparatus 100 in conjunction with the testing tool 110 and camera 120, a calibration device can be provided that will check the accuracy of the testing tool 110 with respect to the camera output. This may be completed either automatically after a set number of injections or manually when requested by a user. In some embodiments, the accuracy of the testing tool 110 may be calibrated to have a better than about 0.5 mm precision.
The output device 140 may allow the user to rotate the display presented between landscape and portrait views. The injection apparatus 100 may also be physically rotated and this rotation may be detected by the processor 604, which then sends a signal to the output device 140 to rotate the image displayed. The output device 140 may receive communications sent from a testing tool 110 to the processor 604 regarding this change in direction of the injection apparatus 100 and display the change accordingly. In an embodiment, the image displayed is a three dimensional image. In another embodiment, two dimensional images are displayed.
The results are displayed in a chart 2220 that informs a user or operator of an injector's performance. The output device 140 or software application reports the parameters of the injection collected from the testing tool 110 or camera 120. In some embodiments, the output device 140 and/or software application provides feedback on the results. For example, the feedback may include whether the injection was made in the correct location, the depth of the injection, and areas in which the injection could have been improved. In one embodiment, the feedback may include whether a user passed or failed an injection test corresponding to a cosmetic condition or therapeutic treatment. The results may also be in the form of a score, accuracy rating, or an overall rating.
In this particular example of
After completing the injection test, the user may select a different view of the injection apparatus 100 or choose to enter a learning mode from the main menu 2250. The user has the option of starting over by pressing the new test button 2240 or printing the report 2230. The user interface provides the user or operator with the option of saving the injector's results into the software program for later access.
The test data and other data collected by the devices and systems of the present disclosure can also be analyzed using data analytics. For example, data analytics software can analyze data or information collected from and associated with patients and injectors who use the injection apparatus 100. This data can be collected from a large number of patients and injectors and compiled for analysis, or data can be collected and analyzed separately for each patient or injector. The data can be stored in an electronic storage device local to the injector or at a remote location. In an embodiment, the injection records can be associated with or collected from electronic medical records (EMR). In an embodiment, the data associated with a patient or injector may be accessible by linking the individual's information with a fingerprint or username and password. The fingerprint may be read by a biometric sensor. In some embodiments, an injector may access his or her progress when performing injections on any injection apparatus and each training or test result may be stored. In some embodiments, the patient will have a compilation of all their medical information stored in a database that can be retrieved once their profile is accessed on a output device 140. The information may include personal information, medical history, and types of procedures which have been performed on the patient, which, for example, can be stored in the form of an EMR. Injectors who use the injection apparatus 100 may include those who are certified, are in the process of being certified, doctors, nurses, or other medical practitioners.
The software may keep track of an injector's progress of injections performed on the injection apparatus. Based on the injector's performance, there may be a score, rating or ranking calculated and presented to a user requesting information on the injector. The score, rating or ranking provides an indication of an accuracy of the injections performed, an estimated skill level of the injector, an indication of the experience of the injector or the number of injections performed, or any other measure indicative of the quality of the injector. A separate score or ranking may be available for different types of injections or injection locations. For example, a user searching for an injector experienced in treating crow's feet may pull up a list of injectors in a geographic area. The injectors may be listed by ranking, rating or score based on one or more of education, years of experience, performance results with the injection apparatus, or patient reviews. The data can also be collected from multiple patients or injectors and analyzed to determine a bulk average. This can be used to determine the effectiveness of a treatment or the risks associated with treatment.
In some embodiments, the testing tool 110 may be a scalpel or other equipment used for incisions. The resulting colors, directions, intensities or other visual effects of light detected by the camera 120 represent the location, differences in pressure exerted by the user, angle of injection, or the depth of the injection. This information can be detected, for example by a camera 120, and communicated to a user interface device 140 for testing or certification purposes. The camera 120 may be an endoscope so it may fit within the simulated human brain 2410 or other simulated organs which may not be capable of containing a bigger camera 120. An endoscope may be used for training procedures on other organs, such as the bladder, ureters, or kidneys. The camera 120 is able to detect the size and location of the portion which is removed from the injection apparatus 110. Alternatively, only a portion of the body part is provided and an opposing portion is left open so that a camera can be positioned to detect the testing tool. For example, in
At the end of the simulated operation, the injector or therapist may return the removed portion of the skin, muscle, or skeletal layers 2400 into the simulated human brain 2410. This can be accomplished by attaching the skin incision to the injection apparatus 110 with sutures or surgical staples.
In some embodiments, the injection apparatus 100 may be used in connection with non-invasive or minimally invasive surgical techniques which do not require incisions or do not require large incisions. For example, an injector may be able to perform a simulated brain surgery with radiation, where a high dose of radiation is applied to problematic nerves. This does not require the injector to open up the injection apparatus 100 to view the simulated human brain 2410, but still allows the injector to practice this technique by using the output device 140 to view the simulated human brain 2410.
In some embodiments, the camera 120 may be placed within the injection apparatus and focused on the simulated human eye 2500. The camera 120 may also be an endoscope that captures the administered injection or surgical procedure. In some embodiments, the coats or sections of the eye may be have a different color or density. For example, fibrous tunic may be opaque, the vascular tunic or uvea may be tinted, and the retina may be clear. Once an injection is placed by the injector into the eye, the camera 120 may detect the parameters of the injection.
In some embodiments, the veins of a dog 2700 may have a different color or density than the other portions of the injection apparatus. This is particularly helpful for injectors who wish to practice intravenous injections. For example, injectors who want to practice euthanasia procedures may be given a solution that has the same viscosity of pentobarbital or phenytoin, which are commonly used by veterinarian in administering euthanasia procedures.
The term “injection” as used herein includes it usual and customary meaning of an injection, but is also to be interpreted broad enough to encompass, for example, the insertion of a catheter device or the use of simple needles, such as would be used in an acupuncture therapy. The techniques involved, particularly a camera embedded in a model of a living subject and a tool with a light emitter can be applied to any therapeutic procedure. For example, the tool can be a catheter and the procedure can be a minimally invasive procedure requiring the catheter to be located in a particular location.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or states. Thus, such conditional language is not generally intended to imply that features, elements and/or states are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or states are included or are to be performed in any particular embodiment.
Depending on the embodiment, certain acts, events, or functions of any of the methods described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the method). Moreover, in certain embodiments, acts or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores, rather than sequentially.
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such embodiment decisions should not be interpreted as causing a departure from the scope of the disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The blocks of the methods and algorithms described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of computer-readable storage medium known in the art. An exemplary storage medium is coupled to a processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As will be recognized, certain embodiments of the disclosures described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain disclosures disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
This application is a continuation of U.S. patent application Ser. No. 16/853,597, filed Apr. 20, 2020 which is a continuation of U.S. patent application Ser. No. 15/977,993, filed May 11, 2018, now U.S. Pat. No. 10,643,497, which is a continuation of U.S. patent application Ser. No. 15/258,839, filed Sep. 7, 2016, which is a continuation of U.S. patent application Ser. No. 14/595,972, filed Jan. 13, 2015, now U.S. Pat. No. 9,443,446, which is a continuation of U.S. patent application Ser. No. 14/318,368, filed Jun. 27, 2014, now U.S. Pat. No. 8,961,189, which is a continuation of U.S. patent application Ser. No. 14/067,829, filed Oct. 30, 2013, now U.S. Pat. No. 8,764,449, which claims the benefit of U.S. Provisional Applications Nos. 61/720,046, filed on Oct. 30, 2012; 61/784,239, filed on Mar. 14, 2013; 61/814,766, filed on Apr. 22, 2013; and 61/826,899, filed on May 23, 2013, the entirety of which are hereby incorporated herein by reference. Furthermore, any and all priority claims identified in the Application Data Sheet, or any correction thereto, are hereby incorporated by reference under 37 C.F.R. § 1.57.
Number | Name | Date | Kind |
---|---|---|---|
3237340 | Knott | Mar 1966 | A |
3722108 | Chase | Mar 1973 | A |
3941121 | Olinger et al. | Mar 1976 | A |
4142517 | Contreras Guerrero de Stavropoulos et al. | Mar 1979 | A |
4311138 | Sugarman | Jan 1982 | A |
4356828 | Jamshidi | Nov 1982 | A |
4410020 | Lorenz | Oct 1983 | A |
4439162 | Blaine | Mar 1984 | A |
4515168 | Chester et al. | May 1985 | A |
4566438 | Liese et al. | Jan 1986 | A |
4836632 | Bardoorian | Jun 1989 | A |
4867686 | Goldstein | Sep 1989 | A |
4880971 | Danisch | Nov 1989 | A |
5065236 | Diner | Nov 1991 | A |
5197476 | Nowacki et al. | Mar 1993 | A |
5198877 | Schulz | Mar 1993 | A |
5241184 | Menzel | Aug 1993 | A |
5249581 | Horbal et al. | Oct 1993 | A |
5295483 | Nowacki et al. | Mar 1994 | A |
5321257 | Danisch | Jun 1994 | A |
5391081 | Lampotang et al. | Feb 1995 | A |
5517997 | Fontenot | May 1996 | A |
5518407 | Greenfield et al. | May 1996 | A |
5534704 | Robinson et al. | Jul 1996 | A |
5622170 | Shulz | Apr 1997 | A |
5651783 | Reynard | Jul 1997 | A |
5690618 | Smith et al. | Nov 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5727948 | Jordan | Mar 1998 | A |
5766016 | Sinclair et al. | Aug 1998 | A |
5817105 | Van Der Brug | Oct 1998 | A |
5828770 | Leis et al. | Oct 1998 | A |
5890908 | Lampotang et al. | Apr 1999 | A |
5899692 | Davis et al. | May 1999 | A |
5923417 | Leis | Jul 1999 | A |
5954648 | Van Der Brug | Sep 1999 | A |
5954701 | Matalon | Sep 1999 | A |
6024576 | Bevirt et al. | Feb 2000 | A |
6061644 | Leis | May 2000 | A |
6064749 | Hirota et al. | May 2000 | A |
6127672 | Danisch | Oct 2000 | A |
6172499 | Ashe | Jan 2001 | B1 |
6217558 | Zadini et al. | Apr 2001 | B1 |
6288785 | Frantz et al. | Sep 2001 | B1 |
6353226 | Khalil et al. | Mar 2002 | B1 |
6385482 | Boksberger et al. | May 2002 | B1 |
6428323 | Pugh | Aug 2002 | B1 |
6470302 | Cunningham et al. | Oct 2002 | B1 |
6485308 | Goldstein | Nov 2002 | B1 |
6538634 | Chui et al. | Mar 2003 | B1 |
6553326 | Kirsch et al. | Apr 2003 | B1 |
6564087 | Pitris et al. | May 2003 | B1 |
6568941 | Goldstein | May 2003 | B1 |
6575757 | Leight et al. | Jun 2003 | B1 |
6625563 | Kirsch et al. | Sep 2003 | B2 |
6687529 | Van Vaals | Feb 2004 | B2 |
6702790 | Ross et al. | Mar 2004 | B1 |
6769286 | Biermann et al. | Aug 2004 | B2 |
6774624 | Anderson et al. | Aug 2004 | B2 |
6836745 | Seiler et al. | Dec 2004 | B2 |
6857878 | Chosack et al. | Feb 2005 | B1 |
6863536 | Fisher et al. | Mar 2005 | B1 |
7015859 | Anderson | Mar 2006 | B2 |
7115113 | Evans et al. | Oct 2006 | B2 |
7137712 | Brunner et al. | Nov 2006 | B2 |
7158754 | Anderson | Jan 2007 | B2 |
7194296 | Frantz et al. | Mar 2007 | B2 |
7204796 | Seiler | Apr 2007 | B1 |
7247149 | Beyerlein | Jul 2007 | B2 |
7383728 | Noble et al. | Jun 2008 | B2 |
7500853 | Bevirt et al. | Mar 2009 | B2 |
7544062 | Hauschild et al. | Jun 2009 | B1 |
7553159 | Arnal et al. | Jun 2009 | B1 |
7594815 | Toly | Sep 2009 | B2 |
7665995 | Toly | Feb 2010 | B2 |
7725279 | Luinge et al. | May 2010 | B2 |
7761139 | Tearney et al. | Jul 2010 | B2 |
7783441 | Nieminen et al. | Aug 2010 | B2 |
7857626 | Toly | Dec 2010 | B2 |
7912662 | Zuhars et al. | Mar 2011 | B2 |
7945311 | McCloy et al. | May 2011 | B2 |
8007281 | Toly | Aug 2011 | B2 |
8040127 | Jensen | Oct 2011 | B2 |
8072606 | Chau et al. | Dec 2011 | B2 |
8131342 | Anderson | Mar 2012 | B2 |
8165844 | Luinge et al. | Apr 2012 | B2 |
8203487 | Hol et al. | Jun 2012 | B2 |
8208716 | Choi et al. | Jun 2012 | B2 |
8226610 | Edwards et al. | Jul 2012 | B2 |
8250921 | Nasiri et al. | Aug 2012 | B2 |
8257250 | Tenger et al. | Sep 2012 | B2 |
8277411 | Gellman | Oct 2012 | B2 |
8319182 | Brady et al. | Nov 2012 | B1 |
8342853 | Cohen | Jan 2013 | B2 |
8351773 | Nasiri et al. | Jan 2013 | B2 |
8382485 | Bardsley et al. | Feb 2013 | B2 |
8403888 | Gaudet | Mar 2013 | B2 |
8408918 | Hu et al. | Apr 2013 | B2 |
8409140 | Ejlersen et al. | Apr 2013 | B2 |
8437833 | Silverstein | May 2013 | B2 |
8442619 | Li et al. | May 2013 | B2 |
8450997 | Silverman | May 2013 | B2 |
8467855 | Yasui | Jun 2013 | B2 |
8469716 | Fedotov et al. | Jun 2013 | B2 |
8525990 | Wilcken | Sep 2013 | B2 |
8535062 | Nguyen | Sep 2013 | B2 |
8556635 | Toly | Oct 2013 | B2 |
8632498 | Rimsa et al. | Jan 2014 | B2 |
8647124 | Bardsley et al. | Feb 2014 | B2 |
8655622 | Yen et al. | Feb 2014 | B2 |
8684744 | Selz et al. | Apr 2014 | B2 |
8689801 | Ritchey et al. | Apr 2014 | B2 |
8715233 | Brewer et al. | May 2014 | B2 |
8764449 | Rios et al. | Jul 2014 | B2 |
8818751 | Van Acht et al. | Aug 2014 | B2 |
8917916 | Martin et al. | Dec 2014 | B2 |
8924334 | Lacey et al. | Dec 2014 | B2 |
8945147 | Ritchey et al. | Feb 2015 | B2 |
8961189 | Rios et al. | Feb 2015 | B2 |
8994366 | Ashe | Mar 2015 | B2 |
9017080 | Placik | Apr 2015 | B1 |
9024624 | Brunner | May 2015 | B2 |
9031314 | Clausen et al. | May 2015 | B2 |
9053641 | Samosky | Jun 2015 | B2 |
9123261 | Lowe | Sep 2015 | B2 |
9251721 | Lampotang et al. | Feb 2016 | B2 |
9275557 | Trotta | Mar 2016 | B2 |
9318032 | Samosky et al. | Apr 2016 | B2 |
9361809 | Caron | Jun 2016 | B1 |
9439653 | Avneri et al. | Sep 2016 | B2 |
9443446 | Rios et al. | Sep 2016 | B2 |
9456766 | Cox et al. | Oct 2016 | B2 |
9460638 | Baker et al. | Oct 2016 | B2 |
9486162 | Zhuang et al. | Nov 2016 | B2 |
9554716 | Burnside et al. | Jan 2017 | B2 |
9595208 | Ottensmeyer et al. | Mar 2017 | B2 |
9626805 | Lampotang et al. | Apr 2017 | B2 |
9666102 | East et al. | May 2017 | B2 |
9792836 | Rios et al. | Oct 2017 | B2 |
9922578 | Foster et al. | Mar 2018 | B2 |
10083630 | Samosky et al. | Sep 2018 | B2 |
10173015 | Fiedler et al. | Jan 2019 | B2 |
10269266 | Rios et al. | Apr 2019 | B2 |
10290231 | Rios et al. | May 2019 | B2 |
10290232 | Rios et al. | May 2019 | B2 |
10325522 | Samosky et al. | Jun 2019 | B2 |
10500340 | Rios et al. | Dec 2019 | B2 |
10643497 | Rios et al. | May 2020 | B2 |
10743942 | Foster et al. | Aug 2020 | B2 |
10849688 | Rios et al. | Dec 2020 | B2 |
10857306 | Holmqvist et al. | Dec 2020 | B2 |
10896627 | Foster et al. | Jan 2021 | B2 |
10902746 | Rios et al. | Jan 2021 | B2 |
20010037191 | Furuta et al. | Nov 2001 | A1 |
20020076681 | Leight et al. | Jun 2002 | A1 |
20020168618 | Anderson et al. | Nov 2002 | A1 |
20020191000 | Henn | Dec 2002 | A1 |
20030031993 | Pugh | Feb 2003 | A1 |
20030055380 | Flaherty | Mar 2003 | A1 |
20030108853 | Chosack et al. | Jun 2003 | A1 |
20030114842 | DiStefano | Jun 2003 | A1 |
20030164401 | Andreasson et al. | Sep 2003 | A1 |
20030220557 | Cleary et al. | Nov 2003 | A1 |
20040009459 | Anderson et al. | Jan 2004 | A1 |
20040092878 | Flaherty | May 2004 | A1 |
20040118225 | Wright et al. | Jun 2004 | A1 |
20040126746 | Toly | Jul 2004 | A1 |
20040175684 | Kaasa et al. | Sep 2004 | A1 |
20040234933 | Dawson et al. | Nov 2004 | A1 |
20050055241 | Horstmann | Mar 2005 | A1 |
20050057243 | Johnson et al. | Mar 2005 | A1 |
20050070788 | Wilson et al. | Mar 2005 | A1 |
20050084833 | Lacey et al. | Apr 2005 | A1 |
20050181342 | Toly | Aug 2005 | A1 |
20050203380 | Sauer et al. | Sep 2005 | A1 |
20060084050 | Haluck | Apr 2006 | A1 |
20060085068 | Barry | Apr 2006 | A1 |
20060194180 | Bevirt et al. | Aug 2006 | A1 |
20060264745 | Da Silva | Nov 2006 | A1 |
20060264967 | Ferreyro et al. | Nov 2006 | A1 |
20070003917 | Kitching et al. | Jan 2007 | A1 |
20070179448 | Lim et al. | Aug 2007 | A1 |
20070197954 | Keenan | Aug 2007 | A1 |
20070219503 | Loop et al. | Sep 2007 | A1 |
20070238981 | Zhu et al. | Oct 2007 | A1 |
20080038703 | Segal et al. | Feb 2008 | A1 |
20080097378 | Zuckerman | Apr 2008 | A1 |
20080107305 | Vanderkooy et al. | May 2008 | A1 |
20080123910 | Zhu | May 2008 | A1 |
20080138781 | Pellegrin et al. | Jun 2008 | A1 |
20080176198 | Ansari et al. | Jul 2008 | A1 |
20080177174 | Crane | Jul 2008 | A1 |
20080194973 | Imam | Aug 2008 | A1 |
20080270175 | Rodriguez et al. | Oct 2008 | A1 |
20090036902 | Dimaio et al. | Feb 2009 | A1 |
20090043253 | Podaima | Feb 2009 | A1 |
20090046140 | Lashmet et al. | Feb 2009 | A1 |
20090061404 | Toly | Mar 2009 | A1 |
20090074262 | Kudavelly | Mar 2009 | A1 |
20090081619 | Miasnik | Mar 2009 | A1 |
20090081627 | Ambrozio | Mar 2009 | A1 |
20090123896 | Hu et al. | May 2009 | A1 |
20090142741 | Ault et al. | Jun 2009 | A1 |
20090161827 | Gertner et al. | Jun 2009 | A1 |
20090208915 | Pugh | Aug 2009 | A1 |
20090221908 | Glossop | Sep 2009 | A1 |
20090263775 | Ullrich | Oct 2009 | A1 |
20090265671 | Sachs et al. | Oct 2009 | A1 |
20090275810 | Ayers et al. | Nov 2009 | A1 |
20090278791 | Slycke et al. | Nov 2009 | A1 |
20090305213 | Burgkart et al. | Dec 2009 | A1 |
20090326556 | Diolaiti | Dec 2009 | A1 |
20100030111 | Perriere | Feb 2010 | A1 |
20100071467 | Nasiri et al. | Mar 2010 | A1 |
20100099066 | Mire et al. | Apr 2010 | A1 |
20100120006 | Bell | May 2010 | A1 |
20100167249 | Ryan | Jul 2010 | A1 |
20100167250 | Ryan et al. | Jul 2010 | A1 |
20100167254 | Nguyen | Jul 2010 | A1 |
20100179428 | Pederson et al. | Jul 2010 | A1 |
20100198141 | Laitenberger et al. | Aug 2010 | A1 |
20100273135 | Cohen | Oct 2010 | A1 |
20110027767 | Divinagracia | Feb 2011 | A1 |
20110046915 | Hol et al. | Feb 2011 | A1 |
20110060229 | Hulvershorn et al. | Mar 2011 | A1 |
20110071419 | Liu et al. | Mar 2011 | A1 |
20110144658 | Wenderow et al. | Jun 2011 | A1 |
20110170752 | Martin et al. | Jul 2011 | A1 |
20110202012 | Bartlett | Aug 2011 | A1 |
20110207102 | Trotta et al. | Aug 2011 | A1 |
20110236866 | Psaltis | Sep 2011 | A1 |
20110257596 | Gaudet | Oct 2011 | A1 |
20110269109 | Miyazaki | Nov 2011 | A2 |
20110282188 | Burnside et al. | Nov 2011 | A1 |
20110294103 | Segal et al. | Dec 2011 | A1 |
20110301500 | Maguire et al. | Dec 2011 | A1 |
20110306025 | Sheehan et al. | Dec 2011 | A1 |
20120002014 | Walsh | Jan 2012 | A1 |
20120015336 | Mach | Jan 2012 | A1 |
20120026307 | Price | Feb 2012 | A1 |
20120034587 | Toly | Feb 2012 | A1 |
20120053514 | Robinson et al. | Mar 2012 | A1 |
20120082969 | Schwartz et al. | Apr 2012 | A1 |
20120130269 | Rea | May 2012 | A1 |
20120148994 | Hori et al. | Jun 2012 | A1 |
20120157800 | Tschen | Jun 2012 | A1 |
20120171652 | Sparks et al. | Jul 2012 | A1 |
20120183238 | Savvides et al. | Jul 2012 | A1 |
20120209243 | Yan | Aug 2012 | A1 |
20120214144 | Trotta et al. | Aug 2012 | A1 |
20120219937 | Hughes | Aug 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120251987 | Huang et al. | Oct 2012 | A1 |
20120280988 | Lampotang et al. | Nov 2012 | A1 |
20120282583 | Thaler et al. | Nov 2012 | A1 |
20120293632 | Yukich | Nov 2012 | A1 |
20120301858 | Park et al. | Nov 2012 | A1 |
20120323520 | Keal | Dec 2012 | A1 |
20130006178 | Pinho et al. | Jan 2013 | A1 |
20130018494 | Amini | Jan 2013 | A1 |
20130046489 | Keal | Feb 2013 | A1 |
20130100256 | Kirk et al. | Apr 2013 | A1 |
20130131503 | Schneider et al. | May 2013 | A1 |
20130179110 | Lee | Jul 2013 | A1 |
20130189658 | Peters et al. | Jul 2013 | A1 |
20130189663 | Tuchschmid et al. | Jul 2013 | A1 |
20130197845 | Keal | Aug 2013 | A1 |
20130198625 | Anderson | Aug 2013 | A1 |
20130203032 | Bardsley | Aug 2013 | A1 |
20130223673 | Davis et al. | Aug 2013 | A1 |
20130236872 | Laurusonis et al. | Sep 2013 | A1 |
20130267838 | Frank et al. | Oct 2013 | A1 |
20130296691 | Ashe | Nov 2013 | A1 |
20130308827 | Dillavou et al. | Nov 2013 | A1 |
20130323700 | Samosky | Dec 2013 | A1 |
20130342657 | Robertson | Dec 2013 | A1 |
20140039452 | Bangera et al. | Feb 2014 | A1 |
20140071165 | Tuchschmid et al. | Mar 2014 | A1 |
20140102167 | MacNeil et al. | Apr 2014 | A1 |
20140120505 | Rios et al. | May 2014 | A1 |
20140121636 | Boyden et al. | May 2014 | A1 |
20140129200 | Bronstein et al. | May 2014 | A1 |
20140142422 | Manzke et al. | May 2014 | A1 |
20140162232 | Yang et al. | Jun 2014 | A1 |
20140212864 | Rios et al. | Jul 2014 | A1 |
20140240314 | Fukazawa et al. | Aug 2014 | A1 |
20140244209 | Lee et al. | Aug 2014 | A1 |
20140260704 | Lloyd et al. | Sep 2014 | A1 |
20140278183 | Zheng et al. | Sep 2014 | A1 |
20140278205 | Bhat et al. | Sep 2014 | A1 |
20140278215 | Keal et al. | Sep 2014 | A1 |
20140322683 | Baym et al. | Oct 2014 | A1 |
20140349263 | Shabat et al. | Nov 2014 | A1 |
20140349266 | Choi | Nov 2014 | A1 |
20140363801 | Samosky et al. | Dec 2014 | A1 |
20150031987 | Pameijer et al. | Jan 2015 | A1 |
20150049081 | Coffey et al. | Feb 2015 | A1 |
20150079545 | Kurtz | Mar 2015 | A1 |
20150086955 | Poniatowski et al. | Mar 2015 | A1 |
20150104773 | Toly et al. | Apr 2015 | A1 |
20150182706 | Wurmbauer et al. | Jul 2015 | A1 |
20150206456 | Foster et al. | Jul 2015 | A1 |
20150262512 | Rios et al. | Sep 2015 | A1 |
20150348443 | Rios et al. | Dec 2015 | A1 |
20150352294 | O'Mahoney et al. | Dec 2015 | A1 |
20150379899 | Baker et al. | Dec 2015 | A1 |
20150379900 | Samosky et al. | Dec 2015 | A1 |
20160000411 | Raju et al. | Jan 2016 | A1 |
20160001016 | Poulsen et al. | Jan 2016 | A1 |
20160155363 | Rios et al. | Jun 2016 | A1 |
20160193428 | Perthu | Jul 2016 | A1 |
20160213856 | Despa et al. | Jul 2016 | A1 |
20160293058 | Gaillot et al. | Oct 2016 | A1 |
20160374902 | Govindasamy et al. | Dec 2016 | A1 |
20170053563 | Holloway | Feb 2017 | A1 |
20170136185 | Rios et al. | May 2017 | A1 |
20170178540 | Rios et al. | Jun 2017 | A1 |
20170186339 | Rios et al. | Jun 2017 | A1 |
20170245943 | Foster et al. | Aug 2017 | A1 |
20170252108 | Rios et al. | Sep 2017 | A1 |
20170254636 | Foster et al. | Sep 2017 | A1 |
20170316720 | Singh et al. | Nov 2017 | A1 |
20180012516 | Rios et al. | Jan 2018 | A1 |
20180068075 | Shiwaku | Mar 2018 | A1 |
20180197441 | Rios et al. | Jul 2018 | A1 |
20180211562 | Rios et al. | Jul 2018 | A1 |
20180225991 | Pedroso et al. | Aug 2018 | A1 |
20180240365 | Foster et al. | Aug 2018 | A1 |
20180261125 | Rios et al. | Sep 2018 | A1 |
20180261126 | Rios et al. | Sep 2018 | A1 |
20180271581 | OuYang et al. | Sep 2018 | A1 |
20180333543 | Diaz et al. | Nov 2018 | A1 |
20180338806 | Grubbs | Nov 2018 | A1 |
20190130792 | Rios et al. | May 2019 | A1 |
20200202747 | Rios et al. | Jun 2020 | A1 |
20200206424 | Rios et al. | Jul 2020 | A1 |
20200226951 | Rios et al. | Jul 2020 | A1 |
20200251017 | Rios et al. | Aug 2020 | A1 |
20210174706 | Rios et al. | Jun 2021 | A1 |
20210177518 | Rios et al. | Jun 2021 | A1 |
20210213205 | Karlsson et al. | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
2011218649 | Sep 2011 | AU |
2015255197 | Dec 2015 | AU |
2865236 | Sep 2013 | CA |
2751386 | Jan 2006 | CN |
201213049 | Mar 2009 | CN |
201359805 | Dec 2009 | CN |
201465399 | May 2010 | CN |
101908294 | Dec 2010 | CN |
202159452 | Mar 2012 | CN |
102708745 | Oct 2012 | CN |
102737533 | Oct 2012 | CN |
104703641 | Jun 2015 | CN |
105118350 | Dec 2015 | CN |
205541594 | Aug 2016 | CN |
106710413 | May 2017 | CN |
107067856 | Aug 2017 | CN |
102004046003 | Mar 2006 | DE |
202005021286 | Sep 2007 | DE |
0316763 | May 1989 | EP |
1504713 | Feb 2005 | EP |
1723977 | Nov 2006 | EP |
1884211 | Feb 2008 | EP |
2425416 | Mar 2015 | EP |
2538398 | Aug 2015 | EP |
2756857 | May 2016 | EP |
2288686 | Jul 1997 | GB |
2309644 | Aug 1997 | GB |
2 309 644 | May 2000 | GB |
2508510 | Jun 2014 | GB |
201202900 | Nov 2013 | IN |
H10161522 | Jun 1998 | JP |
H10260627 | Sep 1998 | JP |
2004-348095 | Dec 2004 | JP |
2006-189525 | Jul 2006 | JP |
2008-83624 | Apr 2008 | JP |
2011-113056 | Jun 2011 | JP |
2013-037088 | Feb 2013 | JP |
52-21420 | Jun 2013 | JP |
2013-250453 | Dec 2013 | JP |
2014-153482 | Aug 2014 | JP |
2012009379 | Feb 2012 | KR |
20140047943 | Apr 2014 | KR |
10-1397522 | May 2014 | KR |
201207785 | Feb 2012 | TW |
WO 9616389 | May 1996 | WO |
WO 0053115 | Sep 2000 | WO |
WO 02083003 | Oct 2002 | WO |
WO 2005083653 | Sep 2005 | WO |
WO 2005089835 | Sep 2005 | WO |
WO 2007109540 | Sep 2007 | WO |
WO 2008005315 | Jan 2008 | WO |
WO 2008122006 | Oct 2008 | WO |
WO 2009023247 | Feb 2009 | WO |
WO 2009049282 | Apr 2009 | WO |
WO 2009094646 | Jul 2009 | WO |
WO 2009141769 | Nov 2009 | WO |
WO 2011043645 | Apr 2011 | WO |
WO 2011127379 | Oct 2011 | WO |
WO 2011136778 | Nov 2011 | WO |
WO 2012075166 | Jun 2012 | WO |
WO 2012088471 | Jun 2012 | WO |
WO 2012101286 | Aug 2012 | WO |
WO 2012106706 | Aug 2012 | WO |
WO 2012155056 | Nov 2012 | WO |
WO 2013025639 | Feb 2013 | WO |
WO 2013064804 | May 2013 | WO |
WO 2014035659 | Mar 2014 | WO |
WO 2014070799 | May 2014 | WO |
WO 2014100658 | Jun 2014 | WO |
WO 2015109251 | Jul 2015 | WO |
WO 2015110327 | Jul 2015 | WO |
WO 2015136564 | Sep 2015 | WO |
WO 2015138608 | Sep 2015 | WO |
WO 2015171778 | Nov 2015 | WO |
WO 2016089706 | Jun 2016 | WO |
WO 2016123144 | Aug 2016 | WO |
WO 2016162298 | Oct 2016 | WO |
WO 2016191127 | Dec 2016 | WO |
WO 2017048929 | Mar 2017 | WO |
WO 2017048931 | Mar 2017 | WO |
WO 2017050781 | Mar 2017 | WO |
WO 2017060017 | Apr 2017 | WO |
WO 2017070391 | Apr 2017 | WO |
WO 2017151441 | Sep 2017 | WO |
WO 2017151716 | Sep 2017 | WO |
WO 2017151963 | Sep 2017 | WO |
WO 2017153077 | Sep 2017 | WO |
WO 2018136901 | Jul 2018 | WO |
Entry |
---|
3D Systems,“Angio Mentor Clinical Validations, The Role of Simulation in Boosting the learning Curve in EVAR Procedures,” Journal of Surgical Education, Mar.-Apr. 2018, 75(2), pp. 1-2, accessed on Feb. 6, 2020, https://simbionix.com/simulators/clinical-validations/angio-mentor-clinical-validations/ (listing clinical validations completed on ANGIO Mentor from 2007 through 2018). |
3D Systems, “Angio Mentor™,” Product Brochure/Overview. 2015, 6 pp. |
Dimension Engineering, Internet Archive Wayback Machine webpage capture of https://www.dimensionengineering.com/info/accelerometers, apparently available Apr. 11, 2012, site visited Aug. 24, 2020. |
“Accelerometer: Introduction to Acceleration Measurement,” Omega Engineering, Sep. 17, 2015, 3 pages, https://www.omega.com/prodinfo/accelerometers.html. |
Afzal, et al., “Use of Earth's Magnetic Field for Mitigating Gyroscope Errors Regardless of Magnetic Perturbation,” Sensors 2011, 11, 11390-11414; doi:10.3390/s111211390, 25 pp. published Nov. 30, 2011. |
Ainsworth et al., “Simulation Model for Transcervical Laryngeal Injection Providing Real-time Feedback,” Annals of Otology, Rhinology & Laryngology, 2014, col. 123 (12), pp. 881-886. |
Arms, S.W., “A Vision for Future Wireless Sensing Systems,” 44 pp., 2003. |
“B-Smart disposable manometer for measuring peripheral nerve block injection pressures”, B. Braun USA, 2016, in 4 pages. |
Banivaheb, Niloofar, “Comparing Measured and Theoretical Target Registration Error of an Optical Tracking System,” Feb. 2015, Toronto, Ontario, 128 pp. |
Bao, et al., “A Novel Map-Based Dead-Reckoning Algorithm for Indoor Localization”, J. Sens. Actuator Networks, 2014, 3, 44-63; doi:10.3390/jsan3010044, 20 pp., Jan. 3, 2014. |
Begg et al., “Computational Intelligence for Movement Sciences: Neural Networks and Other Emerging Techniques”, Idea Group Inc (IGI), 2006. |
Benbasat et al., “An Inertial Measurement Framework for Gesture Recognition and Applications,” I. Wachsmuth and T. Sowa (Eds.): GW 2001, Springer-Verlag Berlin Heidelberg, 12 pp. , 2002. |
Bergamini et al., “Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks”, Oct. 2014, 18625-18649. |
Blue Telescope, DAISEY Injector Simulator, Available athttps://www.bluetelescope.com/work/ipsen-injection-simulator. Blue Telescope Laboratories 2020, site visited Aug. 24, 2020. |
Blum et al., “A Review of Computer-Based Simulators for Ultrasound Training,” Society for Simulation in Healthcare, Apr. 2013, vol. 8, pp. 98-108. |
Botden et al., “Suturing training in Augmented Reality: gaining proficiency in suturing skills faster,” Surg Endosc, 2009, vol. 23, pp. 2131-2137. |
Botden et al., “Augmented versus Virtual Reality Laparoscopic Simulation: What Is the Difference?,” World J. Surgery, 31, 2007, 10 pp. |
Botden et al., “Face validity study of the ProMIS Augmented Reality laparoscopic suturing simulator,” Surgical Technology International, Feb. 2008, 17, 16 pp. |
Botden et al., “What is going on in augmented reality simulation in laparoscopic surgery,” Surgical Endoscopy 23, 2009, 1693-1700. |
Bova et al.,“Mixed-Reality Simulation for Neurosurgical Procedures,” Neurosurgery, Oct. 2013, vol. 73, No. 4, pp. S138-S145. |
Brennan et al., “Classification of diffuse light emission profiles for distinguishing skin layer penetration of a needle-free jet injection,” Biomedial Optics Express, Oct. 1, 2019, vol. 10, No. 10, pp. 5081-5092. |
Brennan et al., “Light source depth estimation in porcine skin using spatially resolved diffuse imaging,” 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, 2016, pp. 5917-5920. |
Brett, et al., “Simulation of resistance forces acting on surgical needles,” Proceedings of the Instiutional of Mechanical Engineers Part H Journal of Engineering in Medicine, Feb. 1997, vol. 211 Part H, pp. 335-347. |
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 6 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/. |
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 13 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/. |
Buchanan, Judith Ann, “Use of Simulation Technology in Dental Education,” Journal of Dental Education, 2001, vol. 65, No. 11, 1225-1231. |
CAE Healthcare, “CAE ProMIS Laparoscopic Simulator,” Product Brochure/Overview, 2012, 2 pp. |
Capsulorhexis forceps only technique rehearsed on EYESi before OR (Feb. 10, 2010), https://www.youtube.com/watch?v=ySMI1Vq6Ajw. |
Chui et al., “Haptics in computer-mediated simulation: Training in vertebroplasty,” Simulation & Gaming, Dec. 2006, vol. 37, No. 4, pp. 438-451. |
Comsa et al, “Bioluminescence imaging of point sources implants in small animals post mortem: evaluation of a method for estimating source strength and depth”, Phys. Med. Biol., Aug. 2007, vol. 52, No. 17, pp. 5415-5428. |
Correa et al., “Virtual Reality Simulator for Dental Anesthesia Training in the Inferior Alveolar Nerve Block,” Journal of Applied Oral Science, vol. 25, No. 4, Jul./Aug. 2017, pp. 357-366. |
Coquoz et al., “Determination of depth of in vivo bioluminescent signals using spectral imaging techniques,” Conference Proceedings of SPIE, 2003, vol. 4967, pp. 37-45, San Jose, CA. |
Craig, Alan B., “Augmented Reality Hardware,” Understanding Augmented Reality Chapter 3, 2013, Elsevier Inc., pp. 69-124. |
Cumin et al.,“Simulators for use in anaesthesia,” Anaesthesia, 2007, vol. 62, pp. 151-162. |
Datta et al., “The use of electromagnatic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model”. vol. 193, No. 5, Nov. 2001, pp. 479-485. |
Davenar123, DentSim (Mar. 18, 2008), https://www.youtube.com/watch?v=qkzXUHay1W0. |
Defendant SHDS, INC.'s(F/K/A Nestle Skin Health, Inc.) Supplemental Disclosure of Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Aug. 3, 2020, in 6 pages. |
Defendant SHDS, INC.'s(F/K/A Nestle Skin Health, Inc.) Disclosure of Preliminary Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Feb. 12, 2020, in 136 page. |
DentSim Educators, DentSim Classroom Introduction (Aug. 8, 2013), https://vimeo.com/79938695. |
DentSimLab, Aha Moments—Dentsim Students explain how their dental skills are improving (Nov. 13, 2013), https://www.youtube.com/watch?v=02NgPmhg55Q. |
Desjardins, et al. “Epidural needle with embedded optical fibers for spectroscopic differentiation of tissue: ex vivo feasibility study”, Biomedical Optics Express, vol. 2(6): pp. 1-10. Jun. 2011. |
Dine et al., “Improving cardiopulmonary resuscitation quality and resuscitation training by combining audiovisual feedback and debriefing,” Grit Care Med, 2008 vol. 36, No. 10, pp. 2817-2822. |
EPED Taiwan, EPED—Computerized Dental Simulator (CDS-100) (Jun. 9, 2014), https://www.youtube.com/watch?v=m8UXaV2ZSXQ. |
“EPGL Medical Invents Smart Epidural Needle, Nerve Ablation and Trigger Point Treatment Devices: New Smart Medical Devices Will Give Physicians Advanced Situational Awareness During Critical Procedures,” EPGL Medical, dated Aug. 12, 2013, in 3 pages. Retrieved from http://www.prnewswire.com/news-releases/epgl-medical-invents-smart-epidural-needle-nerve-ablation-and-trigger-point-treatment-devices-219344621.html#. |
“The EpiAccess System: Access with Confidence”, EpiEP Epicardial Solutions, dated 2015, in 2 pages. |
Esteve, Eric, “Why do you need 9D Sensor Fusion to support 3D orientation?”, 5 pp., Aug. 23, 2014, https://www.semiwiki.com/forum/content/3794-why-do-you-need-9d-sensor-fusion-support-3d-orientation.html. |
Ford et al.,“Impact of simulation-based learning on mediation error rates in critically ill patients,” Intensive Care Med, 2010, vol. 36, pp. 1526-1531. |
Franz et al., “Electromagnetic Tracking in Medicine—A Review of Technology, Validation, and Applications,” IEEE, Transactions on Medical Imaging, Aug. 2014, vol. 33, No. 8, pp. 1702-1725. |
Garg et al., “Radial Artery cannulation-Prevention of pain and Techniques of cannulation: review of literature,” The Internet Journal of Anesthesiology, vol. 19, No. 1,2008, in 6 pages. |
Garrett et al., “High-Fidelity Patient Simulation: Considerations for Effective Learning,” Teaching with Technoloyg: High-Fidelity Simulation, 2010, vol. 31, No. 5, pp. 309-313. |
Gottlieb et al., “Faculty Impressions of Dental Students' Performance With and Without Virtual Reality Simulation,”Journal of Dental Education, 2011, vol. 75, No. 11, pp. 1443-1451. |
Gottlieb et al., “Simulation in Dentistry and Oral Health,” The Comprehensive Textbook of Healthcare Simulation Chapter 21, Apr. 2013, pp. 329-340. |
Grenet et al., “spaceCoder: a Nanometric 3D Position Sensing Device,” CSEM Scientific & Technical Report, 1 page, 2011. |
Helen, L., et al. “Investigation of tissue bioimpedance using a macro-needle with a potential application in determination of needle-to-nerve proximity”, Proceedings of the 8th International Conference on Sensing Technology, Sep. 2-4, 2014, pp. 376-380. |
Hoffman et al., “Arytenoid Repositioning Device,” Annals of Otology, Rhinology & Laryngology, 2014, vol. 123 (3); pp. 195-205. |
Hoffman et al., “Transillumination for Needle Localization in the Larynx,” The Laryngoscope, 2015, vol. 125, pp. 2341-2348. |
Hotraphinyo et al., “Precision measurement for microsurgical instrument evaluation”, Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2001, vol. 4, pp. 3454-3457. |
Huang et al., “CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System with Dexterous Instrument Tracking Technology,” CHI' 18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Apr. 21-26, 2018, pp. 1-12, ACM, Montréal, Canada. |
IDA Design Awards—Winners, DAISEY Injection Simulator, available at https://idesignawards.com/winners/zoom.php?eid=9-11737-16&count=0&mode=, Available as early as Sep. 7, 2016. |
Image Navigation, DentSim by Image Navigation—Augmented Reality Dental Simulation, Nov. 2014, 5 pp., available at https://image-navigation.com/wp-content/uploads/2014/11/DentSim-V5-2-Pager.pdf. |
Image Navigation, DentSim Computerized Dental Training Simulator, Product Brochure, Jul. 2014, available at https://image-navigation.com/wp-content/uploads/2014/07/DentsimBrochure.pdf. |
Inition. Virtual Botox: Haptic App Simulated Injecting The Real Thing. Retrieved from http://inition.co.uk/case-study/virtual-botox-haptic-app-simulates-injecting-real-thing., printed on Oct. 30, 2013 in 2 pgs. |
International Search Report and Written Opinion for Appl. No. PCT/US2013/067352 dated Mar. 31, 2014 in 10 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2015/011845, dated Apr. 29, 2015 in 10 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2015/019974, dated May 21, 2015, 10 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2015/062798, dated Mar. 14, 2016, 12 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2017/020509, dated Jul. 13, 2017, 24 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2016/057974, dated Apr. 19, 2017, 21 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2017/020112, dated Jun. 9, 2017, 13 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2017/019518, dated Sep. 18, 2017, 19 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2018/014748, dated Jun. 13, 2018, 22 pages. |
International Search Report and Written Opinion for Appl. No. PCT/US2019/022136, dated Jun. 19, 2019, 18 pages. |
International Preliminary Reporton Patentability for Appl. No. PCT/US2019/022136, dated Sep. 24, 2020, 13 pages. |
Invensense, Inc., “MPU-9150 EV Board User Guide,” May 11, 2011, pp. 1-15. |
Invensense, Inc., “MPU-9150 Product Specification Revision 4.3,” Sep. 18, 2013, pp. 1-50. |
Invensense, Inc., “MPU-9150 Register Map and Descriptions Revision 4.2,” Sep. 18, 2013, pp. 1-52. |
Jafarzadeh et al., “Design and construction of an automatic syringe injection pump,” Pacific Science Review A: Natural Science and Engineering 18, 2016, in 6 pages. |
Jasinevicius et al., “An Evaluation of Two Dental Simulation Systems: Virtual Reality versus Contemporary Non-Computer-Assisted,” Journal of Dental Education, 2004, vol. 68, No. 11, 1151-1162. |
Joint Claim Construction Chart, filed on Mar. 18, 2020 in 8 pages. |
Kalvøy, H., et al., “Detection of intraneural needle-placement with multiple frequency bioimpedance monitoring: a novel method”, Journal of Clinical Monitoring and Computing, Apr. 2016, 30(2):185-192. |
Kandani et al., “Development in blood vessel searching system for HMS,” SPIE, Infrared Systems and Photoelectronic Tehcnology III, 2008, vol. 7065, pp. 1-10. |
Kettenbach et al., “A robotic needle-positioning and guidance system for CT-guided puncture: Ex vivo results,” Minimally Invasive Therapy and Allied Technologies, vol. 23, 2014, in 8 pages. |
Khosravi, Sara, “Camera-Based Estimation of Needle Pose for Ultrasound Percutaneous Procedures,” University of British Columbia, 2008, pp. ii-83. |
Krupa et al., “Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing”, IEEE Trans. Robotics and Automation, 2003, vol. 19, pp. 842-853. |
Kumar et al., “Virtual Instrumentation System With Real-Time Visual Feedback and Needle Position Warning Suitable for Ophthalmic Anesthesia Training,” IEEE: Transactions on Instrumentation and Measurement, May 2018, vol. 67, No. 5, pp. 1111-1123. |
Lacey et al., “Mixed-Reality Simulation of Minimally Invasive Surgeries,” IEEE Computer Society, 2007, pp. 76-87. |
Ladjal, et al., “Interactive Cell Injection Simulation Based on 3D Biomechanical Tensegrity Model,” 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, in 9 pages. |
Laerdal, “Virtual Phlebotomy—Directions for Use,” Self-directed Phlebotomy learning, Aug. 4, 2020, pp. 1-100. |
Laerdal Medical, http://www.laerdal.com/us/nav/203/Venous-Arterial-Access, printed Mar. 8, 2019 in 3 pgs. |
Lampotang et al.,“A Subset of Mixed Simulations: Augmented Physical Simulations with Virtual Underlays,” Interservice/Idnustry Training, Simualtion, and Education Conference (I/ITSEC), 2012, pp. 1-11. |
Lampotang et al., “Mixed Reality Simulation for Training Reservists and Military Medical Personnel in Subclavian Central Venous Access,” Informational Poster, Ufhealth, Center for Safety, Simulation and Advanced Learning Technologies, 2015, 1 pp. available at https://simulation.health.ufl.edu/files/2018/12/Dept_CoR_2015-Mixed_Reality_Simulation_for_Training.pdf. |
Lee et al., “A Phantom Study on the Propagation of NIR Rays under the Skin for Designing a Novel Vein-Visualizing Device,” ICCAS, Oct. 20-23, 2013, pp. 821-823. |
Lee et al., “An Intravenous Injection Simulator Using Augmented Reality for Veterinary Education and its Evaluation,” Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, Dec. 2-4, 2012, in 4 pages. |
Lee et al., “Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine,” The Veterinary Journal, 2013, vol. 196, No. 2, pp. 197-202. |
Lee et al., “The utility of endovascular simulation to improve technical performance and stimulate continued interest of preclinical medical students in vascular surgery,” Journal of Surgical Education, 2009 APDS Spring Meeting, vol. 66, No. 6, 367-373. |
Lee et al., “Virtual Reality Ophthalmic Surgical Simulation as a Feasible Training and Assessment Tool: Results of a Multicentre Study,” Canada Journal of Ophthalmology, Feb. 2011 vol. 46, No. 1, 56-60. |
Lemole et al., “Virtual Reality in Neurosurgical Education: Part-Task Ventriculostomy Simulation with Dynamic Visual and Haptic Feedback,” Neurosurgery, Jul. 2007, vol. 61, No. 1, pp. 142-149. |
Leopaldi et al., “The dynamic cardiac biosimulator: A method for training physicians in beating-heart mitral valve repair procedures,” The Journal of Thoracic and Cardiovascular Surgery, 2018, vol. 155, No. 1, pp. 147-155. |
Lim, M.W. et al., “Use of three-dimensional animation for regional anaesthesia teaching: application to interscalene brachial plexus blockade,” British Journal of Anaesthesia, Advance Access, 2004, vol. 94, pp. 372-377. |
Liu et al. “Robust Real-Time Localization of Surgical Instruments in the Eye Surgery Stimulator (EyeSi)”, Signal and Image Processing, 2002. |
Liu et al. “Study on an Experimental AC Electromagnetic Tracking System” Proceedings of the 5th World Congress on Intelligent Control and Automation, Jun. 15-19, 2001. pp. 3692-3695. |
Madgwick, Sebastian O.H., “An efficient orientation filter for inertial and inertial/magnetic sensor arrays,” 32 pp., Apr. 30, 2010. |
Medgadget Editors, “EYESI Surgical Simulator,” Medgadget, Aug. 28, 2006,4 pp., printed on Feb. 7, 2020, https://www.medgadget.com/2006/08/eyes_i_surgical.html. |
Merlone1, Eyesi_Cataract_2011 (Sep. 9, 2011), https://www.youtube.com/watch?v=XTulabWmEvk. |
Merril et al., “The Ophthalmic Retrobulbar Injection Simulator (ORIS): An Application of Virtual Reality to Medical Education”, Proc. Ann. Symp. Comput. Med. Care, 1992, pp. 702-706. |
Microsofi, “Integrating Motion and Orientation Sensors,” 85 pp., Jun. 10, 2013. |
Miller, Nathan L., Low-Power, Miniature Inertial Navigation System with Embedded GPS and Extended Kalman Filter, MicroStrain, Inc., 12 pp., 2012. |
Mnemonic, Ipsen Injection Simulators, available at http://mnemonic.studio/project/ispen-injection-simulators. Copyright 2019, Website viewed on Aug. 24, 2020. |
Mnemonic, Injection Simulator (Oct. 20, 2017), https://vimeo.com/239061418. |
MPU-9150 9-Axis Evaluation Board User Guide, Revision 1.0, 15 pp., May 11, 2011, http//www.invensense.com. |
MPU-9150, Register Map and Descriptions, Revision 4.2, 52 pp., Sep. 18, 2013, http//www.invensense.com. |
MPU-9150, Product Specification, Revision 4.3, 50 pp., Sep. 18, 2013, http//www.invensense.com. |
Mukherjee et al., “A Hall Effect Sensor Based Syringe Injection Rate Detector”, IEEE 2012 Sixth Int'l Conf. on Sensing Technol.(ICST), Dec. 18-21, 2012. |
Mukherjee et al., “An Ophthalmic Anesthesia Training System Using Integrated Capacitive and Hall Effect Sensors,” IEEE, Transactions on Instrumentation and Measurement, Jan. 2014, vol. 63, No. 5, 11 pp. |
Nelson, Douglas A. Jr., “A Modular and Extensible Architecture Integrating Sensors, Dynamic Displays of Anatomy and Physiology, and Automated Instruction for Innovations in Clinical Education” Doctoral Dissertation, Univ. of Pitt., 2017, 260 pp. |
Nelson et al., “The Tool Positioning Tutor: A Target-Pose Tracking and Display System for Learning Correct Placement of a Medical Device,” Medicine Meets Virtual Reality 18, IOS Press, 2011, 5 pp. |
Ottensmeyer et al., “Ocular and Craniofacial Trauma Treatment Training System: Overview & Eyelid Laceration Module,” workshop Proceedings of the 8th International Conference on Intelligent Environments, IOS Press, 2012, 13 pp. |
Ozturk wt al., “Complications Following Injection of Soft-Tissue Fillers,” Aesthetic Surgery Journal, from the American Society for Aesthetic Plastic Surgery, Inc. Reprints and permissions, http://www.sagepub.com/journalsPermissions.nav, Aug. 2013, pp. 862-877. |
Petition for Inter Partes Review of U.S. Pat. No. 9,792,836, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 Et Seq., IPR2020-00042, dated Oct. 11, 2019. |
Decision Denying Instiution of Inter Partes Review of U.S. Pat. No. 9,792,836, Pursuant to 35 U.S.C. §§ 314, 37 C.F.R. § 42.4 (a), IPR2020-00042, dated Apr. 17, 2020. |
Petition for Inter Partes Review of U.S. Pat. No. 10,290,231, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 Et Seq., IPR2020-00935 dated May 13, 2020. |
Petition for Inter Partes Review of U.S. Pat. No. 10,290,232, Pursuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 Et Seq., IPR2020-00937 dated May 13, 2020. |
Patterson et al., “Absorption spectroscopy in tissue-simulating materials: a theoretical and experimental study of photon paths”, Appl. Optics, Jan. 1995, vol. 34, No. 1, pp. 22-30. |
Pitt Innovates, BodyExplorer™ (Sep. 24, 2014), https://www.youtube.com/watch?v=T6G2OWJm5hs. |
Pitt Innovates, Pitt Student Innovator Award, Pitt Intellectual Property 2017, Douglas A Nelson Jr. (Nov. 28, 2017), https://www.youtube.com/watch?v=0_CVBgWtCLo. |
Poyade et al., “Development of a Haptic Training Simulation for the Administration of Dental Anesthesia Based Upon Accurate Anatomical Data,” Conference and Exhibition of the European Association of Virtual and Augmented Reality, 2014, in 5 pages. |
PST Iris Tracker, Plug and Play, 3D optical motion tracking specifications, 1 p., Dec. 4, 2014, www.pstech.com. |
PST Iris Tracker, Instruction Manual, 3D optical motion tracking specifications, 42 pp., Jul. 27, 2012, www.pstech.com. |
Quio, “Smartinjector,” available at https://web.archive.org/web/20161017192142/http://www.quio.com/smartinjector, Applicant believes to be available as early as Oct. 17, 2016, in 3 pages. |
Report and Recommendation, Case No. 19-592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Jun. 18, 2020, in 18 pages. |
Rahman et al., “Tracking Manikin Tracheal Intubation Using Motion Analysis,” Pediatric Emergency Care, Aug. 2011, vol. 27, No. 8, pp. 701-705. |
Robinson et al., “A Mixed-Reality Part-Task Trainer for Subclavian Venous Access,” Journal of the Society for Simulation in Healthcare, Feb. 2014, vol. 9, No. 1, pp. 56-64. |
Salem et al., “Clinical Skills Laboratories “CSLs” Manual 1432-2011,” Jan. 2011, pp. 0-88. |
Samosky et al., “BodyWindows: Enhancing a Mannequin with Projective Augmented Reality for Exploring Anatomy, Physiology and Medical Procedures,” Medicine Meets Virtual Reality 19, 2012, 433, J.D. Westwood et al. eds., IOS Press, pp. 433-439. |
Samosky et al., “Enhancing Medical Device Training with Hybrid Physical-Virtual Simulators: Smart Peripherals for Virtual Devices,” Medicine Meets Virtual Reality 20, Jan. 2013, J.D. Westwood et al. eds., IOS Press 377, pp. 377-379. |
Samosky, Joseph, “View from the Top: Simulation Director Envisions Greater Use For Training Tool,” Biomedical Instrumentation & Technology, 2012, pp. 283-288. |
Samosky et al.“Toward a Comprehensive Hybrid Physical-Virtual Reality Simulator of Peripheral Anesthesia with Ultrasound and Neurostimulator Guidance,” Medicine Virtual Reality 18, IOS Press, 2011, pp. 552-554. |
Satava, “Accomplishments and Challenges of Surgical Simulation”, Dawning of the next-generation surgical education, Surgical Endoscopy Ultrasound and Interventional Techniques, Online publication, Feb. 6, 2001, in 10 pages. |
Schneider, Chad Michael, “Systems for Robotic Needle Insertion and Tool-Tissue Interaction Modeling,” Research Gate, 2004, pp. 1-74, Baltimore, Maryland. |
Simbionix, Valencia College's CVT program uses Simbionix ANGIO Mentor simulators, Feb. 26, 2013, https://www.youtube.com/watch ?v=oAE0fWzXMjw. |
SimEx, “Dental Augmented Reality Simulator,” EPED, 3 pp. https://www.epedmed.com/simex. Available as early as 2019. |
Spiteri et al., “Phacoemulsification Skills Training and Assessment,” The British Journal of Ophthalmology 2010, Aug. 2009, 20 pp. |
State Electronics, “Sensofoil Membrane Potentiometer,” Product Information and Technical Specifications, received on May 15, 2020 in 6 pages. |
Struik, Pieter, “Ultra Low-Power 9D Fusion Implementation: A Case Study,” Synopsis, Inc., 7 pp., Jun. 2014. |
Stunt et al., “Validation of ArthroS virtual reality simulator for arthroscopic skills,” Knee Surgery Sports Traum. Arthroscopy 23, Jun. 11, 2014, 8 pp. |
Sultan et al.,“A Novel Phantom for Teaching and Learning Ultrasound-guided Needle Manipulation,” Journal of Medical Ultrasound, Elsevier Taiwan LLC, Jul. 2013, vol. 21, pp. 152-155. |
Sutherland, et al. “An Augmented Reality Haptic Training Simulator for Spinal Needle Procedures,” IEEE, 2011. |
Suzuki et al., “Simulation of Endovascular Neurointervention Using Silicone Models: Imaging and Manipulation,” Neurol Med Chir (Tokyo), 2005, vol. 45, pp. 567-573. |
The Simulation Group, Internet Archive Wayback webpage capture of http://www.medicalsim.org/virgil.htm, apparently available Apr. 10, 2013, site visited Aug. 25, 2020. |
The Simulation Group, VIRGIL™ Videos (2002), http://www.medicalsim.org/ virgil_vid.htm; http://www.medicalsim.org/virgil/virgil%20expert.mpg. |
Ting et al., “A New Technique to Assist Epidural Needle Placement: Fiberoptic-guided Insertion Using Two Wavelengths,” Anesthesiology, 2010, vol. 112, pp. 1128-1135. |
Touch of Life Technologies, “ToLTech Cystoscopy Simulator Helps Practice BOTOX Injections,” https://www.medgadget.com/2012/05/toltech-cystoscopy-simulator-helps-practice-botox-injections.html, May 2012, printed on Feb. 6, 2020 in 2 pgs. |
Touch of Life Technologies, “Touch of Life Technologies' new cystoscopy and bladder injection simulator offers urologists training on use of BOTOX®,” https://www.urotoday.com/recent-abstracts/pelvic-health-reconstruction/urinary-incontinence/50289-touch-of-life-technologies-new-cystoscopy-and-bladder-injection-simulator-offers-urologists-training-on-use-of-botox-onabotulinumtoxina-as-treatment-for-urinary-incontinence-in-adults-with-neurological-conditions.html, May 2012, printed on Feb. 6, 2020 in 2 pgs. |
Truinject Corp., “Smart Injection Platform,” http://truinject.com/technology/, printed Jan. 13, 2018, in 3 pages. |
Ufcssalt, “Video of mixed simulation for placement of CVL needle”—(Patent Pending), Dec. 5, 2011, https://www.youtube.com/watch ?v=0ITIFbiiwRs. |
UFHealth, “UF developing mixed-reality simulators for training in treatment of injured soldiers,” Aug. 20, 2014, https://www.youtube.com/watch?v=sMxH1lprc10& feature=emb_title. |
Ungi et al., “Perk Tutor: An Open-Source Training Platform for Ultrasound-Guided Needle Insertions,” IEEE Transactions on Biomedical Engineering, Dec. 2012, vol. 59, No. 12, pp. 3475-3481. |
Univervisty of Pittsburgh Innovation Institute, “BodyExplorer: An Automated Augmented Reality Simulator for Medical Training and Competency Assessment,” Mar. 2016, 2 pp. |
Univervisty of Pittsburgh Innovation Institute, “BodyExplorer: Enhancing a Mannequin Medical Simulator with Sensing. Tangible Interaction and Projective Augmented Reality for Exploring Dynamic Anatomy, Physiology and Clinical Procedures,” 2012, pp. 1-3. |
Van Sickle et al., “Construct validation of the ProMIS simulator using novel laparoscopic suturing task”, Surg Endosc, Sep. 2005, vol. 19, No. 9, pp. 1227-1231. |
Varesano, Fabio, “Prototyping Orientation and Motion Sensing Objects with Open Hardware,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Feb. 10, 2013, 4 pp. |
Varesano, Fabio, “FreeIMU: An Open Hardware Framework for Orientation and Motion Sensing,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Mar. 20, 2013, 10 pp. |
Vaughan et al., “A review of virtual reality based training simulators for orthopedic surgery,” Journal Engineering and Physics, 2016, vol. 38, Elsevier Ltd., pp. 59-71. |
Virgil™, The Simulation Group/CIMIT, “Medical Simulation Chest Trauma Training System,” 2002, 6 pp. http://www.medicalsim.org/virgil.htm. |
VirtaMed ArthroS™, “Virtual reality arthroscopy for knee, shoulder, hip, ankle & FAST basic skills,” Fact Sheet/Brochure Jul. 13, 2011. |
VirtaMed ArthroS™ Module Descriptions. 2019. |
Virtamed, ArthroS—The 2012 Arthroscopic Simulator for Knee Arthroscopy, Feb. 1, 2012, https://www.youtube.com/watch ?v=Y6w3AGfAqKA. |
Virtamed, Arthroscopy Training Simulator ArthroS Now With Shoulder Module!, Mar. 13, 2013, https://www.youtube.com/watch ?v=kPuAm0MIYg0. |
Virtamed, Arthroscopy Training 2013: VirtaMed ArthroS Shoulder Simulator, Sep. 24, 2013, https://www.youtube.com/watch?v=WdCtPYr0wK0. |
Virtamed News, “VirtaMed ArthroS—Virtual reality training for knee arthroscopy,” VirtaMed, Jul. 13, 2011, 2 pp. accessed on Feb. 6, 2020,https://www.virtamed.com/en/news/virtamed-arthros-virtual-reality-training-knee-arthroscopy/. |
Virtamed, VirtaMed ArthroS™—diagnostic and therapeutic arthroscopy in both the knee and shoulder (Apr. 15, 2014), https://www.youtube.com/watch?v=gtklSWnOzRc. |
VRMAGIC,“eyesi by VRmagic Surgical Simulator,” Product Brochure, 2015, available at https://pdf.medicalexpo.com/pdf/vrmagic/eyesi-surgical-product-brochure/112458-159450.html. |
Walsh et al., “Use of Simulated Learning Environments in Dentistry and Oral Health Curricula,” SLE in Dentistry and Oral Health: Final Report, 2010, Health Workforce Australia, pp. 1-112. |
Welk et al., “DentSim—A Future Teaching Option for Dentists,” 7 International Journal of Computerized Dentistry, 2004, 9 pp. |
Wik et al., “Intubation with laryngoscope versus transillumination performed by paramedic students on manikins and cadavers”, Resuscitation, Jan. 1997, vol. 33, No. 3, pp. 215-218. |
Wiles, Andrew et al., “Accuracy assessment and interpretation for optical tracking systems,” SPIE, Medical Imaging: Visualization, Image-Guided Procedures and Display, 2004, vol. 5367, pp. 1-12. |
Yeo et al., “The Effect of Augmented Reality Training on Percutaneous Needle Placement in Spinal Facet Joint Injections,” IEEE, Transactions on Biomedical Engineering, Jul. 2011, vol. 58, No. 7, 8 pp. |
Yu et al., “Development of an In Vitro Tracking System with Poly (vinyl alcohol) Hydrogel for Catheter Motion,” Journal of Biomedical Science and Engineering, 2010, vol. 5, No. 1, 11-17. |
Association of American Medical Colleges, Medical Simulation in Medical Education: Results of an AAMC Survey (Sep. 2011), in 48 pages. |
J. Clark et al., A quantitative scale to define endoscopic torque control during natural orifice surgery, 22 Minimally Invasive Therapy & Allied Technologies 17-25 (2013). |
A. D'Angelo et al., Use of decision-based simulations to assess resident readiness for operative independence, 209 Am J Surg. 132-39 (2015). |
V. Datta et al., The relationship between motion analysis and surgical technical assessments, 184(1) Am J Surg.70-73 (2002). |
A. Dosis et al., Synchronized Video and Motion Analysis for the Assessment of Procedures in the Operating Theater, 140 Arch Surg. 293-99 (2005). |
Lance Baily, Polhemus Delivers World Class Motion Tracking Technology to Medical Simulation Industry,healthysimulation.com,(May 2, 2016), https://www.healthysimulation.com/8621/polhemus-deliversworld-class-motion-tracking-technology-to-medical-simulationindustry/. |
S. Laufer et al., Sensor Technology in Assessments of Clinical Skill, 372 N Engl Jmed 784-86 (2015). |
K. Perrone et al., Translating motion tracking data into resident feedback: An opportunity for streamlined video coaching, 209 Am J Surg. 552-56 (2015). |
C. Pugh et al., A Retrospective Review of TATRC Funding for Medical Modeling and Simulation Technologies, 6 Simulation in Healthcare, 218-25 (2011). |
Defendant SHDS, INC.'s(F/K/A Nestle Skin Health, Inc.) Second Supplemental Disclosure of Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Mar. 5, 2021, in 9 pages. |
S. Shaharan et al., Motion Tracking System in Surgical Training, 2017 INTECHOPEN 3-23 (2017), available at http://dx.doi.org/10.5772/intechopen.68850. |
J. {hacek over (S)}ilar et al., Development of In-Browser Simulators for Medical Education: Introduction of a Novel Software Toolchain, 21 J Med Internet Res. e14160 (published online Jul. 3, 2019). |
Andraos et al., “Sensing your Orientation” Address 2007, 7 pp. |
“A Virtual Reality Based Joint Injection Simulator Phase III”, https://www.sbir.gov/. Retreived Mar. 5, 2021, in 2 pages. |
Berkelman et al., “Co-Located 3D Graphic and Haptic Display using Electromagnetic Levitation”, The Institute of Electrical and Electronics Engineers, 2012 in 6 pages. |
Coles et al., “Modification of Commercial Force Feedback Hardware for Needle Insertion Simulation”, Studies in Health Technology and Informatics, 2011 in 1 page. |
Dang et al., “Development and Evaluation of an Epidural Injection Simulator with Force Feedback for Medical Training”, Studies in Health Technology and Informatics, 2001, vol. 81., pp. 97-102. |
Defendant SHDS, INC.'s(F/K/A Nestle Skin Health, Inc.) Final Invalidity Contentions, Case No. 1:19-cv-00592-LPS-JLH, Truinject Corp., v. Galderma, S.A., Galderma Laboratories, L.P., Nestle Skin Health, Inc., dated Jun. 18, 2021, in 54 pages. |
Färber et al., “Needle Bending in a VR-Puncture Training System Using a 6DOF Haptic Device”, Studies in Health Technology and Informatics, 2009, vol. 142, in 3 pages. |
Gobbetti et al., “Catheter Insertion Simulation with co-registered Direct Volume Rendering and Haptic Feedback”, Studies in Health Technology and Informatics, vol. 70, 2000 in 3 pages. |
“Immersion Medical Joins with PICC Excellence to Promote Training Products for Peripherally Inserted Central Catheter Procedure”, Immersion Corporation, Business Wire 2006. Dated Jan. 9, 2006, in 3 pages. |
“Immersion Medical Upgrades CathSim AccuTouch”, Med Device Online, dated Jan. 12, 2005 in 1 page. |
Judgment and Final Written Decision Determing All Challenged Claims Unpatentable, U.S. Pat. No. 10,290,231B2, IPR2020-00935. |
Judgment and Final Written Decision Determing All Challenged Claims Unpatentable, U.S. Pat. No. 10,290,232B2, IPR2020-00937. |
Laerdal, “Virtual I.V.—Directions for Use”, www.laerdal.com, dated Sep. 3, 2010, in 103 pages. |
Laerdal, “Virtual I.V. Sell Sheet”, www.laerdal.com, dated Mar. 26, 2013, in 2 pages. |
Laerdal, “Virtual I.V. Simulator (Discontinued)”, www.laerdal.com, in 5 pages. Retrieved Jul. 23, 2021. |
“Learning by Feel: ToLTech and Allergan Simulator”, 3D Systems, dated May 8, 2012, in 93 pages. |
Lee et al., “Evaluation of the Mediseus® Epidural Simulator”, Anaesthesia and Intensive Care (2012), vol. 40, No. 2, pp. 311-318. |
Lendvay et al., “The Biomechanics of Percutaneous Needle Insertion”, Studies in Health Technology and Informatics, Jan. 2008 in 2 pages. |
Lim et al., “Simulation-Based Military Regional Anesthesia Training System”, US Army Medical Research and Materiel Command Fort Detrick MD, Telemedicine and Advanced Technology Research Center, 2008, in 8 pages. |
Luboz et al., “ImaGiNe Seldinger: First simulator for Seldinger technique and angiography training”, Computer Methods and Programs in Biomedicine, vol. 111, No. 2, Aug. 2013 pp. 419-434. |
Mastmeyer et al., “Direct Haptic Volume Rendering in Lumbar Puncture Simulation”, Studies in Health Technology and Informatics, vol. 173, No. 280, 2012 in 8 pages. |
Mastmeyer et al., “Real-Time Ultrasound Simulation for Training of US-Guided Needle Insertin in Breathing Virtual Patients”, Studies in Health Technology and Informatics, Jan. 2016 in 9 pages. |
MEDGADGET Editors, “ToLTech Cystoscopy Simulator Helps Practice BOTOX Injections”, MEDGADGET, May 14, 2012, in 2 pages. Printed on Feb. 6, 2020, http://www.medgadget.com/2012/05/toltech-cystoscopy-simulator-helps-practice-botox-injections.html. |
Sclaverano et al. “BioSym : a simulator for enhanced learning of ultrasound-guided prostate biopsy”, Studies in Health Technology and Informatics, 2009 in 6 pages. |
Shen et al., “Virtual trainer for intra-destrusor injection of botulinum toxin to treat urinary incontinence”, Studies in Health Technology and Informatics, vol. 173, 2012 in 4 pages. |
Vidal et al., “Developing An Immersive Ultrasound Guided Needle Puncture Simulator”, Studies in Health Technology and Informatics, 2009, pp. 398-400. |
Virtual I.V.® Simulator—1. Introduction. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=H9Qd6N9vG_A, viewed on Jul. 27, 2021. |
Virtual I.V.® Simulator—2. System Overview. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=l01UFNFU3cU, viewed on Jul. 28, 2021. |
Virtual I.V.® Simulator—3. Training overview. YouTube, uploaded by Laerdal Medical AS, Jan. 19, 2011, www.youtube.com/watch?v=5Ut6YkDaNWI, viewed on Jul. 27, 2021. |
Wandell et al., “Using a Virtual Reality Simulator in Phlebotomy Training”, LabMedicine, ( Aug. 2010) vol. 41, No. 8, in 4 pages. |
Wolpert et al., “ENISS: An Epidural Needle Insertion Simulation System”, Institute of Electrical and Electronics Engineers Inc., 2007 pp. 271-272. |
Number | Date | Country | |
---|---|---|---|
20210264815 A1 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
61826899 | May 2013 | US | |
61814766 | Apr 2013 | US | |
61784239 | Mar 2013 | US | |
61720046 | Oct 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16853597 | Apr 2020 | US |
Child | 17123002 | US | |
Parent | 15977993 | May 2018 | US |
Child | 16853597 | US | |
Parent | 15258839 | Sep 2016 | US |
Child | 15977993 | US | |
Parent | 14595972 | Jan 2015 | US |
Child | 15258839 | US | |
Parent | 14318368 | Jun 2014 | US |
Child | 14595972 | US | |
Parent | 14067829 | Oct 2013 | US |
Child | 14318368 | US |