The present disclosure generally relates to a system for practicing injections on a human or animal training model.
A variety of medical injection procedures are often performed in prophylactic, curative, therapeutic, or cosmetic treatments. Injections may be administered in various locations on the body, such as under the conjunctiva, into arteries, bone marrow, the spine, the sternum, the pleural space of the chest region, the peritoneal cavity, joint spaces, and internal organs. Injections can also be helpful in administering medication directly into anatomic locations that are generating pain. These injections may be administered intravenously (through the vein), intramuscularly (into the muscle), intradermally (beneath the skin), subcutaneously (into the fatty layer of skin), or by way of intraperitoneal injections (into the body cavity). Injections can be performed on humans as well as animals. The methods of administering injections typically vary for different procedures and may depend on the substance being injected, the needle size, or the area of injection.
Injections are not limited to treating medical conditions, but may be expanded to treating aesthetic imperfections, restorative cosmetic procedures, procedures for treating migraine, depression, epidurals, orthopedic procedures, self-administered injections, in vitro procedures, or other therapeutic procedures. Many of these procedures are performed through injections of various products into different parts of the body. The aesthetic and therapeutic injection industry comprises two main categories of injectable products: neuromodulators and dermal fillers. The neuromodulator industry commonly utilizes nerve-inhibiting products such as Botox®, Dysport®, and Xeomin®, among others. The dermal filler industry utilizes products administered by providers to patients for orthopedic, cosmetic and therapeutic applications, such as, for example, Juvederm®, Restylane®, Belotero®, Sculptra®, Artefill®, Voluma®, Kybella®, Durolane®, and others. The providers or injectors may include plastic surgeons, facial plastic surgeons, oculoplastic surgeons, dermatologists, orthopedist, primary care givers, psychologist/psychiatrist, nurse practitioners, dentists, and nurses, among others.
One of the problems in the administration of injections is that there is no official certification or training process. Anyone with a minimal medical related license may inject a patient. These “injectors” may include primary care physicians, orthopedist, dentists, veterinarians, nurse practitioners, nurses, physician's assistants, aesthetic spa physicians, therapeutic or the patient for self-administered injections. However, the qualifications and training requirements for injectors vary by country, state, and county. For example, in most states in the United States, the only requirement to inject patients with neuromodulators and/or fillers is a nursing degree or medical degree. This causes major problems with uniformity and expertise in administering injections. The drawbacks resulting from a lack of uniformity in training and expertise are widespread throughout the medical industry. Doctors and practitioners often are not well trained in administering injections for diagnostic, therapeutic, and cosmetic purposes. This lack of training often leads to instances of chronic pain, headaches, bruising, swelling or bleeding in patients.
Current injection training options are classroom-based, with hands-on training performed on live models. The availability of models is limited. Moreover, even when available, live models are limited in the number and type of injections that may be performed on them. The need for live models is restrictive because injectors are unable to be exposed to a wide and diverse range of situations in which to practice. For example, it may be difficult to find live models with different skin tones or densities. This makes the training process less effective because patients often have diverse anatomical features as well as varying prophylactic, curative, therapeutic, or cosmetic needs. Live models are also restrictive because injectors are unable to practice injection methods on internal organs due to health considerations. As a result of these limited training scenarios, individuals seeking treatments involving injections have a much higher risk of being treated by an inexperienced injector. This may result in low patient satisfaction with the results, or in failed procedures. In many instances, patients have experienced lumpiness from incorrect dermal filler injections. Some failed procedures may result in irreversible problems and permanent damage to a patient's body. For example, patients have experienced vision loss, direct injury to the globe of the eye, and brain infarctions where injectors have incorrectly performed dermal filler procedures. Other examples of side effects include inflammatory granuloma, skin necrosis, endophthalmitis, injectable-related vascular compromise, cellulitis, biofilm formation, subcutaneous nodules, fibrotic nodules, other infections, and death.
The present disclosure provides for a system for prophylactic, curative, therapeutic, acupuncture, or cosmetic injection training and certification. The system can be configured to use at least two cameras to track the position and/or trajectory of a testing tool with three-dimensional location information, for example, an x-y-z location, of the tip of the testing tool when inserted into a training model. In some embodiments, the system can take into account bending of light by at least a portion of the training model to provide more accurate three-dimensional location information. In some embodiments, the system can reduce, minimize or eliminate variations in camera parameters, including intrinsic and extrinsic parameters, without a need for calibrating the three-dimensional position calculations.
In some embodiments, an injection training system can include an anatomic training model, the training model including one or more resilient layers configured to receive a tip of a testing tool and a rigid innermost layer, the one or more resilient layers and rigid innermost layer being optically transmissive, the innermost layer defining a cavity within the training model; a first camera mounted within the cavity, the first camera having a first central viewing axis; a second camera mounted within the cavity, the second camera having a second central viewing axis extending at an angle offset from the first central viewing axis, the first and second cameras each having fields of view configured to detect light emitting from the tip of the testing tool; and a processing unit configured to determine a three-dimensional position of the tip of the testing tool based on locations of the centroids of emitted light detected in the fields of view of the first and second cameras and refraction of the emitted light through the innermost layer. The system can further comprise a support structure configured for mounting the first and second cameras. The testing tool can comprise a syringe, a biopsy needle, a catheter, or another type of injection device. The system can further comprise an output device in communication with the processing unit and/or the first and second cameras and configured to generate information regarding injection parameters based on the communications. The first central viewing axis can be at a ninety degree angle with respect to the second central viewing axis. The first camera can be positioned in a superior portion of the anatomic training model and the second camera can be positioned in an inferior portion of the anatomic training model. The first central viewing axis can extend anteriorly and inferiorly. The second central viewing axis can extend anteriorly and superiorly. The one or more resilient layers can comprise at least one elastomeric layer. The training model further can comprise an opaque outer skin layer. The training tool can comprise an optical fiber configured to emit light from the tip of the training tool.
In some embodiments, a method for providing injection training can include determining whether an area of emitted light from a testing tool is within a field of view of a first camera and a second camera positioned in an anatomical training model, the training model including one or more resilient layers configured to receive a tip of an testing tool and a rigid innermost layer, the one or more resilient layers and rigid innermost layer being optically transmissive, the first and second cameras position within a cavity defined by the innermost layer; finding a location of a centroid of the area of emitted light from the field of view of each of the first and second cameras; tracing the light from the location of the centroid in each of the first and second cameras toward the innermost layer; adjusting the light tracing from each of the first and second cameras by refraction of the light through the innermost layer; recording from the adjusted light tracing a first line segment from an outer surface of the innermost layer to an outer surface of the training model for the first camera and a second line segment from an outer surface of the innermost layer to an outer surface of the training model for the second camera; and calculating a three-dimensional position of the tip of the testing tool by calculating a mid-point of nearest points along each of the first and second line segments to the other line segment. The adjusting can comprise adjusting the light tracing by a first refraction angle at an interface between the cavity and an inner surface of the innermost layer and a second refraction angle at an interface between an outer surface of the innermost layer and an inner surface of the one or more resilient layers. The method can further comprise repeating the determining, finding, tracing, adjusting, recording and calculating to track multiple locations of the tip of the testing tool over time. The tracking of the multiple locations can further comprise animating a trajectory of the injection on an output device. When the location of the centroid of the area of emitted light from the field of view of the first camera is known, the finding of the location of the centroid of the area of emitted light from the field of view of the second camera can comprise determining a feasible light detection region of the second camera based on end points of the first line segment for the first camera, the feasible light detection region being smaller than the field of view of the second camera. When light is not detected in the feasible light detection, the finding of the location of the centroid of the area of emitted light from the field of view of the second camera can further comprise determining a subsequent feasible light detection region based on a length of the testing tool.
In some embodiments, an injection training system can include an anatomic training model, the training model configured to receive a tip of a testing tool, the training model comprising an inner cavity; a first camera mounted within the cavity, the first camera having a first central viewing axis, wherein the first camera has a first plurality of intrinsic and/or extrinsic parameters; and a processing unit configured to determine a location of a centroid of emitted light detected in the field of view of the first camera, the light being emitted from the tip of the testing tool at a known three-dimensional position, wherein the processing unit is further configured to adjust the first plurality of intrinsic and/or extrinsic parameters based on one or more reference three-dimensional positions and corresponding locations of the centroid of emitted light. The system can further comprise a second camera mounted within the cavity, the second camera having a second central viewing axis extending at an angle offset from the first central viewing axis, the camera having a second set plurality of intrinsic and/or extrinsic parameters. The processing unit can be configured to adjust the second plurality of intrinsic and/or extrinsic parameters based on the one or more reference three-dimensional positions and corresponding locations of the centroid of emitted light. The first camera can be positioned in a superior portion of the anatomic training model and the second camera can be positioned in an inferior portion of the anatomic training model. The first central viewing axis can extend anteriorly and inferiorly. The second central viewing axis can extend anteriorly and superiorly. The first central viewing axis can be at a ninety degree angle with respect to the second central viewing axis. The testing tool can comprise a syringe, a biopsy needle, a catheter, or another type of injection device. The system can further comprise an output device in communication with the processing unit and/or the first and second cameras and configured to generate information regarding injection parameters based on the communications. The training model can comprise one or more resilient layers configured to receive the tip of a testing tool and a rigid innermost layer, the one or more resilient layers and rigid innermost layer being optically transmissive. The training model further can comprise an opaque outer skin layer. The training tool can comprise an optical fiber configured to emit light from the tip of the training tool.
In some embodiments, a method for providing injection training can include determining a plurality of intrinsic and/or extrinsic parameters of a first camera positioned in an anatomical training model configured to receive a tip of an testing tool, the camera configured to detected area of light emitted from the tip of the testing tool in a field of view of the first camera; finding a location of a centroid of the area of the emitted light in the field of view of the first camera, wherein a three-dimensional position of the tip of the testing tool is known; comparing the location of the centroid on the first camera with locations of centroid corresponding to one or more reference three-dimensional positions; and adjusting, if needed, the plurality of intrinsic and/or extrinsic parameters of the first camera based on the comparison. The method can further comprise determining a plurality of intrinsic and/or extrinsic parameters of a second camera positioned in the anatomical training model, the second camera having a central viewing axis extending at an angle offset from a central viewing axis of the first camera; finding a location of a centroid of the area of the emitted light in the field of view of the second camera, wherein a three-dimensional position of the tip of the testing tool is known; comparing the location of the centroid on the second camera with locations of centroid corresponding to one or more reference three-dimensional positions; and adjusting, if needed, the plurality of intrinsic and/or extrinsic parameters of the second camera based on the comparison. The intrinsic parameters can comprise one or more of focal length, image sensor format, principal point, and/or lens distortion. The extrinsic parameters can comprise one or more of position of a camera, rotation of a camera, and/or coordinate system transformations from coordinates of a chosen coordinate system to 3D camera coordinates. The adjusting can eliminate a need to calibrate future determination of a three-dimensional position of the tip of the testing tool using the training system. The reference three-dimensional positions of the tip of the testing tool and the corresponding locations of centroids of the areas of emitted light can be empirical data obtained by tracing the emitted light away from a location of a centroid in the first camera and a location of a centroid in the second camera respectively.
In some embodiments, an injection training system can include a first camera mounted within an anatomic training model, the anatomic training model configured to receive a tip of a testing tool, the first camera having a first central viewing axis; a second camera mounted within the anatomic training model, the second camera having a second central viewing axis extending at an angle offset from the first central viewing axis, the first and second cameras positioned at a distance from each other, the first and second cameras each having fields of view configured to detect light emitting from the tip of the testing tool; and a processing unit configured to determine a three-dimensional position of the tip of the testing tool based at least on locations of the centroids of emitted light detected in the fields of view of the first and second cameras. The first central viewing axis can be at between about 1 to about 90 degree angle with respect to the second central viewing axis. The first central viewing axis can be at a ninety degree angle with respect to the second central viewing axis. The first camera can be positioned in a superior portion of the anatomic training model and the second camera can be positioned in an inferior portion of the anatomic training model. The first central viewing axis can extend anteriorly and inferiorly. The second central viewing axis can extend anteriorly and superiorly. The system can further comprise a support structure configured for mounting the first and second cameras. The first camera can be mounted on a superior portion of the mounting camera. The second camera can be mounted on an inferior portion of the mounting camera. The testing tool can comprise a syringe, a biopsy needle, a catheter, or another type of injection device. The system can further comprise an output device in communication with the processing unit and/or the first and second cameras and configured to generate information regarding injection parameters based on the communications. The training model can comprise one or more resilient layers configured to receive the tip of a testing tool and a rigid innermost layer, the one or more resilient layers and rigid innermost layer being optically transmissive. The training model further can comprise an opaque outer skin layer. The training tool can comprise an optical fiber configured to emit light from the tip of the training tool. The processing unit can be further configured to determine the three-dimensional position of the tip of the testing tool based refraction of the emitted light through the training model.
Any feature, structure, or step disclosed herein can be replaced with or combined with any other feature, structure, or step disclosed herein, or omitted. Further, for purposes of summarizing the disclosure, certain aspects, advantages, and features of the inventions have been described herein. It is to be understood that not necessarily any or all such advantages are achieved in accordance with any particular embodiment of the inventions disclosed herein. No individual aspects of this disclosure are essential or indispensable.
Various embodiments are depicted in the accompanying drawings for illustrative purposes, and should in no way be interpreted as limiting the scope of the embodiments. Furthermore, various features of different disclosed embodiments can be combined to form additional embodiments, which are part of this disclosure. Corresponding numerals indicate corresponding parts.
Aspects of the disclosure are provided with respect to the figures and various embodiments. One of skill in the art will appreciate, however, that other embodiments and configurations of the devices and methods disclosed herein will still fall within the scope of this disclosure even if not described in the same detail as some other embodiments. Aspects of various embodiments discussed do not limit scope of the disclosure herein, which is instead defined by the claims following this description.
The term “bending of light” in this disclosure includes its broad ordinary meanings understood by a person of ordinary skill in the art, which include refraction of light.
As shown in
The training model 100 can include a base or inner or innermost layer 104 and one or more elastomeric layers 103. The base layer 104 can include a rigid material in order to provide structural support to the training model 100. In some embodiments, the tip of the injection tool 110 does not penetrate the rigid base layer 104. The base layer 104 can be optically transmissive. For example, the base layer 104 can be transparent or translucent. In some embodiments, the base layer 104 can include plexiglass, other similar acrylic glass, or other glass or glass-like materials. The base layer 104 can define a cavity 105 to accommodate the one or more cameras 120. One or more elastomeric layers 103 may be positioned between the base layer 104 and the outer layer 102. Each elastomeric layer 103 may have different properties to simulate different types of tissue. The elastomeric layers can be optically transmissive (for example, translucence or transparent). An opaque or outer layer 102 can cover the outer-most elastomeric layer to mimic the skin.
In the illustrated example, the testing tool 110 is in the form of a syringe, but the testing tool 110 can include other needle-based devices or catheter devices. The testing tool 110 can include a light source that emits light at the head of the needle, for example, using a fiber optic in the needle. The light source may be one or more LEDs, laser diodes, or any other light emitting device or combination of devices.
The one or more cameras 120 may be placed within the training model 100. As shown in
The camera(s) 120 can send the information detected to a processing unit included in the system. For example, the processing unit may be on the camera(s) 120, the training model 100, the output device 140, or on a separate apparatus. The processing unit can communicate with the output device 140, which can display parameters associated with the injection. The output device 140 can include any type of display useful to a user, such as, for example, a tablet, phone, laptop or desktop computer, television, projector or any other display technology.
Additional information on the injection apparatus and training system can be found in U.S. Pat. No. 8,764,449, filed Oct. 30, 2013, titled “SYSTEM FOR COSMETIC AND THERAPEUTIC TRAINING” and U.S. Publication No. 2014/0212864, filed Mar. 31, 2014, titled “INJECTION TRAINING APPARATUS USING 3D POSITION SENSOR,” the entirety of each of which is hereby incorporated by reference and made part of this specification.
According to some embodiments of the present disclosure, the apparatus can include a three-dimensional (3D) tracking system configured to determine a location of the tip of the testing tool in one of the elastomeric layers. The location can be an x-y-z position of the tip of the injection tool. In some embodiments, the system may track a depth of insertion of the testing tool using an x-y-z position of the tip of the testing tool. The tracking system can determine the location of the tip of the testing tool by tracking the light emitted from the tip of the testing tool.
As shown in
The support structure 150 can be shaped to position the first camera 120 at an angle relative to the second camera 130. For example, the mounting portion 154 can include a first portion 155 configured to be positioned in a superior portion of the training model 100 and a second portion 158 configured to be positioned in an inferior portion of the training model 100. The first portion 155 can be angled with respect to the second portion 158, such that a first central viewing axis of the first camera 120 is at an angle relative to a second central viewing axis of the second camera 130. In some configurations, the first central viewing axis of the first camera 120 can be positioned at a 90 degree angle with respect to the second central viewing axis of the second camera 130. Positioning the cameras at a 90 degree angle with respect to each other can be useful to determine the three-dimensional position of the tip of the testing tool 110 using the process(es) described below, as maximum resolution of an x-y-z position can be a function of the angle between the first and second cameras 120, 130.
As shown in
Several factors relating to the relative positions of the cameras in the training model are at play here. Specifically, the smaller the distance between the two cameras, the greater is the overlap of the viewing fields of these two cameras. However, the further apart are the two cameras, the better the resolution of the 3D position of an object that shows up in the viewing fields of the cameras. In addition, placing the two cameras at a non-zero angle to each other improves the resolution of the 3D positions of the object, but may result in a smaller overlap of the viewing fields. The embodiments described herein advantageously position the two cameras such that their viewing fields can overlap over substantially an entire injection region of the training model, but are relatively far apart and at a non-zero angle to each other to improve the resolution of the 3D positions of the tip of the injection tool. In some embodiments, one or both of the first and second cameras can be positioned anywhere along the mounting structure or within the training model. In some embodiments, the cameras can be at an angle of between about 1 degree to about 90 degree with respect to each other.
Returning to
3D Location Determination
Determination of a 3D location of the tip of the testing tool will now be described with reference to
Turning to
The processes described below for determining the 3D location of the tip of the testing tool can be based on the principle of tracing a light path backwards away from a pixel in a viewing field of a camera and determining intersection of the light path with objects, staring from a nearest object.
where A and B represent two different media and n is the refractive index. At block 466, the processing unit can calculate where the adjusted light path intersects the outer surface of the rigid inner layer. At block 468, the processing unit can calculate a refraction angle, which is the same as the angle of incidence θ2 as shown in
Turning to
The process 500 can be repeated by restarting at block 502 to track multiple locations of the tip of the testing tool over time. This data can be used to animate the trajectory of the injection on the output device. In some embodiments, the animation can be in real-time (which includes at least processing time). The 3D location determination processes described above can advantageously provide accurate 3D location of the tip of the injection tool, thereby providing more helpful feedback in injection training, by taking into account refraction of light as the light enters and leaves the rigid inner layer. In some embodiments, the processes can further incorporate different light diffusing and/or transmission properties of different elastomeric layers in order to determine the particular elastomeric layer that the tip of the injection tool has reached. In some embodiments, different elastomeric layers can have fibers arranged in different orientations so that the fibers deflect light in different directions. In some embodiments, different layers of elastomeric layers can have varying degrees of optical transmission. For example, one layer of elastomeric layer can be transparent and another layer can be translucent. Information about the layer that the tip of the injection tool has reached can provide checks against the x-y-z position determined using the processes described herein to further improve accuracy of the x-y-z position determination.
Camera Parameter Variations
Another advantage of the 3D location determination system described above will now be described with reference to
As illustrated in
Centroid Determination in Noisy Situations
In some embodiments, even though the testing or injection tool is inserted in a single position, the emitted light may appear in more than one location because the rigid inner layer and/or elastomeric layer(s) can reflect the light, resulting in a plurality of distinct light spots that can be within in the field of view of a camera. An advantage of a two-camera training system described herein is that a known centroid pixel value from one camera can help determining the centroid value of the second camera in such “noisy” situations due to the reflection of light.
Other Variations
In some embodiments, if the emitted light is within the field of view of both cameras, the processing unit can determine the location of the centroid of the area of emitted light (e.g., the u-v position of the centroid) from the field of view of each camera. The processing unit can then determine the x-y-z position by comparing data representative of the location of the centroid from both cameras with calibration data. This step can be executed using linear matrix multiplication as follows: x, y, z=X*[u1, v1, v2].
The X matrix includes calibration data that can be determined empirically using a calibration jig 300 shown in
Although the disclosure describes determining a 3D position based on a location of emitted light, the other properties of the light may also be taken into consideration such as intensity, angle, dispersion, brightness, color, and/or duration of the light.
Terminology
As used herein, the relative terms “superior,” “inferior,” and “anterior” have their usual and customary anatomical meaning. For example, superior refers to the direction of the top of a head and inferior refers to the direction of the neck.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without other input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list.
The term “about” as used herein represents an amount close to the stated amount that still performs a desired function or achieves a desired result. For example, the term “about” may refer to an amount that is within less than 10% of the stated amount or as the context may dictate.
Any methods disclosed herein need not be performed in the order recited. The methods disclosed herein include certain actions taken by a practitioner; however, they can also include any third-party instruction of those actions, either expressly or by implication. For example, actions such as “inserting the testing tool” include “instructing insertion of a testing tool.”
All of the methods and tasks described herein may be performed and fully automated by a computer system. The computer system may, in some cases, include multiple distinct computers or computing devices (e.g., physical servers, workstations, storage arrays, cloud computing resources, etc.) that communicate and interoperate over a network to perform the described functions. Each such computing device typically includes a processor (or multiple processors) that executes program instructions or modules stored in a memory or other non-transitory computer-readable storage medium or device (e.g., solid state storage devices, disk drives, etc.). The various functions disclosed herein may be embodied in such program instructions, and/or may be implemented in application-specific circuitry (e.g., ASICs or FPGAs) of the computer system. Where the computer system includes multiple computing devices, these devices may, but need not, be co-located. The results of the disclosed methods and tasks may be persistently stored by transforming physical storage devices, such as solid state memory chips and/or magnetic disks, into a different state. In some embodiments, the computer system may be a cloud-based computing system whose processing resources are shared by multiple distinct business entities or other users.
Depending on the embodiment, certain acts, events, or functions of any of the processes or algorithms described herein can be performed in a different sequence, can be added, merged, or left out altogether (e.g., not all described operations or events are necessary for the practice of the algorithm). Moreover, in certain embodiments, operations or events can be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially.
The various illustrative logical blocks, modules, routines, and algorithm steps described in connection with the embodiments disclosed herein can be implemented as electronic hardware (e.g., ASICs or FPGA devices), computer software that runs on general purpose computer hardware, or combinations of both. Various illustrative components, blocks, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as specialized hardware versus software running on general-purpose hardware depends upon the particular application and design constraints imposed on the overall system. The described functionality can be implemented in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosure.
Moreover, the various illustrative logical blocks and modules described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a general purpose processor device, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor device can be a microprocessor, but in the alternative, the processor device can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor device can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor device includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor device can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor device may also include primarily analog components. For example, some or all of the rendering techniques described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
The elements of a method, process, routine, or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor device, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of a non-transitory computer-readable storage medium. An exemplary storage medium can be coupled to the processor device such that the processor device can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor device. The processor device and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor device and the storage medium can reside as discrete components in a user terminal.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it can be understood that various omissions, substitutions, and changes in the form and details of the devices or algorithms illustrated can be made without departing from the spirit of the disclosure. As can be recognized, certain embodiments described herein can be embodied within a form that does not provide all of the features and benefits set forth herein, as some features can be used or practiced separately from others. The scope of certain embodiments disclosed herein is indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference in their entirety under 37 CFR 1.57. This application claims benefit of U.S. Provisional Patent Application No. 62/302,328, filed Mar. 2, 2016, and entitled “SYSTEM FOR DETERMINING A THREE-DIMENSIONAL POSITION OF A TESTING TOOL,” the entire disclosure of which is hereby incorporated by reference and made part of this specification.
Number | Name | Date | Kind |
---|---|---|---|
3941121 | Olinger et al. | Mar 1976 | A |
4142517 | Contreras Guerrero de Stavropoulos et al. | Mar 1979 | A |
4311138 | Sugarman | Jan 1982 | A |
4356828 | Jamshidi | Nov 1982 | A |
4410020 | Lorenz | Oct 1983 | A |
4515168 | Chester et al. | May 1985 | A |
4566438 | Liese et al. | Jan 1986 | A |
4836632 | Bardoorian | Jun 1989 | A |
5065236 | Diner | Nov 1991 | A |
5241184 | Menzel | Aug 1993 | A |
5518407 | Greenfield et al. | May 1996 | A |
5534704 | Robinson | Jul 1996 | A |
5622170 | Shulz | Apr 1997 | A |
5651783 | Reynard | Jul 1997 | A |
5899692 | Davis et al. | May 1999 | A |
6064749 | Hirota et al. | May 2000 | A |
6353226 | Khalil et al. | Mar 2002 | B1 |
6485308 | Goldstein | Nov 2002 | B1 |
6564087 | Pitris et al. | May 2003 | B1 |
6575757 | Leight et al. | Jun 2003 | B1 |
6702790 | Ross et al. | Mar 2004 | B1 |
6769286 | Biermann et al. | Aug 2004 | B2 |
7383728 | Noble et al. | Jun 2008 | B2 |
7500853 | Bevirt et al. | Mar 2009 | B2 |
7553159 | Arnal et al. | Jun 2009 | B1 |
7594815 | Toly | Sep 2009 | B2 |
7665995 | Toly | Feb 2010 | B2 |
7725279 | Luinge et al. | May 2010 | B2 |
7761139 | Tearney et al. | Jul 2010 | B2 |
7857626 | Toly | Dec 2010 | B2 |
8007281 | Toly | Aug 2011 | B2 |
8072606 | Chau et al. | Dec 2011 | B2 |
8165844 | Luinge et al. | Apr 2012 | B2 |
8203487 | Hol et al. | Jun 2012 | B2 |
8208716 | Choi et al. | Jun 2012 | B2 |
8250921 | Nasiri et al. | Aug 2012 | B2 |
8257250 | Tenger et al. | Sep 2012 | B2 |
8277411 | Gellman | Oct 2012 | B2 |
8319182 | Brady et al. | Nov 2012 | B1 |
8342853 | Cohen | Jan 2013 | B2 |
8351773 | Nasiri et al. | Jan 2013 | B2 |
8382485 | Bardsley | Feb 2013 | B2 |
8450997 | Silverman | May 2013 | B2 |
8467855 | Yasui | Jun 2013 | B2 |
8525990 | Wilcken | Sep 2013 | B2 |
8535062 | Nguyen | Sep 2013 | B2 |
8632498 | Rimsa et al. | Jan 2014 | B2 |
8655622 | Yen et al. | Feb 2014 | B2 |
8764449 | Rios et al. | Jul 2014 | B2 |
8818751 | Van Acht et al. | Aug 2014 | B2 |
8961189 | Rios et al. | Feb 2015 | B2 |
9017080 | Placik | Apr 2015 | B1 |
9251721 | Lampotang et al. | Feb 2016 | B2 |
9443446 | Rios et al. | Sep 2016 | B2 |
10269266 | Rios et al. | Apr 2019 | B2 |
10290231 | Rios et al. | May 2019 | B2 |
10290232 | Rios et al. | May 2019 | B2 |
10500340 | Rios et al. | Dec 2019 | B2 |
20020168618 | Anderson et al. | Nov 2002 | A1 |
20020191000 | Henn | Dec 2002 | A1 |
20030031993 | Pugh | Feb 2003 | A1 |
20030055380 | Flaherty | Mar 2003 | A1 |
20030108853 | Chosack et al. | Jun 2003 | A1 |
20030114842 | DiStefano | Jun 2003 | A1 |
20030220557 | Cleary et al. | Nov 2003 | A1 |
20040009459 | Anderson et al. | Jan 2004 | A1 |
20040092878 | Flaherty | May 2004 | A1 |
20040118225 | Wright et al. | Jun 2004 | A1 |
20040126746 | Toly | Jul 2004 | A1 |
20040175684 | Kaasa et al. | Sep 2004 | A1 |
20050055241 | Horstmann | Mar 2005 | A1 |
20050057243 | Johnson et al. | Mar 2005 | A1 |
20050084833 | Lacey et al. | Apr 2005 | A1 |
20050181342 | Toly | Aug 2005 | A1 |
20060084050 | Haluck | Apr 2006 | A1 |
20060194180 | Bevirt et al. | Aug 2006 | A1 |
20060264745 | Da Silva | Nov 2006 | A1 |
20070003917 | Kitching et al. | Jan 2007 | A1 |
20070179448 | Lim et al. | Aug 2007 | A1 |
20070197954 | Keenan | Aug 2007 | A1 |
20070238981 | Zhu | Oct 2007 | A1 |
20080097378 | Zuckerman | Apr 2008 | A1 |
20080107305 | Vanderkooy et al. | May 2008 | A1 |
20080123910 | Zhu | May 2008 | A1 |
20080138781 | Pellegrin et al. | Jun 2008 | A1 |
20080176198 | Ansari et al. | Jul 2008 | A1 |
20090036902 | DiMaio | Feb 2009 | A1 |
20090043253 | Podaima | Feb 2009 | A1 |
20090046140 | Lashmet | Feb 2009 | A1 |
20090061404 | Toly | Mar 2009 | A1 |
20090074262 | Kudavelly | Mar 2009 | A1 |
20090081619 | Miasnik | Mar 2009 | A1 |
20090081627 | Ambrozio | Mar 2009 | A1 |
20090123896 | Hu et al. | May 2009 | A1 |
20090142741 | Ault et al. | Jun 2009 | A1 |
20090161827 | Gertner | Jun 2009 | A1 |
20090208915 | Pugh | Aug 2009 | A1 |
20090263775 | Ullrich | Oct 2009 | A1 |
20090265671 | Sachs et al. | Oct 2009 | A1 |
20090275810 | Ayers et al. | Nov 2009 | A1 |
20090278791 | Slycke et al. | Nov 2009 | A1 |
20090305213 | Burgkart et al. | Dec 2009 | A1 |
20090326556 | Diolaiti | Dec 2009 | A1 |
20100030111 | Perriere | Feb 2010 | A1 |
20100071467 | Nasiri et al. | Mar 2010 | A1 |
20100099066 | Mire et al. | Apr 2010 | A1 |
20100120006 | Bell | May 2010 | A1 |
20100167249 | Ryan | Jul 2010 | A1 |
20100167254 | Nguyen | Jul 2010 | A1 |
20100179428 | Pederson et al. | Jul 2010 | A1 |
20100198141 | Laitenberger et al. | Aug 2010 | A1 |
20100273135 | Cohen | Oct 2010 | A1 |
20110027767 | Divinagracia | Feb 2011 | A1 |
20110046915 | Hol et al. | Feb 2011 | A1 |
20110060229 | Hulvershorn et al. | Mar 2011 | A1 |
20110071419 | Liu et al. | Mar 2011 | A1 |
20110202012 | Bartlett | Aug 2011 | A1 |
20110207102 | Trotta et al. | Aug 2011 | A1 |
20110236866 | Psaltis et al. | Sep 2011 | A1 |
20110257596 | Gaudet | Oct 2011 | A1 |
20110269109 | Miyazaki | Nov 2011 | A2 |
20110282188 | Burnside et al. | Nov 2011 | A1 |
20110294103 | Segal et al. | Dec 2011 | A1 |
20110301500 | Maguire et al. | Dec 2011 | A1 |
20120002014 | Walsh | Jan 2012 | A1 |
20120015336 | Mach | Jan 2012 | A1 |
20120026307 | Price | Feb 2012 | A1 |
20120034587 | Toly | Feb 2012 | A1 |
20120130269 | Rea | May 2012 | A1 |
20120148994 | Hori et al. | Jun 2012 | A1 |
20120171652 | Sparks et al. | Jul 2012 | A1 |
20120183238 | Savvides et al. | Jul 2012 | A1 |
20120214144 | Trotta et al. | Aug 2012 | A1 |
20120219937 | Hughes | Aug 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120251987 | Huang et al. | Oct 2012 | A1 |
20120280988 | Lampotang et al. | Nov 2012 | A1 |
20120282583 | Thaler et al. | Nov 2012 | A1 |
20120293632 | Yukich | Nov 2012 | A1 |
20120301858 | Park et al. | Nov 2012 | A1 |
20120323520 | Keal | Dec 2012 | A1 |
20130006178 | Pinho et al. | Jan 2013 | A1 |
20130018494 | Amini | Jan 2013 | A1 |
20130046489 | Keal | Feb 2013 | A1 |
20130100256 | Kirk et al. | Apr 2013 | A1 |
20130131503 | Schneider et al. | May 2013 | A1 |
20130179110 | Lee | Jul 2013 | A1 |
20130189658 | Peters et al. | Jul 2013 | A1 |
20130197845 | Keal | Aug 2013 | A1 |
20130198625 | Anderson | Aug 2013 | A1 |
20130203032 | Bardsley | Aug 2013 | A1 |
20130223673 | Davis | Aug 2013 | A1 |
20130236872 | Laurusonis et al. | Sep 2013 | A1 |
20130267838 | Fronk et al. | Oct 2013 | A1 |
20130296691 | Ashe | Nov 2013 | A1 |
20130308827 | Dillavou | Nov 2013 | A1 |
20130323700 | Samosky et al. | Dec 2013 | A1 |
20130342657 | Robertson | Dec 2013 | A1 |
20140039452 | Bangera et al. | Feb 2014 | A1 |
20140102167 | MacNeil et al. | Apr 2014 | A1 |
20140120505 | Rios | May 2014 | A1 |
20140121636 | Boyden | May 2014 | A1 |
20140162232 | Yang et al. | Jun 2014 | A1 |
20140212864 | Rios | Jul 2014 | A1 |
20140240314 | Fukazawa et al. | Aug 2014 | A1 |
20140244209 | Lee et al. | Aug 2014 | A1 |
20140260704 | Lloyd et al. | Sep 2014 | A1 |
20140278183 | Zheng et al. | Sep 2014 | A1 |
20140278205 | Bhat et al. | Sep 2014 | A1 |
20140278215 | Keal et al. | Sep 2014 | A1 |
20140322683 | Baym et al. | Oct 2014 | A1 |
20140349266 | Choi | Nov 2014 | A1 |
20150079545 | Kurtz | Mar 2015 | A1 |
20150182706 | Wurmbauer et al. | Jul 2015 | A1 |
20150206456 | Foster | Jul 2015 | A1 |
20150262512 | Rios et al. | Sep 2015 | A1 |
20150352294 | O'Mahoney et al. | Dec 2015 | A1 |
20150379899 | Baker et al. | Dec 2015 | A1 |
20150379900 | Samosky et al. | Dec 2015 | A1 |
20160000411 | Raju et al. | Jan 2016 | A1 |
20160001016 | Poulsen et al. | Jan 2016 | A1 |
20160155363 | Rios et al. | Jun 2016 | A1 |
20160193428 | Perthu | Jul 2016 | A1 |
20160213856 | Despa et al. | Jul 2016 | A1 |
20160293058 | Gaillot et al. | Oct 2016 | A1 |
20160374902 | Govindasamy et al. | Dec 2016 | A1 |
20170136185 | Rios et al. | May 2017 | A1 |
20170178540 | Rios et al. | Jun 2017 | A1 |
20170186339 | Rios et al. | Jun 2017 | A1 |
20170245943 | Foster et al. | Aug 2017 | A1 |
20170252108 | Rios et al. | Sep 2017 | A1 |
20180012516 | Rios et al. | Jan 2018 | A1 |
20180068075 | Shiwaku | Mar 2018 | A1 |
20180197441 | Rios et al. | Jul 2018 | A1 |
20180240365 | Foster et al. | Aug 2018 | A1 |
20180261125 | Rios et al. | Sep 2018 | A1 |
20180261126 | Rios et al. | Sep 2018 | A1 |
20190130792 | Rios et al. | May 2019 | A1 |
Number | Date | Country |
---|---|---|
2011218649 | Sep 2011 | AU |
2015255197 | Dec 2015 | AU |
2865236 | Sep 2013 | CA |
2751386 | Jan 2006 | CN |
201213049 | Mar 2009 | CN |
102708745 | Oct 2012 | CN |
104703641 | Jun 2015 | CN |
105118350 | Dec 2015 | CN |
205541594 | Aug 2016 | CN |
106710413 | May 2017 | CN |
107067856 | Aug 2017 | CN |
202005021286 | Sep 2007 | DE |
0316763 | May 1989 | EP |
1504713 | Feb 2005 | EP |
1723977 | Nov 2006 | EP |
1884211 | Feb 2008 | EP |
2425416 | Mar 2015 | EP |
2538398 | Aug 2015 | EP |
2756857 | May 2016 | EP |
2288686 | Jul 1997 | GB |
2309644 | Aug 1997 | GB |
2508510 | Jun 2014 | GB |
201202900 | Nov 2013 | IN |
2013-037088 | Feb 2013 | JP |
52-21420 | Jun 2013 | JP |
2013-250453 | Dec 2013 | JP |
2014-153482 | Aug 2014 | JP |
2012009379 | Feb 2012 | KR |
20140047943 | Apr 2014 | KR |
10-1397522 | May 2014 | KR |
201207785 | Feb 2012 | TW |
WO 0053115 | Sep 2000 | WO |
WO 02083003 | Oct 2002 | WO |
WO 2005083653 | Sep 2005 | WO |
WO 2007109540 | Sep 2007 | WO |
WO 2008005315 | Jan 2008 | WO |
WO 2008122006 | Oct 2008 | WO |
WO 2009023247 | Feb 2009 | WO |
WO 2009049282 | Apr 2009 | WO |
WO 2009094646 | Jul 2009 | WO |
WO 2009141769 | Nov 2009 | WO |
WO 2011043645 | Apr 2011 | WO |
WO 2011127379 | Oct 2011 | WO |
WO 2011136778 | Nov 2011 | WO |
WO 2012075166 | Jun 2012 | WO |
WO 2012088471 | Jun 2012 | WO |
WO 2012101286 | Aug 2012 | WO |
WO 2012106706 | Aug 2012 | WO |
WO 2012155056 | Nov 2012 | WO |
WO 2013025639 | Feb 2013 | WO |
WO 2013064804 | May 2013 | WO |
WO 2014070799 | May 2014 | WO |
WO 2014100658 | Jun 2014 | WO |
WO 2015109251 | Jul 2015 | WO |
WO 2015110327 | Jul 2015 | WO |
WO 2015136564 | Sep 2015 | WO |
WO 2015138608 | Sep 2015 | WO |
WO 2015171778 | Nov 2015 | WO |
WO 2016089706 | Jun 2016 | WO |
WO 2016123144 | Aug 2016 | WO |
WO 2016162298 | Oct 2016 | WO |
WO 2016191127 | Dec 2016 | WO |
WO 2017048929 | Mar 2017 | WO |
WO 2017048931 | Mar 2017 | WO |
WO 2017050781 | Mar 2017 | WO |
WO 2017060017 | Apr 2017 | WO |
WO 2017070391 | Apr 2017 | WO |
WO 2017151441 | Sep 2017 | WO |
WO 2017151716 | Sep 2017 | WO |
WO 2017151963 | Sep 2017 | WO |
WO 2017153077 | Sep 2017 | WO |
WO 2018136901 | Jul 2018 | WO |
Entry |
---|
Afzal, et al., “Use of Earth's Magnetic Field for Mitigating Gyroscope Errors Regardless of Magnetic Perturbation,” Sensors 2011, 11, 11390-11414; doi:10.3390/s111211390, 25 pp. published Nov. 30, 2011. |
Andraos et al., “Sensing your Orientation” Address 2007, 7 pp. |
Arms, S.W., “A Vision for Future Wireless Sensing Systems,” 44 pp., 2003. |
Bao, et al., “A Novel Map-Based Dead-Reckoning Algorithm for Indoor Localization”, J. Sens. Actuator Networks, 2014, 3, 44-63; doi:10.3390/jsan3010044, 20 pp., Jan. 3, 2014. |
Benbasat et al., “An Inertial Measurement Framework for Gesture Recognition and Applications,” I. Wachsmuth and T. Sowa (Eds.): GW 2001, Springer-Verlag Berlin Heidelberg, 12 pp., 2002. |
Bergamini et al., “Estimating Orientation Using Magnetic and Inertial Sensors and Different Sensor Fusion Approaches: Accuracy Assessment in Manual and Locomotion Tasks”, Oct. 2014, 18625-18649. |
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 6 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/. |
Brunet et al., “Uncalibrated Stereo Vision,” A CS 766 Project, University of Wisconsin—Madison, 13 pp, Fall 2004, http://pages.cs.wisc.edu/˜chaol/cs766/. |
Desjardins, et al. “Epidural needle with embedded optical fibers for spectroscopic differentiation of tissue: ex vivo feasibility study”, Biomedical Optics Express, vol. 2(6): pp. 1-10. Jun. 2011. |
“EPGL Medical Invents Smart Epidural Needle, Nerve Ablation and Trigger Point Treatment Devices: New Smart Medical Devices Will Give Physicians Advanced Situational Awareness During Critical Procedures,” EPGL Medical, dated Aug. 12, 2013, in 3 pages. Retrieved from http://www.prnewswire.com/news-releases/epgl-medical-invents-smart-epidural-needle-nerve-ablation-and-trigger-point-treatment-devices-219344621.html#. |
“The EpiAccess System: Access with Confidence”, EpiEP Epicardial Solutions, dated 2015, in 2 pages. |
Esteve, Eric, “Why do you need 9D Sensor Fusion to support 3D orientation?”, 5 pp., Aug. 23, 2014, https://www.semiwiki.com/forum/content/3794-why-do-you-need-9d-sensor-fusion-support-3d-orientation.html. |
Grenet et al., “spaceCoder: a Nanometric 3D Position Sensing Device,” CSEM Scientific & Technical Report, 1 page, 2011. |
Helen, L., et al. “Investigation of tissue bioimpedance using a macro-needle with a potential application in determination of needle-to-nerve proximity”, Proceedings of the 8th International Conference on Sensing Technology, Sep. 2-4, 2014, pp. 376-380. |
Inition. Virtual Botox: Haptic App Simulated Injecting The Real Thing. Retrieved from http://inition.co.uk/case-study/virtual-botox-haptic-app-simulates-injecting-real-thing. |
International Search Report and Written Opinion for Appl. No. PCT/US2017/020112, dated Jun. 9, 2017, 13 pages. |
Kalvøy, H., et al., “Detection of intraneural needle-placement with multiple frequency bioimpedance monitoring: a novel method”, Journal of Clinical Monitoring and Computing, Apr. 2016, 30(2):185-192. |
Madgwick, Sebastian O.H., “An efficient orientation filter for inertial and inertial/magnetic sensor arrays,” 32 pp., Apr. 30, 2010. |
Microsoft, “Integrating Motion and Orientation Sensors,” 85 pp., Jun. 10, 2013. |
Miller, Nathan L., Low-Power, Miniature Inertial Navigation System with Embedded GPS and Extended Kalman Filter, MicroStrain, Inc., 12 pp., 2012. |
MPU-9150 9-Axis Evaluation Board User Guide, Revision 1.0, 15 pp., May 11, 2011, http//www.invensense.com. |
MPU-9150, Register Map and Descriptions, Revision 4.2, 52 pp., Sep. 18, 2013, http//www.invensense.com. |
MPU-9150, Product Specification, Revision 4.3, 50 pp., Sep. 18, 2013, http://www.invensense.com. |
PST Iris Tracker, Plug and Play, 3D optical motion tracking specifications, 1 p., Dec. 4, 2014, www.pstech.com. |
PST Iris Tracker, Instruction Manual, 3D optical motion tracking specifications, 42 pp., Jul. 27, 2012, www.pstech.com. |
Struik, Pieter, “Ultra Low-Power 9D Fusion Implementation: A Case Study,” Synopsis, Inc., 7 pp., Jun. 2014. |
Sutherland, et al. “An Augmented Reality Haptic Training Simulator for Spinal Needle Procedures,” IEEE, 2011. |
Varesano, Fabio, “Prototyping Orientation and Motion Sensing Objects with Open Hardware,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Feb. 10, 2013, 4 pp. |
Varesano, Fabio, “FreelMU: An Open Hardware Framework for Orientation and Motion Sensing,” Dipartimento di Informatica, Univ. Torino, http://www.di.unito.it/˜varesano, Mar. 20, 2013, 10 pp. |
“B-Smart disposable manometer for measuring peripheral nerve block injection pressures”, Bbraun USA, 2016. |
“A beginner's guide to accelerometers,” Dimension Engineering LLC, accessed Jul. 11, 2018, in 2 pages, https://www.dimensionengineering.com/info/accelerometers. |
“Accelerometer: Introduction to Acceleration Measurement,” Omega Engineering, Sep. 17, 2015, 3 pages, https://www.omega.com/prodinfo/accelerometers.html. |
“About the Journal”, J. Dental Educ., AM. Dental Educ. Ass'n, 2019, http://www.jdentaled.org/content/about-us (last visited Oct. 9, 2019). |
“Article Information”, Wierinck et al., “Expert Performance on a Virtual Reality Simulation System”, J. Dental Educ., AM. Dental Educ. Ass'n, 2019, http://www.jdental.org/content/71/6/759/tab-article-info (last visited Oct. 9, 2019). |
Begg et al., “Computational Intelligence for Movement Sciences: Neural Networks and Other Emerging Techniques”, Idea Group Inc (IGI), 2006. |
Comsa et al, “Bioluminescene imaging of point sources implants in small animals post mortem: evaluation of a method for estimating source strength and depth”, Phys. Med. Biol., Aug. 2007, vol. 52, No. 17, pp. 5415-5428. |
Correa et al., “Virtual Reality Simulator for Dental Anesthesia Training in the Inferior Alveolar Nerve Block,” Journal of Applied Oral Science, vol. 25, No. 4, Jul./Aug. 2017, pp. 357-366. |
Garg et al., “Radial Artery cannulation-Prevention of pain and Techniques of cannulation: review of literature,” The Internet Journal of Anesthesiology, vol. 19, No. 1, 2008, in 6 pages. |
Hotraphinyo et al., “Precision measurement for microsurgical instrument evaluation”, Conference Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Societyl, 2001, vol. 4, pp. 3454-3457. |
International Preliminary Report on Patentability for Appl. No. PCT/US2017/020112, dated Sep. 13, 2018, 8 pages. |
Jafarzadeh et al., “Design and construction of an automatic syringe injection pump,” Pacific Science Review A: Natural Science and Engineering 18, 2016, in 6 pages. |
Kettenbach et al., “A robotic needle-positioning and guidance system for CT-guided puncture: Ex vivo results,” Minimally Invasive Therapy and Allied Technologies, vol. 23, 2014, in 8 pages. |
Krupa et al., “Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing”, IEEE Trans. Robotics and Automation, 2003, vol. 19, pp. 842-853. |
Ladjal, et al., “Interactive Cell Injection Simulation Based on 3D Biomechanical Tensegrity Model,” 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, in 9 pages. |
Lee et al., “An Intravenous Injection Simulator Using Augmented Reality for Veterinary Education and its Evaluation,” Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry, Dec. 2-4, 2012, in 4 pages. |
Lee et al., “Augmented reality intravenous injection simulator based 3D medical imaging for veterinary medicine,” The Veterinary Journal, 2013, vol. 196, No. 2, pp. 197-202. |
Liu et al. “Robust Real-Time Localization of Surgical Instruments in the Eye Surgery Stimulator (EyeSi)”, Signal and Image Processing, 2002. |
Merril et al., “The Ophthalmic Retrobulbar Injection Simulator (ORIS): An Application of Virtual Reality to Medical Education”, Proc. Ann. Symp. Comput. Med. Care, 1992, pp. 702-706. |
Mukherjee et al., “A Hall Effect Sensor Based Syringe Injection Rate Detector”, IEEE 2012 Sixth Int'l Conf on Sensing Technol.(ICST), Dec. 18-21, 2012. |
Petition for Inter Partes Review of U.S. Pat. No. 9,792,836, Pusuant to 35 U.S.C. §§ 311-19, 37 C.F.R. § 42.100 ET SEQ., IPR2020-00042, dated Oct. 11, 2019. |
Patterson et al., “Absorption spectroscopy in tissue-simulating materials: a theoretical and experimental study of photon paths”, Appl. Optics, Jan. 1995, vol. 34, No. 1, pp. 22-30. |
Poyade et al., “Development of a Haptic Training Simulation for the Administration of Dental Anesthesia Based Upon Accurate Anatomical Data,” Conference and Exhibition of the European Association of Virtual and Augmented Reality, 2014, in 5 pages. |
Quio, “Smartinjector,” available at https://web.archive.org/web/20161017192142/http://www.quio.com/smartinjector, Applicant believes to be available as early as Oct. 17, 2016, in 3 pages. |
State Electronics, “Sensofoil Membrane Potentiometer,” Product Information and Technical Specifications, in 6 pages. |
Truinject Corp., “Smart Injection Platform,” http://truinject.com/technology/, printed Jan. 13, 2018, in 3 pages. |
Van Sickle et al., “Construct validation of the ProMIS simulator using novel laparoscopic suturing task”, Surg Endosc, Sep. 2005, vol. 19, No. 9, pp. 1227-1231. |
Wierinck et al., “Expert Performance on a Virtual Reality Simulation System”, 71 J. Dental Educ., Jun. 2007, pp. 759-766. |
Wik et al., “Intubation with laryngoscope versus transillumination performed by paramedic students on mainkins and cadavers”, Resuscitation, Jan. 1997, vol. 33, No. 3, pp. 215-218. |
Number | Date | Country | |
---|---|---|---|
20170254636 A1 | Sep 2017 | US |
Number | Date | Country | |
---|---|---|---|
62302328 | Mar 2016 | US |