Ultrasound trainer with internal optical tracking

Information

  • Patent Grant
  • 11495142
  • Patent Number
    11,495,142
  • Date Filed
    Thursday, January 30, 2020
    4 years ago
  • Date Issued
    Tuesday, November 8, 2022
    a year ago
Abstract
A training system to teach use of an ultrasound probe, the training system having a chamber defining an orifice, a shaft insertable into the orifice of the chamber, a marker positioned on the shaft at a distal end, a camera positioned to view the marker when inserted inside the chamber, and a processor operatively connected to the camera for processing a position and an orientation of the shaft based on the marker. The system provides a method for visualizing movement of the shaft from inside the chamber.
Description
BACKGROUND

Currently available ultrasound simulation solutions that deliver endolumenal ultrasound simulation utilize excessively complex, large physical footprint, and expensive inertial tracking, or alternative motion sensing, technologies. The latter motion-sensing options are not compatible with individual user utilization due to practical (e.g., large form factor) and cost considerations, thereby—limiting ultrasound training options. The proposed invention would deliver 6-DOF simulated endolumenal ultrasound probe movement using a compact form factor for a scalable individual user training solution.


SUMMARY

The present invention is directed towards a training system to teach use of a medical device, such as an ultrasound probe. The training system comprises a chamber defining an orifice; a shaft insertable into the orifice of the chamber, the shaft having a proximal end and a distal end; a marker positioned on the shaft at the distal end; a camera positioned to view the marker when inserted inside the chamber; and a processor operatively connected to the camera for processing a position and an orientation of the shaft based on the marker.


In some embodiments, the system may further comprise a motion limiter connected to the chamber at the orifice.


In some embodiments, the system may further comprise a light source to illuminate the chamber.


In some embodiments, the chamber mimics a body or a body part.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a schematic diagram of an embodiment of the present invention.



FIG. 2 is a schematic diagram of an embodiment of the present invention.



FIG. 3 is a flow diagram of an embodiment of the present invention.



FIG. 4 is a perspective view of an embodiment of the present invention.



FIG. 5 is a perspective view of an embodiment of the present invention.



FIG. 6 is a perspective view of an embodiment of the present invention.



FIG. 7 is a cross-sectional view of a chamber taken at plane A shown in FIG. 6.



FIG. 8 is the cross-sectional view shown in FIG. 7 showing movement of the shaft (as indicated by the broken lined shafts).



FIG. 9 is the cross-sectional view shown in FIG. 7 showing insertion and withdrawal movement of the shaft (as indicated by the broken lined shafts).



FIG. 10 is the cross-section view shown in FIG. 7 showing movement of the shaft in a chamber containing filler material.



FIG. 11 is a cross-sectional view of a chamber mimicking a human head.



FIG. 12 is a cross-section view of a chamber having an incision.





DETAILED DESCRIPTION OF THE INVENTION

The detailed description set forth below in connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention.


The invention of the present application is a training system 100 for the mastery of ultrasound procedures, including, but not limited to endolumenal (or endoluminal) ultrasound procedures, also known as endoscopic ultrasound (EUS) medical procedures, transvaginal sonography (TVS) or OB-GYN ultrasonography, rectal endoscopic sonography (RES), transesophageal echocardiogram (TEE), endobronchial, intestinal, intravascular ultrasound (IVUS) or similar diagnostic techniques where an imaging probe is inserted into a bodily lumen of a human or animal subject.


With reference to FIGS. 1-3, in the preferred embodiment, the training system 100 comprises: a chamber 1 defining an orifice 6, a shaft 2 insertable into the chamber 1 through the orifice 6, a marker 5 on the shaft 2, a camera 4 configured to view inside the chamber 1 towards the orifice 6, and a processor 3 operatively connected to the camera 4 to process information. In some embodiments, a motion limiter 9 can be attached to the chamber 1 at the orifice 6. In some embodiments, a light source 8 can be configured to illuminate inside the chamber 1.


The marker 5 is an indicium endowed with a distinctive appearance and a known geometric structure (shape) that can be detected by the camera 4. For example, the indicium may be a printed pattern or other marking applied to shaft 2. The marker 5 can be affixed rigidly to one end of shaft 2 by way of an adhesive, other bonding solution, or any other type of fastener, or may be printed directly on the surface of shaft 2. In some embodiments, the marker 5 may be formed directly into shaft 2 either during the manufacturing process of the shaft 2 or as a post-production modification.


The shaft 2 itself may be constructed of a rigid material or a flexible material, as long as the flexure of the shaft 2 does not cause significant distortion of the appearance of the marker 5. The shaft 2 has a handle 12 and a distal end 10 opposite the handle 12 to mimic typical imaging probes, such as ultrasound probes. The distal end 10 of shaft 2 bears the marker 5. As such, the distal end 10 is insertable into the chamber 1 through the orifice 6. The distal end 10 of shaft 2 where the marker 5 is located may be referred to as the “tip” of shaft 2. Shaft 2 emulates an imaging probe being inserted into a bodily lumen. The tip of shaft 2 emulates the location of the detector or transducer in a real endolumenal imaging probe, such as a transvaginal ultrasound probe, a transesophageal ultrasound probe, or an intravascular ultrasound probe. In order to increase the realism of the endolumenal ultrasound trainer, the preferred embodiment molds the shaft 2 to resemble a particular type of endolumenal ultrasound probe, and the outside of the chamber 1 to resemble a section of the human body or the body of an animal. One may also choose to affix an marker 5 to a real clinical probe, while ensuring that the appearance of the probe, which may be made of a reflective type of material, does not disturb the view of the optical marker 5 by the camera 4, introducing glare and other undesirable optical artifacts.


In some embodiments, shaft 2 may have a tip 10 that can be bent, through flexing or articulation, independently from the main shaft 2 as shown in FIG. 4. Marker 5 may be applied to the tip 10. The shaft 2 may have one or more controllers 11 on its exterior to mechanically alter the position or orientation of the steerable tip (and thus marker 5 affixed on it). Common controllers 11 known to those skilled in the art may include knobs, joysticks, dials, push buttons, capacitive buttons or any other tactile or electronic control system. The motion of the controller 11 may be transmitted directly to the tip 10 by an intermediate mechanical component or the tactile control may send a signal to a separate actuator to steer the tip 10 in response to the operator's input. Alternatively, the controller 11 may be part of the user interface of an external device, which in turn will send a signal to the shaft 2 and actuate the tip 10. The controller 11 may also be virtual elements of a graphical user interface (GUI) displayed on a screen that may be part of the shaft 2 itself or part of an external computing device.


In some embodiments, at least one marker 5 may be placed on a rigid section of shaft 2 upstream of the tip 10 (i.e., towards the handle), and at least one other marker 5 may be placed on the tip 10. An algorithm can analyze independent observations of each marker 5 and compute the position and orientation of tip 10 and the rigid section of shaft 2 separately.


In some embodiments, if the controller 11 can transmit its state to the processor 3, one can simulate the presence of a tip 10 with a single rigid tubular shaft 2 with one or more markers 5, but without requiring the additional complexity of a mechanically steerable tip. In the latter, the processor 3 computes the location of the imagined steerable extension by combining information from optical tracking and the state of the controllers 11 without requiring a physical mechanism to articulate the tip 10 of the shaft 2.


The chamber 1 is hollow with at least one inner wall 7 defining an interior space. For example, a single curved wall 7 can be used to create a cylindrical or spherical shape, multiple flat walls 7 can be used to create sidewalls, ceilings, and floors, or a combination of curved and flat walls can be used to create other three-dimensional cavities. The inner wall 7 should be constructed with an opaque material that limits the transmission of external light into chamber 1. Preferably, the inside of chamber 1 is constructed of a material that is matte in appearance or alternatively is coated with a substance that reduces its optical reflectivity. Preferably, the distal end 10 of shaft 2 that goes inside chamber 1 should have an appearance that is matte and distinct in color and texture from the appearance of marker 5 so as to create a detectable contrast. Chamber 1 may mimic various body parts, such as the lower body (see, FIGS. 5-10 and 12) such as the vaginal area and rectal area, the head and neck (see, FIG. 11), and the like, where physicians are likely to probe with an imaging probe. In some embodiments, the chamber 1 may be a full-sized manikin commonly used for medical training.


In some embodiments, a motion limiter 9 is formed or attached to the chamber 1 at the orifice 6 and spans its circumference adding thickness to its outer edge. The motion limiter 9 is positioned in such way as to exert mechanical resistance against shaft 2, thus constraining its freedom of motion when the shaft 2 is inserted into chamber 1 and manipulated by the user. Portions of shaft 2 in between the distal end 10 and the handle 12 may be configured to mate with a motion limiter 9, which mechanically constrains shaft 2 to a desired range of motion. The motion limiter 9 may have a flexible rubber trim whose thickness is such to provide tight contact with shaft 2 and limit its insertion via friction and its lateral motion by means of its stiffness against deformation. In this embodiment the contact between the motion limiter 9 and the shaft 2 should be tight enough, so that the user cannot change orientation of the shaft 2 without deforming the shape of motion limiter 9. Therefore, in this embodiment there is a direct correlation between the elasticity (the ability to deform) and coefficient of friction of the motion limiter 9 and the haptics of the shaft 2. Alternatively, the motion limiter 9 may be a cone or other revolved surface of a rigid or semi-rigid material whose profile is calculated appropriately so as to constrain the lateral motion of shaft 2 within a desired solid angle.


The camera 4 faces inside the opaque chamber 1 in such a way that it maintains a clear view of the marker 5 for the entire range of motion of shaft 2 when inserted into the chamber 1. In some embodiments, the camera 4 may be inside the chamber 1. If a single camera 4 cannot observe the marker 5 for the entire range of motion of the shaft 2, the system can employ multiple cameras 4 without violating the spirit of the invention. In some embodiments, the shaft 2 can have multiple distinct markers 5 to ensure that at least one of the markers 5 is always visible by at least one of the cameras 4 for the entire range of motion of the shaft 2. Therefore, each marker 5 may be distinct from another marker 5. In some embodiments, a marker 5 may be on the chamber wall 7, which can serve as a reference point to determine movement of the marker 5 on the shaft 2.


The camera 4 may be operatively connected to a processor that analyzes the visual information captured by the camera 4. The connection may be established with a wired or wireless connection using either a standard protocol, such as USB, Thunderbolt, Bluetooth or Wi-Fi, or a custom protocol as is well known to those skilled in the art. The processor may be a microcontroller placed inside the chamber 1, it may be placed outside the chamber 1 at a remote location, or it may be part of a more complex computing device.


We refer to the combination of the frame data from the camera 4 and the algorithm that runs on the processor as “optical tracking”.


The optical tracking described in this invention allows for full 6 degrees-of-freedom (6-DOF) tracking (rotation in 3 spatial dimensions and position in 3 spatial dimensions) of the shaft 2. However, a designer may employ different types of motion limiters 9 to further constrain the motion to, for example, rotation only (3-DOF), rotation and planar motion (5-DOF), rotation and penetration (4-DOF), roll rotation and penetration (2-DOF).


The camera 4 may operate in the visible spectrum or the infrared spectrum, and may support multiple colors or be monochromatic. One skilled in the art understands that the appearance of the marker 5, the opacity of the chamber 1, and the internal coating of the chamber 1 must be chosen in a way that conforms to the chosen spectrum of light. Furthermore, for the purpose of this invention, optical tracking can be achieved adequately if the optical assembly of the camera 4 has a fixed focus, manually adjustable focus, electronically adjustable focus, or auto-focus, or any other variants of varifocal lenses that can resolve the pattern on the marker 5 with sufficient visual acuity. The image sensor of camera 4 can either employ rolling shutter or global shutter.


In some embodiments, the camera 4 may be able to measure depth (e.g., RGBD camera) directly by way of stereo vision, time-of-flight imaging, structured light, or other operating principle known to those skilled in the art. Alternatively, the camera 4 may be a device specifically designed to track the three-dimensional position and orientation of an elongated object such as the commercially available LEAP Motion controller. This embodiment enables applications where the shaft 2 can be an arbitrary elongated object and does not necessarily require modification by affixing a marker 5 to it.


In some embodiments, a light source 8 may also be directed towards the inside of the hollow chamber 1. For example, a light source 8 may be mounted on a mechanical assembly of the camera, or mounted on one of the walls 7 of the chamber 1, or embedded into or attached behind the walls 7 for backlighting. The light source 8 is designed to provide controlled illumination of the marker 5 for the entire range of motion of the shaft 2. In some cases, the designer may employ more than a single light source 8 to ensure uniform illumination of the marker 5 and/or elimination of shadows for the entire range of motion of shaft 2.


In the preferred embodiment, the system may be combined with an external computing device 3 that runs a software ultrasound simulator similar, but not limited, to The SonoSim® Ultrasound Training Solution. An ultrasound simulator comprises at least a case library that contains one or more medical cases of interest, a user interface, a variable image that resembles the appearance of an ultrasound image or other clinical imaging modality, and optionally a virtual representation of a patient along with a visualization of the imaging device being inserted in a bodily lumen. The system described in this invention is connected by means of a wire or wirelessly to a computing device 3 that runs the ultrasound simulator. The computing device 3 may either receive raw frame data directly from the camera 4 and run the algorithm to compute the position and orientation of the tip 10 of the shaft 2, or it may receive position and orientation information of the tip 10 of the shaft 2 already computed by the system through a processor 3 embedded in the apparatus itself.


The computing device 3 transmits the position and orientation of the tip 10 of the shaft 2 to the ultrasound simulator and, in turn, the ultrasound simulator updates the visualization to display an ultrasound image 20 that corresponds to the exact spatial configuration of shaft 2 as if it were a real endolumenal probe inserted into a real patient as shown in FIG. 5.


Additionally, if shaft 2 is endowed with controllers 11, the operator may alter the state of the ultrasound simulator by interacting with controllers 11. In the latter, the shaft 2 must have the ability to transmit the state of the controllers 11 to the computing device 3. For example, the operator turn a knob to steer the tip 10 of the probe and the corresponding simulated ultrasound image 20 in the ultrasound simulator, or may press a button to switch the selected case from the available case library.


In use, the training system monitors movement of a marker on a distal end of a shaft with a camera, wherein the distal end of the shaft is inserted inside a chamber through an orifice of the chamber; and determines a position and orientation of the shaft based on the movement of the marker with a processor operatively connected to the camera. In some embodiments, movement of the shaft is restricted with a motion limiter. The position and orientation of the shaft can be calculated using an algorithm or a look up table. The movable shaft 2 can be partially inserted into the orifice 6 of the chamber 1 where the optical camera 4 and light source 8 reside. Once inserted, movement of the part of the movable shaft 2 that is outside the chamber 1 (i.e., the handle 12) results with the movement of the part of the movable shaft 2 that is inside the opaque chamber 1 and that has marker 5 (i.e., the tip 10). The movable shaft 2 may be guided (constrained) by the motion limiter 9. The light source 8 illuminates the chamber 1 to allow movement of the marker 5 to be captured by the camera 4. The images of the marker 5 captured by camera 4 can be processed by a computer processor 3 to calculate corresponding movement of the movable shaft 2.


When an operator moves the shaft 2 by manipulating it from the outside of the chamber 1, the shaft 2 transmits the motion of the operator to the optical marker 5 that is rigidly attached to the end of the shaft 2 hidden inside the chamber 1.


Camera 4 observes the marker 5 as it moves, and transmits its frame data to the processor 3. The processor 3 employs an algorithm that correlates the observations of the marker 5 and its perspective distortions to the position and orientation of shaft 2. Further processing by means of mathematical transformations known to those skilled in the art allows the algorithm to determine the exact position and orientation of the distal tip 10 of the shaft 2 that is hidden inside the chamber 1 in three-dimensional space when in use.


EXAMPLES
Pelvic Trainer

In one embodiment, the system emulates transvaginal sonography (TVS) and/or rectal endoscopic sonography (RES) as shown in FIGS. 5-10 with the shaft 2 mimicking a probe. The endolumenal ultrasound trainer comprises an external mold that mimics the shape of a pelvis that hides an internal hollow chamber 1; a shaft 2 that resembles a transvaginal/transrectal probe; an orifice 6 that allows insertion of the probe 2 into the internal hollow chamber 1; one or more optical markers 5 affixed to the tip 10 of the probe 2; one or more optical cameras 4; and one or more sources of light 8. The optical camera 4 acquires observations of the optical marker 5 and sends them directly to an external computation device 3 that hosts an ultrasound simulator capable of displaying transvaginal/transrectal sonographic images 20. The operator manipulates the probe as they would do in a real OB-GYN or RES clinical session with a real patient. In response to the operator's motions, the ultrasound simulator on computing device 3 displays an ultrasound image 20 that realistically emulates the image the operator would see if they placed the transvaginal/transrectal probe inside a patient at the same position and orientation inside the body of the patient.


Marker 5 may be a small disc with a pattern portraying two fully overlapping but non-concentric circles of contrasting colors. The offset between the circles breaks the symmetry of the pattern and ensures that each observation of the optical marker 5 determines the pose of the probe unambiguously. Alternatively the marker 5 may be a rectangular tile with a pattern portraying a collection of squares of contrasting color arranged in a non-symmetric fashion.


The algorithm running on the processor first isolates the pixels corresponding to optical marker 5 based on their distinct appearance. The algorithm then analyzes the pixels corresponding to marker 5 to determine how the lens of camera 4 has applied a perspective distortion to the observed shape of the marker 5. Given a set of camera parameters known in advance, the perspective distortion alters the size and shape in ways that can be predicted accurately.


In general if the marker 5 is designed appropriately, there is only one possible position and orientation of the marker 5 in three-dimensional space that matches the view of marker 5 seen by the camera 4. The algorithm calculates this position and orientation using techniques known to those skilled in the art.


The shape of orifice 6 and the channel that guides probe 2 into chamber 1 act as a motion limiter 9 that mechanically constrains probe 2. The orifice 6 and channel may be replaceable to simulate different kinds of medical procedures. For example, TVS procedures may require less constrained probe rotations than RES procedures. An extended motion limiter 9 can be inserted inside the chamber 1 to mimic the internal anatomy of the cavity (patient-specific or procedure-specific). Furthermore, one may employ one or a multitude of filler material 20 to fill the empty volume inside the chamber 1 to provide physical resistance to the probe's 2 motion and emulate the haptic feedback of inserting a transvaginal or transrectal probe inside the body cavity. The filler material 20 may have the consistency of a deformable solid or a viscous fluid. For example, the filler material 20 can be an optically transparent and deformable material. In some embodiments, the filler material 20 can be a plurality of small, loose particles packed inside the chamber 1.


TEE Trainer

In one embodiment, the system emulates transesophageal echocardiogram (TEE) sonography as shown in FIG. 11 with the shaft 2 mimicking a probe. The trainer comprises a chamber 1 that mimics the shape of a human head with a cavity that corresponds to the inside of the mouth; shaft 2 resembles a real transesophageal probe and is inserted into chamber 1 through an orifice 6 at the mouth; a source of light 8 and a camera 4 are located inside chamber wall 7. Probe 2 has control knobs 11 that mimic the real steering controls of a TEE probe. Probe 2 has a flexible tip 10 that is actuated mechanically in response to control knobs 11 and one or more optical markers 5 are affixed on it. The position of markers 5 informs the penetration depth (advancing and withdrawing) of probe 2 inside the mouth. If more than one marker 5 is used, the relative position of the markers 5 informs the angle at which the operator has steered the flexible tip 10 via the knob (rotate back/forward, flex to left/right, anteflex/retroflex steering). Camera 4 observes marker 5 and through the processor transmits the computed position and orientation of probe 2 to computing device 3 running a simulator capable of visualizing ultrasound of images of TEE sonography.


IVUS Trainer

In one embodiment, the system emulates intravascular ultrasound (IVUS) sonography as shown in FIG. 12 with the shaft 2 mimicking a probe. The trainer comprises a chamber 1 that mimics the shape of a human body or a section of it that is relevant to a particular type of intravascular procedure. A small orifice 6 emulates a puncture into the body and an internal channel 14 leads to a small hollow chamber 1 on the inside. Probe 2 mimics the shape of a real IVUS probe and has the marker 5 attached to its tip 10. Camera 4 is located at one end of internal chamber 1 and observes the marker 5 affixed at the tip of the probe 2. Camera 4 observes the marker 5 and through the processor transmits the computed position and orientation of the probe to computing device 3 running a simulator capable of visualizing ultrasound images 20 of IVUS sonography.


The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.

Claims
  • 1. An ultrasound trainer, comprising: a) a chamber defining an orifice;b) a shaft insertable into the orifice of the chamber, the shaft having a proximal end, a distal end opposite the proximal end, a flexible tip at the distal end, and a control knob to bend the flexible tip;c) a marker positioned on the shaft at the distal end;d) a camera positioned to view inside the chamber towards the orifice;e) a processor operatively connected to the camera for processing a position and an orientation of the shaft based on the marker;f) a motion limiter connected to the chamber at the orifice;g) a filler material inside the chamber;h) a light source to illuminate the chamber, wherein the chamber mimics a body.
  • 2. An ultrasound trainer, comprising:a) a chamber defining an orifice;h) a shaft insertable into the orifice of the chamber, the shaft having a proximal end and a distal end opposite the proximal end;c) a marker positioned on the shaft at the distal end;d) a camera positioned to view inside the chamber;e) a processor operatively connected to the camera for processing a position and an orientation of the shaft based on the marker;f) a motion limiter connected to the chamber at the orifice;g) a filler material inside the chamber; andh) a light source to illuminate the chamber, wherein the shaft comprises a flexible tip at the distal end, wherein the shaft comprises a controller to bend the flexible tip.
  • 3. The ultrasound trainer of claim 2, wherein the chamber mimics a body.
  • 4. The ultrasound trainer of claim 3, wherein the body is a vaginal area.
  • 5. The ultrasound trainer of claim 3, wherein the body is a rectal area.
  • 6. The ultrasound trainer of claim 3, wherein the body is a head and neck area.
  • 7. An ultrasound trainer, comprising:a) a chamber defining an orifice;b) a shaft insertable into the orifice of the chamber, the shaft having a proximal end and a distal end opposite the proximal end;c) a marker positioned on the shaft at the distal end;d) a camera positioned to view inside the chamber; ande) a processor operatively connected to the camera for processing a position and an orientation of the shaft based on the marker, wherein the shaft comprises a flexible tip at the distal end and, wherein the shaft comprises a controller to bend the flexible tip.
  • 8. The ultrasound trainer of claim 7, further comprising a motion limiter connected to the chamber at the orifice.
  • 9. The ultrasound trainer of claim 8, further comprising a filler material inside the chamber.
  • 10. The ultrasound trainer of claim 9, further comprising a light source to illuminate the chamber.
  • 11. The ultrasound trainer of claim 7, further comprising a filler material inside the chamber.
  • 12. The ultrasound trainer of claim 7, further comprising a light source to illuminate the chamber.
  • 13. The ultrasound trainer of claim 7, wherein the chamber mimics a body.
  • 14. The ultrasound trainer of claim 13, wherein the body is a vaginal area.
  • 15. The ultrasound trainer of claim 13, wherein the body is a rectal area.
  • 16. The ultrasound trainer of claim 13, wherein the body is a head and neck area.
CROSS-REFERENCE TO RELATED APPLICATION

This patent application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/798,860, entitled “Ultrasound Trainer with Internal Optical Tracking,” filed Jan. 30, 2019, which application is incorporated in its entirety here by this reference.

US Referenced Citations (264)
Number Name Date Kind
1488233 Frederick Mar 1924 A
1762937 Staud Jun 1930 A
2019121 De Rewal Oct 1935 A
2112019 Gyger Mar 1938 A
2127610 Moore Aug 1938 A
2705049 Brooks Mar 1955 A
2705307 Nyswander Mar 1955 A
2722947 Sragal Nov 1955 A
2886316 Ayala May 1959 A
4040171 Cline et al. Aug 1977 A
4838863 Allard et al. Jun 1989 A
4838869 Allard Jun 1989 A
4994034 Botich et al. Feb 1991 A
5231381 Duwaer Jul 1993 A
5513992 Refait May 1996 A
5609485 Bergman et al. Mar 1997 A
5678565 Sarvazyan Oct 1997 A
5689443 Ramanathan Nov 1997 A
5701900 Shehada et al. Dec 1997 A
5704791 Gillio Jan 1998 A
5755577 Gillio May 1998 A
5767839 Rosenberg Jun 1998 A
5776062 Nields Jul 1998 A
5791908 Gillio Aug 1998 A
5800177 Gillio Sep 1998 A
5800178 Gillio Sep 1998 A
5800179 Bailey Sep 1998 A
5800350 Coppleson et al. Sep 1998 A
5827942 Madsen et al. Oct 1998 A
5882206 Gillio Mar 1999 A
5889237 Makinwa Mar 1999 A
5934288 Avila et al. Aug 1999 A
6001472 Ikeda et al. Dec 1999 A
6048312 Ishrak et al. Apr 2000 A
6063030 Vara et al. May 2000 A
6068597 Lin May 2000 A
6074213 Hon Jun 2000 A
6113395 Hon Sep 2000 A
6117078 Lysyansky et al. Sep 2000 A
6119033 Spigelman et al. Sep 2000 A
6122538 Sliwa, Jr. et al. Sep 2000 A
6156213 Dudley et al. Dec 2000 A
6193657 Drapkin Feb 2001 B1
6267599 Bailey Jul 2001 B1
6468212 Scott et al. Oct 2002 B1
6502756 F.ang.hraeus Jan 2003 B1
6511427 Sliwa, Jr. et al. Jan 2003 B1
6548768 Pettersson et al. Apr 2003 B1
6570104 Ericson et al. May 2003 B1
6654000 Rosenberg Nov 2003 B2
6663008 Pettersson et al. Dec 2003 B1
6665554 Charles et al. Dec 2003 B1
6666376 Ericson Dec 2003 B1
6667695 Pettersson et al. Dec 2003 B2
6674427 Pettersson et al. Jan 2004 B1
6689966 Wiebe Feb 2004 B2
6693626 Rosenberg Feb 2004 B1
6694163 Vining Feb 2004 B1
6698660 F.ang.hraeus et al. Mar 2004 B2
6714213 Lithicum et al. Mar 2004 B1
6714901 Cotin et al. Mar 2004 B1
6719470 Berhin Apr 2004 B2
6722574 Skantze et al. Apr 2004 B2
6732927 Olsson et al. May 2004 B2
6750877 Rosenberg et al. Jun 2004 B2
6780016 Toly Aug 2004 B1
6816148 Mallett et al. Nov 2004 B2
6836555 Ericson et al. Dec 2004 B2
6854821 Ericson et al. Feb 2005 B2
6864880 Hugosson et al. Mar 2005 B2
6878062 Bjorklund et al. Apr 2005 B2
6896650 Tracey et al. May 2005 B2
6916283 Tracey et al. Jul 2005 B2
6927916 Craven-Bartle Aug 2005 B2
6929183 Pettersson Aug 2005 B2
6929481 Alexander et al. Aug 2005 B1
6947033 F.ang.hraeus et al. Sep 2005 B2
6958747 Sahlberg et al. Oct 2005 B2
6966495 Lynggaard et al. Nov 2005 B2
6992655 Ericson et al. Jan 2006 B2
7002559 Ericson Feb 2006 B2
7035429 Andreasson Apr 2006 B2
7037258 Chatenever et al. May 2006 B2
7050653 Edso et al. May 2006 B2
7054487 Ericson et al. May 2006 B2
7072529 Hugosson et al. Jul 2006 B2
7089308 Fransson et al. Aug 2006 B2
7094977 Ericson et al. Aug 2006 B2
7110604 Olsson Sep 2006 B2
7120320 Petterson et al. Oct 2006 B2
7121465 Rignell Oct 2006 B2
7127682 Sandstrom et al. Oct 2006 B2
7143952 Ericson Dec 2006 B2
7145556 Pettersson Dec 2006 B2
7154056 Bergqvist et al. Dec 2006 B2
7162087 Bryborn Jan 2007 B2
7167164 Ericson et al. Jan 2007 B2
7172131 Pettersson et al. Feb 2007 B2
7175095 Pettersson et al. Feb 2007 B2
7176896 Fahraeus et al. Feb 2007 B1
7180509 Fermgard et al. Feb 2007 B2
7195166 Olsson et al. Mar 2007 B2
7202861 Lynggaard Apr 2007 B2
7202963 Wiebe et al. Apr 2007 B2
7239306 Fahraeus et al. Jul 2007 B2
7246321 Bryborn et al. Jul 2007 B2
7248250 Pettersson et al. Jul 2007 B2
7249256 Hansen et al. Jul 2007 B2
7249716 Bryborn Jul 2007 B2
7254839 Fahraeus et al. Aug 2007 B2
7278017 Skantze Oct 2007 B2
7281668 Pettersson et al. Oct 2007 B2
7283676 Olsson Oct 2007 B2
7293697 Wiebe et al. Nov 2007 B2
7295193 Fahraeus Nov 2007 B2
7296075 Lynggaard Nov 2007 B2
7321692 Bryborn et al. Jan 2008 B2
7333947 Wiebe et al. Feb 2008 B2
7345673 Ericson et al. Mar 2008 B2
7353393 Hansen et al. Apr 2008 B2
7356012 Wiebe et al. Apr 2008 B2
7371068 Lloyd et al. May 2008 B2
7382361 Burstrom et al. Jun 2008 B2
7385595 Bryborn et al. Jun 2008 B2
7408536 Hugosson et al. Aug 2008 B2
7415501 Burstrom Aug 2008 B2
7418160 Lynggaard Aug 2008 B2
7422154 Ericson Sep 2008 B2
7441183 Burstrom et al. Oct 2008 B2
7457413 Thuvesholmen et al. Nov 2008 B2
7457476 Olsson Nov 2008 B2
7543753 Pettersson Jun 2009 B2
7588191 Pettersson et al. Sep 2009 B2
7600693 Pettersson Oct 2009 B2
7649637 Wiebe et al. Jan 2010 B2
7670070 Craven-Bartle Mar 2010 B2
7672513 Bjorklund et al. Mar 2010 B2
7701446 Sahlberg et al. Apr 2010 B2
7710408 Ericson May 2010 B2
7751089 Fahraeus et al. Jul 2010 B2
7753283 Lynggaard Jul 2010 B2
7777777 Bowman et al. Aug 2010 B2
7788315 Johansson Aug 2010 B2
7794388 Draxinger et al. Sep 2010 B2
7806696 Alexander et al. Oct 2010 B2
7833018 Alexander et al. Nov 2010 B2
7845560 Emanuel et al. Dec 2010 B2
7850454 Toly Dec 2010 B2
7857626 Toly Dec 2010 B2
7871850 Park Jan 2011 B2
7931470 Alexander et al. Apr 2011 B2
8116549 Warmath et al. Feb 2012 B2
8244506 Butsev et al. Aug 2012 B2
8294972 Chung Oct 2012 B2
8428326 Falk et al. Apr 2013 B2
8480404 Savitsky Jul 2013 B2
8480406 Alexander et al. Jul 2013 B2
8721344 Marmaropoulos et al. May 2014 B2
9128116 Welch et al. Sep 2015 B2
9171484 Fitzli et al. Oct 2015 B2
9251721 Lampotang Feb 2016 B2
9436993 Stolka et al. Sep 2016 B1
9626805 Lampotang et al. Apr 2017 B2
9792836 Rios et al. Oct 2017 B2
9870721 Savitsky et al. Jan 2018 B2
9918657 Daon et al. Mar 2018 B2
10052010 Feddema Aug 2018 B2
10132015 Woodruff et al. Nov 2018 B2
11011077 Garcia Kilroy May 2021 B2
20010031920 Kaufman et al. Oct 2001 A1
20020076581 McCoy Jun 2002 A1
20020076681 Leight et al. Jun 2002 A1
20020088926 Prasser Jul 2002 A1
20020099310 Kimchy et al. Jul 2002 A1
20020168618 Anderson et al. Nov 2002 A1
20020173721 Grunwald et al. Nov 2002 A1
20030210812 Khamene et al. Nov 2003 A1
20040043368 Hsieh et al. Mar 2004 A1
20040087850 Okerlund et al. May 2004 A1
20050084833 Lacey et al. Apr 2005 A1
20050119569 Ohtake Jun 2005 A1
20050181342 Toly Aug 2005 A1
20050214726 Feygin et al. Sep 2005 A1
20050228617 Kerwin et al. Oct 2005 A1
20050283075 Ma et al. Dec 2005 A1
20060020204 Serra et al. Jan 2006 A1
20060098010 Dwyer et al. May 2006 A1
20070088213 Poland Apr 2007 A1
20070161904 Urbano Jul 2007 A1
20070232907 Pelissier et al. Oct 2007 A1
20070233085 Colvin et al. Oct 2007 A1
20070236514 Agusanto Oct 2007 A1
20080009743 Hayasaka Jan 2008 A1
20080137071 Chow Jun 2008 A1
20080187896 Savitsky Aug 2008 A1
20080200807 Wright et al. Aug 2008 A1
20080204004 Anderson Aug 2008 A1
20080269606 Matsumura Oct 2008 A1
20080294096 Uber et al. Nov 2008 A1
20080312884 Hostettler et al. Dec 2008 A1
20090043195 Poland Feb 2009 A1
20090046912 Hostettler Feb 2009 A1
20090130642 Tada et al. May 2009 A1
20090209859 Tsujita et al. Aug 2009 A1
20090266957 Cermak Oct 2009 A1
20090305213 Burgkart et al. Dec 2009 A1
20090311655 Karkanias et al. Dec 2009 A1
20100055657 Goble et al. Mar 2010 A1
20100104162 Falk et al. Apr 2010 A1
20100179428 Pedersen et al. Jul 2010 A1
20100268067 Razzaque et al. Oct 2010 A1
20100277422 Muresianu et al. Nov 2010 A1
20110010023 Kunzig et al. Jan 2011 A1
20110306025 Sheehan et al. Dec 2011 A1
20120021993 Kim et al. Jan 2012 A1
20120143142 Klein Jun 2012 A1
20120150797 Landy et al. Jun 2012 A1
20120179039 Pelissier et al. Jul 2012 A1
20120200977 Nestler Aug 2012 A1
20120219937 Hughes et al. Aug 2012 A1
20120238875 Savitsky et al. Sep 2012 A1
20120251991 Savitsky et al. Oct 2012 A1
20130046523 Van Dinther Feb 2013 A1
20130064036 Lee et al. Mar 2013 A1
20130065211 Amso et al. Mar 2013 A1
20130158411 Miyasaka Jun 2013 A1
20130179306 Want et al. Jul 2013 A1
20130236872 Laurusonis et al. Sep 2013 A1
20130267838 Fronk et al. Oct 2013 A1
20140087347 Tracy Mar 2014 A1
20140114194 Kanayama et al. Apr 2014 A1
20140120505 Rios et al. May 2014 A1
20140228685 Eelbode Aug 2014 A1
20140272878 Shim et al. Sep 2014 A1
20150056591 Tepper et al. Feb 2015 A1
20150073639 Hausotte Mar 2015 A1
20150086955 Poniatowski et al. Mar 2015 A1
20150190112 Yeo et al. Jul 2015 A1
20150213731 Sato Jul 2015 A1
20160104393 Savitsky et al. Apr 2016 A1
20160133158 Sui et al. May 2016 A1
20160284240 Liang Sep 2016 A1
20160314716 Grubbs Oct 2016 A1
20160328998 Pedersen et al. Nov 2016 A1
20170028141 Fiedler et al. Feb 2017 A1
20170035517 Geri Feb 2017 A1
20170046985 Hendrickson et al. Feb 2017 A1
20170110032 O'Brien Apr 2017 A1
20170200399 Thomas et al. Jul 2017 A1
20170270829 Bauss Sep 2017 A1
20170352295 Belzacq et al. Dec 2017 A1
20170372640 Lampotang et al. Dec 2017 A1
20180049622 Ryan et al. Feb 2018 A1
20180071032 de Almeida Barreto Mar 2018 A1
20180197441 Rios Jul 2018 A1
20180225992 Gonsalves Aug 2018 A1
20180366034 Casals Gelpí Dec 2018 A1
20190057620 Eggert Feb 2019 A1
20190231436 Panse Aug 2019 A1
20190321657 Hale Oct 2019 A1
20200126449 Horst Apr 2020 A1
20200138518 Lang May 2020 A1
20210128125 Sitti et al. May 2021 A1
20210186311 Levy et al. Jun 2021 A1
Foreign Referenced Citations (9)
Number Date Country
1103223 May 2001 EP
2801966 Nov 2014 EP
2127610 Mar 1999 RU
1994040171 Nov 2014 RU
2006060406 Jun 2006 WO
WO 2009003664 Jan 2009 WO
WO 2013140315 Sep 2013 WO
PCTUS2013058661 Mar 2014 WO
WO 2017098036 Jun 2017 WO
Non-Patent Literature Citations (54)
Entry
Garrido-Jurado et al., Automatic generation and detection of highly reliable fiducial markers under occlusion, 2014, Pattern Recognition.
Healthcare Solutions, 3D Systems, Pelvic Mentor, 3 pages.
Sonocubic Fine, “A Revolution in Obstetrical Ultrasound,” 1 page.
Medge Platforms, Inc., “Volutracer O.P.U.S. Optical Positioning Ultrasound Simulator,” 2 pages.
Chung, Gregory, “Effects of Simulation-Based Practice on Focused Assessment . . . ”, Military Medicine, Oct. 2013, vol. 178.
Aligned Management Associates, Inc., Corporate home page describing organizing committee, overview, Procedicus MiST[trademark]-suturing module 30.0, 6 pgs., obtained from website Sep. 6, 2004.
American Academy of Emergency Medicine, conference: 11th annual scientific assembly preconference ultrasound courts, http://www. aaem.org/education/scientificassembiy/sa05/precon/ultrasound.shtmi, 6 pgs, obtained from website Feb. 16, 2005.
Barbosa, J. et al., “Computer education in emergency medicine residency programs,” http://www.med-ed-online.org/res00002.htm, 8 pgs, obtained from website Sep. 6, 2004.
Brannam, Let al, “Emergency nurses utilization of ultrasound guidance for placement of peripheral intravenous lines in difficult-access patients,” Acad Emerg Med, 11(12):1361-1363, Dec. 2004.
Calvert, N. et al., “The effectiveness and cost-effectiveness of ultrasound locating devices for central venous access: a systematic review and economic evaiuation/executive summary,” Health Tech Assess 2003, 7(12), 4 pgs.
Center for Human Simulation, corporate home page describing overview/people, http://www.uchsc.edu, 7 pgs, obtained from website Sep. 6, 2004.
CIMIT News, “The medical access program: new CIMIT initiative to benefit underserved patients/partners telemedicine and CIMIT launch new initiative: stay connected, be healthy/highlights: operating room of the future plug-and-play project,” http://www.cimit.org, Jan. 2005; Volll(2), 2 pgs., obtained from website Mar. 1, 2005.
Colt, H. G. et al., “Virtual reality bronchoscopy simulation: a revolution in procedural training,” Chest 2001; 120:1333-1339.
Computer Motion, “About computer motion: technology to enhance surgeons capabilities, improve patient outcomes and reduce healthcare costs/corporate alliances/products solutions for surgical innovation/training on the da Vinci[registered] surgical system-introduction,” 2002 Computer Motion, http://www.computermotion.com, 6 pgs.
Delp, Setai, “Surgical simulation—an emerging technology fortraining in emergency medicine,” Presence, 6 (2):147-159, Apr. 1997 (abstract).
Dorner, R. et. al., “Synergies between interactive training simulations and digital storytelling: a component-based framework,” Computer Graphics, 26(1):45-55, Feb. 2002 (abstract).
Duque, D. and Kessler S., “Ultrasound guided vascular access,” Amer Coli Emerg Phy., http://vww.nyacep.org/education/articles/ultrasound%20vascular%20access.htm, 2 pgs, obtained from website May 11, 2005.
Espinet, A. and Dunning J., “Does ultrasound-guided central line insertion reduce complications and time to placement in elective patients undergoing cardiac surgery,” Inter Cardiovascular Thoracic Surg, 3:523-527, 2004; http:/licvts.ctsnetjournals.org/cgi/content/full/3/3/523, 6 pgs, obtained from website May 11, 2005 (abstract).
Gallagher, A. G. et al., “Virtual reality training for the operating room and cardiac catheterization laboratory,” Lancet, 364:1538-1540, Oct. 23, 2004.
Gallagher, A. G. et al., “Psychomotor skills assessment in practicing surgeons experienced in performing advanced laparoscopic procedures,” AM Coli Surg, 197(3):479-488, Sep. 2003.
Gausche, M. et al., “Effect on out-of-hospital pediatric endotracheal Intubation on survival and neurological outcome: a controlled clinical trial,” JAMA, 283(6)783-790, Feb. 9, 2000.
Gore, D. C. and Gregory, S. R., “Historical perspective on medical errors: Richard Cabot and the Institute of Medicine,” J Amer Coli Surg, 197(4), 5 pgs, Oct. 2003.
Grantcharov, T. P. et al., “Randomized clinical trial of virtual reality simulation for laparoscopic skills training,” Br J Surg, 91 (2):146-150, Feb. 1, 2004 (abstract).
Grantcharov, T. P. et al., “Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills,” Am J Surg, 185(2):146-149, Feb. 1, 2004 (abstract).
Haluck, R. S., et al., “Are surgery training programs ready for virtual reality A survey of program directors in general surgery,” Arch Surg, 135(7):786-792, Jul. 1, 2000.
Helmreich, R. L., “On error management: lessons from aviation,” BMJ, 320:781-785, Mar. 2000.
Huckman, R. S. and Pisano, G. P., “Turf battles in coronary revascularization,” N Engl J Med, http://www.nejm.org, 4 pgs. 352(9):857-859, Mar. 3, 2005.
Immersion Corporation, URL: http://www.immersion.com/corporate/products/, corporate home page describing Immersions surgical training simulators—“Wireless Data Glove: The CyberGiove[registered]II System,” 5 pgs, obtained from the website Nov. 17, 2005 and Jan. 24, 2008.
Injuryboard.com, “Reducing complications associated with central vein catheterization,” URSL: http://www.injuryboard.com/view.cfm/Article=668, 5 pgs, obtained from website May 11, 2005.
INTERSENSE, home page listing motion tracking products, http://www.isense.com/prodcuts.aspxid=42,1 pg, obtained from website Jan. 24, 2008.
Jemmett, M. E., et. al., “Unrecognized misplacement of endotracheal tubes in a mixed urban to rural emergency medical services setting,” Acad Emerg Med, 10(9):961-964, Sep. 2003.
Katz, S. H. and Falk, J. L., “Misplaced endotrachial tubes by paramedics in an urban medical services system,” Annals Emerg Med, 37:32-37, Jan. 2001.
Lewis, R., “Educational research: time to reach the bar, not lower it,” Acad Emerg Med, 12(3):247-248, Mar. 2005.
Liu, A. et, al., “A survey of surgical simulation: applications, technology, and education,” Presence, 12(6):1-45, Dec. 2003.
Manchester Visulations Centre, “Webset project-bringing 3D medical training tools to the WWW,” http://www.sve.man.ac.uklmvc/research/previous/website, 3 pgs, obtained from the website Sep. 8, 2004.
Mclellan, H., “Virtual realities,” Mclellan Wyatt Digital, 33 pgs.
Medical Simulation Corporation, corporate home page describing management team/frequently asked questions, http://www.medsimulation.com/about_msc/key_employees.asp, 7 pgs, obtained from website Nov. 25, 2004.
Medtronic, “The StealthStation[registered] treatment guidance system,” the corporate home page describing the company fact sheet and profile; http://www.medtronic.com/Newsroom, 4 pgs, obtained from website Mar. 5, 2005.
Mort, T. C., “Emergency tracheal intubation: complications associated with repeated laryngoscopic attempts,” Anesth Analg, 99(2):607-613, Aug. 2004, 1 pg, obtained from website Sept. 8, 2004 (abstract).
Nazeer, S. R., et al., “Ultrasound-assisted paracentesis performed by emergency physicians v.s. the traditional technique: a prospective, randomized study,” Amer J of Emer Med, 23:363-367, 2005.
NCA Medical Simulation Center, Tutorial-simulation for medical training, http://Simcen.usuhs.milimiccaie, 4 pgs, 2003.
Next Dimension Imaging, “Products-Anatomy Analyzer 2,” http://www.nexted.com/anatomyanalyzer.asp, 2 pgs, obtained from website Dec. 7, 2004.
Norris, T. E. et al., “Teaching procedural skills,” J General Internal Med, 12(S2):S64-S70, Apr. 1997.
On the Net Resources-Education and Training, URL: http://www.hitl.washington.edu/projects/knowledge_base/education.html, corporate home page regarding internet sites regarding education and training, 16 pgs, obtained from website Jan, 8, 2005.
Osberg, K. M., “Virtual reality and education: a look at both sides of the sword,” http://www.hitl.washington.edu/pubiications/r-93-7/, 19 pgs, Dec. 14, 1992, obtained from website Jan. 21, 2008.
Osmon, S. et al., “Clinical investigations: reporting of medical errors: an intensive care unit experience,” Grit Care Med. 32(3), 13 pgs, Mar. 2004.
Ponder, M., et al., “Immersive VR decision training: telling Interactive stories featuring advanced human simulation technologies,” Eurographics Association 2003, 10 pgs.
Primal, corporate home page describing resources forteaching healthcare practitioners, 2 pgs, obtained from website.
Prystowsky, J. B. et al., “A virtual reality module for intravenous catheter placement,” Am J Surg 1999; 177 (2):171-175 (abstract).
Reachin, “Medical Training Development Centre/Reachin technologies AB has entered into a corporation with Mentice AB,” Jan. 20, 2004, 4 pgs, obtained from website Nov. 9, 2004.
Rothschild, J. M., “Ultrasound guidance of central vein catheterization,” NCBI, Nat Lib Med, www.ncbi.nlm.nih.gov/books/, HSTAT 21,6 pgs, obtained from website May 11, 2005.
Rowe, R. and Cohen, R. A., “An evaluation of a virtual reality airway simulator,” Anesth Analg 2002, 95:62-66.
Sensable Technologies, “PHANTOM Omni Haptic Device,” 2 pgs, http://www.sensable.com/haptic-ohantom-omni.htm., obtained from website Jan. 24, 2008.
Shaffer, K., “Becoming a physician: teaching anatomy in a digital age,” NEJM, Sep. 23, 2004; 351(13):1279-81 (extract of first 100 words—no abstract).
Related Publications (1)
Number Date Country
20200242972 A1 Jul 2020 US
Provisional Applications (1)
Number Date Country
62798860 Jan 2019 US