Over the past decade, several companies started commercializing compact medical devices that interface directly with smartphones and tablets or other portable computers. Of particular interest are ultrasound probes. They plug-in directly into the computing device and use its processing capabilities to display high quality diagnostic ultrasound images in real-time, rivaling traditional machinery only found at large clinical institutions. By virtue of their versatility and portability, these technologies carry profound benefits for critical care. However, one major obstacle remains. Mastering how to use an ultrasound transducer to diagnose a patient requires extensive training and hands-on experience. This invention introduces a new hardware accessory that allows using the same system for both clinical applications and self-directed training through simulation.
The invention of the present application is directed towards a method and system for allowing a motion sensor to be combined with an ultrasound transducer probe and to communicate with a mobile device to allow a real ultrasound transducer probe to also be converted into a training tool. The system comprises a motion sensor encased in a housing to function as a motion sensor accessory to an ultrasound transducer probe (hereinafter “probe” or “ultrasound probe”). The motion sensor accessory can be attached and removed from the probe or could be permanently affixed to its exterior or interior. When the motion sensor accessory is attached to the probe and turned on, the probe is used as a training tool. When the motion sensor accessory is removed from the probe or turned off, the probe can be used as an actual ultrasound tool on real patients for actual diagnosis of a medical condition. The motion sensor accessory may be supplemented with a carrier configured for affixing the motion sensor to the probe.
Software can be downloaded on to a computing device, such as a mobile phone or smartphone, a tablet, a notebook computer, a desktop computer, optical head-mounted displays, and other devices that can display images so that the motion sensor accessory can communicate with the user's computing device. The motion sensor accessory may further comprise a communication module with a wireless interface to allow the motion sensor components to communicate: with the computing device wirelessly. The required hardware and software can run simulation and training programs.
The probe may be an actual ultrasound probe capable of capturing ultrasound images. If the accessory is not permanently attached and is removable, instruction materials will direct the user to firmly affix the accessory to the existing probe using the carrier. The instructions will also guide the user to download and install the software (e.g. a mobile app) on the computing device. The mobile app may use a graphical user interface (GUI) to instruct the user on how to establish communication with the motion sensor accessory and use the system to learn how to operate an ultrasound probe with the provided simulation.
The detailed description set forth below in connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention.
With reference to
The ultrasound probe 104 communicates with the ultrasound machine 102. In a version of the invention, the ultrasound probe 104 communicates with the ultrasound machine 102 through a data cable 112. In other versions, the communication may be wireless.
The ultrasound probe 104 includes an ultrasound transducer 114, ultrasound circuitry 116, a motion sensor 118 (e.g. a motion sensor that can detect orientation and position, and a probe housing 120. When the accessory is affixed permanently, the probe housing 120 encases the ultrasound transducer 114 and the motion sensor 118.
The ultrasound transducer 114 transmits acoustic waves 122 and measures the reflected acoustic waves 122 to produce a reflected wave signal 124. The ultrasound circuitry 116 receives the reflected wave signal 124 from the ultrasound transducer 114 and transmits an image signal 126 to the ultrasound machine 102. The motion sensor 118 measures the position and orientation of the ultrasound probe 104.
Preferably, the motion sensor 118 is an inertial sensor. In some embodiments, the motion sensor 118 includes an accelerometer 130, a gyroscope 132, and a magnetometer 134. The motion sensor 118 can be used to detect misalignment and provide a visual alert (for example, in conjunction with the probe indicator icon 108) or an auditory alert to the user about the reference indicator 110 alignment.
In a version of the invention, the ultrasound probe 104 also includes a sensing head 128 at which the ultrasound transducer 114 transmits acoustic waves 122 and measures reflected waves 124.
In a preferred version, the ultrasound probe 104 also includes a compression sensor 136 that measures the force 138 applied to the sensing head 128. In that version, the probe housing 120 also encases the compression sensor 136. The compression sensor 136 allows the user to investigate the elastic properties of the underlying anatomy in the simulated environment by pressing the tip of the device (for example, the sensing head 128) against a surface with varying amounts of force 138. Preferably, the compression sensor 136 is a resistive strain gauge or other mechanical means that will not interfere with the operation of the ultrasound transducer 114. If the compression sensor 136 interferes with the operation of the ultrasound transducer 114, in some versions the compression sensor 136 may be disabled mechanically when the ultrasound system 100 is operated in the standard mode. In an alternative embodiment, the ultrasound transducer 114 (which is typically built using a highly sensitive piezoelectric element) can itself be used to measure compression directly without a separate compression sensor 136. Other types of sensors can also be employed by the ultrasound probe 104, such as a temperature sensor 137.
With motion-sensing technology embedded directly within the probe housing 120, the ultrasound system 100 can operate in two separate modes: a diagnostic mode that allows the user to use the ultrasound probe 104 to scan real patients using the traditional physics of ultrasound as is done currently, and a training mode that will instead allow the user to employ the same ultrasound probe 104 as a motion sensing peripheral to navigate existing patient cases, perhaps augmented with annotations 140 that help the operator expand and refine his or her knowledge of ultrasound imaging.
More specifically, an ultrasound system 100 equipped with this novel kind of ultrasound probe 104 allows the machine to provide an additional mode of operation for training (training mode). When the training mode is enabled, the user can move the ultrasound probe 104 on the patient's body, a medical mannequin, or other arbitrary surface to navigate a pre-recorded patient case. The software loaded on the ultrasound machine 102 will respond to the motion of the ultrasound transducer 114 in a simulated environment in the same manner as when operating the ultrasound machine 102 in traditional mode (standard mode) with the real physics of ultrasound. The added benefit of the training mode is that the ultrasound operator can correlate what is observed in the real patient with a large library of pre-recorded real and simulated ultrasound cases that may exhibit a wide range of known pathologies. The library of pre-recorded ultrasound cases may comprise extracted real patient ultrasound data sets (3D and 4D data), which a user can scan through using the same hand motions used by the original sonographer at the patient's bedside. The library may also provide narrated ultrasound clips (2D videos) that describe the patient findings and provide verbal and visual instruction. Furthermore, pre-recorded real and simulated ultrasound data may be augmented with additional anatomical annotations 140 that provide further insight on the details of how to use ultrasound imaging in the clinical setting. Those anatomical annotations 140 may include the labeling of pathologies or anatomical structures that are visible in the ultrasound data.
Furthermore, the disclosed solution can mitigate human error that arises from misalignment of the transducer's reference indicator 110 through manual and automatic misalignment detection.
Manual misalignment detection—The ultrasound operator can validate the appearance of a desired anatomical region with a pre-recorded case and verify that he or she oriented the ultrasound probe 104 correctly when scanning a patient. This approach does not need any additional equipment or modification beyond the disclosed embedded motion sensor.
To calibrate the system, the user places the ultrasound probe 104 at a known position with respect to the ultrasound machine 102. This is necessary to track the position of the ultrasound probe 104 with respect to the ultrasound machine 102 without the aid of any additional sensor or technology. The ultrasound machine 102 provides the user with an on-screen visual reference to establish how the sensor should be aligned (for example, with the probe indicator icon 108). All existing ultrasound machines provide such reference in the form of a small colored circle on the side of the screen. Other indicators may be used such as arrows pointing to the side the reference indicator 110 is on. For the disclosed application it also may be useful, but not necessary, to show a visual representation of the patient's body on-screen to provide additional guidance.
The ultrasound operator may then scan the patient's body, a medical mannequin, or other arbitrary surface. The motion sensor 118 informs the ultrasound machine 102 about the position of the motion sensor 118 throughout the scanning session.
Software in the ultrasound machine 102 continuously monitors the position and orientation of the ultrasound probe 104 during the scanning session using the readings from the motion sensor 118. The calibration procedure noted above allows the software to compute the relative position of the motion sensor 118 with respect to the ultrasound machine 102. If the software detects that the ultrasound probe 104 is not aligned correctly according to established medical conventions, then a visual or audio alert is generated to inform the operator about the hazard.
Automatic misalignment detection—If additional means (as explained below) are available for determining the position of the ultrasound unit with respect to the ultrasound transducer 114, software on the device can determine automatically whether or not the current orientation of the ultrasound transducer 114 is correct by checking if the expected medical conventions are being honored. This solution does not require a separate calibration step, and it may be more accurate over the extent of the scanning session.
More specifically, this approach requires two-point motion sensing solution where a reference beacon 142 is placed at a fixed position on the ultrasound machine 102 or other nearby location and the receiver is placed inside the ultrasound probe, preferably as part of the motion sensor 118. During the ultrasound scanning process, the two-point sensor solution informs the ultrasound machine 102 about the position of the motion sensor 118 relative to the ultrasound machine 102 throughout the scanning session. Software on the ultrasound machine 102 continuously monitors the position and orientation of the ultrasound probe 104 with respect to the ultrasound machine 102 during the scanning session using the readings from the two-point sensor solution (that is, the reference beacon 142 in conjunction with the motion sensor 118). If the software detects that the ultrasound probe 104 is not aligned correctly according to established medical conventions, then a visual or audio alert is generated to inform the operator about the hazard.
In some embodiments, rather than having the motion sensing technology embedded in an ultrasound probe 104, the motion sensing technology can be removably attached to the outside of the ultrasound probe 104 as shown in
This allows a fully functional ultrasound probe 104 to be converted to an ultrasound training tool. Therefore, the present invention may be viewed as an ultrasound probe conversion kit for converting a commercial diagnostic ultrasound probe 104 into an ultrasound training tool. In the preferred embodiment, the kit comprises a motion sensor accessory 150 that can be attached to the ultrasound probe 104. In some embodiments, the kit may further comprise an ultrasound probe 104. In some embodiments, the kit may comprise an ultrasound machine 102. In some embodiments, the kit may comprise software or instructions to obtain software that can be loaded on to a commercially available computing device 160, and preferably; a mobile device, such as a smartphone, tablet, notebook computer, virtual reality devices, augmented reality devices, wearable devices, light field display devices, and the like, to run the software and convert the computing device 160 into an ultrasound machine 102 as shown in
Motion Sensor
Several off-the-shelf solutions have delivered effective training simulators for ultrasound using only three degrees of freedom (3-DOF) motion sensors. Therefore, the preferred embodiment for the motion sensor 118 is a low-cost Inertial Measurement Unit (IMU) comprising a MEMS 3-axis accelerometer 130, 3-axis gyroscope 132, and 3-axis magnetometer 134. Care must be taken to choose components that provide a high-degree of accuracy to minimize the incidence of drift in the estimated orientation of the device. Practitioners may, however, envision other means of measuring the 3D orientation (3-DOF) and position (6-DOF) of the device using electrical, optical, electromagnetic, purely mechanical, or chemical operating principles that are consistent with the spirit of this invention. The only hard requirement is a motion sensor 118 that is miniaturized and portable enough to meet the form factor of the ultrasound probe 104.
The motion sensor can employ multiple degrees of freedom (DOF), such as 2-DOF, 3-DOF, 4-DOF, 5-DOF, 6-DOF, 7-DOF, 8-DOF, 9-DOF, or 10-DOF, 11-DOF. The differentiation between the number of DOFs is often an artifact of marketing rather than being based on true technical merit. For instance, a pure orientation sensor may be referred to as having 3-DOF, where each DOF determines minimum number of 3D axes in space required to describe the motion of the device, 6-DOF (where each DOF refers to each axis of a 3-axis accelerometer and 3-axis gyro), 9-DOF (where each DOE refers to each axis of a 3 axis accelerometer, 3-axis gyro, and 3-axis magnetometer), 10-DOF (where each DOF refers to each axis of a 3 axis accelerometer, 3-axis gyro, 3-axis magnetometer, and additional temperature sensor), 11-DOF (where each DOF refers to each axis of a 3 axis accelerometer, 3-axis gyro, 3-axis magnetometer, temperature sensor, and altimeter).
The preferred embodiment encompasses the following cases directly: 3-DOF (pure orientation sensing in 3D), 5-DOF (2 axes of position sensing and 3 axes of orientation sensing), and 6-DOF (3-axes of position sensing and 3 axes of orientation sensing).
Motherboard and Other Electronics
With reference to
Communication Interface
Since the mobile ultrasound transducer probe 104 is expected to have a direct wired connection via a data cable 112, such as USB, thunderbolt, FireWire, Ethernet, and the like, with the computing device 160, it is important that the motion sensor accessory 150 uses a wireless communication interface 119 to minimize the encumbrance of additional wires. Many low-cost and standardized solutions exist for this purpose and the most widely used standard is Bluetooth or Bluetooth Low Energy. Bluetooth is a highly desirable solution for the preferred embodiment as the large majority of mobile devices and portable computers already incorporate the hardware and software to communicate with an external Bluetooth device. Alternatively a Wi-Fi controller or other custom wireless protocol can be used for the same purpose. The wireless communication interface 119 must be mounted on the main PCB 116 along with the other components.
Battery and Charging
A wireless device needs a source of power to operate and designers may choose among a variety of battery solutions for this purpose. While single-use alkaline batteries may be adequate for this invention, a more desirable solution for the preferred embodiment is to use a form of rechargeable battery 125, which removes the need of disassembling the device for replacement and allows for a higher degree of integration to make the device smaller. The most common form of high-efficiency batteries used in wireless devices are Lithium Ion batteries or Lithium-Ion Polymer batteries. If a rechargeable battery is used, the electronics on the PCB 116 must also include recharging hardware to allow recharging the battery 125. A common solution for this purpose is to incorporate charging circuitry 127 on the PCB, for example, using a standard USB-micro port 123 on the device. Hence, the user can easily recharge the device using a widely available USB cable and a functioning USB port or USB wall-charger. Other charging circuitry 127 can be used, such as a designated power cord, wireless charging, and the like.
Carrier
As shown in
The motion sensor accessory 150 may comprise a carrier 152 that holds the enclosure 151 and attaches the enclosure 151 firmly to the exterior surface of the ultrasound probe 104. The design of the carrier 152 is challenging, because most mobile probes 104 have a curved design and they do not offer clear attachment points for external accessories. As shown in
In a second embodiment, the carrier 152 is like a cap that covers the tip of the mobile probe 104 at the transducer 114 allowing the system to be used solely for training purposes while the accessory 150 is attached.
The carrier 152 can be designed to either fit a particular model of mobile ultrasound probe 104 or in a way that can accommodate a wide range of mobile ultrasound probes 104 from different vendors. In the former, the motion sensor accessory 150 can be accompanied by a multitude of different carriers 152 each designed to accommodate a specific ultrasound device available in the market. In some embodiments, the motion sensor accessory 150 may be removably attached to the carrier 152 with a fastener so that a single motion sensor accessory 150 can be used with the multitude of different carriers 152, The fastener can be a hook and loop type fastener, resistance fit fastener, screw on fastener, magnetic faster, and the like.
In some embodiments, the carrier 152 may be made of an elastic material that can be stretched to conform to the shape of the ultrasound probe 104. In some embodiments, rather than having different carriers 152, the same carrier 152 can be provided with a plurality of hollow inserts 156 that can be seated in the same carrier 152. The outer dimensions of the insert 156 may be substantially the same as the inner dimension of the carrier 152. The inner dimensions of the insert may be substantially similar to the shape of the probe 104 to which the insert was designed to fit. The insert 156 can be fastened inside the carrier 152 by resistance fit, or by other types of fasteners such as adhesives, hook and loop fasteners, high friction material, clips, magnets, and the like. The fastening mechanism should make it easy to change out the inserts 156 from the carrier 152.
In some embodiments, the motion sensor accessory 150 may comprise an accessory switch 107 for detecting whether the motion sensor accessory 150 has been attached to the ultrasound probe 104. The accessory switch 107 may be positioned on the motion sensor accessory 150 or the carrier 152 at a location where the motion sensor accessory 150 makes contact with the probe 104. For example, the switch 107 may be on the inner side of the motion sensor 118 or on the inner wall of the cap-type carrier 152. Therefore, when the motion sensor accessory 150 is attached to the ultrasound probe 104, the accessory switch 107 is automatically actuated by touching or being pressed against the ultrasound probe 104 to indicate that the motion sensor accessory 150 has been attached to the ultrasound probe 104. The accessory switch 107 may utilize a mechanical switch, such as a button, lever, slide, dial, and the like; a sensor, such as a capacitive touch sensor, a light sensor, a proximity sensor, a Hall effect sensor, or any other type of sensor that can detect when the motion sensor accessory 150 has been attached to the ultrasound probe 104; and the like, to detect whether the motion sensor accessory 150 has been attached to the probe 104.
In some embodiments, the accessory switch 107 may be located anywhere on the ultrasound probe 104 so that the user can actuate the accessory switch 107 to indicate that the motion sensor accessory 150 has been attached to the ultrasound probe 104; or the accessory switch 107 may be attached to the probe 104 at a location where attachment of the motion sensor accessory 150 automatically actuates the accessory switch 107.
In some embodiments, the accessory switch 107 may operatively be connected to the transducer 114 so as to turn the transducer 114 on and off Therefore, when the motion sensor accessory 150 is attached to the ultrasound probe 104, the transducer 114 is automatically turned off Conversely, when the motion sensor accessory 150 is removed from the ultrasound probe 104, the transducer 114 is automatically turned on. Alternatively, in some embodiments, the user can actuate the accessory switch 107 to turn the transducer 114 on and off
Operating Modes
With reference to
In training simulation mode, the user attaches 210 the motion sensor accessory 150 to the probe 104, and connects 212 the probe 104 to the computing device 160. The user can launch a simulator 214, and use the motion sensing capabilities to scan virtual patients 216 loaded in the accompanying simulator app to learn about ultrasound imaging in a simulated environment.
Simulation Software
As shown in
The suite of software available on the computing device 160 must allow the user to operate the device in either operating mode (diagnostic mode or training mode). Implementers may choose to provide two distinct apps for each mode of operation, requiring the user to manually switch from one to the other, or provide a single app that incorporates functionality to operate the system either in diagnostic ultrasound or training simulation mode.
The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.
This patent application is a divisional of U.S. patent application Ser. No. 15/198,994 filed Jun. 30, 2016 for System and Method for Converting Handheld Diagnostic Ultrasound Systems Into Ultrasound Training Systems, which claims the benefit of U.S. Provisional Patent Application No. 62/187,085 filed Jun. 30, 2015 for System and Method for Converting Handheld Diagnostic Ultrasound. Systems into Ultrasound. Training Systems, which applications are incorporated in their entirety here by this reference.
Number | Name | Date | Kind |
---|---|---|---|
1488233 | Diehl | Mar 1924 | A |
1762937 | Staud | Jun 1930 | A |
2019121 | De Rewal | Oct 1935 | A |
2112019 | Gyger | Mar 1938 | A |
2127610 | Moore | Aug 1938 | A |
2705049 | Brooks | Mar 1955 | A |
2705307 | Nyswander | Mar 1955 | A |
2722947 | Sragal | Nov 1955 | A |
2886316 | Carl | May 1959 | A |
4040171 | Cline et al. | Aug 1977 | A |
4838863 | Allard et al. | Jun 1989 | A |
4838869 | Allard | Jun 1989 | A |
4994034 | Botich et al. | Feb 1991 | A |
5231381 | Duwaer | Jul 1993 | A |
5513992 | Refait | May 1996 | A |
5609485 | Bergman et al. | Mar 1997 | A |
5678565 | Sarvazyan | Oct 1997 | A |
5689443 | Ramanathan | Nov 1997 | A |
5701900 | Shehada et al. | Dec 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5755577 | Gillio | May 1998 | A |
5767839 | Rosenburg | Jun 1998 | A |
5776062 | Nields | Jul 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5800177 | Gillio | Sep 1998 | A |
5800178 | Gillio | Sep 1998 | A |
5800179 | Bailey | Sep 1998 | A |
5800350 | Coppleson et al. | Sep 1998 | A |
5827942 | Madsen et al. | Oct 1998 | A |
5882206 | Gillio | Mar 1999 | A |
5889237 | Makinwa | Mar 1999 | A |
5934288 | Avila et al. | Aug 1999 | A |
6001472 | Ikeda et al. | Dec 1999 | A |
6048312 | Ishrak et al. | Apr 2000 | A |
6063030 | Vara et al. | May 2000 | A |
6068597 | Lin | May 2000 | A |
6074213 | Hon | Jun 2000 | A |
6113395 | Hon | Sep 2000 | A |
6117078 | Lysyansky | Sep 2000 | A |
6122538 | Sliwa, Jr. | Sep 2000 | A |
6133395 | Miyata et al. | Oct 2000 | A |
6156213 | Dudley et al. | Dec 2000 | A |
6193657 | Drapkin | Feb 2001 | B1 |
6238341 | Mullen | May 2001 | B1 |
6267599 | Bailey | Jul 2001 | B1 |
6468212 | Scott et al. | Oct 2002 | B1 |
6502756 | Fåhraeus | Jan 2003 | B1 |
6511427 | Sliwa et al. | Jan 2003 | B1 |
6516213 | Nevo | Feb 2003 | B1 |
6548768 | Pettersson et al. | Apr 2003 | B1 |
6570104 | Ericson et al. | May 2003 | B1 |
6654000 | Rosenberg | Nov 2003 | B2 |
6663008 | Pettersson et al. | Dec 2003 | B1 |
6665554 | Charles et al. | Dec 2003 | B1 |
6666376 | Ericson | Dec 2003 | B1 |
6667695 | Pettersson et al. | Dec 2003 | B2 |
6674427 | Pettersson et al. | Jan 2004 | B1 |
6689966 | Wiebe | Feb 2004 | B2 |
6693626 | Rosenberg | Feb 2004 | B1 |
6694163 | Vining | Feb 2004 | B1 |
6698660 | Fåhraeus et al. | Mar 2004 | B2 |
6714213 | Lithicum et al. | Mar 2004 | B1 |
6714901 | Cotin et al. | Mar 2004 | B1 |
6719470 | Berhin | Apr 2004 | B2 |
6722574 | Skantze et al. | Apr 2004 | B2 |
6732927 | Olsson et al. | May 2004 | B2 |
6750877 | Rosenberg et al. | Jun 2004 | B2 |
6780016 | Toly | Aug 2004 | B1 |
6816148 | Mallett et al. | Nov 2004 | B2 |
6836555 | Ericson et al. | Dec 2004 | B2 |
6854821 | Ericson et al. | Feb 2005 | B2 |
6864880 | Hugosson et al. | Mar 2005 | B2 |
6878062 | Bjorklund et al. | Apr 2005 | B2 |
6896650 | Tracey et al. | May 2005 | B2 |
6916283 | Tracey et al. | Jul 2005 | B2 |
6927916 | Craven-Bartle | Aug 2005 | B2 |
6929183 | Pettersson | Aug 2005 | B2 |
6929481 | Alexander et al. | Aug 2005 | B1 |
6947033 | Fåhraeus et al. | Sep 2005 | B2 |
6958747 | Sahlberg et al. | Oct 2005 | B2 |
6966495 | Lynggaard et al. | Nov 2005 | B2 |
6992655 | Ericson et al. | Jan 2006 | B2 |
7002559 | Ericson | Feb 2006 | B2 |
7035429 | Andreasson | Apr 2006 | B2 |
7037258 | Chatenever et al. | May 2006 | B2 |
7050653 | Edso et al. | May 2006 | B2 |
7054487 | Ericson et al. | May 2006 | B2 |
7072529 | Hugosson et al. | Jul 2006 | B2 |
7089308 | Fransson et al. | Aug 2006 | B2 |
7094977 | Ericson et al. | Aug 2006 | B2 |
7110604 | Olsson | Sep 2006 | B2 |
7120320 | Petterson et al. | Oct 2006 | B2 |
7121465 | Rignell | Oct 2006 | B2 |
7127682 | Sandstrom et al. | Oct 2006 | B2 |
7143952 | Ericson | Dec 2006 | B2 |
7145556 | Pettersson | Dec 2006 | B2 |
7154056 | Bergqvist et al. | Dec 2006 | B2 |
7162087 | Bryborn | Jan 2007 | B2 |
7167164 | Ericson et al. | Jan 2007 | B2 |
7172131 | Pettersson et al. | Feb 2007 | B2 |
7175095 | Pettersson et al. | Feb 2007 | B2 |
7176896 | Fahraeus et al. | Feb 2007 | B1 |
7180509 | Fermgard et al. | Feb 2007 | B2 |
7195166 | Olsson et al. | Mar 2007 | B2 |
7202861 | Lynggaard | Apr 2007 | B2 |
7202963 | Wiebe et al. | Apr 2007 | B2 |
7239306 | Fahraeus et al. | Jul 2007 | B2 |
7246321 | Bryborn et al. | Jul 2007 | B2 |
7248250 | Pettersson et al. | Jul 2007 | B2 |
7249256 | Hansen et al. | Jul 2007 | B2 |
7249716 | Bryborn | Jul 2007 | B2 |
7254839 | Fahraeus et al. | Aug 2007 | B2 |
7278017 | Skantze | Oct 2007 | B2 |
7281668 | Pettersson et al. | Oct 2007 | B2 |
7283676 | Olsson | Oct 2007 | B2 |
7293697 | Wiebe et al. | Nov 2007 | B2 |
7295193 | Fahraeus | Nov 2007 | B2 |
7296075 | Lynggaard | Nov 2007 | B2 |
7321692 | Bryborn et al. | Jan 2008 | B2 |
7333947 | Wiebe et al. | Feb 2008 | B2 |
7345673 | Ericson et al. | Mar 2008 | B2 |
7353393 | Hansen et al. | Apr 2008 | B2 |
7356012 | Wiebe et al. | Apr 2008 | B2 |
7371068 | Lloyd et al. | May 2008 | B2 |
7382361 | Burstrom et al. | Jun 2008 | B2 |
7385595 | Bryborn et al. | Jun 2008 | B2 |
7408536 | Hugosson et al. | Aug 2008 | B2 |
7415501 | Burstrom | Aug 2008 | B2 |
7418160 | Lynggaard | Aug 2008 | B2 |
7422154 | Ericson | Sep 2008 | B2 |
7441183 | Burstrom et al. | Oct 2008 | B2 |
7457413 | Thuvesholmen et al. | Nov 2008 | B2 |
7457476 | Olsson | Nov 2008 | B2 |
7543753 | Pettersson | Jun 2009 | B2 |
7588191 | Pettersson et al. | Sep 2009 | B2 |
7600693 | Pettersson | Oct 2009 | B2 |
7649637 | Wiebe et al. | Jan 2010 | B2 |
7670070 | Craven-Bartle | Mar 2010 | B2 |
7672513 | Bjorklund et al. | Mar 2010 | B2 |
7701446 | Sahlberg et al. | Apr 2010 | B2 |
7710408 | Ericson | May 2010 | B2 |
7751089 | Fahraeus et al. | Jul 2010 | B2 |
7753283 | Lynggaard | Jul 2010 | B2 |
7777777 | Bowman et al. | Aug 2010 | B2 |
7788315 | Johansson | Aug 2010 | B2 |
7794388 | Draxinger et al. | Sep 2010 | B2 |
7806696 | Alexander et al. | Oct 2010 | B2 |
7833018 | Alexander et al. | Nov 2010 | B2 |
7850454 | Toly | Dec 2010 | B2 |
7857626 | Toly | Dec 2010 | B2 |
7871850 | Park | Jan 2011 | B2 |
7931470 | Alexander et al. | Apr 2011 | B2 |
8244506 | Butsev et al. | Aug 2012 | B2 |
8294972 | Chung | Oct 2012 | B2 |
8428326 | Falk et al. | Apr 2013 | B2 |
8480404 | Savitsky | Jul 2013 | B2 |
8480406 | Alexander et al. | Jul 2013 | B2 |
8556635 | Siassi | Oct 2013 | B2 |
8721344 | Marmaropoulos et al. | May 2014 | B2 |
9128116 | Welch et al. | Sep 2015 | B2 |
9251721 | Lampotang | Feb 2016 | B2 |
9436993 | Stolka et al. | Sep 2016 | B1 |
9870721 | Savitsky et al. | Jan 2018 | B2 |
9911365 | Toly | Mar 2018 | B2 |
10052010 | Feddema | Aug 2018 | B2 |
10132015 | Woodruff et al. | Nov 2018 | B2 |
11011077 | Garcia Kilroy | May 2021 | B2 |
20010031920 | Kaufman et al. | Oct 2001 | A1 |
20020076581 | McCoy | Jun 2002 | A1 |
20020076681 | Leight et al. | Jun 2002 | A1 |
20020088926 | Prasser | Jul 2002 | A1 |
20020099310 | Kimchy | Jul 2002 | A1 |
20020168618 | Anderson et al. | Nov 2002 | A1 |
20020173721 | Grunwald et al. | Nov 2002 | A1 |
20040009459 | Anderson et al. | Jan 2004 | A1 |
20040043368 | Hsieh et al. | Mar 2004 | A1 |
20040087850 | Okerlund et al. | May 2004 | A1 |
20040234113 | Miga | Nov 2004 | A1 |
20050119569 | Ohtake | Jun 2005 | A1 |
20050181342 | Toly | Aug 2005 | A1 |
20050214726 | Feygin et al. | Sep 2005 | A1 |
20050228617 | Kerwin | Oct 2005 | A1 |
20050283075 | Ma et al. | Dec 2005 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060098010 | Dwyer et al. | May 2006 | A1 |
20070032726 | Osaka et al. | Feb 2007 | A1 |
20070088213 | Poland | Apr 2007 | A1 |
20070161904 | Urbano | Jul 2007 | A1 |
20070232907 | Pelissier et al. | Oct 2007 | A1 |
20070236514 | Augusanto | Oct 2007 | A1 |
20070238085 | Colvin et al. | Oct 2007 | A1 |
20080009743 | Hayasaka | Jan 2008 | A1 |
20080137071 | Chow | Jun 2008 | A1 |
20080187896 | Savitsky | Aug 2008 | A1 |
20080200807 | Wright et al. | Aug 2008 | A1 |
20080204004 | Anderson | Aug 2008 | A1 |
20080221446 | Washburn | Sep 2008 | A1 |
20080269606 | Matsumura | Oct 2008 | A1 |
20080294096 | Uber et al. | Nov 2008 | A1 |
20080312884 | Hostettler et al. | Dec 2008 | A1 |
20090006419 | Savitsky | Jan 2009 | A1 |
20090043195 | Poland | Feb 2009 | A1 |
20090046912 | Hostettler | Feb 2009 | A1 |
20090130642 | Tada et al. | May 2009 | A1 |
20090209859 | Tsujita | Aug 2009 | A1 |
20090266957 | Cermak | Oct 2009 | A1 |
20090275833 | Ikeda | Nov 2009 | A1 |
20090305213 | Burgkart et al. | Dec 2009 | A1 |
20090311655 | Karkanias et al. | Dec 2009 | A1 |
20100055657 | Goble et al. | Mar 2010 | A1 |
20100104162 | Falk et al. | Apr 2010 | A1 |
20100179428 | Pederson | Jul 2010 | A1 |
20100268067 | Razzaque | Oct 2010 | A1 |
20100277422 | Muresianu et al. | Nov 2010 | A1 |
20110010023 | Kunzig et al. | Jan 2011 | A1 |
20110269097 | Sporbert et al. | Nov 2011 | A1 |
20110306025 | Sheehan | Dec 2011 | A1 |
20120021993 | Kim et al. | Jan 2012 | A1 |
20120058457 | Savitsky | Mar 2012 | A1 |
20120143142 | Klein | Jun 2012 | A1 |
20120150797 | Landy et al. | Jun 2012 | A1 |
20120179039 | Pelissier et al. | Jul 2012 | A1 |
20120200977 | Nestler | Aug 2012 | A1 |
20120219937 | Hughes et al. | Aug 2012 | A1 |
20120237102 | Savitsky et al. | Sep 2012 | A1 |
20120237913 | Savitsky et al. | Sep 2012 | A1 |
20120238875 | Savitsky | Sep 2012 | A1 |
20120251991 | Savitsky et al. | Oct 2012 | A1 |
20130046523 | Van Dinther | Feb 2013 | A1 |
20130064036 | Lee et al. | Mar 2013 | A1 |
20130065211 | Amso et al. | Mar 2013 | A1 |
20130137989 | Chen | May 2013 | A1 |
20130158411 | Miyasaka | Jun 2013 | A1 |
20130179306 | Want et al. | Jul 2013 | A1 |
20130236872 | Laurusonis et al. | Sep 2013 | A1 |
20140004488 | Tepper | Jan 2014 | A1 |
20140087347 | Tracy | Mar 2014 | A1 |
20140114194 | Kanayama et al. | Apr 2014 | A1 |
20140119645 | Zimet | May 2014 | A1 |
20140120505 | Rios et al. | May 2014 | A1 |
20140170620 | Savitsky et al. | Jun 2014 | A1 |
20140228685 | Eelbode | Aug 2014 | A1 |
20140272878 | Shim et al. | Sep 2014 | A1 |
20150056591 | Tepper | Feb 2015 | A1 |
20150078639 | Hausotte | Mar 2015 | A1 |
20150084897 | Nataneli et al. | Mar 2015 | A1 |
20150086956 | Savitsky et al. | Mar 2015 | A1 |
20150140538 | Savitsky et al. | May 2015 | A1 |
20150154890 | Savitsky et al. | Jun 2015 | A1 |
20150213731 | Sato | Jul 2015 | A1 |
20160314716 | Grubbs | Oct 2016 | A1 |
20160328998 | Pedersen et al. | Nov 2016 | A1 |
20170028141 | Fiedler et al. | Feb 2017 | A1 |
20170035517 | Geri | Feb 2017 | A1 |
20170046985 | Hendrickson et al. | Feb 2017 | A1 |
20170110032 | O'Brien | Apr 2017 | A1 |
20170270829 | Bauss | Sep 2017 | A1 |
20180197441 | Rios | Jul 2018 | A1 |
20180366034 | Gelpi | Dec 2018 | A1 |
20190057620 | Eggert | Feb 2019 | A1 |
20190231436 | Panse | Aug 2019 | A1 |
20190321657 | Hale | Oct 2019 | A1 |
20200126449 | Horst | Apr 2020 | A1 |
20200138518 | Lang | May 2020 | A1 |
20210128125 | Sitti et al. | May 2021 | A1 |
20210186311 | Levy et al. | Jun 2021 | A1 |
Number | Date | Country |
---|---|---|
1103223 | May 2001 | EP |
2801966 | Nov 2014 | EP |
2127610 | Nov 2014 | RU |
1994040171 | Nov 2014 | RU |
2006060406 | Jun 2006 | WO |
2011091613 | Aug 2011 | WO |
Entry |
---|
Simsuite, “Are you ready for your first carotid stent procedure/begin preparing now,” www.simsuites.com/readarticle.asp?articleId=598, 1 pg, obtained from website Nov. 25, 2004. |
NewRuleFX Brand 5ml Retractable Hypo Syringe Prop—Retractable Needle and Plunger—No Liquids. (Feb. 9, 2014), from https://www.newrulefx.com/products/newrulefx-brand-5ml-retractable-hypo-syringe-prop-retractable-needle-and-plunger-no-liquids?_pos=4&_sid=71f351469&_ss=r (Year: 2014). |
U.S. Appl. No. 15/279,405, No Inventor. |
U.S. Appl. No. 15/871,634, No Inventor. |
Chung, Gregory, “Effects of Simulation-Based Practice on Focused Assessment . . . ”, Military Medicine, Oct. 2013, vol. 178. |
Aligned Management Associates, Inc., Corporate home page describing organizing committee, overview, Procedicus MIST[trademark]-suturing module 30.0, 6 pgs., obtained from website Sep. 6, 2004. |
American Academy of Emergency Medicine, conference: 11th annual scientific assembly preconference ultrasound courts, http://www. aaem.org/education/scientificassembly/sa05/precon/ultrasound.shtml, 6 pgs, obtained from website Feb. 16, 2005. |
Barbosa, J. et. al., “Computer education in emergency medicine residency programs,” http://www.med-ed-online.org/res00002.htm, 8 pgs, obtained from website Sep. 6, 2004. |
Brannam, Let al, “Emergency nurses' utilization of ultrasound guidance for placement of peripheral intravenous lines in difficult-access patients,” Acad Emerg Med, 11(12):1361-1363, Dec. 2004. |
Calvert, N. et al., “The effectiveness and cost-effectiveness of ultrasound locating devices for central venous access: a systematic review and economic evaluation/executive summary,” Health Tech Assess 2003, 7(12), 4 pgs. |
Center for Human Simulation, corporate home page describing overview/people, http://wvw.uchsc.edu, 7 pgs, obtained from website Sep. 6, 2004. |
CIMIT News, “The medical access program: new CIMIT initiative to benefit underserved patients/partners telemedicine and CIMIT launch new initiative: stay connected, be healthy/highlights: operating room of the future plug-and-play project,” http://www.cimit.org, Jan. 2005: vol. II(2), 2 pgs., obtained from website Mar. 1, 2005. |
Colt, H. G. et. al., “Virtual reality bronchoscopy simulation: a revolution in procedural training,” Chest 2001; 120:1333-1339. |
Computer Motion, “About computer motion: technology to enhance surgeons' capabilities, improve patient outcomes and reduce healthcare costs/corporate alliances/products & solutions for surgical innovation/training on the da Vinci[registered] surgical system—introduction,” 2002 Computer Motion, http://www.computermotion.com, 6 pgs. |
Delp, Setal, “Surgical simulation—an emerging technology for training in emergency medicine,” Presence, 6 (2):147-159, Apr. 1997 (abstract). |
Dorner, R. et. al., “Synergies between interactive training simulations and digital storytelling: a component-based framework,” Computer & Graphics, 26(1):45-55, Feb. 2002 (abstract). |
Duque, D. and Kessler S., “Ultrasound guided vascular access,” Amer Coll Emerg Phy., http://www.nyacep.org/education/articles/ultrasound%20vascular%20access.htm, 2 pgs, obtained from website May 11, 2005. |
Espinet, A. and Dunning J., “Does ultrasound-guided central line insertion reduce complications and time to placement in elective patients undergoing cardiac surgery,” Inter Cardiovascular Thoracic Surg, 3:523-527, 2004; http:/licvts.ctsnetjournals.org/cgi/content/full/3/3/523, 6 pgs, obtained from website May 11, 2005 (abstract). |
Gallagher, A. G. et al., “Virtual reality training for the operating room and cardiac catheterization laboratory,” Lancet, 364:1538-1540, Oct. 23, 2004. |
Gallagher, A. G. et. al., “Psychomotor skills assessment in practicing surgeons experienced in performing advanced laparoscopic procedures,” AM Coll Surg, 197(3):479-488, Sep. 2003. |
Gausche, M. et. al., “Effect, on out-of-hospital pediatric endotracheal intubation on survival and neurological outcome: a controlled clinical trial,” JAMA, 283(6):783-790, Feb. 9, 2000. |
Gore, D. C. and Gregory, S. R., “Historical perspective on medical errors: Richard Cabot and the Institute of Medicine,” J Amer Coll Surg, 197(4), 5 pgs, Oct. 2003. |
Grantcharov, T. P. et. al., “Randomized clinical trial of virtual reality simulation for laparoscopic skills training,” Br J Surg, 91(2):146-150, Feb. 1, 2004 (abstract). |
Grantcharov, T. P. et. al., “Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills,” Am J Surg, 185(2):146-149, Feb. 1, 2004 (abstract). |
Haluck, R. S., et. al., “Are surgery training programs ready for virtual reality? A survey of program directors in general surgery,” Arch Surg, 135(7):786-792, Jul. 1, 2000. |
Helmreich, R. L., “On error management: lessons from aviation,” BMJ, 320:781-785, Mar. 2000. |
Huckman, R. S. and Pisano, G. P., “Turf battles in coronary revascularization,” N Engl J Med, http://www.nejm.org, 4 pgs, 352(9):857-859, Mar. 3, 2005. |
Immersion Corporation, URL: http://www.immersion.com/corporate/products/, corporate home page describing Immersion's surgical training simulators—“Wireless Data Glove: The CyberGlove[registered]II System,” 5 pgs, obtained from the website Nov. 17, 2005 and Jan. 24, 2008. |
injuryboard.com, “Reducing complications associated with central vein catheterization,” URSL: http://www.injuryboard.com/view.cfm/Article=668, 5 pgs, obtained from website May 11, 2005. |
Intersense, home page listing motion tracking products, http://www.isense.com/prodcuts.aspx?id=42&, 1 pg, obtained from website Jan. 24, 2008. |
Jemmett, M. E., et al., “Unrecognized misplacement of endotracheal tubes in a mixed urban to rural emergency medical services setting,” Acad Emerg Med, 10(9):961-964, Sep. 2003. |
Katz, S. H. and Falk, J. L., “Misplaced endotrachial tubes by paramedics in an urban medical services system,” Annals Emerg Med, 37:32-37, Jan. 2001. |
Lewis, R., “Educational research: time to reach the bar, not lower it,” Acad Emerg Med, 12(3):247-248, Mar. 2005. |
Liu, A. et. al., “A survey of surgical simulation: applications, technology, and education,” Presence, 12(6):1-45, Dec. 2003. |
Manchester Visulations Centre, “Webset project-bringing 3D medical training tools to the WWW,” http://www.sve.man.ac.uklmvc/research/previous/website, 3 pgs, obtained from the website Sep. 8, 2004. |
Mclellan, H., “Virtual realities,” Mclellan Wyatt Digital, 33 pgs. |
Medical Simulation Corporation, corporate home page describing management team/frequently asked questions, http://www.medsimulation.com/about_msc/key_employees.asp, 7 pgs, obtained from website Nov. 25, 2004. |
Medtronic, “The StealthStation[registered] treatment guidance system,” the corporate home page describing the company fact sheet and profile; http://www.medtronic.com/Newsroom, 4 pgs, obtained from website Mar. 5, 2005. |
Mort, T. C., “Emergency tracheal intubation: complications associated with repeated laryngoscopic attempts,” Anesth Analg, 99(2):607-613, Aug. 2004, 1 pg, obtained from website Sep. 8, 2004 (abstract). |
Nazeer, S. R., et. al., “Ultrasound-assisted paracentesis performed by emergency physicians v.s. the traditional technique: a prospective, randomized study,” Amer J of Emer Med, 23:363-367, 2005. |
NCA Medical Simulation Center, Tutorial-simulation for medical training, http://Simcen.usuhs.millmiccale, 4 pgs, 2003. |
Next Dimension Imaging, “Products-Anatomy Analyzer 2,” http://www.nexted.com/anatomyanalyzer.asp, 2 pgs, obtained from website Dec. 7, 2004. |
Norris, T. E. et. al., “Teaching procedural skills,” J General Internal Med, 12(S2):S64-S70, Apr. 1997. |
On the Net Resources—Education and Training, URL: http://www.hitl.washington.edu/projects/knowledge_base/education.html, corporate home page regarding internet sites regarding education and training, 16 pgs, obtained from website Jan. 8, 2005. |
Osberg, K. M., “Virtual reality and education: a look at both sides of the sword,” http://www.hitl.washington.edu/publications/r-93-7/, 19 pgs, Dec. 14, 1992, obtained from website Jan. 21, 2008. |
Osmon, S. et al., “Clinical investigations: reporting of medical errors: an intensive care unit experience,” Grit Care Med, 32(3), 13 pgs, Mar. 2004. |
Ponder, M., et. al., “Immersive VR decision training: telling interactive stories featuring advanced human simulation technologies,” Eurographics Association 2003, 10 pgs. |
Primal, corporate home page describing resources for teaching healthcare practitioners, 2 pgs, obtained from website. |
Prystowsky, J. B. et. al., “A virtual reality module for intravenous catheter placement,” Am J Surg 1999; 177 (2):171-175 (abstract). |
Reachin, “Medical Training Development Centre/Reachin technologies AB has entered into a corporation with Mentice AB,” Jan. 20, 2004, 4 pgs, obtained from website Nov. 9, 2004. |
Rothschild, J. M., “Ultrasound guidance of central vein catheterization,” NCBI, Nat Lib Med, www.ncbi.nim.nih.gov/books/, HSTAT 21, 6 pgs, obtained from website May 11, 2005. |
Rowe, R. and Cohen, R. A., “An evaluation of a virtual reality airway simulator,” Anesth Analg 2002, 95:62-66. |
Sensable Technologies, “Phantom Omni Haptic Device,” 2 pgs, http://www.sensable.com/haptic-ohantom-omni.htm., obtained from website Jan. 24, 2008. |
Shaffer, K., “Becoming a physician: teaching anatomy in a digital age,” NEJM, Sep. 23, 2004; 351(13):1279-81 (extract of first 100 words—no abstract). |
Number | Date | Country | |
---|---|---|---|
62187085 | Jun 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15198994 | Jun 2016 | US |
Child | 16940021 | US |