This invention is related to ultrasound simulators.
A variety of task-specific ultrasound simulators are currently available. These ultrasound simulators typically present a library of ultrasound cases to the user, and provide various mechanisms to navigate and interact with either simulated or real ultrasound data. In most cases, the ultrasound training solution includes a sensor device shaped like an ultrasound probe that reacts to motion, and controls a scanning plane in a simulated ultrasound-scanning environment. The simulators must be accurate enough to capture nuanced motions, and must integrate well with an intuitive user interface to convey a plausible experience of how to operate a real ultrasound machine on a real patient.
Some current mannequin-based, ultrasound simulators provide the ability to track simulated probe movement over six degrees of freedom (6-DOF). The limitations of such mannequin-based, task-specific ultrasound simulators include the physical footprint of the simulators (very large and bulky), cost, fidelity (sensor drift that requires frequent recalibration), and verisimilitude (limited realism owing to computer graphic imagery). These simulators can only be used in conjunction with the overlying mannequin that contains the embedded sensing equipment associated with the simulator. A high fidelity, economical, ultrasound training solution using real ultrasound imagery does exist, the SonoSim® Ultrasound Training Solution. While highly realistic and cost-effective, this training solution only provides continuous ultrasound probe tracking and simulation over 3-DOF. It does not continuously track handheld probe translational movement.
In summary, important needed improvements to the state-of-the-art of ultrasound simulation include: creating the capability to reliably, precisely, and continuously track translational movement of a simulated ultrasound probe independent of an electronic training mannequin with embedded motion tracking hardware, integrating the ability to continuously track 5-DOF simulated ultrasound probe movement into a holistic ultrasound training solution that does not mandate integration with an electronic training mannequin with embedded motion tracking hardware, and creating the ability to practice ultrasound simulation using live volunteers, rather than solely relying upon expensive electronic training mannequins with embedded software and hardware.
This invention, a system and method for augmented ultrasound simulation using flexible touch sensitive surfaces, will extend the 3-DOF probe tracking capabilities of the SonoSim® Ultrasound Training Solution to include 5-DOF of ultrasound probe tracking. It creates the ability to reliably, precisely, and continuously track translational movement of a simulated ultrasound probe independent of an electronic training mannequin containing embedded motion tracking hardware. This invention integrates and expands the capabilities of the SonoSim® Ultrasound Training Solution. In the process, it simultaneously provides the ability to continuously track 5-DOF simulated ultrasound probe movement in the context of a holistic ultrasound training solution that does not mandate integration with an electronic training mannequin with embedded motion tracking hardware.
Presented here is an invention that combines widely available components for Radio Frequency Identification (RFID) and MEMS sensors with advances in flexible electronics to produce an easy-to-use, low-cost controller for ultrasound training software. This invention will extend the capabilities of the SonoSim® Ultrasound Training Solution to include reliable, precise, and continuous 5-DOF simulated ultrasound probe movement. It also provides a flexible ultrasound training solution that can be integrated in live volunteers as well as mannequin-based ultrasound training scenarios Importantly, it does not need to be embedded within training mannequins and is designed to easily be affixed to external mannequin or live volunteer body surfaces.
The detailed description set forth below in connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention.
The present invention is a system for simulating ultrasound use comprising a probe assembly 100 having an orientation sensor 102 and a transponder reader 104; an electronic tag 200 having a transponder layer 202 containing a transponder 206; and a translation sensor 300. The probe assembly 100 is essentially a housing having a handle 106 and a tip 108 connected to the handle 106 so as to mimic an actual ultrasound probe. The electronic tag 200 is attachable to a body 12 of a subject 10, such as a mannequin or a live individual, and contains the transponder 206 to transmit information to the transponder reader 104 housed in the probe assembly 100. The translation sensor 300 is preferably in the form of a patch 302 that can be applied to the body 12, preferably over the electronic tag 200, to provide translation information of the probe assembly 100 when the probe assembly 100 is moved along the patch 302.
In the preferred embodiment, the transponder 206 may be an RFID, NFC, or similarly capable transponder coupled with an adhesive layer 204 and a built-in or external antenna 116. The orientation sensor 102 may be a 3-DOF MEMS Inertial Measurement Unit (IMU). The translation sensor 300 may be a patch 302 having a flexible touch-sensitive surface, with an optional flexible display 304 bonded with the flexible touch-sensitive surface. The probe assembly 100 may further comprise a wired or wireless communication interface 110 to communicate with a separate computing device 400 (e.g., PC, tablet, or dedicated unit). The computing device 400 can run an ultrasound simulation software and can communicate with the other components through the wired or wireless communication interface 110.
The Probe Assembly
As shown in
In the preferred embodiment, the orientation sensor 102 may be an inertial measurement unit (IMU) measuring three degrees of freedom to detect the orientation of the probe assembly 100 with respect to the gravity vector. Yaw, pitch, and roll angles of the probe assembly 100 measured over time may correspond to fundamental motions that a sonographer is trained to perform in clinical practice: e.g., fanning, rocking, and rolling. This orientation sensor 102 relays the readings of orientation to the computing device 400 to drive the orientation of the scanning plane in a simulated environment 402 run by the computing device 400.
Orientation can also be measured using other operating principles, such as electromagnetic, optical, or mechanical. The orientation sensor 102 must be secured within the probe assembly 100 at a fixed rotation with respect to the housing.
The reader 104 or interrogator is a radio device that broadcasts electromagnetic (EM) waves at a pre-determined frequency. The frequency induces a current in small transponders 206 located a short distance away through magnetic induction. The transponder 206 uses the harvested power to broadcast a response over the air according to a pre-determined protocol. For the application discussed in this invention, the interaction between the reader 104 and the transponders 206 is preferably limited to a distance of 1-2 cm. Typical readers 104 that are suited for this invention operate either in the range of about 125 kHz (Low Frequency) or in the range of about 13.56 MHz (high frequency).
The reader 104 is integrated within the probe assembly 100. Preferably, to simulate the use of an actual ultrasound probe, the antenna element 116 of the reader 104 should be placed at the tip 108 of the probe assembly 100 so as to be placed in close proximity to a transponder layer 202 during use. Alternatively, if the reader component 104 is small enough, the entire reader board with an embedded antenna 116 can be placed at the tip 108 of the probe assembly 100. In the preferred embodiment, the reader 104 may be an RFID reader that matches these specifications.
The Electronic Tag
In the following the term RFID refers to a range of technologies that use radio frequency signals to query remotely located transponders. This includes the popular near field communication (“NFC”) standard and other analogous technologies with similar capabilities known to those skilled in the art.
As shown in
The transponder layer 202 may comprise a transponder 206 associated with a memory for storing and processing information, as well as performing other standard functions of a transponder, and an antenna element 208 (e.g., an inductive coil) to transmit that signals to the reader 104 and receive signals from the reader 104.
The information contained in the transponder 206 pertains to the identification of the transponder 206 and its associated, specific anatomical body part. The transponder 206 may also contain information that implements a pre-determined protocol. This information can be transmitted to a computing device 400 and used in creating a simulated environment 402. In particular, this information will help create ultrasound images 404 of a particular body part being scanned with the probe assembly 100 so as to mimic what a user would see if the user were to scan that body part of a live subject in the same manner as with an actual ultrasound probe.
As an alternative to RFID or NFC readers, the transponder 206 may be a Bluetooth beacon, which is a beacon using Bluetooth Low Energy (BLE) technology. BLE Beacons are active tags that continuously broadcast a signal using the BLE transceiver, and a built-in battery for power. To accommodate BLE Beacons the probe assembly 100 must integrate a Bluetooth Low Energy transceiver that can detect the presence of BLE Beacons and estimate their distance by measuring the Received Signal Strength Indicator (RSSI).
In some embodiments, the electronic tag 200 may further comprise a surface layer 210 placed on top of the transponder layer 202 to sandwich the transponder layer 202 in between the surface layer 210 and the adhesive layer 204. The surface layer 210 may display an identifier 212 to let the user know to which anatomical body part the electronic tag pertains. For example, the surface layer 210 may be made of a protective material (plastic, paper, a film, and the like), and may depict a number, a character, or a custom graphic representing an anatomical body part. By identifying the electronic tag 200, the user will know where on the subject 10 to place the electronic tag 200.
Localization and RFID
First, the user must place a collection of electronic tags 200a-g on different regions on the body 12 of a subject 10. These regions may be pre-defined locations on, for example, a human body or training mannequin. The adhesive layer 204 on the electronic tag 200 keeps the electronic tags 200 attached to the surface and prevents unwanted motion. The transponders 206 are mapped to specific regions on the body (e.g., left shoulder, right thigh, chest, etc.). When the user places the tip 108 of the probe assembly 100 against the electronic tag 200 the RFID reader 104 is placed in close proximity to the transponder layer 202 and the transponder 206 is able to transmit information pertaining to the body part to which it is associated back to the reader 104. As shown in
The antenna 116 of the RFID reader 104 is positioned within the housing in such a way that the tip 108 of the probe 100 exhibits the highest sensitivity to nearby transponders 202. The probe assembly 100 communicates to a PC or other computing device 400 via a wired or wireless connection, such as USB or Bluetooth. The user can therefore position the probe assembly on a specific region of the body 12 marked with an electronic tag 200 and the software will respond by shifting the focus of the simulation to the corresponding region of the body 12.
The Translation Sensor
Aside from the orientation movement of the probe assembly 100 (i.e. yawing movement, pitching movement, and rolling movement) the system further comprises a translation sensor 300 to detect the translational movement of the probe assembly 100 on selected regions of the body 12, thereby adding two additional degrees of freedom for probe assembly 100 movement detection. In the preferred embodiment, the translation sensor 300 may be in the form of a scanning patch 302 to be applied over an electronic tag 200 on the body 12 of the subject 10 as shown in
When the user picks a region of anatomy to study in the simulated environment 402, he or she has the option of placing a flexible touch-sensitive, scanning patch 302 on the region of interest as if it were a towel lying on the body. One important feature of the scanning patch 302 is to possess flexibility (bendability) that allows it to conform to the shape of the body 12. Sliding the probe assembly 100 over the scanning patch 302, produces a measurement of translation. By combining the measurement of orientation from the IMU and translation over time, the connected computing device 400 can reconstruct the contour of the curved surface that the user traces with the probe assembly 100. This information can be used to recreate accurate simulated probe movement over a virtual scanning plane in a simulated training environment 402. One advantage of allowing translational motion over a curved surface is that the user can practice sliding the probe 100 over a surface that mimics the physicality of a human body. Care must be taken to build the flexible touch-sensitive scanning patch 302 in such a way that it does not shield the radio frequency waves used by the RFID reader 104 to query the transponders 202.
While still rare, flexible displays can be currently built either using: graphene-based electronics, or electrophoretic screens with flexible electronics bonded on a plastic substrate. Graphene-based solutions have been demonstrated by companies such as Samsung and are commercially available at this time. They result in very light, thin displays that can be bonded with a projected capacitive surface. Electrophoretic flexible screens bonded with a projected capacitive surface are already available commercially and manufactured by several companies including Plastic Logic and Sony/e-Ink.
Some basic operating principles used to build touch-sensitive surfaces include projected capacitive, resistive, and capacitive pressure sensing.
Projected Capacitive
Projected capacitive surfaces are a common solution for tablets and smartphones that must support multi-touch interfaces. For this invention a single-touch-capable component is sufficient. Most projected capacitive surfaces are specifically designed to respond to the typical conductance of human skin. They typically measure how the presence of a finger distorts an electric field generated on top of the surface. In the present invention, the surface must be able to detect contact with a probe assembly 100 made of plastic or other rigid material and detect its position with respect to the origin of the flexible touch surface. It is easy to modify the probe assembly to work with standard touch-sensitive surfaces. For example, the tip of the probe assembly can be covered with a rubbery material that has similar conductance to human skin. The required material is similar to the tips used on low-cost passive styli designed for smartphones. Additionally, given that the probe assembly is designed to resemble a real ultrasound probe, its tip is expected to be either flat or to possess a slight curvature. In order to enhance the experience of controlling the position of the scanning plane in the simulated environment by sliding the probe assembly over the touch surface, one embodiment may include a protruding element 130 at the tip 108 of the probe 100 to create clear point of contact between the probe assembly 100 and the surface as shown in
Most commercially available projected capacitive surfaces are bonded over a rigid glass substrate that does not allow flexing. However, a new fabrication process has been made by companies such as Sony, e-Ink, and Plastic Logic that allows the electronic components to be bonded over a flexible plastic substrate. Despite being very new, this technology is already available for OEMs to integrate in commercial products.
Resistive
Resistive touch surfaces typically use a three-layer assembly comprising of: a top conductive layer, an intermediate insulator (typically Indium Tin Oxide), and a bottom conductive layer. When the user applies pressure at a single point over the resistive surface, the insulator layer gets depressed allowing the conductive layers to form a system of variable resistances. By measuring the amount of resistance in each direction over the surface, a microcontroller can determine the position of the touch. One advantage of resistive surfaces over projected capacitive surfaces is that they work with any material and not just human skin.
Resistive surfaces are naturally flexible, but the amount of bending they can tolerate depends on the manufacturing process. If the surface is not designed specifically to withstand the appropriate mechanical stresses, the electronic components may break.
Some resistive surface components are also able to measure the amount of mechanical pressure that is applied on them. The pressure readings can be used to control the amount of compression that the user applies over the patient's body in the simulated environment.
Capacitive Pressure Sensing
Capacitive pressure sensing is realized by bonding an array of miniature capacitors on a conformable surface that hosts the interconnections necessary to relay a reading of capacitance from each of the capacitive elements. Each capacitor is composed of two conductive plates separated by a soft dielectric. When pressure is applied to each capacitive element, the dielectric is compressed causing the distance between the conductive plates to vary inducing a measurable change in capacitance.
Absolute-Position Optical Tracking
In another embodiment, the user may place the scanning patch 302 over the body or training mannequin with a special pattern printed on it. The pattern possesses the property that imaging any region of the pattern allows an algorithm to determine the exact position within the pattern of the imaged patch. As shown in
Relative-Position Optical Tracking
As an alternative to the scanning patch 302, the probe assembly 100 may be equipped with the translation sensor 300. Again, the tip 108 of the probe assembly 100 may be provisioned with an aperture 132 through which an optical tracker similar to the integrated components used in computer mice, can detect translational movement. These components use a small, low-resolution, high-speed camera to detect motion over a surface using a simplified form of optical-flow. Alternatively, the optical tracker can be provisioned with one or more laser beams that detect motion using Doppler interferometry. Sliding such probe assembly over the surface of a subject will produce a reading of 2D displacement. This solution can be used to displace the scanning plane from a discrete point in the simulated environment that corresponds to the physical location on the body defined by the corresponding electronic tag 200.
Display Component
If the flexible touch-surface scanning patch 302 is bonded with a flexible display component 304 (enhanced scanning patch), the software can provide additional visual feedback to the user by showing relevant information 306 about the region of the body it is placed on as well as clear indication on how to move the probe over the surface. The user may be instructed to place the enhanced scanning patch over the region of interest. Thus, when the probe assembly 100 is placed over the patch 302, the aforementioned RFID reader 104 can localize both the position of the probe assembly 100 and the patch 302 it is placed on.
Complete System
The IMU combined with the scanning patch allows the system to sense motion over 5 Degrees Of Freedom (DOF): 3-axis rotation, 2-axis translation over a curved surface.
By combining the above with the RFID component within the probe assembly 100, the final system can also localize the region of the body where the motion of the probe occurs. This solution reflects faithfully how an ultrasound operator works, whereby during an examination he or she: scans only a small set of discrete regions of the body (e.g., liver imaging protocol), restricts the extent of the scan to only a small area around the region of interest, restricts the motion of the probe to the surface of the patient's body (a motion over a semi-rigid curved manifold) to maintain good contact between the ultrasound transducer and the body.
A high-level block diagram of an exemplary computing device 400 that may be used to implement systems, apparatus, and methods described herein is illustrated in
Computing device 400 may also includes one or more network interfaces 426 for communicating with other devices via a network. Computing device 400 also includes one or more input/output devices 428 that enable user interaction with computing device 400 (e.g., display, keyboard, touchpad, mouse, speakers, buttons, etc.).
Processor 420 can include, among others, special purpose processors with software instructions incorporated in the processor design and general purpose processors with instructions in storage device 422 or memory 424, to control the processor 420, and may be the sole processor or one of multiple processors of computing device 400. Processor 420 may be a self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor may be symmetric or asymmetric. Processor 420, data storage device 422, and/or memory 424 may include, be supplemented by, or incorporated in, one or more application-specific integrated circuits (ASICs) and/or one or more field programmable gate arrays (FPGAs). It can be appreciated that the disclosure may operate on a computing device 400 with one or more processors 420 or on a group or cluster of computing devices networked together to provide greater processing capability.
Data storage device 422 and memory 424 each comprise a tangible non-transitory computing device readable storage medium. By way of example, and not limitation, such non-transitory computing device-readable storage medium can include random access memory (RAM), high-speed random access memory (DRAM), static random access memory (SRAM), double data rate synchronous dynamic random access memory (DDRRAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD-ROM), digital versatile disc read-only memory (DVD-ROM) disks, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computing device-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computing device, the computing device properly views the connection as a computing device-readable medium. Thus, any such connection is properly termed a computing device-readable medium. Combinations of the above should also be included within the scope of the computing device-readable media.
Network/communication interface 426 enables the computing device 400 to communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices using any suitable communications standards, protocols, and technologies. By way of example, and not limitation, such suitable communications standards, protocols, and technologies can include Ethernet, Wi-Fi (e.g., IEEE 802.11), Wi-MAX (e.g., 802.16), Bluetooth, near field communications (“NFC”), radio frequency systems, infrared, GSM, EDGE, HS-DPA, CDMA, TDMA, quadband, VoIP, IMAP, POP, XMPP, SIMPLE, IMPS, SMS, or any other suitable communications protocols. By way of example, and not limitation, the network interface 426 enables the computing device 400 to transfer data, synchronize information, update software, or perform any other suitable operation.
Input/output devices 428 may include peripherals, such as the probe assembly. Input/output devices 428 may also include monitors or touchscreens for display, a keyboard and mouse for input, speakers for audio output, and other such devices.
Any or all of the systems and apparatus discussed herein, including personal computing devices, tablet computing devices, hand-held devices, cellular telephones, servers, database, cloud-computing environments, and components thereof, may be implemented using a computing device such as computing device 400.
One skilled in the art will recognize that an implementation of an actual computing device or computing device system may have other structures and may contain other components as well, and that
Software Components and Simulation
The simulation software comprises: low-level components to interface with the probe assembly hardware (e.g., device drivers), a graphics engine to display a graphical user interface and additional 3D visual elements (on a 2D screen or stereoscopic display), a mathematical engine, and a database or other storage functionality to host a library of medical cases.
When the user changes the orientation of the probe assembly 100 in physical space, he/she will observe a corresponding motion of the virtual probe 406 on screen. A mathematical algorithm updates the simulated ultrasound image to present an image that mimics how a real ultrasound image would look if the probe were placed on the same location on the body.
As shown in
When the user places the probe assembly 100 in close proximity to one of the electronic tags 200a-g arranged over a real body or training mannequin, the simulation software will move the virtual probe 406 to the corresponding location on the virtual body 412.
When the user slides the probe assembly 100 over the scanning patch 302 (e.g., flexible touch surface assembly with optional bonded flexible display 304) the virtual probe 406 will displace over the surface of the virtual body 412 mimicking the motion of the user. If the scanning patch 302 also acts as a display, the software can showcase additional information 306 about the region of the body where the scanning patch 302 is located as well as visual guidance or instructions 308 on how to operate the probe assembly correctly.
The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.
This patent application is a continuation-in-part of U.S. patent application Ser. No. 14/548,210 filed Nov. 19, 2014, which claims the benefit of U.S. Provisional Application No. 61/907,276 filed Nov. 21, 2013; this application is also a continuation-in-part of U.S. patent application Ser. No. 14/494,379, filed Sep. 23, 2014, which claims the benefit of U.S. Provisional Application No. 61/881,338, filed Sep. 23, 2013; this application also claims the benefit of U.S. Provisional Patent Application Ser. Nos. 61/946,646 and 61/946,586, each entitled “System and Method for Augmented Ultrasound Simulation Using Flexible Display Surfaces,” and each filed Feb. 28, 2014, which applications are incorporated in their entirety here by this reference.
Number | Name | Date | Kind |
---|---|---|---|
5609485 | Bergman | Mar 1997 | A |
6193657 | Drapkin | Feb 2001 | B1 |
6502756 | Fåhraeus | Jan 2003 | B1 |
6548768 | Pettersson | Apr 2003 | B1 |
6570104 | Ericson et al. | May 2003 | B1 |
6663008 | Pettersson et al. | Dec 2003 | B1 |
6666376 | Ericson | Dec 2003 | B1 |
6667695 | Pettersson et al. | Dec 2003 | B2 |
6674427 | Pettersson et al. | Jan 2004 | B1 |
6689966 | Wiebe | Feb 2004 | B2 |
6698660 | Fåhraeus et al. | Mar 2004 | B2 |
6719470 | Berhin | Apr 2004 | B2 |
6722574 | Skantze et al. | Apr 2004 | B2 |
6732927 | Olsson et al. | May 2004 | B2 |
6836555 | Ericson et al. | Dec 2004 | B2 |
6854821 | Ericson et al. | Feb 2005 | B2 |
6864880 | Hugosson et al. | Mar 2005 | B2 |
6878062 | Bjorklund et al. | Apr 2005 | B2 |
6927916 | Craven-Bartle | Aug 2005 | B2 |
6929183 | Pettersson | Aug 2005 | B2 |
6929481 | Alexander | Aug 2005 | B1 |
6947033 | Fåhraeus et al. | Sep 2005 | B2 |
6958747 | Sahlberg et al. | Oct 2005 | B2 |
6966495 | Lynggaard et al. | Nov 2005 | B2 |
6992655 | Ericson et al. | Jan 2006 | B2 |
7002559 | Ericson | Feb 2006 | B2 |
7035429 | Andreasson | Apr 2006 | B2 |
7050653 | Edso et al. | May 2006 | B2 |
7054487 | Ericson et al. | May 2006 | B2 |
7072529 | Hugosson et al. | Jul 2006 | B2 |
7089308 | Fransson et al. | Aug 2006 | B2 |
7094977 | Ericson et al. | Aug 2006 | B2 |
7110604 | Olsson | Sep 2006 | B2 |
7120320 | Petterson et al. | Oct 2006 | B2 |
7121465 | Rignell | Oct 2006 | B2 |
7127682 | Sandstrom et al. | Oct 2006 | B2 |
7143952 | Ericson | Dec 2006 | B2 |
7145556 | Pettersson | Dec 2006 | B2 |
7154056 | Bergqvist et al. | Dec 2006 | B2 |
7162087 | Bryborn | Jan 2007 | B2 |
7167164 | Ericson et al. | Jan 2007 | B2 |
7172131 | Pettersson et al. | Feb 2007 | B2 |
7175095 | Pettersson et al. | Feb 2007 | B2 |
7176896 | Fahraeus et al. | Feb 2007 | B1 |
7180509 | Fermgard et al. | Feb 2007 | B2 |
7195166 | Olsson et al. | Mar 2007 | B2 |
7202861 | Lynggaard | Apr 2007 | B2 |
7202963 | Wiebe et al. | Apr 2007 | B2 |
7239306 | Fahraeus et al. | Jul 2007 | B2 |
7246321 | Bryborn et al. | Jul 2007 | B2 |
7248250 | Pettersson et al. | Jul 2007 | B2 |
7249256 | Hansen et al. | Jul 2007 | B2 |
7249716 | Bryborn | Jul 2007 | B2 |
7254839 | Fahraeus et al. | Aug 2007 | B2 |
7278017 | Skantze | Oct 2007 | B2 |
7281668 | Pettersson et al. | Oct 2007 | B2 |
7283676 | Olsson | Oct 2007 | B2 |
7293697 | Wiebe et al. | Nov 2007 | B2 |
7295193 | Fahraeus | Nov 2007 | B2 |
7296075 | Lynggaard | Nov 2007 | B2 |
7321692 | Bryborn et al. | Jan 2008 | B2 |
7333947 | Wiebe | Feb 2008 | B2 |
7345673 | Ericson et al. | Mar 2008 | B2 |
7353393 | Hansen et al. | Apr 2008 | B2 |
7356012 | Wiebe et al. | Apr 2008 | B2 |
7382361 | Burstrom et al. | Jun 2008 | B2 |
7385595 | Bryborn et al. | Jun 2008 | B2 |
7408536 | Hugosson et al. | Aug 2008 | B2 |
7415501 | Burstrom et al. | Aug 2008 | B2 |
7418160 | Lynggaard | Aug 2008 | B2 |
7422154 | Ericson | Sep 2008 | B2 |
7441183 | Burstrom et al. | Oct 2008 | B2 |
7457413 | Thuvesholmen et al. | Nov 2008 | B2 |
7457476 | Olsson | Nov 2008 | B2 |
7543753 | Pettersson | Jun 2009 | B2 |
7588191 | Pettersson et al. | Sep 2009 | B2 |
7600693 | Pettersson | Oct 2009 | B2 |
7649637 | Wiebe et al. | Jan 2010 | B2 |
7670070 | Craven-Bartle | Mar 2010 | B2 |
7672513 | Bjorklund et al. | Mar 2010 | B2 |
7701446 | Sahlberg et al. | Apr 2010 | B2 |
7710408 | Ericson | May 2010 | B2 |
7751089 | Fahraeus et al. | Jul 2010 | B2 |
7753283 | Lynggaard | Jul 2010 | B2 |
7788315 | Johansson | Aug 2010 | B2 |
7806696 | Alexander | Oct 2010 | B2 |
7833018 | Alexander | Nov 2010 | B2 |
7871850 | Park | Jan 2011 | B2 |
7931470 | Alexander | Apr 2011 | B2 |
8244506 | Butsev | Aug 2012 | B2 |
8294972 | Chung | Oct 2012 | B2 |
8428326 | Falk | Apr 2013 | B2 |
8480406 | Alexander | Jul 2013 | B2 |
8721344 | Marmaropoulos | May 2014 | B2 |
20020088926 | Prasser | Jul 2002 | A1 |
20090130642 | Tada | May 2009 | A1 |
20100179428 | Pedersen | Jul 2010 | A1 |
20130065211 | Amso | Mar 2013 | A1 |
Number | Date | Country |
---|---|---|
2801966 | Nov 2014 | EP |
Entry |
---|
Chung, Gregory, “Effects of Simulation-Based Practice on Focused Assessment . . . ”, Military Medicine, Oct. 2013, vol. 178. |
Number | Date | Country | |
---|---|---|---|
20150154890 A1 | Jun 2015 | US |
Number | Date | Country | |
---|---|---|---|
61907276 | Nov 2013 | US | |
61881338 | Sep 2013 | US | |
61946646 | Feb 2014 | US | |
61946586 | Feb 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14548210 | Nov 2014 | US |
Child | 14622490 | US | |
Parent | 14494379 | Sep 2014 | US |
Child | 14548210 | US |