This invention relates to extending the capabilities of ultrasound training simulation to support traditional modalities based on animate (e.g., live human model) and inanimate (e.g., training mannequins) objects.
The acquisition of ultrasound skills requires ability of finding an image window i.e., placing the ultrasound transducer over a site of interest that also enables acoustic sound wave transmission towards the structure of interest). Upon finding an image window, the operator must acquire an optimal view. This typically involves rotation of the transducer around a fixed axis or point. Both of these skills require practice and the development of psychomotor skills that are married to didactic instruction and an understanding of underlying anatomy. Therefore, an effective tool for learning ultrasound must allow the user to practice both rotational and translational movements of the ultrasound probe. This invention introduces a low-cost solution that allows users to practice the skills of image window and optimal view acquisition in a simulated environment.
Methods of ultrasound simulation have been developed that force trainees to both locate an image window and subsequently find an optimal image view. These methods rely upon complex six degrees-of-freedom (6DOF) motion tracking technologies coupled with inanimate mannequins. Issues of calibration, cost, interference, and ease-of-use issues make 6-DOF ultrasound simulations expensive and cumbersome. Many institutions and individuals who wish to teach or learn ultrasonography do not have access to expensive training mannequins equipped with 6-DOF motion sensing technology.
For the foregoing reasons there is a need for a more accessible system and method for ultrasound simulation that does not require expensive 6-DOF motion sensing technology.
The present invention is directed to a system and method of ultrasound training that uses Near Field Communication (NFC) tags or similar radio frequency tags that may be placed on animate or inanimate models to define desired locations over the extent of the body that are linked to pre-selected image windows. Trainees use an NFC reader coupled with a rotational 3-DOF motion tracker to manipulate a virtual ultrasound probe. Ultrasound simulation software displays a graphical user interface, a virtual body, the virtual ultrasound probe, and an ultrasound image. The virtual ultrasound probe and ultrasound image continuously update based on the manipulation of the reader and the 3-DOF motion tracker. In this way trainees may train in finding image windows and optimal image views.
The detailed description set forth below connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention and claims.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first gesture could be termed a second gesture, and, similarly, a second gesture could be termed a first gesture, without departing from the scope of the present invention.
The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The term 6-DOF means six degrees of freedom and refers to the freedom of movement of a rigid body in three-dimensional space. Specifically, the body is free to move in three translational degrees of forward/back, up/down, left/right, and three rotational degrees of pitch, yaw, and roll. The term rotational 3-DOF refers to the pitch, yaw, and roll.
The term tag is used herein to refer to both transponders and beacons. Generally, NFC, RFID, and other types of passive tags are referred to transponders, and active tags such as ones using Bluetooth Low Energy are referred to as beacons.
The term reader is used to refer to the device that emits the query signal to detect the ID of the tag.
A preferred embodiment of the extended-spectrum ultrasound simulation system comprises:
Embodiments of the ultrasound training method involve providing the hardware and software of the extended-spectrum ultrasound simulation system and directing the trainee 206 in the use of the system. This method would assist a trainee 206 in acquiring the skills of finding an image window and an optimal image view. The method may comprise a setup step in which the trainee 206 places tags 202 on an animate or inanimate training object. After the setup step, the trainee 206 may begin the simulation, which involves moving the reader and motion tracker, while viewing on a display 100 a virtual body 108, virtual ultrasound probe 110, and a simulated ultrasound 102, as depicted in
As discussed, the tags 202 may be used with animate or inanimate models/objects. In some embodiments, the animate training model 204 is a live human being, as shown in
In some embodiments, the tags 202 are labeled so that the user can easily identify where to affix the tags 202 on the animate or inanimate object. In some embodiments, the simulation software directs a user in the process of affixing the tags 202 on the animate or inanimate object. In some embodiments, the tags 202 may be provided already affixed to a training object so a trainee 206 does not need to worry about setting up or losing the tags 202.
The tags 202 should have an ID that is mapped to locations on the virtual body 108. The system needs to know how to establish a correspondence between the ID of a tag 202 and a set of coordinates on the virtual body 108. An example of a mapping may be:
The tags 202 are designed in a way that they may be easily affixed onto the training model either permanently or for a limited number of uses. If the tag is designed to be affixed permanently to an object, such as a training mannequin 302, they could be embedded directly by the manufacturer. For instance, if a training mannequin 302 features a soft removable skin, the tags 202 could be embedded directly under the skin at the correct locations.
In some embodiments, the tag 202 may be a tag assembly 500 comprising multiple superimposed layers. An example tag assembly 500 having three layers is illustrated in
The reader detects tags 202 over a short distance by employing a number of available radio frequency technologies depending on the type of tag. In preferred embodiments, a low-cost NFC reader and passive NFC tags 202 are used. The NFC reader broadcasts an electromagnetic (EM) wave at a specific frequency. The NFC tag 202 harvests energy from the incoming EM wave using magnetic induction. The tag 202 uses the energy to power a small chip 606 that broadcasts a new EM wave that encodes the unique identification number of the tag 202 according to a predefined protocol. The NFC reader then receives the encoded signal and relays the information to the computation engine.
The rotational 3-DOF motion tracker may comprise one or more sensors for measuring rotation. For example, the motion tracker may comprise a low-cost Inertial Measurement Unit (IMU) composed of a combination of gyroscopes, accelerometers, magnetometers, and other sensors for compensating external disturbances.
In some embodiments, the reader and the rotational 3-DOF motion tracker are combined into a single unit called a sensor assembly 208, as shown in
The system requires a computing device capable of running the ultrasound simulation software. The ultrasound simulation software should be capable of displaying on a display a graphical user interface (GUI) 104, a virtual body 108, a virtual probe 110, and a dynamic image 102 resembling the appearance of an ultrasound. The computing device should be capable of receiving data from a sensor assembly, including an identified tag and an orientation. The position and orientation of the virtual probe 110 should update based on the data, and the dynamic image 102 should also change.
The ultrasound simulation software may include instructions on placing tags 202 on an animate or inanimate model/object. The ultrasound simulation software may also include a choice of different virtual bodies 108. For example, practitioners may be given a choice to practice ultrasound training on an average male, a pregnant female, or a pet dog. In some embodiments, the instructions on placing tags 202 are dependent on the body type chosen in the ultrasound simulation software.
The ultrasound simulation software may further include scenarios, objectives and instructions on obtaining an image window or optimal image view. For example, a scenario may involve instructing a trainee 206 to place a sensor assembly at the abdomen of the model. If the trainee 206 does not find the correct tag 202, the scenario will not progress. This may instruct a trainee 206 in finding the correct image window. The scenario may then instruct the trainee 206 to find the optimal image view. The trainee 206 will need to orient the sensor assembly at the right orientation to progress. In some embodiments, a specific virtual probe position will be displayed on the computing device, which a trainee 206 will be required to mimic.
A high-level block diagram of an exemplary computing device 800 that may be used to implement systems, apparatus, and methods described herein is illustrated in
Computing device 800 also includes one or more network interfaces 840 for communicating with other devices via a network. Computing device 800 also includes one or more input/output devices 850 that enable user interaction with computing device 800 (e.g., display, keyboard, touchpad, mouse, speakers, buttons, etc.).
Processor 810 can include, among others, special purpose processors with software instructions incorporated in the processor design and general purpose processors with instructions in storage device 820 or memory 830, to control the processor 810, and may be the sole processor or one of multiple processors of computing device 800. Processor 810 may be a self-contained computing system, containing multiple cores or processors, a bus, memory controller, cache, etc. A multi-core processor may be symmetric or asymmetric. Processor 810, data storage device 820, and/or memory 830 may include, be supplemented by, or incorporated in, one or more application-specific integrated circuits (ASICs) and/or one or more field programmable gate arrays (FPGAs). It can be appreciated that the disclosure may operate on a computing device 800 with one or more processors 810 or on a group or cluster of computing devices networked together to provide greater processing capability.
Data storage device 820 and memory 830 each comprise a tangible non-transitory computing device readable storage medium. By way of example, and not limitation, such non-transitory computing device-readable storage medium can include random access memory (RAM), high-speed random access memory (DRAM), static random access memory (SRAM), double data rate synchronous dynamic random access memory (DDRRAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD-ROM) digital versatile disc read-only memory (DVD-ROM) disks, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code means in the form of computing device-executable instructions, data structures, or processor chip design. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computing device, the computing device properly views the connection as a computing device-readable medium. Thus, any such connection is properly termed a computing device-readable medium. Combinations of the above should also be included within the scope of the computing device-readable media.
Network/communication interface 840 enables the computing device 800 to communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices using any suitable communications standards, protocols, and technologies. By way of example, and not limitation, such suitable communications standards, protocols, and technologies can include Ethernet, Wi-Fi (e.g., IEEE 802.11), Wi-MAX (e.g., 802.16), Bluetooth, near field communications (“NFC”), radio frequency systems, infrared, GSM, EDGE, HS-DPA, CDMA, TDMA, quadband, VoIP, IMAP, POP, XMPP, SIMPLE, IMPS, SMS, or any other suitable communications protocols. By way of example, and not limitation, the network interface 840 enables the computing device 800 to transfer data, synchronize information, update software, or perform any other suitable operation.
Input/output devices 850 may include peripherals, such as the sensor assembly or the individual reader and motion tracker. Input/output devices 850 may also include monitors or touchscreens for display, a keyboard and mouse for input, speakers for audio output, and other such devices.
Any or all of the systems and apparatus discussed herein, including personal computing devices, tablet computing devices, hand-held devices, cellular telephones, servers, database, cloud-computing environments, and components thereof, may be implemented using a computing device such as computing device 800.
One skilled in the art will recognize that an implementation of an actual computing device or computing device system may have other structures and may contain other components as well, and that
The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.
This patent application is a continuation of U.S. patent application Ser. No. 14/548,210, filed Nov. 19, 2014, which claims the benefit of U.S. Provisional Patent Application Ser. No. 61/907,276, filed Nov. 21, 2013, entitled “SYSTEM AND METHOD FOR EXTENDED SPECTRUM ULTRASOUND TRAINING USING ANIMATE AND INANIMATE TRAINING OBJECTS,” which applications are incorporated in their entirety here by this reference.
This invention was made with government support under Contract No. W81XWH-11-C-0529 awarded by the US Army Medical Research and Material Command. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
1488233 | Diehl | Mar 1924 | A |
1762937 | Staud | Jun 1930 | A |
2019121 | De Rewal | Oct 1935 | A |
2112019 | Gyger | Mar 1938 | A |
2127610 | Moore | Aug 1938 | A |
2705049 | Brooks | Mar 1955 | A |
2705307 | Nyswander | Mar 1955 | A |
2722947 | Sragal | Nov 1955 | A |
2886316 | Ayala | May 1959 | A |
4040171 | Cline et al. | Aug 1977 | A |
4838863 | Allard et al. | Jun 1989 | A |
4838869 | Allard | Jun 1989 | A |
4994034 | Botich et al. | Feb 1991 | A |
5231381 | Duwaer | Jul 1993 | A |
5513992 | Refait | May 1996 | A |
5609485 | Bergman | Mar 1997 | A |
5678565 | Sarvazyan | Oct 1997 | A |
5689443 | Ramanathan | Nov 1997 | A |
5701900 | Shehada et al. | Dec 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5755577 | Gillio | May 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5776062 | Nields | Jul 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5800177 | Gillio | Sep 1998 | A |
5800178 | Gillio | Sep 1998 | A |
5800179 | Bailey | Sep 1998 | A |
5800350 | Coppleson | Sep 1998 | A |
5827942 | Madsen et al. | Oct 1998 | A |
5882206 | Gillio | Mar 1999 | A |
5889237 | Makinwa | Mar 1999 | A |
5934288 | Avila et al. | Aug 1999 | A |
6001472 | Ikeda et al. | Dec 1999 | A |
6048312 | Ishrak et al. | Apr 2000 | A |
6063030 | Vara et al. | May 2000 | A |
6068597 | Lin | May 2000 | A |
6074213 | Hon | Jun 2000 | A |
6113395 | Hon | Sep 2000 | A |
6117078 | Lysyansky et al. | Sep 2000 | A |
6122538 | Sliwa, Jr. et al. | Sep 2000 | A |
6156213 | Dudley et al. | Dec 2000 | A |
6193657 | Drapkin | Feb 2001 | B1 |
6267599 | Bailey | Jul 2001 | B1 |
6468212 | Scott et al. | Oct 2002 | B1 |
6502756 | Fåhraeus | Jan 2003 | B1 |
6511427 | Sliwa, Jr. et al. | Jan 2003 | B1 |
6548768 | Pettersson et al. | Apr 2003 | B1 |
6570104 | Ericson et al. | May 2003 | B1 |
6654000 | Rosenberg | Nov 2003 | B2 |
6663008 | Pettersson et al. | Dec 2003 | B1 |
6665554 | Charles et al. | Dec 2003 | B1 |
6666376 | Ericson | Dec 2003 | B1 |
6667695 | Pettersson | Dec 2003 | B2 |
6674427 | Pettersson et al. | Jan 2004 | B1 |
6689966 | Wiebe | Feb 2004 | B2 |
6693626 | Rosenberg | Feb 2004 | B1 |
6694163 | Vining | Feb 2004 | B1 |
6698660 | Fåhraeus et al. | Mar 2004 | B2 |
6714213 | Lithicum et al. | Mar 2004 | B1 |
6714901 | Cotin et al. | Mar 2004 | B1 |
6719470 | Berhin | Apr 2004 | B2 |
6722574 | Skantze et al. | Apr 2004 | B2 |
6732927 | Olsson et al. | May 2004 | B2 |
6750877 | Rosenberg et al. | Jun 2004 | B2 |
6780016 | Toly | Aug 2004 | B1 |
6816148 | Mallett et al. | Nov 2004 | B2 |
6836555 | Ericson et al. | Dec 2004 | B2 |
6854821 | Ericson et al. | Feb 2005 | B2 |
6864880 | Hugosson et al. | Mar 2005 | B2 |
6878062 | Bjorklund et al. | Apr 2005 | B2 |
6896650 | Tracey et al. | May 2005 | B2 |
6916283 | Tracey et al. | Jul 2005 | B2 |
6927916 | Craven-Bartle | Aug 2005 | B2 |
6929183 | Pettersson | Aug 2005 | B2 |
6929481 | Alexander | Aug 2005 | B1 |
6947033 | Fåhraeus et al. | Sep 2005 | B2 |
6958747 | Sahlberg et al. | Oct 2005 | B2 |
6966495 | Lynggaard et al. | Nov 2005 | B2 |
6992655 | Ericson et al. | Jan 2006 | B2 |
7002559 | Ericson | Feb 2006 | B2 |
7035429 | Andreasson | Apr 2006 | B2 |
7037258 | Chatenever et al. | May 2006 | B2 |
7050653 | Edso et al. | May 2006 | B2 |
7054487 | Ericson et al. | May 2006 | B2 |
7072529 | Hugosson et al. | Jul 2006 | B2 |
7089308 | Fransson et al. | Aug 2006 | B2 |
7094977 | Ericson et al. | Aug 2006 | B2 |
7110604 | Olsson | Sep 2006 | B2 |
7120320 | Petterson et al. | Oct 2006 | B2 |
7121465 | Rignell | Oct 2006 | B2 |
7127682 | Sandstrom et al. | Oct 2006 | B2 |
7143952 | Ericson | Dec 2006 | B2 |
7145556 | Pettersson | Dec 2006 | B2 |
7154056 | Bergqvist et al. | Dec 2006 | B2 |
7162087 | Bryborn | Jan 2007 | B2 |
7167164 | Ericson et al. | Jan 2007 | B2 |
7172131 | Pettersson et al. | Feb 2007 | B2 |
7175095 | Pettersson et al. | Feb 2007 | B2 |
7176896 | Fahraeus et al. | Feb 2007 | B1 |
7180509 | Fermgard et al. | Feb 2007 | B2 |
7195166 | Olsson et al. | Mar 2007 | B2 |
7202861 | Lynggaard | Apr 2007 | B2 |
7202963 | Wiebe et al. | Apr 2007 | B2 |
7239306 | Fahraeus et al. | Jul 2007 | B2 |
7246321 | Bryborn et al. | Jul 2007 | B2 |
7248250 | Pettersson et al. | Jul 2007 | B2 |
7249256 | Hansen et al. | Jul 2007 | B2 |
7249716 | Bryborn | Jul 2007 | B2 |
7254839 | Fahraeus et al. | Aug 2007 | B2 |
7278017 | Skantze | Oct 2007 | B2 |
7281668 | Pettersson et al. | Oct 2007 | B2 |
7283676 | Olsson | Oct 2007 | B2 |
7293697 | Wiebe et al. | Nov 2007 | B2 |
7295193 | Fahraeus | Nov 2007 | B2 |
7296075 | Lynggaard | Nov 2007 | B2 |
7321692 | Bryborn et al. | Jan 2008 | B2 |
7333947 | Wiebe et al. | Feb 2008 | B2 |
7345673 | Ericson et al. | Mar 2008 | B2 |
7353393 | Hansen et al. | Apr 2008 | B2 |
7356012 | Wiebe et al. | Apr 2008 | B2 |
7371068 | Lloyd et al. | May 2008 | B2 |
7382361 | Burstrom | Jun 2008 | B2 |
7385595 | Brybomn et al. | Jun 2008 | B2 |
7408536 | Hugosson et al. | Aug 2008 | B2 |
7415501 | Burstrom | Aug 2008 | B2 |
7418160 | Lynggaard | Aug 2008 | B2 |
7422154 | Ericson | Sep 2008 | B2 |
7441183 | Burstrom et al. | Oct 2008 | B2 |
7457413 | Thuvesholmen et al. | Nov 2008 | B2 |
7457476 | Olsson | Nov 2008 | B2 |
7543753 | Pettersson | Jun 2009 | B2 |
7588191 | Pettersson et al. | Sep 2009 | B2 |
7600693 | Pettersson | Oct 2009 | B2 |
7649637 | Wiebe et al. | Jan 2010 | B2 |
7670070 | Craven-Bartle | Mar 2010 | B2 |
7672513 | Bjorklund et al. | Mar 2010 | B2 |
7701446 | Sahiberg et al. | Apr 2010 | B2 |
7710408 | Ericson | May 2010 | B2 |
7751089 | Fahraeus et al. | Jul 2010 | B2 |
7753283 | Lynggaard | Jul 2010 | B2 |
7777777 | Bowman et al. | Aug 2010 | B2 |
7788315 | Johansson | Aug 2010 | B2 |
7794388 | Draxinger et al. | Sep 2010 | B2 |
7806696 | Alexander | Oct 2010 | B2 |
7833018 | Alexander | Nov 2010 | B2 |
7850454 | Toly | Dec 2010 | B2 |
7857626 | Toly | Dec 2010 | B2 |
7871850 | Park | Jan 2011 | B2 |
7931470 | Alexander | Apr 2011 | B2 |
8244506 | Butsev | Aug 2012 | B2 |
8294972 | Chung | Oct 2012 | B2 |
8428326 | Falk | Apr 2013 | B2 |
8480404 | Savitsky | Jul 2013 | B2 |
8480406 | Alexander | Jul 2013 | B2 |
8721344 | Marmaropoulos | May 2014 | B2 |
9128116 | Welch et al. | Sep 2015 | B2 |
9251721 | Lampotang | Feb 2016 | B2 |
9870721 | Savitsky et al. | Jan 2018 | B2 |
10052010 | Feddema | Aug 2018 | B2 |
10132015 | Woodruff et al. | Nov 2018 | B2 |
11011077 | Garcia Kilroy | May 2021 | B2 |
20010031920 | Kaufman et al. | Oct 2001 | A1 |
20020076581 | McCoy | Jun 2002 | A1 |
20020076681 | Leight et al. | Jun 2002 | A1 |
20020088926 | Prasser | Jul 2002 | A1 |
20020099310 | Kimchy et al. | Jul 2002 | A1 |
20020168618 | Anderson et al. | Nov 2002 | A1 |
20020173721 | Grunwald et al. | Nov 2002 | A1 |
20040043368 | Hsieh et al. | Mar 2004 | A1 |
20040087850 | Okerlund et al. | May 2004 | A1 |
20050119569 | Ontake | Jun 2005 | A1 |
20050181342 | Toly | Aug 2005 | A1 |
20050214726 | Feygin et al. | Sep 2005 | A1 |
20050283075 | Ma et al. | Dec 2005 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060098010 | Dwyer et al. | May 2006 | A1 |
20070088213 | Poland | Apr 2007 | A1 |
20070232907 | Pelissier et al. | Oct 2007 | A1 |
20070236514 | Augusanto | Oct 2007 | A1 |
20070238085 | Colvin et al. | Oct 2007 | A1 |
20080137071 | Chow | Jun 2008 | A1 |
20080187896 | Savitsky | Aug 2008 | A1 |
20080200807 | Wright et al. | Aug 2008 | A1 |
20080204004 | Anderson | Aug 2008 | A1 |
20080269606 | Matsumura | Oct 2008 | A1 |
20080294096 | Uber et al. | Nov 2008 | A1 |
20080312884 | Hostettler et al. | Dec 2008 | A1 |
20090043195 | Poland | Feb 2009 | A1 |
20090046912 | Hostettler | Feb 2009 | A1 |
20090130642 | Tada | May 2009 | A1 |
20090305213 | Burgkart et al. | Dec 2009 | A1 |
20090311655 | Karkanias et al. | Dec 2009 | A1 |
20100055657 | Goble et al. | Mar 2010 | A1 |
20100104162 | Falk et al. | Apr 2010 | A1 |
20100179428 | Pedersen et al. | Jul 2010 | A1 |
20100268067 | Razzaque et al. | Oct 2010 | A1 |
20100277422 | Muresianu et al. | Nov 2010 | A1 |
20120021993 | Kim et al. | Jan 2012 | A1 |
20120143142 | Klein | Jun 2012 | A1 |
20120150797 | Landy et al. | Jun 2012 | A1 |
20120179039 | Pelissier et al. | Jul 2012 | A1 |
20120200977 | Nestler | Aug 2012 | A1 |
20120219937 | Hughes et al. | Aug 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120251991 | Savitsky et al. | Oct 2012 | A1 |
20130046523 | Van Dinther | Feb 2013 | A1 |
20130064036 | Lee et al. | Mar 2013 | A1 |
20130065211 | Amso | Mar 2013 | A1 |
20130158411 | Miyasaka | Jun 2013 | A1 |
20130179306 | Want | Jul 2013 | A1 |
20130236872 | Laurusonis et al. | Sep 2013 | A1 |
20140087347 | Tracy | Mar 2014 | A1 |
20140114194 | Kanayama et al. | Apr 2014 | A1 |
20140120505 | Rios et al. | May 2014 | A1 |
20140228685 | Eelbode | Aug 2014 | A1 |
20140272878 | Shim et al. | Sep 2014 | A1 |
20150056591 | Tepper et al. | Feb 2015 | A1 |
20150213731 | Sato | Jul 2015 | A1 |
20160314716 | Grubbs | Oct 2016 | A1 |
20160328998 | Pedersen et al. | Nov 2016 | A1 |
20170028141 | Fiedler et al. | Feb 2017 | A1 |
20170035517 | Geri | Feb 2017 | A1 |
20170046985 | Hendrickson et al. | Feb 2017 | A1 |
20170110032 | O'Brien | Apr 2017 | A1 |
20170270829 | Bauss | Sep 2017 | A1 |
20180197441 | Rios | Jul 2018 | A1 |
20180366034 | Casals Gelpi | Dec 2018 | A1 |
20190057620 | Eggert | Feb 2019 | A1 |
20190231436 | Panse | Aug 2019 | A1 |
20200126449 | Horst | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
1103223 | May 2001 | EP |
2801966 | Nov 2014 | EP |
2801966 | Nov 2014 | EP |
2127610 | Mar 1999 | RU |
1994040171 | Nov 2014 | RU |
2006960406 | Jun 2006 | WO |
Entry |
---|
Chung, Gregory, “Effects of Simulation-Based Practice on Focused Assessment . . . ”, Military Medicine, Oct. 2013, vol. 178. |
Aligned Management Associates, Inc., Corporate home page describing organizing committee, overview, Procedicus MIST[trademark]-suturing module 30.0, 6 pgs., obtained from website Sep. 6, 2004. |
American Academy of Emergency Medicine, conference: 11th annual scientific assembly preconference ultrasound courts, http://www. aaem.org/education/scientificassembly/sa05/precon/ultrasound.shtmi, 6 pgs, obtained from website Feb. 16, 2005. |
Barbosa, J et al., “Computer education in emergency medicine residency programs,” http://www.med-ed-oniine.org/res00002.htm, 8 pgs, obtained from website Sep. 6, 2004. |
Brannam, Let al, “Emergency nurses utilization of ultrasound guidance for placement of peripheral intravenous lines in difficult-access patients,” Acad Emerg Med, 11 (12):1361-1363, Dec. 2004. |
Calvert, N et al., “The effectiveness and cost-effectiveness of ultrasound locating devices for central venous access: a systematic review and economic evaluation/executive summary,” Health Tech Assess 2003, 7(12), 4 pgs. |
Center for Human Simulation, corporate home page describing overview/people, http://www.uchsc.edu, 7 pgs, obtained from website Sep. 6, 2004. |
CIMIT News, “The medical access program: new CIMIT initiative to benefit underserved patients/partners telemedicine and CIMIT launch new initiative: stay connected, be healthy/highlights: operating room of the future plug-and-play project,” http://www.cimit.org, Jan. 2005; VoIII(2), 2 pgs., obtained from website Mar. 1, 2005. |
Colt, H. G. et al., “Virtual reality bronchoscopy simulation: a revolution in procedural training,” Chest 2001; 120:1333-1339. |
Computer Motion, “About computer motion: technology to enhance surgeons capabilities, improve patient outcomes and reduce healthcare costs/corporate alliances/products solutions for surgical innovation/training on the da Vinci[registered] surgical system-introduction,” 2002 Computer Motion, http://www.computermotion.com, 6 pgs. |
Delp, Setal, “Surgical simulation—an emerging technology for training in emergency medicine,” Presence, 6(2):147-159, Apr. 1997 (abstract). |
Dorner, R. et. al., “Synergies between interactive training simulations and digital storytelling: a component-based framework,” Computer Graphics, 26(1):45-55, Feb. 2002 (abstract). |
Duque, D. and Kessler S., “Ultrasound guided vascular access,” Amer Coll Emerg Phy., http://vww.nyacep.org/education/articles/ultrasound%20vascular%20access.htm, 2 pgs, obtained from website May 11, 2005. |
Espinet, A. and Dunning J., “Does ultrasound-guided central line insertion reduce complications and time to placement in elective patients undergoing cardiac surgery,” Inter Cardiovascular Thoracic Surg, 3:523-527, 2004; http:/licvts.ctsnetjournals.org/cgi/content/full/3/3/523, 6 pgs, obtained from website May 11, 2005 (abstract). |
Gallagher, A. G. et al., “Virtual reality training for the operating room and cardiac catheterization laboratory,” Lancet, 364:1538-1540, Oct. 23, 2004. |
Gallagher, A. G. et. ai., “Psychomotor skills assessment in practicing surgeons experienced in performing advanced laparoscopic procedures,” AM Coll Surg, 197(3):479-488, Sep. 2003. |
Gausche, M. et al., “Effect on out-of-hospital pediatric endotracheal intubation on survival and neurological outcome: a controlled clinical trial,” JAMA, 283(6):783-790, Feb. 9, 2000. |
Gore, D. C. and Gregory, S. R., “Historical perspective on medical errors: Richard Cabot and the Institute of Medicine,” J Amer Coll Surg, 197(4), 5 pgs, Oct. 2003. |
Grantcharov, T. P. et. al., “Randomized clinical trial of virtual reality simulation for laparoscopic skills training,” Br J Surg, 91(2):146-150, Feb. 1, 2004 (abstract). |
Grantcharov, T. P. et al., “Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills,” Am J Surg, 185(2):146-149, Feb. 1, 2004 (abstract). |
Haluck, R. S., et. al., “Are surgery training programs ready for virtual reality A survey of program directors in general surgery,” Arch Surg, 135(7):786-792, Jul. 1, 2000. |
Helmreich, R. L., “On error management: lessons from aviation,” BMJ, 320:781-785, Mar. 2000. |
Huckman, R. S. and Pisano, G. P., “Turf battles in coronary revascularization,” N Engl J Med, http://www.nejm.org, 4 pgs, 352(9):857-859, Mar. 3, 2005. |
Immersion Corporation, URL: http://www.immersion.com/corporate/products/, corporate home page describing Immersions surgical training simulators—“Wireless Data Glove: The CyberGlove[registered]II System,” 5 pgs, obtained from the website Nov. 17, 2005 and Jan. 24, 2008. |
injuryboard.com, “Reducing complications associated with central vein catheterization,” URSL: http://www.injuryboard.com/view.cfm/Article=668, 5 pgs, obtained from website May 11, 2005. |
Intersense, home page listing motion tracking products, http://www.isense.com/prodouts.aspxid=42, 1 pg, obtained from website Jan. 24, 2008. |
Jemmett, M. E., et al., “Unrecognized misplacement of endotracheal tubes in a mixed urban to rural emergency medical services setting,” Acad Emerg Med, 10(9):961-964, Sep. 2003. |
Katz, S. H. and Falk, J. L., “Misplaced endotrachial tubes by paramedics in an urban medical services system,” Annals Emerg Med, 37:32-37, Jan. 2001. |
Lewis, R., “Educational research: time to reach the bar, not lower it,” Acad Emerg Med, 12(3):247-248, Mar. 2005. |
Liu, A. et, al., “A survey of surgical simulation: applications, technology, and education,” Presence, 12(6):1-45, Dec. 2003. |
Manchester Visulations Centre, “Webset project-bringing 3D medical training tools to the WWW,” http://www.sve.man.ac.uklmvc/research/previous/website, 3 pgs, obtained from the website Sep. 8, 2004. |
Mclellan, H., “Virtual realities,” Mclellan Wyatt Digital, 33 pgs. |
Medical Simulation Corporation, corporate home page describing management team/frequertly asked questions, http://www.medsimulation.com/about_msc/key_employees.asp, 7 pgs, obtained from website Nov. 25, 2004. |
Medtronic, “The StealthStation[registered] treatment guidance system,” the corporate home page describing the company fact sheet and profile; http://www.medtronic.com/Newsroom, 4 pgs, obtained from website Mar. 5, 2005. |
Mort, T. C., “Emergency trachea! intubation: complications associated with repeated laryngoscopic attempts,” Anesth Analg, 99(2):607-613, Aug. 2004, 1 pg, obtained from website Sep. 8, 2004 (abstract). |
Nazeer, S. R., et al., “Ultrasound-assisted paracentesis performed by emergency physicians v.s. the traditional technique: a prospective, randomized study,” Amer J of Emer Med, 23:363-367, 2005. |
NCA Medical Simulation Center, Tutorial—simulation for medical training, http://Simcen.usuhs.millmiccaie, 4 pgs, 2003. |
Next Dimension Imaging, “Products-Anatomy Analyzer 2,” http://www.nexted.com/anatomyanalyzer.asp, 2 pgs, obtained from website Dec. 7, 2004. |
Norris, T. E. et al., “Teaching procedural skills,” J General Internal Med, 12(S2):S64-S70, Apr. 1997. |
On the Net Resources-Education and Training, URL: http://www.hitl.washington.edu/projects/knowledge_base/education.html, corporate home page regarding internet sites regarding education and training, 16 pgs, obtained from website Jan, 8, 2005. |
Osberg, K. M., “Virtual reality and education: a look at both sides of the sword,” http://www.hitl.washington.edu/publications/r-93-7/, 19 pgs, Dec. 14, 1992, obtained from website Jan. 21, 2008. |
Osmon, S. et al., “Clinical investigations: reporting of medical errors: an intensive care unit experience,” Grit Care Med, 32(3), 13 pgs, Mar. 2004. |
Ponder, M., et al., “Immersive VR decision training: telling interactive stories featuring advanced human simulation technologies,” Eurographics Association 2003, 10 pgs. |
PRIMAL, corporate home page describing resources forteaching healthcare practitioners, 2 pgs, obtained from website. |
Prystowsky, J. B. et al., “A virtual reality module for intravenous catheter placement,” Am J Surg 1999; 177 (2):171-175 (abstract). |
Reachin, “Medical Training Development Centre/Reachin technologies AB has entered into a corporation with Mentice AB,” Jan. 20, 2004, 4 pgs, obtained from website Nov. 9, 2004. |
Rothschild, J. M., “Ultrasound guidance of central vein catheterization,” NCBI, Nat Lib Med, www.ncbi.nlm.nih.gov/books/, HSTAT 21, 6 pgs, obtained from website May 11, 2005. |
Rowe, R. and Cohen, R. A., “An evaluation of a virtual reality airway simulator,” Anesth Analg 2002, 95:62-66. |
Sensable Technologies, “Phantom Omni Haptic Device,” 2 pgs, http://www.sensable.com/haptic-ohantom-omni.htm., obtained from website Jan. 24, 2008. |
Shaffer, K., “Becoming a physician: teaching anatomy in a digital age,” NEJM, Sep. 23, 2004; 351(13):1279-81 Kextract of first 100 words—no abstract). |
Number | Date | Country | |
---|---|---|---|
20190371204 A1 | Dec 2019 | US |
Number | Date | Country | |
---|---|---|---|
61907276 | Nov 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14548210 | Nov 2014 | US |
Child | 16538317 | US |