This invention relates generally to systems and methods for providing medical training, and more specifically to medical training systems and methods that at least partly involve simulations of medical procedures and operations, particularly ultrasonography.
Paradigm shifts are taking place in the world of medicine and technology. Point-of-care ultrasonography is redefining the very essence of the physical examination and the way clinicians practice medicine. Point-of-care ultrasound refers to the use of portable ultrasonography at a patient's bedside for diagnostic (e.g., symptom or sign-based examination) and therapeutic (e.g., image-guidance) purposes. The principal barrier to use of this life-saving technology is lack of ultrasound training. User adoption is limited by the psychomotor skills required to manipulate an ultrasound probe coupled with the ability to interpret the resulting ultrasound images. Concurrent with improving psychomotor skills, training must also reflect the psychosocial stressors associated with performing procedures. It is clear that medical education must catch up with the expanded use of ultrasonography if tomorrow's practitioners are to provide superior healthcare for their patients.
Developing competence on performing ultrasonography in a clinical setting requires integrated cognitive (image interpretation) and psychomotor (optimal image window acquisition) skills and developing the ability to perform these procedures under stress. Once an optimal image window is acquired and correctly interpreted, the information needs to be correctly applied to patient care. The opportunity cost of training healthcare providers on ultrasonography is extremely high. Optimal training requires: (1) a qualified instructor; (2) trainees; (3) an ultrasound machine; (4) a patient with a pathologic condition; and (5) an environment that accurately recreates the psychosocial challenges of performing procedures. All of these elements must come together in the same place and at the same time.
Currently available training methods all have significant limitations. These include clinical bedside teaching, attending hands-on training courses, use of phantom models, and high-fidelity yet very expensive ultrasound simulator workstations. They may involve bulky training platforms that require multiple users to visit a simulation center (e.g., UltraSim®, CAE Healthcare) or require the presence of an actual ultrasound machine (e.g., Blue Phantom™). These ultrasound-training solutions employ high-priced dedicated computer hardware and software that does not deploy over the Internet. Alternative training products provide a limited library of purely didactic training solutions that are not accompanied with any hands-on training experience (e.g., EMSONO). None of the above solutions recreate the psychosocial stressors of performing procedures in a real-life clinical setting.
Conversely, within the fields of virtual reality, augmented reality, and mixed reality technologies there exists the ability to create virtual (digital) characters, environments, and guides that can be superimposed on actual physical objects. However, none of these technologies offer the ability to develop and train individuals on highly refined psychomotor skills, such as performing an ultrasound-guided procedure.
For the foregoing reasons there is a need for a system and method to develop psychomotor skills of individuals on a wide variety of tasks, such as ultrasound-guided procedures, in a context (or clinical setting) that accurately imposes the psychosocial stressors of performing these stated procedures in a real clinical setting. Individuals skilled in these arts and technologies have not been able to devise a solution to the described limitations of existing training solutions. The lack of existing products and/or technology solutions to this problem, at the time of this filing, speaks to the lack of a clear and/or obvious solution to this problem.
The present invention is directed to creating the perception of presence in a multisensory psychomotor skill training environment that would extend the existing described patented technologies by further adding realism through representation of virtual (digital) characters into a training scenario (e.g., pregnant patient), superimposing virtual (digital) guides or prompts onto actual real physical objects (e.g., anatomical landmarks or guides onto a mannequin or patient), or embedding the training instruments within a broader virtual (digital) scenario (e.g., operating room).
The detailed description set forth below in connection with the appended drawings is intended as a description of presently-preferred embodiments of the invention and is not intended to represent the only forms in which the present invention may be constructed or utilized. The description sets forth the functions and the sequence of steps for constructing and operating the invention in connection with the illustrated embodiments. It is to be understood, however, that the same or equivalent functions and sequences may be accomplished by different embodiments that are also intended to be encompassed within the spirit and scope of the invention.
The present invention is a system and method for delivering psychomotor skill training to provide didactic instruction for performing a skill in at least a partially virtual environment. In some embodiments, using an internet portal, tracking performance, and providing user feedback within a multisensory environment integrates existing multimodal ultrasound training systems with the psychomotor skill training system. The feedback may be in the form of an auditory feedback, visual feedback, or tactile feedback to alert or instruct the user regarding his or her performance.
The existing multimodal ultrasound training systems comprise a wide variety of immersive and non-immersive visualization technologies that have been described as virtual reality, augmented reality, and mixed reality. U.S. Pat. No. 8,480,404 to Savitsky (the '404 patent) and U.S. Pat. No. 8,297,983 to Savitsky et al. (the '983 patent) describe multimodal ultrasound training systems in greater detail. The '404 and '983 patents are incorporated in their entirety here by this reference.
The '404 patent teaches a medical procedure training system based on a computer platform that provides multimodal education within a virtual environment. The system integrates digital video, three-dimensional modeling, and force-feedback devices for the purpose of training medical professionals on the skills required to perform medical procedures. It can be used to teach a variety of medical procedures and interventions, including ultrasound. The '983 patent describes an ultrasound training system having a data capture module, a digital asset management module, a validation module, didactic content, a media asset production module, an integration layer, an internet-based portal, a software client, and a peripheral probe. The combination of the disclosed elements creates a system that enables the creation and delivery of high-quality ultrasound education and training in a low-cost, widely deployable, and scalable manner, with a facilitated method for processing orders and financial transactions between customers and content providers.
The multisensory psychomotor skill training environment of the present invention would extend the existing described patented technologies by further adding realism through representation of virtual (digital) characters into a training scenario (e.g., pregnant patient), superimposing virtual (digital) guides or prompts onto actual real physical objects (e.g., anatomical landmarks or guides onto a mannequin or patient), or embedding the training instruments within a broader virtual (digital) scenario (e.g., operating room). The ability to create these virtual, augmented, and mixed reality scenarios enables cost-effective, realistic, and improved training capabilities that mirror real-life training scenarios.
Importantly, the present invention would eliminate the need for physical structures (e.g., hospital beds, manikins, etc.) within the training scenario, as these elements can be virtually represented—making training scalable, cost-effective, and efficient. Combining the virtual, augmented, and mixed reality scenarios with the multimodal ultrasound training system enables effective ultrasound training for performance of fine psychomotor skill-dependent skills with the added psychosocial stressors associated with performing these procedures in real clinical settings. This proposed system and method for providing multisensory psychomotor skill training will address the unmet need for a method to develop the psychomotor skills of individuals on a wide variety of tasks, such as ultrasound-guided procedures, in in a realistic clinical scenario with proportionate psychosocial stressors.
The present invention hinges on emerging technologies capable of creating the perception of presence to enhance the psychological impact and the effectiveness of medical training simulators. The notion of presence for our purpose has two fundamental meanings:
(1) the cognitive illusion of being truly transported into an alternate world recreated by computer simulations; or
(2) the cognitive illusion that an object, a person, or other interactive medium generated by computer simulations is truly present in the surrounding world perceived by the user.
The notion of presence is fundamentally important in its ability to instill a powerful emotional response in users—an essential requisite for our invention. Collectively, these technologies are defined as projector of presence (“POP”) devices. The POP devices address the problem associated with recreating the visual and auditory stimuli that play a primary role in creating the perception of presence and synchronizing those stimuli with the reality that surrounds the user. A practitioner skilled in the art will recognize that several commercially available POP devices exist already. POP devices are grouped in several broad categories:
(1) Virtual Reality (VR), wherein the sight of the real world is entirely blocked by an opaque medium and the user's field of view is completely replaced by a rendering of a simulated reality (virtual world) in which solutions generally rely on sophisticated motion sensing technologies to synchronize the rendering of the virtual world with the user's head motion and in some cases his or her position in the real world;
(2) Augmented Reality (AR), wherein the content is overlaid on the user's field of view of the real world and enhanced with audio that responds to the surrounding real world in which true AR solutions are generally restricted to showing notifications or graphical user interfaces superimposed on real objects; or
(3) Mixed Reality (MR), wherein solutions extend the basic notion of AR by synchronizing the content overlaid onto the user's field of view with the surrounding real world, thus recreating a convincing perception that the simulated objects are really present in front of the user.
While the present invention can be realized with any of these categories of POP devices, the preferred embodiment utilizes VR and MR for their enhanced ability in creating the perception of presence. Further, it has been shown that for many applications, tactile feedback is also very important in establishing a strong perception of presence, especially in situations where a simulated real world task does in fact require interaction with a physical object. The present invention conveys the strongest perception of presence and succeeds in recreating psychological stressors that occur in real-world clinical scenarios.
Unless otherwise stated, the following terms as used herein have the following definitions.
“Projector of Presence” of “POP” devices refers to any VR, AR, or MR device capable of measuring key aspect of the user's head position, body placement, and gaze direction using a first set of sensors to measure position, orientation, and/or motion in order to project realistic renditions of individual objects or entire settings into the user's field of view. These types of technologies exist already and are rapidly evolving.
“Tracked probe” refers to a device preferably in the shape of a medical instrument with a second set of sensors to measure its position, orientation and/or motion, and with communication hardware to transmit sensor information to the computer.
“Scanning surface” refers to a planar or curvilinear solid surface on which the user can place and move the tracked probe on top of, such as a medical manikin. Preferably the geometry of the scanning surface is known by the simulation software so that the tactile feedback of the tracked probe moving on the scanning surface can be reproduced accurately in the virtual environment projected by the POP device. The scanning surface may contain a third set of sensors so that its position, orientation, and/or motion can be measured and detected.
The sensors for detecting position, orientation, and/or motion can be realized using MEMS sensors, such as accelerometers, gyroscopes, and magnetometers for self-contained solutions, or the can be built using external optical or electromagnetic trackers that act as position and orientation references. Other sensing solutions capable of measuring position, orientation and/or motion fall within the spirit of this invention.
A “computer” refers to any type of desktop, laptop, tablet, mobile computer, or any other computational device capable of running the simulation software and/or communicating through a wired or wireless connection with the POP device and tracked probe.
A “simulation software,” runs on the computer and provides a realistic representation of a clinical scenario to train students on the use of medical imaging and how to perform certain types of procedures. Several types of simulation solutions exist on the market and demonstrate the importance of this type of technology. However, current solutions are not capable of instilling the same types of psychosocial stressors that a medical practitioner encounters in real life or emulating interactions with other medical professionals involved in a clinical scenario (e.g. nurses and radiologists).
The simulation software augmented with a POP device renders and animates a clinical scene and provides a convincing rendition of a patient in pain that emotes vigorously during a procedure. The simulation software also renders and animates other medical professionals interacting and talking to the user to help him or her understand the team effort required to diagnose and treat a patient in real life and to emulate the psychological pressure of being surrounded by other individuals that must coordinates their efforts to achieve a desired clinical goal. The computer monitors the position, orientation, and movement of the tracked probe to determine whether the user has performed the proper procedure with a predetermined level of proficiency by determining whether the tracked probe was in the correct position and orientation and moved in a predetermined manner within a specified tolerance level for a given scenario.
As illustrated in
In use, a user wears and activates a POP device. A computer in communication with the POP device generates a simulation in the user's field of view. The simulation may be all the user sees in the user's field of view or the simulation may be integrated with the real world environment in the user's field of view. The user holds a tracked probe in the user's hand, and when brought into the user's field of view, the tracked probe enters the simulation. As the tracked probe interacts with the simulation, the tracked probe provides tactile feedback that contributes to a perception of the user.
In some embodiments, the tracked probe may be used with a scanning surface. The tracked probe interacts with the scanning surface and the combined interaction provides a tactile feedback that contributes to the perception of the user.
In some embodiments, the computer in communication with the POP device and the tracked probe determines the position and orientation of the tracked probe and the POP device and projects a simulated environment on the POP device based on the position and orientation of the POP device and the tracked probe. The POP device uses the sensors to measure information, such as the orientation and/or position of the user's head and gaze direction, and reports the information to the computer.
The computer may be integrated with the POP device or may be a separate system that communicates with the POP device and supports the processing requirements of the POP device. The computational system uses the information to generate a rendering of a simulated scene in the POP device for display in the user's field of view. The simulated scene may represent a clinical scenario or individual objects in a way that matches the user's viewpoint and the current state of the tracked probe.
The POP device renders an image, and optionally auditory feedback, to convince the user that the simulated scene generated by the computer simulation is truly present in the user's field of view. By way of example only, in an ultrasound training application, the POP device may present a simulated hospital scene in which the tracked probe is represented as an ultrasound probe. The scanning surface may be presented as a patient. As the user looks around the room, the POP device generates a simulated monitor when the user looks in a specific location. The specific location may be predetermined, or the user may select a particular location to generate the simulated monitor. The user can then begin his or her ultrasound examination on the scanning surface. If the user's gaze is directed towards the tracked probe and scanning surface, then the user will see the ultrasound probe and patient. If the user turn's his or her head in the particular location where the monitor was established, then the user sees on the simulated monitor, an ultrasound image. As the user manipulates the ultrasound image along the scanning surface, the ultrasound image changes accordingly giving the user the perception of actually conducting an ultrasound scan.
The simulation software can create a scene in which the user must perform the proper procedure to pass the test. Whether the user passes the test will be determined by the position, orientation, and/or movement of the tracked probe. Timing can also be integrated into the scene by allowing the user to perform a particular procedure at a specified time that the user must recognize. Distractions can also be incorporated into the scene to determine whether the user is able to maintain focus.
A database of scenes may be stored on the computer or other storage device for the user to select to test various skills and knowledge.
The system can take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment containing both hardware and software elements. In one embodiment, the system is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
Furthermore, the system can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium comprise a semiconductor or solid-state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks comprise compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
A data processing system suitable for storing and/or executing program code comprises at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code is retrieved from bulk storage during execution.
Input/output or I/O devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers.
Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. Modems, cable modem and Ethernet cards are just a few of the currently available types of network adapters.
Described above, aspects of the present application are embodied in a World Wide Web (“WWW”) or (“Web”) site accessible via the Internet. As is well known to those skilled in the art, the term “Internet” refers to the collection of networks and routers that use the Transmission Control Protocol/Internet Protocol (“TCP/IP”) to communicate with one another. The internet can include a plurality of local area networks (“LANs”) and a wide area network (“WAN”) that are interconnected by routers. The routers are special purpose computers used to interface one LAN or WAN to another. Communication links within the LANs may be wireless, twisted wire pair, coaxial cable, or optical fiber, while communication links between networks may utilize analog telephone lines, digital T-1 lines, T-3 lines or other communications links known to those skilled in the art.
Furthermore, computers and other related electronic devices can be remotely connected to either the LANs or the WAN via a digital communications device, modem and temporary telephone, or a wireless link. It will be appreciated that the internet comprises a vast number of such interconnected networks, computers, and routers.
The foregoing description of the preferred embodiment of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the invention not be limited by this detailed description, but by the claims and the equivalents to the claims appended hereto.
This application is a continuation of U.S. patent application Ser. No. 15/880,290, filed Jan. 25, 2018, titled “System and Method for Multisensory Psychomotor Skill Training,” which claims the benefit of U.S. Provisional Patent Application No. 62/450,975, filed Jan. 26, 2017, which applications are incorporated here in their entirety by this reference.
Number | Name | Date | Kind |
---|---|---|---|
1488233 | Diehl | Mar 1924 | A |
1762937 | Staud | Jun 1930 | A |
2019121 | De Rewal | Oct 1935 | A |
2112019 | Gyger | Mar 1938 | A |
2127610 | Moore | Aug 1938 | A |
2705049 | Brooks | Mar 1955 | A |
2705307 | Nyswander | Mar 1955 | A |
2722947 | Sragal | Nov 1955 | A |
2886316 | Ayala | May 1959 | A |
4040171 | Cline et al. | Aug 1977 | A |
4838863 | Allard et al. | Jun 1989 | A |
4838869 | Allard | Jun 1989 | A |
4994034 | Botich et al. | Feb 1991 | A |
5231381 | Duwaer | Jul 1993 | A |
5513992 | Refait | May 1996 | A |
5609485 | Bergman et al. | Mar 1997 | A |
5678565 | Sarvazyan | Oct 1997 | A |
5689443 | Ramanathan | Nov 1997 | A |
5701900 | Shehada et al. | Dec 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5755577 | Gillio | May 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5776062 | Nieids | Jul 1998 | A |
5791908 | Gillio | Aug 1998 | A |
5800177 | Gillio | Sep 1998 | A |
5800178 | Gillio | Sep 1998 | A |
5800179 | Bailey | Sep 1998 | A |
5800350 | Coppleson et al. | Sep 1998 | A |
5827942 | Madsen et al. | Oct 1998 | A |
5882206 | Gillio | Mar 1999 | A |
5889237 | Makinwa | Mar 1999 | A |
5934288 | Avila et al. | Aug 1999 | A |
6001472 | Ikeda et al. | Dec 1999 | A |
6048312 | Ishrak et al. | Apr 2000 | A |
6063030 | Vara et al. | May 2000 | A |
6068597 | Lin | May 2000 | A |
6074213 | Hon | Jun 2000 | A |
6113395 | Hon | Sep 2000 | A |
6117078 | Lysyansky et al. | Sep 2000 | A |
6122538 | Sliwa, Jr. et al. | Sep 2000 | A |
6156213 | Dudley et al. | Dec 2000 | A |
6193657 | Drapkin | Feb 2001 | B1 |
6267599 | Bailey | Jul 2001 | B1 |
6468212 | Scott et al. | Oct 2002 | B1 |
6502756 | Fåhraeus | Jan 2003 | B1 |
6511427 | Sliwa, Jr. et al. | Jan 2003 | B1 |
6548768 | Pettersson et al. | Apr 2003 | B1 |
6570104 | Ericson et al. | May 2003 | B1 |
6654000 | Rosenberg | Nov 2003 | B2 |
6663008 | Pettersson et al. | Dec 2003 | B1 |
6665554 | Charles et al. | Dec 2003 | B1 |
6666376 | Ericson | Dec 2003 | B1 |
6667695 | Pettersson et al. | Dec 2003 | B2 |
6674427 | Pettersson et al. | Jan 2004 | B1 |
6689966 | Wiebe | Feb 2004 | B2 |
6693626 | Rosenberg | Feb 2004 | B1 |
6694163 | Vining | Feb 2004 | B1 |
6698660 | Fåhraeus et al. | Mar 2004 | B2 |
6714213 | Lithicum et al. | Mar 2004 | B1 |
6714901 | Cotin et al. | Mar 2004 | B1 |
6719470 | Berhin | Apr 2004 | B2 |
6722574 | Skantze et al. | Apr 2004 | B2 |
6732927 | Olsson et al. | May 2004 | B2 |
6750877 | Rosenberg et al. | Jun 2004 | B2 |
6780016 | Toly | Aug 2004 | B1 |
6816148 | Mallett et al. | Nov 2004 | B2 |
6836555 | Ericson et al. | Dec 2004 | B2 |
6854821 | Ericson et al. | Feb 2005 | B2 |
6864880 | Hugosson et al. | Mar 2005 | B2 |
6878062 | Bjorklund et al. | Apr 2005 | B2 |
6896650 | Tracey et al. | May 2005 | B2 |
6916283 | Tracey et al. | Jul 2005 | B2 |
6927916 | Craven-Bartle | Aug 2005 | B2 |
6929183 | Pettersson | Aug 2005 | B2 |
6929481 | Alexander et al. | Aug 2005 | B1 |
6947033 | Fåhraeus et al. | Sep 2005 | B2 |
6958747 | Sahlberg et al. | Oct 2005 | B2 |
6966495 | Lynggaard et al. | Nov 2005 | B2 |
6992655 | Ericson et al. | Jan 2006 | B2 |
7002559 | Ericson | Feb 2006 | B2 |
7035429 | Andreasson | Apr 2006 | B2 |
7037258 | Chatenever et al. | May 2006 | B2 |
7050653 | Edso et al. | May 2006 | B2 |
7054487 | Ericson et al. | May 2006 | B2 |
7072529 | Hugosson et al. | Jul 2006 | B2 |
7089308 | Fransson et al. | Aug 2006 | B2 |
7094977 | Ericson et al. | Aug 2006 | B2 |
7110604 | Olsson | Sep 2006 | B2 |
7120320 | Petterson et al. | Oct 2006 | B2 |
7121465 | Rignell | Oct 2006 | B2 |
7127682 | Sandstrom et al. | Oct 2006 | B2 |
7143952 | Ericson | Dec 2006 | B2 |
7145556 | Pettersson | Dec 2006 | B2 |
7154056 | Bergqvist et al. | Dec 2006 | B2 |
7162087 | Bryborn | Jan 2007 | B2 |
7167164 | Ericson et al. | Jan 2007 | B2 |
7172131 | Pettersson et al. | Feb 2007 | B2 |
7175095 | Pettersson et al. | Feb 2007 | B2 |
7176896 | Fahraeus et al. | Feb 2007 | B1 |
7180509 | Fermgard et al. | Feb 2007 | B2 |
7195166 | Olsson et al. | Mar 2007 | B2 |
7202861 | Lynggaard | Apr 2007 | B2 |
7202963 | Wiebe et al. | Apr 2007 | B2 |
7239306 | Fahraeus et al. | Jul 2007 | B2 |
7246321 | Bryborn et al. | Jul 2007 | B2 |
7248250 | Pettersson et al. | Jul 2007 | B2 |
7249256 | Hansen et al. | Jul 2007 | B2 |
7249716 | Bryborn | Jul 2007 | B2 |
7254839 | Fahraeus et al. | Aug 2007 | B2 |
7278017 | Skantze | Oct 2007 | B2 |
7281668 | Pettersson et al. | Oct 2007 | B2 |
7283676 | Olsson | Oct 2007 | B2 |
7293697 | Wiebe et al. | Nov 2007 | B2 |
7295193 | Fahraeus | Nov 2007 | B2 |
7296075 | Lynggaard | Nov 2007 | B2 |
7321692 | Bryborn et al. | Jan 2008 | B2 |
7333947 | Wiebe et al. | Feb 2008 | B2 |
7345673 | Ericson et al. | Mar 2008 | B2 |
7353393 | Hansen et al. | Apr 2008 | B2 |
7356012 | Wiebe et al. | Apr 2008 | B2 |
7371068 | Lloyd et al. | May 2008 | B2 |
7382361 | Burstrom et al. | Jun 2008 | B2 |
7385595 | Bryborn et al. | Jun 2008 | B2 |
7408536 | Hugosson et al. | Aug 2008 | B2 |
7415501 | Burstrom | Aug 2008 | B2 |
7418160 | Lynggaard | Aug 2008 | B2 |
7422154 | Ericson | Sep 2008 | B2 |
7441183 | Burstrom et al. | Oct 2008 | B2 |
7457413 | Thuveshoimen et al. | Nov 2008 | B2 |
7457476 | Olsson | Nov 2008 | B2 |
7543753 | Pettersson | Jun 2009 | B2 |
7588191 | Pettersson et al. | Sep 2009 | B2 |
7600693 | Pettersson | Oct 2009 | B2 |
7649637 | Wiebe et al. | Jan 2010 | B2 |
7670070 | Craven-Bartle | Mar 2010 | B2 |
7672513 | Bjorklund et al. | Mar 2010 | B2 |
7701446 | Sahlberg et al. | Apr 2010 | B2 |
7710408 | Ericson | May 2010 | B2 |
7751089 | Fahraeus et al. | Jul 2010 | B2 |
7753283 | Lynggaard | Jul 2010 | B2 |
7777777 | Bowman et al. | Aug 2010 | B2 |
7788315 | Johansson | Aug 2010 | B2 |
7794388 | Draxinger et al. | Sep 2010 | B2 |
7806696 | Alexander et al. | Oct 2010 | B2 |
7833018 | Alexander et al. | Nov 2010 | B2 |
7850454 | Toly | Dec 2010 | B2 |
7857626 | Toly | Dec 2010 | B2 |
7871850 | Park | Jan 2011 | B2 |
7931470 | Alexander et al. | Apr 2011 | B2 |
8244506 | Butsev et al. | Aug 2012 | B2 |
8294972 | Chung | Oct 2012 | B2 |
8428326 | Falk et al. | Apr 2013 | B2 |
8480404 | Savitsky | Jul 2013 | B2 |
8480406 | Alexander et al. | Jul 2013 | B2 |
8556635 | Siassi | Oct 2013 | B2 |
8721344 | Marmaropoulos et al. | May 2014 | B2 |
9128116 | Welch et al. | Sep 2015 | B2 |
9251721 | Lampotang | Feb 2016 | B2 |
9436993 | Stolka et al. | Sep 2016 | B1 |
9870721 | Savitsky et al. | Jan 2018 | B2 |
9911365 | Toly | Mar 2018 | B2 |
10052010 | Feddema | Aug 2018 | B2 |
10132015 | Woodruff et al. | Nov 2018 | B2 |
11011077 | Garcia Kilroy | May 2021 | B2 |
20010031920 | Kaufman et al. | Oct 2001 | A1 |
20020076581 | McCoy | Jun 2002 | A1 |
20020076681 | Leight et al. | Jun 2002 | A1 |
20020088926 | Prasser | Jul 2002 | A1 |
20020099310 | Kimchy et al. | Jul 2002 | A1 |
20020168618 | Anderson et al. | Nov 2002 | A1 |
20020173721 | Grunwald et al. | Nov 2002 | A1 |
20040043368 | Hsieh et al. | Mar 2004 | A1 |
20040087850 | Okerlund et al. | May 2004 | A1 |
20050119569 | Ohtake | Jun 2005 | A1 |
20050181342 | Toly | Aug 2005 | A1 |
20050214726 | Feygin et al. | Sep 2005 | A1 |
20050228617 | Kerwin et al. | Oct 2005 | A1 |
20050283075 | Ma et al. | Dec 2005 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060098010 | Dwyer et al. | May 2006 | A1 |
20070088213 | Poland | Apr 2007 | A1 |
20070161904 | Urbano | Jul 2007 | A1 |
20070232907 | Pelissier et al. | Oct 2007 | A1 |
20070233085 | Colvin et al. | Oct 2007 | A1 |
20070236514 | Augusanto | Oct 2007 | A1 |
20080009743 | Hayasaka | Jan 2008 | A1 |
20080137071 | Chow | Jun 2008 | A1 |
20080187896 | Savitsky | Aug 2008 | A1 |
20080200807 | Wright et al. | Aug 2008 | A1 |
20080204004 | Anderson | Aug 2008 | A1 |
20080269606 | Matsumura | Oct 2008 | A1 |
20080294096 | Uber et al. | Nov 2008 | A1 |
20080312884 | Hostettler et al. | Dec 2008 | A1 |
20090006419 | Savitsky et al. | Jan 2009 | A1 |
20090043195 | Poland | Feb 2009 | A1 |
20090046912 | Hostettler | Feb 2009 | A1 |
20090130642 | Tada et al. | May 2009 | A1 |
20090209859 | Tsujita et al. | Aug 2009 | A1 |
20090266957 | Cermak | Oct 2009 | A1 |
20090305213 | Burgkart et al. | Dec 2009 | A1 |
20090311655 | Karkanias et al. | Dec 2009 | A1 |
20100055657 | Goble et al. | Mar 2010 | A1 |
20100104162 | Falk et al. | Apr 2010 | A1 |
20100179428 | Pedersen et al. | Jul 2010 | A1 |
20100268067 | Razzaque et al. | Oct 2010 | A1 |
20100277422 | Muresianu et al. | Nov 2010 | A1 |
20110010023 | Kunzig et al. | Jan 2011 | A1 |
20110306025 | Sheehan et al. | Dec 2011 | A1 |
20120021993 | Kim et al. | Jan 2012 | A1 |
20120058457 | Savitsky | Mar 2012 | A1 |
20120143142 | Klein | Jun 2012 | A1 |
20120150797 | Landy et al. | Jun 2012 | A1 |
20120179039 | Pelissier et al. | Jul 2012 | A1 |
20120200977 | Nestler | Aug 2012 | A1 |
20120219937 | Hughes et al. | Aug 2012 | A1 |
20120237102 | Savitsky et al. | Sep 2012 | A1 |
20120237913 | Savitsky et al. | Sep 2012 | A1 |
20120238875 | Savitsky et al. | Sep 2012 | A1 |
20120251991 | Savitsky | Oct 2012 | A1 |
20130046523 | Van Dinther | Feb 2013 | A1 |
20130064036 | Lee et al. | Mar 2013 | A1 |
20130065211 | Amso et al. | Mar 2013 | A1 |
20130137989 | Chen | May 2013 | A1 |
20130158411 | Miyasaka | Jun 2013 | A1 |
20130179306 | Want et al. | Jul 2013 | A1 |
20130236872 | Laurusonis et al. | Sep 2013 | A1 |
20140000448 | Tepper | Jan 2014 | A1 |
20140087347 | Tracy | Mar 2014 | A1 |
20140114194 | Kanayama et al. | Apr 2014 | A1 |
20140119645 | Zimet | May 2014 | A1 |
20140120505 | Rios et al. | May 2014 | A1 |
20140170620 | Savitsky et al. | Jun 2014 | A1 |
20140228685 | Eelbode | Aug 2014 | A1 |
20140272878 | Shim et al. | Sep 2014 | A1 |
20150056591 | Tepper et al. | Feb 2015 | A1 |
20150073639 | Hausotte | Mar 2015 | A1 |
20150084897 | Nataneli et al. | Mar 2015 | A1 |
20150086956 | Savitsky et al. | Mar 2015 | A1 |
20150140538 | Savitsky et al. | May 2015 | A1 |
20150154890 | Savitsky | Jun 2015 | A1 |
20150213731 | Sato | Jul 2015 | A1 |
20160104393 | Savitsky et al. | Apr 2016 | A1 |
20160259424 | Nataneli et al. | Sep 2016 | A1 |
20160314715 | Savitsky et al. | Oct 2016 | A1 |
20160314716 | Grubbs | Oct 2016 | A1 |
20160328998 | Pedersen | Nov 2016 | A1 |
20170018204 | Savitsky et al. | Jan 2017 | A1 |
20170028141 | Fiedler et al. | Feb 2017 | A1 |
20170035517 | Geri | Feb 2017 | A1 |
20170046985 | Hendrickson et al. | Feb 2017 | A1 |
20170110032 | O'Brien | Apr 2017 | A1 |
20170270829 | Bauss | Sep 2017 | A1 |
20170352294 | Nataneli et al. | Dec 2017 | A1 |
20180137784 | Savitsky | May 2018 | A1 |
20180197441 | Rios | Jul 2018 | A1 |
20180211563 | Savitsky | Jul 2018 | A1 |
20180330635 | Savitsky | Nov 2018 | A1 |
20180366034 | Casals Gelpi | Dec 2018 | A1 |
20190057620 | Eggert | Feb 2019 | A1 |
20190231436 | Panse | Aug 2019 | A1 |
20190321657 | Hale | Oct 2019 | A1 |
20190371204 | Savitsky | Dec 2019 | A1 |
20200126449 | Horst | Apr 2020 | A1 |
20200138518 | Lang | May 2020 | A1 |
20200242971 | Wang | Jul 2020 | A1 |
20200242972 | Petrinec | Jul 2020 | A1 |
20210128125 | Sitti et al. | May 2021 | A1 |
20210186311 | Levy et al. | Jun 2021 | A1 |
Number | Date | Country |
---|---|---|
1103223 | May 2001 | EP |
2801966 | Nov 2014 | EP |
2011097238 | Dec 2014 | JP |
2127610 | Nov 2014 | RU |
1994040171 | Nov 2014 | RU |
2006060406 | Jun 2006 | WO |
Entry |
---|
Simsuite, “Are you ready for your first carotid stent procedure/begin preparing now,” www.simsuites.com/readarticie.asp?articleId=598, 1 pg, obtained from website Nov. 25, 2004. |
NewRuleFX Brand 5ml Retractable Hypo Syringe Prop—Retractable Needle and Plunger—No Liquids. (Feb. 9, 2014), from https://www.newrulefx.com/products/newrulefx-brand-5ml-retractable-hypo-syringe-prop-retractable-needle-and-plunger-no-liquids?_pos=4&_sid=71f351469&_ss=r (Year: 2014). |
Chung, Gregory, “Effects of Simulation-Based Practice on Focused Assessment . . . ”, Military Medicine, Oct. 2013, vol. 178. |
Aligned Management Associates, Inc., Corporate home page describing organizing committee, overview, Procedicus MIST[trademark]-suturing module 30.0, 6 pgs., obtained from website Sep. 6, 2004. |
American Academy of Emergency Medicine, conference: 11th annual scientific assembly preconference ultrasound courts, http://www. aaem.org/education/scientificassembly/sa05/precon/ultrasound.shtml, 6 pgs, obtained from website Feb. 16, 2005. |
Barbosa, J. et. al., “Computer education in emergency medicine residency programs,” http://www.med-ed-online.org/res00002.htm, 8 pgs, obtained from website Sep. 6, 2004. |
Brannam, Let al, “Emergency nurses' utilization of ultrasound guidance for placement of peripheral intravenous lines in difficult-access patients,” Acad Emerg Med, 11(12):1361-1363, Dec. 2004. |
Calvert, N et al., “The effectiveness and cost-effectiveness of ultrasound locating devices for central venous access: a systematic review and economic evaluation/executive summary,” Health Tech Assess 2003, 7(12), 4 pgs. |
Center for Human Simulation, corporate home page describing overview/people, http://www.uchsc.edu, 7 pgs, obtained from website Sep. 6, 2004. |
CIMIT News, “The medical access program: new CIMIT initiative to benefit underserved patients/partners telemedicine and CIMIT launch new initiative: stay connected, be healthy/highlights: operating room of the future plug-and-play project,” http://www.cimit.org, Jan. 2005: vol. II(2), 2 pgs., obtained from website Mar. 1, 2005. |
Colt, H. G. et. al., “Virtual reality bronchoscopy simulation: a revolution in procedural training,” Chest 2001; 120:1333-1339. |
Computer Motion, “About computer motion: technology to enhance surgeons' capabilities, improve patient outcomes and reduce healthcare costs/corporate alliances/products & solutions for surgical innovation/training on the da Vinci[registered] surgical system-introd action,” 2002 Computer Motion, http://www.computermotion.com, 6 pgs. |
Delp, Setal, “Surgical simulation—an emerging technology for training in emergency medicine,” Presence, 6(2):147-159, Apr. 1997 (abstract). |
Dorner, R. et. al,, “Synergies between interactive training simulations and digital storytelling: a component-based framework,” Computer & Graphics, 26(1):45-55, Feb. 2002 (abstract). |
Duque, D. and Kessler S., “Ultrasound guided vascular access,” Amer Coli Emerg Phy., http://www.nyacep.org/education/articles/ultrasound%20vascular%20access.htm, 2 pgs, obtained from website May 11, 2005. |
Espinet, A. and Dunning J., “Does ultrasound-guided central line insertion reduce complications and time to placement in elective patients undergoing cardiac surgery,” Inter Cardiovascular Thoracic Surg, 3:523-527, 2004; http:/licvts.ctsnetjournals.org/cgi/content/full/3/3/523, 6 pgs, obtained from website May 11, 2005 (abstract). |
Gallagher, A. G. et. al., “Virtual reality training for the operating room and cardiac catheterization laboratory,” Lancet, 364:1538-1540, Oct. 23, 2004. |
Gallagher, A. G. et. al., “Psychomotor skills assessment in practicing surgeons experienced in performing advanced laparoscopic procedures,” AM Coll Surg, 197(3):479-488, Sep. 2003. |
Gausche, M. et. al., “Effect on out-of-hospital pediatric endotracheal intubation on survival and neurological outcome: a controlled clinical trial,” JAMA, 283(6):783-790, Feb. 9, 2000. |
Gore, D. C. and Gregory, S. R., “Historical perspective on medical errors: Richard Cabot and the Institute of Medicine,” J Amer Coll Surg, 197(4), 5 pgs, Oct. 2003. |
Grantcharov, T. P. et. al., “Randomized clinical trial of virtual reality simulation for laparoscopic skills training,” Br J Surg, 91(2):146-150, Feb. 1, 2004 (abstract). |
Grantcharov, T. P. et. al., “Learning curves and impact of previous operative experience on performance on a virtual reality simulator to test laparoscopic surgical skills,” Am J Surg, 185(2):146-149, Feb. 1, 2004 (abstract). |
Haluck, R. S., et. al., “Are surgery training programs ready for virtual reality? A survey of program directors in general surgery,” Arch Surg, 135(7):786-792, Jul. 1, 2000. |
Helmreich, R. L., “On error management: lessons from aviation,” BMJ, 320:781-785, Mar. 2000. |
Huckman, R. S. and Pisano, G. P., “Turf battles in coronary revascularization,” N Engl J Med, http://www.nejm.org, 4 pgs, 352(9):857-859, Mar. 3, 2005. |
Immersion Corporation, URL: http://www.immersion.com/corporate/products/, corporate home page describing Immersion's surgical training simulators—“Wireless Data Giove: The CyberGlove[registered]II System,” 5 pgs, obtained from the website Nov. 17, 2005 and Jan. 24, 2008. |
injuryboard.com, “Reducing complications associated with central vein catheterization,” URSL: http://www.injuryboard.com/view.cfm/Article=668, 5 pgs, obtained from website May 11, 2005. |
Intersense, home page listing motion tracking products, http://www.isense.com/prodcuts.aspx?id=42&, 1 pg, obtained from website Jan. 24, 2008. |
Jemmett, M. E., et. al., “Unrecognized misplacement of endotracheal tubes in a mixed urban to rural emergency medical services setting,” Acad Emerg Med, 10(9):961-964, Sep. 2003. |
Katz, S. H. and Falk, J. L., “Misplaced endotrachial tubes by paramedics in an urban medical services system,” Annals Emerg Med, 37:32-37, Jan. 2001. |
Lewis, R., “Educational research: time to reach the bar, not lower it,” Acad Emerg Med, 12(3):2478-248, Mar. 2005. |
Liu, A. et. al., “A survey of surgical simulation: applications, technology, and education,” Presence, 12(6):1-45, Dec. 2003. |
Manchester Visulations Centre, “Webset project-bringing 3D medical training tools to the WWW,” http://www.sve.man.ac.uklmvc/research/previous/website, 3 pgs, obtained from the website Sep. 8, 2004. |
Medical Simulation Corporation, corporate home page describing management team/frequently asked questions, http://www.medsimulation.com/about_msc/key_employees.asp, 7 pgs, obtained from website Nov. 25, 2004. |
Medtronic, “The StealthStation[registered] treatment, guidance system,” the corporate home page describing the company fact sheet and profile; http://www.medtronic.com/Newsroom, 4 pgs, obtained from website Mar. 5, 2005. |
Mort, T. C., “Emergency tracheal intubation: complications associated with repeated laryngoscopic attempts,” Anesth Analg, 99(2):607-613, Aug. 2004, 1 pg, obtained from website Sep. 8, 2004 (abstract). |
Nazeer, S. R., et. al., “Ultrasound-assisted paracentesis performed by emergency physicians v.s. the traditional technique: a prospective, randomized study,” Amer J of Emer Med, 23:363-367, 2005. |
NCA Medical Simulation Center, Tutorial-simulation for medical training, http://Simcen.usuhs.millmiccaie, 4 pgs, 2003. |
Next Dimension Imaging, “Products-Anatomy Analyzer 2,” http://www.nexted.com/anatomyanalyzer.asp, 2 pgs, obtained from website Dec. 7, 2004. |
Norris, T. E. et. al., “Teaching procedural skills,” J General internal Med, 12(S2):S64-S70, Apr. 1997. |
On the Net Resources-Education and Training, URL: http://www.hitl.washington.edu/projects/knowledge_base/education.html, corporate home page regarding internet sites regarding education and training, 16 pgs, obtained from website Jan. 8, 2005. |
Osberg, K. M., “Virtual reality and education: a look at both sides of the sword,” http://www.hitl.washington.edu/publications/r-93-7/, 19 pgs, Dec. 14, 1992, obtained from website Jan. 21, 2008. |
Osmon, S. et. al., “Clinical investigations: reporting of medical errors: an intensive care unit experience,” Grit Care Med, 32(3), 13 pgs, Mar. 2004. |
Ponder, M., et al., “Immersive VR decision training: telling interactive stories featuring advanced human simulation technologies,” Eurographics Association 2003, 10 pgs. |
Prystowsky, J. B. et. al., “A virtual reaiity moduie fer intravenous catheter placement,” Am J Surg 1999; 17(2):171-175 (abstract). |
Reachin, “Medical Training Development Centre/Reachin technologies AB has entered into a corporation with Mentice AB,” Jan. 20, 2004, 4 pgs, obtained from website Nov. 9, 2004. |
Rothschild, J. M., “Ultrasound guidance of central vein catheterization,” NCBI, Nat Lib Med, www.ncbi.nlm.nih.gov/books/, HSTAT 21, 6 pgs, obtained from website May 11, 2005. |
Rowe, R. and Cohen, R. A., “An evaluation of a virtual reality airway simulator,” Anesth Analg 2002, 95:62-66. |
Sensable Technologies, “PHANTOM Omni Haptic Device,” 2 pgs, http://www.sensable.com/haptic-ohantom-omni.htm., obtained from website Jan. 24, 2008. |
Shaffer, K., “Becoming a physician: teaching anatomy in a digital age,” NEJM, Sep. 23, 2004; 351(13):1279-81 (extract of first 100 words—no abstract). |
Number | Date | Country | |
---|---|---|---|
20210134186 A1 | May 2021 | US |
Number | Date | Country | |
---|---|---|---|
62450975 | Jan 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15880290 | Jan 2018 | US |
Child | 17150719 | US |