The present invention is directed generally to simulating medical conditions and, more particularly, to transmitting an indication of the medical condition(s) to a recipient to aid in the diagnosis of various medical condition(s).
To become clinically competent physicians, medical students must develop knowledge and skills in many areas of both the art and science of medicine. Three areas are emphasized in medical students' early clinical training: doctor-patient communication, eliciting the patient history, and performing the physical exam. Standardized patients (SPs), individuals trained to realistically portray patients, are commonly employed to teach and assess medical students in those three areas. By working with SPs, students gain the opportunity to learn and practice the skills of doctor-patient communication, such as eliciting the patient history, conducting the physical exams, and other clinical skills in a safe setting. SPs also provide a way to reliably test students' clinical skills in a realistic setting, interacting with a person. The range of clinical problems an SP can portray, however, is limited. They are typically healthy individuals with few or no abnormal physical findings. While some can be trained to simulate physical abnormalities (e.g., breathing through one lung, voluntarily increasing blood pressure, etc.), there are many abnormalities they cannot simulate.
One way to supplement what students learn from SPs is for the students to separately learn from and practice on simulators. A variety of mechanical or computer-based simulators are now used in medical education, including software for testing clinical reasoning and diagnostic skills, computer simulations of physiological processes, and physical models for practicing selected procedural skills. For example, a completely virtual SP (e.g., an interactive computer program) has been tried before by Hubal et al., as described in “The Virtual Standardized Patient,” Medicine Meets Virtual Reality 2000 (J. D. Westwood et al., eds., IOS Press), who utilized natural language processing and virtual patients that exhibit emotion in a realistic context to provide completely automatic yet unscripted training sessions. A key limitation to these simulators is that their users (e.g., medical students) do not interact with a live person (a patient or SP). Human-computer interaction brings a different set of psychological concerns than does the human-human interaction of a doctor-patient examination. A significant level of immersion is needed to overcome the human-computer interaction aspects so that there is appreciable transfer of training with regard to patient interaction and diagnosis. This level of immersion and interactivity has not been reached and may not be achievable in a totally virtual form with today's technology. Augmenting SPs with the ability to simulate abnormal physical findings would expand the opportunities for students to learn more clinical skills in a realistic setting with a real person (SP) while practicing their doctor-patient communication skills.
In addition, there is currently a need for expanding the breadth of indications associated with known medical conditions that may be portrayed by an SP. For example, with a real or standardized patient, a student is limited to hearing only the sounds of that single person. Learning a variety of sounds has traditionally required examining many patients over time, often without direct supervision and feedback. Commercially available recordings of heart and lung sounds exist, but using them ignores the process of locating the sources of sounds (e.g., correct placement of the stethoscope) and excludes simultaneous interactions with a patient.
Augmenting SPs with the capability of portraying patients with an increased range of medical conditions would make the use of SPs an even more valuable teaching tool. The present invention is directed to these and other important ends.
The present invention provides a system, method and medium for simulating medical conditions to facilitate medical training, that utilizes a roaming device configured to be mobile; a positioning device configured to determine location information of the roaming device; and a computing device configured to receive the location information, compare the location information with a predetermined set of regions, and transmit information indicative of a medical condition when the location information coincides with the predetermined set of regions.
In at least one aspect of the invention, the location of a roaming device is determined by a positioning device and transmitted to a computing device. In some embodiments, a magnetic tracking system is used to determine the location information of the roaming device. In other embodiments, an optical tracking system is used to determine the location information of the roaming device. In other embodiments, sensor devices are used to determine the location information of the roaming device. In some embodiments, the positioning device is a sensor placed on a vest worn by the subject being examined. The computing unit calibrates the location of the roaming device according to the individual body morphology or physical movements of the subject
The computing unit is connected to a computer, either as a separate unit or comprised within the computer. The computer contains software for comparing the location of the roaming device with a predetermined set of regions. When the location information of the roaming device coincides with the predetermined set of regions, information indicative of a medical condition is transmitted to an output device connected to the computer. The output device can be, for example, a stethoscope earpiece or a speaker connected to the computer, and the information indicative of a medical condition can be, for example, a sound file selected from a computer data repository. In some embodiments, the sound file can be played through the speaker in surround sound to specify the sound source location to the location of the object. A touch switch on the roaming device can control playback of the computer file, such that playback only occurs when the roaming device is touching the subject.
In accordance with the invention, information indicative of a medical condition is transmitted when the location of the roaming device coincides with a predetermined set of regions. Such regions include positions over the subject's actual heart, lungs, carotid and renal arteries, aorta, and abdomen. Examples of medical conditions that may be simulated using the invention include: bronchitis, heart failure, lung consolidation, pneumonia, atelectasis, pleural effusion, pneumothorax, chronic obstructive pulmonary disease, emphysema, asthma, healthy lung function, mitral valve prolapse, mitral regurgitation, mitral stenosis, pulmonic stenosis, aortic stenosis, aortic regurgitation, ventricular septal defect, pericarditis, healthy heart function, bowel obstruction, renal artery bruits, normal abdominal function, and carotid artery bruits.
In one aspect of the invention, the roaming device is a stethoscope and the information indicative of a medical condition is transmitted as a sound played through the earpiece of the stethoscope. The sound is either a naturally occurring sound or a digitally processed or altered sound. Examples of sounds heard through the stethoscope in accordance with the invention include normal breath sounds, crackles, wheezes, stridor, pleural rub, normal heart sounds, pathologic splitting, murmurs, clicks, gallops, pericardial friction rub, venous hum, bowel sounds, and bruits.
In yet another aspect of the invention, the roaming device is a specialized glove configured to transmit information indicative of a medical condition when the location of the glove coincides with a predetermined set of regions. When the glove detects tactile feedback at a predetermined region, a sound corresponding to a medical condition is selected by the computer for playback into the earpiece of a stethoscope or through a speaker connected to the computer.
The invention also provides a stethoscope comprising an earpiece and a headpiece, configured to transmit sound indicative of a medical condition when the location of the headpiece coincides with a predetermined set of regions. A touch switch on the headpiece of the stethoscope can be used to control the transmission of sound played through the earpiece.
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to specific embodiments and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alteration and further modifications of the invention, and such further applications of the principles of the invention as illustrated herein, being contemplated as would normally occur to one skilled in the art to which the invention relates. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
In accordance with some embodiments of the invention, low-frequency magnetic field technology is used to determine the location of roaming tracking device 112 on object 116. Transmitting device 120, stationary tracking device 104, roaming tracking device 112, and computing unit 118 provide a system for determining the location of roaming tracking device 112 on object 116 with respect to subject 102. Transmitting device 120 can be, for example, an electromagnetic field generator, which generates electromagnetic signals that are transmitted to tracking devices, such as magnetic sensors, within a known operating radius of device 120. In
The locations of tracking devices 104 and 112 in space are calculated by computing unit 118 using signals received by transmitting device 120. For example, roaming tracking device 112 transmits signals indicative of its position to computing unit 118. Software associated with computing unit 118 then computes the location of roaming tracking device 112. Computing unit 118 similarly computes the location of stationary tracking device 104. Preferably, computing unit 118 provides location information as substantially instantaneous updates of position (e.g., X, Y, and Z Cartesian coordinates) and/or orientation (e.g., azimuth, elevation, and roll). In embodiments of the invention, orientation may be measured, for example, as direction cosines, Euler angles, or quaternions.
From the location measurements obtained for tracking devices 104 and 112 with respect to transmitting device 120, computing unit 118 determines the location of tracking devices 104 and 112 with respect to each other. More specifically, the location of roaming tracking device 112 is determined with respect to the location of stationary tracking device 104 by computing unit 118. Therefore, computing unit 118 determines the location of roaming tracking device 112 relative to the reference frame of subject 102. Computing unit 118 thus provides dynamic location information that is generated according to the individual reference frames of different subjects.
In accordance with various embodiments of the invention, the correct placement of roaming tracking device 112 on subject 102 correlates with the generation of feedback indicative of a known medical condition to the user. Position and orientation measurements of roaming tracking device 112 are transmitted from computing unit 118 to computer 122, and transformed into a computer readable format utilizing any standard, commercially available software devised for making tracking data available for other applications. For example, software such as TRACKD (VRCO, Virginia Beach, Va.) may be used to transform tracking measurements of roaming tracking device 112 into a computer readable format. As shown in
Computer 122 contains a data repository of computer files. Each file is programmed to generate an indication associated with a known medical condition. The location of roaming tracking device 112 with respect to subject 102, as computed by computing unit 118 and transformed into computer readable format, is then used by software running on computer 122 to determine which computer file, if any, is selected for playback into output device 114 connected to computer 122. The computer software compares the location of roaming tracking device 112 on object 116 to a previously recorded map of predetermined “hot zone” locations (e.g., regions) on subject 102. A computer file is selected for playback only when the location of roaming tracking device 112 on object 116 is within a predetermined operating radius of a “hot zone.” For example, a computer file programmed to play a certain sound associated with a specific heart defect is selected for playback only when roaming tracking device 112 on object 116 is within the predetermined operating radius of “hot zone” 110 located over the subject's actual heart. The operating radius of a “hot zone” depends on its location and on the medical condition being simulated, and can be, for example, about 1 inch, or can be determined by one skilled in the art and adjusted for individual subjects and medical conditions to be simulated.
The computer file selected for playback is transmitted from computer 122 to output device 114. Output device 114 can be, for example, a stethoscope earpiece or a speaker. In some embodiments, a sound file can be transmitted in surround sound through the speaker to specify the sound source location to the location of object 116.
As used herein, a “hot zone” refers to a location on the body of subject 102 that triggers the playback of a corresponding computer file. “Hot zone” locations on a subject correspond with one or more simulated medical conditions. For example, if a particular cardiovascular condition is desired to be simulated, one or more “hot zones” corresponding to the particular heart condition will be located, for example, over the subject's anterior chest and/or major arteries, such as the carotid or renal artery. As another example, if a particular lung condition is desired to be simulated, one or more “hot zones” corresponding to the particular lung condition will be located, for example, over the subject's lungs. Other “hot zones” in accordance with the invention include areas generally examined by physicians during the pulmonary examination, such as the 26 areas described in Bates' Guide to Physical Examination and History Taking (Bickley & Szilagyi, Philadelphia: Lippincott William & Wilkons 2003), incorporated by reference in its entirety.
One skilled in the art may also appreciate that the location in space of a given “hot zone” will vary among subjects. For example, “hot zone” locations may vary depending on the individual subject's body morphology type (for example, obese v. thin body types or male v. female body types), or if a subject shrugs her shoulders or twists. In one or more embodiments of the invention, tracking device measurements are recalibrated to account for these variations and deviations among different subjects. Position measurements for stationary tracking device 104 on subject 102 with respect to transmitting device 120 are compared by computing unit 118 to position measurements obtained using a frame of reference as measured, for example, by an tracking device attached to a subject of average height, weight, and build. Any deviations in the position measurements for stationary tracking device 104 from the neutral position are accordingly subtracted from the position measurements obtained for roaming tracking device 112 on object 116. In some embodiments, neoprene vests of standard sizes (for example, XS to XXL) labeled with “hot zone” locations and calibration reference points can be used for calibrating the system according to any individual subject's morphological type. The subject can wear a vest of appropriate size for a short period of calibration to individualize the “hot zone” pattern for the subject.
When the location of roaming tracking device 112 on object 116 is within the range of a “hot zone,” a computer file is selected for playback and transmitted from computer 122 to output device 114. In some embodiments, roaming tracking device may include a touch switch 124. In operation, touch switch 124 enables output device 114. In other words, a user is not able to detect through output device 114 the playback of the selected computer file unless touch switch 124 is in the “closed” position. Computer 122 transmits the playback of a selected file to touch switch 124, which is connected to output device 114. If touch switch 124 is in the “open position,” the playback transmission to output device 114 is blocked. However, touch switch 124 closes upon contact with subject 102 into the “closed” position, allowing the playback transmission to be detected through output device 114 by the user. Touch switch 124 can be a standard, commercially available SDPT switch with ¾″ roller lever, such as Radio Shack catalog #275-017, modified by removing the roller from the switch.
In one or more embodiments of the invention, object 116 is a standard electronic stethoscope, such as, generally the Androscope i-stethos model IS-28A00 (Andromed, Inc., St. Laurent, Quebec), modified as described herein.
Transmission of a computer sound file from computer 122 to earpiece 128 correlates with correct placement of headpiece 126 on subject 102. In one or more embodiments of the invention, the sound file is a .wav file. However, other sound files, such as .mp3 files, may be used. The sound file may correspond to either a naturally occurring sound (e.g., a normal heartbeat) or to a sound that has been digitally processed (e.g., modifying a normal heartbeat to sound like an abnormal heartbeat) or digitally altered (e.g., adding extra sounds to normal heart sounds to simulate a valvular defect, or adding wheezes to breath sounds to simulate asthma). Commercially available software for digital signal processing may be used in connection with embodiments of the invention. For example, WaveWarp from SoundsLogical (Glasgow, Scotland UK) may be used.
Computer playback of the sound file is timed to correspond to the actual inspiration and expiration of the subject. A sound sensing/modifying approach is used where the computer receives input device information on the subject's inspiration/expiration and adjusts the timing of the lung sounds accordingly. Information on the subject's breathing pattern is obtained using a subject-controlled actuator switch or a plethysmograph to monitor chest movement.
In accordance with the invention, the auscultated sound corresponds to the particular medical condition that is simulated by the subject. In one embodiment, a heart condition is simulated by the playback of heart-associated sounds. Examples of heart-associated sounds that may be generated, and the associated heart condition that is simulated, include: normal heart sounds, pathologic splitting, murmurs, clicks, gallops, pericardial friction rub, venous hum, and carotid artery bruits, for simulating healthy heart function, heart failure, mitral valve prolapse, mitral regurgitation, mitral stenosis, pulmonic stenosis, aortic stenosis, aortic regurgitation, ventricular septal defect, and pericarditis. In another embodiment, a lung condition is simulated by the generation of lung-associated sounds. Examples of lung-associated sounds that may be generated, and the associated lung condition that is simulated, include: normal breath sounds, crackles, wheezes, stridor, and pleural rub, for simulating healthy lung function, bronchitis, lung consolidation, pneumonia, atelectasis, pleural effusion, pneumothorax, chronic obstructive pulmonary disease, emphysema, and asthma. In another embodiment, an abdominal condition is simulated by the generation of abdominal-associated sounds. Examples of auscultation sounds that may be generated, and the associated abdominal condition that is simulated, include: bowel sounds and bruits, for simulating bowel obstruction, renal or aortic artery bruits, and normal abdominal function.
In another embodiment of the invention, an optical tracking system that utilizes optical motion capture may be used for determining the location of a roaming device. Optical motion capture is the tracking of markers, such as LED markers, on an object over time. Generally, cameras are placed on the perimeter of a capture area to track markers placed on the object. Using an optical tracking system, active marker positions are detected by camera sensors and transmitted to a central processor, which calculates and stores the coordinate positions for the markers. Optical tracking systems such as ARToolkit (Human Interface Technology Laboratory, University of Washington, Seattle, Wash.), PhaseSpace optical motion capture systems (PhaseSpace, San Leandro, Calif.), HiBall-3000 (3rdTech, Inc. Chapel Hill, N.C.), Vicon small camera systems (Vicon Motion Systems, Inc., Lake Forest, Calif.), the Hawk System (Motion Analysis Corp., Santa Rosa, Calif.), and the Eagle System (Motion Analysis) may be used in conjunction with the invention.
In another embodiment of the invention, a specialized glove 140, such as, generally the CyberTouch or the CyberGlove (Immersion Corp., San Jose, Calif.) is used, at least in part, to simulate a medical condition.
In another embodiment of the invention, a sensor system is used to determine the location of a roaming device. For example, sensing devices configured to detect when a marker moves within an operating radius can be fastened or mounted to “hot zone” locations on a vest worn by the subject. When a marker, mounted on a user's hand or on an object such as a stethoscope in the user's hand, is moved within a “hot zone,” the sensing device communicates this information to a computer, and software associated with the computer triggers the playback of a sound file through an output device such as a computer speaker or the earpiece of the stethoscope. In some embodiments, the sensing device can be used as the roaming device to detect markers fastened to the “hot zone” locations on the vest. Sensing devices configured to detect markers are known and readily available to one skilled in the art.
According to one or more embodiments of the invention, the components of the system for simulating normal or abnormal medical conditions can be connected in different ways. For example, the connections can be wired, wireless, optical, electromagnetic, and the like. In addition, one or more cameras can be used to provide information and/or position data for the tracking devices. Alternatively, or additionally, tracking devices or magnetic sensors known in the art can be incorporated into the tracking system to assist in, or detect, the location of the tracking device. The functions of one or more devices can also be combined into a single device. For example, the computer can be programmed, and include appropriate hardware, to perform the functions of the tracking system. Various components can also be connected across one or more networks so that data can be exchanged and/or analyzed by others. For example, in a classroom setting, an instructor can demonstrate the appropriate locations to position the stethoscope. Appropriate signals can be sent to the computer and transmitted across a wireless network. Users in the classroom would then receive the appropriate sounds in their own stethoscopes. All such embodiments, modifications, and variations fall within the scope of the present invention.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected. In addition, all references cited herein are indicative of the level of skill in the art and are hereby incorporated by reference in their entirety.
This application claims the priority of U.S. Provisional Patent Application No. 60/621,084, filed on Oct. 25, 2004, which is incorporated herein by reference in its entirety.
The present invention is supported in part by the Naval Health Research Center through NAVAIR Orlando TSD under contract N61339-03-C-0157. The Government may have certain rights in the invention.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2005/038150 | 10/25/2005 | WO | 00 | 8/20/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2006/047400 | 5/4/2006 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
2945304 | Niiranen et al. | Jul 1960 | A |
3024568 | Barnett | Mar 1962 | A |
3947974 | Gordon et al. | Apr 1976 | A |
4155196 | Bollinger et al. | May 1979 | A |
4337529 | Morokawa | Jun 1982 | A |
4411629 | Voights | Oct 1983 | A |
5047952 | Kramer et al. | Sep 1991 | A |
5119072 | Hemingway | Jun 1992 | A |
5217453 | Wilk | Jun 1993 | A |
5256098 | Smith et al. | Oct 1993 | A |
5360005 | Wilk | Nov 1994 | A |
5491756 | Francais | Feb 1996 | A |
6097822 | Min | Aug 2000 | A |
6106463 | Wilk | Aug 2000 | A |
6149595 | Seitz et al. | Nov 2000 | A |
6220866 | Amend et al. | Apr 2001 | B1 |
6319201 | Wilk | Nov 2001 | B1 |
6443735 | Eggert et al. | Sep 2002 | B1 |
6503087 | Eggert et al. | Jan 2003 | B1 |
6527558 | Eggert et al. | Mar 2003 | B1 |
6758676 | Eggert et al. | Jul 2004 | B2 |
7114954 | Eggert et al. | Oct 2006 | B2 |
7122005 | Shusterman | Oct 2006 | B2 |
7192284 | Eggert et al. | Mar 2007 | B2 |
7497828 | Wilk et al. | Mar 2009 | B1 |
7510398 | Thornton | Mar 2009 | B1 |
7597665 | Wilk et al. | Oct 2009 | B2 |
7645141 | Lecat | Jan 2010 | B2 |
7976312 | Eggert et al. | Jul 2011 | B2 |
7976313 | Eggert et al. | Jul 2011 | B2 |
8016598 | Eggert et al. | Sep 2011 | B2 |
8152532 | Eggert et al. | Apr 2012 | B2 |
8257089 | Lecat | Sep 2012 | B2 |
8287283 | Lecat | Oct 2012 | B2 |
8323031 | Lecat | Dec 2012 | B2 |
8419438 | Eggert et al. | Apr 2013 | B2 |
8696362 | Eggert et al. | Apr 2014 | B2 |
20010001852 | Rovinelli et al. | May 2001 | A1 |
20020087101 | Barrick et al. | Jul 2002 | A1 |
20020100103 | Pascal | Aug 2002 | A1 |
20030073060 | Eggert et al. | Apr 2003 | A1 |
20030091968 | Eggert et al. | May 2003 | A1 |
20030139671 | Walston et al. | Jul 2003 | A1 |
20030210186 | Sollenberger et al. | Nov 2003 | A1 |
20040022052 | Chien | Feb 2004 | A1 |
20040157199 | Eggert et al. | Aug 2004 | A1 |
20040157612 | Kim | Aug 2004 | A1 |
20040161732 | Stump et al. | Aug 2004 | A1 |
20040197764 | Stump et al. | Oct 2004 | A1 |
20040228494 | Smith | Nov 2004 | A1 |
20050020918 | Wilk et al. | Jan 2005 | A1 |
20050113703 | Farringdon et al. | May 2005 | A1 |
20050132290 | Buchner et al. | Jun 2005 | A1 |
20050255434 | Lok et al. | Nov 2005 | A1 |
20060073447 | Bjork et al. | Apr 2006 | A1 |
20060247733 | Amer | Nov 2006 | A1 |
20070038164 | Afshar | Feb 2007 | A1 |
20070058818 | Yoshimine | Mar 2007 | A1 |
20070178430 | Lecat | Aug 2007 | A1 |
20070276278 | Coyle et al. | Nov 2007 | A1 |
20080013747 | Tran | Jan 2008 | A1 |
20080131855 | Eggert et al. | Jun 2008 | A1 |
20080138778 | Eggert et al. | Jun 2008 | A1 |
20080138779 | Eggert et al. | Jun 2008 | A1 |
20080138780 | Eggert et al. | Jun 2008 | A1 |
20090117526 | Lecat | May 2009 | A1 |
20090117527 | Lecat | May 2009 | A1 |
20090131759 | Sims et al. | May 2009 | A1 |
20090256801 | Helmer | Oct 2009 | A1 |
20100062407 | Lecat | Mar 2010 | A1 |
20100279262 | Lecat | Nov 2010 | A1 |
20130196302 | Lecat | Aug 2013 | A1 |
20130252219 | Lecat | Sep 2013 | A1 |
20130330699 | Eggert et al. | Dec 2013 | A1 |
20140087343 | Lecat | Mar 2014 | A1 |
20140220529 | Eggert et al. | Aug 2014 | A1 |
Number | Date | Country |
---|---|---|
01-196091 | Aug 1989 | JP |
2004-520606 | Jul 2004 | JP |
WO 2005114616 | Dec 2005 | WO |
Entry |
---|
English Translation of Office Action dated Jul. 20, 2011 for corresponding Japanese Patent Application No. 2007-539020. |
International Search Report for PCT application No. PCT/USO5/38150 dated May 26, 2006. |
Westwood, J.D., et al., The Virtual Standardized Patient, Medicine Meets Virtual Reality 2000; eds, IOS Press. |
Sturman et al., “A survey of glove-based input,” IEEE Computer Graphics and Applications, vol. 14, No. 1, (1994) pp. 30-39. |
Bickley et al, “Bates' Guide to Physical Examination and History Taking,” Eighth Edition, Lippincott Williams & Wilkins, A Wolters Kluwer Company, Table of Contents, 2003, 17 pages. |
Number | Date | Country | |
---|---|---|---|
20090305212 A1 | Dec 2009 | US |
Number | Date | Country | |
---|---|---|---|
60621084 | Oct 2004 | US |