The present invention relates generally to knot detection. The present invention relates more specifically to systems and methods for real-time winding analysis for knot detection.
To close a wound, or during a surgery, a doctor or nurse may need to tie a knot to complete a suture. Typically, a standard array of knots is used to close a suture, and a doctor or nurse would learn some or all of these knots in school or training. Part of the training process may include supervision by a trained doctor or nurse who may watch the trainee tie the knot, and may examine the completed knot to ensure that the correct knot type was used, and that the knot was tied correctly. When the knot is tied using string, and the trainer is available to watch the trainee tie the knot, it is relatively straightforward for the trainer to recognize the knot used. However, knot detection by a computer is significantly more difficult.
Knot detection, in general, is an NP-complete problem, which means that it takes an exponential amount of time, computationally, to recognize a knot based on the knot presented. The use of general-purpose knot-detection algorithms in a simulation tool may impair the speed and effectiveness of the simulation tool as a training device, as they require a significant amount of time to execute. Existing knot detection algorithms also impose constraints on the definition of knots input into the algorithm that may render those algorithms less desirable for use in a simulation system.
Embodiments of the present invention comprise systems, methods, and computer-readable media for real-time winding analysis for knot detection. For example, one embodiment of the present invention is a method comprising receiving a first wrapping signal indicating a first wrapping of the simulated thread around a second tool to create a first loop. The method further comprises determining a first wrapping direction based at least in part on the first wrapping signal; receiving a first tightening signal indicating a pulling of a first end of the simulated thread through the first loop; determining a first half-hitch based at least in part on the first winding direction and the first tightening signal; and outputting the first half-hitch. In another embodiment, a computer-readable media comprises code for a carrying out such a method.
These illustrative embodiments are mentioned not to limit or define the invention, but to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, and further description of the invention is provided there. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
These and other features, aspects, and advantages of the present invention are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:
Embodiments of the present invention provide systems, methods, and computer-readable media for real-time winding analysis for knot detection.
During a simulated surgery, the simulation application may display a view of the simulated surgery on the display 120 for the user to view. The user manipulates the simulation tools 130, 131 to provide signals to the simulation application, which interprets the signals as movements of the simulated tools. The simulation application may then update the display 120 based on the user's manipulation of the simulation tools 130, 131.
During a surgery, a doctor may need to suture an incision. A doctor typically sutures an incision by using a thread to stitch the incision closed, and then tying the thread off, which will hold the two flaps of skin or organic material together to allow them to rejoin and heal. In a laparoscopic surgery, the process of suturing a wound is typically more difficult than during a more invasive surgery, or when suturing a wound or incision on the patient's skin. Laparoscopic surgery is performed by inserting special laparoscopic tools into a patient via one or more small incisions. The surgery is performed by a surgeon who watches video images on a screen transmitted by a camera inserted into one of the incisions, and manipulates the tools to perform the surgery based on the video images. During a laparoscopic surgery, internal incisions within the patient may be made, which are sutured at the end of the surgery to allow the patient to heal properly. Training for a laparoscopic surgery, including suturing a wound using laparoscopic instruments, typically requires significant training, often with simulation systems.
To train for making sutures in a laparoscopic surgery, a doctor or medical student may use a laparoscopy simulation system. In such an event, the user uses the simulation tools 130, 131 to suture the incision with a simulated thread, and then ties a simulated knot. To tie the knot, the user grasps one end of the simulated thread (the first end) with one of the simulation tools (the first tool) 130, and winds the other end of the thread (the second end) around the other simulation tool (the second tool) 131. It should be noted that the first end and second end need not be the actual terminal parts of the thread, and are not used in such a limited fashion herein. Rather, the ends simply refer to two different locations on the thread that are sufficiently far apart to allow a knot to be tied—for some sutures, the two “ends” need not be very far apart at all. In a surgery simulation, the first end may refer to the portion of the thread extending from one side of the incision, while the second end may refer to the portion of the thread extending from the other side of the incision.
The computer 110 detects the winding of the thread around the second tool 131, and determines the direction of the winding. Once the winding is completed, the user grasps the first end of the thread with the second tool 131, and pulls the first end of the simulated thread through the winding, and pulls the winding tight. The computer 110 detects the movement of the first end of the simulated thread through the winding, and the tightening of the winding as a half-hitch. The user then ties additional, successive half-hitches having the same or different winding directions. The computer 110 tracks each of the half-hitches and the direction of the winding in each of the half-hitches.
Once the user releases the thread with the first tool 130, the computer 110 determines that the user has completed the knot. The computer 110 then determines the type of knot the user has tied by analyzing the sequence of half-hitches and winding directions. For example, in this illustrative embodiment, if the user has tied a knot with two half-hitches, each of which having the same winding direction, the computer 110 detects the knot as a “granny” (or slip) knot. Or, if the user has tied a knot with two half-hitches, each with a different winding direction, the computer 110 detects the knot as a square knot. By detecting the half-hitches and windings in real-time as the user is tying the knot, the illustrative system 100 is able to correctly identify the type of knot, and to provide real-time feedback to the user, such as whether the user is correctly tying the desired knot.
According to this illustrative embodiment, the system 100 may also output haptic effects to the first and/or second tools 130, 131 to provide feedback to the user. For example, the computer 110 may detect the tightening of a winding and output a resistive force on one or both tools 130, 131 to indicate that the half-hitch is being tightened. The computer may also output haptic effects to the second tool 131 while the user is wrapping the thread around the tool to emulate forces the user might experience during a real surgery, such as contact between the tool 131 and a part of the patient's body, or movement of the tool resulting from force on the thread. Still further haptic effects could be output to provide a more realistic simulation environment.
In block 210, the computer 110 receives a grasping signal. The grasping signal indicates that the first tool 130 has grasped one end of a simulated thread (referred to as the first end). For example, according to one embodiment of the present invention, a user may view an image of a thread shown in the display 120 that has been used to suture an incision. The user may also view images of the position and orientation of the simulation tools 130, 131 on the display 100. The user may manipulate the first tool 130 to be in proximity to the simulated thread, which sends signals to the computer 110 to indicate that the user has moved the first tool 130. The user may then manipulate the first tool 130 to grasp one end of the thread. When the user grasps the thread with the first tool 130, the computer receives a signal indicating a grasping of the simulated thread with the first tool 130.
In one embodiment of the present invention, when the user grasps the first end of the simulated thread with the first tool 130, the computer also determines a relative position of the first end of the simulated thread relative to the second end of the simulated thread, and a position and orientation of the first tool 130 with respect to the second tool 131. In some embodiments of the present invention, the relative positions of the first end of the thread and the second end of the thread may be used to determine when a wrapping of the simulated thread is occurring, or a direction of the wrapping (described below). However, determining the relative positions may not be necessary. Accordingly, some embodiments may not determine the relative positions of the first end of the simulated thread and the second end of the simulated thread. Further, some embodiments may not determine a relative position and orientation of the first tool 130 with respect to the second tool 131.
In block 220, the computer 110 receives a wrapping signal. The wrapping signal indicates that the user is wrapping, or has wrapped, the other end of the simulated thread (referred to as the second end) around the second simulation tool 131. To wrap the thread around the second tool 131, the user may manipulate the second tool to interact with the simulated thread. By viewing the movement of the second tool 131 on the display, and the second tool's interaction with the thread, the user may be able to wrap the thread around the tool to create a loop. While the user is manipulating the second tool 131 to wrap the simulated thread, the computer 110 receives a first wrapping signal indicating a first wrapping of the simulated thread around the second tool to create a first loop. Alternatively, in one embodiment, the computer 110 may receive a first wrapping signal after the user has completed a loop.
According to one embodiment of the present invention, a wrapping signal may comprise data indicative of a contact between a thread and a simulation tool. For example, the wrapping signal may comprise a vector indicating the normal force between a simulated thread and the second tool 131 at a location. In another embodiment, the wrapping signal may comprise a plurality of vectors indicating normal forces between a simulated thread and the second tool 131 at a plurality of locations. In some embodiments, the wrapping signal may be sent in real-time, i.e. as the user wraps a simulated thread around a simulation tool. Additionally, a plurality of wrapping signals may be received by the computer 110 as the user wraps the thread around the tool. In some embodiments, the wrapping signal may be sent at an interval, or after a minimum length of thread has been wrapped, or after a wrapping is complete.
In block 230, the computer determines the wrapping direction. According to one embodiment of the present invention, as the user wraps the simulated thread around the second tool 311, the computer 110 determines the direction of the wrapping or winding. For example, the computer may define a set of axes 320 based on the position and orientation of the second tool 311, as may be seen in
As the simulated thread 330 is wrapped around the second tool 311, the computer may calculate normal forces exerted by the second tool 311 on the simulated thread 330 at one or more points along the length of the thread 330. The normal forces may be represented as vectors by the computer 101, and the vectors may be defined having directions relative to the axes 320. After the computer 101 has calculated normal vectors along the length of the thread, the computer 101 may then determine the direction of the winding. For example, the computer 101 may analyze the normal vectors in sequence beginning at the first end 331 of the thread 330, and continuing to the second end 332 of the thread 330. As the computer 101 analyzes the normal vectors, the normal vectors will tend to change direction in a clockwise or counter-clockwise direction with respect to the axes, as the thread 330 wraps around the second tool 311, which may be seen in
In one embodiment of the present invention, the computer 110 may determine a direction of a wrapping in real-time. For example, as the computer 110 receives wrapping signals, the computer 110 may compute changes in direction of normal forces between the thread and the second tool 311. The computer 110 may use the changes in direction to determine a direction of the wrapping as the user wraps the thread around the second tool 311.
In some embodiments of the present invention, a computer may determine the direction of a first wrapping based at least in part on how the thread was initially grasped. For example, a user may grasp a first end of a thread such that, in order to pull the first end through a loop, the tools must cross (referred to as a scissor's move). In such a case, the direction of the wrapping is opposite to the apparent direction. This results from the crossing maneuver, which effectively reverses the direction of the wrapping. Thus, in one embodiment of the present invention, a computer may determine that a scissor's move has occurred, and will determine the direction of the wrapping based at least in part on the scissor's move.
In one embodiment of the present invention, the thread 330 is represented as a series of discrete pieces, such as pieces 330a and 330b. Each of these pieces has a single normal vector associated with it. By breaking the thread into pieces and only calculating one normal per piece, the number of calculations needed may be significantly reduced. In other embodiments, each pieces may have a plurality of normal vectors associated with its contact with the second tool 311.
In one embodiment of the present invention, a thread is represented by a series of discrete pieces referred to as nodes. To determine a winding direction, the computer makes calculations based on the thread's nodes. The computer 101 can calculate a wrap angle based on the angles formed by the thread nodes as they wind about the tool axis In such an embodiment, the computer 101 may compute the wrap angle, θ, according to the following equation:
for nodes i in contact set C, where each θi is the four quadrant arctangent evaluation of θi=tan−1(n·x/n·z). n is the vector from the closest point on the second tool 311 to node i, and x, z are the axes 320 perpendicular to the longitudinal axis of the second tool 311. By definition, an overhand wrap angle has a positive sign, and an underhand wrap angle has a negative sign. Thus, the computer 101 may determine the direction of a wrapping based on the sign of the angle resulting from equation (1).
In block 240, the computer 101 receives a tightening signal indicating a pulling of the first end 311 of the simulated thread through the first loop. Once a user has created a loop or winding, the user may close a knot or half-hitch by pulling the first end of the thread through the loop. According to one embodiment of the present invention, a user may close a half-hitch by manipulating the second tool 311 to pull the first end of the thread through the loop made by wrapping the thread around the second tool 311.
In block 250, the computer determines that the user has completed a half-hitch. After pulling the first end 311 of the thread 330 through the first loop, the user may pull the half-hitch tight, which may cause the computer 110 to receive a signal indicating that the half-hitch has been completed. Alternatively, the computer 110 may determine that the half-hitch has been completed based on the user's manipulation of the tools, such as by a motion made to tighten the half-hitch.
In block 260, the computer outputs a signal indicating that a half-hitch has been made. According to one embodiment of the present invention, the system may generate a haptic effect when the user pulls the half-hitch tight. For example, the system shown in
The system may also output the half-hitch by storing an information describing the half-hitch to a computer-readable medium. For example, in one embodiment of the present invention, a system for winding analysis for knot detection may execute a simulation to test a user's ability to tie one or more knots. In such an embodiment, when the user completes a half-hitch, the system may store information relating to the half-hitch to a computer-readable medium, such as a hard disk, to be used when grading the user's performance. Information relating to the half-hitch may include the direction of the winding, a number of windings, an indication of a quality of the half-hitch, a number of attempts needed to complete the half-hitch, or a time to complete the half-hitch.
In block 270, the computer may receive a releasing signal indicating that the user has released both ends of the thread. Such a signal may indicate that the knot has been completed. If a releasing signal is received, it is an indication that the user has completed the knot, and the method progresses to step 280. However, the user may not have completed the knot at this point, and so the user may elect to begin another winding. In such a case the system may receive a wrapping signal. If no releasing signal is received, and a wrapping signal is received, the method returns to step 220. In such a case, the user is mostly likely attempting to tie another half-hitch.
In block 280, the computer 101 determines the type of knot the user has tied. In one embodiment of the present invention, the computer 101 may determine that the user has tied a “granny” or slip knot if the knot includes two half-hitches, both of which having the same winding direction. In one embodiment, the computer 101 may determine that the user has tied a square knot if the knot includes two half-hitches, each of which having different winding directions. In one embodiment, the computer 101 may determine that the user has tied a surgeon's knot if the knot includes a half-hitch having a plurality of windings. Other embodiments can detect knots based on a sequence of half-hitches.
In block 290, the computer 101 outputs the knot type. In one embodiment, the computer 101 may generate output similar to the outputs generated for completing a half-hitch. For example, the system may output a visual and/or audible indication that the knot has been completed. For example, the system may briefly change the color of the thread to green when the knot has been successfully completed, or to red if the knot was incorrectly tied. The change in color may be accompanied by a bell or beep sound, or the audible indication may accompany a visual indication.
The computer may also output the knot by storing an information describing the knot to a computer-readable medium. For example, in one embodiment of the present invention, a system for winding analysis for knot detection may execute a simulation to test a user's ability to tie one or more knots. In such an embodiment, when the user completes a knot, the system may store information relating to the knot to a computer-readable medium, such as a hard disk, to be used when grading the user's performance. Information relating to the knot may include the type of knot attempted, the type of knot completed, the number of attempts to tie the knot, the time needed to tie the knot, the number and type of half-hitches used to make the knot, or a quality of the knot.
Referring now to
In the embodiment shown in
In addition, in some embodiments of the present invention, the simulation tools may not be bounded by a coupling to a housing. For example, according to one embodiment of the present invention, the user may wear special gloves with sensors, where the gloves are capable of transmitting their position and orientation to a computer. In another embodiment, the system may comprise video cameras or other remote sensors that are able to capture a user's movements in space without requiring the user to wear special gloves or other equipment.
Some embodiments of the present invention may provide haptic feedback to a user. For example, in one embodiment of the present invention, the laparoscopy tools 530, 531 comprise actuators configured to output haptic effects. In one such embodiment, each of the haptic tools 530, 531 comprise actuators in communication with the computer 510. Examples of suitable actuators may be found in co-pending U.S. patent application Ser. No. 10/196,717 entitled “Pivotable Computer Interface” filed Jul. 15, 2002, the entirety of which is incorporated herein by reference. Other actuators may be suitable as well. In such an embodiment, the haptic tools 530, 531 are configured to receive one or more actuator signals from the computer 510, the actuator signals configured to cause the actuators to output a haptic effect. In some embodiments, only one of the simulation tools 530, 531 may comprise an actuator. In some embodiments, the simulations tools 530, 531 may comprise a plurality of actuators.
Haptically-enabled embodiments of the present invention may be capable of outputting a variety of haptic effects. For example, one embodiment of the present invention is configured to output haptic effects intended to replicate forces encountered during a laparoscopic surgery, such as resistance to movement of the tools within incisions, effects indicating contact with parts of a patient's body, vibrations or movements associated with a patient's movement (e.g. heartbeats, breathing, etc.), or other forces encountered during a simulated surgery.
Some embodiments may output other haptic effects intended to signal the user to a status or message. For example, one embodiment of the present invention may provide a haptic effect to a simulation tool 530, 531 to indicate that the user is incorrectly tying a knot. Suitable effects may be vibrations or resistance to movement of the tools.
Referring again to
Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other devices may include, such as a router, private or public network, or other transmission device. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
For example, program code for a software application to perform methods according to one embodiment of the present invention may comprise a series of interconnected software modules. For example,
In the embodiment shown in
Half-hitch detection block 630 is configured to read data from the data block 620 and to determine if a half-hitch has been made. Half-hitch detection block 630 employs winding direction determination block 640 to determine the direction of a winding, and then is able to determine when a half-hitch is made based on data read from the data block 620. Such data may comprise data parameters or information received from signals received by the transmit/receive block. For example, the half-hitch detection block 630 may perform half-hitch detection as described in relation to
Knot detection block 650 is configured to determine when a knot has been completed, and to determine a knot type based on data read from data block 620. Such data may comprise data parameters or information received by the transmit/receive block 610. Knot determination employs the method of knot detection described with respect to
Haptic effect generation block 660 is configured to generate haptic effects to be output to the first and/or second tools 130, 131. The information for generating the haptic effects may be gained from the data block 620, or from information received from the transmit/receive block 610. The haptic effect generation block 660 transmits haptic effect information to the transmit/receive block 610, which then transmits an actuator signal based at least in part on the haptic effect information to the first and/or second tools 130, 131.
While the embodiment described with respect to
The foregoing description of the embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
2524782 | Ferrar et al. | Oct 1950 | A |
3490059 | Paulsen et al. | Jan 1970 | A |
3623046 | Scourtes | Nov 1971 | A |
3775865 | Rowan | Dec 1973 | A |
3875488 | Crocker et al. | Apr 1975 | A |
4050265 | Drennen et al. | Sep 1977 | A |
4103155 | Clark | Jul 1978 | A |
4125800 | Jones | Nov 1978 | A |
4148014 | Burson | Apr 1979 | A |
4311980 | Prusenziati | Jan 1982 | A |
4321047 | Landis | Mar 1982 | A |
4385836 | Schmitt | May 1983 | A |
4391282 | Ando et al. | Jul 1983 | A |
4400790 | Chambers et al. | Aug 1983 | A |
4443952 | Schulien et al. | Apr 1984 | A |
4546347 | Kirsch | Oct 1985 | A |
4637264 | Takahashi et al. | Jan 1987 | A |
4639884 | Sagues | Jan 1987 | A |
4678908 | LaPlante | Jul 1987 | A |
4680466 | Kuwahara et al. | Jul 1987 | A |
4692726 | Green et al. | Sep 1987 | A |
4695266 | Hui | Sep 1987 | A |
4699043 | Violante De Dionigi | Oct 1987 | A |
4712101 | Culver | Dec 1987 | A |
4724715 | Culver | Feb 1988 | A |
4728954 | Phelan et al. | Mar 1988 | A |
4734685 | Watanabe | Mar 1988 | A |
4776701 | Pettigrew | Oct 1988 | A |
4789340 | Zikria | Dec 1988 | A |
4794384 | Jackson | Dec 1988 | A |
4795901 | Kitazawa | Jan 1989 | A |
4799055 | Nestler et al. | Jan 1989 | A |
4803413 | Kendig et al. | Feb 1989 | A |
4811608 | Hilton | Mar 1989 | A |
4815006 | Andersson et al. | Mar 1989 | A |
4819195 | Bell et al. | Apr 1989 | A |
4823106 | Lovell | Apr 1989 | A |
4825157 | Mikan | Apr 1989 | A |
4840634 | Muller et al. | Jun 1989 | A |
4851771 | Ikeda et al. | Jul 1989 | A |
4860051 | Taniguchi et al. | Aug 1989 | A |
4891889 | Tomelleri | Jan 1990 | A |
4906843 | Jones et al. | Mar 1990 | A |
4914976 | Wyllie | Apr 1990 | A |
4935725 | Turnau | Jun 1990 | A |
4935728 | Kley | Jun 1990 | A |
4937685 | Barker et al. | Jun 1990 | A |
4940234 | Ishida et al. | Jul 1990 | A |
4962448 | DeMaio et al. | Oct 1990 | A |
4964837 | Collier | Oct 1990 | A |
4965446 | Vyse | Oct 1990 | A |
4967126 | Gretz | Oct 1990 | A |
4982504 | Soderberg et al. | Jan 1991 | A |
5006703 | Shikunami et al. | Apr 1991 | A |
5024626 | Robbins et al. | Jun 1991 | A |
5053975 | Tsuchihashi et al. | Oct 1991 | A |
5062830 | Dunlap | Nov 1991 | A |
5065145 | Purcell | Nov 1991 | A |
5068529 | Ohno et al. | Nov 1991 | A |
5079845 | Childers | Jan 1992 | A |
5086197 | Liou | Feb 1992 | A |
5095303 | Clark et al. | Mar 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5113179 | Scott-Jackson et al. | May 1992 | A |
5116051 | Moncrief et al. | May 1992 | A |
5125261 | Powley | Jun 1992 | A |
5132927 | Lenoski et al. | Jul 1992 | A |
5138154 | Hotelling | Aug 1992 | A |
5139261 | Openiano | Aug 1992 | A |
5143006 | Jimenez | Sep 1992 | A |
5148377 | McDonald | Sep 1992 | A |
5155423 | Karlen et al. | Oct 1992 | A |
5168268 | Levy | Dec 1992 | A |
5182557 | Lang | Jan 1993 | A |
5195179 | Tokunaga | Mar 1993 | A |
5195920 | Collier | Mar 1993 | A |
5202961 | Mills et al. | Apr 1993 | A |
5204600 | Kahkoska | Apr 1993 | A |
5209131 | Baxter | May 1993 | A |
5216337 | Orton et al. | Jun 1993 | A |
5223658 | Suzuki | Jun 1993 | A |
5229836 | Nagano | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5235868 | Culver | Aug 1993 | A |
5239249 | Ono | Aug 1993 | A |
5246316 | Smith | Sep 1993 | A |
5247648 | Watkins et al. | Sep 1993 | A |
5254919 | Bridges et al. | Oct 1993 | A |
5275565 | Moncrief | Jan 1994 | A |
5280276 | Kwok | Jan 1994 | A |
5284330 | Carlson et al. | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5296846 | Ledley | Mar 1994 | A |
5310348 | Miller | May 1994 | A |
5313229 | Gilligan et al. | May 1994 | A |
5313230 | Venolia et al. | May 1994 | A |
5317336 | Hall | May 1994 | A |
5329289 | Sakamoto et al. | Jul 1994 | A |
5341459 | Backes | Aug 1994 | A |
5351692 | Dow et al. | Oct 1994 | A |
5358408 | Medina | Oct 1994 | A |
5359193 | Nyui et al. | Oct 1994 | A |
5368487 | Medina | Nov 1994 | A |
5374942 | Gilligan et al. | Dec 1994 | A |
5379663 | Hara | Jan 1995 | A |
5384460 | Tseng | Jan 1995 | A |
5390128 | Ryan et al. | Feb 1995 | A |
5390296 | Crandall et al. | Feb 1995 | A |
5396267 | Bouton | Mar 1995 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5398044 | Hill | Mar 1995 | A |
5402499 | Robison et al. | Mar 1995 | A |
5402582 | Raab | Apr 1995 | A |
5402680 | Korenaga | Apr 1995 | A |
5403191 | Tuason | Apr 1995 | A |
5417696 | Kashuba et al. | May 1995 | A |
5428748 | Davidson et al. | Jun 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5436640 | Reeves | Jul 1995 | A |
5452615 | Hilton | Sep 1995 | A |
5457479 | Cheng | Oct 1995 | A |
5457793 | Elko et al. | Oct 1995 | A |
5467763 | McMahon et al. | Nov 1995 | A |
5473344 | Bacon et al. | Dec 1995 | A |
5474082 | Junker | Dec 1995 | A |
5481914 | Ward | Jan 1996 | A |
5491477 | Clark et al. | Feb 1996 | A |
5512919 | Araki | Apr 1996 | A |
5514150 | Rostoker | May 1996 | A |
5524195 | Clanton, III et al. | Jun 1996 | A |
5526022 | Donahue et al. | Jun 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5543821 | Marchis et al. | Aug 1996 | A |
5546508 | Jain | Aug 1996 | A |
5547383 | Yamaguchi | Aug 1996 | A |
5550562 | Aoki et al. | Aug 1996 | A |
5550563 | Matheny et al. | Aug 1996 | A |
5570111 | Barrett et al. | Oct 1996 | A |
5573286 | Rogozinski | Nov 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5583407 | Yamaguchi | Dec 1996 | A |
5591924 | Hilton | Jan 1997 | A |
5592401 | Kramer | Jan 1997 | A |
5604345 | Matsuura | Feb 1997 | A |
5611731 | Bouton et al. | Mar 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5623642 | Katz et al. | Apr 1997 | A |
5624398 | Smith | Apr 1997 | A |
5627531 | Posso et al. | May 1997 | A |
5628686 | Svancarek et al. | May 1997 | A |
5635897 | Kuo | Jun 1997 | A |
5638421 | Serrano et al. | Jun 1997 | A |
5652603 | Abrams | Jul 1997 | A |
5666138 | Culver | Sep 1997 | A |
5680141 | Didomenico et al. | Oct 1997 | A |
5691747 | Amano | Nov 1997 | A |
5694153 | Aoyagi et al. | Dec 1997 | A |
5722071 | Berg et al. | Feb 1998 | A |
5722836 | Younker | Mar 1998 | A |
5724106 | Autry et al. | Mar 1998 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5734108 | Walker et al. | Mar 1998 | A |
5740083 | Anderson et al. | Apr 1998 | A |
5745057 | Sasaki et al. | Apr 1998 | A |
5746752 | Burkhart | May 1998 | A |
5749577 | Couch et al. | May 1998 | A |
5755620 | Yamamoto et al. | May 1998 | A |
5763874 | Luciano et al. | Jun 1998 | A |
5767836 | Scheffer et al. | Jun 1998 | A |
5771037 | Jackson | Jun 1998 | A |
5795228 | Trumbull et al. | Aug 1998 | A |
5808568 | Wu | Sep 1998 | A |
5808603 | Chen | Sep 1998 | A |
5818426 | Tierney et al. | Oct 1998 | A |
5825305 | Biferno | Oct 1998 | A |
5828295 | Mittel et al. | Oct 1998 | A |
5831593 | Rutledge | Nov 1998 | A |
5841133 | Omi | Nov 1998 | A |
5841423 | Carroll, Jr. et al. | Nov 1998 | A |
5841428 | Jaeger et al. | Nov 1998 | A |
5844673 | Ivers | Dec 1998 | A |
5873732 | Hasson | Feb 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5877748 | Redlich | Mar 1999 | A |
5879327 | Moreau DeFarges et al. | Mar 1999 | A |
5889506 | Lopresti et al. | Mar 1999 | A |
5912661 | Siddiqui | Jun 1999 | A |
5917486 | Rylander | Jun 1999 | A |
5919159 | Lilley et al. | Jul 1999 | A |
5936613 | Jaeger et al. | Aug 1999 | A |
5947743 | Hasson | Sep 1999 | A |
5954689 | Poulsen | Sep 1999 | A |
5963196 | Nishiumi et al. | Oct 1999 | A |
5986638 | Cheng | Nov 1999 | A |
6007550 | Wang et al. | Dec 1999 | A |
6017273 | Pelkey | Jan 2000 | A |
6031222 | Carapelli | Feb 2000 | A |
6063095 | Wang et al. | May 2000 | A |
6078311 | Pelkey | Jun 2000 | A |
6078876 | Rosenberg et al. | Jun 2000 | A |
6097499 | Casey et al. | Aug 2000 | A |
6097964 | Nuovo et al. | Aug 2000 | A |
6104379 | Petrich et al. | Aug 2000 | A |
6143006 | Chan | Nov 2000 | A |
6183364 | Trovato | Feb 2001 | B1 |
6192432 | Slivka et al. | Feb 2001 | B1 |
6241574 | Helbing | Jun 2001 | B1 |
6259433 | Meyers | Jul 2001 | B1 |
6280327 | Leifer et al. | Aug 2001 | B1 |
6293798 | Boyle et al. | Sep 2001 | B1 |
6295608 | Parkes et al. | Sep 2001 | B1 |
6300038 | Shimazu et al. | Oct 2001 | B1 |
6349301 | Mitchell et al. | Feb 2002 | B1 |
6398557 | Hoballah | Jun 2002 | B1 |
6418329 | Furuya | Jul 2002 | B1 |
6546390 | Pollack et al. | Apr 2003 | B1 |
6633224 | Hishida et al. | Oct 2003 | B1 |
6692485 | Brock | Feb 2004 | B1 |
6760751 | Hachiya et al. | Jul 2004 | B1 |
6767037 | Wenstrom, Jr. | Jul 2004 | B2 |
6780016 | Toly | Aug 2004 | B1 |
6810281 | Brock et al. | Oct 2004 | B2 |
6887082 | Shun | May 2005 | B2 |
6991627 | Madhani et al. | Jan 2006 | B2 |
7594815 | Toly | Sep 2009 | B2 |
8328860 | Strauss | Dec 2012 | B2 |
8814905 | Sengun | Aug 2014 | B2 |
8821543 | Hernandez | Sep 2014 | B2 |
8834170 | Kurenov | Sep 2014 | B2 |
8882660 | Phee | Nov 2014 | B2 |
8926661 | Sikora | Jan 2015 | B2 |
9345468 | Sengun | May 2016 | B2 |
20010018354 | Pigni | Aug 2001 | A1 |
20010045978 | McConnell et al. | Nov 2001 | A1 |
20020072674 | Criton et al. | Jun 2002 | A1 |
20020087166 | Brock | Jul 2002 | A1 |
20030043206 | Duarte | Mar 2003 | A1 |
20030112269 | Lentz et al. | Jun 2003 | A1 |
20040033476 | Shun | Feb 2004 | A1 |
20040076444 | Badovinac et al. | Apr 2004 | A1 |
20040142314 | Hasson et al. | Jul 2004 | A1 |
20040193393 | Keane | Sep 2004 | A1 |
20050064378 | Toly | Mar 2005 | A1 |
20050187747 | Paxson et al. | Aug 2005 | A1 |
20050251206 | Maahs | Nov 2005 | A1 |
20060178556 | Hasser | Aug 2006 | A1 |
20060252019 | Burkitt et al. | Nov 2006 | A1 |
20070093858 | Gambale | Apr 2007 | A1 |
20070156172 | Alvarado | Jul 2007 | A1 |
20070166682 | Yarin et al. | Jul 2007 | A1 |
20070172803 | Hannaford et al. | Jul 2007 | A1 |
20070238081 | Koh | Oct 2007 | A1 |
20070287992 | Diolaiti | Dec 2007 | A1 |
20080064017 | Grundmeyer et al. | Mar 2008 | A1 |
20080065105 | Larkin | Mar 2008 | A1 |
20080071291 | Duval | Mar 2008 | A1 |
20080218770 | Moll | Sep 2008 | A1 |
20100291520 | Kurenov et al. | Nov 2010 | A1 |
Number | Date | Country |
---|---|---|
0 085 518 | Aug 1989 | EP |
0 470 257 | Feb 1992 | EP |
0 358 989 | Jul 1994 | EP |
0 875 819 | Oct 2002 | EP |
2 237 160 | Apr 1991 | GB |
2 347 199 | Aug 2000 | GB |
WO 9616397 | May 1996 | WO |
WO 9624398 | Aug 1996 | WO |
WO 9632679 | Oct 1996 | WO |
WO 0077689 | Dec 2000 | WO |
WO 0100630 | Jan 2001 | WO |
WO 0167297 | Sep 2001 | WO |
WO 03000319 | Jan 2003 | WO |
Entry |
---|
Adelstein, B., A Virtual Environment System for the Study of Human Arm Tremor, Submitted to the Dept. of Mechanical Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology, Jun. 1989, pp. 1-253. |
Adelstein, B. et al., Design and Implementation of a Force Reflecting Manipulandum for Manual Control Research, DSC-vol. 42, Advances in Robotics, ASME 1992, pp. 1-12. |
Akamatsu et al., Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display, Presence, vol. 3, No. 1 pp. 73-80, 1994. |
ATIP98.059: Virtual Reality (VR) Development at SERI (Korea), Asian Technology Information Program (ATIP) Jul. 20, 1998, pp. 1-9. |
Aukstakalnis, S. et al., the Art and Science of Virtual Reality Silicon Mirage, 1992, Peachpit Press, Inc., Berkeley, CA, pp. 129-180. |
Baigrie, S. et al., Electric Control Loading-A Low Cost, High Performance Alternative, Proceedings, Nov. 6-8, 1990, pp. 247-254. |
Bejczy, A., Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation, Science, vol. 208, No. 4450, 1980, pp. 1327-1335. |
Bejczy, A. et al., Kinesthetic Coupling Between Operator and Remote Manipulator, International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980, pp. 1-9. |
Bejczy, A. et al., A Laboratory Breadboard System for Dual-Arm Teleoperation, SOAR '89 Workshop, JSC, Houston, Jul. 25-27, 1989. |
Bejczy, A. et al., Universal Computer Control System (UCCS) for Space Telerobots, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA, pp. 317-324. |
Bjork, S. et al., An Alternative to Scroll Bars on Small Screens, Play: Applied Research on Art and Technology, Viktoria Institute, Box 620, SE-405 30 Gothenburg, Sweden, pp. 1-2. |
Bouguila, L. et al., Effect of Coupling Haptics and Stereopsis on Depth Perception in Virtual Environment, Precision and Intelligence Laboratory, Tokyo Institute of Technology, 4259 Nagatsuta cho Midori ku Yokohama shi 226-8503-Japan. |
Brooks, T. et al., Hand Controllers for Teleoperation: A State-of-the-Art Technology Survey and Evaluation, 1985, NASA Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA. |
Burdea, G. et al., Distributed Virtual Force Feedback, IEEE Workshop on “Force Display in Virtual Environments and its Application to Robotic Teleoperation,” May 2, 1993, Atlanta, GA. |
Calder, B. et al., Design of a Force-Feedback Touch-Inducing Actuator for Teleoperator Robot Control, Submitted to the Department of Mechanical Engineering and Electrical Engineering in partial Fulfillment of the requirements of the degree of Bachelors of Science in Mechanical Engineering and Bachelor of Science in Electrical Engineering at the Massachusetts Institute of Technology, May 1983. |
Caldwell, D. et al., Enhanced Tactile Feedback (Tele-Taction) using a Multi-Functional Sensory System, Dept. of Electronic Eng., University of Salford, Salford, M5 4WT, UK, 1993. |
Cyberman Technical Specification, Logitech Cyberman SWIFT Supplement, Revision 1.0, Apr. 5, 1994, pp. 1-29. |
Eberhardt, S. et al., OMAR-A Haptic Display for Speech Perception by Deaf and Deaf-Blind Individuals, IEEE Virtual Reality Annual International Symposium, Sep. 18-22, 1993, Seattle Washington. |
Eberhardt, S. et al., Inducing Dynamic Haptic Perception by the Hand: System Description and Some Results, Dynamic Systems and Control, 1994, vol. 1, presented at 1994 International Mechanical Engineering Congress and Exposition, Chicago Illinois, Nov. 6-11, 1994. |
Fukumoto, M. et al., Active Click: Tactile Feedback for Touch Panels, NTT DoCoMo Multimedia Labs, Japan. |
Gobel, M. et al., Tactile Feedback Applied to Computer Mice, International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Gotow, J. et al., Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback, The Robotics Institute and Deptartmetn of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213, pp. 332-337. |
Hansen, W., Enhancing Docuemtns with Embedded Programs: How Ness extends Insets in the Andrew Toolkit, 1990, Information Technology Center, Carnegie Mellon University, Pittsburgh, PA 15213. |
Hasser, C. et al., Tactile Feedback with Adaptive Controller for a Force-Reflecting Haptic Display Part 1: Design, 1996, Armstrong Laboratory, Human Systems Center, Air Force Materiel Command, Wright-Patterson AFB OH 45433. |
Hasser, C. et al., Tactile Feedback for a Force-Reflecting Haptic Display, Thesis Submitted to the School of Engineering of the University of Daytona, Dayton OH, Dec. 1995. |
Hasser, C., Force-Reflecting Anthropomorphic Hand Masters, Crew Systems Directorate Biodynamics and Biocommunications Division, Wright-Patterson AFB OH 45433-7901, Jul. 1995, Interim Report for the Period Jun. 1991-Jul. 1995. |
Hinckley, K. et al., Haptic Issues for Virtual Manipulation, A Dissertation presented to the Faculty of the School of Engineering and Applied Science at the University of Virginia, in Partial Fulfillment of the Requirement for the Degree Doctor of Philosophy (Computer Science), Dec. 1996. |
Howe, R., A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation, Proceedings of the 1992 IEEE Conference in Robotics and Automation, Nice, France-May 1992. |
Iwata, H., Pen-Based Haptic Virtual Environment, Institute of Engineering Mechanics, University of Tsukuba, Japan, 1993. |
Jacobsen, S. et al., High Performance, Dextrous Telerobotic Manipulator with Force Reflection, Intervention/ROV '91, Conference & Exposition, May 21-23, 1991, Hollywood, FL. |
Johnson, A., Shape-Memory Alloy Tactical Feedback Actuator, Phase I-Final Report, Air Force SABIR Contract F33-88-C-0541, Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson Air Force Base, OH 45433. |
Jones, L. et al., A Perceptual Analysis of Stiffness, Experimental Brain Research, 1990, vol. 79, pp. 150-156. |
Kaczmarek, K. et al., Tactile Displays, Virtual Environment Technologies, pp. 349-414. |
Kelley, A. et al., MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device, Department of Electrical Engineering, University of British Columbia, Canada, Oct. 19, 1993. |
Lake, S.L., Cyberman from Logitech, web site at http://www.ibiblio.org/GameBytes/issue21/greviews/cyberman/html, as available via the Internet and printed May 29, 2002. |
MacLean, K., Designing with Haptic Feedback, Interval Research Corporation, 1801 Page Mill Road, Palo Alto, CA 94304, 2000. |
Mine, M., Isaac: A Virtual Environment Tool for the Interactive Construction of Virtual Worlds, Department of Computer Science, University of North Carolina Chapel Hill, 1995. |
Picinbono, G. et al., Extrapolation: A Solution for Force Feedback, Virtual Reality and Prototyping, Jun. 1999, Laval, France. |
Wloka, M., Interacting with Virtual Reality, Science and Technology Center for Computer Graphics and Scientific Visualization, Brown University Site, Department of Computer Science, 1995. |
eRENA, Pushing Mixed Reality Boundaries, Deliverable 7b.1, Final, Version 1.0. |
Real Time Graphics, The Newsletter of Virtual Environment Technologies and Markets, Aug. 1998, vol. 7, No. 2. |
1998 IEEE International Conference on Robotics and Automation, May 16-20, 1998, Lueven, Belgium. |
Patent Cooperation Treaty, International Search Report, International Application No. PCTUS0802019, mailed Jul. 14, 2008. |
Number | Date | Country | |
---|---|---|---|
20090209978 A1 | Aug 2009 | US |