This invention relates generally to human/computer interface devices, and more particularly to computer input devices such as mice, trackballs, etc.
Virtual reality computer systems provide users with the illusion that they are part of a “virtual” environment. A virtual reality system will typically include a personal computer or workstation, specialized virtual reality software, and virtual reality I/O devices such as head mounted displays, pointer gloves, 3D pointers, etc.
For example, a virtual reality computer system can allow a doctor-trainee or other human operator or user to “manipulate” a scalpel or probe within a computer-simulated “body”, and thereby perform medical procedures on a virtual patient. In this instance, the I/O device is typically a 3D pointer, stylus, or the like. As the “scalpel” or “probe” moves within the body image displayed on the screen of the computer system, results of such movement are updated and displayed so that the operator can gain the experience of such a procedure without practicing on an actual human being or a cadaver.
For virtual reality systems to provide a realistic (and therefore effective) experience for the user, sensory feedback and manual interaction should be as natural as possible. As virtual reality systems become more powerful and as the number of potential applications increases, there is a growing need for specific human/computer interface devices which allow users to interface with computer simulations with tools that realistically emulate the activities being represented within the virtual simulation. Such procedures as laparoscopic surgery, catheter insertion, and epidural analgesia should be realistically simulated with suitable human/computer interface devices if the doctor is to be properly trained.
While the state of the art in virtual simulation and medical imaging provides a rich and realistic visual feedback, there is a great need for new human/computer interface tools which allow users to perform natural manual interactions with the computer simulation. For medical simulation, there is a strong need to provide doctors with a realistic mechanism for performing the manual activities associated with medical procedures while allowing a computer to accurately keep track of their actions.
There are number of devices that are commercially available for interfacing a human with a computer for virtual reality simulations. There are, for example, such 2-dimensional input devices such as mice, trackballs, and digitizing tablets. However, 2-dimensional input devices tend to be awkward and inadequate to the task of interfacing with 3-dimensional virtual reality simulations. In contrast, a 3-dimensional human/computer interface tool sold under the trademark Immersion PROBE™ is marketed by Immersion Human Interface Corporation of Palo Alto, Calif., and allows manual control in 3-dimensional virtual reality computer environments. A pen-like stylus allows for dexterous 3-dimensional manipulation, and the position and orientation of the stylus is communicated to a host computer. The Immersion PROBE has six degrees of freedom which convey spatial coordinates (x, y, z) and orientation (role, pitch, yaw) of the stylus to the host computer.
While the Immersion PROBE is an excellent 3-dimensional interface tool, it may be inappropriate for certain virtual reality simulation applications. For example, in some of the aforementioned medical simulations three or four degrees of freedom of a 3-dimensional human/computer interface tool is sufficient and, often, more desirable than five or six degrees of freedom because it more accurately mimics the real-life constraints of the actual medical procedure. Therefore, a less complex, more compact, lighter weight, lower inertia and less expensive alternative to six degree of freedom human/computer interface tool is desirable for certain applications.
The present invention provides a 3-dimensional human/computer interface tool which is particularly well adapted to virtual reality simulation systems that require fewer degrees of freedom, e.g. two, three, or four degrees of freedom. The present invention therefore tends to be less complex, more compact, lighter weight, less expensive, more reliable and have less inertia than 3-dimensional human/computer interface tools of the prior art having more degrees of freedom.
The present invention is directed to a method and apparatus for providing an interface between a human and a computer. The human end of the interface is preferably a substantially cylindrical object such as a shaft of a surgeon's tool, a catheter, a wire, etc. Alternatively, it can comprise a pool cue, a screw driver shaft, or any other elongated object that is manipulated in 3-dimensional space by a human operator. In certain embodiments of the present invention, the computer develops signals to provide force feedback to the object. For example, a twisting or resisting force can be imparted on the object to provide haptic or force feedback of a medical procedure being performed in a virtual reality simulation.
An apparatus for interfacing with a electrical system includes a support, a gimbal mechanism coupled to the support, and preferably three electromechanical transducers, although certain embodiments (e.g. for use with catheters) may require only two electromechanical transducers. The gimbal mechanism has a base portion which is rotatably coupled to the support to provide a first degree of freedom, and an object receiving portion rotatably coupled to the base portion to provide a second degree of freedom. A first electromechanical transducer is coupled between the support and the base portion, a second electromechanical transducer is coupled between the base portion and the object receiving portion, and a third electromechanical transducer is coupled between the object receiving portion and an intermediate portion of an elongated object that is at least partially disposed within the object receiving portion. The third electromechanical transducer is associated with a third degree of freedom. Therefore, each of the three transducers are associated with a degree of freedom of movement of the object when it is engaged with the object receiving portion of the gimbal mechanism.
More specifically, an apparatus for interfacing an operator manipulable shaft with a computer includes a support, a gimbal mechanism, and four sensors. The gimbal mechanism preferably includes a U shaped base portion having a base and a pair of substantially parallel legs extending therefrom, where the base of the U shaped base portion is rotatably coupled to the support, and a shaft receiving portion pivotally coupled between the legs of the base portion. The shaft receiving portion includes a translation interface and a rotation interface that engage the shaft when it is engaged with an aperture of the shaft receiving portion. The base portion rotates around a first axis and the shaft receiving portion rotates around a second axis substantially perpendicular to the first axis, such that an axis of the shaft defines a radius in a spherical coordinate system having an origin at an intersection of the first axis and the second axis. A first sensor is coupled between the support and the U shaped base portion to provide a first output signal, a second sensor is coupled between the U shaped base portion and the shaft receiving portion to produce a second output signal, a third sensor is coupled to the translation interface to produce a third output signal, and a fourth sensor is coupled between the rotation interface and the object to produce a fourth output signal. The output signals are preferably coupled to an input of a computer by an electronic interface.
In an alternative embodiment of the present invention a first actuator is coupled between the support and the U shaped base portion to produce a movement therebetween in response to a first input electrical signal, a second actuator is coupled between the U shaped base portion and the shaft receiving portion to produce a movement therebetween in response to a second input electrical signal, a third actuator is coupled to the translation interface to produce a mechanical movement of the elongated cylindrical object relative to the shaft receiving portion in response to a third input electrical signal, and a fourth actuator is coupled to the rotation interface to produce a mechanical movement of the elongated cylindrical object relative to the shaft receiving portion in response to a fourth input electrical signal.
A method for providing a human/computer interface includes the steps of: (a) defining an origin in a 3-dimensional space; (b) physically constraining a shaft that can be grasped by an operator such that a portion of the object always intersects the origin and such that the portion of the object extending past the origin defines a radius in a spherical coordinate system; (c) transducing a first electrical signal related to a first angular coordinate of the radius in the spherical coordinate system with a first transducer; (d) transducing a second electrical signal related to a second angular coordinate of the radius in the spherical coordinate system with a second transducer; (e) transducing a third electrical signal related to the length of the radius with a third transducer; and (f) electrically coupling the transducers to a computer system to provide a human/computer interface. The method can further include the step of transducing a fourth electrical signal related to a rotation of the shaft around an axis with a fourth transducer. The transducers are either sensors, actuators, or bi-directional transducers which can serve as either input or sensors.
It will therefore be appreciated that a human/computer interface of the present invention includes a support, a gimbal mechanism coupled to the support, and an elongated shaft engaged with the gimbal mechanism and having a grip area that can be grasped by a hand of an operator. The gimbal mechanism has a base portion rotatably coupled to the support, and a shaft receiving portion rotatably coupled to the base. A first sensor is coupled between the support and the base portion, a second sensor is coupled between the base portion and the shaft receiving portion, and a third sensor is coupled between the shaft receiving portion and an intermediate portion of the shaft. The three sensors are coupled to an input of a computer to provide the human/computer interface. Preferably, the interface further includes a fourth sensor coupled between the shaft receiving portion and an intermediate portion of the shaft, where the third sensor is a translation sensor and the fourth sensor is a rotation sensor.
The advantage of the present invention is that a 3-dimensional human/computer interface tool is provided which has the three or four degrees of freedom available that are desirable for many virtual reality simulation applications. The mechanism of the present invention is relatively straight-forward allowing for low cost production and high reliability. Furthermore, since the human/computer interface tool of the present invention is constrained from movement along at certain degrees of freedom, it can more accurately simulate the use of tools and other elongated mechanical objects which are similarly constrained. Importantly, the present interface is of low inertia since the primary mass of the interface is located at the pivot point. This, along with the light weight of the interface, makes the interface less fatiguing to use.
In another embodiment of the present invention a human/computer interface tool is provided which is provided with only two degrees of freedom. This is particularly advantageous when the shaft is flexible, such as with very thin shafts, wires, catheters, and the like. With, for example, catheters, it is only necessary to provide two degrees of freedom (i.e. in-and-out, and rotation) and, therefore, sensors and/or actuators for the other degrees of freedom do not need to be provided.
These and other advantages of the present invention will become apparent to those skilled in the art upon a reading of the following descriptions of the invention and a study of the several figures of the drawing.
a is a perspective view of an alternative translation interface used for wires, catheters, and the like;
a is a cross-section taken along line 8a-8a of
a is a sectional view taken along line 9a-9a of FIG. 9.;
b is a perspective view of an alternative sensing wheel used for wires, catheters, and the like;
a is a cross sectional view taken along line 10a-10a of
a is a sectional view taken along line 11a-11a of
In
A laparoscopic tool 18 used in conjunction with the present invention is manipulated by an operator and virtual reality images are displayed on a screen 20 of the digital processing system in response to such manipulations. Preferably, the digital processing system is a personal computer or workstation, such as an IBM-PC AT or Macintosh personal computer, or a SUN or Silicon Graphics workstation. Most commonly, the digital processing system is a personal computer which operates under the MS-DOS operating system in conformance with an IBM PC AT standard.
The human/interface apparatus 12 as illustrated herein is used to simulate a laparoscopic medical procedure. In addition to a standard laparoscopic tool 18, the human/interface apparatus 12 includes a barrier 22 and a standard laparoscopic trocar 24. The barrier 22 is used to represent portion of the skin covering the body of a patient. Trocar 24 is inserted into the body of the patient to provide an entry and removal point from the body of the patient for the laparoscopic tool 18, and to allow the manipulation of the laparoscopic tool 18 within the body of the patient while minimizing tissue damage. Laparoscopic tools 18 and trocars 24 are commercially available from sources such as U.S. Surgical of Connecticut. Preferably, the laparoscopic tool 18 is modified such that the end of the tool (such as any cutting edges) are removed, leaving only the handle and the shaft. The end of the laparoscopic tool 18 is not required for the virtual reality simulation, and is removed to prevent any potential damage to persons or property. A gimbal apparatus 25 is shown within the “body” of the patient in phantom lines.
The laparoscopic tool 18 includes a handle or “grip” portion 26 and a shaft portion 28. The shaft portion is an elongated mechanical object and, in particular, is an elongated cylindrical object. The present invention is concerned with tracking the movement of the shaft portion 28 in three-dimensional space, where the movement has been constrained such that the shaft portion 28 has only three or four free degrees of motion. This is a good simulation of the real use of a laparoscopic tool 18 in that once it is inserted into a trocar 24 and through the gimbal apparatus 25, it is limited to about four degrees of freedom. More particularly, the shaft 28 is constrained at some point of along its length such that it can move with four degrees of freedom within the patient's body.
While the present invention will be discussed with reference to the shaft portion 28 of laparoscopic tool 18, it will be appreciated that a great number of other types of objects can be used with the method and apparatus of the present invention. In fact, the present invention can be used with any elongated mechanical object where is desirable to provide a human/computer interface with three or four degrees of freedom. Such objects may include catheters, hypodermic needles, wires, fiber optic bundles, screw drivers, pool cues, etc. Furthermore, although the described preferred embodiment of the present invention contemplates the use of a elongated cylindrical mechanical object, other embodiments of the present invention provide a similar human/computer interface for an elongated mechanical objects which are not cylindrical in shape.
The electronic interface 14 is a part of the human/computer interface apparatus 12 and coupled the apparatus 12 to the computer 16. An electronic interface 14 that is particularly well adopted for the present is described in U.S. patent application Ser. No. 08/092,974, filed Jul. 16, 1993, now U.S. Pat. No. 5,576,727, assigned to the assignee of the present invention and incorporated herein by reference in its entirety. The electronic interface described therein was designed for the Immersion PROBE™ 3-D mechanical mouse and has six channels corresponding to the six degrees of freedom of the Immersion PROBE. However, in the context of the present invention, the electronic interface 14 requires the use of only four of the six channels, since the present invention is preferably constrained to no more than four degrees of freedom.
The electronic interface 14 is coupled to a gimbal apparatus 25 of the apparatus 12 by a cable 30 and is coupled to the computer 16 by a cable 32. In some embodiments of the present invention, interface 14 serves solely as an input device for the computer 16. In other embodiments of the present invention, interface 14 serves solely as an output device for the computer 16. In yet other embodiments of the present invention, the interface 14 serves as an input/output (I/O) device for the computer 16.
In an alternative embodiment of the present invention, interface 14 has a local microprocessor 33 preferably coupled with any transducers present in the interface 14 and with a transceiver 35. In such an embodiment, the computer 16 is coupled to the transceiver 35 and, typically, not coupled directly with any transducers present in the interface 14. As will be appreciated, the transceiver 35 may be any suitable transceiver capable of bidirectional communication through serial or parallel communication strategies. The local microprocessor 33 will be programmed to execute computer instructions locally such that a computing burden is removed from the computer 16. For example, positional information generated by the transducers may be processed locally by the local microprocessor 33, which in turn can send absolute position and velocity information to the computer 16. Still further, the local microprocessor 33 is capable of receiving incoming force commands from the computer 16, decoding such commands, and controlling the interface 14 accordingly. For more details, see U.S. Pat. No. 5,576,727 of Rosenberg et al.”
In the perspective view of
The gimbal mechanism 36 also includes an elongated object (shaft) receiving portion 44 provided with an aperture 46 which extends entirely through the object receiving portion. The aperture 46 defines an object axis A0 for an elongated cylindrical object, such that the shaft portion 28 of the laparoscopic tool 18 of
The object receiving portion 44 also includes a translation interface 50 and a rotation interface 52. The object receiving portion 44 includes a bearing section 54, a translation sensor section 56, and a rotation sensor section 58. The bearing section 54 includes a mass of material provided with a cylindrical bore 60 forming a portion of the aperture 46. The translation sensor section 56 includes a pair of opposing wall surfaces 62a and 62b, each of which is provided with a cylindrical bore receptive to the cylindrical object and forming a part of the aperture 46 which extends through the object receiving portion. The translation sensor section 56 includes a pair of opposing wall surfaces 64a and 64b of a wall 63 and which are provided with cylindrical bores receptive to the cylindrical object and therefore also forming a part of the aperture 46. In consequence, when an elongated cylindrical object is inserted into the object receiving portion 44 along axis A0 it engages the bore 60 of the bearing section 54, and extends through bores provided in the surfaces 62a, 62b, 64a, and 64b to extend completely through the object receiving portion 44 along aperture 46. In another embodiment of the present invention, wall 63 (and therefore wall surfaces 64a and 64b) is eliminated as being superfluous.
Referring briefly to
The object receiving portion 44 is preferably a unitary mass of material made from aluminum or some other lightweight material such as a plastic. The object receiving portion 44 is preferably cast, molded, and/or machined as a monoblock member having the aforementioned bearing section, translation sensory section, and rotation sensory section. The materials and construction of U shaped base portion 38 preferably match the materials and construction techniques used for the production of object receiving portion 44.
The gimbal apparatus 25 illustrated in
Four electromechanical transducers are used in association with these four degrees of freedom. More particularly, a first degree of freedom electromechanical transducer 66 is arranged to transduce motion and/or force between the U shaped base portion 38 and the support 34, a second degree of freedom electromechanical transducer 68 is arranged to transduce motion and/or force between the U shaped base portion 38 and the object receiving portion 44, a third degree of freedom electromechanical transducer 70 is arranged to transduce motion and/or force between the object receiving portion 44 and an object engaged with the object receiving portion 44, and a fourth degree of freedom transducer 72 is arranged to transduce motion and/or force between the object receiving portion 44 and an object engaged with the object receiving portion 44.
By “associated with”, “related to”, or the like, it is meant that the electromechanical transducer is influenced by or influences one of the four degrees of freedom. The electromechanical transducers can be input transducers, in which case they sense motion along a respective degree of freedom and produce an electrical signal corresponding thereto for input into computer 16. Alternatively, the electromechanical transducers can be output transducers which receive electrical signals from computer 16 that cause the transducers to impart a force on the object in accordance with their respective degrees of freedom. The electromechanical transducers can also be hybrid or bi-directional transducers which operate both as sensors and as actuator devices.
A variety of transducers, readily available in the commercial market are suitable for use in the present invention. For example, if the transducers are input transducers (“sensors”), such sensors can include encoded wheel transducers, potentiometers, etc. Output transducers (“actuators”) include stepper motors, servo motors, magnetic particle brakes, friction brakes, pneumatic actuators, etc. Hybrid or bidirectional transducers often pair input and output transducers together, but may also include a purely bi-directional transducer such as a permanent magnet electric motor/generator.
It should be noted that the present invention can utilize both absolute and relative sensors. An absolute sensor is one which the angle of the sensor is known in absolute terms, such as with an analog potentiometer. Relative sensors only provide relative angle information, and thus require some form of calibration step which provide a reference position for the relative angle information. The sensors described herein are primarily relative sensors. In consequence, there is an implied calibration step after system power-up wherein the shaft is placed in a known position within the gimbal mechanism and a calibration signal is provided to the system to provide the reference position mentioned above. All angles provided by the sensors are thereafter relative to that reference position. Such calibration methods are well known to those skilled in the art and, therefore, will not be discussed in any great detail herein.
A preferred input transducer for use of the present invention is an optical encoder model SI marketed by U.S. Digital of Vancouver, Wash. This transducer is an encoded wheel type input transducer. A preferred output transducer for use of the present invention is a d.c. motor model 2434.970-50 produced by Maxon of Fall River, Mass. This type of transducer is a servo motor type output transducer.
There are a number of ways of attaching the transducers to the various members of the gimbal apparatus 25. In this preferred embodiment, a housing of transducer 66 is attached to the U shaped base portion 38, and a shaft of the transducer extends through an oversize bore (not shown) in base 40 to engage a press-fit bore (also not shown) in support 34. Therefore, rotation of the us shaped base portion 38 around axis A1 will cause a rotation of a shaft of transducer 66. A housing of transducer 68 is attached to leg 42a of the U shaped base portion 38 such that its shaft forms pivot 48a. Therefore rotation of the object receiving portion 44 around axis A2 will cause a rotation of the shaft of a second transducer 68. The transducer 70 is attached to object receiving portion 44 and extends through a bore (not shown) in a wall 74 of the translation sensor section 56. The shaft 76 provides an axis about which the translation interface 50 can rotate. The fourth transducer 74 is attached to a wall 78 of rotation sensor section 58 and extends through a bore 80 in that wall 78. The shaft 82 of the transducer 72 engages a circumferential surface of rotation interface 52 and rotates therewith.
Axes A1 and A2 are substantially mutually perpendicular and intersect at an origin point O within object receiving portion 44. Axis A0 also intersects this origin O. Shaft 76 rotates around an axis A3 which is substantially perpendicular to the axis A0. Shaft 58 of transducer 72 rotates around an axis A4 which is substantially parallel to the axis A0.
In
In
The four degrees of freedom are illustrated graphically in
In
a illustrate a modified laparoscopic tool 104. More particularly, a sensor 106 has been added to determine when the handle 108 has been squeezed, and the shaft 110 has been grooved or slotted for a purpose to be discussed subsequently. The sensor 106 can be coupled to the computer 16 through electronic interface 14 to provide additional input to the virtual reality system.
With reference to
a illustrate an alternate embodiment for transducer 72 which utilizes the shaft 110 and a detector mechanism similar to the one illustrated in
In
a illustrate another embodiment 72″ for the transducer 72 of
Another embodiment 72″′ for the fourth transducer is illustrated in
With reference to all of the figures, and with particular reference to
To this point, the majority of the discussion has been under the assumption that the transducers are input transducers, i.e. the human/computer interface device is used an input device to the computer 16. However, it is also been mentioned that the interface device 12 can serve as an output device for the computer 16. When used as an output device, output transducers (“actuators”) are used to respond to electrical signals developed by the computer 16 to impart a force upon the shaft 28 of the laparoscopic tool 18. This can provide useful movement and force (haptic) feedback to the doctor/trainee or other user. For example, if the laparoscopic tool encounters dense mass of tissue or a bone in the “virtual” patient, a force can be generated by transducer 70 making it harder for the doctor/trainee to push the shaft 28 further into the gimbal apparatus 25. Likewise, twisting motions can be imparted on the shaft 28 when the shaft encounters an obstacle within the virtual patient.
It should be noted that force applied to the shaft may not result in any movement of the shaft. This is because the shaft may be inhibited from movement by the hand of the operator who is grasping a handle or grip portion of the shaft. However, the force applied to the shaft may be sensed by the operator as haptic feedback.
With reference to
It will be noted that the electrical system most frequently described in the present invention is a digital processing system or a computer. However, other digital systems, analog systems, and simple electric or electromechanical system can also be utilized with the apparatus and method of the present invention.
It will also be noted that while specific examples of “elongated objects” and “shafts” have been given, that these examples are not meant to be limiting. In general, equivalents of “elongated objects”, “elongated cylindrical objects”, “shafts”, etc. include any object which can be grasped by a human operator to provide an interface between the operator and a computer system. By “grasp”, it is meant that operators may releasably engage a grip portion of the object in some fashion, such as by hand, with their fingertips, or even orally in the case of handicapped persons. The “grip” can be a functional grip or handle attached to an elongated portion of the object, or can be a portion of the object itself, such as a portion of the length of a shaft that can be gripped and/or manipulated by the operator.
It should also be noted that flexible shafts, such as wires or catheters, do not always require three or four degrees of freedom. For example, if a human/computer interface for a catheter insertion virtual reality system is desired, only a translation interface (e.g. translation interface 50′ of
While this invention has been described in terms of several preferred embodiments, it is contemplated that alternatives, modifications, permutations and equivalents thereof will become apparent to those skilled in the art upon a reading of the specification and study of the drawings. It is therefore intended that the following appended claims include all such alternatives, modifications, permutations and equivalents as fall within the true spirit and scope of the present invention.
This application is a continuation of U.S. application Ser. No. 11/725,958, filed Mar. 19, 2007, which is a continuation of U.S. application Ser. No. 10/674,423, filed Oct. 1, 2003, now U.S. Pat. No. 7,215,326, which is a continuation of U.S. application Ser. No. 09/996,487, filed Nov. 27, 2001, now U.S. Pat. No. 6,654,000, which is a continuation of U.S. application Ser. No. 09/276,012, filed Mar. 25, 1999, now U.S. Pat. No. 6,323,837, which is a continuation of application Ser. No. 08/833,502, filed Apr. 7, 1997 now U.S. Pat. No. 6,037,927, which is a continuation of application Ser. No. 08/275,120, filed Jul. 14, 1994, now Pat. No. 5,623,582, the entirety of all of which are hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
2906179 | Bower | Sep 1959 | A |
2972140 | Hirsch | Feb 1961 | A |
3157853 | Hirsch | Nov 1964 | A |
3220121 | Cutler | Nov 1965 | A |
3226846 | Wood | Jan 1966 | A |
3497668 | Hirsch | Feb 1970 | A |
3517446 | Codvon et al. | Jun 1970 | A |
3520060 | Crabtree et al. | Jul 1970 | A |
3531868 | Stevenson | Oct 1970 | A |
3623064 | Kagan | Nov 1971 | A |
3662076 | Gordon et al. | May 1972 | A |
3775865 | Rowan | Dec 1973 | A |
3863098 | Mehr | Jan 1975 | A |
3890958 | Fister et al. | Jun 1975 | A |
3902687 | Hightower | Sep 1975 | A |
3903614 | Diamond et al. | Sep 1975 | A |
3911416 | Feder | Oct 1975 | A |
3919691 | Noll | Nov 1975 | A |
3944798 | Eaton | Mar 1976 | A |
4052981 | Bachmann | Oct 1977 | A |
4127752 | Lowthorp | Nov 1978 | A |
4148014 | Burson | Apr 1979 | A |
4160508 | Salisbury, Jr. | Jul 1979 | A |
4216467 | Colston | Aug 1980 | A |
4227319 | Guy et al. | Oct 1980 | A |
4236325 | Hall et al. | Dec 1980 | A |
4262549 | Schwellenbach | Apr 1981 | A |
4321047 | Landis | Mar 1982 | A |
4333070 | Barnes | Jun 1982 | A |
4360345 | Hon | Nov 1982 | A |
4391282 | Ando et al. | Jul 1983 | A |
4398889 | Lam et al. | Aug 1983 | A |
4439162 | Blaine | Mar 1984 | A |
4444205 | Jackson | Apr 1984 | A |
4459113 | Boscaro Gatti et al. | Jul 1984 | A |
4464117 | Foerst | Aug 1984 | A |
4477043 | Repperger | Oct 1984 | A |
4477973 | Davies | Oct 1984 | A |
4484191 | Vavra | Nov 1984 | A |
4513235 | Acklam et al. | Apr 1985 | A |
4571834 | Fraser et al. | Feb 1986 | A |
4575297 | Richter | Mar 1986 | A |
4581491 | Boothroyd | Apr 1986 | A |
4593470 | Davies | Jun 1986 | A |
4599070 | Hladky et al. | Jul 1986 | A |
4632341 | Repperger et al. | Dec 1986 | A |
4638798 | Shelden et al. | Jan 1987 | A |
4642055 | Saliterman | Feb 1987 | A |
4653011 | Iwano | Mar 1987 | A |
4654648 | Herrington et al. | Mar 1987 | A |
4664130 | Gracovetsky | May 1987 | A |
4676002 | Slocum | Jun 1987 | A |
4679331 | Koontz | Jul 1987 | A |
4680519 | Chand et al. | Jul 1987 | A |
4685464 | Goldberger et al. | Aug 1987 | A |
4703443 | Moriyasu | Oct 1987 | A |
4708656 | de Vries et al. | Nov 1987 | A |
4713007 | Alban | Dec 1987 | A |
4726772 | Amplatz | Feb 1988 | A |
4750487 | Zanetti | Jun 1988 | A |
4769763 | Trieb et al. | Sep 1988 | A |
4771344 | Fallcaro et al. | Sep 1988 | A |
4773865 | Baldwin | Sep 1988 | A |
4787051 | Olson | Nov 1988 | A |
4789340 | Zikria | Dec 1988 | A |
4791934 | Brunnett | Dec 1988 | A |
4794392 | Selinko | Dec 1988 | A |
4800721 | Cemenska et al. | Jan 1989 | A |
4811608 | Hilton | Mar 1989 | A |
4819195 | Bell et al. | Apr 1989 | A |
4825872 | Tan et al. | May 1989 | A |
4839838 | LaBiche et al. | Jun 1989 | A |
4849692 | Blood | Jul 1989 | A |
4868549 | Affinito et al. | Sep 1989 | A |
4879556 | Duimel | Nov 1989 | A |
4879668 | Cline et al. | Nov 1989 | A |
4885565 | Embach | Dec 1989 | A |
4888877 | Enderle et al. | Dec 1989 | A |
4891764 | McIntosh | Jan 1990 | A |
4891889 | Tomelleri | Jan 1990 | A |
4896554 | Culver | Jan 1990 | A |
4907970 | Meenen, Jr. | Mar 1990 | A |
4907973 | Hon | Mar 1990 | A |
4930770 | Baker | Jun 1990 | A |
4934694 | McIntosh | Jun 1990 | A |
4942545 | Sapia | Jul 1990 | A |
4945305 | Blood | Jul 1990 | A |
4945501 | Bell et al. | Jul 1990 | A |
4961138 | Gorniak | Oct 1990 | A |
4962591 | Zeller et al. | Oct 1990 | A |
4982504 | Soderberg et al. | Jan 1991 | A |
4986280 | Marcus et al. | Jan 1991 | A |
5007085 | Greanias et al. | Apr 1991 | A |
5007300 | Siva | Apr 1991 | A |
5019761 | Kraft | May 1991 | A |
5022384 | Freels | Jun 1991 | A |
5022407 | Horch et al. | Jun 1991 | A |
5035242 | Franklin et al. | Jul 1991 | A |
5038089 | Szakaly | Aug 1991 | A |
5040306 | McMurtry et al. | Aug 1991 | A |
5044956 | Behensky et al. | Sep 1991 | A |
5047942 | Middleton et al. | Sep 1991 | A |
5047952 | Kramer et al. | Sep 1991 | A |
5050608 | Watanabe et al. | Sep 1991 | A |
5067072 | Talati et al. | Nov 1991 | A |
5072361 | Davis et al. | Dec 1991 | A |
5078152 | Bond et al. | Jan 1992 | A |
5086401 | Glassman et al. | Feb 1992 | A |
5088046 | McMurtry et al. | Feb 1992 | A |
5088055 | Oyama | Feb 1992 | A |
5095303 | Clark et al. | Mar 1992 | A |
5103404 | McIntosh | Apr 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5112228 | Zouras | May 1992 | A |
5116051 | Moncrief et al. | May 1992 | A |
5121747 | Andrews | Jun 1992 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5131844 | Marinaccio et al. | Jul 1992 | A |
5132672 | Clark | Jul 1992 | A |
5137458 | Ungs et al. | Aug 1992 | A |
5139261 | Openiano | Aug 1992 | A |
5142506 | Edwards | Aug 1992 | A |
5142931 | Menahem | Sep 1992 | A |
5143505 | Burdea et al. | Sep 1992 | A |
5146566 | Hollis, Jr. et al. | Sep 1992 | A |
5148377 | McDonald | Sep 1992 | A |
5149270 | McKeown | Sep 1992 | A |
5149487 | McKeown | Sep 1992 | A |
5162713 | Mohri et al. | Nov 1992 | A |
5165897 | Johnson | Nov 1992 | A |
5175459 | Danial et al. | Dec 1992 | A |
5181181 | Glynn | Jan 1993 | A |
5182557 | Lang | Jan 1993 | A |
5184319 | Kramer | Feb 1993 | A |
5185561 | Good et al. | Feb 1993 | A |
5186629 | Rohen | Feb 1993 | A |
5186695 | Mangseth et al. | Feb 1993 | A |
5187874 | Takahashi et al. | Feb 1993 | A |
5189806 | McMurtry et al. | Mar 1993 | A |
5193963 | McAffee et al. | Mar 1993 | A |
5204824 | Fujimaki | Apr 1993 | A |
5205289 | Hardy et al. | Apr 1993 | A |
5212473 | Louis | May 1993 | A |
5217003 | Wilk | Jun 1993 | A |
5220260 | Schuler | Jun 1993 | A |
5223776 | Radke et al. | Jun 1993 | A |
5228356 | Chuang | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5251127 | Raab | Oct 1993 | A |
5251156 | Heier et al. | Oct 1993 | A |
5259120 | Chapman et al. | Nov 1993 | A |
5271290 | Fischer | Dec 1993 | A |
5273038 | Beaven | Dec 1993 | A |
5275174 | Cook | Jan 1994 | A |
5275565 | Moncrief | Jan 1994 | A |
5280265 | Kramer et al. | Jan 1994 | A |
5283970 | Aigner | Feb 1994 | A |
5290276 | Sewell, Jr. | Mar 1994 | A |
5295694 | Levin | Mar 1994 | A |
5296846 | Ledley | Mar 1994 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5309140 | Everett, Jr. et al. | May 1994 | A |
5317336 | Hall | May 1994 | A |
5320537 | Watson | Jun 1994 | A |
5333106 | Lanpher et al. | Jul 1994 | A |
5334017 | Lang et al. | Aug 1994 | A |
5334027 | Wherlock | Aug 1994 | A |
5337412 | Baker et al. | Aug 1994 | A |
5338198 | Wu et al. | Aug 1994 | A |
5339723 | Huitema | Aug 1994 | A |
5351692 | Dow et al. | Oct 1994 | A |
5354162 | Burdea et al. | Oct 1994 | A |
5368487 | Medina | Nov 1994 | A |
5376007 | Zirm | Dec 1994 | A |
5379663 | Hara | Jan 1995 | A |
5384460 | Tseng | Jan 1995 | A |
5385474 | Brindle | Jan 1995 | A |
5389849 | Asano et al. | Feb 1995 | A |
5389865 | Jacobus et al. | Feb 1995 | A |
5391081 | Lampotang et al. | Feb 1995 | A |
5396266 | Brimhall | Mar 1995 | A |
5396895 | Takashima et al. | Mar 1995 | A |
5397323 | Taylor | Mar 1995 | A |
5402582 | Raab | Apr 1995 | A |
5403191 | Tuason | Apr 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5414337 | Schuler | May 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5417696 | Kashuba et al. | May 1995 | A |
5428748 | Davidson et al. | Jun 1995 | A |
5429140 | Burdea et al. | Jul 1995 | A |
5435729 | Hildreth et al. | Jul 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5436638 | Bolas et al. | Jul 1995 | A |
5437607 | Taylor | Aug 1995 | A |
5438529 | Rosenberg et al. | Aug 1995 | A |
5445166 | Taylor | Aug 1995 | A |
5451924 | Massimino et al. | Sep 1995 | A |
5454722 | Holland et al. | Oct 1995 | A |
5459382 | Jacobus et al. | Oct 1995 | A |
5461711 | Wang et al. | Oct 1995 | A |
5466213 | Hogan et al. | Nov 1995 | A |
5467763 | McMahon et al. | Nov 1995 | A |
5482051 | Reddy et al. | Jan 1996 | A |
5482472 | Garoni et al. | Jan 1996 | A |
5483961 | Kelly et al. | Jan 1996 | A |
5509810 | Schertz et al. | Apr 1996 | A |
5510832 | Garcia | Apr 1996 | A |
5512919 | Araki | Apr 1996 | A |
5513100 | Parker et al. | Apr 1996 | A |
5513992 | Refait | May 1996 | A |
5518406 | Waters | May 1996 | A |
5547382 | Yamasaki et al. | Aug 1996 | A |
5575761 | Hajianpour | Nov 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5589828 | Armstrong | Dec 1996 | A |
5598269 | Kitaevich et al. | Jan 1997 | A |
5609485 | Bergman et al. | Mar 1997 | A |
5609560 | Ichikawa et al. | Mar 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5629594 | Jacobus et al. | May 1997 | A |
5643087 | Marcus et al. | Jul 1997 | A |
5667517 | Hooven | Sep 1997 | A |
5676157 | Kramer | Oct 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5701140 | Rosenberg et al. | Dec 1997 | A |
5704791 | Gillio | Jan 1998 | A |
5709219 | Chen et al. | Jan 1998 | A |
5721566 | Rosenberg | Feb 1998 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5731804 | Rosenberg | Mar 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5739811 | Rosenberg et al. | Apr 1998 | A |
5742278 | Chen et al. | Apr 1998 | A |
5755577 | Gillio | May 1998 | A |
5766016 | Sinclair et al. | Jun 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5769640 | Jacobus et al. | Jun 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
5790108 | Salcudean et al. | Aug 1998 | A |
5805140 | Rosenberg et al. | Sep 1998 | A |
5808665 | Green | Sep 1998 | A |
5821920 | Rosenberg et al. | Oct 1998 | A |
5841423 | Carroll, Jr. et al. | Nov 1998 | A |
5880714 | Rosenberg et al. | Mar 1999 | A |
5882207 | Lampotang et al. | Mar 1999 | A |
5889670 | Schuler et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5909380 | Dubois et al. | Jun 1999 | A |
5920319 | Vining et al. | Jul 1999 | A |
5929846 | Rosenberg et al. | Jul 1999 | A |
5941710 | Lampotang et al. | Aug 1999 | A |
5944151 | Jakobs et al. | Aug 1999 | A |
5950629 | Taylor et al. | Sep 1999 | A |
5976156 | Taylor et al. | Nov 1999 | A |
6004134 | Marcus et al. | Dec 1999 | A |
6024576 | Bevirt et al. | Feb 2000 | A |
6037927 | Rosenberg | Mar 2000 | A |
6046727 | Rosenberg et al. | Apr 2000 | A |
6057828 | Rosenberg et al. | May 2000 | A |
6059506 | Kramer | May 2000 | A |
6104158 | Jacobus et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6125337 | Rosenberg et al. | Sep 2000 | A |
6131097 | Peurach et al. | Oct 2000 | A |
6154198 | Rosenberg | Nov 2000 | A |
6160489 | Perry et al. | Dec 2000 | A |
6162190 | Kramer | Dec 2000 | A |
6195592 | Schuler et al. | Feb 2001 | B1 |
6223100 | Green | Apr 2001 | B1 |
6246390 | Rosenberg | Jun 2001 | B1 |
6300937 | Rosenberg | Oct 2001 | B1 |
6323837 | Rosenberg | Nov 2001 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6437771 | Rosenberg et al. | Aug 2002 | B1 |
6758843 | Jensen | Jul 2004 | B2 |
6876891 | Schuler et al. | Apr 2005 | B1 |
7091950 | Rosenberg et al. | Aug 2006 | B2 |
7215326 | Rosenberg | May 2007 | B2 |
Number | Date | Country |
---|---|---|
0 349 086 | Jan 1990 | EP |
0571827 | Dec 1993 | EP |
0571827 | Dec 1993 | EP |
2254911 | Oct 1992 | GB |
H2-185278 | Jul 1990 | JP |
H4-8381 | Jan 1992 | JP |
434610 | Feb 1992 | JP |
H5-192449 | Aug 1993 | JP |
H7-24147 | Jan 1995 | JP |
1124372 | Nov 1984 | SU |
9106935 | May 1991 | WO |
WO 9106935 | May 1991 | WO |
WO 9400052 | Jan 1994 | WO |
WO 9426167 | Nov 1994 | WO |
WO 9502233 | Jan 1995 | WO |
WO 9502801 | Jan 1995 | WO |
9520787 | Aug 1995 | WO |
WO 9520787 | Aug 1995 | WO |
WO 9520788 | Sep 1995 | WO |
WO 9616397 | May 1996 | WO |
WO 9622591 | Jul 1996 | WO |
WO 9639944 | Dec 1996 | WO |
Number | Date | Country | |
---|---|---|---|
20090299711 A1 | Dec 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11725958 | Mar 2007 | US |
Child | 12537967 | US | |
Parent | 10674423 | Oct 2003 | US |
Child | 11725958 | US | |
Parent | 09996487 | Nov 2001 | US |
Child | 10674423 | US | |
Parent | 09276012 | Mar 1999 | US |
Child | 09996487 | US | |
Parent | 08833502 | Apr 1997 | US |
Child | 09276012 | US | |
Parent | 08275120 | Jul 1994 | US |
Child | 08833502 | US |