Interactive gaming systems with haptic feedback

Information

  • Patent Grant
  • 8992322
  • Patent Number
    8,992,322
  • Date Filed
    Wednesday, June 9, 2004
    20 years ago
  • Date Issued
    Tuesday, March 31, 2015
    9 years ago
Abstract
Interactive gaming systems and methods with haptic feedback are described. One described apparatus comprises: a portable manipulandum configured to simulate an element associated with a sports activity; a sensor in communication with the portable manipulandum and configured to detect a movement of the portable manipulandum; an actuator disposed in the portable manipulandum; and a processor in communication with the sensor and the actuator. The processor is configured to receive a sensor signal from the sensor associated with the movement of the portable manipulandum and to output a control signal to the actuator based at least in part on the sensor signal. The control signal is configured to cause the actuator to output a haptic feedback associated with the movement of the portable manipulandum, e.g., simulating an impact between the portable manipulandum and an object.
Description
FIELD OF THE INVENTION

This invention relates generally to haptic-feedback systems. More specifically, embodiments of the invention relate to interactive gaming systems with haptic feedback.


BACKGROUND

Interactive gaming systems can simulate various sports or physical activities, such as tennis, baseball, ping-pong, soccer, fishing, or playing a musical instrument. Such systems allow users (or players) to compete with “virtual opponents,” or simulate other sports/physical activities (e.g., fishing), in virtual playing fields displayed on televisions, and have gained popularity as entertainment and/or training means. Although visual and audio cues are typically employed to inform the status of the contests, many such systems provide no haptic feedback—an essential component of many sports/physical activities—to the players. A need therefore exists in the art to incorporate haptic feedback in interactive gaming systems, so as to convey greater realism to the players.


SUMMARY

Embodiments of the invention relate to interactive gaming systems and methods with haptic feedback.


In one embodiment, an apparatus comprises: a portable manipulandum configured to simulate an element associated with a physical activity; a sensor in communication with the portable manipulandum and configured to detect a movement of the portable manipulandum; an actuator disposed in the portable manipulandum; and a processor in communication with the sensor and the actuator. The processor is configured to receive a sensor signal from the sensor associated with the movement of the portable manipulandum and to output a control signal to the actuator based at least in part on the sensor signal. The control signal is configured to cause the actuator to output a haptic feedback associated with the movement of the portable manipulandum, e.g., simulating an impact between the portable manipulandum and an object.


These embodiments are mentioned not to limit or define the invention, but to provide examples of embodiments of the invention to aid understanding thereof. Embodiments are discussed in the Detailed Description, and further description of the invention is provided there. Advantages offered by the various embodiments of the present invention may be further understood by examining this specification.





BRIEF DESCRIPTION OF THE FIGURES

These and other features, aspects, and advantages of the present invention are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:



FIG. 1 illustrates several embodiments of a portable manipulandum configured to simulate several sports elements, according to the present invention;



FIG. 2 shows a schematic diagram of an interactive gaming system with haptic feedback, comprising a port manipulandum with an actuator disposed therein, according to one embodiment of the present invention;



FIG. 3 shows a schematic diagram of an interactive gaming system with haptic feedback, comprising a host processor and a local processor, according to one embodiment of the present invention;



FIG. 4 shows a schematic diagram of an interactive gaming system with haptic feedback, comprising a camera as a sensor, according to one embodiment of the present invention, and



FIGS. 5-6 show schematic diagrams of two embodiments of an interactive gaming system with haptic feedback, comprising a two-body sensing system, according to the present invention.





DETAILED DESCRIPTION

Embodiments of this invention provide interactive gaming systems and methods with haptic feedback. It will be appreciated that the term “physical activity” is herein construed broadly to include a sports activity, a music-instrument-playing activity, or any other physical activity that involves operating/maneuvering a manipulandum (or device) and feeling an impact or contact between the manipulandum and an object, as the following embodiments further illustrate. The term “gaming” is herein construed to include simulating such physical activity.


In one embodiment of the invention, an interactive gaming apparatus comprises: a portable manipulandum configured to simulate an element associated with a physical activity; a sensor in communication with the portable manipulandum and configured to detect a movement of the portable manipulandum; an actuator disposed in the portable manipulandum; and a processor in communication with the sensor and the actuator. The processor is configured to receive a sensor signal from the sensor associated with the movement of the portable manipulandum and to output a control signal to the actuator based at least in part on the sensor signal. The control signal is configured to cause the actuator to output a haptic feedback associated with the movement of the portable manipulandum, e.g., simulating an impact between the portable manipulandum and an object.


The portable manipulandum is configured to take on or imitate (i.e., simulate) the appearance and/or form of an element associated with a sports or physical activity, such as a racket for tennis, a bat for baseball, etc. By way of example, FIG. 1 depicts several embodiments of a portable manipulandum, each configured to simulate a sports element, including for example a tennis racket 110, a baseball bat 120, a sword 130, and a pair of boxing gloves 140, etc. The manipulandum is also configured to be portable such that it can be hand-held and operated in a stand-alone or un-tethered fashion. The ensuing description provides more examples of such manipulandum.


In one embodiment, the portable manipulandum may be used for example by a player (not explicitly shown) as a tennis racket to practice playing tennis. The sensor can be for example a motion sensor such as an acceleration sensor (or accelerometer) or tilt sensor, e.g., disposed in and forming an integrated structure with the portable manipulandum in a manner that allows it to effectively track the motion of portable manipulandum, such as the position and/or velocity of the tennis racket. The processor may execute an application program, e.g., a virtual tennis application program. In the application program, the player may compete, for example, with a virtual opponent.


In one scenario, the player swings the racket at a virtual tennis ball (e.g., to simulate a serve), resulting in a sensor signal being generated by the sensor and sent to the processor. The processor may use the sensor signal to compute the resulting motion of the virtual tennis ball, as well as a counter move made, for example, by the virtual opponent. The processor can also determine and output a control signal to the actuator. The control signal is configured to cause the actuator to output a haptic feedback that simulates, for example, the impact between the virtual tennis ball and the tennis racket when the player strikes the virtual tennis ball struck back by the virtual opponent. The haptic effect can also be correlated with how/where the virtual tennis ball hits the racket, so as to enhance the realism of the playing. The contest between the player and the virtual opponent may also be displayed on a display device (e.g., a television), and/or broadcast by an audio device (e.g., one or more speakers).


U.S. Pat. No. 6,366,272 provides some embodiments of systems and methods for providing interactions between simulated objects using force feedback, the entire disclosure of which is incorporated herein by reference.


Systems and methods in accordance with the invention are further described below with reference to FIGS. 2-6. The invention is not limited to the examples given; there are a variety of systems and methods to make and use the invention.


Referring now to the drawings in which like numerals indicate like elements. FIG. 2 depicts a schematic diagram of an interactive gaming apparatus 200 with haptic feedback, according to one embodiment of the invention. By way of example to illustrate some general principles of the invention, the apparatus 200 comprises a portable manipulandum 210, which is configured to simulate an element associated with a sports or physical activity. For example, in one embodiment, the manipulandum is configured to simulate a tennis racket. In other embodiments, the manipulandum can be configured to simulate various elements for other sports/physical activities, such as a baseball bat, a sword, a boxing glove, a drum stick, etc. See the description below for more examples.


An actuator 220 may be disposed in and form an integrated structure with the portable manipulandum 210. For example, a motor connected to an eccentric mass may be disposed in the handle of a simulated tennis racket to provide haptic effects to the user holding the tennis racket. The embodiment shown in FIG. 2 also comprises a sensor 230 in communication with the portable manipulandum 210. The sensor 230 provides a sensor signal associated with the movement of the portable manipulandum 210, e.g., the position and/or velocity of the portable manipulandum 210. In the embodiment of FIG. 2, the sensor 230 is shown to be disposed in and form an integrated structure with the portable manipulandum 210. For example, in one embodiment, a simulated tennis racket may comprise a motion sensor, such as an accelerometer or tilt sensor, to provide a sensor signal, for example, tracking the movement of the portable manipulandum 210 by the user or otherwise. In other embodiments, the sensor as a whole or in part may be physically separate from the portable manipulandum 210, or as further described in some of the embodiments below.


In the embodiment shown in FIG. 2, the apparatus also comprises a processor 240 in communication with the sensor 230 and the actuator 220. The processor 240 may communicate with the actuator 220 and the sensor 230 via wired and/or wireless communications means 270, 272. For example, in one embodiment, the processor 240, the sensor 230, and the actuator 220 may comprise Bluetooth or other wireless transceivers. The processor 240 receives sensor signals from the sensor 230 and generates control signals for the actuator 220, which are based, at least in part, on the sensor signals. The processor 240 may also execute an application program, such as a virtual sports activity application program.


In one embodiment, the processor 240 may be physically separated from the portable manipulandum 210, such as illustrated in FIG. 2. For example, the processor 240 may be included in a console or housing (along with associated electronics and other components, for example). In some embodiments, the processor 240 can also be included in a computer (e.g., a desktop or laptop), a television set, or other controllers, configured to execute an application program pertaining to an interactive gaming application, as described above. In other embodiments, the processor 240 may be disposed in and form an integrated structure with the portable manipulandum 210 (e.g., along with the actuator 220 and/or the sensor 230). This configuration may, for example, be implemented in a fishing or shooting game. In one embodiment of such a configuration, for example, the portable manipulandum 210 (e.g., a fishing rod or rifle) may be configured to be self-contained, e.g., comprising all the associated electronics and components necessary to play the simulation, and there need not be an external console or a computer.


The processor 240 may also be in communication with a display device 250 (e.g., a television, a computer monitor, or other suitable display device) via a wired or wireless communications means 280. The processor 240 may additionally be in communication with an audio device 260 (e.g., one or more speakers) via a wired or wireless communications means 282. Although the audio device 260 is shown separate from the display device 250, the audio device 260 and display device 250 may be combined in some embodiments of the invention.


The term “portable” is herein construed broadly to include manipulanda that can be hand-held and operated in a stand-alone or un-tethered fashion. In some embodiments, it may be desirable for the manipulandum to communicate with an external processor via a wired communication means; however, the manipulandum is otherwise stand-alone or un-tethered mechanically. The portable manipulandum is configured to take on or imitate (i.e., simulate) the appearance and/or form of an element associated with a sports or physical activity, as the embodiments of FIG. 1 illustrate. The ensuing description provides more examples of such portable manipulandum.


In the above and following embodiments, the term “disposed in” includes that the actuator 220 (or the sensor 230, or the processor 240) is configured such to form an integrated structure with the portable manipulandum 210. For example, the actuator 220 (or the sensor 230, or the processor 240) may be enclosed inside a housing of the manipulandum 210, the actuator 220 (or the sensor 230, or the processor 240) may be embedded in (or mounted on) a body (or housing) of the manipulandum 210, or the actuator 220 (or the sensor 230, or the processor 240) may be disposed in the portable manipulandum 210 via other mechanisms.


In the embodiment of FIG. 2, the processor 240 is configured to receive a sensor signal from the sensor 230 associated with the movement of the portable manipulandum 210. For example, the sensor signal may track a swing made by a user or player. The processor 240 is also configured to output a control signal to the actuator 220, which is based at least in part on the sensor signal. The control signal is configured to cause the actuator 220 to output a haptic feedback associated with the movement of the portable manipulandum 210, e.g., simulating an impact between the portable manipulandum 210 and an object (e.g., a tennis racket and a tennis ball).


In one embodiment, the portable manipulandum 210 may be used, for example, by a player (not explicitly shown) as a tennis racket to practice playing tennis. The sensor 230 can be, for example, a motion sensor such as an acceleration sensor (or tilt sensor), e.g., disposed in the portable manipulandum 210 in a manner that allows it to effectively track the motion of portable manipulandum 210. The sensor 230 may be an accelerometer (piezoelectric, MEMS (Micro-Electro-Mechanical Systems) based, etc.), a gyroscope, a receiver (infrared, radio frequency, etc.), or other sensing means capable of measuring position change, velocity, and/or acceleration of the manipulandum 210. When the player swings the racket 210, the sensor 230 generates a sensor signal indicating the motion of the tennis racket, e.g., its position, velocity, and/or acceleration (e.g., speed, direction, and rate of change in direction and/or speed).


The processor 240 may execute an application program, which may be, for example, stored in a memory 242. For example, the processor 240 may execute a virtual tennis application program, in which the player may compete, for example, with a virtual opponent. In one scenario, the player swings the racket at a virtual tennis ball (e.g., to simulate a serve), resulting in a sensor signal being generated by the sensor 230 and sent to the processor 240. The processor 240 may use the sensor signal to compute the resulting motion of the virtual tennis ball, as well as a counter move made, for example, by the virtual opponent. The processor 240 can also determine and output a control signal to the actuator 220. The control signal is configured to cause the actuator 220 to output a haptic feedback that simulates, for example, the impact between the virtual tennis ball and the tennis racket when the player strikes the virtual tennis ball struck back by the virtual opponent. For example, the processor 240 may generate a control signal configured to cause the actuator 220 to output a jolt sensation as the tennis racket “impacts” the virtual tennis ball. The haptic feedback can be correlated with how/where the virtual tennis ball hits the racket, for example, so as to enhance the realism of the playing. Various haptic effects can also be created for other events of interest, and/or to simply enhance the joy of playing. U.S. Pat. No. 6,366,272 discloses some examples of application programs designed and implemented for simulated (or virtual) interactions, the entire disclosure of which is incorporated herein by reference.


The application program may further implement a graphical environment on the display device 250, for example, a virtual playing field (e.g., a tennis court) with the virtual opponent and/or a virtual representation of the player to graphically illustrate the contest between the player and the virtual opponent. In addition, the application program may have, for example, the status of the contest broadcasted by the audio device 260, along with other audio cues that mimic the real environment (e.g., applauses from virtual audience, announcer commentary, opponent banter or “heckling,” in-game sounds such as “cracking” of a bat or “clicking” of a fishing reel, etc.).


The embodiment of FIG. 2 may further include a second (or “local”) processor, e.g., configured to work in conjunction with the first (or “host”) processor 240 in controlling the actuator 220, so as to deliver the desired haptic effects. FIG. 3 shows a schematic diagram of an interactive gaming apparatus 300 comprising host and local processors, according to one embodiment of the invention. By way of example, the apparatus 300 of FIG. 3 may be built on the apparatus 200 of FIG. 2, and hence like elements are labeled with like numerals. In the embodiment of FIG. 3, a local processor 345 is disposed in and forms an integrated structure with the portable manipulandum 210. The local processor 345 is configured to be in communication with the host processor 240 via a wired and/or wireless communication means 370. For example, the local processor 345 and the host processor 240 may include wireless communication means, such as Bluetooth transceivers, various types of IEEE 802 transceivers, infrared communication means, or ultrasonic communication means.


In one embodiment, the host processor 240 may be configured to send high-level force commands (e.g., “generating a sweet spot” for a tennis game, or “generating a home-rum” for a baseball game) to the local processor 345. A high-level force command can be a command that provides a general description of the haptic effect but may not include the details, such as the particular frequency, amplitude, or duration, of the control signal to be generated for the actuator 220. In response, the local processor 345 can provide appropriate control signals to the actuator 220 to render the desired haptic effects. A set of pre-determined “haptic effects” can be stored, for example, in a local memory (not explicitly shown) associated with the local processor 345, such that the corresponding haptic effects can be looked up upon receiving the high-level commands from the host processor 240. For example, the memory may comprise a look-up table that includes two fields. The first field may contain the names of the high-level effects to be implemented. The second field may include a collection of data (e.g., a frequency, amplitude, and duration, etc.) necessary to generate a particular haptic effect. U.S. Pat. Nos. 5,734,373, 5,959,613, 6,028,593, 6,300,937, and 6,411,276 disclose some embodiments of haptic systems employing host and local processors and associated controlling schemes, the disclosures of all of which are incorporated herein by reference.


The local processor 345 can also be configured to generate haptic effects in an interactive or dynamic manner. By way of example in the embodiment of FIG. 3, the local processor 345 is shown to be disposed in the portable manipulandum 210 (along with the actuator 220 and the sensor 230, for example), which may be desired in some applications. In other embodiments, the local processor 345 may be physically separated from the portable manipulandum 210 (or stand-alone). By way of example, FIG. 3 also shows that the sensor 230 is in communication with the local processor 345. In other embodiments, the sensor 230 may also or instead be in communication with the host processor 240, as described above with respect to the embodiment of FIG. 2. For example, in one embodiment, the host processor 240 receives sensor signals from the sensor 230 and determines the appropriate high-level command signals to send to the local processor 345. The local processor 345 can generate control signals for use by the actuator 220, based at least in part on the high-level command signals from the host processor 240.



FIG. 4 shows a schematic diagram of an interactive gaming apparatus 400 with haptic feedback, according to one embodiment to the invention. For purposes of illustration and simplicity, the apparatus 400 may make use of some of the elements employed in the apparatus 200 of FIG. 2, and hence like elements are labeled with like numerals. (The apparatus 400 can also be built on the apparatus 300 of FIG. 3 in alternative embodiments.) The apparatus 400 includes a sensor 430, which can be, for example, a video (or infrared) camera, configured to capture the motion of the portable manipulandum 210. In the embodiment shown, no sensor is disposed in the portable manipulandum 210. However, in other embodiments, the portable manipulandum 210 may comprise internal sensors as well. In one embodiment, the portable manipulandum 210 may be colored, or bear some color patterns on its outer surface, so as to enhance the contrast between manipulandum 210 and the ambient environment and thereby to aid the sensor 430 in accurately capturing the motion of the manipulandum 210.


The sensor signal (e.g., image data) from the sensor 430 is communicated to a processor 440. In one embodiment, the sensor 430 and the processor 440 (along with a memory 442 and/or other associated electronics) may constitute an integrated console (or housing) 490, situated at a location that allows the sensor 430 to effectively capture the range of motion of the portable manipulandum 210. For example, in one embodiment, the portable manipulandum 210 may comprise a baseball bat. A console 490 is constructed to be placed on the floor and resemble a home plate. The console 490 comprises one or more sensors configured such to allow the console to capture the motion of the simulated baseball bat across the plate. A processor 440 situated in the console 490 receives the sensor signal, and if the simulated bat strikes a simulated baseball, the processor 440 generates a control signal to the actuator 220. The actuator 220 is in communication with the processor 440 via a wired and/or wireless communication means 470. For example, in the baseball simulation system described above, the actuator 220 may comprise a Bluetooth (or other wireless) receiver. A Bluetooth (or other wireless) transmitter in the console 490 or integrated in the processor 440 may transmit the control signal to the actuator 220. In another embodiment, the processor 440 (along with the memory 442) may be included in a computer (or a television set), with the sensor 430 being, for example, a video camera peripheral to the computer. The use of a camera as a motion sensing means allows a “one-way” communication between the processor 440 and the portable manipulandum 210, which may be desired in some applications. The apparatus 400 can be substantially similar to the embodiment of FIG. 3 in operation and functionality, as described above.


In some embodiments, a two-body (or multi-body) sensing system may also be implemented. FIG. 5 depicts a schematic diagram of an interactive gaming apparatus 500 with a two-body sensing system, according to one embodiment to the invention. For purposes of illustration and simplicity, the apparatus 500 may make use of some of the elements employed in the embodiments of FIGS. 2 and 4, and hence like elements are labeled with like numerals. (The apparatus 500 can also be built on the apparatus 300 of FIG. 3 in alternative embodiments.) In the apparatus 500, a sensor (or sensing system) including a first element 532 and a second element 534 can be implemented. In one embodiment, the first element 532 can be an emitter, e.g., configured to emit (or broadcast) an electromagnetic radiation or sound wave 574 in a particular frequency range. The second element 534 can be a detector/receiver, e.g., configured to communicate with and receive the radiation or sound wave originating from the emitter.


In one embodiment, for example, the first element 532 can be attached to (or mounted on) the portable manipulandum 210. The second element 534 can be disposed in the console 490, configured to receive sensor signals emitted (or broadcasted) from the first element 532 and communicates the sensor signals to the processor 440. In other embodiments, the second element 534 can also be physically separate from the console 490, e.g., be placed at a location that allows it to best communicate with the first element 532. In any case, the first and second elements 532, 534 are configured to work in conjunction to effectively track the movement of the portable manipulandum 210. The processor 440 in turn generates control signals based on the received sensor signals, and outputs the control signals to the actuator 220 via a wired and/or wireless communication means 570.


In another embodiment, the first element 532 can be disposed in the console 490, and the second element 534 can be attached to (or mounted on) the portable manipulandum 210, as depicted in FIG. 6. In the embodiment shown in FIG. 6, the second element 534 may additionally be in communication with the processor 440, for example, via a wired and/or wireless communication means 672.


In the embodiments above, the first element 532 can be a radio frequency (RF) emitter, an infrared emitter, an ultrasound emitter, or any other known radiation (or wave) source. The second element 534 can be a detector/receiver devised to work effectively with the first element 532. In addition to a two-body sensor, a sensing system including a plurality of emitters and/or a plurality of receivers can also be implemented in an interactive gaming system of the invention, configured such to effectively track the movement of the manipulandum 210.


In one embodiment of the invention, for example, a plurality of receivers and/or emitters can be arranged such that the movement of the manipulandum 210 can be “triangulated” from the plurality of resultant sensor signals. In another embodiment, a plurality of actuators can be, for example, arranged in a particular spatial pattern and actuated selectively in accordance with the gaming events, such as where/how the impact between the portable manipulandum (e.g., a racket) and the simulated object (e.g., a virtual ball) occurs.


An interactive gaming system of the invention (e.g., one of the embodiments described above) can also be applied to a situation where a player is playing with an opponent or a computer over a network, such as Internet. In one embodiment, the processor 240 (or 340) can, for example, be equipped with wired and/or wireless networking capabilities. For example, in one embodiment, the processor 240 (or 340) may comprise a wireless communication means (e.g., a Wi-Fi device based on IEEE 802.11, or other IEEE 802 transceivers), a Bluetooth chip, or other networking means. The processor 240 (or 340) can, for example, send to the network information related to the motion of the player, as well as receive from the network information related to a move made by the opponent, and so on. Based on such information, the processor 240 (or 340) outputs appropriate control signals to the actuator 220 for rendering the desired haptic effects, e.g., in a manner similar to that described above. The processor 240 (or 340) may further instruct the display device 250 to display the playing of the opponent (e.g., making use of the imaging data taken by the camera such as described in the embodiment of FIG. 3). The processor 240 (or 340) may alternatively cause a virtual playing field along with a “virtual representation” of the opponent to be displayed. Audio cues can also be generated, e.g., in a manner similar to that described above.


In the above, the actuator 220 can be an eccentric mass actuator (e.g., a pager motor), a harmonic eccentric mass actuator, an inertial mass harmonic actuator, a voice coil, a moving magnet actuator, a piezoelectric actuator, an electro-active polymer actuator, or any other suitable actuation means known in the art. The actuator 220 can be, for example, disposed in the housing of the portable manipulandum 210, e.g., via a suitable mechanism that is effective in transmitting haptic effects. In some applications such as fishing, a resistive actuator (e.g., an electromagnetic brake) can also be utilized, e.g., coupled to the crank mechanism of the fishing pole. In other embodiments, it might be desired to implement a plurality of actuators in the portable manipulandum, such as described above, so as to output haptic effects that mimic the underlying application. Haptic effects can be kinesthetic, tactile, or other types of feedback forces deemed appropriate. U.S. Pat. Nos. 5,734,373, 6,285,351, and 6,300,936 provide more details on configuring and implementing haptic feedback systems, the disclosures of all of which are incorporated herein by reference.


As described above, the portable manipulandum 210 is configured to simulate an element associated with a sports or physical activity, such as a racket for playing tennis, badminton, racquet ball, squash, ping-pong, and the like, or a bat (or club) for playing baseball, hockey, golf, and the like. The actuator 220 can be configured and controlled accordingly to deliver haptic effects desired for a given application.


Furthermore, the portable manipulandum 210 can be configured to simulate a fishing rod, where the haptic feedback can, for example, be related to the sensation of catching and/or maneuvering a fish. The portable manipulandum 210 can also be configured to simulate a gun or rifle, where the haptic feedback may mimic, for example, the sensation of “recoiling” during shooting. Such can be used, for example, in hunting or shooting practices (e.g., for training military personnel). For example, in one embodiment, a simulated rifle may comprise one or more pager motors (or other eccentric mass actuators), configured such that vibrations (or a jolt) are output during shooting to mimic the recoiling sensation.


In addition, the portable manipulandum 210 can be configured to simulate a sword for fencing, martial arts, or the like, where the haptic feedback can, for example, be correlated with the sword coming into contact with that (or a body part) of the opponent. For example, in one embodiment, a simulated sword can comprise one or more motors, configured such that a jolt or impulse like force is output when the sword hits a virtual one.


The portable manipulandum 210 may alternatively be configured to simulate a glove for boxing or baseball, or a football, where the haptic feedback can, for example, be associated with the sensation of receiving a punch from an opponent, or catching a ball. Moreover, there can be applications where the portable manipulandum 210 is configured such to be attachable to a body part, such as a leg of a soccer player, thereby enabling the player to experience the physical sensations associated with kicking a soccer ball, for instance. For example, in one embodiment, a plurality of actuators (e.g., piezoelectric buzzers) may be disposed in a simulated boxing glove at different locations, so as to output haptic effects in relation to where/how virtual punches are received.


The portable manipulandum 210 may also be configured to simulate a drumstick for beating a drum, or other stick (or rod) like elements for striking/plucking a music instrument, where the haptic feedback can, for example, be associated with the stick striking the surface of a virtual drum (or corresponding part of other music instrument). For example, in one embodiment, a simulated drumstick can comprise one or more actuators (e.g., piezoelectric or voice coil buzzers), configured such that a jolt or impulse like force is output when the drumstick strikes the virtual drum. In another embodiment, a console comprising a ring-like structure may be, for example, implemented to resemble (or simulated) a drum, where a plurality of sensors (e.g., infrared, optical, or RF sensors) may be positioned on the perimeter of the ring-like structure. When the simulated drumstick strikes a virtual surface of the simulated drum (e.g., an imaginary surface as provided by the ring-like structure), one or more sensor signals are output and sent to a processor (e.g., situated in the console). Based on these and other sensor signals (e.g., the simulated drumstick may also comprise one or more internal sensors), the processor can determine, for example, the location and/or speed at which the simulated drumstick strikes the virtual drum surface, and generate one or more control signals accordingly to one or more actuators disposed in the simulated drumstick. A haptic feedback is then output, which may, for example, be correlated with the manner by which the drumstick strikes the simulated drum. Alternative embodiments may, for example, comprise other sensing means, such as those described above with respect to the embodiments of FIGS. 2-6.


In other embodiments, the invention can be used to simulate other sports/physical activities those skilled in the art contemplate. All in all, haptic feedback can be used to enhance the realism of such gaming, to complement various events of interest, and/or to create more fun to the player.


In the above, processors (e.g., the processors 240, 345, 440) can generally include, for example, digital logical processors/controllers capable of processing input, execute algorithms, and generate output, as necessary to cause the desired haptic effects to be output to the portable manipulandum 210. Suitable processors include, for example, a microprocessor, an Application Specific Integrated Circuit (ASIC), state machines, an analog or digital circuit, or a combination of multiple circuits, and the like. Such processors can include a variety of other components, such as, for example, co-processors, graphics processors, etc. Such processors may also include, or may be in communication with, media, such as computer (or processor) readable media (e.g., the memories 242, 442 above), which store instructions that, when executed by the processor, cause the processor to perform certain functions/steps (e.g., those described above with respect to the application program).


One embodiment of a computer (or processor) readable medium includes an electronic, optical, magnetic, or other storage or transmission device capable of providing a processor with computer-readable instructions. Other examples of computer readable media include, but are not limited to, random access memory (RAM), read-only memory (ROM), electronically programmable read only memory (EPROM), erasable electronically programmable read only memory (EEPROM), flash memory, hard drives, DVD drives, CD-R/RW drive, floppy diskettes, all optical media, photomagnetoelectric disks, magnetic tapes or other magnetic media, or any other medium from which a processor can read.


The display device 250 in the above can include (but is not limited to) a computer or television monitor (e.g., CRT, LCD, flat panel, etc.), a 3-D goggle, or any other display means known in the art. The audio device 260 can include, for example, one or more speakers, or any other known audio/sound generating means.


The foregoing description of the preferred embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the present invention.

Claims
  • 1. An apparatus, comprising: a portable manipulandum configured to simulate an element associated with a physical activity;an actuator disposed in the portable manipulandum, wherein the actuator comprises one of a pager motor, an eccentric mass actuator, a harmonic eccentric mass actuator, an inertial mass harmonic actuator, a voice coil, a moving magnet actuator, a piezoelectric actuator, or an electro-active polymer actuator;a sensor in communication with the portable manipulandum and configured to detect a movement of the portable manipulandum; anda processor in communication with the sensor and the actuator, the processor configured to receive a sensor signal from the sensor associated with the movement of the portable manipulandum and to output a control signal to the actuator based at least in part on the sensor signal and haptic data in a lookup table, wherein the lookup table comprises a first data field and a second data field, wherein the first data field comprises a name of a haptic effect and the second data field comprises a portion of the haptic data, and wherein the control signal is configured to cause the actuator to output a haptic feedback associated with the movement of the portable manipulandum, the haptic feedback comprising a vibrotactile effect.
  • 2. The apparatus of claim 1 wherein the haptic feedback is operable to simulate an impact between the portable manipulandum and an object.
  • 3. The apparatus of claim 1 wherein the sensor is disposed in the manipulandum.
  • 4. The apparatus of claim 3 wherein the sensor comprises one of an acceleration sensor or a tilt sensor.
  • 5. The apparatus of claim 1 wherein the processor is disposed in the portable manipulandum.
  • 6. The apparatus of claim 1 wherein the processor is disposed in a housing separate from the portable manipulandum.
  • 7. The apparatus of claim 6 wherein the processor comprises a wireless communication means for communicating with at least one of the sensor or the actuator.
  • 8. The apparatus of claim 6 wherein the processor is a first processor, the apparatus further comprising a second processor in communication with the first processor, the second processor being disposed in the manipulandum.
  • 9. The apparatus of claim 1 wherein the sensor comprises a first element and a second element in communication with one another.
  • 10. The apparatus of claim 9 wherein the first element is disposed in the portable manipulandum, the second element and the processor being included in a housing separate from the portable manipulandum.
  • 11. The apparatus of claim 9 wherein the second element is disposed in the portable manipulandum, the first element and the processor being included in a housing separate from the portable manipulandum.
  • 12. The apparatus of claim 9 wherein the first element includes one of a radio frequency emitter, an infrared emitter, or an ultrasound emitter.
  • 13. The apparatus of claim 12 wherein the second element comprises one of a radio frequency receiver, an infrared receiver, or an ultrasound receiver.
  • 14. The apparatus of claim 1 wherein the sensor comprises one of a video camera or an infrared camera.
  • 15. The apparatus of claim 14 wherein the sensor and the processor are disposed in a housing separate from the portable manipulandum.
  • 16. The apparatus of claim 1 wherein the processor is further configured to execute an application program associated with the physical activity.
  • 17. The apparatus of claim 16 further comprising a memory in communication with the processor, the memory storing the application program.
  • 18. The apparatus of claim 16 wherein the application program causes a graphical environment associated with the physical activity to be output.
  • 19. The apparatus of claim 18 wherein the output comprises a visual display on a display device.
  • 20. The apparatus of claim 18 wherein the output comprises an audio output from an audio device, the audio output indicating a status associated with the physical activity.
  • 21. The apparatus of claim 1 wherein the element associated with the physical activity comprises one of a tennis racket, a badminton racket, a racquetball racket, a squash racket, a ping-pong racket, a baseball bat, a hockey stick, a golf club, a gun, a rifle, a sword, a boxing glove, a baseball glove, a football, or a drumstick.
  • 22. An apparatus, comprising: a portable manipulandum configured to simulate an element associated with a physical activity;a sensor in communication with the portable manipulandum and configured to output a sensor signal associated with a movement of the portable manipulandum; andan actuator disposed in the portable manipulandum, the actuator configured to receive a control signal based at least in part on the sensor signal and haptic data in a lookup table, wherein the lookup table comprises a first data field and a second data field, wherein the first data field comprises a name of a haptic effect and the second data field comprises a portion of the haptic data, and to output a haptic feedback that simulates an impact between the portable manipulandum and an object, the haptic feedback comprising a vibrotactile effect, and wherein the actuator comprises one of a pager motor, an eccentric mass actuator, a harmonic eccentric mass actuator, an inertial mass harmonic actuator, a voice coil, a moving magnet actuator, a piezoelectric actuator, or an electro-active polymer actuator.
  • 23. The apparatus of claim 22 wherein the sensor is disposed in the portable manipulandum.
  • 24. The apparatus of claim 22 wherein the sensor includes one of a video camera or an infrared camera, separate from the manipulandum.
  • 25. The apparatus of claim 22 further comprising a processor in communication with the sensor and the actuator, the processor configured to receive the sensor signal from the sensor and to output the control signal to the actuator.
  • 26. The apparatus of claim 25 wherein the processor is disposed in the portable manipulandum.
  • 27. The apparatus of claim 25 wherein the processor is separate from the portable manipulandum.
  • 28. The apparatus of claim 27 wherein the processor is included in one of a computer or a television set.
  • 29. The apparatus of claim 23 further comprising a memory in communication with the processor, the memory storing an application program associated with the physical activity.
  • 30. The apparatus of claim 29 wherein the application program is configured to cause a graphical environment associated with the physical activity to be output.
  • 31. The apparatus of claim 30 wherein the output comprises a visual display on a display device.
  • 32. The apparatus of claim 30 wherein the output comprises an audio output from an audio device, the audio output indicating a status associated with the physical activity.
  • 33. The apparatus of claim 22 wherein the element associated with the physical activity includes one of a tennis racket, a badminton racket, a racquetball racket, a squash racket, a ping-pong racket, a baseball bat, a hockey stick, a golf club, a gun, a rifle, a sword, a boxing glove, a baseball glove, a football, or a drumstick.
  • 34. A method, comprising: providing a portable manipulandum, the portable manipulandum configured to simulate an element associated with a physical activity;providing a sensor in communication with the portable manipulandum, the sensor configured to output a sensor signal associated with a movement of the portable manipulandum; andproviding an actuator disposed in the portable manipulandum, the actuator configured to receive a control signal based at least in part on the sensor signal and haptic data in a lookup table, wherein the lookup table comprises a first data field and a second data field, wherein the first data field comprising a name of a haptic effect and the second data field comprises a portion of the haptic data, and to output a haptic feedback associated with the movement of the portable manipulandum, the haptic feedback comprising a vibrotactile effect, and wherein the actuator comprises one of a pager motor, an eccentric mass actuator, a harmonic eccentric mass actuator, an inertial mass harmonic actuator, a voice coil, a moving magnet actuator, a piezoelectric actuator, or an electro-active polymer actuator.
  • 35. The method of claim 34 wherein the haptic feedback is operable to simulate an impact between the portable manipulandum and an object.
  • 36. The method of claim 34 further comprising providing a processor in communication with the sensor and the actuator, the processor configured to receive the sensor signal from the sensor and to output the control signal to the actuator.
  • 37. The method of claim 36 wherein the processor is disposed in the portable manipulandum.
  • 38. The method of claim 36 wherein the processor is separate from the portable manipulandum.
  • 39. The method of claim 38 wherein the sensor includes one of a video camera or an infrared camera, the sensor and the processor being disposed in a housing separate from the portable manipulandum.
  • 40. The method of claim 36 wherein the processor is a first processor, the method further comprising providing a second processor in communication with the first processor, the second processor being disposed in the portable manipulandum.
  • 41. The method of claim 34 wherein the sensor includes a first element and a second element in communication with one another.
  • 42. The method of claim 41 wherein the first element is disposed in the portable manipulandum, the second element and the processor being included in a housing separate from the portable manipulandum.
  • 43. The method of claim 41 wherein the second element is disposed in the portable manipulandum, the first element and the processor being included in a housing separate from the portable manipulandum.
  • 44. The method of claim 41 wherein the first element includes one of a radio frequency emitter, an infrared emitter, or an ultrasound emitter.
  • 45. The method of claim 44 wherein the second element includes one of a radio frequency receiver, an infrared receiver, or an ultrasound receiver.
  • 46. The method of claim 34 wherein the element associated with the sports activity includes one of a tennis racket, a badminton racket, a racquet ball racket, a squash racket, a ping-pong racket, a baseball bat, a hockey stick, a golf club, a gun, a rifle, a sword, a boxing glove, a baseball glove, a football, or a drumstick.
  • 47. The apparatus of claim 1, wherein the portion of the haptic data is predetermined.
  • 48. The apparatus of claim 22, wherein the portion of the haptic data is predetermined.
  • 49. The method of claim 34, wherein the portion of the haptic data is predetermined.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Application No. 60/477,214, filed on Jun. 9, 2003, the entire disclosure of which is incorporated herein by reference.

US Referenced Citations (294)
Number Name Date Kind
2524782 Ferrar et al. Oct 1950 A
3157853 Hirsch Nov 1964 A
3220121 Cutler Nov 1965 A
3490059 Paulsen et al. Jan 1970 A
3497668 Hirsch Feb 1970 A
3517446 Corlyon et al. Jun 1970 A
3623046 Scourtes Nov 1971 A
3875488 Crocker et al. Apr 1975 A
3902687 Hightower Sep 1975 A
3903614 Diamond et al. Sep 1975 A
4050265 Drennen et al. Sep 1977 A
4103155 Clark Jul 1978 A
4125800 Jones Nov 1978 A
4148014 Burson Apr 1979 A
4160508 Salsbury Jul 1979 A
4236325 Hall et al. Dec 1980 A
4311980 Prusenziati Jan 1982 A
4385836 Schmitt May 1983 A
4391282 Ando et al. Jul 1983 A
4400790 Chambers et al. Aug 1983 A
4443952 Schulien et al. Apr 1984 A
4513235 Acklam et al. Apr 1985 A
4546347 Kirsch Oct 1985 A
4581491 Boothroyd Apr 1986 A
4599070 Hladky et al. Jul 1986 A
4637264 Takahashi et al. Jan 1987 A
4639884 Sagues Jan 1987 A
4678908 LaPlante Jul 1987 A
4680466 Kuwahara et al. Jul 1987 A
4692726 Green et al. Sep 1987 A
4695266 Hui Sep 1987 A
4699043 Violante De Dionigi Oct 1987 A
4708656 De Vries et al. Nov 1987 A
4712101 Culver Dec 1987 A
4713007 Alban Dec 1987 A
4724715 Culver Feb 1988 A
4728954 Phelan et al. Mar 1988 A
4734685 Watanabe Mar 1988 A
4776701 Pettigrew Oct 1988 A
4794384 Jackson Dec 1988 A
4795901 Kitazawa Jan 1989 A
4799055 Nestler et al. Jan 1989 A
4803413 Kendig et al. Feb 1989 A
4811608 Hilton Mar 1989 A
4815006 Andersson et al. Mar 1989 A
4819195 Bell et al. Apr 1989 A
4823106 Lovell Apr 1989 A
4825157 Mikan Apr 1989 A
4840634 Muller et al. Jun 1989 A
4851771 Ikeda et al. Jul 1989 A
4860051 Taniguchi et al. Aug 1989 A
4891764 McIntosh Jan 1990 A
4891889 Tomelleri Jan 1990 A
4906843 Jones et al. Mar 1990 A
4914976 Wyllie Apr 1990 A
4930770 Baker Jun 1990 A
4934694 McIntosh Jun 1990 A
4935725 Turnau Jun 1990 A
4935728 Kley Jun 1990 A
4937685 Barker et al. Jun 1990 A
4940234 Ishida et al. Jul 1990 A
4962448 DeMaio et al. Oct 1990 A
4964837 Collier Oct 1990 A
4965446 Vyse Oct 1990 A
4982504 Soderberg et al. Jan 1991 A
4988981 Zimmerman et al. Jan 1991 A
5006703 Shikunami et al. Apr 1991 A
5019761 Kraft May 1991 A
5022407 Horch et al. Jun 1991 A
5024626 Robbins et al. Jun 1991 A
5035242 Franklin Jul 1991 A
5038089 Szakaly Aug 1991 A
5053975 Tsuchihashi et al. Oct 1991 A
5062830 Dunlap Nov 1991 A
5065145 Purcell Nov 1991 A
5068529 Ohno et al. Nov 1991 A
5078152 Bond Jan 1992 A
5079845 Childers Jan 1992 A
5086197 Liou Feb 1992 A
5095303 Clark et al. Mar 1992 A
5107080 Rosen Apr 1992 A
5113179 Scott-Jackson et al. May 1992 A
5116051 Moncrief et al. May 1992 A
5125261 Powley Jun 1992 A
5132927 Lenoski et al. Jul 1992 A
5138154 Hotelling Aug 1992 A
5139261 Openiano Aug 1992 A
5148377 McDonald Sep 1992 A
5155423 Karlen et al. Oct 1992 A
5168268 Levy Dec 1992 A
5182557 Lang Jan 1993 A
5186629 Rohen Feb 1993 A
5186695 Mangseth et al. Feb 1993 A
5195179 Tokunaga Mar 1993 A
5195920 Collier Mar 1993 A
5202961 Mills et al. Apr 1993 A
5204600 Kahkoska Apr 1993 A
5209131 Baxter May 1993 A
5212473 Louis May 1993 A
5216337 Orton et al. Jun 1993 A
5223658 Suzuki Jun 1993 A
5229836 Nagano Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5235868 Culver Aug 1993 A
5239249 Ono Aug 1993 A
5240417 Smithson et al. Aug 1993 A
5246316 Smith Sep 1993 A
5247648 Watkins et al. Sep 1993 A
5254919 Bridges et al. Oct 1993 A
5271290 Fischer Dec 1993 A
5275174 Cook Jan 1994 A
5275565 Moncrief Jan 1994 A
5280276 Kwok Jan 1994 A
5284330 Carlson et al. Feb 1994 A
5289273 Lang Feb 1994 A
5296846 Ledley Mar 1994 A
5299810 Pierce et al. Apr 1994 A
5309140 Everett May 1994 A
5313229 Gilligan et al. May 1994 A
5313230 Venolia et al. May 1994 A
5317336 Hall May 1994 A
5329289 Sakamoto et al. Jul 1994 A
5332225 Ura Jul 1994 A
5334027 Wherlock Aug 1994 A
5341459 Backes Aug 1994 A
5347306 Nitta Sep 1994 A
5351692 Dow et al. Oct 1994 A
5359193 Nyui et al. Oct 1994 A
5368484 Copperman et al. Nov 1994 A
5374942 Gilligan et al. Dec 1994 A
5379663 Hara Jan 1995 A
5384460 Tseng Jan 1995 A
5388059 DeMenthon Feb 1995 A
5389865 Jacobus et al. Feb 1995 A
5390128 Ryan et al. Feb 1995 A
5390296 Crandall et al. Feb 1995 A
5396266 Brimhall Mar 1995 A
5396267 Bouton Mar 1995 A
5397323 Taylor et al. Mar 1995 A
5398044 Hill Mar 1995 A
5402499 Robison et al. Mar 1995 A
5402582 Raab Apr 1995 A
5402680 Korenaga Apr 1995 A
5405152 Katanics et al. Apr 1995 A
5417696 Kashuba et al. May 1995 A
5428748 Davidson et al. Jun 1995 A
5436542 Petelin et al. Jul 1995 A
5436640 Reeves Jul 1995 A
5452615 Hilton Sep 1995 A
5453758 Sato Sep 1995 A
5457479 Cheng Oct 1995 A
5457793 Elko et al. Oct 1995 A
5459382 Jacobus et al. Oct 1995 A
5466213 Hogan Nov 1995 A
5467763 McMahon et al. Nov 1995 A
5473344 Bacon et al. Dec 1995 A
5474082 Junker Dec 1995 A
5481914 Ward Jan 1996 A
5491477 Clark et al. Feb 1996 A
5512919 Araki Apr 1996 A
5514150 Rostoker May 1996 A
5519618 Kastner et al. May 1996 A
5524195 Clanton, III et al. Jun 1996 A
5526022 Donahue et al. Jun 1996 A
5530455 Gillick et al. Jun 1996 A
5542672 Meredith Aug 1996 A
5543821 Marchis et al. Aug 1996 A
5547382 Yamasaki Aug 1996 A
5547383 Yamaguchi Aug 1996 A
5550562 Aoki et al. Aug 1996 A
5550563 Matheny et al. Aug 1996 A
5565840 Thorner et al. Oct 1996 A
5570111 Barrett et al. Oct 1996 A
5576727 Rosenberg et al. Nov 1996 A
5583407 Yamaguchi Dec 1996 A
5589828 Armstrong Dec 1996 A
5591924 Hilton Jan 1997 A
5592401 kramer Jan 1997 A
5604345 Matsuura Feb 1997 A
5611731 Bouton et al. Mar 1997 A
5616078 Oh Apr 1997 A
5623582 Rosenberg Apr 1997 A
5623642 Katz et al. Apr 1997 A
5627531 Posso et al. May 1997 A
5628686 Svancarek et al. May 1997 A
5635897 Kuo Jun 1997 A
5638421 Serrano et al. Jun 1997 A
5652603 Abrams Jul 1997 A
5666138 Culver Sep 1997 A
5680141 Didomenico et al. Oct 1997 A
5691747 Amano Nov 1997 A
5691898 Rosenberg et al. Nov 1997 A
5692117 Berend et al. Nov 1997 A
5694153 Aoyagi et al. Dec 1997 A
5701140 Rosenberg et al. Dec 1997 A
5704837 Iwasaki et al. Jan 1998 A
5713792 Ohzono Feb 1998 A
5722071 Berg et al. Feb 1998 A
5724106 Autry et al. Mar 1998 A
5724264 Rosenberg et al. Mar 1998 A
5734108 Walker et al. Mar 1998 A
5734373 Rosenberg et al. Mar 1998 A
5739811 Rosenberg et al. Apr 1998 A
5740083 Anderson et al. Apr 1998 A
5741182 Lipps et al. Apr 1998 A
5742278 Chen et al. Apr 1998 A
5745057 Sasaki et al. Apr 1998 A
5749577 Couch et al. May 1998 A
5751273 Cohen May 1998 A
5755620 Yamamoto et al. May 1998 A
5763874 Luciano et al. Jun 1998 A
5766016 Sinclair Jun 1998 A
5767836 Scheffer et al. Jun 1998 A
5771037 Jackson Jun 1998 A
5785630 Bobick et al. Jul 1998 A
5795228 Trumbull et al. Aug 1998 A
5805140 Roseberg et al. Sep 1998 A
5808568 Wu Sep 1998 A
5808603 Chen Sep 1998 A
5818426 Tierney et al. Oct 1998 A
5825305 Biferno Oct 1998 A
5828295 Mittel et al. Oct 1998 A
5831593 Rutledge Nov 1998 A
5841133 Omi Nov 1998 A
5841423 Carroll, Jr. et al. Nov 1998 A
5841428 Jaeger et al. Nov 1998 A
5844673 Ivers Dec 1998 A
5877748 Redlich Mar 1999 A
5879327 Moreau DeFarges et al. Mar 1999 A
5889506 Lopresti et al. Mar 1999 A
5889670 Schuler et al. Mar 1999 A
5912661 Siddiqui Jun 1999 A
5913727 Ahdoot Jun 1999 A
5917486 Rylander Jun 1999 A
5919159 Lilley et al. Jul 1999 A
5936613 Jaeger et al. Aug 1999 A
5954689 Poulsen Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5963196 Nishiumi et al. Oct 1999 A
5986638 Cheng Nov 1999 A
5999185 Kato et al. Dec 1999 A
6001014 Ogata et al. Dec 1999 A
6004134 Marcus et al. Dec 1999 A
6017273 Pelkey Jan 2000 A
6028593 Rosenberg et al. Feb 2000 A
6031222 Carapelli Feb 2000 A
6078311 Pelkey Jun 2000 A
6078876 Rosenberg et al. Jun 2000 A
6097499 Casey et al. Aug 2000 A
6097964 Nuovo et al. Aug 2000 A
6100874 Schena et al. Aug 2000 A
6104379 Petrich et al. Aug 2000 A
6111577 Zilles et al. Aug 2000 A
6162123 Woolston Dec 2000 A
6183364 Trovato Feb 2001 B1
6192432 Slivka et al. Feb 2001 B1
6219034 Elbing et al. Apr 2001 B1
6241574 Helbing Jun 2001 B1
6259433 Meyers Jul 2001 B1
6280327 Leifer et al. Aug 2001 B1
6285351 Chang et al. Sep 2001 B1
6293798 Boyle et al. Sep 2001 B1
6295608 Parkes et al. Sep 2001 B1
6300038 Shimazu et al. Oct 2001 B1
6300937 Rosenberg Oct 2001 B1
6342880 Rosenberg et al. Jan 2002 B2
6349301 Mitchell et al. Feb 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6411276 Braun et al. Jun 2002 B1
6416327 Wittenbecher Jul 2002 B1
6418329 Furuya Jul 2002 B1
6422941 Thorner et al. Jul 2002 B1
6546390 Pollack et al. Apr 2003 B1
6589117 Moritome et al. Jul 2003 B1
6633224 Hishida et al. Oct 2003 B1
6693622 Shahoian et al. Feb 2004 B1
6722888 Macri et al. Apr 2004 B1
6760751 Hachiya et al. Jul 2004 B1
6767282 Matsuyama et al. Jul 2004 B2
6902482 Woolston Jun 2005 B1
6967566 Weston et al. Nov 2005 B2
7001272 Yamashita et al. Feb 2006 B2
7247097 Woolston Jul 2007 B2
7445550 Barney et al. Nov 2008 B2
20010018354 Pigni Aug 2001 A1
20010026266 Schena et al. Oct 2001 A1
20010045978 McConnell et al. Nov 2001 A1
20020072674 Criton et al. Jun 2002 A1
20030043206 Duarte Mar 2003 A1
20030112269 Lentz et al. Jun 2003 A1
20040076444 Badovinac et al. Apr 2004 A1
20040193393 Keane Sep 2004 A1
20050187747 Paxson et al. Aug 2005 A1
20060030383 Rosenberg et al. Feb 2006 A1
Foreign Referenced Citations (19)
Number Date Country
196 36 831 Feb 1998 DE
0 085 518 Aug 1989 EP
0349086 Jan 1990 EP
0 470 257 Feb 1992 EP
0 358 989 Jul 1994 EP
0 875 819 Oct 2002 EP
2 237 160 Apr 1991 GB
2 347 199 Aug 2000 GB
01-003664 Jul 1990 JP
02-109714 Jan 1992 JP
04-007371 Aug 1993 JP
05-193862 Jan 1995 JP
WO 9616397 May 1996 WO
WO 9624398 Aug 1996 WO
WO 9632679 Oct 1996 WO
WO 0077689 Dec 2000 WO
WO 0100630 Jan 2001 WO
WO 0167297 Sep 2001 WO
WO 03000319 Jan 2003 WO
Non-Patent Literature Citations (90)
Entry
Adelstein, “A Virtual Environment System for the Study of Human Arm Tremor,” Ph.D. Dissertation, Dept. of Mechanical Engineering, MIT, Jun. 1989.
Adelstein, “Design and Implementation of a Force Reflecting Manipulandum for Manual Control research,” DSC-vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992.
Aukstakalnis et al., “Silicon Mirage: The Art and Science of Virtual Reality,” ISBN 0-938151-82-7, pp. 129-180, 1992.
Baigrie, “Electric Control Loading—A Low Cost, High Performance Alternative,” Proceedings, pp. 247-254, Nov. 6-8, 1990.
Bejczy et al., “Kinesthetic Coupling Between Operator and Remote Manipulator,” International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980.
Bejczy, “Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation,” Science, vol. 208, No. 4450, pp. 1327-1335, 1980.
Bejczy, “Generalization of Bilateral Force-Reflecting Control of Manipulators,” Proceedings of Fourth CISM-IFToMM, Sep. 8-12, 1981.
Bejczy, et al., “Universal Computer Control System (UCCS) for Space Telerobots,” CH2413-3/87/0000/0318501.00 1987 IEEE, 1987.
Bejczy et al., “A Laboratory Breadboard System for Dual-Arm Teleoperation,” SOAR '89 Workshop, JSC, Houston, TX, Jul. 25-27, 1989.
Brooks et al., “Hand Controllers for Teleoperation—A State-of-the-Art Technology Survey and Evaluation,” JPL Publication 85-11; NASA-CR-175890; N85-28559, pp. 1-84, Mar. 1, 1985.
Burdea et al., “Distributed Virtual Force Feedback, Lecture Notes for Workshop on Force Display in Virtual Environments and its Application to Robotic Teleoperation,” 1993 IEEE International Conference on Robotics and Automation, pp. 25-44, May 2, 1993.
Caldwell et al., “Enhanced Tactile Feedback (Tele-Taction) Using a Multi-Functional Sensory System,” 1050-4729/93, pp. 955-960, 1993.
“Cyberman Technical Specification,” Logitech Cyberman Swift Supplement, Apr. 5, 1994.
Eberhardt et al., “OMAR—A Haptic display for speech perception by deaf and deaf-blind individuals,” IEEE Virtual Reality Annual International Symposium, Seattle, WA, Sep. 18-22, 1993.
Eberhardt et al., “Including Dynamic Haptic Perception by The Hand: System Description and Some Results,” DSC-vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994.
Gobel et al., “Tactile Feedback Applied to Computer Mice,” International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995.
Gotow et al., “Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback,” WA11-11:00, pp. 332-337.
Howe, “A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
IBM Technical Disclosure Bulletin, “Mouse Ball-Actuating Device With Force and Tactile Feedback,” vol. 32, No. 9B, Feb. 1990.
Iwata, “Pen-based Haptic Virtual Environment,” 0-7803-1363-1/93 IEEE, pp. 287-292, 1993.
Jacobsen et al., “High Performance, Dextrous Telerobotic Manipulator With Force Reflection,” Intervention/ROV '91 Conference & Exposition, Hollywood, Florida, May 21-23, 1991.
Jones et al., “A perceptual analysis of stiffness,” ISSN 0014-4819 Springer International (Springer-Verlag); Experimental Brain Research, vol. 79, No. 1, pp. 150-156, 1990.
Kaczmarek et al., “Tactile Displays,” Virtual Environment Technologies.
Kontarinis et al., “Display of High-Frequency Tactile Information to Teleoperators,” Telemanipulator Technology and Space Telerobotics, Won S. Kim, Editor, Proc. SPIE vol. 2057, pp. 40-50, Sep. 7-9, 1993.
Marcus, “Touch Feedback in Surgery,” Proceedings of Virtual Reality and Medicine the Cutting Edge, Sep. 8-11, 1994.
McAffee, “Teleoperator Subsystem/Telerobot Demonstrator: Force Reflecting Hand Controller Equipment Manual,” JPL D-5172, pp. 1.50, A1-A36, B1-B5, C1-C36, Jan. 1988.
Minsky, “Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display,” Ph.D. Dissertation, MIT, Jun. 1995.
Ouh-Young, “Force Display in Molecular Docking,” Order No. 9034744, p. 1-369, 1990.
Ouh-Young, “A Low-Cost Force Feedback Joystick and Its Use in PC Video Games,” IEEE Transactions on Consumer Electronics, vol. 41, No. 3, Aug. 1995.
Ouhyoung et al., “The Development of a Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment,” Proceedings of the Third Pacific Conference on Computer Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995.
Patrick et al., “Design and Testing of a Non-reactive, Fingertip, Tactile Display for Interaction with Remote Environments,” Cooperative Intelligent Robotics in Space, Rui J. deFigueiredo et al., Editor, Proc. SPIE vol. 1387, pp. 215-222, 1990.
Pimentel et al., “Virtual Reality: through the new looking glass,” 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994.
Rabinowitz et al., “Multidimensional tactile displays: Identification of vibratory intensity, frequency, and contactor area,” Journal of the Acoustical Society of America, vol. 82, No. 4, Oct. 1987.
Russo, “The Design and Implementation of a Three Degree of Freedom Force Output Joystick,” MIT Libraries Archives Aug. 14, 1990, pp. 1-131, May 1990.
Russo, “Controlling Dissipative Magnetic Particle Brakes in Force Reflective Devices,” DSC-vol. 42, Advances in Robotics, pp. 63-70, ASME 1992.
Scannell, “Taking a Joystick Ride,” Computer Currents, Boston Edition, vol. 9, No. 11, Nov. 1994.
Shimoga, “Finger Force and Touch Feedback Issues in Dexterous Telemanipulation,” Proceedings of Fourth Annual Conference on Intelligent Robotic Systems for Space Exploration, Rensselaer Polytechnic Institute, Sep. 30-Oct. 1, 1992.
Snow et al., “Model-X Force-Reflecting-Hand-Controller,” NT Control No. MPO-17851; JPL Case No. 5348, pp. 1-4, Jun. 15, 1989.
Stanley et al., “Computer Simulation of Interacting Dynamic Mechanical Systems Using Distributed Memory Parallel Processors,” DSC-vol. 42, Advances in Robotics, pp. 55-61, ASME 1992.
Tadros, “Control System Design for a Three Degree of Freedom Virtual Environment Simulator Using Motor/Brake Pair Actuators”, MIT Archive @ Massachusetts Institute of Technology, pp. 1-88, Feb. 1990.
Terry et al., “Tactile Feedback in a Computer Mouse,” Proceedings of Fourteenth Annual Northeast Bioengineering Conference, University of New Hampshire, Mar. 10-11, 1988.
Adelstein, B., A Virtual Environment System for the Study of Human Arm Tremor, Submitted to the Dept. of Mechanical Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology, Jun. 1989, pp. 1-253.
Adelstein, B. et al., Design and Implementation of a Force Reflecting Manipulandum for Manual Control Research, DSC-vol. 42, Advances in Robotics, ASME 1992, pp. 1-12.
Akamatsu et al., Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display, Presence, vol. 3, No. 1 pp. 73-80, 1994.
ATIP98.059: Virtual Reality (VR) Development at SERI (Korea), Asian Technology Information Program (ATIP) Jul. 20, 1998, pp. 1-9.
Aukstakalnis, S. et al., The Art and Science of Virtual Reality Silicon Mirage, 1992, Peachpit Press, Inc., Berkeley, CA, pp. 129-180.
Baigrie, S. et al., Electric Control Loading-A Low Cost, High Performance Alternative, Proceedings, Nov. 6-8, 1990, pp. 247-254.
Bejczy, A., Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation, Science, vol. 208, No. 4450, 1980, pp. 1327-1335.
Bejczy, A. et al., Kinesthetic Coupling Between Operator and Remote Manipulator, International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980, pp. 1-9.
Bejczy, A. et al., A Laboratory Breadboard System for Dual-Arm Teleoperation, SOAR '89 Workshop, JSC, Houston, Jul. 25-27, 1989.
Bejczy, A. et al., Universal Computer Control System (UCCS) for Space Telerobots, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA, pp. 317-324.
Bjork, S. et al., An Alternative to Scroll Bars on Small Screens, Play: Applied Research on Art and Technology, Viktoria Institute, Box 620, SE-405 30 Gothenburg, Sweden, pp. 1-2.
Bouguila, L. et al., Effect of Coupling Haptics and Stereopsis on Depth Perception in Virtual Environment, Precision and Intelligence Laboratory, Tokyo Institute of Technology, 4259 Nagatsuta cho Midori ku Yokohama shi 226-8503-Japan.
Brooks, T. et al., Hand Controllers for Teleoperation: A State-of-the-Art Technology Survey and Evaluation, 1985, NASA Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA.
Burdea, G. et al., Distributed Virtual Force Feedback, IEEE Workshop on “Force Display in Virtual Environments and its Application to Robotic Teleoperation,” May 2, 1993, Atlanta, GA.
Calder, B. et al., Design of a Force-Feedback Touch-Inducing Actuator for Teleoperator Robot Control, Submitted to the Department of Mechanical Engineering and Electrical Engineering in partial Fulfillment of the requirements of the degree of Bachelors of Science in Mechanical Engineering and Bachelor of Science in Electrical Engineering at the Massachusetts Institute of Technology, May 1983.
Caldwell, D. et al., Enhanced Tactile Feedback (Tele-Taction) using a Multi-Functional Sensory System, Dept. of Electronic Eng., University of Salford, Salford, M5 4WT, UK, 1993.
Cyberman Technical Specification, Logitech Cyberman Swift Supplement, Revision 1.0, Apr. 5, 1994, pp. 1-29.
Eberhardt, S. et al., Omar-A Haptic Display for Speech Perception by Deaf and Deaf-Blind Individuals, IEEE Virtual Reality Annual International Symposium, Sep. 18-22, 1993, Seattle Washington.
Eberhardt, S. et al., Inducing Dynamic Haptic Perception by the Hand: System Description and Some Results, Dynamic Systems and Control, 1994, vol. 1, presented at 1994 International Mechanical Engineering Congress and Exposition, Chicago Illinois, Nov. 6-11, 1994.
Fukumoto, M. et al., Active Click: Tactile Feedback for Touch Panels, NTT DoCoMo Multimedia Labs, Japan.
Gobel, M. et al., Tactile Feedback Applied to Computer Mice, International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995.
Gotow, J. et al., Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback, The Robotics Institute and Deptartmetn of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213, pp. 332-337.
Hansen, W., Enhancing Docuemtns with Embedded Programs: How Ness extends Insets in the Andrew Toolkit, 1990, Information Technology Center, Carnegie Mellon University, Pittsburgh, PA 15213.
Hasser, C. et al., Tactile Feedback with Adaptive Controller for a Force-Reflecting Haptic Display Part 1: Design, 1996, Armstrong Laboratory, Human Systems Center, Air Force Materiel Command, Wright-Patterson AFB OH 45433.
Hasser, C. et al., Tactile Feedback for a Force-Reflecting Haptic Display, Thesis Submitted to the School of Engineering of the University of Daytona, Dayton OH, Dec. 1995.
Hasser, C., Force-Reflecting Anthropomorphic Hand Masters, Crew Systems Directorate Biodynamics and Biocommunications Division, Wright-Patterson AFB OH 45433-7901, Jul. 1995, Interim Report for the Period Jun. 1991-Jul. 1995.
Hinckley, K. et al., Haptic Issues for Virtual Manipulation, A Dissertation presented to the Faculty of the School of Engineering and Applied Science at the University of Virginia, in Partial Fulfillment of the Requirement for the Degree Doctor of Philosophy (Computer Science), Dec. 1996.
Howe, R., A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation, Proceedings of the 1992 IEEE Conference in Robotics and Automation, Nice, France-May 1992.
Iwata, H., Pen-Based Haptic Virtual Environment, Institute of Engineering Mechanics, University of Tsukuba, Japan, 1993.
Jacobsen, S. et al., High Performance, Dextrous Telerobotic Manipulator with Force Reflection, Intervention/ROV '91, Conference & Exposition, May 21-23, 1991, Hollywood, FL.
Johnson, A., Shape-Memory Alloy Tactical Feedback Actuator, Phase I-Final Report, Air Force SABIR Contract F33-88-C-0541, Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson Air Force Base, OH 45433.
Jones, L. et al., A Perceptual Analysis of Stiffness, Experimental Brain Research, 1990, vol. 79, pp. 150-156.
Kaczmarek, K. et al., Tactile Displays, Virtual Environment Technologies, pp. 349-414.
Kelley, A. et al., MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device, Department of Electrical Engineering, University of British Columbia, Canada, Oct. 19, 1993.
Lake, S.L., Cyberman from Logitech, web site at http://www.ibiblio.org/GameBytes/issues21/greviews/cyberman/html, as available via the Internet and printed May 29, 2002.
Maclean K., Designing with Haptic Feedback, Interval Research Corporation, 1801 Page Mill Road, Palo Alto, CA 94304, 2000.
Mine, M., Isaac: A Virtual Environment Tool for the Interactive Construction of Virtual Worlds, Department of Computer Science, University of North Carolina Chapel Hill, 1995.
Picinbono, G. et al., Extrapolation: A Solution for Force Feedback, Virtual Reality and Prototyping, Jun. 1999, Laval, France.
Wloka, M., Interacting with Virtual Reality, Science and Technology Center for Computer Graphics and Scientific Visualization, Brown University Site, Department of Computer Science, 1995.
eRENA, Pushing Mixed Reality Boundaries, Deliverable 7b.1, Final, Version 1.0.
Real Time Graphics, The Newsletter of Virtual Environment Technologies and Markets, Aug. 1998, vol. 7, No. 2.
1998 IEEE International Conference on Robotics and Automation, May 16-20, 1998, Lueven, Belgium.
PCT Search Report corresponding to PCT/US04/18412 mailed on Nov. 3, 2004.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 11/233,563, dated Mar. 4, 2010.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 11/233,563, dated Aug. 6, 2008.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 11/233,563, dated Aug. 14, 2009.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 09/934,739, dated May 16, 2006.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 09/934,739, dated Dec. 13, 2005.
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 09/934,739, dated Sep. 11, 2002.
Related Publications (1)
Number Date Country
20050017454 A1 Jan 2005 US
Provisional Applications (1)
Number Date Country
60477214 Jun 2003 US