This invention relates generally to haptic systems, and more particularly, to interactive simulations and interface devices that incorporate haptic feedback.
The advent of Internet and modem communication networks has brought a renewed life to simulated (or “virtual”) pets. In addition to stand-alone electronic pet toys (e.g., those known as “Tomagotcchi,” see U.S. Pat. No. 5,966,526 for example), a user nowadays can also create his/her own simulated (or “virtual”) pet, or order a virtual pet online, and rear the pet in a manner as he/she desires. Such virtual pets are typically programmed to adapt to their environments, and develop new traits and characteristics based upon their interactions with the owners. A virtual pet may further explore the online world, and participate in events as arranged by its owner, and so on. In such scenarios, however, the interaction between a virtual pet and its owner is limited to visual and/or auditory interaction. That is, the user misses a sense of touch with his/her pet, as experienced in the real world.
Embodiments of the invention relate to methods and systems for providing haptic feedback to a user interacting with a simulated (or “virtual”) pet, so as to enhance the realism of the user's relationship with the virtual pet. The term “virtual pet” as used herein is construed broadly to refer to any simulated creature or character, which may or may not have a “real-life” counterpart.
In one embodiment, a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
A further understanding of the invention will become apparent by reference to the remaining portions of the specification and drawings.
In one embodiment, a method of providing haptic feedback to a user interacting with a virtual pet comprises: receiving a signal relating to a biological status of the virtual pet, and outputting, to the user, a haptic effect based on the received signal.
As used herein, the term “biological status” is construed broadly to refer to a “state of being” of a virtual pet, such as for example a health or emotional state. Examples of the biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits. Such states of being are conveyed to a user by way of haptic effects generated based on the biological status of the virtual pet. The user may also experience responses related to feeding and other interactions with the virtual pet by way of appropriate haptic effects.
The software application for controlling a virtual pet may be located on a local device (e.g., a computer or a hand-held device), where the signal relating to the biological status and associated haptic effect are determined at the local device. Alternatively, the software application for controlling a virtual pet may reside remotely, e.g., on a network resource, where the signal relating to the biological status along with associated haptic effect may be generated within the network and sent to a local device for interaction with the user.
In another embodiment, a haptic system that provides haptic feedback to a user interacting with a virtual pet comprises: a user-interface object; a haptic feedback assembly coupled to the user-interface object; a controller in communication with the user-interface object and the haptic feedback assembly; and a memory storing a software. The controller executes the software so as to practice the above method, and the haptic feedback assembly is configured to output the haptic effect thus generated on the user-interface object. In another embodiment, the haptic system further comprises a display screen for displaying a visual image of the virtual pet. It may additionally include an audio element for providing an audio cue associated with the biological status of the virtual pet. Such visual and audio effects may be produced and coordinated in a manner that complements the haptic sensation experienced by the user.
The haptic system described above may be embodied in a computer, a cell phone, a personal digital assistant (PDA), a pager, a game console, a stand-alone toy device (e.g., Tomagotcchi), or other types of hand-held electronic devices known in the art, which may be further equipped with network capabilities.
The flowchart 100 of
In step 110 of
The term “biological status” refers to a “state of being” (or behavior) of the virtual pet, such as a health or emotional state. Examples of the biological status include, but are not limited to: heartbeat, vitality, purring, giggling, being affectionate, and other personal traits.
In step 120 of
Further, the term “haptic effect” should be construed broadly as encompassing any type of force feedback, such as tactile or kinesthetic feedback, that is deemed appropriate for conveying a particular biological status of the virtual pet and thereby enhancing the realism of the user-pet interaction. See
The embodiment of
The embodiment of
The ensuing description discloses several embodiments, illustrating by way of example how the embodiment of
In the embodiment of
The haptic system 200 of
Situations may exist where software application controlling the virtual pet is located on a remote source such as for example a network resource, and an indicator or signal associated with the biological status and an indicator or signal associated with the corresponding haptic effect are sent (or downloaded) from the network resource to the haptic feedback assembly in a local device configured to be in contact with the user.
In
The haptic system 200 of
The haptic system 200 of
As described above, the haptic feedback assembly 420 and the user-interface object 410 may be mechanically integrated to form a “haptic-enabled” unitary device, such as the iFeel mouse manufactured by Logitech, Inc., and enabled by the TouchSense™ technology of Immersion Corporation. In one embodiment, such a mouse may be interfaced to a computer running a virtual pet software (e.g., an Internet-based virtual pet software from Neopets.com). Such software enables users to create their own pets, which may be selected from many different types and with a wide variety of characteristics. U.S. Pat. Nos. 6,211,861 and 6,429,846, for instance, disclose embodiments on “haptic-enabled” user-interface input devices, which are incorporated herein by reference.
Further, the haptic feedback assembly 420 may be configured to output any form of force feedback as deemed suitable. In some applications, for instance, it may be desirable to effect tactile sensations, such as vibrations, pulses, and textures, on a user. Whereas in other applications, kinesthetic sensations may be produced in the degrees of freedom of motion of the user-manipulatable object (e.g., a joystick handle, mouse, steering wheel, etc.), so as to provide more dynamic interactions between the user and virtual pet. U.S. Pat. No. 5,734,373 discloses embodiments on generating tactile and kinesthetic feedback, which is incorporated herein by reference.
Optionally, embodiments of the invention may further allow the user to select or customize the haptic feedback that corresponds to a particular status of the virtual pet.
The ensuing description discloses embodiments on producing haptic sensations associated with various biological states of a virtual pet.
When a user, interacting with a virtual pet, takes an action that makes the pet happy, a haptic effect that simulates a purring sensation may be output to the user by a haptic feedback assembly (e.g., the haptic feedback assembly 220 described above). The purring sensation may be triggered in response to the user “petting” the virtual pet with a cursor on the display screen (such as the display screen 260 of
In some embodiments, the vibration cycles in
In some embodiments, a user may check the heartbeat of his/her virtual pet as a way of checking the health condition of the pet. The user may enter an input signal to prompt the heartbeat “measure” via a user-interface input device (e.g., the user-interface object 210 described above). Consequently, a data signal or indicator may be transmitted to the haptic feedback assembly that outputs a pulsing sensation to the user. The rate or magnitude of the pulsing sensation may be used to indicate the health state of the virtual pet: for instance, a slow (low frequency) and/or weak (low magnitude) pulse may signal an unhealthy pet that needs care.
In addition to health, the heartbeat may be used to indicate a state of “exertion” or “excitement” of the virtual pet, e.g., a rapid heartbeat may convey such a state to the user. By way of example,
When a user interacts with a virtual pet in a manner that “tickles” the pet, a giggling sensation may be delivered to the user by way of the haptic feedback assembly. For example, the user may move a cursor back and forth over the image of the virtual pet to mimic the action of tickling. As a result, a giggling sensation may be delivered to the user as a vibration sensation with varying magnitude and frequency. By way of example,
In caring for a virtual (or real) pet, a routine activity is “feeding” the pet. When a virtual pet is eating, a tactile feedback may be output to the user to effect a “feeding sensation.” Such a feeding sensation may be in the form of a series of jolts, indicating that the pet is gulping down food, for instance. Alternatively, the feeding sensation may be delivered to the user as a continuous vibration, indicating that the pet is drinking liquid, or chewing vigorously. The feeding sensation may be also be delivered in coordination with visual images of the pet moving its mouth in chewing or gulping motion, along with corresponding sound effects.
It will be appreciated that haptic effects may be further devised to convey other characteristics and abilities of a virtual pet. For example, a tactile sensation may be delivered to the user to signal a virtual pet wagging its tail, where the magnitude and frequency of the vibration may be correlated with the graphical image of wagging. Appropriate haptic sensations may also be generated, corresponding to a virtual pet wagging its ears, panting, scratching fur or flea bites, stretching, or sleeping. In addition, a virtual pet may be equipped with an extraordinary power, such as the ability to shoot lightening bolts or breathe fire. An appropriate haptic sensation may be devised to convey such power, as well.
In a virtual pet environment, a pet is often given a set of statistics that document the strength and vitality of the creature. Such statistics may be used when two pets “do battle.” For instance, when one pet owner is trying to decide if his/her pet should battle another pet, he/she may check the strength statistics related to both pets. An effective way of getting a sense of the “strength” of a potential opponent is by way of haptic sensation. As a way of example, a user may put a cursor over the image of a particular pet and feel a haptic sensation that conveys the strength of the pet. The haptic sensation in this case may be delivered in the form of a vibration, characterized by a magnitude that is scaled in accordance with the pet's strength statistics, for instance.
Likewise, virtual pets may be characterized by “popularity” statistics. As in the case of the strength (or vitality) statistics, a haptic sensation may be associated with a popularity statistic. For example, an “unpopular” pet may be assigned with a soft, low frequency tactile sensation; whereas a popular pet may dictate a strong, high frequency tactile sensation. Those skilled in the art will appreciate that haptic sensations may likewise be associated with other statistics of virtual pets.
Those skilled in the art will recognize that the embodiments described above are provided by way of example, to elucidate the general principles of the invention. Various means and methods can be devised to perform the designated functions in an equivalent manner. Moreover, various changes, substitutions, and alternations can be made herein without departing from the principles and the scope of the invention.
The present application claims benefit of Provisional Patent Application No. 60/336,411, entitled “Using Haptic Feedback Peripheral Devices to Enhance Interaction with Computer Simulated Pets,” filed on Oct. 30, 2001, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
2972140 | Hirsch | Feb 1961 | A |
3157853 | Hirsch | Nov 1964 | A |
3220121 | Cutler | Nov 1965 | A |
3497668 | Hirsch | Feb 1970 | A |
3517446 | Corlyon et al. | Jun 1970 | A |
3623064 | Kagan | Nov 1971 | A |
3902687 | Hightower | Sep 1975 | A |
3903614 | Diamond et al. | Sep 1975 | A |
3911416 | Feder | Oct 1975 | A |
4127752 | Lowthorp | Nov 1978 | A |
4160508 | Salisbury, Jr. | Jul 1979 | A |
4236325 | Hall et al. | Dec 1980 | A |
4262549 | Schwellenbach | Apr 1981 | A |
4320268 | Brown | Mar 1982 | A |
4321441 | Thornburg | Mar 1982 | A |
4333070 | Barnes | Jun 1982 | A |
4464117 | Foerst | Aug 1984 | A |
4484191 | Vavra | Nov 1984 | A |
4513235 | Acklam et al. | Apr 1985 | A |
4581491 | Boothroyd | Apr 1986 | A |
4599070 | Hladky et al. | Jul 1986 | A |
4708656 | de Vries et al. | Nov 1987 | A |
4713007 | Alban | Dec 1987 | A |
4791416 | Adler | Dec 1988 | A |
4794392 | Selinko | Dec 1988 | A |
4795296 | Jau | Jan 1989 | A |
4798919 | Miessler et al. | Jan 1989 | A |
4821030 | Batson et al. | Apr 1989 | A |
4885565 | Embach | Dec 1989 | A |
4891764 | McIntosh | Jan 1990 | A |
4930770 | Baker | Jun 1990 | A |
4934694 | McIntosh | Jun 1990 | A |
5019761 | Kraft | May 1991 | A |
5022384 | Freels | Jun 1991 | A |
5022407 | Horch et al. | Jun 1991 | A |
5035242 | Franklin et al. | Jul 1991 | A |
5038089 | Szakaly | Aug 1991 | A |
5078152 | Bond et al. | Jan 1992 | A |
5165897 | Johnson | Nov 1992 | A |
5175459 | Danial et al. | Dec 1992 | A |
5182557 | Lang | Jan 1993 | A |
5186695 | Mangseth et al. | Feb 1993 | A |
5212473 | Louis | May 1993 | A |
5223658 | Suzuki | Jun 1993 | A |
5237327 | Saitoh et al. | Aug 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5246316 | Smith | Sep 1993 | A |
5271290 | Fischer | Dec 1993 | A |
5275174 | Cook | Jan 1994 | A |
5283970 | Aigner | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5309140 | Everett, Jr. et al. | May 1994 | A |
5334027 | Wherlock | Aug 1994 | A |
5355148 | Anderson | Oct 1994 | A |
5390128 | Ryan et al. | Feb 1995 | A |
5390296 | Crandall et al. | Feb 1995 | A |
5402499 | Robison et al. | Mar 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5437607 | Taylor | Aug 1995 | A |
5451924 | Massimino et al. | Sep 1995 | A |
5461711 | Wang et al. | Oct 1995 | A |
5466213 | Hogan et al. | Nov 1995 | A |
5524195 | Clanton, III et al. | Jun 1996 | A |
5547382 | Yamasaki et al. | Aug 1996 | A |
5565840 | Thorner et al. | Oct 1996 | A |
5575761 | Hajianpour | Nov 1996 | A |
5631861 | Kramer | May 1997 | A |
5669818 | Thorner et al. | Sep 1997 | A |
5684722 | Thorner et al. | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5691747 | Amano | Nov 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5729249 | Yasutake | Mar 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5766016 | Sinclair et al. | Jun 1998 | A |
5767457 | Gerpheide et al. | Jun 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
5791992 | Crump et al. | Aug 1998 | A |
5844392 | Peurach et al. | Dec 1998 | A |
5857986 | Moriyasu | Jan 1999 | A |
5887995 | Holehan | Mar 1999 | A |
5889670 | Schuler et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5945772 | Macnak et al. | Aug 1999 | A |
5956484 | Rosenberg et al. | Sep 1999 | A |
5988902 | Holehan | Nov 1999 | A |
6059506 | Kramer | May 2000 | A |
6078126 | Rollins et al. | Jun 2000 | A |
6097964 | Nuovo et al. | Aug 2000 | A |
6101530 | Rosenberg et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6131097 | Peurach et al. | Oct 2000 | A |
6160489 | Perry et al. | Dec 2000 | A |
6167362 | Brown et al. | Dec 2000 | A |
6195592 | Schuler et al. | Feb 2001 | B1 |
6198206 | Saarmaa et al. | Mar 2001 | B1 |
6211861 | Rosenberg et al. | Apr 2001 | B1 |
6218966 | Goodwin et al. | Apr 2001 | B1 |
6219034 | Elbing et al. | Apr 2001 | B1 |
6225976 | Yates et al. | May 2001 | B1 |
6273815 | Stuckman et al. | Aug 2001 | B1 |
6287193 | Rehkemper et al. | Sep 2001 | B1 |
6290566 | Gabai et al. | Sep 2001 | B1 |
6374255 | Peurach et al. | Apr 2002 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6438457 | Yokoo et al. | Aug 2002 | B1 |
6543487 | Bazinet | Apr 2003 | B2 |
6650338 | Kolarov et al. | Nov 2003 | B1 |
20020033795 | Shahoian et al. | Mar 2002 | A1 |
20020128048 | Aaltonen | Sep 2002 | A1 |
20020177471 | Kaaresoja et al. | Nov 2002 | A1 |
Number | Date | Country |
---|---|---|
0607580 | Jul 1994 | EP |
11-299305 | Nov 1999 | JP |
0035548 | Jun 2000 | WO |
Entry |
---|
US Patent Application Publication 2002/0019678 A1. |
L.B. Rosenberg, A Force Feedback Programming Primer, Immersion Corporation, San Jose, California, 1997; 98 pages (2-sided). |
Author not known, “Information Processing, vol. 41, No. 2 (Interface Technology Coming Close to Real World: Digital Pets—Machines with Minds)” published by the Information Processing Society of Japan, Feb. 15, 2000, vol. 41, No. 2, pp. 127-136 (with translation). |
Umeki, Naoko, et al., “A Motional Interface by Using Motion Processor,” Human Interface Society, Feb. 16, 1999, vol. 1, No. 1, pp. 63-66 (with translation). |
Supplementary European Search Report, Application No. EP02804680, dated Apr. 17, 2008, the corresponding set of claims/application has not been identified. |
“Interface Technology Coming Close to Real World: Digital Pets—Machines with Minds”, Information Processing Society of Japan Magazine, Feb. 15, 2000, pp. 127-136, vol. 41, No. 2, Information Processing Society of Japan, partial translation. |
Naoko Umeki et al., “A motional interface by using Motion Processor”, Human Interface Society Technical Reports, Feb. 16, 1999, pp. 63-66, vol. 1, No. 1, Information Processing Society of Japan, partial translation. |
Number | Date | Country | |
---|---|---|---|
20030080987 A1 | May 2003 | US |
Number | Date | Country | |
---|---|---|---|
60336411 | Oct 2001 | US |