Some playsets include electronic displays configured to display animated images. Other playsets include one or more removable toys with identifiable accessories, wherein the playset may be configured to determine an identity of a removable toy by interacting with the removable toy's identifiable accessory.
Games and toys incorporating electronic displays or toys with identifiable accessories are found in U.S. Pat. Nos. 5,085,609; 5,278,779; 5,766,077; 6,039,625; 6,190,174; 6,192,215; 6,227,931; 6,302,612; 6,461,238; 6,773,325; 6,814,662; and 6,937,152; and U.S. Published Patent Application No. 2002/132553, the disclosures of which are incorporated herein by reference for all purposes.
The present disclosure provides for electronic playsets and components thereof. An electronic playset of the present disclosure may include a see-through monitor having a transparent screen configured to display an animated image superimposed over one or more regions. The see-through monitor may be moveable by rotation or translation between one or more positions, the one or more positions being adjacent to the one or more regions. Some embodiments include a see-through monitor with a light source configured to illuminate the one or more regions. Another aspect of the present disclosure provides for a removable toy having an identifiable accessory that may be read by an electronic playset. The playset may be configured to identify the toy and display an animated character based on the identity.
An electronic playset 10 in accordance with the present disclosure is described herein. Referring primarily to
Housing 12 may comprise one or more regions, such as a first region 20, a second region 22, a third region 24, a fourth region 26, a fifth region 28 and a sixth region 30, so that housing 12 may resemble a multi-unit housing complex or a multi-room building. Each region may comprise a two-dimensional or three-dimensional scene. In some embodiments, such as the embodiment shown in
First region 20 of
As seen best in
See-through monitor 14 may include a transparent screen 58 and a light source 60. Transparent screen 58 may be configured to display an animated image 62 superimposed over the various regions when see-through monitor 14 is in the corresponding positions. For instance, see-through monitor 14 may be configured to display an animated image 62 containing particular virtual objects and/or characters superimposed over first region 20 when see-through monitor 14 is in the first position, as shown in
Animated image 62 may include one or more virtual characters and/or objects displayed superimposed over the regions. Further, animated image 62 may include virtual characters appearing to interact with objects in the various regions, such as three-dimensional objects 34, 40, 44 and 50. For instance, in
Light source 60 may be disposed on see-through monitor 14 and configured to illuminate the various regions when in the corresponding positions. For instance, the light source 60 may be configured to illuminate first region 20 when see-through monitor 14 is in a first position adjacent to first region 20, and to illuminate second region 22 when see-through monitor 14 is in a second position adjacent to second region 22. Light source 60 may include one or more light-emitting diodes 68, as well as other forms of light such as incandescent light bulbs.
Mount 16 may take various forms. In some embodiments, mount 16 may be elongate and linear so as to define an axis 70, such as the vertical axis 72 depicted in
Regions may be at various positions on the housing relative to an axis 70. For instance,
Processor 18 may be a microprocessor such as those commonly found in various electronic components. As the example schematic diagram depicted in
Processor 18 may be configured to execute instructions in memory 74 causing transparent screen 58 to display various animated images 62, including virtual characters such as those described above. Some virtual characters may be stored locally in memory 74. Other virtual characters may be associated with removable objects that may be connected to housing 12, as will be discussed in more detail below. Other virtual characters may be stored in memory 74, but may be unlocked when a particular removable object (described below) is affixed to housing 12.
Processor 18 further may be configured to determine which position see-through monitor 14 is in, so that processor 18 may instruct transparent screen 58 to display an animated image 62 appropriate for the corresponding region. In the example shown in
Likewise, when see-through monitor 14 is in a third position adjacent to third region 24, which resembles an exercise room, processor 18 may instruct transparent screen 58 to display a virtual character such as first virtual character 64 appearing to ride the stationary bicycle 42.
In some embodiments, virtual characters may be associated with particular regions. For instance, fifth region 28 may include items with masculine appearances, indicating that fifth region 28 may be a male's room. In such a case, when see-through monitor 14 is in a fifth position adjacent to fifth region 28, transparent screen 58 may be configured to display a male-appearing virtual character interacting with three-dimensional objects 44 in fifth region.
Likewise, sixth region 30 may include items with feminine appearances, indicating that sixth region 30 may be a female's room. In such a case, when see-through monitor 14 is in a sixth position adjacent to sixth region 30, transparent screen 58 may be configured to display a female-appearing virtual character interacting with three-dimensional objects 50 in sixth region.
It should be understood that virtual characters are not limited to a particular region, and virtual characters such as first virtual character 64 may be displayed by transparent screen 58 in multiple regions. Moreover, more than one virtual character may appear in a region at one time, and two or more virtual characters may appear to interact with each other, as well as the three-dimensional objects in the region.
Memory 74 may be used to store sounds, games, play modes, one or more animated images 62 including one or more virtual characters. Memory 74 may further be used to store real-time characteristics associated with a particular virtual character. For instance, as a user controls a virtual character over time, the user may be able to save the character as the character changes (e.g., becomes smarter or older). Additionally and/or alternatively, a virtual character's interactions with other virtual characters may be stored in memory 74, so that such interactions may affect further interactions between the two virtual characters or other virtual characters.
Processor 18 may further control other components. For instance, in embodiments including light source 60, processor 18 may execute instructions in memory 74 causing light source 60 to illuminate specific regions, as described above.
Electronic playset 10 may be configured with one or more speakers 75 which processor 18 may cause to produce sounds. Sounds may include music and/or sound effects to accompany various actions occurring in animated image 62. Sounds may further be controlled by a user operating user interface 80.
Network interface 76 may allow electronic playset 10 to connect to one or more computers directly or over a network (e.g., a local-area network or the Internet). Such a connection may be wireless (e.g., IEEE 802.x) or wired (e.g., Ethernet, parallel, serial, token ring, dial-up, etc.). Processor 18 may control network interface 76 to download information into memory 74. Such information may include new virtual characters to display on transparent screen 58, rules for new games a user may play, sounds to be produced from speaker 75, and the like.
Sound interface 78 may allow electronic playset 10 to receive acoustic signals. In some embodiments, sounds are received via acoustic wire. In other embodiments, sound waves may be received from a different medium, such as air (e.g., via microphone 79). The received signals may contain instructions and/or data which may be stored in memory 74. Processor 18 may be configured to instruct transparent screen 58 to display animated images 62 which respond to or are controlled by sounds received at sound interface 78.
User interface 80 may include one or more actuators 82 (e.g., buttons). The one or more actuators 82 may be operably connected to processor 18 so that they may be used to control various components such as animated image 62, light source 60 and/or speaker 75. In some embodiments, a user may be presented with a task (e.g., to cause a virtual character to retrieve food from refrigerator 36, cook it, and eat it), and the user may utilize the one or more actuators 82 to control animated image 62 (which may include one or more virtual characters such as first virtual character 64) to complete the task. In other embodiments, a user may use user interface 80 to cause a virtual character to interact with another virtual character.
User interface 80 may be operably connected to housing 12 via cable (as seen in
Another aspect of the present disclosure provides for an electronic playset 10 configured to interact with foreign objects. Referring to
While housing 12 may be similar to one of the multi-region embodiments described above, housing 12 with respect to this aspect of the present disclosure may take numerous other forms, such as a figurine, object or environment.
Removable toy 86 may take various forms such as a figurine 92, a creature 94, or an object 96. In embodiments where housing 12 takes a form different than the multi-region embodiments described above, removable toy 86 may resemble other objects. For instance, if housing 12 resembles a figurine, removable toy may resemble an article of clothing that may be affixed to housing 12, and housing 12 may “identify” the article of clothing. In embodiments where removable toy is a figurine such as 92, the first virtual character 64 may resemble the appearance of the figurine 92.
Identifiable accessory 88 may comprise a second plurality 98 of electrical contacts and diode 100. Second plurality 98 of electrical contacts may be connected to and removed from first plurality 84 of electrical contacts. A diode 100 may interconnect a first pair 102 of the second plurality 98 of electrical contacts, and diode 100 may be adapted to limit current flow between the first pair 102 of the second plurality 98 of electrical contacts to a first direction.
Controller 90 may be configured to determine an identity of removable toy 86 by interacting with identifiable accessory 88. Controller 90 may be electrically connected to the first plurality 84 of electrical contacts, as shown in
When second plurality 98 of electrical contacts is connected to first plurality 84 of electrical contacts (i.e., removable toy 86 is attached to housing 12), controller 90 may be configured to apply voltage to one of the first pair 102 of the second plurality 98 of electrical contacts. Controller 90 may then detect voltage on the other of the first pair 102 of the second plurality 98 of electrical contacts, and determine an identity of the removable toy 86 based at least in part on the first direction which diode 100 permits current to pass between the first pair 102 of the second plurality 98 of electrical contacts.
In addition to the first direction of current flow permitted by diode 100, controller 90 may detect other aspects of identifiable accessories 88, such as which electrical contacts are connected by diode 100. Second plurality 98 of electrical contacts may comprise three or more electrical contacts. Controller 90 may be further configured, in addition to the applying, detecting, and determining described above, to detect voltage on the one of the three or more electrical contacts not included in the first pair 102 of the second plurality 98 of electrical contacts, and to determine the identity of the removable toy 86 based on which two of the three or more electrical contacts form the first pair 102 of the second plurality 98 of electrical contacts. It further should be understood that more than three electrical contacts are possible, as is seen in the examples described below.
Particular examples shown in
In
Diode 100 of the removable toy 86 labeled “TOM” is shown interconnecting the first 122 and second 124 electrical contacts, forming the first pair 102 of the second plurality of electrical contacts. Diode 100 may be adapted to limit current flow between the first 122 and second 124 electrical contacts in a first direction labeled A.
To determine an identity of removable toy 86, controller 90 may sequentially apply voltage to each line 114, 116, 118 and 120, and detect voltage on the resistors 106, 108, 110 and 112. For instance, controller 90 may apply voltage to first line 114, causing current to pass through first electrical contact 122 and diode 100 on the removable toy 86 labeled “TOM.” The current causes second resistor 108 to have voltage. Controller 90 may detect this voltage and, using Table 1 below (which may be stored as a lookup table in, for instance, memory 74), determine that removable toy 86 having an identity of “TOM” is present.
Three other removable toys 86 also are shown in
In some embodiments, controller 90 may be configured to identify removable toy 86 (i.e. perform the above-described steps of applying, detecting and determining) in response to second plurality 98 of electrical contacts (associated with a removable toy 86) being brought into contact with first plurality 84 of electrical contacts.
Other embodiments of playset 10 may comprise a user-controlled switch 130 operably connected to controller 90, which may be activated to cause controller 90 to perform the steps of applying, detecting and determining. For instance, some embodiments may include receiving area 32 resembling a garage 132 having one or more garage doors 134, wherein user-controlled switch 130 may be a portion of the garage door 134 which may be actuated when garage door 134 is closed.
Controller 90 may await activation of user-controlled switch 130, such as a user closing garage door 134, to perform the steps of applying, detecting and determining described above. In other embodiments, controller 90 may await activation of user-controlled switch 130 comprising an actuator 82 on user interface 80 to perform the above-described steps of applying, detecting and determining.
Accordingly, while embodiments have been particularly shown and described with reference to the foregoing disclosure, many variations may be made therein. The foregoing embodiments are illustrative, and no single feature or element is essential to all possible combinations that may be used in a particular application. Where the claims recite “a” or “a first” element or the equivalent thereof, such claims include one or more such elements, neither requiring nor excluding two or more such elements. Further, ordinal indicators, such as first, second or third, for identified elements are used to distinguish between the elements, and do not indicate or imply a required or limited number of such elements, and do not indicate a particular position or order of such elements unless otherwise specifically stated.
This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 60/849,338 entitled “Video Toy with Backgrounds and Movable Screen,” filed Oct. 2, 2006, the disclosure of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3182420 | Bender | May 1965 | A |
4398723 | Erickson et al. | Aug 1983 | A |
4421317 | Hector et al. | Dec 1983 | A |
4432151 | Morris | Feb 1984 | A |
5013278 | Dixon et al. | May 1991 | A |
5022682 | Desmond | Jun 1991 | A |
5055083 | Walker | Oct 1991 | A |
5085609 | Haberle | Feb 1992 | A |
5278779 | Conway et al. | Jan 1994 | A |
5280416 | Hartley et al. | Jan 1994 | A |
5312284 | Grober et al. | May 1994 | A |
5445552 | Hine | Aug 1995 | A |
5513129 | Bolas et al. | Apr 1996 | A |
5655945 | Jani | Aug 1997 | A |
5728962 | Goede | Mar 1998 | A |
RE35819 | Suzuki | Jun 1998 | E |
5766077 | Hongo | Jun 1998 | A |
5940167 | Gans | Aug 1999 | A |
5952598 | Goede | Sep 1999 | A |
5966526 | Yokoi | Oct 1999 | A |
5971833 | Rasmussen | Oct 1999 | A |
5989033 | Burgio | Nov 1999 | A |
6039625 | Wang | Mar 2000 | A |
6048251 | Klitsner | Apr 2000 | A |
6056618 | Larian | May 2000 | A |
6165068 | Sonoda et al. | Dec 2000 | A |
6190174 | Lam | Feb 2001 | B1 |
6192215 | Wang | Feb 2001 | B1 |
6213871 | Yokoi | Apr 2001 | B1 |
6227931 | Shackelford | May 2001 | B1 |
6227966 | Yokoi | May 2001 | B1 |
6273815 | Stuckman et al. | Aug 2001 | B1 |
6290565 | Galyean et al. | Sep 2001 | B1 |
6302612 | Fowler et al. | Oct 2001 | B1 |
6319010 | Kikinis | Nov 2001 | B1 |
6319130 | Ooseki et al. | Nov 2001 | B1 |
6343006 | Moscovitch et al. | Jan 2002 | B1 |
6353170 | Eyzaguirre et al. | Mar 2002 | B1 |
6369822 | Peevers et al. | Apr 2002 | B1 |
6443796 | Shackelford | Sep 2002 | B1 |
6449518 | Yokoo et al. | Sep 2002 | B1 |
6461238 | Rehkemper et al. | Oct 2002 | B1 |
6500070 | Tomizawa et al. | Dec 2002 | B1 |
6537149 | Sogabe | Mar 2003 | B2 |
6542869 | Foote | Apr 2003 | B1 |
6558225 | Rehkemper et al. | May 2003 | B1 |
6609968 | Okada et al. | Aug 2003 | B1 |
6652383 | Sonoda et al. | Nov 2003 | B1 |
6656049 | Masaki | Dec 2003 | B1 |
6722973 | Akaishi | Apr 2004 | B2 |
6773325 | Mawle et al. | Aug 2004 | B1 |
6800013 | Liu | Oct 2004 | B2 |
6814662 | Sasaki et al. | Nov 2004 | B2 |
6832955 | Yokoi | Dec 2004 | B2 |
6885898 | Brown et al. | Apr 2005 | B1 |
6898759 | Terada et al. | May 2005 | B1 |
6937152 | Small | Aug 2005 | B2 |
6988896 | Cho | Jan 2006 | B2 |
6997773 | Dubois et al. | Feb 2006 | B1 |
7001270 | Taub | Feb 2006 | B2 |
7024255 | Brown et al. | Apr 2006 | B1 |
7059934 | Whitehead | Jun 2006 | B2 |
7081033 | Mawle et al. | Jul 2006 | B1 |
7095387 | Lee et al. | Aug 2006 | B2 |
7104884 | Yokoi | Sep 2006 | B2 |
7203558 | Sugiyama et al. | Apr 2007 | B2 |
7254455 | Moulios | Aug 2007 | B2 |
20020132553 | Jelinek | Sep 2002 | A1 |
20030064685 | Kim | Apr 2003 | A1 |
20030124954 | Liu | Jul 2003 | A1 |
20040004667 | Morikawa et al. | Jan 2004 | A1 |
20040133354 | Low et al. | Jul 2004 | A1 |
20040197758 | Langford | Oct 2004 | A1 |
20040259635 | Germeraad | Dec 2004 | A1 |
20050009443 | Martin et al. | Jan 2005 | A1 |
20050024313 | Nakajima et al. | Feb 2005 | A1 |
20050054440 | Anderson | Mar 2005 | A1 |
20050237701 | Yu | Oct 2005 | A1 |
20050245302 | Bathiche | Nov 2005 | A1 |
20050253775 | Stewart | Nov 2005 | A1 |
20060007644 | Huilgol et al. | Jan 2006 | A1 |
20060009121 | Rotundo | Jan 2006 | A1 |
20060058101 | Rigopulos | Mar 2006 | A1 |
20060077621 | Adatia | Apr 2006 | A1 |
20060082518 | Ram | Apr 2006 | A1 |
20060126284 | Moscovitch | Jun 2006 | A1 |
20060160588 | Yamada et al. | Jul 2006 | A1 |
20060172787 | Ellis | Aug 2006 | A1 |
20060181537 | Vasan | Aug 2006 | A1 |
20060266200 | Goodwin | Nov 2006 | A1 |
Number | Date | Country |
---|---|---|
0978301 | Feb 2000 | EP |
1365386 | Nov 2003 | EP |
2425730 | Nov 2006 | GB |
64091188 | Apr 1989 | JP |
1315791 | Dec 1989 | JP |
63170697 | Jan 1990 | JP |
2006198017 | Aug 2006 | JP |
WO9422128 | Sep 1994 | WO |
WO9503588 | Feb 1995 | WO |
WO9525312 | Sep 1995 | WO |
WO0014719 | Mar 2000 | WO |
WO2006034180 | Mar 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20080113586 A1 | May 2008 | US |
Number | Date | Country | |
---|---|---|---|
60849338 | Oct 2006 | US |