Interactive ride-on toy apparatus

Information

  • Patent Grant
  • 10245517
  • Patent Number
    10,245,517
  • Date Filed
    Tuesday, March 27, 2018
    6 years ago
  • Date Issued
    Tuesday, April 2, 2019
    5 years ago
Abstract
An apparatus is provided that includes a torso and a plurality of legs, a first drive motor assembly secured to a first of the plurality of legs and to a first drive wheel, a second drive motor assembly secured to a second of the plurality of legs and to a second drive wheel, a motorized neck assembly coupling a head to the torso and providing a multi-directional rotational movement of the head, a rechargeable battery, a throttle switch to provide a throttle signal, a controller including one or more processors and memory devices, and an electrical steering position sensor configured to translate a mechanical steering input via manual rotation of the head into an electronic steering position signal that is communicated to the controller, wherein the controller is configured to selectively actuate at least one of the drive wheel motors based on the throttle signal and the steering position signal.
Description
FIELD OF THE INVENTION

The invention relates generally to the field of motorized toys. More particularly, a motorized interactive ride-on toy.


BACKGROUND

Motorized ride-on toys have been driven by children for many years, although the ability to control and interact with these toys has been notably limited, thereby diminishing a user's overall experience. Accordingly, a need exists for a ride-on toy that engages the user through improved interactive and control capabilities.


SUMMARY OF THE INVENTION

The terms used herein should not be interpreted as being limited to specific forms, shapes, or compositions. Rather, the parts can have a wide variety of shapes and forms and can be composed of a wide variety of materials. These and other features of the apparatus will become apparent from the detailed description, claims, and accompanying drawings.


In at least some embodiments, the apparatus is an interactive ride-on toy apparatus that includes: a torso; a plurality of legs secured to the torso; a first drive motor assembly secured to a first of the plurality of legs and to a first drive wheel; a second drive motor assembly secured to a second of the plurality of legs and to a second drive wheel; a motorized neck assembly coupling a head to the torso, wherein the neck assembly provides a multi-directional rotational movement of the head relative to the torso; a rechargeable battery; a throttle switch to provide a throttle signal; a controller including one or more processors and memory devices; and an electrical steering position sensor configured to translate a mechanical steering input via manual rotation of the head into an electronic steering position signal that is communicated to the controller, wherein the controller is configured to receive the throttle signal and the steering position signal, and selectively actuate at least one of the drive wheel motors based on the throttle signal and the steering position signal.


In at least some other embodiments, the apparatus is an interactive ride-on toy that includes: a torso having a first torso portion and a second torso portion; a first front leg and a second front leg, each extending down from the first torso portion; a first drive motor coupled to the first front leg and to a first drive wheel; a second drive motor coupled to the second front leg and to a second drive wheel, wherein the first and second drive wheels are rotatable propel the apparatus along a surface; a non-motorized wheel coupled to the second torso portion; a motorized neck assembly coupling a head to the first torso portion, wherein the neck assembly provides selective rotational movement of the head along both a first rotational head axis and a second rotational head axis; a rechargeable battery; a throttle switch to provide a throttle input signal; a controller for receiving the throttle input signal and selectively actuating the drive motor assemblies with power from the battery; an electrical steering position sensor for receiving a mechanical steering input upon manual rotation of the head, wherein the controller proportionally varies the applied power from the battery to the first drive motor assembly and the second drive motor assembly based on a steering position signal provided by the steering position sensor; a plurality of touch-based sensors situated in the head for providing a touch input signal; reins coupled to the head, wherein the reins include the throttle switch and a speed and direction selection switch; a speaker for emitting sounds selected by the controller; and a seat positioned on the torso.


In at least yet some embodiments, the apparatus is an interactive ride-on toy that includes: a torso having a first torso portion pivotably coupled to a second torso portion along a vertical pivot joint; a first front leg and a second front leg, each extending down from the first torso portion; a first drive motor assembly secured to the first front leg and to a first drive wheel; a second drive motor assembly secured to the second front leg and to a second drive wheel, wherein the first and second drive wheels are rotatable about a single rotational drive axis to propel the apparatus along a surface; a first rear leg and a second rear leg, each extending down from the second torso portion and including a wheel secured thereto; a motorized neck assembly coupling a head to the first torso portion, wherein the neck assembly provides selective rotational movement of the head along both a first rotational head axis and a second rotational head axis, wherein the second rotational head axis lies parallel to the rotational drive axis and perpendicular to the first rotational head axis; a rechargeable battery situated in at least one of the torso and the head; a throttle switch to provide a throttle input; a controller for receiving the throttle input and selectively actuating the drive motor assemblies using the rechargeable battery; an electrical steering position sensor for receiving a mechanical steering input via manual rotation of the head, and wherein the controller proportionally varies the applied power from the battery to the first drive motor assembly and the second drive motor assembly based on a received steering position sensor input; a plurality of touch-based sensors situated in the head for receiving touch signals from a user; wherein the head includes a mouth, eyelids, and ears, and wherein the eyelids and the ears are rotatably actuatable via a signal from the controller; reins pivotably coupled to the head, wherein the reins include the throttle switch and a speed and direction selection switch, and wherein the reins are coupled to the head via a reins pivot assembly that allows the reins to be rotated between a forward position and a back position relative to the head, and wherein the reins pivot assembly provides a reins position input signal to the controller indicating the position; a speaker for emitting sounds selected by the controller; a seat positioned on the torso; a first seat switch situated between the seat and the torso, wherein actuation of the first seat switch by a user provides a rider detected input signal; a motion sensor configured to detect the presence of another object situated in front of the first torso portion; and a mode selection switch for selecting between a first mode and a second mode, wherein the first mode directs the controller to actuate the drive wheel motor assemblies and neck assembly according to a predetermined sequence, and the second mode directs the controller to actuate the drive wheel motor assemblies only during actuation of the throttle switch.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of a toy apparatus are disclosed with reference to the accompanying drawings and are for illustrative purposes only. The toy apparatus is not limited in application to the details of construction or the arrangement of the components illustrated in the drawings. The toy apparatus is capable of other embodiments or of being practiced or carried out in other various ways. In the drawings:



FIG. 1 is a front perspective view of an exemplary embodiment of a toy apparatus;



FIG. 2 is a bottom perspective view of the apparatus of FIG. 1;



FIG. 3 is a left side view of the apparatus of FIG. 1;



FIG. 4 is a top view of the apparatus of FIG. 1;



FIG. 5 is a bottom view of the apparatus of FIG. 1;



FIG. 6 is a cross-sectional left side view of the apparatus taken along line 6-6 of FIG. 4;



FIG. 7 is a front perspective view of the apparatus of FIG. 1 with various body portions omitted for clarity;



FIG. 8 is an exemplary block diagram of the electrical components of the apparatus of FIG. 1;



FIG. 9 is a side perspective view of the apparatus of FIG. 1 with a portion of the body removed to expose internal structure;



FIG. 10 is a rear perspective view of a feature assembly and reins pivot assembly of the apparatus of FIG. 1;



FIG. 11 is an exploded view of the feature assembly, and perspective view of the reins pivot assembly and mane of the apparatus of FIG. 1;



FIG. 12 is a section view of a portion of the feature assembly as viewed from the right side of the apparatus of FIG. 1;



FIG. 13 is a perspective view of a motorized neck assembly of FIG. 1;



FIG. 14 is a section view of the neck assembly taken along line 14-14 of FIG. 13;



FIG. 15 is an exploded view of the neck assembly and support structure of FIG. 1;



FIG. 16 is a section view of the neck assembly taken along line 16-16 of FIG. 13;



FIG. 17. is a section view of the neck assembly taken along line 17-17 of FIG. 13;



FIG. 18. is a section view of the neck assembly taken along line 18-18 of FIG. 13;



FIG. 19. is a partially exploded bottom perspective view of the neck assembly of FIG. 15;



FIG. 20 is an exemplary flow chart describing the power on/wake sequence for the apparatus of FIG. 1;



FIG. 21 is an exemplary flow chart describing distress mode sequences for the apparatus of FIG. 1;



FIG. 22A-22D are an exemplary flow chart describing drive mode sequences for the apparatus of FIG. 1;



FIG. 23A-23D are an exemplary flow chart describing autonomous mode sequences for the apparatus of FIG. 1;



FIGS. 24A-24B illustrate a first exemplary sequence table for the apparatus of FIG. 1; and



FIGS. 25A-25C illustrate a second exemplary sequence table for the apparatus of FIG. 1.





DETAILED DESCRIPTION

An exemplary motorized interactive ride-on toy apparatus 10 is disclosed and discussed herein. The apparatus 10 is a ride-on toy having various physical features, sounds, and movements that allow a child to interact with the apparatus 10 in a manner similar to a “real” animal to provide a life-like simulated interactive experience. As shown in FIGS. 1-5, an exemplary embodiment of the apparatus 10 can include a ride-on toy configured to mimic a horse. Although the illustrated and discussed embodiments reference a toy horse at times, similar structure, components, and/or functionality, in whole or in part, can be utilized with other toy characters as well, including animals and non-animals, such as a dog, dinosaur, tiger, turtle, car, doll, etc.


The apparatus 10 is sized and shaped to be ridden by a child user and includes a body 11 formed from a plurality of shell pieces, such as a head shell 13, torso shell 15, leg shell 17, etc., that are coupled to each other and/or various internal components to form the overall shape and aesthetic appearance of the apparatus 10. The apparatus 10 includes a torso 12, which in at least some embodiments, has a first torso portion 14 and second torso portion 16, which can be attached by a pivot joint 18. In at least some embodiments, the pivot joint 18 includes a vertical pivot pin 20 and a pivot spring 22 to generally bias the second torso portion 16 in alignment with the first torso portion 14, while in other embodiments, the pivot joint 18 can utilize other types of pivot mechanisms. The torso 12 includes a seat 24, which can take the shape of a saddle that included stirrups 25 for a user's feet to rest.


As shown in FIGS. 1-10, a plurality of legs are coupled to the torso 12. In at least some embodiments the apparatus 10 includes four legs extending down from the torso 12, including a first front leg 28, a second front leg 30, a first rear leg 32 and a second rear leg 34. The legs do not bend and are rigidly secured to the torso 12 to prevent or substantially prevent movement relative to the torso, although in at least some embodiments, they can be pivotably secured to the torso 12 and each leg can include a knee joint that is spring loaded to allow the legs to bend if desired.


The apparatus 10 includes a plurality of wheels coupled to the legs, wherein the wheels allow the apparatus 10 to be propelled along a surface with or without a user thereon. To propel the apparatus 10, a plurality of the wheels are motorized. More particularly, in at least some embodiments, a first drive wheel 36 is secured to a first drive motor assembly 38 (FIG. 7), which is secured to the first front leg 28. A second drive wheel 40 is secured to a second drive motor assembly 42 (FIG. 7), which is secured to the second front leg 30. As shown in FIGS. 5 and 7, the first drive wheel 36 and second drive wheel 40 rotate about the same rotational drive axis 41. The first rear leg 32 and second rear leg 34 can include non-motorized wheels 44 secured thereto, such as freely pivotable caster-type wheels that allow the second torso portion 16 to be pulled along by the first torso portion 14 during propulsion. As shown, the first rear leg 32 and second rear leg 34 each include a separate wheel 44, while in some other embodiments, the rear torso portion 16 can be coupled to and supported by a single wheel 44.


The first and second drive motor assemblies 38, 42 can include various components, for example circuit protection devices, gears, motors, etc. In at least some embodiments, they each include a respective motor and gearbox, such as a first drive wheel motor 46, first gearbox 48, second drive wheel motor 50, second gearbox 52, while in some other embodiments, the first and second drive motor assemblies 38, 42 do not include a gearbox and the motors 46, 50 are directly coupled to the drive wheels 36, 40. Additionally, in at least some embodiments the drive wheel motors 46, 50 are direct current motors, while in other embodiments, other known types of motors can be utilized.


The apparatus 10 further includes a motorized neck assembly 54 (FIG. 13, discussed in detail below) coupling a head 56 to the first torso portion 14, wherein the neck assembly 54 provides selective rotational movement of the head 56 along both a first rotational head axis 58 and a second rotational head axis 60, wherein in at least some embodiments, the second rotational head axis 60 lies perpendicular to the first rotational head axis 58 and parallel to the rotational drive axis 41. Rotation of the head 56 along the first rotational head axis 58 is indicated by directional arrow 62 (FIGS. 1, 3, and 14) and provides a left side to right side movement, and rotation of the head 56 along the second rotational head axis 60 is indicated by directional arrow 64 (FIGS. 1, 6, and 14), and provides a nodding up and down movement of the head 56. In addition, in at least some embodiments, the second rotational head axis 60 lies parallel to the rotational drive axis 41.


The head 56 further includes a mouth 66, a mane 67, a pair of eyes 68, a pair of motorized eyelids 70 configured to open and close at least partially over the eyes 68, and a pair of motorized ears 72 configured to rotate. The eyelids 70 and ears 72 are actuated by a feature assembly 74 (FIG. 10) discussed in greater detail below. A plurality of touch-based sensors are included within the apparatus 10 to provide a touch input signal indicating that a user has touched a portion of the apparatus 10 (e.g., petting, brushing, feeding, etc.). The touch-based sensors can include various types of sensors, for example, capacitive, resistive, tactile, etc. Although numerous touch-based sensors can be provided in various locations throughout the apparatus 10, in at least some embodiments, the touch-based sensors can include a front head touch sensor 76, a left head touch sensor 78, a right head touch sensor 80, a mouth sensor 82, and a mane sensor 84. Further, in at least some embodiments the front head touch sensor 76, right head touch sensor 78, and left head touch sensor 80 are each capacitive-based sensors that sense a user touch, and the mouth sensor 82 and a mane sensor 84 are tactile switches that actuate when depressed by a user, such as when an object (e.g., a toy carrot) is inserted in the mouth 66, or when a user brushes the mane 67. The mane 67 is comprised of a rigid or semi-rigid material that is hinge-mounted to the head 56 by hinge pins 86 at one end to allow movement of the other end, which is spring biased away from the mane sensor 84 by a mane spring 87, such that a brushing motion on the mane 67 causes the mane sensor 84 to be activated by the downward motion of the mane 67. It is to be understood that the term touch-based sensor can be broadly construed to include various types of sensors that are activated by physical interaction with the user, as well sensors that are activated by close proximity with a user without physical interaction. In addition, it shall be understood that the terms “sensor” and “switch” can be interchangeable with the understanding that either can be utilized to communicate an indication of their position or sensed state.


A mechanical steering component is provided to allow steerage of the apparatus 10 during propulsion. In at least some embodiments, to mimic a horse, the steering component is in the shape of reins 90, which are coupled to the head via a reins pivot assembly 92. The reins pivot assembly 92 allows the reins 90 to be rotated by a user between a forward position and a back position relative to the head 56. As best seen in FIGS. 9-11, the reins pivot assembly 92 includes a bit shaft 93 that has rein arms 95 on either ends that form a portion of the reins 90. The bit shaft 93 rotates within rein sleeves 98 mounted to or formed within the body 11, adjacent the mouth 66 to hold the bit shaft 93 in position relative to the mouth 66, while allowing rotation with respect to the body 11. To detect the position of the reins 90, a reins forward sensor 94 and a reins back sensor 96 are positioned at either of the rein sleeves 98 and protrude therefrom (FIG. 10), activation of either sensor 94, 96 provides a reins position input signal. The sensors 94, 96 are actuated by a bit shaft disc portion 99 during movement of the reins 90. The bit shaft disc portion 99 is shown in FIG. 11, but omitted from FIG. 10 to facilitate viewing of the sensors 94, 96. The bit shaft disc portion 99 is formed or secured to the shaft 93 to rotate with the reins, and has a varied thickness that engages the protruding sensors 94, 96 at the extents of rotation. The reins 90 further include a throttle switch 110 (for providing a throttle input signal) for user actuation to command the apparatus 10 to be propelled in a forward or reverse direction via the drive wheels 36, 40. A speed control switch 112 is also provided on the reins 90 to allow a user to select a forward or reverse speed. In at least some embodiments, the speed control switch 112 includes selections for a low forward speed (FWD1), a high forward speed (FWD2), and a low reverse speed (REV), although in other embodiments, less or additional speed settings can be provided.


The apparatus 10 includes a mode selection switch 114 for selecting various modes, such as an autonomous mode and a ride mode, and is provided on the torso 12 and can include a status light 115 (e.g., an LED) integrated therewith or separately mounted, to provide a colored indication of the selected mode or other status information. The mode selection switch 114 can be used to initiate various other actions other than mode selection, such as described in one or more sequences herein. A body tilt sensor 116 can be provided and mounted in the apparatus 10 to sense when the apparatus 10 is not in an upright position, and serves as a safety device to prevent operation of motors when not upright.


The apparatus 10 includes a rechargeable battery 118 interconnected to a charge port 120 and a charger connect switch 122. The charge port 120 is configured to receive a mating charge plug connected to a typical wall plug power supply adapter that converts household AC power to DC power. The charger connect switch 122 is physically engaged by the charge plug when inserted, causing the charger connect switch 122 to electrically disconnect the battery 118, thereby preventing battery power from activating the apparatus 10 while it is charging. In addition, a user operable main power switch 123 is included to provide a disconnect from the battery 118 to a controller 130. If the main power switch 123 is left in the ON position, thereby providing the controller 130 access to power, then the controller 130 may initiate a low power consumption sleep mode after a period of inactivity.


The controller 130 includes one or more processors to facilitate operation of the apparatus 10 using a software program stored on one or more memory devices. The controller 130 monitors the various sensors to receive status input signals and provide outputs to the various motors, speaker, light, etc. to induce action, such as motor movement, illumination, sounds, etc. based on the software program. The controller 130 can be comprised of numerous components including multiple circuit boards, integrated circuit chips (ICs), processors, memory devices, discrete components, etc., that are interconnected to communicate information and commands therebetween. As shown in the exemplary block diagram for the apparatus 10 provided in FIG. 8, the controller 130 in at least some embodiments includes a first circuit board PCB-1, which can be situated in the torso 12 and a second circuit board PCB-2 that can be located in the head 56. Each circuit board can include a processor 132 and memory device 134, with at least one of the processors 132 serving to process and implement the software program. The memory device 134 can include a discrete or processor IC embedded memory devices, and can further be comprised of any one of several known memory types, such as RAM, ROM, EPROM, EEPROM, SDRAM, etc.


In at least some embodiments, the first circuit board PCB-1 includes a GPCE4P096UA (or GPCE4P096A) IC as manufactured by GeneralPlus in Taiwan, and the second circuit board PCB-2 includes a GPC11033D (or GPC11024) IC as manufactured by GeneralPlus in Taiwan, although in other embodiments, other known ICs could be utilized to provide the functionality necessary to perform the operations described herein. These or other exemplary ICs provided can include functionality for interfacing with the described sensors to process inputs, playing sound, through the speaker 140, operate motors at varied power levels (including PWM), etc. It is to be understood that the circuit boards can include various additional components, such as resistors, capacitors, relays, fuses, solid state switches, diodes, etc. which are interconnected with each other, or other components such as the input and output devices (e.g., sensors, motors, etc.) described herein.


Further, in at least some embodiments, the memory device 134 on the first circuit board PCB-1 stores the software program for operating the apparatus 10 as described herein. The software program includes instructions for evaluating inputs from the various sensors and providing outputs to generate actions by the apparatus 10, and can include logic to perform the sequences detailed herein as well as various other functions. Although numerous actions have been detailed below with reference to various flowcharts, it is to be understood that numerous other actions can be performed and that such a listing is not intended to be exhaustive, further such actions can be modified to provide similar effects (e.g., replacing a horse's neighing sound with a dog's barking sound). In addition, the various exemplary ICs (e.g. GPCE4P096UA, GPC11033D, etc.) include pre-programmed control instructions for processing inputs and outputs as detailed in their published data sheets, which are incorporated herein by reference in their entirety. As such, the software program stored on the memory device 134 generally includes utilization of the features and instructions found on such ICs, although the stored software program could include similar features and instructions necessary to perform the various operations described herein with or without specific ICs by utilizing other or similar ICs to execute the stored software program. Further it is to be understood that the software program can take many forms and be comprised of any one of various programming languages serviceable to facilitate the described actions herein.


The software program includes pre-determined power levels for the controller 130 to provide to the motors based on various received inputs (e.g., speed selection, steering position, etc.). The controller 130 includes motor control components that provide an output of power from the battery 26 to the various motors of the apparatus 10. More particularly, electrical actuation of the motors described herein by the controller 130 can be performed using any of various combinations of solid-state and mechanical switching components and configurations. In at least one embodiment, the controller 130 can include an array of solid state relays coupled to the first drive wheel motor 46 and second drive wheel motor 50, wherein the relays are energized by a plurality of solid state switches (e.g., MOSFETS, etc.) that are switched ON/OFF by outputs from the processor 132 (e.g., GPCE4P096UA) to provide a specific polarity and selected level of power to achieve a desired speed and rotation direction. Similarly, power can be provided to the head pivot motors 220, 230 and the feature motor 142 to actuate the motors with a specific power level depending on received inputs.


The controller 130 can be configured to supply the motors with power from the battery 26 in various manners. For example, the power output from the battery 26 to a motor can be directly switched to provide a constant full or divided portion of the available battery power (e.g., a voltage divider circuit, etc.), or the power output can be a variable power level that is varied using signal modulation (e.g., pulse width modulation). Using pulse width modulation to slowly increase the average voltage level to the motors that move various body parts, such as the head 56, can result in a smooth movement of body parts, which can provide a more realistic impression of an animal movement. This is in notable contrast to direct application of a full or divided power level to a DC motor, which would result in a quick and jerky movement of the body part. Using pulse width modulation to vary the power level supplied to the drive wheel motors 46, 50 can also allow for smooth motion of the apparatus 10 along a surface, but in at least some embodiments, is not utilized. In addition, further variations of power delivery to the motors can include an initial delay, stepped levels, or intermittent delays.


The various motors described herein can include various types and configurations of motors known in the art, for example, continuous DC, stepper, and servo motors, and can include circuit protection components as desired. It shall be understood that actuation of a motor as referenced herein indicates the transmission of power to the motor to induce a rotational output therefrom.



FIG. 8 provides an exemplary block diagram of the various interconnections between the various components found in the apparatus 10 and the controller 130. Other configurations of more or less circuit boards can be utilized to accomplish the same purpose. In addition, some of the components, such as a motion sensor 136, can include their own circuit board and IC to perform the well-known inherent function of detecting motion, as well as to receive and send the necessary inputs and outputs to the controller 130. The motion sensor 136 is in at least some embodiments an ultrasonic motion sensor that includes a transmitter and receiver, and can be positioned in the front of the first torso portion 14 to detect the presence of another object situated in front of the apparatus 10. Although an ultrasonic sensor is preferred, other sound-based motion sensors, as well as other types of motion sensors, such as light-based motion sensors can also be utilized to perform the same functions.


Referring to FIG. 9, a sectional view of the head 56 is provided showing among other things, the feature assembly 74, the reins pivot assembly 92, the mane sensor 84, the mouth sensor 82, and a speaker 140. Referring additionally to FIGS. 10-12, various views of the feature assembly 74 are also provided. As noted above, the feature assembly 74 provides motorized movement of the eyelids 70 and the ears 72 via commands from the controller 130. The feature assembly 74 utilizes a feature motor 142 to perform both movements. More particularly, the feature motor 142 rotates a feature disc 144 having grooved tracks on both a first side 146 and a second side 148, and a center shaft 150 that protrudes on both sides of the feature disc 144. The first side 146 includes first tracks 152 that are engaged with an eyelid lever 154, such as by receiving therein a protruding post extending perpendicularly from the eyelid lever 154, such that it can only move within the first tracks 152. The eyelid lever includes an oblong center portion 156 sized to engage with the center shaft 150 and allow for only rotational and longitudinal movement therewith. The eyelids 70 are each rotatably secured to an eyelid pivot rod 160 at their center, and further secured to a bar portion 164 of the eyelid lever 154. As the feature motor 142 rotates the feature disc 144, the eyelid lever 154 is moves longitudinally towards and away from the center shaft 150 as directed by the first side tracks 146, causing the eyelid lever 154 to be moved in and out relative to the feature assembly housing 158 (which is shown as two separate halves), thereby pushing and pulling the bar portion 164 to cause the eyelids 70 to rotate on the eyelid pivot rod 160. Rotation of the ears 72 is performed by a series of gear interactions that begin with an ear lever 166 that includes an oblong center portion 168 sized to engage with the center shaft 150 and allow for only rotational and longitudinal movement therewith. The second side 170 of the feature disc 144 includes second tracks 172 that are engaged with the ear lever 166, such as by receiving therein a protruding post extending perpendicularly therefrom, such that rotation of the feature disc 144 causes the ear lever 166 to be moved as directed by the second tracks 172, causing the ear lever 166 to be moved in and out (up and down) relative to the feature assembly housing 158. As best shown in FIG. 12, the ear lever 166 includes an upper toothed portion 174 that engages a toothed center gear 176 via enclosure by the feature assembly housing 158. The center gear 176 is fixed to a center rod 178 along with first and second end gears 180, 182, such that rotation of the center gear 176 by the ear lever 166 causes the first and second end gears 180, 182 to rotate. The first and second end gears 180, 182 are further rotatably engaged with first and second ear gears 184, 186, which are attached to the ears 72. It is noted that for clarity only a portion of the ears 72 are shown in FIGS. 10-12.


Referring now to FIGS. 13-19, the motorized neck assembly 54 is illustrated in an exemplary perspective view. FIGS. 14-19 further illustrate the neck assembly 54 of FIG. 13 along with a head collar 204 and a neck sleeve 206. The neck assembly 54 includes a head pedestal 200 having a cylindrical pedestal upper portion 202 sized and shaped to be secured to a mating head collar 204, and a partially spherical pedestal lower portion 207 sized and shaped to move at least partially within the neck sleeve 206, noting that various other shapes and sizes can be utilized for these components and still allow for the described functionality. The head collar 204 is formed with or otherwise secured to the head 56 to allow the head 56 to be installed and secured to the head pedestal 200. The head pedestal 200 also includes a central pivot arm 208 extends along the second rotational head axis 60.


The neck assembly 54 further includes a head pivot base 210 having a pivot disc portion 212 that is rotatably supported and secured to a base ring 214. The base ring 214 and the neck sleeve 206 are each secured to the first torso portion 14, and provide support for the neck assembly 54, while allowing movement of the head 56 along multiple axes. The base ring 214 includes an interior circular channel 216 sized and shaped to enclose and support a plurality of disc rollers 218 positioned along the pivot disc portion 212, thus allowing the head pivot base 210 to rotate relative to the base ring 214, and therefore relative to the first torso portion 14. To facilitate rotation of the head pivot base 210, a first head pivot motor 220 (FIG. 14) is secured to the base ring 214, wherein the first head pivot motor 220 includes a first pivot gear 222 that engages an arced gear wall 224 formed on the bottom side 226 of the pivot disc portion 212. Actuation of the first head pivot motor 220 by the controller 130 causes the pivot disc portion 212, as well as the head pedestal 200 and head 56 coupled thereto, to be rotated left or right about the first rotational head axis 58.


Referring further to FIGS. 13-19, to facilitate rotation of the head 56 about the second rotational head axis 60, a second head pivot motor 230 is provided. The second head pivot motor 230 is mounted to the base ring 214 and includes a rotatable lever disc 232 that rotates upon actuation of the second head pivot motor 230 by the controller 130. The lever disc 232 includes a disc post 234 that rides inside a longitudinal channel 236 of a lever arm 238, wherein the lever arm 238 extends upward and is secured to the head pedestal 200 (FIG. 18). The pivot arm 208 of the head pedestal 200 is rotationally situated within a base sleeve 240 of the head pivot base 210 (FIG. 15), and thereby provides the second rotational head axis 60 for the head 56 to rotate relative to the first torso portion 14. More particularly, when the second head pivot motor 230 is actuated by the controller 130, the rotating lever disc 232 causes the lever arm 238 to move forward or backwards, thereby causing the head pedestal 200 and the attached head 56, to rotate about the second rotational head axis 60. As described, the first head pivot motor 220 and second head pivot motor 230 can be utilized to selectively rotate the head 56 in multiple directions (multi-directional), including clockwise and counter-clockwise (a.k.a. right and left, per directional arrow 62 of FIGS. 1 and 14) using the first head pivot motor 220, and up and down (per directional arrow 64 of FIGS. 1 and 14) using the second head pivot motor 230. Further, as separate motors are used for separate rotations, the motors can be actuated simultaneously to provide compound movement of the head 56 along both the first rotational head axis 58 and the second rotational head axis 60.


In addition to providing motorized movement of the head 56, the neck assembly 54 also includes an integrated electrical steering position sensor 243 (FIG. 19) that translates a mechanical steering input via manual rotation of the head 56 by a user, into an electronic steering position signal that is communicated to the controller 130. More particularly, a steering position disc 242 is mounted to the head pivot base 210 via a disc mount 244. The steering position disc 242 includes a plurality of progressively spaced arced contact strips 246. A contact sensor array 248 comprised of a row of contacts 250 are provided on a lower mount 252, which is secured to the base ring 214, with the contacts 250 wired to the controller 130 to provide a conduction signal for each contact 250. As the contact sensor array 248 is fixed in position, and the contact strips 246 rotate with the head pivot base 210 while a user turns the head 56, rotation of the head 56 causes the contact sensor array 248 to lose or gain contact with specific contact strips 246, whereby the spacing of the contact strips 246 allows the contact sensor array 248 to sense when the head is rotated a specific number of degrees to either the left or to the right based on which contact strip 246 is sensed based on the conductivity of each contact 250. In at least some other embodiments, other types of position sensing mechanisms can be utilized as well, such as a potentiometer, rotary encoder, etc.


To provide further verification of the position of various components such as the head 56, eyelids 70, etc., various additional position sensors can be provided. Such position sensors can include an eyelid position sensor 260 (FIG. 11) that confirms when the feature motor 142 is situated to position the eyelid in a closed or open position, a head home position sensor 262 (FIG. 15) to detect when the head 56 is fully rotated upwards from the ground, a head tilt home position switch 263 positioned opposite the head home position sensor 262 to detect when the head 56 is fully rotated towards the ground, a head center position sensor 264 to detect when the head 56 is centered (facing straight forward), a head left position sensor 266 (FIG. 19) to indicate when the head 56 is fully rotated to the left, and a head right position sensor 268 (FIG. 19) to indicate when the head 56 is fully rotated to the right. In addition, a spring-biased seat mount 269 can be positioned under the seat 24, wherein the seat mount 269 includes a first seat switch 270 and a second seat switch 272. The first seat switch 270 can detect when a user is sitting on the seat 24 and satisfies a predetermined acceptable weight limit, thereby providing a rider detected input signal, wherein the second seat switch 272 can be calibrated to detect when a user sitting on the seat exceeds the predetermined acceptable weight limit. Both switches 270, 272 can send an input signal to the controller 130 and used as a control parameter for enabling and disabling various operations.


The apparatus 10 includes various modes of operation, such as autonomous mode and drive mode that provide interactive experiences for a user. When in drive mode, use of the steering position sensor 243, the throttle switch 110, and the speed control switch 112 allow a user to propel the apparatus 10 in a chosen direction by utilizing the first and second drive motor assemblies 38, 42 to rotate the drive wheels 36, 40. The user can either be sitting on the seat 24 with the reins 90 in a back position to experience a ride by the apparatus 10, or can rotate the reins 90 to a forward position and guide the apparatus 10 to follow the user.


As discussed above, the apparatus 10 can be steered by the reins 90. When a user wishes to steer the apparatus 10 in a specific direction, the reins 90 are used to rotate the head 56 to the left or right along the first rotational head axis 58. As the steering position sensor 243 can detect numerous angles of rotation of head positions, the further the head 56 is rotated to the left or right, the more steering control is provided by the controller 130. As such, the controller 130 proportionally varies applied power from the battery 26 to the first drive motor assembly 38 and the second drive motor assembly 42 based on the steering position signal. In at least some embodiments, the steering position sensor 243 can detect several distinct positions, which can include: a center position (head is not rotated and pointed straight ahead—zero degree rotation), three positions of rotation to the left based on increasing degrees of rotation (L1, L2, L3) relative to center, and three positions of rotation to the right based on increasing degrees of rotation (R1, R2, R3). The positions extend over several degrees in both left and right rotation directions and can be adjusted as desired during programming. For example the first left position L1 can extend from 1-10 degrees rotation to the left from center (zero degrees), second position L2 from 11-15 degrees, and the third position L3 from 15-25 degrees. Similarly, the first right position R1 can extend from 1-10 degrees rotation to the right from center, second position R2 from 11-15 degrees, and the third position R3 from 15-25 degrees. As such, when a user moves the reins 90 to mechanically rotate the head 56 (similar to riding a real animal) and thereby directs the apparatus 10 to move in a specific direction, the steering position sensor 243 provides an electronic position signal to the controller 130 indicating the user's desired direction.


To move the apparatus 10 the user first selects a desired speed/direction from the speed control switch 112. FWD1 is a forward low speed and therefore would require the controller 130 provide a first level of power to the first and second drive wheel motors 46, 50 to propel the apparatus 10. FWD2 is a high speed and therefore would require the controller 130 to provide a second level of power that is greater in than the first level in order to propel the apparatus 10 at a higher speed. When REV is selected, the controller 130 provides a low level of power similar to low speed, but with an opposite rotation direction from FWD1 and FWD2 to propel the apparatus 10 in reverse.


When the head 56 is at the center neutral position, no steering instruction is provided and therefore when a user actuates the throttle switch 110 the controller 130 actuates (i.e., causes a rotational output) both first and second drive wheel motors 46, 50 simultaneously with equal power levels. As the first and second drive wheels 36, 40 are both in the front and on the same rotational drive axis 41, the apparatus 10 moves in a straight or substantially straight direction, with the rear torso portion 16, merely following the direction of the front torso portion 14. If the user wishes to steer the apparatus 10 in a specific direction then the output power provided by the controller 130 would be different between the first and second drive wheel motors 46, 50. More particularly, to effectuate steering of the apparatus 10 in a chosen direction, the controller 130 reduces or eliminates the power level provided to the inside drive wheel so that it rotates slower than the outside drive wheel.


The controller is configured with predetermined power level ratios for applying power to the first and second drive wheel motors 46, 50 based on the steering angle and the speed and direction setting, namely L1-L3, R1-R3, FWD1, FWD2, and REV. For example, when the head 56 is rotated left (user moves the reins to their right) to the first sensed position (L1), a predetermined level of power for L1 is transmitted from the controller 130 to the first drive wheel motor 46 (right side wheel) and a lesser predetermined level of power for L1 is transmitted from the controller 130 to the second drive wheel motor 50 (left side wheel), the disparity in power causes the apparatus 10 to begin to turn left, of course the amount of power reduction provided to the left side wheel determines the rate at which the apparatus 10 will turn left, therefore, if a user turns the head further to the left to L2, the power reduction to the left side wheel is increased, and so on for L3. In addition to the option of providing a reduced power level to the left side wheel, it may be desired or necessary to cease all power or even apply a reverse power to the second drive wheel motor 50 in order to slow the left side wheel down sufficiently to facilitate a desired turning action.


Steering the apparatus 10 to the right follows a similar principal, except that the right side wheel must now be slowed to effectuate a right hand turn. More particularly, when the head 56 is rotated right (user moves the reins to their left) to the first sensed position (R1), a predetermined level of power for R1 is transmitted from the controller 130 to the second drive wheel motor 50 (left side wheel) and a lesser predetermined level of power for R1 is transmitted from the controller 130 to the first drive wheel motor 46 (right side wheel), the disparity in power causes the apparatus 10 to begin to turn right, of course the amount of power reduction provided to the right side wheel determines the rate at which the apparatus 10 will turn right, therefore, if a user turns the head further to the right to R2, the power reduction to the right side wheel will be increased, and so on for R3. In addition to the option of providing a reduced power level to the right side wheel, it may be desired or necessary to cease all power or even apply a reverse power to the first drive wheel motor 46 in order to slow the right side wheel down sufficiently to facilitate a desired turning action. The predetermined levels of power that the software program utilizes for each steering position, as well as FWD1, FWD2, and REV, can be chosen based on numerous factors, such as the overall weight of the apparatus 10, the allowable user weight, the power output of the battery, and so on, therefore these power levels will be relative to each other to perform their chosen function, but can all be higher or lower depending on various design choices.


In addition to being self-propelled, the apparatus 10 includes numerous other interactive features, which can be performed simultaneously or separately. As the apparatus 10 is capable of performing various actions to provide an interactive experience, such as motorized head rotation, body propulsion, eyelid blinking, emitting of animal specific sounds, dancing, etc., sequence tables have been provided in FIGS. 24A-24B, and 25A-25D that detail various exemplary sequences that can be executed by the controller 130. Although not exhaustive, many of the sequences are further detailed in the various flowcharts provided in FIGS. 20-23D. These flowcharts can also include various global functions that dictate safe operation of the apparatus 10, noting that for example, if at any time the body tilt sensor 116 or the second seat switch 272 (overweight limit) are sensed as on, the controller 130 immediately cease power to all the motors and go to distress mode. The sequence tables provide a sequence name (e.g., SEQ_RideOnStart) followed by a row of actions to be executed for that specific called Sequence. More particularly, the sequence tables include the following columns: Sequence (reference name of the sequence); Audio Content (specific sound file played through the speaker 140); Motor Description (describes head or body motor movements (e.g., first or second pivot motor actuation, first and second drive wheel motor actuation)); and Eyes (indicates eyelid movement via the feature motor 142). The sequence tables in FIGS. 24-25D are merely exemplary and each sequence described can include less or more actions, occurring in the same or varied orders.


Referring now to FIG. 20, a flow chart 300 is provided that illustrates various exemplary sequences that can occur when the apparatus 10 is activated by a user. To begin, at step 302, to initiate activation the main power switch 123 is switched on. If the main power switch 123 is already switched on and the apparatus 10 is in sleep mode, then activation can occur by various interactions, such as sitting on the seat 24 (activating the seat sensor), pushing the mode selection switch 114 (horseshoe button), pushing the throttle switch 110, triggering the mouth sensor (feeding) or mane switch (brushing the mane). Once activated the controller 130 checks if the apparatus 10 has been placed in factory test mode 306, if not then the controller 130 checks the voltage of the battery 26 at step 308 to determine if the battery voltage is sufficiently high. If a low voltage condition is detected at step 308, then at steps 310 and 312 the controller 130 activates the status light 115 to flash red several times, indicating to the user that the battery needs to be charged, and then sleep mode is activated to conserve power.


If at step 308, no low voltage condition is detected, the controller 130 checks if the body tilt sensor 116 is activated, indicating that the apparatus 10 is not upright and therefore is not safe for use, if yes, then distress mode is activated in step 312. If no, then the controller 130 checks if the first seat switch 270 is activated in step 314, indicating a user is sitting on the seat. If activated then in step 316 the controller 130 checks if the second seat switch 272 is activated indicating the apparatus 10 is overloaded, and if so, then distress mode is activated in step 317, if not, then drive mode is activated in step 318. If the first seat switch 270 is not activated, then in step 320 the controller 130 checks if the reins are in a forward or backward position. If the reins 90 are in the forward position (indicating that the user wishes to lead the apparatus 10 versus ride), then drive mode is activated in step 318. If the reins 90 are not sensed in the forward position in step 320, then autonomous mode is activated in step 322. As such, when the apparatus 10 is initially activated, the controller 130 will place it in one of various modes, such as autonomous, drive, distress, or sleep.


Referring to FIG. 21, a flow chart 400 is provided that illustrates an exemplary sequence that can occur when the apparatus 10 enters distress mode. Beginning at step 402, when distress mode is first entered the apparatus executes a distress sequence where the status light 115 illuminates red, the eyelids 70 blink repeatedly, and the speaker plays a distress sound to alert the user of a distress condition. At step 404 the controller 130 checks the body tilt sensor 116 and the second seat switch 272 and if either one remains on for greater than 15 seconds, then sleep mode is activated at step 406. If both are sensed as off, then at step 408 the controller 130 checks if the first seat switch 270 is on and if yes, then drive mode is activated at step 410, if no, then autonomous mode is activated at step 412. It is noted that for safety purposes, various sensors are continuously monitored when the apparatus is powered (i.e., the main power switch 123 is on). For example, if the body tilt sensor 116 or second seat switch 272 are sensed as on by the controller 130, all or some of the apparatus motors can be deactivated and distress mode activated. Similarly, if the controller 130 detects a battery low voltage condition, the status light 115 will flash red, indicating to the user that the battery needs to be charged, and sleep mode is activated.


Referring now to FIGS. 22A-22D, a flow chart 500 is provided that illustrates various sequences that can occur when the apparatus 10 has been placed in drive mode. Drive mode can be activated in several ways, such as through a power on/wake sequence (step 504) as discussed in flow chart 300, or by actuation of the mode selection switch 114 (step 502) when in autonomous mode. Whether drive mode is actuated through step 502 or 504, the sequence moves to step 506 which begins with the controller 130 executing SEQ_RideOnStart (see sequence tables for description), then in step 508, checks if the reins 90 are in the forward position and if yes, then in step 510 FWD2 is disabled along with the playing of music, if no, then in step 512 FWD2 and music are enabled.


Then at step 514, the controller 130 checks the throttle and speed control switch inputs. If the throttle switch 110 is sensed as not being actuated by the user, as noted in step 516 (FIG. 22B), then in step 518 any power to the drive wheel motors 46, 50 is ceased and, if the drive wheel motors 46, 50 were powered at the time of cessation, then the controller 130 executes SEQ_RideOnStop and advances to step 520 wherein the controller 130 starts a timer and waits for one of a plurality of events to occur. If none of the events described below occur within 30 seconds as noted in step 522, then in step 524 the controller 130 executes SEQ_MovementPrompt and returns to step 518. If none of the events occur within 40 seconds as noted in step 526, then in step 528 the controller 130 executes SEQ_MovementPrompt, and advances to step 530 to activate sleep mode. The various possible events that can be detected at step 520 include the detection of various inputs being activated by a user. More particularly, if the user touches the front head touch sensor 76 for more than 0.5 seconds at step 532 then at step 534 the controller 130 executes SEQ_NosePetting[1, 2, or 3] and returns to step 518. If the user touched either the left or right head touch sensor 78, 80 at step 536, then at step 538 the controller 130 executes a random petting response chosen from SEQ_RidingLeftPetting[1, 2, or 3] if the left head touch sensor 78 was sensed, and chosen from SEQ_RidingRightPetting[1, 2, or 3] if the right head touch sensor 80 was sensed. Then at step 540 if the activation of the left or right head touch sensor 78, 80 at step 536 ceased within three seconds, then the process returns to step 518, if the activation continued for greater than three seconds, then the process proceeds to step 542 where the controller 130 executes SEQ_LongLeftPetting[1 or 2] if the left head touch sensor 78 was sensed and SEQ_LongRightPetting[1 or 2] if the right head touch sensor 80 was sensed. Step 542 continues to run in a loop as long as the user remains in contact, when contact stops, the process returns to step 518. From step 520, if the user activates the mane sensor 84 (e.g., brushing the mane 67) as in step 544 (FIG. 22C), then instep 546 the controller 130 executes SEQ_ManeBrushing1 and continues through SEQ_ManeBrushing4 [1-4] incrementing one for each additional sensor activation (brush action) detected, then the process returns to step 518. From step 520, if the user activates the mouth sensor 82 (e.g., feeding action) as in step 548, then in step 550 the controller 130 executes SEQ_Eating[1-3] and continues in loop until either the mouth sensor is no longer sensed on or ten seconds has transpired in step 552, then the process returns to step 518. A final event can be triggered at step 554 if the controller 130 has sensed a sixth consecutive input from the mane sensor 84 or the head touch sensors 76, 78, 80, then at step 556 the controller 130 executes SEQ_RidingILoveYou and returns to step 518.


Referring back to step 514, if the speed control switch 112 is set to REV and the throttle switch 110 is actuated as in step 560, then at step 562 the controller 130 executes SEQ_BackingUp in a loop and commands one or both of the drive wheel motors 46, 50 to rotate in reverse. As noted above, specific activation of the drive wheel motors 46, 50 is dependent on the steering command, although if no steering command is present, both drive wheel motors 46, 50 will be activated simultaneously at the preselected reverse speed to propel the apparatus 10 in reverse. While operating in reverse the controller 130 monitors for other events such as activation of the left or right head touch sensors 78, 80 in step 564, if detected, then at step 566 the controller 130 executes a random petting response chosen from SEQ_RidingLeftPetting[1, 2, or 3] if the left head touch sensor 78 was sensed, and chosen from SEQ_RidingRightPetting[1, 2, or 3] if the right head touch sensor 80 was sensed, then returns to step 562.


Referring again back to step 514, if the speed control switch 112 is set to FWD1 (low speed) and the throttle switch 110 is actuated at step 570 (FIG. 22D), then at step 572 the controller 130 executes SEQ_Galloping and randomly every 3-6 repeats, executes SEQ_RidingNeigh[1 or 2], and the controller 130 also commands one or both of the drive wheel motors 46, 50 to propel the apparatus 10 at FWD1, depending on the steering position. Similar to reverse movement, while the throttle switch 110 is activated, the controller 130 monitors for other events at step 574, such as activation of the left or right head touch sensors 78, 80 at step 576, if detected, then at step 578 the controller 130 executes a random petting response chosen from SEQ_RidingLeftPetting[1, 2, or 3] if the left head touch sensor 78 was sensed, and chosen from SEQ_RidingRightPetting[1, 2, or 3] if the right head touch sensor 80 was sensed, and then returns to step 572. Another potential event that is monitored for is noted in step 580, wherein the motion sensor 136 detects an object in front of the apparatus 10, signaling a potentially imminent collision, wherein the process moves to step 582 wherein the power to the drive wheel motors 46, 50 is reduced or ceased until the object is no longer detected, the process then continues back at step 572.


Referring yet again back to step 514, if the speed control switch 112 is set to FWD2 (high speed) and the throttle switch 110 is actuated at step 586 (FIG. 22D), then at step 588 the controller 130 executes SEQ_GallopingFast and randomly every 3-6 repeats, executes SEQ_RidingNeigh[1 or 2], and the controller 130 also commands one or both of the drive wheel motors 46, 50 to propel the apparatus 10 at FWD2, depending on the steering position. Similar to reverse movement, while the throttle switch 110 is activated, the controller 130 monitors for other events at step 590, such as activation of the left or right head touch sensors 78, 80 at step 592, if detected, then at step 594 the controller 130 executes a random petting response chosen from SEQ_RidingLeftPetting[1, 2, or 3] if the left head touch sensor 78 was sensed, and chosen from SEQ_RidingRightPetting[1, 2, or 3] if the right head touch sensor 80 was sensed, and then returns to step 588. Another potential event that is monitored for is noted in step 596, wherein the motion sensor 136 detects an object in front of the apparatus 10, signaling a potentially imminent collision, wherein the process moves to step 598 wherein the power to the drive wheel motors 46, 50 is reduced or ceased until the object is no longer detected, the process then continues back at step 588. In at least some embodiments, the detected object includes a person, such as the user (e.g., child interacting with the toy), moving in front of the apparatus 10, while in other embodiments the detected object can include a person and/or a structure, such as a wall.


Referring again to FIG. 22A, other actions can occur based on sensed inputs that will lead to step 506. For example, at step 503a, if the reins back sensor 96 signals to the controller 130 that the reins 90 have been moved to the backward position (in front of the rider), while the first seat switch 270 is on, then at step 503b the drive wheel motors will stop for 2 seconds and then proceed to step 506. In addition, at step 503c, if the first seat switch 270 is changed to off while the reins 90 are sensed in the forward position then the drive wheel motors will stop for 2 seconds and then proceed to step 506. Further, if the first seat switch 270 is changed to off while the reins 90 are sensed in the backward position in step 505a, or if the reins 90 are moved to the backward position, while the first seat switch 270 is off, as in step 505b, then at step 505c the drive wheel motors will stop and wait for an event at step 505d. The event at step 505d can include steps 505e or step 505f. In step 505e, if the first seat switch 270 is sensed on or the reins 90 are moved to the forward position again within two seconds, the process moves to step 506. In step 505f, if the first seat switch 270 remains off for two seconds then the controller 130 activates autonomous mode at step 505g.


Referring now to FIGS. 23A-23D, a flow chart 600 is provided that details further exemplary actions can be taken from power on and drive mode that lead to autonomous mode actions, which can include actions taken without persistent user interaction. In at least some embodiments, when operated in autonomous mode, the controller 130 directs actuation of the drive wheel motor assemblies 38, 42 and neck assembly 54 according to a predetermined sequence, and in drive mode, the controller 130 only directs actuation of the drive wheel motor assemblies 38, 42 based on actuation of the throttle switch 110.


Beginning at step 602 while in drive mode, if the user is operating the throttle switch 110 to propel the apparatus 10 and the length of the ride exceeds three minutes at step 604, then at step 605 the controller 130 will execute SEQ_RideOver3, then proceed to step 606 and execute SEQ_FeedingPrompt (which prompts the user to feed the toy), followed by step 608 where the apparatus 10 is put in idle mode, which includes random eyelid blinking and head movements. Returning to step 604, if the length of the ride does not exceed three minutes, then at step 610 the controller randomly chooses to execute one of SEQ_RideOver1 and SEQ_RideOver2, and if the length of the ride exceeds sixty seconds at step 612, then the process proceeds to step 606, otherwise the process moves to step 608.


Idle mode at step 608 can also be activated after a power on at step 614 and wake up sequence has been executed at step 616. While the apparatus 10 is in idle mode at step 608 it is monitoring for numerous possible events to occur as noted at step 618. Sensing of a particular event causes the controller 130 to execute (i.e., play) a specific response as detailed in the flow chart 600 and the sequence tables (FIGS. 24A-25D) followed by a return to idle mode at step 608. Various exemplary events can include the following: i) user touching (e.g., petting, stroking, etc.) the front of the head (nose) for a brief moment (steps 620 and 622); ii) user continuously touching the front of the head for more than 0.5 seconds (steps 624 and 626); user touching the left or right side of the head (steps 628, 630, 632, and 634), wherein the touch executes a random action, which can then be extended to include additional actions if the user holds their contact on the head (e.g., hugging) (steps 636 and 638); iii) user brushing the mane (steps 640 and 642); iv) user consecutively touching or brushing six times (steps 644 and 646); v) user approaching the apparatus quickly, as sensed by the motion detector (steps 648 and 650); vi) eight seconds transpires with no inputs sensed (steps 652 and 654 for the first three occurrences, then steps 656, 658, and 660); vii) user inserts object in the mouth (feeding) (step 662) which can generate a random dislike response (steps 666 or 668), or an eating response (steps 670), which then monitors the time the mouth sensor is on (steps 674 and 676) and can either provide a finished eating response (step 678) or provide a full expression (step 680); and finally viii) user pushes the mode selection switch (horseshoe button) 114 (step 682), which initiates execution of a random trick (step 684).


As noted in the sequence tables, the apparatus 10 can perform a plurality of dance sequences (i.e., SEQ_Dance and SEQ_Dance2) which would include a preprogrammed sequence of discrete motor commands being progressively sent by the controller 130 to actuate the drive wheel motors 46, 50 in forward and/or reverse directions, causing the apparatus 10 to be propelled along the floor in time with a song played over the speaker 140. Additional commands can be provided to actuate the head, ears, eyelids, illuminate the status light 115, etc.


In at least some embodiments, the input from the motion sensor 136 can be used to trigger new or continued motor commands by the controller 130. In this manner, the controller 130 could require confirmation of sensed motion by a user before continuing with a subsequent power output command to the drive wheel motors 46, 50 that would change the direction or power level applied to the drive wheel motors 46, 50. This feature can be utilized in the dance sequence, as well as when a user is interacting with the apparatus 10, such as brushing the mane, feeding the mouth, or touching the head. Although this feature may be utilized with the drive wheel motors 46, 50, other body movement motors, such as the feature motor, the head rotation motors, etc., may be actuated in any of numerous sequences with or without movement of the drive wheel motors 46, 50 and/or sensed motion inputs from the motion sensor 136.


It is specifically intended that the apparatus is not to be limited to the embodiments and illustrations contained herein, but include modified forms of those embodiments including portions of the embodiments and combinations of elements of different embodiments as come within the scope of the following claims. Further the various motors described herein can be coupled to additional components in any of numerous mechanisms, such as gears, actuators, levers, pulleys, etc. to perform the described functions. Further modifications and alternative embodiments of various aspects of the apparatus will be apparent to those skilled in the art in view of this description. Accordingly, this description is to be construed as illustrative only and is for the purpose of teaching those skilled in the art the general manner of carrying out the invention. It is to be understood that the forms of the apparatus shown and described herein are to be taken as the presently preferred embodiments. Elements and materials may be substituted for those illustrated and described herein, parts and processes may be reversed, and certain features of the apparatus may be utilized independently, all as would be apparent to one skilled in the art after having the benefit of this description of the apparatus. Changes may be made in the elements described herein without departing from the spirit and scope of the apparatus as described in the following claims. In addition, any steps described herein with reference to the flow charts are not to be considered limiting and can include variations, such as additional steps, removed steps, and re-ordered steps.

Claims
  • 1. An interactive ride-on toy apparatus comprising: a torso;a plurality of legs secured to the torso;a first drive motor assembly secured to a first of the plurality of legs and to a first drive wheel;a second drive motor assembly secured to a second of the plurality of legs and to a second drive wheel;a motorized neck assembly coupling a head to the torso, wherein the neck assembly provides a multi-directional rotational movement of the head relative to the torso;a rechargeable battery;a throttle switch to provide a throttle signal;a controller including one or more processors and one or more memory devices; andan electrical steering position sensor configured to translate a mechanical steering input via manual rotation of the head into an electronic steering position signal that is communicated to the controller, wherein the controller is configured to receive the throttle signal and the steering position signal, and selectively actuate at least one of the drive wheel motors based on the throttle signal and the steering position signal.
  • 2. The apparatus of claim 1, wherein the rotational movement of the head relative to the torso includes selective rotational movement of the head about a first axis or a second axis that is perpendicular to the first axis.
  • 3. The apparatus of claim 2, wherein movement of the head about the first axis and second axis can be performed simultaneously.
  • 4. The apparatus of claim 3, further comprising a plurality of touch-based sensors situated in the head.
  • 5. The apparatus of claim 4, wherein at least one or more touch-based sensors situated in the head include a capacitive-based sensors.
  • 6. The apparatus of claim 4, wherein the controller proportionally varies applied power from the battery to each of the first drive motor assembly and the second drive motor assembly based on the steering position signal.
  • 7. The apparatus of claim 6, further comprising a speaker for emitting sounds selected by the controller, a seat positioned on the torso, and a first seat switch situated between the seat and the torso, wherein actuation of the first seat switch by a user provides a rider detected input signal.
  • 8. The apparatus of claim 7, further comprising a motion sensor coupled to the controller and configured to detect the presence of an object situated in front of the torso.
  • 9. The apparatus of claim 8, further comprising reins pivotably coupled to the head, wherein the reins include the throttle switch and a speed and direction selection switch.
  • 10. The apparatus of claim 9, wherein the reins are coupled to the head via a reins pivot assembly that allows the reins to be rotated between a forward position and a back position relative to the head, and wherein the reins pivot assembly further includes at least one of a reins forward sensor and a reins back sensor to indicate to the controller the position of the reins.
  • 11. The apparatus of claim 10, wherein a third and a fourth of the plurality of legs each include a freely pivotable non-motorized wheel.
  • 12. The apparatus of claim 11, wherein the torso further includes a first torso portion pivotably coupled to a second torso portion along a vertical pivot joint.
  • 13. The apparatus of claim 12, wherein the head further includes a mouth, ears, and eyelids, and wherein the eyelids and the ears are rotatably actuatable via a signal from the controller.
  • 14. The apparatus of claim 13, wherein the touch-based sensors include a right head touch sensor, a left head touch sensor, and a front head touch sensor.
  • 15. The apparatus of claim 14, further comprising a mode selection switch for selecting between a first mode and a second mode, wherein the first mode directs the controller to actuate the drive wheel motor assemblies and neck assembly according to a predetermined sequence, and the second mode directs the controller to actuate the drive wheel motor assemblies only upon actuation of the throttle switch.
  • 16. An interactive ride-on toy apparatus comprising: a torso having a first torso portion and a second torso portion;a first front leg and a second front leg, each extending down from the first torso portion;a first drive motor coupled to the first front leg and to a first drive wheel;a second drive motor coupled to the second front leg and to a second drive wheel, wherein the first and second drive wheels are rotatable propel the apparatus along a surface;a non-motorized wheel coupled to the second torso portion;a motorized neck assembly coupling a head to the first torso portion, wherein the neck assembly provides selective rotational movement of the head along both a first rotational head axis and a second rotational head axis;a rechargeable battery;a throttle switch to provide a throttle input signal;a controller for receiving the throttle input signal and selectively actuating the drive motor assemblies with power from the battery;an electrical steering position sensor for receiving a mechanical steering input upon manual rotation of the head, wherein the controller proportionally varies the applied power from the battery to the first drive motor assembly and the second drive motor assembly based on a steering position signal provided by the steering position sensor;a plurality of touch-based sensors situated in the head for providing a touch input signal;reins coupled to the head, wherein the reins include the throttle switch and a speed and direction selection switch;a speaker for emitting sounds selected by the controller; anda seat positioned on the torso.
  • 17. The apparatus of claim 16, further comprising a first seat switch situated between the seat and the torso, wherein actuation of the first seat switch by a user provides a rider detected input signal.
  • 18. The apparatus of claim 17, further comprising a motion sensor configured to detect the presence of an object situated in front of the first torso portion.
  • 19. The apparatus of claim 18, wherein the first rotational head axis is situated perpendicular to the second rotational head axis, and wherein the reins are coupled to the head via a reins pivot assembly that allows the reins to be rotated between a forward position and a back position relative to the head, and wherein the reins pivot assembly provides a reins position input signal to the controller indicating the sensed position of the reins.
  • 20. An interactive ride-on toy apparatus comprising: a torso having a first torso portion pivotably coupled to a second torso portion along a vertical pivot joint;a first front leg and a second front leg, each extending down from the first torso portion;a first drive motor assembly secured to the first front leg and to a first drive wheel;a second drive motor assembly secured to the second front leg and to a second drive wheel, wherein the first and second drive wheels are rotatable about a single rotational drive axis to propel the apparatus along a surface;a first rear leg and a second rear leg, each extending down from the second torso portion and including a wheel secured thereto;a motorized neck assembly coupling a head to the first torso portion, wherein the neck assembly provides selective rotational movement of the head along both a first rotational head axis and a second rotational head axis, wherein the second rotational head axis lies parallel to the rotational drive axis and perpendicular to the first rotational head axis;a rechargeable battery situated in at least one of the torso and the head;a throttle switch to provide a throttle input;a controller for receiving the throttle input and selectively actuating the drive motor assemblies using the rechargeable battery;an electrical steering position sensor for receiving a mechanical steering input via manual rotation of the head, and wherein the controller proportionally varies the applied power from the battery to the first drive motor assembly and the second drive motor assembly based on a received steering position sensor input;a plurality of touch-based sensors situated in the head for receiving touch signals from a user;wherein the head includes a mouth, eyelids, and ears, and wherein the eyelids and the ears are rotatably actuatable via a signal from the controller;reins pivotably coupled to the head, wherein the reins include the throttle switch and a speed and direction selection switch, and wherein the reins are coupled to the head via a reins pivot assembly that allows the reins to be rotated between a forward position and a back position relative to the head, and wherein the reins pivot assembly provides a reins position input signal to the controller indicating the position;a speaker for emitting sounds selected by the controller;a seat positioned on the torso;a first seat switch situated between the seat and the torso, wherein actuation of the first seat switch by a user provides a rider detected input signal;a motion sensor configured to detect the presence of another object situated in front of the first torso portion; anda mode selection switch for selecting between a first mode and a second mode, wherein the first mode directs the controller to actuate the drive wheel motor assemblies and neck assembly according to a predetermined sequence, and the second mode directs the controller to actuate the drive wheel motor assemblies only during actuation of the throttle switch.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Appl. No. 62/477,220 filed on Mar. 27, 2017, U.S. Provisional Patent Appl. No. 62/477,629 filed on Mar. 28, 2017, U.S. Provisional Patent Appl. No. 62/552,502 filed on Aug. 31, 2017, and U.S. Provisional Patent Appl. No. 62/581,863 filed on Nov. 6, 2017, the disclosures of which are incorporated herein by reference in their entirety for all purposes.

US Referenced Citations (161)
Number Name Date Kind
1308425 Johnson Jul 1919 A
2224411 Smith Dec 1940 A
2253096 Roy Aug 1941 A
2400265 Sieger May 1946 A
2511968 Card Mar 1949 A
2606022 Vander Veer et al. Aug 1952 A
2646990 Fowler Jul 1953 A
2782032 Plympton Feb 1957 A
2800323 West et al. Jul 1957 A
2832426 Seargeant Apr 1958 A
2852887 Washington Sep 1958 A
2960345 Chontos Nov 1960 A
3073409 Daifotes Jan 1963 A
3183990 Edwards May 1965 A
3303821 Harris Feb 1967 A
3348518 Forsyth Oct 1967 A
3481417 Jarret Dec 1969 A
3659378 Duncan May 1972 A
3720281 Frownfelter Mar 1973 A
3748564 Ohba Jul 1973 A
3771615 Rieli Nov 1973 A
3823790 Richison Jul 1974 A
3867993 Kozaburo Feb 1975 A
4057752 Artrip Nov 1977 A
4063607 Patrick Dec 1977 A
4081725 Schmidt Mar 1978 A
4157123 Rodaway Jun 1979 A
4157826 Sims Jun 1979 A
4378855 Haub Apr 1983 A
4415049 Wereb Nov 1983 A
4497500 Mercurio Feb 1985 A
4516648 Berger May 1985 A
4516951 Saigo et al. May 1985 A
4561514 Kazuo Dec 1985 A
4562893 Cunard Jan 1986 A
4626750 Post Dec 1986 A
4648473 Bergner Mar 1987 A
4657098 Wilcox Apr 1987 A
4712184 Haugerud Dec 1987 A
4772829 Pickering Sep 1988 A
4926952 Farnam May 1990 A
4988300 Yamaguchi Jan 1991 A
4999556 Masters Mar 1991 A
5011451 Holtier Apr 1991 A
5074820 Nakayama Dec 1991 A
5135427 Suto Aug 1992 A
5141464 Stern et al. Aug 1992 A
5158492 Rudell Oct 1992 A
5180023 Reimers Jan 1993 A
5261025 Post Nov 1993 A
5275248 Finch Jan 1994 A
5334075 Kakizaki Aug 1994 A
5429543 Tibor Jul 1995 A
5487437 Avitan Jan 1996 A
5568926 Kaptein Oct 1996 A
5592997 Ball Jan 1997 A
5644114 Neaves Jul 1997 A
5697465 Kruse Dec 1997 A
5697621 Nazarian Dec 1997 A
5762155 Scheulderman Jun 1998 A
5805140 Rosenberg et al. Sep 1998 A
5842532 Fox et al. Dec 1998 A
5859509 Bienz Jan 1999 A
5885159 DeAngelis Mar 1999 A
5921843 Skrivan et al. Jul 1999 A
5924506 Perego Jul 1999 A
5989096 Barton et al. Nov 1999 A
5994853 Ribbe Nov 1999 A
6039327 Spector Mar 2000 A
6039626 Gerold Mar 2000 A
6089942 Chan Jul 2000 A
6089948 LaBarbara, Jr. Jul 2000 A
6095268 Jones, Jr. Aug 2000 A
6105982 Howell Aug 2000 A
6139061 Lewis Oct 2000 A
6149490 Hampton Nov 2000 A
6202773 Joseph Mar 2001 B1
6250987 Choi Jun 2001 B1
6257948 Silva Jul 2001 B1
6283237 Muller Sep 2001 B1
6287167 Kondo Sep 2001 B1
6405817 Huntsberger et al. Jun 2002 B1
6412787 Pardi et al. Jul 2002 B1
6435936 Rehkemper Aug 2002 B1
6470982 Sitarski et al. Oct 2002 B2
6471565 Simeray Oct 2002 B2
6497607 Hampton Dec 2002 B1
6499747 Fagan Dec 2002 B2
6514117 Hampton Feb 2003 B1
6522244 Huntsberger Feb 2003 B2
6524156 Horchler Feb 2003 B1
6537128 Hampton Mar 2003 B1
6544098 Hampton Apr 2003 B1
6554087 Huntsberger Apr 2003 B2
6554679 Shackelford Apr 2003 B1
6565407 Woolington May 2003 B1
6589098 Lee Jul 2003 B2
6620024 Choi Sep 2003 B2
6736694 Hornsby May 2004 B2
6771034 Reile Aug 2004 B2
6773327 Felice Aug 2004 B1
6780076 Horchler Aug 2004 B2
D500097 Passmore Dec 2004 S
6843703 Iaconis Jan 2005 B1
7020310 Turney Mar 2006 B2
7025657 Nishimoto Apr 2006 B2
7047108 Rainier et al. May 2006 B1
7062073 Turney Jun 2006 B1
7066782 Maddocks et al. Jun 2006 B1
7118443 Marine Oct 2006 B2
7203642 Ishii et al. Apr 2007 B2
7207588 Bergum Apr 2007 B2
7207859 Iaconis Apr 2007 B1
7207860 Hornsby Apr 2007 B2
7222684 Norman May 2007 B2
7300328 Klick, Jr. Nov 2007 B2
7364489 Iaconis Apr 2008 B1
7416468 Felice Aug 2008 B1
7431629 Maddocks Oct 2008 B1
7507139 Maddocks et al. Mar 2009 B1
7568538 Drosendahl Aug 2009 B2
7621552 Bergum Nov 2009 B2
7637794 Barri Dec 2009 B2
7695341 Maddocks Apr 2010 B1
7731560 Kratz Jun 2010 B2
7749043 Rehkemper Jul 2010 B1
7794303 Cassidy Sep 2010 B2
7837531 Friedland Nov 2010 B2
7901265 Cameron Mar 2011 B1
7938218 Howell May 2011 B2
7942720 Galoustian May 2011 B1
7950978 Norman May 2011 B2
7988168 Miroewski Aug 2011 B2
7988522 Chen Aug 2011 B2
8092271 Garbos Jan 2012 B2
8175747 Lee May 2012 B2
8366135 Asbach et al. Feb 2013 B2
8371897 Wong Feb 2013 B1
8376803 Oonaka Feb 2013 B2
8414350 Rehkemper Apr 2013 B2
8483873 Wong Jul 2013 B2
8529310 Powers Sep 2013 B2
8662955 Fai Mar 2014 B1
8827279 Clark Sep 2014 B1
8909370 Stiehl Dec 2014 B2
8915791 Gibson Dec 2014 B2
9079110 Byrne Jul 2015 B2
9079113 Wong Jul 2015 B2
9211476 Curry, Sr. Dec 2015 B2
9266577 Todokoro Feb 2016 B2
9550131 Loetz Jan 2017 B2
9636598 Cai May 2017 B2
9665179 Degtyarev May 2017 B2
20020089297 Filo Jul 2002 A1
20050233676 Bohart Oct 2005 A1
20050287924 Haney Dec 2005 A1
20120315821 Hayakawa Dec 2012 A1
20130206488 Horinouchi Aug 2013 A1
20130277128 Gillett Oct 2013 A1
20140273728 Rehkemper Sep 2014 A1
20160009335 Biderman et al. Jan 2016 A1
Foreign Referenced Citations (5)
Number Date Country
2119911 Oct 1992 CN
1051321 Aug 1999 EP
630496 Oct 1949 GB
2450887 Jan 2009 GB
WO 9534267 Dec 1995 WO
Non-Patent Literature Citations (5)
Entry
“Electric power plush walking horse toy”; Guangzhou Listar Technology Co., LTD; WWW.listartech.com; Retrieved Feb. 15, 2017.
“Snuggles My Dream Puppy”; Moose Toys; www.moosetoys.com/products/little-live-pets-snuggles-my-dream-puppy; Retreived Jul. 16, 2018.
“Adult pedal cars walking hourse toy motorized animals for rent”; Guangzhou Hansel Electronic Technology Co., LTD; www.hanseltech.com; Retrieved Feb. 15, 2017.
“Zoomer—Show Pony”; Spin Master, Ltd.; www.shop.spinmaster.com/zoomer-show-pony/product/p21262 Acknowledged Jan. 1, 2017 Retreived Jul. 16, 2018.
International Search Report for PCT/US18/24595; dated Jun. 26, 2018.
Related Publications (1)
Number Date Country
20180272238 A1 Sep 2018 US
Provisional Applications (4)
Number Date Country
62477220 Mar 2017 US
62477629 Mar 2017 US
62552502 Aug 2017 US
62581863 Nov 2017 US