The present invention relates generally to the interfacing with remote devices by a user, and more particularly to devices used to interface with remote control toys and which provide haptic feedback to the user.
Humans interface with electronic and mechanical devices in a variety of applications, and the need for a more natural, easy-to-use, and informative interface is a constant concern. In the context of the present invention, one such application is the remote control of moving devices such as toy vehicles. For example, remote control toy cars are common, which are small cars that move under their own power, e.g. using batteries or gasoline. The user may typically control the direction of turning, the braking, and/or the forward/back direction of the car by using a remote control unit, which typically sends signals to the car via wireless transmission. Some remote control toys with limited motion may include a wire connecting the remote control unit with the controlled toy or device to allow the signals to be transmitted to the toy. The remote control unit may include joysticks, dials, switches, buttons, or other controls to assist the user in the control of the toy. Other types of moving toys and devices can be similarly controlled, such as flying toys (e.g., planes, helicopters, rockets), water toys (e.g., boats and submarines), trucks, robots, toy animals, etc.
One type of functionality missing from toy remote control devices is kinesthetic force feedback and/or tactile feedback, collectively known herein as “haptic feedback.” Haptic feedback can be added to such interface control devices to provide the user with a more interactive experience and to provide greater ease in interfacing and controlling the remote toy device.
The present invention provides a haptic feedback remote control device for controlling moving toy devices such as cars, boats, etc. The remote control unit provides haptic feedback to the user that provides a more compelling experience when controlling the toy.
More particularly, a haptic feedback remote control device provides control signals to a toy device, such as a car, boat, plane, etc., to control the operation of the toy device. The remote control device includes a housing and at least one control for manual manipulation by the user, where control signals representing the manipulation are sent to the toy device to control the operation of the toy device. An actuator outputs forces on the housing in response to received actuator signals, and a controller provides the actuator signals to the actuator and monitors the control signals representing the manipulation of the control. The controller can determine the forces based only on the manual manipulation of the control by the user, or based partially on the manipulation. In one embodiment, the actuator moves an inertial mass to provide inertial haptic sensations on the housing, the inertial haptic sensations being felt by the user. The control includes a lever movable along an axis, a steering wheel or knob, or other control. Preferably, the control signals sent to the toy device are transmitted wirelessly to the toy device. For example, the control can be a throttle control or steering control.
An additional feature in some embodiments allows the controller to determines the forces based only or partially on information received from the toy device. For example, the information received from the toy device can includes information from a contact sensor on the toy device that indicates whether the toy device has contacted another object at a location of the contact sensor. The information can indicate a degree of contact of the toy device with the other object. In another embodiment, the information can indicate an amount of acceleration experienced by the toy device in at least one dimension of the toy device.
In another embodiment, an actuator in the remote control unit can output forces on the control manipulated by the user in response to the received electrical signals. The forces can be determined based only or partially on the local manipulation of the control(s), or partially or wholly on information indicating the status of the controlled toy. An embodiment of a remote control toy device includes a remote control unit as described above and a toy device operable to physically move in accordance with the control signals. Another embodiment is a method for controlling a toy device based on manipulation of a remote control unit by a user provide haptic sensations to the user.
The present invention provides a haptic feedback remote control device that provides haptic sensations to the user when controlling a toy vehicle or other toy device. This allows the user to experience another dimension in sensory feedback when controlling a toy remotely. The sensations can simulate what the toy is currently experiencing and can also inform the user of the status of the toy, thereby enhancing the user's control over the toy.
These and other advantages of the present invention will become apparent to those skilled in the art upon a reading of the following specification of the invention and a study of the several figures of the drawing.
In
Remote control 12 is operated by a user to control the toy 14. Remote control 12 is typically small enough to be held in one or both hands of the user and may include a variety of controls. For example, a joystick or lever 20 can be moved by the user up or down to command the toy 14 to move forward or back, respectively. In some rate control embodiments, the distance that the lever 20 is moved from an origin position (such as the center of the lever's range) controls the velocity of the controlled toy in the appropriate direction. Lever 22 can be moved left or right to command the toy to turn left or right, respectively. In some rate control embodiments, the distance that the lever 22 is moved from an origin position controls the amount or tightness of the turn of the toy. Other buttons, small steering wheels, knobs, dials, joysticks, trackballs, switches, direction pads, levers, or other controls can be included to command various other functions of the toy, such as to brake or stop the toy, power or start the toy, turn on headlights, change driving modes, sound a horn, etc. and/or to control steering and throttle functions instead of levers 20 and 22, and any of these controls can be provided with haptic feedback as described herein Some embodiments may provide the functionality of the remote control 12 in another device, such as a cell phone, portable computer (e.g. laptop or PDA), wristwatch, etc.
Remote control 12 may include a wireless transmission device for transmitting control signals through the air to the toy 14. The implementation of such a transmission device is well known to those of skill in the art. Often, remote control 12 includes an antenna 23 to broadcast radio frequency (RF) (or other frequency range) control signals at a desired frequency. In other embodiments, the control signals from the remote control can be sent along a wire or other transmission line that physically couples the remote control 12 to the toy 14. Some embodiments may allow the remote control 12 to receive signals from the toy 14, as detailed below; in such embodiments, appropriate receiver electronics are included in the remote control 12.
In the present invention, remote control 12 includes haptic feedback functionality. This functionality can be provided using one or more actuators included within or coupled to the housing 24 of the remote control 12. Various embodiments of the haptic remote control are described in detail below.
Toy 14 is shown in
Toy 14 includes receiver electronics for receiving the commands from the remote control 12, e.g. at the proper broadcast frequency. In those embodiments in which information is transmitted to the remote control 12 from the toy 14, the toy 14 includes a transmitter, e.g. a wireless transmitter similar to the transmitter used by the remote control 12. Other components of the toy can include a microprocessor or other controller for implementing received commands, controlling the motors, reading sensors (for those embodiments including sensors), etc.
Other types of moving toy vehicles and devices can be similarly controlled, such as flying toy vehicles (e.g., planes, helicopters, rockets), water toy vehicles (e.g., boats and submarines), trucks, robots, toy animals, etc.
The housing 24 of the remote control 12 includes an actuator assembly 50 which outputs forces on the housing 24 of the remote control 12. In the described embodiment, actuator assembly 50 oscillates an inertial mass in an approximately linear motion. The oscillations provided by the movement of the inertial mass are transmitted to the housing 24, where the user contacting the housing feels them as tactile sensations. The inertial mass can preferably be oscillated at different frequencies and force magnitudes to provide a variety of tactile sensations, such as pulses and vibrations.
One embodiment of an actuator assembly 50 is described below with reference to
Other embodiments of the present invention can cause tactile feedback to a portion of the housing 24. For example, a portion of the housing can be made moveable with respect to the remaining portion of the housing. The moveable portion can be coupled to the actuator assembly and moved to provide tactile sensations. The moveable portion can be positioned at a location on the housing that is contacted by the user when holding, contacting or supporting the remote control 12 during normal use. In one embodiment, the moveable portion can be moved away from (e.g. perpendicular to) the outer surface of the stationary portion of the housing, such as a cover on a hinge, to move against the user's finger or palm. In other embodiments, the moveable portion can be moved laterally, parallel to the outer surface of the housing and in shear with the user's skin contacting the moveable portion (or both lateral and perpendicular motion can be provided). Some embodiments of a moveable surface portion are described in U.S. Pat. No. 6,184,868, which is incorporated herein by reference in its entirety.
A battery 60 or other power storage element can be included in the remote control 12 to supply power to the actuator assembly 50 and other components, such as a local microprocessor, transmitter, lights on the device, etc. Battery 60 can be the disposable form of battery, or a rechargeable battery which the user can remove, recharge, and replace. Some embodiments can provide a convenient compartment door in the housing 24 to allow easy access to the battery 60 by the user. One or more batteries 60 can be provided in the remote control 12 for the desired amount of power. Other types of power storage elements that supply power may be used in other embodiments. In some embodiments, the battery 60 may be recharged without the user having to remove it from the device housing 24. For example, the housing 24 can include a “docking port” or electrical connector connected to a rechargeable battery 60 which allows the remote control 12 to be plugged into a mating connector on a recharging power source device that is, for example, connected to a standard AC power outlet.
Battery 60 can be a heavy component and thus may be disadvantageous in an inertial haptic feedback device. The heaviness of the battery 60 can add to the overall mass of the device, which may weaken the strength of the inertial haptic sensations output by actuator assembly 50 and felt by the user. To compensate for this effect, a flexible or compliant coupling between the battery 60 and the housing 24 may be used; other embodiments may use a rubber or other compliant layer or spring element. Layer 62 allows the battery 60 to move at least partially independently of the housing 2, and thus inertially decouples the battery 60 from the housing 24. The layer 62 reduces the intertial contribution of the battery 60 to the system and allows the user to feel stronger tactile sensations within the given actuator assembly 50 than if the battery 60 were rigidly coupled to the housing without layer 62. These embodiments are described in greater detail in copending patent application 09/771,116, filed Jan. 26, 2001, and incorporated herein by reference in its entirety.
Actuator 110 is shown coupled to the flexure 120. The housing of the actuator is coupled to a receptacle portion 122 of the flexure 120 which houses the actuator 110 as shown. A rotating shaft 124 of the actuator is coupled to the flexure 120 in a bore 125 of the flexure 120 and is rigidly coupled to a central rotating member 130. The rotating shaft 124 of the actuator is rotated about an axis B which also rotates member 130 about axis B. Rotating member 130 is coupled to a first portion 132a of an angled member 131 by a flex joint 134. The flex joint 134 preferably is made very thin in the dimension it is to flex so that the flex joint 134 will bend when the rotating portion 130 moves the first portion 132a approximately linearly. The first portion 132a is coupled to the grounded portion 140 of the flexure by a flex joint 138 and the first portion 132a is coupled to a second portion 132b of the angled member by flex joint 142. The second portion 132b, in turn, is coupled at its other end to the receptacle portion 122 of the flexure by a flex joint 144.
The angled member 131 that includes first portion 132a and second portion 132b moves linearly along the x-axis as shown by arrow 136. In actuality, the portions 132a and 132b move only approximately linearly. When the flexure is in its origin position (rest position), the portions 132a and 132b are preferably angled as shown with respect to their lengthwise axes. This allows the rotating member 130 to push or pull the angled member 131 along either direction as shown by arrow 136.
The actuator 110 is operated in only a fraction of its rotational range when driving the rotating member 130 in two directions, allowing high bandwidth operation and high frequencies of pulses or vibrations to be output. To channel the compression or stretching of the flexure into the desired z-axis motion, a flex joint 152 is provided in the flexure portion between the receptacle portion 122 and the grounded portion 140. The flex joint 152 allows the receptacle portion 122 (as well as the actuator 110, rotating member 130, and second portion 132b) to move (approximately) linearly in the z-axis in response to motion of the portions 132a and 132b. A flex joint 150 is provided in the first portion 132a of the angled member 131 to allow the flexing about flex joint 152 in the z-direction to more easily occur.
By quickly changing the rotation direction of the actuator shaft 124, the actuator/receptacle can be made to oscillate along the z-axis and create a vibration on the housing 24 with the actuator 110 acting as an inertial mass. Preferably, enough space is provided above and below the actuator to allow its range of motion without impacting any surfaces or portions of the housing 24. In addition, the flex joints included in flexure 120, such as flex joint 152, act as spring members to provide a restoring force toward the origin position (rest position) of the actuator 110 and receptacle portion 132. In some embodiments, the stops can be included in the flexure 120 to limit the motion of the receptacle portion 122 and actuator 110 along the z-axis.
Many different types and shapes of eccentric masses 174 can be used. A wedge- or pie-shaped eccentric can be used, as shown, where one end of the eccentric is coupled to the shaft 172 so that most of the wedge extends to one side of the shaft. Alternatively, a cylindrical or other-shaped mass can be coupled to the shaft 172. The center of the mass 174 is positioned to be offset from the axis of rotation C of the shaft 172, creating an eccentricity parameter that is determined by the distance between the axis of rotation of the shaft 172 and the center of mass of the mass 174. The eccentricity can be adjusted in different device embodiments to provide stronger or weaker vibrations, as desired. Greater magnitude is generally obtained by changing the eccentricity if the motor is driven constantly in one direction.
When the eccentric mass 174 is rotated by the motor 170, a vibration is induced in the motor and in any member coupled to the motor due to the off-balance motion of the mass. Since the housing of motor 176 is preferably coupled to a housing of the remote control 12, the vibration is transmitted to the user that is holding the housing. One or more of motors 176 can be included in a control 14 to provide vibrotactile or other haptic feedback; for example, two motors may be used to provide stronger magnitude vibrations and/or vibrations in two different directions.
Other types of actuator assemblies may also be used, as disclosed in U.S. Pat. No. 6,184,868, such as a linear voice coil actuator, solenoid, moving magnet actuator, etc.
Haptic Sensations
Different types of haptic sensations can be output to the user in the local haptic sensation embodiment. Since the haptic sensations are determined based only on the local actions of the user on the remote control 12, the sensations can be based on specific actions or controls manipulated by the user.
Engine vibration: The actuator assembly 50 can be controlled to output haptic sensations that are meant to simulate the vibration of an engine in a vehicle. This vibration sensation can have a magnitude and/or frequency correlated to the position of a throttle control, such as lever 20 in
Turning: Haptic sensations can be controlled to be correlated with the user manipulating controls to turn the toy 14. For example, left-right lever 22 can be used to turn the toy left or right. In some embodiments, the amount of movement of the lever from the origin position controls the tightness of the turn; when the lever is fully left or right, the toy turns in its smallest turning radius. Vibrations can be output on the remote control 12 to indicate the tightness of the turn. For example, a wide turn can be associated with a lower-frequency vibration, while a tight turn can be associated with a higher-frequency vibration. In some embodiments, both the speed and turn controls can be used in the determination of a haptic sensation. For example, a fast, tight turn can cause sporadic pulses as haptic sensations, which simulate the feel of tires of the toy losing traction with the ground.
Other sensations: Other controls on the remote control 12 can be associated with haptic sensations. If there is a braking control that commands the toy to apply brakes or slow down its motion, a vibration can be output during the braking. Horns, blinking lights, or other functions of the toy controlled from the remote control 12 can also be associated with different haptic sensations.
Kinesthetic force feedback can also be output on the controls of the remote control, which is forces in one or more of the sensed degrees of freedom of a manipulandum. For example, a motor or other actuator can output forces in the degree of freedom of lever 20 and 22, in the degree of freedom of motion of a button or switch, in the rotary degree of freedom of a small steering wheel or knob, etc. Some embodiments for providing such haptic feedback are described below with respect to
In a kinesthetic force feedback embodiment, time-based haptic effects can be output in the degree of freedom of a control, similar to the tactile effects described above. For example, a vibration or jolt can be directly output on the lever 20 or 22 while the user is holding it when controlling the toy 14 in a desired manner. The magnitude and/or frequency of the jolt or vibration can be based on the position of the control in its degree of freedom, and/or can simulate engine rumble, turning radius, etc.
Furthermore, more sophisticated haptic sensations can be output in the kinesthetic force feedback embodiment. For example, a spring sensation generated by an actuator can provide a restoring force to a lever to its origin position. A tight or high-magnitude spring can be output for a fast turn of the toy 14, while a loose, low-magnitude spring can be output for a slow turn to simulate driving conditions at those speeds. Detents in the lever (or other control) motion can be output to mark particular positions to the user, e.g. each ¼ of the lever range moved provides a detent to let the user haptically know how fast he or she is controlling the car to move or how tight a turn is being controlled. The detents can be implemented as jolts, spring forces, or other force profiles. A damping force can resist lever motion based on the velocity of lever motion. A barrier or obstruction force can prevent or resist motion of a control past a desired limit.
More sophisticated embodiments of the present invention are described below. In these embodiments, the toy 14 can send signals to the remote control 12 to inform the remote control of one or more statuses or conditions of the toy. The output of haptic sensations, and the characteristics of those haptic sensations (e.g., type of sensation, magnitude, frequency, duration, etc.), can be based solely or partially on these signals.
In one embodiment, one or more sensors are mounted on the toy 14. For example, referring to
The sensor 32 can be any of a variety of different types of sensors. For example, an optical sensor with emitter and detector, a magnetic sensor, a mechanical contact sensor, analog potentiometers that measure the motion of a contact member, or other types of sensors for detecting contact can be used. Alternatively, or additionally, one or more sensors can be positioned at other areas of the toy to detect contact in those areas, such as at a rear bumper or other rear surface, a side surface, the underside of the toy, etc.
The signals sent from the toy 14 to the remote control 12 can have the same broadcast carrier frequency as the signals transmitted from remote control to toy, or can have a different frequency. The remote control 12, upon receiving the signal from the toy, can command a tactile sensation appropriate to the signal.
A variety of haptic sensations can be provided. If a collision is detected by the sensor 32, the controller (see
In addition, the magnitude of the haptic sensations can be correlated with a speed of the toy that is assumed from the throttle control on the remote control 12. For example, if the lever 20 has been pushed forward to its full extent, then it is assumed the toy is moving very fast and that when a collision occurs, it is a strong one. Thus, a high-magnitude jolt or vibration is output. If, on the other hand, the lever 20 is positioned only slightly away from its origin position when a collision is sensed by the sensor 32, a slow toy speed is assumed and a lower-magnitude sensation can be output on the remote control 12.
In embodiments providing an analog sensor, force sensor, or variable-state sensor as sensor 32, different haptic sensations can be more realistically associated with different degrees of collision or contact as sensed by the sensors. For example, the sensor 32 can directly sense whether a collision is a strong one, and the magnitude of the correlated haptic sensation can be proportionally adjusted. Thus, a mid-strength collision can be associated with a haptic sensation having a magnitude in the middle of the available output range. This can provide a more realistic correlation between collision strength and haptic sensation strength, since a speed of the toy is not assumed, and an actual collision strength is measured.
Haptic sensations can also be based on a combination of toy status, as relayed by the sensor(s) 32, and user manipulation of a manipulandum or control on the remote control 12. For example, a centering spring force output in the degree of freedom of a steering wheel or knob on remote control 12 can be based on the current position of that wheel or knob, and can be combined with a jolt force that is output based on a sensed collision of the toy with another object.
Such inertial sensors can take a variety of different embodiments. For example, Inertial sensor 202 can be a lower-cost single-axis accelerometer that can measure accelerations in only one axis. This type of sensor can be placed to sense only front-back, top-bottom, or side-to-side accelerations, if desired. Alternatively, this type of accelerometer can be positioned in toy 200 at an angle, as shown in
Use of the accelerometer allows the user to feel haptic sensations that can be correlated with a variety of interactions that the toy is experiencing. For example, the accelerometer can sense and send status signals and data to the remote control representative of accelerations on the toy indicating the toy is bouncing over rough terrain or down stairs, landing after jumping off a ramp, or sideswiping another toy or object. Different haptic sensations can be output on the remote control 12 which simulate or indicate all of these conditions on the toy.
The accelerometer can also be used in conjunction with multiple actuator assemblies 50 placed at different locations on the remote control 12, as described above. For example, sensed accelerations of the car in a front-back axis can cause actuator assemblies positioned at the front and back of the remote control 12 to output haptic sensations in correlation with the sensed accelerations. Left/right actuator assemblies can similarly provide left/right differentiation to the user.
Other embodiments can employ different types of actuators, such as voice coil actuators, moving magnet actuators, brushed or brushless motors, etc. Also, some embodiments can provide passive actuators that output a resistance on the lever, but cannot output an active force on the lever; such actuators include magnetic particle brakes, fluid brakes, or other brake-like devices.
Other controls on the remote control 230 can be similarly provided with kinesthetic force feedback. Buttons, for example, can be coupled to actuators to output forces in the degree of freedom of the button, as disclosed in U.S. Pat. No. 6,184,868. Knobs, dials, linear sliders or switches, steering wheels, trackballs, direction pads, joysticks and other controls can also be actuated.
Kinesthetic force feedback on the controls of the remote control can offer greater realism to experienced haptic effects. For example, spring forces can be output on the levers 20 and 22 to provide a centering function without having to use mechanical springs. Furthermore, when the toy is moving faster, the spring magnitude can be controlled to be higher to simulate the increased road forces on the steering of the car. In addition, if a local sensor such as an accelerometer detects that the car is airborne, the spring magnitude can be made zero to simulate the feel of light steering when the car loses contact with the road. When the car is experiencing a bumpy or rough terrain, the accelerometers on the car can provide the data back to the remote control to cause vibration or other forces on the lever that simulates the bumpiness and may cause some difficulty to the user in steering the toy.
As explained above, toy device 14 can include electronic components for receiving signals and controlling motion of the toy. In some embodiments, the toy device 14 may include a processor 300, such as a microprocessor or other controller (hardware state machines, digital logic, an ASIC, etc.), which can receive sensor signals from sensor 32 (if a sensor is included on the toy) and which can output signals to control any output devices 302, such as motors that turn the front wheels for steering, wheel motors or other motors providing locomotion of the toy, any audio output devices such as a horn, and any visual output devices such as lights. In other embodiments, circuitry other than a processor 300 can be used for providing signals to output devices 302 and receiving signals from sensors 32 can be provided; for example, an analog control system can receive the signals from the remote control to drive the appropriate motors of the toy 14. An ASIC, state machines, or other logic can also be used. Any other required components for use with processor 300 may also be included, such as memory, I/O circuitry, etc.
Processor 300 of the toy device 14 has a communication link 306 with the remote control device 12, as explained above. This link can be wireless through the use of RF or signals of other frequencies, or through a wire or other physical connection. In some embodiments, the link in one way, from the remote control device 12 to the toy 14. In other embodiments, the link is bi-directional.
The haptic feedback remote control device 12 may include a local processor 310 which handles the input, output, and control functions of the remote control. The local processor 310 can be provided with software (firmware) instructions to monitor the controls of the remote control device 12, wait for information sent from the toy device 14, and provide signals to the actuators of the remote control device to control haptic sensations. Processor 330 can include one microprocessor chip, or multiple processors and/or co-processor chips. In other embodiments, microprocessor 330 can include digital signal processor (DSP) functionality, or be implemented as control logic components, and ASIC, or hardware state machine instead of an actual microprocessor chip.
A local clock 312 can be coupled to the processor 330 to provide timing data which might be required, for example, to compute forces output by actuators of the remote control 12. Local memory 314, such as RAM and/or ROM, can be coupled to processor 310 to store instructions for processor 310 and store temporary and other data.
Sensor interface 316 may optionally be included in device 12 to convert sensor signals to signals that can be interpreted by the processor 310. For example, sensor interface 316 can receive and convert signals from a digital sensor such as an encoder or from an analog sensor using an analog to digital converter (ADC). Such circuits, or equivalent circuits, are well known to those skilled in the art. Alternately, processor 310 can perform these interface functions. Sensors 318 sense the position, motion, and/or other characteristics of particular controls of remote control device 12; for example, sensors 318 can sense the motion or position of levers 20 and 22 and any other buttons, switches, joysticks, trackballs, etc. on the remote control 12. Sensors 318 provide signals to processor 310 including information representative of that motion or position. Example of sensors suitable for embodiments described herein are analog potentiometers, Hall effect sensors, digital rotary optical encoders, linear optical encoders, optical sensors such as a lateral effect photo diode, velocity sensors (e.g., tachometers) and/or acceleration sensors (e.g., accelerometers). Furthermore, either relative or absolute sensors can be employed.
Actuator interface 320 can be optionally connected between the actuators of remote control device 12 and processor 310 to convert signals from microprocessor 310 into signals appropriate to drive the actuators. Interface 320 can include power amplifiers, switches, digital to analog controllers (DACs), and other components well known to those skilled in the art.
Actuators 322 transmit forces, as described above, to the housing of the remote control 12 and/or particular controls 324 of remote control device 12 in one or more directions along one or more degrees of freedom in response to signals output by processor 310, i.e., they are “computer controlled.” Actuators 322 can include the actuator assembly 50 or rotary actuator 234 described above. The actuators can a variety of devices, such as linear current control motors, stepper motors, pneumatic/hydraulic active actuators, a torquer (motor with limited angular range), magnetic particle brakes, friction brakes, or pneumatic/hydraulic passive actuators.
Power supply 326 can be coupled to actuator interface 320 and/or to actuators 322 to provide electrical power to the actuator and other components of the remote control 12. As described above, power supply 326 is preferably batteries or other portable power supply.
Other input devices 328 can be included in device 12 and send input signals to microprocessor 310. Such input devices can include other buttons, dials, knobs, switches, voice recognition hardware, or other input mechanisms as described above. A safety or “deadman” switch can be included in some embodiments to provide a mechanism to allow a user to override and deactivate forces output by actuators 322.
The operation of the system is now generally described. In the simpler, lower-cost embodiment, the sensors 318 can detect motion of controls such as levers 20 and 22 by the user and send the appropriate signals to the processor 310. The processor 310 then sends the appropriate control signals to the toy device 14 to control it in accordance with the user manipulation of the controls. The processor 310 also sends actuator signals to the actuator assembly(ies) 50 to output haptic sensations in accordance with the control manipulated and the way that control is manipulated. In the more sophisticated embodiments, the toy device 14 can send signals to the remote control 12. In those embodiments, the local processor 310 receives signals from local sensors 318 as well as information from the sensors 32 on the toy device 14. The local processor then determines the haptic sensations to be output and controls the actuators 322 accordingly, as well as sending control signals to the toy device in accordance with the present user manipulation of the remote control 12.
While this invention has been described in terms of several preferred embodiments, it is contemplated that alterations, permutations, and equivalents thereof will become apparent to those skilled in the art upon a reading of the specification and study of the drawings. Furthermore, certain terminology has been used for the purposes of descriptive clarity, and not to limit the present invention. It is therefore intended that the following appended claims include all such alterations, permutations and equivalents as fall within the true spirit and scope of the present invention.
Number | Name | Date | Kind |
---|---|---|---|
3157853 | Hirsch | Nov 1964 | A |
3220121 | Cutler | Nov 1965 | A |
3497668 | Hirsch | Feb 1970 | A |
3517446 | Corlyon et al. | Jun 1970 | A |
3902687 | Hightower | Sep 1975 | A |
3903614 | Diamond et al. | Sep 1975 | A |
3919691 | Noll | Nov 1975 | A |
4160508 | Salisbury, Jr. | Jul 1979 | A |
4236325 | Hall et al. | Dec 1980 | A |
4414984 | Zarudiansky | Nov 1983 | A |
4513235 | Acklam et al. | Apr 1985 | A |
4581491 | Boothroyd | Apr 1986 | A |
4599070 | Hladky et al. | Jul 1986 | A |
4695266 | Hui | Sep 1987 | A |
4706294 | Ouchida | Nov 1987 | A |
4708656 | De Vries et al. | Nov 1987 | A |
4713007 | Alban | Dec 1987 | A |
4731603 | McRae et al. | Mar 1988 | A |
4795296 | Jau | Jan 1989 | A |
4868549 | Affinito et al. | Sep 1989 | A |
4885565 | Embach | Dec 1989 | A |
4891764 | McIntosh | Jan 1990 | A |
4930770 | Baker | Jun 1990 | A |
4934694 | McIntosh | Jun 1990 | A |
4938483 | Yavetz | Jul 1990 | A |
4940234 | Ishida et al. | Jul 1990 | A |
4964837 | Collier | Oct 1990 | A |
5019761 | Kraft | May 1991 | A |
5022407 | Horch et al. | Jun 1991 | A |
5024626 | Robbins et al. | Jun 1991 | A |
5035242 | Franklin | Jul 1991 | A |
5038089 | Szakaly | Aug 1991 | A |
5078152 | Bond | Jan 1992 | A |
5103404 | McIntosh | Apr 1992 | A |
5107262 | Cadoz et al. | Apr 1992 | A |
5146566 | Hollis, Jr. et al. | Sep 1992 | A |
5184319 | Kramer | Feb 1993 | A |
5186629 | Rohen | Feb 1993 | A |
5186695 | Mangseth et al. | Feb 1993 | A |
5195920 | Collier | Mar 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5212473 | Louis | May 1993 | A |
5216337 | Orton et al. | Jun 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5271290 | Fischer | Dec 1993 | A |
5275174 | Cook | Jan 1994 | A |
5296871 | Paley | Mar 1994 | A |
5299810 | Pierce | Apr 1994 | A |
5309140 | Everett | May 1994 | A |
5334027 | Wherlock | Aug 1994 | A |
5354162 | Burdea et al. | Oct 1994 | A |
5388992 | Franklin et al. | Feb 1995 | A |
5389865 | Jacobus et al. | Feb 1995 | A |
5399091 | Mitsumoto | Mar 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5440183 | Denne | Aug 1995 | A |
5459382 | Jacobus et al. | Oct 1995 | A |
5466213 | Hogan | Nov 1995 | A |
5547382 | Yamasaki | Aug 1996 | A |
5565840 | Thorner et al. | Oct 1996 | A |
5580251 | Gilkes et al. | Dec 1996 | A |
5583478 | Renzi | Dec 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5589828 | Armstrong | Dec 1996 | A |
5619180 | Massimino et al. | Apr 1997 | A |
5631861 | Kramer | May 1997 | A |
5643087 | Marcus et al. | Jul 1997 | A |
5661446 | Anderson et al. | Aug 1997 | A |
5669818 | Thorner et al. | Sep 1997 | A |
5684722 | Thorner et al. | Nov 1997 | A |
5692956 | Rifkin | Dec 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5714978 | Yamanaka et al. | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5736978 | Hasser et al. | Apr 1998 | A |
5739811 | Rosenberg et al. | Apr 1998 | A |
5742278 | Chen et al. | Apr 1998 | A |
5754023 | Roston et al. | May 1998 | A |
5766016 | Sinclair | Jun 1998 | A |
5781172 | Engel et al. | Jul 1998 | A |
5784052 | Keyson | Jul 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
5790108 | Salcudean et al. | Aug 1998 | A |
5805140 | Rosenberg et al. | Sep 1998 | A |
5857986 | Moriyasu | Jan 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5894263 | Shimakawa et al. | Apr 1999 | A |
5897437 | Nishiumi et al. | Apr 1999 | A |
5914705 | Johnson et al. | Jun 1999 | A |
5945772 | Macnak et al. | Aug 1999 | A |
5973670 | Barber et al. | Oct 1999 | A |
5984880 | Lander et al. | Nov 1999 | A |
5986643 | Harvill et al. | Nov 1999 | A |
6001014 | Ogata et al. | Dec 1999 | A |
6004134 | Marcus et al. | Dec 1999 | A |
6036495 | Marcus et al. | Mar 2000 | A |
6044646 | Silverbrook | Apr 2000 | A |
6078126 | Rollins et al. | Jun 2000 | A |
6088017 | Tremblay et al. | Jul 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6104158 | Jacobus et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6113459 | Nammoto | Sep 2000 | A |
6121955 | Liu | Sep 2000 | A |
6154201 | Levin et al. | Nov 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6169540 | Rosenberg | Jan 2001 | B1 |
6184868 | Shahoian et al. | Feb 2001 | B1 |
6198206 | Saarmaa et al. | Mar 2001 | B1 |
6211861 | Rosenberg et al. | Apr 2001 | B1 |
6241574 | Helbing | Jun 2001 | B1 |
6256011 | Culver | Jul 2001 | B1 |
6268857 | Fishkin | Jul 2001 | B1 |
6275213 | Tremblay et al. | Aug 2001 | B1 |
6280327 | Leifer et al. | Aug 2001 | B1 |
RE37374 | Roston et al. | Sep 2001 | E |
6293798 | Boyle et al. | Sep 2001 | B1 |
6317032 | Oishi | Nov 2001 | B1 |
6346025 | Tachau et al. | Feb 2002 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6424333 | Tremblay et al. | Jul 2002 | B1 |
6585595 | Soma et al. | Jul 2003 | B1 |
6641480 | Murzanski et al. | Nov 2003 | B2 |
6650318 | Arnon | Nov 2003 | B1 |
6686901 | Rosenberg | Feb 2004 | B2 |
6697043 | Shahoian et al. | Feb 2004 | B1 |
6707443 | Bruneau et al. | Mar 2004 | B2 |
6717573 | Shahoian et al. | Apr 2004 | B1 |
6807291 | Tumey et al. | Oct 2004 | B1 |
20010003101 | Shinohara et al. | Jun 2001 | A1 |
20010045978 | McConnell et al. | Nov 2001 | A1 |
20020030663 | Tierling et al. | Mar 2002 | A1 |
20020103025 | Murzanski et al. | Aug 2002 | A1 |
20030030619 | Martin et al. | Feb 2003 | A1 |
20030201975 | Bailey et al. | Oct 2003 | A1 |
Number | Date | Country |
---|---|---|
0265011 | Apr 1988 | EP |
0 085 518 | Aug 1989 | EP |
0607580 | Jul 1994 | EP |
0626634 | Nov 1994 | EP |
0 695 566 | Feb 1996 | EP |
0 940 162 | Sep 1999 | EP |
977142 | Feb 2000 | EP |
2237160 | Apr 1991 | GB |
2 336 890 | Nov 1999 | GB |
0349086 | Jan 1990 | JP |
01-003664 | Jul 1990 | JP |
3033148 | Feb 1991 | JP |
02-109714 | Jan 1992 | JP |
04-007371 | Aug 1993 | JP |
05-300985 | Nov 1993 | JP |
06-039097 | May 1994 | JP |
05-193862 | Jan 1995 | JP |
07-024708 | Mar 1995 | JP |
07-019512 | May 1995 | JP |
07-053189 | Jun 1995 | JP |
08-048297 | Feb 1996 | JP |
08-168545 | Jul 1996 | JP |
9-215870 | Aug 1997 | JP |
10-314463 | Dec 1998 | JP |
11-004966 | Jan 1999 | JP |
11-313001 | Nov 1999 | JP |
2000-020222 | Jan 2000 | JP |
2000-102677 | Apr 2000 | JP |
2000-317866 | Nov 2000 | JP |
2000-334163 | Dec 2000 | JP |
2001-502200 | Feb 2001 | JP |
2001-062160 | Mar 2001 | JP |
WO9200559 | Jan 1992 | WO |
WO9731333 | Aug 1997 | WO |
WO9832112 | Jul 1998 | WO |
WO9940504 | Aug 1999 | WO |
WO0103105 | Jan 2001 | WO |
WO0113354 | Feb 2001 | WO |
WO0124158 | Apr 2001 | WO |
WO0227705 | Apr 2002 | WO |
Entry |
---|
Webster's Third New International Dictionary of the English Language Unabridged, 1965, pp. 1375-1376. |
Internet: Islwww.epfl.ch/˜penha/predoc/vr/haptic.html, “Haptics”, (printed from www.archive.org), 1999. |
An Architecture for Haptic Control of Media, 1999 Internation Mechanical Engineering Congress and Exposition—Eighth Annua Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, 1999. |
Web page Conti, “HapticDriver Remote Driving with Force Feedback,” at URL=http://robotics.stanford.edu/˜conti/HapticDriver, pp. 4. |
Web page Fong et al., Novel interfaces for remote driving: gesture, haptic and PDA, at URL=http://imtsg7.epfl.ch/papers/SPIE00-TF.pdf, pp. 12. |
Korean Intellectual Property Office, Korean Application No. 10-2003-7012826, Notice of Preliminary Rejection, mailed Jan. 4, 2008, 12 pages. |
Japan Patent Office, Japanese Application No. 2002-577071, Notice for Reasons of Rejection, mailed Jul. 19, 2007, 6 pages. |
Patent Office of the Peoples Republic of China, Chinese Application No. 02807519, First Office Action, mailed Aug. 18, 2006, 3 pages. |
Patent Office of the Peoples Republic of China, Chinese Application No. 02807519, Second Office Action, mailed Mar. 14, 2008, 3 pages. |
European Patent Office, Application No. 02726696, Communication, mailed Feb. 22, 2006, 3 pages. |
Patent Cooperation Treaty, International Search Report, International Application No. PCT/US02/10394, 2 pages. |
Baigrie, “Electric Control Loading—A Low Cost, High Performance Alternative,” Proceedings, pp. 247-254, Nov. 6-8, 1990. |
Iwata, “Pen-based Haptic Virtual Environment,” 0-7803-1363-1/93 IEEE, pp. 287-292, 1993. |
Russo, “The Design and Implementation of a Three Degree of Freedom Force Output Joystick,” MIT Libraries Archives Aug. 14, 1990, pp. 1-131, May 1990. |
Brooks et al., “Hand Controllers for Teleoperation—A State-of-the-Art Technology Survey and Evaluation,” JPL Publication 85-11; NASA-CR-175890; N85-28559, pp. 1-84, Mar. 1, 1985. |
Jones et al., “A perceptual analysis of stiffness,” ISSN 0014-4819 Springer International (Springer-Verlag); Experimental Brain Research, vol. 79, No. 1, pp. 150-156, 1990. |
Burdea et al., “Distributed Virtual Force Feedback, Lecture Notes for Workshop on Force Display in Virtual Environments and its Application to Robotic Teleoperation,” 1993 IEEE International Conference on Robotics and Automation, pp. 25-44, May 2, 1993. |
Snow et al.,“Model-X Force-Reflecting-Hand-Controller,” NT Control No. MPO-17851; JPL Case No. 5348, pp. 1-4, Jun. 15, 1989. |
Ouh-Young, “Force Display in Molecular Docking,” Order No. 9034744, p. 1-369, 1990. |
Tadros, “Control System Design for a Three Degree of Freedom Virtual Environment Simulator Using Motor/Brake Pair Actuators”, MIT Archive © Massachusetts Institute of Technology, pp. 1-88, Feb. 1990. |
Caldwell et al., “Enhanced Tactile Feedback (Tele-Taction) Using a Multi-Functional Sensory System,” 1050-4729/93, pp. 955-960, 1993. |
Adelstein, “Design and Implementation of a Force Reflecting Manipulandum for Manual Control research,” DSC-vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992. |
Gotow et al., “Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback,” WA11-11:00, pp. 332-337. |
Stanley et al., “Computer Simulation of Interacting Dynamic Mechanical Systems Using Distributed Memory Parallel Processors,” DSC-vol. 42, Advances in Robotics, pp. 55-61, ASME 1992. |
Russo, “Controlling Dissipative Magnetic Particle Brakes in Force Reflective Devices,” DSC-vol. 42, Advances in Robotics, pp. 63-70, ASME 1992. |
Kontarinis et al., “Display of High-Frequency Tactile Information to Teleoperators,” Telemanipulator Technology and Space Telerobotics, Won S. Kim, Editor, Proc. SPIE vol. 2057, pp. 40-50, Sep. 7-9, 1993. |
Patrick et al., “Design and Testing of a Non-reactive, Fingertip, Tactile Display for Interaction with Remote Environments,” Cooperative Intelligent Robotics in Space, Rui J. deFigueiredo et al., Editor, Proc. SPIE vol. 1387, pp. 215-222, 1990. |
Adelstein, “A Virtual Environment System for The Study of Human Arm Tremor,” Ph.D. Dissertation, Dept. of Mechanical Engineering, MIT, Jun. 1989. |
Bejczy, “Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation,” Science, vol. 208, No. 4450, pp. 1327-1335, 1980. |
Bejczy, “Generalization of Bilateral Force-Reflecting Control of Manipulators,” Proceedings of Fourth CISM-IFToMM, Sep. 8-12, 1981. |
McAffee, “Teleoperator Subsystem/Telerobot Demonsdtrator: Force Reflecting Hand Controller Equipment Manual,” JPL D-5172, pp. 1- 50, A1-A36, B1-65, C1-C36, Jan. 1988. |
Minsky, “Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display,” Ph.D. Dissertation, MIT, Jun. 1995. |
Jacobsen et al., “High Performance, Dextrous Telerobotic Manipulator With Force Reflection,” Intervention/ROV '91 Conference & Exposition; Hollywood, Florida, May 21-23, 1991. |
Shimoga, “Finger Force and Touch Feedback Issues in Dexterous Telemanipulation,” Proceedings of Fourth Annual Conference on Intelligent Robotic Systems for Space Expploration, Rensselaer Polytechnic Institute, Sep. 30-Oct. 1, 1992. |
IBM Technical Disclosure Bullein, “Mouse Ball-Actuating Device With Force and Tactile Feedback,” vol. 32, No. 9B, Feb. 1990. |
Terry et at., “Tactile Feedback in a Computer Mouse,” Proceedings of Fouteenth Annual Northeast Bioengineenng Conference, University of New Hampshire, Mar. 10-11, 1988. |
Howe, “A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992. |
Eberhardt et al., “OMAR—A Haptic display for speech perception by deaf and deaf-blind individuals,” IEEE Virtual Reality Annual International Symposium, Seattle, WA, Sep. 18-22, 1993. |
Rabinowitz et al., “Multidimensional tactile displays: Identification of vibratory intensity, frequency, and contactor area,” Journal of The Acoustical Society of America, vol. 82, No. 4, Oct. 1987. |
Bejczy et al., “Kinesthetic Coupling Between Operator and Remote Manipulator,” International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980. |
Bejczy et al., “A Laboratory Breadboard System for Dual-Arm Teleoperation,” SOAR '89 Workshop, JSC, Houston, TX, Jul. 25-27, 1989. |
Ouh-Young, “A Low-Cost Force Feedback Joystick and Its Use in PC Video Games,” IEEE Transactions on Consumer Electronics, vol. 41, No. 3, Aug. 1995. |
Marcus, “Touch Feedback in Surgery,” Proceedings of Virtual Reality and Medicine The Cutting Edge, Sep. 8-11, 1994. |
Bejczy, et al., “Universal Computer Control System (UCCS) for Space Telerobots,” CH2413-3/87/0000/0318501.00 1987 IEEE, 1987. |
Aukstakalnis et al., “Silicon Mirage: The Art and Science of Virtual Reality,” ISBN 0-938151-82-7, pp. 129-180, 1992. |
Eberhardt et al., “Including Dynamic Haptic Perception by The Hand: System Description and Some Results,” DSC-vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994. |
Gobel et al., “Tactile Feedback Applied to Computer Mice,” International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Pimentel et al., “Virtual Reality: through the new looking glass,” 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994. |
“Cyberman Technical Specification,” Logitech Cyberman SWIFT Supplement, Apr. 5, 1994. |
Ouhyoung et al., “The Development of a Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment,” Proceedings of the Third Pacific Conference on Computer Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995. |
Kaczmarek et al., “Tactile Displays,” Virtual Environment Technologies. |
Scannell, “Taking a Joystick Ride,” Computer Currents, Boston Edition, vol. 9, No. 11, Nov. 1994. |
Schmult, Brian et al., “Application Areas for a Force-Feedback Joystick,” ASME 1993, DSC-vol. 49, pp. 47-54. |
Hasser, Christopher John, “Tactile Feedback for a Force-Reflecting Haptic Display,” The School of Engineering, University of Dayton, Dec. 1995, pp. iii-xii &1-96. |
Akamatsu, M. et al., “Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display,” Presence, vol. 3, No. 1, 1994, pp. 73-80. |
Kelley, A. J. et al., “MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device,” Dept. of Elec. Eng., Univ. of Brit. Columbia, 1993, pp. 1-27. |
Hasser, C. et al., “Tactile Feedback with Adaptive Controller for a Force-Reflecting Haptic Display,” Parts 1&2, IEEE 0-7803-3131-1, 1996, pp. 526-533. |
Ramstein, C., “Combining Haptic and Braille Technologies: Design Issues and Pilot Study,” Assets '96, 2nd Annual ACM Conf. on Assistive Technologies, 1996, pp. 37-44. |
Dennerlein, et al., “Vibrotactile Feedback for Industrial Telemanipulators,” ASME IMECE, 6th Annual Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Nov. 1997, pp. 1-7. |
Minsky, Margaret et al., “Feeling and Seeing: Issues in Force Display,” ACM 089791-351-5, 1990, pp. 235-242. |
Ouh-young, M. et al., “Creating an Illusion of Feel: Control Issues in Force Display,” Computer Science Dept., University of North Carolina, 1989, pp. 1-14. |
Hasser, C., “Force-Reflecting Anthropomorphic Hand Masters,” AL/CF-TR-1995-0110, 1995, pp. 5-31. |
Kim, Won, “Telemanipulator Technology and Space Telerobotics,” SPIE Proceedings 1993, vol. 2057, pp. 40-50. |
Japan Patent Office, Japanese Application No. 2002-577071, Notice for Reasons of Rejection, mailed Apr. 10, 2008. |
Number | Date | Country | |
---|---|---|---|
20020142701 A1 | Oct 2002 | US |