The present application relates to the field of toy vehicles. More particularly, the described embodiments relate to a remote control vehicle with on-board telemetry systems that monitor performance and a remote control application that monitors inputs by a user for storage and sharing over a cloud-based computer network.
a is a schematic diagram showing a first embodiment of a track having a visual track indicator.
b is a schematic diagram showing a second embodiment of a track having a visual track indicator.
c is a schematic diagram showing a third embodiment of a track having a visual track indicator.
The tablet computer 130 stores the telemetry and track-related data that it receives from the vehicle 140 in its memory. This data can be transferred through a wide-area network 120 (such as the Internet) to one or more remote servers 110 for storage in the remote database 112. In addition, the tablet computer 130 can record the input commands that it received from the user and store these inputs along with the telemetry and track data in its memory for transmission to the remote server 110. In one embodiment, this information is grouped together for each race (or “run”) through the track by the first vehicle 140. A run is a particular race by the vehicle 140 through the track 150, and may constitute one or more laps of the track 150. In other embodiments, the vehicle 140 need not operate on a particular track 150, and a run is a set time period during which the remote controlled vehicle 140 is operated and data is collected and stored.
A second user using a second tablet computer 132 can access the input commands, telemetric data, and track data that the first user stored in the database 112. This information can be used by the second tablet computer 132 to compare the first user's performance with car 140 to the second user's ability to control car 142. In particular, the second user can lay out a track 152 that is identical to the track 150 used by the first user. Since the second tablet computer 132 has downloaded all of the information about the first user's run on their track 150, the second user can use their tablet computer 132 to control their vehicle 142 through the same track layout 152 and compare the results. In effect, the two physical vehicles 140, 142 are permitted to race head-to-head through the same track layout 150, 152. This is true even though the second vehicle 142 may make its run through the track 152 at a later time and at a different location than the run made by the first vehicle 140 through its track 150.
The mobile device 210 can take the form of a smart phone or tablet computer. As such, the device 210 will include a display 212 for displaying information to a user, a user input mechanism 214 for receiving user input commands (such as a touch screen integrated into the display 212), and a processor 216 for processing instructions and data for the device 100. The display 212 can use LCDs, OLEDs, or similar technology to provide a color display for the user. The processor 216 can be a general purpose CPU, such as those provided by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.), or a mobile specific processor, such as those designed by ARM Holdings (Cambridge, England).
The mobile device 210 also has the local wireless interface 218 for interfacing with the remote control vehicle 200, and a wide area network interface 220 for connecting with a network 250. In some embodiments, these two interfaces 218, 220 could be the same interface. For instance, the mobile device 210 may interface with both the remote controlled vehicle 200 and the network 250 over a single Wi-Fi network interface 218, 220. The mobile device 210 further includes a memory 230 for storing processing instructions and data. This memory is typically solid-state memory, such as flash memory. Mobile devices such as device 210 generally use specific operating systems 232 designed for such devices, such as iOS from Apple Inc. (Cupertino, Calif.) or ANDROID OS from Google Inc. (Menlo Park, Calif.). The operating system 232 is stored on the memory 230 and is used by the processor 216 to provide a user interface for the display 212, to receive input over the user input device(s) 214, to handle communications for the device 210 over the interfaces 218, 220, and to manage applications (or apps) that are stored in the memory 230, such as the remote control app 234.
The remote control app 234 is responsible for receiving user input 214 related to the control of the remote controlled vehicle 200 and ensuring that these inputs are relayed to the vehicle 200 over interface 218. In addition, the app 234 receives data from the vehicle 200 over interface 218 and stores this data in memory 230. In particular, the app 234 may receive car telemetry data 238 and track related data 240. In addition, some embodiments of the app 234 allow the user to request that the vehicle 200 take video or still images using an image sensor found on the vehicle 200. This image data 242 is also received by the app 234 and stored in memory 230. In addition to storing this data 238, 240, 242, the app 234 also generates a user interface on the display 212 and shares this data in real time with the user over the display 212. Finally, the app 234 is responsible for connecting with a remote server 260 over network interface 220 and for sharing its data 238, 240242 with the server 260. The app 234 can also request data from the server 260 concerning previous runs or runs made by third-parties, and store this data in memory 230.
The server 260 contains a programmable digital processor 262, such as a general purpose CPU manufactured by Intel Corporation (Mountain View, Calif.) or Advanced Micro Devices, Inc. (Sunnyvale, Calif.). The server 260 further contains a wireless or wired network interface 264 to communicate with remote computing devices, such as mobile device 210, over the network 250. The processor 262 is programmed using a set of software instructions stored on a non-volatile, non-transitory, computer readable memory 266, such as a hard drive or flash memory device. The software typically includes operating system software 268, such as LINUX (available from multiple companies under open source licensing terms) or WINDOWS (available from Microsoft Corporation of Redmond, Wash.).
The processor 262 controls the communication with mobile device 210 under the direction of an application program 269 residing on the server memory 266. The application program 269 is further responsible for maintaining data within a database 270. The database 270 may be stored in the memory 266 of the server 260 as structured data (such as separate tables in a relational database, or as database objects in an object-oriented database environment). Alternatively, the database 270 may be stored and managed in a separate device, such as a separate database server computer and/or a storage area network storage device. Database programming stored on the memory 266 directs the processor 262 to access, manipulate, update, and report on the data in the database 270. This database programming can be considered part of the application programming 269.
Each of the database entities 272-284 shown in
The instructions received from the tablet computer 210 may include directions for the vehicle 200 to take still or video images on its image sensor(s) 316. In some embodiments, the resulting images 324 are stored in the vehicle memory 320 for transmission back to the tablet computer 210 when convenient. In the preferred embodiment, the data from the image sensors 316 is fed to the mobile device 210 as a live feed, allowing the tablet computer 210 to generate still and video image files as desired by the user.
While the vehicle 200 is moving under control of these instructions, the processor 310 is monitoring the telemetry sensors 312 to obtain data about how the vehicle 200 is moving and behaving. These sensors 312 may include RPM sensors that track wheel revolutions for the vehicle, accelerometer sensors that track acceleration of the vehicle, and even sensors that measure the performance and characteristics (such as heat) of the wheel motors 330. Preferably, the telemeter sensors 312 include at least an RPM sensor that can indicate wheel rotations, from which can be derived the vehicle speed and distance traveled. In some embodiments, separate RPM sensors 312 may be placed on wheels driven by the wheel motors 330 and non-driven wheels. The sensors 312 at the powered wheels may detect wheel spin during periods of hard acceleration, at which time the sensors 312 at the non-driven wheels will give a better indication of the vehicle's current speed and distance traveled.
In addition to monitoring telemetry sensors 312 such as the infrared RPM sensor 400 shown in
As explained above, in one embodiment the track sensor 314 is an image sensor that can detect visual markings 352 found on the track 350.
b shows another technique for creating visual track markings 352. In this Figure, the track segments 530, 540 each have color lines 532, 534, 536 perpendicular to the length of the track segment at regularly spaced intervals. The sensors 314 can read the presence and color of these lines 532-536 and transmit this information to the processor 310. Using this information, the tablet computer 210 can also determine distance traversal along the track 350. By alternating three colors, the tablet computer 210 can identify situations where a vehicle has reversed directions and is traversing backwards along the track 350. The use of three different, identifiable shades or colors could also be used in the embodiment shown in
In
Regardless of which marking system is used, it is possible to create special markings on the track segments 510-560 to indicate unique locations on the track 350. For instance, a finish line could be created with a unique background color in track segment 510, or with a unique color line in segment 530, or with a unique code in segment 550. In this way, the vehicle 200 can sense when it crosses the finish line, in order to stop timing a particular run, or to count laps when a particular race involves multiple laps of the track 350.
At the top of the interface are a variety of controls. The touch mode control changes the steering controls from the directional controls 610 to tilt control. Tilt control uses sensors within the mobile device 210 to determine when a user tilts the device 210 and sends signals to steer the vehicle in the direction of the tilt. The timer display at the center top of the interface 600 allows a user to see the current time of a particular “run.” The photo button stores the current image as a still image, while the video button causes the mobile device 210 to record the image stream coming from the vehicle 200 as a video file. In
At step 710, a wireless communication path is established between the vehicle 200 and the mobile device 210. As explained above, this can be a Bluetooth connection, a Wi-Fi connection, or any similar wireless data communication path between the devices 200, 210. At step 715, the vehicle 200 is placed on the track 350. At step 720, the mobile device 210 is used to control the vehicle in a race or run around the track 350. At the same time (or immediately after the race), the mobile device acquires data related to the race (step 725).
These steps 720, 725 are described in more detail in connection with flow chart 800 in
While the vehicle 200 is in motion, the telemetry sensors 312, the track sensors 314, and the image sensors 316 will be read at steps 825, 830, and 835 respectively. In one embodiment, the data from these sensors 312-316 will be immediately transmitted to the mobile device 210 at step 840. The mobile device 210 will receive this data at step 845, and then display and/or store the data. In other embodiments, the data from the sensors 312-316 will be stored in the memory 320 of the vehicle 200 for transmission to the mobile device 210 at a later time. In the preferred embodiment, all of this data will be time coded so as to be able to compare each element of data temporally with the other data, including the received control signals.
Returning to step 730 of
At step 735, the mobile device 210 downloads from the server 260 data from a race run by a third party. This allows the user of the mobile device 210 to compare the third-party race results with their own. If the third party ran the race on the same track configuration, it would be possible to compare the performance of each user head-to-head. The total time for the race could be compared to determine a race winner. Individual statistics could be compared, such as fastest lap time, longest wheel skid, etc. If the user elects to perform their race again at step 745, the third-party race results could be displayed live on the interface 600 while the user controlled their vehicle 200 over the track 350. The interface 600 could display the user's lead over the opponent, or how far behind their opponent their vehicle 200 is at any point in the race. The interface 600 could even superimpose an image of the third-party vehicle on the image portion 602 of the interface 600 whenever the user was running behind the third party during their race.
At step 750, the third-party user input data 236 could even be used to control the user's vehicle 200. While environmental differences and variations in starting positions may prevent input data 236 from a multi-lap third-party race from recreating the entire race with the user's vehicle, this ability could prove useful to recreate maneuvers performed by third parties. For example, a third-party may be able to perform interesting tricks with their vehicle based on a particular sequence of input commands. The third-party could upload the run data for the trick. When the current user sees video of the trick (such as on a website operated using the data in database 270), they decide to download the third-party run data for the trick. At step 745, they use the third-party user inputs to control their own physical vehicle 200 as the trick is recreated in front of them.
The method 900 starts at step 905, in which the mobile device 210 receives a command through the user interface to put the vehicle 200 into a controlled start mode. At step 910, the mobile device 210 transmits this mode change to the vehicle 200. At step 915, the vehicle 200 receives this transmission and sets the vehicle 200 to operate in this mode.
Some time latter, the mobile device 210 receives a command through its user interface to increase the throttle of the vehicle 200 to a user-selected level (step 920). This command is then sent as an instruction to the vehicle 200 at step 925. The vehicle 200 receives this command at step 930. However, instead of operating the motor 330 at the user-selected level, the vehicle 200 recognizes that it is not currently moving and has been set to operate in the controlled start mode. As a result, the vehicle starts the motor 330 operating at maximum torque (aka “throttle”) in step 930. This allows the vehicle 200 to start as quickly as possible.
At step 935, the vehicle 200 senses that it has begun to move. This sensing can be accomplished through the track sensor 314. In other embodiments, the telemetry sensor can determine that the vehicle wheels are now moving the vehicle 200. Once the vehicle 200 has begun to move, the vehicle automatically reduces the amount of torque being provided from the motor 330 to the vehicle wheels (step 940). In one embodiment, the amount torque being provided by the motor 330 at step 940 is completely controlled by the throttle commands received from the mobile device 210. If the user is request 90% throttle, step 940 will now provide 90% throttle. In another embodiment, user throttle commands are not allowed to drive the motor 330 above a pre-set throttle/torque level, where such level is below the maximum level provided at step 930. For example, to help novice users control the vehicle, the controlled start mode may prevent the motor from ever providing greater than 60% throttle (other than during start-up at step 930). If the user requests 20% throttle, the motor 330 is operated at 20%, but if the user requests 100% throttle, the motor 330 is operated at the pre-set level (such as 60%).
In another embodiment, the vehicle 200 always operates its motor 330 as instructed by the mobile device 210. In this embodiment, the mobile device 210 becomes responsible for operating the vehicle 200 in controlled start mode. To accomplish this, the vehicle 200 must constantly provide telemetry data 326 and track data 328 to the mobile device 210. In this way, the mobile device 210 will know when the vehicle 200 is stopped and when it begins moving. The mobile device 210 will instruct the vehicle 200 to operate the motor 330 at full throttle when starting, as described above in connection with step 930. When the data received from the vehicle 200 indicates to the mobile device 210 that the vehicle 200 has started moving, the mobile device 210 will change its throttle command to the vehicle 200 to operate under controlled throttle (as described above in connection with step 940).
The many features and advantages of the invention are apparent from the above description. Numerous modifications and variations will readily occur to those skilled in the art. Since such modifications are possible, the invention is not to be limited to the exact construction and operation illustrated and described. Rather, the present invention should be limited only by the following claims.
This application claims the benefit of U.S. Provisional Application Ser. No. 61/858,440, filed Jul. 25, 2013, which is hereby incorporated by reference in its entirety herein.
Number | Date | Country | |
---|---|---|---|
61858440 | Jul 2013 | US |