This disclosure relates to systems and methods for using a vehicle as a motion base for a simulated experience.
Vehicle entertainment systems allow a person to use a display of a vehicle to play movies and/or games. Such vehicle entertainment systems do not allow for movies, games, and/or other entertainment media presented by the vehicle entertainment system to change based on the motion of the vehicle.
This disclosure relates to using a vehicle as a motion base for a simulated experience. A simulated experience may refer to a recreational presentation conveyed to a person through one or more of visual, audio, haptic and/or other simulation, where the visual, audio, haptic, and/or other simulation changes based on the motion of the vehicle. A recreational presentation may include one or more of a story, an image, a video, a movie, an audio, a song, a game, and/or other recreational presentations. The use of a vehicle as a motion base may allow a motion of the vehicle to form a part of a simulated experience. The use of a vehicle as a motion base may enhance a simulated experience and allow a person to feel more engaged by the simulated experience as the vehicle moves.
A system configured to use a vehicle as a motion base for a simulated experience may include one or more processors and/or other components. The one or more processors may be configured to obtain simulation information for a simulation experience, obtain ride information, identify occurrences of simulation events based on the ride information, and generate simulation stimuli that correspond to simulation events for which occurrences are identified. In some implementations, the one or more processors may be configured to effectuate provision of the simulated experience by operating one or more of a light source inside the vehicle, a speaker, a display, an air conditioner, a heater, a temperature controller of the vehicle, and/or other components.
In some implementations, the one or more processors may be configured to select a simulated experience. A simulation experience may be selected based on one or more of a trip criterion, a user selection, a prior simulated experience, and/or other information. A trip criterion may include one or more information relating to the distance of the trip, the duration of the trip, the locations along the trip, and/or other information relating to the trip. A user selection may include a selection of a simulation experience based on one or more user inputs received through one or more input devices. An input device may include a key entry device, a touch entry device, an imaging device, a sound device, and/or other input devices. A prior simulated experience may include information relating to a simulation experience previously experienced by the user.
The one or more processors may be configured to obtain simulation information for the simulation experience. The simulation information may include simulation stimuli that correspond to simulation events. The simulation stimuli may include a first simulation stimulus that corresponds to a first simulation event. The simulation information may include one or more of a database, a lookup table, and/or other information component.
The one or more processors may be configured to obtain ride information. Ride information may include motion information that characterizes a motion experience by a person in the vehicle. The one or more processor may obtain motion information from one or more of a sensor and/or a wearable sensor that characterizes a motion experienced by a person in the vehicle, and/or other sensors.
In some implementations, ride information may include activity information that characterizes an action performed and/or a sound made by a person in the vehicle. The one or more processors may obtain activity information from one or more of a sensor and/or a wearable sensor that characterizes an action performed by a person in the vehicle and/or a sound made by a person in the vehicle, and/or other sensors.
In some implementations, activity information may include one or more user inputs received through one or more input devices. An input device may include a key entry device, a touch entry device, a sound device, and/or other input devices. The one or more processors may obtain activity information from one or more input devices and/or other devices.
In some implementation, ride information may include trip progress information that characterizes a location of the vehicle along a trip, a distance traveled in a trip, a distance remaining in a trip, a duration traveled in a trip and/or a remaining expected duration of a trip. The one or more processors may obtain trip progress information from one or more sensors that characterizes a location of the vehicle along a trip, a distance traveled and/or remaining in a trip, a duration traveled and/or expected remaining in the trip, and/or other sensors.
In some implementations, ride information may include environment information that characterizes a condition of an environment around the vehicle. The one or more processors may obtain environment information from one or more sensors that characterizes a condition of an environment around the vehicle, and/or other sensors. In some implementations, the one or more processors may obtain environment information by determining a location of the vehicle from one or more sensors that characterize a location of a vehicle and obtaining environment information at the location from a communication device.
In some implementations, ride information may include caravanning information that characterizes a relative position of the vehicle to another vehicle. The one or more processors may obtain caravanning information from one or more sensors that characterizes a relative position of the vehicle to another vehicle, and/or other sensors. In some implementations, the one or more processors may obtain caravanning information from a communication device communicating with another communication device on or in the other vehicle.
The one or more processors may be configured to identify occurrences of simulation events based on the ride information. Occurrences of simulation events may be identified based on one or more of motion information, activity information, trip progress information, environment information, caravanning information, and/or other ride information.
The one or more processors may be configured to generate simulation stimuli that correspond to simulation events for which occurrences are identified. For example, responsive to identification of an occurrence of the first simulation event, the first simulation stimulus may be generated. A simulation stimuli may include one or more of a visual, an audio, a haptic and/or other simulation that may change the simulation experience.
In some implementations, the one or more processors may be configured to effectuate provision of the simulated experience by operating one or more of a light source inside the vehicle, a speaker, a display, and/or other components. In some implementations, the one or more processors may be configured to effectuate provision of the simulated experience by operating one or more of an air conditioner, a heater, a temperature controller of the vehicle, and/or other components.
These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
To use a vehicle as a motion base for a simulation experience, simulation information for the simulation experience may be obtained. Simulation information may include simulation stimuli that correspond to simulation events. Ride information may be obtained to identify occurrences of simulation events. Simulation stimuli corresponding to the identified simulation events occurrences may be generated. Simulation experience may be provided by operating one or more of a light source inside the vehicle, a speaker, a display, an air conditioner, a heater, a temperature controller of the vehicle, and/or other components.
System 10 may include one or more of processor 11, sensor 12, simulation device 13, electronic storage 14, bus 15, and/or other components. Some or all components of system 10 may be installed in a vehicle and/or be otherwise coupled with a vehicle. Some or all components of system 10 may be worn by a person in a vehicle. Some or all of components of system 10 may be installed in a device worn by a person in a vehicle and/or be otherwise coupled with a device worn by a person in a vehicle.
Sensor 12 may be configured to generate output signals conveying ride information. Ride information may characterize one or more aspects of a ride. The aspects of the ride may include a setting of the ride, operation of the vehicle, user interaction or reaction within the vehicle, and/or other aspects. Without limitation, ride information may include motion, action, sound, location, surroundings, and/or other information relating to a vehicle and/or a person in the vehicle. Ride information may include motion information, activity information, trip progress information, environment information, caravanning information, and/or other ride information.
Sensor 12 may include one or more of image sensors, temperature sensors, vehicle speed sensors, wheel speed sensors, motion sensors, accelerometers, tilt sensors, inclination sensors, angular rate sensors, gyroscopes, navigation sensors, geolocation sensors, magnetometers, radar detectors, radar sensors, proximity sensors, distance sensors, vibration sensors, light detection sensors, vehicle sensors, engine control module sensors, and/or other sensors. In some implementations, sensor 12 may be installed in a vehicle and/or be otherwise coupled to a vehicle. In some implementations, sensor 12 may be worn by a person in a vehicle. In some implementations, sensor 12 may be installed in or otherwise coupled to simulation device 13.
Simulation device 13 may be configured to provide a simulation experience. Simulation device 13 may provide a simulation experience visually, audibly, haptically, and/or in other ways. Simulation device 13 may include one or more of a display, a speaker, a light source, an air conditioner, a heater, a temperature controller and/or other simulation devices.
A display may provide a simulation experience through visual information presented on the display. Visual information may include information that may be observed visually. Visual information may include one or more of an image, a video, and/or other visual information. A display may include one or more of a head-mounted display, an optical head-mounted display, a see-through display, an optical see-through display, a video see-through display, a visor, eyeglasses, sunglasses, a computer, a laptop, a smartphone, a tablet, a mobile device, a projector, and/or other displays.
In some implementations, a display may include motion, position, and/or orientation tracking component so that the visual information presented on the display changes as the position and/or orientation of the display changes. In some implementations, a display may be integrated with a vehicle. For example, a display may include one or more of a dashboard display, a global positioning system (GPS) navigation display, a front view camera display, a rear view camera display, a display of a vehicle entertainment system and/or other displays.
A display may be configured to display a simulation experience using augmented reality technology. For example, a display may visually provide the simulation experience by displaying an overlay image over one or more of an image, a video, and/or other visual information so that one or more parts of a real-world objects appears to be augmented by one or more parts of a virtual-world objects. In some implementations, a display may use augmented reality technology to display a simulation experience by using systems and methods described in U.S. patent application Ser. No. 14/966,754, entitled “SYSTEMS AND METHODS FOR AUGMENTING AN APPEARANCE OF AN ACTUAL VEHICLE COMPONENT WITH A VIRTUAL VEHICLE COMPONENT,” filed Dec. 11, 2015, the foregoing being incorporated herein by reference in its entirety. Other systems and methods of providing a simulation experience are contemplated.
A speaker may provide a simulation experience through audio information generated by the speaker. Audio information may include information that may be observed audibly. Audio information may include one or more of sound, vibration and/or other audio information. A speaker may include one or more of a headphone, an earphone, a headset, an earset, and/or other speakers. In some implementations, a speaker may include a speaker associated with a display. For example, a speaker may include a speaker of a mobile device. In some implementations, a speaker may be integrated with a vehicle. For example, a speaker may include a sound system of a vehicle.
A light source may provide a simulation experience through one or more wavelengths and/or intensities of light. A light source may include an electric lighting, a fluorescent lighting, an incandescent lighting, an infrared lighting, a light-emitting diode, and/or other light sources. In some implementations, a light source may include a light source of a mobile device. In some implementations, a light source may be integrated with a vehicle. For example, a light source may include one or more interior light sources of a vehicle.
An air conditioner, a heater and temperature controller may provide a simulation experience through one or more of air flow and/or change in temperature. An air conditioner may include an air conditioner of a vehicle, an air conditioner inside a vehicle and/or other air conditioners. A heater may include a heater of a vehicle, a heater inside a vehicle, and/or other heaters. A temperature controller may include a temperature controller of a vehicle, a temperature controller inside a vehicle, and/or other temperature controllers.
Electronic storage 14 may include electronic storage media that electronically stores information. Electronic storage 14 may store software algorithms, information determined by processor 11, information received remotely, and/or other information that enables system 10 to function properly. For example, electronic storage 14 may store simulation information (as discussed elsewhere herein), ride information (as discussed elsewhere herein), information relating to a vehicle, information relating to a person in a vehicle, and/or other information.
Processor 11 may be configured to provide information processing capabilities in system 10. As such, processor 11 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Processor 11 may be configured to execute one or more computer program components. The computer program components may include one or more of simulation experience component 20, simulation information component 21, ride information component 22, simulation event occurrence component 23, simulation stimuli generation component 24, simulation provision component 25, and/or other components.
Simulation experience component 20 may be configured to select a simulated experience. A simulation experience may be selected based on one or more of a trip criterion, a user selection, a prior simulated experience, and/or other information. Simulation experience component 22 may include or retrieve information (for example, a database, etc.) that matches one or more of a trip criterion, a user selection, a prior simulated experience, and/or other information relating to a particular simulation experience.
A simulation experience may be selected based on a trip criterion. A trip criterion may refer to one or more physical and/or temporal characteristics of a trip. By way of non-limiting example, a trip criterion may include one or more information relating to the trip destination, the distance of the trip, the duration of the trip, the locations along the trip and/or other information relating to the trip. A trip criterion may be obtained based on one or more user inputs received through one or more input devices, and/or from one or more navigation devices.
In some implementations, simulation experience component 22 may select a simulated experience based on a trip destination. A trip destination may refer to a destination on one or more parts of a trip. For example, simulation experience component 22 may match a trip destination of a home to a simulation experience relating to a space travel to a home base, a home planet, or a home ship. As another example, simulation experience component 22 may match a trip destination of a school to a simulation experience relating to a space travel to a training ground.
In some implementations, simulation experience component 22 may select a simulated experience based on a distance of a trip. A distance of a trip may refer to a distance of one or more parts of a trip. For example, simulation experience component 22 may match a short distance of a trip to a simulation experience relating to a space race to a nearby object/location. As another example, simulation experience component 22 may match a long distance of a trip to a simulation experience relating to a space race to a distant object/location. In some implementations, simulation component 22 may select a simulation experience so that a story of the simulation experience reaches its peak when the vehicle is estimated to be at a certain location in the trip. For example, simulation experience component 22 may select a simulation experience so that a story of the simulation experience will reach its peak when the vehicle is expected to have traveled a certain percentage (e.g., 70%) of the distance of the trip.
In some implementations, simulation experience component 22 may select a simulated experience based on a duration of a trip. A duration of a trip may refer to a duration of one or more parts of a trip. For example, simulation experience component 22 may match a short duration of a trip to a simulation experience relating to a short space battle. As another example, simulation experience component 22 may match a long duration of a trip to a simulation experience relating to a long space battle. In some implementations, simulation component 22 may select a simulation experience so that a story of the simulation experience reaches its peak when the vehicle is estimated to have a certain duration remaining in the trip. For example, simulation experience component 22 may select a simulation experience so that a story of the simulation experience will reach its peak when the vehicle is expected to have a certain percentage (e.g., 30%) of the duration remaining in the trip.
In some implementations, simulation experience component 22 may select a simulated experience based on a location along a trip. A location along a trip may refer to a location along one or more parts of a trip. As non-limiting examples, simulation experience component 22 may match an urban area along a trip to a simulation experience in a virtual city, a plain area along a trip to a simulation experience in a virtual field. and a forest area along a trip to a simulation experience in a virtual forest. For example,
A trip criterion may be obtained based on one or more user inputs received through one or more input devices. A user input may refer to one or more information provided by a user through an input device. By way of non-limiting example, a user input may include one or more information relating to a simulation, a trip criterion, a user selection, a prior simulated experience, and/or other information. One or more user inputs may be received through one or more input devices. By way of non-limiting example, an input device may include a key entry device, a touch entry device, an imaging device, a sound device, and/or other input devices.
A key entry device may include a device that allows a user to provide one or more user inputs by typing one or more of characters, numbers, and/or other symbols. A key entry device may include a separate device or a part of another device. For example, a key entry device may include a keyboard coupled to processor 11. As another example, a key entry device may include a mobile device coupled to processor 11. A user may provide one or more user inputs by typing one or more information. For example, a user may provide one or more user inputs by typing one or more of a trip destination, a distance of a trip, a duration of a trip, a location along a trip, and/or other information relating to a trip.
A touch entry device may include a device that allows a user to provide user inputs by touching a user interface of the touch entry device. A touch entry device may include a separate device or a part of another device. For example, a touch entry device may include a touch screen coupled to processor 11. As another example, a touch entry device may include a mobile device coupled to processor 11. A user may provide one or more user inputs by touching one or more portions of the touch entry device corresponding to one or more information. For example, a user may provide one or more user inputs by touching one or more portions of the touch entry device corresponding to one or more of a trip destination, a distance of a trip, a duration of a trip, a location along a trip, and/or other information relating to a trip.
An imaging device may include a device that allows a user to provide user inputs by using an image sensor of the imaging device. An imaging device may include a separate device or a part of another device. For example, an imaging device may include an image sensor coupled to processor 11. As a non-limiting example, an imaging device may include sensor 12. As another example, an imaging device may include a mobile device coupled to processor 11. A user may provide one or more user inputs by directing the field of view of the imaging device to objects that include information. For example, a user may provide one or more user inputs by directing the field of view of the imaging device to objects that include information about one or more of a trip destination, a distance of a trip, a duration of a trip, a location along a trip, and/or other information relating to a trip.
A sound device may include a device that allows a user to provide user inputs through voice and/or sounds. A sound device may include a separate device or part of another device. For example, a sound device may include a microphone coupled to processor 11. As another example, a sound device may include a mobile device coupled to processor 11. A user may provide one or more user input by speaking one or more information. For example, a user may provide one or more user inputs by speaking one or more of a trip destination, a distance of a trip, a duration of a trip, a location along a trip, and/or other information relating to a trip.
In some implementations, a trip criterion may be obtained from one or more navigation devices. A navigation device may refer to a device that keeps track of a location of a vehicle on a trip. For example, a navigation device may include a navigation/GPS system of a vehicle and/or a navigation/GPS system coupled to processor 11. As another example, a navigation device may include a mobile device coupled to processor 11. Simulation experience component 20 may obtain from one or more navigation devices one or more of a trip destination, a distance of a trip, a duration of a trip, a location along a trip, and/or other information relating to a trip.
A simulation experience may be selected based on a user selection. A user selection may include a selection of a simulation experience based on one or more user inputs received through one or more input devices. By way of non-limiting example, an input device may include a key entry device, a touch entry device, an imaging device, a sound device, and/or other input devices, as described above. In some implementations, a user may select a simulation experience by using a key entry device to type one or more of characters, numbers, and/or other symbols corresponding to the simulation experience. In some implementations, a user may select a simulation experience by using a touch entry device to touch one or more portions of the touch entry device corresponding to the simulation experience.
In some implementations, a user may select a simulation experience by using an imaging device by directing the field of view of the imaging device to objects that include information relating to the simulation experience. For example, a user may direct the field of view of the imaging device to an augmented reality marker containing information relating to the simulated experience. An augmented reality marker may be two-dimensional or three-dimensional. As a non-limiting example, an augmented reality marker may include one or more of a sticker, a label, a barcode, a quick response (QR) code, and/or other augmented reality marker. In some implementations, a user may select a simulation experience by using a sound device by speaking one or more information relating to the simulated experience.
A simulation experience may be selected based on a prior simulated experience. A prior simulated experience may refer to one or more simulated experience previously presented to a user. Information regarding a prior simulation experience may be obtained from a memory of system 10 (e.g., memory of processor 11, memory of electronic storage 14, and/or memory of another component of system 10) and/or a memory otherwise coupled to system 10 (e.g., memory of a mobile device). For example, information regarding a prior simulation experience may indicate that the prior simulation experience was not concluded. In some implementations, simulation component 22 may select the prior simulated experience and continue the presentation of the prior simulation experience.
Simulation Information component 21 may be configured to obtain simulation information for the simulation experience. The simulation information may include one or more of a database, a lookup table, and/or other information component that allows simulation information component 21 to match a simulation event to a simulation stimulus. A simulation event may refer to one or more of specific motions, specific actions, specific sounds, specific locations, specific surroundings, and/or other specific conditions relating to a vehicle and/or a person in the vehicle. A simulation stimulus may refer to one or more of a visual, an audio, a haptic and/or other simulation that may change a simulation experience. The simulation information may be programmed into simulation information component 21, updated by simulation information component 21, obtained by simulation information component 21 from electronic storage 14, obtained by simulation information component 21 from a remote location, and/or obtained by simulation information component 21 in other ways.
The simulation information may include simulation stimuli that correspond to simulation events. The simulation stimuli may include a first simulation stimulus that corresponds to a first simulation event. For example, a particular simulation stimulus (e.g., a virtual vehicle jumping into light speed/a virtual vehicle being hit by a weapon fire) may correspond to a particular motion of a vehicle (e.g., accelerating/breaking). As another example, a particular simulation stimulus (e.g., a virtual location shaking and/or virtual objects falling/moving) may correspond to a particular activity inside a vehicle (e.g., high volume and/or intensity of physical activity inside a vehicle). As another example, a particular simulation stimulus (e.g., the size, shape, and/or angle of a virtual object changing and/or specific actions taken by a virtual object, such as communicating or firing a weapon) may correspond to a particular location of a vehicle (e.g., distance from a destination or duration to a destination). As another example, a particular simulation stimulus (e.g., virtual rain in a virtual location) may correspond to a particular environment around a vehicle (e.g., rain). Other simulation stimulus and simulation events are contemplated.
Ride information component 22 may be configured to obtain ride information. Ride information may characterize one or more aspects of a ride. The aspects of the ride may include a setting of the ride, operation of the vehicle, user interaction or reaction within the vehicle, and/or other aspects. Without limitation, ride information may include motion, action, sound, location, surroundings, and/or other information relating to a vehicle and/or a person in the vehicle. Ride information may be obtained from output signals generated by sensor 12.
Ride information may include motion information. Motion information may characterize a motion experience by a person in a vehicle at a time, over a duration of time, at a location, or over a distance. Motion information may include one or more information regarding motion experienced by a person in a vehicle, including one or more of moving forward, moving backwards, moving right, moving left, moving up, moving down, turning left, turning right, sloping up, sloping down, acceleration in any direction and/or angle, deceleration in any direction and/or angle, jostling, hitting a speedbump, hitting a pothole, and/or other motion information. Processor 11 may obtain motion information from output signals generated by sensor 12. In some implementations, sensor 12 may include one or more of a vehicle speed sensor, a wheel speed sensor, a motion sensor, an accelerometer, a tilt sensor, an inclination sensor, an angular rate sensor, a gyroscope, a magnetometer, a vibration sensor, a vehicle sensor, an engine control module sensor, and/or other sensors.
In some implementations, ride information may include activity information. Activity information may characterize an action performed and/or a sound made by a person in a vehicle at a time, over a duration of time, at a location, or over a distance. Activity information may include one or more information regarding activity of a person in a vehicle, including one or more of quantity and/or quality of action and/or sound made by the person, and/or other activity information. Processor 11 may obtain activity information from output signals generated by sensor 12. In some implementations, sensor 12 may include one or more of an image sensor that characterizes an action performed by a person in the vehicle, a sound sensor that characterizes a sound made by a person in the vehicle, a wearable sensor that characterizes an action performed and/or a sound made by a person in the vehicle, and/or other sensors.
In some implementations, activity information may include one or more user inputs received through one or more input devices. An input device may include a key entry device, a touch entry device, a sound device, and/or other input devices. For example, one or more persons in the vehicle may change a simulated experience by providing activity information through the use of one or more user input devices. One or more persons in the vehicle may be able to provide the same or different types of activity information. For example, one person may be able to provide activity information corresponding to a virtual weapons control of a virtual spaceship while another person may be able to provide activity information corresponding to a virtual navigation control of the virtual spaceship. Processor 11 may obtain activity information from output signals generated by one or more user input devices.
In some implementation, ride information may include trip progress information. Trip progress information may characterize a location of a vehicle along a trip, a distance traveled in a trip, a distance remaining in a trip, a duration traveled in a trip and/or a remaining expected duration of a trip. Trip progress information may include one or more information regarding a status of a trip, including one or more of location of a vehicle, a traveled distance, a remaining distance, a traveled duration, an expected remaining duration, and/or other trip progress information. Processor 11 may obtain trip progress information from output signals generated by sensor 12. In some implementations, sensor 12 may include one or more of a navigation sensor, a geolocation sensor, a magnetometer, a vehicle sensor, an engine control module sensor, and/or other sensors.
In some implementations, ride information may include environment information. Environment information may characterize a condition of an environment around a vehicle at a time, over a duration of time, at a location, or over a distance. Environment information may include one or more of information regarding a condition of an environment around a vehicle, including one or more of time, weather, temperature, humidity, lighting, terrain, nearby objects, nearby buildings, and/or other environment information. Processor 11 may obtain environment information from output signals generated by sensor 12. In some implementations, sensor 12 may include one or more of a clock, an image sensor, a temperature sensor, a vibration sensor, a light detection sensor, a vehicle sensor, an engine control module sensor, and/or other sensors. In some implementations, processor 11 may obtain environment information by determining a location of a vehicle from output signals generated by sensor 12 and obtaining environment information at the location from a communication device.
In some implementations, ride information may include caravanning information. Caravanning information may characterize a relative position of the vehicle to another vehicle at a time, over a duration of time, at a location, or over a distance. Caravanning information may include one or more information regarding a position, an orientation, and/or speed of the vehicle and/or another vehicle. Processor 11 may obtain caravanning information from output signals generated by sensor 12. In some implementations, sensor 12 may include one or more of an image sensor, a vehicle speed sensor, a wheel speed sensor, a motion sensor, an accelerometer, a tilt sensor, an inclination sensor, an angular rate sensor, a gyroscope, a navigation sensor, a geolocation sensor, a magnetometer, a radar detector, a radar sensor, a proximity sensor, a distance sensor, a vehicle sensor, an engine control module sensor, and/or other sensors. In some implementations, processor 11 may obtain caravanning information from a communication device communicating with another communication device on or in another vehicle. For example, the other communication device on the other vehicle may provide one or more caravanning information regarding the other vehicle.
Simulation event occurrence component 23 may be configured to identify occurrences of simulation events based on ride information. A simulation event may refer to one or more of specific motions, specific actions, specific sounds, specific locations, specific surroundings, and/or other specific conditions relating to a vehicle and/or a person in the vehicle. Occurrences of simulation events may be identified based on one or more of motion information, activity information, trip progress information, environment information, caravanning information, and/or other ride information. Simulation event occurrence component 23 may be configured to identify an occurrence of a simulation event when one or more of motion information, activity information, trip progress information, environment information, caravanning information, and/or other ride information indicates occurrence of one or more of specific motions, specific actions, specific sounds, specific locations, specific surroundings, and/or other specific conditions relating to a vehicle and/or a person in the vehicle that correspond to a specific simulation event.
Criteria for an occurrence of one or more simulation events may be referred to as a simulation event logic. The simulation event logic may be programmed into simulation event occurrence component 23, updated by simulation event occurrence component 23, obtained by simulation event occurrence component 23 from the simulation information, obtained by simulation event occurrence component 23 from electronic storage 14, obtained by simulation event occurrence component 23 from a remote location, and/or obtained by simulation event occurrence component 23 in other ways.
Simulation stimulation generation component 24 may be configured to generate simulation stimuli that correspond to simulation events for which occurrences are identified. A simulation stimulus may refer to one or more of a visual, an audio, a haptic and/or other simulation that may change a simulation experience. Simulation stimulation generation component 24 may be configured to generate a simulation stimulus for a simulation event when the simulation stimulus corresponding to the simulation event is found in the simulation information.
Simulation provision component 25 may be configured to effectuate provision of a simulated experience by operating simulation device 13. Simulation device 13 may include one or more of a display, a speaker, a light source, an air conditioner, a heater, a temperature controller and/or other simulation devices. Simulation provision component 25 may be configured to effectuate provision of a simulated experience through one or more of visual, audio, haptic and/or other simulation, where the visual, audio, haptic, and/or other simulation changes based on simulation stimuli.
For example,
As another example, visual simulation showing the change in relative position of a virtual vehicle to other virtual vehicles (simulation stimulus) may be generated when change in speed of a vehicle (simulation event) is identified. For example, a simulation experience may relate to a virtual spaceship being followed by virtual enemy spaceships. When acceleration of the vehicle is identified, visual simulation may display the virtual enemy spaceships falling behind the virtual spaceship. When deceleration of the vehicle is identified, visual simulation may display virtual enemy spaceships catching up to or overtaking the virtual spaceship.
As another example, visual simulation showing the beginning/ending of a simulation experience/segment of a simulation experience (simulation stimulus) may be generated when acceleration from a stop/deceleration to a stop of a vehicle (simulation event) is identified. For example, a simulation experience may relate to a race/competition between virtual vehicles. When acceleration from a stop of the vehicle is identified, visual simulation may display the beginning of the race/competition. When deceleration to a stop of the vehicle is identified, visual simulation may display the ending of the race/competition.
In
In
In
In
In
In some implementations, simulation provision component 25 may be configured to effectuate provision of a simulated experience by incorporating environment information of a vehicle into the simulation experience. Simulation provision component 25 may obtain environment information of the vehicle from one or more of ride information component 22, output signals generated by sensor 12, and/or another sensors. For example, simulation provision component 25 may obtain information relating to one or more of terrain, nearby objects, nearby buildings, and/or other environment information from an image sensor of sensor 12. As another example, simulation provision component 25 may obtain information relating to one or more of terrain, nearby objects, nearby buildings, and/or other environment information by determining a location of the vehicle from output signals generated by sensor 12 and obtaining environment information at the location from a communication device.
For example, a simulation experience may relate to a virtual vehicle being chased by stormtroopers on virtual speeder bikes and a vehicle may be traveling next to and/or on a mountain. Simulation provision component 25 may incorporate the mountain into the simulation experience by having the virtual speeder bikes traveling over the terrains of the mountain, where the virtual speeder bikes are hovering at a certain distance over the terrains of the mountain. As another example, a vehicle may be traveling in a city and simulation provision component 25 may incorporate buildings and/or objects of the city into the simulation experience by having the virtual speeder bikes weaving between the buildings and/or objects. As another example, simulation provision component 25 may incorporate buildings and/or objects in the city as virtual barriers in the simulation experience so that when a virtual speeder bike hits a building/object, the virtual speeder bike will crash into the building/object.
Although processor 11, sensor 12, simulation device 13, and electronic storage 14 are shown to be connected to a bus 15 in
Although processor 11 is shown in
Processor 11 may be configured to execute one or more of simulation experience component 20, simulation information component 21, ride information component 22, simulation event occurrence component 23, simulation stimuli generation component 24, simulation provision component 25, and/or other components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 11.
It should be appreciated that although simulation experience component 20, simulation information component 21, ride information component 22, simulation event occurrence component 23, simulation stimuli generation component 24, and simulation provision component 25 are illustrated in
The description of the functionality provided by the different computer program components 20, 21, 22, 23, 24, and/or 25 described herein is for illustrative purposes, and is not intended to be limiting, as any of computer program components 20, 21, 22, 23, 24 and/or 25 may provide more or less functionality than is described. For example, one or more of computer program components 20, 21, 22, 23, 24 and/or 25 may be eliminated, and some or all of its functionality may be provided by other computer program components 20, 21, 22, 23, 24 and/or 25. As another example, processor 11 may be configured to execute one or more additional computer program components that may perform some or all of the functionality attributed to one or more of computer program components 20, 21, 22, 23, 24 and/or 25.
Although sensor 12 is depicted in
Although simulation device 13 is depicted in
The electronic storage media of electronic storage 14 may be provided integrally (i.e., substantially non-removable) with one or more components of system 10 and/or removable storage that is connectable to one or more components of system 10 via, for example, a port (e.g., a USB port, a Firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 14 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EPROM, EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 14 may be a separate component within system 10, or electronic storage 14 may be provided integrally with one or more other components of system 10 (e.g., processor 11). Although electronic storage 14 is shown in
In some implementations, method 200 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, a central processing unit, a graphics processing unit, a microcontroller, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 200 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 200.
Referring to
At operation 202, ride information may be obtained. The ride information may include motion information that characterizes a motion experience by a person in a vehicle. In some implementations, operation 202 may be performed by a processor component the same as or similar to ride information component 22 (shown in
At operation 203, occurrences of the simulation events may be identified based on the ride information. In some implementations, operation 203 may be performed by a processor component the same as or similar to simulation event occurrence component 23 (shown in
At operation 204, the simulation stimuli may be generated. The simulation stimuli may correspond to the simulation events for which occurrences are identified, such that responsive to identification of an occurrence of the first simulation event, the first simulation stimulus may be generated. In some implementations, operation 204 may be performed by a processor component the same as or similar to simulation stimuli generation component 24 (shown in
Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
Number | Name | Date | Kind |
---|---|---|---|
5299291 | Ruetz | Mar 1994 | A |
5766077 | Hongo | Jun 1998 | A |
6007338 | DiNunzio | Dec 1999 | A |
6053815 | Hara | Apr 2000 | A |
6200138 | Ando | Mar 2001 | B1 |
6691032 | Irish | Feb 2004 | B1 |
7081033 | Mawle | Jul 2006 | B1 |
7266847 | Pauker | Sep 2007 | B2 |
7739371 | Ikegaya | Jun 2010 | B2 |
7837544 | Tipping | Nov 2010 | B2 |
8190295 | Garretson | May 2012 | B1 |
8758126 | Bavitz | Jun 2014 | B2 |
8831228 | Agrawal | Sep 2014 | B1 |
8882595 | Chowdhary | Nov 2014 | B2 |
8894462 | Leyland | Nov 2014 | B2 |
8941690 | Seder | Jan 2015 | B2 |
8988465 | Baron | Mar 2015 | B2 |
9007400 | Takahashi | Apr 2015 | B2 |
9008310 | Nelson | Apr 2015 | B2 |
9266018 | Story, Jr. | Feb 2016 | B2 |
9293042 | Wasserman | Mar 2016 | B1 |
9327189 | Bavitz | May 2016 | B2 |
9361730 | Keating | Jun 2016 | B2 |
9467515 | Penilla | Oct 2016 | B1 |
9610510 | Comploi | Apr 2017 | B2 |
9643086 | Tipping | May 2017 | B2 |
9669302 | Park | Jun 2017 | B2 |
9674361 | Ristock | Jun 2017 | B2 |
9715764 | Alaniz | Jul 2017 | B2 |
9744448 | Mullen | Aug 2017 | B2 |
9814991 | Van Winkle | Nov 2017 | B2 |
9821920 | Cole | Nov 2017 | B2 |
9922466 | Donnelly | Mar 2018 | B2 |
10019070 | Szczerba | Jul 2018 | B2 |
10025431 | Li | Jul 2018 | B2 |
10043316 | Donnelly | Aug 2018 | B2 |
10046241 | Krosky | Aug 2018 | B1 |
10059347 | Thieberger-Navon | Aug 2018 | B2 |
10140464 | Lebeck | Nov 2018 | B2 |
10162998 | Park | Dec 2018 | B2 |
10186065 | Anderson | Jan 2019 | B2 |
10310600 | Hong | Jun 2019 | B2 |
10339711 | Ng-Thow-Hing | Jul 2019 | B2 |
10366290 | Wang | Jul 2019 | B2 |
10376776 | Lowe | Aug 2019 | B2 |
10482669 | Rober | Nov 2019 | B2 |
10585471 | Reichow | Mar 2020 | B2 |
10589625 | Goslin | Mar 2020 | B1 |
10639557 | Hake | May 2020 | B2 |
20030027636 | Covannon | Feb 2003 | A1 |
20030104824 | Hale | Jun 2003 | A1 |
20030130031 | Yoshida | Jul 2003 | A1 |
20040059922 | Harris | Mar 2004 | A1 |
20050021192 | Takafuji | Jan 2005 | A1 |
20050059483 | Borge | Mar 2005 | A1 |
20060052153 | Vlazny | Mar 2006 | A1 |
20060143270 | Wodtke | Jun 2006 | A1 |
20060224456 | Walker | Oct 2006 | A1 |
20060277100 | Parham | Dec 2006 | A1 |
20070060233 | Liccardo | Mar 2007 | A1 |
20070087834 | Moser | Apr 2007 | A1 |
20070093299 | Bergeron | Apr 2007 | A1 |
20070139671 | Stevens | Jun 2007 | A1 |
20070197275 | Gagner | Aug 2007 | A1 |
20070206023 | Street | Sep 2007 | A1 |
20080091782 | Jakobson | Apr 2008 | A1 |
20080105751 | Landau | May 2008 | A1 |
20080148067 | Sitrick | Jun 2008 | A1 |
20080200244 | Rowe | Aug 2008 | A1 |
20080309010 | Bowling | Dec 2008 | A1 |
20080311983 | Koempel | Dec 2008 | A1 |
20090069084 | Reece | Mar 2009 | A1 |
20090079705 | Sizelove | Mar 2009 | A1 |
20090137323 | Fiegener | May 2009 | A1 |
20090176566 | Kelly | Jul 2009 | A1 |
20090313358 | Shepherd | Dec 2009 | A1 |
20100033427 | Marks | Feb 2010 | A1 |
20100093421 | Nyman | Apr 2010 | A1 |
20100098092 | Luo | Apr 2010 | A1 |
20100130296 | Ackley | May 2010 | A1 |
20100182340 | Bachelder | Jul 2010 | A1 |
20100324984 | Pelto | Dec 2010 | A1 |
20100331721 | Epley | Dec 2010 | A1 |
20110098092 | Reiche, III | Apr 2011 | A1 |
20110183754 | Alghamdi | Jul 2011 | A1 |
20120089275 | Yao-Chang | Apr 2012 | A1 |
20120142415 | Lindsay | Jun 2012 | A1 |
20120256945 | Kidron | Oct 2012 | A1 |
20120264518 | Rouille | Oct 2012 | A1 |
20120289122 | Elliott | Nov 2012 | A1 |
20120295703 | Reiche | Nov 2012 | A1 |
20120295704 | Reiche | Nov 2012 | A1 |
20130083003 | Perez | Apr 2013 | A1 |
20130083061 | Mishra | Apr 2013 | A1 |
20130157607 | Paek | Jun 2013 | A1 |
20130166147 | Chudzinski | Jun 2013 | A1 |
20130274024 | Geylik | Oct 2013 | A1 |
20130296058 | Leyland | Nov 2013 | A1 |
20140067208 | Klappert | Mar 2014 | A1 |
20140100020 | Carroll | Apr 2014 | A1 |
20140100029 | Reiche | Apr 2014 | A1 |
20140128144 | Bavitz | May 2014 | A1 |
20140128145 | Hwang | May 2014 | A1 |
20140162785 | Reiche | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140189017 | Prakash | Jul 2014 | A1 |
20140274313 | Bala | Sep 2014 | A1 |
20140295963 | Ishikawa | Oct 2014 | A1 |
20140342790 | Kim | Nov 2014 | A1 |
20150003609 | Nelson | Jan 2015 | A1 |
20150024852 | Pacey | Jan 2015 | A1 |
20150065237 | Hohn | Mar 2015 | A1 |
20150080125 | Andre | Mar 2015 | A1 |
20150097860 | Alaniz | Apr 2015 | A1 |
20150097864 | Alaniz | Apr 2015 | A1 |
20150100179 | Alaniz | Apr 2015 | A1 |
20150145671 | Cohen | May 2015 | A1 |
20150174479 | Reiche | Jun 2015 | A1 |
20150202962 | Habashima | Jul 2015 | A1 |
20150269780 | Herman | Sep 2015 | A1 |
20150294505 | Atsmon | Oct 2015 | A1 |
20150346722 | Herz | Dec 2015 | A1 |
20150363092 | Morton | Dec 2015 | A1 |
20160042607 | McCoy | Feb 2016 | A1 |
20160071397 | Logan | Mar 2016 | A1 |
20160096114 | Van Winkle | Apr 2016 | A1 |
20160189444 | Madhok | Jun 2016 | A1 |
20160199730 | Olson | Jul 2016 | A1 |
20160206955 | Goslin | Jul 2016 | A1 |
20160206957 | Goslin | Jul 2016 | A1 |
20160216854 | McClellan | Jul 2016 | A1 |
20160224939 | Chen | Aug 2016 | A1 |
20160299567 | Crisler | Oct 2016 | A1 |
20160310839 | Leyland | Oct 2016 | A1 |
20160313792 | Siegel | Oct 2016 | A1 |
20160346704 | Wagner | Dec 2016 | A1 |
20170021273 | Rios | Jan 2017 | A1 |
20170021282 | Comploi | Jan 2017 | A1 |
20170045946 | Smoot | Feb 2017 | A1 |
20170050743 | Cole | Feb 2017 | A1 |
20170068311 | Evans | Mar 2017 | A1 |
20170072316 | Finfter | Mar 2017 | A1 |
20170078621 | Sahay | Mar 2017 | A1 |
20170103571 | Beaurepaire | Apr 2017 | A1 |
20170106288 | Reiche | Apr 2017 | A1 |
20170132334 | Levinson | May 2017 | A1 |
20170154024 | Subramanya | Jun 2017 | A1 |
20170158023 | Stevanovic | Jun 2017 | A1 |
20170166221 | Osterman | Jun 2017 | A1 |
20170236328 | Eatedali | Aug 2017 | A1 |
20170253252 | Donnelly | Sep 2017 | A1 |
20170330034 | Wang | Nov 2017 | A1 |
20170344754 | Kumar | Nov 2017 | A1 |
20170352185 | Bonilla Acevedo | Dec 2017 | A1 |
20180008894 | Sack | Jan 2018 | A1 |
20180011988 | Ziegler | Jan 2018 | A1 |
20180040162 | Donnelly | Feb 2018 | A1 |
20180040163 | Donnelly | Feb 2018 | A1 |
20180043272 | Van Winkle | Feb 2018 | A1 |
20180089900 | Rober | Mar 2018 | A1 |
20180089901 | Rober | Mar 2018 | A1 |
20180096501 | Anderson | Apr 2018 | A1 |
20180181412 | Paratey | Jun 2018 | A1 |
20180231973 | Mattingly | Aug 2018 | A1 |
20180247352 | Rogers | Aug 2018 | A1 |
20180369702 | Hake | Dec 2018 | A1 |
20190001987 | Kim | Jan 2019 | A1 |
20190014070 | Mertvetsov | Jan 2019 | A1 |
20190047498 | Alcaidinho | Feb 2019 | A1 |
20190065970 | Bonutti | Feb 2019 | A1 |
20190075437 | Shaikh | Mar 2019 | A1 |
20190101976 | Reichow | Apr 2019 | A1 |
20190157607 | Kim | May 2019 | A1 |
20190220674 | Khalfan | Jul 2019 | A1 |
20200053400 | Chao | Feb 2020 | A1 |
20200074181 | Chang | Mar 2020 | A1 |
20200151768 | Dekeyser | May 2020 | A1 |
20200163616 | Sakaya | May 2020 | A1 |
20200193163 | Chang | Jun 2020 | A1 |
Number | Date | Country |
---|---|---|
1381KOL2007 | Apr 2009 | IN |
2019065430 | Apr 2019 | JP |
2007101785 | Sep 2007 | WO |
2018128946 | Jul 2018 | WO |
Entry |
---|
Adam Hartley, 5 Predictions for the Future of in-Car Entertainment, Apr. 9, 2019 [https://360.here.com/5-predictions-for-the-future-of-in-car-leisure-entertainment], (5 pages). |
Lambros Sarakis, et al., Technological Educational Institute of Sterea Ellada; Hellenic Open University; Synelixis Solutions Ltd., Providing Entertainment Applications in Vanet Environments, Mar. 2, 2016 [https://ieeexplore.ieee.org/document/7422403] (8 pages). |
Raman Mehta, IDG Communications, Inc., Augmented Reality—Next Frontier for Competitive Advantage, Feb. 14, 2017 [https://www.cio.com/article/3169685/augmented-reality-next-frontier-for-competitive-advantage.html], (5 pages). |
Kloster, Benjamin, ‘Wanted: Encryption Scheme for Copy Protection Purposes’, Retrieved from the Internet http://stackoverflow.com/questions/14529732/wanted-encryption-scheme-for-copy-protection-purposes, Jan. 25, 2013, XP055152568, 2 pages. |
Dagamant, ‘Skylanders Hacking’, Retrieved from the Internet http://web.archive.ora/web/20120309212642/http://www.suncries.com/skvlaners-hack ing Feb. 27, 2012, XP055152538, 8 pages. |
Extended European Search Report issued in European Patent Application No. 14175300.4, dated Nov. 19, 2014. (6 pgs). |
“Sun CriesSun Cries”, http://web.archive.org, Mar. 9, 2012 (Mar. 9, 2012), XP055152538, Retrieved from the Internet: URL: http://web.archive.org/web/20120309212642/http://www.suncries.com/skylanders-hac king [retrieved on Nov. 12, 2014] (8 pgs). |
“Video Game/ Battle Circuit”, TV Tropes, available at «https://web.archive.org/web/20150114005112/http://tvtropes.org/pmwiki/pmwiki .php/VideoGameBattleCircuit» (4 pages). |
“Battle Circuit”, Capcom, available at «https://web.archive.org/web/20000111073021/http://www.capcom.co.jp/newproducts/ arcade/battle/bs-top. html» (Orig in al Japanese web page followed by English translation), 4 pages. |
Apple (Developing Wireless CarPlay System, https://developer.apple.com/videos/play/wwdc2017/717/WWDC 2017, video and slide). (Year: 2017). |
Google search “Iphone Figurine” (Year: 2020), 1 page. |
NPX “Software-Apple-Carplay: Software Technology for CarPlay”, https://www.nxp.com/design/software/embedded-software/software-technology-for-carplay:SOFTWARE-APPLE-CARPLAY (Year: 2020), 3 pages. |