This invention relates generally to an immersive vehicle simulator apparatus and method and, more particularly, but not necessarily exclusively to an apparatus and method for providing immersive vehicle control simulation, such as flight simulation, for the purposes of training operatives to control a vehicle moving in a three dimensional environment.
Motion-based simulators using domes are known and, for example, immersive flight simulators are known which, referring to
Whilst such immersive dome simulators are widely accepted, and are effective in providing a fully immersive and realistic training environment, there are a number of issues associated with systems of this type. Firstly, the physical size of the dome required to effect the simulator has a large footprint and requires a relatively large ground area to accommodate it, but also makes transportation thereof logistically complex and costly. Furthermore, there is a significant cost implication in relation to the requirement for several high specification projectors, lighting, air conditioning and other support systems, the overall cost of which is further increased by the requirement for high level ongoing maintenance. Changing and/or upgrading such equipment may also, as a result, be cost-prohibitive.
It would, therefore, be desirable to provide an immersive simulation apparatus and method that is less costly in both monetary terms and in terms of size, maintenance and upgrade overheads, and it is an object of aspects of the present invention to address at least some of these issues.
In accordance with a first aspect of the present invention, there is provided a mixed reality vehicle control simulator comprising a headset for placing over a user's eyes, in use, said headset including a screen, the simulator further comprising a processor configured to display on said screen a three dimensional environment consisting of scenery, one or more interactive controls for enabling a user to simulate vehicle control actions, said processor being further configured to receive, from said one or more interactive controls, data representative of one or more parameters determinative of vehicle movement and update said scenery displayed on said screen in accordance with said parameters so as to simulate vehicle movement therein.
The simulator may further comprise a physical vehicle control structure, such as a cockpit structure, within which a user is located, in use, said physical control structure including said one or more interactive controls. However, in alternative exemplary embodiments, there is no physical control structure, and the control structure, e.g. a cockpit, is provided in virtual form and blended into the 3D environment displayed on the screen.
The simulator may include image capture means for capturing images of the real world in the vicinity of the user, wherein said processor may be configured to blend images of said real world environment into said three-dimensional environment to create a mixed reality environment representative of a user's field of view and said virtual scenery and display said mixed reality environment on said screen. The image capture means may comprise at least one image capture device mounted on said headset so as to be substantially aligned with a user's eyes, in use.
In an exemplary embodiment of the invention, the simulator may comprise a flight simulator and said data representative of one or more parameters determinative of vehicle movement comprises one or more of air speed, direction, and altitude.
The virtual scenery may be derived from satellite images of the Earth, and/or from animated or computer generated images of an environment.
Another aspect of the invention extends to a method of providing an immersive flight simulation system comprising at least one headset for placing over a user's eyes, in use, said headset including a screen, the method comprising configuring a processing module to display on said screen a three dimensional environment consisting of virtual scenery, receive, from one or more interactive controls included in said system, data representative of one or more parameters determinative of aircraft movement and update said scenery displayed on said screen in accordance with said parameters so as to simulate aircraft movement therein.
Aspects of the invention extend to a program or plurality of programs arranged such that when executed by a computer system or one or more processors, it/they cause the computer system or the one or more processors to operate in accordance with the method described above.
Still further, the present invention extends to a machine readable storage medium storing a program or at least one of the plurality of programs described above.
These and other aspects of the present invention will be apparent from the following specific description, in which embodiments of the present invention are described, by way o examples only, and with reference to the accompanying drawings, in which:
Virtual reality systems are known, comprising a headset which, when placed over a user's eyes, creates and displays a three dimensional virtual environment in which a user feels immersed and with which the user can interact in a manner dependent on the application. For example, the virtual environment created may comprise a game zone, within which a user can play a game.
More recently, mixed reality systems have been developed, in which an image of a real world object can be captured, rendered and placed within a 3D virtual reality environment, such that it can be viewed and manipulated within that environment in the same way as virtual objects therein. Other mixed reality systems have also been developed that enable virtual images to be blended into a user's view of the real world, and it is envisaged, that data from one or more external data sources can be visually represented and placed within the mixed reality environment thus created such that multiple data sources are displayed simultaneously in three dimensions.
Referring to
The system of the present invention further comprises a processor, which is communicably connected in some way to a screen which provided inside the visor 10. Such communicable connection may be a hard wired electrical connection, in which case the processor and associated circuitry will also be mounted on the headset. However, in an alternative exemplary embodiment, the processor may be configured to wirelessly communicate with the visor, for example, by means of Bluetooth or similar wireless communication protocol, in which case, the processor need not be mounted on the headset but can instead be located remotely from the headset, with the relative allowable distance between them being dictated and limited only by the wireless communication protocol being employed. For example, the processor could be mounted on or formed integrally with the user's clothing, or instead located remotely from the user, either as a stand-alone unit or as an integral part of a larger control unit, for example.
Referring to
In a flight simulator, according to a first exemplary embodiment of the present invention, and referring additionally to
The image capture devices 106 on the headset 100 capture images of the user's immediate environment. Thus, images are captured in respect of the cockpit 200 and the user's own body, depending on the user's field of view at any time. The images thus captured are transmitted to the processor in the mixed reality system and blended into the three dimensional environment displayed on the screen, such that the user is provided with a fully immersive, mixed reality environment.
The concept of real time image blending for augmented or mixed reality is known, and several different techniques have been proposed. The present invention is not necessarily intended to be limited in this regard. However, for completeness, one exemplary method for image blending will be briefly described. Thus, in respect of an object or portion of a real world image to be blended into the 3D ‘virtual’ environment displayed on the screen, a threshold function may be applied in order to extract that object from the background image. Its relative location and orientation may also be extracted and preserved by means of marker data. Next, the image and marker data is converted to a binary image, possibly by means of adaptive thresholding (although other methods are known). The marker data and binary image are then transformed into a set of coordinates that match the location within the 3D environment in which they will be blended. Such blending is usually performed using black and white image data. Thus, if necessary, colour data sampled from the source image can be backward warped, using homography, to each pixel in the resultant virtual scene. All of these computational steps require minimal processing and time and can, therefore, be performed quickly and in real (or near real) time. Thus, as the user's field of view and/or external surroundings change, image data within the mixed reality environment can be updated in real time.
The user 200 interacts with the controls in the cockpit 200 in a conventional manner in order to control all aspects of the ‘flight’. Signals representative of user actions, flight status, and other relevant data is fed to the system processor 104 (including a 3D scenery engine) and the mixed reality environment displayed on the user's screen is updated accordingly, in terms of both the scenery change caused by apparent movement through the 3D environment, and any other respective data displayed therein.
In an alternative embodiment of the present invention, and with reference to
Thus, aspects of the present invention provide a mixed reality flight simulator which is able to provide a similar immersive experience to that provided by conventional dome simulators with a greatly reduced infrastructure requirements, which has an impact on physical size (and ease of transportation), costs, maintenance and ease of upgrade.
It is envisaged that the mixed reality technology can be introduced into flight simulation technologies at a number of different levels, and two exemplary embodiments have been described above. In a first exemplary embodiment, as described with reference to
In an alternative exemplary embodiment, as described above with reference to
As previously stated, there are many benefits associated with the reduction in physical infrastructure, including a reduction in cost of purchase, reduction in transport/logistics burden, in addition to the software nature of the virtual cockpit (in some exemplary embodiments of the invention), which can be modified very quickly and at a very low cost of change. The reduction in cost and the ability to network such systems could also allow for a greater number of interconnected simulators which can be relatively easily adapted between aircraft and even roles.
It will be apparent to a person skilled in the art, from the foregoing description, that modifications and variations can be made to the described embodiments, without departing from the scope of the invention as claimed.
Number | Date | Country | Kind |
---|---|---|---|
1503115.6 | Feb 2015 | GB | national |
15182891.0 | Aug 2015 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/GB2016/050453 | 2/23/2016 | WO | 00 |