The embodiments relate generally to simulators, and in particular to a domeless simulator.
Commercial simulators, such as flight simulators, are relatively large systems that require a substantial amount of space. A flight simulator, for example, may include a large dome on which imagery is projected, and may include multiple projectors and image generators, which are costly, require a substantial amount of power, and generate a substantial amount of heat, which in turn increases environmental cooling requirements. As an example, one known flight simulator utilizes 25 projectors and requires a dome that is 20 feet in diameter, and utilizes 314 square feet of space. Such size requirements can limit the locations at which the simulator can be used. The use of a dome may also require special focus adjustments to any heads-up display (HUD) apparatus used in the simulator to make the HUD apparatus focus at the distance of the dome, increasing simulator configuration complexity. Moreover, the physical cockpit controls used by the user are made as realistic as possible to ensure simulation realism, which further increases simulator costs.
The embodiments provide a domeless simulation system, sometimes referred to as a simulator, that utilizes a head-wearable display, a head track device, and a hand track device to realistically simulate an out-the-window display and an instrument control panel of a vehicle, such as an aircraft, to a user. Among other features, the embodiments visually depict in imagery movements of the user's hand manipulating virtual controls based on physical movements of the user's hand in a real-world environment.
In one embodiment, a simulator is provided. The simulator includes a head-mounted display (HMD) device having a field-of-view (FOV) and a cockpit control surface. An image generation device is coupled to the HMD device and configured to generate imagery of a virtual environment including an out-the-window image component and a cockpit control image component that is registered to the cockpit control surface. A hand track device is configured to sense a location of a hand of a user. A controller is coupled to the hand track device and is configured to determine the location of the hand of the user with respect to the FOV.
In one embodiment, the controller is further configured to cause the image generation device to insert a virtual hand into the imagery of the virtual environment at a virtual location that corresponds to a sensed location of the hand of the user.
In one embodiment, the controller is further configured to determine, based on the hand track device, a contact location on the cockpit control surface of the hand of the user, correlate the contact location with a virtual cockpit control of a plurality of virtual cockpit controls depicted in the cockpit control image component, and cause the image generation device to generate imagery depicting contact of the virtual cockpit control with the virtual hand.
In one embodiment, the controller is further configured to alter a vehicle motion characteristic, such as an altitude, velocity, or direction, based on the virtual cockpit control. The controller may also cause the image generation device to alter the imagery of the virtual environment in response to altering the vehicle motion characteristic.
In one embodiment, the simulator includes a head track device, and, based on head track data received from the head track device, the controller continuously determines the FOV of the HMD device. In one embodiment, the controller alters the imagery of the virtual environment in synchronization with a change in the FOV of the HMD device.
In one embodiment, over a period of time and based on the hand track device, the controller causes the image generation device to move the virtual hand with respect to the FOV in correspondence with a plurality of sensed locations of the hand of the user over the period of time.
In one embodiment, the image generation device includes a first image generation element that is configured to generate the imagery of the virtual environment for one eye of the user, and a second image generation element that is configured to generate the imagery of the virtual environment for another eye of the user.
In another embodiment, a method is provided. The method includes providing, to a HMD device having a FOV, imagery of a virtual environment including an out-the-window image component and a cockpit control image component that is registered to a cockpit control surface. Based on input from a hand track device, it is determined that a hand of a user is at a location in space that corresponds to a location within the FOV. The imagery of the virtual environment is altered to depict a virtual hand at the location within the FOV.
Those skilled in the art will appreciate the scope of the disclosure and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.
The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the disclosure, and together with the description serve to explain the principles of the disclosure.
The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the embodiments and illustrate the best mode of practicing the embodiments. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the disclosure and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.
Any flowcharts discussed herein are necessarily discussed in some sequence for purposes of illustration, but unless otherwise explicitly indicated, the embodiments are not limited to any particular sequence of steps. The use herein of ordinals in conjunction with an element is solely for distinguishing what might otherwise be similar or identical labels, such as “first image generation element” and “second image generation element,” and does not imply a priority, a type, an importance, or other attribute, unless otherwise stated herein.
The embodiments provide a domeless simulator that utilizes a head-wearable display, a head track device, and a hand track device to realistically simulate an out-the-window (OTW) display and an instrument control panel (such as a cockpit control panel) of a vehicle, such as an aircraft, to a user. Among other features, the embodiments visually depict in imagery movements of the user's hand manipulating virtual cockpit controls based on the physical movements of the user's hand in a real-world environment. The embodiments facilitate a simulator that has a relatively small footprint and that consumes substantially less power and has lower cooling requirements than conventional simulators.
The platform 12 includes a tracked volume 18 that comprises a volume of space that is tracked by a hand track device 20. As will be discussed in greater detail herein, the hand track device 20 tracks the movements and locations of one or both hands of the user 14. The tracked volume 18 also includes a cockpit control surface 22 that the user 14 may touch, or otherwise interact act with, during a simulation.
A controller 24 may include one or more processing devices 25 and a memory 26, and is responsible for overall coordination of the various functionalities described herein. An image generation device 28 generates imagery and provides the imagery to a head-mounted display (HMD) device 30. The HMD device 30 is a head-wearable apparatus that, in one embodiment, has an ultra-wide field-of-view, such as in excess of 100 degrees. In some embodiments, the HMD device 30 may comprise, or be substantially similar to, the HMD device described in U.S. Pat. No. 8,781,794 B2, entitled “METHODS AND SYSTEMS FOR CREATING FREE SPACE REFLECTIVE OPTICAL SURFACES,” filed on Aug. 17, 2011 and U.S. patent application Ser. No. 13/211,365, entitled “HEAD-MOUNTED DISPLAY APPARATUS EMPLOYING ONE OR MORE FRESNEL LENSES,” filed on Aug. 17, 2011, each of which is hereby incorporated by reference herein.
In one embodiment, the image generation device 28 includes a first image generation element 32-1 that is configured to generate imagery of the virtual environment for the right eye of the user 14, and a second image generation element 32-2 that is configured to generate imagery of the virtual environment for the left eye of the user 14. In one embodiment, the first and second image generation elements 32 comprise individual graphic processing units (GPUs). In some embodiments, the imagery provided to the eyes of the user 14 may be stereoscopic imagery, such that the user 14 experiences the virtual environment in a realistic three-dimensional (3D) sense.
The imagery of the virtual environment that is presented to the user 14 may be generated based on virtual environment data 34 that is maintained in the memory 26. The virtual environment data 34 may include a virtual cockpit model 36 that maintains information about a virtual cockpit that is registered to the cockpit control surface 22. Thus, when displayed to the user 14, the user 14 views a virtual cockpit that appears to be located relatively precisely at the same location as the real-world location of the cockpit control surface 22. The virtual cockpit model 36 may include information about a plurality of virtual cockpit controls, a current state of each virtual cockpit control, locations of relevant imagery associated with the virtual cockpit, and the like.
The virtual environment data 34 may also include an OTW model 38 that contains information about the environment that is external to the cockpit of the simulated vehicle, including, for example, information about objects in the external environment, the particular location of the simulated vehicle with respect to the external environment, information that identifies a portion of the external environment that is within a field of regard of the user 14, and the like. The virtual environment data 34 may also include a hand model 40 that provides information about a hand of the user 14. The hand model 40 may be based on data received from the hand track device 20, including, by way of non-limiting example, the location of the hand of the user 14 in X, Y, and Z coordinates in the tracked volume 18. In some embodiments, the hand model 40 may identify locations of individual fingers, and/or individual knuckles of the hand, depending on the particular capabilities of the hand track device 20. While only one hand model 40 is illustrated, in some embodiments the simulator 10 may keep track of both hands of the user 14, and in such embodiments, two hand models 40 may be utilized.
A head track device 42 provides head track data that comprises information about the orientation and location of the head of the user 14. In one embodiment, the head track device 42 may be coupled to the HMD device 30. The head track device 42 may comprise, for example, an inertial measurement unit (IMU) that continually, over the duration of a simulation, provides relatively precise orientation information associated with movements of the head of the user 14. The head track device 42 may be positioned at a known location with respect to a reference location, such as the mid-point between the two eyes of the user 14, such that the orientation information can be used to determine relatively precisely where the user 14 is looking. The controller 24 may utilize the information received from the head track device 42 to maintain an instantaneous field-of-view (FOV) 44 of the HMD device 30. The image generation device 28 may utilize the FOV 44 in conjunction with the virtual cockpit model 36, OTW model 38, and hand model 40 to determine precisely which imagery associated with the virtual environment should be rendered and provided to the HMD device 30 at a relatively high rate, such as 30 or 60 times per second. Thus, as the head track device 42 detects movements of the head of the user 14, the controller 24 continuously determines and updates the FOV 44, and the image generation device 28 continuously alters the imagery provided to the HMD device 30 in synchronicity with the changing FOV 44. Some embodiments allow the user 14 to have a complete 360 degree viewing area such that irrespective of where the user 14 looks, the user 14 experiences similar visuals to that which would be seen by the user 14 in the aircraft being simulated. Thus, for example, during the simulation the user 14 may look over a shoulder through a simulated cockpit window and see one or more other aircraft. Moreover, when the hand model 40 indicates that the hand of the user 14 is at a location within the tracked volume 18 that is within the FOV 44 of the HMD device 30, the image generation device 28 generates imagery that depicts a virtual hand at a location in the virtual environment that corresponds to the location of the hand of the user 14 in the real world.
The imagery 50 is generated by the image generation device 28 (
When the hand model 40 indicates that the hand 46 has moved within the FOV 44 of the HMD device 30, the image generation device 28 inserts a virtual hand into the imagery of the virtual environment that is provided to the HMD device 30 at a virtual location that corresponds to the sensed location of the hand 46 in the tracked volume 18. As the hand 46 moves within the tracked volume 18, the image generation device 28 generates imagery that depicts the virtual hand moving with the respect to the FOV 44 in correspondence with the sensed locations of the hand 46.
The selection or activation of a virtual cockpit control 56 may, depending on the simulated function of the virtual cockpit control 56, alter a vehicle motion characteristic of the simulated vehicle, such as altitude, velocity, or direction. In response to the altered vehicle motion characteristic, the virtual environment data 34 may change, such that the imagery provided to the HMD device 30 may change. For example, if selection of the virtual cockpit control 56 caused the roll, pitch, or yaw of the simulated aircraft to change, the image generation device 28 generates imagery that corresponds to such changed roll, pitch, or yaw.
While for purposes of illustration only a single user 14 has been discussed, in some embodiments the simulator 10 maintains multiple FOVs 44 for multiple users 14 in a simulation, such as, for example, a pilot and a weapon systems officer (WSO). In such embodiments, each user 14 may have a corresponding FOV 44 maintained in the virtual environment data 34, and a corresponding hand model 40. The image generation device 28 may include additional image generation elements 32 that are configured to generate imagery for each user 14 in the simulation based on the virtual environment data 34. The WSO may also have a separate cockpit control surface 22 (not illustrated) that is registered to a cockpit control image component seen by the WSO, and which provides tactile feedback substantially similar to that which the WSO would experience in the cockpit of the aircraft being simulated.
As discussed above, while the cockpit control surface 22-1 illustrated in
Referring again to
Among other features, the embodiments provide a relatively low-cost, full-motion and wide field-of-view simulator that realistically simulates vehicles, such as aircraft, including the cockpit control surfaces of such vehicles, without requiring the cost and space associated with a domed simulator. Further, some embodiments provide cockpit control surface feedback identical to that of the vehicle being simulated, without the expense of full mockup cockpit control surfaces, and can utilize replaceable cockpit control surfaces such that any number of different vehicles may be realistically simulated by simply swapping one cockpit control surface with another.
Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the disclosure. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.