A portion of the disclosure of this patent document contains material, which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
This application generally relates to a shock absorber, and in particular, a dampener to improve tracking of recoiling weapons used in virtual reality systems.
It has long been desired to provide personnel training to improve their skills in aiming and firing shotguns, rifles, handguns, and other weapons. Law enforcement and military training often place trainees into situations that require quick visual and mental assessment of the situation as well as an appropriate response with a weapon. Trainees are often subjected to adverse situations to test their abilities to effectively react.
Traditional training methods in marksmanship and firing tactics for hunters and other sportsmen, police, military personnel, and others, leave much to be desired from the aspects of realism, cost and practicality. Many firing ranges have limited capacity. Moreover, most existing firing ranges do not provide protection for the shooter against the natural elements such as rain or snow. Because of the noise levels normally associated with firing ranges, they are typically located in remote areas requiring people to have to drive to remote locations. The ammunition, targets and use costs for the range, make such training expensive. Furthermore, when live ammunition is used, expense, risks, administrative problems, safety concerns, and government rules and regulations are more burdensome. For training in marksmanship and tactics, it is beneficial to have an indoor range where shooters can fire simulated projectiles against simulated moving targets.
Video games are increasingly more realistic where users may be placed into immersive virtual environments. First-person-view shooting games offer players the ability to perform actions such as walking, crouching, shooting, etc., using a mouse and keyboard. However, these games are usually played in front of a computer where the user is sitting in a chair and are adequate for personnel training. Virtual reality systems may improve gaming experience where the player's movement in the game is dependent on their actions in physical space which makes the game more immersive than a traditional video game. Despite the realism provided by virtual reality systems, players are often provided with game controllers that are either tethered or have the look and feel of toys. As such, existing virtual reality game controllers that are representative guns differ from actual guns in feel and balance, and thus reduces the effectiveness of the training for real life.
It is also desirable to incorporate recoil in weapons connected to virtual reality systems to increase immersion. However, existing virtual reality trackers are not able to track position accurately when subjected to the motion of a recoiling weapon. This is due to the high accelerations seen during recoil which cause tracked objects to “fly away,” be tracked so inaccurately that immersion is lost, or disappear from the virtual world entirely. There is thus a need to provide improved hardware for virtual reality shooting simulators.
The present invention provides a recoil mount for a virtual reality tracker. According to one embodiment, the recoil mount comprises a resonator that is situated in between a one or more dampers, the one or more dampers each including a strut that is coupled to opposing sides of the resonator, a tracker interface coupled to the resonator, the tracker interface adaptable with a virtual reality tracker, and a pair of fasteners that mount the one or more dampers to a weapon mounting surface.
The strut may include a spring that exerts elastic resisting force. The one or more dampers may be configured to generate resisting forces proportional to a rate of deformation of the spring to dissipate energy from vibration or movement originating from the weapon mounting surface. The recoil mount may further comprise a guide rail that joins the one or more dampers and is configured to provide horizontal support to the resonator. The resonator may include a channel configured to accept the guide rail.
In one embodiment, the weapon mounting surface comprises a weapon barrel. In another embodiment, the weapon mounting surface comprises a weapon magazine. In yet another embodiment, the weapon mounting surface comprises a weapon handle. At least one of the resonator and the tracker interface is comprised of a material selected from the group consisting of rubber, polyurethane, polyvinyl chloride, and elastomers. The virtual reality tracker may comprise hardware configured to track and link actions, events or signals from a weapon to a virtual reality computing device.
The invention is illustrated in the figures of the accompanying drawings which are meant to be exemplary and not limiting, in which like references are intended to refer to like or corresponding parts.
Subject matter will now be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, exemplary embodiments in which the invention may be practiced. Subject matter may, however, be embodied in a variety of different forms and, therefore, covered or claimed subject matter is intended to be construed as not being limited to any example embodiments set forth herein; example embodiments are provided merely to be illustrative. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. Likewise, a reasonably broad scope for claimed or covered subject matter is intended. Throughout the specification and claims, terms may have nuanced meanings suggested or implied in context beyond an explicitly stated meaning. Likewise, the phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment and the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment. It is intended, for example, that claimed subject matter include combinations of exemplary embodiments in whole or in part. Among other things, for example, subject matter may be embodied as methods, devices, components, or systems. Accordingly, embodiments may, for example, take the form of hardware, software, firmware or any combination thereof (other than software per se). The following detailed description is, therefore, not intended to be taken in a limiting sense.
The present application discloses systems, devices, and methods that employ virtual reality to train police officers or military in the use of weapons like guns or electronic control weapons (e.g., those available from Taser™) in specific scenarios or situations. A trainee may be given an actual weapon or a training weapon that simulates an actual weapon and is connected to a virtual reality system. The trainee may wear goggles or headsets that are connected to the virtual reality system which plays a training scenario. Throughout the training scenario, the trainee's use of his weapon may be tracked and provided to the virtual reality system.
Signals from switches 108, 108′, or 110 may be transmitted to tracker interfaces 104, 104′, respectively, and activate corresponding electrical switches to pin pads 302 and 502. Articulations from gun 200 or electronic control weapon 400 that are communicated to any ones of switches 108, 108′, or 110 are not limited to trigger pulls and safety positions and may include other actions, events or signals that may be generated from weapons. Pin pad 302 and 502 may provide an electrical connection interface with tracker 106.
Signals from pin pads 302 and 502 on tracker interfaces 104 and 104′ may be convey to tracker 106 via pogo pin connector 602. Pogo pin connector 602 may comprise a plurality of spring-loaded pins that support electrical connection with pins on pin pads 302, 502. Signals from the pins on pin pads 302, 502 may be mapped into commands based on contact connections with corresponding pins on pogo pin connector 602. The commands generated on tracker 106 may be received and interpreted by the virtual reality computing device.
Tracker interfaces 104 and 104′ may be mated with tracker 106 by inserting stabilizing pin 306 and 506, respectively, into a stabilizing pin recess (not illustrated) of tracker 106. The stabilizing pins 306, 506 provide proper alignment and contact between pin pads 302, 502 and pogo pin connector 602. Tracker 106 may further include image sensors and/or non-optical sensors (e.g., utilizing sound waves or magnetic fields) that can be installed in hardware to track the movement of a user's body. According to another embodiment, optical markers may be placed on tracker 106 (or alternatively on magazines 102, 102′, or tracker interfaces 104, 104′) for motion tracking using cameras to track movement of a user.
Tracker interfaces 104, 104′ may be secured to tracker 106 by securing screwing bolts 304, 504 to mount 604. Screwing bolts 304, 504 may be tightened and loosened from mount 604 via a thumbwheel 114 as illustrated in
The box magazine 1002 may be coupled to tracker interface 1004. The tracker interface 1004 may include trigger output 1012 and safety output 1014. Trigger detect switch 1016 may include a circuit component that opens and closes an electrical circuit to trigger output 1012. Similarly, safety detect switch 1018 may include a circuit component that opens and closes an electrical circuit to safety output 1014. According to other embodiment, trigger detect switch 1016 and safety detect switch 1018 may be in another housing that is coupled to box magazine 1002, or tracker interface 1004, or both. The electrical connections or signals corresponding to trigger output 1012 and safety output 1014 may be carried to given pins on pin pad 1020.
Tracker interface 1004 may be further coupled to tracker 1006. Tracker 1006 includes pin connector 1022, power source 1024, sensors 1026, wireless transmitter 1028, and microcontroller 1030. Pin pad 1020 may be communicatively or electrically connected to pin connector 1022. Power source 1024 may be connected to microcontroller 1030 and used by microcontroller 1030 to provide a voltage source to components within box magazine 1002 and tracker interface 1004 via pin connector 1022. As such, microcontroller 1030 may receive signals from closed electrical circuits connected to pin connector 1022 and transmit the signals to virtual reality computing device 1010 via wireless transmitter 1028. Virtual reality computing device 1010 may process or render the signals using processor(s) 1032 and transmit corresponding images to headset unit 1008 from wireless interface 1034.
Microcontroller 1030 may also provide power to sensors 1026 and wireless transmitter 1028 from power source 1024. Sensors 1026 can detect a position of tracker 1006 within the x, y and z coordinates of a space, as well as orientation including yaw, pitch and roll. From a user's perspective, a gun connected to tracker 1006 may be tracked when pointed up, down, left and right, tilted at an angle, or moved forward or backwards. Sensors 1026 may communicate where the gun is oriented to microcontroller 1030 which sends the data to virtual reality computing device 1010 for processing by processor(s) 1032 and renders corresponding images for transmission by wireless interface 1034 to headset unit 1008.
Headset unit 1008 may comprise a head mounted display, also including components similar to tracker 1006, that a user can place over the user's eyes. The headset unit 1008 may be configured to communication with the virtual reality computing device 1010 to provide display according to a virtual reality simulation program. Additionally, the headset unit 1008 may be configured with positioning and/or motion sensors to provide user motion inputs to virtual reality computing device 1010. When wearing the headset unit 1008, the view may shift as the user looks up, down, left and right. The view may also change if the user tilts their head at an angle or move their head forward or backward without changing the angle of gaze. Sensors on headset unit 1008 may communicate to processor(s) 1032 where the user is looking, and the processor(s) 1032 may render corresponding images to the head mounted display. Sensors, as disclosed herein, can detect signals of any form, including electromagnetic signals, acoustic signals, optical signals and mechanical signals.
Virtual reality computing device 1010 includes processor(s) 1032, wireless interface 1034, memory 1036, and computer readable media storage 1038. Processor(s) 1032 may be configured to execute virtual reality training software stored within memory 1036 and/or computer readable media storage 1038, to communicate data to and from memory 1036, and to control operations of the virtual reality computing device 1010. The processor(s) 1032 may comprise central processing units, auxiliary processors among several processors, and graphics processing units. Memory 1036 may include any one or combination of volatile memory elements (e.g., random access memory (RAM). Computer readable media storage 1038 may comprise non-volatile memory elements (e.g., read-only memory (ROM), hard drive, etc.). Wireless interface 1034 may comprise a network device operable to connect to a wireless computer network for facilitating communications and data transfer with tracker 1006 and headset unit 1008.
The virtual reality training software may comprise an audio/visual interactive interface that enables a trainee to interact with a three-dimensional first-person-view environment in training scenarios with tracker devices, such as a weapon including a virtual reality-enabled magazine assembly (e.g., comprising box magazine 1002, tracker interface 1004, and tracker 1006). Virtual reality computing device 1010 may receive signals or commands from tracker 1006 and headset unit 1008 to generate corresponding data (including audio and video data) for depiction in the virtual reality environment.
The disclosed embodiments with reference to
To provide a realistic weapon experience, the disclosed system may be used with weapons that recoil when fired. A weapon's recoil comprises an acceleration generated when the weapon is being fired. Typical recoil acceleration of such weapons may be in the range of 20-60 G's. However, accelerometers (which are used to calculate motion) in typical virtual trackers may be limited to measuring g-force acceleration below 8 Gs. Virtual trackers are unable to account for accelerations that exceed a certain threshold caused by recoiling weapons. As such, recoiling weapons causes incorrect calculations of acceleration detected by virtual trackers and results in inaccurate position calculations.
To address this limitation of virtual trackers, the present application discloses a recoil mount system comprising a single degree-of-freedom shock isolator which reduces the recoil produced by weapons used in a virtual reality system without sacrificing position tracking of the weapon during normal operating conditions. The recoil mount system may include shock isolation mounts that can move in three degrees of freedom. The recoil mount system reduces acceleration and vibration imparted to the tracker in any direction and can be mounted to any weapon using standard rail interface systems (e.g. Picatinny, Weaver, etc.).
Dampers 1108 and 1110 reduce vibration amplitudes of resonator 1120 with respect to struts 1112 and 1114. Struts 1112 and 1114 may include springs that exert elastic resisting force. Upon a vibration or movement event (e.g., from weapon recoil) originating from the weapon mounting surface 1102, struts 1112 and 1114 transfer forces of displacement between weapon mounting surface 1102 and resonator 1120 to dampers 1108 and 1110. The dampers 1108 generate resisting forces proportional to a rate of deformation of the springs on struts 1112 and 1114 to dissipate energy from the vibration or movement.
The dampers 1108 and 1110 are mounted on a weapon mounting surface 1102 via fastener 1104 and fastener 1106, respectively. Weapon mounting surface 1102 may comprise a weapon barrel (as illustrated in
A weapon recoil signal block 1402 includes one or more signal sources representative of recoil of a weapon resulting from weapon fire may be created to feed the simulation model 1400. The weapon recoil signal block 1402 may be configured to generate signals associated with a variety of weapon firing modes, such as full automatic, semi automatic, and single fire. Signals from the weapon recoil signal block 1402 may be converted into physical control signals and fed into force source block 1406. Force source block 1406 may further receive configuration data from solver configuration block 1404. Solver configuration block 1404 may be used to specify simulation options and settings. Force source block 1406 generates a source of mechanical energy that generates force proportional to the input physical signal from weapon recoil signal block 1402.
The force from force source block 1406 is representative of weapon recoil at weapon body mass block 1410. The weapon body mass block 1410 may comprise a weapon mounting surface or a fastener of a recoil mount. The recoil mount comprises a damper represented by damper block 1416 and a spring represented by spring block 1414. Damper block 1416 is configurable with a given damping coefficient and spring block 1414 is configurable with a given spring constant to provide desired shock absorbing characteristics, which is described in further detail below. Tracker body mass block 1412 is representative of a virtual reality tracker that is attached to the recoil mount.
Position scope block 1432 is able to generate readings of simulation data corresponding to weapon position from weapon position sensor block 1420 and tracker position from tracker position sensor block 1422. Readings of simulation data corresponding to position error of a difference between the weapon position and the tracker position may be generated by position error scope block 1434 from tracker position error sensor block 1424. The tracker position error sensor block 1424 captures the position of the recoil mount to offset the difference in position between the weapon position and the tracker position, which ideally should be identical. Velocity scope block 1436 is able to generate readings of simulation data corresponding to weapon velocity signals from weapon recoil signal block 1402 and tracker velocity from tracker velocity sensor block 1418. The tracker velocity read by tracker velocity sensor block 1418 comprises the velocity signals from weapon recoil signal block 1402 that is translated through the recoil mount.
Weapon acceleration function block 1426 calculates weapon acceleration based on (i.e., the derivative of, with respect to time) velocity signals from weapon recoil signal block 1402. Tracker acceleration function block 1428 calculates tracker acceleration based on (i.e., the derivative of, with respect to time) the velocity signals read at tracker velocity sensor block 1418. Tracker acceleration orientation block 1430 translates the acceleration from tracker acceleration function block 1428 according to an orientation the tracker is positioned on the recoil mount. Acceleration scope block 1438 is able to generate readings of simulation data corresponding to weapon acceleration from weapon acceleration function block 1426 and tracker acceleration from tracker acceleration orientation block 1430. Based on the readings produced by a combination of the position scope block 1432, position error scope block 1434, velocity scope block 1436, and acceleration scope block 1438, the damping coefficient of damper block 1416 and the spring constant of spring block 1414 may be adjusted or tuned to provide position, velocity, and acceleration readings that are suitable for correct operation of the tracker.
It should be understood that various aspects of the embodiments of the present invention could be implemented in hardware, firmware, software, or combinations thereof. In such embodiments, the various components and/or steps would be implemented in hardware, firmware, and/or software to perform the functions of the present invention. That is, the same piece of hardware, firmware, or module of software could perform one or more of the illustrated blocks (e.g., components or steps). In software implementations, computer software (e.g., programs or other instructions) and/or data is stored on a machine-readable medium as part of a computer program product and is loaded into a computer system or other device or machine via a removable storage drive, hard drive, or communications interface. Computer programs (also called computer control logic or computer-readable program code) are stored in a main and/or secondary memory, and executed by one or more processors (controllers, or the like) to cause the one or more processors to perform the functions of the invention as described herein. In this document, the terms “machine readable medium,” “computer-readable medium,” “computer program medium,” and “computer usable medium” are used to generally refer to media such as a random access memory (RAM); a read only memory (ROM); a removable storage unit (e.g., a magnetic or optical disc, flash memory device, or the like); a hard disk; or the like.
The foregoing description of the specific embodiments will so fully reveal the general nature of the invention that others can, by applying knowledge within the skill of the relevant art(s) (including the contents of the documents cited and incorporated by reference herein), readily modify and/or adapt for various applications such specific embodiments, without undue experimentation, without departing from the general concept of the present invention. Such adaptations and modifications are therefore intended to be within the meaning and range of equivalents of the disclosed embodiments, based on the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by the skilled artisan in light of the teachings and guidance presented herein, in combination with the knowledge of one skilled in the relevant art(s).
This application is related to the following patents and applications, which are assigned to the assignee of the present invention: U.S. patent application Ser. No. 16/930,050, entitled “MAGAZINE SIMULATOR FOR USAGE WITH WEAPONS IN A VIRTUAL REALITY SYSTEM,” filed on Jul. 15, 2020, andU.S. patent application Ser. No. 16/930,060, entitled “A VIRTUAL REALITY SYSTEM FOR USAGE WITH SIMULATION DEVICES,” filed on Jul. 15, 2020.The above identified patents and applications are incorporated by reference herein in their entirety.