The following disclosure relates generally to smart optics, and, more specifically, to smart optics configured for high-accuracy data fusion.
Over the years, military weaponry, including the instruments used to assist a soldier in aiming their weapon, has constantly evolved. Initially no instruments, or only very crude instruments, were utilized. Iron sights, which come in a variety of styles, are mechanical in nature, and are typically fixed to the front and rear of a weapon, however, were eventually adopted. While iron sights are still used, they are typically used only as a backup to a more complex, but fragile sight, such as a holographic, red dot, or magnified optic. Such optics are generally mounted to the top of a weapon through a standardized mounting rail, such as a Picatinny rail. Next generation optics, which are sometimes referred to as “Smart Optics” provide additional features above and beyond those found in current aiming instruments and even often add functionality unrelated to aiming of the weapon.
What is needed, therefore, are systems that can be incorporated into or operatively connected to an optic, especially a light or heavy weapon mountable smart optic, such as may be used on an M4 carbine or similar low SWAP-C weapons platforms, that provide advanced features, such as precise synchronization of data obtained by a plurality of sensors, whether on the same or multiple smart optics, preferably without requiring additional hardware, and methods of use thereof.
One object of the present disclosure is to limit the impact of traditionally “difficult” environments, such as urban canyons, indoors, woods, and GPS-contested tactical environments, on operations.
Another object of the present disclosure is to limit the drift over time between oscillators used across sensors.
Still another object of the present disclosure is to correct for drift that is inherent in IMUs.
One embodiment of the present disclosure provide a smart optic, the smart optic comprising: a mount configured to allow at least a portion of the smart optic to be mounted to a weapon; a time reference comprising a first oscillator, wherein the time reference is configured to output a signal comprising a periodically-repeating feature and time metadata; at least two sensors configured to gather data, each of the at least two sensors comprising secondary oscillators; and at least one processor in operative communication with at least one non-transitory storage medium and each of the at least two sensors; wherein each of the at least two sensors is in operative communication with the time reference and is configured to associate an edge of the periodically-repeating signal with a time conveyed by the time metadata, and wherein each of the at least two sensors is configured to gather data, associate time metadata with the gathered data, and to send the gathered data with time metadata to the at least one processor, and wherein the at least one processor is configured to fuse the data gathered by each of the at least two sensors.
Another embodiment of the present disclosure provides such a smart optic, wherein the time metadata comprises an absolute time.
Still another embodiment of the present disclosure provides such a smart optic, wherein the time reference comprises a GPS or GNSS disciplined oscillator.
Even still another embodiment of the present disclosure provides such a smart optic, wherein the time reference comprises an atomic clock.
Even yet still another embodiment of the present disclosure provides such a smart optic, wherein the time reference comprises a temperature compensated crystal oscillator.
Even still yet another embodiment of the present disclosure provides such a smart optic, wherein the time reference is internal to the smart optic.
Even yet still another embodiment of the present disclosure provides such a smart optic, wherein the signal is overlaid on a power feed to the smart optic and passes through the time reference.
Even another embodiment of the present disclosure provides such a smart optic, wherein the signal overlaid on the power feed comprises an embedded voltage spike, or pulse, at a predetermined cadence.
Still even another embodiment of the present disclosure provides such a smart optic, wherein the pulse comprises a pulse train or specific waveform.
Still even yet another embodiment of the present disclosure provides such a smart optic, wherein a cadence of the pulse is one pulse per second.
Still yet even another embodiment of the present disclosure provides such a smart optic, wherein the power feed is provided by a rail system used to mount the smart optic to a weapon.
Even still another embodiment of the present disclosure provides such a smart optic, wherein the time reference is external to the smart optic.
Even still even another embodiment of the present disclosure provides such a smart optic, further comprising one or more additional time references, with at least one of the additional time references being associated with at least one sensor.
Even still yet even another embodiment of the present disclosure provides such a smart optic, wherein the at least one additional time references are synchronized to the time reference.
Still yet even further embodiments of the present disclosure provide such a smart optic, wherein one of the at least two sensors is an attitude sensor and wherein the attitude sensor is selected from the group consisting of gyroscopes, inertial measurement units, and multi-antenna GPS or GNSS modules.
One embodiment of the present disclosure provides such a system of synchronized smart optics, the system comprising: a first smart optic, at least one additional smart optic, wherein the first and additional smart optic are in operative communication with one another, and wherein one of the first and additional smart optic is configured to synchronize its time reference to the time reference of other.
Another embodiment of the present disclosure provides such a system of synchronized smart optics, wherein data generated by the first smart optic is used to correct for errors on the additional second smart optic.
Even another embodiment of the present disclosure provides such a system of synchronized smart optics, wherein time references used on the first and additional smart optic are disciplined to one another prior to use.
Even still another embodiment of the present disclosure provides such a system of synchronized smart optics, wherein the first and additional smart optic are placed into master/slave relationships, with the time reference of a master smart optic being used to discipline oscillators in slave smart optics.
One embodiment of the present disclosure provides a system of synchronized smart optics, the system comprising: at least two smart optics; and a synchronization module disposed between the at least two smart optics, wherein the smart optics are in operative communication with the synchronization module, and wherein the synchronization module is configured to synchronize the time references of the smart optics.
Implementations of the techniques discussed above may include a method or process, a system or apparatus, a kit, or a computer software stored on a computer-accessible medium. The details or one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and form the claims.
The features and advantages described herein are not all-inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been selected principally for readability and instructional purposes and not to limit the scope of the inventive subject matter.
These and other features of the present embodiments will be understood better by reading the following detailed description, taken together with the figures herein described. The accompanying drawings are not intended to be drawn to scale. For purposes of clarity, not every component may be labeled in every drawing.
Smart optics 100, which offer a wide range of capabilities, may generally be thought of as optics including computing capabilities that utilize at least one passive and/or active sensor that expands the capabilities of a weapon system on which the smart optic 100 is mounted. Smart optics 100 are typically weapon-mounted, include an eyepiece or other element through which a shooter can aim, and can record, and sometimes process details of and/or otherwise interact with, a region surrounding the smart optic 100 using an array of passive and/or active sensors, especially in the direction that the weapon 302 to which the smart optic 100 is attached is pointed.
Current weapon sights and other types of smart optics 100 used by the military can typically supply data, including sensor data such as images, either to a central server or to local storage for storage and/or processing, but cannot mark the various data elements from different sensors on a single device with a time that is sufficiently accurate and consistent between those data elements or data elements on remote smart optics 100 to enable their use in many calculations. Being able to provide a more internally-consistent time stamp for all data elements, whether on a single smart optic 100 or on multiple smart optics 100, would allow the data elements from disparate sensors to be combined, or fused more accurately, which is important in targeting and other collaborative applications, especially where situational awareness is required, enabling their use in more applications.
For example, in recent years, training has included the use of firearms equipped with the ability to fire “eBullets”, with the “eBullet” consisting of a laser beam. These firearms can determine whether a shot would have hit a target with reasonable accuracy and provide benefits in terms of safety and cost over the use of standard projectiles. Such systems, however, require relatively large inertial measurement units (IMUs) and weapon-mounted lasers to function, have a range limited by the power of the laser, require line-of-sight visibility to the target, and fail to accurately account for the bullet drop, i.e. simulate a realistic trajectory that would occur with a real projectile.
While vision-only solutions, which identify downrange position offset via changes in image scale, have been attempted in the past, such systems are computationally complex and less precise. These scale-based approaches use a parameter search over scale in which image registration is performed for each scale parameter value. This additional search loop adds significant computational expense which can make deployment to small SWaP-C platforms unrealizable.
Other vision-only solutions use specially-crafted visual features that provide tolerance to scale differences. However, these feature approaches can fail when matching cross-modal imagery, for example IR camera imagery and synthetic range imagery. Feature match failure results from the potentially significantly different feature manifestation due to the different sensor phenomenologies. Machine learning-based approaches can work well, but require massive amounts of training data, and can be brittle when operating on data that has not been seen during training.
In such an application, image data, such as from a visible or Near Wavelength Infrared (NWIR) camera, that is very accurately combined with ranging data, such as from a laser rangefinder, whether from a single or multiple smart optics, or position data supplied by the target itself (e.g. during training exercises) would allow for triangulation of a target, including a calculation of bullet drop, and subsequent determination of a ‘hit’. To accomplish such tasks, positional accuracy of less than 1 cm and angular accuracy of less than 1 milliradian is desirable. To obtain such accuracy, especially on small arms where the expected rate of angular change is relatively high, extremely accurate time-stamping of data is required. Without a way to synchronize the data to a consistent global, or system, time to a very high degree of accuracy, fusion of this data, and therefore triangulation, at least to a degree of accuracy that would be acceptable for this application, is not possible.
Before delving into the details of embodiments of the present disclosure, as used herein “system time” should be understood to refer to a synchronized time that is used by sensors on a single smart optic whereas “global time” is a synchronized time that is used by sensors across at least two smart optics, each in accordance with the teachings of the present disclosure.
Additionally, data fusion should be considered the joint analysis of multiple inter-related datasets that provide complementary views of the same phenomenon, which provides more accurate inferences than the analysis of a single dataset can yield. In the context of the present disclosure, an example of data fusion or fusing of data would be the combination of camera and rangefinder data, after being synchronized to one another.
Furthermore, a disciplined oscillator should be understood to refer to an oscillator whose output frequency is continuously adjusted, such as through the use of a phase locked loop (PLL), to agree with an external time reference. For example, a GPS disciplined oscillator (GPSDO) usually consists of a quartz or rubidium oscillator whose output frequency is continuously adjusted to agree with signals broadcast by GPS satellites.
Lastly, synchronization, as used herein, refers to the process of disciplining oscillators, in embodiments to a time reference 138, in embodiments an external time reference 138. This can be done, for instance, by receiving a signal comprising a periodic pulse, such as a voltage spike, and including time metadata and then associating an edge of the pulse with a change in time across elements to be synchronized.
Now referring specifically to the Figures,
In embodiments, the time reference 138 is in communication with the processor 106 and at least one sensor and is configured to provide data representative of a time or change in time thereto, such that data generated by the sensors can be associated with a highly precise time that does not differ between sensors. In embodiments, the data representative of a time or change in time comprises a signal comprising a regular, repeating pulse and further comprising time metadata. An edge of the pulse (whether leading or trailing) can then be associated with the exact moment of a change in time, such as from one second to the next, allowing the time metadata to be used to mark data generated by the sensors, such as with a timestamp, very accurately. In embodiments, the edge of the pulse is used to discipline oscillators 140 on each of the sensors as well as on the processor 106, ensuring that the time recorded by each element of the system in in agreement. In embodiments, this pulse is used to discipline oscillators 140 substantially continuously, while, in others disciplining is done once or periodically. Embodiments then utilize the time metadata specifying the time associated with the pulse to allow for accurate time-stamping of sensor data. In embodiments, the pulse is generated externally from the smart optic 100 and conveyed thereto via a rail 300 on which the smart optic 100 is configured to be mounted, in embodiments being overlaid on a power feed.
In embodiments, the smart optic 100 is mountable to a rail system 300, such as a Picatinny rail, in embodiments via a rail mount 102.
In embodiments, the memory 116 is a non-transitory storage device.
In embodiments, the smart optic 100 includes at least two sensors, a processor 106, a non-transitory storage medium 116, and a time reference 138, which may be local or remote, with the time reference being used to discipline at least one oscillator associated with at least one of the two sensors. In embodiments, multiple processors 106 are used.
These embodiments are only exemplary and various details may differ without departing from the inventive aspects of the present disclosure, as would be known to one of ordinary skill in the art.
Referring to
In embodiments, the processor 106 is configured to send time metadata to the at least one sensor 400, which the at least one sensor is configured to associate with the repeating feature.
Now referring to
In embodiments, the IADM 200 is mountable to a rail system 300 and includes a rail mount 102. In embodiments, the IADM 200 is configured to be in operative communications, which may be wired or wireless, with the smart optic 100.
In embodiments, the attitude sensor 202 comprises an inertial measurement unit (IMU) or a gyroscope 202 while, in other embodiments, it comprises a GPS or GNSS module 202 with multiple antennas 208 that allow attitude to be inferred. Additional configurations, as would be apparent to one of ordinary skill in the art, are also possible.
In embodiments, the IADM 200 is not used and attitude is determined using elements, such as an attitude sensor 202, for example an IMU 118, gyroscope, GPS, or GNSS module 202, disposed in or outside of in the smart optic 100.
Now referring to
In the embodiments shown in
In embodiments where only a single smart optic 100 is used, the time used as a reference time may be a relative time (e.g. time since boot), rather than an absolute time.
In embodiments, the smart optic 100 is capable of being mounted to light and/or heavy weapons 302, such as the M4 carbine in common use by infantry. In embodiments, the smart optic 100 is mounted to the weapon 302 via a universal mounting system, such as a rail system 300, in embodiments a Picatinny rail.
Exemplary embodiments of the present disclosure provide the ability to time synchronize the data output by at least two sensors on a single smart optic 100, in embodiments via a shared time reference 138, which may be an internal or external reference, such as an atomic clock, a Temperature Compensated Crystal Oscillator (TCXO), a Global Positioning System Disciplined Oscillator (GPSDO), or any other oscillator or time reference 138 of sufficient accuracy (oscillators may also be referred to herein as clocks). This time reference 138 is then used in conjunction with a pulse train to associate a very precise time with data generated by each of the at least two sensors, allowing such data to be combined with a very high degree of accuracy. In embodiments, the time reference 138 is used to apply a very precise time stamp to metadata associated with the data generated by each of the at least two sensors.
In embodiments, at least one time reference 138 is external to the smart optic 100.
In embodiments, additional time references 138 may be used, with each time reference 138 associated with at least one sensor. In embodiments, the additional time reference(s) 138 are synchronized, or disciplined to, a specific time reference 138.
In embodiments, a signal associated with the time reference 138 is overlaid on an existing input to the smart optic 100, such as a power feed. In such embodiments, the signal comprises a voltage spike at a given cadence, e.g., 1 PPS (Pulse Per Second) embedded in a power feed, with an edge of the voltage spike being configured to exactly correspond to a change in the time reference 138 (e.g. a change from one second to the next). This pulse edge can then be detected by the smart optic 100 and, in embodiments after combination with metadata, internal, or external data, provide an absolute time associated with a previous event (e.g. the previous pulse edge) that can be used to precisely synchronize the timestamp(s) associated with data generated by the at least two of the sensors on the smart optic 100.
Such features and techniques allow sensor data obtained from such a smart optic 100 to be very accurately fused together and thereby expands the potential use cases thereof. For example, such a configuration allows the combination of laser range finder 132 data with camera data, such as a visible camera 128 or Long Wavelength Infrared (LWIR) camera 130, to be used in place of laser painting for smart munitions, as is currently done. It also allows target location information provided by networked smart optics 100 to be combined with information such as attitude, pointing direction, barometric data, relative humidity, temperature, and trigger-pull information from a targeting smart optic 100 to be used to determine a hit or miss in training exercises, allowing factors such as bullet drop to be taken into account and for distinctions between cover and to be made. In embodiments, smart optics 100 between a shooter and target are configured to provide additional information that could effect where a bullet will impact, such as temperature, atmospheric pressure, wind speed and/or direction, humidity, etc.
In embodiments, smart optics 100 used in training exercises do not require a visible line-of-sight to a target to determine if a shot would have impacted the target; targets occluded by vehicles or foliage may still be engaged, and hit depending upon the caliber of the projectiles using a knowledge of the materials between the shooter and target. In embodiments, cover or concealment located between the shooter and target is automatically identified as such depending on what type of weapon and/or what type of projectiles the shooter is equipped with (for training purposes, these could simply be programmed into the smart optic 100 before use or training magazines that are associated with a specific type of ammunition could be configured to communicate with the smart optic 100, allowing it to determine what type of ammunition should be used for calculations throughout a training exercise).
In embodiments, multiple smart optics 100 are synchronized to a global time. In such embodiments, camera and laser range finder 132 data from individual soldiers carrying, for example, light weapons 302, e.g. an M4, equipped with smart optics 100 in accordance with embodiments of the present disclosure can be combined, in embodiments over wireless and/or wired networks, which in some cases are decentralized and/or peer-to-peer, to perform collaborative triangulation and targeting functions.
In embodiments, the time reference 138, which, in embodiments, is a GPS or GNSS disciplined oscillator (GPSDO or GNSSDO), is used on each of a plurality of smart optics 100 to discipline oscillators thereon and establish a consistent global time therebetween. In such embodiments, each time reference may be disciplined to one another prior to use, at regular intervals, at each opportunity (e.g. when a GPS signal is available), or as otherwise needed.
In embodiments, the time reference 138 of each smart optic 100 comprises an atomic clock that is used directly or to discipline oscillators on the smart optic(s) 100 and provide a consistent global time therebetween, in embodiments using a master/slave relationship. In such embodiments, each atomic clock may be disciplined to one another prior to use, at regular intervals, or as otherwise needed.
In embodiments, the time reference 138 of each smart optic 100 comprises a Temperature Compensated Crystal Oscillator (TCXO) that is used directly or to discipline oscillators on the smart optic(s) 100 and provide a consistent global time therebetween, in embodiments using a master/slave relationship. In such embodiments, each TCXO may be disciplined to one another prior to use, at regular intervals, or as otherwise needed.
In embodiments, wireless communications between smart optics 100 are used to synchronize the smart optics 100 using a synchronization module 136 disposed between smart optics 100. In embodiments, wireless communications between smart optics 100 comprise ad-hoc or peer-to-peer networks.
In embodiments, a network protocol, such as Network Time Protocol (NTP) or Precision Time Protocol (PTP), as would be known to one of ordinary skill in the art, may be used to determine the transmission delay between networked devices in real-time and to synchronize the time between smart optics 100.
In embodiments, data exchanged between devices is done so using a blockchain, which, in embodiments, is encrypted.
In embodiments, the power feed is provided by a ‘smart’ rail system 300, which, in the context of the present disclosure, should be understood to be a rail system 300, such as a Picatinny rail, that provides power at each potential mounting point. An example of such a smart rail system is the T-Worx Intelligent Rail® and Rail Operating System (ROS), which conforms to U.S. Army Smart Rail requirements and NATO STANAG 4740.
In embodiments, an electrical pulse is transmitted through the rail system 300 to the smart optic 100, providing an accurate time reference that, in embodiments, is used to discipline the time reference 138 of each smart optic 100 while, in other embodiments, the time reference 138 is used to discipline oscillators associated with individual sensors or groups thereof associated with each smart optic 100.
In embodiments, the pulse is a single pulse while, in other embodiments, it is a pulse train or specific waveform.
In embodiments, metadata providing a time associated with a pulse, such as National Marine Electronics Association (NMEA) messages sent with GPS data, is transmitted separately from the pulse.
In embodiments, time references 138 and/or oscillators are disciplined prior to use while, in other embodiments, they are disciplined on a continuous or regular basis or when the opportunity to do so otherwise arises.
In embodiments, smart optics 100 are placed into master/slave relationships, with the master smart optic 100 being used to discipline oscillators in slave smart optics 100.
In embodiments, local data, i.e. data generated by a single smart optic 100, is used to correct for errors on a second or subsequent smart optic 100. In embodiments, this is done by adding an offset to a time associated with data received by the second or subsequent smart optic 100. For instance, where the smart optics 100 are able to capture at least a portion of the same data and/or have a clear line of sight to one another, the overlapping data can be matched up and a time associated with the data on a second or subsequent smart optic 100 can then be offset or otherwise adjusted so that the time at which the overlapping data observed by the smart optics 100 is associated with a consistent time across the smart optics 100.
In embodiments, the use of such a rail system 300 to discipline a time reference 138, such as an oscillator or clocks, associated with one or more sensors within a smart optic 100 allows the smart optic 100 itself to utilize fewer components, thus simplifying the design while reducing size, weight, and power requirements (SWaP).
In embodiments, the rail system 300 is configured for communications and is used to synchronize voltage spikes in its power feed at a given cadence across a plurality of rails 300 and/or other devices.
In embodiments, the smart optic 100 is wholly contained within a unitary housing while, in other embodiments, it is distributed amongst a plurality of modules, which may or may not be networked or located in the same location (e.g. a part of the system described herein may be remote from the user/available over a network).
In embodiments, the modules and techniques described herein may be distributed over multiple devices, including over a network; the elements described herein do not need to coexist within a single unitary housing to function as described herein.
The foregoing description of the embodiments of the present disclosure has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the present disclosure to the precise form disclosed. Many modifications and variations are possible in light of this disclosure. It is intended that the scope of the present disclosure be limited not by this detailed description, but rather by the claims appended hereto.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the scope of the disclosure. Although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results.