Inertial head-tracking systems are compact and low cost, but drift rates over time are inadequate for precision pointing applications. Methods for “ground truth reference” are applied on a regular interval to correct drift misalignment. Existing methods to correct drift, such as inertial optical blend, require expensive cameras and/or are computationally intensive.
In one aspect, embodiments of the inventive concepts disclosed herein are directed to a system of light emitting diodes (LEDs) and photoreceptors. The LEDs and photoreceptors are positioned around the environment and head worn goggles or helmet respectively. A processor periodically receives measurements from the photoreceptors and resets inertial drift based on those measurements.
In a further aspect, either the LEDs or the photoreceptors are constrained to a specified range to allow for more accurate drift estimation.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and should not restrict the scope of the claims. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments of the inventive concepts disclosed herein and together with the general description, serve to explain the principles.
The numerous advantages of the embodiments of the inventive concepts disclosed herein may be better understood by those skilled in the art by reference to the accompanying figures in which:
Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Finally, as used herein any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
Broadly, embodiments of the inventive concepts disclosed herein are directed to a pose tracking and verification system for determining poses from fiducials with coded dimensional features. While specific embodiments described herein are directed toward head-tracking systems, the principles described are generally applicable to any system with one or more cameras rigidly affixed to one body and one or more fiducials with coded features rigidly affixed to a second body.
Referring to
During inertial head-tracking, head pose estimates deviate over time. In at least one embodiment, the processor 102 is configured to determine a drift correction by periodically receiving values from the photoreceptors 108 corresponding to an intensity of light from the LEDs 106. The intensity values generally correspond to more or less direct exposure the LEDs 106. Multiple, contemporaneous photoreceptor values may be combined to identify an estimated head pose; the estimated head pose is compared to head pose determined via inertial head-tracking. The difference between the estimated head pose and the inertial head-tracking determined pose may comprise a measure of drift. A correction factor is determined based on the difference and the correction factor is applied to future inertial head-tracking poses until the next drift determination.
Alternatively, or in addition, the processor 102 may receive a head pose from an inertial head-tracking system and determine anticipated photoreceptor intensity values. The anticipated photoreceptor values may be compared to received photoreceptor values to determine a drift correction.
In at least one embodiment, the LEDs 106 are adapted to operate in a wavelength that is not visible to the human eye to avoid distracting pilots, for night-vision compatibility, for sunlight rejection. Combinations of LEDs 106 and photoreceptors 108 may be adapted to operate in distinct wavelengths such that photoreceptors 108 only register light from one or more specific LEDs 106. Such embodiments may operate with greater accuracy within a limited range of likely head poses.
In at least one embodiment, each of the LEDs 106 may be configured to flash at a specific frequency such that corresponding photoreceptors 108 may distinguish received intensity values. For example, a photoreceptor 108 may periodically register a first intensity at a first frequency; likewise, the same photoreceptor 108 may periodically register a second intensity at a second frequency. Furthermore, other photoreceptors 108 may similarly receive intensity values at those frequencies; the disparity in intensity values may be used to triangulate the relative position of each LED 106 with respect to the photoreceptors 108, and thereby produce an estimated head pose.
In at least one embodiment, the system 100 further comprises a data storage element 110 in data communication with the at least one processor 102. The data storage element may store relational values associating photoreceptor values with an estimated head pose.
Referring to
Alternatively, in at least one embodiment, light sources may be disposed about the cabin while the photoreceptors are disposed on the helmet 202. Furthermore, existing in cabin light sources may be utilized provided they are sufficiently distinguishable by the corresponding photoreceptors.
Light sources 204, 206 and photoreceptors 208, 210, 212 may be disposed to substantially isolate specific light source 204, 206 and photoreceptor 208, 210, 212 pairs to limit interference when the helmet 202 is within a range of expected values.
Referring to
Alternatively, in at least one embodiment, light sources may be disposed about the cabin while the photoreceptors are disposed on the helmets 302, 304. Furthermore, existing in cabin light sources may be utilized provided they are sufficiently distinguishable by the corresponding photoreceptors.
Referring to
Referring to
It may be appreciated that a combination of photoreceptor range restriction elements (
Referring to
It may be appreciated that, while embodiments are shown depicting light sources 602, 604 disposed on the helmet 600, all of the same principles may be applied to a system where the light sources are disposed about the cockpit and the photoreceptors are disposed on the helmet 600.
It may be appreciated that while embodiments are shown with light sources 602, 604 disposed on a helmet 600, other embodiments are envisioned. For example, in at least one embodiment, a pilot may have goggles with embedded light sources 602, 604. Such embodiment may allow for a more compact form while still closely associating the light sources 602, 604 with the pilot's eyes.
Referring to
Based on the distinguished signals, signal strengths are analyzed 708 and relative locations of each photoreceptor and light source are identified 710. In at least one embodiment, the photoreceptors are disposed at known locations and the light sources are disposed relative to each other; in other embodiments, the light sources are disposed at known locations and the photoreceptors are disposed relative to each other. In at least one embodiment, locations may be determined via triangulation algorithms specific to the disposition of the elements. Alternatively, a neural network algorithm may be trained via associations of photoreceptor values and locations or head poses.
In at least one embodiment, an estimated head pose is determined 712 based on the identified locations of the elements. The estimated head pose is compared 714 to a head pose determined via an inertial head-tracking system. The difference of the estimated pose and output of the inertial head tacking system comprises a drift correction factor. The drift correction factor is applied 716 to future inertial head poses until the next drift correction factor is determined. In at least one embodiment, the method is applied periodically; for example, every ten seconds.
In at least one embodiment, neural network algorithms may be trained to produce a drift correction based on photoreceptor inputs and inertial head-tracking inputs.
It is believed that the inventive concepts disclosed herein and many of their attendant advantages will be understood by the foregoing description of embodiments of the inventive concepts disclosed, and it will be apparent that various changes may be made in the form, construction, and arrangement of the components thereof without departing from the broad scope of the inventive concepts disclosed herein or without sacrificing all of their material advantages; and individual features from various embodiments may be combined to arrive at other embodiments. The form herein before described being merely an explanatory embodiment thereof, it is the intention of the following claims to encompass and include such changes. Furthermore, any of the features disclosed in relation to any of the individual embodiments may be incorporated into any other embodiment.
Number | Name | Date | Kind |
---|---|---|---|
5023943 | Heberle | Jun 1991 | A |
6049747 | Nakajima et al. | Apr 2000 | A |
8825435 | Smid et al. | Sep 2014 | B2 |
9075127 | Hess et al. | Jul 2015 | B2 |
9495585 | Bicer et al. | Nov 2016 | B2 |
9607387 | Okuyan et al. | Mar 2017 | B2 |
9874931 | Koenck | Jan 2018 | B1 |
9891705 | Lahr | Feb 2018 | B1 |
10247613 | Wald | Apr 2019 | B1 |
10466779 | Liu | Nov 2019 | B1 |
10775881 | Lahr et al. | Sep 2020 | B1 |
11280872 | Godil | Mar 2022 | B1 |
20070273983 | Hebert | Nov 2007 | A1 |
20080218434 | Kelly et al. | Sep 2008 | A1 |
20100109976 | Gilbert et al. | May 2010 | A1 |
20120081564 | Kamiya | Apr 2012 | A1 |
20120218101 | Ford | Aug 2012 | A1 |
20130182414 | Fedewa | Jul 2013 | A1 |
20140362370 | Bickerstaff et al. | Dec 2014 | A1 |
20170308159 | Yoon | Oct 2017 | A1 |
20170352160 | Tanaka | Dec 2017 | A1 |
20180266847 | Trythall et al. | Sep 2018 | A1 |
20180322335 | Golan et al. | Nov 2018 | A1 |
20190094989 | Chen | Mar 2019 | A1 |
20200214559 | Krueger | Jul 2020 | A1 |
20200355929 | Zhang | Nov 2020 | A1 |
20210133447 | Babu | May 2021 | A1 |
Number | Date | Country |
---|---|---|
105353347 | Oct 2017 | CN |
2013752669 | Apr 2017 | CZ |
Entry |
---|
Extended Search Report for European Application No. 21181527.9 dated Nov. 25, 2021, 11 pages. |
Ercan, Munir, (2010). A 3D Topological Tracking System for Augmented Reality [Unpublished Master's Thesis] Middle East Technical University. |
Hutson, Malcolm et al., (2011). JanusVF: Accurate navigation using SCAAT and virtual fiducials. Visualization and Computer Graphics, IEEE Transactions on. 17. 3-13. 10.1109/TVCG.2010.91. |
Murphy-Chutorian, Erik et al., “Head Pose Estimation and Augmented Reality Tracking: An Integrated System and Evaluation for Monitoring Driver Awareness”, IEE Transactions on Intelligent Transportation Systems, vol. 11 No. 2, Jun. 2010, pp. 300-311. |
Murphy-Chutorian, Erik et al., “Head Pose Estimation in Computer Vision: A Survey,” in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, No. 4, pp. 607-626, Apr. 2009, doi: 10.1109/TPAMI.2008.106. |
Ross, D.A.,. et al., Incremental Learning for Robust Visual Tracking. Int J Comput Vis 77, 125-141 (2008). https://doi.org/10.1007/s11263-007-0075-7. |
S. Jha, et al., “The Multimodal Driver Monitoring Database: A Naturalistic Corpus to Study Driver Attention,” in IEEE Transactions on Intelligent Transportation Systems, doi: 10.1109/TITS.2021.3095462. |
Shewell, C., et al., (2017). Indoor localisation through object detection within multiple environments utilising a single wearable camera. Health and technology, 7(1), 51-60. https://doi.org/10.1007/s12553-016-0159-x. |
Extended European Search Report dated May 3, 2023; European Application No. 21181527.9. |
Number | Date | Country | |
---|---|---|---|
20210405147 A1 | Dec 2021 | US |