Cognitive systems are seeing increasing use in military applications. In one example, a cognitive model is used to determine the cognitive state of a solider in order to adapt the communication and/or display of information to the solider. In another example, a cognitive model is used to make friend or foe determinations.
In some such cognitive applications, navigation-related information is used as an input to a cognitive model in order to improve the quality of the outputs of the cognitive model. However, cognitive data is typically not used an input to a navigation system in order to improve the quality of the outputs of the navigation system.
One exemplary embodiment is directed to a method of navigation comprising receiving, from at least one sensor, physiological sensor data indicative of at least one physiological attribute of a person and generating navigation-related information derived from at least some of the physiological sensor data.
In the particular exemplary embodiment shown in
As used herein, a “navigation solution” comprises information about the location (position) and/or movement of the user 104 and/or vehicle. Examples of such information include information about a past, current, or future absolute location of the user 104 and/or vehicle, a past, current, or future relative location of the user 104 and/or vehicle, a past, current, or future velocity of the user 104 and/or vehicle, and/or a past, current, or future acceleration of the user 104 and/or vehicle. A navigation solution can also include information about the location and/or movement of other persons or objects within the environment of interest.
The navigation solution 102 can be used to determine the current location of the vehicle and/or user 104. The navigation solution 102 can also be used in a mapping process in which a particular environment of interest is mapped. The navigation solution 102 can also be used for other applications.
The system 100 comprises one or more physiological sensors 106 located on or near the user 104 (for example, mounted directly to the user 104, mounted to a helmet, strap, or item of clothing worn by the user 104, and/or mounted to a structure near the user 104 while the user 104 is within or near the system 100). In general, the physiological sensors 106 generate data associated with one or more physiological attributes of the user 104. More specifically, in the particular exemplary embodiment shown in
The system 100 further comprises one or more programmable processors 110 for executing software 112. The software 112 comprises program instructions that are stored (or otherwise embodied) on an appropriate storage medium or media 114 (such as flash or other non-volatile memory, magnetic disc drives, and/or optical disc drives). At least a portion of the program instructions are read from the storage medium 114 by the programmable processor 110 for execution thereby. The storage medium 114 on or in which the program instructions are embodied is also referred to here as a “program product”. Although the storage media 114 is shown in
One or more input devices 118 are communicatively coupled to the programmable processor 110 by which the user 104 is able to provide input to the programmable processor 110 (and the software 112 executed thereby). Examples of input devices include a keyboard, keypad, touch-pad, pointing device, button, switch, and microphone. One or more output devices 120 are also communicatively coupled to the programmable processor 110 on or by which the programmable processor 110 (and the software 112 executed thereby) is able to output information or data to or for the user 104. Examples of output devices 120 include visual output devices such as liquid crystal displays (LCDs), light emitting diodes (LEDs), or audio output devices such as speakers. In the exemplary embodiment shown in
In the particular exemplary embodiment described here in connection with
One exemplary embodiment of a method of determining or generating a navigation state from at least some of the physiological data output by the physiological sensors 106 is described below in connection with
In the particular exemplary embodiment shown in
In the exemplary embodiment shown in
In this exemplary embodiment, the software 112 also comprises imaging functionality 140 that generates navigation information (also referred to here as “imaging navigation information”) from the imaging sensor data generated by the imaging sensors 138. The imaging navigation information is generated in a conventional manner using techniques known to one of ordinary skill in the art.
In this exemplary embodiment, the INS functionality 142 also comprises sensor fusion functionality 144. As a part of generating the navigation solution 102, the sensor fusion functionality 144 combines the navigation information generated by the physiological sensor functionality 124, the inertial navigation information generated by the INS functionality 142, the imaging navigation information generated by the imaging functionality 140, and GPS data generated by the GPS receiver 136. In this exemplary embodiment, the navigation information generated by the physiological sensor functionality 124, the inertial navigation information generated by the INS functionality 142, the imaging navigation information generated by the imaging functionality 140, and the GPS data generated by the GPS receiver 136 each have a respective associated confidence level or uncertainty estimate that the sensor fusion functionality 144 uses to combine such navigation-related information. In one implementation of such an exemplary embodiment, the sensor fusion functionality 144 comprises a Kalman filter. In other implementations, other sensor fusion techniques are used.
By using the cognitive and other physiological information derived from physiological data generated by the physiological sensors 106 as an input, the sensor fusion functionality 144 can use this cognitive and other physiological information to improve the navigation solution 102 output by the INS functionality 142. For example, in environments where GPS is not available and/or where the imaging sensors are not reliable or operable, the navigation information derived from the physiological sensors 106 can still be used to control inertial navigation information error growth.
In the particular exemplary embodiment shown in
In the particular exemplary embodiment described here in connection with
Method 200 comprises, as a part of an offline process 202 performed prior to the system 100 being used on a live mission, training the neural network 146 using a set of training data (block 204). The set of training data is obtained by having the user 104 perceive various navigation-related experiences or stimuli and capturing the physiological sensor data that is generated by the physiological sensors 106 in response to each such experiences or stimuli. The training data can be captured in an off-line process performed in a controlled environment and/or during “live” missions where the user 104 has perceived the relevant navigation-related experiences or stimuli. Examples of such navigation-related experiences or stimuli include, without limitation, positioning the user 104 at various locations within an environment of interest, moving the user 104 in or at various directions, inclines, speeds, and/or rates of acceleration, having the user 104 view various objects of interest while positioned at various locations within the environment of interest, having the user 104 view various objects of interest while moving in or at various directions, inclines, speeds, and/or rates of acceleration, having the user 104 view various landmarks of interest while positioned at various locations within the environment of interest, viewing various “friendly” and “foe” vehicles or persons.
In this exemplary embodiment, individual training data is captured for particular users 104 of the system 100 so that each such user 104 has a separate instantiation of the neural network 146 that is trained with the individual training data that has been captured specifically for that user 104. Training data can also be captured for several users (for example, users of a particular type or class) and the captured data can be used to train a single instantiation of the neural network 146 that is used, for example, for users of that type or class. This latter approach can also be used to create a “default” instantiation of the neural network 146 that is used, for example, in the event that a user 104 for whom no other neural network instantiation is available uses the system 100.
The captured training data is used to train the neural network 146 in a conventional manner using techniques known to one of ordinary skill in the art.
Method 200 further comprises, during a live mission 206, receiving physiological sensor data from one or more of the physiological sensors 106 (block 208) and using the trained neural network 146 to associate one or more navigation states (or other navigation-related information) with a particular set of the received physiological sensor data (block 212). In the particular exemplary embodiment described here in connection with
Method 200 further comprises, during a live mission 206, using the set of navigation states and associated confidence levels output by the neural network 146 as inputs to the sensor fusion functionality 144 (block 212). In the particular exemplary embodiment described here in connection with
As noted above, by using the cognitive and other physiological information derived from the physiological sensor data generated by the physiological sensors 106 as an input, the sensor fusion functionality 144 can use this cognitive and other physiological information to improve the navigation solution 102 output by the INS functionality 142 (for example, to control inertial navigation information error growth in environments where GPS is not available and/or where the imaging sensors are not reliable or operable).
In the particular exemplary embodiment described above in connection with
The methods and techniques described here may be implemented in digital electronic circuitry, or with a programmable processor (for example, a special-purpose processor or a general-purpose processor such as a computer) firmware, software, or in combinations of them. Apparatus embodying these techniques may include appropriate input and output devices, a programmable processor, and a storage medium tangibly embodying program instructions for execution by the programmable processor. A process embodying these techniques may be performed by a programmable processor executing a program of instructions to perform desired functions by operating on input data and generating appropriate output. The techniques may advantageously be implemented in one or more programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and DVD disks. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs).
A number of embodiments of the invention defined by the following claims have been described. Nevertheless, it will be understood that various modifications to the described embodiments may be made without departing from the spirit and scope of the claimed invention. Accordingly, other embodiments are within the scope of the following claims.