PUPIL DYNAMICS, PHYSIOLOGY, AND PERFORMANCE FOR ESTIMATING COMPETENCY IN SITUATIONAL AWARENESS

Information

  • Patent Application
  • 20250000372
  • Publication Number
    20250000372
  • Date Filed
    June 30, 2023
    a year ago
  • Date Published
    January 02, 2025
    a month ago
Abstract
A computer system records eye tracking data and identifies movements in the eye tracking data to determine gaze and pupil dynamics. Eye tracking data is correlated with physiological data including heart rate. The system looks at the heart rate of the operator in real time to evaluate whether they exhibit an anticipatory response to switch attention. The system differentiates between behaviors that are cognitively initiated as opposed to triggered by the environment. The system correlates eye tracking data and physiological data with previously identified knowledge about the training scenario to assess whether the operator is able to direct attention to the right stimuli based on task needs. The system examines whether there is follow through behavior of shifting gaze to specific instruments as defined by the current task.
Description
BACKGROUND

The increasing expense of training for complicated tasks, such as pilot training, coupled with rapid advancements in computer modeling and simulation capabilities are fostering substantial growth in the use of simulation-based training technologies. Additionally, low-cost, commercial-off-the shelf (COTS) technologies are beginning to supplant custom simulation environments. However, training in general, and training via simulated environments, has substantial limitations.


Situation awareness is difficult to measure. Currently, measuring one's situation awareness requires pausing training or simulation exercises, and administering self-report surveys. This process is time consuming, and can take a trainee out of a simulation, affecting the realism of the experience. However, assessing the operator's level of situation awareness in situ is important for assessing their competency and readiness.


Consequently, it would be advantageous if an apparatus existed that is suitable for monitoring a trainee's situational awareness during a training operation.


SUMMARY

In one aspect, embodiments of the inventive concepts disclosed herein are directed to a computer system that records eye tracking data. The system identifies movements in the eye tracking data to determine gaze and pupil dynamics. Eye tracking data is correlated with physiological data including heart rate. The system looks at the heart rate of the operator in real time to evaluate whether they exhibit an anticipatory response to switch attention. The system differentiates between behaviors that are cognitively initiated as opposed to triggered by the environment.


In a further aspect, the system correlates eye tracking data and physiological data with previously identified knowledge about the training scenario to assess whether the operator is able to direct attention to the right stimuli based on task needs.


In a further aspect, the system examines whether there is follow through behavior of shifting gaze to specific instruments as defined by the current task.


It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and should not restrict the scope of the claims. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments of the inventive concepts disclosed herein and together with the general description, serve to explain the principles.





BRIEF DESCRIPTION OF THE DRAWINGS

The numerous advantages of the embodiments of the inventive concepts disclosed herein may be better understood by those skilled in the art by reference to the accompanying figures in which:



FIG. 1 shows a block diagram of a system suitable for implementing embodiments of the incentive concepts disclosed herein;



FIG. 2 shows a flowchart of an exemplary embodiment of the inventive concepts disclosed herein; and



FIG. 3 shows a block diagram of a neural network according an exemplary embodiment of the inventive concepts disclosed herein.





DETAILED DESCRIPTION

Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.


As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.


Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.


Finally, as used herein any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.


Broadly, embodiments of the inventive concepts disclosed herein are directed to a computer system that records eye tracking data. The system identifies movements in the eye tracking data to determine gaze and pupil dynamics. Eye tracking data is correlated with physiological data including heart rate. The system looks at the heart rate of the operator in real time to evaluate whether they exhibit an anticipatory response to switch attention. The system differentiates between behaviors that are cognitively initiated as opposed to triggered by the environment. The system correlates eye tracking data and physiological data with previously identified knowledge about the training scenario to assess whether the operator is able to direct attention to the right stimuli based on task needs. The system examines whether there is follow through behavior of shifting gaze to specific instruments as defined by the current task.


Referring to FIG. 1, a block diagram of a system 100 suitable for implementing embodiments of the incentive concepts disclosed herein is shown. The system 100 includes a processor 102, memory 104 in data communication with the processor 102 for storing processor executable code, one or more eye tracking sensors/cameras 108 for receiving eye tracking data stream, and one or more physiological sensors 110, including at least one heart rate monitor. Physiological sensors 110 may include devices such as an electroencephalograph (EEG), an electrocardiogramalvanic skin response sensor (GSR), pulse sensor, or any other such biometric data sensing device.


In at least one embodiment, the eye tracking sensors 108 record eye movement/gaze of a pilot and eye lid position. The processor executable code configures the processor 102 to continuously log the eye tracking data in a data storage element 106. The processor 102 analyzes the eye tracking data to identify gaze and pupil dynamics (e.g., pupil response and changes over time). The processor 102 also receives physiological data from one or more physiological sensors 110. The physiological data may include, but is not limited to the user's heart rate. In at least one embodiment, the processor 102 correlates eye tracking data (including at least gaze and pupil dynamics) with physiological data (including at least heart rate).


In at least one embodiment, eye tracking data and physiological data are correlated with discreet portions of a training scenario, and/or specific stimuli such as instrument readings, alerts, or the like. The processor 102 determines whether the eye tracking data and physiological data correspond to predetermined behaviors indicative of volitional attention shifting. In some instances, stimuli may induce an involuntary shift in gaze. Embodiments of the present disclosure are directed toward identifying an anticipatory response (e.g., gaze shift and pupil changes) correlated to subsequent actions generally anticipated in response to a given scenario. For example, for some stimuli (e.g., an alert message), a user should anticipate a need for certain information from the available instruments at some point in the future. The processor 102 may determine if the user begins shifting gaze to those instruments within some threshold time. This provides a method to differentiated between behaviors that are cognitively initiated as opposed to triggered by the environment.


In at least one embodiment, the processor 102 may compare the eye tracking and physiological data to stored profiles. Such profiles may be specific to the user and indicate a user specific minimum threshold. Alternatively, or in addition, the profiles may represent some standard minimum response.


In at least one embodiment, the processor 102 may alert a remote party such as ground control personnel via a wireless communication device 112. Alternatively, or in addition, the processor 102 may render the determination of cognitively initiated responses on a display 114. For example, in a training scenario, an instructor may continuously monitor cognitively initiated responses (as compared to stimuli triggered responses) on the display 114.


In at least one embodiment, the processor 102 transfers the stored eye tracking data and other correlated system and task data to an offline storage device for later analysis and correlation to historic data and other outside factors such as crew rest, crew sleet rhythms, flight schedules, etc. Such transfer may be in real time via the wireless communication device 112.


Referring to FIG. 2, a flowchart of an exemplary embodiment of the inventive concepts disclosed herein is shown. A computer system implementing embodiments of the inventive concepts disclosed herein receives 200 an image stream corresponding to eye tracking data from one or more vision-based sensors and physiological data, including at least heart rate data, from one or more physiological sensors. The eye tracking data and physiological data are continuously logged 202 and correlated to a specific flight task or an individual duty schedule of the user/pilot.


In at least one embodiment, the eye tracking data is analyzed 204 to characterize gaze and pupil dynamics as cognitively initiated or stimuli triggered. Such analysis 204 may include processing via machine learning, neural network algorithms. Cognitive initiation may be characterized by timing; for example, gaze shift and/or pupil dynamics that are stimuli triggered may be generally faster and more cursory than cognitively initiated gaze shifts.


In at least one embodiment, gaze and pupil dynamics may be analyzed 204 along with physiological data such as heart rate. Gaze and pupil dynamics correlated to certain heart rate ranges, and in the context of specific stimuli, may indicate cognitively initiated responses as opposed to stimuli triggered responses. In at least one embodiment, small involuntary eye movements (and potentially eye lid movement/position) may be identified and used to characterize the gaze and pupil dynamics.


The determination of cognitive initiation may be ex post facto. In at least one embodiment, cognitively initiated responses may be associated with a set of anticipatory responses. If the set of anticipatory responses is later identified, the initial response may be characterized as cognitively initiated.


In at least one embodiment, eye tracking data and physiological data may be compared 206 to task or user specific profiles. The task or user specific profiles may define a minimum characterization of cognitively initiation, either based on previously observed responses of known experts, or based on previously observed responses of the user.


Embodiments of the inventive concepts disclosed herein are critical to enabling reduced crew or single pilot operations, and will provide independent measures necessary to facilitate reduced crew in the cockpit by providing a means to identify when crew members are unable to continue safe flight and notify relief crew or activate automation. Furthermore, a training application may utilize embodiments of the inventive concepts to compare the small involuntary eye movements of a pilot-in-training to previously characterized professional pilot data patterns. Improved methods of assessing an operator's ability to acquire and maintain situation awareness is important for training. As such, it can serve as enabling technologies for single pilot operations or reduced crew operations.


Referring to FIG. 3, a block diagram of a neural network 300 according an exemplary embodiment of the inventive concepts disclosed herein is shown. The neural network 300 comprises an input layer 302 that receives external inputs (including physiological signals, such as EEG, ECG, and GSR, eye tracking data, and potentially user or task specific profiles), and output layer 304, and a plurality of internal layers 306, 308. Each layer comprises a plurality of neurons or nodes 310, 336, 338, 340. In the input layer 302, each node 310 receives one or more inputs 318, 320, 322, 324 corresponding to a digital signal and produces an output 312 based on an activation function unique to each node 310 in the input layer 302. An activation function may be a Hyperbolic tangent function, a linear output function, and/or a logistic function, or some combination thereof, and different nodes 310, 336, 338, 340 may utilize different types of activation functions. In at least one embodiment, such activation function comprises the sum of each input multiplied by a synaptic weight. The output 312 may comprise a real value with a defined range or a Boolean value if the activation function surpasses a defined threshold. Such ranges and thresholds may be defined during a training process. Furthermore, the synaptic weights are determined during the training process.


Outputs 312 from each of the nodes 310 in the input layer 302 are passed to each node 336 in a first intermediate layer 306. The process continues through any number of intermediate layers 306, 308 with each intermediate layer node 336, 338 having a unique set of synaptic weights corresponding to each input 312, 314 from the previous intermediate layer 306, 308. It is envisioned that certain intermediate layer nodes 336, 338 may produce a real value with a range while other intermediated layer nodes 336, 338 may produce a Boolean value. Furthermore, it is envisioned that certain intermediate layer nodes 336, 338 may utilize a weighted input summation methodology while others utilize a weighted input product methodology. It is further envisioned that synaptic weight may correspond to bit shifting of the corresponding inputs 312, 314, 316.


An output layer 304 including one or more output nodes 340 receives the outputs 316 from each of the nodes 338 in the previous intermediate layer 308. Each output node 340 produces a final output 326, 328, 330, 332, 334 via processing the previous layer inputs 316, the final output 326, 328, 330, 332, 334 corresponding to a characterization of a set of physiological data as either cognitively initiated or stimuli triggered. Such outputs may comprise separate components of an interleaved input signal, bits for delivery to a register, or other digital output based on an input signal and DSP algorithm.


In at least one embodiment, each node 310, 336, 338, 340 in any layer 302, 306, 308, 304 may include a node weight to boost the output value of that node 310, 336, 338, 340 independent of the weighting applied to the output of that node 310, 336, 338, 340 in subsequent layers 304, 306, 308. It may be appreciated that certain synaptic weights may be zero to effectively isolate a node 310, 336, 338, 340 from an input 312, 314, 316, from one or more nodes 310, 336, 338 in a previous layer, or an initial input 318, 320, 322, 324.


In at least one embodiment, the number of processing layers 302, 304, 306, 308 may be constrained at a design phase based on a desired data throughput rate. Furthermore, multiple processors and multiple processing threads may facilitate simultaneous calculations of nodes 310, 336, 338, 340 within each processing layers 302, 304, 306, 308.


Layers 302, 304, 306, 308 may be organized in a feed forward architecture where nodes 310, 336, 338, 340 only receive inputs from the previous layer 302, 304, 306 and deliver outputs only to the immediately subsequent layer 304, 306, 308, or a recurrent architecture, or some combination thereof.


It is believed that the inventive concepts disclosed herein and many of their attendant advantages will be understood by the foregoing description of embodiments of the inventive concepts disclosed, and it will be apparent that various changes may be made in the form, construction, and arrangement of the components thereof without departing from the broad scope of the inventive concepts disclosed herein or without sacrificing all of their material advantages; and individual features from various embodiments may be combined to arrive at other embodiments. The form herein before described being merely an explanatory embodiment thereof, it is the intention of the following claims to encompass and include such changes. Furthermore, any of the features disclosed in relation to any of the individual embodiments may be incorporated into any other embodiment.

Claims
  • 1. A computer apparatus comprising: at least one eye tracking camera; andat least one processor in data communication with a memory storing processor executable code; andwherein the processor executable code configures the at least one processor to: receive an image stream from the at least one eye tracking camera;identify gaze and pupil dynamics from the image stream;correlate the gaze and pupil dynamics with one or more stimuli; andcharacterize the gaze and pupil dynamics as either cognitively initiated or stimuli triggered.
  • 2. The computer apparatus of claim 1, further comprising one or more physiological data recording devices in data communication with the at least one processor, wherein: the processor executable code further configures the at least one processor to: receive physiological data from the one or more physiological data recording devices, including at least a heart rate; andcorrelate the physiological data with the gaze and pupil dynamics; andcharacterizing the gaze and pupil dynamics includes reference to the physiological data.
  • 3. The computer apparatus of claim 2, wherein: the processor executable code further configures the at least one processor to receive a task or user specific profile of gaze, pupil dynamics, and physiological data; andcharacterizing the gaze and pupil dynamics includes reference to the task or user specific profile.
  • 4. The computer apparatus of claim 1, further comprising a data storage element in data communication with the at least one processor, wherein the processor executable code further configures the at least one processor to continuously store the gaze and pupil dynamics in the data storage element.
  • 5. The computer apparatus of claim 4, wherein: the processor executable code further configures the at least one processor to: identify subsequent user-initiated responses; andcorrelate the subsequent responses to the stored gaze and pupil dynamics; andcharacterizing the gaze and pupil dynamics includes reference to the identified subsequent user-initiated responses to determine if the user-initiated responses correspond to anticipatory responses.
  • 6. The computer apparatus of claim 4, wherein the processor executable code further configures the at least one processor to: analyze the stored gaze and pupil dynamics over time to identify a user specific profile of a user; andsubsequently compare the gaze and pupil dynamics to the user specific profile.
  • 7. The computer apparatus of claim 1, wherein the processor executable code further configures the at least one processor as a machine learning neural network.
  • 8. A method comprising: receiving an image stream from at least one eye tracking camera;identifying gaze and pupil dynamics from the image stream;correlating the gaze and pupil dynamics with one or more stimuli; andcharacterizing the gaze and pupil dynamics as either cognitively initiated or stimuli triggered.
  • 9. The method of claim 8, further comprising: receiving physiological data from one or more physiological data recording devices, including at least a heart rate; andcorrelating the physiological data with the gaze and pupil dynamics,wherein characterizing the gaze and pupil dynamics includes reference to the physiological data.
  • 10. The method of claim 9, further comprising receiving a task or user specific profile of gaze, pupil dynamics, and physiological data, wherein characterizing the gaze and pupil dynamics includes reference to the task or user specific profile.
  • 11. The method of claim 8, further comprising continuously storing the gaze and pupil dynamics in a data storage element.
  • 12. The method of claim 11, further comprising: identifying subsequent user-initiated responses; andcorrelating the subsequent responses to the stored gaze and pupil dynamics,wherein characterizing the gaze and pupil dynamics includes reference to the identified subsequent user-initiated responses to determine if the user-initiated responses correspond to anticipatory responses.
  • 13. The method of claim 11, further comprising: analyzing the stored gaze and pupil dynamics over time to identify a user specific profile of a user; andsubsequently comparing the gaze and pupil dynamics to the user specific profile.
  • 14. The method of claim 8, further comprising initiating a remedial action when the gaze and pupil dynamics are characterized as stimuli triggered and no anticipatory response is identified.
  • 15. A simulator comprising: at least one eye tracking camera; andat least one processor in data communication with a memory storing processor executable code; andwherein the processor executable code configures the at least one processor to: receive an image stream from the at least one eye tracking camera;identify gaze and pupil dynamics from the image stream;correlate the gaze and pupil dynamics with one or more stimuli; andcharacterize the gaze and pupil dynamics as either cognitively initiated or stimuli triggered.
  • 16. The simulator of claim 15, further comprising one or more physiological data recording devices in data communication with the at least one processor, wherein: the processor executable code further configures the at least one processor to: receive physiological data from the one or more physiological data recording devices, including at least a heart rate; andcorrelate the physiological data with the gaze and pupil dynamics; andcharacterizing the gaze and pupil dynamics includes reference to the physiological data.
  • 17. The simulator of claim 16, wherein: the processor executable code further configures the at least one processor to receive a task or user specific profile of gaze, pupil dynamics, and physiological data; andcharacterizing the gaze and pupil dynamics includes reference to the task or user specific profile.
  • 18. The simulator of claim 15, further comprising a data storage element in data communication with the at least one processor, wherein the processor executable code further configures the at least one processor to continuously store the gaze and pupil dynamics in the data storage element.
  • 19. The simulator of claim 18, wherein: the processor executable code further configures the at least one processor to: identify subsequent user-initiated responses; andcorrelate the subsequent responses to the stored gaze and pupil dynamics; andcharacterizing the gaze and pupil dynamics includes reference to the identified subsequent user-initiated responses to determine if the user-initiated responses correspond to anticipatory responses.
  • 20. The simulator of claim 18, wherein the processor executable code further configures the at least one processor to: analyze the stored gaze and pupil dynamics over time to identify a user specific profile of a user; andsubsequently compare the gaze and pupil dynamics to the user specific profile.