Scenario Workflow Based Assessment System and Method

Information

  • Patent Application
  • 20080026348
  • Publication Number
    20080026348
  • Date Filed
    October 04, 2007
    17 years ago
  • Date Published
    January 31, 2008
    16 years ago
Abstract
A system and a method for automating performance assessments of an exercise or a training activity provide event assessment information in real-time to one or more evaluators in conjunction with unfolding events. The evaluators can wirelessly communicate assessment information to a database for after action review (AAR).
Description
FIELD OF THE INVENTION

The invention pertains to systems and methods for assessing performance of participants during training exercises, or, rehearsing of missions. More particularly, the invention pertains to automated systems and methods to facilitate performance evaluation by providing real-time feedback to evaluators as an activity proceeds.


BACKGROUND OF THE INVENTION

The importance of training personnel to respond to events such as fires, violent domestic events, accidents or natural disasters (earthquakes, tornadoes, floods or the like) is well recognized. Similar comments apply to military training/mission rehearsal.


Training/rehearsal activities can last hours or days and can involve a large number of geographically dispersed participants. The value of collecting information as to how the exercise was carried out to facilitate an accurate and meaningful after-action review is also well known. One such system and method are disclosed in U.S. Pat. No. 6,106,297 issued Aug. 22, 2000, assigned to the assignee hereof and entitled “Distributed Interactive Simulation Exercise Manager System and Method”. The '297 patent is hereby incorporated by reference.


While the primary value of conducting a performance session, such as a training or exercise session, is an effective and accurate assessment, (the basis of measurable and verifiable feedback to the session audience or participant, the after-action review (AAR)) obtaining such assessments during such sessions can be difficult. A problem in efficiently assessing, or evaluating, performance during complex tasks is defining what is important to be assessed at any given time, and what assessment criteria should be used.


It has been known in prior art to define assessment criteria and guidance prior to the assessment session. The assessor is then required to monitor performance activities to determine what type of events are taking place, recall and apply the applicable assessment criteria and assessment guidance, and record the applicable assessment. This approach is labor intensive, particularly for complex tasks involving teams of several individuals, and teams in different locations.


There continues to be a need for improved, preferably real-time evaluation systems and methods. Preferably such systems and methods will be flexible and cost effective to implement so as to be usable to provide assessment information for a wide range of civilian and military exercises.


SUMMARY OF THE INVENTION

A method which embodies the invention includes defining a scenario workflow relative to a selected situation, establishing a set of assessment criteria relative to a plurality of workflow related events, carrying out the scenario; as the scenario proceeds, retrieving assessment criteria for at least one active event; and providing the returned assessment criteria to an evaluator.




BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an over-view of a method in accordance with the invention;



FIG. 2 illustrates a system in accordance with the invention;



FIG. 3 illustrates an exemplary scenario event;



FIG. 4 illustrates an exemplary workflow event list for the event of FIG. 3;



FIG. 5 illustrates an exemplary event assessment criteria list;



FIG. 6 illustrates a displayed event list for activating an event;



FIG. 7A illustrates an exemplary assessment entering screen; and



FIG. 7B illustrates an exemplary assessment prompting display for an active event.




DETAILED DESCRIPTION OF THE EMBODIMENTS

While embodiments of this invention can take many different forms, specific embodiments thereof are shown in the drawings and will be described herein in detail with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention, and as a disclosure of the best mode of practicing the invention. It is not intended to limit the invention to the specific embodiment illustrated.


Systems and methods that embody the invention improve the efficiency of assessing performance during complex tasks, such as for distributed teams cooperating to achieve a common goal. Assessment accuracy is increased while reducing the work associated with recording behavior observations, preparing material for briefings and debriefings, and presenting feedback messages or comments during and after the assessment session.


Efficiency is increased because 1) the assessor is prompted during the assessment session with the applicable assessment criteria and guidance. An assessor can more quickly determine what assessment is needed; 2) the assessor is prompted with a tailored form for recording the assessment, so the assessor can record the assessment more quickly and accurately; 3) assessment responsibilities can be efficiently distributed and allocated across several assessors during the assessment session, so fewer assessors are needed; 4) expert subjective judgment guidance that is applicable to the assessment event can be captured from experts prior to the assessment session, and communicated to all assessors as required during the assessment session, so the assessors are not required to be experts.


Further, a method that embodies the invention automates performance assessment of an activity such as a training or a rehearsal exercise (the assessment session). The automation results in measurable or observable assessments of session events. In one embodiment, the method includes:


1) generating a scenario workflow which defines an event list or events that are expected to occur during an assessment session;


2) generating a workflow event assessment list which defines assessment criteria or guidance to be used during a respective event;


3) defining active events from the event list;


4) prompting the assessor regarding the applicable assessment criteria relative to a currently active event; and


5) recording the assessors' observations.


Alternative embodiments include:


a) Predefining the scenario workflow (e.g., a predefined script that always occurs in the same sequence).


b) The scenario workflow which defines an event list can be dynamically adjusted in response to the actions taken by the participants during the assessment session (e.g., a portion of the scenario workflow only occurs if the participants behave in a certain way referred to as a triggering behavior).


c) The scenario workflow which defines an event list can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., the evaluator may add scenario workflow to increase the workload during an assessment session - referred to as an “inject”).


d) The workflow event list criteria can be predefined (e.g., the criteria to be used for an event is always the same whenever that event occurs in the scenario workflow).


e) The workflow event list assessment criteria can be dynamically adjusted in response to the actions taken by the participants during the assessment session (e.g., if the participants or audience respond to a scenario situation).


f) The workflow event list assessment criteria can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., if the evaluator may determine that a report preparer has demonstrated mastery of the primary criteria of submitting a properly formatted report in a timely fashion. The assessment criteria for future reporting could be adjusted to record whether the report preparer consults all relevant information sources needed for a quality report).


g) Defining what event from the event list are currently applicable can be based on human observation.


h) Defining what events from the event list are currently applicable can be based on observation by a computer-based agent.


i) Prompting the assessor regarding the applicable assessment criteria can be predefined (e.g., the assessor is always provided the same cue or input screen each time the applicable event occurs).


j) Prompting the assessor regarding the applicable assessment criteria can be dynamically adjusted in response to the actions taken by the audience during the assessment session (e.g., if the participants are frequently performing one type of task (Task A) and infrequently performs another type of task (Task B). In this regard, prompting the assessor regarding the applicable assessment criteria can be dynamically adjusted by the evaluator based on accomplishment of assessment objectives during the assessment session (e.g., if the evaluator determines that a report preparer has demonstrated mastery of the primary criteria of submitting a properly formatted report in a timely fashion. The assessment criteria for future reporting could be adjusted to record whether the report preparer consults all relevant information sources needed for a quality report).


In one system, that embodies this invention, the scenario workflow is defined to detail sequences or series of events that would result for a particular situation or course of action. The scenario workflow is then stored in a scenario workflow event database. The scenario workflow is analyzed to define what assessment criteria, if any, are applicable for each of the workflow events. These assessment criteria are added to the scenario workflow event database.


During the assessment session, the assessor or evaluator is provided an automated assessment device, or AAD. The AAD facilitates the use of hand-carried Tablet PCs by the evaluation team to coordinate evaluation responsibilities, display the pre-brief material and exercise status at any time during the exercise. As the scenario session proceeds, the scenario is monitored to determine what events are currently active.


By accessing the workflow event database, the system retrieves the details for the scenario event, including the assessment criteria list for that event. The evaluator is prompted with context-sensitive assessment criteria as key and critical events occur. As the evaluator records observations against each exercise goal, AAD provides immediate feedback of exercise goals that require additional attention. This significantly reduces the need for the evaluator to closely monitor actual scenario event execution while recording key observations. The AAD provides assistance to the evaluator to prepare the After Action Review (AAR).


Following the exercise, the evaluation team can quickly review evaluator comments grouped by exercise objectives and select candidates for discussion. During the review, the evaluator can use the AAD to identify and replay critical periods of audience actions that temporally relate to the evaluator's notes, while referencing the guidance provided during the exercise planning process. This automation and recall capability are not available using current manual data collection and synthesis techniques. The resultant After Action Reports provide invaluable lessons for the exercise participants, exercise coordinators and stakeholder agencies, including best practices that can be shared, thus, enhancing emergency preparedness.



FIG. 1 illustrates an overall view of a method 100 in accordance with the invention. In a step 102a a scenario workflow is generated. In a step 102b, a workflow event list is produced from the scenario workflow. The event list produced in step 102b defines those events which are expected to occur during a scenario assessment session.


In step 104a, event assessment criteria are assigned. In step 104b an event assessment criteria list is generated. Those of skill will understand that the workflow event list step 102b and event assessment criteria list step 104b could be stored as known to those of skill in the art for subsequent use.


In step 106a the scenario is initiated and the participants or audience participate in the ongoing scenario. Scenario workflow events unfold, step 106b and the assessor 110 is available to observe the participants' or audience's performance in response to the events 106b.


As the events unfold, step 106b, active events are recognized, step 108 either automatically or by personnel associated with implementing the session.


Active events trigger a retrieval of evaluation or assessment information, step 116. This information is forwarded to the assessor 110, step 118.


The assessors' remarks, comments and evaluations are received step 120 and stored for subsequent use, step 122 during after action review. Step 118 can be repeated as appropriate to provide supplemental assessment information to assessor 110 in view of previously recorded assessments step 122.



FIG. 2 is a block diagram of a system 10 in accordance with the invention. System 10 represents one embodiment for implementing the methodology 100 of FIG. 1. As illustrated in FIG. 1, the training audience or participants 12 can communicate in a system 10 using, for example, personal computers which communicate via an intranet 12a and server 12b with the Internet 14. It will be understood that the participants 12 might participate in the subject scenario as multiple separate teams or as one organization working together to respond to the scenario. The participants or audience 12 can receive information via their personal computers and intranet 12a, such as by e-mail, telephone or any other form of information providing systems, such as video, crisis information management system (CIMS) or the like.


The evaluators 110 can wirelessly communicate with a server associated with web 14 using wireless personal communication devices indicated generally at 20. This wireless communication capability makes it possible for the evaluators to readily move among the audience or participants during the exercise.


A plurality of web based servers 24 provides useful information for the exercise. An emergency scene simulator system 26 provides a realistic representation of the subject scenario which requires the attention and action of the training audience or participants 12. The e-mail, phone, crisis information management system 28 provides realistic communication and coordination capabilities for the audience/participants 12. The scenario log/replay system 30 records audience communications, scenario events, and emergency scene related simulation activity. System 32 stores and makes available scenario events, procedures, standards, including the workflow event list, the event assessment criteria lists and related assessment information.


Role players 34 interact with the ongoing scenario and participants in the session via one or more web based servers. They can communicate with the audience or participants and/or control the sequence of events, the emergency scene simulator 26 and any other event related information.


As noted above, the scenario workflow is defined to detail sequences or series of events that would result for a particular situation or course of action, step 102a. A sample scenario is illustrated in FIG. 3 (a single event is shown for simplicity). The scenario events can then be stored in a system, or, database 32. It will be understood that the scenario events might be stored at a plurality of locations without departing from the spirit and scope of the invention.


For any given scenario event, there is a workflow procedure or event sequence, the workflow event list step 102b, that the audience is expected to follow in response to the occurrence of the scenario events. The expected workflow event list 102b for the sample scenario event in FIG. 3 is illustrated in FIG. 4. The scenario workflow event list 102b can then be stored in a database such as database 32, FIG. 2.


The scenario workflow is analyzed to define what assessment criteria, if any, are applicable for each of the workflow events step 104a. The assessment criteria list, step 104b, is created and stored in a database such as database 32. These criteria include a description of the objective behavior that is expected of the audience, an event priority used to designate relative priority of events, an Evaluator Observation column to cue the assessor as to what should be observed, and a Performance Outcome note to define the criteria for successful performance by the participants or audience. These criteria which can be added to the event list 102b from FIG. 4, are illustrated in FIG. 5, as the assessment criteria list step 104b. The assessment criteria list can be stored in a database such as 32.


Event determining software can be used to define what events from the event list are currently active or applicable, step 108. As the scenario session proceeds, the scenario is monitored to determine what events are currently active. This can be accomplished by monitoring the scenario activity and looking for a match of the expected “Trigger” condition for those scenario events that are expected but have not yet occurred (status=“Pending”).


In one embodiment, active event identification can be accomplished by a human monitoring the event status see screen of FIG. 6, including all e-mail communication, as well as all voice between the role players and the training audience. When the monitor determines that an event had started, the monitor can designate that even as active (status=“Active”), using a PC-based workstation. This status of the events that are currently applicable would be recorded, for example in database 32, and transmitted over the network to all role players 34 and evaluators 110.


By assessing the workflow event database, the details for the active scenario event, including the assessment criteria list step 104b for that event can be retrieved, for example from database 32. The evaluator 110 is prompted with context-sensitive assessment criteria as key and critical events occur via an automated assessment device, or AAD 20.


The AAD 20, for example, hand-carried Tablet-PCs, enables the evaluation team 110 to coordinate evaluation responsibilities, display lists on other exercise material and exercise status at any time during the exercise. FIGS. 7A and 7B are sample displays as might be presented to the evaluation team 110.



FIG. 7A illustrates an exemplary normal AAD display where the scenario event monitor is shown in the bottom half of the display. The evaluator records assessments in the top half of the display. As the evaluator records observations against each exercise goal, the AAD provides immediate feedback of exercise goals that require additional attention step 118. This significantly reduces the need for the evaluator to closely monitor actual scenario event execution while recording key observations.



FIG. 7B illustrates AAD operation when a critical event has become active. The evaluator is cued to this activity by the “alert” button, lower left corner. Upon selecting the alert button, an Alert Display that is appropriate for assessing the current active event is displayed. This event-appropriate cue enables extremely efficient assessment.


It will be understood that none of the details pertaining to communication via intranet 12a or Internet 14 are limitations of the invention. Similarly, those of skill in the art will recognize that none of the size or location of the participants or audience 12, evaluator or assessment team(s) 110 or role players 34 are limitations of the invention.


From the foregoing, it will be observed that numerous variations and modifications may be effected without departing from the spirit and scope of the invention. It is to be understood that no limitation with respect to the specific apparatus illustrated herein is intended or should be inferred. It is, of course, intended to cover by the appended claims all such modifications as fall within the scope of the claims.

Claims
  • 1. A method comprising: defining a scenario workflow relative to a selected situation; storing the workflow; establishing a set of different assessment criteria relative to a plurality of different workflow related events; retrieving a selected one of the assessment criteria in response to at least one active event as the scenario proceeds; and providing the retrieved assessment criteria to an evaluator at or about the time the active event occurs.
  • 2. (canceled)
  • 3. A method as in claim 1 where the criteria are communicated, at least in part, wirelessly.
  • 4. A method as in claim 1 which includes storing real-time evaluation information relative to the active event.
  • 5. A method as in claim 4 which includes providing supplemental information relative to evaluating the event, in response to previously stored observations.
  • 6. A method as in claim 5 where the criteria are communicated, at least in part, wirelessly.
  • 7. A method as in claim 5 where the supplemental information is provided in real-time as the event is taking place.
  • 8. A method as in claim 7 which includes storing evaluation information during the event grouped at least by predetermined objectives.
  • 9. A method as in claim 8 which includes replaying stored information relative to the event, in combination with temporally related, pre-stored, evaluation information.
  • 10. A method as in claim 9 which includes preparing reports pertaining to scenario implementation.
  • 11. A method as in claim 7 where the supplemental information is provided, in part, wirelessly to at least one evaluator.
  • 12. A method as in claim 1 which includes carrying out a plurality of events related to the scenario workflow.
  • 13. A method as in claim 12 which includes wirelessly receiving different assessment criteria related to a plurality of different workflow related events.
  • 14. A method as in claim 13 which includes providing wireless receivers to the evaluators.
  • 15. A system comprising: a stored workflow event list; a stored event assessment criteria list where the criteria list includes a plurality of different context-sensitive performance criteria; circuitry to sequentially and visually provide a plurality of different assessment criteria to an evaluator in response to a plurality of events; and circuitry for providing supplemental information to the evaluator.
  • 16. A system as in claim 15 which includes circuitry to provide assessment criteria to a plurality of evaluators.
  • 17. A system as in claim 16 which includes storage for a plurality of event assessments.
  • 18. A system as in claim 17 which includes a plurality of evaluator units which display context-sensitive assessment criteria, the units each include a display device and a tablet for manual input of information.
  • 19. A system as in claim 18 where at least some of the units communicate wirelessly.
  • 20. A system as in claim 19 where the units include software enabling an evaluator to enter exercise related observations.
  • 21. A system as in claim 19 where the units include software to receive and present event assessment criteria, from the list, to the evaluator relative to the exercise.
  • 22. A system for assessing performance of participants in an exercise comprising: a system for storing a predetermined workflow event list, and predetermined event assessment criteria lists; a computer network coupled to the system; a plurality of exercise participant computers coupled to the network; a plurality of wireless assessment computers each of which is coupled to the network, each of which includes a display device; event determining means in communication with the network, the event determining means designates an active event, and responsive thereto details of the active event and associated assessment criteria are retrieved from the system and transmitted respectively to at least some of the participant computers and to at least some of the assessment computers, with the assessment criteria displayed thereon along with an assessment receiving input region, when the event is taking place.
  • 23. A system as in claim 22 where the assessment computers include a tablet for entry of assessment information, and including simultaneously displaying at the assessment computer both event and assessment information.
  • 24. A system as in claim 22 which includes an exercise simulator system, a communications information management system and a system to record audience communications and scenario events all of which are coupled to the computer network.
  • 25. A system as in claim 23 where the event determining means comprises a workstation.
  • 26. A system as in claim 25 wherein displayed event information specifies an active event and an associated objective.
CROSS-REFERENCE TO RELATED APPLICATION

This Continuation application claims the benefit of the filing date of U.S. patent application Ser. No. 10/870,879 filed Jun. 17, 2004 and entitled “Scenario Workflow Based Assessment System and Method” which is incorporated herein by reference.

Continuations (1)
Number Date Country
Parent 10870879 Jun 2004 US
Child 11867328 Oct 2007 US