This is the second in a two-part series on the need to regularly test your emergency, continuity, and disaster recovery plans. The first part in the series can be found here.
Test Exercise Roles & Responsibilities
Test Design Team
A Test Design Team is responsible for designing the test exercise scenario. These individuals should have in-depth knowledge of the organization and departments being tested,
and are able to produce "credible scenarios" and yet stay on course with the test plan. Typically, the Disaster Recovery Coordinator serves in the capacity of the Test Design
Team.
Assistance from a third party to help design your exercise can provide an independent evaluation of your exercise. Firestorm® acts as the Design Team for many organizations' test exercises.
The Simulation Team
The Simulation Team will guide the participants through the test or simulation.
Simulation Team Guidelines:
- Know the test plan and the messages
- Know where the test is going
- Know your resources
- Know your messages
Simulation Room:
- The simulation room should be located near the test room, but far enough away where occupants cannot be heard.
- Have a sufficient number of phones.
- Have white boards or flip charts for scribes to note the current status.
- Key messages need to be noted for tracking.
- The room needs to have adequate room and wall space.
The Facilitator
The facilitator is responsible for central coordination of the test exercise. Firestorm acts as the facilitator for many organizations’ test exercises. The facilitator is
responsible for overseeing the accomplishment of targeted objectives, and conducting a briefing following the exercise.
Facilitators should be knowledgeable in the execution of the plan(s) being tested. The facilitator provides overall guidance and coordinates with the participants and should
assure that:
- Participant instructions are prepared.
- Full scenario and player scenarios, as well as a master sequence of events are prepared.
- Evaluation forms are prepared.
The Test Assistant
A Test Assistant may be assigned to assist the facilitator throughout the testing process.
Evaluators should be very knowledgeable of the plan(s) being tested. Evaluators should be assessing command, control, coordination, and communication activities, and should be
observant and objective. The role of a test evaluator is to:
- Monitor test play.
- Evaluate actions, not players.
- Determine if the objectives and related actions are being met.
- Identify problems to the facilitator.
- Track key messages and report findings to facilitator.
- Evaluator Activities:
- Attend the pre-test briefing.
- Assist in the development of evaluation form.
- Review and know the test plan.
- Know the objectives, narrative and messages.
- Know the test organization.
- Report early to the test.
- Be positioned near intake phones so you will see where messages go and how they are handled.
- If key messages are lost, advise the facilitator so the message can be resent.
- Assign certain messages to specific evaluators so they can track their progress.
- Note message processing on evaluator forms.
Test participants should be familiar with their specific roles within the plan(s) that are being tested. They should be specifically named as team members within the plan (s).
Messages
Messages drive the test, expose unresolved issues, and address the objectives. They add information to describe the disaster environment and/or situation. Messages stimulate
action by the participants. Messages can escalate an initial (primary) problem and create secondary or tertiary problems. Example:
- Primary event — earthquake
- Second event — building collapse
- Tertiary event — building fire
Messages should influence action at least one of four ways:
- Verification — information gathering
- Consideration — discussion, consultation
- Deferral — place on a priority list
- Decision — deploy or deny resources
Message component examples:
- Time — what time is it to be delivered within the test?
- Who — who is the source of the message?
- Mode — how was the message transmitted?
- To Whom — who is the recipient?
- What — is the content of the message?
Followup
Following completion of the test, the facilitator should review the test plan with the participants and answers questions. If possible, audio-visuals should be used to add
realism. The best time for a debriefing is immediately after the test. The test facilitator should facilitate the session. The purpose of the debriefing is to:
- Review and evaluate the test
- Provide feedback
- Review lessons learned from the test
- Obtain feedback from all participants on what worked and what didn’t work
- Note issues of command, control, coordination, and communication
- Have each function/business unit chair report on their group
Written Evaluations:
- Test participants should evaluate the perceived value of the test and their overall reaction to the experience
- They should evaluate the existing plan(s)
- They should evaluate the test
- They should identify the need for further training and tests
- They should make suggestions for improvement
The test facilitator should incorporate debriefing comments, evaluator observations and participant evaluations into a concise report of the event including lessons learned, issues that need correction, next steps, and additional training needed.
Test exercise analyses should include:
- An assessment of whether the test exercise objectives were completed.
- An assessment of the validity of test exercise data processed.
- Corrective actions to address problems encountered.
- A description of any gaps between the plan(s) tested and actual test exercise results.
- Proposed modifications to the plan(s).
- Recommendations for future test exercises.
- The report should be completed within five working days of the test and distribute it to all participants.
Summary: Keys To A Successful Test Exercise
Having a clear objective, management support, a realistic scenario, and active involvement are key to an exercise success. The updates and plan changes based on the lessons
learned in the exercise must be made and shared. In evaluating a plan(s), look for:
- Top level support and involvement.
- Test design team expertise.
- Realistic test plan.
- Thorough preparation and attention to detail.
- Clear introduction and instructions.
- Participant feedback at debriefing.
- Follow-up.