| 32 | === Actors (System Users) === |
| 33 | We have separated the SAMBRO users into three roles: |
| 34 | 1. Implementers - would ready the system with predefined data and processes necessary for the efficient use of the system by the Publishers and Subscribers |
| 35 | 1. Publishers - are essentially authorized alerting organizations who would originate or relay CAP warning/alert messages |
| 36 | 1. Subscribers - are authenticated users who would receive restricted or private CAP messages |
| 37 | |
| 38 | With respect to the alerting cycle, there is a backward dependency among the actors. The performance of the Subscribers in achieving their intended SOPs depend the accuracy and timeliness of the messages produced by the Publishers and delivered by the SAMBRO system. The performance of the Publishers depends on the completeness of the implementation with predefined data and training regime carried out by the Implementers. |
| 39 | |
| 40 | === Research Questions === |
| 41 | The evaluation process will attempt to answer the questions: |
| 42 | * Have the Implementers operationalized the National cross-agency situational-awareness platform to meet the user expectations and ease-of-use? |
| 43 | * Are the National Alerting Authorities competent in publishing effective CAP-enabled messages for a warming cycle? |
| 44 | * Are the Subscribers, namely First-Responders, able to decipher the CAP-enabled messages, received over the various channels, to determine the necessary actions or inactions SOPs? |
| 45 | |
| 46 | === Evaluation Process === |
| 47 | These questions would be answered through a series of exploitation-evaluation activities, namely a series of controlled-exercises following a format of Simulations and Tabletop Exercise (TTX). The controlled-exercise activities constituted three steps: |
| 48 | 1. discussion of the operating procedures, |
| 49 | 1. actual execution of those procedures, and |
| 50 | 1. evaluating the outcomes |
| 51 | These activities are designed as verification exercises to determine the system’s usability (or complexity), usability, and utility. The table discusses the indicators, methods, instruments, and process of the verification process. [[br]] |
| 52 | |
| 53 | The two types of users will be subject to scenario driven simulation and tabletop exercises. |
| 54 | 1. Publishers - will be presented with various scenarios and they will act on those scenarios by executing an appropriate alerting cycle (i.e. alert, update, clear). |
| 55 | 1, Subscribers (specifically First-Responders) - will be presented with alert messages, taking them through a complete cycle, to determine their response. |
| 56 | |
| 57 | These exercises '''DO NOT INVOLVE THE PUBLIC'''. |
| 58 | |
| 59 | ==== Objective Evaluation |
| 60 | The objective evaluation is part of the controlled-exercises. The Publishers and Subscribers will exercise successive iterations of the steps forming: a goal, intentions, and actions. This is a Human Computer Interaction (HCI) technique. The Publishers would execute the actions related to issuing alerts. The Subscribers would discuss the actions they would execute for specific alerts received. In each case the participants would perceive the state of the world, interpret the state of the world, and evaluate the outcome. |
| 61 | [[br]] |
| 62 | We will install the CAM Studio screen capture software to record the user Interaction Techniques. The screen capture software will capture the behaviour of the user and the sequence of executing each process. This will also provide the Evaluators with insights of the Usability and the Operational Efficiency of the system. The alert messages saved in SAMBRO will be used to determine the data quality as well as Email and SMS received by the Subscribers will determine the Data Quality. |