Version 17 (modified by Nuwan Waidyanatha, 8 years ago) ( diff )


SAMBRO Evaluation Guidelines

We provide a methodology and a set of evaluation instruments to be used in evaluating the implementation.


Sahana Alerting and Messaging Broker (SAMBRO) is designed to foster Cross-Agency Situational-Awareness in support of improving institutional responsiveness to All-Hazards: . Alerting Authorities (Publishers) and warning recipients like the public and first-responders (Subscribers) can interconnect through the SAMBRO. It offers an all-hazard all-media approach whereby consistent, concise, and controlled alerts and warnings are channelled through multiple technologies to geo-targeted recipients.

To evaluate the performance of SAMBRO, along with the underlying Common Alerting Protocol (CAP) content standard and procedures, we are introducing an objective and subjective evaluation approach. The evaluation is specific to early warning and alerting. Through structured scenario-based simulation exercises, we will evaluate the performance of the users and the usability and ease of use of the system.


Issuing Alerts (Publish)

Primarily there are two types of devices that can be used for issuing messages:

  • Personal Computer (PC) - Desktop, Laptop, Notebook
  • SmartPhones (including Tablet PCs)

SAMBRO offers two types of applications:

  • Browser-based application that is accessible through a PC or SmartPhone
  • Mobile-based application (Android & iOS) that is accessible using a SmartPhone only

Receiving Alerts (Subscribe)

There are multiple channels for receiving alerts: SMS, Email, RSS, Twitter, Facebook. Only “public” alerts are received over Twitter and Facebook. SAMBRO offers two RSS feeds: one that carries public warnings and the other that carries restricted and private warnings. SMS and Email are purely for those Administrator and Self controlled Subscribers.

The user may chose to receive alerts on a PC, SmarPhone, or Standard Mobile Phone. Each of the devices have their limitations.


The evaluation hinges on the simple question: “did the technology and the people perform as planned on the day of the exercise?”

NOTE that the technology and procedures should not be a surprise to the users. They should have undergone training and a series of silent-tests to be familiar with the system and be competent in operating and reacting to the system for all known scenarios.

Actors (System Users)

We have separated the SAMBRO users into three roles:

  1. Implementers - would ready the system with predefined data and processes necessary for the efficient use of the system by the Publishers and Subscribers
  2. Publishers - are essentially authorized alerting organizations who would originate or relay CAP warning/alert messages
  3. Subscribers - are authenticated users who would receive restricted or private CAP messages

With respect to the alerting cycle, there is a backward dependency among the actors. The performance of the Subscribers in achieving their intended SOPs depend the accuracy and timeliness of the messages produced by the Publishers and delivered by the SAMBRO system. The performance of the Publishers depends on the completeness of the implementation with predefined data and training regime carried out by the Implementers.

Research Questions

The evaluation process will attempt to answer the questions:

  • Have the Implementers operationalized the National cross-agency situational-awareness platform to meet the user expectations and ease-of-use?
  • Are the National Alerting Authorities competent in publishing effective CAP-enabled messages for a warming cycle?
  • Are the Subscribers, namely First-Responders, able to decipher the CAP-enabled messages, received over the various channels, to determine the necessary actions or inactions SOPs?

Evaluation Process

These questions would be answered through a series of exploitation-evaluation activities, namely a series of controlled-exercises following a format of Simulations and Tabletop Exercise (TTX). The controlled-exercise activities constituted three steps:

  1. discussion of the operating procedures,
  2. actual execution of those procedures, and
  3. evaluating the outcomes

These activities are designed as verification exercises to determine the system’s usability (or complexity), usability, and utility. The table discusses the indicators, methods, instruments, and process of the verification process.

The two types of users will be subject to scenario driven simulation and tabletop exercises.

  1. Publishers - will be presented with various scenarios and they will act on those scenarios by executing an appropriate alerting cycle (i.e. alert, update, clear).
  2. Subscribers (specifically First-Responders) - will be presented with alert messages, taking them through a complete cycle, to determine their response.


Objective Evaluation

The objective evaluation is part of the controlled-exercises. The Publishers and Subscribers will exercise successive iterations of the steps forming: a goal, intentions, and actions. This is a Human Computer Interaction (HCI) technique. The Publishers would execute the actions related to issuing alerts. The Subscribers would discuss the actions they would execute for specific alerts received. In each case the participants would perceive the state of the world, interpret the state of the world, and evaluate the outcome.
We will install the CAM Studio screen capture software (see setup here) to record the user Interaction Techniques. The screen capture software will capture the behaviour of the user and the sequence of executing each process. This will also provide the Evaluators with insights of the Usability and the Operational Efficiency of the system. The alert messages saved in SAMBRO will be used to determine the data quality as well as Email and SMS received by the Subscribers will determine the Data Quality.
Observers, namely members of the project team, will be provided assessment guidelines to evaluate the Publishers and Subscribers during the controlled-exercises. The Observer will report details on the operational usability, timeliness, and any other noteworthy observations.

Subjective Evaluation

The subjective evaluation will be carried out at the end of the controlled-exercises to give the users the opportunity to express their opinion on the adoption of the system. It is follows the Technology Acceptance Model (TAM) and the HCI-based Gulf of Execution / Evaluation methodologies. These insights will evaluate the utility of the system.
For the TAM evaluation, the users will be given a structured questionnaire which they will provide answers to. The questionnaire is anonymized to allow for the user to provide an unbiased opinion. The users are provided with a section in the TTX guideline to provide feedback on the Gulf of Execution / Evaluation, which would indicate their ability to achieve the goal, intents, and actions using the system.

Evaluation Instruments

We will use the following evaluation instruments: questionnaires, tools, and guides in the controlled-exercises


  1. CAM Studio for capturing usability
  2. Technology Acceptance Model Questionnaire
  3. Human Action Cycle Guideline
    1. for Publishers
    2. for Subscribers


  1. Observer report for Publishers

Cam Studio

Cam Studio will be used to evaluate the Human-Computer Interaction (HCI). This will give us an idea on the User-Acceptance for the system, ease of use, time consumed in each of the process and behavior and sequences of the work-flow.


  1. Download and install Xvid Codec from here
  2. In order to download Cam Studio, go to Cam Studio project. This should download the latest version of Cam Studio.


  1. Once you download the application, double click on the file to start the installation.
  2. Click Next -> Choose I accept the agreement -> Next -> Choose the destination folder where you wish to install the Cam Studio -> Next -> Next and Install and Finish.

That's it, you are ready to go now.


  1. Go to Options -> Video Options. Select compressor as Xvid MPEG-4 Codec. Click Ok.
  2. Go to Options -> Video Options -> Configure. Make sure the 'Target quantizer' is set to 4.5 or above. Since the size of the output file will increase as the number gets smaller, for this long simulation, it is recommended to set that to 4.5 or above. Also note that the maximum size of video that can be made from Cam Studio is 2 GB, so make sure we don't cross this file size.
  3. Go to Options -> Video Options -> Configure -> Other Options and untick 'Display encoding status'
  4. Go to Options -> Video Options. Check 'Auto Adjust' and 'Lock Capture and Playback Rates'. You can change the frame rate by sliding the widget, which will show the changed value in the Framerates section. Framerate means the video will be captured every that milliseconds. Lesser the frame capture time more size the video. Capture Frames every 50 milliseconds should suffice.
  5. Go to Region and select Full Screen.

You can find more settings by following video tutorial which are available here and here


Note: See TracWiki for help on using the wiki.