Aug 1, 2004 12:00 PM

The system demonstration test portion of a Site Acceptance Test (SAT) involves the most people, requires the most advanced planning, and is the task that most people associate with Site Acceptance Testing. Once the Site Installation Inspection Tests have been completed (see last month's article), the System Demonstration Tests can be performed.

Some of the questions the end-user needs to answer during the SAT demonstration tests of card readers and associated hardware include:
Door Control Functions

  • Does a valid card presented to the reader unlock the door?

  • Is a valid entry reported at the host computer?

  • Does the door remain locked when an invalid card is presented?

  • Is an invalid card (entry attempt) message reported at the host computer?

  • How long does the door remain unlocked?

  • Does the door relock when the door closes, or when the unlock time has transpired?

  • What is the unlock time?

  • If the door contacts are supervised, is each "trouble" state reported correctly (line shorted and line open or cut)?

  • Is a message reported when the door is held open too long or forced open?

  • Is there a "pre-held" alarm that sounds if a cardholder holds the door open, and how long is the pre-held time?

  • Does triggering the request-to-exit (REX) device unlock the door?

  • Does the reader have a tamper switch, and what message is reported when the switch opens and closes?

None of these questions addresses directly what devices are connected to what terminals, although in reality the system will not work properly if the devices are not correctly connected. Rather, these questions elicit the information that is needed by the end-user in end-user's terms, not the installer's.

If there are many doors to be tested, it may not be practical to apply these questions to every door. In such a case a good approach is to print out the settings for each door, and to place the doors in categories according to types of settings (such as doors with REX devices, and doors without them). After testing one or two doors from each category, the customer can merely verify that the remaining settings shown in the printouts are correct.
Input Monitoring Points

Some input devices provide a normally open (N.O.) contact that closes to indicate the alarm condition. Other devices are just the opposite, providing a normally closed (N.C.) contact that opens to indicate the alarm. If these settings are not set correctly, the device will operate in the reverse of what is intended. Here are questions the end-user should ask:

  • When the alarm device is triggered is the correct alarm message reported?

  • When the alarm device returns to normal is the correct "normal" message reported?

  • If the monitor point circuits are supervised is each "trouble" state reported correctly? (line shorted, and line open or cut)?

  • What camera(s) will this input trigger for recording or viewing?

Output Control Points

There are some technical details about output control points that need to be understood by at least one person on the customer's team, either a facility engineering person or a consultant. The purpose of delving into these details is to ensure that the actual devices proposed are the ones that are being installed (checked during SAT inspection tests) and that they are installed correctly and operating correctly (checked during SAT demonstration tests).

In most cases the external device is controlled by energizing the relay (turning it on), which causes the relay to connect an external power source to the device (the equivalent of turning on a light switch). In other cases the relay connections go not to a power source but directly to the external device, and act like an on-off switch on the device to initiate the desired action.

Some devices require only a momentary contact closure to initiate the desired action, and some other external action resets the device. The Demonstration Tests should establish:

  • What action is required to activate (operate) the control device?

  • Is another action required to deactivate (turn off) the device, and if so, what is it?

  • What happens to the relay and to the controlled device during a power loss?

  • Are these the intended states of operation?


  • Does the camera provide the correct field of view?

  • Does it correctly switch from color (daytime viewing) to black-and-white (nighttime viewing)?

  • Do pan, tilt and zoom controls operate properly?

  • Do iris and focus controls operate properly?

  • Is the range of illumination adequate?

  • Do any input devices trigger the camera to activate or change position?

  • Does it need to have preset control positions?

Access Control Panels

Do trouble-reporting indicators operate as they should? Some common indicators are the tamper switch, a loss of AC power and low battery voltage.

It should be obvious that test planning and preparation are key for the success of the Equipment Inspection and System Demonstration Tests. The test plan should clearly define the criteria for success, system adjustment and resumption of testing, full test restart and failures. It must also specify who will sign off on each test (for example: installer, customer, consultant). There should be one person on the customer's team vested with final decision authority regarding test suspension, resumption, restart or termination.

System providers should realize that the more educated and familiar the test participants are with the system, the more comfortable they will be with the test process.

Customer training should provide a basic understanding of the security technologies themselves and impart understanding of how the technologies are applied and must be managed for the customer facility.

Anyone participating in the Site Acceptance Test not already familiar with the system should participate in the customer training. For example, facilities engineering or maintenance personnel should receive basic training about the security technologies, including the general care of the devices and any restrictions on handling or cleaning. Security officer personnel should be trained on how to operate the system monitoring, alarm response and reporting features. Administration personnel should be trained in how to perform cardholder data entry, and how to issue badges if badging is included in the system.

Training for customer personnel should be performed prior to the Site Acceptance Test, so that the customer test participants will be knowledgeable enough to participate actively in the testing. That also offers an extra benefit for the SAT, by helping to reveal (during testing) any gaps in the training that was provided.

Because no system is entirely fault-free, the customer and system provider must agree upon the maximum number of acceptable outstanding issues in any particular category. Here are categories that have been useful for assessing errors found during testing:
Show Stopper

It is impossible to continue with the testing because of the severity of the issue.
Critical Problem

Testing can continue, but the system cannot go "live" (into operational use) with this issue unresolved.
Major Problem

Testing can continue, but this issue will cause severe disruption to business processes in live operation or requires special resources to compensate (such as posting of guards), and could only be tolerated on a short-term basis.
Medium Problem

Testing can continue, and the system is likely to go live with only minimal departure from agreed business processes.
Minor Problem

Both testing and live operations may progress. This problem should be corrected, but little or no changes to business processes would be needed.
Cosmetic Problem

This refers to an error in colors, fonts, text size or visual layout of information which, while incorrect, will not impact system operations.

What if the customer thinks something is not working the way it should, and the system provider disagrees?

The best way to resolve the issue is by referring to the customer's requirements documents. If those do not cover the issue, the system provider's proposal or product documentation should be referred to. If no documentation relates to the issue, for purposes of the test, the decision must favor the system provider because there is no stated requirement about the issue. However, customer consideration warrants that the system provider do everything reasonable to accommodate the customer post-test, in the interest of customer satisfaction.

As each test of the overall SAT is completed, the designated witnesses should sign the test sheet indicating that the test was performed and initial the appropriate place to indicate "pass" or "fail." If an error occurs during the test, the details of the error are described along with its classification.

If the test has to be stopped and restarted later, the decision and the reasons for it should be documented in writing and signed by the witnesses.

When the SAT is completed or terminated, an overall "pass" or "fail" status must be determined based upon the agreed-upon criteria. A summary list of errors and their agreed-upon resolution should be compiled and bound with the signed test pages. This list provides an auditable record of the SAT. Sometimes part of the the SAT payment milestone is contingent upon the satisfactory resolution of some or all of the issues identified during testing.

System providers can enhance the success of the Site Acceptance Tests by offering customer training before the SAT, and collaboration in providing test plans and execution documents in terms of user function.

They can also provide the user with separate documentation that defines field sizes and other system restrictions to eliminate the customer's need to read through manuals in hopes of finding the information. Finally, they can provide a test plan that has customer personnel involved in performing portions of the SAT.

One role for a consultant is to provide technical expertise regarding the correctness of the system installation and the integrity of the test demonstrations. Another valid role for a consultant is to contribute to the development of the test plan.

Depending on the consultant's role in the project, this would include writing testable systems specifications, collaborating with the customer to develop security operations and response scenarios that the system must support and defining an appropriate scope of end-user training to be delivered prior to the SAT.

The consultant also collaborates with the system provider, on behalf of the customer, to ensure that the test plan represents the customer's perspective and interests, and provides a realistic and adequate exposure of the system to all reasonably expected events.

Additionally, the consultant participates in System Inspection Tests and Demonstration Tests as an official witness and reviews test summaries and reports for accuracy and completeness.

It is important for the consultant to remember that he or she represents the customer's objectives and requirements for the customer's system, not his or her own.

The customer cannot be complacent and expect the system provider to do all of the work required for a successful SAT. The customer should enhance the success of the SAT by being definitive in the system specifications as to what is expected in the SAT; by learning about the system and the specific features that are important to him; by getting those who will be working with and using the system involved in the training and SAT.

Customers should require that documentation be specific to their own system and how it will be used ¡ª not simply a "boilerplate" documentation.

The customer must be pro-active in obtaining a full and complete Site Acceptance Test. He should not wait for the system provider to indicate what needs to be done to get ready for the SAT. A system provider may have an entirely different and less satisfactory approach to acceptance testing than the customer. The more customers get involved prior to and during the SAT, the more beneficial these tests will be for both parties.
Coming Next Month¡­

An Operational Acceptance Test shows that a system can operate as intended continuously, with acceptable up-time.

Don Sturgis, CPP, is a senior security consultant for Ray Bernard Consulting Services (RBCS), a firm that provides high-security consulting services for public and private facilities. This article is excerpted material from the upcoming book The Handbook of Physical Security System Testing by Ray Bernard and Don Sturgis, scheduled for publication by Auerbach Publications in the spring of 2005. For more information about Don Sturgis and RBCS, go to www.go-rbcs.com or call 949-831-6788.

  • Hidden Camera Video Recorder w/700ft. 2.4GHz Transmitter & Receiver
  • Modern Art Museum Saves Time, Money With Digital System Upgrade
  • Colocation facility provides high-tech security
  • Ceiling Mount Infrared
  • Single Channel Digital Video Recorder
  • Piping Multimedia Entertainment to Every Room
  • Wireless Outdoor Home Security Camera
  • Credit Suisse First Boston
  • Hidden Cameras - Personal Space Air Purifier w/ 2,000ft. SECURE Encoder &
  • Freeze-Drying Works
  • Lights, camera, asset management
  • Security Camera Articles