The Skinny on FATs
 
The Skinny on FATs

Jun 1, 2004 12:00 PM
By Don Sturgis

The Factory Acceptance Test or FAT (sometimes called a Pre-Shipment Test, System Test or more recently Integration Test) was originally used for all kinds of custom-built systems to work out design and manufacturing flaws, and to prove that a system will do the job it is supposed to do before it is shipped to a customer site. After satisfactorily completing a factory test, a system is typically disassembled and shipped to the customer site, then reassembled and subjected to a Site Acceptance Test (SAT), which would replicate all or some of the FAT, with the expectation of getting the same test results.

Manufacturing control systems, telephone switching systems and electric power distribution control systems ! to name just a few ! undergo factory tests that take weeks and involve dozens and sometimes hundreds of people for their execution. Preparation for that type of factory test takes months. In contrast, a properly planned and executed FAT for a large security system usually takes 1-3 days, and the planning and preparation only a few weeks.
Security system Factory Acceptance Tests

When electronic security systems were first developed, and as they continued to grow in capability and complexity, a FAT was needed to ensure that new aspects of systems would work correctly when the components were connected and the system was set up for use. For most access control systems in the 1980s and '90s, the time intervals between feature changes and bug fixes were often measured in weeks. A FAT was required to make sure that bugs had not crept into the systems along with intended enhancements. Because the systems were not integrated at the time, a separate FAT needed to be conducted for each type of system (access control, alarm monitoring and CCTV).

As the reliability of electronic security systems increased, and as the time intervals between changes lengthened, it became common to use a FAT only if the customer required or requested one. Typically, the only customers who requested them were those already familiar with factory tests, or whose security or engineering consultants knew of their importance.
Information technology accelerates change

Today's large networked and computer-based electronic security systems are more encompassing and more complex than previous generations of systems. They are usually an integration of multiple systems from one or more manufacturers. The high pace of technology change means that, in essence, many system integrators will never install the exact same system twice. More frequently, a systems integrator is now building systems using components or applications that the integrator has not previously installed in the same combination.

Thus, a FAT should be included in the plans for any large physical security system and should be indicated as a major milestone for acceptance and payment.
Factory Acceptance Test

Many installing technicians are not familiar with factory acceptance testing. If they were, they would require it regardless of the customer's preference because of the tremendous benefits to the installing vendor. Vendor personnel who have not performed a true Factory Acceptance Test often try to omit it. They try to convince the customer that it is costly and time-consuming, asserting that the testing would be a waste of time, since they know their products well and everything works the way it should. However, the FAT offers the customer the only opportunity to witness major operational issues of the system and to work out flaws in the design before any equipment is shipped to the job site. These flaws can show up in many different ways ! whether improper component selection; "bugs" in custom software applications; or interconnectivity or integration issues. What better place to find these issues than in a controlled environment at the "factory" where it does not interfere with or compromise any activities at the customer site?

The FAT brings the hardware and software together for testing in one location prior to shipment to customer sites, in order to:




  • demonstrate the suitability and workability of the system design;


  • verify correct interconnectivity of all system components showing that hardware and software are fully integrated as a system;


  • satisfy capacity and performance requirements that are not demonstrated in previously installed vendor's systems;


  • prove compliance with specifications, and confirm that all custom requirements perform as specified;


  • verify that critical performance parameters operate as described in the functional requirements and/or design specifications;


  • verify that backup and restore operations function as required; and


  • verify that redundant server takeover operations function as required.


The FAT can be performed only after the system design has been completed. System design means more than just selecting equipment and determining device counts. It includes working out how the customer will use the system. Some examples:




  • For access control systems, it includes developing the access levels for assignment to system users, something that may require collaboration with the customer's security and human resources departments.


  • For video monitoring systems, it includes determining how the automatic camera control will be used in response to alarms, what events will initiate video recording, and how other video system features will be used.


  • For ID badging systems, it means designing the actual badge types and preparing to print real badges.


The customer and vendor must collaborate on these tasks. Cooperation ensures that there are no misunderstandings about how the system will operate, and in some cases, the customer may be able to obtain free improvements and customizations. At the least, system setup is significantly accelerated because without a FAT these issues typically would not be addressed until the system is being installed.

The customer can even participate in the data entry as part of the FAT setup, thus providing hands-on experience with the system before it is installed. Furthermore, key customer personnel who will serve supervisory roles involving the security system can receive basic training to prepare for their participation in the testing process. Educated test participants will speed up the testing process, and the more the customer knows about the system, the more comfortable he or she can be in terms of system acceptance.

MDI Security Systems, Rancho Cucamonga, Calif., a manufacturer of access control, alarm management, digital video and digital audio systems, offers a Factory Acceptance Test as a separate line item on the majority of its proposals. Most MDI systems are sold through dealers or system integrators, so it is up to these "middle-men" to convince the end-user of the value of the FAT. Most are convinced and purchase this option.

"The end-users like the FAT because it gives them a chance to touch-and-feel the equipment and to get training on it before it is installed at their facility," says MDI senior vice president Ray Payne. "Some customers even bring along [the system operators] who will be working with the system on a day-to-day basis. We get their inputs on what they like and what seems to be difficult for them. We have made improvements to the systems based on some of their comments."

Payne indicates that the dealers and integrators like the FAT because it gives the end-user an opportunity to buy in and take ownership of the system during these tests. "The best value for [the manufacturer] is realized in the case when something isn't working correctly after the system is installed. The dealer/integrator is less likely to blame the equipment but more likely to assume that the operator is doing something wrong because they saw the system working correctly here at our facility," Payne says. "Another advantage to the dealers/integrators is if they have turnover in personnel, it gives their newer technicians an opportunity to work with the system before they have to install it in the field."

Frequently, the FAT is not conducted at the "factory," but at the system integrator's facility. Regardless of where these tests are performed, the main advantage is that all components of the system are in one room or area of a building instead of being dispersed over the end-user's multiple sites.

When the tests are performed at the manufacturer's site, as in the case of MDI, the dealer/integrator comes to the factory a few days before the end-user to help set up the system. They then do a first run-through of the tests to find any problems and correct them before the customer arrives. The technical personnel who are capable of correcting any design issues normally work at the "factory" and can witness the anomalies first-hand. They can instantly see how performing a task at one "site" impacts the operation at another "site," since all equipment is assembled together. They can then immediately delve into solving the problems without ever having to leave the test site or send out field personnel.

This is much more cost-effective and provides a more timely solution than the alternative. If no FAT is performed the system integrator incurs the cost of technical personnel traveling to the end-user's facility to deal with these same issues after the system has been installed. Solving these problems in the field always involves more calendar time and vendor expense, and will inevitably delay the date when a fully functional system can be turned over to the end-user. A well-designed and executed FAT will pay for itself in both time and money.
Optimizing test scope

For very large systems, it is not practical (and sometimes not even possible) to assemble, connect and configure every piece of hardware and software that will be installed. Thus a minimum configuration should be designed that will accomplish the objectives.

When some of these objectives can be proven for various aspects of the system, by reference to existing customer sites (with the same models and versions of the system components), it is not necessary to prove them again in the FAT. In particular, this approach can be used to satisfy the maximum system capacity and performance requirements, and reduce the number of hardware devices required for the FAT.

Furthermore, proving that one piece of equipment (such as an access control panel) meets the maximum capacity requirement is sufficient to prove that all panels can meet that requirement. However, the overall capacity of the system to handle the intended full equipment load and the message traffic it generates must still be proven.

For complex systems, it can make sense to perform the FAT in stages. When complex subsystems are providing simple inputs into the main system (such as a fence-line perimeter intrusion detection system), it is only necessary to demonstrate the ability of the main system to accept and process these inputs. The subsystem operation can be verified as a separate stage, either before or after the main system factory test, by reference to existing customer sites or by separate testing. This approach works if a subsystem is ideally tested in a physical environment that is not conducive to the main FAT test. For example, testing a five-mile fiber optic network backbone is best done at the network systems factory location rather than at the system integrator's facility.

Some specialized pre-shipment tests must also be performed separately because they involve harsh or specially controlled environments and may need to be performed over an extended time period. For example, outdoor security system components that will be installed at oil refineries or chemical processing facilities must be resistant to the airborne chemicals in those environments and may have to be enclosed in explosion-proof housings. Some marine facility security system components must withstand corrosive salt sprays. In such cases, laboratory-certified test results ! or inspection of existing customer sites ! should be used to eliminate the need for customer testing.
Scenario-based testing

A system integrator's customer list will contain a wide variety of facility types. The exact uses of the security systems from facility to facility will vary, meaning acceptance tests should be tailored to fit the way each customer needs and wants to use the system. This is where security scenarios come into play.

A scenario is a sequence of events that a customer expects to take place as his system is used. There are two categories of scenarios: normal operations and security response. Security response includes investigation of suspicious indicators as well as responses to security incidents.

To develop security scenarios, examine the jobs, roles and activities of each person who will be using the system. For access control systems, that includes system operators as well as system users (the people who will be granted access privileges). Often 6-12 scenarios can cover most system usage requirements.

In general, factory testing consists of a list of features that must be demonstrated, along with the steps required to demonstrate them and the results that should be observed. A single scenario will include the use of multiple system features. By performing the scenario-based tests first, the customer will be able to mark most features as tested, and then address the remaining individual features as necessary.

Scenarios should include (as appropriate for the system):


  • alarm operations;
  • surveillance operations;
  • badge issuance operations;
  • access control operations;
  • reporting operations;
  • emergency mode operations; and
  • backup and restore operations.


The customer provides the security scenarios and seeks assistance from a security consultant or even from the vendor in the development of the scenarios. Regardless of who prepares them, the security scenarios are an important primary responsibility of the customer.

A discussion of security scenarios is not complete without making this important point: Acceptance testing is not limited to the security scenarios. The customer is not responsible for writing the test plan or designing tests. The purpose of scenarios is to ensure that both the customer and the vendor understand how the customer intends to use the system.
Factory Acceptance Test plan

Test plans will vary from system to system. In general, the testing should be planned in order to provide a realistic and adequate exposure of the system to all reasonably expected events, given the system's scope and purpose.

Using the IEEE Standard 829 for Software Test Documentation as a guide, an acceptance test plan should include in detail:




  • description of the scope and approach of the test;


  • references to related documents (such as specifications, user guides, etc.);


  • types of tests to be carried out;


  • features and combinations of features to be tested;


  • features not to be tested (and the reasons why);


  • test environment;


  • system equipment including model and version numbers;


  • test setup requirements;


  • test pass and fail criteria;


  • test suspension and resumption criteria;


  • any test equipment and tools;


  • staffing training and skill requirements, and responsibilities;


  • schedule for test performance and all test-related tasks, including post-test reports;


  • any risks requiring contingency planning;


  • a list and description of test deliverables (IEEE Standard 829 requires the following documents: test plan, test design specifications, test case specifications, test procedures, test item transmittal reports, test logs, test incident reports, test summary reports); and


  • the names and titles of all persons who must approve the test plan


Once the overall test plan is approved, the individual test cases (also called test scripts, test procedures or test suites) can be developed. Completing the up-front work of test planning (as listed above) speeds the creation of test case development. A common error is to grab a features list or specifications document and to start writing test cases as the first action. This approach inevitably becomes a stop-and-go activity, as the test designer will encounter many questions along the way that must be answered before test case design work can be continued.
Who writes the test plan?

Ideally, test plan templates would be written by the manufacturers and tailored by vendors and customers for each system being tested. Vendors should incorporate the integration aspects of the systems in their test plans, and, over time, build up their own library of reusable test templates. The customer should have final say over the test plan.

Payne indicates that MDI has prepared FAT guidelines for its systems and individual features and makes them available to assist dealer/integrators in preparing the actual test plans. Emil Marone, chief technology officer for Henry Bros. Electronics, Saddle Brook, N.J., agrees: "Since we know the customer's needs, we, the integrator, must take the manufacturer's guidelines or test scripts and tailor them into a Factory Acceptance Test that demonstrates the features of the system the customer requires. The customer (or customer's security or engineering consultant) needs to be involved in this process and buy off on the final test plan. The test can then be performed by the customer alone or jointly by the vendor and customer.

"I cannot stress too strongly the need for the customer's input during the entire process (from clarifying expectations for the system, through design, and to development of the Factory Acceptance Test)," Marone continues. "Our most successful installations have been the result of a close working relationship with the end-users. Having the customers supply us a person or personnel totally familiar with their operational needs is the key to providing the correct system. This aids in creating the proper Factory Acceptance Test, the purpose of which is to ensure they receive a system which meets their needs."
Test results

Obviously the best scenario for a Factory Acceptance Test is that all tests perform as planned and the system is shipped and installed on schedule. The worst-case scenario is that there are one or more major issues that do not perform as required and cannot be resolved by the vendor within a reasonable and acceptable time period. If the purchasing contract is written correctly, the customer has the option at that point of canceling the contract. Given today's level of technological advancement of both systems and vendor personnel, that situation is unlikely.

In between these two extremes are the following situations that are more likely to occur.




  • Both vendor and customer agree that an issue does not meet its expected test results but are able to agree on an acceptable solution. The "fix" is incorporated, the system is retested and performs as planned. In this case the customer gets the desired system ! the only negative effect is the system ships later than scheduled because of the time required to find and implement the "fix."


  • Both vendor and customer agree that some performance issue does not meet its specified requirements. As an example, the specifications state that an alarm message must appear on an operator's screen within 2 seconds from the occurrence of the alarm. The tests show that this time is typically 3 to 4 seconds and there seems to be no way to improve it. The customer must now decide whether this is such a significant issue as to reject the system or to accept the system with its lesser performance criteria. A third solution is that the customer conditionally accept the system with the agreement that the manufacturer will correct the performance issue and provide the required corrections to the installed system, at no cost, within an agreed-upon time frame.


  • If the test cases are not well-written, the customer and vendor may interpret the results differently. The vendor may state that the test results meet the intent of the customer's specification; the customer may disagree. Questions regarding the validity of test procedures or test results should have been resolved in the earlier customer review and approval of the test plan; however, it can happen that the test procedures or test results were interpreted differently by the vendor and customer, and the difference does not become apparent until test time. That is another benefit of the FAT: it can bring such differences to light before installation occurs.


Test scheduling

Best practice dictates that the vendor actually perform a practice run of the entire test, step-by-step as it will be executed by or for the customer. This is a good way to flush out errors or inaccuracies in the test procedures. It also provides the vendor with knowledge of how long each test procedure actually takes, so that groups of tests can be scheduled accurately, and test breaks can be included in the schedule at appropriate intervals.
NOTES

Factory Acceptance Test Checklist

A successful Factory Acceptance Test should have the following features:




  • Is described clearly in a well-written test plan


  • Is documented accurately by well-written test procedures


  • Is prepared by inputs from manufacturers, vendor, customer and/or security consultant


  • Must be quantifiable and verifiable


  • Must include all tests for major system design and integration issues


  • May include special test software or hardware to simulate high traffic volume not easily implemented any other way


  • May include special test software or hardware to create test specific conditions or events that must be repeatable across several test scripts


  • Be sufficiently detailed so that the end-user can verify the test setup and perform the tests


  • Is run by the customer, or run by the vendor with customer participation and/or observation


  • Has its individual tests approved or disapproved (pass/fail/pass with conditions) as performed by one or more customer personnel or customer representatives


About the author

Don Sturgis, CPP, is a senior security consultant for Ray Bernard Consulting Services (RBCS), a firm that provides high-security consulting services for public and private facilities. This article is excerpted material from the upcoming book The Handbook of Physical Security System Testing by Ray Bernard and Don Sturgis, scheduled for publication by Auerbach Publications in the spring of 2005. For more information about Don Sturgis and RBCS go to www.go-rbcs.com or call 949-831-6788.

Editor's Note

The IEEE Standard 829 for Software Test Documentation is available for $65 from Amazon.com.

COMING NEXT MONTH

Site acceptance and installation inspection testing.

 
  • The debate about open architecture
  • CCD WEATHER RESISTANT BULLET CAMERA
  • Architectural design and information technology control the speed and effectiveness of an
  • Harmony 880 Advanced Universal Remote
  • FIRST AND GOAL
  • Countries worry that U.S. security will hamper free trade
  • Spy Camera Alarm Clock w/ Built-in Digital Recorder
  • Gaming Security
  • Fire Prevention
  • Hidden Security Camera - Tissue Box
  • PROFILES IN CCTV EXCELLENCE
  • Security Camera Companies and products