The PSSA Interact Days are an opportunity for members to meet, network, share ideas and concerns, and to learn from one another.
Interact provides a forum for discussion on the most pressing topics that our sector is facing, and this discussion around VADS was no exception.
(Please note: This article was written before the Magdeburg Christmas Market Attack)
The Vehicle Attack Delay Standard (VADS) is proving to be a contentious, misunderstood rating from the NPSA, that seems to be shrouded in secrecy and intrigue.
What is it really for? Who benefits from it? Do we need another ratings standard?
These were just some of the questions that needed answers.
We had a full house to hear from our esteemed speakers and panellists that included:
- Gary Heward – Thornton Tomasetti
- Robert Ball – ATG Access
- Inspector Lewis Hastie (NaCTSO)
- Government Security Advisor
- Dave Johnstone – HORIBA MIRA
But firstly, let’s start with the NPSA definition of VADS, which is what everyone wanted to talk about.
Definition of Vehicle Attack Delay Standard (VADS) on the National Protective Security Authority (NPSA) site:
Security Equipment listed in this CSE Chapter has been allocated a rating in accordance with NPSA’s Vehicle Attack Delay Standard (VADS).
VADS provides a means for testing Vehicle Security Barriers (VSBs) against aggressive and repetitive vehicle impacts.
For organisations concerned with vehicle borne threats, including Vehicle as a Weapon attacks, VADS rated VSBs are not a substitute for IWA 14-1 and PAS 68 rated vehicle security barriers. VADS rated VSBs provide an alternative risk-based option for event managers and other risk owners: pragmatic, affordable and achievable levels protection, typically for, but not exclusively, temporary events.”
https://www.npsa.gov.uk/cse-chapter-hvm-delay-rated
“The deployment of VADS VSBs should only be done with a full understanding of their performance capability. This will inform the decision process for selecting suitable measures. The chosen VSB should provide the appropriate level of protection against the defined vehicle borne threat/s in order to reduce risk to an acceptable level.
The Vehicle Attack Delay Standard (VADS) has been developed to allow the testing of barriers predominantly used at temporary events. “
“NPSA strongly recommend the end user verifies claims that a VSB holds VADS approval by confirming its presence in this CSE Chapter. In the first instance, for further advice regarding the products performance rating and characteristics please enquire with the manufacturer listed in the CSE. Additional advice can be sought from NPSA Sector Advisors, NaCTSO, Police Counter Terrorist Security Advisors, Members of the Register of Security Engineers and Specialists and Chartered Security Professionals.
End users are reminded that the VADS approval allocated to a VSB are based on specific layouts/configurations that were used in the testing. The listing in this CSE Chapter will state the configurations that hold the VADS approval. Deviating from a rated configuration will incur risk. Using the check list in NPSA’s guidance: Due diligence in the selection and procurement of vehicle security barriers, users should satisfy themselves that VSB will meet the specification.”
Gary Heward of Thornton Tomasetti opened the discussion by reminding the audience of the Bastille Day Vehicle-as-a-Weapon (VAW) attack in Nice in 2016 – and that the attack only lasted 5 minutes in its entirety.
Penetration distances must be considered, when examining ‘dead zones’, especially when considering passive, un-anchored inertial Vehicle Safety Barriers (VSBs), that rely on their mass and friction to stop/ delay a VAW attack.
The VADS testing standard is said to be judged on a pass/fail duration of the delay. Under 30 seconds resistance is a failure. Between 30 and 60 seconds is a Pass:30 rating, and over 60 seconds is a Pass:60.
However, the delay rating is not publicly available (it is available to CTSAs), nor are the testing configurations that failed testing; only the ‘passed’ configurations are listed in the CSE. So, it is therefore imperative that manufacturers and installers explain to their clients exactly who owns the risk, if they elect to deploy VADS-rated products, based on a Threat, Vulnerability & Risk Assessment (TVRA) and a Vehicle Dynamic Assessment (VDA) to determine the risk scores.
Robert Ball of ATG Access started out by asking the question “What are testing standards for, and who benefits from them?” The manufacturer? The site owner? Society and the Public?
“The benefit of testing standards is for the end-user. We are leaving customers vulnerable to making (Hostile Vehicle Mitigation – HVM – purchasing) decisions that they don’t fully understand.”
As an industry, he suggested that Risk Analysis needs to be a constant process, because a HVM product that worked last year, may not work this year.
Overall, Robert’s message was about doing the right thing. Explaining the ownership of residual risk to clients, taking a pragmatic view by explaining the limitations of products to end-users (as they are not experts and may want to cut corners by reducing costs). Increased collaboration with Counter Terrorism Security Advisers (CTSAs) through better working relationships will be vital, going forwards, so that a consistent message can be delivered to customers.
However, he questioned how anyone could make an informed decision on which product to choose from, without having access to the full VADS testing information.
The Government Security Advisor explained the need to balance between making test results open source and declaring to a potential attacker how much limited delay a VSB provides. NPSA does not publish open source its delay ratings for many types of security equipment because of this.
During the panel discussion, one issue that came up on several occasions was the mobility of temporary VSBs – can it be moved out of the way? Can it be pushed over?
The Government Security Advisor suggested that this was the responsibility of the manufacturers, and that it wouldn’t be appropriate, or a good use of test-pad time i.e. tax payers’ money, for them to be testing, or rating this. Surely it could be assumed that people moving a barrier out of the way was beneath what was needed?
This was not met with uniform agreement.
Robert strongly advocated for greater levels of transparency with end-users about both the strengths and weaknesses of a selected product.
The CTP spokesperson indicated that he would prefer temporary barriers to be staffed at all times, in order to prevent such types of tampering.
Gary Heward suggested that VADS testing was based on a repetitive nature, as not all circumstances can be accounted for in a single test (such as pushing/ towing of temporary barriers by hostile actors). He suggested that, when looking at permanent HVM solutions, that the product lifecycle be taken into account. Issues such as corrosion, degradation of foundations should be addressed, when presenting solutions to clients. Would offering guarantees for products be an answer to this?
From the audience, Richard Flint of BRE added that ‘declared performance didn’t have to mean that a product ticked all the boxes…’ (just the ones it needed to, from an operational point of view).
Iain Moran of Crowdguard, while generally supportive of VADS, raised a note of caution about an over-reliance on product ratings at the expense of operational requirements. He also emphasised the need to address the gap in ‘interference testing,’ which includes evaluating a barrier’s tamper-proof capabilities, a critical aspect overlooked in VADS testing but addressed by the German DIN Spec 91414 testing. Bridging this gap is essential for ensuring comprehensive evaluation standards.
Additionally, he highlighted the importance of integrating Hostile Vehicle Mitigation measures into a broader security strategy. HVM is designed to provide trained operatives with the time needed to respond effectively and protect assets by moving them out of harm’s way. Vehicle security barriers should be part of a well-planned counter terrorism strategy, ensuring they ‘buy time’ for the public to evacuate to safety
This point was echoed by Robert Ball who agreed that delays are vital to saving lives in the event of an attack. He went on to suggest that testing standards could involve a range of delay-based tests, so that the correct product could be specified, for the operational requirement.
The CTP spokesperson suggested that VADS rated systems should be the last line of defence, after police, event staff, traffic management and so on. He was adamant, however, that VSBs should be tested in the same configuration for both Impact rating and VADS rating – a point generally supported by all. He went on to suggest that ‘lightweight equipment’ should be VADS tested first (before Impact testing) to get the best possible configuration – although this last suggestion was not universally supported.
The Government Security Advisor reinforced the message of transparency with clients, providing them with honest, impartial advice, rather than letting them be reliant on NPSA providing additional testing – a clear reference to the tamper testing that was floated earlier in the conversation.
PSSA Chairman, Paul Jeffrey noted that:
“The discussion on VADS was very enlightening and a number of conclusions were made including the role in PSSA to produce clarification and comparative supporting information in excess of the standard. This will be worked on and issued in the coming months and will give a greater level of comparison and information regarding VADS.”
Summary
It is clear from the conversations today that VADS is a contentious issue. The lack of transparency from the NPSA about testing standards, and clear details of the pass/ fail criteria is a concern for installers – as only manufacturers involved are privy to that level of information. However, NPSA do now publish open source, in the CSE, whether a VSB has a VADS rating or not, which is a step forwards.
There are ongoing concerns about the perception of impact-rated products as inherently superior to VADS-tested solutions, as well as how the critical issue of residual risk is communicated to clients. While documenting residual risk is a fundamental requirement of the PSSA HVMIS, there appears to be variability in how this is being implemented across the industry. To address this, the PSSA will begin taking steps to ensure that installation standards are consistently met and that any residual risk is clearly documented and communicated to clients.
It’s widely accepted that VADS related products are more suitable for temporary deployments – street/ festival markets, football matches, annual athletics events such as marathons – but does that make them any less effective, or suitable, from an operational point of view? Perhaps not. If the level of risk is clearly explained to the end-user, alongside the limitations of a particular product, and the associated liabilities, as well as the benefits, then VADS-rated products can carve out a niche of their own, and help to saves lives, should the worst happen.
One thing is clear though, there needs to be greater levels of transparency all round.