NSS Labs has observed the evolution of the firewall through a unique lens. Looking back, it has been an interesting journey, and there are many stories to be told.
Enterprise requirements for firewalls have changed considerably since the technology’s inception more than 30 years ago, which in turn has directly shaped our test methodologies. For example, versions 1.0 through 3.0 of NSS’ network firewall test methodologies focused primarily on enterprise requirements for throughput and latency and on capabilities such as policy control, NAT, and SYN flood protection. In 2012, NSS released its fourth firewall test methodology (NGFW v4.0) to align with the emergence of next generation firewall technology. This methodology included new tests for capabilities such as intrusion prevention, application control, and ID awareness.
In 2017, enterprise requirements expanded once again, which resulted in the inclusion of broader threat categories, resistance to evasions, and the impact of SSL/TLS decryption on performance in NGFW v.8.0. The 2017 NSS Labs Security Control Insight Study found that 95% of enterprises are concerned with threats delivered by non-standard threat vectors, and so version 8.0 of the methodology includes a new concept known as resiliency, which is a product’s ability to block a threat regardless of the delivery method.
The introduction of next-generation visibility features (such as identity and application control) and security capabilities (such as anti-malware and intrusion prevention) have certainly expanded the firewall’s use case, but at a cost—these features can negatively impact product performance (so much so that many enterprises do not fully implement them). Over the years, NSS-tested performance metrics have enabled enterprises to understand the true performance capabilities of products and therefore make more informed decisions. In some cases, vendor-claimed performance deviates widely from actual performance, and in other cases, data may not even be available, for instance when SSL/TLS decryption is active.
And what about the data center? Data center requirements fall outside the norm of enterprise “user traffic” firewalls. For example, it is not uncommon for data centers to have high-volume stateless UDP traffic and long-lived TCP connections. Data centers also have a low threshold for unpredictable behavior and do not need user or application identification (though deep packet inspection is required for intrusion prevention). This set of requirements (and more) drove NSS to develop a set of test methodologies specific to the data center: data center firewall (version 1.0 in 2013), data center secure gateway (version 1.0 in 2016), and data center network security (version 1.0 in 2017).
Cost remains the focus in any discussion of security product selection. Recent NSS group tests incorporate what we call threat-associated costs. These are IT labor costs and user productivity costs associated with the clean-up of threats missed by a security control and must be factored in to understand the true cost of deployment.
After so many years, why does NSS continue to test firewalls? Despite its maturity, the product category continues to evolve and remains ubiquitous among enterprises. As the definitive inline network device, firewalls have the potential to be highly disruptive to business and are therefore scrutinized carefully prior to deployment. Be sure to watch for NSS’ NGFW group test results releasing in Q3’18.
Interested in reading more? The NSS Labs Intelligence Brief, The Evolution of Product Testing: Firewall, will be released in June 2018.
Jason Pappalexis (@jsnppp) is managing director of the NSS Labs Enterprise Architecture Research Group (EARG), whose charter is to help enterprises solve security challenges. He has worked with endpoint protection products for more than 18 years and has held roles in the IT security industry that include administration, architecture, field engineering, and product testing.
Follow us on Twitter (@NSSLabs) to keep informed as new research is released.