The "cybersecurity licence": Why plants need one test, not dozens
For plant operators in the German process industry, two things are paramount: plant safety and efficiency. They work every day to keep our highly complex systems safe and productive. That is why they support the goal behind regulations such as TRBS1115-1[2], KAS-51 [3] and the new NIS2 Directive [4]: increasing cybersecurity.
But the way in which this goal must be pursued increasingly feels like a ride on the bureaucratic hamster wheel. An analogy from everyday life illustrates this.
The current situation: the endless driving test tour through regions and Europe
Imagine that your IT/OT cybersecurity expertise is your ability to drive a vehicle safely. You have just passed the German driving test with flying colours – one of the most demanding in the world. You have mastered your vehicle, know the rules and have proven your skills. You not only comply with the requirements, but go beyond them in your own interest, as you depend on your vehicle. Now comes the KAS-51 examiner, often a cybersecurity novice whose focus has previously been on completely different topics. They conduct your "driving test" but also use the exam as a training opportunity, as their test catalogue does not include sample answers and they must work diligently over many years to gain the necessary experience to create them. You pass. Shortly afterwards, the next supervisory authority gets in touch – this time for operational safety. Imagine them as French gendarmes. And you immediately realise: the gendarmes know how to test. He says: "It's great that you passed the German test, but we need to check whether you also meet our specific requirements for roundabout safety." Another test, another piece of evidence. No sooner have you finished than the NIS2 auditor knocks on the door – to stay with the metaphor, the Italian examiner – to announce additional rules for next year: "Your German and French tests are noted, but have you also taken into account our requirements for driving in narrow streets?"
The result of this endless series of tests is absurd: for the most part, operators repeatedly demonstrate the same basic skills. Although each test has its own peculiarities, at its core it is always about the same thing: is the "vehicle", our system, secure against cyber attacks?
This approach is a considerable drain on resources. The best engineers and cybersecurity specialists spend weeks preparing checklists for various examiners instead of focusing on proactively defending against threats. This ties up highly qualified personnel on both the operator and regulatory sides and requires external consultants just to prepare documents for audits and provide evidence that overlaps significantly in terms of content.
Anyone who believes that a corporation operating across regions and national borders can exploit synergies is mistaken. Even corporate-wide regulations, manuals and processes are consistently evaluated differently. Although authorities in different regions/federal states and testing organisations have fundamental questions, there is still no uniform approach, let alone a clear red line in the form of evaluation criteria for when a requirement is deemed not to have been met and the test is therefore considered failed. A driving licence candidate does not shy away from taking the most rigorous test if, in return, they only have to pass it once to be allowed to drive their vehicle across borders, thus saving time, money and stress. In relation to KAS-51 and TRBS1115-1, this would mean mutual recognition of cybersecurity test results based on uniform evaluation criteria. What sounds simple at first glance proves difficult in practice. This is because the KAS-51 guideline and the TRBS1115-1 technical rule originate from different areas of law and are therefore subject to different federal ministries. As a result, it can happen that the same company is first tested by an approved monitoring body (ZÜS) on the basis of TRBS1115-1 and then by a state authority with regard to KAS-51.
Although the test results of a ZÜS are taken into account, they do not spare the company from having to undergo another time-consuming cybersecurity audit. The authorities' hands are tied in this regard as long as there is no agreement on the part of the authorities to avoid multiple audits, for example by recognising the cybersecurity audit results of installations requiring monitoring in a KAS51 audit and not requesting them again.
In an age of bureaucracy reduction and skills shortages, avoiding overlapping audits is not an option but a necessity that authorities, audit organisations, operators and, above all, various ministries must work towards together.
NIS2: Possible effects on KAS-51/TRBS1115-1 audits
The EU's NIS1 Directive [5] from 2016 focused on identifying and imposing obligations on so-called "operators of critical services". Critical services were defined as those whose failure would have a significant impact on the community. As a result, however, the scope was often limited to the specific facilities and IT systems necessary for the provision of these critical services. When implementing the NIS1 Directive into national law, however, the German legislature went beyond this requirement with the BSI Act by additionally defining "companies of particular public interest" and imposing requirements on them.
In contrast to NIS1, the new NIS2 Directive takes a much broader approach that targets the entire organisation. True to the motto "security in a company is only as strong as its weakest link", the focus in future will no longer be on individual areas, but on entire companies. For German legislators, the leap from the old to the new directive is therefore smaller than in other EU member states, where the implementation of the first directive was not supplemented by additional requirements.
However, although NIS2 and KAS-51/TRBS1115-1 focus on companies in one case and on operational areas in the other, it is foreseeable that there will also be overlaps here. There is a risk that, for example, there will be different ideas about the required state of the art in each case. In practice, NIS2 is already casting its shadow, as authorities responsible for reviewing incident reports already expect to be informed when reports are submitted to the federal authority.
Impact of the CRA on operators of automation systems
General information about the CRA
The Cyber Resilience Act (CRA) is a European Union regulation that came into force on 10 December 2024. Unlike directives, which must first be transposed into national law, the CRA has direct effect in all member states of the European Union. Its most relevant requirements for "products with digital elements" provided in the EU internal market must be implemented from 11 December 2027.
The CRA's objectives are justified by the increasing threats posed by cyber attacks to the economy, democracy, security and health of the Union and its citizens, and are divided into two broad parts:
- Creating a framework for the development of secure products with digital elements so that hardware and software products with fewer vulnerabilities are placed on the market and manufacturers consistently contribute to the security of a product throughout its life cycle.
- Enabling users to take cybersecurity into account when selecting and using products with digital elements, for example through greater transparency regarding the support period.
A key term in the CRA is "product with digital elements", which is defined in numerous detailed provisions and exceptions. Broadly speaking, this refers to hardware and software that processes digital data and communicates digitally with other products either directly or indirectly (i.e. is networked). Some products that are considered particularly important for cybersecurity are classified as "important" or "critical" products with digital elements (e.g. VPN software, firewalls, etc.), for which increased requirements apply.
Direct impact on automation systems
Automation systems (e.g. process control systems) and their components (e.g. network switches) are generally products with digital elements within the meaning of the CRA. Consequently, manufacturers must meet the requirements for these products. These requirements are diverse and can only be described in general terms here.
Products must be developed to be "secure by design". This means that cybersecurity must be taken into account as early as the design phase and, for example, software development must ensure that the source code contains as few security vulnerabilities as possible.
Products must be delivered "secure by default". To ensure that the product is already cybersecurity-compliant when delivered, security measures are active and a secure configuration is preset. This may mean that functions (e.g. the web interface of a PLC) must first be activated if they are required.
Security vulnerabilities in products must be reported to both authorities and users during the product life cycle, and security updates must be provided.
Indirect effects on operators of automation systems
From an operator's point of view, it is welcome news that automation products will in future offer a high level of cyber security throughout their entire life cycle. Although the CRA requirements must initially be met by manufacturers, there are indirect effects on operators, presenting both opportunities and risks.
In addition to increased resilience against cyber attacks, operators have the opportunity to simplify and standardise the provision of evidence to authorities (see "driving test tour" above). This requires that the examiners include the CRA requirements in the test catalogues. For example, sections on vulnerability and patch management could be skipped for CRA-compliant products with an active product life cycle. The CRA requirements could also promote desirable standardisation (e.g. NE 201 "Identity and Access Management on Automation Devices", PROFINET Security Classes, etc.). Hopefully, there will be no need for internal discussions about whether an automation system should be procured "cheaply" or "with security".
However, there are also numerous uncertainties and risks for operators. Although spare parts for existing systems are explicitly excluded from the CRA, it is common practice to expand or convert production facilities during their life cycle. It would be disastrous if, for example, a process control system procured in 2026 but not yet CRA-compliant had to be migrated as early as 2028 because a system expansion with CRA-compliant components is impossible due to a lack of backward compatibility. With CRA-compliant products, there is a risk that manufacturers will shorten previous product life cycles with reference to CRA requirements such as the provision of security updates. In addition, it is common practice to continue operating automation systems even after they have been discontinued by the manufacturer. For example, process control systems whose computers run on Windows Server 2012 or Windows 7 are still in use today. Operators have built up many years of experience in dealing with the associated operational and security risks. But will these systems still pass the "driving test" in the future when the reference to their CRA compliance with the "end of life" argument no longer applies?
Since implementing CRA requirements in product development ties up manufacturers' resources, these resources may be lacking for innovation and further development. Finally, price increases for procurement and service contracts are to be feared, the extent of which cannot currently be predicted.
Conclusion: Operators, manufacturers, legislators and inspectors should work together to reconcile the goals of "high levels of cyber security" and "cost-efficient operation" (long life cycles, no double security checks) in order to strengthen Germany's competitiveness as a location for chemical plants.