"Very few" nuclear plants install software patches, plants are "entirely reliant on the perimeter to stop hackers," and operations engineers "dislike" IT security engineers, according to a new report published today by the Chatham House. However, as the report explains, there are some legitimate reasons for this.
The report -- which includes remarks from 30 anonymous sources from both sides of the fence -- describes behavior regularly practiced in nuclear plants around the world that would make most infosec professionals blanch, explains the logic behind those practices, and makes recommendations on how to improve cybersecurity without jeopardizing safety.
Just can't get along
"The problem is as much cultural and sociological as it is technical," said Source 8, A US industrial control systems expert who was trained as a nuclear engineer. "One of the biggest problems we have is that – as in any industry – the operations people dislike IT."
Part of the trouble, according to the report, is that nuclear operational engineers and IT security professionals have conflicting priorities, think differently, and have very different meanings for the same words, all of which contributes to difficult communication.
One of the fundamental differences is that while nuclear operations engineers are used to preventing and preparing for accidental incidents, while infosec pros are used to preventing and preparing for intentional incidents caused by malicious actors. While ops engineers want to see causal analysis of what could happen and what has happened, infosec people deal with some more unknowns -- new threats yet to be imagined by malicious actors.
Another difference, is that operational engineers' priority is physical safety, not security in the way that infosec pros would think of it. And while there are very detailed, long-standing policies and procedures for safety that are rigorously enforced, the cybersecurity policies are new, insufficient, and barely enforced (internally or externally).
As Source 6, a recently retired (and outspoken) operations shift manager at a US nuclear power plant, says:
"On a standard issue, if we have a procedure that says valves x and y need to be open, we usually send two people to do that and then a third person to check. When it comes to cyber, they’ll make sure that the computer is hooked up to the right hub, but they don’t have anybody check to make sure that the computer you’re hooking up is the one we bought for it and not your own, or that you didn’t plug it in anyplace else. They tell you that they do but they don’t."
Source 6 also said that the IT people did not understand how nuclear plants worked, nor were they subject to the same regulations on fitness for duty, which include limitations on working hours and and alcohol consumption before a shift.
Source 8 explained that even when the cybersecurity rules laid out by the Nuclear Regulatory Commission are followed to the letter, they are inadequate:
"Two years ago I was involved in doing a third-party review of what I consider the most comprehensive cyber assessment done of any commercial facility worldwide, and it was a nuclear plant. We found major cyber security vulnerabilities that weren’t being addressed in the Reg Guide [created by the Nuclear Regulatory Commission]."
Each side seems to misunderstand how the others' procedures work, according to the report. As Source 25, a France-based director at a major international company specializing in cybersecurity, explained, IT people understand that in order to secure a system they need to inspect and secure each component of a system, not just bolt some security on top the whole unit. However, what they don't understand is that when they inspect each component for security in that way, they invalidate all of the rigorous safety tests that those components already passed. So information security really does have to be built in from the very beginning.
Exacerbating the communication problem is that ops engineers and IT staff generally don't work in the same place, according to the report. As Source 6 said:
"No plants in the country have cyber expertise on site. I think that it’s all corporate people and that they are not even around. I had no idea who they were. I just knew that they worked in an office that was maybe 100 miles away."
Another key point of contention is that in nuclear plants, availability is critical. Therefore, anything that may disrupt operations is avoided as much as possible -- and that's a problem for cybersecurity.
Patching 'infrequently used'
"Patching is really challenging, and the reality is that very few people are actually installing any patches," says Source 3, a senior technical officer working on control computers at a Canadian owner-operator of nuclear power plants.
Duplicate systems are "enormously expensive," so patches cannot be thoroughly tested before being deployed. A faulty patch could cause an outage, and the tolerance for downtime is slim to none. Plants run 24/7 with planned shutdowns for maintenance scheduled only about every two years, and nuclear plants are considered the baseload -- the constant source of energy for the power grid -- so an outage at one facility could quickly cause further outages, and a widespread loss of power.
Of course, if a faulty patch could cause such an outage, then so too could a deliberate attack, as the report points out:
A cyber attack that took one or more nuclear facilities offline could, in a very short time, remove a significant base component to the grid, causing instability. According to Source 27 [An expert on SCADA and other industrial control systems who founded an online information platform]:
"In the US, it’s very easy to have this ripple effect because if those plants go off the grid quickly enough, it’s a pretty significant percentage of the grid’s base load that all of a sudden disappears, which causes the entire grid to become burdened. If you did that to a reasonable number of those larger substations, you could cause a significant grid event."
Some of the legacy software typically in use by plants isn't even supported anymore, and yet, the report states, plants are becoming increasingly reliant on commercial off-the-shelf software.
Internet of Radioactive Things
Another issue noted in the report is a threefold problem: a mistaken belief that airgapping solves all cybersecurity issues, a mistaken belief that their systems are airgapped when they aren't, and a decrease in the use of airgapping entirely.
Source 26 [A UK-based vice president and chief technology officer at a major international company specializing in cyber security] points out:
The common rhetoric I hear within the nuclear industry is, ‘We don’t need to worry about a cyber attack because our plant is air gapped.’
There appears to be some element of denial. Some nuclear facility personnel may view cyber conflict as occurring between a small number of advanced states rather than as a threat that concerns them. Source 25 [A France-based director at a major international company specializing in cyber security]explains:
For them, it remains a movie scenario, maybe in the future. They think it is just states against states, not everybody wants to hack us, and also it won’t happen here.
Furthermore, many in the industry are also sceptical about the potential for a release of ionizing radiation to occur as a result of a cyber attack; a number of those interviewed asserted that it just would not be possible.
Manually introducing malware via USB ports is one way that such techniques can be overcome. As Source 27, an expert on SCADA and other industrial control systems who founded an online information platform, said "If you allow in a USB key, which breaches the air gap, you’ve now got a connection that nobody really considered. And since there is [often] no security software running on any system machines, malware is free to do whatever it wants."
Clearly nuclear plants are not executing secure port control, judging from some "common" practices named in the report:
In some countries it is common practice to bring personal computers into nuclear facilities, where they provide an avenue for virus infection. Source 6 describes how in some US facilities, engineers regularly bring in their own personal computers in order to run tests and plug them directly into the computer interface of the [programmable logic controller].
Many computer control systems have PLCs. You can introduce viruses or other malware into a PLC – and we have. Engineers are usually the worst offenders. Often, they will bring their own laptops in, and want to take data off a machine. Lots of times they have introduced viruses in the PLCs when doing tests.
As the report explains, there are more nuclear plants connecting systems to the Internet in order to give off-site third parties access to data, including data they may need in case of a malfunction, which introduces a new vector of attack. Some plants even grant remote access (via VPN) to their digital reactor protection systems. As Source 30, a German cyber security expert providing consulting services to nuclear power plants warns:
...if VPN access is allowed to the digital reactor protection system, which is the system that shuts down the reactor in the event of a safety concern, a hacker could gain access to and compromise the reactor protection system, triggering a plant shutdown – or, worse, preventing a plant from shutting down in response to a safety alert. There are some countries that allow remote access for the vendors to the digital reactor protection systems. And if a hacker knows that, he has an entry point.
The growth of specialized search engines for connected industrial systems, like Shodan, are of particular concern. According to the report, one source said that a Shodan search revealed that all of France's nuclear plants are connected to the Internet and discoverable on Shodan.
Some plant engineers believe that their facilities are not connected to the Internet, but access points can creep in. Sometimes replacement parts for equipment will come with WiFi or GPS functionality the engineers aren't aware of, the report states.
Further, some "as business networks are typically connected to the nuclear facility, some attacks on business networks could serve as a route for attacks on the facility’s industrial control systems."
Threat landscape changing
The report explains that cyber attackers have tools at their disposal that make the threat greater. In addition to search engines like Shodan, malware writers have Stuxnet components to build from, more exploit kits are being made available, and the gray market for zero-day vulnerabilities is expanding. In fact, Malta-based company ReVuln specializes in selling zero-day vulnerabilities for SCADA systems, according to the report.
According to the report, nation-state actors attacking industrial control systems appear to be focusing on stealing data and hacktivists focused on defacement.
Yet, according to Source 10, a UK-based director at a major international company specializing in cyber security:
Radical extremism is also a serious risk, so we can consider it at least equal [to a] governmental hack attack. If an attacker really wants to penetrate or infiltrate the network, it is a question of time and money.
Threats could be introduced throughout the supply chain, but according to Source 5, a consultant to the International Atomic Energy Agency, globalization has gotten to the point that no nation is in the position to manufacture all the components it needs to truly nationalize nuclear supply chain (though Japan has come closest, it says).
What to do
"What people keep saying is ‘wait until something big happens, then we’ll take it seriously’," said Source 8. "But the problem is that we have already had a lot of very big things happen. There have probably been about 50 actual control systems cyber incidents in the nuclear industry so far, but only two or three have been made public."
Whether 50 is an accurate number or not, the report does state that cyber incidents in nuclear facilities are underreported, and recommends more anonymous sharing of information like indicators of compromise. They also state that governments should establish CERTs for industrial control systems (like ICS-CERT in the U.S.) and that developing countries should be provided technical and funding assistance to improve cybersecurity in their nuclear facilities.
To improve relationships between engineers in IT and operations, they recommend creation of cross-disciplinary university programs, improved cybersecurity training, and cross-team emergency drills.
On the technical side, Chatham House recommends nuclear plants and vendors that make software for them "avoid superfluous digital features" that introduce new complexity and vulnerability, incorporate authentication and encryption. They recommend further adoption of secure optical data diodes, whitelisting technology, intrusion detection systems, more sufficient redundancy, and better supply chain integrity.