Industrial Control Systems (ICS) are responsible for the safety and operations of critical infrastructure such as power grids. Attacks on such systems threaten the well-being of societies, and the lives of human operators, and pose huge financial risks. Due to their reliance on insecure legacy protocols and hosts, the systems cannot be protected easily. Fortunately, detailed process data is available and can be leveraged by process-aware attack detectors that verify inherent physical correlations. In commercial products, such detectors will be trained by the vendors on process data from the target system, which might allow malicious manipulations of the training process to later evade detection at runtime. Previously proposed attacks in this direction rely on detailed process knowledge to predict the exact attack features to be concealed. In this work, we show that even without any process knowledge, it is possible to launch training time attacks against such attack detectors. Our backdoor attacks achieve this by identifying `alien' actuator state combinations that never occur in the training samples and injecting them with legitimate sensor data into the training set. At runtime, the attacker spoofs one of those alien actuator state combinations, which triggers (regardless of current process sensor values) the classification as `normal'. To demonstrate this, we design and implement five backdoor attacks against autoencoder-based anomaly detectors for 14 attacks from the BATADAL dataset collection. Out of these five variations, four implementations have been found to be effective. Our evaluation also shows that our best backdoor attack implementation can achieve perfect attack concealment and accomplish an average accuracy of 0.19. Compared to the performance of the detector for anomalies that are not concealed by inserted triggers, our attacks decrease the detector's accuracy by 0.39.
Proceedings of the Cyber-Physical System Security Workshop (CPSS)