Robust, Resilient, and Risk-Aware Optimization and Controls for Cyber-Physical Systems
Date
Authors
ORCID
Journal Title
Journal ISSN
Volume Title
Publisher
item.page.doi
Abstract
Cyber-Physical Systems (CPS) are physical processes that are tightly integrated with computation and communication systems for monitoring and control. Advances in CPS design has equipped them with adaptability, resiliency, safety, and security features that exceed the simple embedded systems of the past. On the other hand, the design of CPS involving complex interconnections between modules often leaves open several points for attackers to strike. This PhD dissertation is aimed upon developing theoretical techniques and building simulation tools based on distributional robustness for uncertainty handling tailored for guaranteeing resiliency in attack-prone CPS. As these systems become large, devising both model-based and moment-based methods for detecting anomalies are critical for robust and efficient operation. Similarly, safely deploying robots in dynamic and unknown environments require a systematic accounting of various risks both within and across layers in an autonomy stack from perception to motion planning and control. However, the perception and planning components in a robot autonomy stack are loosely coupled, in the sense that nominal estimates from the perception system may be used for planning, while inherent perception uncertainties are usually ignored, inspired from the classical separation of estimation and control in linear systems theory. As motion planning algorithms must be coupled with the outputs of inherently uncertain perception systems, there is a crucial need for tightly coupled perception and planning frameworks that explicitly incorporate perception uncertainties. In the first contribution of this dissertation, we show that robotic networks having graph robustness properties guaranteeing resiliency against malicious agents can be compromised through spoofing attack. We quantify the misclassification probability through distributionally robust pairwise comparison of the physical fingerprints of the agents. We propose a variant of robust consensus protocol to guarantee spoof resiliency against malicious agents who might spoof arbitrary amounts of spoofed identities. In the second contribution of this dissertation, we design anomaly detector for cyber-physical systems. Threshold of an anomaly detector limits the potential impact of a stealthy attacker attacking a CPS. We show that the traditional chi-squared anomaly detector raises false alarms more than a desired value in face of non-Gaussian uncertainties. To address the above problem, we propose a distributionally robust approach for tuning anomaly detector threshold and further analyse the problem when the system model has multiplicative noise uncertainties. In the final contribution of this dissertation, we establish a systematic framework for integrating the perception and control components, tailored for the robotic systems that are designed to operate in dynamic, cluttered and unknown environments. We propose a distributionally robust incremental sampling-based motion planning framework that explicitly and coherently incorporates perception and prediction uncertainties. We formulate distributionally robust risk constraints through linear temporal logic specifications to help the robot make coherent risk assessment without increasing the computation complexity while operating in unknown environments.