The National Science Foundation supports the cutting-edge research of junior faculty members through its prestigious Faculty Early Career Development (CAREER) award. The program recognizes “early-career faculty who have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization.” The latest faculty member of the Donald Bren School of Information and Computer Sciences (ICS) to be honored with this award is Assistant Professor of Computer Science Qi Alfred Chen.
Chen will receive $523,400 over the next five years to fund his proposal, “Securing the AI Stack in Autonomous CPS under Physical-Layer Attacks: A Systems Perspective.” His research focuses on artificial intelligence (AI) security in the emerging Internet of Autonomous Things (IoAT), which includes various recent high-profile AI-enabled autonomous cyber-physical systems (CPSs) such as autonomous cars, delivery drones and robots.
“I am very honored to receive such an award and deeply appreciate NSF’s recognition and support of my research vision,” says Chen of this milestone in his faculty career. “The IoAT is currently revolutionizing our everyday transportation, retail, logistics and domotics, and will eventually impact every industry. So part of my long-term career goal is to secure emerging computer technologies in the physical world, which is more critical than ever due to their unprecedented ability to impact the physical world and the users within it.”
As outlined in Chen’s proposal, we are witnessing a surge in the development and deployment of autonomous CPSs. Furthermore, he notes that “to achieve high-level autonomy in complex environments, the AI stack plays a central role, a type of ‘brain,’ which makes [the autonomous CPS] highly security-critical.” The project aims to create a suite of systematic methodologies, solution frameworks, and platforms that analyze the systems. It will also involve designing effective defense strategies for autonomous CPSs under physical-layer attacks.
“Even when not in adversarial settings, AI stack flaws in autonomous cars have already been found to cause violent crashes in recent years, killing various drivers and a pedestrian,” says Chen. “If these problems can be deliberately triggered by attackers, such occasional tragedies today can soon become planned murders tomorrow.”
Acknowledging that the AI stack is only a component of the entire autonomous system, Chen recognizes the need to take a systems perspective — something he says is currently lacking in today’s AI security research. “We are highly motived to fill this critical and timely gap by creating a suite of systematic methodologies, solution frameworks and platforms for system-driven AI security research in autonomous systems and IoAT in general.”
Chen and his students will work to develop novel system-to-AI and AI-to-system mapping methodologies, “overcoming various design challenges such as systematically maintaining physical realizability and semantic equivalency in physical-layer attack generation, and effectively accommodating the diversity of real-world system designs and implementations.” They will also develop system-level defense designs and a simulation-based evaluation platform, which will be used for both research and educational purposes.
“This award,” stresses Chen, “will be dedicated to AI security research and education that is driven by the semantics of the AI system context and thus more practically meaningful.”
— Shani Murray