AFCEA Preferred Provider Session: Securing Our Sentinels: Protecting Military AI Models from Data Poisoning, Evasion, and Extraction​

  • Room: Estes A
Monday, August 18, 2025: 4:00 PM - 5:00 PM

Speaker(s)

Speaker (confirmed)
Mike Morris
Associate Dean and Senior Director, ASCSIA, BSCSIA and MSCSIA Programs
Western Governors University

Description

As Artificial Intelligence (AI) becomes increasingly integral to modern military capabilities, the AI models themselves transform into high-value targets for adversaries. Protecting these "digital sentinels" requires a comprehensive security strategy addressing unique vulnerabilities throughout the AI lifecycle, from data ingestion to model deployment and operation. Frameworks such as the OWASP Top 10 for Large Language Model (LLM) Applications and the MITRE ATLAS (Adversarial Threat Landscape for AI Systems) provide guidance for understanding and mitigating these threats. Model Theft are directly applicable to military AI systems, where exploitation could lead to compromised intelligence, mission failure, or the loss of strategic technological advantage.

The "black box" nature of many advanced AI models complicates traditional software security approaches. Vulnerabilities may not be simple code flaws but emergent properties of data, architecture, and training. This necessitates continuous behavioral validation and AI-specific red teaming. The increasing reliance on third-party AI components and pre-trained models also introduces significant supply chain risks, making robust AI Bill of Materials (AI-BOM) practices essential. Model theft, beyond intellectual property loss, can reveal critical capabilities and weaknesses to adversaries.

Best practices include stringent input validation and sanitization , encryption of training data and models , secure training environments , and continuous monitoring for anomalies and drift. The physical and cyber-physical security of AI development and deployment environments is now as critical as traditional cybersecurity.

Listen here


Tracks:


Handouts



Click here for Continuing Education approvals