The Federation of American Scientists (FAS) has posted a recently released report (as a result of its Freedom of Information Act (FOIA) request) delivered to the Department of Defense in October 2008 by the federal JASON science advisory group on technologies to detect deception, titled “Quest for Truth: Deception and Intent Detection,” as well as a useful overview of the report.
As noted in the abstract:
“This report is in response to Department of Defense (DOD) interest in the detection of deception and intent to do harm. The report provides an assessment of the potential utility and efficacy of monitoring and assessing human behavioral neurophysiology and verbal and nonverbal communication to determine human intent in a military context. The JASON study focused on the process that would need to be implemented in order to identify covert combatants and ultimately infer intended actions is outlined. A key finding in this report is the need to establish a discipline of science in the development and deployment of potential technologies, including interrogation methods , that have been proposed to be useful in this setting.”
From a cognitive security perspective, it is worth reading in more detail to examine some of the technologies that may potentially play a role in helping suss out deceptive intent (at least from humans; it is not clear how they might be useful to counter such activities like bot-nets or deep-fake AI, though they too rely on humans and human-biases). It is possible there could be future applications for the technologies/techniques examined — if we develop a foundation for assessment and validation that these systems can scientifically do what they are advertised to do. Too often, the government and industry pursue technology based on some “faith” that the technology will be able to meet our needs, rather than objective testing or analysis to support those claims.
Read the full report: Quest for Truth: Deception and Intent Detection, JSR-08-143, October 2008.