The Cognitive Crucible is a forum that presents different perspectives and emerging thought leadership related to the information environment. The opinions expressed by guests are their own, and do not necessarily reflect the views of or endorsement by the Information Professionals Association.
During this episode, Dr. Matthew Canham discusses the importance of cognitive security, and his neurosecurity framework.
Research Question: Are there a finite and discoverable set of Principles of Influence for AI analogous to the Principles of Influence in humans? (Reference: Dr. Robert Cialdini’s Seven Principles of Influence in Humans which form the basis of (most) social engineering attacks: Reciprocity, Social Proof, Liking, Authority, Commitment & Consistency, Scarcity, Unity)
Guest Bio: Dr. Matthew Canham is a security consultant and researcher dedicated to understanding and addressing the human element in cybersecurity. His research focuses on human susceptibility to mis-dis-mal (MDM) information operations and remote online social engineering attacks. He is also the host of the Cognitive Security Institute, an organization which holds monthly online meetings to discuss topics in cognitive security. You may watch past presentations here: https://www.youtube.com/@cognitivesecurityinstitute579/videos
About: The Information Professionals Association (IPA) is a non-profit organization dedicated to exploring the role of information activities, such as influence and cognitive security, within the national security sector and helping to bridge the divide between operations and research. Its goal is to increase interdisciplinary collaboration between scholars and practitioners and policymakers with an interest in this domain.
For more information, please contact us at firstname.lastname@example.org.
Or, connect directly with The Cognitive Crucible podcast host, John Bicknell, on LinkedIn.
Disclosure: As an Amazon Associate, 1) IPA earns from qualifying purchases, 2) IPA gets commissions for purchases made through links in this post.