Rolling Stone has a an article out (February 24th..sorry, somewhat delayed here) about the ethical issues swirling around ‘big tech’ and their decisions (or lack thereof) to police the spaces they create for the exchange of information. Much has been written about the algorithmic propagation of misinformation on social media platforms which is why this article is interesting: It’s not about that. Instead, the author focuses on the human decision-making behind both the propagation and consumption of disinformation.
“Van der Linden got turned onto the idea of pre-bunking and inoculation theory when he stumbled across a decades-old news article about a man named Bill McGuire. McGuire was a social psychologist at Yale in the 1950s and 1960s who was concerned about the effects of Cold War-era propaganda and the Soviet Union’s attempts to brainwash Americans. So McGuire, who had done pioneering work on persuasion, began looking at the subject from the opposite angle — how to resist persuasion. “What he found was if people are not prepared to defend their beliefs or their knowledge, they’re really easily duped by arguments,” van der Linden says. “So he started thinking about fortifying people’s mental defenses.”
Ah, yes, the Cold War. A war fought on the frontiers of human beliefs, motivations and perceptions! So why can’t we just dust off the methodology we understood so well forty years ago? Well, because now we have the means to turn those tools which were propagated slowly through coordinated campaigns, organizations and communities into weapons of mass destruction through the use of algorithmic propagation. Read on:
“There are also clear ethical questions about the private sector, with its profit motives, using prebunks and inoculation to intervene in the information ecosystem. Rival companies might have little incentive to share tactics or best practices — or might even put those practices to nefarious use — when a competitor finds itself in a crisis, Shawn Walker says. And of course there’s the potential for a P.R. firm like Edelman to position itself as an arbiter of truth, snuffing out not just disinformation but unpopular opinions and inconvenient facts about a client. A Edelman rep says the firm has enlisted an outside consultancy called Compass Ethics to advise it on how to tackle disinformation without declaring itself judge and jury of what’s true and not.”
Alas, we can’t unilaterally disarm and, of course, this is obvious to most:
“Working with government is great,” he says, “but to achieve maximum effectiveness, we need the private sector on board as well.”
This is true, alas I’m not sure a collaboration with the government in this case would allow us to transfer the tools developed by tech to combat disinformation are what’s called for. I am with the skeptics in the belief that using the tech tools to propagate and target advertising to inoculate people against harmful narratives especially those that some audiences want to believe because of our well-known cognitive biases.
“Other information experts are skeptical of applying concepts such as inoculation and herd immunity to disinformation. Shawn Walker, a professor of social and behavioral science at Arizona State University, says the epidemiological approach risks overlooking the nuances and differences between online communities and how one form of intervention or solution might work in, say, a particular Reddit subgroup but not on Twitter. “There has to be thoughtful engagement and the understanding of the different balkanization of these communities,” Walker says. “Some you want to go in and engage, and some you don’t want to because it feeds the beast.”
As I said, the article is interesting because it starts to scratch at the ‘real’ issues around human behavior and not just the motivations of those steering social media companies.
If you’re not already signed up to get the Active Measures Newsletter from the Pell Center, you should be! Go here to do it – https://salve.us4.list-manage.com/track/click?u=c2ec02b0ba766fda7fed6793d&id=bed1f8277d&e=790776974a
The NATO Strategic Communication Center of Excellence continues to publish straightforward and well-informed material on how to counter disinformation. https://stratcomcoe.org/fact-checking-and-debunking (long read
Pell Center – Active Measures Newsletter https://salve.us4.list-manage.com/track/click?u=c2ec02b0ba766fda7fed6793d&id=bed1f8277d&e=790776974a