Monday, September 3, 2012

the Social Justice Warrior cult

Updated versions of old posts about cults:

• the Social Justice Warrior cult

Whether The Culture of Cults is an accurate description of cults in general, I don't know, but it has many bits which describe Social Justice Warriors. It identifies their kind of cult:
...therapy cults, promote a secular type of belief system, based on quasi-scientific or quasi-psychological principles.
Their approach to discourse:
Actions which, to an outsider, might seem devious or immoral, may, in the mind of a believer, seem perfectly just and ethical.
And their pursuit of ideological perfection:
'The Demand for Purity: The creation of a guilt and shame milieu by holding up standards of perfection that no human being can accomplish. People are punished and learn to punish themselves for not living up to the group's ideals.'
Its list of traits of cult belief systems fits SJWs well:
Independent and non-accountable - believers follow their own self-justifying moral codes: e.g. a Moonie may, in their own mind, justify deceptive recruiting as 'deceiving evil into goodness'.
Aspirational - they appeal to ambitious, idealistic people. The assumption that only weak, gullible people join cults is not necessarily true.
Personal and experiential - it is not possible to exercise informed free choice in advance, about whether the belief system is valid or not, or about the benefits of following the study and training opportunities offered by the group. The benefits, if any, of group involvement can only be evaluated after a suitable period of time spent with the group. How long a suitable period of time might be, depends on the individual, and cannot be determined in advance.
Hierarchical and dualistic - cult belief systems revolve around ideas about higher and lower levels of understanding. There is a hierarchy of awareness, and a path from lower to higher levels. Believers tend to divide the world into the saved and the fallen, the awakened and the deluded, etc.
Bi-polar - believers experience alternating episodes of faith and doubt, confidence and anxiety, self-righteousness and guilt, depending how well or how badly they feel they are progressing along the path.
Addictive - believers may become intoxicated with the ideals of the belief system, and feel a vicarious pride in being associated with these ideals. Cults tend to be cliquey and elitist, and believers can become dependent on the approval of the group's elite to maintain their own self-esteem...
Non-falsifiable - a cult belief system can never be shown to be invalid or wrong. This is partly why critics have low credibility, and why it can be difficult to warn people of the dangers of a cult.
Because you just can't talk about group dynamics without mentioning The Stanford Prison Experiment: A Simulation Study of the Psychology of Imprisonment:
In only a few days, our guards became sadistic and our prisoners became depressed and showed signs of extreme stress.
The second basic pattern that Bion detailed: The identification and vilification of external enemies. 
...even if someone isn't really your enemy, identifying them as an enemy can cause a pleasant sense of group cohesion. And groups often gravitate towards members who are the most paranoid and make them leaders, because those are the people who are best at identifying external enemies. 
The third pattern Bion identified: Religious veneration. The nomination and worship of a religious icon or a set of religious tenets. The religious pattern is, essentially, we have nominated something that's beyond critique.
Italics mine. some contexts, it seems that an intellectual analog of Gresham's Law applies... It's not only that bad ideas drive good ideas out of circulation, but also that certain kinds of bad ideas reinforce themselves, becoming stronger in the people who believe them to start with, and taking root in the people who don't. 

This is a particularly noxious form of the Law of Group Polarization, which says that "members of a deliberating group predictably move towards a more extreme point in the direction indicated by the members' predeliberation tendencies" (Cass R. Sunstein, "The Law of Group Polarization", Journal of Political Philosophy 10(2), 175-195, 2002; working papers version here). 
As Sunstein explains, "[G]roups consisting of individuals with extremist tendencies are more likely to shift, and likely to shift more (a point that bears on the wellsprings of violence and terrorism; the same is true for groups with some kind of salient shared identity (like Republicans, Democrats, and lawyers, but unlike jurors and experimental subjects).When like-minded people are participating in 'iterated polarization games' -- when they meet regularly, without sustained exposure to competing views -- extreme movements are all the more likely." 
In cases like the Freeper thread that I cited, there seems to me to be an additional factor. In addition to the basic group-polarization dynamic, there's a sort of Gresham's Law effect, whereby people with a taste for the rational evaluation of evidence are likely to withdraw from a forum whose participants are so obviously uninterested in the facts of the matter. As a result, as the group opinion becomes more extreme, the standards of evidence get worse and worse, until we get to the point illustrated in that Freeper thread: a freely-available web link is cited to "prove" the opposite of what it plainly says, and 30-odd participants chime in enthusiastically, over a period of several hours, without even noticing.
• how to know you're in a cult

If you think your group's solution is the only solution? You're probably in a cult.