This is the second article of a three-part series. Part One is here.
What did you agree to?
Some who have read Part One of this article have let me know I did a pretty lousy job of explaining myself on the “Without your prior knowledge and consent” decision in my flowchart. In my clumsy way, I was trying to say that in some situations, giving your consent to participate in a cult doesn’t necessarily mean you’re exercising free will in the way we normally understand it—in other words, that it is you who are making the choices. It’s an important point, so I need to explain it well.
You can exercise your free will and consent to the agenda as you know it, but if the cult accepts that as an agreement also to the rest of its agenda—the “secret” knowledge it doesn’t share with outsiders or newest members—the potential now exists for a cult to change your identity later in a way you didn’t desire without your prior knowledge or consent.
In typical destructive cults, that information is being withheld specifically because it is something you’d be unlikely to agree to in the early stages of your involvement.
Consider this example:
Suppose a woman has decided that she has led a life of sin and wants to change her ways. So she sees this little church not too far from where she lives—a small church with a lone minister.
She meets with the minister and says she is ready to embrace Christ and change her ways. The minister says “that’s wonderful my child, are you ready to embrace all of His teachings? Are you ready to renounce Satan? Are you ready to change all of your past ways? Are you ready to do what you need to purify yourself in the eyes of the Lord?”
“Yes, yes, yes, to all of that,” she replies. And so she begins down the path to her reclamation. What she doesn’t know is this particular minister believes he talks to God directly. And God has told the minister that the way for a fallen woman to purify herself is through a “union” with a man of God.
But the minister knows she’s not ready to hear that yet, because she’s so “corrupt.” So he has to begin the process of elevating his status with her as a man of God and “work with her” until she completely embraces the guilt and shame of her past life. If all goes well, in a few months she’ll become one more member of the flock that he has had sex with.
I wish I completely made that up, but that sort of thing happens. The point is, the woman had no prior knowledge that she was a sexual target when she consented to “purify herself.” As a result, her ability to consent was compromised.
But…if the minister had said at the outset—”first, we need to do a little purification, and it starts with you and me doing the horizontal mambo. Wait here while I get my chaps”? (He’s a pretty weird minister, that one.)
If he had done that, she would have known to give him a good kick in the yarbles and move on.
For this reason, I have a very healthy suspicion about any group that has “secret knowledge” that you must acquire over time. All too often, the path on that journey is seeded with hidden agendas.
We’ll revisit this issue later in this article. But for now let’s get back to this part about identity change.
Part 2. Changing your
So how do they do it? What can be said of this secret and sinister process that can be used to change how you view yourself?
Well, it’s not secret, for one.
And it’s not sinister, either. In fact, it’s been known since the 1930s and is used mainly for positive reasons.
That’s why most “cults that change your identity” are considered benign on the flowchart I described in Part One. The basic steps of identity change were identified decades ago.
It’s a three-step process that was identified by Kurt Lewin, a pioneer in organizational psychology. His work has been applied mostly as the foundation of change management in the business world. The three steps Lewin identified are as follows:
- Unfreezing—bypassing the defensive mechanisms to overcome current patterns and deconstruct the current mindset.
- Transition—a time of confusion and flux as the old patterns are released and replaced with a new mindset.
- Freezing—the transition is complete, the new identity is sealed, and the subject begins to feel calm now that the period of stress and change is over.
Here’s a typical use for Lewin’s model. To survive, most businesses must change to keep pace with technology and market dynamics. Often, that means a business must also inspire its employees to change they way they think of themselves and the role they play in the organization. For example, if most of your workforce joined your company when it was known as a “quality manufacture” company, but the current marketplace demands that they operate as part of a “customer focus” company, then you’ll probably use some form of the Lewin model to help them adopt a “customer-centric” role—adopting not only behavioral change but also pride in their new self-image. This is a common practice in successful companies. And since the end result passes my little flowchart test, there’s nothing sinister about it.
No matter how employees think about themselves on the job, they go home after work more or less unchanged.
What’s a little xi nao among friends?
But some years after Lewin’s model was identified, Americans got a terrifying glimpse of other ways in which it could be employed. For some reason, a higher percentage of US POWs during the Korean War began defecting to the enemy side and signing “confessions” than in any previous war. It was discovered that these soldiers were being subjected to Chinese methods of coercion known as xi nao (literally, “wash brain”). The Lewin model was being used, but the methods were often brutal and the results anything but beneficial. Thus began our unending fascination for mind control techniques, both within the military and without.
One American psychiatrist, Robert J. Lifton, began studying the practice and effects of brainwashing techniques (now known by the less-sensational and more accurate term thought reform) employed on US soldiers. His initial work was captured in the groundbreaking book Thought Reform and the Psychology of Totalism.
While Lifton’s early work was focused almost exclusively on the thought reform techniques used by totalitarian regimes, it has influenced much of today’s thinking about the nature of destructive cults in general.
For example, one of the mistaken popular beliefs about thought reform and destructive cults is that the techniques are/were developed by sinister master psychologists. In fact, “there is no evidence that psychologists, psychiatrists, neurophysiologists, or scientists of any sort played any significant role in their planning, development, or execution…There is every reason to believe that they evolved pragmatically, empirically, and to some extent sui generis in response to the military and political needs of the Russian and Chinese governments over the past half-century.” (From an anonymous review of Coercive Persuasion by Edger H. Schien, CIA Web site)
This is particularly important when considering Molyneux and other people who may or may not be leaders of a destructive cult. If there are thought reform techniques being employed in such groups, the notion that they are consciously orchestrated by some genius-psychologist leader may owe more to fiction than science. As I’ve said elsewhere, whatever Molyneux may or may not be doing with his “community,” I do not believe his intentions are destructive. If he is actually manipulating thoughts, memories, and personalities of his True Believers, he may not even be aware of it. I tend to think he simply believes he is dispensing “the truth.”
Orchestrated or not, one still finds similarities among all groups that destructively change people’s identities without their consent. Lifton captures these similarities in his well-known Eight Criteria for Thought Reform:
- Milieu Control—Control of communication both from without and within the group environment, resulting in a significant degree of isolation from the surrounding society. Includes other techniques to restrict members’ contact with outside world and to be able to make critical, rational judgments about information: overwork, busy-ness, multiple lengthy meetings, etc.
- Mystical Manipulation—The claim of divine authority or spiritual advancement that allows the leader to reinterpret events as he or she wishes, or make prophecies or pronouncements at will, all for the purpose of controlling group members.
- Demand for Purity—The world is viewed as black and white and group members are constantly exhorted to strive for perfection. Consequently, guilt and shame are common and powerful control devices.
- The Cult of Confession—Serious (and often not so serious) sins, as defined by the group, are to be confessed, either privately to a personal monitor or publicly to the group at large.
- The “Sacred Science”—The doctrine of the group is considered to be the ultimate Truth, beyond all questioning or disputing. The leader of the group is likewise above criticism as the spokesperson for God on earth.
- Loading the Language—The group develops a jargon in many ways unique to itself, often not understandable to outsiders. This jargon consists of numerous words and phrases which the members understand (or think they do), but which really act to dull one’s ability to engage in critical thinking.
- Doctrine over Person—The personal experiences of the group members are subordinated to the “Truth” held by the group. “Apparently” contrary experiences must be denied or re-interpreted to fit the doctrine of the group. The doctrine is always more important than the individual.
- Dispensing of Existence—The group arrogates to itself the prerogative to decide who has the right to exist and who does not. Usually held non-literally, this means that those outside the group are unspiritual, worldly, satanic, “unconscious,” or whatever, and that they must be converted to the ideas of the group or they will be lost. If they refuse to join the group, then they must be rejected by the group members, even if they are family members. In rare cases this concept gives the group the right to terminate the outsider’s life.
These criteria appear in Chapter 22 of Lifton’s aforementioned book. (A larger summary is here.)
While this list is one that Lifton created to help understand how totalitarian regimes practice thought reform, his work continues to offer insights into the destructive cults that soon followed and several cult experts have adapted this list to help others understand the operations of today’s religious- and non-religious- based destructive cults.
Sadly, now that I’ve begun to re-introduce lists and criteria to the discussion, I’ve also re-introduced ambiguity. Some may wonder—if a group meets only some, but not all, of the above criteria, is it automatically not a destructive cult? Personally I don’t think so. I still think it all comes down to the final answer on the flowchart.
Wikipedia notes that Lifton also popularized the phrase “thought-terminating cliché,” an aphorism with a ring of truth that is created to immediately quell any thoughts that might challenge group doctrine.
Coincidentally, I recently encountered a brilliant thought-terminating cliché that is commonly used at FreeDomain Radio to squelch any attempt at critically analyzing Molyneux’s “Universally Preferable Behavior (UPB).” Since UPB is considered to be Molyneux’s greatest work (and quite possibly philosophy’s greatest work) by FDR True Believers, it is essential that no serious critical analysis be permitted. (In my article The Promise and Failure of UPB, I note at least two instances of FDR members being banned simply for suggesting they might critique it.)
The FDR thought-terminating cliché for UPB criticism is: “the act of arguing against UPB actually validates it.” After that, there is no need for any FDR True Believer to think about UPB—there is nothing left but to accept it without question.
Enter the “New Religious Movements”
The fascination with thought reform increased in the 1960s, when a number of previously unknown or obscure religious groups began springing up in the US and elsewhere. Today, such groups are collectively known as New Religious Movements. They may be technically be cults, but because the distinction between “cults” and “destructive cults” doesn’t often exist in popular culture (and most of these cults aren’t destructive anyway), the term “New Religious Movement” is more palatable and most often more accurate.
However, many people were surprised at a new phenomenon seen in a few of those groups. A number of adolescent and late-adolescent baby boomers, upon joining them, almost immediately exhibited radical behavior change—sometimes discarding their family and friends in favor of the group. These groups were recruiting heavily, often focusing their efforts on students at college campuses.
That’s when the real controversy over “destructive cults” began. Do they actually exist? Do they practice a new kind of thought reform? If so, how do they do it?
The controversy has raged for years, despite an almost insurmountable degree of anecdotal and empirical evidence. Today, the American Psychiatric Association (APA) is tentative in affirming the existence of destructive cults in general but not thought reform (which has become an accepted synonym for brainwashing and coercive persuasion) in particular. The Diagnostics and Statistical Manual of Mental Disorders (DSM-IV) specifically covers the concept of thought reform, as does the current proposed draft of DSM-V. However, the word “cultists,” which appeared in DSM-III, was removed when the work was updated to DSM-IV.
However, it is important to note that “not affirming” is not the same as “rejecting.” That’s an important distinction, since destructive cults will often defend themselves by implying just the opposite. The APA hasn’t rejected anything.
In fact, many physicians today use the diagnosis “Dissociative Disorder Not Otherwise Specified” for patients they believe are victims of destructive-cult thought reform. This applies not only to cult members but also to so-called walkaways or “throwaways.” (When ex-members of destructive cults do not seek sufficient therapy to help contextualize what they have just gone through, negative effects can linger for years.)
Cults and the American Psychological Association
Like the American Psychiatric Association, the American Psychological Association (another APA!) also struggles with understanding and defining destructive cults. However, before it can affirm the existence of destructive cults, it must wrestle with some very large issues—not only with a methodology for providing clinical proof but also with issues that expand into the legal, religious, and human rights arenas. In the article Mind control: psychological reality or mindless rhetoric? published on the APA site, Dr. Philip G. Zimbardo writes:
Mind control: psychological reality or mindless rhetoric?
By Dr. Philip G. Zimbardo
November 2002, Vol 33, No. 10
One of the most fascinating sessions at APA’s Annual Convention featured presentations by former cult members. (See “Cults of hatred“). Several participants challenged our profession to form a task force on extreme forms of influence, asserting that the underlying issues inform discourses on terrorist recruiting, on destructive cults versus new religious movements, on social-political-”therapy” cults and on human malleability or resiliency when confronted by authority power.
That proposal is intriguing. At one level of concern are academic questions of the validity of the conceptual framework for a psychology of mind control. However, at broader levels, we discover a network of vital questions:
- Does exposing the destructive impact of cults challenge the principle of religious freedom of citizens to mindfully join nontraditional religious groups?
- When some organizations that promote religious or self-growth agendas become rich enough to wield power to suppress media exposés, influence legal judgments or publicly defame psychology, how can they be challenged?
- What is APA’s role in establishing principles for treating those who claim to have suffered abuse by cults, for training therapists to do so and for establishing guidelines for expert testimony?
Dr. Zimbardo characterizes the polar views of destructive cults in a particularly interesting way:
It seems to me that at the heart of the controversy over the existence of mind control is a bias toward believing in the power of people to resist the power of situational forces, a belief in individual will power and faith to overcome all evil adversity. It is Jesus modeling resistance against the temptations of Satan, and not the vulnerability of Adam and Eve to deception. More recently, examples abound that challenge this person-power misattribution.
….The power of social situations to induce “ego alien” behavior over even the best and brightest of people has been demonstrated in a variety of controlled experiments, among them, Stanley Milgram’s obedience to authority studies, Albert Bandura’s research on dehumanization, my Stanford Prison Experiment and others on deinviduation.
Understanding the dynamics and pervasiveness of situational power is essential to learning how to resist it and to weaken the dominance of the many agents of mind control who ply their trade daily on all of us behind many faces and fronts.
I think for those of us who work the “freedom side of the street,” Zimbardo captures the polarization very well, and it’s amusing that he chose to do it in religious terms. Are humans “Jesus”—able and required to resist temptation? Or are we “Adam and Eve”—rulers of our kingdom yet still vulnerable to deception? Some people believe the choices people make are always born of free will and others believe it is possible to be deceived into identity change. Zimbardo appears to be in the latter camp and so am I.
Some psychologists who accept the notion of destructive cults believe that while such cults lack the ability to isolate and torture, as in the Korean War, they have developed techniques that are far more powerful. In the article Thought Reform Exists: Organized, Programmatic Influence, published on FactNet (coincidentally, a source accepted by Molyneux as a credible authority on destructive cults), Dr. Margaret Singer writes:
Thought reform is accomplished through the use of psychological and environmental control processes that do not depend on physical coercion. Today’s thought reform programs are sophisticated, subtle, and insidious, creating a psychological bond that in many ways is far more powerful than gun-at-the-head methods of influence. The effects generally lose their potency when the control processes are lifted or neutralized in some way. That is why most Korean War POWs gave up the content of their prison camp indoctrination programs when they came home and why many cultists leave their groups if they spend a substantial amount of time away from the group or have an opportunity to discuss their doubts with an intimate.
Contrary to popular misconceptions (some intentional on the part of naysayers), a thought reform program does not require physical confinement and does not produce robots. Nor does it permanently capture the allegiance of all those exposed to it. In fact, some persons do not respond at all to the programs, while others retain the contents for varied periods of time. In sum, thought reform should be regarded as “situationally adaptive belief change that is not subtle and is environment-dependent.”
Today, groups identified as potentially destructive cults can be based on virtually any belief—religious or not. All they require is an environment that fosters a religious-like zeal for whatever core belief the cult is based upon, to the detriment of the members themselves.
The up- and downside of lists
In the end, the problem always comes back to identification—how does one determine whether or not a cult is destructive?
The upside to the various “identification” criteria lists is that they can be very informative and directional. The downside (at least in what I have seen in on-line arguments) is that they can be—and are often—used in a very fragmented and subjective way to “prove” the case of both cult attackers and defenders.
Molyneux himself has attempted to defend FDR in this way on several occasions, one of which I linked to in Part One of this series.
(Not surprisingly, after that article appeared on-line, nearly all of Molyneux’s cult-defense articles and videos have disappeared from FDR and elsewhere [in the same way that all traces of his wife Christina's involvement suddenly disappeared several months ago]. Although the Molyneux video I linked to in Part One [True News 17: Media Accusations, Part 2] is no longer available, the audio portion can still be downloaded as FDR Podcast 1256. [For now, at least!])
Instead of relying solely on such lists to determine if a group is a destructive cult, I find it easier to start with a “big picture” analysis first, like my flowchart or the Lewin model. Then it becomes a matter of answering the questions—How is the group unfreezing; How and what are they changing; and How are they Freezing?
At that point, the various lists available on the Web do become useful—not as definitive criteria, but as suggestions to help you answer those questions.
Noted destructive cult expert Steven Hassan offers a useful go-by, adapting the Lewin model specifically for analyzing potential destructive cults. His adaptation, which he calls The Three Stages of Gaining Control of the Mind is as follows:
The Three Stages of Gaining Control of the Mind
[Adapted from Kurt Lewin's three-stage model as described in Coercive Persuasion (Norton, 1961) by Edgar Schein]
- Disorientation / confusion
- Sensory deprivation and/or sensory overload
- Physiological manipulation
- Sleep deprivation
- Privacy deprivation
- Change of diet
- Age regression
- Story-telling and metaphors
- Linguistic double binds, use of suggestion
- Meditation, chanting, praying, singing
- Get person to question self identity
- Redefine individual’s past (implant false memories, forget positive memories of the past)
- Creation and imposition of new “identity” done step by step
- Formally within indoctrination sessions
- Informally by members, tapes, books, etc.
- Use of Behavior Modification techniques
- Rewards and punishments
- Use of thought-stopping techniques
- Control of environment
- Mystical manipulation
- Use of hypnosis and other mind-altering techniques
- Repetition, monotony, rhythm
- Excessive chanting, praying, decreeing, visualizations
- Use of confession and testimonials
- New identity reinforced, old identity surrendered
- Separate from the past; decrease contact or cut off friends and family
- Give up meaningful possessions and donate assets
- Start doing cult activities: recruit, fundraise, move in with members
- New name, new clothing, new hairstyle, new language, new “family”
- Pairing up with new role models, buddy system
- Indoctrination continues: Workshops, retreats, seminars, individual studies, group activities
Remember, cult mind control does not erase the person’s old identity, but rather creates a new one to suppress the old identity (John-John and John-cult).
(You may notice in Hassan’s adaption, he mistakenly refers to the third step of the Lewin model as “Refreezing” instead of “Freezing.” It’s a common error.)
Hassan’s version of the Lewin model (like the adaption of Lifton’s criteria I presented earlier) is very informative and useful, as long as you understand that Hassan is offering examples of some common techniques. Because destructive cults continually find new angles, he cannot provide an exhaustive list. Just as important, it is also not a list of necessary requirements for destructive cult identification.
(Obviously, you can be in a pretty destructive group even if they don’t chant! I may be susceptible to a destructive cult, but I’d never be susceptible to any group that chants. I’m just not a chanter. That’s not how I roll.)
In the end, that’s why I resorted to my little flowchart model. My theory is there are so many different types of destructive cults operating at this point, let me first determine to my own satisfaction that a particular group can be classified as such and then I’ll try to figure out how they’re accomplishing it.
Yeah, but has anyone ever seen one up close?
So, I hear you say, “all the history and theoretical stuff is good and Q.E’s cult identification flowchart is nothing less than brilliant!” (Wait–maybe that was just me who said that.)
“But,” you add, “has anyone ever come up with a methodology for examining identity change within cults that’s a little more scientific than empirical observation and subjective determination?”
That story is waiting for you in Part 3.
Click below to e-mail or DIGG, etc., this article! As always, I welcome your comments!