Methodological Fallacies in Anthony’s Critique of Exit Cost Analysis
Benjamin Zablocki
Abstract
This article consists of a
point-by-point rebuttal of Dick Anthony’s critique (Anthony, 2001) of my theory
of cultic brainwashing (Zablocki, 2001). Anthony’s argument is first reduced to
the 98 distinct propositions that he presents in Zablocki & Robbins (2001). The
assumption is that, if there is validity to his critique, some of these
propositions must satisfy all of the following three criteria: (1) they must
dispute some aspect of my theory; (2) they must be relevant to the specific
arguments made in my theory; and (3) they must be factually correct.
Surprisingly, it is shown that none of Anthony’s 98 propositions is successful
in meeting all three of these fundamental criteria. It is therefore concluded
that Anthony’s critique does not meet the minimal requirements needed for a
successful attempt to invalidate a theory.
See ICSA collection on
academic disputes and dialogue.
Preface
The reader should be warned that this long paper was boring
to write and is probably even more boring to read. The bulk of it consists of a
point-by-point refutation of Dick Anthony’s (2001) long, rambling critique of my
theoretical work. Many of my colleagues have advised me that Anthony’s critique
does not deserve serious scholarly consideration. I tend to agree with them. But
because Anthony’s chapter in Misunderstanding Cults appears immediately
following mine, and because I am one of the book’s editors, I felt that I needed
to deal with issues he raises. This was easy enough to do because although
Anthony is an accomplished scholar on his own turf, he is completely out of his
element trying to critique the work of a social psychologist. I do not expect
many scholars to be interested in reading this defense in full. I have published
it on the Web so that any who doubt that Anthony’s criticisms of my theory are
specious might have public access to my rebuttal to see for themselves. Feel
free to download this document and to use it freely if you find it useful.
Brainwashing has been, by far, the most controversial topic
among scholars who study new religious movements. My purpose here today is not
to convince you that brainwashing happens in New Religious Movements (NRMs), but
merely that a theory exists that we can use to determine empirically whether or
not it does.
In a recent book called Misunderstanding Cults
(Zablocki & Robbins, 2001), I attempted to lay out such a clearly stated,
well-formed, empirically testable, and epistemologically falsifiable
sociological theory that would locate the concept within the field of social
psychology as an ordinary (albeit extremely powerful) process of social
influence. Dick Anthony (2001) replied, in the same book, with a massive
103-page critique of this effort, arguing that I had failed miserably in the
attempt. He argued that what I thought of as my little theory was not merely
empirically false but bogus science, as well—that is, it was not really a theory
at all, but a bunch of double-talk masquerading as a theory.
The purpose of this paper is to evaluate the validity of
Anthony’s critique. My method of evaluation has been to identify all of the
propositional statements in Anthony’s chapter and to determine which, if any, of
them constitute valid scholarly criticisms of my theory as I stated it. After I
isolate only those propositions that meet the test of credibility, it should
then be possible to determine whether these propositions constitute a complete
refutation of my theory, a partial refutation, or no refutation at all.
Before I begin, I must set out three ground rules. First, I
am limiting the discussion to the social process that I have (wisely or
unwisely) labeled brainwashing. Even if you hate the word, or it conjures
up images for you that are far different from those that I delineate in my
theory, the norms of scholarly discourse require that you allow me to
distinguish the term from the substance and to concentrate here on defending the
substance of my theory. Of course, you will still then be free to reject my
theory if you choose, even if can successfully defend the substance of its
argument, on the grounds that the word I chose to label the central concept is
bad, misleading, or stupid. And, if you do so, that contention may form the
basis of an entirely separate debate.
Second, I must make a clear distinction between
descriptive theories that identify a social process that occurs, on
the one hand, and explanatory theories that explain why the
process has the consequences associated with it, on the other. Both sorts of
theories are valuable in scientific enterprise. For example, in physics, the
theory of gravity is purely a descriptive theory. The theory of
electricity started out as a descriptive theory but eventually evolved,
becoming an explanatory theory, as well. The only claims that I make for
my theory of brainwashing are that it describes a real social process
(whose existence has been disputed) and the behavioral consequences of that
process to the individuals who are its target. Although I do speculate in
several of my publications about the reasons that what I call brainwashing
might have the effects it does, I make no theoretical claims for these
speculations, calling them instead “conjectures,” which is all they are at this
point. This distinction is important to make because Anthony is not clear about
it and, therefore, sometimes goes astray by muddling together my descriptive
arguments with my explanatory speculations. Sometimes it even seems as if the
heart and soul of Anthony’s argument with me lies in a mistaken assumption that
I am trying to explain why brainwashing has the effects it does rather
than simply providing an empirically practical observational schema that can be
used to determine, in any particular setting, whether such a social
influence mechanism exists.
Third, I must make a clear and sharp distinction between
the validity of a theory and the amount of evidence that supports the theory.
Some perfectly good theories describe phenomena that occur rarely in nature. I
believe that other individuals, as well as myself, have gathered evidence that
brainwashing occurs in the social world often enough to make the phenomenon
worthwhile to study. But to confound discussions of a theory arguing the
possible existence of a phenomenon with empirical discussions of the
prevalence of that phenomenon in the real world only breeds methodological
confusion. My concern here is exclusively with the former. The latter (which is
also dealt with in a later portion of my chapter in Misunderstanding Cults)
could be a topic for another debate.
Let us turn now to Anthony’s arguments. Although his prose
tends to be rambling and repetitive, I was able to isolate ninety-eight discrete
propositions that Anthony makes concerning my theory. Although I ransacked his
chapter carefully searching for statements in propositional form, it is always
possible that I missed a few. If Anthony or anyone else can point out additional
overlooked propositions, I will, of course, be obliged to deal with them. But,
for now, I ask you to provisionally accept my statement that these ninety-eight
propositions constitute an exhaustive list of Anthony’s arguments.
Ninety-eight is a lot of propositions to deal with. When I
first saw the length of Anthony’s chapter, I admit I was taken aback. With so
many points, it would seem that at least a few of them were bound to have struck
their target. But then I was reminded of the famous story of the little boy who
asked his parents for a pony for Christmas. Christmas day arrived, the presents
were all unwrapped, but no pony. A while later, Mom and Dad noticed that the boy
was not in the house. They went to look for him and found him in the barn,
thrashing wildly around in a big pile of manure. They quickly pulled him out and
asked him, “What in the world were you doing?” He replied, “Well, gosh, with all
that manure I figured there was sure to be a pony in there somewhere.” Unlike
the little boy, I decided to examine each proposition individually before
concluding that there must be a pony among them.
For a proposition to be a valid part of a critical
argument, it must pass three tests: (1) It must be a statement in disputational
form that challenges something about the theory it is criticizing. (2) It must
be relevant to the stated theory that it is criticizing. (3) It must be
factually true (if a statement of fact) or logically true (if a statement of
logic). Only if the proposition passes all three of these tests can it make a
contribution to the author’s argument. I subjected each of Anthony’s
ninety-eight propositions to these three tests. There are eight logically
possible outcomes to the combination of these tests. In table 1, I list these
possible outcomes and the number of propositions that fall within each category.
Table 1 Distribution of Anthony’s Propositions on Each Logically Possible
Combination of Tests
|
Disputational Form |
Relevant to the
Stated Theory |
Factually or
Logically Correct |
Frequency Count |
|
No |
No |
No |
7 |
|
Yes |
No |
No |
23 |
|
No |
Yes |
No |
14 |
|
Yes |
Yes |
No |
27 |
|
No |
No |
Yes |
11 |
|
Yes |
No |
Yes |
5 |
|
No |
Yes |
Yes |
11 |
|
Yes |
Yes |
Yes |
0 |
|
Total |
|
|
98 |
An examination of table 1 shows that some of Anthony’s
disputational propositions are indeed relevant and that some are indeed correct.
But, unfortunately for Anthony, none of those that are relevant are correct, and
none of those that are correct are relevant. This refutation of all of
Anthony’s ninety-eight propositions, if it holds up to scrutiny, constitutes a
definitive refutation of his entire argument. I grant him that, as gestalt and
holistic philosophers have taught us, the whole may sometimes be greater than
the sum of its parts. But when the parts add up to zero, one must acknowledge
that any gestalt multiplier times zero still equals zero.
I have demonstrated here not that my theory is necessarily
correct, but only that Anthony has not succeeded in debunking it or even making
a dent in it. Before I discuss each of Anthony’s propositions in turn, let me
mention a few of the more representative fallacies among them. I’ll focus on
three methodological errors that appear as persistent themes across numerous of
his propositions:
1.
Inexact translation across paradigms
2.
Incorrect mapping to a philosophical indeterminacy (free will)
3.
Inappropriate application of legal standards to a scientific argument
Translation Across Paradigms
We social scientists work in a multipardigmatic discipline.
Critical analysis across paradigmatic boundaries is possible only to the extent
that the critic is well versed, not only in his own paradigm, but also in the
paradigm the work he is criticizing uses.
I work within a traditional social-psychological paradigm
and use concepts in the standard way in which they are used in social
psychology. Anthony works within a humanistic-psychoanalytic paradigm, which
uses some of the same concepts in a different way and also uses many concepts
that are not found in social psychology. Anthony is welcome to his paradigm in
his own work. But when he chooses to criticize the work of a social
psychologist, his attack appears only ridiculous if he doesn’t first acquaint
himself with the vocabulary and assumptions prevalent in that discipline—unless,
of course, his goal is to attack the entire field of social psychology and its
paradigm.
I’ll give just one example here of how Anthony, by not
understanding how social psychologists construct theories of influence, veers
off track in his criticism. Standard practice in social psychology is to
describe mechanisms of how influence is hypothesized to flow from a target to a
source. The elaboration likelihood model and the encoder-decoder model are just
two examples of these sorts of theories. Such theories do not discuss individual
differences in attitude or motivation that targets bring to the influence
situation, not because we are not interested in individual variation, but rather
because we wish to use our theories, once established, to study such individual
variation. For example, the well-known Asch effect in social psychology is a
mechanism that describes how individuals are pressured to conform to group
norms. The Asch effect says nothing about individual differences, but this
mechanism has been used in many studies to identify the characteristics of
individuals who are immune to pressures to conform. A number of Anthony’s
criticisms of my model are complaints that I do not consider individual
differences in pre-motivations to belong to cults. Nothing could be further from
the truth. But I develop a model of charismatic influence within cults that does
not account for individual influences precisely to enable studies to be
constructed that might let us discover how the pre-motivations of individuals
cause them to react differently to the same influence process. Anthony does not
seem to understand this difference but instead draws erroneous conclusions from
the assumption that I am working within the same person-centered (rather than
mechanism-centered) paradigm within which he is working.
Mapping to a Philosophical Indeterminacy
Anthony makes one error so persistently that it runs
through fully one-third of all his ninety-eight propositions. This error is to
assume that my theory asserts that the influence mechanism I am describing
involves the destruction of the target’s free will. He calls this “the
involuntarism hypothesis,” and it is the cornerstone of much of his criticism of
what he considers the bogus nature of my attempts to theorize about social
influence. In practice, because the words voluntary, involuntary,
voluntarism, or involuntarism appear nowhere in my writing on this
subject, Anthony argues instead that I somehow sneak this hypothesis in the back
door through a process that he calls “tactical ambiguity.”
It is true that various anticult writers, drawn mostly from
the ranks of mental-health professionals rather than the social sciences, have
alleged that cults take away the free will of their members, not realizing—or
not caring—that the overthrow of free will is an unfalsifiable (and therefore
unscientific) phenomenon. It is also true that Anthony has in the past
successfully exposed the nonscientific nature of these cultic-loss-of-free-will
arguments. He senses correctly that if only he could map an isomorphism between
my theory and theirs, his work would be mostly done, and he could tar me with
the same brush he has used successfully in the past on these hapless clinicians
(of whom Margaret Singer is the most notorious example).
The interested reader will have to examine, one by one, the
32 Anthony propositions that fall into this free-will category to satisfy
herself that none have succeeded in building this isomorphism. Here, I will
briefly mention one rather silly argument that Anthony relies on heavily—his
misconstruction of my use of the word “free.” I use the term in the economic
sense to refer to the absence of structural costs imposed on an individual by
society. Virtually all of sociology accepts as axiomatic that social systems
impose costs on individual action and thereby constrain this action. Anthony
does not seem to grasp that one can discuss socially imposed constraints without
declaring the overthrow of free will.
Legal Standards and Scientific Standards
The last of Anthony’s errors that I wish to discuss briefly
is his imposition of courtroom standards on scientific discourse. In legal
matters, one frequently cites one’s “authorities,” meaning the research experts
that one is relying on for one’s testimony. Anthony keeps trying to identify my
“authorities,” never realizing that I rely on only three: my eyes, my ears, and
my nose. I do my own research, and I arrive at my theories, for better or worse,
inductively from my own observations. But Anthony will have none of this and
instead scours my citations and references under the erroneous (and sometimes
ludicrous) assumption that he is justified in treating any non-negative
citations in my writing as my “authorities” and, therefore, any criticism of the
work of these supposed “authorities” as a successful attack on my own work. For
example, Anthony finds that I include in my long list of references one or two
in which Margaret Singer (the writer whom he most loves to hate) listed among
them, acknowledging her prior work on some topic or other quite peripheral to my
argument. But, to Anthony, her very presence among the S’s in my list of
references constitutes a fatal “gotcha,” proving that Margaret Singer must be
one of my authorities and any discredit to her work successfully undermines my
own. This assumption, of course, is absurd and quite contrary to the norms of
scholarly bibliography that encourage scholars to be as inclusive as possible in
their reference citations.
Conclusion
I have indicated only the three most common errors Dick
Anthony has committed in attempting to debunk my work. If my reasoning is
correct, it follows that my contention that the theory I have developed (which I
include in the appendix to this paper for your consideration) has successfully
survived Anthony’s attempt to debunk it. At the very minimum, any attempt to
revive Anthony’s critique requires that at least one of his ninety-eight
propositions be successfully defended.
A Discussion of All Ninety-Eight Propositions Versus
Zablocki Contained in Chapter Six—Tactical Ambiguity and Brainwashing
Formulations: Science or Pseudoscience?
By Dick Anthony
Introduction: Zablocki’s Brainwashing Formulation
Proposition 1. (Page 215) Zablocki’s recent
brainwashing articles do not report concrete research but rather attempt to
clarify the conceptual outline of the brainwashing idea and to defend its
authentically scientific character.
disputational
relevant correct
In fact, many of my books and papers report results of
concrete research. In the Misunderstanding Cults chapter in which I
outline my theory, I report on the research that supports the theory on pages
194-204. Whether or not I can point to research that supports my theory,
however, is irrelevant to the scientific validity of the theory itself.
Proposition 2. (Page 215) I will also focus upon
older publications by Zablocki (Zablocki, 1971, 1980), as well as publications
by Margaret Singer and Richard Ofshe (Ofshe, 1992; Ofshe and Singer, 1986;
Singer, 1995; Mitchell, Mitchell, and Ofshe, 1980) and by Steven Kent (1997),
all of which Zablocki claim to be cultic brainwashing publications which
provide the empirical foundation for his more recent articles.
disputational
X relevant correct
The reasoning here is that if Anthony can link my theory to
theories presented by any of the scholars whose works I cite in my bibliography,
he will have an easier time debunking me. All he will need to do is debunk the
work of any of the people I cite in my bibliography, and I will be tarnished
through guilt by association. The rhetorical device he uses is to speak of my
citations as authorities, a term used in legal briefs and articles. He
doesn’t seem to understand that, in academia, scholars have many reasons for
citing prior works, and that citation does not imply that these works are the
“foundations” for my own, another legal term that Anthony is fond of using
inappropriately in trying to establish guilt by association. In particular, the
fact that I cite Margaret Singer doesn’t make Margaret Singer’s theory the
foundation of my own. Therefore, to trot out his earlier successful arguments
debunking Singer and claiming that those arguments must apply to all those who
cite Singer as well is not justifiable.
Proposition 3. (Page 215) In addition, Zablocki
claims that his recent brainwashing articles are based upon the empirical
foundation provided by research on Communist thought reform published in
books by Edgar Schein (1961) and Robert Lifton (1961).
X disputational
X relevant correct
See my comments on Proposition 2.
Proposition 4. (Page 215) Zablocki claims that
brainwashing is a valid scientific concept that has been supported by
considerable research both upon Communist coercive persuasion and upon coercive
influence tactics in new religions or cults. (1997, 104‑107).
disputational
X relevant correct
I point to homologies between my findings in cults and the
findings of Lifton and Schein in prison situations. My theory would stand alone,
even if Lifton and Schein had never done their research. But it is my duty to
cite their work both because of its brilliant pioneering nature and because
homologies in influence mechanisms that can be observed between settings that
are as different as religious movements and prison re-education centers are
interesting and intriguing.
Proposition 5. (Page 216) According to Zablocki, the
primary ideologically motivated misinterpretation of the scientific brainwashing
concept is that it has to do with illicit recruitment mechanisms when it is
really a concept concerning influence processes which bring about addictive
commitments to world views to which the targets of brainwashing have already
been converted prior to their being brainwashed.
disputational
relevant X correct
Here we come to the first correct proposition in Anthony’s
chapter. However, it is merely a correct description of something I have
observed. It is not in disputational form, and it is not relevant to the
question of whether my theory is a valid one.
Proposition 6. (Page 217)
a primary burden of his approach would seem to be that he make good on his claim
that his interpretation of this foundational literature of the brainwashing
concept, i.e. Lifton’s and Schein’s 1961 books, is different in kind from the
epistemologically spurious version used in legal trials.
disputational
X relevant correct
This proposition is relevant only in the sense that it
would certainly be relevant to his debunking attempt if he could show that my
theory is no different from an epistemologically spurious theory. However, what
we have here is an example of burden shifting. He is trying to establish that
the burden of proof is on me to show that my theory is different from Margaret
Singer’s theory. In fact, the burden is properly on him to show that these
theories are identical. If and only if he can show that they are identical in
one of his other propositions would this nondisputational proposition have a
bite.
Evaluation of Zablocki’s Formulation
Proposition 7. (Page 218) All brainwashing
formulations claim to provide criteria for identifying social influence that
results in involuntary conduct from social influence that does not result in
involuntary conduct.
disputational
X relevant correct
Here we come to the first statement of the heart of
Anthony’s debunking argument. He puts almost all of his chips on the hope that
he can saddle my theory with what he calls the “involuntarism assumption.” The
implied argument has the structure of a well-formed syllogism: All brainwashing
formulations posit involuntarism. Involuntarism posits the overthrow of free
will. Statements positing the overthrow of free will are bogus. Zablocki’s
theory is a brainwashing formulation. Therefore, Zablocki’s theory must posit
the overthrow of free will. Therefore Zablocki’s theory is bogus. Unfortunately
for Anthony, even well-formed syllogisms are correct only if their presumptions
are correct, and Anthony’s initial presumption is wrong.
Proposition 8. (Page 218) In addition, authors of
brainwashing formulations claim that research upon Chinese and North Korean
Communist coercive indoctrination practices around the time of the Korean War
provides the primary theoretical foundation for their theories of involuntary
cultic commitment.
disputational
relevant correct
Maybe yes, or maybe no. But I have never made such a claim.
CIA Research on Brainwashing
Proposition 9. (Page 219) The core idea of
brainwashing formulations is that world views can be transformed to their
polar opposites through techniques that create disorientation and hyper
suggestibility followed by intensive indoctrination.
disputational
X relevant correct
It may be that world views can be so transformed, but that
is not an argument my theory makes. An examination of my theory will show that
it talks of the inculcation of obedience and attachment by some (but not all) of
the techniques mentioned in Proposition 9.
Proposition 10. (Page 220) The basic model that was
being explored [by the CIA] consisted of two stages: 1) a “deconditioning
stage” intended to wipe out previous ideological loyalties through the
imposition of a disoriented, hyper suggestible, state of consciousness; 2) a
“conditioning stage” intended to implant new loyalties and a new self
through specialized conditioning procedures based upon behaviorist psychology.
disputational
X relevant X correct
This is a brief but correct summary of the CIA program.
That program is relevant to my theory only in that the theory I present also
involves a deconditioning stage and a reconditioning stage. Anthony is trying to
establish grounds for his later assertion that my theory and the CIA theory are
identical. He fails to do so because all influence models that posit a
deconditioning stage and a reconditioning stage are not identical. An
examination of the particulars in his description of CIA deconditioning and
reconditioning shows that they do not match up with the particulars of these
stages in my theory (see Appendix). In any case, because this particular
proposition is not disputational but merely descriptive of the CIA model, it
does not invalidate my theory.
Proposition 11. (Page 220) In terms of their
original goals of improving interrogation and coercive indoctrination tactics
beyond that obtainable with physical coercion or other traditional methods, [the
government programs] were complete failures. Neither the German nor the
American program ever learned how to change people's minds about their political
orientations much less turn them into so‑called “deployable agents.”
X disputational
relevant X correct
Here Anthony correctly disputes the validity of the CIA
model. However, that model is not relevant to my model. This irrelevance is
evident by Anthony’s failure to establish equivalence between the two with any
of his other propositions. It is also evident by direct comparison of the two
models. The CIA model attempts to take ordinary people and, using operant
conditioning techniques, change their ideologies and make them deployable. The
model I have outlined concerns attempts to take already-committed members of a
religious movement and, using techniques of charismatic persuasion and group
pressures to conform, change them from agents to deployable agents. Nothing
about the failure of the CIA model is necessarily applicable to the cult
brainwashing model.
Cultic-Brainwashing Formulations: History
Proposition 12. (Page 221) ...cultic brainwashing
formulations are ... actually based upon the discredited CIA brainwashing model
while they claim to be based upon generally accepted research [by Lifton
and Schein] upon Korean War era Communist indoctrination practices.
X disputational
X relevant correct
The model I have developed depends neither on the CIA
failures nor upon the successful explanations of Lifton and Schein. It stands or
falls on its own merits and is a useful theory only to the extent that it helps
us understand charismatic persuasion in cults. I say this because Anthony’s
repeated attempts to locate what he calls my “authorities” in twentieth-century
research is on the wrong track and confusing. This comment does not diminish my
considerable intellectual debt to Robert Jay Lifton that I acknowledge in my
writings. But there are no “authorities“ lurking in my model waiting to be
discovered. There are only intellectual influences of varying degrees of
importance.
Proposition 13. (Page 222) The core proposition of
cultic brainwashing theory is that brainwashing consists of the use of
techniques which place its victims into a disoriented state of consciousness in
which their normal capacity to rationally evaluate social influence has been
suspended, and consequently in which they have become hyper suggestible and
therefore unable to resist propaganda advocating alternative
totalitarian world views.
disputational
X relevant correct
This nondisputational statement simply tries to describe
the core proposition of my theory in a way that can later be used to invalidate
it. The statement is a mixture of correct and incorrect statements that require
some care to be disentangled. Although the theory does discuss attempts to
disorient the influence target, these disorientation attempts are only an
intermediate step of the process. The key term in Anthony’s description of what
he believes is the core proposition is the phrase “unable to resist.” If he can
establish this, he can later charge the theory with proposing the overthrow of
free will and therefore demonstrate that the theory is unscientific. But even a
cursory examination of the theory as written shows that it nowhere argues that
the target ever becomes unable to resist propaganda. Furthermore, the theory
says nothing about totalitarian world views but is concerned instead with
susceptibility to charismatic influence.
Proposition 14. (Page 222) In addition, brainwashing
formulations contend:
14a) that such influence occurs
without pre‑existing motives or character traits which predispose those
who are influenced to respond positively to the new world views;
14b) that once converted to the
new world view, a brainwashed convert has difficulty repudiating it, so that in
effect the new world view has become a sort of addiction;
14c) that such brainwashed
conversion and commitment to new world views overwhelms the free will of
its victims without the use of physical coercion.
disputational
X relevant part b only correct
In this proposition, Anthony describes three hypotheses
that he believes to be a part of my theory. This proposition is not
disputational, but if Anthony can establish its validity, he believes that he
can use it in other disputational propositions. Let us consider these three
hypotheses, one by one:
§
14a. My theory makes no prejudgment one way or another about
pre-existing motives or character traits. It is concerned with an influence
mechanism used on target persons who are already members of voluntary
charismatic movements. Presumably most, if not all, of these individuals will
have joined and stayed because of some pre-existing motives or character traits.
The absence of discussion of these motives or traits in my theory indicates only
that their relevance is something to be discovered by empirical research. It may
turn out that the strength of pre-existing motives or traits is correlated with
the likelihood that the brainwashing process will be successful. Or it may turn
out that there is no such correlation. Either finding is compatible with the
theory as written.
§
14b. This is the one element of this proposition that is
semi-correct and relevant. I say semi-correct because the charismatic obedience
and attachment to the group are what have become addictive, not what Anthony
calls “the new world view.” But the charge that I see brainwashing as producing
a kind of addictive dependency is correct. Addictive dependencies are hard but
not impossible to resist. My understanding of addiction apparently differs from
Anthony’s. Whereas he sees an addict as one whose free will has been overthrown,
I see addictions as only a strong form of the obsessive-compulsive habits that
we all experience in our lives to some degree and that sometimes require a good
deal of conscious effort to resist.
§
14c. This is more of Anthony’s fabrication of a free-will
hypothesis. At the risk of being just a tad repetitive, I must again reiterate
that nowhere in my theory is anything said or implied about free will. I don’t
consider free will to be a scientific concept. Free will is not something that
I’ve ever thought about in connection with brainwashing.
Tactical Ambiguity
Proposition 15. (Page 223) Cultic brainwashing
formulations are stated in a fashion that is essentially ambiguous and which
thus tends to render them immune to empirical evaluation.... Among the forms
such ambiguity takes in cultic brainwashing formulations are the following:
15a) internal contradictions;
15b) unfalsifiable identifying
characteristics for key variables and predictions;
15c) connotative rather than denotative use of language, i.e. the use of
emotionally charged buzz words rather than precisely defined terminology which
is capable of being operationalized.
X disputational
X relevant correct
None of these charges is correct.
§
15a) To establish this, Anthony would have to show at least one
example of one of the twelve hypotheses in my theory contradicting another. This
he has not done.
§
15b) This statement is meaningless as written. Neither variables
nor predictions are, by definition, falsifiable. Falsifiability is a
characteristic of theories and hypotheses. In fact, one of the criteria of a
falsifiable theory is precisely its ability to make verifiable predictions. But
let us give Anthony maximum benefit of the doubt and assume that he was trying
to say in 15b) “variables that are not operationally defined and hypotheses that
do not lead to verifiable predictions.” But each of the eight variables in my
theory has been given an operational definition. So the problem, if there is
one, must lie with the hypotheses. For Anthony to establish his charge, he must
show at least one example of a hypothesis in my theory that does not lead to
verifiable predictions. This he has failed to do. Each of the twelve hypotheses
in the theory makes predictions that are in principle verifiable.
§
15c) Even a brief examination of the definitional part of the
theory will show this assertion to be incorrect. Each of the concepts used in
the theory has an operational definition.
Proposition 16. (Page 224) Their artful ambiguity
may tend to conceal their pseudo‑scientific character to non‑specialists who
review them in a variety of contexts.
X disputational
relevant correct
This proposition, of course, is pure opinion.
Proposition 17. (Page 224)
The most glaring source of ambiguity in cultic brainwashing formulations
develops from their attempt to simultaneously affirm the [CIA] brainwashing
argument and also to affirm the [Lifton, Schein] research on Communist coercive
persuasion which flatly contradicts it with respect to a number of core
issues, e.g. the presence or absence of predisposing motives, involuntary vs.
voluntary influence, defective cognition vs. full cognitive capacity, and so on.
As I have documented elsewhere, (Anthony, 1990; Anthony and Robbins, 1995a;
Anthony, 1996), Margaret Singer and Richard Ofshe, who were until recently the
most influential exponents of cultic brainwashing theory, switch back and forth
between the two traditions as the tactical requirements of particular contexts
demand.
X disputational
relevant correct
This proposition is not relevant because there is nothing
in my theory that attempts to affirm either the CIA argument or the
Lifton-Schein argument. My theory rests on its own merits, although I do find
interesting homologies between my findings in cults and Lifton’s findings about
Chinese Communist thought reform. These homologies strengthen my own confidence
in the value of the theory, but they are irrelevant to the question of whether
the theory is well formulated. Of the three examples that Anthony gives, the
first two are arguments that I have shown above to be incorrect. Regarding the
third, Anthony seems unwilling to accept that the brainwashing process, as I
have outlined it, produces cognitive confusion while the process is going on but
leads to the restoration of full cognitive capacity as an end result. This
scenario is identical to Schein’s finding and is outlined more fully on pages
177-179 of Misunderstanding Cults.
Third-Stage Brainwashing Formulations
Proposition 18. (Page 225)
Such supplementary scientific foundations ... include putative research on
hypnosis, addiction, the psychoanalytic transference concept, charisma, the
attitude‑change literature, disorientation, male chauvinism/gender bias, and so
on. As the reader of Zablocki’s articles may recognize, his approach includes
several of such supplementary foundations for allegations of involuntary world
view transformation, i.e. addiction, transference, hypnosis, disorientation.
disputational
X relevant correct
There is absolutely no allegation of transference or
hypnosis in my theory. I have no idea where Anthony gets these ideas.
Disorientation is discussed as a relatively minor result of the traumatization
of the target individual that takes place during the intermediate stages of
brainwashing. Of the “supplementary foundations” that Anthony mentions, only
one, addiction, is a part of my theory. As I discuss in my comments on
Proposition 14, above, I do not consider addiction to be a foundation for
allegations of involuntary world view transformation, and neither does anybody
else working in the addiction field. No addiction scientists that I know of
believe that addiction robs one of free will (nor do they believe that this is
even a meaningful scientific statement). Although addictions can be overcome,
many of them require a tremendous effort to resist. The literature on
nonchemical addictions (such as addictions to gambling or sexual promiscuity) is
still controversial at this point, but there is no doubt that such topics do not
in any way rely on statements about free will.
Zablocki’s Third-Stage Brainwashing Formulation: Disorientation, Defective
Thought, Suggestibility, and the False Self
Proposition 19. (Page 226f ) The following are the
individual elements or hypotheses within Zablocki’s definition:
19a) Absence of Pre‑motives. People who join new religions cults are not
seeking alternatives to mainstream world views prior to their membership in the
new group.
19b) Disorientation. New religions or cults induce irrational altered
states of consciousness as the core technique in seducing people into giving up
their existing world view. (Zablocki refers to this primitive state of
consciousness as disorientation; other brainwashing theorists have referred to
it as hypnosis, dissociation, trance, etc., but there is no meaningful
distinction between these various terms for primitive consciousness as they are
used by brainwashing theorists, i.e. they are functional synonyms within the
brainwashing world view.)
19c) Defective Cognition: In the disoriented state essential to
brainwashing the person has a significantly reduced cognitive capacity to
evaluate the truth or falsity of world views with which he or she is confronted.
19d) Suggestibility. As a result of externally induced disorientation and
defective cognitive capacity, the victim of brainwashing is highly
“suggestible,” i.e. prone to accept as her/his own ideas and world views which
are recommended to him or her by the person or organization that has induced the
defective cognitive state.
19e) Coercive or involuntary imposition of a defective or false world view.
The above sequence of criteria of brainwashing results in the involuntary
imposition of a defective or false world view which anyone in a rational state
of mind would have rejected.
19f) Coercive imposition of a false self. As a result of the brainwashing
process, the person manifests a pseudo‑identity or shadow self which has been
involuntarily imposed upon him/her by brainwashing.
19g) Deployable agency. The involuntarily imposed false self and
defective world view persist after the brainwashing process has been completed
and as a result the brainwashed person retains his commitment to the new self
and world view even when he or she is not in direct contact with the group doing
the brainwashing.
19h) Exit Costs: It is extremely difficult for the person to later
repudiate his new world view and false self‑conception because he no longer has
the capacity to rationally evaluate these choices.
disputational
X relevant correct
There is nothing new in this proposition, nor does Anthony
intend there to be. This is simply a summary of what Anthony believes are the
hypotheses he has discovered within my theory. It is a mixture of the good, the
bad, and the ugly. For example, 19f) is completely incorrect. There is no
coercion involved, and the notion of a “false self” is meaningless within the
paradigm of mainstream social psychology within which I work. After G.H. Mead,
most social psychologists regard self as a process (in continuous interaction
with the social and nonsocial environment) rather than a structure that can be
imposed—coercively or not. The self at the end of the brainwashing process is
just what it is. It is neither truer nor “falser” than the self at the beginning
of the persuasion process, nor are these even meaningful adjectives in this
context. Most of the other hypotheses as stated by Anthony are either completely
or partly false. Even a cursory comparison of this list with the twelve
hypotheses on pages 185 through 193 of Misunderstanding Cults will
confirm this. However, 19g) would be correct if the terms “involuntary,”
“false,” and “defective” were removed from the sentence.
Proposition 20. (Page 227) All of these hypotheses
were aspects of the original, generally discredited CIA brainwashing model which
Zablocki claims to be replacing with his “new approach.”
X disputational
relevant correct
This is an instance of the false equivalency fallacy that
runs through many of Anthony’s propositions. Anthony is correct in perceiving
that if only he can establish an equivalence between my theory and a theory that
has already been discredited, his debunking job will be much easier. By invoking
a simple train of logical reasoning that says if A = B, and B is false, then A
must be false, he will have accomplished his goal. Anthony has demonstrated in
other writing that B (the CIA model) is false. But, as we have already seen, his
attempt to establish that my theory is equivalent to the CIA model has failed.
Proposition 21 (Page 227) ...he asserts that
disorientation and a suspension of critical rationality are essential to the
brainwashing process. He states:
The core hypothesis is that,
under certain circumstances, an individual can be subject to persuasive
influences so overwhelming that they actually restructure one’s core beliefs and
world view and profoundly modify one’s self‑conception. The sort of
persuasion posited by the brainwashing conjecture is aimed at somewhat different
goals than the sort of persuasion practiced by bullies or by salesman and
teachers. . . . The more radical sort of persuasion posited by the brainwashing
conjecture utilizes extreme stress and disorientation along with ideological
enticement to create a conversion experience that persists for some time after
the stress and pressure have been removed.... To be considered brainwashing this
process must result in (a) effects that persist for a significant amount of time
after the orchestrated manipulative stimuli are removed and (b) an accompanying
dread of disaffiliation which makes it extremely difficult for the subject to
even contemplate life apart from the group.
disputational
X relevant X correct
This nondisputational proposition is composed mainly of a
segment that Anthony correctly quotes from an earlier (1997) article. Although
correct, the citation out of context is misleading. I make it clear in that
earlier article that I was exploring various unproven conjectures for explaining
why brainwashing works. Although I stand by this earlier conjectural quotation,
it is not part of my scientific argument. Anthony’s quotations from my writings
range over thirty years, but he presents them—misleadingly—as though all are
part of a single current scientific attempt to formulate a falsifiable theory.
Proposition 22. (Page 228) The “profoundly modified
self” referred to by Zablocki in the above quote as characteristic of
brainwashing is essentially the same as the false self or “pseudo‑identity”
which Singer, (1995, 60, 61, 77‑79), West and Martin (1994) and other
brainwashing theorists regard as an essential aspect of brainwashing.
disputational
X relevant correct
Sez who? We have another instance of the false equivalency
fallacy cropping up here. Only now B = the theories of Singer, West, and Martin,
none of whom are social psychologists. Anthony has not done any of the difficult
work of establishing an equivalency between my theory and those of these three
scholars. Without doing this work, he has no right to simply assert that my
theory is equivalent to theirs. Additionally, the “profoundly modified self”
that Anthony quotes me as discussing has no place in my theory but only in
earlier conjectural work in which I speculate about the question of “why”
brainwashing may work the way it does.
Proposition 23. (Page 228) The new identity is
viewed as false because allegedly it is imposed wholly by extrinsic influence
and thus is seen as discontinuous with the pre‑existing values and
self‑conception of the person; that is, as being “ego‑dystonic,” to use
Zablocki’s appropriation of psychoanalytic terminology. (Within psychoanalysis,
the term “ego‑dystonic” refers to distortions of rational thought processes,
such as delusions, hallucinations, obsessive thoughts, or compulsive behaviours,
produced by eruptions of primitive unconscious materials into consciousness.)
disputational
X relevant correct
There are two separate problems with this proposition. The
first is that gets the definition of ego dystonic wrong. According to the
American Psychiatric Glossary (seventh edition) 1994, American Psychiatric
Press, the term ego dystonic refers to “aspects of a person’s behavior,
thoughts, or attitudes that are viewed by the self as repugnant or inconsistent
with the total personality.” There is no assumption that these aspects involve
delusions, hallucinations, or any of the other disorders Anthony suggests. The
second problem is that the first sentence in the proposition is wrong; it is
based on the erroneous assumption that the brainwashed person’s new identity is
viewed as false either by the subject or by the observer. No such value
judgments are made or implied.
Proposition 24. (Page 228) Zablocki states: The
result of this [brainwashing] process, when successful, is to make the
individual a deployable agent of the charismatic authority.
disputational
X relevant X correct
Yes, this interpretation is correct and important, but the
statement is merely a descriptive one that does not dispute the validity of the
theory.
Proposition 25. (Page 229) As Zablocki has stated,
the cult is able to overwhelm—and replace with a shadow self—the pre‑existing
authentic self of the person only by inducing an altered, primitive state of
consciousness in which the person is unable to resist indoctrination.
Zablocki refers to this alleged state of primitive consciousness as
“disorientation.”
disputational
X relevant correct
Total nonsense. Disorientation is defined as “loss of
awareness of the position of the self in relation to space, time, or other
persons.” Disorientation is a run-of-the-mill state of consciousness that all of
us experience at one time or another and that people often experience as a
reaction to prolonged stressful treatment.
Proposition 26. (Page 229) Zablocki doesn’t
provide a definition for his use of the disorientation term.... Elsewhere,
Zablocki elaborates upon the disoriented state, which he considers to be the
core of the brainwashing process. He states that those in the throes of the
brainwashing process:
are, at times, so disoriented
that they do appear to resemble zombies or robots: glassy eyes, inability to
complete sentences, and fixed eerie smiles are characteristics of disoriented
people under randomly varying levels of psychological stress.... He elaborates
in a later section of the same article upon the “loose cognition” and suspension
of critical rationality referred to in this passage, which he regards as
essential to the brainwashing process.
X disputational
relevant correct
See my reply to Proposition 25, above, for the standard
definition of disorientation.
The following is conjecture on my part, but I can’t help
but suspect that what Anthony is trying to accomplish by making so much fuss
over the role of disorientation in my theory is to construct an isomorphism
between my theory and some rather silly caricatures of “mind control” that
appear in Fu Manchu movies, for example. But I don’t argue that brainwashing
turns people into robots. I simply state that, for a brief transitional period
during which the subjects are enduring sleep deprivation and random
inquisitions, the subjects present themselves as so worn down and traumatized
that they resemble zombies or robots to the observer.
Proposition 27. (Page 230) [Zablocki] states:
My argument is that his
transition to the biological [essential to brainwashing] involves both a
suspension of incredulity and an addictive orientation to the alternation of
arousal and comfort comparable to the mother‑infant attachment. At the
cognitive level this relationship [between the charismatic cult and its
brainwashed victim] involves the suspension of left‑brain criticism of
right‑brain beliefs such that the beliefs are uncritically and enthusiastically
adopted.... By preventing even low‑level testing of the consequences of our
convictions, the [brainwashed] individual is able rapidly to be convinced of a
changing flow of beliefs, accepted uncritically. (1998, 241‑242; emphasis mine)
disputational
relevant X correct
Anthony is here quoting an earlier article of mine in which
I offer a neuropsychosocial conjecture as to why brainwashing might have the
effects that it does. But the validity of the theory does not depend on the
truth of the conjecture. A theme that runs through Anthony’s critique is the
failure to distinguish between an empirically based theory that a
specific event occurs, on the one hand, and conjectures as to why it
occurs, on the other hand. The social scientist has a responsibility to offer
both, but only the former must be falsifiable at the present time. In fact, we
currently have no way of knowing whether or not brainwashing involves this kind
of left-brain and right-brain activity, although it might be possible in the
future to determine this. But, meanwhile, the possible truth of the conjecture
is relevant neither to the observation that the phenomenon exists nor to the
theoretical criteria offered by which an instance of the phenomenon can be
identified.
Proposition 28. (Page 230) the notion that
brainwashing uses the induction of a primitive state of consciousness and a
resulting inability to resist indoctrination, leading to an addictive or
compulsive attachment to a new world view and a false self, is the heart of the
CIA brainwashing paradigm.
disputational
relevant X correct
On the face of it, this is a statement of fact, not about
my theory, but about the CIA theory. Building on propositions 25 through 27,
Anthony seems to be implying that my theory is essentially identical to the CIA
theory and therefore must somehow posit an “inability to resist indoctrination”
and, therefore, the overthrow of free will. As we have seen, however, my theory
makes no such argument. Because Anthony has failed in his attempt to establish
an isomorphism between my theory and the CIA theory, any such implication is
irrelevant.
Proposition 29. (Page 230) In Zablocki’s
formulation, the conversion to the new world view is regarded as involuntary
and compulsive because it follows from the absence of even “low‑level
testing of the consequences of our convictions” and thus the new world view is
“accepted uncritically.”
disputational
X relevant correct
This is one of Anthony’s critical propositions. He wishes
to establish an equivalence between criticism and voluntarism so that he can tag
my theory with what he calls the involuntarism assumption and thus demonstrate
that it rests on the unscientific argument that free will can be overthrown. But
a moment’s reflection will indicate that all of us, at times, engage in behavior
in which our critical faculties do not come into play. When my wife tells me in
the middle of the night that I’m snoring and should role over on my side, I do
so without subjecting the message to critical examination. All introductory
social psychology textbooks include discussions of the Elaboration Likelihood
Model of Persuasion. In this model, a distinction is made between persuasion via
“the central route” and persuasion via “the peripheral route.” In the case of
the former, the message is subjected to critical “elaboration” by the target of
the persuasion before the target decides whether or not to accept the message’s
validity. In the case of the latter, the target bases the decision whether or
not to accept the validity of the message on the source and context rather than
the content. Peripheral-route persuasion involves little or no critical
evaluation of the content of the message itself. Thus, Anthony’s 29th
proposition is relevant in that it correctly identifies suspension of critical
message elaboration as an essential component of brainwashing. But it is not
correct because lack of criticism does not imply lack of free will.
Disconfirmation of the Primitive-Consciousness Hypothesis
Proposition 30. (Page 230f) ...researchers found
that such Communist influence did not result from diminished cognitive
competence. The essence of coercive persuasion, on the other hand, is to
produce ideological and behavioral changes in a fully conscious, mentally intact
individual. (Schein, 1959, 437; emphasis mine)
X disputational
relevant X correct
Anthony, Lifton, Schein, and I are all in agreement on this
point. The end result of a successful brainwashing project is for the target
individual to emerge fully conscious and mentally intact. How else would he or
she be able to function as a deployable agent? Here Anthony is mixing up a brief
transitory period of disorientation that occurs during brainwashing with mental
intactness that is a result have having been successfully brainwashed. This
error of Anthony’s is comparable to the error of stating that, since patients
bleed while surgery is being performed, a person who is not bleeding cannot have
been a surgical patient.
Proposition 31. (Page 232) Ofshe and Singer, 1986
... claim that cults use brainwashing techniques that are different from and
more effective than techniques used in Communist thought reform. Zablocki, 1998,
pg. 222, endnote 21, specifically claims that this article is part of the
empirical basis of his own brainwashing formulation.
disputational
relevant X correct
This proposition involves two sequential nondisputational
statements, both of which are true but irrelevant:
1.
Ofshe claims that cult brainwashing is different from Communist thought
reform in some respects. This is true, but what does it have to do with the
matter at hand?
2.
Zablocki claims the Ofshe article is part of his own empirical base.
Well, I guess this is true in the sense that I profited by
reading Ofshe’s early empirical work on Synanon and other charismatic groups
that I have never studied. And the similarity of my findings on different, later
charismatic groups increases my confidence that the subject I have chosen to
investigate is one that others have independently noticed in other contexts. But
similarity is not identity. My theory, on the face of it, is not identical to
any of the theories of Ofshe or Singer. I can, therefore, not be saddled with
responsibility for a proposition in one of their theories simply because I cite
their work and find myself inspired by Ofshe’s evidence.
Proposition 32. (Page 233) attacks upon religious
conversion as being involuntary because they involve “irrational” states of mind
are evaluative rather than scientific because they are tautological.
X disputational
relevant X correct
I wholeheartedly agree. Therefore, let’s all agree never to
make any such tautological attacks upon religious conversion by claiming they
involve “irrational” states of mind. I have never made any such attacks. My
theory does not include or imply such an attack. I promise never to make any
such attack in the future. Why would anyone want to base an intellectual attack
on tautological reasoning?
Proposition 33. (Page 233) Zablocki, (1998, 227)
claims that the research of Orne (1972) provides scientific support for the idea
that hypnosis can be used to compel involuntary behavior. [In the section of
his article on “Mental or Physical Impairment” as an explanation for cult
membership, Zablocki states: Orne has done some interesting experimental work on
the extent to which subjects can be hypnotized to do things against their will.]
X disputational
relevant correct
First, the pedantic comment: The correct citation for the
statement Anthony is talking about is (1998a, 237), not (1998, 227). However,
this proposition contains a far more serious blunder. The purpose of the 1998a
paper that Anthony cites is an extremely modest one. At that time, I had not yet
developed a fully formed social-psychological theory of brainwashing and was
simply arguing that the brainwashing “conjecture” was at least as plausible as
any of the rival conjectures in the literature. After I set out my view
of the brainwashing conjecture as I understood it at that time, I then—in the
interests of fairness—took a page or two to discuss each of the rival
conjectures. That Anthony does not notice that the discussion of Orne and
hypnosis that he alludes to takes place within a discussion of one of these
rival conjectures is incredible to me. Anthony seems so anxious to tar me
with the hypnotism brush that he rips this discussion out of context with absurd
results. Because I correctly state that hypnotism is yet another conjecture that
some people have used to account for charismatic obedience in cults, he rushes
to the assumption that the hypnotism conjecture is part of my own approach.
Proposition 34. (Page 234) Zablocki makes this same
fundamental error in support of this notion that hypnosis and other primitive
states of consciousness form the basis for brainwashed involuntary commitment to
religious groups.
X disputational
relevant correct
See my comments on Anthony’s proposition 33.
Exit Costs, Pre‑Motives, and Totalitarian Influence: Brainwashing as Exit
Costs [sic] and Absence of Pre‑Motives
Proposition 35. (Page 236) The novel feature of
Zablocki’s version of the CIA model, when compared to previous versions of the
cultic brainwashing model, is the surprising claim that research on thought
reform did not demonstrate involuntary conversion of its victims to a new
Communist world view but rather the coercive intensification of commitment
to a Communist world view to which the victims of thought reform were already
committed.
X disputational
relevant correct
Anthony’s obsession with what he calls the “involuntarism
assumption” has him thoroughly confused here, and also in propositions 36 and
37, which stem from the same confusion. I discuss this confusion on page 176 of
my chapter in Misunderstanding Cults, of which I will quote a relevant
excerpt here:
The difference between the
state-run institutions that Lifton and Schein studied in the 1950s and 1960s and
the cults that Lifton and others study today is in the obtaining function not in
the retaining function. In the Chinese and Korean situations, force was used for
obtaining and brainwashing was used for retaining. In cults, charismatic appeal
is used for obtaining and brainwashing is used, in some instances, for
retaining.
This distinction seems to be difficult for Anthony to grasp
although I have explained it to him in numerous personal correspondences. I
nowhere claim that the prisoners studied in Chinese and Korean settings were
Communists before their incarcerations. The only coercion involved was in
getting them incarcerated in the first place, an element lacking in cults.
Proposition 36. (Page 236f) Zablocki seems to be
saying that Lifton’s and Schein’s subjects were already Communists prior to
thought reform in the sense of already having joined a Communist
organization. He says: “The target of brainwashing is always an individual who
has already joined the group.” (1998, 221).
X disputational
relevant correct
See my comments to Proposition 35. In cults, it is
individuals who have joined. Individuals in the Chinese and Korean settings were
brought to these institutions through the use of force. Why is that so hard to
understand, and why does it imply that these subjects joined the Communist Party
prior to their incarcerations?
Proposition 37. (Page 238) Zablocki’s insistence
that brainwashing consists of coercive change in level of commitment to
totalistic ideology, rather than coercive conversion to totalistic ideology in
the first place, is all the more puzzling when other passages are taken into
account in which he seems clearly to define brainwashing as coercive
conversion to a new world view.
X disputational
X relevant correct
See my comments to Proposition 35. My so-called insistence
on coercive change at any point in the process is a fabrication of Anthony’s. I
nowhere posit or claim the existence of coercive change among cult members.
Among prisoners, the only coercion is that involved in arresting them and
imprisoning them.
Proposition 38. (Page 238f) According to Zablocki,
brainwashing consists of overwhelming or irresistible “extrinsic” influence
to which the inner qualities of the person are irrelevant, as opposed to normal
“intrinsic” influence, resulting from an interaction between the inner
characteristics of the person and outside influence. The extrinsic influence
character of brainwashing formulations is essential to establish that such
influence is “involuntary.”
X disputational
relevant correct
Let’s ignore the “overwhelming or irresistible” adjectives
that Anthony appends to this proposition because I’ve already established in my
discussion of previous propositions that there is nothing in my theory that
alleges either of these adjectives. Doing that leaves us with a new and
interesting aspect of Anthony’s argument: his distinction between extrinsic and
intrinsic forms of influence. As an ideal type, the intrinsic/extrinsic
dimension of influence can be analytically useful. Ideal-typically extrinsic
forms of influence depend on sources entirely outside the target person;
intrinsic forms of influence depend on sources entirely internal to the target
person. My theory of brainwashing models the process as an extrinsic one.
However, there is room for intrinsic factors in terms of differential
vulnerability to brainwashing.
Proposition 39. (Page 240f) He claims to base his
brainwashing theory upon research on minority religions and communes which he
conducted and described in earlier books. Indeed, in his 1980 book,
Alienation and Charisma, alienation is one of the two master concepts (the
other being charisma) by which he organizes his data. Thus Zablocki himself
is, in this former guise as the author of these earlier publications, a
proponent of what he now labels the “seekership conjecture” school of new
religions scholarship, a theoretical orientation that he now sees as conflicting
with his current brainwashing perspective.
X disputational
X relevant correct
This criticism is based on an either/or way of thinking
that Anthony wishes to impose on my scholarship. I find both seekership theory
and brainwashing theory useful in studying cult behavior. There is nothing in
either that contradicts the other.
Proposition 40. (Page 241) Zablocki appears to be
acknowledging Lifton as a proponent of the seekership explanation of conversion
to new religions, whereas elsewhere he views Lifton’s work as the primary
theoretical foundation for the brainwashing explanation which he regards as
contradictory to the seekership explanation.
X disputational
X relevant correct
I do not regard the seekership explanation of cult
participation “as conflicting” with the brainwashing explanation. As I explain
in various other writings, seekership and brainwashing are complementary
explanatory tools. Seekership seems better at explaining some aspects of cult
participation, and brainwashing theory seems to better account for others. I
favor an inclusionary approach to the study of cults, one that uses as many
different perspectives as necessary to try to make sense of the observed data.
In the article in which I compared the various perspectives (or conjectures,
as I prefer to call them), I was merely trying to establish that brainwashing is
just as plausible and useful as any of the others. I was not trying to oppose
the conjectures as either/or alternatives.
Proposition 41. (Page 250) Unlike the brainwashing
paradigm, the totalism theory does not interpret the contemporary influence
situation of the person as the primary cause of his religious or political
choices. This approach is clearly contradictory to Zablocki’s attempt to
explain involuntary religious conversions exclusively on the basis of social
influence which is independent of the pre‑existing character of the person
who is being influenced.... (For instance Zablocki states: ...“the brainwashing
model does not focus primarily on characteristics of the subject. The assumption
is that many different kinds of people can, with enough effort, be brainwashed.”
X disputational
X relevant correct
Again, we must ignore the gratuitous insertion of the term
“involuntary religious conversion,” which I neither state nor imply in my
theory. We are then left with the heart of this proposition: Anthony’s concern
over my statement that “many different kinds of people can, with enough effort,
be brainwashed.” He sees this as making my theory “contradictory” to his theory
of totalist influence. But to say that many different kinds of people can be
brainwashed is not to say that all people can be brainwashed. This
differentiation is not merely nitpicking. Indeed, my empirical observations
suggest to me that some people are immune to this sort of persuasion. The
investigation of what personal characteristics these people share is, to me, one
of the more fascinating topics on the agenda of the study of cults or new
religious movements. Furthermore, I do not claim (or believe) that the amount of
effort necessary to brainwash a person is similar across all personality types.
These are all interesting issues and quite amenable to empirical investigation.
If Anthony and his allies would stop trying to obstruct this entire area of
study, we could get on with the interesting job of trying to get the answers to
these questions.
The Brainwashing Term
Proposition 42. (Page 250) Another obvious
indication that Zablocki’s articles express the CIA brainwashing paradigm rather
than the totalitarian-influence paradigm is that he refers to the perspective he
is advocating as a “brainwashing” theory. Both Schein and Lifton repudiate
the brainwashing term because it designates the CIA brainwashing theory
which their research had disconfirmed. (Schein, 1961, 18; Lifton, 1961, 4; 1987,
211).
X disputational
relevant correct
Propositions 42 through 45 are concerned not with a
critique of theory but with a critique of terminology. As I explained in the
beginning of this essay, I have chosen to separate strictly the discussion of
the phenomenon from the discussion of the label used to designate the
phenomenon. This is not to say that issues surrounding the label “brainwashing”
are not important, just that greater clarity is achieved by not confounding the
two separate issues. I do not insist that the phenomenon I have been researching
be labeled “brainwashing” as long as a reasonable degree of consensus can be
achieved on a satisfactory alternative label. For a fuller discussion of these
matters of terminology, see the Appendix at the end of this essay.
Proposition 43. (Page 251) Zablocki, on the other
hand, asserts that the “brainwashing” term accurately stands for the
tradition represented by Lifton’s and Schein’s research.
disputational
relevant X correct
See discussion of proposition 42.
Proposition 44. (Page 252) It seems me to be likely
that Zablocki continues to use the “brainwashing” term in contradiction to
its repudiation by Schein and Lifton for the same reason that they have rejected
it, i.e. because the brainwashing term is actually defined in terms of the
involuntary world view transformation which the research of Schein and Lifton
demonstrated did not occur as a result of Communist thought reform.
X disputational
relevant correct
See discussion of proposition 42.
Equating of Brainwashing and Totalitarian Influence Perspectives
Proposition 45. (Page 255) In affirming that such
terms are roughly synonymous, Zablocki is informing us that in his mind the
CIA brainwashing paradigm as expressed by Meerloo, Singer, Ofshe, and Farber, et
al., on the one hand, and the totalitarian influence perspectives of Schein,
Lifton, et al., on the other, are essentially the same.
X disputational
X relevant correct
See discussion of proposition 42.
Voluntary Versus Involuntary Influence
Proposition 46. (Page 256) Zablocki feels that the
characteristic that distinguishes the pseudo‑scientific CIA brainwashing
perspective from the valid brainwashing perspective, which he claims to
advocate, is that the CIA perspective claims that brainwashing “rob[s]
ordinary people of their free will,” whereas his own perspective does not.
disputational
X relevant X correct
Yes! This descriptive proposition gets it right. One of the
major differences between my perspective and that of the discredited CIA program
is that the CIA program claims to overthrow free will.
Proposition 47. (Page 257) On the one hand, Zablocki
says that brainwashing does not claim that its victims are robbed of their free
will. On the other, he says that the goal of brainwashing is the modification of
the preference structure on which choice is based. The question then becomes
whether a brainwashing account of how such a modification is accomplished
indicates that the modification is voluntary or involuntary. The central
theme of Zablocki’s brainwashing articles is to show that the accomplishment of
such a modification of preferences by brainwashing is involuntary, and that
once the modified preference structure is accomplished, the person becomes stuck
in it against their will.
X disputational
X relevant correct
In this proposition, Anthony confuses an early article of
mine in which I speculate about the conjecture that the effect of brainwashing
occurs at the level of preference structure. We could argue about whether
preference modification involves the overthrow of free will. If it does, then
political debates by presidential candidates should perhaps be outlawed because
they all attempt to modify our candidate preferences. But such argument is
unnecessary because the theory of brainwashing as stated in chapter 5 of
Misunderstanding Cults does not rely on a discussion of preference
modification. So even if Anthony’s contention about the association between
preference structure and free will were correct, it would have no bearing on the
theory but only on a conjecture I once offered to explain why brainwashing has
the effect that it does. As I explained more fully in my comments on proposition
27, Anthony consistently confounds the formal theory of brainwashing with
various speculations about why it has the effects it does.
Proposition 48. (Page 257) Zablocki develops quite
explicitly his contention that brainwashing produces a compulsive attachment
to a totalistic world view, group, and false or shadow self. As we saw
above, Zablocki contends that brainwashing involves “an addictive orientation to
the alternation of arousal and attachment comparable to the mother‑infant
attachment.... In these terms, brainwashing can be operationalized as an
influence process orchestrated toward the goal of charismatic addiction. My
hypothesis is that each of the three stages of brainwashing achieves a plateau
in this addictive process. The stripping stage creates the vulnerability to this
sort of transformations. The identification stage creates the biochemical
alignment, and the rebirth stage creates the fully addicted shadow self.”
disputational
X relevant X correct
In this proposition, we have a purely descriptive statement
of my view of brainwashing as an addiction. Anthony does not use it here to
dispute my theory, but he obviously believes that, if he can establish a
connection in my theory between brainwashing and addiction, he will be able to
establish his pet false claim that I am asserting the overthrow of free will.
There exists a huge literature on addiction, and none of it claims that free
will is in any way involved.
Proposition 49. (Page 257f) His argument that the
brainwashing perspective is not concerned with involuntarism boils down to two
assertions. The first is that his articles are not concerned with the
voluntarism/involuntarism dimension because he doesn’t use the terms
voluntary or involuntary in them.
disputational
X relevant X correct
Get out your thesaurus. Not only do I not use the terms
“voluntary” or “involuntary,” but I also don’t use any synonyms for these words.
Dr. Anthony seems to be grasping at straws here. I explicitly assert that I’m
not concerned with “involuntarism.” I don’t use the words or anything
approximating the meanings of these words. Yet, like a tea-leaf reader or a
tarot-card reader, Anthony insists on seeing in my words not what is there, but
what he desperately believes must be there.
Proposition 50. (Page 258) The second is that the
distinction between voluntary or involuntary is a qualitative, i.e., either/or
one, whereas the brainwashing perspective as he articulates it merely
demonstrates a reduction in voluntary choice rather than its complete loss.
disputational
X relevant X correct
I’m not sure what Anthony means by “reduction in voluntary
choice.” All choices are voluntary, but all choices are made in the presence of
constraints. The greater the constraint, the more costly it is for the
individual to make the choice. If all Anthony means is that I’m asserting that
constraints on individual choices may vary from mild to severe, I plead guilty
of asserting what is a cornerstone postulate of all sociology and psychology.
Proposition 51. (Page 258) Actually in the
penultimate draft (sic) of his 1998 article he did use the voluntarism term
in defining an essential dimension of the brainwashing paradigm. He stated: “In
this paper, I attempt to lay the foundation for brainwashing as a useful and
well defined scientific concept. An essential part of this effort is to cut away
some of the more grandiose claims that have been made for this concept and to
locate it simply as one among many constraints on religious voluntarism.”
(Page 2, penultimate draft of Zablocki, 1998, emphasis mine)
disputational
relevant correct
What is this garbage about penultimate drafts of articles?
Has Anthony taken to dumpster diving outside my house to see whether he can find
anything incriminating in drafts of papers that I discard prior to publication?
Aside from being ethically dubious, this claim of Anthony’s is impossible to
either verify or refute because there is no published record. Nonsense like this
does not deserve to be taken seriously.
Proposition 52. (Page 258) The more important
question, however, is whether Zablocki’s paradigm is concerned with the concept
of voluntarism, not whether it is treated either as a qualitative or a
continuous variable.
disputational
relevant X correct
Anthony is correct that this is the more important
question.
Proposition 53. (Page 258) there are many other
passages in Zablocki’s works on brainwashing in which he contends, as in this
passage, that the brainwashing paradigm is concerned with demonstrating that
brainwashed conversion and commitment to social groups is (at least relatively)
involuntary. See for instance this statement in the abstract to his 1998
article:
“In contrast to some of the more
grandiose claims sometimes made for brainwashing as the sole explanation of cult
movement behavior, I argue instead that brainwashing is only one of the factors
that needs to be examined in order to understand the more general phenomenon of
exit costs as a barrier to free religious choice.” ( 1998, pg. 216)
And also:
...exit cost analysis is
primarily concerned with the paradox of feeling trapped in what is
nominally a voluntary association. It asks not ‘Why did they leave?’ but rather,
‘What prevents them from leaving?’(1998, 220)
X disputational
X relevant correct
Again, there is no implication of an insult to free will in
any of these quotations. Anthony continues to stumble on what is to him an
unfamiliar paradigm. Every time I talk about social constraints on action,
Anthony infers an attack on free will. Does he believe that social constraints
do not exist and that we all act in a social vacuum? If so, he is at odds with
the entire field of sociology.
Proposition 54. (Page 259) there are many other
examples of passages in which Zablocki asserts that brainwashing diminishes
voluntary religious choice without actually using the words voluntary or
involuntary.... In my view Zablocki contends repeatedly that brainwashing
accomplishes involuntary influence of cult members even if he doesn’t always use
the voluntarism term in affirming this proposition.
X disputational
X relevant correct
See my comments on propositions 49, 50, and 53.
Proposition 55. (Page 259) At times Zablocki does
use the voluntary/involuntary terminology. In one of his publications, which
Zablocki regards as a scientific study of cultic brainwashing, he made the
following statement:
Change of world view is possible,
although rare and difficult. It can occur in a religious conversion and in
psychoanalysis. Both of these processes are undergone voluntarily. Classic
thought reform [as defined by Lifton] is an involuntary method of changing a
person’s world view. I am going to compare the Bruderhof novitiate with the
thought reform process in order to locate the points of structural similarity
which are common to all forms of world view socialization, whether voluntary
or coercive, religious or secular. (1971, 247; emphasis mine). In this
passage at least, Zablocki uses variants of the voluntary/involuntary
terminology to refer to the voluntary/involuntary dimension of world view
transformation. He also seems in this passage to be referring to the voluntarism
dimension as an either/or variable, asserting without qualification that thought
reform [brainwashing] is an involuntary method of changing a person’s world
view.
X disputational
X relevant correct
First of all, to bring up a quote from a book I published
in 1971 as an attack on a theory formulated in 2001 seems like an act of
desperation. But even on its own terms, Anthony’s reading of this paragraph is
flawed. I was referring here to whether the subjects of persuasion were brought
voluntarily or involuntarily to the site at which the persuasion happened.
Clearly, the prisoners studied by Lifton were brought by force to the prisons
where brainwashing was accomplished. They did not voluntarily go to prison.
Members of the Bruderhof, in contrast, joined that commune voluntarily. Since
1971, I have learned to make these distinctions more explicit. Back when this
early book was published, I was not yet aware that scholars might confuse
voluntarism in recruitment with voluntarism of the persuasive process.
Proposition 56. (Page 259f) At the time that
Zablocki wrote this passage, he apparently believed that Lifton had demonstrated
that thought reform, as conducted by the Chinese Communists, accomplished
involuntary world view transformation of its victims, whereas conversion and
commitment to NRMs such as the Bruderhof (the subject of his book) was
essentially voluntary and thus didn’t really constitute thought reform or
brainwashing, even though it might be “structurally similar” to it in certain
ways. Furthermore, as he clearly says in this passage, at that time Zablocki
believed that: 1) all forms of world view [re]socialization are structurally
similar to thought reform; 2) some forms of world view resocialization, such
as thought reform, are involuntary, whereas other forms, such as conversion to
the Bruderhof, and presumably to other new religions or so‑called “cults,” are
voluntary.
X disputational
X relevant correct
My distinction has to do with whether the subjects of
persuasion were obtained by force and brought to the persuasive site in chains
or whether they decided to join a purely voluntary association. Anthony and I
agree that some organizations obtain their members by force and some by
voluntary recruitment. But I never claim that this has any bearing on the
“voluntarism” of the persuasive process itself.
Proposition 57. (Page 260) Careful review of his
1971 book reveals that Zablocki believed:
... There is a danger, however,
in making structural comparisons among totally different processes. A fault of
the comparative method is an inevitable tendency to stress similarities and
neglect differences. As I mentioned earlier, thought reform and Bruderhof
resocialization are poles apart phenomenologically. One is coercive, rigid, and
exploitive [sic]. The other [the Bruderhof] is voluntary, flexible, and
loving. Thought reform sacrifices its victims for the sake of future
generations. The Bruderhof, although concerned with the future, offers its
members a deeply rewarding life in the present. (1971, 265‑66; emphasis mine)
disputational
relevant X correct
Isn’t it interesting that coercive institutions such as
prisons can practice persuasive techniques that are structurally similar to
those practiced by voluntary associations such as religious communes? Far from
being an argument against brainwashing theory, this similarity just makes the
theory all the more intriguing.
Proposition 58. (Page 260) Zablocki at the time that
he wrote it believed that even though conversion to new religious world views
shares certain “structural similarities” to the process of Communist thought
reform as described by Lifton, nevertheless the resocialization processes in
the Bruderhof and other new religions did not result in involuntary commitment
to the new world views, whereas in Communist thought reform it did result in
involuntary commitment to Communist ideology.
disputational
X relevant X correct
See my comments on propositions 56 and 57.
Proposition 59. (Page 261) Since Zablocki says in
the passage from his 1971 book quoted earlier that all world view
resocialization is structurally similar to brainwashing, this would presumably
mean that he now thinks, at some level, that all world view resocialization
in involuntary. This in turn would seem to support the conclusion, which has
been advanced by various critics of the cultic brainwashing theory, that the
concept of involuntary world view resocialization, (brainwashing), is an
evaluative rather than an empirical concept, in that it cannot be falsified.
X disputational
X relevant correct
This is total nonsense and is useful only in showing
glaringly the absurd deductions Anthony can draw from his incorrect assumptions.
My statement, again from a thirty-year-old book, is meant to show that there is
nothing mystical or unusual about brainwashing. It’s just an extreme form of
ordinary persuasion.
Proposition 60. (Page 261) The following quote
suggest that Zablocki is attempting to show in his brainwashing publications
that brainwashing produces a rather extreme degree of the loss of the voluntary
capacity to control one’s own actions:
Within family sociology, it used
to be the tendency to say of battered wives, “Why don’t they just leave the
abusive situation? Nobody is holding them there by force.” Now it is much better
understood that chronic battering can wear down not only the body but the
capacity to make independent decisions about leaving. I fail to see any
significant differences between this phenomenon and the phenomenon of the
charismatically abused participant in a cult movement. (1998, 231; emphasis
mine)
X disputational
X relevant correct
Is Anthony denying that battered wives often stay in
abusive marriages because they imagine the costs of leaving to be higher than
they actually are? A broad consensus among those who have studied spouse abuse
is that they have noticed this phenomenon (subjectively elevated exit costs) and
have also noticed that some abused spouses succeed in overcoming their fears and
leaving anyway. Clearly, therefore, there is no loss of free will inherent in
the subjectively elevated exit costs of battered wives. Why then should there be
for brainwashed cult members?
Proposition 61. (Page 262) [Zablocki] contend[s]
that converts to new religions are unable to leave them, or at least find it
very difficult to leave them even when they want to.... Zablocki unequivocally
asserts that brainwashing produces a severe compulsion to remain in cults
once one has joined, and in this sense, produces a loss of free will.
X disputational
X relevant correct
Once again, Anthony fudges the critical distinction between
“unable to leave” and “find it very difficult to leave.” He seems to think these
are sort of equivalent. This line of reasoning would lead to the denial of the
existence of social constraints in human action.
Proposition 62. (Page 262) It seems obvious, then,
that Zablocki’s assertion that his brainwashing perspective does not address
the issue of free will is an example of tactical ambiguity rather than an
internally consistent characteristic of his argument. This in turn would
seem to indicate that, in this respect at least, Zablocki’s brainwashing
argument is congruent with the CIA brainwashing paradigm that he claims to
repudiate. It seems to me that both Zablocki’s paradigm, and the CIA
brainwashing paradigm from which it is derived, are primarily aimed at
demonstrating the loss of free will of the alleged victims of brainwashing.
X disputational
relevant correct
Notice that Anthony is unable or unwilling to argue against
my theory on its own terms. He prefers the indirect route of “proving” that my
theory is identical to a previously discredited theory. This is not optimal
scientific method, but it might be acceptable if Anthony had succeeded in
demonstrating this equivalence. But my previous comments have demonstrated that
he has failed to do so. This failure gives him no other choice but to abandon
his attempts to refute, or else to refute directly. But he will do neither.
Brainwashing Versus Totalitarian Influence: Summary of Empirical Conflicts
Proposition 63. (Page 262) Zablocki’s cultic
brainwashing theory, like the earlier statements of cultic brainwashing theory,
such as those of Singer and Ofshe, is contradicted by its own claimed
theoretical foundation; that is, the research of Schein and Lifton.
X disputational
X relevant correct
See my comments on proposition 2. Again we find an
unwillingness to discuss my theory on its own terms. We are already at
proposition 63, and not once have any of the hypotheses of my theory been so
much as mentioned! Nor will they be in the remaining 35 propositions in
Anthony’s argument. Instead, we have another attempt at guilt by association.
This time, the guilty association is with Schein and Lifton rather than the CIA,
and my sin is not equivalence but contradiction. Anthony will not understand
that, in the social sciences, bibliographic sources are just bibliographic
sources. They are not authorities.
Proposition 64. (Page 262ff) As I have shown above,
the research of Schein and Lifton on westerners in thought reform prisons, upon
which Zablocki claims to base his brainwashing theory, confirmed and extended
Hinkle’s and Wolff’s earlier findings. As I argued in my earlier article, their
research on Communist forceful indoctrination practices disconfirmed the CIA
model with respect to 8 variables. These are: 1) conversion; none of Schein’s
and Lifton’s subjects became committed to Communist world views as a result of
the thought reform program. Only two of Lifton’s 40 subjects and only one or two
of Schein’s 15 subjects emerged from the thought reform process expressing
sympathy for Communism and none of them actually became Communists. Communist
coercive persuasion produced behavioral compliance but not belief in Communist
ideology (Lifton, 1961, 117, 248‑49; Schein, 1958, 332, 1961, 157‑166, 1973,
295); 2) predisposing motives; those subjects who were at all influenced by
Communist indoctrination practices were predisposed to be so before they were
subjected to them (Lifton, 1961, 130; Schein, 1961, 104‑110, 140‑156; 1973,
295); 3) physical coercion; Communist indoctrination practices produced
involuntary influence only in that subjects were forced to participate in them
through extreme physical coercion. (Lifton, 1961, 13; 1976, 327‑328; Schein
1959, 4371, 1961, 125‑127); 4) continuity with normal social influence; The
non‑physical techniques of influence utilized in Communist thought reform are
common in normal social influence situations. (Lifton, 1961, 438‑461; Schein,
1961, 269‑282, 1962, 90‑97, 1964, 331‑351); 5) Conditioning; No distinctive
conditioning procedures were utilized in Communist coercive persuasion (Schein,
1959, 437‑438, 1973, 284‑285; Biderman, 1962, 550); 6) psychophysiological
stress/debilitation; The extreme physically based stress and debilitation to
which imprisoned thought reform victims were subjected did not cause involuntary
commitment to Communist world views. (Hinkle and Wolff, 1956; Lifton, 117,
248‑49; Schein, 1958, 332, 1961, 157‑166, 1973, 295) Moreover, no comparable
practices are present in new religious movements (Anthony, 1990, 309‑311); 7)
Deception/Defective Thought; Victims of Communist thought reform did not become
committed to Communism as a result of deception or defective thought. (Schein,
1961, 202‑203, 238‑39); 8) Dissociation/Hypnosis/Suggestibility. Those subjected
to thought reform did not become hyper‑suggestible as a result of altered states
of consciousness, e.g. hypnosis, dissociation, disorientation, etc. (Schein,
1959, 457; Biderman, 1962, 550)
disputational
relevant X correct
I’ve given Anthony the benefit of the doubt with regard to
his summary of the eight criteria in Lifton’s and Schein’s research that
disconfirm the CIA model. But none of this is relevant to the theory I have
presented. My theory stands alone, is not dependent on any previous theories
(although I am enormously intellectually indebted to them), and needs to be
evaluated on its own terms and in its own terminology. Anthony consistently
refuses to even begin such a project.
Proposition 65. (Page 264) The primary basis for
Zablocki’s “exit costs” third stage brainwashing perspective is the notion that
the research of Lifton and Schein had demonstrated that Communist thought reform
could bring about a conversion to the Communism world view.
disputational
X relevant correct
It is not at all clear what Anthony means when he speaks of
a “perspective” having a “basis.” If he means that a certain proposition about
the efficacy of brainwashing for producing political conversions is an unstated
postulate of my theory, he is wrong. Since I do not state this postulate, the
burden of proof is on Anthony to show that my theory is somehow, nonetheless,
dependent of such a postulate—whether logically or in some other way. Anthony
has not undertaken this task, so his assertion is merely unfounded speculation.
For what it’s worth, I don’t believe that Communist thought reform could bring
about a “conversion” (whatever that might be) to the Communist world view. My
reading of Lifton and Schein suggest that they did not believe this either.
Falsifiability/Testability of Zablocki’s Formulation: Demarcation of Science
from Pseudoscience
Proposition 66. (Page 265) It would seem that at the
most he can claim that he has developed a testable theory which in the future
could serve as the basis for scientific research. Surprisingly, when his
brainwashing articles are read carefully, this turns out to be all that he is
really claiming. Consequently, the scientific status of Zablocki’s exit costs
brainwashing model stands or falls upon its testability.
disputational
X relevant X correct
Why does this ‘discovery’ by Anthony of what I have always
explicitly stated come as a surprise to him? That is what social scientists do,
Dr. Anthony. We propose carefully constructed theories and then later do
research to test them. If research produces findings that contradict the results
expected by the theory, we must then revise or discard the theory. If anything
about this approach is remarkable, it escapes me.
Proposition 67. (Page 265) Early in his brainwashing
articles Zablocki boldly claims that his brainwashing theory is falsifiable,
i.e. testable.
disputational
X relevant X correct
Certainly I claim that the theory I propose is falsifiable.
I would not knowingly propose a theory that didn’t meet this minimal
epistemological criterion. Why Anthony ascribes boldness to this claim is
unclear to me, although I’m flattered, I suppose, in a Walter Mittyan kind of
way.
Proposition 68. (Page 266) It seems to me that in
Zablocki’s definition of them, neither the independent variable (the coercive
cause), nor the dependent variable (the involuntary effect) of his brainwashing
formulation are falsifiable. (sic) In other words, he has not supplied criteria
which clearly differentiate brainwashing techniques and their allegedly
involuntary effects from components of normal social influence, in particular
from other forms of religious conversions and commitments.
X disputational
X relevant correct
One doesn’t ordinarily speak of variables as being
falsifiable or unfalsifiable. That terminology is reserved for discussion of
hypotheses. Perhaps Anthony is just a little confused and is trying to say that
he thinks my definitions of these variables are not sufficiently operationalized
to allow for testing. If true, that would be a more comprehensible charge. I
have provided operational definitions in my theory for all the conceptual terms
I use, either as independent or dependent variables. If Anthony finds these
operational definitions less than persuasive, he should point out the specific
definitions he finds lacking. I will then try to deal with his points and
clarify the definitions. Another point about this proposition is that his
identification of a single independent variable and a single dependent variable
is overly simplistic and does not fit my model. On page 186, I provide a diagram
that maps the causal connections implied by my theory. Depending on the section
of the path diagram one is looking at, several variables might be considered
“independent” or “dependent,” and others are intermediary and can serve as both.
Falsifiability and Independent Variable of Brainwashing
Proposition 69. (Page 266) Zablocki primarily
identifies brainwashing techniques, i.e. the independent or causal variable,
with Lifton’s accounts of the 12 psychological steps which Lifton found to
characterize the Communist thought reform process for his subjects (Lifton,
1961, 65‑85). It is Zablocki’s application of this model of conversion to the
Bruderhof which constitutes his only concrete empirical application of thought
reform research to a new religion. Zablocki also claims that this process
usually must occur in the context of a totalistic social organization in order
to constitute brainwashing.
disputational
X relevant X correct
This proposition is slippery. Strictly speaking, his
labeling of the set of actual brainwashing techniques as the independent
variable is not correct for the reasons stated in my discussion of proposition
68. But I coded this proposition as correct because it does faithfully capture
the important point that the actual techniques used by brainwashers,
ideal-typically, are precisely the
twelve steps delineated by Lifton. Anthony is also correct in stating that I
argue that this process generally takes place in the context of a totalist
organization. But these statements are merely descriptive and do not attempt to
dispute the theory.
Proposition 70. (Page 266f) Zablocki interprets the
imposition of such psychological steps as occurring through techniques that
impose a primitive state of consciousness (and resulting suggestibility) that
he variously describes as disorientation, hypnosis, transference, the suspension
of critical rationality, and so on. But as we also saw, he supplies no
criteria or generally accepted empirical foundation which differentiates these
different terms for primitive consciousness in a falsifiable way from other
forms of religious experience.
X disputational
X relevant correct
In this proposition and the next one, Anthony seems to be
saying that he believes that some of the concepts I use are lacking in
discriminant validity. His argument is complicated by the fact that he throws
out four unrelated terms as if they are synonymous with one another. I don’t use
the terms hypnosis and transference in my theory.
Disorientation, as I discussed in my comments to proposition 25 above, is a
commonplace term in psychology with a well-established, consensually agreed-upon
operational definition. I don’t use the term suspension of critical
rationality in my theory. I think he is confusing this with uncritical
obedience. I think he’s worried that it may be impossible for a researcher
to distinguish cases of uncritical obedience from cases of wholehearted
agreement. But it is possible to distinguish these in situations where the
collectivity’s beliefs and policy positions are changing—especially when they
are rapidly changing. In cases of wholehearted agreement, the temporal sequence
of attitude change in the leader and the follower will be random. In cases of
uncritical obedience, the change will always occur first in the leader and then
very rapidly in the follower.
Proposition 71. (Page 267) Zablocki’s application of
the 12 psychological steps model to the Bruderhof does not supply a falsifiable
way of differentiating a brainwashing process of religious conversion from
other forms of religious conversion.
X disputational
relevant correct
Here again, the term falsifiability is
inappropriate. What Anthony appears to be saying is that the brainwashing
process, as a concept, lacks discriminant validity because there is no
operational way to distinguish it from the steps one may go through in the
process that leads to religious conversion. But such a distinction is necessary
only if one imposes one’s own value judgments on these processes. If
brainwashing is necessarily bad and religious conversion is necessarily good, a
distinction is important. But from a value-neutral point of view, these twelve
stages simply are what they are. My theory simply predicts that, if the
leadership of a group systematically takes some or all of its members through
these stages, certain behavioral consequences (enumerated in the theory) will
follow. The likelihood of these stages producing the internal changes of heart
within the individual that Anthony calls “religious conversion” is interesting
to know about, but it falls outside the scope of this theory.
Proposition 72. (Page 268) Zablocki’s
interpretation of any conversion that includes the 12 psychological steps as
brainwashing in his recent articles, therefore, tends to confirm my
impression that this brainwashing interpretation of Lifton’s research is an
evaluative rather than a scientific perspective, intended to call into question
the authenticity of religious conversion experiences simply because they are
experiential in nature.
X disputational
X relevant correct
This is ironic because, as I show in my comments to
proposition 71, it is Anthony who is imposing the value judgment. In my view,
brainwashing is an ethically neutral behavioral process of persuasion. We need
to be careful here to distinguish carefully between arguments about the wisdom
of using the word “brainwashing” to describe the process I have been discussing
from arguments about whether it is possible to determine empirically if the
process has occurred, regardless of whether I have labeled the process wisely or
unwisely. We have agreed to avoid discussion of the wisdom of using the word in
this essay, so we can focus on the scientific status of the theory. Anthony is
not correct that my use of this term, as I have defined it, is evaluative
rather than scientific. At most, Anthony might argue that the word itself has
become so value laden that social scientists should no longer be free to use the
term, no matter how carefully they define it—that its use will cause other
people to misread the theory as making a moral argument that something bad is
going on.
Falsifiability and the Dependent Variable of Brainwashing
Proposition 73. (Page 269) Zablocki’s brainwashing
formulation has no better luck in defining a falsifiable dependent variable. In
the course of his two articles, it becomes clear that he has not been able to
provide a testable criterion for involuntary belief and conduct, the so‑called
“exit costs” that he maintains are the outcome of brainwashing. As we saw
above, the novel feature of this 3rd stage brainwashing argument is to claim
that brainwashing does not lead to involuntary conversions but rather to
involuntary commitments to a new world view. But what is the testable or
falsifiable criterion of involuntary commitment to the new world view such that
an objective outsider could determine that Zablocki’s brainwashing theory has
been either confirmed or disconfirmed?
X disputational
relevant correct
This proposition is quite muddled. The theory doesn’t need
a testable criterion for involuntary belief because it doesn’t utilize a concept
called “involuntary belief” or anything similar to it; ditto for “involuntary
commitment to the new world view.” As in the previous few propositions, Anthony
is misusing the term falsifiability. This term applies only to hypotheses
and it is meaningless when applied to concepts. What he appears to mean is that
he thinks there are no operational criteria that an objective outside researcher
can use to determine whether exit costs have increased, decreased, or remained
the same. But “exit cost” is an easy variable to operationalize. Exit cost
is defined simply as the magnitude of what it will cost to get a person to
leave. If one year you ask, and the person says she will leave for $10,000, and
when you come back the next year, she says it will take $100,000 to get her to
leave, we can infer that exit costs have increased. When the subjective costs
are emotional and cannot be translated into vulgar dollar amounts, to measure
such a change might be more challenging. But, in principle, doing this is no
more difficult than constructing a test to determine whether a person’s racial
prejudice or his self-esteem have increased from one year to the next—something
that social psychologists routinely do.
Proposition 74. (Page 270f) According to Zablocki
the exit costs idea depends upon his finding that converts to totalistic new
religious movements find it more difficult to leave than they would if they had
converted to other movements. This is the whole basis for his “exit costs”
reinterpretation of the brainwashing paradigm.... Moreover, according to
Zablocki, the brainwashed convert is aware that he wants to leave the
“socio‑psychological prison, but is unable to do so in very much in the same way
as someone who is addicted to drugs or other undesirable habits.” But what is
measurable about this feeling of being trapped such that we could scientifically
determine that converts to totalistic new religious world views are more apt to
have difficulty leaving their religion than are those committed to other types
of world views? ... It would seem that a minimum test of his theory would be
if Zablocki could show that totalism as an independent variable could
differentially predict the dependent variable of lower defection rates than in
non‑totalistic groups, particularly if such lower turnover rates were combined
with evidence that members consciously wanted to leave but had great difficulty
doing so while they were still members of the group.... In Zablocki’s only
systematic discussion of this issue, he acknowledges that there is no evidence
that totalistic NRMs have a lower turnover rate than other new religions, and he
even contends that such differential rates of defection are irrelevant to
testing his exit costs hypothesis.
X disputational
X relevant correct
Membership-turnover data is not appropriate for measuring
exit costs. Many organizations have high overall membership turnover along with
a highly committed core membership that is stable. Exit costs are an
individual-level phenomenon and must be measured at the individual level. When
exit costs are objective (money, for example) one can measure them simply by
determining the level of bribe necessary to get the person to leave. When exit
costs are more subjective, they present precisely the same measurement
challenges as those faced by individual who would measure any
social-psychological phenomenon. Anthony has not demonstrated why a
questionnaire that can measure authoritarianism or self-esteem cannot also
measure subjective exit costs. Once exit costs are measured, only a simple
statistical procedure is required to determine whether those costs tend to
increase for a person who has gone through the twelve persuasion stages
discussed above. If E stands for exit cost, the problem is whether E
after brainwashing is significantly greater than E before brainwashing.
This design corresponds to the simplest quasi-experimental design. If the
researcher wishes to control for spuriousness by examining covariates, the
design becomes a little more complex, but it is still straightforward.
Zablocki’s Admission Brainwashing Not Testable/Falsifiable
Proposition 75. (Page 272) Surprisingly, even
Zablocki, towards the end of his second brainwashing article, acknowledges that
he has failed to make good on his original intent to demonstrate that his exit
costs interpretation of the brainwashing idea is a testable scientific concept.
In his 1998 article, pg. 227, he admits that so‑called scientific “conjectures”
such as the brainwashing concept, are not testable after all and are merely
“plausible...”
X disputational
X relevant correct
This proposition stems from Anthony’s confounding of my
conjectural speculations on why and how brainwashing works, on the one
hand, with my simple social psychological theory that demonstrates, on the other
hand, that brainwashing works. I’ve made my argument clear in numerous
places about this distinction. Only the theory that brainwashing exists and has
certain measurable effects is currently a falsifiable theory. I would love to be
able to put forth a more powerful theory that explains exactly how and why
brainwashing works, but, at present, I am unable to do so. This I have freely
admitted so I don’t understand why Anthony chortles over this fact as if he had
caught me out in some sneaky maneuver. Explanations of how and why remain at the
conjectural level. At various times in my writing, I have experimented with
psychoanalytic explanations, rational-choice-theory explanations,
neuropsychological theories, addictive models, and so on. I think that all of
these approaches are intriguing, but none, as yet, rises to the level of a true
falsifiable theory. Anthony’s critical method of ransacking all of my published
works that range over more than 30 years, without contextual regard as to
whether I am speculating conjecturally or laying out a falsifiable theory, only
breeds confusion. I don’t think the fault is mine because, at least in my
writing in the past ten years, I have always clearly labeled explanations as
being either conjectural or theoretical according to whether I believe they rise
to the level of testability.
Proposition 76. (Page 272) Zablocki apparently has
come to believe that there is a well‑recognized difference in the
epistemological requirements for scientific conjectures as opposed to scientific
theories, whereby the former must only tell a “plausible story” whereas actual
scientific theories must be falsifiable or testable.
X disputational
relevant correct
See my comments on proposition 75. Science is full of
instances in which it is known that a phenomenon exists and has certain
measurable effects, but it is not yet known how or why the phenomenon works. The
theory of electricity was in this situation in the nineteenth century. Standard
scientific operating procedure in such instances involves measuring the
phenomenon while offering conjectures that tell a plausible story as to why it
might have the effects it does. Sometimes, as science progresses, one or more of
these conjectures might graduate and become a testable theory.
Proposition 77. (Page 273f) In Popper’s use of these
terms, there is no real difference between the concepts of conjectures and
theories, nor between falsifiability and testability.
disputational
relevant X correct
There is no difference between falsifiability and
testability. On this point Anthony and I are agreed. I don’t ever claim that
there is a difference. As to conjectures and theories, I have explained the
distinction in the way I use these terms, and this distinction conforms to
normal scientific usage.
Proposition 78 (Page 274) ...with respect to the
term and concept of plausibility, I have been unable to find this term used as
one which is relevant to differentiating scientific from pseudo‑scientific
concepts in any of Popper’s works.
disputational
relevant X correct
I take Anthony’s word for it that he has been unable to
find this term in Popper’s works. So what? As long as I clearly define what I
mean by conjecture (and especially if this usage is widespread in the scientific
literature), who cares whether Popper ever used the term? Even if Anthony wishes
to insist on using Popper as the absolute arbiter of all things epistemological,
he would have to show, not that Popper never used the term, but that he warns
against using the term. This Anthony cannot do.
Proposition 79. (Page 274f) At any rate, Zablocki at
several points in his 1998 article acknowledges that his brainwashing theory, at
least as currently formulated, is not “testable.” (Zablocki, 1998, 216, 239‑240,
244) In his last reference to this issue he states:
At some point in the future,
brainwashing theory will be testable at the chemical level of the brain. Until
then, hopefully I have at least traced the outlines of a viable agenda for
theory building. (1998, 244)
X disputational
relevant correct
See my comments on propositions 75 and 76. Here, Anthony is
confounding my statement that brain imaging is not currently adequate to
determine whether there are neurological changes in brainwashed subjects with my
theoretical design. This confusion is what comes of overusing keyword searches.
Anthony finds the word “testable” in my writings, and he pounces on it without
regard to whether the word is referring to a theory or a conjecture about the
theory.
Proposition 80. (Page 275) Zablocki is herein
covertly admitting that his original aim to demonstrate that brainwashing is
a testable scientific concept has failed.
X disputational
relevant correct
I resent Anthony’s throwing around words such as “covert,”
which means concealed or disguised. His confusion between theory and speculation
in my writing does not justify the charge of concealment. When I find something
I believe I can test, I label it a theory. When I find some explanation that
seems plausible to me but that currently is not testable, I label it a
conjecture. Even if Anthony doesn’t find this distinction sufficiently Popperian
for his tastes, he has to admit that I make it in an open and straightforward
way with no attempt at concealment.
Proposition 81. (Page 275) I am unaware of any
credibly scientific research which differentiates voluntary from involuntary
forms of social influence on the basis of differential neurophysiological
variables, and I think that it is unlikely that such falsifiable differential
criteria with be discovered any time soon.
disputational
relevant X correct
There is no research that differentiates voluntary from
involuntary influence because involuntary influence is not a well-defined
scientific concept. Anthony and I agree on that point, and I don’t use this
pseudoconcept in my work. As evidenced in my comments to previous propositions,
Anthony’s attempts to find this pseudoconcept in my writings have failed. So his
comment is not relevant.
Conclusion—Brainwashing Versus Totalitarian Influence: Unfalsifiable
Brainwashing Formulations and Civil Liberties
Proposition 82. (Page 276) [Zablocki’s] own [early]
research (1971; 1980) not only is inconsistent with his more recent brainwashing
formulation, but it actually disconfirms it in central respects.
X disputational
relevant correct
First, there would be nothing unusual or suspect in a
person’s 1971 writings being inconsistent with his writings in 2001. I argued
for many things as a kid that I no longer would argue for as a grownup. But
ironically, my approach to brainwashing is not one that has changed very much in
these thirty years. I think that my formulation has become more careful and
precise as I have responded to numerous criticisms, but that the basic model is
pretty much the same Liftonian model that I used to account for the strenuous
resocialization exercises I observed while I was studying the Bruderhof. I have,
however, become disenchanted with the psychoanalytic conjectures that were
fashionable in my youth, and I no longer use them. This change might be what
Anthony has incorrectly seized on as an “inconsistency.”
Proposition 83. (Page 276) Zablocki’s formulation is
stated very ambiguously, a characteristic that results in its being impervious
to definitive empirical disconfirmation.
X disputational
relevant correct
This statement is pure opinion. Readers might disagree
about the validity of the eight definitions and twelve hypotheses that
constitute my theory. But I don’t think anyone could argue that there is
anything ambiguous about how they are stated. If Anthony disagrees, let him cite
chapter and verse, and I’ll try to make myself clearer. Meanwhile, the
interested reader can look at these definitions and hypotheses himself and
decide whether they are stated ambiguously.
Civil Liberties and Scientists’ Duties As Citizens: Testimony
Proposition 84. (Page 279) To the extent that
Zablocki is successful in reviving the credibility of the brainwashing idea, ...
his formulation will tend to encourage cultic brainwashing trials and other
forms of religious prejudice, and will serve as the theoretical foundation for
actual testimony in such trials...
disputational
relevant correct
This commentary is totally irrelevant to the scientific
question under discussion. Maybe Anthony is wrong, but maybe he is right. In
either case, the issue is political, not scientific.
Proposition 85. (Page 279f) Zablocki’s criticism ...
is another indication that his viewpoint will, to the extent that it is
influential, work in favor of further brainwashing trials.
disputational
relevant correct
See my comments on proposition 84.
Research on Harm in NRMS; General Research on Harm
Proposition 86. (Page 281) As Zablocki sees it, the
brainwashing concept is the only valid approach to evaluating why, or if,
new religions have harmful social or psychological consequences.
X disputational
relevant correct
This interpretation is a total fabrication. I never say
anything like this, and I don’t believe it. I will give $100 to Dick Anthony’s
favorite charity if he can come up with a quotation from my work that supports
this claim. It is telling that, although Anthony uses citations liberally, he
includes no citation for this accusation.
Totalitarian Influence and Research on Harm`
Proposition 87. (Page 283) it seems to me that thus
interpreted, the totalitarian influence tradition has produced testable
theories, e.g. Rokeach’s dogmatism scales or Adorno et al.’s authoritarian
personality measures.
X disputational
X relevant correct
Here again, Anthony confuses measurement with theorizing.
Rokeach’s dogmatism scale and Adorno’s authoritarianism scales are attempts to
operationalize and measure concepts. They may be very useful, but they are not
theories and they make no claim to be theories. Ironically, by citing these
particular authors, Anthony undermines some of his other criticisms of my work.
These scales that he admires come simply from responses to questionnaire items.
If Anthony recognizes the value of using questionnaires to measure these
invisible, internal, individual states, why not apply the same reasoning to
questionnaires that measure exit costs or uncritical obedience?
Proposition 88. (Page 283) The totalitarian
influence interpretation of this tradition of research more parsimoniously
handles the range of research presently available on new religions.... It seems
obvious to me that a theory that hopes to explain conversion to new religions,
as well as their psychological impact on individuals and the evolution of such
movements in either positive or pathological directions, must take individual
differences into account.
X disputational
relevant X correct
Anthony is correct that individual differences should be
taken into account. The comment is irrelevant, however, because nothing in my
theory is incompatible with doing so. What Anthony calls the “totalitarian
influence interpretation” is not in competition with brainwashing theory. A
scholar using both might well do better at explaining any given NRM than a
scholar using only one or the other. In personal conversations with Anthony, I
have gotten the impression that he thinks that brainwashing must be used as the
absolute and complete explanation of what makes a particular NRM tick. This is a
legitimate criticism of some of the partisan literature of the anticult
movement. But such claims of a single cause that explains everything are not
found in scientific writings.
Brainwashing Ideology As Totalism: Apostate Tales
Proposition 89. (Page 286f) NRM scholars view these
findings as an indication that the brainwashing view of their experiences
results from socialization into the anticult movement and is adopted as an
exculpatory mechanism for reentering mainstream institutions without being
blamed for have rejected them in the first place. Brainwashing authors adopt
various explanations for defending the scientific accuracy of the brainwashing
claims of ex‑members, e.g. some members of the same group were brainwashed
and some were not, or ex‑members associated with the anticult movement have
learned to understand their past memberships correctly whereas those not
associated with it remain in denial about why there were members in the first
place.
disputational
relevant correct
This proposition is pure opinion. It is both incorrect and
irrelevant. Anthony is simply stating his own opinion as to why others come to
the opinions that they hold.
Proposition 90. (Page 287f) My idea is that the
anticult movement itself, and the brainwashing ideology which rationalizes it,
have many of the characteristics that the totalitarian influence tradition
describes as characteristic of totalistic organizations and ideologies.... If I
am correct about brainwashing ideology being a form of totalitarian influence,
it would presumably serve the function of ministering to a polarized self‑sense
and curing identity confusion by enabling converts to it to shift responsibility
for undesirable aspects of their personalities and former behavior onto a
scapegoated contrast category.
disputational
relevant correct
Here Anthony seems to be arguing, ironically enough, that
ex-members get brainwashed into feeling brainwashed. But again, right or wrong,
this commentary is pure opinion and does not deal with the theory itself.
End Notes
Proposition 91. (Footnote 15, page 294). Lifton ...
could not accurately be claiming, as Zablocki interprets him as claiming, that
his ‘doubling’ concept is equivalent to the brainwashing notion of a false or
shadow self.... The conception of a false or shadow self proposed by Zablocki
... involves the notion of a new but inauthentic self.... Doubling, as Lifton
defines it ... could not reasonably be interpreted as implying the involuntary,
compulsive, or addictive attachment to a false self that Zablocki imputes to it.
X disputational
relevant correct
I think that Anthony’s reading of Lifton is wrong. But
whether it is wrong or right is irrelevant to the question at hand. The concepts
discussed in this proposition do not appear in the theory but only in
conjectural explanations of how the brainwashing mechanism might affect the
self.
Proposition 92. (Footnote 16, page 296). In defining
brainwashing as a process that is accomplished primarily by means of
disorientation, Zablocki appears to be giving the disorientation term a more
metaphorical and less precise meaning than its scientific meaning in
psychiatry ... In this less precise form, ... it is unfalsifiable.
X disputational
X relevant correct
This proposition is wrong in two ways. First, I never argue
that brainwashing is accomplished primarily by disorientation. I do argue that
disorientation (in the precise way the term is used in clinical psychiatry)
figures transiently in the process as a by-product of the intense stress that
the subject is under while brainwashing is going on. Second, I do not use the
term in a metaphorical sense. Confusion about position in space and time is the
standard definition in psychiatry and also the one that I use.
Proposition 93. (Footnote 17, page 297). Zablocki
... claims that Orne’s 1972 article reports research demonstrating that people
‘can be hypnotized to do things against their will.’
X disputational
X relevant correct
The exact quote from my “Exit Cost” paper in Nova
Religio is the following: “Orne has done some interesting experimental work
on the extent to which subjects can be hypnotized to do things against their
will.” Anthony is merely being careless here. First of all, the statement
clearly does not state or imply a claim that Orne found that people ‘can be
hypnotized to do things against their will.’ Secondly, this whole discussion of
hypnotism appears in a section of my article in which I discuss alternative
theories to the brainwashing theory. Anthony’s failure to place his quotation in
the proper context is yet another example of his excessive reliance on mindless
keyword searches as a substitute for actually reading the articles he attempts
to criticize.
Proposition 94. (Footnote 18, page 297f). Zablocki
acknowledges that research has demonstrated that ... disorientation and
defective cognition are not characteristics of allegedly brainwashed members of
so called cults, but he attempts to get around this by claiming that these
qualities are only essential characteristics of cult converts during the process
of brainwashing, rather than after they have been successfully brainwashed. . .
. In his earlier book, however, he appears to be saying that such cognitive
defects are continuing characteristics of cult members.
X disputational
X relevant correct
It is puzzling that Anthony has so much trouble
understanding that certain social-psychological processes might produce
disorientation at times while a subject is going through the processes but not
as an end result. This phenomenon is not limited to brainwashing. There are
hundreds of commonplace processes that are stressful to go through but result in
tranquility when they are completed. Because disorientation can result from
extreme stress, the stress-tranquility relationship is understandable and even
predicable. Perhaps the most commonplace of these processes is courtship. How
many couples have felt disoriented at times while they are head-over-heels in
love, and yet they have managed to collect themselves well enough to stand
before the preacher and take their vows in at least relative sobriety? Because
Anthony is married, perhaps he can use his own experiences of this process to
help him understand that the distinction I am making is not remarkable. In fact,
as many field researchers have observed, the brainwashing process puts stress on
individuals that can lead to periods of disorientation. But after the person has
gone through this process of persuasion and become a deployable agent, the
stress becomes much less, and there is no reason to expect disorientation.
Proposition 95. (Footnote 27, page 301f). Zablocki’s
endorsement of Ofshe’s cultic brainwashing theory as essentially equivalent to
his own...
X disputational
relevant correct
I do not say this, nor do I believe it. Anthony seems to
believe that if you give credit to another scholar by citing his work, you are
thereby asserting that your work and his work are equivalent. This is the
indirect method of criticism that Anthony seems to prefer. Rather than dealing
with my theory directly, he tries to establish its equivalence to a theory he
has previously discredited.
Proposition 96. (Footnote 29, page 302). Zablocki is
asserting ... that the brainwashing term and its variants, such as menticide,
are synonymous with the term ‘thought reform,’ [and] that the brainwashing and
thought reform models ... are equivalent also.
X disputational
relevant correct
Here Anthony is just talking about words. Let him specify
what he means by the menticide model, by the brainwashing model, and by the
thought-reform model, and then let us see whether or not they are equivalent. As
stated, this proposition is so vague that it has no content.
Proposition 97. (Footnote 32, page 302f).
[Zablocki’s] assertion in this passage that the goal of brainwashing is to
create ‘deployable agents’ implicitly contradicts Zablocki’s assertion in the
same passage that brainwashing is not concerned with the issue of free will.
The deployable agent term is used only by advocates of the CIA brainwashing
model and indicates that the person so designated has become an involuntary
mental prisoner of the group to which he is committed.
X disputational
X relevant correct
In this proposition, Anthony merely displays his ignorance
of the literature in organization sociology. The term deployable agent
has been around for a long time in sociology and social psychology. The term was
first used in sociology by Phillip Selznick in his influential book, The
Organizational Weapon. Deployable agent has nothing to do with free
will or the lack of free will. It has to do only with the empirical question of
the extent to which an agent can be trusted to remain obedient when not under
surveillance.
Proposition 98. (Footnote 35, page 304). Zablocki
says that all of the eight themes of totalism do not have to be present in
order for an environment to be considered totalistic. Thus ... concrete
allegations of totalism by him would be difficult to falsify.
X disputational
X relevant correct
Anthony is assuming that totalism must be construed as a
dichotomous variable. An organization or an environment is either totalistic or
it is not. Most sociologists see totalism as a continuum. A given organization
can be more or less totalistic than another organization. This is how Goffman
uses the concept in his work on total institutions. Because totalism is a
concept, not a theory, issues of falsifiability are misplaced in discussing it.
But the concept certainly measures something real. An organization with seven of
the criteria of totalism set out by Lifton is highly likely to be more
totalistic than one that has only two.
Conclusion
Because none of Anthony’s ninety-eight propositions have
proven to be disputational, relevant, and correct, it follows that Anthony’s
chapter as a whole cannot be considered a successful critique of the
brainwashing theory as I have presented it in my chapter of Misunderstanding
Cults (which follows in this document as an Appendix). This conclusion, of
course, does not demonstrate that the theory is either useful or correct, but
only that Anthony’s chapter is not relevant to any discussion of its validity.
The only way that these conclusions can be refuted would be if Anthony (or
someone else) were to point out a proposition that I have missed, or if my own
comments on at least one of these propositions can be shown to be inaccurate.
Appendix: Zablocki’s Theory of Brainwashing in Charismatic
Groups Taken Verbatim from Chapter Five of Zablocki and Robbins,
Misunderstanding Cults, Toronto: University of Toronto Press, 2001 (Pp. 182
- 193)
Definitions
D1. Charisma is defined using the classical Weberian
formula: as a condition of “devotion to the specific and exceptional sanctity,
heroism, or exemplary character of an individual person, of the normative
patterns or order revealed or ordained by him.” Being defined this way, as a
condition of devotion, leads us to recognize that charisma is not to be
understood simply in terms of the characteristics of the leader, as it has come
to be in popular usage, but requires an understanding of the relationship
between leader and followers. In other words, charisma is a relational variable.
Charisma is defined operationally as a network of relationships in which
authority is justified (for both superordinates and subordinates) in terms of
the special characteristics discussed above.
D2. Ideological Totalism is defined as a
socio-cultural system that places high valuation on total control over all
aspects of the outer and inner lives of participants for the purpose of
achieving the goals of an ideology defined as all important. Individual rights
either do not exist under ideological totalism or they are clearly subordinated
to the needs of the collectivity whenever the two come into conflict.
Ideological totalism has been operationalized in terms of eight observable
characteristics: milieu control, mystical manipulation, the demand for purity,
the cult of confession, “sacred science,” loading the language, doctrine over
person, and the dispensing of existence.
D3. Surveillance is defined as keeping watch over a
person’s behavior and, perhaps, attitudes. As Hechter has shown, the need for
surveillance is the greatest obstacle to goal achievement among ideological
collectivities organized around the production of public goods. Surveillance is
not only costly, it is also impractical for many activities in which agents of
the collectivity may have to travel and act autonomously and at a distance. It
follows from this that all collectivities pursuing public goals will be
motivated to find ways to decrease the need for surveillance. Resources used for
surveillance are wasted in the sense that they are unavailable for the
achievement of collective goals.
D4. A deployable agent is one who is uncritically
obedient to directives perceived as charismatically legitimate. A deployable
agent can be relied on to continue to carry out the wishes of the collectivity
regardless of his own hedonic interests and in the absence of any external
controls. Deployability can be operationalized as the likelihood that the
individual will continue to comply with hitherto ego-dystonic demands of the
collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child
abuse, murder) when not under surveillance.
D5. Brainwashing is defined as an observable set of
transactions between a charismatically structured collectivity and an isolated
agent of the collectivity with the goal of transforming the agent into a
deployable agent. Brainwashing is thus a process of ideological
resocialization carried out within a structure of charismatic authority.
The brainwashing process may be operationalized as a
sequence of well-defined and potentially observable phases. These hypothesized
phases are: (1) identity stripping, (2) identification, and (3) symbolic
death/rebirth. The operational definition of brainwashing refers to the specific
activities attempted, whether or not they are successful, as they are either
observed directly by the ethnographer or reported in official or unofficial
accounts by members or ex-members. Although the exact order of phases and
specific steps within phases may vary from group to group, we should always
expect to see the following features, or their functional equivalents, in any
brainwashing system: (1) the constant fluctuation between assault and leniency;
and (2) the seemingly endless process of confession, re‑education, and
refinement of confession.
D6. Hyper Credulity is defined as a disposition to
accept uncritically all charismatically ordained beliefs. All lovers of
literature and poetry are familiar with “that willing suspension of disbelief
for the moment, which constitutes poetic faith.” Hyper credulity occurs when
this state of mind, which, in most of us, is occasional and transitory, is
transformed into a stable disposition. Hyper credulity falls between hyper
suggestibility on the one hand and stable conversion of belief on the other. Its
operational hallmark is plasticity in the assumption of deeply held convictions
at the behest of an external authority. This is an other-directed form of what
Robert Lifton has called the protean identity state.
D7. Relational Enmeshment is defined as a state of
being in which self-esteem depends upon belonging to a particular collectivity.
It may be operationalized as immersion in a relational network with the
following characteristics: exclusivity (high ratio of in-group to out-group
bonds), interchangeability (low level of differentiation in affective ties
between one alter and another), and dependency (reluctance to sever or weaken
ties for any reason). In a developmental context, something similar to this has
been referred to by Bowlby as anxious attachment.
D8. Exit Costs are defined as the subjective costs
experienced by an individual who is contemplating leaving a collectivity.
Obviously, the higher the perceived exit costs, the greater will be the
reluctance to leave. Exit costs may be operationalized as the magnitude of the
bribe necessary to overcome them. A person who is willing to leave if we pay him
$1,000 experiences lower exit costs than one who is not willing to leave for any
payment less than $1,000,000. With regard to cults, the exit costs are most
often spiritual and emotional rather than material which makes measurement in
this way more difficult but not impossible.
Hypotheses
Not all charismatic organizations engage in brainwashing.
We therefore need a set of hypotheses that will allow us to test empirically
whether any particular charismatic system attempts to practice brainwashing and
with what effect. The brainwashing model asserts twelve hypotheses about the
role of brainwashing in the production of uncritical obedience. These hypotheses
are all empirically testable.
This model begins with an assumption that charismatic
leaders are capable of creating organizations that are easy and attractive to
enter (even though they may later turn out to be difficult and painful to
leave). There are no hypotheses, therefore, to account for how charismatic cults
obtain members. It is assumed that an abundant pool of potential recruits to
such groups is always available. The model assumes that charismatic leaders,
using nothing more than their own intrinsic attractiveness and persuasiveness,
are initially able to gather around them a corps of disciples sufficient for the
creation of an attractive social movement. Many ethnographies have shown how
easy it is for such small movement organizations to attract new members from the
general pool of anomic “seekers” that can always be found within the population
of an urbanized mobile society.
The model does attempt to account for how some percentage
of these ordinary members are turned into deployable agents. The initial
attractiveness of the group, its vision of the future and/or its capacity to
bestow seemingly limitless amounts of love and esteem on the new member are
sufficient inducements in some cases to motivate a new member to voluntarily
undergo this difficult and painful process of resocialization.
H1. Ideological totalism is a necessary but not sufficient
condition for the brainwashing process. Brainwashing will be attempted only in
groups that are structured totalistically. However, not all ideologically
totalist groups will attempt to brainwash their members. It should be remembered
that brainwashing is merely a mechanism for producing deployable agents. Some
cults may not want deployable agents or have other ways of producing them.
Others may want them but feel uncomfortable about using brainwashing methods to
obtain them or may not have discovered the existence of brainwashing methods.
H2. The exact nature of this resocialization process will
differ from group to group but, in general, will be similar to the
resocialization process that Robert Lifton and Edgar Schein observed in
Communist re-education centers in the 1950s. For whatever reasons, these methods
seem to come fairly intuitively to charismatic leaders and their staffs.
Although the specific steps and their exact ordering differ from group to group,
their common elements involve a stripping away of the vestiges of an old
identity, the requirement that repeated confessions be made either orally or in
writing, and a somewhat random and ultimately debilitating alternation of the
giving and the withholding of “unconditional” love and approval. H2 further
states that the maintenance of this program involves the expenditure of a
measurable quantity of the collectivity's resources. This quantity is known as
C, where C equals the cost of the program and should be measurable at least at
an ordinal level.
This resocialization process has baffled many observers, in
my opinion because it proceeds simultaneously along two distinct but parallel
tracks one involving cognitive functioning and the other involving emotional
networking. These two tracks lead to the attainment of states of hyper credulity
and relational enmeshment respectively. The group member learns to accept with
suspended critical judgment the often shifting beliefs espoused by the
charismatic leader. At the same time, the group member becomes strongly attached
to and emotionally dependent upon the charismatic leader and (often especially)
the other group members and cannot bear to be shunned by them.
H3. Those who go through the process will be more likely
than those who do not to reach a state of hyper credulity. This involves the
shedding of old convictions and the assumption of a zealous loyalty to these
beliefs of the moment, uncritically seized upon, so that all such beliefs become
not mere “beliefs” but deeply held convictions.
Under normal circumstances, it is not easy to get people to
disown their core convictions. Convictions, once developed, are generally
treated not as hypotheses to test empirically but as possessions to value and
cherish. There are often substantial subjective costs to the individual in
giving them up. Abelson has provided convincing linguistic evidence that most
people treat convictions more as valued possessions than as ways of testing
reality. Cognitive dissonance theory predicts with accuracy that, when subject
to frontal attack, attachment to convictions tends to harden. Therefore, a
frontal attack on convictions, without first undermining the self-image
foundation of these convictions, is doomed to failure. An indirect approach
through brainwashing is often more effective.
The unconventional beliefs that individuals adopt when they
join cults will come to be discontinuous with the beliefs they held in pre-cult
life. What appears to happen is a transformation from individually held to
collectively held convictions. This is a well known phenomenon that Janis has
called “groupthink.” Under circumstances of groupthink, the specific content of
one's convictions becomes much less important than achieving the goal that all
in the group hold the same convictions. In elaboration likelihood terms we can
say that the subject undergoes a profound shift from message processing to
source processing in the course of resocialization.
When the state of hyper credulity is achieved, it leaves
the individual strongly committed to the charismatic belief of the moment but
with little or no critical inclination to resist charismatically approved new or
contradictory beliefs in the future and little motivation to attempt to form
accurate independent judgments of the consequences of assuming new beliefs. The
cognitive track of the resocialization process begins by stripping away the old
convictions and associating them with guilt, evil, or befuddlement. Next, there
is a traumatic exhaustion of the habit of subjecting right-brain convictions to
left-brain rational scrutiny. This goes along with an increase in what Snyder
has called self-monitoring, implying a shift from central route to peripheral
route processing of information in which the source rather than the content of
the message becomes all important.
H4. As an individual goes through the brainwashing process,
there will be an increase in relational enmeshment with measurable increases
with completion of each of the three stages. The purging of convictions is a
painful process and it is reasonable to ask why anybody would go through it
voluntarily. The payoff is the opportunity to feel more connected with the
charismatic relational network. These people have also been through it and only
they really understand what you are going through. So cognitive purging leads
one to seek relational comfort and this comfort becomes enmeshing. The credulity
process and the enmeshing process depend on each other.
The next three hypotheses are concerned with the fact that
each of the three phases of brainwashing achieves plateaus in both of these
processes. The stripping phase creates the vulnerability to this sort of
transformation. The identification phase creates realignment, and the rebirth
phase breaks down the barrier between the two so that convictions can be
emotionally energized and held with zeal while emotional attachments can be
sacralized in terms of the charismatic ideology. The full brainwashing model
actually provides far more detailed hypotheses concerning the various steps
within each phase of the process. Space constraints make it impossible to
discuss these here. An adequate technical discussion, for example, of the
manipulation of language in brainwashing would require a chapter at least the
length of this one.
H5. The stripping phase: The cognitive goal of the
stripping phase is to destroy prior convictions and prior relationships of
belonging. The emotional goal of the stripping phase is to create the need for
attachments. Overall, at the completion of the stripping phase, the situation is
such that the individual is hungry for convictions and attachments and dependent
upon the collectivity to supply them. This sort of credulity and attachment
behavior is widespread among prisoners and hospital patients.
H6. The identification phase: The cognitive goal of the
identification phase is to establish imitative search for conviction and bring
about the erosion of the habit of incredulity. The emotional goal of the
identification phase is to instill the habit of acting out through attachment.
Overall, at the completion of the identification phase, the individual has begun
the practice of relying on the collectivity for beliefs and for a cyclic
emotional pattern of arousal and comfort. But, at this point, this reliance is
just one highly valued form of existence. It is not yet viewed as an existential
necessity.
H7. The symbolic death and rebirth phase: In the rebirth
phase, the cognitive and the emotional tracks come together and mutually support
each other. This often gives the individual a sense of having emerged from a
tunnel and an experience of spiritual rebirth. The cognitive goal of the rebirth
phase is to establish a sense of ownership of (and pride of ownership in) the
new convictions. The emotional goal of the rebirth phase is to make a full
commitment to the new self that is no longer directly dependent upon hope of
attachment or fear of separation. Overall, at the completion of the rebirth
phase, we may say that the person has become a fully deployable agent of the
charismatic leader. The brainwashing process is complete.
H8 states that the brainwashing process results in a state
of subjectively elevated exit costs. These exit costs cannot, of course, be
observed directly. But they can be inferred from the behavioral state of panic
or terror that arises in the individual at the possibility of having his or her
ties to the group discontinued. The cognitive and emotional states produced by
the brainwashing process together bring about a situation in which the perceived
exit costs for the individual increase sharply. This closes the trap for all but
the most highly motivated individuals and induces in many a state of uncritical
obedience. As soon as exit from the group (or even from its good graces) ceases
to be a subjectively palatable option, it makes sense for the individual to
comply with almost anything the group demands—even to the point of suicide in
some instances. Borrowing from Sartre’s insightful play of that name, I refer to
this situation as the “no exit” syndrome. When demands for compliance are
particularly harsh, the hyper credulity aspect of the process sweetens the pill
somewhat by allowing the individual to accept uncritically the justifications
offered by the charismatic leader and/or charismatic organization for making
these demands, however farfetched these justifications might appear to an
outside observer.
H9 states that the brainwashing process results in a state
of ideological obedience in which the individual has a strong tendency to comply
with any behavioral demands made by the collectivity, especially if motivated by
the carrot of approval and the stick of threatened expulsion, no matter how life
threatening these demands may be and no matter how repugnant such demands might
have been to the individual in his or her pre-brainwashed state.
H10 states that the brainwashing process results in
increased deployability. Deployability extends the range of ideological
obedience in the temporal dimension. It states that the response continues after
the stimulus is removed. This hypothesis will be disconfirmed in any cult within
which members are uncritically obedient only while they are being brainwashed
but not thereafter. The effect need not be permanent, but it does need to result
in some measurable increase in deployability over time.
H11 states that the ability of the collectivity to rely on
obedience without surveillance will result in a measurable decrease in
surveillance. Since surveillance involves costs, this decrease will lead to a
quantity S, where S equals the savings to the collectivity due to diminished
surveillance needs and should be measurable at least to an ordinal level.
H12 states that S will be greater than C. In other words,
the savings to the collectivity due to decreased surveillance needs is greater
than the cost of maintaining the brainwashing program. Only where S is greater
than C does it make sense to maintain a brainwashing program. Cults with
initially high surveillance costs and, therefore, high potential savings due to
decreased surveillance needs [S] will tend to be more likely to brainwash, as
will cults structured so that the cost of maintaining the brainwashing system
[C] are relatively low.
References
Anthony, Dick. (2001). Tactical ambiguity and brainwashing
formulations. In Benjamin Zablocki & Thomas Robbins (Eds.), Misunderstanding
cults: Searching for objectivity in a controversial field (pp. 215-317).
Toronto: University of Toronto Press
Zablocki, Benjamin. (2001). Towards a demystified and
disinterested scientific theory of brainwashing. In Benjamin Zablocki & Thomas
Robbins (Eds.), Misunderstanding cults: Searching for objectivity in a
controversial field (pp. 159-214). Toronto: University of Toronto Press
Zablocki, Benjamin, & Robbins, Thomas (Eds.). (2001).
Misunderstanding cults: Searching for objectivity in a controversial field
(pp. 215-317). Toronto: University of Toronto Press.
|