Mind Control in Twenty Minutes
The nature of cult mind control
Why do people join cults?
Leaving a cult
How do cults get away with it?
Addendum - other models of mind control
The nature of cult mind control
'Mind Control' (aka 'Brainwashing' or 'Thought Reform') is a shorthand term for a complex process of mental and psychological manipulation, which occurs within a cult. It is a means of exercising undue influence over a person.
The most effective mind-control is the kind that isn't recognised by the victim as any kind of manipulation. You don't feel it, you think you are in control.
Briefly, this is how it works. A cult promotes its cultish belief system, and then believers control their own minds, as they train their minds and reform their personalities, in accordance with the tenets of their cultish new belief system.
Understanding the disorientating, drug-like nature of a cult belief system and worldview is the key to understanding cult mind control.
Cults promote a belief system which is utopian and idealistic, and also dualistic and bi-polar in nature. Dualistic, in the sense that they tend to see the world in terms of two opposite poles, such as pure good and evil, the saved and the fallen, the enlightened and the ignorant, etc.
Cult belief systems are also bi-polar in psychological terms, rather like Bi-polar disorder or manic-depression. Cults promote a vision of an ideal 'new life' or 'new self' ('the true individual' in FWBO terms), which members believe they can attain by following the cult teachings. E.g.:
'The central teaching of the Buddha is that we can change our lives. Buddhism offers clear and practical guidelines as to how men and women can realise their full potential for understanding and kindness. Meditation is a direct way of working on ourselves, to bring about positive change in our lives. We teach two simple and complementary meditations. One helps us develop a calm, clear, focused mind; the other transforms our emotional life, enabling us to enjoy greater self-confidence and positivity towards others.' 
Cult belief systems are bi-polar because they encourage the aspirant to identify with this imagined ideal new self, and then, from the perspective of this new self, to see their old self as comparatively inferior and flawed. E.g:
'Bodhi [Enlightenment] is a state of insight, of wisdom, of awareness - to begin with, insight into one's own self. It consists in taking a very deep, clear, profound look into oneself, and seeing how, on all the different levels of one's being, one is conditioned, governed by the reactive mind, reacting mechanically, automatically, on account of past psychological conditionings of which only too often one is largely unconscious. It is seeing, moreover, the extent to which one is dominated, even against one's will, - often without ones knowledge, by the negative emotions.' 
An inner tension or conflict is set up, between the 'positive' new self (the 'Stepford' personality), and the 'negative' old self. In effect, a split personality is created, with pride and hubris for the idealised new self, and shame and guilt for the unreformed old self.
All cults seem to be motivated by this bipolar mixture of hubris and guilt. Exactly what is considered 'positive' or 'negative', virtuous or sinful, may vary considerably, depending on the tenets of the particular belief system in question, but cult belief systems generally seem to have this same underlying hubris/guilt dynamic.
Cult members are entranced by the cult's beguiling 'fantasy of heroic virtue', which both inspires and traps them. As they try to practice the cult teachings for themselves, they tend to alternate between seeing themselves as fairly heroic in their efforts to achieve an ideal personality and to help bring about a ideal new world, or feeling guilty over their failure to overcome their recalcitrant old self, with all its supposed negativity and reactivity and sinfulness.
The hubris can either be a personal hubris, in the case of the cult's top hierarchy, or more usually, for rank and file members, it is a kind of projected hubris, or hubris-by-proxy - the hope and expectation that in due course, after diligent practice, they will attain the ideal for themselves, or at least make definite progress towards the ideal. This expectation can sometimes lead to a sort of collective hubris among established cult members. They see themselves as part of an elite, and tend to look down rather sniffily upon the mores and values of mainstream society.
To varying degrees, believers can experience a sort of refined mania of inspiration, almost like a drug high, when they are in the hubris phase, identifying with their imaginary idealised new self, with its enhanced understanding and kindness, etc. This inspiration is pleasant and even intoxicating in itself, and it may also be interpreted as a sign of spiritual progress, as a glimpse of a higher reality, or as evidence of the truth of the belief system. This is all part of the circular, solipsistic, self-validating nature of a cult belief system.
Besides tending to believe in the objective truth of the belief system, believers can also become psychologically addicted to the inspired state.This combination of faith and addiction can make a person very loyal to their group, and to the teachers and leaders who inspire and guide the group. In effect, they become dependent on the group and its leadership to guide their 'spiritual growth'. Of course, cult leaders can exploit this dependency in various ways.
If members fall out of favour, even temporarily, with the group's leadership, or if they begin to doubt if they can achieve the group's ideals, they may experience a sort of religious depression, in the form of anxiety or guilt over their seeming inability to free themselves from their negative 'old self', with all its bad habits and reactivity and lack of faith. This guilt or depression tends to reinforce their desire to return to the inspired state, and to reinforce their addiction to the inspiring vision of the cult belief system, so there can be a cumulative feedback process operating too - the more they cling to their inspiration, the more they are prone to depression once the inspiration wears off. And the more they experience depression, the more they crave inspiration. And so on.
At an extreme, believers fear they will become ill or fall into hell if they leave the group.
All this goes on within a cult members mind. A cult does not really control its members by using external coercion. It is the belief system/worldview itself which is the primary active agent in cult mind control. The actual controlling of mind is done by the person themselves, as they attempt to discipline their mind and reform their personality, in accordance with the tenets of their new belief system. Effectively, a cult, via its belief system, uses a person's own energy and aspirations against them.
It would be a mistake to assume that only weak-willed people join cults. On the contrary, it is often the more ambitious and strong-willed people who become the most committed cult members.
Of course, ordinary society can be a bit bi-polar as well, with its pressure to be 'successful', with an ideal physique, lifestyle, etc. The pressure is just more focused and sustained within a cult. A cult can play on both a person's anxieties and their aspirations at the same time. They (or rather their belief system) can potentially make a person feel both more guilty about their 'old self' with its normal human weaknesses, and simultaneously inspire them with an imaginary idealised vision of a wonderful new self and a new life. Very bi-polar.
In general, when you talk to a cult member, it can be helpful to understand which self, either the old self with its old set of beliefs, or the new self with its new set of cult beliefs, is more dominant at any particular time.
If you criticise a cult member, this may just encourage their tendency to see themselves (their old self) as flawed, and may push them further into the cult. If you criticise their church or group, the cult-member will go into cult-self mode and will see your criticisms as tending to confirm the cult's warnings about the outside world and its negative effects. A better approach may be to acknowledge and encourage a cult member's old self, without criticising or threatening the new cult self. If a cult member feels valued in themselves, and their old self does not feel devalued, then this weakens the cult's attraction for them.
Why do people join cults?
Obviously, no-one is forced to join a cult. No-one is forced to adopt a new belief system. Equally, however, no-one can really make an informed assessment of a group or its belief system in advance, without having first had some personal experience of it. You can't knock it if you haven't tried it.
The ideals and goals of a cult's belief system are such as to give the cult an inherent psychological advantage over its critics and doubters, because on the one hand the cult's ideals are attractive and inspiring, while on the other hand they are non-falsifiable - they can never be shown to be false or deluded, and indeed it can seem negative and reactionary even to question them.
In general, it can be difficult to know in advance whether it would be beneficial or not to follow the study and training opportunities offered by a particular group or organisation. The benefits, if any, of group involvement can only really be evaluated after a suitable period of time spent with the group. How long a suitable period of time might be, depends on the individual, and cannot be determined in advance. In other words, it can be very difficult to know from the outside whether a group is a cult or not, unless you are forewarned and have enough knowledge about cults to be able to spot the telltale signs.
The danger for someone who may unwittingly become involved with a cult is that they will be exposed to the cult belief system, which is psychoactive, like a drug. It can be addictive and disorientating, and dangerous even to experiment with. Once involved, it may not be all that easy for someone to escape from a cult belief system.
Cults will do their best to ensure initial apparent benefits for new members. A cult is rather like a card sharp , who will let a newcomer win the first few games, in order to take all their money in the long run. There is no problem, so long as a member is happy to continue their involvement with the group. However, should a member at some stage become unhappy with their involvement, or develop serious doubts about the belief system or the integrity of the group's leadership, then the process of disentanglement may not be all that straightforward.
Leaving a cult.
Rejecting the belief system in its entirety may not be easy, or even desirable. Even after physical contact with the group has ceased, elements of the cult belief system are likely to linger in the mind of an ex-member for some time, depending how deeply and for how long they were involved. They may experience feelings of anxiety and disorientation as they try to rid themselves of the unwanted remnants of the cult belief system and worldview, while simultaneously trying to regain some confidence either in their old, pre-cult belief system and ways of relating to the world, or alternatively, in some new, post-cult belief system.
In trying to rid their minds of the unwanted remnants of the cult belief system, an ex-member is effectively trying to use their own thought processes to disentangle their own thought processes. This is quite a difficult task, rather like trying to lift yourself up by your own shoelaces.
For a while, an ex-member may exist in a sort of limbo between the cult world and the outside world, unsure which to believe in. To the extent that the cult belief system retains any degree of respect or credibility within an ex-member's mind, then to that extent leaving the group will seem like abandoning the ideals and aspirations of the group's belief system, and therefore like a failure.
On the other hand, to the extent that the cult belief system fails to retain credibility and is eschewed, to that extent an ex-member will tend to feel shame at their foolishness and gullibility in having once adopted beliefs and aspired to ideals which they now regard as unrealistic.
So either they are a failure, or a gullible fool. Either way their self-confidence takes a knock, and they may find it difficult to have any faith in their own judgement, or in their ability to make sensible decisions. For a while, they may not know what to believe, or who to trust.
While an ex-member is in this process of disentangling themselves from the cult belief system and ways of thinking, it can be helpful to talk to other ex-members from various different cults, who have gone through a similar process of disentanglement. Additionally, there are professional 'exit-counselors', often former cult members themselves, who may be able to help unravel any psychological disorientation or damage resulting from cult involvement.
However, some caution is also necessary, because some so-called cult awareness networks are, ironically, actually run by cults, and some supposedly independent academic researchers are in effect cult apologists, whose research is sometimes indirectly funded by cults seeking a positive report for their own marketing and public relations purposes.
How do cults get away with it?
Mind control is an intangible thing. It is a complex psychological process which leaves no physical trace or evidence behind. Therefore it is virtually impossible to prove that mind control has occurred in any particular case, or even that it exists at all. 
Mind control occurs as a result of an individual becoming involved with a cult and its belief system. Unfortunately, the workings of cult mind control are not widely understood by the general public. Consequently, cults are almost never held responsible for the beliefs they promote, or for the changes in an individual's behaviour that those beliefs may cause. The individual is held responsible for acting on those beliefs, but the cult is rarely if ever held responsible for promoting those beliefs in the first place.
No criticisms of the allegedly harmful effect that a cult's belief system may have had upon a member's mind or behaviour can ever be proved objectively, because the whole subject of personal belief is by nature largely subjective and intangible, and therefore unprovable either way.
Victims are left with the near-impossible task of proving the unprovable. A cult can simply say that its critics are motivated by personal resentment and negativity, or that they had hidden psychological problems before they became involved with the group. So long as the burden of proof remains with the critic, a cult can never lose. A cult can be a complete scam, and damaging to those who become involved, but nobody can actually prove it.
Therefore it is very difficult to expose a cult, or to prevent it from continuing to expand and to attract new recruits.
 FWBO Norwich Buddhist Centre leaflet and programme of classes, Autumn 1999.
(NB Buddhism as a whole is not a cult, but like any mainstream faith or belief system, it can be twisted and used by a cult.)
 FWBO leader, Sangharakshita, Mitrata Omnibus, Pub FWBO/Ola Leaves 1980-81, p 38-9
 The 'card sharp' simile for a cult is courtesy of Verdex, an ex-FWBO member from Germany, who maintains the Internet site,
Another Verdex simile: A cult is like a black hole; the attraction increases as a person moves closer to the group, and there is even an event horizon, beyond which communication with the outside world is lost.
 There is some debate in academic circles about whether mind control (aka brainwashing) actually exists or not. On the one hand, some academic researchers claim that the concept of mind control has been scientifically discredited. Details of these researchers, sometimes described as 'cult apologists', can be found at:
On the other hand, DSM-IV (American Psychiatric Association, Diagnostic Manual of Mental Disorders, 4th edition, 1994) specifically mentions cults and brainwashing under 300.15, "Dissociative Disorder NOS."
There are other ways of explaining the mind control process, perhaps the best known being Steve Hassan's BITE model (control of Behaviour, Information, Thoughts, and Emotions); and Lifton's eight criteria for Thought Reform.
Some descriptions of mind control and the techniques involved may be difficult for people with no personal experience of cult involvement to understand. They have no concrete experience of their own to compare these descriptions to, so it may be hard for them to understand the 'flavour' of cult mind control, just from reading a list of the ingredients.
Ex-cult members, or their family and friends, obviously have some personal experience of cult involvement, and therefore should find it easier to relate the BITE model, or similar models, to their own experience. They have some experience of the flavour of mind control, and thus they are better able to see how the different elements combined together.
They can look back and see, for example, how the cult environment pressured them to change their behaviour, and their thoughts and emotions, and how the cult presented misleading or incomplete information about itself. They can interpret the BITE model in the light of their own experience.
The analysis presented here, in 'Mind Control in 20 Minutes', is intended to complement rather than compete with the BITE model. In particular, it tries to give an idea of the flavour of cult involvement, and a sense of the 'inner life' of a cult member, in a way which people who have never been involved with a cult themselves, might be able to understand or empathise with.
It is a sort of 'inside out' analysis, which concentrates on trying to explain the nature of a cult belief system, and how it can gain a hold within a person's mind and come to dominate their mind and behaviour, rather than an 'outside in' analysis, which tries to explain mind control in terms of external influences or constraints on a person's behaviour.
For a more detailed (11,000 word) version of the above analysis of cult belief systems, see 'The Culture of Cults'
For general information about cults, mind control, and recovery issues, see:
Good books on cults and mind control include:
'Combatting Cult Mind Control' - Protection, Rescue and Recovery from Destructive Cults, by Steven Hassan, pub 1990, Park St. Press.
'Releasing the Bonds: Empowering People to Think for Themselves', by Steven Hassan, pub 2002
'Cults in Our Midst' by Margaret Thaler Singer, et al.
'Captive Hearts, Captive Minds: Freedom and Recovery from Cults and Other Abusive Relationships' by Madeleine Landau Tobias and Janja Lalich, pub Hunter House,1994. ISBN 0-89793-144-0