The Culture of Cults


Preliminary Definitions

Cult Mind Control
Outline of a Cult Persuasion Process
Marketing a Cult Belief System

Religious Freedom & Moral Independence
The Quasi - Religious Spectrum
Religious Freedom & Moral Independence
Organisations and their Belief Systems

The Nature of Personal Belief
Free Will, Free Choice, and Personal Belief
The Hermeneutics of Personal Belief
Hierarchical, Bi-polar Belief Systems
Hierarchies and the Politics of Personal Belief (and why Christianity didn't start as a cult)

Recruitment by Cults
The Cult Recruitment Process

Leaving a Cult
Disability Arising from Cult Involvement

Problems in Exposing Cults
Difficulties in Identifying a Cult
Difficulties facing Critical Ex-members
Summary of Advantages Enjoyed by Cult Organisations



The intended purpose of this analysis, written by a former cult member, is to explain the nature of a cult, to warn others of the dangers of involvement with a cult group, and to support calls for society to be more proactive in protecting the rights of individuals targeted by cults.

Cults use so-called brainwashing or mind control techniques to indoctrinate their members. These are not magic techniques, they are ordinary techniques of marketing and persuasion, but applied more intensively within the peculiar context of a cult environment.

Essentially, a cult promotes its cultish belief system, and then believers control their own minds, as they attempt to train their minds and reform their personalities, in accordance with the tenets of their new belief system.

Brainwashing or mind control does not directly overcome free will. After brainwashing, free will itself remains intact, but its basis has changed.

The basis of a person's free will is their belief system and worldview. People make their choices and decisions, based on their beliefs, values, and attitudes. If a cult can influence and change a person's beliefs, then it can influence and change a person's will, and the choices they make. Effectively, a cult controls its followers indirectly, through the belief system which it promotes.

Cult belief systems differ from mainstream belief systems in a number of ways. This analysis tries to explain these differences, why they are significant, and to place them within a wider cultural and social context.

Preliminary Definitions

A cult can be defined in general as any group of people holding to a common belief system, but in practice the term cult is often used pejoratively, to refer specifically to 'a quasi-religious organization using devious psychological techniques to gain and control adherents' (Collins English Dictionary)[1] , and this is the sense in which 'cult' is used in this analysis.

Various terms have been used to describe the devious psychological techniques allegedly used by cults, the most common being 'brainwashing'[2], 'mind-control', 'thought reform' and 'mental manipulation'.

The term 'brainwashing' was originally coined by journalist Edward Hunter in 1950, to describe techniques used by the Chinese Communists to subvert the loyalty of American prisoners captured in Korea. Brainwashing in this original sense involved physical coercion: imprisonment, food and sleep deprivation, and sometimes torture. Cults do not generally use physical coercion, and so in recent years, researchers writing about cults have tended to use the term 'mind control', or sometimes 'thought reform', to denote a brainwashing or indoctrination process which does not involve physical coercion. Journalists still tend to use the term 'brainwashing'.

The term 'mind control' can be misleading. It perhaps suggests the idea that a person's mind can be robotically controlled by some outside agency, or that thoughts can somehow be hypnotically implanted in a person's mind.

This is not at all what happens in a cult. In fact a cult controls its members primarily through the promotion and inculcation of a hierarchical, cult-type belief system within a person's own mind, rather than through any form of external control. It is the belief system itself which is the catalyst in cult mind control. The actual mind control is done by the person themselves, as they train and discipline their mind in accordance with the tenets of their new belief system.

Cults actively promote and market their belief systems. Commercial companies use marketing and public relations techniques to promote an idealised image of their product or service to potential customers, and cults do much the same. [3]

The 'product' marketed by a cult is its belief system. A belief system is an intangible thing, though it can be a powerful one. This intangibility means that cults are not subject to any kind of scientific appraisal, because the benefits or otherwise of involvement with a cult and its belief system are also intangible. They are almost entirely a matter of personal opinion, and impossible to prove objectively.

This also means that it is very difficult to define a cult satisfactorily, because any definition depends on personal opinion, rather than on objective criteria. The best one can do is to describe a cult-type belief system, and look at some of the subjective processes which may be going on in the mind of a cult member.

Because of the intangible nature of what they promote, cults do not really operate in the public domain. They operate in a private world, within an individual's personal religious framework or set of beliefs, and within an individual's own subjective world of self-esteem and self-confidence. They operate within a person's mind.

A person's mind (or consciousness) is something which is difficult to define or to measure, and so it tends to be largely outside the scope of scientific and academic enquiry. And from a legal point of view, a concept like 'freedom of mind' is equally hard to define, and so it is difficult to specifically protect such a freedom.

Personal free will is a cherished axiom of Western democracies, but free will itself is mysterious and hard to define. Indeed, some scientists and philosophers dispute whether free will as such really exists. For example:

Einstein wrote:
'In human freedom in the philosophical sense I am definitely a disbeliever. Everyone acts not only under external compulsion but also in accordance with inner necessity. Schopenhauer's saying, that "a man can do as he will, but not will as he will" has been an inspiration to me since my youth up...' [4]

Thomas Hobbes wrote:
'Nothing takes a beginning from itself, but from the action of some other immediate agent, without itself. Therefore, when first a man has an appetite or will to something, to which immediately before he had no appetite nor will; the cause of his will is not the will itself, but something else not in his own disposing. So that, whereas it is out of controversy that, of voluntary actions the will is the necessary cause, and by this which is said, the will is also necessarily caused by other things, whereof it disposes not, it follows that voluntary actions have all of them necessary causes, and therefore are necessitated.' [5]

Spinoza wrote:
'The will cannot be called a free cause, but can only be called necessary. For the will, like all other things, needs a cause by which it may be determined to [existence and] action in a certain manner.' [6]

Largely because of these difficulties in defining terms, the whole area of cults and mind control tends to be rather contentious, and opinions are often polarised.

It is possible to take the view that society at large embraces a variety of organisations and institutions which could be interpreted as being somewhat cult like in their nature. It is often argued that there is a fine line between socialisation and indoctrination, or between persuasion and mind control. Nevertheless, society does attempt to make a distinction between acceptable and unacceptable behaviour: persuasion through physical force or through the denial of food and water, for example, are clearly illegal.

In the view of this writer, there is a need for society to be more active in protecting its citizens from other more subtle processes of persuasion, which can be equally abusive of personal freedoms, and which are sometimes made use of by individuals and organisations in order to gain personal power over those they claim to help, but whom often they merely manipulate and exploit. This analysis will seek to unravel some of the processes of persuasion used by cults, and to show in what ways they may be devious and an infringement of personal liberty.

Cult Mind Control

Outline of a Cult Persuasion Process

Some cults promote a religious type of belief system. Others, such as so-called therapy cults, promote a secular type of belief system, based on quasi-scientific or quasi-psychological principles. Some cults combine religious and secular elements in their belief system. In general, cult organisations promote utopian ideals of self awareness or self-transcendence, ostensibly for the benefit both of the individual and of the world at large. For example:

'The central teaching of the Buddha is that we can change our lives. Buddhism offers clear and practical guidelines as to how men and women can realise their full potential for understanding and kindness. Meditation is a direct way of working on ourselves to bring about positive transformation. We teach two simple and complementary meditations. One helps us develop a calm, clear, focused mind; the other transforms our emotional life, enabling us to enjoy greater self-confidence and positivity towards others.' [7]

Not every organisation which promotes this kind of ideal is necessarily a cult. However, it can be quite difficult to tell from the outside whether a group is a cult or not. In general, so long as a cult maintains a respectable public image, it will attract followers who aspire to this kind of ideal.

Cult belief systems present an inspiring vision, in which any individual can begin to realise their own higher potential, through following the teaching and training offered by the group. Believers begin to aspire to a 'new life' or a 'new self' (or to become a 'true individual', in FWBO terms), based on this ideal.

However, there can be a sting in the tail. Cult belief systems inspire the aspirant not only to identify with this potential new self, but also to adopt the perspective of this new self, which sees their old self as comparatively inferior and flawed. E.g.:

'Bodhi [Enlightenment] is a state of insight, of wisdom, of awareness - to begin with, insight into one's own self. It consists in taking a very deep, clear, profound look into oneself, and seeing how, on all the different levels of one's being, one is conditioned, governed by the reactive mind, reacting mechanically, automatically, on account of past psychological conditionings of which only too often one is largely unconscious. It is seeing, moreover, the extent to which one is dominated, even against one's will, - often without ones knowledge, by the negative emotions.' [8]

An inner tension or conflict is set up, between the 'positive' new self, and the 'negative' old self. In effect, a split personality is artificially created, with pride and hubris for the idealised new self, and shame and guilt for the unreformed old self.

This rather bi-polar combination of inspiration and guilt has been termed 'The Demand for Purity', by Dr. Lifton in his book 'Thought Reform and the Psychology of Totalism'.[9] , where it is one of the criteria which he proposed for determining whether or not a given environment has the potential to exert 'thought-reform' or mind control:

'The Demand for Purity: The creation of a guilt and shame milieu by holding up standards of perfection that no human being can accomplish. People are punished and learn to punish themselves for not living up to the group's ideals.'

In other words, cult belief systems 'guilt-trip' aspiring individuals, by first holding up a utopian goal, and then encouraging aspirants to feel ashamed when they are unable to fully realise this goal. This combination of initial inspiration and subsequent guilt, is what first draws a person into a cult, and then makes it difficult for them to leave. For a believer, leaving means failure.

It would be a mistake to assume that only weak-willed people join cults. Paradoxically, it is often the more ambitious and strong-willed people who become the most committed cult members. They have the determination needed to persevere in pursuit of their goal, no matter how great the difficulties.

While they are in the process of training their mind and transforming their personality in accordance with the tenets of their new belief system, believers are in the position of students, and are therefore somewhat dependent on the guidance of their teachers.

Cults do not gain influence over their members by overcoming their free will. They gain influence through promoting a belief system which undermines members' confidence in their own judgement, or more specifically in the judgement of their unreformed old self, so that they lack the confidence to make decisions for themselves, independently of guidance from the group's teachers and preceptors.

Marketing a Cult Belief System

Of course, no-one is forced to join a cult. No-one is forced to adopt a new belief system, either as a whole or in part. Equally, however, no-one can make an informed assessment of a belief system in advance, without having first had some personal experience of it.

Cults have to compete to market their belief systems and gain adherents, just as ordinary commercial organisations have to compete to market their products or services. Indeed some of the marketing techniques are not dissimilar. Commercial businesses often use aspirational marketing techniques, promoting their products and services to potential customers by implying that purchase of a particular product will enhance an owner's self esteem and social status.

However, cults have two significant marketing advantages compared with a normal commercial organisation, because of the intangible nature of the 'product' they market. The product which a cult markets is its belief system, together with the values and attitudes that are part of that belief system.

The first marketing advantage enjoyed by a cult is that, as a quasi-religious organisation, it is protected from outside investigation, by a legal system which attempts to protect freedom of religion and freedom of belief. Briefly, freedom of religion allows cults to use their own self-referential ethical codes to justify their own behaviour, and to remain unaccountable to any outside agency. There are no consumer protection laws to regulate the marketing of personal or religious belief, and no independent quality control of the product.

A second advantage enjoyed by a cult stems from the fact that it does not really operate in the public domain; it operates primarily within the private and subjective realm of a person's mind. Both the actual product marketed by a cult, and any consequences resulting from purchase or use of the product, are largely subjective and intangible in nature. This means that no criticisms of the allegedly harmful effect that a cult's belief system may have had upon a member's mind or behaviour can ever be proved objectively, because the whole subject of personal belief is by nature largely or entirely subjective, and therefore unprovable either way. So long as the burden of proof remains with the critic, a cult can never lose.

In order to examine how these two advantages characteristically enjoyed by a cult, can give a cult organisation an unfair advantage over individuals to whom they promote and market their belief systems, and can enable them to use possibly devious processes of persuasion when recruiting new members, this analysis now moves on to look more closely at the nature of cult belief systems, and how they differ from conventional belief systems.

Religious Freedom & Moral Independence

The Quasi - Religious Spectrum

As discussed in note 1, words like religion, sect, and cult have complex derivations, related to ideas about culture and way of life, and their meanings are subject to change over time. This analysis acknowledges the range of meanings that these terms may be expected to carry, and has no wish to exclude any nuances of meaning. This analysis interprets the various dictionary definitions not as separate and distinct meanings, but more like reference points along a continuum or spectrum of meaning.

Towards one end of this spectrum are established mainstream religious and secular/humanist systems of belief and practice, in the middle are non-conformist sects and fashionable fads of various kinds, and towards the other end are various religious and secular groups which tend towards being exclusive coteries. These more cultish groups are sometimes characterised by a collective hubris among their members, who, seeing themselves as part of an elect, look down rather sniffily upon the mores and values of established mainstream institutions.

With this perspective in mind, this analysis uses the term cult primarily in the Collins 3 sense of: 'a quasi-religious organization using devious psychological techniques to gain and control adherents'. Leaving aside for now the alleged 'devious psychological techniques', cult is here defined as being a quasi-religious organization as distinct from a religious one. The cultish-ness is related to the quasi-ness of the religion or belief system. Therefore, a theological definition of a cult, based on examination of the group's belief system (or a tenelogical definition, based on examining the tenets of a secular belief system), may be more useful here than a sociological definition, based on the observation of outward characteristics. For example, Alan Gomes, in his book 'Unmasking the cults' [10] gives the following definition of a (Christian - based) cult:

'A cult of Christianity is a group of people, which claiming to be Christian, embraces a particular doctrine system taught by an individual leader, group of leaders, or organization, which denies (either explicitly or implicitly) one or more of the central doctrines of the Christian Faith as taught in the sixty-six books of the Bible.'

The main criterion being put forward here for a cult is that it 'denies' (either explicitly or implicitly) one or more of the central doctrines of the related mainstream belief system. To deny something is to declare it untrue, or to refuse to accept the existence, validity, or truth of something. Denial is more than merely quibbling over the fine details of a belief system. Denial implies a distinctly different belief system.

A theological/tenelogical definition of cult provides a means of broadly differentiating between cults, sects, and mainstream religious or secular belief systems, by considering the degree to which a particular group's belief system and culture originates from within the group, and is separate and distinct from the relevant mainstream belief system and culture.

From this perspective, sects can be characterised as tending to disagree with some details of the relevant mainstream belief system, while cults can be characterised as tending to deny and reject outright significant parts of the relevant mainstream belief system.

In this perspective, a sect (religious or secular) tends to distinguish itself from the mainstream by having an individual interpretation of some or all of an agreed set of scriptures, or secular tenets, but a sect doesn't tend to invent entirely new scriptures or tenets of belief. In general, they seek to improve upon existing traditions, rather than to replace them entirely. A sect might perhaps be said to have an underlying respect for the relevant mainstream belief system and culture, in that a sect tends to hanker after some degree of respectability and acceptance (on its own terms of course) in the eyes of the mainstream belief system.

A cult, in comparison, tends to have little or no underlying respect for established belief systems. Holding their own partly or wholly self-originated belief system in high esteem, cult leaders and their followers tend to disdain existing belief systems as inferior and outmoded, and will consequently tend to separate off or isolate themselves from the mainstream more than a sect. A cult will tend to either invent completely new scriptures or tenets of belief, or at least to radically reinterpret existing scriptures and tenets.

Cult leaders may claim some special revelation or insight which is accessible to them, but not to those outside the group. They may claim a special ability to go back to first principles and to practice a more pure version of the tradition, or claim a special ability to re-interpret traditional teachings in a way which is more appropriate for the modern world.

Cults tend to be cliquey, elitist, and hierarchical, and there is usually a distinct difference of status (in the eyes of cult leaders and their followers) between believers and unbelievers, between the committed and the uncommitted, and between the saved and the fallen.

Thus it may be difficult for an outsider, as an unbeliever, to investigate a group's belief system. Any group of people, cult or non-cult, may be economical with the truth, presenting a sanitised version of its belief system to public gaze, while keeping some aspects confidential to trusted members.

It is easier to estimate to what extent a particular group is unorthodox, if its belief system is based on a religion like Christianity, and the group explicitly denies one or more of the central doctrines of the source religion. Often it is not so easy, and denial may be only implicit; or there may be no commonly agreed checklist by which to compare what is orthodox and what is not. There is no reliable orthodoxy meter available to measure a group's belief system, and the process of investigation is likely to be complex and difficult.

The purpose of this kind of investigation, attempting to locate a particular group's belief system along a spectrum between orthodox belief systems and increasingly quasi or unorthodox belief systems, is not to make value judgments between orthodox and unorthodox beliefs as such. On the one hand, orthodox belief systems have evolved over time, and incorporate experience and understanding distilled over generations. On the other hand, it is important to acknowledge both that personal freedom is an established tenet of Western democracy, and also that various forms of unorthodoxy (in the sense of deviation from established norms) may well have evolutionary benefits for society as a whole. Both tradition and innovation have their respective merits.

Rather, the intention in the present context is to point to a particular characteristic of independent, self-originated and innovative belief systems, which can give a considerable advantage to an organisation promoting such a belief system. The particular characteristic of independent belief systems is that they can set their own moral codes.

Religious Freedom & Moral Independence

A belief system implies some kind of ethical or moral code, whether explicit or implicit. A moral code is concerned with the distinction between right and wrong, and with the goodness or badness of human behaviour.

Depending on how strongly a particular group, in terms of its belief system, denies particular tenets of the relevant mainstream belief system, and how strongly it distinguishes and separates itself off from the mainstream belief system, to that degree its belief system may be said to be self-originated, in the sense that the belief system originates primarily from, and is also interpreted by, the group's leadership.

If the belief system originates primarily from the group's leadership, then the group's leadership are also the ethical preceptors and moral arbiters. They act as both law maker and judge, and can therefore make up the rules as they go along. The danger is that their ethical standards may become flexible and self-serving, if they succumb to the temptation to deflect any criticism of themselves or their behaviour, by adjusting definitions of right and wrong to put themselves in the right. If the moral arbiters are unwilling to modify their behaviour, they can instead modify their moral codes to justify their behaviour. Freedom of belief can become freedom without responsibility.

The former Bishop of Durham, UK, the Revd Dr David Jenkins, usually regarded as someone whose views tend to the heterodox as much as to the orthodox, commented that: 'Nowadays, freedom of belief is defined in terms of a post-modern relativism. Freedom of belief can mean; anything goes.' [11]

Many organised groups holding to wholly or partly self-originated belief systems are keen to defend religious freedom, because of the 'anything goes' potential inherent in freedom of belief. For example, a lawsuit [12] filed in 1999 by a coalition of plaintiffs, including the Seventh-day Adventists and the 'International Coalition for Religious Freedom' (Moonies) claims that the State of Maryland's task force studying religious cults on college campuses is violating constitutional rights and conducting a 'religious inquisition'. Representing the plaintiffs, Los Angeles civil rights attorney Kendrick Moxon (believed to be a Scientologist), was quoted as saying: 'The government cannot, absolutely cannot, get involved in adjudicating what's a right religion and what's a wrong religion.'

If a group is classified as a religion, then leaders of that group have considerable freedom, within the group at least, to set their own moral codes and to adjudicate between right and wrong. Potentially, any methods of persuasion, short of those which involve actual, legally provable physical coercion, may be considered reasonable within the ethical codes of a self-originated belief system. A sincere believer may feel that, in religious matters, the end justifies the means, and therefore various deceptive and devious practices may, in the mind of a believer, be justified as 'skillful means', 'crazy wisdom', or 'heavenly deception'.

In most cases, cult leaders probably do not deliberately set out to establish cults. However, in practice they can tend to fall into cultish patterns almost by default, as they attempt to assert and defend unorthodox positive visions of the world, in the face of potentially destructive (to them) cynical or questioning orthodoxies. Actions which, to an outsider, might seem devious or immoral, may, in the mind of a believer, seem perfectly just and ethical. For example:

'Outright lying means to tell people things that you do know are not true - Scientologists do that at times, when they are honestly convinced that this is better for Scientology. …
'Scientologists do have their own definition about ethics which does not fully correspond with the general understanding about ethics. …
'Scientologists are formally and informally told how to best explain Scientology to others, what to mention, how to mention it, what not to mention - all with the best intent, but the result is still, that you cannot, as an outsider, get fully informed about Scientology by a Scientologist. I do know that, because I did it myself and I taught it myself for years.' [13]

Moonies, in promoting their belief system, may justify deception and misrepresentation as part of a greater process of 'deceiving evil into goodness'. The FWBO rejects 'conventional morality' in favour of its own self-serving 'natural morality', and advises senior members to be discrete about discussing the movement's more radical principles with the public [14]. This sort of secrecy and deception means that it is not really possible for a newcomer to exercise informed free choice about the benefits or otherwise of becoming involved with such a group.

This characteristic of independent belief systems, that they independently set their own moral codes, does not necessarily mean that devious psychological techniques will be employed to promote the belief system and gain adherents, only that they can be, and that there is little to stop them. Religious and quasi-religious groups alike are largely left to regulate themselves, and to adjudicate for themselves on what is devious and what is not.

Organisations and their Belief Systems

It is the combination of an independent belief system, together with an organisation devoted to promoting that that belief system and gaining adherents, that creates a situation in which cultish patterns of behaviour may develop.[15] In order to identify whether a group or organisation might be a cult, it is necessary to enquire both into the group's organisational structure, and also into the underlying belief system.

Questions which might be asked about a group's organisational structure include: how tightly organised is the group? What sort of leadership structure does the group have? Is it a loose and informal association, or does the group have a significant proportion of full-time members? Does the group actively seek new members? Are there formal rituals for admitting new members? How strongly are new members encouraged to orientate their lives full time around the group's belief system?

Questions which might be asked of a group's belief system include: What sort of position does the belief system occupy along the quasi-religious spectrum? Does the group maintain good communication with a mainstream source belief system, or are they independent and self-referential? Is the group and its belief system open to investigation by outside agencies, or is the group inclined to resist this and to cry 'religious inquisition'? Who are the preceptors for the group? Who decides what is devious and what is not?

Moral non-accountability is one advantage of following an independent, self-originated belief system. There are other advantages which a cult enjoys, in terms of defending itself against investigation and criticism. These advantages stem from the subjective, intangible nature of personal belief and free will.

The Nature of Personal Belief

Free Will, Free Choice, and Personal Belief

The subjective, non-material nature of free will means that, strictly speaking, neither free will itself, nor any manipulation of free will, can be objectively proven. In any given case, it is always at least partly a matter of opinion as to whether a person might have acted out of their own free (and informed) choice, or whether their mental liberty and freedom of choice might have been restricted or unduly influenced in some way. It is ultimately unprovable either way.

The personal and experiential nature of the belief system promoted by a cult means that it is not possible for a person to exercise informed free choice in advance, about whether the belief system is valid or not, or about the benefits of following the study and training opportunities offered by the group. The benefits, if any, of group involvement can only be evaluated through personal experience, through spending a suitable period of time with the group. How long a suitable period of time might be, depends on the individual, and cannot be determined in advance.

Unfortunately, the subjective, non-material nature of free will means that a person who becomes involved with a cult and its belief system, and who subsequently comes to regret this, can never actually prove that they had not been acting entirely out of their own free will in becoming involved, or that their free choice had been in any way manipulated or deceptively influenced, even if it had.

If they claim that the cult's descriptions of its belief system were false or misleading, preventing them from making a reasonable and informed free choice before becoming involved, they will not really be able to prove this. This is because the subjective, non-material nature of personal belief is such that any descriptions of a belief system are also subjective and a matter of personal interpretation.

Therefore it is virtually impossible to prove that a belief system might have been mis-represented or falsely described. An allegation of mis-representation can always be countered with an allegation of mis-interpretation. A cult can always claim that critics have misunderstood the belief system itself, and have therefore simply misunderstood and mis-interpreted the cult's description of its belief system.

The subjective, non-material nature of personal belief also means that any criticisms of the effect that exposure to a cult belief system may have had upon a person's mind or behaviour are unprovable, since it is impossible to prove what someone does or doesn't believe, and therefore it is impossible to prove any consequences. As long as the burden of proof remains with the critic, a cult can never lose, and criticism is impotent.

The net result is twofold: firstly, cults are spared any obligation to prove to the outside world that their members became involved purely out of their own free will and choice, and secondly, they are not obliged to prove that involvement is safe and not psychologically damaging.

It is possible to broadly place a group's belief system along the quasi-religious spectrum, based on investigating the belief system from the outside, as a non-believer. However, it requires more of an insider's perspective to understand the interior dynamics of the belief system, and in particular the effect that involvement with a cult belief system may have upon a person's mind or behaviour.

The Hermeneutics of Personal Belief

This kind of investigation, into the inner workings of a group's belief system, could be described as hermeneutical. Hermeneutics, derived from the Greek for 'interpret', is a philosophical tradition concerned with the nature of interpretation and understanding of human behaviour and social traditions.

An influential contributor to this tradition was the German philosopher Wilhelm Dilthey (1833 - 1911), who argued that the 'human sciences' could not employ the same methods as the natural sciences, but needed to use the procedure of 'understanding' (Verstehen) to grasp the 'inner life' of an alien culture or historical period. To understand the inner life of a belief system, an investigator has to go native and enter into the belief system to some extent. They have to become a believer, or at least suspend disbelief.

A belief system has both objective and subjective aspects. There are the formal doctrines and tacitly agreed behaviour codes of a particular faith or culture or group, and there is the complex of sometimes conflicting ideas, convictions, attitudes, and affiliations within the mind of an individual member of that particular faith or culture or group. Understanding a belief system requires some ability to empathise with the mind and outlook of a believer, and some ability to experience the emotions of a believer.

Clearly, there is a tension between being an impartial investigator and being a believer. Hermeneutics is sometimes divided into the 'hermeneutics of suspicion' and the 'hermeneutics of faith'. The French philosopher, Paul Ricoeur, wrote of this tension: 'Hermeneutics seems to me to be animated by this double motivation: willingness to suspect, willingness to listen; vow of rigor, vow of obedience.' [16]

A hermeneutic of faith allows an investigator to enter the inner world of a belief system. However, there is the danger that this may compromise an investigator's impartiality and objectivity, because an investigator has to make a paradigm shift and adopt, at least temporarily, a new set of beliefs. They have to go native to some extent, if they hope to understand the inner life of the belief system and see it through the eyes of a believer. Otherwise, they will always be an outsider, and not really able to comprehend the belief system.

However, if they do adopt a new belief system, even partially and tentatively, how can they at the same time maintain an impartial and objective perspective on it? Their old belief system may be incompatible with the new belief system, and in time the original reasons for undertaking an investigation may no longer seem entirely valid under the new belief system. An investigator needs some equivalent of Ariadne's thread [17], if they want to be confident of finding their way back out of the new belief system.

Of course, no-one is forced to join a cult. No-one is forced to adopt a new belief system, either as a whole or in part. Equally however, no-one, be they an independent academic investigator, a curious onlooker, or a potential new member, can understand a belief system, without trying it out first. It's not really a belief system, if you don't believe in it. Without some ability to see a belief system through the eyes of a believer, and to experience the emotions of a believer, an investigator will always remain an outsider. Without a hermeneutic of faith, and a 'vow of obedience', in Paul Ricoeur's phrase, an objective investigation ('vow of rigor') will be impotent and ineffectual.

Consequently, it is not really possible to make an informed choice in advance about whether or not to adopt, either wholly or in part, a new belief system. It is only really possible to make an informed evaluation of a belief system after having tried it out it and lived it for a period of time. However, cult belief systems tend have a particular set of characteristics which make it dangerous even to experiment with them. The dangerous characteristics of cult belief systems are that they are hierarchical and bi-polar in nature.

Hierarchical, Bi-polar Belief Systems

In general, cult organisations promote utopian ideals of self awareness or self-transcendence, ostensibly for the benefit both of the individual and of the world at large. For example:

'The central teaching of the Buddha is that we can change our lives. Buddhism offers clear and practical guidelines as to how men and women can realise their full potential for understanding and kindness. Meditation is a direct way of working on ourselves to bring about positive transformation. We teach two simple and complementary meditations. One helps us develop a calm, clear, focused mind; the other transforms our emotional life, enabling us to enjoy greater self-confidence and positivity towards others.' [7]

The type of belief system implied above is not unique to cults. Many belief systems could be described as aspirational or even utopian, in the sense that they proclaim an ideal to be realised, and propose a path or a lifestyle for believers that leads towards realisation of that ideal.

However, cult belief systems have some additional characteristics. Firstly, they tend to be hierarchical in perspective, revolving around ideas about lower and higher levels of personal awareness and insight. They also tend to be dualistic and bi-polar, in the sense that they tend to make a clear distinction between the two poles of the hierarchy. There is a clear distinction between lower and higher, between mundane and ultimate, etc. For example:

'I see it [the spiritual path] in terms of a very definite transition from what we regard as a mundane way of seeing the world and experiencing the world, to what we would describe as a transcendental way, seeing it in terms of wisdom, seeing it in terms of real knowledge, seeing it in terms of ultimate reality, seeing it in terms of a truer, wider perspective.' [18]

'spiritual life begins with awareness, when one becomes aware that one is unaware, or when one wakes up to the fact that one is asleep.' [19]

'We always have to be aware that our. . .um . . . what we think, is not true, until enlightenment.' [20]

Not only do cult belief systems tend to be dualistic or bi-polar in philosophical terms, they also tend to be bi-polar in psychological terms, rather like Bi-polar Disorder or manic-depression.

Cult belief systems promote a utopian vision in which any individual, through following the group's teachings, can begin to realise their own higher potential, and can ultimately transcend the mundane. As outlined earlier, believers begin to aspire to a 'new life' or a 'new self', one which embodies the ideals and insights of the belief system. At the same time, cult belief systems encourage the aspirant not only to identify with this potential new self, but also to adopt the perspective of this new self, which sees their old self as comparatively inferior and unaware. It is ego-utopia or hubris for the new self, and ego-dystonia or shame for the old self. [21]

There are two intertwined aspects to this process:

From the ego-utopian side, a cult-type belief system presents a vision of an ideal new self, and this vision can become a source of vicarious pride for believers, as they identify with the ideal and bask in its reflected glory. Believers can begin to experience a feeling of intoxication with the ideals proclaimed by a cult, and a sense of pride in being associated with these ideals.As their commitment is recognised and acknowledged by the group's leaders, they may also develop a sense of pride in being admitted into an exclusive coterie.

Often, established cult members will tend to divide the world into the saved and the fallen, and seeing themselves as members of an elect, will look down compassionately upon those not yet fortunate enough to be initiated into their belief system. This vicarious pride or hubris-by-proxy can be intoxicating, and may possibly be one of the most addictive aspects of cult involvement.

In the absence of alternative sources of emotional nourishment, a believer can develop a psychological dependence on the feeling of enhanced self-confidence associated with being accepted as a member of an elite group. Like any addict, they can become dependent on their supplier. A believer can become dependent on the granting of recognition and appreciation by the believer's adoptive peer group, and by its leaders and hierachs. This appreciation and recognition can usually be earned, like brownie points, by supporting the group financially or by working for the group in various ways.

From the ego-dystonic side, a hierarchical cult-type belief system, combined with the aspirational and idealistic ethos of a community of believers, tends to create an ego-dystonic dynamic within the group. Ego-dystonia tends to perpetuate itself given the two necessary conditions, which are a hierarchical, bi-polar belief system and a community of believers. The doctrines of a hierarchical belief system encourage ego-dystonia and also define the community; the community perpetuates the doctrines, and the aspirational nature of the doctrines consolidates the community's appeal as a refuge for ego-dystonics.

In other words, the belief system both creates a problem (ego-dystonia) and simultaneously offers a solution (ego-utopia). Like the proverbial chicken and egg, the whole thing can tend to become a self-perpetuating symbiosis.

As they try to practice the cult's teachings for themselves, believers tend to alternate between seeing themselves as fairly heroic in their efforts to achieve an ideal personality and to help bring about an ideal new world, or feeling guilty over their failure to overcome their recalcitrant old self, with all its supposed negativity and reactivity and sinfulness.

Cults don't usually try to induce extreme or pathological levels of ego-dystonic guilt in their members; milder levels can be just as effective. Mild guilt tends to be a good motivator, while excessive guilt tends to be disabling, and a disabled, de-motivated believer is of no use to a cult.

Cult leaders don't necessarily plan all this out in advance; these processes tend to occur naturally, given the necessary conditions. In fact, given the necessary conditions, it takes a positive effort to avoid becoming a cult.

In general, believers feel a pleasant ego-utopia or hubris so long as they remain in favour with the group, and an unpleasant ego-dystonic guilt if they are out of favour. For an outsider to be able to understand whether this type of psychological dependency may be a factor or not, it is necessary to have some understanding of the 'inner life' of the group's belief system, and to be able to empathise with the mind of a believer.

Hierarchies and the Politics of Personal Belief

The two conditions necessary for a cult to form are: a hierarchical, bi-polar type of belief system, and an organisation devoted to promoting that belief system.

Many belief systems, both cult and non-cult, are associated with a church or with an organised community of believers, which ostensibly supports and encourages individual believers in their efforts to realise the ideal for themselves. In the earlier section 'Organisations and their belief systems', various criteria were proposed as a means of investigating the nature of a particular community of believers. Enquiring about roughly where the belief system fits in the quasi-religious spectrum, and about who the preceptors and moral arbiters of the group are, can be helpful in gaining an understanding of the group.

The particular area of enquiry proposed in this section, is into the interior dynamics or politics of how an individual believer interacts with the organised body of believers.

Is the group's hierarchical belief system, with its beliefs about higher and lower levels of personal realisation, used to justify a hierarchical power structure within the organisation? Do the institutions of the belief system tend to support the aspirations of believers, or do they tend to subordinate believers to the interests of the organisation and its leaders?

It is sometimes suggested that Christianity began as a cult. The Christianity of Jesus and the disciples was unorthodox in its time, and might well have met several of the criteria proposed so far for identifying a cult. However, a saying such as 'The sabbath was made for man, and not man for the sabbath.' [22] does imply a spirit and an ethos centered on the needs of a believer, rather than on the needs of the belief system and its institutions. While it is admittedly rather difficult to know the 'inner life' of believers two thousand years ago, early Christianity would appear to fail Lifton's criterion for a cult of 'Doctrine over Person'.[9]

Potentially, any belief system can be interpreted either in a cultish or in a non-cultish manner. Personal belief can sometimes become institutionalised and harden into group ideology, and it is not always easy for an outsider to know to what extent this has happened. The Uraguayan theologian Juan Segundo, who considers that 'the alienating sin of the world is ideology', writes that:

'liberation from ideology requires opting for the exercise of an ideological suspicion in order to unmask the unconscious ideological structures which dominate and which favor a powerful, privileged minority.' [23]

In the case of a cult, the 'powerful, privileged minority' are the cult leaders and hierachs. The difficulty is that an investigator (or potential new member) has to exercise a hermeneutic of faith as well as of suspicion, if they are to succeed in penetrating into the mind set of a cult member and in unmasking therein any 'unconscious ideological structures'. Only someone with hands-on experience of the ethos and interior dynamics of the group actually knows the point of view of an engaged believer. Only an insider can really tell us if the 'inner life' of the belief system serves the members or a privileged hierarchy.

This places an investigator in a dangerous paradox. On the one hand, they have to go native and enter into the belief system to some extent, if they want to know whether the 'inner life' of the belief system serves the members or the hierarchy. But the difficulty with experimenting with a hierarchical, cult-type belief system, is that it is impossible at any point for an investigator to know how far to go, or when to stop. Having begun to adopt, cautiously and on a trial basis, elements of a hierarchical, dualistic belief system, it is never possible to know when the belief system has been given a fair trial.

This uncertainty arises because of the very nature of a hierarchical cult-type belief system, with its ideas about higher and lower levels of personal understanding. At no stage can an investigator or a newcomer eliminate the possibility that they have failed to attain any more than a mundane level of understanding of the group's beliefs. They can never be sure that a breakthrough to a deeper level of understanding is impossible, or that valuable insights will definitely not result from attending the next training course or residential weekend offered by the group. Or from the next course after that. A hierarchical cult-type belief system is like an endless road to a uncertain destination.

This is because a hierarchical type of belief system, with its ideas about lower and higher levels of awareness and understanding, is intrinsically non-falsifiable [24]. No counter observations or criticisms of a hierarchical type of belief system can ever be established as objectively true. It is impossible for an investigator to prove any fault with the tenets of a hierarchical belief system, even after long term personal experience of the belief system, or to censure any of the methods (short of physical force) which might be used to promote such a belief system.

It is never actually possible to prove that a group promoting such a belief system has used 'devious psychological techniques to gain and control adherents', even if they have, because critics can never prove that their criticisms are not based merely on mundane ignorance and misunderstanding. From the perspective of a hierarchical, dualistic type of belief system, critics are deemed to be at a lower level of awareness, and are thus effectively disenfranchised.[25]

Any attempt at debate with the hierachs of a cult is doomed, because a critic can never disprove the hierachs' claim to a special revelation, or to a more profound understanding of the group's core beliefs. So attempts to reform a cult from within tend to be futile. It may also be difficult to warn outsiders what the 'inner life' of the belief system is actually like, because critics can never actually prove that their criticisms are objectively valid, not just personal and subjective.

These difficulties tend to be characteristic of cult-type belief systems, and help to put cult organisations beyond the reach of any outside authority.

Recruitment by Cults

The Cult Recruitment Process

By no means everyone who encounters a cult will be drawn in, so clearly mind control techniques are not all-powerful. In general, less than 10%, and probably closer to 1% of people who attend a cult's introductory talk or a short course, might go on to become full members of the group.

The process of recruitment involves befriending and then mentoring or discipling a newcomer, and this takes time. An established member may only be able to effectively befriend and mentor two or three newcomers at a time, so there is an arithmetical limit to the rate at which a cult can recruit new members, however many people may attend their introductory events.

There is often an element of deception or disingenuousness in the way that cults present themselves to the public. Someone encountering a group such as 'Sterling Management' (Scientologists) or 'Women's Federation for World Peace' (Moonies) may have no particular reason to be cautious of the group. Initial contact is usually achieved via an ostensibly neutral agency which has no visible cult associations, such as a meditation centre or a stress management course.

Once initial contact has been established, selected individuals are targeted by the group's recruiters. In that sense, a person doesn't choose a cult, the cult chooses them.

A cult recruiter's role is essentially to make a newcomer feel welcome and appreciated, and to encourage them to feel an affinity for the idealistic belief system of the group. If this can be achieved, the belief system itself will largely do the rest. It is the belief system itself which is the primary active agent in cult mind-control.

Established members acting as recruiters will not wish to feel that their efforts have been wasted, and will tend to focus on individuals who appear more open to the ideals of the group. Recruiters are instinctively able to spot people who are similar in outlook and temperament to themselves, with whom they can simply re-enact the same processes by which they themselves were originally drawn into the group.

Of course, recruiters are unlikely to consciously think of themselves as 'recruiters'; they are more likely to see themselves as altruists, reaching out to share their aspirations and beliefs with others. It can be rather like a chain letter or pyramid sales scheme.

One of the ways in which established members may gain ego-utopian brownie points is by attracting new members to the group. A successful recruitment tends both to enhance a recruiter's status within the group, and also to confirm their own faith and confidence in the group's belief system. This ego-utopian feedback process provides cults with a well motivated sales force that would be the envy of many conventional businesses.

The young and idealistic may be vulnerable to recruitment, as may individuals who are undergoing some change or uncertainty or re-evaluation in their lives, for example when leaving home to begin college, leaving college to enter the job market, changing jobs, or after a bereavement. This kind of situation can present a chance for a cult recruiter. People who maintain an established career and circle of friends are less likely to be drawn in to any depth.

The processes of recruitment are largely invisible to an onlooker. To understand the kinds of processes that occur within the minds of recruiter and recruitee, it is necessary to have some understanding of, and empathy for, the 'inner life' of the group's belief system.

Leaving a Cult

Disability Arising from Cult Involvement

As discussed earlier, the personal and experiential nature of belief systems in general means that it is not really possible to exercise informed free choice in advance, about the merits or otherwise of a belief system and set of attitudes promoted by a particular group. In order to make a meaningful evaluation of a belief system, it is necessary to go native and become a believer to some extent. Some affirmation of faith in the belief system is usually required of a believer, and there may be formal rituals of initiation into membership.

However, the hierarchical, bi-polar nature of cult belief systems makes it dangerous even to experiment with them. Unless they are an investigator who became involved with a cult purely as a research project, with the clear intention of rejecting the belief system and leaving the cult environment after a set period of time, it may not be all that easy for someone to escape from a cult belief system.

Cults will do their best to ensure initial apparent benefits for new members. A cult might be compared to a card sharp, who will let a newcomer win the first few games in order to take all their money in the long run [26]. There is no problem, so long as a member is happy to continue their involvement. However, should a member become unhappy with their involvement, or develop serious doubts about the belief system, the process of disentanglement may not be very straightforward.

Rejecting the belief system in its entirety may not be easy, or even desirable. Even after physical contact with the group has ceased, elements of the cult belief system are likely to linger in the mind of an ex-member for some time, depending how deeply and for how long they were involved with the group. They may experience feelings of anxiety and disorientation as they try to rid themselves of the unwanted remnants of the cult belief system and the cult way of relating to the world, while simultaneously trying to regain some confidence either in their old, pre-cult belief system and ways of relating to the world, or alternatively, in some new, post-cult belief system.

For a while, an ex-member may exist in a sort of limbo between the cult world and the outside world, unsure which to believe in. To the extent that the cult belief system retains any degree of respect or credibility within an ex-member's mind, then to that extent leaving the group will seem like abandoning the ideals and aspirations of the group's belief system, and therefore like a failure.

On the other hand, to the extent that the cult belief system fails to retain credibility and is eschewed, to that extent an ex-member will tend to feel shame at their foolishness and gullibility in having once adopted beliefs and aspired to ideals which they now regard as unrealistic.

So either they are a failure, or a gullible fool. Either way their self-esteem takes a knock, and they may find it difficult to have confidence in their own judgement, or in their ability to come to reasonable decisions.

In general, there appears to be little systematic research done into the after effects of cult involvement. Although there is a good deal of anecdotal evidence, it is difficult to quantify this or to put it into an appropriate perspective. It is difficult to agree on the definition of terms, and some researchers consider the testimonies of ex-cult members to be inherently unreliable. The threat of litigation may also inhibit research in some cases.

Consequently, it is not clear whether adverse reactions to cult involvement are relatively common and typical, or uncommon and atypical. It is difficult to know to what extent any adverse reactions may be due to cult involvement, or to what extent they may be due to other independent factors.

Nevertheless there is a substantial body of concern about cults and the effect that they can have upon their victims. This concern has been expressed for example by former RAF psychiatrist, Dr. Gordon Turnbull, who debriefed Beirut hostages Terry Waite and John McCarthy, and who sees similarities between them and some former cult members. In a BBC Newsnight report on cults, Dr. Turnbull commented that:

'There are, obviously, great similarities between hostages who have been kidnapped, who have been rescued, and cult victims who are re-emerging into normal life. The phenomena, the features of their regaining of control and regaining of responsibility are very similar. The symptoms that they display are similar too. The great difficulty is, however, that the symptoms closely resemble symptoms of major psychiatric illness.' [27]

The difficulties which ex-members may face in regaining a sense of control and responsibility in their lives, tend to result from their difficulties in finding a belief system they can believe in, rather than from weakness of character. A person's belief system includes moral codes and codes of conduct; it includes their beliefs about right and wrong, and about what constitutes responsible behavior. A person's belief system determines the way they behave.

When there are two distinct and somewhat incompatible and conflicting belief systems, the cult and the non-cult, vying for supremacy in an ex-member's mind, the conflict between them may cause mental disorientation. It may cause uncertainty and confusion about right and wrong, and about what is an appropriate and responsible way to behave in a given set of circumstances.

The two rival belief systems, the cult and the non-cult, may coexist in an ex-members mind for some time, and an ex-member may oscillate or flip-flop between the two. For a while, they may not know what to believe, what to think, or who to trust. Using the metaphor of two selves, the cult self and the non-cult self, to represent these two states of belief, the psychologist Robert J. Lifton commented:

'The two selves [cult and non-cult] can exist simultaneously and confusedly for a considerable time, and it may that the transition periods are the most intense and psychologically painful as well as the most potentially harmful.' [28]

This analysis argues that understanding the 'inner life' of a belief system is crucial to discriminating between cults and non-cult organisations, and that it is also a key to understanding the particular nature of a cult. This analysis also suggests that the concept of bi-polar mind-control, with its implicit pressure to reject the flawed 'old self' in favour of an ideal 'new self' (or cult self), can be a useful tool in understanding the sort of mental space a cult member or former member will be operating in. It can be helpful to understand which self, either the old self with its old set of beliefs or the new self with its new set of cult beliefs, is more dominant at any particular time.

If you criticise a cult member, this may just encourage their tendency to see themselves (their old self) as flawed, and may push them further into the cult. If you criticise their church or group, the cult-member will go into cult-self mode, and will see your criticisms as tending to confirm the cult's warnings about the outside world and its negative effects.

A better approach may be to acknowledge and encourage a cult member's old self, without criticising or threatening the new cult self. If a cult member feels valued in themselves, and their old self does not feel devalued, then this weakens the cult's attraction for them.

As former Moonie Steven Hassan writes:

'I will never forget the simple gift of a cold drink on a hot day from a stranger as I sold flowers on a New York City street. By treating me with compassion, he helped to undo the Moonie-programmed belief that the ''outside'' world was evil. We should never pass up an opportunity to reach out to a cult member, whether he may be someone we know or a stranger. It could help open the door to his freedom.' [29]

Problems in Exposing Cults

Difficulties in Identifying a Cult

It is difficult for an outsider to know whether a particular group is a cult, or may have developed cultish undercurrents. Although there are some pointers and external indicators, it really takes an insider's perspective to know what goes on inside a particular group. Only insiders can really blow the whistle on any abuses within cults.

In theory, it might be possible for a cult to be a harmless or even a beneficial organisation. Mind control can be used beneficially, for example to cure people of drug addiction, through reorienting their beliefs and self-image away from addiction. One of the UK's leading cult experts said that she first became interested in cults when she became aware that cults were using techniques similar to those that were sometimes used therapeutically within the medical profession to cure people of drug addiction. Rev. Jim Jones (of the Jonestown massacre) started off as a drugs counsellor in New York. The Scientologists claim to be able to cure people of drug addiction, and they probably can. The FWBO has plans to set up a drug rehabilitation unit with help from Dutch Bank Triodos. There are allegations that some Alcoholics Anonymous groups have developed into abusive cults.

The problem is that abuses can occur when powerful techniques are used in a situation without proper checks and balances. In theory, it is possible for a cult to be entirely beneficent, but because of human nature and the non-accountability of cult leaders, such cults are comparatively rare. Most cults sooner or later are revealed to have fallen prey to some degree to their leadership's desires for adulation, money, power, or sex.

A cult will tend to deny and cover up any abuses by its leadership, and details may only emerge years later [30]. A cult is more or less immune from outside investigation or regulation, because psychological coercion in the form of brainwashing or mind control is almost impossible to prove. This difficulty of proof stems mostly from the subjective, intangible nature of personal belief itself, as discussed earlier, but there are some additional practical obstacles which may face a whistleblower, someone who becomes openly critical of the cult they were once a member of.

Difficulties Facing Critical Ex-members

In general, cults have a hierarchical or pyramid type of structure. At the lowest level, members are part-timers who are only partially committed to the group and are who are only lightly brainwashed. All the cult leadership really requires of this level is that members should speak well of the group and be generally positive. Members at this level have little power or influence, and are unlikely to be aware of the full range of the cult's teachings, knowledge of which is restricted to a trusted inner circle of committed, full-time members.

Members at a part-time level of commitment are less likely to be manipulated or abused to any significant extent, because achieving strong influence over a person really requires that they be exposed to a mind control environment on a more full-time basis. Mind control only works on a foundation of personal friendship and trust, and it takes time and effort to establish this foundation. Strong and intensive mind control is partly a one on one process, in which the controlee is assigned a personal mentor, a more senior and experienced member, who is willing to devote the patience and effort needed to coach the aspirant/controlee through the beliefs and practices of the group.

For this practical reason, therefore, intensive mind control is generally only applied to selected individuals who are perceived to be not only receptive, but who also have something that the group leadership wants. Sometimes this is money or sex, or it may be some practical or business skill which is desired by the group leadership in order to expand the group or to raise money. In some groups, the greater majority of members are not specially targeted, and are only relatively lightly brainwashed.

A person involved at a more superficial level may find it genuinely difficult to believe what goes on in some of the more committed levels of membership. Members who have not been specially targeted, and who have enjoyed the warmth and friendship of the group without having been exposed to its darker side, will tend to think well of the group, and may be puzzled by criticisms of it.

These positive and supportive members can be used as a public relations shield, to counter any allegations against the group, and to reassure new members. Individual critics can be simply outnumbered and their criticisms discredited.

Even if a member involved at a less committed level is not swayed by the general air of positivity, and does develop suspicions about the group, they are unlikely to have enough inside information about the group to be able to verify their suspicions, or to be in a position to effectively warn others of potential problems. Nevertheless, the mere suspicion that a group might be a cult can be enough to deter a person from becoming involved, and so it can still be worth making relevant criticisms and sowing the seeds of suspicion.

If a critic is an insider, someone who has been more deeply involved and who has enough inside knowledge about a cult to be able to make detailed criticisms, they will still be unable to prove anything (because of the subjective nature of personal belief in general, and the non-falsifiable nature of cult belief systems in particular). They will be unable to prove that the group used deception or misrepresentation in marketing the benefits of participating in group run courses and activities.

If an ex-member claims that they were subjected to brainwashing or mind-control techniques, not only is this again unprovable, but in the mind of the general public, it is tantamount to admitting that they are a gullible and easily led person whose opinions, consequently, can't be worth much. If an ex-member suffers from any mental disorientation or evident psychiatric symptoms, this is likely to further diminish their credibility as a reliable informant.

Additionally, dissatisfied members or other critics have great difficulty in disproving ad-hominem arguments, such as that they just have a personal axe to grind, that they are trying to find a scapegoat to excuse their own failure or deficiency, or that they are simply being subjective and emotional. Cults have a vested interest in challenging the personal credibility of their critics, and may cultivate academic researchers who attack the credibility and motives of ex-members [31].

In general, the public credibility of critical ex-cultists seems to be somewhere in between that of Estate Agents and flying saucer abductees.

Summary of Advantages Enjoyed by Cult Organisations

To summarise, a cult - defined as an identifiable, organised group of people holding to an independent belief system which primarily originates or is primarily interpreted from within the group, and which has a hierarchical organisational structure based on that belief system - is to a large extent immune from outside criticism, either of its belief system or of the methods used to recruit followers, because:

1. legal criticism is ineffectual, firstly because freedom of belief laws largely protect cults from outside investigation or regulation, and secondly because of the subjective, non-provable nature of personal belief itself.

2. moral criticism is ineffectual, because a cult belief system can set its own self-justifying moral codes.

3. philosophical or theological criticism is ineffectual, because a cult belief system follows its own internal logic, which is impenetrable to an outsider.

4. empirical or scientific criticism is ineffectual, because the tenets of a cult belief system are non-falsifiable.

5. criticism by ex-members is ineffectual, because apostates tend to lack credibility for a variety of reasons.

Immunity from outside criticism and regulation does not, in itself, necessarily mean that a group will develop and use what might be considered, by the standards of the mainstream, deceptive or devious psychological techniques to gain or control adherents. It only means that they can, and that there is little come-back if they do.

Religious freedom and freedom of belief laws tend to protect the rights of religious and quasi-religious organisations, much more than they protect the rights of individuals who may become involved with those organisations and their belief systems.

Text © Mark Dunlop 2004.

Unknown internet artist


1 Collins English Dictionary definition of cult:
1. a specific system of religious worship, esp. with reference to its rites and deity.
2. a sect devoted to such a system.
3. a quasi-religious organization using devious psychological techniques to gain and control adherents.
4. Sociol. a group having an exclusive ideology and ritual practices centered on sacred symbols, esp. one characterized by lack of organizational structure.
5. Intense interest in and devotion to a person, idea, or activity: the cult of yoga.
6. the person, idea, etc., arousing such devotion.
7a. something regarded as fashionable or significant by a particular group. b. (as modifier): a cult show.
8. (modifier) of, relating to, or characteristic of a cult or cults: a cult figure.
[from Latin cultus cultivation, refinement, from colere to till].

The Collins 1 definition refers to the term religious worship. Religion (meaning a particular system of faith and worship) derives originally from the Latin religio-onis, which meant 'obligation, bond, reverence'.

The Collins 2 definition refers to the term sect. This word comes from the Late Latin secta, which means an "organized church body." That in turn is rooted in the Latin sequi, which means 'to follow,' and is used of a 'way of life', or a 'class of persons'. Sect can refer to:

· a religious denomination
· a dissenting religious group, formed as the result of schism (division; separation, from Greek skhisma-atos 'cleft', from skhizo 'to split'). In this case, the term sect also borrows from the Latin sectus, which means 'cut' or 'divide'.
· a group adhering to a distinctive doctrine or leader

Theologically, sect is used of a group which has divided from a larger body or movement - generally over minor differences in doctrine and/or practice - but whose teachings and practices are not considered unorthodox or cultic (theologically and/or sociologically). However, in some countries sect is used instead of - or interchangeably with - cult.

In general, there are two main ways in which the word cult is used: the academic and the popular. Academic usage tends to try and make cult cover all eight (in Collins' classification) shades of meaning, while popular usage is more specific, and tends to equate cult with either the Collins 3 or the Collins 7 meaning. Likewise, popular usage of the word religion tends to be more specific, and tends to imply an established system of faith and worship, like Christianity. It is arguable that, in the West, the modern usage of the term religion, to imply an established system of faith and worship (usually one which believes in a single creator Deity) dates from the third century AD. In 313 Constantine the Great issued an edict of toleration for all religions, and in 380, Theodosius I made Christianity the official religion of the Roman Empire. Prior to this, Christianity had in general been seen a just another 'cult' (in the non-pejorative Collins 1 sense of 'a system of religious worship), much like, for example, the cult of Diana/Artemis. Theodosius I effectively promoted Christianity from 'cult' (system of faith and worship) to 'religion' (established system of faith and worship).

For a long time subsequently, within Western academic and scholastic traditions, religion meant Christianity, and anything else tended to be dismissed as a 'native cult' (eg. 'the cult of Lamaism'). Over the last century or so, the West has gained a greater understanding of foreign cultures, and systems of faith like Islam and Buddhism have, in both popular and academic usage, been promoted from 'native cults' to the status of 'world religions'. The faith systems of even fairly obscure and only recently discovered ethnic tribes are nowadays mostly referred to as native 'religions'. The term cult in this context, while possibly technically correct, tends to be viewed as disparaging and carrying undesirable overtones of cultural imperialism.

Religion comes from the Latin religio-onis, which meant 'obligation, bond, reverence', and did not originally necessarily imply belief in a single creator God. If the terms 'religion' and 'worship' can legitimately be used in a secular context (eg. 'Football is their religion'), as a synonym for the honour, respect, and reverence paid, in varying degrees, to various secular ideals and personalities, the second and subsequent meanings of cult given above can, with only a little stretching, be interpreted as referring to specific systems of secular 'worship'. For example, Collins gives: '7a. something regarded as fashionable or significant by a particular group. b. (as modifier): a cult show.'

An example in this context might be J.D. Salinger's 'Catcher in the Rye'. In the 1950's, this book became popular among young people and came to be regarded as a cult novel. The book told the story of an American adolescent, Holden Caulfield, growing up in an adult society after World War II, and its theme, of the relation to the individual to the world around him, can be compared to the slightly more mainstream philosophy of Existentialism, which is concerned with issues of the individual self, freedom of choice, and personal responsibility, in a world which does not make sense. Existentialism traveled to America with the GI's returning in 1945 from war service in France, and strongly influenced the 'beatnik generation' and, presumably, Salinger himself.

Existentialism can be related both to later philosophical attitudes, such as the alienated vision of Generation X, and to earlier cultural and artistic movements like Surrealism and Dada, which had in turn partly arisen out of a desire to make sense of the experiences of the first world war. The term Existentialism, much like the terms religion and cult, can acquire different shades of meaning, depending on whether the term is being used in a popular, academic, literary, artistic, or political context. The various flavours of Existentialism coexist with assorted flavours of other current philosophies, like post-modernism, structuralism and situationism. None are easy to define.

Existentialism and its cousins can be regarded as sub-sets of the mainstream Western secular ideal of democracy, the ideal of the responsible individual exercising informed freedom of choice. While it is probably stretching the language a little to say that mainstream Western secular society actually 'worships' this ideal, nevertheless the ideal is granted considerable honour, respect, and even reverence. (For example, from the American Declaration of Independence, 1776: '…We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. ----- That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed,…' or, for example, J.S. Mill, On Liberty, 1859: 'Over himself, over his own body and mind, the individual is sovereign.')

This ideal, of the responsible individual, was born out of the secular-humanist concept of the individual personality, or self. This self or individual personality had, in turn, been the replacement (radical at the time) for the 'soul' of the Christian ideology that had dominated Europe in the preceding centuries. As with most complex systems, previous paradigms do not suddenly vanish, they linger in the background. For example, the 'soul' remains a meaningful concept in many religious contexts, while 'self-understanding' and 'self-esteem' can sometimes play a similar idealised and mythologised role in current psychoanalysis and popular psychology, for example in terms of the 'cosmic self' or 'the warrior within'.

Both culture and personal belief are complex matters, and this complexity is reflected in the difficulty of precisely defining terms such as cult, sect, religion, soul, self, liberty and responsibility. These terms refer to processes and behaviours as much as to finite states They are all 'big' words which are difficult to define neatly or precisely, and whose meaning may include various emotional associations or contextual nuances which may change over time. Language is a fluid medium.

The intention of the above comments on the definition of 'cult' and related terms is neither to set out a precise terminology, nor to excuse sloppy terminology, but somewhere in between. This analysis acknowledges the range of meanings that the term cult may be expected to carry, and has no wish to exclude any nuances of meaning. The main text introduces the notion of the 'quasi-religious spectrum' (page 6) as a means of allowing an appropriate flexiblity or 'fuzziness' in the definitions of the terms cult, sect, and religion.

2 The Concise Oxford Dictionary defines brainwash as: to 'subject [a person] to a prolonged process by which ideas other than and at variance with those already held are implanted in the mind'

The term 'brainwashing' was first used in 1953 to describe techniques used by the Chinese Communists to subvert the loyalty of American prisoners captured in Korea. Brainwashing in this original sense involved physical coercion: imprisonment, food and sleep deprivation, and sometimes torture.

In recent years, various people concerned about cults have tended to use terms like 'mind control' or 'thought reform' to describe a brainwashing or indoctrination process which does not involve physical coercion. This kind of non-coercive process has the great advantage (from a cult's point of view) of not leaving any physical evidence, and of therefore being very difficult to prove. Whilst there is evidence that some cults have used physical coercion, in general cults are keen to distance themselves from such practices. There has been a kind of Darwinian evolution among cults, in that those which have survived and prospered have tended to be those which have succeeded in developing effective, but non-physically coercive, processes to 'brainwash' a person.

3 'A remarkable thing about cult mind control is that it's so ordinary in the tactics and strategies of social influence employed. They are variants of well-known social psychological principles of compliance, conformity, persuasion, dissonance, reactance, framing, emotional manipulation, and others that are used on all of us daily to entice us: to buy, to try, to donate, to vote, to join, to change, to believe, to love, to hate the enemy.' -Philip G. Zimbardo, PhD, professor of psychology at Stanford University and a former American Psychiatric Association president.

There are a number of interesting books on the subjects of marketing and persuasion, eg.:
'Influence, the Psychology of Persuasion', by Robert Cialdini
'The Guide to Identity', 'Corporate Identity', both by Wally Olins

4 From p.2 of Albert Einstein's Autobiography "The World As I See It" translated by Alan Harris pub Bodley Head 1935 ISBN 0-8065-0711-X

5 From p.76 of Schopenhauer's 'On the Freedom of the Will' translated by Konstantin Kolenda pub Basil Blackwell, quoting from Thomas Hobbes 'Of Liberty and Necessity' The English Works of Thomas Hobbes - London 1840.

6 From the same Schopenhauer book as note 5.

7 FWBO Norwich Buddhist Centre leaflet and programme of classes, Autumn 1999.

8 Sangharakshita, Mtrata Omnibus, pub. FWBO Windhorse Publications 1980, p38. Originally transcribed by Jinamata from the lecture H8 'The Symbolism of the Five Buddhas "Male" and "Female".', and revised by Sangharakshita. ['Bodhi is a state of insight ...']

9 Robert J. Lifton's eight criteria of mind control:

Adapted from Robert Jay Lifton's Thought Reform and the Psychology of Totalism (Norton, 1961, now reprinted by the University of North Carolina Press)

Dr. Lifton's work was the outgrowth of his studies for military intelligence of Mao Tse-Tung's "thought-reform programs" commonly known as "brainwashing." In Chapter 22, Lifton outlines eight criteria which can be used as indicators when investigating whether an environment can be understood as exercising "thought-reform" or mind control. Lifton wrote that any group has some aspects of these indicators. However, if an environment exhibits all eight of these indicators and implements them in the extreme, then there is the possibility of unhealthy thought reform taking place.

1. Milieu Control
Environment control and the control of human communication. Not just communication between people but communication within people's minds to themselves.

2. Mystical Manipulation
Everyone is manipulating everyone, under the belief that it advances the "ultimate purpose." Experiences are engineered to appear to be spontaneous, when, in fact, they are contrived to have a deliberate effect. People misattribute their experiences to spiritual causes when, in fact, they are concocted by human beings.

3. Loading the Language
Controlling words help to control people's thoughts. A totalist group uses totalist language to make reality compressed into black or white - "thought-terminating clichés." Non-members cannot simply understand what believers are talking about. The words constrict rather than expand human understanding.

4. Doctrine Over Person
No matter what a person experiences, it is the belief of the dogma which is important. Group belief supersedes conscience and integrity.

5. Sacred Science
The group's belief is that their dogma is absolutely scientific and morally true. No alternative viewpoint is allowed. No questions of the dogma are permitted.

6. The Cult of Confession
The environment demands that personal boundaries are destroyed and that every thought, feeling, or action that does not conform with the group's rules be confessed; little or no privacy.

7. The Demand for Purity
The creation of a guilt and shame milieu by holding up standards of perfection that no human being can accomplish. People are punished and learn to punish themselves for not living up to the group's ideals.

8. The Dispensing of Existence
The group decides who has a right to exist and who does not. There is no other legitimate alternative to the group. In political regimes, this permits state executions.

It could be argued that all eight of Lifton's criteria (for example, milieu control or information control) are applicable to society at large, and can be observed in operation within various groups, both cult and non-cult. It could equally be argued that all eight of Lifton's criteria in fact primarily reflect the nature and interior dynamics of a hierarchical belief system, one which includes beliefs about higher and lower levels of personal awareness and understanding, and ideas about rejecting the old self and developing a new self.

Lifton's criteria may be more illuminating about cults when the criteria are interpreted as descriptions of the interior world or self-view of a person who believes in such a hierarchical, cult-type belief system. In this perspective, Lifton's Demand for Purity could be broadly interpreted as the desire of a believer for the purification of their old self and the creation of a pure new self.

Lifton also wrote the following about 'The demand for purity' in the essay 'Cults: Religious Totalism and Civil Liberties', (included in the book 'The Future of Immortality and Other Essays for a Nuclear Age', by Robert J. Lifton, pub. New York, Basic Books, 1987):

'The demand for purity can create a Manichean quality in cults, as in some other religious and political groups. Such a demand calls for radical separation of pure and impure, of good and evil, within an environment and within oneself. Absolute purification is a continuing process. It is often institutionalized; and, as a source of stimulation of guilt and shame, it ties in with the confession process. Ideological movements, at whatever level of intensity, take hold of an individual's guilt and shame mechanisms to achieve intense influence over the changes he or she undergoes. This is done within a confession process that has its own structure. Sessions in which one confesses to one's sins are accompanied by patterns of criticism and self-criticism, generally transpiring within small groups and with an active and dynamic thrust toward personal change.'

10 Alan Gomes, Unmasking the Cults, Zondervan 1995

11 Panel discussion with Dr David Jenkins et al, broadcast in July 1999 as part of the ITV series on the history of Christianity, '2000 Years', hosted by Melvyn Bragg.)

12 Lawsuit filed on 16 August 1999 in Baltimore, Maryland, USA, by a coalition of plaintiffs, including the Seventh-day Adventists and the 'International Coalition for Religious Freedom' (funded primarily by the Unification Church or Moonies)

13 Re: justification of lies and deception. From a conversation posted on the internet newsgroup, on Fri 24 Mar 2000, on thread: '$cientology: Cult of LIES'

Poster 1
Lying implies some kind of malicious intention

Poster 2
Lying does not by definition imply malicious intent - you can lie out of politeness, out of pity, because you think the truth would be bad for that person, etc. etc. etc.
Outright lying means to tell people things that you do know are not true - Scientologists do that at times, when they are honestly convinced that this is better for Scientology.
But there are finer variations where one can convince oneself, that one is not really lying: You can tell only a small part of the truth, you can tell the truth in a way that the other person does understand it differently as it is, you can consciously omit important facts, you can formulate things generally and unspecific - all with the intent that the other person judges the situation according to your wishes while you objectively do not give the person the information necessary to judge the situation impartially.
Sure you can say, that you did not lie - but you did not tell what is, in your conviction, the full truth.

Poster 1
Because scientologists believe in their tech does not make them liars. For the most part, scientologists are honest and well-meaning. Just like most critics.

Poster 2
The point is not, that scientologists believe in their tech - they have the right to that.
Also Scientologists themselves do try their best to act ethical (as they define it) and they are sincere in that.
But Scientologists do have their own definition about ethics which does not fully correspond with the general understanding about ethics.
And also Scientologists do have their own understanding about reality, about what to tell other people as truth about Scientology - again their view of these things is in conflict with the general understanding.
This does not only concern staff - also public Scientologists are formally and informally told how to best explain Scientology to others, what to mention, how to mention it, what not to mention - all with the best intent, but the result is still, that you cannot, as an outsider, get fully informed about Scientology by a Scientologist. I do know that, because I did it myself and I taught it myself for years.
Hardly any Scientologist lies consciously to you - but he tells you only a truth he thinks acceptable to you (and this might be so small a part of truth, that it results in disinformation, not information)

Poster 1
Maybe this is true in any religion. If there is really no heaven, have the priests all lied to us?

Poster 2
A voodoo adherent who firmly believes in voodoo, does not lie to you, when he tells you about voodoo, if he is sincere in telling you what he believes.
A moslem who believes that non-moslems will go to hell, does not lie, if he tells what he sincerely believes - no matter, if the moslem hell factually exists or not.
A protestant priest who does not believe that Jesus rose from the dead and still preaches he sure did, is lying - he tells something as truth which he does not believe in.
A Scientologist who says Scientology is a religion and privately thinks it is a technology for self-betterment does lie to you - what he is saying is not what he believes. A Scientologist who believes Scientology is a religion and says it is a philosophy because a as religion it would not be acceptable in e.g. Greece, is lying - what he thinks and what he says is not the same and he knows it.
One of the problems of Scientology is, that people are actually taught to tell outsiders not what insiders see as the truth - and that Scientologists feel it is ethical to do that. This sort of re-education about what is ethical or not does lead to conflicts with non-Scientologists.

14 Jayamati, in Shabda, August 1998, p.59. Also: Ratnavira, in Shabda, April 1998. 'There is … the question of public image and reality. There is a public image of the Order which is presented through our publications … and there is what the Order really is … the two are quite different.'

15 Re: Organisations and their belief systems
The first three definitions of cult given in Collins Dictionary all imply an element of organisation, while Collins' fourth and subsequent definitions of cult refer to phenomena which are characterised by a relative lack of organisation: '4. Sociol. a group having an exclusive ideology and ritual practices centered on sacred symbols, esp. one characterized by lack of organizational structure.'

Cults in the sense of fads and fashions (as in: 7a. something regarded as fashionable or significant by a particular group. b. (as modifier): a cult show.) tend to be relatively disorganised phenomena; there may be a belief system, or a set of attitudes, but there tends to be relatively little desire to set up an organisation to promote those beliefs and attitudes. Again, there is something of a spectrum among 'fads and fashions', ranging from the anarchic to the slightly organised.

16 Paul Ricouer Freud and Philosophy: An Essay on Interpretation, New Haven, Yale University Press, 1970, p27

17 In Greek mythology, Ariadne was the daughter of Minos, King of Crete. When Theseus came from Athens as one of the sacrificial victims offered to the Minotaur, she fell in love with him and gave him a ball of thread, which enabled him to find his way out of the labyrinth after killing the Minotaur.

18 Sangharakshita, 'Going for Refuge', T.V. programme, BBC East 12.11.92.

19 Sangharakshita, 'Mind - Reactive and Creative', page 8, pub. FWBO 1971.

20 Alaya (FWBO Order member), telephone conversation 28.3.1994

21 Ego-utopia/ego-dystonia
A utopian vision of an ideal 'new world' or 'new self' does imply an opposite pole, a dystopian or dystonic vision of a disfunctional old world or old self. Applying these terms to the inner world of a person's ego, to their subjective sense of themselves as an individual, rather than to the world at large, 'ego-utopia' could be said to be a tendency towards an unreasonably high level of self-esteem or hubris (overweening personal pride and arrogance), with 'ego-dystonia' as its opposite, a tendency towards shame and guilt and an unreasonably low level of self-esteem.

It might seem more logical to adopt the term ego-dystopic rather than ego-dystonic. The word dystopia is used, for example in 'a dystopian, nightmarish vision of society', but there is also the established term 'ego-dystonic sexual orientation', and it seems more sensible to be consistent with the latter usage.

The term ego-dystonia is used by the American Psychiatric Association (APA) in their Diagnostic and Statistical Manual of Mental Disorders. For example, when homosexuality was declassified as a 'disorder' in 1973, a diagnosis was left as a residual of the former; that is, the diagnosis of 'Ego-dystonic Homosexuality.' Ego-dystonic here means that the person's (homo)sexual orientation is not compatible with where they think they 'should' be to such an extent that it is causing 'marked decline in social functioning,' or 'dysfunctionality'. While it may result in shame, the suffering does not come in and of itself of being gay, but rather it comes of the fear and self-loathing learned within a hostile family and/or social system which defines gay people as second class citizens. Ego-dystonic sexual orientation is also recognised in the International Classification of Diseases, where it is classified as an adjustment disorder (F66.1, ICD 10).

Ego-dystonia, or a diminished sense of self worth compared to a peer group, is not necessarily confined to the area of sexual orientation. Ego-dystonia can result from exposure to a variety of situations encountered within society at large. It can result from experiences of sexual or racial stereotyping, or as a result of bullying at school or in the workplace, or sometimes as a result of social deprivation or a difficult family background.

Of course, individual experiences do vary; some people seem more robust and thick-skinned than others, and better able to withstand a difficult environment. It is rarely possible to make any direct causal or predictive link between the environment and the personal psychology of a given individual, because similar situations may affect different individuals in different ways. One person may react to a feeling of social exclusion, for example, by becoming hostile and blaming others, perhaps developing ideas of revenge, while another may interiorise their reaction and, tending to blame themselves, may develop some form of ego-dystonic depression, and may possibly turn to self-destructive activities.

Personal self-esteem is a fluid and subjective factor, and it is never possible to measure self-esteem objectively. It is often difficult to assess another person's level of self-esteem, or to place it on a scale between excessive self-esteem (hubris) and inadequate self-esteem (ego-dystonia). Nevertheless, individuals clearly do experience various and varying levels of self-esteem, and it seems reasonable to suggest that inappropriate levels of self-esteem may tend to result in inappropriate behaviour.

This analysis puts forward the hypothesis that cults can indirectly control behaviour, by inducing and then exploiting an inappropriately low level of personal self-esteem in the minds of their followers. Once some degree of ego-dystonic guilt has been established in a person's mind, then the behaviour of that person can to some extent be manipulated, by that person's peer group granting or withholding emotional approval and support.

Ego-dystonia is not to be confused with the medical condition dystonia, which is an illness characterised by involuntary spasms and muscle contractions that induce abnormal movements and postures. This is obviously different from the psychological condition of ego-dystonic sexual orientation. There are other instances where the same term is used to denote two quite different and distinct conditions; for example, the term 'hypertension' may be used either to describe a particular measurable level of high blood pressure, or it may be used to describe a psychological state which cannot be measured by material indicators.

22 'The sabbath was made for man, and not man for the sabbath.' - Parable of the wheat field, Mark chapter 2, verse 27.
There may be equivalent 'believer centered' statements in other religions. In Buddhism for example, there is the parable of the raft (Digha-Nikaya, ii, 89; Majjhima-Nikaya, i, 134), in which the raft, representing the beliefs and practices of Buddhism, is to be cast aside once the further shore (enlightenment) is reached.

23 J. L. Segundo, Evolution and Guilt, New York: Orbis, 1974, p 52
It should be pointed out that Segundo was not writing specifically about cults as such, but from the perspective of 'Liberation Theology', which is concerned with the hermeneutics of issues such as land ownership and education.

24 'Falsifiability' was first put forward as criterion for testable scientific truth by Sir Karl Popper, Professor of Logic and Scientific Method at the London School of Economics from 1949 - 1969.

A statement of the form 'All crows are black' is a falsifiable statement, because one properly authenticated observation of a white (albino) crow is sufficient to show that the statement is false, despite any number of observations of black crows. In other words, the statement is capable of being disproved through empirical evidence.

Statements of the form: 'Spiritual life begins when one realises that one is not as aware as one could be' or 'We always have to be aware that our. . .um . . . what we think, is not true, until enlightenment' are non-falsifiable statements. While any number of people within a group may observe (or say that they believe) that they have become more aware following the group's spiritual guidance, a person who questions this or who observes (or believes) that they themselves have not become more aware following the group's spiritual guidance, cannot establish this either as a valid observation, or as a reasonable belief, because it can easily be argued that this 'negative' observation results from that person's own deficiencies of spiritual awareness, and not from any deficiencies in the group's spiritual guidance, or in the truth of the group's doctrines.

25 Re: Critics disenfranchised and the non-democratic nature of cults:

'There's no democracy in the Western Buddhist Order! .... It's a hierarchy, but a spiritual one.... It is the broad feeling that there is in someone, or in certain people, something higher and better than yourself to which you can look up.... It's a good, positive thing to be able to look up to someone! If you can't, you're in a pretty difficult position. You're in a sad state.... like a child that hasn't even got a mother and father to look up to....But this sort of assertion, that you're just as good as anybody else in the egalitarian sense, is really sick.'
From 'The Endlessly Fascinating Cry.' A seminar by FWBO leader Sangharakshita on the Bodhicaryavatara, transcribed and published FWBO, 1977, p.74-5.

'Spiritual hierarchy' appears to be a variety of 'doctrine over person' and 'dispensing of existence' described by Robert J. Lifton in his 'Eight Criteria of Mind Control'

26 The 'card sharp' simile for a cult is courtesy of Verdex, an ex-FWBO member from Germany, who maintains the Internet site,

Verdex also likens the FWBO to a black hole; the attraction increases as a person moves closer to the group, and there is an event horizon, beyond which communication with the outside universe is lost.

27 Newsnight report on cults, BBC 2, 16th July 1993

28 From the essay 'Cults: Religious Totalism and Civil Liberties', included in the book 'The Future of Immortality and Other Essays for a Nuclear Age', by Robert J. Lifton, pub. New York, Basic Books, 1987.

29 Hassan, Steve, Releasing The Bonds: Empowering People To Think For Themselves, Somerville, MA: Freedom of Mind Press, 2000, Introduction, p. 334

30 For details of numerous allegations of abuse within cults, follow relevant links (eg 'ex-member stories') on the following websites:

31 Academic researchers who attack the credibility and motives of ex-members. Eg:

Dr J. Gordon Melton testified as an expert witness in a lawsuit, that:

'When you are investigating groups such as this [The Local Church], you never rely upon the unverified testimony of ex-members….To put it bluntly, hostile ex-members invariably shade the truth. They invariably blow out of proportion minor incidents and turn them into major incidents, and over a period of time their testimony almost always changes because each time they tell it they get the feedback of acceptance or rejection from those to whom they tell it, and hence it will be developed and merged into a different world view that they are adopting.'

From the expert testimony of Dr J. Gordon Melton in Lee vs. Duddy et al, a lawsuit involving the Local Church and the Spiritual Counterfeits Project.
Quoted from

And Bryan Wilson, Emeritus Professor at All Souls College, Oxford, wrote about apostates (ex-members who become openly critical of the group they were once a member of):

'Informants who are mere contacts and who have no personal motives for what they tell are to be preferred to those who, for their own purposes, seek to use the investigator. The disaffected and the apostate are in particular informants whose evidence has to be used with circumspection. The apostate is generally in need of self-justification. He seeks to reconstruct his own past, to excuse his former affiliations, and to blame those who were formerly his closest associates. Not uncommonly the apostate learns to rehearse an 'atrocity story' to explain how, by manipulation, trickery, coercion, or deceit, he was induced to join or to remain within an organization that he now forswears and condemns. Apostates, sensationalized by the press, have sometimes sought to make a profit from accounts of their experiences in stories sold to newspapers or produced as books (sometimes written by 'ghost' writers).'

Bryan Wilson, The Social Dimensions of Sectarianism, Oxford: Clarendon Press, 1990, p.19.

Further examples of these sorts of ad-hominem arguments may be found at the cult-apologists FAQ at: