Is the medicalization of human suffering doing more harm than good?

‘Mental disorder’ is difficult to define.

Generally speaking, mental disorders are conditions that involve either loss of contact with reality or distress and impairment. These experiences lie on a continuum of normal human experience, and so it is impossible to define the precise point at which they become pathological.

What’s more, concepts such as borderline personality disorder, schizophrenia, and depression listed in classifications of mental disorders may not map onto any real or distinct disease entities. Even if they do, the symptoms and clinical manifestations that define them are open to subjective judgement and interpretation.

In an attempt to address these problems, classifications of mental disorders such as DSM-5 and ICD-10 adopt a ‘menu of symptoms’ approach, and rigidly define each symptom in technical terms that are often far removed from a person’s felt experience. This encourages mental health professionals to focus too narrowly on validating and treating an abstract diagnosis, and not enough on the person’s distress, its context, and its significance or meaning.

Despite using complex aetiological models, mental health professionals tend to overlook that a person’s felt experience often has a meaning in and of itself, even if it is broad, complex, or hard to fathom. By being helped to discover this meaning, the person may be able to identify and address the source of his distress, and so to make a faster, more complete, and more durable recovery. Beyond even this, he may gain important insights into himself, and a more refined and nuanced perspective over his life and life in general. These are rare and precious opportunities, and not to be squandered.

A more fundamental problem with labelling human distress and deviance as mental disorder is that it reduces a complex, important, and distinct part of human life to nothing more than a biological illness or defect, not to be processed or understood, or in some cases even embraced, but to be ‘treated’ and ‘cured’ by any means possible—often with drugs that may be doing much more harm than good. This biological reductiveness, along with the stigma that it attracts, shapes the person’s interpretation and experience of his distress or deviance, and, ultimately, his relation to himself, to others, and to the world.

Moreover, to call out every difference and deviance as mental disorder is also to circumscribe normality and define sanity, not as tranquillity or possibility, which are the products of the wisdom that is being denied, but as conformity, placidity, and a kind of mediocrity.

The evolution of the status of homosexuality in the classifications of mental disorders highlights that concepts of mental disorder can be little more than social constructs that change as society changes. PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM and ICD, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.

Other pressing problems with the prevalent medical model is that it encourages false epidemics, most glaringly in depression, bipolar disorder, and ADHD. Data from the US National Health Interview Survey indicate that, in 2012, 13.5% (about 1 in 7) of boys aged 3-17 had been diagnosed with ADHD, up from 8.3% in 1997. It also encourages the wholesale exportation of Western mental disorders and Western accounts of mental disorder. Taken together, this is leading to a pandemic of Western disease categories and treatments, while undermining the variety and richness of the human experience.

For example, in her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.

Many critics question the scientific evidence underpinning such a robust biological paradigm and call for a radical rethink of mental disorders, not as detached disease processes that can be cut up into diagnostic labels, but as subjective and meaningful experiences grounded in personal and larger sociocultural narratives.

Unlike ‘mere’ medical or physical disorders, mental disorders are not just problems. If successfully navigated, they can also present opportunities. Simply acknowledging this can empower people to heal themselves and, much more than that, to grow from their experiences.

Generally speaking, culture-specific, or culture-bound, syndromes are mental disturbances that only find expression in certain cultures or ethnic groups, and that are not comfortably accommodated by Western psychiatric classifications such as the DSM and ICD. DSM-IV defined them as ‘recurrent, locality-specific patterns of aberrant behavior and troubling experience…’

One example of a culture-bound syndrome is dhat, which is seen in men from South Asia, and involves sudden anxiety about loss of semen in the urine, whitish discoloration of the urine, and sexual dysfunction, combined with feelings of weakness and exhaustion. The syndrome may originate in the Hindu belief that it takes forty drops of blood to create a drop of bone marrow, and forty drops of bone marrow to create a drop of semen, and thus that semen is a concentrated essence of life.

DSM-5, published in 2013, replaces the notion of culture-bound syndromes with three ‘cultural concepts of distress’: cultural syndromes, cultural idioms of distress, and cultural explanations for distress. Rather than merely listing specific cultural syndromes, DSM-5 adopts a broader approach to cultural issues, and acknowledges that all mental disorders, including DSM disorders, can be culturally shaped.

However, some DSM disorders are, it seems, much more culturally shaped than others. For instance, PTSD, anorexia nervosa, bulimia nervosa, depression, and deliberate self-harm (non-suicidal self-injury) can all be understood as cultural syndromes. Yet, for being in the DSM, they are usually seen, and largely legitimized, as biological and therefore universal expressions of human distress.

Thus, one criticism of classifications of mental disorders such as DSM and ICD is that, arm in arm with pharmaceutical companies, they encourage the wholesale exportation of Western mental disorders, and, more than that, the wholesale exportation of Western accounts of mental disorder, Western approaches to mental disorder, and, ultimately, Western values such as biologism, individualism, and the medicalization of distress and deviance.

In her recent book, Depression in Japan, anthropologist Junko Kitanaka writes that, until relatively recently, depression (utsubyō) had remained largely unknown to the lay population of Japan. Between 1999 and 2008, the number of people diagnosed with depression more than doubled as psychiatrists and pharmaceutical companies urged people to re-interpret their distress in terms of depression. Depression, says Kitanaka, is now one of the most frequently cited reasons for taking sick leave, and has been ‘transformed from a rare disease to one of the most talked about illnesses in recent Japanese history’.

In Crazy Like Us: The Globalization of the American Psyche, journalist Ethan Watters shows how psychiatric imperialism is leading to a pandemic of Western disease categories and treatments. Watters argues that changing a culture’s ideas about mental disorder actually changes that culture’s disorders, and depletes the store of local beliefs and customs which, in many cases, provided better answers to people’s problems than antidepressants and anti-psychotics. For Watters, the most devastating consequence of our impact on other cultures is not our golden arches, but the bulldozing of the human psyche itself.

He writes:

Looking at ourselves through the eyes of those living in places where human tragedy is still embedded in complex religious and cultural narratives, we get a glimpse of our modern selves as a deeply insecure and fearful people. We are investing our great wealth in researching and treating this disorder because we have rather suddenly lost other belief systems that once gave meaning and context to our suffering.

Distressed people are subconsciously driven to externalize their suffering, partly to make it more manageable, and partly so that it can be recognized and legitimized. According to medical historian Edward Shorter, our culture’s beliefs and narratives about illness provide us with a limited number of templates or models of illness by which to externalize our distress. If authorities such as psychiatrists and celebrities appear to endorse or condone a new template such as ADHD or deliberate self-harm, the template enters into our culture’s ‘symptom pool’ and the condition starts to spread. At the same time, tired templates seep out of the symptom pool, which may explain why conditions such as ‘hysteria’ and catatonic schizophrenia (schizophrenia dominated by extreme agitation or immobility and odd mannerisms and posturing) have become so rare.

The incidence of bulimia nervosa rose in 1992, the year in which journalist Andrew Morton exposed Princess Diana’s ‘secret disease’, and peaked in 1995, when she revealed her eating disorder to the public. It began to decline in 1997, the year of her tragic death. This synchronology suggests that Princess Diana’s status and glamour combined with intense press coverage of her bulimia and bulimia in general led to an increase in the incidence of the disorder.

An alternative explanation is that Princess Diana’s example encouraged people to come forward and admit to their eating disorder. By the same token, it could have been that the Japanese had always suffered from depression, but had been hiding it, or had not had a template by which to recognize or externalize it. The danger for us psychiatrists and health professionals when treating people with mental disorder is to treat the template without addressing or even acknowledging the very real distress that lies beneath.

Adapted from the new edition of The Meaning of Madness.

Find me on Twitter and Facebook.

In his paper of 1943, A Theory of Human Motivation, psychologist Abraham Maslow proposed that healthy human beings had a certain number of needs, and that these needs are arranged in a hierarchy, with some needs (such as physiological and safety needs) being more primitive or basic than others (such as social and ego needs). Maslow’s so-called ‘hierarchy of needs’ is often presented as a five-level pyramid, with higher needs coming into focus only once lower, more basic needs have been met.

Maslow's Hierarchy of Needs
Maslow’s Hierarchy of Needs

Maslow called the bottom four levels of the pyramid ‘deficiency needs’ because we do not feel anything if they are met, but become anxious or distressed if they are not. Thus, physiological needs such as eating, drinking, and sleeping are deficiency needs, as are safety needs, social needs such as friendship and sexual intimacy, and ego needs such as self-esteem and recognition. On the other hand, he called the fifth, top level of the pyramid a ‘growth need’ because our need to self-actualize enables us to fulfill our true and highest potential as human beings.

Once we have met our deficiency needs, the focus of our anxiety shifts to self-actualization, and we begin, even if only at a sub- or semi-conscious level, to contemplate our bigger picture. However, only a small minority of people is able to self- actualize because self-actualization requires uncommon qualities such as honesty, independence, awareness, objectivity, creativity, and originality.

Maslow’s hierarchy of needs has been criticized for being overly schematic and lacking in scientific grounding, but it presents an intuitive and potentially useful theory of human motivation. After all, there is surely some truth in the popular saying that one cannot philosophize on an empty stomach, or in Aristotle’s observation that, ‘all paid work absorbs and degrades the mind’.

Many people who have met all their deficiency needs do not self-actualize, instead inventing more deficiency needs for themselves, because to contemplate the meaning of their life and of life in general would lead them to entertain the possibility of their meaninglessness and the prospect of their own death and annihilation.

A person who begins to contemplate his bigger picture may come to fear that life is meaningless and death inevitable, but at the same time cling on to the cherished belief that his life is eternal or important or at least significant. This gives rise to an inner conflict that is sometimes referred to as ‘existential anxiety’ or, more colourfully, ‘the trauma of non-being’.

While fear and anxiety and their pathological forms (such as agoraphobia, panic disorder, or PTSD) are grounded in threats to life, existential anxiety is rooted in the brevity and apparent meaninglessness or absurdity of life. Existential anxiety is so disturbing and unsettling that most people avoid it at all costs, constructing a false reality out of goals, ambitions, habits, customs, values, culture, and religion so as to deceive themselves that their lives are special and meaningful and that death is distant or delusory.

However, such self-deception comes at a heavy price. According to Jean-Paul Sartre, people who refuse to face up to ‘non-being’ are acting in ‘bad faith’, and living out a life that is inauthentic and unfulfilling. Facing up to non-being can bring insecurity, loneliness, responsibility, and consequently anxiety, but it can also bring a sense of calm, freedom, and even nobility. Far from being pathological, existential anxiety is a sign of health, strength, and courage, and a harbinger of bigger and better things to come.

For theologian Paul Tillich (1886-1965), refusing to face up to non-being leads not only to a life that is inauthentic but also to pathological (or neurotic) anxiety.

In The Courage to Be, Tillich asserts:

He who does not succeed in taking his anxiety courageously upon himself can succeed in avoiding the extreme situation of despair by escaping into neurosis. He still affirms himself but on a limited scale. Neurosis is the way of avoiding nonbeing by avoiding being.

According to this outlook, pathological anxiety, though seemingly grounded in threats to life, in fact arises from repressed existential anxiety, which itself arises from our uniquely human capacity for self-consciousness.

Facing up to non-being enables us to put our life into perspective, see it in its entirety, and thereby lend it a sense of direction and unity. If the ultimate source of anxiety is fear of the future, the future ends in death; and if the ultimate source of anxiety is uncertainty, death is the only certainty. It is only by facing up to death, accepting its inevitability, and integrating it into life that we can escape from the pettiness and paralysis of anxiety, and, in so doing, free ourselves to make the most out of our lives and out of ourselves.

The Death of Socrates, by Jacques-Louis David (detail).
The Death of Socrates, by Jacques-Louis David (detail).
Some philosophers have gone even further by asserting that the very purpose of life is none other than to prepare for death. In Plato’s Phaedo, Socrates, who is not long to die, tells the philosophers Simmias and Cebes that absolute justice, absolute beauty, or absolute good cannot be apprehended with the eyes or any other bodily organ, but only by the mind or soul. Therefore, the philosopher seeks in as far as possible to separate body from soul and become pure soul. As death is the complete separation of body and soul, the philosopher aims at death, and indeed can be said to be almost dead.



Adapted from the new edition of The Meaning of Madness