shame

The psychology of the reflexive emotions, and the differences between them.

Embarrassment, shame, and guilt are all three reflexive emotions, that is, emotions about the self.

Although there is some overlap, embarrassment, shame, and guilt are distinct constructs.

Let’s look at them each in turn.

Embarrassment

Embarrassment is the feeling of discomfort when (1) some aspect of ourselves is, or threatens to be, revealed to others, and (2) we think that this revelation is likely to undermine the image that we seek to project to those others.

Potential sources of embarrassment vary according to circumstances and, in particular, to the company in which we find ourselves. They include particular thoughts, feelings, or dispositions; actions or behaviors, such as farting or swearing; conditions or states, such as a spot on the nose or smelly feet; possessions, such as our car or home; and relations, such as our oafish partner, criminal uncle, or lecherous aunt.

Sources of embarrassment need not be beneath our projected image, but merely out of keeping with it—which explains why it is possible, at times, to be embarrassed by our posh parents or rarefied education.

Shame

Whereas embarrassment is a response to something that threatens our projected image but is otherwise morally neutral, shame is a response to something that is morally reprehensible.

Shame is often accentuated if its object is exposed, but, unlike embarrassment, also attaches to a thought or action that remains undisclosed and undiscoverable to others. Embarrassment can sometimes be intense, but shame is a more substantial feeling in that it pertains to our moral character and not merely to our social character or image.

Shame arises from measuring our actions against moral standards and discovering that they fall short. If our actions fall short and we fail to notice, we can “be shamed” or made to notice—an extreme example being Cersei Lannister’s Walk of Shame in Game of Thrones. If having been made to notice, we do not much mind, we can be said to be shameless, or to “have no shame.”

In the Rhetoric, Aristotle, ever the fine psychologist, remarks that shame also arises from lacking in honorable things shared by others like us, especially if the lack is our own fault and therefore owes to our moral badness.

Finally, it is possible to feel shame vicariously, that is, to share in the shame of another person or feel shame on his or her behalf, especially if this person is closely allied or associated with us: for example, our partner, sibling, or child. Thus, even blameless people can experience shame, and so much is also true of embarrassment and other emotions. “Hell,” said Jean-Paul Sartre, “is other people.”

Try, right now, to act out the feeling of shame. The word “shame” derives from the Proto-Indo-European for “to cover,” and the feeling of shame is often expressed by a covering gesture over the brow and eyes, a downcast gaze, and a slack posture. Other manifestations of shame include a sense of warmth or heat and mental confusion or paralysis. These signs and symptoms can communicate remorse and contrition and, in so doing, inspire pity and pardon.

Even so, we may prefer to make a secret of our shame, for shame can itself be shameful—or, to be more precise, embarrassing.

People with low self-esteem, being harsher upon themselves, are more given to shame. In some cases, they may defend against shame with blame or contempt, often for the person or people who incited their shame. This is only likely to lead to deeper shame, and therefore to lower self-esteem, opening up a vicious cycle—which might be broken if, like certain politicians, they stop feeling shame at all.

While overwhelming shame can be destructive, mild to moderate shame is mostly a force for good, goading us to live more ethical lives.

In Dying for Ideas, the philosopher Costica Bradatan writes:

…the chief reason for studying philosophy is not a desire to know more about the world, but a profound sense of dissatisfaction with the state in which one finds oneself. One day you suddenly, painfully, realize that something important is missing in your life and that there is too large a gap between what you are and the sense of what you should be. And before you know it, this emptiness starts eating at you. You may not know yet what exactly it is that you want, but you know quite well what you do not want: remaining the person you currently are. You may be so ashamed that you don’t even dare to call that “existence”: you don’t exist yet properly. It must have been in this sense that Socrates used the term “midwifery” for what he was doing. By subjecting those around him to the rigors of philosophy, he was bringing them into proper existence. So closely related to self-detestation, it may be that philosophy begins not in wonder, but in shame.

Guilt

Whereas shame pertains to a person, guilt pertains to an action or actions and to blame and remorse. Shame says, “I am bad.” Guilt says, “I did something bad.”

More subtly, shame involves falling short of cultural or societal moral standards, whereas guilt involves falling short of one’s own moral standards. Thus, it is entirely possible to feel guilty about actions of which many or most of our peers approve, such as wearing designer clothes, driving a gas-guzzling car, or eating red meat.

As I discuss in my new book, Heaven and Hell: The Psychology of the Emotions, shame and guilt often go hand in hand, which is why they are so often confused. For instance, when we injure someone, we often feel bad about having done so (guilt) and, at the same time, feel bad about ourselves (shame).

Yet guilt and shame are distinct emotions. Shame is “egodystonic,” that is, in conflict with our desired self-image, and high levels of shame are correlated with poor psychological functioning. In particular, eating disorders and many sexual disorders can be understood as disorders of shame, as can narcissism, which can be construed as a defense against shame.

Guilt, on the other hand, is “egosyntonic,” that is, consistent with our self-image, and—except in extreme cases, such as that of the regicidal Lady Macbeth—is either unrelated or inversely correlated with poor psychological functioning.

Faced with the same set of circumstances, people with high self-esteem are more prone to guilt than to shame, and more likely to take corrective or redemptive action.

There is a fourth “negative” reflexive emotion, humiliation, which I will save for my next article.

Le_Jardin_de_Nébamoun.jpgIneni, architect to Pharaoh Thutmose I (d. 1492 BCE), had his garden painted into his tomb, along with a list of all the trees within it—presumably, so that they might be accounted for in the afterlife.

Ineni’s garden included:

  • 170 date palm
  • 120 doum palm
  • 73 sycamore fig
  • 31 persea
  • 16 carob
  • 12 grape vine
  • 10 tamarisk
  • 8 willow
  • 5 fig
  • 5 pomegranate
  • 5 garland thorn
  • 2 moringa
  • 2 myrtle

A grand garden of this sort symbolized control and mastery over nature, a haven of peace and plenty, of order and beauty, by which to project the status, power, and temperament of its owner. Other, more famous, examples include the Hanging Gardens of Babylon, the Gardens of the Real Alcázar in Seville, and the Gardens of Versailles.

The garden could also have a religious or philosophical message or dimension. For example, the Old Testament’s four rivers of Eden are represented by four watercourses in Islamic paradise gardens, and four paths in Christian cloister gardens. The Zen garden, by hinting at hidden principles, serves as an aid to meditation about the true meaning of existence.

The Gardens at Versailles reflect a rationalist, Cartesian vision of God-given ideas and principles for the intellect to apprehend or recognize, whereas English landscape gardens are more in the empiricist mold, presenting nature as a stream of sensory experiences skirting across the blank slate of the mind.

In either case, the garden represents a taming of nature, from dark and deadly forest, or disease-infested swamp, to an extension of our living space: open and structured to still our minds, but retaining enough mystery to sustain our interest and even, perhaps, capture our imagination.

Individual plants too can have a meaning. English churchyards often feature yew trees, which are poisonous, dark, and evergreen, and symbolize both death and immortality. A yew tree is commonly found near the lychgate, where, prior to the advent of mortuaries, cadavers guarded by vigils awaited burial.

In the ancient world, the palm tree symbolized victory, peace, and bounty, while the cedar of Lebanon symbolized pride, majesty, and dignity. Both also stood for righteousness, as in Psalm 92:12: “the righteous shall flourish like the palm tree, grow tall like the cedar of Lebanon.” Today, Cedrus libani is the national emblem of Lebanon, and a symbol of the peaceful Cedar Revolution of 2005.

Trees can also be planted to mark an important occasion, which is why British royals are often asked to brandish a shovel. In a recent annual tradition, the Friends of my local park purchase a noteworthy tree and invite a dignitary such as the Lord Mayor to plant it.

Today, gardening is more popular than ever. According to the National Gardening Survey 2018, more American households (77%) are gardening than ever before. In the U.K. 87% of homes have access to a garden, and 27 million people report a personal interest or active engagement in gardening, even if it is only on a balcony.

Sporting replacements of the BBC’s flagship Gardeners’ World achieve only a third of presenter Monty Don’s usual viewing figures of almost three million—which, in the U.K. is many more people than go to church.

Community garden projects and ‘guerrilla gardening’ are on the rise, as are garden towns and villages. Two years ago, one of my neighbours organized for thousands of daffodils to be planted on a neglected and overlooked common, transforming it into a Wordsworthian idyll for the selfie generation. I wonder, do they know that ‘daffodil’ is Narcissus in Latin?

Gardening is more and more recognized, and even prescribed, for its health benefits. These include: increased muscle strength and cardiovascular fitness; improved sleep and diet (if you grow your own produce); reduced stress, anxiety, and depression; a greater sense of community and belonging; and better self-esteem.

You don’t even have to get your hands dirty: some of these benefits accrue simply from visiting a garden, or even just looking over one—although it probably helps to notice and mentally engage with the greenery.

Researchers in Korea randomly assigned hospital patients recovering from thyroidectomy to rooms with plants and flowers, and rooms without, and found that the test group fared significantly better, asking for less pain relief and requiring less time in hospital. So yes, it makes sense to bring flowers, and, at home, to have indoor plants.

Even street trees greatly benefit our health. An American study looking at the city of Toronto found that, for cardio-metabolic conditions (heart disease, stroke, diabetes, obesity…), an increase of just 11 trees per city block compared to an increase in annual personal income of $20,000.

How might gardening help with mental health? To various degrees, we live inside the stories we tell ourselves. But gardening drags us out of our tortured heads and back into the natural world, which blunts the ideological and emotional extremes to which detached, abstract thought is prone.

In 1920, a mentally strained Ludwig Wittgenstein took up the post of assistant gardener at Klosterneuburg Abbey, explaining in a letter to a friend that he had been longing for “some kind of regularized work which, of all the things I can do in my present condition, is the most nearly bearable…”

Voltaire’s Candide (1759) is an attack on the highly abstracted, convoluted, and strained philosophy of Gottfried Wilhelm von Leibniz, and famously concludes with the phrase “we must cultivate our garden.”

In the Philosophy of Existence (1938), Karl Jaspers describes this disinterested process of looking outside oneself—or phenomenology, as it is sometimes called—as “a thinking that, in knowing, reminds me, awakens me, brings me to myself, transforms me.”

Just picture the gardener’s pure and simple delight at the first crocuses or tulips, a bird’s nest, a swarm of bees…

If gardening makes us feel better, it also makes us into better people. It is a moral education, a school of life, instilling virtues such as pragmatism, patience, perseverance, reliability, and humility, which then transfer out into other spheres.

In his treatise on agriculture (c. 160 BCE), Cato is quite clear that farmers make the bravest and strongest soldiers, and that, of all men, they are “the most highly respected, most stable, and least hated.”

In Plato’s Phaedrus (c. 370 BCE), Socrates compares the wise man to the good husbandman, who is careful to scatter the right seed in the right soil in the right season. Similarly, the good teacher is careful to sow the right words in the right soul at the right time. For Plato, the Divine Teacher, teaching is the gardening of the soul.

Building on these notions, Epicurus (d. 270 BCE) sought in his garden just outside Athens to return to an agricultural golden age of harmony, community, and self-sufficiency—to ground a flourishing life in the midst of a flourishing garden.

A garden is a microcosm of the outside world. Gardeners are acutely aware of the rhythms and cycles of nature: which flowers are in their prime, when to plant out the seedlings, when and how and how much it last rained. Just as music is time made audible, so the garden is “time made visible” [Clive James].

Winter is difficult, yes because it is dark and cold, but also because time is no longer structured by a succession of flowerings and fruitings. Time becomes amorphous, to be entertained and endured rather than savoured and celebrated like the season of magnolias, cherry blossom, rhubarb, plums, or chestnuts.

More than just keen observers of time, gardeners are real-life Time Lords, able to speed up time by working in the garden, and later to slow it right down by sitting back and surveying the fruits of their labour. Some gardeners are even able to step out of time altogether, working year round to create timeless moments of perfection.

But moments are all they will ever be. When you paint a picture or write a book, it is there for ever (and isn’t that just amazing?), but when you mow the lawn you have to do it all over again in just a few days’ time. The gardener is like Sisyphus, the mythological king made to repeat for all eternity the same meaningless task of pushing a boulder up a mountain, only to see it roll back down again.

In his essay of 1942, The Myth of Sisyphus, Albert Camus concludes: “The struggle to the top is itself enough to fill a man’s heart. One must imagine Sisyphus happy.”

Even in a state of utter hopelessness, Sisyphus can still be happy. Indeed, he is happy precisely because he is in a state of utter hopelessness, because in recognizing and accepting the hopelessness of his condition, he at the same time transcends it.

Loneliness is a complex and unpleasant emotional response to isolation or lack of companionship. The pain of loneliness is such that, throughout history, solitary confinement has been used as a form of torture and punishment. More than just painful, loneliness is also damaging. Lonely people eat and drink more, and exercise and sleep less.

Loneliness is a particular problem of modernity. One U.S. study found that between 1985 and 2004, the proportion of people reporting having no one to confide in almost tripled. According to a poll carried out in 2017 for the Jo Cox Commission on Loneliness, three-quarters of older people in the U.K. are lonely. Shockingly, two-fifths of respondents agreed with the statement, “sometimes an entire day goes by and I haven’t spoken to anybody”.

Some of the factors behind these stark statistics include: smaller household sizes, greater migration, higher media consumption, and longer life expectancy. Large conglomerations built on productivity and consumption at the expense of connection and contemplation can feel profoundly alienating. The Internet has become the great comforter and seems to offer it all: news, knowledge, music, entertainment, shopping, relationships, and even sex. But over time, it foments envy and division, confuses our needs and priorities, desensitizes us to violence and suffering, and, by creating a false sense of connectedness, entrenches superficial relationships at the cost of living ones.

Man has evolved over millennia into one of the most social and interconnected of all animals. Suddenly, he finds himself apart and alone, not on a mountaintop, in a desert, or on a raft at sea, but in a city of millions, in reach but out of touch. For the first time in human history, he has no practical need, and therefore no pretext, to interact and form attachments with his fellow men and women.

But, against nature, there are a few people who actively choose to remove themselves from the rest of society, or, at least, not to actively seek out social interaction. Such “loners” (the very term is pejorative, implying, as it does, abnormality and deviousness) may revel in their rich inner life or simply dislike or distrust the company of others, which, they feel, comes with more costs than benefits.

Timon of Athens, who lived at around the same time as Plato, began life in wealth, lavishing money upon his flattering friends, and, in accordance with his conception of friendship, never expecting anything in return. When he ran out of coin, all his friends deserted him, reducing him to the hard toil of labouring the fields. One day, as Timon tilled the earth, he uncovered a pot of gold, and, suddenly, all his old friends came piling back. But rather than welcome them with open arms, he cursed them and drove them away with sticks and clods of earth. Timon declared his hatred of humankind and withdrew into the forest, where, much to his annoyance, people began to seek him out as some sort of holy man.

Did Timon feel lonely in the forest? Probably not, because he did not believe he lacked for anything. As he no longer valued his friends or their companionship, he could not have desired or missed them—even though he may have pined for a better class of man, and, in that limited sense, felt lonely.

Loneliness is not so much an objective state of affairs as a subjective state of mind, a function of desired and achieved levels of social interaction and also of type or types of interaction. Lovers often feel lonely in the single absence of their beloved, even when completely surrounded by friends and family. Jilted lovers feel much lonelier than lovers who are merely apart from their beloved, indicating that loneliness is not merely a matter of the amount or degree of interaction, but also of the potential or possibility for interaction. Conversely, it is common to feel lonely within a marriage because the relationship is no longer validating or nurturing us, but diminishing us and holding us back.

And yet for many people marriage is, among others, an attempt to flee from their lifelong loneliness and escape from their inescapable demons. At the bottom, loneliness is not the experience of lacking but the experience of living. It is part and parcel of the human condition. Unless a person is resolved, it can only be a matter of time before the feeling of loneliness resurfaces, often with a vengeance. On this account, loneliness is the manifestation of the conflict between our desire for meaning and the absence of objective meaning from the universe, an absence that is all the more glaring in modern societies which have sacrificed traditional and religious structures of meaning on the thin altar of truth.

So much explains why people with a strong sense of purpose and meaning, or simply with a strong narrative, such as Nelson Mandela or St Anthony of the Desert, are protected from loneliness regardless of the circumstances in which they find themselves. St Anthony sought out loneliness precisely because he understood that it could bring him closer to the real questions and value of life. He spent fifteen years in a tomb and twenty years in an abandoned fort in the desert of Egypt before his devotees persuaded him to withdraw from his seclusion to instruct and organize them, whence his epithet, “Father of All Monks”. Anthony emerged from the fort not ill and emaciated, as everyone had been expecting, but healthy and radiant, and expired in his hundred and sixth year, which in the fourth century must in itself have counted as a minor miracle.

St Anthony did not lead a life of loneliness, but one of solitude. Loneliness, the pain of being alone, is damaging; solitude, the joy of being alone, is empowering, liberating. Our unconscious requires solitude to process and unravel problems, so much so that our body imposes it upon us each night in the form of sleep. By removing us from the constraints, distractions, and influences imposed upon us by others, solitude frees us to reconnect with ourselves, assimilate ideas, and generate identity and meaning.

For Nietzsche, men without the aptitude or opportunity for solitude are mere slaves because they have no alternative but to parrot culture and society. In contrast, anyone who has unmasked society naturally seeks out solitude, which becomes the source and guarantor of a more authentic set of values and ambitions:

I go into solitude so as not to drink out of everybody’s cistern. When I am among the many I live as the many do, and I do not think I really think. After a time it always seems as if they want to banish my self from myself and rob me of my soul.

Solitude removes us from the mindless humdrum of everyday life into a higher consciousness which reconnects us with our deepest humanity, and also with the natural world, which quickens into our muse and companion. By setting aside dependent emotions and constricting compromises, we free ourselves up for problem solving, creativity, and spirituality. If we can embrace it, this opportunity to adjust and refine our perspectives creates the strength and security for still greater solitude and, in time, the substance and meaning that guards against loneliness.

The life of St Anthony can leave the impression that solitude is at odds with attachment, but this need not be the case so long as the one is not pitted against the other. True lovers, says RM Rilke, should not only tolerate but “stand guard over” the solitude of the other.

In Solitude: A Return to the Self, the psychiatrist Anthony Storr convincingly argues that:

The happiest lives are probably those in which neither interpersonal relationships nor impersonal interests are idealized as the only way to salvation. The desire and pursuit of the whole must comprehend both aspects of human nature.

Be this as it may, not everyone is capable of solitude, and for many people aloneness will never amount to anything more than bitter loneliness. Younger people often find aloneness difficult, while older people are more likely, or less unlikely, to seek it out.

So much suggests that solitude, the joy of being alone, stems from, as well as promotes, a state of maturity and inner richness.

We are being lazy if there’s something that we ought to do but are reluctant to do because of the effort involved. We do it badly, or do something less strenuous or less boring, or just remain idle. In other words, we are being lazy if our motivation to spare ourselves effort trumps our motivation to do the right or best or expected thing—assuming, of course, we know what that is.

In the Christian tradition, laziness, or sloth, is one of the seven deadly sins because it undermines society and God’s plan, and invites the other sins. The Bible inveighs against slothfulness, for example, in Ecclesiastes:

By much slothfulness the building decayeth; and through idleness of the hands the house droppeth through. A feast is made for laughter, and wine maketh merry: but money answereth all things.

Today, laziness is so closely connected with poverty and failure that a poor person is often presumed lazy, no matter how hard he or she actually works.

But it could be that laziness is written into our genes. Our nomadic ancestors had to conserve energy to compete for scarce resources, flee predators and fight enemies. Expending effort on anything other than short-term advantage could jeopardise their very survival. In any case, in the absence of conveniences such as antibiotics, banks, roads or refrigeration, it made little sense to think long-term. Today, mere survival has fallen off the agenda, and it is long-term vision and commitment that lead to the best outcomes. Yet our instinct remains to conserve energy, making us averse to abstract projects with distant and uncertain payoffs.

Even so, few people would choose to be lazy. Many so-called ‘lazy’ people haven’t yet found what they want to do, or, for one reason or another, are not able to do it. To make matters worse, the job that pays their bills and fills their best hours might have become so abstract and specialised that they can no longer fully grasp its purpose or product, and, by extension, their part in improving other peoples’ lives. Unlike a doctor or builder, an assistant deputy financial controller in a large multinational corporation cannot be at all certain of the effect or end-product of his or her labour—so why bother?

Other psychological factors that can lead to ‘laziness’ are fear and hopelessness. Some people fear success, or don’t have enough self-esteem to feel comfortable with success, and laziness is their way of sabotaging themselves. William Shakespeare conveyed this idea much more eloquently and succinctly in Antony and Cleopatra: ‘Fortune knows we scorn her most when most she offers blows.’ Other people fear not success but failure, and laziness is preferable to failure because it is at one remove. ‘It’s not that I failed,’ they can tell themselves, ‘it’s that I never tried.’

Some people are ‘lazy’ because they understand their situation as being so hopeless that they cannot even begin to think it through, let alone do something about it. As these people are unable to address their circumstances, it could be argued that they are not truly lazy—which, at least to some extent, can be said of all ‘lazy’ people. The very concept of laziness presupposes the ability to choose not to be lazy, that is, presupposes the existence of free will.

In a few cases, ‘laziness’ is the very opposite of what it appears. We often confuse laziness with idleness, but idleness—which is to be doing nothing—need not amount to laziness. In particular, we might choose to remain idle because we value idleness and its products above whatever else we might be doing. Lord Melbourne, Queen Victoria’s favourite prime minister, extolled the virtues of ‘masterful inactivity’. More recently, Jack Welch, as chairman and CEO of General Electric, spent an hour each day in what he called ‘looking out of the window time’. And the German chemist August Kekulé in 1865 claimed to have discovered the ring structure of the benzene molecule while daydreaming about a snake biting its own tail. Adepts of this kind of strategic idleness use their ‘idle’ moments, among others, to observe life, gather inspiration, maintain perspective, sidestep nonsense and pettiness, reduce inefficiency and half-living, and conserve health and stamina for truly important tasks and problems. Idleness can amount to laziness, but it can also be the most intelligent way of working. Time is a very strange thing, and not at all linear: sometimes, the best way of using it is to waste it.

Idleness is often romanticised, as epitomised by the Italian expression dolce far niente (‘the sweetness of doing nothing’). We tell ourselves that we work hard from a desire for idleness. But in fact, we find even short periods of idleness hard to bear. Research suggests that we make up justifications for keeping busy and feel happier for it, even when busyness is imposed upon us. Faced with a traffic jam, we prefer to make a detour even if the alternative route is likely to take longer than sitting through the traffic.

There’s a contradiction here. We are predisposed to laziness and dream of being idle; at the same time, we always want to be doing something, always need to be distracted. How are we to resolve this paradox? Perhaps what we really want is the right kind of work, and the right balance. In an ideal world, we would do our own work on our own terms, not somebody else’s work on somebody else’s terms. We would work not because we needed to, but because we wanted to, not for money or status, but (at the risk of sounding trite) for peace, justice and love.

On the other side of the equation, it’s all too easy to take idleness for granted. Society prepares us for years and years for being useful as it sees it, but gives us absolutely no training in, and little opportunity for, idleness. But strategic idleness is a high art and hard to pull off—not least because we are programmed to panic the moment we step out of the rat race.

There is a very fine divide between idleness and boredom. In the 19th century, Arthur Schopenhauer argued that, if life were intrinsically meaningful or fulfilling, there could be no such thing as boredom. Boredom, then, is evidence of the meaninglessness of life, opening the shutters on some very uncomfortable thoughts and feelings that we normally block out with a flurry of activity or with the opposite thoughts and feelings—or indeed, any feelings at all.

In Albert Camus’s novel The Fall (1956), Clamence reflects to a stranger:

I knew a man who gave 20 years of his life to a scatterbrained woman, sacrificing everything to her, his friendships, his work, the very respectability of his life, and who one evening recognised that he had never loved her. He had been bored, that’s all, bored like most people. Hence he had made himself out of whole cloth a life full of complications and drama. Something must happen – and that explains most human commitments. Something must happen, even loveless slavery, even war or death.

In the essay The Critic as Artist (1891), Oscar Wilde wrote that ‘to do nothing at all is the most difficult thing in the world, the most difficult and the most intellectual’.

The world would be a much better place if we could all spend a year looking out of our window.

In Plato’s Cratylus, on the philosophy of language, Socrates says that the Greek word for truth, aletheia, is a compression of the phrase “a wandering that is divine”.

Since Plato, many thinkers have spoken of truth and God in the same breath, and truth has also been linked with concepts such as justice, power, and freedom. According to John the Apostle, Jesus said to the Jews: “And ye shall know the truth, and the truth shall make you free.”

Today, the belief in God may be dying, but what about truth? Rudy Giuliani, Donald Trump’s personal lawyer, claimed that “truth isn’t truth”, while Kellyanne Conway, Trump’s counsellor, presented the public with what she called “alternative facts”.

Over in the U.K. in the run-up to the Brexit referendum, Michael Gove, then Minister of Justice and Lord Chancellor, opined that people “have had enough of experts”. Accused by the father of a sick child of visiting a hospital for a press opportunity, Prime Minister Boris Johnson replied, “There’s no press here”—while being filmed by a BBC camera crew.

The anatomy of a lie

What constitutes a lie? A lie is not simply an untruth. For centuries, people taught their children that the earth was at the centre of the universe. This was not a lie, insofar as they believed it to be true.

For something to be a lie, the person putting it out has to believe that it is false, even if, by chance, it happens to be true. If I tell you, “I’m not actually my father’s biological son”, believing that I am, and it so happens that I am not, I am still telling a lie.

Of course, it could be that I am being sarcastic, or joking—and, if I have made this sufficiently clear, I could not be counted as lying. For my statement to be a lie, it is not enough that I believe it to be false. I must also intend you to believe that it is true, that is, I must also intend to deceive you. If my intention in deceiving you is a good one, I am telling a white lie; if it is a bad one, I am telling a black lie; and if it is a bit of both, a blue lie.

When Olympias told her son Alexander the Great that his father was not Philip of Macedon but Zeus himself, she would only have been lying if (1) she believed this to be false, and (2) she intended to deceive Alexander. Olympias—who, according to Plutarch, slept with snakes in her bed—probably did believe it to be true, which highlights an important problem with lying, namely, that people can believe the most fantastical things.

A special case is when someone tells the naked truth, intending others to interpret it as a lie or joke. In Game of Thrones, after killing the Freys, Arya Stark runs into some Lannister soldiers, who share with her their meal of roast rabbit and blackberry wine. When one of the soldiers (not the one played by Ed Sheeran) asks, “So why is a nice girl on her own going to King’s Landing?” Arya replies, point blank, “I’m going to kill the queen.” After an awkward silence, everyone including Arya bursts out laughing.

If I am late to a dinner party, I can tell a small lie about some heavy traffic, or I can tell a bolder lie about being pushed into a muddy ditch by a chihuahua and having to go home to get changed. The more unusual and imaginative (and embarrassing) the lie, the more it is likely to be believed.

Or I could try instead to hide the lie. For example, I might lie by omission or “mental reservation”: “Sorry, I had a flat tyre” (last month). Or I might lie by equivocation (playing on words), as Bill Clinton famously did when he stated, “I did not have sexual relations with that woman, Monica Lewinsky.”

A special kind of lie is the bluff, which involves pretending to have an asset or intention or position that one does not actually have. An infamous example of a bluff is former prime minister Theresa May’s Brexit mantra that “no deal is better than a bad deal”.

Lies versus bullsh*t

Is there any difference between telling lies and talking bullsh*t? According to the philosopher Harry Frankfurt, lies differ from bullsh*t in that liars must track the truth in order to conceal it, whereas bullsh*tters have no regard or sensitivity for the truth or even for what their intended audience believes, so long as this audience is convinced or carried by their rhetoric.

Bullsh*tters will say whatever it takes, from moment to moment, to limp on to the next moment.

For Frankfurt:

“Someone who lies and someone who tells the truth are playing on opposite sides, so to speak, in the same game. Each responds to the facts as he understands them, although the response of the one is guided by the authority of the truth, while the response of the other defies that authority and refuses to meet its demands. The bullsh*tter ignores these demands altogether. He does not reject the authority of the truth, as the liar does, and oppose himself to it. He pays no attention to it at all. By virtue of this, bullsh*t is a greater enemy of the truth than lies are.”

Pathological lying

Pathological lying—also sometimes called compulsive lying or mythomania—is a controversial construct, and not tightly defined. It refers to habitual lying, typically for no discernible external gain. It is often although not always a feature of the four Cluster B personality disorders, namely, borderline personality disorder, histrionic personality disorder, narcissistic personality disorder, and antisocial personality disorder, and is in Factor One of the Psychopathy Checklist.

As with pathological lying, most lying is actually carried out for internal, or emotional, gain, to attract attention or sympathy, or to alleviate feelings of abandonment, rejection, or worthlessness.

We often lie to ourselves and to others from a position of vulnerability: we lie not out of strength or smartness, but out of need and necessity.

The philosophy of lying

St Augustine’s treatise on lying begins with, “There is a great question about lying…” [Magna quæstio est de mendacio…].

It may be permissible to lie when the positive consequences clearly outweigh any negative consequences. Thus, it may be reasonable to lie in a life and death situation, for instance, to save someone from being discovered by a murderer. And it may be reasonable to lie if the person being lied to has forfeited their right to the truth, for example, by threatening violence.

But such situations are few and far between. Much more common are the small white lies that lubricate our social interactions, such as greeting acquaintances with “good to see you” and starting a letter to a stranger or antagonist with “Dear”.

Outside vis majorand social convention, it is usually a bad idea to lie. In the fifth century BCE, Herodotus wrote that from their fifth year to their twentieth, the Persians of the Pontus were instructed in three things: “to ride a horse, to draw a bow, and to speak the Truth.”

“The most disgraceful thing in the world [the Persians] think is to tell a lie; the next worst, to owe a debt: because, among other reasons, the debtor is obliged to tell lies.”

The first person to suffer from a lie is none other than the liar. Lying feels bad and damages pride and self-esteem. It is a slippery slope that leads to further and greater lies and other ethical violations. Having told a lie, it can take a lot of thought and exertion and sacrifice to avoid being found out. If found out (or even merely suspected), the liar loses authority and credibility, undermines their reputation and relationships, and may suffer further sanctions, including being lied to in return. Last but not least, by keeping them under the radar, lying prevents critical issues from being addressed and dealt with.

And then there is the harm to others. To lie is to treat people as means-to-an-end rather than as ends-in-themselves, which is why being lied to is experienced as disrespectful and demeaning. It also leads people to act on false information, which can have untold and unforeseen consequences.

When faced with a choice between a life of limitless pleasure as a detached brain in a vat, and a genuine, human life along with all its struggle and suffering, most people opt for the latter. This suggests that we value truth for its own sake, as well as for its utility. To deny us a part of reality is therefore to impoverish our life.

There is a line of reasoning that, since the natural end of speech is to communicate the thoughts of the speaker, lying is a perversion of language. Curiously, language runs into serious metaphysical difficulties as soon as a lie is introduced. Consider the sentence: “This statement is false.” If the statement is false, it is true; but if it is true, it is not false.

The strongest argument against lying is perhaps that it could not be made into a universal principle. If everyone started lying here and there and everywhere, everything would quickly come apart. For just this reason, Plato bans even poetry from his ideal state, reasoning that poetry is “thrice removed from the truth”.

It’s worrying that we as a society are increasingly tolerant of lies. When people take to lying, they have to tell more and more lies to shore up their earlier lies. This tangled web we weave undermines trust, to the point that we no longer believe anything, least of all the truth.

In the fifth century BCE, the Persian King of Kings Darius the Great had the following advice engraved for his successor Xerxes:

“Thou who shalt be king hereafter, protect yourself vigorously from the Lie; the man who shall be a lie-follower, him do thou punish well, if thus thou shall think. May my country be secure!”

If Darius knew it then, why do we not know it now?