We are being lazy if there’s something that we ought to do but are reluctant to do because of the effort involved. We do it badly, or do something less strenuous or less boring, or just remain idle. In other words, we are being lazy if our motivation to spare ourselves effort trumps our motivation to do the right or best or expected thing—assuming, of course, we know what that is.

In the Christian tradition, laziness, or sloth, is one of the seven deadly sins because it undermines society and God’s plan, and invites the other sins. The Bible inveighs against slothfulness, for example, in Ecclesiastes:

By much slothfulness the building decayeth; and through idleness of the hands the house droppeth through. A feast is made for laughter, and wine maketh merry: but money answereth all things.

Today, laziness is so closely connected with poverty and failure that a poor person is often presumed lazy, no matter how hard he or she actually works.

But it could be that laziness is written into our genes. Our nomadic ancestors had to conserve energy to compete for scarce resources, flee predators and fight enemies. Expending effort on anything other than short-term advantage could jeopardise their very survival. In any case, in the absence of conveniences such as antibiotics, banks, roads or refrigeration, it made little sense to think long-term. Today, mere survival has fallen off the agenda, and it is long-term vision and commitment that lead to the best outcomes. Yet our instinct remains to conserve energy, making us averse to abstract projects with distant and uncertain payoffs.

Even so, few people would choose to be lazy. Many so-called ‘lazy’ people haven’t yet found what they want to do, or, for one reason or another, are not able to do it. To make matters worse, the job that pays their bills and fills their best hours might have become so abstract and specialised that they can no longer fully grasp its purpose or product, and, by extension, their part in improving other peoples’ lives. Unlike a doctor or builder, an assistant deputy financial controller in a large multinational corporation cannot be at all certain of the effect or end-product of his or her labour—so why bother?

Other psychological factors that can lead to ‘laziness’ are fear and hopelessness. Some people fear success, or don’t have enough self-esteem to feel comfortable with success, and laziness is their way of sabotaging themselves. William Shakespeare conveyed this idea much more eloquently and succinctly in Antony and Cleopatra: ‘Fortune knows we scorn her most when most she offers blows.’ Other people fear not success but failure, and laziness is preferable to failure because it is at one remove. ‘It’s not that I failed,’ they can tell themselves, ‘it’s that I never tried.’

Some people are ‘lazy’ because they understand their situation as being so hopeless that they cannot even begin to think it through, let alone do something about it. As these people are unable to address their circumstances, it could be argued that they are not truly lazy—which, at least to some extent, can be said of all ‘lazy’ people. The very concept of laziness presupposes the ability to choose not to be lazy, that is, presupposes the existence of free will.

In a few cases, ‘laziness’ is the very opposite of what it appears. We often confuse laziness with idleness, but idleness—which is to be doing nothing—need not amount to laziness. In particular, we might choose to remain idle because we value idleness and its products above whatever else we might be doing. Lord Melbourne, Queen Victoria’s favourite prime minister, extolled the virtues of ‘masterful inactivity’. More recently, Jack Welch, as chairman and CEO of General Electric, spent an hour each day in what he called ‘looking out of the window time’. And the German chemist August Kekulé in 1865 claimed to have discovered the ring structure of the benzene molecule while daydreaming about a snake biting its own tail. Adepts of this kind of strategic idleness use their ‘idle’ moments, among others, to observe life, gather inspiration, maintain perspective, sidestep nonsense and pettiness, reduce inefficiency and half-living, and conserve health and stamina for truly important tasks and problems. Idleness can amount to laziness, but it can also be the most intelligent way of working. Time is a very strange thing, and not at all linear: sometimes, the best way of using it is to waste it.

Idleness is often romanticised, as epitomised by the Italian expression dolce far niente (‘the sweetness of doing nothing’). We tell ourselves that we work hard from a desire for idleness. But in fact, we find even short periods of idleness hard to bear. Research suggests that we make up justifications for keeping busy and feel happier for it, even when busyness is imposed upon us. Faced with a traffic jam, we prefer to make a detour even if the alternative route is likely to take longer than sitting through the traffic.

There’s a contradiction here. We are predisposed to laziness and dream of being idle; at the same time, we always want to be doing something, always need to be distracted. How are we to resolve this paradox? Perhaps what we really want is the right kind of work, and the right balance. In an ideal world, we would do our own work on our own terms, not somebody else’s work on somebody else’s terms. We would work not because we needed to, but because we wanted to, not for money or status, but (at the risk of sounding trite) for peace, justice and love.

On the other side of the equation, it’s all too easy to take idleness for granted. Society prepares us for years and years for being useful as it sees it, but gives us absolutely no training in, and little opportunity for, idleness. But strategic idleness is a high art and hard to pull off—not least because we are programmed to panic the moment we step out of the rat race.

There is a very fine divide between idleness and boredom. In the 19th century, Arthur Schopenhauer argued that, if life were intrinsically meaningful or fulfilling, there could be no such thing as boredom. Boredom, then, is evidence of the meaninglessness of life, opening the shutters on some very uncomfortable thoughts and feelings that we normally block out with a flurry of activity or with the opposite thoughts and feelings—or indeed, any feelings at all.

In Albert Camus’s novel The Fall (1956), Clamence reflects to a stranger:

I knew a man who gave 20 years of his life to a scatterbrained woman, sacrificing everything to her, his friendships, his work, the very respectability of his life, and who one evening recognised that he had never loved her. He had been bored, that’s all, bored like most people. Hence he had made himself out of whole cloth a life full of complications and drama. Something must happen – and that explains most human commitments. Something must happen, even loveless slavery, even war or death.

In the essay The Critic as Artist (1891), Oscar Wilde wrote that ‘to do nothing at all is the most difficult thing in the world, the most difficult and the most intellectual’.

The world would be a much better place if we could all spend a year looking out of our window.

In Plato’s Cratylus, on the philosophy of language, Socrates says that the Greek word for truth, aletheia, is a compression of the phrase “a wandering that is divine”.

Since Plato, many thinkers have spoken of truth and God in the same breath, and truth has also been linked with concepts such as justice, power, and freedom. According to John the Apostle, Jesus said to the Jews: “And ye shall know the truth, and the truth shall make you free.”

Today, the belief in God may be dying, but what about truth? Rudy Giuliani, Donald Trump’s personal lawyer, claimed that “truth isn’t truth”, while Kellyanne Conway, Trump’s counsellor, presented the public with what she called “alternative facts”.

Over in the U.K. in the run-up to the Brexit referendum, Michael Gove, then Minister of Justice and Lord Chancellor, opined that people “have had enough of experts”. Accused by the father of a sick child of visiting a hospital for a press opportunity, Prime Minister Boris Johnson replied, “There’s no press here”—while being filmed by a BBC camera crew.

The anatomy of a lie

What constitutes a lie? A lie is not simply an untruth. For centuries, people taught their children that the earth was at the centre of the universe. This was not a lie, insofar as they believed it to be true.

For something to be a lie, the person putting it out has to believe that it is false, even if, by chance, it happens to be true. If I tell you, “I’m not actually my father’s biological son”, believing that I am, and it so happens that I am not, I am still telling a lie.

Of course, it could be that I am being sarcastic, or joking—and, if I have made this sufficiently clear, I could not be counted as lying. For my statement to be a lie, it is not enough that I believe it to be false. I must also intend you to believe that it is true, that is, I must also intend to deceive you. If my intention in deceiving you is a good one, I am telling a white lie; if it is a bad one, I am telling a black lie; and if it is a bit of both, a blue lie.

When Olympias told her son Alexander the Great that his father was not Philip of Macedon but Zeus himself, she would only have been lying if (1) she believed this to be false, and (2) she intended to deceive Alexander. Olympias—who, according to Plutarch, slept with snakes in her bed—probably did believe it to be true, which highlights an important problem with lying, namely, that people can believe the most fantastical things.

A special case is when someone tells the naked truth, intending others to interpret it as a lie or joke. In Game of Thrones, after killing the Freys, Arya Stark runs into some Lannister soldiers, who share with her their meal of roast rabbit and blackberry wine. When one of the soldiers (not the one played by Ed Sheeran) asks, “So why is a nice girl on her own going to King’s Landing?” Arya replies, point blank, “I’m going to kill the queen.” After an awkward silence, everyone including Arya bursts out laughing.

If I am late to a dinner party, I can tell a small lie about some heavy traffic, or I can tell a bolder lie about being pushed into a muddy ditch by a chihuahua and having to go home to get changed. The more unusual and imaginative (and embarrassing) the lie, the more it is likely to be believed.

Or I could try instead to hide the lie. For example, I might lie by omission or “mental reservation”: “Sorry, I had a flat tyre” (last month). Or I might lie by equivocation (playing on words), as Bill Clinton famously did when he stated, “I did not have sexual relations with that woman, Monica Lewinsky.”

A special kind of lie is the bluff, which involves pretending to have an asset or intention or position that one does not actually have. An infamous example of a bluff is former prime minister Theresa May’s Brexit mantra that “no deal is better than a bad deal”.

Lies versus bullsh*t

Is there any difference between telling lies and talking bullsh*t? According to the philosopher Harry Frankfurt, lies differ from bullsh*t in that liars must track the truth in order to conceal it, whereas bullsh*tters have no regard or sensitivity for the truth or even for what their intended audience believes, so long as this audience is convinced or carried by their rhetoric.

Bullsh*tters will say whatever it takes, from moment to moment, to limp on to the next moment.

For Frankfurt:

“Someone who lies and someone who tells the truth are playing on opposite sides, so to speak, in the same game. Each responds to the facts as he understands them, although the response of the one is guided by the authority of the truth, while the response of the other defies that authority and refuses to meet its demands. The bullsh*tter ignores these demands altogether. He does not reject the authority of the truth, as the liar does, and oppose himself to it. He pays no attention to it at all. By virtue of this, bullsh*t is a greater enemy of the truth than lies are.”

Pathological lying

Pathological lying—also sometimes called compulsive lying or mythomania—is a controversial construct, and not tightly defined. It refers to habitual lying, typically for no discernible external gain. It is often although not always a feature of the four Cluster B personality disorders, namely, borderline personality disorder, histrionic personality disorder, narcissistic personality disorder, and antisocial personality disorder, and is in Factor One of the Psychopathy Checklist.

As with pathological lying, most lying is actually carried out for internal, or emotional, gain, to attract attention or sympathy, or to alleviate feelings of abandonment, rejection, or worthlessness.

We often lie to ourselves and to others from a position of vulnerability: we lie not out of strength or smartness, but out of need and necessity.

The philosophy of lying

St Augustine’s treatise on lying begins with, “There is a great question about lying…” [Magna quæstio est de mendacio…].

It may be permissible to lie when the positive consequences clearly outweigh any negative consequences. Thus, it may be reasonable to lie in a life and death situation, for instance, to save someone from being discovered by a murderer. And it may be reasonable to lie if the person being lied to has forfeited their right to the truth, for example, by threatening violence.

But such situations are few and far between. Much more common are the small white lies that lubricate our social interactions, such as greeting acquaintances with “good to see you” and starting a letter to a stranger or antagonist with “Dear”.

Outside vis majorand social convention, it is usually a bad idea to lie. In the fifth century BCE, Herodotus wrote that from their fifth year to their twentieth, the Persians of the Pontus were instructed in three things: “to ride a horse, to draw a bow, and to speak the Truth.”

“The most disgraceful thing in the world [the Persians] think is to tell a lie; the next worst, to owe a debt: because, among other reasons, the debtor is obliged to tell lies.”

The first person to suffer from a lie is none other than the liar. Lying feels bad and damages pride and self-esteem. It is a slippery slope that leads to further and greater lies and other ethical violations. Having told a lie, it can take a lot of thought and exertion and sacrifice to avoid being found out. If found out (or even merely suspected), the liar loses authority and credibility, undermines their reputation and relationships, and may suffer further sanctions, including being lied to in return. Last but not least, by keeping them under the radar, lying prevents critical issues from being addressed and dealt with.

And then there is the harm to others. To lie is to treat people as means-to-an-end rather than as ends-in-themselves, which is why being lied to is experienced as disrespectful and demeaning. It also leads people to act on false information, which can have untold and unforeseen consequences.

When faced with a choice between a life of limitless pleasure as a detached brain in a vat, and a genuine, human life along with all its struggle and suffering, most people opt for the latter. This suggests that we value truth for its own sake, as well as for its utility. To deny us a part of reality is therefore to impoverish our life.

There is a line of reasoning that, since the natural end of speech is to communicate the thoughts of the speaker, lying is a perversion of language. Curiously, language runs into serious metaphysical difficulties as soon as a lie is introduced. Consider the sentence: “This statement is false.” If the statement is false, it is true; but if it is true, it is not false.

The strongest argument against lying is perhaps that it could not be made into a universal principle. If everyone started lying here and there and everywhere, everything would quickly come apart. For just this reason, Plato bans even poetry from his ideal state, reasoning that poetry is “thrice removed from the truth”.

It’s worrying that we as a society are increasingly tolerant of lies. When people take to lying, they have to tell more and more lies to shore up their earlier lies. This tangled web we weave undermines trust, to the point that we no longer believe anything, least of all the truth.

In the fifth century BCE, the Persian King of Kings Darius the Great had the following advice engraved for his successor Xerxes:

“Thou who shalt be king hereafter, protect yourself vigorously from the Lie; the man who shall be a lie-follower, him do thou punish well, if thus thou shall think. May my country be secure!”

If Darius knew it then, why do we not know it now?

‘Hypersanity’ is not a common or accepted term. But neither did I make it up. I first came across the concept while training in psychiatry, in The Politics of Experience and the Bird of Paradise (1967) by R D Laing. In this book, the Scottish psychiatrist presented ‘madness’ as a voyage of discovery that could open out onto a free state of higher consciousness, or hypersanity. For Laing, the descent into madness could lead to a reckoning, to an awakening, to ‘break-through’ rather than ‘breakdown’.

A few months later, I read C G Jung’s autobiography, Memories, Dreams, Reflections (1962), which provided a vivid case in point. In 1913, on the eve of the Great War, Jung broke off his close friendship with Sigmund Freud, and spent the next few years in a troubled state of mind that led him to a ‘confrontation with the unconscious’.

As Europe tore itself apart, Jung gained first-hand experience of psychotic material in which he found ‘the matrix of a mythopoeic imagination which has vanished from our rational age’. Like Gilgamesh, Odysseus, Heracles, Orpheus and Aeneas before him, Jung travelled deep down into an underworld where he conversed with Salome, an attractive young woman, and with Philemon, an old man with a white beard, the wings of a kingfisher and the horns of a bull. Although Salome and Philemon were products of Jung’s unconscious, they had lives of their own and said things that he had not previously thought. In Philemon, Jung had at long last found the father-figure that both Freud and his own father had failed to be. More than that, Philemon was a guru, and prefigured what Jung himself was later to become: the wise old man of Zürich. As the war burnt out, Jung re-emerged into sanity, and considered that he had found in his madness ‘the primo materia for a lifetime’s work’.

The Laingian concept of hypersanity, though modern, has ancient roots. Once, upon being asked to name the most beautiful of all things, Diogenes the Cynic (412-323 BCE) replied parrhesia, which in Ancient Greek means something like ‘uninhibited thought’, ‘free speech’, or ‘full expression’. Diogenes used to stroll around Athens in broad daylight brandishing a lit lamp. Whenever curious people stopped to ask what he was doing, he would reply: ‘I am just looking for a human being’ – thereby insinuating that the people of Athens were not living up to, or even much aware of, their full human potential.

After being exiled from his native Sinope for having defaced its coinage, Diogenes emigrated to Athens, took up the life of a beggar, and made it his mission to deface – metaphorically this time – the coinage of custom and convention that was, he maintained, the false currency of morality. He disdained the need for conventional shelter or any other such ‘dainties’, and elected to live in a tub and survive on a diet of onions. Diogenes proved to the later satisfaction of the Stoics that happiness has nothing whatsoever to do with a person’s material circumstances, and held that human beings had much to learn from studying the simplicity and artlessness of dogs, which, unlike human beings, had not complicated every simple gift of the gods.

The term ‘cynic’ derives from the Greek kynikos, which is the adjective of kyon or ‘dog’. Once, upon being challenged for masturbating in the marketplace, Diogenes regretted that it were not as easy to relieve hunger by rubbing an empty stomach. When asked, on another occasion, where he came from, he replied: ‘I am a citizen of the world’ (cosmopolites), a radical claim at the time, and the first recorded use of the term ‘cosmopolitan’. As he approached death, Diogenes asked for his mortal remains to be thrown outside the city walls for wild animals to feast upon. After his death in the city of Corinth, the Corinthians erected to his glory a pillar surmounted by a dog of Parian marble.

Jung and Diogenes came across as insane by the standards of their day. But both men had a depth and acuteness of vision that their contemporaries lacked, and that enabled them to see through their facades of ‘sanity’. Both psychosis and hypersanity place us outside society, making us seem ‘mad’ to the mainstream. Both states attract a heady mixture of fear and fascination. But whereas mental disorder is distressing and disabling, hypersanity is liberating and empowering.

After reading The Politics of Experience, the concept of hypersanity stuck in my mind, not least as something that I might aspire to for myself. But if there is such a thing as hypersanity, the implication is that mere sanity is not all it’s cracked up to be, a state of dormancy and dullness with less vital potential even than madness. This I think is most apparent in people’s frequently suboptimal – if not frankly inappropriate – responses, both verbal and behavioural, to the world around them. As Jung puts it:

The condition of alienation, of being asleep, of being unconscious, of being out of one’s mind, is the condition of the normal man.

Society highly values its normal man. It educates children to lose themselves and to become absurd, and thus to be normal.

Normal men have killed perhaps 100,000,000 of their fellow normal men in the last 50 years.

Many ‘normal’ people suffer from not being hypersane: they have a restricted worldview, confused priorities, and are wracked by stress, anxiety and self-deception. As a result, they sometimes do dangerous things, and become fanatics or fascists or otherwise destructive (or not constructive) people. In contrast, hypersane people are calm, contained and constructive. It is not just that the ‘sane’ are irrational but that they lack scope and range, as though they’ve grown into the prisoners of their arbitrary lives, locked up in their own dark and narrow subjectivity. Unable to take leave of their selves, they hardly look around them, barely see beauty and possibility, rarely contemplate the bigger picture – and all, ultimately, for fear of losing their selves, of breaking down, of going mad, using one form of extreme subjectivity to defend against another, as life – mysterious, magical life – slips through their fingers.

We could all go mad, in a way we already are, minus the promise. But what if there were another route to hypersanity, one that, compared with madness, was less fearsome, less dangerous, and less damaging? What if, as well as a backdoor way, there were also a royal road strewn with sweet-scented petals? After all, Diogenes did not exactly go mad. Neither did other hypersane people such as Socrates and Confucius, although the Buddha did suffer, in the beginning, with what might today be classed as depression.

Besides Jung, are there any modern examples of hypersanity? Those who escaped from Plato’s cave of shadows were reluctant to crawl back down and involve themselves in the affairs of men, and most hypersane people, rather than courting the limelight, might prefer to hide out in their back gardens. But a few do rise to prominence for the difference that they felt compelled to make, people such as Nelson Mandela and Temple Grandin. And the hypersane are still among us: from the Dalai Lama to Jane Goodall, there are many candidates. While they might seem to be living in a world of their own, this is only because they have delved more deeply into the way things are than those ‘sane’ people around them.Aeon counter – do not remove