In the Nicomachean Ethics, Aristotle (d. 322 BCE) tries to discover what is ‘the supreme good for man’, that is, what is the best way to lead our life and give it meaning.
For Aristotle, a thing is most clearly and easily understood by looking at its end, purpose, or goal. For example, the purpose of a knife is to cut, and it is by seeing this that one best understands what a knife is; the goal of medicine is good health, and it is by seeing this that one best understands what medicine is, or, at least, ought to be.
Now, if one persists with this, it soon becomes apparent that some goals are subordinate to other goals, which are themselves subordinate to yet other goals. For example, a medical student’s goal may be to qualify as a doctor, but this goal is subordinate to her goal to heal the sick, which is itself subordinate to her goal to make a living by doing something useful. This could go on and on, but unless the medical student has a goal that is an end-in-itself, nothing that she does is actually worth doing.
What, asks Aristotle, is this goal that is an end-in-itself? What, in other words, is the final purpose of everything that we do?
The answer, says Aristotle, is happiness.
And of this nature happiness is mostly thought to be, for this we choose always for its own sake, and never with a view to anything further: whereas honour, pleasure, intellect, in fact every excellence we choose for their own sakes, it is true, but we choose them also with a view to happiness, conceiving that through their instrumentality we shall be happy: but no man chooses happiness with a view to them, nor in fact with a view to any other thing whatsoever.
Why did we get dressed this morning? Why do we go to the dentist? Why do we go on a diet? Why am I writing this article? Why are you reading it? Because we want to be happy, simple as that.
That the meaning of life is happiness may seem moot, but it is something that most of us forget somewhere along the way. Oxford and Cambridge are infamous for their fiendish admission interviews, and one question that is sometimes asked is, ‘What is the meaning of life?’ So, when I prepare prospective doctors for their medical school interviews, I frequently put this question to them. When they flounder, as invariably they do, I ask them, ‘Well, tell me, why are you here?’
Our exchange might go something like this:
“What do you mean, why am I here?”
“Well, why are you sitting here with me, prepping for your interviews, when you could be outside enjoying the sunshine?”
“Because I want to do well in my interviews.”
“Why do you want to do well in your interviews?”
“Because I want to get into medical school.”
“Why do you want to get into medical school?”
“Because I want to become a doctor.”
“Why do you want to put yourself through all that trouble?”
And so on.
But the one thing that the students never tell me is the truth, which is:
“I am sitting here, putting myself through all this, because I want to be happy, and this is the best way I have found of becoming or remaining so.”
Somewhere along the road, the students lost the wood for the trees, even though they are only at the beginning of their journey. With the passing of the years, their short-sightedness will only get worse—unless, of course, they read and remember their Aristotle.
According to the philosopher Søren Kierkegaard (d. 1855), a person can, deep down, lead one of three lives: the esthetic life, the ethical life, or the religious life.
A person leading the æsthetic life aims solely at satisfying her desires. If, for example, it is heroin that she craves, she will do whatever it takes to get hold of her next fix. If heroin happens to be cheap and legal, this need not involve any illegal or immoral behaviour on her part. But if heroin happens to be expensive or illegal, as is generally the case, she may have to resort to lying, stealing, and much worse. To satisfy her desires, which, by definition, she insists upon doing, the æsthete constantly has to adapt to the circumstances in which she finds herself, and, as a result, cannot lay claim to a consistent, coherent self.
The person leading the ethical life, in complete contrast to the æsthete, behaves according to categorical and immutable moral principles such as ‘do not lie’ and ‘do not steal’, regardless of the circumstances, however attenuating, in which she happens to find herself. Because the moralist has a consistent, coherent self, she leads a higher type of life than that of the æsthete.
But the highest type of life is the religious life, which has something in common with both the ethical life and the æsthetic life. Like the ethical life, the religious life recognizes and respects the authority of moral principles; but like the æsthetic life, it is sensitive to the circumstances. In acquiescing to universal moral principles yet attending to particularities, the religious life opens the door to moral indeterminacy, that is, to ambiguity, uncertainty, and anxiety. Anxiety, says Kierkegaard, is the dizziness of freedom.
A paradigm of the religious life is that of the biblical patriarch Abraham, as epitomized by the episode of the Sacrifice of Isaac.
According to Genesis 22, God said unto Abraham:
Take now thy son, thine only only son Isaac, whom thou lovest, and get thee into the land of Moriah; and offer him there for a burnt offering upon one of the mountains which I will tell thee of.
Unlike the æsthete, Abraham is acutely aware of, and attentive to, moral principles such as, ‘Thou shalt not kill’—which is, of course, one of the ten commandments. But unlike the moralist, he is also willing or able to look beyond these moral principles, and in the end resigns himself to obeying God’s command.
But as he is about to slay his sole heir, born of a miracle, an angel appears and stays his hand:
Abraham, Abraham … Lay not thine hand upon the lad, neither do thou any thing unto him: for now I know that thou fearest God, seeing thou hast not withheld thy son, thine only son from me.
At this moment, a ram appears in a thicket, and Abraham seizes it and sacrifices it in Isaac’s stead. He then names the place of the sacrifice Jehovahjireh, which translates from the Hebrew as, ‘The Lord will provide.’
The teaching of the Sacrifice of Isaac is that the conquest of doubt and anxiety, and hence the exercise of freedom, requires something of a leap of faith. It is in making this leap, not only once but over and over again, that a person, in the words of Kierkegaard, ‘relates himself to himself’ and is able to rise into a thinking, deciding, living being.
In the Milgram experiment, conducted in 1961 during the trial of the Nazi war criminal Adolf Eichmann [one of the major organizers of the Holocaust], an experimenter ordered a ‘teacher’, the test subject, to deliver what the latter believed to be painful shocks to a ‘learner’. The experimenter informed the teacher and learner that they would be participating in a study on learning and memory in different situations, and asked them to draw lots to determine their roles, with the lots rigged so that the test subject invariably ended up as the teacher.
The teacher and the learner were entered into adjacent rooms from which they could hear but not see each other. The teacher was instructed to deliver a shock to the learner for every wrong answer that he gave, and, after each wrong answer, to increase the intensity of the shock by 15 volts, from 15 to 450 volts. The shock button, instead of delivering a shock, activated a tape recording of increasingly alarmed and alarming reactions from the learner. After a certain number of shocks, the learner began to bang on the wall and, eventually, fell silent.
If the teacher indicated that he wanted to end the experiment, the experimenter gave him up to four increasingly stern verbal prods. If, after the fourth prod, the teacher still wanted to end the experiment, the experiment was terminated. Otherwise, the experiment ran until the teacher had delivered the maximum shock of 450 volts three times in succession.
In the first set of experiments, 26 out of 40 test subjects delivered the massive 450-volt shock, and all 40 test subjects delivered shocks of at least 300 volts.
The philosopher Hannah Arendt called this propensity to do evil without oneself being evil ‘the banality of evil’. Being Jewish, Arendt fled Germany in the wake of Hitler’s rise. Some thirty years later, she witnessed and reported on Adolf Eichmann’s trial in Jerusalem. In the resulting book, she remarks that Eichmann, though lacking in empathy, did not come across as a fanatic or psychopath, but as a ‘terrifyingly normal’ person, a bland bureaucrat who lacked skills and education and an ability to think for himself. Eichmann had simply been pursuing his idea of success, diligently climbing the rungs of the Nazi hierarchy. From his perspective, he had done no more than ‘obey orders’, even, ‘obey the law’—not unlike Kierkegaard’s unquestioning moralist.
Eichmann was a ‘joiner’ who, all his life, had joined, or sought to join, various outfits and organizations in a bid to be a part of something bigger than himself, to define himself, to belong. But then he got swept up by history and landed where he landed.
Arendt’s thesis has attracted no small measure of criticism and controversy. Although she never sought to excuse or exonerate Eichmann, she may have been mistaken or misled about his character and motives. Regardless, in the final analysis, Eichmann’s values, his careerism, his nationalism, his antisemitism, were not truly his own as a self-determining being, but borrowed from the movements and society from which he arose, even though he and millions of others paid the ultimate price for them.
Whenever you’re about to engage in something with an ethical dimension, always ask yourself, “Is this who I wanted to be on the best day of my life?”
There is an old Japanese story about a monk and a samurai.
One day, a Zen monk was going from temple to temple, following the shaded path along a babbling brook, when he fell upon a bedraggled and badly bruised samurai.
‘Whatever happened to you?’ asked the monk.
‘We were conveying our lord’s treasure when we were set upon by bandits. But I played dead and was the only one of my company to survive. As I lay on the ground with my eyes shut, a question kept turning in my mind. Tell me, little monk, what is the difference between heaven and hell?’
‘What samurai plays dead while his companions are slain! Shame on you! You ought to have fought to the death. Look at the sight of you, a disgrace to your class, your master, and every one of your ancestors. You are not worthy of the food that you eat or the air that you breathe, let alone of my hard-won wisdom!’
At all this, the samurai puffed up with rage and appeared to double in size as he drew out his sword, swung it over his head, and brought it down onto the monk.
But just before being struck, the monk changed his tone and composure, and calmly said, ‘This is hell.’
The samurai dropped his sword. Filled with shame and remorse, he fell to his knees with a clatter of armour: ‘Thank you for risking your life simply to teach a stranger a lesson’ he said, his eyes wet with tears. ‘Please, if you could, forgive me for threatening you.’
The circumstances in which we laugh are many and varied, but, deep down, we laugh for one (or sometimes several) of just seven reasons.
1. To feel better about ourselves. When looking for romance on dating sites and apps, we often ask for, or promise to offer, a good sense of humour (GSOH). Today, we tend to think of laughter as a good thing, but, historically, this has not always been the case. In particular, the Church looked upon laughter as a corrupting and subversive force, and for centuries, the monasteries forbade it. This notion that laughter can be less than virtuous finds an echo in the superiority theory of laughter, according to which laughter is a way of putting ourselves up by putting others down. The superiority theory is most closely linked with the philosopher Thomas Hobbes, who conceived of laughter as “a sudden glory arising from sudden conception of some eminency in ourselves, by comparison with the infirmity of others, or with our own formerly.” Think of medieval mobs jeering at people in stocks, or, in our time, Candid Camera.
2. To relieve stress and anxiety. Clearly, the superiority theory is unable to account for all cases of laughter, such as laughter arising from relief, surprise, or joy. According to the relief theory of laughter, most often associated with Sigmund Freud, laughter represents a release of pent-up nervous energy. Like dreams, jokes are able to bypass our inner censor, enabling a repressed emotion such as xenophobia (or, at least, the nervous energy associated with the repression) to surface—explaining why, at times, we can be embarrassed by our own laughter. By the same token, a comedian might raise a laugh by conjuring some costly emotion, such as admiration or indignation, and then suddenly killing it. Although more flexible than the superiority theory, the relief theory is unable to account for all cases of laughter, and those who laugh hardest at offensive jokes are not generally the most repressed of people.
3. To keep it real. Much more popular today is the incongruity theory of laughter, associated with the likes of Immanuel Kant and Søren Kierkegaard, according to which the comedian raises a laugh, not by conjuring an emotion and then killing it, but by creating an expectation and then contradicting it. Building upon Aristotle, Kierkegaard highlighted that the violation of an expectation is the core not only of comedy but also of tragedy—the difference being that, in tragedy, the violation leads to significant pain or harm. Possibly, it is not the incongruity itself that we enjoy, but the light that it sheds, in particular, on the difference between what lies inside and outside our heads. The incongruity theory is arguably more basic than the relief and superiority theories. When someone laughs, our inclination is to search for an incongruity; and though we may laugh for superiority or relief, even then, it helps if we can pin our laughter on some real or imagined incongruity.
4. As a social service. According to the philosopher Henri Bergson, we tend to fall into patterns and habits, to rigidify, to lose ourselves to ourselves—and laughter is how we point this out to one another, how we up our game as a social collective. For example, we may laugh at one who falls into a hole through absentmindedness, or at one who constantly repeats the same gesture or phrase. Conversely, we may also laugh at, or from, an unusual or unexpected lack of rigidity, as, for instance, when we break a habit or have an original idea. Ultimately, says Bergson, we are laughable to the extent that we are a machine or an object, to the extent that we lack self-awareness, that we are invisible to ourselves while being visible to everyone else. Thus, the laughter of others usually draws attention to our unconscious processes, to our modes or patterns of self-deception, and to the gap, or gulf, between our fiction and the reality. This gap is narrowest in poets and artists, who have to transcend themselves if they are to be worthy of the name.
5. To put others at ease. Another way of understanding laughter is to look at it like a biologist or anthropologist might. Human infants are able to laugh long before they can speak. Laughter involves parts of the brain that are, in evolutionary terms, much older than the language centres, and that we share with other animals. Primates, in particular, produce laughing sounds when playfighting, play-chasing, or tickling one another. As with human children, it seems that their laughter functions as a signal that the danger is not for real—which may be why rictus characters such as Batman’s Joker, who send a misleading signal, are so unsettling.
6. For diplomacy. Most laughter, even today, is not directed at jokes, but at creating and maintaining social bonds. Humour is a social lubricant, a signal of contentedness, acceptance, and belonging. More than that, it is a way of communicating, of making a point emphatically, or conveying a sensitive message without incurring the usual social costs. At the same time, humour can also be a weapon, a sublimed form of aggression, serving, like the stag’s antlers, to pull rank or attract a mate. The subtlety and ambiguity involved is in itself a source of almost endless stimulation.
7. To transcend ourselves. Laughter may have begun as a signal of play, but it has, as we have seen, evolved a number of other functions. Zen masters teach that it is much easier to laugh at ourselves once we have transcended our ego. At the highest level, laughter is the sound of the shattering of the ego. It is a means of gaining (and revealing) perspective, of rising beyond ourselves and our lives, of achieving a kind of immortality, a kind of divinity. Upon awakening on her deathbed to see her entire family around her, Nancy Astor quipped, “Am I dying, or is this my birthday?”
Today, laughter is able to give us a little of what religion once did.
Following his defeat at the Battle of Actium in 31 BCE, Marc Antony heard a rumour that Cleopatra had committed suicide and, in consequence, stabbed himself in the abdomen—only to discover that Cleopatra herself had been responsible for spreading the rumour. He later died in her arms.
“Fake news” is nothing new, but in our Internet age it has spread like a contagious disease, swinging elections, fomenting social unrest, undermining institutions, and diverting political capital away from health, education, the environment, and all-round good government.
So how best to guard against it?
As a medical specialist, I’ve spent over 20 years in formal education. With the possible exception of my two-year masters in philosophy, the emphasis of my education has always been firmly and squarely on fact accumulation.
Today, I have little use for most of these facts, and though I am only middle-aged, many are already out of date, or highly questionable.
But what I do rely on—every day, all the time—is my faculty for critical thinking. As BF Skinner once put it, “Education is what survives when what has been learnt has been forgotten.”
But can critical thinking even be taught?
In Plato’s Meno, Socrates says that people with wisdom and virtue are very poor at imparting those qualities: Themistocles, the Athenian politician and general, was able to teach his son Cleophantus skills such as standing upright on horseback and shooting javelins, but no one ever credited Cleophantus with anything like his father’s wisdom; and the same could also be said of Lysimachus and his son Aristides, and Thucydides and his sons Melesias and Stephanus.
In Plato’s Protagoras, Socrates says that Pericles, who led Athens at the peak of its golden age, gave his sons excellent instruction in everything that could be learnt from teachers, but when it came to wisdom, he simply left them to “wander at their own free will in a sort of hope that they would light upon virtue of their own accord”.
It may be that wisdom and virtue cannot be taught, but thinking skills certainly can—or, at least, the beginning of them.
So rather than leaving thinking skills to chance, why not make more time for them in our schools and universities, and be more rigorous and systematic about them?
I’ll make a start by introducing you to what I have called “the five enemies of rational thought”:
1. Formal fallacy. A fallacy is some kind of defect in an argument. A formal fallacy is an invalid type of argument. It is a deductive argument with an invalid form, for example:
Some A are B.
Some B are C.
Therefore, some A are C.
If you cannot yet see that this argument is invalid, substitute A, B, and C with “insects”, “herbivores”, and “mammals”.
Insects, clearly, are not mammals.
A formal fallacy is built into the structure of an argument and is invalid irrespective of the content of the argument.
2. Informal fallacy. An informal fallacy, in contrast, is one that can only be identified through an analysis of the content of the argument.
Informal fallacies often turn on the misuse of language, for example, using a key term or phrase in an ambiguous way, with one meaning in one part of the argument and another meaning in another part—called “fallacy of equivocation”.
Informal fallacies can also distract from the weakness of an argument, or appeal to the emotions instead of reason.
Here are a few more examples of informal fallacies.
Damning the alternatives. Arguing in favour of something by damning its alternatives. (Tim’s useless and Bob’s a drunk. So, I’ll marry Jimmy. Jimmy’s the right man for me.)
Gambler’s fallacy. Assuming that the outcome of one or more independent events can impact the outcome of a subsequent independent event. (June is pregnant with her fourth child. Her first three children are all boys, so this time it’s bound to be a girl.)
Appeal to popularity. Concluding the truth of a proposition on the basis that most or many people believe it to be true. (Of course he’s guilty: even his mother has turned her back on him.)
Argument from ignorance. Upholding the truth of a proposition based on a lack of evidence against it, or the falsity of a proposition based on a lack of evidence for it. (Scientists haven’t found any evidence of current or past life on Mars. So, we can be certain that there has never been any life on Mars.)
Argument to moderation. Arguing that the moderate view or middle position must be the right or best one. (Half the country favours leaving the European Union, the other half favours remaining. Let’s compromise by leaving the European Union but remaining in the Customs Union.)
3. Cognitive bias. Cognitive bias is sloppy, if not necessarily faulty, reasoning: a mental shortcut or heuristic intended to spare us time, effort, or discomfort—often while reinforcing our self-image or worldview—but at the cost of accuracy or reliability.
For example, in explaining the behaviour of other people, our tendency is to overestimate the role of character traits over situational factors—a bias, called correspondence bias, that goes into reverse when it comes to explaining our own behaviour. Thus, if Charlotte fails to mow the lawn, I indict her with forgetfulness, laziness, or spite; but if I fail to mow the lawn, I absolve myself on the grounds of busyness, tiredness, or inclement weather.
Another important cognitive bias is my-side, or confirmation, bias, which is the propensity to search for or recall only those stories, facts, and arguments that are in keeping with our pre-existing beliefs while filtering out those that conflict with them—which, especially on social media, can lead us to inhabit a so-called echo chamber.
4. Cognitive distortion. Cognitive distortion is a concept from cognitive-behavioural therapy (CBT), developed by psychiatrist Aaron Beck in the 1960s and used in the treatment of depression and other mental disorders.
Cognitive distortion involves interpreting events and situations so that they conform to and reinforce our outlook or frame of mind, typically on the basis of very scant or partial evidence, or even no evidence at all.
Common cognitive distortions in depression include selective abstraction and catastrophic thinking.
Selective abstraction is to focus on a single and often insignificant negative event or condition to the exclusion of other, more positive ones, for example, “My partner hates me. He gave me an annoyed look three days ago.”
Catastrophic thinking is to exaggerate and dramatize the likely consequences of an event or situation, for example, “The pain in my knee is getting worse. When I’m reduced to a wheelchair, I won’t be able to go to work and pay the bills. So, I’ll end up losing my house and dying in the street.”
A cognitive distortion can open up a vicious circle, with the cognitive distortion feeding the depression, and the depression the cognitive distortion.
Cognitive distortion as broadly understood is not limited to depression and other mental disorders, but is also a feature of, among others, poor self-esteem, jealousy, and marital conflict.
5. Self-deception. Of the five enemies of rational thought, the most important by far is self-deception, because it tends to underlie all the others.
If we do not think clearly, if we cannot see the wood for the trees, this is not usually because we lack intelligence or education or experience, but because we feel exposed and vulnerable—and rather than come to terms with a painful truth, prefer, almost reflexively, to deceive and defend ourselves.
As I argue in Hide and Seek: The Psychology of Self-Deception, all self-deception can be understood in terms of ego defence. In psychoanalytic theory, an ego defence is one of several unconscious processes that we deploy to diffuse the fear and anxiety that arise when who or what we truly are (our unconscious “id”) comes into conflict with who we think we are or who we think we should be (our conscious “superego”).
To put some flesh onto this, let’s take a look at two important ego defences: projection and idealization.
Projection is the attribution of one’s unacceptable thoughts and feelings to other people. This necessarily involves repression (another ego defence) as a first step, since unacceptable thoughts and feelings need to be repudiated before they can be detached. Classic examples of projection include the envious person who believes that everyone envies her, the covetous person who lives in constant fear of being dispossessed, and the person with fantasies of infidelity who suspects that they are being cheated upon by their partner.
Idealization involves overestimating the positive attributes of a person, object, or idea while underestimating its negative attributes. At a deeper level, it involves the projection of our needs and desires onto that person, object, or idea. A paradigm of idealization is infatuation, or romantic love, when love is confused with the need to love, and the idealized person’s negative attributes are glossed over or even construed as positive. Although this can make for a rude awakening, there are few better ways of relieving our existential anxiety than by manufacturing something that is ‘perfect’ for us, be it a piece of equipment, a place, country, person, or god.
In all cases, the raw material of thought is facts. If the facts are missing, or worse, misleading, then thought cannot even get started.