There is no agreed definition or model of intelligence. By the Collins English Dictionary, it is ‘the ability to think, reason, and understand instead of doing things automatically or by instinct’. By the Macmillan Dictionary, it is ‘the ability to understand and think about things, and to gain and use knowledge’.

In seeking to define intelligence, a good place to start might be with dementia. In Alzheimer’s disease, the most common form of dementia, there is disturbance of multiple higher cortical functions including memory, thinking, orientation, comprehension, calculation, learning capacity, language, and judgement. I think it significant that people with dementia or severe learning difficulties cope very poorly with changes in their environment, such as moving into a care home or even into an adjacent room. Taken together, this suggests that intelligence refers to the functioning of a number of related faculties and abilities that enable us to respond to environmental pressures to avoid danger and distress. Because this is not beyond animals and even plants, they too can be said to be possessed of intelligence.

We Westerners tend to think of intelligence primarily in terms of analytical skills. But in a close-knit hunter-gatherer society, intelligence might be defined more in terms of foraging skills, or social skills or responsibilities. Even within a single society, the skills that are most valued change over time. In the West, the emphasis has gradually shifted from language skills to analytical skills, and it is only in 1960, well within living memory, that the Universities of Oxford and Cambridge dropped Latin as an entry requirement. In 1990, Peter Salovey and John D. Mayer published the seminal paper on emotional intelligence, and E.I. soon became all the rage. In that same year, 1990, Tim Berners-Lee wrote the first web browser. Today, we cannot go very far without having some considerable I.T. skills (certainly by the standards of 1990), and computer scientists are among some of the most highly paid professionals. All this to say that what constitutes intelligence varies according to the needs and values of our culture and society.

Our society holds analytical skills in such high regard that some of our leaders repeatedly mention their ‘high I.Q.’ to lend themselves credibility. This Western emphasis on reason and intelligence has its roots in Ancient Greece with Socrates, his pupil Plato, and Plato’s pupil Aristotle. Socrates held that ‘the unexamined life is not worth living’. He typically proceeded by questioning one or more people about a certain concept such as courage or justice, eventually exposing a contradiction in their initial assumptions and provoking a reappraisal of the concept. For Plato, reason could carry us far beyond the confines of common sense and everyday experience into a ‘hyper-heaven’ of ideal forms. He famously fantasized about putting a geniocracy of philosopher-kings in charge of his utopic Republic. Finally, Aristotle argued that our distinctive function as human beings is our unique capacity to reason, and therefore that our supreme good and happiness consists in leading a life of rational contemplation. To paraphrase Aristotle in Book X of the Nicomachean Ethics, ‘man more than anything is reason, and the life of reason is the most self-sufficient, the most pleasant, the happiest, the best, and the most divine of all.’ In later centuries, reason became a divine property, found in man because made in God’s image. If you struggled with your SATs, or thought they were pants, you now know who to blame.

Unfortunately, the West’s obsession with analytical intelligence has had, and continues to have, dire moral and political consequences. Immanuel Kant most memorably made the connection between reasoning and moral standing, arguing (in simple terms) that by virtue of their ability to reason human beings ought to be treated, not as means to an end, but as ends-in-themselves. From here, it is all too easy to conclude that, the better you are at reasoning, the worthier you are of personhood and its rights and privileges. For centuries, women were deemed to be ’emotional’, that is, less rational, which justified treating them as chattel or, at best, second-class citizens. The same could be said of non-white people, over whom it was not just the right but the duty of the white man to rule. Kipling’s poem The White Man’s Burden (1902) begins with the lines: Take up the White Man’s burden/ Send forth the best ye breed/ Go bind your sons to exile/ To serve your captives’ need/ To wait in heavy harness/ On fluttered folk and wild/ Your new-caught, sullen peoples/ Half-devil and half-child. People deemed to be less rational—women, non-white people, the lower classes, the infirm—were not just disenfranchised but dominated, colonized, enslaved, murdered, in all impunity. Only in 2015 did the U.S. Senate vote to compensate living victims of government-sponsored sterilization programs for the ‘feeble-minded’. Today, it is the white man who most fears artificial intelligence, imagining that it will usurp his status and privilege.

According to one recent paper, I.Q. is the best predictor of job performance. But this is not entirely surprising given that ‘performance’ and I.Q. have been defined in similar terms, and that both depend, to some extent, on third factors such as compliance, motivation, and educational attainment. Rather than intelligence per se, genius is defined more by drive, vision, creativity, and opportunity, and it is notable that the minimum I.Q. necessary for genius—probably around 125—is not all that high.

William Shockley and Luis Walter Alvarez, who both went on to win the Nobel Prize for physics, were excluded from the Terman Study of the Gifted on account of… their modest I.Q. scores.

For the story, in later life Shockley developed controversial views on race and eugenics, setting off a national debate over the use and applicability of I.Q. tests.

References

  • Salovey P & Mayer JD (1990): Emotional intelligence. Imagination, Cognition and Personality 9(3):185–211.
  • Rees MJ &  Earles JA (1992): Intelligence is the Best Predictor of Job Performance. Current Directions in Psychological Science 1(3): 86-89.
  • Saxon W (1989): Obituary William B. Shockley, 79, Creator of Transistor and Theory on Race. New York Times, August 14, 1989.

poseidonThink back to your favourite teacher: for me, a French teacher who wept as he read out from a novel by Marguerite Duras. The teachers whom we hold in our hearts are not those who taught us the most facts, but those who inspired us and opened us up to ourselves. But what is inspiration and can it be cultivated?

The word ‘inspiration’ ultimately derives from the Greek for ‘God-breathed’, or ‘divinely breathed into’. In Greek myth, inspiration is a gift of the muses, the nine daughters of Zeus and Mnemosyne (‘Memory’), though it can also come from Apollo (Apollon Mousagetēs, ‘Apollo Muse-leader’), Dionysus, or Aphrodite. Homer famously invokes the muses in the very first line of the Iliad: ‘Sing, O Muse, of the rage of Achilles, son of Peleus, that brought countless ills upon the Achaeans…’

Similarly, the Church maintains that inspiration is a gift from the Holy Ghost, including the inspiration for the Bible itself: ‘For the prophecy came not in old time by the will of man: but holy men of God spake as they were moved by the Holy Ghost’ (2 Peter 1:21).

The Oxford English Dictionary defines ‘inspiration’ as ‘a breathing in or infusion of some idea, purpose, etc. into the mind; the suggestion, awakening, or creation of some feeling or impulse, especially of an exalted kind’. Going with this, there appears to be two aspects to inspiration: some kind of vision, accompanied by some kind of positive energy with which to drive or at least sustain that vision.

‘Inspiration’ is often confused with ‘motivation’ and ‘creativity’. Motivation aims at some sort of external reward, whereas inspiration comes from within and is very much its own reward. Although inspiration is associated with creative insight, creativity also involves the realization of that insight—which requires opportunity, means, and, above all, effort. In the words of Thomas Edison, genius is one percent inspiration, ninety-nine percent perspiration—although you may not get started, or get very far, without the initial one percent.

Other than creativity, inspiration has been linked with enthusiasm, optimism, and self-esteem. Inspiration need not be all artistic and highfalutin: I often feel inspired to garden or cook, to plant out some bulbs for next spring or make use of some seasonal ingredients. Such inspired tasks feel very different from, say, writing a complaint or filing my accounts. If I could be paid to do what inspires me, and pay others to do what doesn’t, I should be a very happy man.

Despite its importance to both society and the individual, our system of education leaves very little place for inspiration—perhaps because, like wisdom and virtue, it cannot easily be taught but only… inspired. Unfortunately, if someone has never been inspired, he or she is unlikely to inspire others. That is a great shame. The best education consists not in being taught but in being inspired, and, if I could, I would rather inspire a single person than teach a thousand.

But where, in the first place, does inspiration come from? In Plato’s Ion, Socrates likens inspiration to a divine power, and this divine power to a magnetic stone that can not only move iron rings, but also magnetize the iron rings so that they can do the same. This leads to a long chain of iron rings, with each ring’s energy ultimately derived from that of the original magnetic stone. If a poet is any good, this is not because he has mastered his subject, but because he is divinely inspired, divinely possessed:

For the poet is a light and winged and holy thing, and there is no invention in him until he has been inspired and is out of his senses, and the mind is no longer in him: when he has not attained to this state, he is powerless and is unable to utter his oracles.

Socrates compares inspired poets to the Bacchic maidens, who are out of their minds when they draw honey and milk from the rivers. He asks Ion, a rhapsode (reciter of poetry), whether, when he recites Homer, he does not get beside himself, whether his soul does not believe that it is witnessing the actions of which he sings. Ion replies that, when he sings of something sad, his eyes are full of tears, and when he sings of something frightening, his hairs stand on end, such that he is no longer in his right mind. Socrates says that this is precisely the effect that a rhapsode has on his audience: the muse inspires the poet, the poet the rhapsode, and the rhapsode his audience, which is the last of the iron rings in the divine chain.

In Plato’s Phaedrus, Socrates argues that madness, as well as being an illness, can be the source of our greatest blessings. There are, he continues, four kinds of inspired madness: prophecy, from Apollo; holy prayers and mystic rites, from Dionysus; poetry, from the muses; and love, from Aphrodite and Eros.

But if a man comes to the door of poetry untouched by the madness of the muses, believing that technique alone will make him a good poet, he and his sane companions never reach perfection, but are utterly eclipsed by the performances of the inspired madman.

All human beings, says Socrates, are able to recollect universals such as perfect goodness and perfect beauty, and must therefore have seen them in some other life or other world. The souls that came closest to the universals, or that experienced them most deeply, are reincarnated into philosophers, artists, and true lovers. As the universals are still present in their minds, they are completely absorbed in ideas about them and forget all about earthly interests. Humdrum people think that they are mad, but the truth is that they are divinely inspired and in love with goodness and beauty. In the 20th century, the psychoanalyst Carl Jung echoed Plato, arguing that the artist is one who can reach beyond individual experience to access our genetic memory, that is, the memory, such as the memory for language, that is already present at birth. It is perhaps no coincidence that, in Greek myth, the mother of the muses is Mnemosyne/Memory.

The idea that ‘madness’ is closely allied with inspiration and revelation is an old and recurring one. In Of Peace of Mind, Seneca the Younger writes that ‘there is no great genius without a tincture of madness’ (nullum magnum ingenium sine mixtuae dementiae fuit), a maxim which he attributes to Aristotle, and which is also echoed in Cicero. For Shakespeare, ‘the lunatic, the lover, and the poet are of imagination all compact’. And for Dryden, ‘great wits are sure to madness near allied, and thin partitions do their bounds divide’. As I argued in a book called The Meaning of Madness, our reservoir of madness is a precious resource that we can learn to tap into.

For the modern writer André Gide,

The most beautiful things are those that are whispered by madness and written down by reason. We must steer a course between the two, close to madness in our dreams, but close to reason in our writing.

7 simple strategies to encourage inspiration

So it seems that inspiration is some kind of alignment or channelling of primal energies, and that it cannot quite be summoned or relied upon.

Nonetheless, here are seven simple strategies that may make it more likely to alight upon us:

1. Wake up when your body tells you to. No one has ever been tired and inspired at the same time. To make matters worse, having our sleep disrupted by an alarm clock or other extraneous stimulus can leave us feeling groggy and grouchy, as though we had ‘woken up on the wrong side of the bed’.

2. Complete your dreams. REM sleep, which is associated with dreaming, is richest just before *natural* awakening. Dreaming serves a number of critical functions such as assimilating experiences, processing emotions, and enhancing problem solving and creativity. In fact, the brain can be more active during REM sleep than during wakefulness. Many great works of art have been inspired by dreams, including Dali’s Persistence of Memory, several of Edgar Allan Poe’s poems and short stories, and Paul McCartney’s Let it Be.

3. Eliminate distractions, especially the tedious ones. Clear your diary, remove yourself from people, take plenty of time over every small thing. You want to give your mind plenty of spare capacity. You want it to roam, to freewheel. Before going to bed, I check my calendar for the next day’s engagements, and am never happier than when I see ‘No Events’. Don’t worry or feel guilty, the sun won’t fall out of the sky. Many people are unable to let their minds wander for fear that uncomfortable thoughts and feelings might arise into their consciousness. If they do, why not take the opportunity to meet them?

4. Don’t try to rush or force things. If you try to force inspiration, you will strangle it and achieve much less overall. There may be ‘on’ days and ‘off’ days, or even ‘on’ hours and ‘off’ hours. If you don’t feel inspired, that’s fine, go out and enjoy yourself. Your boss may disagree, but it’s probably the most productive thing you could do. If you can, try not to have a boss.

5. Be curious. The 17th century philosopher John Locke suggested that inspiration amounts to a somewhat random association of ideas and sudden unison of thought. If something, anything, catches your interest, try to follow it through. Nothing is too small or irrelevant. Read books, watch documentaries, visit museums and exhibitions, walk in gardens and nature, talk to inspired and inspiring people… Feed your unconscious.

6. Break the routine. Sometimes it can really help to give the mind a bit of a shake. Try new things that take you out of your comfort zone. Modify your routine or your surroundings. Better still, go travelling, especially to places that are unfamiliar and disorienting, such as a temple in India or a hippy farm in the Uruguayan pampas.

7. Make a start. When I write an article, I make a start and come back to it whenever I next feel inspired. The minute I start flagging, I stop and do something else, and, hopefully, while I do that, the next paragraph or section enters my mind. Some articles I write over three or four days, others over three or four weeks—but hardly ever in a single day or single sitting. When I write a book, the first half seems to take forever, while the second half gets completed in a fraction of the time. Small accomplishments are important because they boost confidence and free the mind to move on, establishing a kind of creative momentum.

If you have any other thoughts on inspiration, please put them in the comments section.

owl

Every time I utter the word ‘wisdom’, someone giggles or sneers. Wisdom, more so even than expertise, does not sit comfortably in a democratic, anti-elitist society. In an age dominated by science and technology, by specialization and compartmentalization, it is too loose and mysterious a concept. With our heads in our smartphones and tablets, our pay slips and bank statements, we simply do not have the mental time and space for it.

But things were not always thus. The word wisdom occurs 222 times in the Old Testament, which includes all of seven so-called ‘wisdom books’: Job, Psalms, Proverbs, Ecclesiastes, the Song of Solomon, the Book of Wisdom, and Sirach. ‘For wisdom is a defence, and money is a defence: but the excellency of knowledge is, that wisdom giveth life to them that have it’ (Ecclesiastes 7:12).

The word ‘philosophy’ literally means ‘the love of wisdom’, and wisdom is the overarching aim of philosophy, or, at least, ancient philosophy. In Plato’s Lysis, Socrates tells the young Lysis that, without wisdom, he would be of no interest to anyone: “…if you are wise, all men will be your friends and kindred, for you will be useful and good; but if you are not wise, neither father, nor mother, nor kindred, nor anyone else, will be your friends.” The patron of Athens, the city in which the Lysis is set, is no less than Athena, goddess of wisdom, who sprung out in full armour from the skull of Zeus. Her symbol, and the symbol of wisdom, is the owl, which can see through darkness.

In fact, ‘wisdom’ derives from the Proto-Indo-European root weid-, ‘to see’, and is related to a great number of words including: advice, druid, evident, guide, Hades, history, idea, idol, idyll, view, Veda, vision, and visit. In Norse mythology, the god Odin gouged out one of his eyes and offered it to Mimir in exchange for a drink from the well of knowledge and wisdom, thereby trading one mode of perception for another, higher one.

And the very name of our species, Homo sapiens, means ‘wise man’.

Wisdom in perspective

So what exactly is wisdom? People often speak of ‘knowledge and wisdom’ as though they might be closely related or even the same thing, so maybe wisdom is knowledge, or a great deal of knowledge. If wisdom is knowledge, then it has to be a certain kind of knowledge, or else learning the phonebook, or the names of all the rivers in the world, might count as wisdom. And if wisdom is a certain kind of knowledge, then it is not scientific or technical knowledge, or else modern people would be wiser than even the wisest of ancient philosophers. Any 21st century school-leaver would be wiser than Socrates.

Once upon a time, Chaerephon asked the oracle at Delphi whether there was anyone wiser than Socrates, and the Pythian priestess replied that there was no one wiser. To discover the meaning of this divine utterance, Socrates questioned a number of men who laid claim to wisdom—politicians, generals, poets, craftsmen—and in each case concluded, ‘I am likely to be wiser than he to this small extent, that I do not think I know what I do not know.’ From then on, Socrates dedicated himself to the service of the gods by seeking out anyone who might be wise and, ‘if he is not, showing him that he is not’. He offended so many people with his questioning that, eventually, they condemned him to death—which served his purposes well, since it made him immortal.

The Bible tells us, ‘When pride comes, then comes disgrace, but with humility comes wisdom’ (Proverbs 11:2). Socrates was the wisest of all people not because he knew everything or anything, but because he knew what he did not know, or, to put it somewhat differently, because he knew the limits of the little that he did know. Shakespeare put it best in As You Like It, ‘The fool doth think he is wise, but the wise man knows himself to be a fool.’

Still, there seems to be more to wisdom than mere ‘negative knowledge’, or else I could just be super-skeptical about everything and count myself wise… Or maybe wisdom consists in having very high epistemic standards, that is, in having a very high bar for believing something, and an even higher bar for calling that belief knowledge. But then we are back to a picture of wisdom as something like scientific knowledge.

In Plato’s Meno, Socrates says that people of wisdom and virtue seem to be very poor at imparting those qualities: Themistocles was able to teach his son Cleophantus skills such as standing upright on horseback and shooting javelins, but no one ever said of Cleophantus that he was wise, and the same could be said for Lysimachus and his son Artistides, Pericles and his sons Paralus and Xanthippus, and Thucydides and his sons Melesias and Stephanus. And if wisdom cannot be taught, then it is not a kind of knowledge.

If wisdom cannot be taught, how, asks Meno, did good people come into existence? Socrates replies that right action is possible under guidance other than that of knowledge: a person who has knowledge about the way to Larisa may make a good guide, but a person who has only correct opinion about the way, but has never been and does not know, might make an equally good guide. Since wisdom cannot be taught, it cannot be knowledge; and if it cannot be knowledge, then it must be correct opinion—which explains why wise men such as Themistocles, Lysimachus, and Pericles were unable to impart their wisdom even unto their own sons. Wise people are no different from soothsayers, prophets, and poets, who say many true things when they are divinely inspired but have no real knowledge of what they are saying.

Aristotle gives us another important clue in the Metaphysics, when he says that wisdom is the understanding of causes. None of the senses are regarded as wisdom because, although they give the most authoritative knowledge of particulars, they are unable to discern the distal causes of anything. Similarly, we suppose artists to be wiser than people of experience because artists know the ‘why’ or the cause, and can therefore teach, whereas people of experience do not, and cannot. In other words, wisdom is the understanding of the right relations between things, which calls for more distant and removed perspectives, and perhaps also the ability or willingness to shift between perspectives. In the Tusculan Disputations, Cicero cites as a paragon of wisdom the pre-Socratic philosopher Anaxagoras, who, upon being told of the death of his son, said, “I knew that I begot a mortal”. For Cicero, true sapience consists in preparing oneself for every eventuality so as never to be overtaken by anything—and it is true that wisdom, the understanding of causes and connexions, has long been associated with both insight and foresight.

So wisdom is not so much a kind of knowledge as a way of seeing, or ways of seeing. When we take a few steps back, like when we stand under the shower or go on holiday, we begin to see the bigger picture. In every day language use, ‘wisdom’ has two opposites: ‘foolishness’ and ‘folly’, which involve, respectively, lack and loss of perspective. For some thinkers, notably Robert Nozick, wisdom has a practical dimension in that it involves an understanding of the goals and values of life, the means of achieving those goals, the potential dangers to avoid, and so on. I agree, but I think that all this naturally flows from perspective: if you have proper perspective, you cannot fail to understand the goals and values life, nor can you fail to act on that understanding. This chimes with Socrates’ claim that nobody does wrong knowingly: people only do wrong because, from their narrow perspective, it seems like the right thing to do.

In cultivating a broader perspective, it helps, of course, to be knowledgeable, but it also helps to be intelligent, reflective, open-minded, and disinterested—which is why we often seek out ‘independent’ advice. But above all it helps to be courageous, because the view from up there, though it can be exhilarating and ultimately liberating, is at first terrifying—whence, no doubt, the giggles and the sneers.

Courage, said Aristotle, is the first of human qualities because it is the one which guarantees all the others.

horsemen

If you know your enemies and know yourself, you will not be imperiled in a hundred battles. —Sun Tzu, The Art of War

The enemies of rational thought can take one of several overlapping forms, including formal fallacy and informal fallacy, cognitive bias, cognitive distortion, and self-deception. If these are difficult to define, they are even more difficult to distinguish.

A fallacy is some kind of defect in an argument, and may be either intentional (aimed at deceiving) or, more commonly, unintentional. A formal fallacy is an invalid type of argument. It is a deductive argument with an invalid form, for example: Some A are B. Some B are C. Therefore, some A are C. If you cannot see that this argument is invalid, complete A, B, and C with ‘insects’, ‘herbivores’, and ‘mammals’. Insects are clearly not mammals!

A formal fallacy is built into the structure of an argument, and is invalid irrespective of the content of the argument. In contrast, an informal fallacy is one that can be identified only through an analysis of the content of the argument. Informal fallacies often turn on the misuse of language, for example, using a key term or phrase in an ambiguous way, with one meaning in one part of the argument and another meaning in another part (fallacy of equivocation). Informal fallacies can also distract from the weakness of an argument, or appeal to the emotions rather than to reason.

As I argued in Hide and Seek, all self-deception can be understood in terms of ego defence. In psychoanalytic theory, an ego defence is one of several unconscious processes that we deploy to diffuse the fear and anxiety that arise when we who we truly are (our unconscious ‘id’) comes into conflict with who we think we are or who we think we should be (our conscious ‘superego’). For example, a person who buys a $10,000 watch instead of a $1,000 watch because “you can really tell the difference in quality” is not only hiding his (unrecognized) craving to be loved, but also disguising it as an ego-enhancing virtue, namely, a concern for quality. Whereas formal and informal fallacies are more about faulty reasoning, self-deception is more about hiding from, or protecting, oneself.

Cognitive bias is sloppy, although not necessarily faulty, reasoning: a mental shortcut or heuristic intended to spare us time, effort, or discomfort, often while reinforcing our image of the self or the world, but at the cost of accuracy or reliability. For example, in explaining the behaviour of other people, our tendency is to overestimate the role of character traits over situational factors—a bias, called the correspondence bias or attribution effect, that goes into reverse when it comes to explaining our own behaviour. So, if Charlotte omits to mow the lawn, I indict her with forgetfulness, laziness, or spite; but if I omit to mow the lawn, I excuse myself on the grounds of busyness, tiredness, or bad weather. Another important cognitive bias is confirmation, or my-side, bias, which is the propensity to search for or recall only those facts and arguments that are in keeping with our pre-existing beliefs while filtering out those that conflict with them—which, especially on social media, can lead us to inhabit a so-called echo chamber.

Cognitive distortion is a concept from cognitive-behavioural therapy, developed by psychiatrist Aaron Beck in the 1960s and used in the treatment of depression and other mental disorders. Cognitive distortion involves interpreting events and situations so that they conform to and reinforce our outlook or frame of mind, typically on the basis of very scant or partial evidence, or even no evidence at all. Common cognitive distortions in depression include selective abstraction and catastrophic thinking. Selective abstraction is to focus on a single negative event or condition to the exclusion of other, more positive ones, for example, “My partner hates me. He gave me an annoyed look three days ago (even though he spends all his spare time with me).” Catastrophic thinking is to exaggerate the consequences of an event or situation, for example, “The pain in my knee is getting worse. When I’m reduced to a wheelchair, I won’t be able to go to work and pay the mortgage. So, I’ll end up losing my house and dying in the street.” Cognitive distortions can give rise to a chicken-and-egg situation: the cognitive distortions aliment the depression, which in turn aliments the cognitive distortions. Cognitive distortion as broadly understood is not limited to depression and other mental disorders, but is also a feature of, among others, poor self-esteem, jealousy, and marital or relationship conflict.

Are there any other enemies of rational thought? Please name them in the comments section.

Einstein held that imagination is more important than knowledge: “Knowledge is limited. Imagination encircles the world.”

I define imagination as the faculty of the mind that forms and manipulates images, propositions, concepts, emotions, and sensations above and beyond, and sometimes independently of, incoming stimuli, to open up the realms of the abstract, the figurative, the possible, the hypothetical, and the universal.

Imagination comes in many forms and by many degrees, ranging from scientific reasoning to musical appreciation, and overlaps with a number of other cognitive constructs including belief, desire, emotion, memory, supposition, and fantasy. Belief, like perception, aims at according with reality, while desire aims at altering reality. Emotion also aims at according with reality, but more particularly at reflecting the significance of its object, or class of object, for the subject—an aspect that it shares with many forms of imagination. Like imagination, memory can involve remote imagery. But unlike imagination, it is rooted in reality and serves primarily to frame belief and guide action. Memories are often more vivid than imaginings, which are, in turn, more vivid than suppositions. Suppositions tend to be cold and cognitive, and lacking in the emotional and existential dimensions of imagination. Finally, fantasy may be understood as a subtype of imagination, namely, imagination for the improbable.

I say the improbable rather than the impossible, because there is a theory that, just as perception justifies beliefs about actuality, so imagination justifies beliefs about possibility (or at least, metaphysical as opposed to natural possibility). To quote Hume, ‘It is an established maxim in metaphysics, that whatever the mind clearly conceives, includes the idea of possible existence, or in other words, that nothing we imagine is absolutely impossible.’ Could ghosts, the devil, time travel etc. really be possible? I think inconceivability may be a better guide to impossibility than conceivability to possibility. But what does it mean for something to be conceivable or inconceivable, and by whom?

It has to be said that, until very recently, most human societies did not mark a strict divide between imagination and belief, or fiction and reality, with each one informing and enriching the other. In fact, it could be argued that, in many important respects, the fiction primed over the reality—and even that this has been, and perhaps still is, one of the hallmarks of our species. Today, there are pills for people who confuse imaginings and beliefs, but back then no one ever thought that life, with its much harder hardships, might be meaningless—which I think tells us quite a bit about imagination and its uses, and also about mental illness.

The uses of imagination are many, more than I can enumerate. Most children begin to develop pretend play at around 15 months of age. What are children doing when they pretend play? And why are they so absorbed in works of imagination? When I was seven years old, I would devour book after book, and plead with my parents for those not already on the shelves. By playing out scenarios and stretching themselves beyond their limited experience, children seek to make sense of the world and to find their place within it. This meaning-making is joyful and exciting, and has an echo in every act of creation.

When we look at the Mona Lisa, we see much more than just the brushstrokes. In fact, we barely see the brushstrokes at all. In imagination as in our dreams, we ascribe form, pattern, and significance to things, and then reflect them back onto those things. Without this work of interpreting and assimilating, the world would be no more than an endless stream of sense impressions, as it might sometimes seem to those who lack imagination, with no hope of escape or reprieve.

More than that, by imagination we are able to complete the world, or our world, by conjuring up the missing parts, and even to inhabit entirely other worlds such as Middle-earth or the Seven Kingdoms. Imagination remains highly active throughout adulthood, and what is chick lit or even pornography if not an aid to the adult imagination? In one year (2017), Pornhub recorded 28.5 billion visits, equivalent to about four times the world population—and that’s just on the one site.

If imagination enables us to feel at home in the world, it also enables us to do things in the world. Science progresses by hypothesis, which is a product of imagination, and philosophy makes frequent use of thought experiments such as the brain in the vat, the trolley problem, and up to Plato’s Republic. Imagination makes knowledge applicable by forming associations and connections. It opens up possibilities, and guides decision-making by playing out those possibilities. So many of our failures, and some of our successes, are in fact failures of the imagination.

Imagination also enables us to talk to one another, understand one another, and work together. Without it, there could be no metaphor, no irony, no humor, no past or future tense, and no conditional. Indeed, there could be no language at all, for what are words if not symbols and representations? By imagination, we can put ourselves in other people’s shoes, think what they think, and feel what they feel. Problems in autism, which can be interpreted as a disorder of imagination, include abnormalities in patterns of communication, impairments in social interactions, and a restricted repertoire of behaviors, interests, and activities.

I’m lucky to have received a decent education, but one thing it didn’t do is cultivate my imagination. In recent years, I’ve been trying to recover the bright and vivid imagination that I left behind in childhood.

I’ve been doing just three things, all of them very simple—or, at least, very simple to explain.

  • Being aware of the importance of imagination.
  • Making time for sleep and idleness.
  • Taking inspiration from nature.

I’ll conclude with these few words from William Blake, which hint at the significance of the natural world and the transcending power of imagination,

The tree which moves some to tears of joy is in the eyes of others only a green thing that stands in the way. Some see nature all ridicule and deformity… and some scarce see nature at all. But to the eyes of the man of imagination, nature is imagination itself.