The world is officially meaningless, the individual has only the status of consumer in a blind market-driven universe
Cross-posted from Aurelien’s substack “Trying to understand the world”
Photo: Wikipedia
Since it was part of the argument of my last essay that pundits and politicians often had no real idea about what “war” with Russia might actually mean, I did decide to gird up my loins and glance through some recent media articles on the subject. And indeed, in all parts of the political spectrum, and irrespective of sympathies, it did seem that many writers had little idea what they were talking about, and little awareness that they had little idea what they were talking about, either. This has been the case since the beginning of the crisis, and it reflects the fact that understanding what’s going on in Ukraine, why it happened and how it might turn out, is objectively difficult, and requires acquired knowledge, reflection and ideally personal experience: a combination, together with time to develop ideas, which you don’t often find these days.
Then it occurred to me that Ukraine was not the only case where the intelligentsia of today (if you can call it that) seemed just to have given up, and retreated into slogans and name-calling. In an age when more people are notionally better educated than ever before, and where apparently limitless information is available on the Internet, we seem to be less intellectually able to engage with, let alone grasp big issues than was ever the case before. And this applies from the productions of popular culture, up to the announcements and actions of governments and international organisations. As it happens, we’ve been in political crisis for months in France now, with no prospect of the Parliament agreeing a budget, let alone disgorging a majority, but media coverage of it is sporadic and personality-based, at best: it’s just all too surreally complicated. Let’s talk about things we think we understand, instead.
Other examples are easy to find. Problems like climate change, the depletion of natural resources, the effects of Long Covid, or the progressive economic and social collapse of western states are not exactly hiding from us, but our societies and those responsible for taking decisions seem intellectually paralysed before them. On the one hand, climate change and environmental degradation are accelerating, on the other, municipal authorities are promoting recycling and planting trees. Yes, every little helps, I know, but too many of these measures seem to me like attempted magical rites, somehow intended to affect a problem we can’t properly comprehend, much less think of ways of dealing with. And you can have all the political will in the world, but unless you understand what you are doing and why, and how, all that will is useless. And we don’t. Sellotaping yourself to works of art and demanding that governments “do something,” is just a demonstration of intellectual failure and defeat on your own part.
The common thread among the major problems of the world today, indeed, is that they just seem too complex for us to even begin to get a grip on them. Partly, it’s a question of simple scale. We know that sea-levels are rising, and we may even realise that many important cities in the world are on low-lying coasts. But can we deal, intellectually, with the possible consequences of the inundation of the Lagos metropolitan area, where some twenty million people live? Where would we even start? And how will societies deal with the millions of children whose immune systems were damaged by Covid when they were tiny, will never be able to work, and will need medical treatment for sixty or seventy years? Such questions, and there are many others, are actually too big to contemplate, and our current political class and the Professional and Managerial Caste (PMC) is not intellectually equipped to understand them, far less to deal with them.
The practical consequences of this failure are typical of the way in which human psychology works: instead of at least trying to deal with major problems we fear we can’t solve, we take refuge in problems we can address, and in principle do something, anything, about. In many ways, the ludicrous beating of war-drums in Europe (the militarism of the traditionally anti-militaristic) is an attempt to turn the very complex and threatening problems after the defeat, some of which I’ve discussed many times, into something that the political leadership and the PMC thinks it understands, from Hollywood films and Powerpoint slides. At least they know how to create money, buy things and have consultants develop ambitious and unrealisable plans. Just don’t ask them to solve real practical problems: it’s too difficult. This mentality applies at all levels: your university may be haemorrhaging good-quality staff, have problems attracting students and need massive rebuilding of its scientific laboratories, all of which is too difficult. But what you can do is start a whispering campaign against the Vice-Chancellor to force him out and have him replaced by a woman. There you are, you’ve accomplished something. Indeed, I would argue that the growth of Identity Politics essentially reflects our society’s decreasing capability to solve serious problems, and the consequent attraction to addressing trivial ones that you think you can actually manage.
So how did we get here? That would take a book, but I just want to mention a few contributory factors. One is certainly the managerial mindset of the last couple of generations, which has trained an entire class to believe that fiddling with problems somehow equates to resolving them, and that anyway there are no problems you can’t solve with a Powerpoint presentation. Another is the decay of genuine knowledge and practical capability, at the expense of credentials whose only purpose is to get you a better job, or indeed a job at all. A third is the massive emphasis today on financial outcomes and the associated belief that they are somehow “real” in the sense that flooding or infectious diseases are real. And of course there are few prizes these days for actually trying to address fundamental problems anyway, since that presupposes both an interest in real outcomes as opposed to financial ones, and a willingness to look at the long term, which our society no longer does. The result is the collective turning of a blind eye to problems that are simply too complicated for our society to understand. After all, something may turn up. Meanwhile, if these are the last days, we need to grab what we can while we can.
But I think there’s a deeper set of issues as well, to do with our vision of the world, or more precisely our lack of one. Most importantly, we have—to put it simply—moved from the traditional view that everything was connected, to the modern view that nothing is connected. The idea of looking at problems holistically, which survived the rise of modern science at least for a time, has now been entirely lost, and we actually have difficulty remembering how complex and interrelated the world once seemed, if indeed we ever learnt about it. We have lost the intellectual habit of considering the relationship of problems to each other, as previous religious, social and political beliefs encouraged us to do. Everything now arrives retail, like an Amazon parcel, disconnected from the rest of the world and from any wider picture. It is as though every problem were encountered, shorn of all context and history, for the first time.
This would have astonished our ancestors, for whom everything was connected, and actions taken over here had consequences over there. We may vaguely have heard of the Great Chain of Being, or that the world was once Enchanted, but we have very little idea of what that meant. So imagine, if you will, the world (and the universe insofar as there was a distinction) as a connected whole. It is like a gigantic book written by God, where all knowledge and all truth is stored, and where everything reflects and influences everything else. Once we learn how to read this book, all knowledge is at our disposal. The truth, in other words, is in there, and we simply need to understand how to interpret it. Signs and symbols abound (you can see why Umberto Eco began as a medievalist), and all natural phenomena, from flights of birds to shapes of plants to signs in the sky, convey information to those who want to understand.
It’s therefore logical to think that divination could help to explain the present and even provide indications about the future. Whether you used highly-sophisticated astrological calculations or just threw coins, you were tapping into the underlying structure and processes of the universe itself, which was an integrated whole, and which worked according to laws that human beings could learn about and understand. Needless to say, we are almost infinitely far from that situation today.
Actually, not everybody believes that, and Weber’s thesis of the“disenchantment of the world” (which he thought was progress, by the way) has come in for much criticism recently. But actually the word Weber used, Entzauberung, comes from the word Zauber (yes, as in Mozart’s opera) and really means “un-magicking.” That is to say that the traditional holistic magical view of the universe, one of causes and correspondences, as above so below, as below so above, was replaced by random and often inexplicable, wholly mechanistic relationships between unrelated, lifeless phenomena. The fact that people today may read horoscopes, or that books on Buddhism and Wicca remain popular, is just a sociological phenomenon, a small rebellion if you like against the dominant contemporary paradigm of a soulless and meaningless universe.( If the universe is a book, then today’s edition is written by Samuel Beckett.) We have lost the Magical Universe, and we won’t get it back, although if you are familiar with cultures in parts of Africa and Asia, you will know that they have hung on to much more of it than we have. The wider consequences of that bear thinking about.
What do we have instead? Well, nothing much, because it’s very hard to make sense of what happens in the world without at least some broad intellectual foundation to rely on, and that we no longer have. Various religions have been confident their holy books provided this foundation. So Christianity inherited from Judaism a four-layered series of interpretations of the Bible, of which only the first represented the plain meaning, while the others were allegorical. (It added the idea that everything in the New Testament was prefigured by an incident in the Old Testament, and that the rest of history was foretold there.) Likewise, part of the attraction of fundamentalist Islam is that it does indeed have a consistent and totalising world view, and that its writings contain, or can be made to divulge, answers to every question you might ever want to ask. To the extent that such world-views persist, they act among other things as a corpus of beliefs and practices that give the world, even if imperfectly, a continued and coherent meaning. (Needless to say, understanding the continued power of fundamentalist religions, in the Muslim world but also in parts of sub-Saharan Africa and the United States, is too intellectually difficult for our society, so pundits fall back on trivial and reductivist explanations which are at least within their ability to articulate.)
Yet for a very long time now, anyone who ventured into the world of Biblical scholarship was surprised to discover how fragile and contingent is the text. Eco’s monks were probably using the Vulgate Bible, a fourth-century collection by various hands from Greek, Hebrew and Latin, sometimes including translations of translations, and which itself competed with other versions. This was bad enough, but as Charles Taylor has pointed out, the rise of Protestantism, with its distrust of ritual and ecclesiastical hierarchy, and its emphasis on personal links with God and close reading of the Bible, and “thinking for yourself” about what it meant, not only helped to produce our modern, individualist un-magical world, it also enabled an almost infinite variety of competing meanings to be extracted from different translations, as the previously centralised control of Biblical interpretation broke down. The wider consequences have not always been happy, and the intellectual habits it engendered still have resonances today.
The nearest the modern western world has ever come to such a totalising system is Communism. Now I say “Communism” and not “Marxism” advisedly, because Marxism is a system of thought and analysis, which always existed independently of particular political systems, and still does. It stands or falls by its explanatory power, much as Newton’s laws of motion were not invalidated by faulty design of early rocket motors. Whereas practical Marxism was an intellectual and social hobby for middle-class thinkers, Communism was a complete system present at all levels of society. We tend to think of the Soviet Union in this context, but in many ways countries with mass political parties provide better examples. In France or Italy fifty years ago, where Communist Parties attracted perhaps a fifth of the electorate, they were effectively parallel states, often controlling entire cities and regions, with their own media, their own festivals, their own ethic of service and even their own educational activities. Moreover, they were part of an international system directed from Moscow, which, like the medieval Catholic Church, tolerated no dissent. When troubling events occurred, like the suppression of the 1956 Hungarian uprising, newspapers, magazines, local Party officials, distinguished intellectuals and commentators on radio and TV were on hand to tell people that they shouldn’t worry, and Moscow was right.
In the West, the steam had begun to run out of this system by the end of the 1960s, and “Marxist” parties, as I knew them then, were starting to become feuding talking shops, where jokes about holding annual conferences in telephone boxes were not entirely unfair. But it’s worth pointing out that Communist Parties were found all over the world (thus, I think, disposing of Bertrand Russell’s facile argument that Communism was just a Christian heresy.) Leaving China aside as a special example, one of the modernising effects of colonialism and interwar League of Nations mandates was the spread of progressive and left-wing ideas into deeply traditional societies. At one point, the Indonesian Communist Party was the third largest in the world, and its opposite numbers led a vigorous, if subterranean, existence in former Ottoman states such as Iraq and Syria. These movements are best seen as attempts to recreate the totalising effect of religion, but in a secular context, to assist in modernisation and nation-building projects. The failure of western-style politics, including Marxism, in the Arab world, is acknowledged to be the primary explanation for the current interest in fundamentalist Political Islam as, effectively, the only political system that hasn’t been tried, and the only chance for societies caught between modernism and tradition to find a coherent explanation for the world.
In the West, Marxism has become a boutique enterprise, with some powerful and important thinkers, and some highly relevant things to say about the world, but these days without an overarching structure or even a shared overarching vision of the world. Its descendants, from Marcusian miserablism to glum Identity Politics, actually split society into smaller and smaller warring factions, and deny even the possibility of positive change and evolution, so complete, they argue, is the domination of capitalism/the consumer society/the patriarchy/racial groups and power structures generally. Just the kind of stuff when you need when you want to be cheered up and motivated. At least Communism had a vision.
So it’s hardly surprising that people feel so alone, grasping at whatever explanatory systems they can find to help them orient themselves, and to make sense of events as they happen, and sometimes choosing pretty eccentric or even dangerous ones. Now in theory it shouldn’t be like this. The secular age has liberated us from the dead hand of the Church, it says here, hierarchical education systems have been dynamited and replaced by “co-learning,” and traditional authority is mocked and distrusted. So the way is open for each of us to reach our own conclusions and affirm our own opinions, in the glorious personal intellectual independence of our Liberal society.
Now it’s important to concede that the initial premise of Liberal thinking in this area was that people (or at least the Liberal elites) should be free to hold and express personal opinions, especially about politics, even if those opinions displeased authority. And for a while, this was arguably how many western societies worked, even if such freedom is rapidly disappearing today. But the wider purpose was to advance the position of relatively small, educated groups who wanted to challenge the existing political system and replace it with one which gave them more influence, as well as undermine the power of the Church. It was not a licence for anyone to say anything they liked, and hold any opinion they wished. Liberals in power turned out to be just as repressive as monarchists, and indeed Liberal states saw the growth of bureaucracies, of “experts,” of universities and learned institutions to which one was expected to defer, rather like the Church. And, being fair to Liberalism twice in the same paragraph, it was largely the case that in those days such institutions and individuals were often conscientious and did the best job they could: something else we have lost.
The progressive emancipation of Liberalism from outside restraints and influences has produced the effect that might have been anticipated. The assault on even the attempt to find some kind of usable accepted truth, the deconstruction of everything until deconstruction ate itself, and most of all the obsessive creation and sustenance of the alienated individual, with no past, no history, no culture and no society, indeed no function but consumption, has produced a society where we are abandoned in the name of freedom. It has also, logically enough, destroyed the intermediate structures to which people could reliably turn in the past for a coherent interpretation of events. The argument is essentially the same as that which encourages us to be “CEO of our own life,” to arrange our own retirement, to “take responsibility” for our mental and physical well-being. It is servitude under the guise of freedom, placing responsibilities upon us that few of us can manage, and taking away the support structures of the past. Its result is to make us less powerful, and more dependent.
Of course, a lot of people don’t see it like that, or at least they think they don’t. Individualism has always been a popular cause (as the joke of my teenage years had it, “Dad, why can’t I be a nonconformist like all the others?”) But as with a lot of things the actual implementation turns out to be a bit trickier than we thought. You can, of course, make ringing declarations about independence and being an individual, captain of my fate, master of my soul etc. One that comes to mind is from the famous poem by AE Housman, who though “a stranger and afraid/in a world I never made”yet asserted that:
The laws of God, the laws of man, He may keep that will and can; Not I: let God and man decree Laws for themselves and not for me. Yet Housman led a notably miserable life, and it’s hard to argue that his aggressively vaunted independence actually benefited him very much. In fact, most self-conscious “rebels” (Baudelaire is another good example) have led lives of miserable failure, because they spent too much of their time just rebelling, and not enough trying to construct a viable alternative life for themselves.
The standard presentation, I suppose, would be “I don’t take my opinions from others, I consider all the facts and decide for myself.” Fair enough, but how exactly do you do that? On what basis? After all, a couple of centuries ago, the freedom Liberals demanded was essentially to hold unpopular opinions without being penalised. I don’t think (and this is the last time I’m being fair to Liberalism today) they ever anticipated an anarchic free-for-all, without any agreement often on the most basic facts. Yet this is how many people—especially aggressive individualists—do actually see things today. I’ve already mentioned some higher profile issues, but here I want to discuss a more detailed case, precisely because making judgements about it would require knowledge I don’t have, and indeed very few people do.
Earlier this year, the US carried out a bombing raid on what they claimed were nuclear weapon research facilities in Iran. Claims were made about how many aircraft of what type were involved and what the effect was. Many things, including the involvement of other nations, are still not clear and probably never will be. ( I saw an official statement from the Pentagon last week, which is why it popped back into my mind again.) Now to write something intelligent about the episode you should ideally have a background in military aviation and mission planning, a good theoretical knowledge of the effects of deep-penetrating air-dropped weapons, a good understanding of Iranian air defence systems, an equally good understanding of US electronic countermeasures, skill in the interpretation of satellite photographs, expertise in the geology of the region, a good idea of the configuration of tunnels the Iranians had constructed, and preferably have personally inspected the damage. Clearly no one person is ever likely to have this collection of knowledge: even governments can only pretend to parts of it. Yet the episode was written about extensively, and often by people with little or no knowledge of the technical details.
So where did they get their opinions from? Well, mostly, they either cited or silently reproduced, arguments from other commentators with at least some technical knowledge in one or more of these areas. There was a wide variety of such analyses to choose from, so how does the generalist pundit, writing for the media, or their own Internet site, consider all the facts and decide for themselves? After all, the foundation for the belief in the worth of individual judgement is the idea that all facts are in principle knowable, and that human beings, as rational animals, can make judgements between them. So over here is the official statement of the US government after the operation, over there is an expert on “geostrategy” and elsewhere still is a physicist who once worked on weapons design. Who do you believe, and whose thought will you reproduce: how do you even decide? (I’m happy to say that I don’t know the truth of this episode, and I feel under no compulsion to pronounce on it. But then my livelihood doesn’t depend on such things.)
Well, it so happens that we know a great deal about how humans decide between competing explanations: in a word, they do so emotionally. As Daniel Khaneman, whom I’ve mentioned before, has shown at some length, we make most of our decisions quickly and emotionally, based on instinct. These decisions, which he called Type 1 decisions, are the residue of the time when life was more threatening, and quick, instinctive decisions might save your life. Yet most of the important decisions we have to make in life are actually Type 2 decisions, where we have to consider the evidence carefully. Crudely, we can say that most people make Type 1 decisions about who to believe when they should be making Type 2 decisions. Which is to say: this person appeals to me, their politics resembles mine, they attack targets I also dislike, they must be right about this issue. And in practice, given the fearsome complexity of almost every international crisis, this is all you can really do: the possibility of “deciding for yourself” is in practice just about subjectively deciding who to believe.
Oddly, this puts us back in the Middle Ages. Surprisingly often, when pundits are challenged, they will cite a source that they claim is authoritative, or should be treated as such. This is the traditional practice of the Argument from Authority, which usually takes the form “X is an expert on A, B is an example of A, therefore X’s views on B must be correct.” In spite of being an obvious logical fallacy, it is a form of argument that is still very often encountered today. (Its extreme form has the wonderful name of ipsedixitism, or “he himself said it”, so there’s no argument.) However, in the Middle Ages there were recognised “authorities” (notably Aristotle) who were not challenged. Generally, it was through their writings that they were considered authoritative: “author” comes from the same root as “authority.” Obviously, also, the Bible was an Authority, but the Church insisted on the monopoly of authoritative readings of it. In both cases, as well as in traditional societies generally, and as I pointed out in one of my first essays, authority was actually based on something relatively coherent, such as age and experience, intellectual pre-eminence or even simple antiquity (the older the better.) We don’t have this today: on one side an experienced military officer who says that the Russians are suffering terrible casualties in Ukraine, on the other, an experienced military officer who says they’re not. Who we believe depends essentially on what we want to be told. It is most unlikely that we will have the necessary expertise and information to evaluate their arguments.
Now of course there are certain things we can do to “think for ourselves,” but they mostly involve access to facts and technologies that the ordinary person doesn’t have, which is why in fact the ordinary person can’t just “make up their mind.” (I’m not concerned with “fake news” and such here.) Sometimes, a little logical thought can help though. For example, during the 1999 Kosovo crisis, when hard information of any kind was difficult to come by, there was a report that the Serb Police had massacred twenty schoolteachers in a village and left their bodies in a ditch. As usual, people took up positions according to their emotional predispositions. But when we thought about it, the number seemed very high. After all, assume a reasonably generous pupil-teacher ratio of 35-1, then we are assuming a school or schools with 700 pupils, even supposing that every single teacher was killed. It seemed unlikely that there were many villages in Kosovo with 700 children of school age, or indeed 700 inhabitants at all. And in due course, it emerged that the report had been garbled, and twenty bodies had been found, one of whom was believed to be a teacher.
You can do this on a larger scale if you really want to “think for yourself,” but for that you need time and resources that few of us have. An important study (a few years old now but the situation can only have got worse) demonstrated that many of the facts and figures quoted about high-profile, controversial issues such as human trafficking and deaths from conflict are not so much exaggerated as simply made up, and passed from hand to hand until they are quoted by a reputable organisation or a government, at which point they became canonical. NGOs and campaigners justify their exaggerations, and even outright inventions, by claiming to be “bringing attention” to a problem, but of course the result is to kick off a pointless and distasteful race to prove that My Problem is Bigger than Yours. And enquiring scepticism of any kind is often attacked with emotional blackmail (“I suppose you think that human trafficking isn’t a problem then!”)
But you can do the same thing yourself, in a minor key, if you are prepared for a bit of work. It’s often interesting to click on links in polemical articles, which should, under normal good practice, lead to some authoritative source. In practice they often just lead to another article saying the same thing, which may cite another article saying the same thing, and in the end you never get to any actual evidence at all. But most people won’t care, of course, so long as the article tells them what they want to hear.
Now there are subjects—ethical ones for example—which are less evidence-dependent, and where there is supposedly more scope for “deciding what to think.” Take abortion for example. After all, we have all been a foetus, we have all been born and most adults have children. So you would expect that in a survey of perhaps a thousand people, you would find a large number of different opinions, often with several nuances. But in practice, all such enquiries show a cluster around a small handful of positions, often characterised by deep emotional involvement, and vehement and violent dismissal of other opinions. But this is only an extreme case of the tendency for people to shelter in emotional silos, clinging to whichever of the most common views they instinctively identify with.
The violence with which such emotions are expressed comes ultimately from fear. Our society does not value or trust logical argument, and surprisingly few people can actually construct a logical argument unaided: not much chance of “deciding for yourself,” therefore. And yet our society tells people that they should “question everything” and “reach their own conclusions.” This is hypocrisy, of course: there are an increasing number of ideas which are not allowed to be questioned, and where reaching your own conclusions makes you very unpopular. The reality is that the construction of logical arguments is not a skill we are born with, and a willingness to hold and defend genuinely personal opinions is a good way to make yourself loathed by all sides. It is conventional now to sanctify George Orwell, but he was a marginal figure in his own time, scarcely known before the publication of Animal Farm. His insistence on coming to his own conclusions and expressing them (often drawing on his own personal experiences) made him unpopular not only with the Right, for his Socialist views, but with the Left, then dominated by Communists and fellow-travellers. He’d have trouble finding a substantial audience today (“which side are you on, then, George?”)
If we were serious about “thinking for yourself,” then we would take steps to help people do so. For the last fifty years the slogan has been “teach children how to think,” rather than introducing them to systems of thought. Because I’ve been involved in education a bit, I have occasionally asked people what the syllabus for this would be, and how it would be taught. Mumble, mumble, teach children to question everything is the usual response, and as we’ve seen that’s deeply hypocritical. In fact, there is no question of “teaching children how to think,” but rather of teaching children that they will receive no help in their intellectual development, and so are required to “think for themselves,” much in the way that they are expected to choose between detailed and complex insurance policies, or evaluate the risks of taking various medicines. No-one is going to help them.
It’s interesting to imagine what such a syllabus would actually consist of. For a start, it would include formal logic, both to enable people to construct coherent arguments and, much more important, to recognise logical fallacies in the arguments of others. Most people have no idea of what logical argument and analysis really are, and hearing examples of them for the first time can induce a drowning sensation and a feeling of the earth giving way. (“But that can‘t be right!”) As I tell students, be very careful following chains of logical argument, because they could lead you to places you didn’t intend to go. Much better to start from an acceptable conclusion and construct a plausible-sounding argument to support it. And they would also study rhetoric, again less to learn rhetorical skills than to identify the misuse of rhetoric by others. Logic and Rhetoric, of course, were two of the three branches of the Medieval Trivium: the third, Grammar to help with clear expression, would probably be unacceptable to teach today. Together with the Quadrivium (Arithmetic, Astronomy, Geometry and Music) they were the “thinking skills” of the time, which enabled the highly complex and formalised Disputatio to be organised by scholars. I suppose that’s what “teaching children to think” means. It’s a shame we don’t do it any more, but rather we deny the very concept of meaning except as a function of power, we define words to mean what we want, we regard logic as a form of oppression and we place What I Feel at the summit of truth, assuming we even accept that truth could exist.
So we are strangers and afraid in a world we never made, to a degree that Housman could never have imagined. The world is officially meaningless, the individual has only the status of consumer in a blind market-driven universe, history cannot be discussed, culture is a form of oppression and the only shared concept of the world is a vulgarised nineteenth century materialist scientism, a dead universe of blindly colliding atoms. This makes some people unhappy. But they are told that it is they who are responsible for their happiness or lack of it and so they should “think for themselves,” in this as in all other areas. But as in all other areas it is a lie: all that we are given is an artificial choice between what Orwell called the “smelly little orthodoxies which are now contending for our souls.” But then Orwell was old-fashioned enough to think in terms of souls.

Be the first to comment