WHO’s learned nothing from the swine-flu panic?

The over-reaction to H1N1 influenza in 2009 was built on years of waiting for ‘the Big One’.

Over the past few days, the sixty-fourth session of the World Health Assembly (WHA) has been held in Geneva. The WHA is the highest decision-making body of the World Health Organization (WHO). It is comprised of delegations up to ministerial level from the WHO’s 193 constituent member states.

Among the agenda items was to be a discussion of the International Health Regulations 2005 (IHR) in relation to pandemic A (H1N1) 2009 – colloquially known at the time as ‘swine flu’. The IHR first came into force in 2007 and were established to facilitate international cooperation in preventing and responding to acute public-health emergencies, such as the outbreak of influenza that appeared to originate in Mexico two years ago.

The 180-page report, presented by the IHR Review Committee to the WHA, certainly seems impressive. Aside from receiving numerous national and institutional inputs, well over a hundred individuals from a vast array of agencies worldwide, including the WHO, contributed in some form to its findings.

But, in essence, only one point of any note is made in it: ‘Critics assert that WHO vastly overstated the seriousness of the pandemic. However, reasonable criticism can be based only on what was known at the time and not what was later learnt.’ This is felt to be of such significance that it is stated three times – in the executive summary, in a slightly modified form in the body of the text, and again in the conclusions. It is intended as a robust rebuttal to those voices – in the media, the medical professions, and elsewhere – who have questioned the global response to H1N1, and the WHO’s role in shaping this response.

Foremost among these has been Paul Flynn, a British Labour MP and rapporteur to the Social, Health and Family Affairs Committee of the Council of Europe, through which he successfully promoted an inquiry into the matter. This inquiry primarily questioned the role of advisors to the WHO, who – through being employed by large pharmaceutical companies that produce anti-viral drugs and vaccines – were held to have had an economic motivation in raising public concerns about swine flu.

The editor of the British Medical Journal, Fiona Godlee, and others, have similarly pointed to possible conflicts of interests, as well as a lack of transparency within the WHO relating to advice and appointments. Sam Everington, former deputy chair of the British Medical Association, went on the record to argue that, in his opinion, the UK’s chief medical officer and the government were ‘actively scaremongering’.

Quite a number of countries worldwide have also raised criticisms since the pandemic abated, ruing the fact that they purchased vast stocks of vaccines at some considerable cost that have remained unused.

But, just as with the official review of the UK’s response into the outbreak, these voices and views are simply non-existent as far as the IHR Review Committee and the WHO are concerned. And anyway, as the report repeatedly reiterates, it is the considered opinion of international public-health specialists that claims of over-reaction to what turned out to be a comparatively mild illness are misguided. Those who point to this are held to be cavalier and complacent as to the possible risks entailed should the situation have been different.

What’s more, much emphasis is placed in the report on the fact that Margaret Chan, the director-general of the WHO, and other WHO staff consistently tried to calm matters down, repeatedly noting that the overwhelming majority of cases were mild and recommending to governments that there was no need to restrict travel or trade. If anyone went beyond the measures that were officially advocated then the WHO could hardly be held responsible for this, the report contends. Hence it is to the media, and in particular new social media, that blame is attached.

But all this is to woefully misunderstand and underestimate how communication about risk affects contemporary society. Regulations and warnings are not issued into a vacuum. People and institutions do not merely respond to messages on the basis of the precise information contained within them. Rather they interpret these through the prism of their pre-existing cultural frameworks.

For example, when the former UN weapons inspector Hans Blix advised the world in 2002 that he could find no evidence for weapons of mass destruction in Iraq, it is quite clear that, rather than reading this at face value, the response of the US authorities was to assume that any such weapons were simply well hidden. In other words, they did not allow the facts to stand in the way of their mental model of the world – one in which that the Iraqi authorities would invariably lie and operate surreptitiously, regardless of evidence to the contrary.

Likewise, whatever the WHO likes to think it announced about the outbreak of H1N1 influenza in 2009 – ignoring, presumably, the fact that the director-general herself described it as ‘a threat to the whole of humanity’ – its officials should also have been sensitive to the reality that their messages would emerge into a world that had steadily been preparing itself for a devastating health emergency for quite some time.

Indeed, much of this ‘pandemic preparedness’ had been instigated and driven by the WHO itself. It is quite wrong therefore for the IHR Review Committee report to argue that any criticism of the WHO was based on ‘what was later learnt’. It is clear that the global public-health culture that the WHO itself helped to create in advance would inevitably result in just such an over-reaction. It is even possible to go further than this and to predict right now that this will not be an isolated incident. Lessons may be learnt, but mostly the wrong ones.

A critical article in Europe’s largest circulation weekly magazine, Der Spiegel, published just over a year ago, noted how prior to the advent of H1N1 in 2009, ‘epidemiologists, the media, doctors and the pharmaceutical lobby have systematically attuned the world to grim catastrophic scenarios and the dangers of new, menacing infectious diseases’. Indeed, it seemed at the time of the outbreak, to one leading epidemiologist at least, that ‘there is a whole industry just waiting for a pandemic to occur’.

In this, as the IHR Review Committee report makes clear, ‘The main ethos of public health is one of prevention’, before continuing: ‘It is incumbent upon political leaders and policy-makers to understand this core value of public health and how it pervades thinking in the field.’ The authors appear to believe that this is a radical outlook; in fact, this precautionary attitude is the dominant outlook of our times. In that regard at least, the WHO and others were merely doing what came naturally to them when they acted as they did in 2009.

It is the case today that both elites and radicals view the world in near-permanent catastrophist terms. This apocalyptic outlook emerged as a consequence of the broader loss of purpose and direction that affected society in the aftermath of the old Cold War world order that last provided all sides of the political spectrum with some kind of organising rationale.

Indeed, it was as the Cold War was drawing to a close that the concept of emerging and re-emerging infectious diseases first took hold. And, as noted by the American academic Philip Alcabes in an excellent book on these issues, it was also the point at which the notion of dramatic flu epidemics occurring on a cyclical basis – which until the 1970s had been little more than one of many possible theories – also came to form an essential component of the contemporary imagination.

In the autumn of 2001, the anthrax incidents that affected a tiny number of people in the US in the aftermath of the devastating 9/11 terrorist attacks, were heralded as a warning of things to come by the authorities. As a consequence, after many years of being regarded as an unglamorous section of the medical profession, public health was catapulted centre-stage with vast sums made available to it by military and civilian authorities to pre-empt and prevent any bioterrorist attacks that they now all too readily anticipate.

The outbreak of a novel virus, severe acute respiratory syndrome (SARS), in 2003 – a disease that affected few individuals worldwide but had a relatively high fatality rate – was held by many to confirm that we should always prepare for the worst.

Since then it has been the projected threat of H5N1 ‘avian flu’ jumping across the animal-human barrier that has preoccupied the world public-health authorities. Irrespective of the fact that there have been just 553 cases of H5N1 since 2003, concerns generated by it have been sufficient to push through far-reaching transformations to the world public-health order – including the advent of the IHR themselves.

Now – ominously – aside from deflecting any responsibility for the confusions they helped to create, by describing the H1N1 episode as having exposed ‘difficulties in decision-making under conditions of uncertainty’, the IHR Review Committee note in conclusion that – looking forwards – their most important shortcoming is that they ‘lack enforceable sanctions’.

In this regard, public health will not just be perceived of as being a national security concern – as it has already become in many influential circles – but also one requiring effective policing, possibly with its own enforcement agency, through the establishment of a ‘global, public-health reserve workforce’, as the report suggests.

Aside from absolving the IHR and the WHO of any responsibility for the debacle that saw large numbers of well-informed healthcare workers refusing to be inoculated when the vaccine eventually emerged in 2009 – thereby encouraging the public to act in similar fashion – the report of the Review Committee is also a call to make risk communication more of a priority in the future.

But, far from the public requiring the authorities to speak more slowly, more clearly or more loudly to them, it was precisely the attempted communication of risk – where there was little – that was the problem in the first place. That is why we can be sure that this problem is set to recur, at tremendous cost – both social and economic – to society.

Risk is not simply an objective fact, as some seem to suppose. Rather, it is shaped and mediated through the prism of contemporary culture. That we perceive something to be a risk and prioritise it as such, as well as how we respond to it, are socially mediated elements. These may be informed by scientific evidence but, as indicated above in relation to Iraq, broader trends and outlooks often come to dominate the process.

These are impacted upon by a vast number of social, cultural and political variables, such as the cumulative impact on our imagination of books, television programmes and films that project dystopian – or positive – visions of the present and the future. Another major influence is the perception of whether the authorities have exaggerated or underestimated other problems, even such apparently unrelated matters as climate change or the 2008 financial crisis.

An emergency then – whether it relates to health or otherwise – does not simply concern the events, actions and communications of that moment. Rather, it draws together, in concentrated form, the legacies of past events, actions and communications as well. And while it may not have been in the gift of the IHR Review Committee to analyse, and – still less – to act upon all of these, there is precious little evidence that they considered such dynamics – and their own role within them – at all.

Far from struggling to convey their messages about H1N1 through a cacophony of competing voices – as some within the WHO seem to suppose – the authorities concerned totally dominated the information provided about the pandemic in its early stages. Their mistake is to presume that it was merely accurate information and the effective dissemination of it that was lacking.

Rather, it was the interpretation of this information according to previously determined frameworks that had evolved over a protracted period that came to matter most. Accordingly, the WHO tied itself in knots issuing endless advisories at the behest of the various nervous national authorities it had helped to create. This even included guidance on the use of facemasks which, whilst noting a lack of any evidence for the efficacy of these, nevertheless conceded that they could be used, but if so that they should be worn and disposed of carefully!

At the onset of the 1968 ‘Hong Kong’ flu epidemic, that killed many tens of thousands more than H1N1, the then UK chief medical officer postulated – erroneously – that he did not envisage the outbreak being a major problem. Far from being lambasted for being wrong, or hounded out of office, as he might be in today’s febrile culture, it appears that the presumption of the times was that it was precisely the role of those in authority to reassure and calm people down, rather than to issue endless, pointless warnings as we witness today.

The WHO, on the other hand, seems determined to assert its moral authority by projecting its worst fears into the public domain. Sadly, it seems, the authorities have not learnt a single lesson from this episode.

It is not the actions of the individuals concerned that the IHR Review Committee report should have scrutinised and sought to exonerate from presumptions of impropriety or personal gain, but rather the gradual construction of a doom-laden social narrative that WHO officials have both helped to construct and now need to respond to, that urgently needs to be interrogated.

First published on spiked, 23 May 2011

The West’s very own celeb terrorist

Whether he was droning on about climate change or consumption, OBL’s ‘ideas’ were born and bred in the West.

Soon after the death of Osama bin Laden had been announced to the world, 72-year-old Muslim cleric Abu Bakar Bashir – the purported spiritual leader of the Islamist militant group Jemaah Islamiyah – issued a statement from his jail cell in Indonesia, where he faces trial for allegedly funding and organising terrorist camps. The statement, to the effect that ‘Osama’s death will not make al-Qaeda dead’, was designed to instill a sense of foreboding across south-east Asia.

But like all nobodies who hide their own uncertainties and weaknesses behind the words and deeds of supposed somebodies – in this case, behind the dread of al-Qaeda – Bashir simultaneously revealed his own lack of substance. This was apt, because bin Laden himself was always fond of citing Western commentators, academics and diplomats in seeking to legitimise his ostensible cause.

Sounding like any other contemporary critic of American policy, bin Laden droned on about a rag-bag of causes at different times: he lambasted the US for not signing up to the Kyoto treaty to control greenhouse gases; accused Washington of being controlled by a Jewish lobby; suddenly became concerned about Palestine after 9/11; suggested that the wars in Afghanistan and Iraq were simply money-making ventures for large US corporations; and even had the gall – for one in thrall to the Taliban – to argue that Western advertising exploited women.

In this regard, bin Laden revealed his true nature through his statements – including his annual post-9/11 rants that became as boring and predictable as the British queen’s Christmas message. He was entirely parasitical on what was being said about him and about the state of world affairs in the West. After the Madrid bombings of 2004, he even proposed that Western leaders should pay more attention to surveys that revealed how few people supported the war in Iraq.

But what kind of spiritual leader is it who piggy-backs on Western opinion-poll data and the views of environmentalists to get his point across? Why did he advocate reading Robert Fisk and Noam Chomsky, rather than the Koran? In truth, bin Laden was entirely lacking in any substantial ideas of his own, let alone anything that could amount to an ideology. More media-has-been than mujahideen after his escape from US forces in late 2001, bin Laden was the leader of nothing who became the quintessential celebrity terrorist of our times – unable even to control his own fans, never mind control the course of history.

Sadly, those who opposed him were just as devoid of principles of their own. Accordingly, across the political spectrum and in all countries, political leaders and officials who themselves lacked purpose and direction sought to justify their increasingly illiberal policies and actions on the basis of the need to defeat al-Qaeda. Bashir’s recent words of warning sound true because much the same point was made by President Obama in his address to the nation, as well as being echoed by the head of the CIA, the UK prime minister David Cameron, and countless others.

Without al-Qaeda, the global counterterrorism industry would find itself in a real quandary. Little wonder that there is such enthusiasm to reiterate the danger from radical Islam now. The fact that the recent transformations in the Middle East – heralded by some as an ‘Arab spring’ – made little to no reference to either Palestine, or bin Laden and al-Qaeda, makes not a jot of difference to the insights of the self-styled experts.

Far from representing the views and grievances of those in the East and South – whom he never consulted – bin Laden was always a product of the West. He jumped on every bandwagon like some demented blogger and echoed the Western self-loathing he found there. His words would then be picked up again by both followers and critics who lacked the courage to speak out for themselves but preferred instead to point to bin Laden’s empty threats as evidence of what Muslim frustrations and humiliations might lead to.

Instead of a clash of civilisations we had a war of gestures as every controversy in the West about cartoons, books – and now even celebrations – that might be deemed as offensive, were picked up on as further examples of the supposed victimisation of Muslims. This over-sensitivity to images and words only further exacerbated the situation, as whole populations were taught that they must never put up with being offended.

Many commentators, aside from implicitly supporting al-Qaeda’s cause by giving a nod to the simplistic notion that suffering, anger and resentment inevitably leads to terrorism, have also noted more critically how the group came to kill more Muslims than Americans through its actions. But this criticism suggests that if the figures had been skewed the other way, if fewer Muslims had been killed, then these commentators would have been somewhat more understanding towards bin Laden.

The solution frequently put forward to resolve matters has been to create de-radicalisation programmes. However, given that the clerics involved in such programmes share the same misgivings about the modern world as the people they’re supposed to be saving, one wonders if these initiatives could ever possibly be truly successful.

Most notable is the general presumption that the removal of bin Laden will somehow lead to a greater risk in the immediate future through the possibility of reprisal attacks that could occur against anyone, anywhere and at any time. This model is itself a construct of the contemporary culture of fear that exists in the West today, presuming that as one threat goes away, another steps in to fill the void.

Those who argue this way fail to note that while there may be aggrieved individuals at large, these people rarely target the symbols of imperial or racial oppression that are held to drive them. Rather, by lashing out at all manner of symbols of modernity – tall buildings, aeroplanes, shopping malls, night clubs – they reveal their frustrations to be a quite mainstream rejection of Western materialism, and not the religiously inspired attacks that so many commentators presume.

First published on spiked, 5 May 2011

Fukushima: sounding worse, getting better

Obsessed with the idea of a nuclear meltdown, the doom mongerers are blind to the reality at Fukushima.

Over the weekend, much of the world’s media reported a radiation spike emanating from Japan’s stricken Fukushima nuclear power plant of the order of 10million times above the norm. It soon transpired that this figure was erroneous and it has since been retracted by the Japanese authorities. But why did so many seem so keen to report the alarming estimate?

The closer the situation comes to being resolved at Fukushima, the clearer it will become what actually happened there. Hence it will sound like matters are getting worse just as they are getting better. As things stand it would seem that one of the worst earthquakes ever recorded, followed by a devastating tsunami that took out the back-up generators required to cool the nuclear facility, may have caused a minor fissure to the casing of one of six reactors, leading to some radioactive materials being released into the environment.

It is important to maintain a sense of proportion and perspective about this. The quantities released, while alarmingly headlined as raising radiation levels in nearby seawater to 1,250 times the normal safety limit, still amounts to less than one per cent of that which was released over the course of the worst nuclear accident in history at Chernobyl in the former Soviet Union in 1986.

There are two things worth noting from the outset. Firstly, that 1,250 times the normal safety level still amounts to not very much at all. And secondly, contrary to the popular myths about Chernobyl, it is today a visitor destination, albeit for what the trade identifies as extreme tourism. The three remaining reactors at Chernobyl reopened just seven months after the explosion there, with one of the reactors working right through to December 2000, since when a small army of workers has been on-site and steadily decommissioning the plant – a process that could still take many years.

Alarmist figures as to the number of people affected by the Chernobyl disaster bear no resemblance to the actual data confirmed by the Chernobyl Forum – a group that includes the UN, the IAEA and WHO – in its 2006 report on the matter. Only 50 deaths can be directly attributed to the accident. These occurred among those workers brave enough to return to the plant when it was burning to sort out the mess at the time, and among a small number of children in the wider community who developed thyroid cancer.

Those who suggest that thousands, maybe even tens of thousands, of fatal cancers are linked to the Chernobyl disaster are basing these estimates on extrapolations from the effects of the atomic bombs dropped on Japan in 1945. These estimates are derived using a linear extrapolation from the effects of high levels of radiation received in an instant as the bombs exploded. But most researchers recognise that the circumstances in Hiroshima and Nagasaki were very different to those in Chernobyl. Such estimates are, therefore, based on rather shaky evidence. It is like suggesting that because a temperature of 200 degrees Celsius would kill 100 per cent of human beings, so a temperature of 20 degrees Celsius should kill 10 per cent of them. In reality, our bodies are able to tolerate radiation up to a certain threshold. Low levels of radiation are almost certainly harmless.

This brings us back to the contaminated seawater, as well as the affected food items and drinking water in Japan today. The situation is certainly not ideal and no doubt lessons will be learnt from all this as they always are after every emergency. Indeed, whether we appreciate it or not, it is only by taking risks that society has evolved to the form in which it exists today, whereby we live longer and healthier lives than any preceding generation. Sadly, it is only by making mistakes that we learn the real limits of anything. And as some have also indicated, even the worst levels of radiation reported from Japan – aside from those to which a handful of workers have been exposed – amount to little compared to natural background levels in other places on earth, as well as comparing favourably with other exposures we voluntarily subject ourselves to, whether these be through flying or having an X-ray or a CT scan.

The situation now is likely to require a plentiful supply of energy to resolve – energy which, like it or not, will probably come from other nuclear facilities, not from windmills and solar panels. These renewable technologies, while they may be desirable for the future, will only emerge based on what we have available to us in the here and now.

The anti-nuclear campaigners however – alongside the far bigger army of catastrophists, who seem keen to imagine the worst at every opportunity – are now smugly standing by to say ‘I told you so’. But none of them suggested there would be a tiny crack through which a limited amount of radiation may leak. Rather, there was a cacophony of voices projecting a meltdown and Armageddon. And, as none of these commentators were nuclear engineers who attended the site in Japan itself, it is obvious that all they could do was imagine the worst and project that fantasy into the public domain.

It would be preferable to have a few more trained specialists dealing with the actual emergency. From a sociological perspective, however, one focused particularly on risks and how these are perceived and communicated, it was entirely predictable that an assortment of risk entrepreneurs, doom-mongers and assorted lobbyists would clamour to claim this incident for themselves and attach it to whatever fear-laden view they hold.

Eight years ago, as hostilities resumed in Iraq, there were many determined to uncover Saddam Hussein’s supposed stash of weapons of mass destruction there, despite the evidence consistently pointing to their absence. We were advised instead to focus on the unknown, or the ‘unknown unknowns’ as the US defence secretary Donald Rumsfeld famously put it. Two years ago, once the director-general of WHO had identified H1N1 as ‘a threat to the whole of humanity’, nations everywhere cranked into pandemic prevention overdrive, convinced that only their precautionary actions could save humanity – this despite all the evidence pointing towards the outbreak of a mild version of influenza. We have to recognise that once a particular mindset is established it is very hard for people to accept that their model of the world may not be correct even if the facts are staring them in the face.

This is the pattern being repeated around the nuclear incident in Japan. Some newscasters seem determined to convey the worst that could happen, as if this were some public service. But surely at such times the role of the media is to report the facts rather than imagine a Hollywood script? The problem we now confront is that a significant number of cultural pessimists have staked their reputations on proving that there was a major problem and possibly that this was covered up. Such individuals seem to desire – if not need – the worst, to confirm their apocalyptic frameworks. It is high time we focused on the evidence and let those who are actually capable of dealing with the mess at Fukushima get on with their jobs without having to worry that their every step will be projected on to the world stage as an example of incompetence and conspiracy.

First published on spiked, 29 March 2011

The mad post-tsunami food panic

You could eat Japan’s so-called ‘radioactive spinach’ for a whole year and it still wouldn’t cause you much harm.

It would require an iron will to stand in the face of today’s febrile culture and oppose the wave of countries rapidly withdrawing Japanese foodstuffs from their shelves ‘in line with the precautionary approach’, as a Singapore government spokesperson put it.

Having alerted the world to elevated levels of radiation in food items such as spinach and milk, as well as doses twice the recommended limit for babies in drinking water in Tokyo, the Japanese government really has no one other than itself to blame. After coping admirably in managing the immediate aftermath of the earthquake and the tsunami, as well as demonstrating the resolve to address the situation at the Fukushima nuclear power plant, it seems that it is at the level of communication that the authorities may yet score an own-goal.

The Japanese cabinet secretary, Yukio Edano – until now the image of cool with his detached demeanour and worker’s overalls at press conferences – has asked international importers to take a ‘logical stance’ over the food situation. They will. Unfortunately, it is not the logic he may have had in mind. ‘Even if these foods are temporarily eaten, there is no health hazard’, he advised. Others have indicated that one would have to drink a lot of the water before being harmed. Drinking the water in Tokyo for a year might expose you to an additional 0.8 millisieverts (mSv) of radiation. But then living in some of the places on earth where the natural background radiation is above the norm could easily expose you to 10 times as much.

Needless to say, people continue to live in such areas – and have babies. In fact, there is a considerable body of evidence to suggest that – if anything – their longevities may be enhanced through such exposure. After all, biological life emerged into an environment that had far more radiation, from the ground and from space, than it does today.

Eating the spinach non-stop for a year (perish the thought) would give you a radiation dose equivalent to about one CT scan. Drinking the milk endlessly would be even less of a problem. In fact, you would be sick of eating and drinking these products long before any of them could make you sick from radiation poisoning or cancer.

So where did it all go wrong for Edano? Where did the army of over-zealous officials wanting to ban things on a precautionary basis come from? Should we blame the US – we often do – for starting the cascade? Or was it the media who irresponsibly amplified concerns?

In fact, if we truly hope to understand the confusions now emerging over the situation regarding food from Japan, there is little point in looking there, or even trying to understand nuclear accidents and radiation, or the role of today’s nervous officials and the media.

Rather, since the end of the Cold War in 1989, the world has steadily been reorganised along the principle that it is better to be safe than sorry. That sounds eminently sensible. But is it true? Is there not a point where safety comes at a cost to other areas of life? For instance, if we were to put all our resources into combating terrorism, there would be none left to deal with disease.

Risk management is always about such trade-offs. But the mantra that we should be as safe as possible and always take precautionary measures whenever possible has become good coin among bureaucratic elites the world over. This provided governments with a new role once the old Soviet enemy had imploded. Noting too that the end of the old-style confrontational politics had also left people rather more isolated and insecure, politicians refashioned themselves as the people’s protectors of last resort.

This has come at a tremendous cost to society – leaders driven more by events than by principles, and populations that are used to having their prejudices pandered to rather than challenged. The rot, of course, started at the top. Hence witness a large number of foreign nationals in Japan, many of whom were caught up in these tumultuous events, and who wanted to stay behind to help their friends and loved ones. They even wanted to help complete strangers – but of course we now know, because we have been brought up to believe so, that strangers are a danger anyway.

So, rather than pursuing their humane instincts, according to their own assessment of what the real risks were, many such individuals were advised, by their own national governments, to get out. Get out of the region. Get out of Tokyo. Get out of Japan.

In the past, people who ran away from people in need, particularly when these were people they knew, might have been accused of being cowards. Today, we call that ‘precautionary measures’.

Welcome to the brave new world of risk-obsessed politics. Far from building character and making populations more resilient, as the leaders of some of these countries constantly profess themselves to be doing, what we find is a highly confused culture that encourages a febrile response, both on the ground, and many thousands of miles away.

It is this that might prove to be the greatest problem for the wider Japanese population for quite some time to come.

First published on spiked, 24 March 2011

On Thailand, what would Trotsky say?

If the Thai Red Shirts want real change, they could do with reading History of the Russian Revolution.

According to some sources, recent publications circulated by the organisational arm of the Red Shirts in Thailand – the United Front for Democracy against Dictatorship (UDD) – ‘offer history lessons on the French and the Russian revolutions’. According to their detractors, such references to past revolutions demonstrate the republican leanings of the UDD leaders, despite the fact that the Red Shirts have been demanding free elections and the resignation of prime minister Abhisit Vejjajiva rather than an overthrow of the monarchy. In a country where expressing anti-monarchy views is still indictable under archaic lèse majesté laws, recently backed up by draconian Internal Security Act provisions, these are serious charges.

But the publication that both sides of the current political impasse in Thailand really could do with reading carefully is Leon Trotsky’s History of the Russian Revolution. In this magisterial work, Trotsky laid no claim to impartiality, but rather sought to expose ‘the actual process of the revolution’. As the insurrection in Thailand develops, considering how Trotsky’s analysis can be used as a ‘playbook’ of the way in which such events unfold is really eye-opening.

One trend that Trotsky identified was the use of ‘conspiracy’ as one of the prime accusations made by the ruling class – consciously or not – in their attempts to demobilise the masses at a time of insurrection. Members of the elite are unable to understand through their ‘police mind’ that periods of rapid change stem, not from ‘the activities of “demagogues”’, but from the precise opposite – the ‘deep conservatism’ of the masses, whose views lag chronically ‘behind new objective conditions’. As a result, the elite focuses narrowly on ‘the deliberate undertaking of the minority’, whilst ignoring ‘the spontaneous movement of the majority’. Trotsky adds: ‘Without a guiding organisation, the energy of the masses would dissipate like steam not enclosed in a piston-box… But nevertheless what moves things is not the piston or the box, but the steam.’

The reality on the ground in Bangkok confirms this insight. Far from threatening Thailand’s royalty, the Red Shirts maintain an overly deferential position towards the monarchy. In response to allegations of a plot to overthrow King Bhumibol Adulyadej, Red Shirt leaders have put great store behind proving otherwise by holding up portraits of the Thai King and Queen at recent rallies. Whether such displays of loyalty are merely tactical remains to be seen, but they point to the possible chronic contradictions that exist within the Red Shirt movement. The King himself did not comment on the situation during his public appearance for Coronation Day yesterday.

Two weeks ago, possibly with a view to encouraging loose talk or even stealing the moral high-ground from the Red Shirts, the Thai foreign minister, Kasit Piromya, controversially mooted the need to re-examine the role of the monarchy. As Trotsky recorded, conspiracies, apparent policy U-turns, divisions and intrigue abound, but these frequently originate from within the government camp.

Understanding how a situation changes and develops is crucial. For example, some within the media – and within the Red Shirt movement itself – have only belatedly understood that support for the deposed former prime minister, Thaksin Shinawatra, may not have been central to developments in Thailand.

Another example is the proposal by the Election Commission – an unelected group of officials charged with managing and overseeing elections in Thailand – that the Constitutional Court be asked to dissolve the ruling party for failing to report donations and misuse of funds. This seems to have confused many, on all sides. Indeed, the Red Shirt leaders now appear to view the commission as some kind of neutral body. In reality, like many other bodies with little authority other than that vested in them by statute, the commissioners most probably had their eye on their own survival.

In a period of insurrection, there is a big difference between official power and real support, something the Thai prime minister is only now starting to learn. Three weeks ago, after the first bloody clashes in the capital, which left over 20 dead and many hundreds more injured, it looked as if the tensions within the elites might come to a head. Vejjajiva was accused by supporters of his own regime of being too weak, and he handed direct control over security matters from his defence minister, Prawit Wongsuwan, to the chief of the army, General Anupong Paochinda.

But General Paochinda’s resolve has also been called into question. He is on record as stating that the current situation is a political problem that ‘must be solved by political means’, and that he is not inclined to intervene. There followed a protracted discussion about army people being possible ‘water melons’ (green on the outside, but red at the core). Accusations have flown that Red Shirt leaders received advance warnings of army and police raids to arrest them and that they escaped, in the full glare of media publicity, as the forces charged with their capture stood idly by.

Such possibilities should hardly come as a surprise. Already in 1930 Trotsky noted: ‘If an army as a whole is a copy of society, then when society openly splits, both armies are copies of the two warring camps.’ He continued: ‘The army of the possessors contained the wormholes of isolation and decay.’ In October 1917, many in the Russian army joined the Bolsheviks, while their commanders either stood aside or pretended in their own turn to have swapped sides. Likewise, in Thailand today, it would seem as if there are many, aside from some in elite squadrons, who would be unlikely to turn on the multitude of middle-aged ‘aunties’ and ‘uncles’ from the north of the country and beyond, who constitute much of this new, if unlikely, fledgling red army.

Writing in a more despairing tone for a regional paper recently, Kraisak Choonhavan, the deputy leader of Vejjajiva’s incumbent Democrat Party, lambasted those in the international media who have portrayed the protests in Bangkok as some kind of ‘class struggle between rich urban elites and the poor rural mass’. His complaint was that the Red Shirt platform is ‘without a clear programme of social and political reform to follow’. He, too, could do with reading Trotsky, who wrote that ‘the masses go into a revolution not with a prepared plan of social reconstruction, but with a sharp feeling that they cannot endure the old régime’.

Whether or not that ‘sharp feeling’ is constructively channelled will, of course, depend greatly on the skill and political acumen of political leaders. These, as in 1917, ‘were little known to anybody when the year began’, and indeed, a cursory glance through the list of 24 arrest warrants issued for Red Shirt leaders by the Thai authorities confirms that none has much of a profile, save for what has recently been reported about them in the media.

Trotsky also understood that the middle classes can be crucial at breaking-point periods. The middle classes in Thailand are extremely unnerved by the current situation. Like those in the media, academia, the government and the ‘international community’ who express concern about the economic impact of the protests, they are particularly ill-suited to handling the impact of uncertainty upon their immediate, narrow and privatised concerns. The finance minister, Korn Chatikavanij, recently suggested that the Red Shirt protests could reduce the growth rate by two per cent if continued to the end of the year, while there has been much discussion of the supposed wider ramifications for the reputation of Thailand vis-à-vis the ‘international community’.

For the middle classes, making profit today is more vital to them than achieving democracy and freedom tomorrow – even for their own children. So much for the long-term, social view of business.

The key element now is timing. In his chapter on ‘The Art of Insurrection’, Trotsky explored the difficulties of, and political nous required for, knowing just when to initiate action. The more conscious elements of the movement would be chomping at the bit, fearing that the moment has passed, while other social layers would only just be waking up to the possibilities of the situation.

In Thailand, the media, and others, have accused the Red Shirts of overstepping the line through their recent takeover and occupation of a hospital. This was done to ensure that army forces were not in hiding there. But Weng Tojirakorn, one of the Red Shirt leaders, has now apologised for this act. In fact, it is this apology that suggests that those who earlier had organised such galvanising activities as a mass collection of blood from their supporters – to daub over the streets – may now have gone somewhat on the defensive.

The insurrection may be undone by tactical errors. For example, some Red Shirt leaders have agreed publicly to hand themselves in to the authorities on 15 May, come what may, while asking various international organisations and institutions to send observers to Bangkok. Martyrdom and handing the initiative to external busy-bodies have never been successful strategies.

Such moves have allowed the cabinet, holed-up for months in the eleventh Infantry Regiment’s barracks in the capital, to reassert a modicum of control. They have robustly declined offers of assistance from outside, despite the shrill and irrelevant voices predicting civil war and demanding mediation emanating from the likes of the International Crisis Group in Brussels and Human Rights Watch in New York.

This week, the prime minister offered to hold elections in November and to commence a process of ‘reconciliation’, so long as the protesters went home and observed a set of other conditions. The Red Shirt leaders, in turn, indicated their inclination to accept the deal, so long as a date for the dissolution of parliament was announced and the Election Commission appointed to handle the process. The ruling party have now tentatively put September forward as the date for this but, as some have pointed out, by then they may be disbanded as a party by the Constitutional Court. Such a situation would allow the possibility for any interim establishment to renege on such commitments.

This is a crucial moment for the Thai insurrection. Following Trotsky’s metaphor, it may be time for the steam to reassert its primacy over the piston. Certainly, it is now the response of the ordinary Red Shirts to the so-called ‘roadmap’, and their own leaders’ willingness to accept it, that will determine whether real change is achieved or whether Thailand continues – as the locals say – to be ‘same, same, but different’.

First published on spiked, 6 May 2010

A semi-irresistible argument

It is refreshing to read Kishore Mahbubani’s unabashed defence of aspirations in the East. But his attachment to the very Western culture of fear means that his book ends on a pessimistic note after all.

‘If I were asked to name the date when my life entered the modern world, I would date it to the arrival of the flush toilet.’

This is the humble beginning of The New Asian Hemisphere: The Irresistible Shift of Global Power to the East, in which the author, Kishore Mahbubani, extols the trappings of modern life.

Mahbubani is currently dean of the Lee Kuan Yew School of Public Policy at the National University of Singapore. He served for 33 years as a diplomat for Singapore. His impressive CV includes a post as permanent secretary at the Singaporean foreign ministry, and as president of the United Nations Security Council. He was also Singapore’s ambassador to the UN.

Mahbubani continues: ‘After the flush toilet, our little home in Singapore began to acquire other conveniences: the refrigerator, the TV set, the gas cooker (which replaced the charcoal fires my mother used), and the telephone.’ It seems the television set was particularly significant for Mahbubani, as it opened his mind to visions of what the world could be like.

For Mahbuani, it is the ‘“heaven” of modernity that the vast majority of the world’s population is aspiring to enter’. In discussions around development in West, however, such aspirations seem to have been forgotten, abandoned or ignored. Yet there are plenty of people around who can still recall what the world was like before the advent of twentieth-century technologies and amenities, and who would hate to lose them. And, shamefully, there are millions who have yet to acquire them.

Few places on Earth have experienced as many dramatic changes in one generation as Singapore has. From a tourist’s point-of-view, life in a kampong (a traditional Singaporean village) may appear to have been idyllic, with human communities living ‘in harmony’ with nature. The reality – mosquitoes, dengue fever, floods, sewage and a complete absence of privacy – was quite different.

While in the West, the past is idealised as somehow more humane and natural, here in Singapore there is a distinct lack of nostalgia for the days of yesteryear. Just consider a photography contest recently launched in one of the daily papers here. ‘Remember those zinc-roofed huts we used to call home?’ the introduction reads, before asking prospective contestants to submit photographs from the past 50 years in order ‘to share with us how life here has become better’.

It is refreshing to see Mahbubani unashamedly defending the aspiration to modernity. As he points out: ‘The 88 per cent of the world’s population who live outside the West have stopped being objects of world history and have become subjects. They have decided to take control of their own destinies.’

The West has lost his optimism, the classically-educated Mahbubani complains, just at the time when the East is gaining its own sense of confidence. He notes how, in the past, the West’s own philosophers would have welcomed this vast increase of ‘goodness’ in the world. Instead, today’s Western leaders and policymakers view global development with trepidation. China in particular has consistently been singled out by Western observers as a source of great concern.

As Mahbubani notes, the West tends to miss or ignore the massive changes to the human spirit that lifting hundreds of millions out of abject poverty has instilled. Development, Mahbubani reminds us, lowers infant mortality and crime rates, whilst simultaneously improving people’s health and longevity. But more than that, ‘[w]hat all these statistics fail to capture is the transformation of the human spirit that takes place when people experience this kind of rapid economic growth’.

Between 1983 and 2003, more than half of all doctoral degrees in science and engineering awarded in the US have gone to students from China, Taiwan, India and South Korea. And by 2010, according to one of the sources Mahbubani cites, 90 per cent of all PhD-holding scientists and engineers in the world will be living in Asia.

If Europeans – who Mahbubani accuses of being smug and insecure – should find it difficult to conceive of China and other parts of Asia opening up in this way, there are also changes afoot in the Islamic world that they appear completely blind to. Apart from a few feudal regimes ‘kept in office at least in part through Western support’, it is not fundamentalism that is the ascendant force in the region, he argues. Instead, it is the drive towards modernisation – and recent events in Iran appear to confirm this.

Intriguingly, though Mahbubani is such a vocal critic of the West, the weaknesses in his own work are arguably very Western in character. He celebrates free markets at a time when they have never been so questioned. But, in reality, both Mahbubani and his detractors are out of date on this point. State monopolies (with which Mahbubani must be familiar as a Singaporean) and regulation curtailed the liberal fantasy of a ‘free’ market over a hundred years ago.

For a self-proclaimed optimist, Mahbubani also appears peculiarly prone to some of the more recent Western fears around things like terrorism, weapons of mass destruction, climate change and overpopulation. He ends up proposing that the European Union and the United Nations – those undemocratic and aloof institutions – should be emulated in Asia. No doubt, his views have been shaped through his background as a career diplomat and he probably has his heart set on a successful future for the Association of South East Asian Nations (ASEAN).

The EU, Mahbubani suggests, has now achieved the position of having ‘zero prospect of war’. This may yet prove to be overly optimistic, but let’s accept the proposition for now and consider it along with von Clausewitz’s famous dictum that ‘war is the continuation of politics by other means’. It appears, then, that the reason behind this zero prospect of war may well be down to the fact that in the famously detached and elitist EU bureaucracy there is also zero prospect of politics.

Like many of his peers, Mahbubani proposes that there can be no freedom without security. That is why, ultimately, he has a rather dim view of democracy, citing an Asian-American academic who believes that for the developing world it is ‘an engine of ethnic conflagration’. As the elites, including Mahbubani himself, are busy projecting all manner of new security threats onto the world, the masses will no doubt have to wait a long time before gaining the right to express their own opinions freely.

Mahbubani’s prioritisation of security over freedom is both blinkered and contradicted by his own analysis. People do not just live their lives, they lead them. And everywhere, at all times, people have been prepared to risk it all to be free. This was also true of Mahbubani’s own parents who emigrated from Pakistan on the eve of partition. Mahbubani believes that the pillars of ‘Western wisdom’ are free markets, science and technology, the rule of law and education. Well, maybe he should also consider just how central freedom is to each of these.

Aside from a disdainful dismissal of democracy – the constant refrain in this part of the world is that the masses are not mature enough for it – The New Asian Hemisphere also suffers from too abstract a separation between politics and economics. These appear to be entirely distinct spheres for Mahbubani. Hence his recurring complaint that Western interests dominate over Western values. For Mahbubani, material, economic interests have trumped the promotion of wider societal values in the West.

There may be an element of truth to this but, more broadly, one could also point out that the West no longer truly knows its own interests, fearing both the unfettered free market and the sense of intellectual confrontation that knowing and representing an interest may involve in society. Hence the contemporary emphasis on ‘harmony’ and ‘inclusion’ that even Mahbubani buys into.

The more fundamental flaw of Mahbubani’s fascinating book is that in attempting to explain, primarily to a Western audience, what it is that the East thinks and does today, he makes sweeping statements about both the West and the East. These are caricatures, or at best vignettes, that serious analysts should steer well clear of, both in terms of regional and individual variations, as well as changes across time.

Mahbubani, for instance, benefited from a Western-style education at a time when this truly meant something. It is not so clear that his advocacy of Western education today would produce similar benefits for the next generation as, having lost its way in the world, the West has also lost sight of what education means, allowing this to be subsumed to contemporary fads, such as the need to bring economic benefit or develop policy prescriptions, as well as a reluctance to allow anyone to fail.

Despite its failings, and the transparent elitist prejudices of its author, born of his time and environment, The New Asian Hemisphere is nevertheless still worth reading for those who focus on such matters. Whether Asia really is ‘the New Modern’, as one reviewer asserts, remains to be determined. For this to be so, it will be the masses in Asia who will need to assert their vision of the future, not just their supposedly expert leaders. It is worth remembering that the American Dream was born of freedom, not security.

The New Asian Hemisphere: The Irresistible Shift of Global Power to the East, by Kishore Mahbubani, is published by Public Affairs. (Buy this book from Amazon(UK).)

First published on spiked, 25 September 2009

Why ‘deradicalisation’ is not the answer

It’s time Jacqui Smith realised that Islamist extremism is not a ‘foreign’ invader of Britain, but rather springs from our own bankrupt culture.

On Tuesday, the British home secretary, Jacqui Smith, announced the development of a nationwide ‘deradicalisation’ programme to tackle people who have supposedly been drawn into violent Islamist extremism in Britain. Muslim community groups and councils will be allocated £12.5million, in addition to the £40million the government has already committed to the ‘prevent’ element of the national counterterrorism strategy made public in July 2006. The funding will be used for projects that will ‘challenge and resist’ the ideas and outlooks deemed to have informed recent acts of terror in the UK.

This strategy will fail for the simple reason that the government has yet to fully appreciate what the influences are that they seek to alter. In addition, officials have no idea as to what it is they would wish to alter them to.

The simplistic model that emerged in the aftermath of 9/11 was that the West was confronted by a resurgent form of political Islam emanating from the Middle East and further afield. Subsequent events, including the London bombings on 7 July 2005, led to an almost begrudging recognition that many of the perpetrators of terrorism had been educated in the West, if not born there.

This still allowed for the possibility that their ideas were largely foreign in origin, or that their outlooks were alien to the presumed norms prevailing in the West. Hence the continuing focus on the form that these ideas take – couched in their jihadist rhetoric – or appeals to defending an ill-defined sense of ‘our values’ or ‘our way of life’. The UK government has failed to confront the true content of what these ideas expressed: a rejection of all things Western, rather than a positive affirmation of anything else.

Nor has the government offered an alternative vision of what we stand for as a society, beyond rhetorical references to freedom and democracy. However, the espousal of such values jars with current proposals to extend the period that alleged terrorists may be held without charge (from 28 to 42 days) – from a prime minister, Gordon Brown, who was never elected by the people.

The truth is that the sources of self-styled Islamist terrorism are more likely to be found within our own shores and within our own communities as anywhere else. It may be more likely, for now, that British Asians will act upon these ideas – with the benefit of an enhanced sense of victimhood that they may have picked up within the British education system. But as the steadily increasing number of white faces appearing on the counterterrorism radar suggests, this need not necessarily be true for much longer.

If this sounds rather harsh, let me illustrate what I mean by way of an example. A good friend of mine recently spent a day in the law faculty of a prestigious British university. The distinguished professor she spent time with advised her that nowadays students are not the same as they once were. They were no longer expected to read numerous books, write long essays or memorise case law. Rather, they are presented with handouts of Powerpoint presentations to read and they keep a weblog of their activities.

That evening, my friend attended the Islamic society meeting in the same university. There, she encountered many of the same students she had met earlier in the day (when they had been disinterestedly sending texts on their mobile phones during the law seminars). Now, however, the students appeared eager to learn. The cleric who ran the meeting expected them to recall specific lines from the Koran and to be familiar with all aspects of Islamic jurisprudence.

Maybe somebody should ask Jacqui Smith who here is the ‘radicalising’ influence? Is it the foreign mullah who ran the evening class, demanding attention and commanding respect, or was it the jaded Western intellectual who deep down believes that there is no truth that can be taught, that not too much should be expected of young people nowadays, and who in any case would not wish to damage their ‘self-esteem’ through challenging them in class?

I use this vignette to suggest that the roots of so-called ‘radicalisation’ are much wider and deeper than can be addressed by a prejudicially targeted programme focusing on ill-founded notions as to where such ideas might emanate from. Indeed, rather than targeting Muslim communities and monitoring Islamic society meetings, the authorities would be better off observing and monitoring their own contemporary culture.

Far from there being a layer of vulnerable young Muslims who are preyed upon by various hotheads, what we find, time and again, are passionate, intelligent and energetic individuals who somehow fail to find any meaning or purpose to their lives from within the confines of contemporary Western culture. Most of these are neither disconnected nor alienated from society, and rather than being ‘radicalised’ from the outside, they actively look for something to join. Nick Reilly, the supposed simpleton whose rudimentary device exploded in his face recently in Exeter, is proof that it is almost impossible to ‘recruit’ anyone of note into terrorism.

In short: a few, fairly intelligent people, deprived of a sense of purpose, will go looking for answers in radical Islam. These are Western people looking for some alternatives to the bankrupt intellectual and political culture around them. Those who are apparently ‘recruited’, on the other hand, are mostly idiots.

In focusing on so-called ‘extremists’ and ‘radicals’, the authorities and security agencies manage to miss that which lies right under their nose. What’s worse, the very language they use belies their own difficulty. By accusing someone of being ‘extreme’ or ‘radical’, they effectively give up on any attempt to address the content of what people supposedly believe, targeting instead the extent to which they are held to believe it. This is like saying, ‘I don’t care what it is you believe in, so long as it is not too much’, which in its turn is an admission that they themselves believe in nothing.

At a talk given to the Smith Institute in London on the evening of her announcement regarding the proposed ‘deredicalisation’ programme, Jacqui Smith suggested that ‘lacking a positive vision, al-Qaeda can only define itself by what it opposes’. Talk of projecting yourself on to others! She and her cronies would be better off outlining what kind of Britain it is that they do want to live in, rather than obsessing over a handful of dangerous idiots whose ideas and outlooks would seem entirely unimpressive were it not for the vacuum that they confront.

First published on spiked, 5 June 2008

History: it’s just one bloody thing after another

Having jettisoned political and historical frameworks, Michael Burleigh’s story of terrorism combines a lack of insight with excessive prejudice about curry-eating loyalists and headbutting Glaswegians.

In a recent interview for the Guardian’s education supplement, historian and writer Michael Burleigh suggested that his decision to leave academia five years ago, after stints, amongst others, at New College Oxford and the London School of Economics, was driven by a determination not to ‘become a guru-like figure’, ‘who surrounded himself with cronies’ and ended up creating ‘clones’ (1).

Judging by his latest book, Blood and Rage – A Cultural History of Terrorism, a more likely explanation is that such is the impoverished nature of his arguments that the only people who were prepared to listen were either cronies or clones. So, while describing women in burqas as ‘black sacks’, or suggesting that ‘headbutting one another’ is ‘a national (sic) pastime in Glasgow’, may appeal to a certain juvenile sense of humour, it is unlikely to endear him to those, as yet un-cloned, constituencies he might wish or need to influence.

One can only presume that he does not care. Over the course of 486 pages on the emergence and development of terrorism, which begins with nineteenth-century Fenians, Nihilists and Anarchists, ends with al-Qaeda, and takes in Italy’s Red Brigades and Germany’s Baader-Meinhof gang on the way, there is very little in the way of analysis. Indeed, he openly declares a desire to focus on ‘actions rather than theories’. But in the absence of analysis, his bombastic and belligerent asides become not just tedious – they encourage suspicion as to his reading of events.

It makes for a grating experience. Reading Blood and Rage reminded me of the great Cambridge historian Sir Herbert Butterfield’s famous aphorism – memorably adapted by Alan Bennett in his 2004 play The History Boys – that history is ‘just one bloody thing after another’. Sounding like a breathless and overexcited child who has just come back from a school trip, Burleigh delivers to the reader an un-insightful and somewhat random list of things that happened. Nowhere, other than in a short passage by Nelson Mandela, is there any attempt to explain how ideas and events may be shaped by context or will.

What really betrays Burleigh’s approach is the subtitle to his book: ‘A Cultural History of Terrorism’. That is, it’s history with the society and politics taken out. With no attempt to engage with the ideas and aspirations that motivated his assorted protagonists, be it the Basque ETA or Algeria’s FLN, or any attempt to appreciate the circumstances in which groupings found themselves, it is little wonder that Burleigh’s narration appears as a sequence of inexplicable events. Burleigh is left instead with just their actions to describe – mysterious, dangerous and impenetrable.

Annoyingly, this also means that even a reactionary like Burleigh effectively lets those who resort to acts of terror off the hook. To him they have become addicted to violence or, as he dubiously proposes, ‘are morally insane’, in which case they can hardly be held culpable.

With less sophistication and reason than the succession of mediocrities occupying the role of British home secretary, his rant continues, page after page after page. In the Daily Telegraph he continued his moan: ‘there are people in this country …who despise our way of life and seek to change it for all time.’ (2) But which people and what way of life?

Like many others, his prejudices encourage him to see such forces as largely emanating from far-flung places and foreign outlooks, in other words, ‘over there’. Yet closer scrutiny of his own invective would reveal to him the vast list of domestic enemies that exist among the ‘liberal elites’. These are variously castigated as ‘fervent human rights lawyers’, ‘loathsome academic[s]’, ‘fanciful journalists’, ‘celebrity useful idiots’ and other ‘well-to-do apologists’. He may have a point, but unable to engage with the breadth and depth of this cultural conflict on the homefront, he simply dismisses it and comes across as a grumpy old man.

At every turn, whether it is in the Middle East, North Africa, Italy, Germany or the UK, he resorts to the tired and trite notion that the roots of terror lie in the rapid expansion of higher education without a concomitant development of employment opportunities. This growth may well have presented him with students less sympathetic to his cheap caricatures of Northern Ireland’s loyalists as people whose ‘idea of an exotic meal was to add curry sauce to a bag of chips, while venturing as far as Tenerife for their first overseas holiday’. But this supposed explanation is unlikely to be ‘the actual source of anger on the part of young Muslims’, as he suggests on his website (3).

Almost inevitably, amidst so much manure, the odd flower of insight blooms. But his apercus could have resulted in a 20-page essay rather than a 500-page book. One of the most useful bits, stemming from his rampant, yet oddly anachronistic anti-left wing prejudices, is a useful section – unusual to books covering Islamist terrorism – detailing the role and barbarity of the Mujahideen in Bosnia, as well as how their actions were supported or ignored by Western radicals.

Elsewhere, he astutely describes terrorists as ‘juvenile fantasists’ and ‘self-styled victims’ whose ‘misdirected or frustrated altruism’ makes them ‘too eager to repudiate themselves’ through their actions, hoping thereby to ‘overcome the boredom and purposelessness of their own lives’.

He also usefully debunks many illusions as to the supposed uniqueness of the threats we face today – simultaneous attacks, suicide bombings, bomb-making manuals, training camps and the targeting of information networks – as well as the overreaction of the authorities to these threats. His detailing of the sheer number and intensity of terrorist attacks in the not too distant past also acts as a reality check.

Yet, despite seeing through Islamism as a pose, he is still driven, through his refusal to see the origins and parallels for this within the West, to describe the contemporary crop of self-styled Islamist losers, plotting terrorist outrages from their bedrooms in east London, as somehow presenting ‘an existential threat to the whole of civilisation’. This seems like a tall order, but one somehow befitting a former academic left howling to the barking of Barking.

Senior figures in the world of national security today call for a new narrative of resilience to be developed in the face of these supposed threats. It is possible that Burleigh may seem to them to offer a little of what they need. But while history is always contested, his story is simply a fanciful myth, unable to engage or captivate a broader community, as real resilience and proper history would.

In the end, Burleigh abdicates all responsibility by suggesting that ‘the battle with jihadism will only be won by Muslims themselves’. In fact, he laments that ‘it is difficult to see how things can be rectified’, comparing contemporary counterterrorism initiatives to an endless game of ‘whack a mole’. Unable to engage in, let alone win, the battle of ideas, as has happened before, Burleigh will simply be left alone with his cronies and his clones.

Blood and Rage: A Cultural History of Terrorism, by Michael Burleigh is published by Harper Press. (Buy this book from Amazon (UK).)

(1) Michael Burleigh: The reluctant guruGuardian, 11 March 2008

(2) See Michael Burleigh’s website here.

(3) Actions speak loudest to terrorists, Mr Brown Telegraph, 15 November 2007

First published on spiked, 30 May 2008

Death of the warrior ethos

Weaving a path from Achilles to Rambo via Shakespeare and Tolstoy, Christopher Coker’s insightful new book captures the increasing demonisation of war – even ‘good wars’ – and the denigration of honour, duty and glory.

In his 1998 BBC Reith lectures, ‘War and Our World’, the military historian and journalist John Keegan described war as ‘collective killing for a purpose’ (1). It is hardly surprising, then, that societies in which a spirit of solidarity has been diminished, the necessity to fight dismissed, and attempts to impart a sense of direction or meaning discredited, are unable to celebrate their wars and their warriors.

Primarily, of course, it is the ‘killing’ part of Keegan’s definition that contemporary societies feel uncomfortable with, or reject outright, rather than the ‘collective’ or ‘purpose’ elements, which many would dearly like to rediscover while remaining sceptical about some of their earlier incarnations. But it is precisely the absence of these latter factors that have served to create confusion about the former.

Anyone wishing to pay tribute to warriors today, or to compose a paean to war as ‘a test of, and testament to, a nation’s resilience’, would be ill-advised to do so. Christopher Coker, professor of international relations at the London School of Economics, has done the next best thing. His book The Warrior Ethos, while imbued with a sense of loss, also appropriately captures the ambivalence and ambiguity of our times.

Despite copious notes and references, this is far from being an academic text. In parts The Warrior Ethos feels more poetic than polemic, as Coker endeavours to weave a path from Achilles to Rambo via Shakespeare and Tolstoy. His sense that the spirit of an age can be captured through its literature and culture, rather than historical and political analysis alone, proves most rewarding, especially in revealing what has changed.

It is not simply a lost world that is unravelled, but lost words, too. ‘Honour’, ‘Duty’ and ‘Glory’ lose their meaning, and their use, if we forget the past, dismiss the present and refuse to face up to the future. ‘Heroism’, stripped of its subjective factor, appears merely to be bred-in, or institutionalised. Alternatively it is pathologised, as a self-serving and dangerous obsession, or worse, as the sad struggle of trauma victims.

Henry V’s decisive defeat of the French at Agincourt in 1415, as well as Shakespeare’s account of it with the infamous ‘band of brothers’, can now be portrayed as being about people suffering from ‘a centuries-old “deception” about the glory of war’. Inverting this new orthodoxy, Coker reveals brilliantly how ‘we tend to deprive them of the fullness of their lives in order to support and sustain the smallness of our own’.

It is our contemporary construction of events that can transform these historic episodes from being ‘full of meaning’ to being seen as a ‘futile waste’. In that sense, the postmodern disposition towards not taking anything too seriously is quite disabling, even in the absence of any enemy we may face. But we should be clear, that this ‘incredulity toward metanarratives’ (2), stems from an interpretation of the world rather than being inherent.

‘All of us in the Western world come from a culture which doubts its own first principles’, rails Coker. So freedom must be fought for afresh in each generation. Stuggle, too, despite its rejection by those of sensitive dispositions, is also a necessity in nature. ‘Only in the last thirty years have we begun to imagine living at peace with nature’, he notes, yet increasing numbers seem to be forgetting this at their peril.

War, like all struggles, is transformative, both for the individuals concerned and for society. Little wonder then, that societies which – despite their rhetoric – fear change, rejecting the uncertainties it creates and endlessly seeking to control risks, should have such qualms about it. Fighting forces them to take a view of the future, regardless of whether they prefer the present or believe in any particular cause.

This transcendental element of existence is most acutely felt by warriors, who are asked to be willing to sacrifice themselves for the ‘greater good’ – another unfashionable concept, and one invoked by The Military Covenant that has only relatively recently been codified and released (3). But again, a ‘greater good’ presumes a ‘collective’ with a sense of ‘purpose’, despite these being noticeable by their absence today.

Coker does not romanticise killing, although, like a recent report accusing the British Army of glamorising warfare (4), he notes a growing reluctance in military circles to use the ‘K’ word. The preference for euphemisms, such as ‘engage’ or ‘suppress’, can rightly be interpreted as defensive. As in animal testing laboratories, when researchers avoid the ‘K’ word, or claim to prioritise ‘welfare’, their evasion allows opponents to run riot.

It was enlightened modernity itself that put paid to Homeric heroes such as Achilles who, living in an unfettered Hobbesian ‘state of nature’, could go about butchering their opponents with little sense of remorse. The modern warrior is accountable to society, choosing to fight for a shared interest. We are not driven mindlessly into feuds through genetic blood ties, but determine our course by our own reason.

But society, suggests Coker, by sanctioning its warriors’ actions, simultaneously removes the determination of their destiny from them. This suppression of the one to the many works so long as there are many who wish to be one, and so long as all parties trust one another and themselves. If these bonds are broken, a vast array of legal codes is imposed upon would-be warriors to patrol their actions and even their thoughts.

In addition, the American cultural historian Paul Fussell suggests that the attenuation of religious belief in the modern world contributed to making modern war and especially death much harder to bear than in the past (5). ‘How does a society cope with death when it no longer dreams of eternity?’ asks Coker, noting how it has been turned into a risk to be avoided, thereby robbing it, and life, of their significance.

The error is to measure life in terms of risk at all. Life, argued Freud, loses its interest when death may not be risked (6). Another way to put it is that there is more to life than mere existence through risk management. As Coker argues, ‘Reason serves the passions; it doesn’t suppress them.’ Yet, in recent years, the military has tied itself in knots assessing risks, thereby encouraging its detractors to do likewise (7).

Take one example, the tragic deaths from gunshot wounds of four young soldiers at the Princess Royal Barracks in Deepcut, England, between 1995 and 2002. This has now led to, by one count, 17 separate inquiries, including those by members of parliament, the Ministry of Defence, the Official Review, the Independent Police Complaints Commission, and a two-year independent review of the various re-investigations (8). No wonder the military feels paralysed.

Meanwhile, the West’s enemies in the ‘war on terror’ claim to embrace death. But suicide bombers are not warriors, proposes Coker, because they are not accountable to society. The problem here is to take them at face value, or to view them as that different to us in the first place. It is not just the Ummah that is not consulted nowadays, but the self-disenfranchised millions in Western democracies, too.

Maybe, in the absence of a cohering society, we are all afflicted by a form of nihilism to some degree. Coker cites Nate Fick in his memoir of the Second Gulf War exclaiming: ‘Death before dishonour. Marines tattoo it on their forearms, but these fuckers [the Iraqis] live it.’ (9) Other, more dispassionate observers, however, characterise self-styled jihadists as making a lot of noise but saying very little, and as having a passion for self-publicity.

Image influences reality, but is limited too, notes Coker. He sees how today’s ‘Jarheads’ are more likely to style themselves upon one-dimensional Hollywood heroes, hip hop and the lyrics of Marilyn Manson, than to have read or appreciated the psychological depth of Greek epic poetry, and bemoans the ‘bad ass’ influence within the US military of those for whom Tupac Shakur is a more familiar figure than Abraham Lincoln (10).

This is a lazy caricature, for while not describing Iraqis as ‘motherfuckers’ or themselves as ‘cool because we’re so good at blowing shit up’, it is the elites who are confused in the current period. They fail to lead for lack of purpose or belief in themselves. And contrary to Coker’s assertions, films do capture mythical dimensions and transcendence, as epics like Crouching Tiger, Hidden Dragon or Pan’s Labyrinth prove.

He is also in danger of overstating the role of technology. Coker seems mesmerised by the world of cybernetic warriors and unmanned combat aerial vehicles (UCAVs). Quite how much he knows of the ‘hypothalamic pituitary adrenal axis’ is anybody’s guess. True, such developments impact on the conduct of war, but it is the loss of confidence in humanity that drives these developments, rather than the other way around.

Technology need not erode tradition and myth, as he suggests. If, for myth, we read a self-affirming narrative that inspires, instructs, enables and connects, as he proposes, then this necessitates the engagement of human passions. For tradition, we could prioritise the truth, as we see it, one that has to be fought for and engaged with, not just imparted. This is the business of politics, not technology or management.

‘Theory’, wrote Marx, ‘becomes a material force as soon as it has gripped the masses’ (11). It is the inability of the elites today to appreciate the material power of ideas, let alone fight for them, that leave them unarmed, looking to technology or management to fill the gap. Ensuring ideas ‘grip the masses’, and become the truth, combining objective evidence with subjective will, is a labour of love entirely alien to them.

There is a real irony, then, in the US military having now introduced a ‘Warrior Ethos’ programme across its force, from basic training to the Army War College, to remind its personnel as to what is expected of them. Like ‘citizenship classes’ in the UK, this seems doomed to fail where it is most needed – at the level of lived ethos as opposed to paper exercises where, unlike on the battlefield, targets are readily met.

The British military is not immune to such instrumental trends. Reports highlight how a career in the Armed Forces ‘equips people with skills and qualifications that can be transferred to civilian life’ (12), or provide ‘an opportunity that may have been denied in civilian life’ (13). In general, the approach is one that emphasises what people can get out of the military, rather than what they will need to give.

Unsurprisingly, then, with such confusion at large across society, as well as embedded in the ranks of the military, Coker identifies how a ‘Therapy Culture’ further confuses matters. It acts as an ‘invitation to infirmity’, he proposes, noting ‘we heal psychic wounds when we are able to give meaning to our experiences. Clearly, if an experience is deemed ‘meaningless’, then ‘so is the pain and suffering that results’.

We are now a long way away from George C Scott’s portrayal of the great American General, George Patton. Talking about war at the start of the 1970 movie, he is depicted as confessing, ‘I love it. God help me, I do love it so. I love it more than my life’. Nowadays, it is journalists who self-depict themselves as the real heroes of war, risking it all in search of ‘the truth’ and without killing anyone to boot (14).

There is no glory in killing but, as Plato reminds us, ‘What makes us human … is not nature or nurture but our capacity to rise above both’. If we do not want, as Nietzsche warned, to find the abyss looking into us when we look into it (15), then it is high time we were reminded of these few basic truths. The fight for truth and for freedom is essential, and Coker’s book goes some way towards highlighting this.

The Warrior Ethos: Military Culture and the War on Terror by Christopher Coker is published by Routledge. (Buy this book from Amazon(UK).)

First published on spiked, 29 February 2008

(1) War and Our World: The Reith Lectures 1998, John Keegan, Hutchinson, 1998, p.2

(2) The Postmodern Condition: A Report on Knowledge, Jean-François Lyotard, Manchester University Press, 1984, p.xxiv

(3) Soldiering – The Military Covenant, Army Doctrine Publication Volume 5, 2000

(4) Informed Choice? Armed Forces Recruitment Practice in the United Kingdom, David Gee, Joseph Rowntree Charitable Trust, 2007

(5) The Bloody Game: An Anthology of Modern Warfare, Paul Fussell, Scribner, 1991, p.24

(6) Freud and Philosophy: An Essay on Interpretation, Paul Ricoeur, New Haven, 1970, p.329

(7) Informed Choice?, op. cit.

(8) Breaking the Covenant: Governance of the British Army in the Twenty-First Century, Anthony Forster, International Affairs, 2006, Vol.82, No.6, p.1048

(9) One Bullet Away: The Making of a Marine Officer, Nathaniel Fick, Houghton Mifflin, 2005, p.82

(10) Generation Kill: Living Dangerously on the Road to Baghdad with the Ultra-Violent Marines of Bravo Company, Evan Wright, Bantam 2004

(11) A Contribution to the Critique of Hegel’s Philosophy of Right, Karl Marx, 1844

(12) Ministry of Defence Responds to Independent Report ‘Informed Choice?’ on Armed Forces Recruitment Practice in the UK, Government News Network, 7 January 2008

(13) House of Commons Defence Committee, Duty of Care (Vol.1), The Stationery Office 2005, p.5-6

(14) This Man’s Army: A Soldier’s Story from the Front Line of the War on Terrorism, Andrew Exum, Gotham, 2005, p.233

(15) Beyond Good and Evil, Friedrich Nietzsche, 1886