Prevent: a very risky strategy

The UK’s clueless counterterrorism strategy sees threats everywhere.

In recent weeks, there has been much attention paid to, and some considerable opprobrium poured on, the UK government’s latest version of its Prevent strategy. Prevent is one part of the four Ps (the others are Pursue, Protect and Prepare), originally framed within CONTEST – the UK’s counter-terrorism strategy. CONTEST – driven by a dawning recognition of the problems posed by homegrown terrorism – was first published in 2006, the year after the 7/7 London bombings, and (together with Prevent) has undergone significant revisions since, including: to demand adherence to specific values; to differentiate and broaden its remit; and to monitor outcomes more rigorously.

Since its inception, hundreds of millions of pounds have been poured into Prevent in order to encourage liaison and dialogue between the authorities and the supposed representatives of various Muslim religious and community groups. The latest furore results from changes to Prevent mandated under the Counterterrorism and Security Act 2015, in particular Section 26, which places a duty on certain specified authorities (local authorities, health authorities, schools, colleges, universities and prisons) to prevent people from ‘being drawn into terrorism’ (a notably passive formulation).

Being more socially and culturally oriented than the other elements of CONTEST, Prevent has always been subject to considerable criticism. Initially, much of this came from traditionally right-wing groups and media who accused the government of consorting with and funding radicals. More recently, after the 2011 revisions, left-wing and radical groups have joined the fray, criticising the partnerships created by Prevent on the grounds that they encourage the infiltration of Muslim communities and justify a culture of suspicion and surveillance.

Both sides in this debate have a point. Following the latest changes to Prevent, various institutions, groups and associations sought to make their reservations clear by responding to the government’s call for feedback. Many of these responses are available online and raise important points about terminology, academic and religious freedom, as well as trust and accountability.

But they also miss the wider problem, which is that the approach taken by Prevent – explicitly aimed at identifying those ‘at risk’ of becoming terrorists, and implicitly framed in the fashionable language of the so-called precautionary principle – is fundamentally flawed. Worse, its latest incarnation, rather than offering mere guidance, now imposes a duty backed up by significant sanctions for those identified as being non-compliant. It transforms risk management from a loose organising principle for societies that lack a broader strategic vision into a set of laws that impact on everyone.

Of course, as the American sociologist Robert Merton noted as far back as 1948, in an article on ‘The self-fulfilling prophecy’, false assumptions have real effects for all parties. In that regard, Prevent has been problematic since its inception. The very act of engaging particular groups around specific issues has the effect of identifying them as different. But it also limits the potential for genuine dialogue between communities, leading to silly spats between groups accusing each other of being either Islamophobes or Islamofascists instead.

As I have noted before, the real drivers behind homegrown terrorism are neither religious nor political ideologies. Rather, they emerge from domestic cultural confusion (as well as confusion further afield, as evidenced by the inability of the 2008 Mumbai attackers to identify their demands). Ours is an age in which Islam acts more as a motif than a motive. It emerges as a rationale for anger rather than necessarily driving it in the first instance. Why Islam appears to have become the religion of choice for the readily disaffected in the West is worthy of further study. Still, we are all engaged in a search for purpose and meaning due to the failure of contemporary society to provide any coherent direction. Governments also fall foul of this lack of societal purpose when presenting ill-defined de-radicalisation strategies that are incapable of saying what people should be de-radicalised to.

Accordingly, if we are to ‘identify those vulnerable to being drawn into terrorism’ – knowing that there is no single or simple model for this process – it ought to be almost everyone that now comes under the purview of the authorities (although, fortunately, only an impossible-to-identify minority will ever act out their nihilistic fantasies). And, far from being vulnerable and readily groomed online by charismatic preachers or activists, would-be jihadists, according to most evidence, tend to be smart and wilful individuals determined to connect with those forces that appear to inspire them in the first place. The real question to be addressed, then, is why it is that the jihadist narrative falls on such fertile soil here in the UK? Or, to put it another way, what it is about contemporary society that propels a minority to find meaning elsewhere?

The confusingly named Prevent Duty Guidance defines extremism as ‘vocal or active opposition to fundamental British values, including democracy’, among other things. But who is it that is undermining democracy in the current climate? When Avinash Tharoor – a contemporary of Mohammed Emwazi (aka Jihadi John) at the University of Westminster – noted in his Washington Post piece that a Westminster student wearing a niqab opposed Kant’s democratic peace theory during a seminar on the grounds that, ‘as a Muslim, I don’t believe in democracy’, who should have been reported to the authorities? The student? The instructor, who Tharoor noted, did not question her? Or all parties to the exchange (or lack of it)?

Prevent also puts great store by ‘Channel’ – a programme that offers more targeted support to certain individuals identified as being ‘at risk’ by various authorities – which has now been put on a statutory footing. That such programmes, like those offered by the Religious Rehabilitation Group in Singapore, may actually make matters worse by presenting rather confused individuals with the somewhat more robust anti-modernist discourses of enthusiastic mentors, as well as teaching Islam to those who hitherto knew little of it, is rarely conceded.

Authorities don’t simply have to comply with Prevent. There are many other interrelated policies, too, and a veritable alphabet soup of acronyms and agencies: CTLPs, LSCBs, BCUs, CSPs, LSPs, NCTTs and PEOs – to name just a few. Far from being strategic, the cacophony of voices reflects the absence of direction and purpose. Little wonder that former MI6 chief Sir Richard Dearlove made a speech at the Royal United Services Institute last year asking for a sense of proportionality to be restored. At the height of the Cold War, his agency only ever put 38 per cent of its resources – at most – into addressing the presumed threat posed by the Soviet Union. Today, by contrast, in excess of 50 per cent of the budget of the three non-military security agencies is expended on counterterrorism.

When the Cold War ended in 1989, a few insightful voices noted that one of the consequences would be a short interregnum during which time social forces would have to be reorganised to maintain the status quo. This period might have offered some opportunities for political alternatives to emerge but, at the same time, the advent of an exaggerated consciousness of risk and a concomitant diminished sense of agency simultaneously pointed to what might lie ahead as a barrier to change. Albeit unconsciously, that new framework for society – organised around risk and precaution, and treating citizens as hapless victims – has now been legislated for, and we are beginning to see the signs of a new ideology emerging that will continue to close down avenues of opportunity.

As many world leaders joined hands two months ago – ostensibly to march for freedom of expression post-Charlie Hebdo – there were some who, somewhat naively, hoped this might represent the resurgence of a core Enlightenment value. In fact, what we witnessed was the last blast from the past, and the final closing of the door to freedom for the foreseeable future. It also confirmed the establishment of a new age of control through behaviour management and a now legally mandated expectation of conformity. That so many have been complicit in allowing this to happen only highlights the enormity of the task ahead for those who still hold to freedom as the basis for any enlightened future society.

First published on spiked, 19 March 2015

Bitter Lake: searching for meaning in the Afghan abyss

Adam Curtis’s latest is a Heart of Darkness for the post-9/11 era.

‘We’re your boys, you stupid bitch! We’ve brought back Mujahideen ghosts!’

So shouts a disabled Russian serviceman to a woman who has asked him to stop his ranting and raving about the Afghan War of the 1980s on a subway train. The scene appears halfway through award-winning BBC filmmaker Adam Curtis’s latest piece, Bitter Lake. Afghanistan was the Soviet Union’s heart of darkness – the place where a crumbling empire sought to rediscover a last gasp of purpose. More recently it was the West’s apocalypse now.

The film, lasting over two hours and only officially available on BBC iPlayer, is ostensibly a search for the meaning of the absence of meaning that afflicts contemporary Western society. It is told through the prism of the West’s engagements with Afghanistan – specifically Helmand Province – over a 60-year period, as mediated by relations with Saudi Arabia and occasionally reaching back as far as the Anglo-Afghan wars of the nineteenth century.

As with all of Curtis’s work, certain aspects beg contestation and clarification. ‘Those in power tell stories to help us make sense of the complexity of reality. But those stories are increasingly unconvincing and hollow’, he begins, seemingly unabashed at the single narrative he, too, is about to tell. But while replete with contradiction, he does have a point. Authorities increasingly talk of the need for narratives today – unaware, it would seem, of the need for material and ideological drivers that might determine such narratives.

It is the first Saudi monarch, Ibn Saud, whose machinations serve as the start for Curtis’s narrative. According to this tale, Ibn Saud was supposedly involved in hoodwinking an ailing President Roosevelt in the closing days of the Second World War. Roosevelt was allegedly tricked into turning a blind eye to the Kingdom of Saud’s religious radicals in return for Saudi oil. The negotiations, held on the USS Quincy at Great Bitter Lake on the Suez Canal, provide both the film’s title and the deus ex machina for its plot.

A smug Saudi oil minister, Sheikh Yamani, then lectures the West on the need to get used to the new normal in the aftermath of the 1973 oil-price hike designed, in part, to chill American support for Israel. The glut of petrodollars that ensued had to find a new home, and it is here that Curtis’s second bête noir – international investment bankers – comes in. The funds were invested primarily in armaments, a ruse to support ailing Western economies. And the weapons produced were sold back to the same Arab leaders who needed to keep their neighbours and domestic populations in check.

So far, so simple. Except that surely Curtis, too, is eliding the odd complexity here?

Because two years before the oil crisis of 1973, the US had unilaterally terminated the convertibility of the dollar to gold, thereby ending the Bretton Woods agreement on monetary management and injecting considerable uncertainty into the markets – including the oil market. This, in turn, had been driven by the relentless printing of dollar bills to fund the Vietnam War and – more significantly – by the steady exhaustion of the US economy in the aftermath of the postwar boom. Unable to invest at home, capital was flooding overseas. Dubious allies were a consequence, not a cause, of America’s economic woe.

But the tendency for the rate of profit to fall never seems to make as good box office as bloody Arabs and greedy bankers. Fortunately though, Curtis does note the extent to which the bankers were then encouraged in their profligacy by Western leaders desperate to make ends meet and devoid of any political vision or agenda. But those leaders largely escape judgment in Bitter Lake.

It is probably Curtis’s bleakest vision yet, punctuated by a typically insistent soundtrack, new musical additions and copious quantities of silence. Images are made to speak for themselves – a drop of blood working its way down a camera lens, a terribly injured Afghan girl dressed up by her father as a princess with a tiara to meet the press, and a scene lasting over two minutes of a fully-kitted soldier picking up and stroking a somewhat ragged-looking dove. Such footage would usually have been left on the cutting-room floor. You wonder what Nietzsche would have made of the scene in which the squaddie stares increasingly meaningfully into the dove’s eyes.

Atrocities abound – some young Afghans unselfconsciously confirm their part in the stoning of prisoners, and US Marines joke about the number of unofficial rounds, including fatal ones, they have fired. But only the most naive or innocent would be shocked by these tales. As to the inevitable question as to what the overall montage means, surely the right response is to point to the total absence of purpose or meaning in the Afghan War itself.

The overall effect of the film is both disorienting and dispiriting. How did the West come to abandon its 1950s-era optimism, when it wanted to construct dams in Helmand to help build a modern state? Why did it opt instead to chase shadows (a British Army captain admits to defining any opposition as ‘the Taliban’)? A crisis of nerve, built on the inability to define and develop a new project for humanity in the shadow of political and economic exhaustion, was the real answer here.

By the 1970s, rich, hippy kids were travelling from the West to Afghanistan in search of authenticity. It spoke volumes about their inability to find meaning domestically, and more so of the failure of Western elites to promote a purpose for them. A decade later, when wealthy US socialites like Joanne Herring were sent to Afghanistan to connect with the Mujahideen, her Orientalist fantasies about saving ‘these people who believed so much in their God’ sound as shallow and libidinal as those of Joan Sims in Carry On Up the Khyber, which Curtis uses clips from.

Repeating one of my contributions to his 2004 three-part, BAFTA award-winning TV documentary series, The Power of Nightmares, Curtis notes how we in the West no longer believe in anything anymore. It is to compensate for this that thousands of young people in the West now seek to find meaning in Islam – with some embracing particularly backward versions of it. But it was really 40 years earlier that the rot set in, as the sons and daughters of the well-to-do paved the way for today’s Islamist meaning-seekers with their own hippy-ish escapism.

It may well be – like the Russian soldier railing on the subway – that elements of the Afghan abyss will come back to stare into us here. But, as in Joseph Conrad’s Heart of Darkness, it seems more likely that it is we who will have ended up exporting our confusions over there.

Along with the army of security consultants, legal advisers and civil-society reformers who fleeced the fledgling Afghan state of its funds in recent years, the West also sent some ‘experts’ to improve Afghans’ wellbeing and culture. So it is that the abiding image of the film for me is that of an English art teacher enthusiastically extolling the meaning of Marcel Duchamp’s conceptual artwork, Fountain, an inverted male urinal, to a group of recently liberated and incredulous Afghan women.

‘The horror! The horror!’

First published on spiked, 5 February 2015

Charlie Hebdo: more security isn’t the solution

We don’t need new anti-terror laws – we need more open political debate.

Predictably, in the aftermath of the atrocities in Paris last week, many commentators have emphasised a presumed need for more security. The UK chancellor, George Osborne, was quick off the mark, asserting that tackling terrorism is now Britain’s ‘national priority’. Really? The number-one focus for some 64million people is to handle the extreme actions of vexatious malcontents? What does that suggest about the malcontents’ presumed power? And what does it say about ours?

The problem revealed here, and in the talk of fighting a ‘war’ against extremists, is a complete loss of proportionality and perspective. We are no less safe today than we were a week ago. We have known for some time that random terrorist acts might strike anyone, anywhere and at any time. And yet now, in the wake of the Charlie Hebdo massacre, politicians are acting as if they had the solution to this threat all along.

UK prime minister David Cameron’s visit to Washington this week, in which he will discuss collaborating with President Obama against the threat of cyber attacks, reflects how lacking in ambition and thought this perceived solution is. In short, all Western leaders feel they can do to tackle the threat posed by terrorism is to intercept the perpetrators before they get a chance to commit their destructive acts. This leaves the central question of why these terrorist attacks are taking place unanswered.

So, billions are to be spent monitoring our movements and communications, while next to no energy is directed at trying to appreciate why it is that the nihilistic rhetoric of a tiny minority is resonating with some people in the West. But tighter security offers only a technical fix and affirms an elite narrative in which we are said to live in uncertain times and among capricious people in need of control. Thinking about what causes the new nihilism, by contrast, would open up a political debate about values and principles that could truly get to the heart of the matter. This, however, is a debate our political leaders would prefer to avoid.

What’s worse, ramping up security in response to terror attacks simply doesn’t work. It is an approach that confuses information with intelligence. What matters most is not how much evidence you can amass but how that evidence is interpreted. Drowning in data, security agents are often unable to see the wood for the trees. As the numbers of people now under surveillance run into the many thousands, the authorities ought to recognise that what they face is not a security problem, but rather a social problem that they have yet to address.

While world leaders marched in solidarity with Charlie Hebdo this weekend, the authorities continue to erode our liberties. Last November, the UK home secretary, Theresa May, tabled yet another counterterrorism bill – the seventh such piece of legislation to be introduced since 9/11. Among other stipulations, this bill will place new requirements on schools and universities to prevent those in their charge from becoming ‘radicalised’ – whatever that means – with severe sanctions for those who fail to do so. Most university officials, including security staff, view these new measures as both unworkable and unnecessary. Yet most universities will implement new procedures so as to be seen to be in accordance with the law. Ministers appear blind to the fact that they now act, and expect others to act, in bad faith.

So, authorities no longer say what they think, while not really believing what they say, either. Few, if any, government officials believe their anti-terror initiatives will work. These measures are purely for show. Politicians and officials have to be seen to be going through the motions.

Crises, such as the one we are facing today, often provoke a race to find meaning in society. Former British prime minister Tony Blair noted after the London riots that there is a danger at such times of crisis to reach for the ‘wrong analysis, leading to the wrong diagnosis, leading to the wrong prescription’. It is imperative, therefore, that we oppose the predictable post-Charlie Hebdo calls for more security and instead kickstart a more rooted social analysis of the issues at hand.

Rather than accepting the supposed need for more protection, we ought to be asking why it is that that our contemporary culture has so thoroughly failed to inspire and engage a generation of young people – to impart in them a sense of meaning, purpose and vision – that some of them are searching for meaning on jihadi internet forums or in the teachings of arcane religious belief systems.

Over the past week, many have repeated the mantra that the first duty of the state is to protect its citizens. That, too, is open to debate. The state itself is the creation of people who were prepared to risk everything, including their lives, to be free. Sadly, the US, in recent years, has seemed determined to make itself the land of the safe rather than the land of the free. It would be a very sad day if the French Republic was to go the same way.

What has most been missing in the so-called war on terror has been a vision for society beyond terror. That is the essence of real resilience: a projection of purpose and a sense of what we are in the absence of all adversities. If we were to achieve this, fewer people would look for purpose elsewhere, and the few that did decide to commit barbaric acts would be framed in the proper context: as mindless criminals.

First published on spiked, 12 January 2015

Dying for a purpose

The absence of meaning in modern war has made combat losses hard to bear.

‘If I should die, think only this of me: That there’s some corner of a foreign field that is forever England.’
‘The Soldier’ (1914), by Rupert Brooke

For almost 300 years, the British Army left its fallen where they fell, as had almost all armies before it. It was only after the Falklands campaign of 1982 that a significant number of bodies were first repatriated – at the behest initially of just one family – and even then, discretely, for fear of demoralising the public and those still serving.

In his latest work, historian Steven Casey of the London School of Economics explores America’s treatment of war losses during the twentieth century. In his telling, officials and politicians will always be caught between a rock and a hard place when it comes to dead soldiers. To underreport or not to attend to the return of the dead is viewed as an attempt to conceal the harsh realities of war from the public, as well as being seen as insensitive. But, at the same time, high-profile send-offs of the dead can be seen as self-serving.

Fundamentally, Casey sets out to explode the mantra established by US political scientist, John Mueller, who, during the Vietnam War in 1973, asserted that ‘as casualties mount, support decreases’. Casey’s effective retort – ‘not always’ – is unconvincing, as he fails to draw out why support doesn’t always decrease as casualties mount.

Nevertheless, When Soldiers Fall, presented cogently and chronologically, provides plenty of useful material to allow readers to draw their own conclusions. Primary among these is the growing inability of governments to project purpose outside of war, let alone through it. To ‘die for a lie’ in Iraq, for example, is considerably worse than to ‘die for a tie’ in Korea.

It is this changing ideological context that provides the backdrop to a story that opens with a relatively small and inexperienced (by European standards) US War Department in 1917 – the general staff consisted of just 20 officers – that evolved into the mightiest military on Earth just half a century later.

It is a period of undeniably rapid technological change – not least in reportage, from newsprint through radio and television, to today’s live-streaming internet coverage, which now escapes the control of a once powerful few. But technology alone does not determine outcomes, as the advocates of ‘technowar’ – from air power to smart bombs – were to discover.

Rather, Clausewitz’s dictum that war is ‘the continuation of politics by other means’, affected by domestic ‘friction’ and conceptual ‘fog’, might be a better place to start. Winning a war relies on the buy-in of the military (just as much as of the public), rather than just a nation’s technical superiority over the enemy. Spirit can matter more than kit.

Admittedly, mundane elements do affect the treatment of fatalities and casualties. Not least, as Casey examines, how to count the dead and the wounded. Losing a battle does not create a situation conducive to conducting elaborate or accurate body counts. Equally, defining serious injury is a moving feast, complicated when casualties are treated (increasingly well and quickly) and returned to the front.

No doubt, there will always be challenges involved in the treatment of the dead. But these will not be resolved through better communications or PR strategies. It is the political dimension that will prevail.

From the war to end all wars and its reprise after Pearl Harbour, through Korea and Vietnam, to Afghanistan and Iraq (with the odd foray into Somalia and elsewhere excluded), Casey identifies the painstaking lengths to which the US military went to identify and count fatalities and casualties, as well as its deliberations over how and when to present these to the public.

Implicitly, this is a story about US rulers’ changing views of the public – from partners who could be trusted to share the same values and outlooks, to concern as to how best to keep the masses on side. But, as Frank Furedi has identified, experiencing problems as relating to trust psychologises the real driver – a crisis of authority.

Casey recognises that the domestic front was always hungry ‘not only for news… but also for analyses’, effectively conceding that making sense of conflict can matter just as much as numbers. Readers will have to look elsewhere to understand this gradual inability to imbue war with any meaning across the twentieth century. It is this that matters more than Mueller’s casualties.

It is why, as Casey correctly identifies, by the time of Vietnam, many in the military, just as much as in the media, had become uncomfortable with the language of death. GIs were never killed; ‘they were aced or greased, waxed or zapped’. But if the military itself used terms such as ‘wasted’, what did this say about its belief in the mission?

Along the way Casey does make some fine observations – not least regarding how in 1917, 1941 and 1950 the United States had gone to war with a segregated army. Unhappy that their role had been diminished and portrayed as one of support or lacking combat capability, many black voices demanded the ‘right to fight’, with tragic consequences in Vietnam. Casey cites Senator George McGovern who, in the aftermath of his 1972 presidential election defeat to Richard Nixon, ruefully observed: ‘When the corpses changed colour, American interest diminished’ – a sad indictment of the anti-war movement, which was more preoccupied with the shooting, by Ohio national guardsmen, of unarmed college students at Kent State University in 1970.

Another sorry truth is that, both in terms of fatalities and casualties, American losses during the Second World War amounted to more than twice those from all its other conflicts in the twentieth century put together. This prompted the US authorities to focus on promoting objectives rather than identifying losses which, as Casey notes, there was hardly time to audit and report anyway.

In his wry essay ‘The Gulf War did not take place’, French philosopher Jean Baudrillard highlighted quite how ‘safe’ America’s wars subsequently became. Mistakenly, Casey sees the lower death toll as a reason why personal narratives now receive far greater prominence. But the media’s focus on emotion represents a retreat from political debate.

Most significantly, as I have argued elsewhere, values such as honour, duty and glory appear entirely anachronistic in an age when they have lost their use and meaning. When the necessity to fight – even for ideas – is dismissed, and attempts to impart a vision or direction discredited, then the game is up, and worse – destructive dissent is to be expected.

One consequence of conflicts’ loss of meaning has been the increased focus on technical processes such as auditing and communicating casualties, instead of clarifying a purpose. Yet societies possessed of a sense of mission – however misguided – have been able to countenance and withstand the most remarkable forms of barbarity, without first having to prioritise force protection.

Casey usefully reminds us that it was the Republican Party that opposed all of America’s major twentieth-century wars (two world wars, Korea and Vietnam). But his view that ‘in any war, faced with an information vacuum, the press and public tend to speculate’ confuses things. After all, the prevalence of rumours is also a measure of political confusion.

Before losing his life, Rupert Brooke continued the poem that opens this review: ‘There shall be in that rich earth a richer dust concealed; a dust whom England bore, shaped, made aware.’ Brooke’s soldier believed in something that gave his life, indeed, his death, a meaning. It is this loss of meaning that makes combat losses so hard to bear today – not just the absolute numbers.

When Soldiers Fall: How Americans Have Confronted Combat Losses from World War I to Afghanistan, by Steven Casey, is published by Oxford University Press.

First published on spiked, 8 August 2014

Putting the human soul on the slab

Human behaviour cannot be understood through brain scans.

If you step outside right now, you’ll find a lot of leaves have fallen on to the pavement. Some have blown away, but have left an imprint there, so you get these beautiful patterns left on the paving stones. If I were to show you a photograph of one of those prints, I think most people would say that they saw the print of a leaf. But that’s not what you would see; that would be your interpretation of what you see. What you would actually see is a patch of colour superimposed on another patch of colour.

You interpret what you see in this way because you have prior experiences of seeing leaves; you would have experienced the leaching of pigment from leaves that remains on the pavement, and you have experiences of things changing over time – that something that was there is no longer there. In fact, most human activity is the interpretation of data that we perceive through our senses rather than the mere representation of that data.

Such interpretations can be contested. Science tries to avoid that contestation by the repetition of experiments, having large numbers of results and looking at the significance of the numbers that are revealed.

One problem facing neuroscience is that it is plagued by very small signals generated from very small numbers of volunteers who are prepared to put their heads into a scanner, which are then enhanced through the amazing imagery capacity of information technology. But those images are still contested.

Another problem is that neuroscience is in many ways a fledgling science and plagued by numerous disagreements, even among its proponents. For example, some would argue that a brain on its own does not explain much; it needs to be understood in context – with its environment, with other brains. Other neuroscientists may not see it that way. This contestation of interpretation and meaning is crucial to understanding what neuroscience can and cannot tell us.

Others have also pointed out that some neuroscientists notoriously try to smuggle the language of consciousness into how they describe the processes going on in the brain. They will say things like neurons ‘signal’ or ‘provide information’ or ‘respond to’. But neurons can’t do those things; that’s what we as human beings do. Neurons simply generate and transmit electrical impulses.

Neuroscience is also plagued by vague language. So neuroscientists may talk about one phenomenon being ‘associated with’ or ‘influenced by’ another. These are descriptions, not explanations.

It’s also the case that we perceive many things simultaneously. In an experimental setting, we somehow have to prioritise the experiences the experimenter is asking us to focus on, yet there may be activity going on at the same time in our brains that we are not conscious of. For example, we are constantly maintaining a state of homeostasis, such as keeping our balance or an optimal body temperature, which may cloud other things that are going on.

I don’t say these things to dismiss neuroscience, but because it is important to say that it is a contested field with some important barriers.

Neuroscience also suffers from presenting a deficit model of the brain. There is a lot of focus on what happens when a part of the brain is damaged and what this supposedly reveals about what would normally be there. But suggesting that normality is the opposite of damaged is a bit like trying to study democracy through only looking at dictatorships because it is assumed that one is the opposite of the other. That’s not true – and that indicates that there are limits to some of the experiments that are currently being reported.

In other words, the study of the brain is incredibly complicated. If I wanted to study a butterfly colony, for example, I could say that all things are ultimately physical, that butterflies are made of protons, neutrons and electrons. Therefore, all I need to do to understand that colony is to sum up where all these particles are, how they interact and how they move across time. In reality, such an effort would simply be far too time-consuming; there’s not enough time in the universe to determine what would happen. Hence, I don’t use physics to understand butterfly colonies.

My thoughts and actions may be reflected in my brain but they are part of a social field of relations, not just a neural or chemical set of relations. For example, a slip of the tongue may leave a neural or chemical signature that can be measured, but it is a really uninteresting way of describing what happens. My degree of embarrassment will have nothing to do with that neural signature; it will have to do with social context. Likewise, I don’t think neuroscience is about to explain anything sophisticated or important like the existence of slavery or sex discrimination.

What is clear, however, is that the language of neuroscience has been hijacked by some people in order to further pre-existing political agendas. I largely work in the field of security and I could show you a whole series of US Department of Defense white papers in this vein, with titles like ‘The Neurobiology of Political Violence’, ‘Neuroscience Insights on Radicalisation’, and so on. The language of neuroscience has been hijacked and adapted to many other fields.

However, to suggest that neuroscience is corrupting or influencing society is to put things entirely the wrong way round. Science, while shaping society, is also a product of the particular society and cultural mood from within which it emerges. The current cultural mood could be categorised as one dominated by a fear of change, a sense of limits, and a feeling of fragility which encourages a particularly dystopian outlook. That dystopian outlook, whether they know it or not, guides many scientists as to what they go off and investigate. As a consequence, we have apocalyptic interpretations of environmental science and deterministic presumptions presented by neuroscience.

Karl Marx, in his introduction to A Contribution to the Critique of Hegel’s Philosophy of Right in 1844 (an article most famous for the idea that ‘religion is the opium of the people’), made two points that are very appropriate here. Firstly, he writes: ‘Theory becomes a material force as soon as it has gripped the masses.’ So here we have Marx, a materialist, who understands that matter is not all that matters, and that ideas can have a material impact.

Secondly, he writes: ‘The immediate task of philosophy, once the holy form of human self-estrangement has been unmasked, is to unmask self-estrangement in its unholy forms.’ That seems very pertinent to our current situation. Having been through a period when we have revealed the limits of religious determinism, now we need to reveal the limits of neurodeterminism.

This is an edited version of a speech given at the Battle of Ideas festival on Sunday 20 October 2013 at the Barbican Centre in London. You can watch video of the whole debate, ‘Soul on the slab: is there no limit to what neuroscience can do?’, at WORLDbytes.

Terrorism: a homegrown fear

The enemy in the ‘war on terror’ was created by a lack of meaning or purpose in the West.

When I was coming up, it was a dangerous world, and you knew exactly who they were. It was us versus them, and it was clear who them was. Today, we are not so sure who they are, but we know they’re there.

With these words in 2000, given before he was elected US president, George W Bush captured some of the uncertainty that had gripped the US establishment in the long aftermath of the Cold War.

Celebrated by some, most notably Francis Fukuyama, as heralding the ‘End of History’, the dismantling of the Cold War framework that had largely organised world affairs (and shaped identities) – both internationally and domestically – across much of the twentieth century proved unsettling for all those who understood themselves through it.

Such confusions continue to this day, and not simply in the US. After a recent terror-related incident that targeted the vicinity of the Legislative Buildings of British Columbia on Canada Day, the BC premier Christy Clark announced: ‘They want us to be governed by fear. They want us to look on each other with suspicion. They want us to be seized with anger. They want this because they hate the things that make us Canadian.’ But, as some analysts immediately noted, who exactly were the ‘they’ that she was pointing to?

In this case, ‘they’ would appear to have been a petty criminal and failed heavy-metal musician turned Muslim convert, and his methadone-taking, common-law wife, neither of whom particularly kept their dislikes discreet. And – just as significantly – what exactly are ‘the things that make us Canadian’ (or American, or British, or anything else for that matter)?

As the British writer James Heartfield notes in his critique of the postmodern outlook, The Death of the Subject Explained, constantly calling into question the object of our attention also points to confusion relating to the subject – ourselves. Yet, almost 10 years into the ‘war on terror’, US president Barack Obama would still write in his foreword to the 2011 US National Strategy for Counterterrorism: ‘To defeat al-Qaeda, we must define with precision and clarity who we are fighting’.

Not only have we failed to understand the enemy but, more importantly, we failed to grasp the extent to which we have changed, too, and how this shapes those we confront. It is our lack of vision and direction for society that generates confusion over who the enemy is in the war on terror, and how to respond to them.

Interpreting meaning

The common adage that ‘generals always fight the last war’ could be augmented to include all manner of other professionals – including politicians, media commentators and even intelligence analysts. A mental model once ingrained is truly difficult to shake off.

The atrocities of 9/11 necessitated a response, but the declaring of a ‘war on terror’ was by no means the only possible one. Compare that with the response of the mother of Dutch filmmaker Theo van Gogh who said of her son’s murder at the hands of a self-styled jihadist in 2005: ‘What is so regrettable … is that Theo has been murdered by such a loser, such an incoherent person. Murder or manslaughter is always a terrible thing, but to be killed by such a figure makes it especially hard.’ As the Holocaust survivor Viktor Frankl (1946) noted over half a century before, it is not suffering that destroys people – but suffering without meaning.

So, after 9/11, a meaning – political ideology – was presumed and projected. It allowed a disoriented administration the semblance of clarity and offered a cohering mission to society. They were facilitated in this by the perpetrators themselves, whose chatter about global jihad was taken at face value.

In a similar way, the failure to find weapons of mass destruction in Iraq led – in 2003 – to its invasion. Any lack of evidence was either ignored or taken to confirm pre-existing views regarding how devious the regime was. Either way, policy needs and presumptions – not evidence – determined outcomes.

The same is true of much intelligence. This necessarily combines information with the interpretation of that information. Yet, time and again, when examining intelligence failures the tendency is to blame just the information, either because insufficient information is highlighted, or there being too much to analyse. Alternatively, analysts worry about being provided with false, or misleading, information.

What is rarely questioned is the framework through which that information is interpreted. So, because in the past protests and violent outbreaks usually had a political or ideological purpose, today politicians, commentators and analysts look for political and ideological explanations – even when all the evidence points to the absence of these.

In the past, groups such as the Irish Republican Army (IRA) and the Palestine Liberation Organisation (PLO) fought national-liberation struggles. They used terror as a tactical means to achieve their strategic ends. But they knew above all that they needed to win the hearts and minds of their own communities.

In other words, they relied on mobilising a conscious and coherent collective. And they confronted an equally conscious and coherent state. Failures, on all sides, can be traced to their alignment – or not – with the people they claimed to speak and act on behalf of.

But al-Qaeda and the offshoots it supposedly inspires could not be more different. While some claim to speak on behalf of the ‘Ummah’, there is no evidence of any community ever having been consulted – let alone engaged. That is why even the families and friends of those involved express shock to hear of their activities.

Nor is there any coherent text outlining the purported mission or aims of these groups. Rather, much of this has been projected for them by analysts who seek to fill the vacuum of information left behind after the various acts of destruction with their own pet prejudices. A striking example of this is, when asked to articulate their demands on television, one of the perpetrators of the 2008 Mumbai attacks was heard placing the phone down and asking one of his co-conspirators what their demands were.

Even if the perpetrators were mindless cannon-fodder as some have suggested, and even if we know the real origins of these attacks, this still fails to explain why no one has come forward to claim responsibility for this incident, as well as many others. Even when someone does claim responsibility – through so-called martyrdom videos and other media – there is precious little content other than a rambling rage.

Our failure is to attribute meaning – either political or ideological – to these actions. We thereby imbue vexatious acts of violence with greater import than they deserve. By doing so, we also attribute far too much authority and power to small numbers of individuals.

Implicitly, we also identify a gaping hole at the heart of our own societies – where ideology and politics should be. For what kind of society is it that can be so rattled by events that – in perspective – should be seen as minor, if unfortunate, historical footnotes?

Some analyses even effectively exonerate the individuals concerned by finding cause for them in the conditions of the developing world and our supposed insensitivity to these. Above all, our responses have allowed local and regional struggles, as well as isolated, irrational acts, to be presented as conflicts of global and epochal proportions.

Reflected caricatures

Osama bin Laden himself was fond of citing Western politicians, commentators, academics and diplomats in seeking to legitimise his ostensible cause. Sounding like any other contemporary critic of American policy, he droned on about a rag-bag of motives at different times. From primarily complaining about the relationship between the US and the Saudi regime, he switched to focusing more on Palestine after the events of 9/11 and then only later to Iraq, echoing the anti-war lobby’s claim that the war was simply a money-making venture for large corporations.

He lambasted the US for not signing up to the Kyoto treaty to control greenhouse gases, accused Washington of being controlled by a Jewish lobby, and argued that Western advertising exploited women. After the Madrid bombings of 2004, he even proposed that Western leaders should have paid more attention to surveys that revealed how few people supported going to war in Iraq.

In all of these, bin Laden and his acolytes revealed themselves as being entirely parasitical upon the caricatures and dystopian views that proliferated in, and emanated from, the West, as well as being obsessed with what was being said about them. One of the final images of bin Laden – sat watching himself on television – is quite apposite in that regards.

But what kind of Muslim leader is it who advises people to read the journalist Robert Fisk or the academic Noam Chomsky rather than, as one might have supposed, the Koran? And why did bin Laden choose to piggy-back his claims on Western opinion-poll data and the views of environmentalists in order to get his points across? (Although we should note that contemporary political leaders and religious figures in the West do much the same thing.)

Ayman al-Zawahiri – once right-hand man of bin Laden and the group’s supposed intellectual – displayed a similar tendency to draw ideas and inspiration from Western concerns when he noted, in relation to his growing, if evidently unrealistic, fascination with developing some kind of chemical or biological weapon: ‘Despite their extreme danger, we only became aware of them when the enemy drew our attention to them by repeatedly expressing concerns that they can be produced simply with easily available materials.’

In truth, bin Laden and al-Qaeda entirely lacked any substantial ideas of their own, let alone anything that amounts to an ideology. Bin Laden was the leader of nothing, who became – in an age enthralled by celebrity – the iconic terrorist of our times, unable to control his own fans never mind the course of history. Sadly, only in an age when image and style trump insight and substance at every turn could such aimless violence prompt such an all-consuming response.

Criticism of the West has long been around, but never before has it taken such a degraded form as in our post-political age. Even the presumed rise of religion in the recent period points to the evisceration of political engagement. And there is a world of difference between the cult-like religiosities of the present and traditional, religious organisations – though the former may better countenance rash acts of barbarism through their being less accountable to any wider institutions or mores.

Homegrown nihilists

Far from being atypical, recent self-styled jihadists intercepted in the domestic arena have exemplified the ineptness of the ever-expanding roll-call of marginal fantasists and wannabe terrorists who claim to be part of, or inspired by, al-Qaeda.

spiked‘s Brendan O’Neill has noted elsewhere, the tactical, technical and organisational incompetence of many modern terrorists, irrespective of their economic or educational backgrounds. And these form just the tip of the iceberg. This is not to dismiss the potential lethality of these plots and the devastating consequences they could have had upon those in their proximity if they been successful in their aims. Nor should we confuse them with the more serious threat posed to troops in Afghanistan, Iraq and elsewhere.

Yet, after each of these incidents, rather than point to the combination of vacuous bravado and concomitant failure, politicians, commentators and analysts have preferred to pursue purported links to al-Qaeda, which they invariably make connection to, however tenuously.

But associating with groups such as Al-Mujahiroun or Jemaah Islamiyah, travelling to Pakistan to attend some kind of training camp, or surfing jihadist websites including the now notorious Inspire magazine – supposedly al-Qaeda’s web-based English language organ – does not explain anything.

Ideas do not transform people unless they resonate with their experience and existing interpretation of the world. Why do the ideas of fringe organisations appear to fall on such fertile soil? What is it about the West that seems to predispose some to identify with such nihilist groups?

In view of the sheer weight of alternative media to Inspire, how has our society failed to inspire individuals who are often young, bright and energetic, and provide them with rules, structures and meaning to live their lives by?

Ultimately, ideas have to emerge from somewhere. And extremism is the extreme expression of mainstream ideas. If our aim is to stop the extremists, we have to address the mainstream ideas that drive them.

In the most recent incidents – in Boston, London and Victoria, British Columbia – as well as many others, what we find are individuals consumed by a sense of self-righteousness. Islam – if it features at all – is often more an afterthought than a driver. It is their motif, not their motive.

But moral indignation is encouraged by contemporary society, which often presents a negative view of the present combined with a dystopian projection of the future. Disengaged from what passes for politics today, many young people come to develop an aggressive sense of entitlement, indulged by a society they seek simultaneously to distance themselves from.

The outcome covers the spectrum from asserting a new identity – young women wearing headscarves whose mothers never wore one – to inchoate rage, expressed either passively, in the so-called Occupy movement, or more acutely and violently, as in recent episodes of rioting. It is the unpredictable emergence of the latter that has led some analysts to express their surprise at how rapidly so-called self-radicalisation can occur. In fact, it is the failure of observers to identify the social currents beneath the surface that leads them to viewing matters this way.

Indeed, the parallels between ‘homegrown terrorists’ and other ‘lone wolves’ – such as Anders Breivik, who murdered 77 people in a bombing and shooting spree in Norway in 2012 – as well as the perpetrators of various mass high-school shootings (another relatively recent phenomenon), are more important than any purported political or cultural differences.

Domestic drivers

Space here precludes a detailed exposition of the various social, economic, political and cultural drivers of these trends that were largely catalysed into being only recently.

That modernity itself produces turmoil and disruption, while generating constant uncertainty, has been known for a long time. Marx and Engels noted as much in 1848 in The Manifesto of the Communist Party. But over the course of much of the twentieth century, the Cold War effectively kept the potential for change in check, by demanding adherence to particular worldviews.

The stand-off between the US and its allies against the Soviet Union and its satellite states across Eastern Europe and elsewhere, divided the world externally and was reproduced internally against the ‘enemy within’, understood then as emanating from trade unions or the political left.

But from about the mid-1980s, the erosion of the supposed twin threats of Soviet-style Marxism and state socialism – finally made evident through the unanticipated fall of the Berlin Wall in 1989 – opened the flood gates on the possibility for both public/political and private/personal transformation. This also encouraged the erosion of the distinction between these domains.

Without the forces that had held the political right together for so long, establishment elites were soon exposed as lacking any positive purpose or vision for society, and rapidly fell out among themselves. Replacement enemies were postulated, but none of the new litany of demons – from the Contras in Nicaragua and General Aideed in Somalia, through to Slobodan Milosevic in the former Yugoslavia and Saddam Hussein in Iraq – could match the military, moral and material caché of the Red Army.

Little wonder then that even freedom-advocating, Cold War warriors would oppose change when it came. For example, Margaret Thatcher, briefing then Soviet president Gorbachev in private meetings, told him that the lifting of the Iron Curtain and German reunification would ‘undermine the stability of the whole international situation and could endanger our security’, adding that – despite public pronouncements to the contrary – US president Ronald Reagan was of the same view.

New organising frameworks for society have struggled to fill the void left by the erosion of the old political and moral frameworks shaped by the interest-based politics of left and right. Ideology has – to some extent – made way for identity, but, as some have noted, the latter is a very fragile sense of identity, based on a ‘diminished’ sense of human agency.

That is why there is such resonance today for prevailing discourses that emphasise risk and uncertainty – despite these always having been part of the human condition. More problematically, this culture also elevates our sense of vulnerability over resilience, irrespective of official intent.

Even those charged with defeating terrorism buy into such negative narratives, pointing in their turn to the possibility (rather than probability) of future catastrophes (variously to be caused by limited resources, viruses, climate, population, the economy, technology, and other forces). They then imagine and act upon worst-case scenarios rather than focusing on the most likely.

In the past, such pessimistic projections would have been condemned as a loss of nerve that encouraged low morale; today, they are considered sensible precautions. They impact not just counterterrorism but upon all walks of life. For example, foreign governments encouraged their nationals to flee the vicinity of Tokyo in the aftermath of the Fukushima power plant emergency triggered by the Great Tohoku earthquake – rather than humanely staying behind and helping those they had been with.

A similarly shallow deterministic outlook explains why the rudimentary findings of neuroscience and simplistic business models have been co-opted to shed light on the causes and trajectories of terrorism. This is possible because they present a process without a subject in an age when our sense of autonomy and potential has been so curtailed. Accordingly, biological metaphors (ideas go viral, terrorists are spawned, etc) proliferate, as these also downplay our role and intentions (as well as – inadvertently – our accountability, too).

Nervous responses

By retreating from political ideology to process management in the West, uncertainty has effectively been allowed to drive world affairs rather than emerging from them. A concomitant sense of insecurity has encouraged politicians and people everywhere to avoid expressing firm principles and values independently of simply managing perceived, exogenous threats.

But it is how we, as a society, respond to acts of destruction that determines their impact. Civilisation cannot be bombed out of existence by terrorists. It can, however, be corroded from within if all we do is focus on technical solutions to our problems rather than expanding our horizons through a strategic vision that could project a positive sense of mission for society.

In effect, we complete the acts perpetrated by domestic nihilists. When the UK prime minister David Cameron flew back from his overseas engagement to be seen to be addressing the brutal murder of an off-duty soldier on a London street, or when the city of Boston was put into lockdown by the authorities pursuing an injured teenager on the rampage, no amount of words extolling our resolve and resilience could alter the implicit message of societies disoriented by adversity.

Not only does this act as an encouragement to other loners and losers with an exaggerated sense of self-importance and grievance, it also flies in the face of the real solidarity and fortitude displayed by those most immediately affected. Such resolute responses at the time are then further undermined by the countless medical experts, media commentators and officials who all project about the long-lasting consequences on individuals and society that such attacks are held to have.

In 2003, the then UK home secretary David Blunkett suggested in relation to one of these losers that the youth concerned posed ‘a very real threat to the life and liberty of our country’. What kind of country is it that can feel so threatened by the actions of such marginal figures?

Sadly, the focus on surveillance, protection, information and warnings that has emerged since 9/11 has the unintended consequence of promoting undue concern, mistrust and cynicism. It pushes people further apart from one another at a time when they need to be drawn together with a sense of common purpose. It also exemplifies the low view of the public and their likely responses evidently held by many in authority.

As opposed to the contemporary obsession with needing to identify unanticipated shocks to the system, it is the long-term drift at the top of society that will prove to be more destabilising in the long run. That is, the drift created by consistently seeking to protect society from without rather than revitalising it from within, and the gradual disengagement and distancing this fosters.

Dystopian projections

Less than 48 hours into the war on terror, British journalist Seumas Milne had an opinion piece published about the US: ‘They can’t see why they are hated’. Others soon followed, leading to expressions of outrage by establishment commentators. What they failed to notice was quite how normal such expressions of anti-Americanism had become.

A sense of contempt for supposedly soulless American consumerism is widespread – even among those working for the likes of Google and Citibank. And surely when Michael Moore’s Stupid White Men (2001) became a bestseller on both sides of the Atlantic – selling over 300,000 copies in the UK in its first year of publication alone – this should have alerted a few bright minds in the security agencies (and beyond) to a self-loathing that is significantly domestic in origin.

This has little to do with America itself, but rather reflects a broader dissatisfaction with the world that targets the US as its highest expression of success. That debate had been fulminating for quite some time, particularly among the old political left. But the events of 9/11 catalysed – rather than triggered – the soul-searching across the board to a new level.

It is striking how common it is today to read book titles such as Alan Weisman’s The World Without Us, or hear respected academics describing humanity as a ‘plague’. These, and countless others like them, point to the low view we have come to have of ourselves in the contemporary world. They point to a significant clash within civilisation, rather than to that between civilisations as characterised by the American political scientist Samuel Huntington.

Unfortunately, such ideas serve to reinforce a cultural milieu within which low expectations and dystopian fantasies become the norm. But such a dismal view of ourselves, our role and our impact on the planet can become internalised by some. It frames a demoralised public discourse of apocalyptic failure and rejection that sustains those prepared to lose their lives – as well as those of others around them – in their misguided determination to leave their mark upon a world they feel encouraged to reject.

Conclusion

America found itself, at the turn of the last century, an undisputable – if somewhat reluctant – world power. It more formally attained that role propelled by events elsewhere – but also inspired by the narrative of ‘manifest destiny’ built on the Enlightenment optimism of Washington, Adams, Jefferson, Madison and others.

By the close of the century, America appeared more gripped by a sense of Millenarian pessimism. Built not on size, but on the initiative of those confronting the unknown, its founding and guiding ideology was that of freedom – freedom from the past, and freedom of conscience, initiative, enterprise and of will.

The US, as immortalised by Francis Scott Key in his poem of 1814, was ‘the land of the free’ – not the ‘land of the secure’ – as it appears some today would have it. He understood that people in all places and at all times had been prepared to risk it all to achieve this.

We do not just live our lives – we lead them. And similar aspirations have inspired the struggles of others, however distorted these became in the years that ensued. To lose sight of this, to trade our freedom in order to be looked over by others and made to feel secure, is just one of the confusions that now grips America.

But the forgotten role of leaders today is to inspire people – not just to protect them. People who believe in their cause or project are far more effective agents of it than those who are coerced, managed or nudged.

What is most missing in the war on terror has been a vision for society beyond terror. That is the essence of real resilience, a sense of what we are for in the absence of all adversities; a projection of purpose. Otherwise, as is the case here, we effectively allow the challenges we confront to determine us rather than the other way round.

America still represents much of what is best in the world – as well as a little of what is worst. For all the challenges still confronting it, as well as the pretensions and delusions of others, the future remains for America to lose rather than for others to win. But over a decade into the war on terror, it is high-time for America’s search for meaning to conclude through the re-invigoration of its founding values, as well as the identification of a new vision.

That way, many of the disillusioned individuals who look elsewhere for purpose and meaning would not need to, and the few that get through would be framed in the proper context – as mindless criminals.

First published on spiked, 11 September 2013

The spy who came in from the Cold War

The Red Army toilet-raiding realities of spying certainly exhilarated Steve Gibson, but the fall of the Berlin Wall brought doubt, too.

From the end of the Second World War through to the end of the Cold War, a little-known unit of British special forces conducted spying missions behind the Iron Curtain – that is, right from the heart of Soviet-occupied East Germany.

Called the British Commanders’-in-Chief Mission to the Group Soviet Forces of Occupation in Germany, or BRIXMIS for short, it was part of an officially sanctioned exchange of observers between the Red Army and the British Army established by the victorious Allied powers and the USSR through the Robertson-Malinin agreement in 1946. Its ostensible purpose was to improve communication and relations between them.

In addition to BRIXMIS – and their French and American counterparts in the East – the Red Army also conducted similar operations through a unit in West Germany. But, diplomatic liaison and translation duties aside, the real purpose of these units soon became clear: to find out what each other was up to by heading out into those areas where they had been specifically told not to go.

My friend and former UK Defence Academy colleague Steve Gibson led many of these ‘tours’ just as the Cold War was coming to an end. Live and Let Spy is his gripping recollection of these episodes. Although originally published in 1997 as The Last Mission: Behind the Iron Curtain, it has now been republished and augmented some 15 years later with a significant additional chapter written with the hindsight gained during his subsequent academic career.

Much of the espionage involved gathering evidence about the weaponry available to the Red Army. Accordingly, it typically required lying in wait on a bridge over a railway line at three o’clock in the morning in the middle of a forest in winter. With temperatures dipping to around minus-30 degrees celsius, the objective was to photograph all the kit that passed by on a train underneath. Alternatively, they might record the rate of fire from Russian guns from the safety of their locked vehicle in the sweltering 40 degree heat of summer.

Sent back to the Defence Intelligence Staff in Whitehall, this information allowed specialists to determine troop and equipment levels, as well as whether a new bolt on a gun or aerial on a tank might allow it to fire or communicate further than previously estimated – and if so, whether this would necessitate the complete re-evaluation of NATO’s Cold War battle plans.

Of course, the operations required meticulous planning to identify suitably concealed observation posts, as well as efficient access and escape routes. This planning was usually conducted during the day. For those so disposed, there are sufficient ‘tradecraft’ details here to sustain interest. For me, however, the real gem is the lesson identified early in the book – to be as conspicuous as possible by waving at everyone.

This waving tactic disoriented many into believing that the Mercedes G-Class (Geländewagen) passing by with three individuals in army fatigues in it was legitimate. And even if observers suspected something, the fact that nearby children would invariably wave back – raising the possibility that those inside were known to them – would add further confusion or delay. Those who did smell a rat usually did so too late.

It was not just a jolly jaunt. Over the years, a number of tour personnel lost their lives or suffered serious injury through being shot at or having Russian tanks ram their vehicles. East German ‘narks’ were also always on the look-out for anyone in the wrong place and would report these to the relevant authorities. It is noted, though, that many local ‘Easties’ were keen to help the agents.

Given the challenging circumstances, selection and training were intense and severe. It required individuals who could think quickly on their feet and not just expect to follow rules. It also meant having the ability to complete advanced courses in Russian and German, photography and navigation in next to no time, and to memorise the look and sound of countless pieces of Soviet military equipment, as well as remain calm – yet sharp – when tired or provoked.

For anyone who imagines that spying is glamorous, or somehow akin to being in a Bond movie, they will be disabused by Gibson’s chapter on document-gathering from dumps (literally). It had been recognised for some time that, when they went on manoeuvres in East Germany, the Soviet forces were not supplied with any toilet paper. They would use whatever came to hand – a copy of Pravda, a letter from a loved one, or even their mission papers. And after they were done, it was then that Her Majesty’s specially trained and equipped Cold War warriors really came into their own…

The book is a tour de force of teeth-clenching tension that will keep most readers gripped from beginning to end. But while the first nine chapters retain the action-packed core of the original narrative, filled with the escapades of small teams of rather special individuals trying to find out what the Soviets were up to, the real substance – for those of a more political disposition – is a chapter titled ‘Reflections’.

As a professor of political science at the University of Warwick, Robert Aldrich, notes in the new foreword, Gibson is now clearly of the mind that ‘much of what [he] was led to believe [during the Cold War], and some of what he was told, was simply wrong!’

It is testimony to the author’s strength of character that – unlike others – he neither chose to dwell in the past nor fell prone to the ‘invention of illness’. This latter problem, he himself notes, affected many of his one-time colleagues once their personal and moral frameworks disintegrated with the end of the old, Cold War world order.

Gibson’s resolute clearsightedness is to be admired. So despite having been caught up in the exhilaration of it all as a young man, despite devoting the prime of his life to the East-West conflict, he refuses to lie to himself. ‘The Cold War’, he notes, ‘was a giant historical cul-de-sac where all enlightened efforts at producing a good society were suspended’.

Aldrich astutely summarises a key argument of Live and Let Spy: ‘while Cold War warriors fought a tyrannical and ruthless version of Communism abroad, they remained ignorant of – and lost – an ideological battle at home’. He then adds accusingly: ‘Western politicians now offer a watered-down version of the interfering, intolerant, controlling and authoritarian government that they were initially set against rather than anything freer.’

In this, he takes his lead from Gibson, who rails against the erosion of ‘moral values, community spirit and sense of purpose’ that now pervades Western political elites. They are ‘pessimistic and misanthropic’, Gibson argues, while ‘suffering from an acute lack of confidence in their own projects’. This lack of authority, this social pessimism, they now effectively impose on others through a ‘moralising intervention into every aspect of private life’. But while the description of this new period will, no doubt, connect with many, Gibson – possibly by trying to cover too much, including passing references to Aristotle, Kant and Bentham for good measure – fails to provide a convincing explanation of why this all came about.

Taking his lead from the BBC documentary film producer Adam Curtis, Gibson identifies how the computer modelling of behaviour – and even more bizarrely, of intentions – came to dominate an intelligence world increasingly devoid of purpose or principle. But, as he himself notes, the intelligence community’s embrace of behaviour modelling is just as likely to be an expression of a broader ‘loss of faith in humans’ as the driver of social processes. Today, that loss of faith – and an obsession with risk management – comes to be expressed through the failure to put eyes and ears on the ground, as Gibson’s once were (a job for which he was awarded an MBE), and thereby a failure to verify theory through practice.

In addition, this final chapter makes three significant and unique contributions to improving our understanding and application of intelligence.

Firstly, he argues that the most useful role of intelligence today is to understand the context correctly, without which ‘purpose is equally misguided’. Secondly, and drawing on his most important academic contribution, Gibson notes that, ‘the use of single-source intelligence-reporting drawn from individuals selected principally for their willingness to share secrets…is not the best way to analyse contemporary challenges’, as the illusions about Iraqi ‘weapons of mass destruction’ ably demonstrated.

Whether these new challenges are, as he suggests, those so-called non-traditional security threats, such as climate change, energy supply and food provision, is open to debate. Possibly, it is the ‘dismissal of free will’ and ‘decline into mediocrity’ that he identifies elsewhere that are the real problems. And it is these problems that have turned the essentially technical issues of climate change or food provision into all-consuming sources of uncertainty and insecurity.

Finally, and significantly for one who has made the pursuit of freedom and autonomy central to his existence, Gibson notes the loss of any sense of fun in a politically correct world without an ‘enlightened purposeful ideology around which to cohere’. (This comes from a man who knows something about fun having, in his youth, gatecrashed an international beauty pageant pretending to be Miss Austria’s personal bodyguard.)

Advocates of the ‘purposeless pragmatism’ and ‘bureaucratic regulation’ he now views as the real barrier to achieving ‘prosperity and progress for all’ would no doubt disapprove of Gibson’s youthful antics. It is unlikely, for instance, that they would appreciate the photographs of naked lovers taken from over one kilometre away that he and his colleagues once sent back to Ministry of Defence analysts to show that their equipment was working and that they were maintaining their skills. But, he notes, it is precisely intolerance towards the criticism – and in this case, the mockery – of widely held beliefs that precludes the effective determination of the truth.

Richard Aldrich concludes how ‘Gibson reflects that it takes the passage of time to recognise that one is misled by power’. For those who feel that after the fact is too late and who still hope to shape history rather than merely be carried along by it, it is only through a constantly evolving analysis of present circumstances that such historical cul-de-sacs can avoided.

This book – while not pretending to be any more than a personal memoir of some hitherto less disclosed aspects of the Cold War – serves to remind us of how far we have come since. After it all ended, Gibson concludes that ‘the somewhat hasty, undignified and testy disintegration of the Mission was intrinsically due to the absence of mission itself’.

This may well explain why – as was revealed from Kremlin minutes released some 20 years after the Berlin Wall came down – Western leaders were so keen at the time to remind the Soviet Union’s then-president, Mikhail Gorbachev, that they really did not want a re-unified Germany. Despite their pro-freedom stance and rhetoric, a unified Germany would, in the words of Margaret Thatcher, ‘undermine the stability of the whole international situation and could endanger our security’.

Indeed, the Cold War may well have been the last time that Britain and the other Western powers could even pretend to have had a clear and positive sense of mission.

First published on spiked, 30 March 2012

How CSR became big business

Corporate social responsibility allows governments to avoid accountability and gives companies a sense of purpose.

Whenever society faces a crisis there tends to be a wave of moralism. So it is not surprising that, as the private-equity crisis has transformed into the public-debt calamity, there is now much discussion about the correct conduct of business and finance.

The last time such a significant conversation occurred on these matters was in the mid-1990s. Back then, economic turmoil and the dramatic downfalls of corporations and businessmen like BCCI, Polly Peck and Robert Maxwell – all tainted by accusations of fraud – led to the promotion of ‘corporate social responsibility’ (CSR). The ideas behind this concept were articulated in a landmark inquiry by the Royal Society for the Encouragement of Arts (RSA), Tomorrow’s Company: The Role of Business in a Changing World. It seems fitting, therefore, that the RSA’s current chief executive, Matthew Taylor, recently sought to articulate his vision for ‘enlightened enterprise’, laying out how ‘business can combine a strategy for competitive success with a commitment to social good’.

Looking back, though, it seems many of the corporate contributors to the original study might have been good at talking the CSR talk, but they were considerably less interested in, or capable of, walking the CSR walk.

Quite a few of the companies, including British Gas, British Airways and National Grid, were relatively recent creations of the privatisation boom under the previous Conservative administration. In their cases it is reasonable to suppose that their chief executives were keen to get behind the calls for change. Many others, such as electronics company Thorn EMI, transport and logistics firm Ocean Group, and the IT company FI Group, got caught up in the late-1990s wave of mergers and acquisitions, and so ended up being subsumed or disappearing entirely. No doubt, quite a few individuals got rich in the process.

Some of the original supporters of CSR – like The Strategic Partnership (London) Ltd – were more like tiny, shoestring-budget quangos, staffed by individuals whose intended policy clout far exceeded their business significance. At the other end of the spectrum, among those who pontificated about what makes a responsible company, were the leaders of Barings Venture Partners Ltd. Barings Bank collapsed in 1995 after one of its employees, Nick Leeson, lost £827 million due to speculative investing. So much for being responsible.

Tomorrow’s Company was a product of its time. Bemoaning the absence of non-financial measures for business success, it fed into the growing demand for procedural audits and targets that were to become one of the emblematic pledges of the New Labour government. And, in what was to become typical New Labour lingo, the inquiry demanded greater ‘dialogue’ and ‘inclusivity’.

The RSA inquiry complained of the ‘adversarial culture’ of the business world. This heralded later attacks on various supposed institutional cultures, including the ‘canteen culture’ of the police force, critiqued in the 1999 Stephen Lawrence inquiry, and the ‘club culture’ of the medical profession, lambasted in the 2001 Bristol Royal Infirmary inquiry. There have also been critiques of the army’s ‘barracks culture’ and, more recently, of the ‘macho culture’ of the International Monetary Fund (IMF). This was following the controversy involving the former IMF chief, Dominique Strauss-Kahn, who was claimed to have been protected by a French ‘culture of secrecy’.

The meaning of CSR today

Matthew Taylor, in his recent exposition of ‘enlightened enterprise’, also asks for a ‘shift in our national culture’. But whereas the 1995 RSA study called for change in response to the ‘increasingly complex, global and interdependent’ conditions within which businesses were allegedly operating, for Taylor the key problem to be corrected is human nature.

‘[H]uman beings are complex social animals’, he suggested in a recent speech, ‘influenced more by our nature and context and less by calculating, conscious decisions, than we intuitively believe’. Like other adherents of the new orthodoxies of behavioural economics and evolutionary psychology, Taylor talks of the need to create ‘more capable and responsible citizens’.

So what does all this have to do with business behaviour? One important clue was provided by Mark Goyder, programme director of the original RSA inquiry. He brought up ‘the notion of business as the most important agent of social change, in an age when governments are redefining and limiting their own sphere of influence’. Taylor, for his part, identified the idea of behaviour change as a key aspect of corporate responsibility and explained that the Lib-Con coalition has set up its own behaviour-change unit and that ‘the idea that we need to move from a government-centric to a people-centric model of social change is central to David Cameron’s vision of a Big Society’.

Against the backdrop of these two elements – the changing role of government and the view of ordinary people as little more than natural impulses on legs, as beings who need to be nudged into changing their behaviour – the new role of business becomes transparently clear. Businesses are to act on behalf of governments that can’t be trusted and for people who don’t know what’s good for them.

Taylor is quite explicit about this. ‘[T]he state’, he noted, ‘has many competing objectives and when it uses its power to nudge it opens itself up to charges of paternalism and social engineering’. Businesses, however, have the ability ‘to build on a relationship based on choice and consent, and in some cases a good degree of trust’. All these qualities are presumably no longer to be expected, or demanded, from government.

No doubt, many in the business community will jump at this invitation to take over the levers of power by acting as de facto school prefects on behalf of states that no longer want, or cannot be trusted, to rule. Many will also be excited by the ability to play an ever bigger role in the government’s nudge agenda and to take on the mantle of responsible agents for change.

From profits and growth to ‘performance with purpose’

Today, CSR is big business. And so the success of enterprise in this age is not to have the spirit that took people to the Moon, but to play a part in slimming waistlines and reducing carbon footprints. It’s simply a question of ‘selling the right stuff’, as PepsiCo’s CEO Indra Nooyi has put it. Nooyi has committed her company to ‘performance with purpose’, which includes providing healthy snacks. Likewise, the Mars Corporation’s new focus has little in common with the bold ambitions of the space-race era. It now wants to concentrate on selling products ‘as part of a balanced diet’ and on encouraging people to get ‘fit not fat’.

Taylor is aware of the possibility that not all of us would choose to pursue the ideals that he and his fellow nudge-enthusiasts espouse. To counteract this, business has to take the lead, ‘prompted by NGOs in a sense acting as quasi-regulators and intermediaries with consuming households’. The RSA has taken the lead in this respect, working with Shell and taxi drivers to make fuel-efficient behaviour more habitual.

Ultimately, Taylor comes across as gullible for buying into the idea that corporations want to put social responsibility first. He even cites ‘Flora’s cholesterol-cutting margarine’ as a service in protecting people’s health. This despite the fact that Flora’s claims are highly dubious, and the purported link between high cholesterol and heart disease is increasingly disputed and discredited. Perhaps Taylor will be promoting anti-ageing creams next?

A major error of CSR proponents is to assume that the key determinants of success for businesses and their employees is not making money, but being fulfilled in some other way. Taylor cited a Gallup survey which showed that ‘beyond obvious basic factors like health and a reasonable income, the key determinant of whether someone described themselves as thriving in their lives as a whole was whether they saw their employer – or manager – as a partner rather than a boss’. Here, he sounds rather like one of his predecessors at the RSA, Charles Handy, who, in answer to the question ‘What is a company for?’, said: ‘To talk of profits is no answer because I would say “of course, but profits to what further purpose?”’

CSR: the displacement of responsibility

But real profits, good health and reasonable incomes cannot so readily be assumed. They still have to be achieved, and cannot just be dismissed as ‘obvious’ in a desire to promote a new business agenda. In fact, the CSR agenda has helped businesses get away with ignoring the self-expressed needs of its employees. British Airways, for instance, was commended for its social and environmental reports while simultaneously undermining working conditions for its staff.

In fact, the most ideal CSR scheme focuses its supposed benefits elsewhere – typically it is directed at poor people ‘without a voice’ or better still on animals or the environment that can’t talk back at all. That way, businesses can offer token sums and gestures to impoverished communities and satisfy eco-activists and their media groupies at the same time – all the while compelling staff to subsidise the schemes by volunteering their own time and energies.

It is not at all obvious what it is about businesses, and still less self-styled civil-society groups and NGOs, that makes them legitimate representatives of the public’s needs. For the government, recruiting business to its behaviour-change agenda seems like a further evasion of accountability. Ultimately, whatever companies say about putting people and the planet before profit, they only ever have a partial view of the world.

Only states have, and are authorised by the sovereign people to promote a more universal view. Whether society should be aiming for healthy living, sustainability or anything else should be part of a broad, democratic discussion, not sneakily foisted upon us by businesses acting under the guidance of NGOs, policy wonks or ministers looking for ways to show they’ve ‘made a difference’.

The original advocates of CSR focused their attention on culture, as do their supposedly more people-centric descendants, because it is at this level – the level of the informal relationships between people – that the potential for contestation and resolution initially emerges. This can be a messy business, and one that states that doubt their own direction and purpose are loathe to engage in. They would rather outsource this messy function to others, and attempt to replace all those informal, uncertain and uncontrollable interactions with more predictable formal codes, regulations and responsibilities. That they find willing lap-dogs for this in the ethereal world of think tanks, as well as businesses that are suffering from their own crisis of confidence, is not that surprising.

However, if we truly want to change the world then it is ordinary people who will have to assert what really matters to them. CSR – it has been noted by many – is invariably a by-product of business success, not the cause of it. Likewise, it is people’s aspirations for a better world – however we imagine it – that should be the only prompt for the kind of behaviour we adopt.

First published on spiked, 2 November 2011

Message to the West: ‘know thyself’

Since 9/11, terrorists have lived like parasites off the already-existing disorientation of Western elites.

In his opening remarks to the latest US National Strategy for Counterterrorism, released in June, President Barack Obama notes that ‘to defeat al-Qaeda, we must define with precision and clarity who we are fighting’.

Ten years on from 9/11, then, it would appear that one of the key protagonists in what used to be known as the ‘war on terror’ (subsequently rebranded as the ‘long war against terrorism’ and now simply redefined as a ‘war on al-Qaeda’) is still busy attempting to identify and understand its enemy.

This speaks volumes about where the fundamental difficulties of the past decade, as well as the next one, may lie. For all the much-vaunted differences with his predecessor, President Obama comes across as just as confused as George W Bush. At a time when 9/11 was probably still just a twinkle in Osama bin Laden’s eye, Bush Jr addressed an audience at Iowa Western Community College, as part of his election campaigning. He expounded that: ‘When I was coming up, it was a dangerous world, and you knew exactly who they were. It was Us vs Them, and it was clear who Them was. Today, we are not so sure who they are, but we know they’re there.’

Perhaps both presidents Bush and Obama should have visited the fabled Temple of Apollo at Delphi, where Ancient Greek warriors consulted the oracle in advance of engaging in a protracted conflict. In the temple forecourt the presidents could read the infamous inscription: ‘Know thyself.’

For 10 years, the world’s sole superpower has allowed one of its key strategies to be defined for it, and has also allowed itself to be buffeted around as its understanding of who the enemy is continually changed. As its locus of interest has shifted relentlessly – from terrorists and terrorism to states that may harbour terrorists to technologies that might facilitate terror – so America has consistently and unwittingly advertised to the world that wherever the ‘they’ lead, the US follows.

This is the very opposite of strategic vision. Such vision would require knowing what you are for, what your aims and ambitions are, even in the absence of having to respond to the presumed threats posed by external forces. Knowing your enemy is, of course, a necessity, but the primary task for any nation is to be clear about its own interests in the first place.

And so it is precisely a better understanding of Western culture – its conflicts and contradictions – that might have helped the US authorities appreciate the extent to which the trajectory they were about to embark on was born of their own internal confusions and incoherence.

Osama bin Laden, Ayman al-Zawahiri and those who have sought to emulate them have also spoken of an inchoate rage against modernity that rapidly eclipsed the various Western anti-globalisation movements of the time. For all their purported claims to be representatives of others in the South and the East, the most striking thing about bin Laden et al was the extent to which their ideas were largely Western in origin. While being mindful to dress themselves and their language in Islamic garb, their complaints were predictable and had been well-rehearsed by others in the West. As I have put it before, ‘Islam was their motif, not their motive’.

Sadly, by imbuing these people’s puerile and purposeless violence with deeper meaning – to the point of even describing it as an ideology or an understandable reaction – countless international analysts both effectively absolved those involved of responsibility for their actions and helped encourage others to follow their lead.

But what these analysts often missed is that while the ‘war on terror’ may be 10 years old, for its real origins we need to go back at least another 15 years. In the mid-1980s the then Soviet president, Mikhail Gorbachev, appeared dramatically to alter the rules of the Cold War through promoting the twin policies of glasnost and perestroika. He had little choice if he was to delay and soften the blow of his country’s impending implosion. The consequences were to prove just as dramatic for the West as for the East.

It was in this period – before the collapse of the Berlin Wall, and while the CIA was still busy training the Mujahideen to assist them in rebutting the occupying Soviet forces in Afghanistan – that the need for Western elites to reorganise their own systems and ideologies first emerged. By the time Francis Fukuyama was celebrating ‘The End of History’ it was already becoming clear that the only force that had held conservative elites across the world together during the Cold War period was the supposed twin threat posed by Soviet Marxism and internal state socialism.

Without these forces, the old political right rapidly suffered intellectual exhaustion and then disintegrated, leaving the future up for grabs. In the 1990s there was a constant search for new enemies against which states – in danger of losing their own meaning and purpose – could cohere themselves.

But none of the new litany of demons – from the Contras in Nicaragua or General Aideed in Somalia, from Slobodan Milosevic in the former Yugoslavia to Saddam Hussein in Iraq – could really live up to the caché of the military and material urgency that had been imposed by the Red Army. Ethical foreign policy came and went – invented by Tony Blair’s New Labour government and adopted later by the Bush administration.

It was in this period that the old remnants of the left, fearful of being consigned to the dustbin of history, embraced both the environmental movement and the politics of risk and precaution as a way of gaining legitimacy. Originally formulated in relation to addressing ecological problems, this rapidly spread to issues pertaining to public health, consumer safety and beyond. It provided a cohering framework in an age without one. And a key element of the precautionary outlook then being developed was the need for public officials to project worst-case scenarios for society to respond to.

The German sociologist Ulrich Beck’s 1987 bestseller Risk Society was translated into English in 1992 and rapidly gained traction through its ability to reflect the emerging mood and policies.

An outlook shaped on the fringes of local authorities and supra-national bodies of marginal relevance soon became the new organising principle of the West. And what had, until then, been largely dismissed as an exercise in left-wing political correctness by the old right, was catapulted and transmogrified through the tragic events of 9/11.

Unwittingly, then, the new terrorists were both a product of these confusions, as well as inadvertently providing the authorities with a flimsy new purpose. Criticism of the West had long been around, but never before had it taken such a degraded form as in this post-political age.

In any other previous period of history, the actions of the Islamic radicals ought at best to have featured as minor disturbances in the footnotes of history. Only in an age schooled in presuming the worst in all eventualities could such mindless violence come to be seen as full of meaning and requiring an all-consuming response.

Ultimately, extremists are merely the extreme expression of mainstream ideas. Their ideas have to come from somewhere. And looking around at the dominant thinking of the post-Cold War world order, it is not too difficult to identify where some of the sources are.

Increasingly, we have become accustomed to presuming that we live in a peculiarly dangerous and uncertain age. Globalisation, which provides most of the benefits we often unconsciously enjoy, has come to be portrayed as the amoral steamroller and destroyer of humanity and history. Human beings are increasingly depicted as being both violent and degraded, as suffering from arrogance and ignorance, or as hapless and vulnerable victims needing constant therapeutic support by a range of experts. Little wonder that such a small coterie of fools, the terrorists who espoused these ideas in an extreme form, could have such strong purchase.

But by overemphasising the extremes, as we are now prone to do, we simultaneously underestimate the significance of the mainstream. Black swans happen but white swans remain far more frequent, and drift can be just as disabling as shock, if not more so.

The Enron crisis occurred at about the same time as 9/11 – and it also cost significantly more. This was soon followed by the collapse of Worldcom, and, years later, the 2008 world economic crash happened. Yet unlike other problems that have emerged over this period, there was never quite the same sense of urgency in addressing these issues. Maybe that’s because, at some deeper level, many world leaders know that they cannot be tackled without significantly more far-reaching measures that, despite the culture of precaution, they have studiously sought to avoid.

Despite the billions of dollars expended on the ‘war on terror’ thus far, the US and others are still far from understanding, not just what it is they think they are up against, but also themselves. Without such an understanding there can be little hope for positive progress and development. In the past, some believed they suffered from a US that was too confident and assertive in the world. Today, we see the legacy of a US that is both confused and ambivalent. And we don’t seem to be any better off.

First published on spiked, 8 September 2011

WHO’s learned nothing from the swine-flu panic?

The over-reaction to H1N1 influenza in 2009 was built on years of waiting for ‘the Big One’.

Over the past few days, the sixty-fourth session of the World Health Assembly (WHA) has been held in Geneva. The WHA is the highest decision-making body of the World Health Organization (WHO). It is comprised of delegations up to ministerial level from the WHO’s 193 constituent member states.

Among the agenda items was to be a discussion of the International Health Regulations 2005 (IHR) in relation to pandemic A (H1N1) 2009 – colloquially known at the time as ‘swine flu’. The IHR first came into force in 2007 and were established to facilitate international cooperation in preventing and responding to acute public-health emergencies, such as the outbreak of influenza that appeared to originate in Mexico two years ago.

The 180-page report, presented by the IHR Review Committee to the WHA, certainly seems impressive. Aside from receiving numerous national and institutional inputs, well over a hundred individuals from a vast array of agencies worldwide, including the WHO, contributed in some form to its findings.

But, in essence, only one point of any note is made in it: ‘Critics assert that WHO vastly overstated the seriousness of the pandemic. However, reasonable criticism can be based only on what was known at the time and not what was later learnt.’ This is felt to be of such significance that it is stated three times – in the executive summary, in a slightly modified form in the body of the text, and again in the conclusions. It is intended as a robust rebuttal to those voices – in the media, the medical professions, and elsewhere – who have questioned the global response to H1N1, and the WHO’s role in shaping this response.

Foremost among these has been Paul Flynn, a British Labour MP and rapporteur to the Social, Health and Family Affairs Committee of the Council of Europe, through which he successfully promoted an inquiry into the matter. This inquiry primarily questioned the role of advisors to the WHO, who – through being employed by large pharmaceutical companies that produce anti-viral drugs and vaccines – were held to have had an economic motivation in raising public concerns about swine flu.

The editor of the British Medical Journal, Fiona Godlee, and others, have similarly pointed to possible conflicts of interests, as well as a lack of transparency within the WHO relating to advice and appointments. Sam Everington, former deputy chair of the British Medical Association, went on the record to argue that, in his opinion, the UK’s chief medical officer and the government were ‘actively scaremongering’.

Quite a number of countries worldwide have also raised criticisms since the pandemic abated, ruing the fact that they purchased vast stocks of vaccines at some considerable cost that have remained unused.

But, just as with the official review of the UK’s response into the outbreak, these voices and views are simply non-existent as far as the IHR Review Committee and the WHO are concerned. And anyway, as the report repeatedly reiterates, it is the considered opinion of international public-health specialists that claims of over-reaction to what turned out to be a comparatively mild illness are misguided. Those who point to this are held to be cavalier and complacent as to the possible risks entailed should the situation have been different.

What’s more, much emphasis is placed in the report on the fact that Margaret Chan, the director-general of the WHO, and other WHO staff consistently tried to calm matters down, repeatedly noting that the overwhelming majority of cases were mild and recommending to governments that there was no need to restrict travel or trade. If anyone went beyond the measures that were officially advocated then the WHO could hardly be held responsible for this, the report contends. Hence it is to the media, and in particular new social media, that blame is attached.

But all this is to woefully misunderstand and underestimate how communication about risk affects contemporary society. Regulations and warnings are not issued into a vacuum. People and institutions do not merely respond to messages on the basis of the precise information contained within them. Rather they interpret these through the prism of their pre-existing cultural frameworks.

For example, when the former UN weapons inspector Hans Blix advised the world in 2002 that he could find no evidence for weapons of mass destruction in Iraq, it is quite clear that, rather than reading this at face value, the response of the US authorities was to assume that any such weapons were simply well hidden. In other words, they did not allow the facts to stand in the way of their mental model of the world – one in which that the Iraqi authorities would invariably lie and operate surreptitiously, regardless of evidence to the contrary.

Likewise, whatever the WHO likes to think it announced about the outbreak of H1N1 influenza in 2009 – ignoring, presumably, the fact that the director-general herself described it as ‘a threat to the whole of humanity’ – its officials should also have been sensitive to the reality that their messages would emerge into a world that had steadily been preparing itself for a devastating health emergency for quite some time.

Indeed, much of this ‘pandemic preparedness’ had been instigated and driven by the WHO itself. It is quite wrong therefore for the IHR Review Committee report to argue that any criticism of the WHO was based on ‘what was later learnt’. It is clear that the global public-health culture that the WHO itself helped to create in advance would inevitably result in just such an over-reaction. It is even possible to go further than this and to predict right now that this will not be an isolated incident. Lessons may be learnt, but mostly the wrong ones.

A critical article in Europe’s largest circulation weekly magazine, Der Spiegel, published just over a year ago, noted how prior to the advent of H1N1 in 2009, ‘epidemiologists, the media, doctors and the pharmaceutical lobby have systematically attuned the world to grim catastrophic scenarios and the dangers of new, menacing infectious diseases’. Indeed, it seemed at the time of the outbreak, to one leading epidemiologist at least, that ‘there is a whole industry just waiting for a pandemic to occur’.

In this, as the IHR Review Committee report makes clear, ‘The main ethos of public health is one of prevention’, before continuing: ‘It is incumbent upon political leaders and policy-makers to understand this core value of public health and how it pervades thinking in the field.’ The authors appear to believe that this is a radical outlook; in fact, this precautionary attitude is the dominant outlook of our times. In that regard at least, the WHO and others were merely doing what came naturally to them when they acted as they did in 2009.

It is the case today that both elites and radicals view the world in near-permanent catastrophist terms. This apocalyptic outlook emerged as a consequence of the broader loss of purpose and direction that affected society in the aftermath of the old Cold War world order that last provided all sides of the political spectrum with some kind of organising rationale.

Indeed, it was as the Cold War was drawing to a close that the concept of emerging and re-emerging infectious diseases first took hold. And, as noted by the American academic Philip Alcabes in an excellent book on these issues, it was also the point at which the notion of dramatic flu epidemics occurring on a cyclical basis – which until the 1970s had been little more than one of many possible theories – also came to form an essential component of the contemporary imagination.

In the autumn of 2001, the anthrax incidents that affected a tiny number of people in the US in the aftermath of the devastating 9/11 terrorist attacks, were heralded as a warning of things to come by the authorities. As a consequence, after many years of being regarded as an unglamorous section of the medical profession, public health was catapulted centre-stage with vast sums made available to it by military and civilian authorities to pre-empt and prevent any bioterrorist attacks that they now all too readily anticipate.

The outbreak of a novel virus, severe acute respiratory syndrome (SARS), in 2003 – a disease that affected few individuals worldwide but had a relatively high fatality rate – was held by many to confirm that we should always prepare for the worst.

Since then it has been the projected threat of H5N1 ‘avian flu’ jumping across the animal-human barrier that has preoccupied the world public-health authorities. Irrespective of the fact that there have been just 553 cases of H5N1 since 2003, concerns generated by it have been sufficient to push through far-reaching transformations to the world public-health order – including the advent of the IHR themselves.

Now – ominously – aside from deflecting any responsibility for the confusions they helped to create, by describing the H1N1 episode as having exposed ‘difficulties in decision-making under conditions of uncertainty’, the IHR Review Committee note in conclusion that – looking forwards – their most important shortcoming is that they ‘lack enforceable sanctions’.

In this regard, public health will not just be perceived of as being a national security concern – as it has already become in many influential circles – but also one requiring effective policing, possibly with its own enforcement agency, through the establishment of a ‘global, public-health reserve workforce’, as the report suggests.

Aside from absolving the IHR and the WHO of any responsibility for the debacle that saw large numbers of well-informed healthcare workers refusing to be inoculated when the vaccine eventually emerged in 2009 – thereby encouraging the public to act in similar fashion – the report of the Review Committee is also a call to make risk communication more of a priority in the future.

But, far from the public requiring the authorities to speak more slowly, more clearly or more loudly to them, it was precisely the attempted communication of risk – where there was little – that was the problem in the first place. That is why we can be sure that this problem is set to recur, at tremendous cost – both social and economic – to society.

Risk is not simply an objective fact, as some seem to suppose. Rather, it is shaped and mediated through the prism of contemporary culture. That we perceive something to be a risk and prioritise it as such, as well as how we respond to it, are socially mediated elements. These may be informed by scientific evidence but, as indicated above in relation to Iraq, broader trends and outlooks often come to dominate the process.

These are impacted upon by a vast number of social, cultural and political variables, such as the cumulative impact on our imagination of books, television programmes and films that project dystopian – or positive – visions of the present and the future. Another major influence is the perception of whether the authorities have exaggerated or underestimated other problems, even such apparently unrelated matters as climate change or the 2008 financial crisis.

An emergency then – whether it relates to health or otherwise – does not simply concern the events, actions and communications of that moment. Rather, it draws together, in concentrated form, the legacies of past events, actions and communications as well. And while it may not have been in the gift of the IHR Review Committee to analyse, and – still less – to act upon all of these, there is precious little evidence that they considered such dynamics – and their own role within them – at all.

Far from struggling to convey their messages about H1N1 through a cacophony of competing voices – as some within the WHO seem to suppose – the authorities concerned totally dominated the information provided about the pandemic in its early stages. Their mistake is to presume that it was merely accurate information and the effective dissemination of it that was lacking.

Rather, it was the interpretation of this information according to previously determined frameworks that had evolved over a protracted period that came to matter most. Accordingly, the WHO tied itself in knots issuing endless advisories at the behest of the various nervous national authorities it had helped to create. This even included guidance on the use of facemasks which, whilst noting a lack of any evidence for the efficacy of these, nevertheless conceded that they could be used, but if so that they should be worn and disposed of carefully!

At the onset of the 1968 ‘Hong Kong’ flu epidemic, that killed many tens of thousands more than H1N1, the then UK chief medical officer postulated – erroneously – that he did not envisage the outbreak being a major problem. Far from being lambasted for being wrong, or hounded out of office, as he might be in today’s febrile culture, it appears that the presumption of the times was that it was precisely the role of those in authority to reassure and calm people down, rather than to issue endless, pointless warnings as we witness today.

The WHO, on the other hand, seems determined to assert its moral authority by projecting its worst fears into the public domain. Sadly, it seems, the authorities have not learnt a single lesson from this episode.

It is not the actions of the individuals concerned that the IHR Review Committee report should have scrutinised and sought to exonerate from presumptions of impropriety or personal gain, but rather the gradual construction of a doom-laden social narrative that WHO officials have both helped to construct and now need to respond to, that urgently needs to be interrogated.

First published on spiked, 23 May 2011